WorldWideScience

Sample records for empirically based theoretical

  1. Theoretical Semi-Empirical AM1 studies of Schiff Bases

    International Nuclear Information System (INIS)

    Arora, K.; Burman, K.

    2005-01-01

    The present communication reports the theoretical semi-empirical studies of schiff bases of 2-amino pyridine along with their comparison with their parent compounds. Theoretical studies reveal that it is the azomethine group, in the schiff bases under study, that acts as site for coordination to metals as it is reported by many coordination chemists. (author)

  2. Outcome (competency) based education: an exploration of its origins, theoretical basis, and empirical evidence

    DEFF Research Database (Denmark)

    Mørcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-01-01

    and professional attributes as ‘‘competencies’’. OBE has been adopted by consensus in the face of weak empirical evidence. OBE, which has been advocated for over 50 years, can contribute usefully to defining requisite knowledge and skills, and blueprinting assessments. Its applicability to more complex aspects...... greatest benefits. Our aim was to explore the underpinnings of OBE: its historical origins, theoretical basis, and empirical evidence of its effects in order to answer the question: How can predetermined learning outcomes influence undergraduate medical education? This literature review had three...... components: A review of historical landmarks in the evolution of OBE; a review of conceptual frameworks and theories; and a systematic review of empirical publications from 1999 to 2010 that reported data concerning the effects of learning outcomes on undergraduate medical education. OBE had its origins...

  3. Theoretical and Empirical Analyses of an Improved Harmony Search Algorithm Based on Differential Mutation Operator

    Directory of Open Access Journals (Sweden)

    Longquan Yong

    2012-01-01

    Full Text Available Harmony search (HS method is an emerging metaheuristic optimization algorithm. In this paper, an improved harmony search method based on differential mutation operator (IHSDE is proposed to deal with the optimization problems. Since the population diversity plays an important role in the behavior of evolution algorithm, the aim of this paper is to calculate the expected population mean and variance of IHSDE from theoretical viewpoint. Numerical results, compared with the HSDE, NGHS, show that the IHSDE method has good convergence property over a test-suite of well-known benchmark functions.

  4. Theoretical and empirical bases for dialect-neutral language assessment: contributions from theoretical and applied linguistics to communication disorders.

    Science.gov (United States)

    Pearson, Barbara Zurer

    2004-02-01

    Three avenues of theoretical research provide insights for discovering abstract properties of language that are subject to disorder and amenable to assessment: (1) the study of universal grammar and its acquisition; (2) descriptions of African American English (AAE) Syntax, Semantics, and Phonology within theoretical linguistics; and (3) the study of specific language impairment (SLI) cross-linguistically. Abstract linguistic concepts were translated into a set of assessment protocols that were used to establish normative data on language acquisition (developmental milestones) in typically developing AAE children ages 4 to 9 years. Testing AAE-speaking language impaired (LI) children and both typically developing (TD) and LI Mainstream American English (MAE)-learning children on these same measures provided the data to select assessments for which (1) TD MAE and AAE children performed the same, and (2) TD performance was reliably different from LI performance in both dialect groups.

  5. Implementing Geographical Key Concepts: Design of a Symbiotic Teacher Training Course Based on Empirical and Theoretical Evidence

    Science.gov (United States)

    Fögele, Janis; Mehren, Rainer

    2015-01-01

    A central desideratum for the professionalization of qualified teachers is an improved practice of further teacher education. The present work constitutes a course of in-service training, which is built upon both a review of empirical findings concerning the efficacy of in-service training courses for teachers and theoretical assumptions about the…

  6. "Because I Am Worth It" : A Theoretical Framework and Empirical Review of a Justification-Based Account of Self-Regulation Failure

    NARCIS (Netherlands)

    De Witt Huberts, Jessie C.; Evers, Catharine; De Ridder, Denise T D

    Self-regulation failure is often explained as being overwhelmed by impulse. The present article proposes a novel pathway, presenting a theoretical framework and empirical review of a justification-based account of self-regulation failure. With justification we refer to making excuses for one's

  7. Theoretical and Empirical Descriptions of Thermospheric Density

    Science.gov (United States)

    Solomon, S. C.; Qian, L.

    2004-12-01

    The longest-term and most accurate overall description the density of the upper thermosphere is provided by analysis of change in the ephemeris of Earth-orbiting satellites. Empirical models of the thermosphere developed in part from these measurements can do a reasonable job of describing thermospheric properties on a climatological basis, but the promise of first-principles global general circulation models of the coupled thermosphere/ionosphere system is that a true high-resolution, predictive capability may ultimately be developed for thermospheric density. However, several issues are encountered when attempting to tune such models so that they accurately represent absolute densities as a function of altitude, and their changes on solar-rotational and solar-cycle time scales. Among these are the crucial ones of getting the heating rates (from both solar and auroral sources) right, getting the cooling rates right, and establishing the appropriate boundary conditions. However, there are several ancillary issues as well, such as the problem of registering a pressure-coordinate model onto an altitude scale, and dealing with possible departures from hydrostatic equilibrium in empirical models. Thus, tuning a theoretical model to match empirical climatology may be difficult, even in the absence of high temporal or spatial variation of the energy sources. We will discuss some of the challenges involved, and show comparisons of simulations using the NCAR Thermosphere-Ionosphere-Electrodynamics General Circulation Model (TIE-GCM) to empirical model estimates of neutral thermosphere density and temperature. We will also show some recent simulations using measured solar irradiance from the TIMED/SEE instrument as input to the TIE-GCM.

  8. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    Science.gov (United States)

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  9. An empirical evaluation of two theoretically-based hypotheses on the directional association between self-worth and hope.

    Science.gov (United States)

    McDavid, Lindley; McDonough, Meghan H; Smith, Alan L

    2015-06-01

    Fostering self-worth and hope are important goals of positive youth development (PYD) efforts, yet intervention design is complicated by contrasting theoretical hypotheses regarding the directional association between these constructs. Therefore, within a longitudinal design we tested: (1) that self-worth predicts changes in hope (self theory; Harter, 1999), and (2) that hope predicts changes in self-worth (hope theory; Snyder, 2002) over time. Youth (N = 321; Mage = 10.33 years) in a physical activity-based PYD program completed surveys 37-45 days prior to and on the second day and third-to-last day of the program. A latent variable panel model that included autoregressive and cross-lagged paths indicated that self-worth was a significant predictor of change in hope, but hope did not predict change in self-worth. Therefore, the directional association between self-worth and hope is better explained by self-theory and PYD programs should aim to enhance perceptions of self-worth to build perceptions of hope. Copyright © 2015 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  10. A Theoretical and Empirical Integrated Method to Select the Optimal Combined Signals for Geometry-Free and Geometry-Based Three-Carrier Ambiguity Resolution.

    Science.gov (United States)

    Zhao, Dongsheng; Roberts, Gethin Wyn; Lau, Lawrence; Hancock, Craig M; Bai, Ruibin

    2016-11-16

    Twelve GPS Block IIF satellites, out of the current constellation, can transmit on three-frequency signals (L1, L2, L5). Taking advantages of these signals, Three-Carrier Ambiguity Resolution (TCAR) is expected to bring much benefit for ambiguity resolution. One of the research areas is to find the optimal combined signals for a better ambiguity resolution in geometry-free (GF) and geometry-based (GB) mode. However, the existing researches select the signals through either pure theoretical analysis or testing with simulated data, which might be biased as the real observation condition could be different from theoretical prediction or simulation. In this paper, we propose a theoretical and empirical integrated method, which first selects the possible optimal combined signals in theory and then refines these signals with real triple-frequency GPS data, observed at eleven baselines of different lengths. An interpolation technique is also adopted in order to show changes of the AR performance with the increase in baseline length. The results show that the AR success rate can be improved by 3% in GF mode and 8% in GB mode at certain intervals of the baseline length. Therefore, the TCAR can perform better by adopting the combined signals proposed in this paper when the baseline meets the length condition.

  11. Empirical and theoretical analysis of complex systems

    Science.gov (United States)

    Zhao, Guannan

    structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.

  12. Fickian-Based Empirical Approach for Diffusivity Determination in Hollow Alginate-Based Microfibers Using 2D Fluorescence Microscopy and Comparison with Theoretical Predictions

    Directory of Open Access Journals (Sweden)

    Maryam Mobed-Miremadi

    2014-12-01

    Full Text Available Hollow alginate microfibers (od = 1.3 mm, id = 0.9 mm, th = 400 µm, L = 3.5 cm comprised of 2% (w/v medium molecular weight alginate cross-linked with 0.9 M CaCl2 were fabricated to model outward diffusion capture by 2D fluorescent microscopy. A two-fold comparison of diffusivity determination based on real-time diffusion of Fluorescein isothiocyanate molecular weight (FITC MW markers was conducted using a proposed Fickian-based approach in conjunction with a previously established numerical model developed based on spectrophotometric data. Computed empirical/numerical (Dempiricial/Dnumerical diffusivities characterized by small standard deviations for the 4-, 70- and 500-kDa markers expressed in m2/s are (1.06 × 10−9 ± 1.96 × 10−10/(2.03 × 10−11, (5.89 × 10−11 ± 2.83 × 10−12/(4.6 × 10−12 and (4.89 × 10−12 ± 3.94 × 10−13/(1.27 × 10−12, respectively, with the discrimination between the computation techniques narrowing down as a function of MW. The use of the numerical approach is recommended for fluorescence-based measurements as the standard computational method for effective diffusivity determination until capture rates (minimum 12 fps for the 4-kDa marker and the use of linear instead of polynomial interpolating functions to model temporal intensity gradients have been proven to minimize the extent of systematic errors associated with the proposed empirical method.

  13. Empirical and theoretical challenges in aboveground-belowground ecology

    DEFF Research Database (Denmark)

    W.H. van der Putten,; R.D. Bardgett; P.C. de Ruiter

    2009-01-01

    of the current conceptual succession models into more predictive models can help targeting empirical studies and generalising their results. Then, we discuss how understanding succession may help to enhance managing arable crops, grasslands and invasive plants, as well as provide insights into the effects...... and environmental settings, we explore where and how they can be supported by theoretical approaches to develop testable predictions and to generalise empirical results. We review four key areas where a combined aboveground-belowground approach offers perspectives for enhancing ecological understanding, namely...

  14. Physical Violence between Siblings: A Theoretical and Empirical Analysis

    Science.gov (United States)

    Hoffman, Kristi L.; Kiecolt, K. Jill; Edwards, John N.

    2005-01-01

    This study develops and tests a theoretical model to explain sibling violence based on the feminist, conflict, and social learning theoretical perspectives and research in psychology and sociology. A multivariate analysis of data from 651 young adults generally supports hypotheses from all three theoretical perspectives. Males with brothers have…

  15. Cognitive culture: theoretical and empirical insights into social learning strategies.

    Science.gov (United States)

    Rendell, Luke; Fogarty, Laurel; Hoppitt, William J E; Morgan, Thomas J H; Webster, Mike M; Laland, Kevin N

    2011-02-01

    Research into social learning (learning from others) has expanded significantly in recent years, not least because of productive interactions between theoretical and empirical approaches. This has been coupled with a new emphasis on learning strategies, which places social learning within a cognitive decision-making framework. Understanding when, how and why individuals learn from others is a significant challenge, but one that is critical to numerous fields in multiple academic disciplines, including the study of social cognition. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Intuition in Decision Making –Theoretical and Empirical Aspects

    Directory of Open Access Journals (Sweden)

    Kamila Malewska

    2015-11-01

    Full Text Available In an economy dominated by information and knowledge, analysis ceases to be the sole and sufficient source of knowledge. Managers seek alternative ways of obtaining and interpreting information and knowledge. Here, managerial intuitive potential begins to play an important role. The aim of this paper is to present the issue of intuition in decision making in both theoretical and empirical terms. The first part presents the essence of intuition and its role in management, especially in decision making. Then, the empirical part attempts to identify the intuitive potential of managers and the extent of its use in practical decision making. The case study method was used in order to achieve this goal. The analysis involved a Polish food company “Fawor” that employs more than 300 workers. These literature and empirical studies in the area of intuition were conducted within the research project „The impact of managerial intuitive potential on the effectiveness of decision making processes”, financed by the National Science Centre, Poland (funds allocated on the basis of decision No. DEC-2014/13/D/HS4/01750

  17. Empirical STORM-E Model. [I. Theoretical and Observational Basis

    Science.gov (United States)

    Mertens, Christopher J.; Xu, Xiaojing; Bilitza, Dieter; Mlynczak, Martin G.; Russell, James M., III

    2013-01-01

    Auroral nighttime infrared emission observed by the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) instrument onboard the Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite is used to develop an empirical model of geomagnetic storm enhancements to E-region peak electron densities. The empirical model is called STORM-E and will be incorporated into the 2012 release of the International Reference Ionosphere (IRI). The proxy for characterizing the E-region response to geomagnetic forcing is NO+(v) volume emission rates (VER) derived from the TIMED/SABER 4.3 lm channel limb radiance measurements. The storm-time response of the NO+(v) 4.3 lm VER is sensitive to auroral particle precipitation. A statistical database of storm-time to climatological quiet-time ratios of SABER-observed NO+(v) 4.3 lm VER are fit to widely available geomagnetic indices using the theoretical framework of linear impulse-response theory. The STORM-E model provides a dynamic storm-time correction factor to adjust a known quiescent E-region electron density peak concentration for geomagnetic enhancements due to auroral particle precipitation. Part II of this series describes the explicit development of the empirical storm-time correction factor for E-region peak electron densities, and shows comparisons of E-region electron densities between STORM-E predictions and incoherent scatter radar measurements. In this paper, Part I of the series, the efficacy of using SABER-derived NO+(v) VER as a proxy for the E-region response to solar-geomagnetic disturbances is presented. Furthermore, a detailed description of the algorithms and methodologies used to derive NO+(v) VER from SABER 4.3 lm limb emission measurements is given. Finally, an assessment of key uncertainties in retrieving NO+(v) VER is presented

  18. Trophic interaction modifications: an empirical and theoretical framework.

    Science.gov (United States)

    Terry, J Christopher D; Morris, Rebecca J; Bonsall, Michael B

    2017-10-01

    Consumer-resource interactions are often influenced by other species in the community. At present these 'trophic interaction modifications' are rarely included in ecological models despite demonstrations that they can drive system dynamics. Here, we advocate and extend an approach that has the potential to unite and represent this key group of non-trophic interactions by emphasising the change to trophic interactions induced by modifying species. We highlight the opportunities this approach brings in comparison to frameworks that coerce trophic interaction modifications into pairwise relationships. To establish common frames of reference and explore the value of the approach, we set out a range of metrics for the 'strength' of an interaction modification which incorporate increasing levels of contextual information about the system. Through demonstrations in three-species model systems, we establish that these metrics capture complimentary aspects of interaction modifications. We show how the approach can be used in a range of empirical contexts; we identify as specific gaps in current understanding experiments with multiple levels of modifier species and the distributions of modifications in networks. The trophic interaction modification approach we propose can motivate and unite empirical and theoretical studies of system dynamics, providing a route to confront ecological complexity. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  19. A theoretical and empirical evaluation and extension of the Todaro migration model.

    Science.gov (United States)

    Salvatore, D

    1981-11-01

    "This paper postulates that it is theoretically and empirically preferable to base internal labor migration on the relative difference in rural-urban real income streams and rates of unemployment, taken as separate and independent variables, rather than on the difference in the expected real income streams as postulated by the very influential and often quoted Todaro model. The paper goes on to specify several important ways of extending the resulting migration model and improving its empirical performance." The analysis is based on Italian data. excerpt

  20. Converging Paradigms: A Reflection on Parallel Theoretical Developments in Psychoanalytic Metapsychology and Empirical Dream Research.

    Science.gov (United States)

    Schmelowszky, Ágoston

    2016-08-01

    In the last decades one can perceive a striking parallelism between the shifting perspective of leading representatives of empirical dream research concerning their conceptualization of dreaming and the paradigm shift within clinically based psychoanalytic metapsychology with respect to its theory on the significance of dreaming. In metapsychology, dreaming becomes more and more a central metaphor of mental functioning in general. The theories of Klein, Bion, and Matte-Blanco can be considered as milestones of this paradigm shift. In empirical dream research, the competing theories of Hobson and of Solms respectively argued for and against the meaningfulness of the dream-work in the functioning of the mind. In the meantime, empirical data coming from various sources seemed to prove the significance of dream consciousness for the development and maintenance of adaptive waking consciousness. Metapsychological speculations and hypotheses based on empirical research data seem to point in the same direction, promising for contemporary psychoanalytic practice a more secure theoretical base. In this paper the author brings together these diverse theoretical developments and presents conclusions regarding psychoanalytic theory and technique, as well as proposing an outline of an empirical research plan for testing the specificity of psychoanalysis in developing dream formation.

  1. Myths in Transformation Processes. Theoretical Interpretation and Empirical Data

    DEFF Research Database (Denmark)

    Weik, E.

    2001-01-01

    rational thinking is not an adequate or even possible reaction. Using empirical materials from East-German enterprises, the article shows how the concept can improve the researcher’s understanding of managerial action in transformation times and explain hitherto „irrational“ elements in people’s accounts.......Transformation processes are historical times which differ considerably from the normal course of events. As societal and group identities crumble or break down, it becomes difficult for the individual actor to retain a reference structure on which to base rational action. In consequence, actions...

  2. Institutions and growth: theoretical foundations and empirical evidence

    International Nuclear Information System (INIS)

    Wagner, A.F.

    2000-09-01

    Institutions and growth rates are strongly linked both theoretically and empirically. They act through 'efficiency of governance'), as institutions of conflict management, and as devices for inter temporal optimization. Contrary to most of the literature, a non-linear (inversely u-shaped) influence of institutions on growth is also formally derived as a general hypothesis; this reflects the widely neglected notion that institutions also bring opportunity costs with them. Systematic econometric evaluations for a world-wide cross-sectional sample and a European Union panel show mixed results: the rule of law, property rights, and contract and law enforcement are consistently positively related to growth. In Europe, a non-linear relationship is often found for these and other institutions. Corporatism and trust are good for growth in Europe; both bring with them significant rent-seeking costs, though. Comparing the results one notes that no easy transfer of knowledge is possible from one sample to the other. This is an important policy conclusion in its own right. (author)

  3. The money creation process: A theoretical and empirical analysis for the US

    OpenAIRE

    Levrero, Enrico Sergio; Deleidi, Matteo

    2017-01-01

    The aim of this paper is to assess – on both theoretical and empirical grounds – the two main views regarding the money creation process,namely the endogenous and exogenous money approaches. After analysing the main issues and the related empirical literature, we will apply a VAR and VECM methodology to the United States in the period 1959-2016 to assess the causal relationship between a number of critical variables that are supposed to determine the money supply, i.e., the monetary base, ban...

  4. Relationships between moment magnitude and fault parameters: theoretical and semi-empirical relationships

    Science.gov (United States)

    Wang, Haiyun; Tao, Xiaxin

    2003-12-01

    Fault parameters are important in earthquake hazard analysis. In this paper, theoretical relationships between moment magnitude and fault parameters including subsurface rupture length, downdip rupture width, rupture area, and average slip over the fault surface are deduced based on seismological theory. These theoretical relationships are further simplified by applying similarity conditions and an unique form is established. Then, combining the simplified theoretical relationships between moment magnitude and fault parameters with seismic source data selected in this study, a practical semi-empirical relationship is established. The seismic source data selected is also to used to derive empirical relationships between moment magnitude and fault parameters by the ordinary least square regression method. Comparisons between semi-empirical relationships and empirical relationships show that the former depict distribution trends of data better than the latter. It is also observed that downdip rupture widths of strike slip faults are saturated when moment magnitude is more than 7.0, but downdip rupture widths of dip slip faults are not saturated in the moment magnitude ranges of this study.

  5. Theoretical vs. empirical discriminability: the application of ROC methods to eyewitness identification.

    Science.gov (United States)

    Wixted, John T; Mickes, Laura

    2018-01-01

    Receiver operating characteristic (ROC) analysis was introduced to the field of eyewitness identification 5 years ago. Since that time, it has been both influential and controversial, and the debate has raised an issue about measuring discriminability that is rarely considered. The issue concerns the distinction between empirical discriminability (measured by area under the ROC curve) vs. underlying/theoretical discriminability (measured by d' or variants of it). Under most circumstances, the two measures will agree about a difference between two conditions in terms of discriminability. However, it is possible for them to disagree, and that fact can lead to confusion about which condition actually yields higher discriminability. For example, if the two conditions have implications for real-world practice (e.g., a comparison of competing lineup formats), should a policymaker rely on the area-under-the-curve measure or the theory-based measure? Here, we illustrate the fact that a given empirical ROC yields as many underlying discriminability measures as there are theories that one is willing to take seriously. No matter which theory is correct, for practical purposes, the singular area-under-the-curve measure best identifies the diagnostically superior procedure. For that reason, area under the ROC curve informs policy in a way that underlying theoretical discriminability never can. At the same time, theoretical measures of discriminability are equally important, but for a different reason. Without an adequate theoretical understanding of the relevant task, the field will be in no position to enhance empirical discriminability.

  6. Whole-body cryotherapy: empirical evidence and theoretical perspectives.

    Science.gov (United States)

    Bleakley, Chris M; Bieuzen, François; Davison, Gareth W; Costello, Joseph T

    2014-01-01

    Whole-body cryotherapy (WBC) involves short exposures to air temperatures below -100°C. WBC is increasingly accessible to athletes, and is purported to enhance recovery after exercise and facilitate rehabilitation postinjury. Our objective was to review the efficacy and effectiveness of WBC using empirical evidence from controlled trials. We found ten relevant reports; the majority were based on small numbers of active athletes aged less than 35 years. Although WBC produces a large temperature gradient for tissue cooling, the relatively poor thermal conductivity of air prevents significant subcutaneous and core body cooling. There is weak evidence from controlled studies that WBC enhances antioxidant capacity and parasympathetic reactivation, and alters inflammatory pathways relevant to sports recovery. A series of small randomized studies found WBC offers improvements in subjective recovery and muscle soreness following metabolic or mechanical overload, but little benefit towards functional recovery. There is evidence from one study only that WBC may assist rehabilitation for adhesive capsulitis of the shoulder. There were no adverse events associated with WBC; however, studies did not seem to undertake active surveillance of predefined adverse events. Until further research is available, athletes should remain cognizant that less expensive modes of cryotherapy, such as local ice-pack application or cold-water immersion, offer comparable physiological and clinical effects to WBC.

  7. Hybrid empirical--theoretical approach to modeling uranium adsorption

    International Nuclear Information System (INIS)

    Hull, Larry C.; Grossman, Christopher; Fjeld, Robert A.; Coates, John T.; Elzerman, Alan W.

    2004-01-01

    An estimated 330 metric tons of U are buried in the radioactive waste Subsurface Disposal Area (SDA) at the Idaho National Engineering and Environmental Laboratory (INEEL). An assessment of U transport parameters is being performed to decrease the uncertainty in risk and dose predictions derived from computer simulations of U fate and transport to the underlying Snake River Plain Aquifer. Uranium adsorption isotherms were measured for 14 sediment samples collected from sedimentary interbeds underlying the SDA. The adsorption data were fit with a Freundlich isotherm. The Freundlich n parameter is statistically identical for all 14 sediment samples and the Freundlich K f parameter is correlated to sediment surface area (r 2 =0.80). These findings suggest an efficient approach to material characterization and implementation of a spatially variable reactive transport model that requires only the measurement of sediment surface area. To expand the potential applicability of the measured isotherms, a model is derived from the empirical observations by incorporating concepts from surface complexation theory to account for the effects of solution chemistry. The resulting model is then used to predict the range of adsorption conditions to be expected in the vadose zone at the SDA based on the range in measured pore water chemistry. Adsorption in the deep vadose zone is predicted to be stronger than in near-surface sediments because the total dissolved carbonate decreases with depth

  8. Whole-body cryotherapy: empirical evidence and theoretical perspectives

    Directory of Open Access Journals (Sweden)

    Bleakley CM

    2014-03-01

    Full Text Available Chris M Bleakley,1 François Bieuzen,2 Gareth W Davison,1 Joseph T Costello3 1Sport and Exercise Science Research Institute, Faculty of Life and Health Sciences, University of Ulster, Newtownabbey, Northern Ireland; 2Research Department, Laboratory of Sport, Expertise and Performance, French National Institute of Sport (INSEP, Paris, France; 3School of Exercise and Nutrition Sciences and Institute of Health and Biomedical Innovation, Queensland University of Technology, Brisbane, Australia Abstract: Whole-body cryotherapy (WBC involves short exposures to air temperatures below –100°C. WBC is increasingly accessible to athletes, and is purported to enhance recovery after exercise and facilitate rehabilitation postinjury. Our objective was to review the efficacy and effectiveness of WBC using empirical evidence from controlled trials. We found ten relevant reports; the majority were based on small numbers of active athletes aged less than 35 years. Although WBC produces a large temperature gradient for tissue cooling, the relatively poor thermal conductivity of air prevents significant subcutaneous and core body cooling. There is weak evidence from controlled studies that WBC enhances antioxidant capacity and parasympathetic reactivation, and alters inflammatory pathways relevant to sports recovery. A series of small randomized studies found WBC offers improvements in subjective recovery and muscle soreness following metabolic or mechanical overload, but little benefit towards functional recovery. There is evidence from one study only that WBC may assist rehabilitation for adhesive capsulitis of the shoulder. There were no adverse events associated with WBC; however, studies did not seem to undertake active surveillance of predefined adverse events. Until further research is available, athletes should remain cognizant that less expensive modes of cryotherapy, such as local ice-pack application or cold-water immersion, offer comparable

  9. Connecting theoretical and empirical studies of trait-mediated interactions

    Czech Academy of Sciences Publication Activity Database

    Bolker, B.; Holyoak, M.; Křivan, Vlastimil; Rowe, L.; Schmitz, O.

    2003-01-01

    Roč. 84, č. 5 (2003), s. 1101-1114 ISSN 0012-9658 Institutional research plan: CEZ:AV0Z5007907 Keywords : community models * competition * empirical study Subject RIV: EH - Ecology, Behaviour Impact factor: 3.701, year: 2003

  10. Potential benefits of remote sensing: Theoretical framework and empirical estimate

    Science.gov (United States)

    Eisgruber, L. M.

    1972-01-01

    A theoretical framwork is outlined for estimating social returns from research and application of remote sensing. The approximate dollar magnitude is given of a particular application of remote sensing, namely estimates of corn production, soybeans, and wheat. Finally, some comments are made on the limitations of this procedure and on the implications of results.

  11. What drives adult personality development? : A comparison of theoretical perspectives and empirical evidence

    NARCIS (Netherlands)

    Specht, J.; Bleidorn, W.; Denissen, J.J.A.; Hennecke, M.; Hutteman, R.; Luhmann, M.; Orth, U.; Reitz, A.K.; Zimmerman, J.

    2014-01-01

    Increasing numbers of empirical studies provide compelling evidence that personality traits change across the entire lifespan. What initiates this continuing personality development and how does this development proceed? In this paper, we compare six theoretical perspectives that offer testable

  12. An extra-memetic empirical methodology to accompany theoretical memetics

    OpenAIRE

    Gill, Jameson

    2012-01-01

    Abstract\\ud \\ud Purpose: The paper describes the difficulties encountered by researchers who are looking to operationalise theoretical memetics and provides a methodological avenue for studies that can test meme theory.\\ud \\ud Design/Methodology/Approach: The application of evolutionary theory to organisations is reviewed by critically reflecting on the validity of its truth claims. To focus the discussion a number of applications of meme theory are reviewed to raise specific issues which oug...

  13. Does the U.S. exercise contagion on Italy? A theoretical model and empirical evidence

    Science.gov (United States)

    Cerqueti, Roy; Fenga, Livio; Ventura, Marco

    2018-06-01

    This paper deals with the theme of contagion in financial markets. At this aim, we develop a model based on Mixed Poisson Processes to describe the abnormal returns of financial markets of two considered countries. In so doing, the article defines the theoretical conditions to be satisfied in order to state that one of them - the so-called leader - exercises contagion on the others - the followers. Specifically, we employ an invariant probabilistic result stating that a suitable transformation of a Mixed Poisson Process is still a Mixed Poisson Process. The theoretical claim is validated by implementing an extensive simulation analysis grounded on empirical data. The countries considered are the U.S. (as the leader) and Italy (as the follower) and the period under scrutiny is very large, ranging from 1970 to 2014.

  14. Axiological aspects of sustainable development with theoretical and empirical approach

    Directory of Open Access Journals (Sweden)

    Włodzimierz Kaczocha

    2011-01-01

    Full Text Available The first part of the paper presents the values – goals, which are contained in political programs of sustainable development. These values (e.g. social justice, intergenerational justice, liberty, sustainable consumption and value of nature should be explained with respect to the assumptions of ethics of beliefs, obligation or responsibility. The second part discusses the results of empirical studies which focused on certain goals-values of sustainable development achieved by residents of rural areas. The third part contains analytical interpretation of three values: positive liberty, social justice and community, which are of key importance to sustainable development. The final fourth part discusses political and ethical dilemmas which must be faced by Polish politicians: should we design and implement sustainable development evenly or, in consideration of the ethical aspect; should we start first from radical improvement in conditions of living among poor people?

  15. The growth of business firms: theoretical framework and empirical evidence.

    Science.gov (United States)

    Fu, Dongfeng; Pammolli, Fabio; Buldyrev, S V; Riccaboni, Massimo; Matia, Kaushik; Yamasaki, Kazuko; Stanley, H Eugene

    2005-12-27

    We introduce a model of proportional growth to explain the distribution P(g)(g) of business-firm growth rates. The model predicts that P(g)(g) is exponential in the central part and depicts an asymptotic power-law behavior in the tails with an exponent zeta = 3. Because of data limitations, previous studies in this field have been focusing exclusively on the Laplace shape of the body of the distribution. In this article, we test the model at different levels of aggregation in the economy, from products to firms to countries, and we find that the predictions of the model agree with empirical growth distributions and size-variance relationships.

  16. Using Graph and Vertex Entropy to Compare Empirical Graphs with Theoretical Graph Models

    Directory of Open Access Journals (Sweden)

    Tomasz Kajdanowicz

    2016-09-01

    Full Text Available Over the years, several theoretical graph generation models have been proposed. Among the most prominent are: the Erdős–Renyi random graph model, Watts–Strogatz small world model, Albert–Barabási preferential attachment model, Price citation model, and many more. Often, researchers working with real-world data are interested in understanding the generative phenomena underlying their empirical graphs. They want to know which of the theoretical graph generation models would most probably generate a particular empirical graph. In other words, they expect some similarity assessment between the empirical graph and graphs artificially created from theoretical graph generation models. Usually, in order to assess the similarity of two graphs, centrality measure distributions are compared. For a theoretical graph model this means comparing the empirical graph to a single realization of a theoretical graph model, where the realization is generated from the given model using an arbitrary set of parameters. The similarity between centrality measure distributions can be measured using standard statistical tests, e.g., the Kolmogorov–Smirnov test of distances between cumulative distributions. However, this approach is both error-prone and leads to incorrect conclusions, as we show in our experiments. Therefore, we propose a new method for graph comparison and type classification by comparing the entropies of centrality measure distributions (degree centrality, betweenness centrality, closeness centrality. We demonstrate that our approach can help assign the empirical graph to the most similar theoretical model using a simple unsupervised learning method.

  17. Theoretical, Methodological, and Empirical Approaches to Cost Savings: A Compendium

    Energy Technology Data Exchange (ETDEWEB)

    M Weimar

    1998-12-10

    This publication summarizes and contains the original documentation for understanding why the U.S. Department of Energy's (DOE's) privatization approach provides cost savings and the different approaches that could be used in calculating cost savings for the Tank Waste Remediation System (TWRS) Phase I contract. The initial section summarizes the approaches in the different papers. The appendices are the individual source papers which have been reviewed by individuals outside of the Pacific Northwest National Laboratory and the TWRS Program. Appendix A provides a theoretical basis for and estimate of the level of savings that can be" obtained from a fixed-priced contract with performance risk maintained by the contractor. Appendix B provides the methodology for determining cost savings when comparing a fixed-priced contractor with a Management and Operations (M&O) contractor (cost-plus contractor). Appendix C summarizes the economic model used to calculate cost savings and provides hypothetical output from preliminary calculations. Appendix D provides the summary of the approach for the DOE-Richland Operations Office (RL) estimate of the M&O contractor to perform the same work as BNFL Inc. Appendix E contains information on cost growth and per metric ton of glass costs for high-level waste at two other DOE sites, West Valley and Savannah River. Appendix F addresses a risk allocation analysis of the BNFL proposal that indicates,that the current approach is still better than the alternative.

  18. Historical Consciousness in Youth. Theoretical and Exemplary Empirical Analyses

    Directory of Open Access Journals (Sweden)

    Carlos Kölbl

    2001-09-01

    Full Text Available The thesis that historical consciousness is an anthropological competence and category is called into question. A concept of modern historical consciousness is outlined which from then on serves as a working concept. This kind of historical consciousness, it is argued, is not a universal anthropological fact, but a result of the development of occidental cultures and societies. Long since a great number of groups and individuals have been deeply affected by this development in which the establishment of a scientific world view and methodical thinking played a major role. Their historical consciousness is modern since it refers to a radically temporalized and dynamic world and since it ties partial representations of this world to (implicit criteria of validity. Moreover it is closely connected with the possibility of self-critical reflections which are grounded in the historically mediated encounter with strangers. After a concise overview of the important questions and the state of the art in different disciplines, selected results of a broader qualitative-empirical study are presented. In the group discussions which were carried out with young people—only results from a discussion with thirteen to fourteen year old grammar-school pupils (Gymnasiasten are presented here—the analysis revealed clear indicators of a specifically modern historical consciousness. Looked at closely this consciousness is committed in a surprisingly high degree to scientific-methodical standards of rationality. One may welcome this as a successful implementation of a life form oriented towards rationality into young people's everyday life or deplore it as a symptom of the distortion of pragmatic orientations for activity and living by scientific standards: first of all it is a fact that the commitment to tie the reconstruction of past realities, historical events and contexts to an operation of knowledge which is intersubjectively transparent and rationally

  19. Promoting mental wellbeing: developing a theoretically and empirically sound complex intervention.

    Science.gov (United States)

    Millar, S L; Donnelly, M

    2014-06-01

    This paper describes the development of a complex intervention to promote mental wellbeing using the revised framework for developing and evaluating complex interventions produced by the UK Medical Research Council (UKMRC). Application of the first two phases of the framework is described--development and feasibility and piloting. The theoretical case and evidence base were examined analytically to explicate the theoretical and empirical foundations of the intervention. These findings informed the design of a 12-week mental wellbeing promotion programme providing early intervention for people showing signs of mental health difficulties. The programme is based on the theoretical constructs of self-efficacy, self-esteem, purpose in life, resilience and social support and comprises 10 steps. A mixed methods approach was used to conduct a feasibility study with community and voluntary sector service users and in primary care. A significant increase in mental wellbeing was observed following participation in the intervention. Qualitative data corroborated this finding and suggested that the intervention was feasible to deliver and acceptable to participants, facilitators and health professionals. The revised UKMRC framework can be successfully applied to the development of public health interventions. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Peer influence in network markets: a theoretical and empirical analysis

    NARCIS (Netherlands)

    J. Henkel (Joachim); J.H. Block (Jörn)

    2013-01-01

    textabstractNetwork externalities spur the growth of networks and the adoption of network goods in two ways. First, they make it more attractive to join a network the larger its installed base. Second, they create incentives for network members to actively recruit new members. Despite indications

  1. Input Manipulation, Enhancement and Processing: Theoretical Views and Empirical Research

    Science.gov (United States)

    Benati, Alessandro

    2016-01-01

    Researchers in the field of instructed second language acquisition have been examining the issue of how learners interact with input by conducting research measuring particular kinds of instructional interventions (input-oriented and meaning-based). These interventions include such things as input flood, textual enhancement and processing…

  2. Adaptation in Food Networks: Theoretical Framework and Empirical Evidences

    Directory of Open Access Journals (Sweden)

    Gaetano Martino

    2013-03-01

    Full Text Available The paper concerns the integration in food networks under a governance point of view. We conceptualize the integration processes in terms of the adaptation theory and focus the issues related under a transaction cost economics perspective. We conjecture that the allocation of decisions rights between the parties to a transaction is a key instrument in order to cope with the sources of basic uncertainty in food networks: technological innovation, sustainability strategies, quality and safety objectives. Six case studies are proposed which contribute to corroborate our conjecture. Managerial patters based on a joint decision approach also are documented

  3. Determinants of Business Success – Theoretical Model and Empirical Verification

    Directory of Open Access Journals (Sweden)

    Kozielski Robert

    2016-12-01

    Full Text Available Market knowledge, market orientation, learning competencies, and a business performance were the key issues of the research project conducted in the 2006 study. The main findings identified significant relationships between the independent variables (market knowledge, market orientation, learning competencies and the dependent variables (business success. A partial correlation analysis indicated that a business success primarily relies on organisational learning competencies. Organisational learning competencies, to a large extent (almost 60%, may be explained by the level of corporate market knowledge and market orientation. The aim of the paper is to evaluate to what extent the relationships between the variables are still valid. The research was based on primary and secondary data sources. The major field of the research was carried out in the form of quantitative studies. The results of the 2014 study are consistent with the previous (2006 results.

  4. The Effect of Private Benefits of Control on Minority Shareholders: A Theoretical Model and Empirical Evidence from State Ownership

    Directory of Open Access Journals (Sweden)

    Kerry Liu

    2017-06-01

    Full Text Available Purpose: The purpose of this paper is to examine the effect of private benefits of control on minority shareholders. Design/methodology/approach: A theoretical model is established. The empirical analysis includes hand-collected data from a wide range of data sources. OLS and 2SLS regression analysis are applied with Huber-White standard errors. Findings: The theoretical model shows that, while private benefits are generally harmful to minority shareholders, the overall effect depends on the size of large shareholder ownership. The empirical evidence from government ownership is consistent with theoretical analysis. Research limitations/implications: The empirical evidence is based on a small number of hand-collected data sets of government ownership. Further studies can be expanded to other types of ownership, such as family ownership and financial institutional ownership. Originality/value: This study is the first to theoretically analyse and empirically test the effect of private benefits. In general, this study significantly contributes to the understanding of the effect of large shareholder and corporate governance.

  5. A Balanced Theoretical and Empirical Approach for the Development of a Design Support Tool

    DEFF Research Database (Denmark)

    Jensen, Thomas Aakjær; Hansen, Claus Thorp

    1996-01-01

    The introduction of a new design support system may change the engineering designer's work situation. Therefore, it may not be possible to derive all the functionalities for a design support system from solely empirical studies of manual design work. Alternatively the design support system could ...... system, indicating a proposal for how to balance a theoretical and empirical approach. The result of this research will be utilized in the development of a Designer's Workbench to support the synthesis activity in mechanical design....

  6. Relationships between Unemployment and Economic Growth - the Review (Results of the Theoretical and Empirical Research

    Directory of Open Access Journals (Sweden)

    Katarzyna Nagel

    2015-04-01

    Full Text Available The article aims to discuss the relationship between economic growth and unemployment as well as related determinant factors based on literature review. The traditional approach presents this relationship through the prism of the effects of creation, capitalization, pool of savings and creative destruction. Nowadays, an increasing number of researchers attach more importance to the impact of institutional factors, such as minimum and efficiency wages or the flexibility of the labor market. Both theoretical and empirical research reveal both the evolution of the relevant views and the lack of consistency between the concepts explaining the relationship between economic growth and unemployment in different regions of the world and in different groups of countries.

  7. Liturgy as Experience - the Psychology of Worship.
 A Theoretical and Empirical Lacuna

    Directory of Open Access Journals (Sweden)

    Owe Wikström

    1993-01-01

    Full Text Available This article has three aims: 1 to plead for an approach to the study of the liturgy based on the psychology of religion, 2 to draw up a preliminary theoretical model for how the liturgy can be interpreted, and 3 to narrow down the field for further interdisciplinary development and empirical analysis. People undergo more or less strong experiences during and in conjunction with church services. Perhaps people are moved, experience holiness, reverence, fellowship or closeness to the risen Christ. The problem is what factors during the service strengthen such a religious experience. What is the role played by the music, symbols, the place or building where the service is held, the number of participants and the liturgical event?

  8. For a new dialogue between theoretical and empirical studies in evo-devo

    Directory of Open Access Journals (Sweden)

    Giuseppe eFusco

    2015-08-01

    Full Text Available Despite its potentially broad scope, current evo-devo research is largely dominated by empirical developmental studies, whereas comparably little role is played by theoretical research. I argue that this represents an obstacle to a wider appreciation of evo-devo and its integration within a more comprehensive evolutionary theory, and that this situation is causally linked to a limited exchange between theoretical and experimental studies in evo-devo. I discuss some features of current theoretical work in evo-devo, highlighting some possibly concurring impediments to an effective dialogue with experimental studies. Finally, I advance two suggestions for enhancing fruitful cross-fertilization between theoretical and empirical studies in evo-devo: i to broaden the scope of evo-devo beyond its current conceptualization, teaming up with other variational approaches to the study of evolution, and ii to develop more effective forms of scientific interaction and communication.

  9. Emotions and Motivation in Mathematics Education: Theoretical Considerations and Empirical Contributions

    Science.gov (United States)

    Schukajlow, Stanislaw; Rakoczy, K.; Pekrun, R.

    2017-01-01

    Emotions and motivation are important prerequisites, mediators, and outcomes of learning and achievement. In this article, we first review major theoretical approaches and empirical findings in research on students' emotions and motivation in mathematics, including a discussion of how classroom instruction can support emotions and motivation.…

  10. Theoretical and empirical approaches to using films as a means to increase communication efficiency.

    Directory of Open Access Journals (Sweden)

    Kiselnikova, N.V.

    2016-07-01

    Full Text Available The theoretical framework of this analytic study is based on studies in the field of film perception. Films are considered as a communicative system that is encrypted in an ordered series of shots, and decoding proceeds during perception. The shots are the elements of a cinematic message that must be “read” by viewer. The objective of this work is to analyze the existing theoretical approaches to using films in psychotherapy and education. An original approach to film therapy that is based on teaching clients to use new communicative sets and psychotherapeutic patterns through watching films is presented. The article specifies the main emphasized points in theories of film therapy and education. It considers the specifics of film therapy in the process of increasing the effectiveness of communication. It discusses the advantages and limitations of the proposed method. The contemporary forms of film therapy and the formats of cinema clubs are criticized. The theoretical assumptions and empirical research that could be used as a basis for a method of developing effective communication by means of films are discussed. Our studies demonstrate that the usage of film therapy must include an educational stage for more effective and stable results. This means teaching viewers how to recognize certain psychotherapeutic and communicative patterns in the material of films, to practice the skill of finding as many examples as possible for each pattern and to transfer the acquired schemes of analyzing and recognizing patterns into one’s own life circumstances. The four stages of the film therapeutic process as well as the effects that are achieved at each stage are described in detail. In conclusion, the conditions under which the usage of the film therapy method would be the most effective are observed. Various properties of client groups and psychotherapeutic scenarios for using the method of active film therapy are described.

  11. Theoretical and Empirical Review of Asset Pricing Models: A Structural Synthesis

    Directory of Open Access Journals (Sweden)

    Saban Celik

    2012-01-01

    Full Text Available The purpose of this paper is to give a comprehensive theoretical review devoted to asset pricing models by emphasizing static and dynamic versions in the line with their empirical investigations. A considerable amount of financial economics literature devoted to the concept of asset pricing and their implications. The main task of asset pricing model can be seen as the way to evaluate the present value of the pay offs or cash flows discounted for risk and time lags. The difficulty coming from discounting process is that the relevant factors that affect the pay offs vary through the time whereas the theoretical framework is still useful to incorporate the changing factors into an asset pricing models. This paper fills the gap in literature by giving a comprehensive review of the models and evaluating the historical stream of empirical investigations in the form of structural empirical review.

  12. ADDRESSED IN EMPIRICALLY BASED PSYCHOTHERAPIES

    Directory of Open Access Journals (Sweden)

    Martin J. La Roche

    2017-01-01

    Full Text Available La literatura psicológica sobre procesos implícitos (IP, que abarca los pensamientos, acciones y sentimientosde un individuo que ocurren independientemente de la conciencia, se ha expandido en las últimas dos décadas. Durante este mismo período, la proliferación de psicoterapias con base empírica (PBE, con énfasis en los procesos conscientes, ha cobrado impulsoentre muchos profesionales de la salud mental. Sin embargo, la literatura sobre el papel de los IP en psicoterapias basadas empíricamente (EBP es escasa. El objetivo principal de este documento es sugerir hallazgos de IP que puedan usarse para mejorar laeficacia y efectividad de EBP. Se destacan siete hallazgos de IP que pueden tener aplicaciones importantes para EBP. Dentro de cada una de estas siete consideraciones, se discute el impacto de los IP en el proceso psicoterapéutico.

  13. Determinants of Financing Decisions in Innovative Firms: A Review on Theoretical Backgrounds and Empirical Evidence

    Directory of Open Access Journals (Sweden)

    Mihaela Diaconu

    2016-01-01

    Full Text Available We review some of the main aspects highlighted in the literature on financing innovation. Thetheoretical background related to the distinctive features of innovative firms impacting theirfinancing decisions and the empirical evidence is reviewed. The growing literature on the financingof innovation shows that the theoretical and empirical work are not always constant across thevarious samples and situations faced by firms as a result of generating new findings. We highlightthe interaction between financing choices for innovation and changing internal and externalcondition firms operate.

  14. Theoretical-empirical model of the steam-water cycle of the power unit

    Directory of Open Access Journals (Sweden)

    Grzegorz Szapajko

    2010-06-01

    Full Text Available The diagnostics of the energy conversion systems’ operation is realised as a result of collecting, processing, evaluatingand analysing the measurement signals. The result of the analysis is the determination of the process state. It requires a usageof the thermal processes models. Construction of the analytical model with the auxiliary empirical functions built-in brings satisfyingresults. The paper presents theoretical-empirical model of the steam-water cycle. Worked out mathematical simulation model containspartial models of the turbine, the regenerative heat exchangers and the condenser. Statistical verification of the model is presented.

  15. Quantifying heterogeneity attributable to polythetic diagnostic criteria: theoretical framework and empirical application.

    Science.gov (United States)

    Olbert, Charles M; Gala, Gary J; Tupler, Larry A

    2014-05-01

    Heterogeneity within psychiatric disorders is both theoretically and practically problematic: For many disorders, it is possible for 2 individuals to share very few or even no symptoms in common yet share the same diagnosis. Polythetic diagnostic criteria have long been recognized to contribute to this heterogeneity, yet no unified theoretical understanding of the coherence of symptom criteria sets currently exists. A general framework for analyzing the logical and mathematical structure, coherence, and diversity of Diagnostic and Statistical Manual diagnostic categories (DSM-5 and DSM-IV-TR) is proposed, drawing from combinatorial mathematics, set theory, and information theory. Theoretical application of this framework to 18 diagnostic categories indicates that in most categories, 2 individuals with the same diagnosis may share no symptoms in common, and that any 2 theoretically possible symptom combinations will share on average less than half their symptoms. Application of this framework to 2 large empirical datasets indicates that patients who meet symptom criteria for major depressive disorder and posttraumatic stress disorder tend to share approximately three-fifths of symptoms in common. For both disorders in each of the datasets, pairs of individuals who shared no common symptoms were observed. Any 2 individuals with either diagnosis were unlikely to exhibit identical symptomatology. The theoretical and empirical results stemming from this approach have substantive implications for etiological research into, and measurement of, psychiatric disorders.

  16. Measuring health lifestyles in a comparative analysis: theoretical issues and empirical findings.

    Science.gov (United States)

    Abel, T

    1991-01-01

    The concept of lifestyle bears great potential for research in medical sociology. Yet, weaknesses in current methods have restrained lifestyle research from realizing its full potentials. The present focus is on the links between theoretical conceptions and their empirical application. The paper divides into two parts. The first part provides a discussion of basic theoretical and methodological issues. In particular selected lines of thought from Max Weber are presented and their usefulness in providing a theoretical frame of reference for health lifestyle research is outlined. Next, a theory guided definition of the subject matter is introduced and basic problems in empirical applications of theoretical lifestyle concepts are discussed. In its second part the paper presents findings from comparative lifestyle analyses. Data from the U.S. and West Germany are utilized to explore issues of measurement equivalence and theoretical validity. Factor analyses indicate high conceptual equivalence for new measures of health lifestyle dimensions in both the U.S. and West Germany. Divisive cluster analyses detect three distinct lifestyle groups in both nations. Implications for future lifestyle research are discussed.

  17. A Survey on Agricultural Trade Policies in Bangladesh: theoretical Insights and empirical Evidence

    Directory of Open Access Journals (Sweden)

    Dayal Talukder

    2014-01-01

    Full Text Available The purpose of this paper is to review the theoretical insights and empirical evidence on agricultural trade policies and their impacts on the Bangladesh‟s economy, with a view to presenting both, the positive and negative effects of trade liberalization. Theoretically, while advocates of trade liberalization argue that free trade is an engine of growth and protection leads to wasteful use of resources, critics argue that openness has its costs and sometimes it could be detrimental to the economic development. The empirical evidence in Bangladesh was consistent with the ongoing debate on the effects of trade liberalization on economic development. The evidence remained mixed and loaded with criticisms on the grounds of choice of liberalization determinants, model specifications and methodology, as well as other measurement shortcomings. The review suggests that the literature is inconclusive and outcomes are largely case-specific.

  18. A Survey on Agricultural Trade Policies in Bangladesh: theoretical Insights and empirical Evidence

    Directory of Open Access Journals (Sweden)

    D. Talukder

    2014-06-01

    Full Text Available The purpose of this paper is to review the theoretical insights and empirical evidence on agricultural trade policies and their impacts on the Bangladesh’s economy, with a view to presenting both, the positive and negative effects of trade liberalization. Theoretically, while advocates of trade liberalization argue that free trade is an engine of growth and protection leads to wasteful use of resources, critics argue that openness has its costs and sometimes it could be detrimental to the economic development. The empirical evidence in Bangladesh was consistent with the ongoing debate on the effects of trade liberalization on economic development. The evidence remained mixed and loaded with criticisms on the grounds of choice of liberalization determinants, model specifications and methodology, as well as other measurement shortcomings. The review suggests that the literature is inconclusive and outcomes are largely case-specific

  19. The Theoretical and Empirical Approaches to the Definition of Audit Risk

    Directory of Open Access Journals (Sweden)

    Berezhniy Yevgeniy B.

    2017-12-01

    Full Text Available The risk category is one of the key factors in planning the audit and assessing its results. The article is aimed at generalizing the theoretical and empirical approaches to the definition of audit risk and methods of its reduction. The structure of audit risk was analyzed and it has been determined, that each of researchers approached to structuring of audit risk from the subjective point of view. The author’s own model of audit risk has been proposed. The basic methods of assessment of audit risk are generalized, the theoretical and empirical approaches to its definition are allocated, also it is noted, that application of any of the given models can be suitable rather for approximate estimation, than for exact calculation of an audit risk, as it is accompanied by certain shortcomings.

  20. What is value for food retail chains? Theoretical aspects and empirical findings from Spain

    DEFF Research Database (Denmark)

    Skytte, Hans; Bove, Karsten

    It is a well-established fact that creating value for customers (in the eyes of the customers) is a very important source of competitive advantage. No researchers have, however, analysed or defined what retail chains mean by value. In this study, building on a solid theoretical back-ground, we pr...... propose a definition of 'retailer value'. Subsequently this concept is used in an empirical study of retail chains in Spain....

  1. Dignity in the care of older people – a review of the theoretical and empirical literature

    Directory of Open Access Journals (Sweden)

    Jones Ian

    2008-07-01

    Full Text Available Abstract Background Dignity has become a central concern in UK health policy in relation to older and vulnerable people. The empirical and theoretical literature relating to dignity is extensive and as likely to confound and confuse as to clarify the meaning of dignity for nurses in practice. The aim of this paper is critically to examine the literature and to address the following questions: What does dignity mean? What promotes and diminishes dignity? And how might dignity be operationalised in the care of older people? This paper critically reviews the theoretical and empirical literature relating to dignity and clarifies the meaning and implications of dignity in relation to the care of older people. If nurses are to provide dignified care clarification is an essential first step. Methods This is a review article, critically examining papers reporting theoretical perspectives and empirical studies relating to dignity. The following databases were searched: Assia, BHI, CINAHL, Social Services Abstracts, IBSS, Web of Knowledge Social Sciences Citation Index and Arts & Humanities Citation Index and location of books a chapters in philosophy literature. An analytical approach was adopted to the publications reviewed, focusing on the objectives of the review. Results and discussion We review a range of theoretical and empirical accounts of dignity and identify key dignity promoting factors evident in the literature, including staff attitudes and behaviour; environment; culture of care; and the performance of specific care activities. Although there is scope to learn more about cultural aspects of dignity we know a good deal about dignity in care in general terms. Conclusion We argue that what is required is to provide sufficient support and education to help nurses understand dignity and adequate resources to operationalise dignity in their everyday practice. Using the themes identified from our review we offer proposals for the direction of

  2. A review of the nurtured heart approach to parenting: evaluation of its theoretical and empirical foundations.

    Science.gov (United States)

    Hektner, Joel M; Brennan, Alison L; Brotherson, Sean E

    2013-09-01

    The Nurtured Heart Approach to parenting (NHA; Glasser & Easley, 2008) is summarized and evaluated in terms of its alignment with current theoretical perspectives and empirical evidence in family studies and developmental science. Originally conceived and promoted as a behavior management approach for parents of difficult children (i.e., with behavior disorders), NHA is increasingly offered as a valuable strategy for parents of any children, despite a lack of published empirical support. Parents using NHA are trained to minimize attention to undesired behaviors, provide positive attention and praise for compliance with rules, help children be successful by scaffolding and shaping desired behavior, and establish a set of clear rules and consequences. Many elements of the approach have strong support in the theoretical and empirical literature; however, some of the assumptions are more questionable, such as that negative child behavior can always be attributed to unintentional positive reinforcement by parents responding with negative attention. On balance, NHA appears to promote effective and validated parenting practices, but its effectiveness now needs to be tested empirically. © FPI, Inc.

  3. 6 essays about auctions: a theoretical and empirical analysis. Application to power markets

    International Nuclear Information System (INIS)

    Lamy, L.

    2007-06-01

    This thesis is devoted to a theoretical and empirical analysis of auction mechanisms. Motivated by allocation issues in network industries, in particular by the liberalization of the electricity sector, it focus on auctions with externalities (either allocative or informational) and on multi-objects auctions. After an introduction which provides a survey of the use and the analysis of auctions in power markets, six chapters make this thesis. The first one considers standard auctions in Milgrom-Weber's model with interdependent valuations when the seller can not commit not to participate in the auction. The second and third chapters study the combinatorial auction mechanism proposed by Ausubel and Milgrom. The first of these two studies proposes a modification of this format with a final discount stage and clarifies the theoretical status of those formats, in particular the conditions such that truthful reporting is a dominant strategy. Motivated by the robustness issues of the generalizations of the Ausubel-Milgrom and the Vickrey combinatorial auctions to environments with allocative externalities between joint-purchasers, the second one characterizes the buyer-sub-modularity condition in a general model with allocative identity-dependent externalities between purchasers. In a complete information setup, the fourth chapter analyses the optimal design problem when the commitment abilities of the principal are reduced, namely she can not commit to a simultaneous participation game. The fifth chapter is devoted to the structural analysis of the private value auction model for a single-unit when the econometrician can not observe bidders' identities. The asymmetric independent private value (IPV) model is identified. A multi-step kernel-based estimator is proposed and shown to be asymptotically optimal. Using auctions data for the anglo-french electric Interconnector, the last chapter analyses a multi-unit ascending auctions through reduced forms. (author)

  4. Uncovering curvilinear relationships between conscientiousness and job performance: how theoretically appropriate measurement makes an empirical difference.

    Science.gov (United States)

    Carter, Nathan T; Dalal, Dev K; Boyce, Anthony S; O'Connell, Matthew S; Kung, Mei-Chuan; Delgado, Kristin M

    2014-07-01

    The personality trait of conscientiousness has seen considerable attention from applied psychologists due to its efficacy for predicting job performance across performance dimensions and occupations. However, recent theoretical and empirical developments have questioned the assumption that more conscientiousness always results in better job performance, suggesting a curvilinear link between the 2. Despite these developments, the results of studies directly testing the idea have been mixed. Here, we propose this link has been obscured by another pervasive assumption known as the dominance model of measurement: that higher scores on traditional personality measures always indicate higher levels of conscientiousness. Recent research suggests dominance models show inferior fit to personality test scores as compared to ideal point models that allow for curvilinear relationships between traits and scores. Using data from 2 different samples of job incumbents, we show the rank-order changes that result from using an ideal point model expose a curvilinear link between conscientiousness and job performance 100% of the time, whereas results using dominance models show mixed results, similar to the current state of the literature. Finally, with an independent cross-validation sample, we show that selection based on predicted performance using ideal point scores results in more favorable objective hiring outcomes. Implications for practice and future research are discussed.

  5. A theoretical and empirical review of the death-thought accessibility concept in terror management research.

    Science.gov (United States)

    Hayes, Joseph; Schimel, Jeff; Arndt, Jamie; Faucher, Erik H

    2010-09-01

    Terror management theory (TMT) highlights the motivational impact of thoughts of death in various aspects of everyday life. Since its inception in 1986, research on TMT has undergone a slight but significant shift from an almost exclusive focus on the manipulation of thoughts of death to a marked increase in studies that measure the accessibility of death-related cognition. Indeed, the number of death-thought accessibility (DTA) studies in the published literature has grown substantially in recent years. In light of this increasing reliance on the DTA concept, the present article is meant to provide a comprehensive theoretical and empirical review of the literature employing this concept. After discussing the roots of DTA, the authors outline the theoretical refinements to TMT that have accompanied significant research findings associated with the DTA concept. Four distinct categories (mortality salience, death association, anxiety-buffer threat, and dispositional) are derived to organize the reviewed DTA studies, and the theoretical implications of each category are discussed. Finally, a number of lingering empirical and theoretical issues in the DTA literature are discussed with the aim of stimulating and focusing future research on DTA specifically and TMT in general.

  6. The Role of Trait Emotional Intelligence in Academic Performance: Theoretical Overview and Empirical Update.

    Science.gov (United States)

    Perera, Harsha N

    2016-01-01

    Considerable debate still exists among scholars over the role of trait emotional intelligence (TEI) in academic performance. The dominant theoretical position is that TEI should be orthogonal or only weakly related to achievement; yet, there are strong theoretical reasons to believe that TEI plays a key role in performance. The purpose of the current article is to provide (a) an overview of the possible theoretical mechanisms linking TEI with achievement and (b) an update on empirical research examining this relationship. To elucidate these theoretical mechanisms, the overview draws on multiple theories of emotion and regulation, including TEI theory, social-functional accounts of emotion, and expectancy-value and psychobiological model of emotion and regulation. Although these theoretical accounts variously emphasize different variables as focal constructs, when taken together, they provide a comprehensive picture of the possible mechanisms linking TEI with achievement. In this regard, the article redresses the problem of vaguely specified theoretical links currently hampering progress in the field. The article closes with a consideration of directions for future research.

  7. Merged ontology for engineering design: Contrasting empirical and theoretical approaches to develop engineering ontologies

    DEFF Research Database (Denmark)

    Ahmed, Saeema; Storga, M

    2009-01-01

    to developing the ontology engineering design integrated taxonomies (EDIT) with a theoretical approach in which concepts and relations are elicited from engineering design theories ontology (DO) The limitations and advantages of each approach are discussed. The research methodology adopted is to map......This paper presents a comparison of two previous and separate efforts to develop an ontology in the engineering design domain, together with an ontology proposal from which ontologies for a specific application may be derived. The research contrasts an empirical, user-centered approach...

  8. THE LEGAL INDEBTEDNESS CAPACITY OF ROMANIAN LOCAL GOVERNMENTS - THEORETICAL AND EMPIRICAL EVIDENCES

    Directory of Open Access Journals (Sweden)

    Bilan Irina

    2011-12-01

    Full Text Available The factual, not only formal capacity of local governments to appeal to borrowed resources is, considering the current conditions, a prerequisite for ensuring economic and social development of local communities. In this paper we intend to position the main theoretical and empirical evidences on local governments indebtedness capacity, mainly focusing on its sizing according to Romanian regulatory framework. With respect to previous research, the issue approached is one of great interest as it has not been, in the Romanian literature on local public finances, subject to a separate analysis of proportions. The undertaken analysis comprises a quantitative dimension, based on processed data from the consolidated general budget of Romanian local governments for 2007-2009, in permanent conjunction with monitoring and analysis of the involved qualitative aspects. To ensure the relevance of the research results, the analysis undertaken refers to the legal framework in function throughout the considered period of time, without involving the legislative changes operated in mid-2010. The main conclusions drawn from our analysis indicate that, considering the current Romanian socio-economic environment, under the impact of specific factors of different nature, the legal indebtedness capacity is far from being well valued, thus bringing its benefits to local communities development. This conclusion is valid from a global perspective as well as for different types of local communities. This appears to be inconsistent with the permanently claimed need to fund important local public investments, mainly in infrastructure, indicating, despite the high legal indebtedness capacity, the lack of factual access to borrowed resources. We suggest, therefore, to introduce the concept of effective indebtedness capacity, the result of a particularized correlation for different local governments between legal indebtedness capacity and the manifestation of several factors

  9. Internet governance and global self regulation: theoretical and empirical building blocks for a general theory of self regulation

    NARCIS (Netherlands)

    Vey Mestdagh, C.; Rijgersberg, R.

    2010-01-01

    The following exposition sets out to identify the basic theoretical and empirical building blocks for a general theory of self-regulation. It uses the Internet as an empirical basis since its global reach and technical characteristics create interdependencies between actors that transcend national

  10. Common liability to addiction and “gateway hypothesis”: Theoretical, empirical and evolutionary perspective

    Science.gov (United States)

    Vanyukov, Michael M.; Tarter, Ralph E.; Kirillova, Galina P.; Kirisci, Levent; Reynolds, Maureen D.; Kreek, Mary Jeanne; Conway, Kevin P.; Maher, Brion S.; Iacono, William G.; Bierut, Laura; Neale, Michael C.; Clark, Duncan B.; Ridenour, Ty A.

    2013-01-01

    Background Two competing concepts address the development of involvement with psychoactive substances: the “gateway hypothesis” (GH) and common liability to addiction (CLA). Method The literature on theoretical foundations and empirical findings related to both concepts is reviewed. Results The data suggest that drug use initiation sequencing, the core GH element, is variable and opportunistic rather than uniform and developmentally deterministic. The association between risks for use of different substances, if any, can be more readily explained by common underpinnings than by specific staging. In contrast, the CLA concept is grounded in genetic theory and supported by data identifying common sources of variation in the risk for specific addictions. This commonality has identifiable neurobiological substrate and plausible evolutionary explanations. Conclusions Whereas the “gateway” hypothesis does not specify mechanistic connections between “stages”, and does not extend to the risks for addictions, the concept of common liability to addictions incorporates sequencing of drug use initiation as well as extends to related addictions and their severity, provides a parsimonious explanation of substance use and addiction co-occurrence, and establishes a theoretical and empirical foundation to research in etiology, quantitative risk and severity measurement, as well as targeted non-drug-specific prevention and early intervention. PMID:22261179

  11. The Theoretical and Empirical Basis for Meditation as an Intervention for PTSD

    Science.gov (United States)

    Lang, Ariel J.; Strauss, Jennifer L.; Bomyea, Jessica; Bormann, Jill E.; Hickman, Steven D.; Good, Raquel C.; Essex, Michael

    2012-01-01

    In spite of the existence of good empirically supported treatments for posttraumatic stress disorder (PTSD), consumers and providers continue to ask for more options for managing this common and often chronic condition. Meditation-based approaches are being widely implemented, but there is minimal research rigorously assessing their effectiveness.…

  12. Development and performance of self-managing work teams : a theoretical and empirical examination

    NARCIS (Netherlands)

    Kuipers, B.J.; Stoker, J.I.

    2009-01-01

    Several theories have been developed that prescribe the team development of self-managing work teams (SMWTs). Some of these have led to models with successive linear developmental phases. However, both the theory and the empirical data show little support for these models. Based on an extensive

  13. Evolution of the empirical and theoretical foundations of eyewitness identification reform.

    Science.gov (United States)

    Clark, Steven E; Moreland, Molly B; Gronlund, Scott D

    2014-04-01

    Scientists in many disciplines have begun to raise questions about the evolution of research findings over time (Ioannidis in Epidemiology, 19, 640-648, 2008; Jennions & Møller in Proceedings of the Royal Society, Biological Sciences, 269, 43-48, 2002; Mullen, Muellerleile, & Bryan in Personality and Social Psychology Bulletin, 27, 1450-1462, 2001; Schooler in Nature, 470, 437, 2011), since many phenomena exhibit decline effects-reductions in the magnitudes of effect sizes as empirical evidence accumulates. The present article examines empirical and theoretical evolution in eyewitness identification research. For decades, the field has held that there are identification procedures that, if implemented by law enforcement, would increase eyewitness accuracy, either by reducing false identifications, with little or no change in correct identifications, or by increasing correct identifications, with little or no change in false identifications. Despite the durability of this no-cost view, it is unambiguously contradicted by data (Clark in Perspectives on Psychological Science, 7, 238-259, 2012a; Clark & Godfrey in Psychonomic Bulletin & Review, 16, 22-42, 2009; Clark, Moreland, & Rush, 2013; Palmer & Brewer in Law and Human Behavior, 36, 247-255, 2012), raising questions as to how the no-cost view became well-accepted and endured for so long. Our analyses suggest that (1) seminal studies produced, or were interpreted as having produced, the no-cost pattern of results; (2) a compelling theory was developed that appeared to account for the no-cost pattern; (3) empirical results changed over the years, and subsequent studies did not reliably replicate the no-cost pattern; and (4) the no-cost view survived despite the accumulation of contradictory empirical evidence. Theories of memory that were ruled out by early data now appear to be supported by data, and the theory developed to account for early data now appears to be incorrect.

  14. When complexity science meets implementation science: a theoretical and empirical analysis of systems change.

    Science.gov (United States)

    Braithwaite, Jeffrey; Churruca, Kate; Long, Janet C; Ellis, Louise A; Herkes, Jessica

    2018-04-30

    Implementation science has a core aim - to get evidence into practice. Early in the evidence-based medicine movement, this task was construed in linear terms, wherein the knowledge pipeline moved from evidence created in the laboratory through to clinical trials and, finally, via new tests, drugs, equipment, or procedures, into clinical practice. We now know that this straight-line thinking was naïve at best, and little more than an idealization, with multiple fractures appearing in the pipeline. The knowledge pipeline derives from a mechanistic and linear approach to science, which, while delivering huge advances in medicine over the last two centuries, is limited in its application to complex social systems such as healthcare. Instead, complexity science, a theoretical approach to understanding interconnections among agents and how they give rise to emergent, dynamic, systems-level behaviors, represents an increasingly useful conceptual framework for change. Herein, we discuss what implementation science can learn from complexity science, and tease out some of the properties of healthcare systems that enable or constrain the goals we have for better, more effective, more evidence-based care. Two Australian examples, one largely top-down, predicated on applying new standards across the country, and the other largely bottom-up, adopting medical emergency teams in over 200 hospitals, provide empirical support for a complexity-informed approach to implementation. The key lessons are that change can be stimulated in many ways, but a triggering mechanism is needed, such as legislation or widespread stakeholder agreement; that feedback loops are crucial to continue change momentum; that extended sweeps of time are involved, typically much longer than believed at the outset; and that taking a systems-informed, complexity approach, having regard for existing networks and socio-technical characteristics, is beneficial. Construing healthcare as a complex adaptive system

  15. Physical violence and psychological abuse among siblings :a theoretical and empirical analysis

    OpenAIRE

    Hoffman, Kristi L.

    1996-01-01

    This study develops and evaluates a theoretical model based on social learning, conflict, and feminist perspectives to explain teenage sibling physical violence and psychological abuse. Using regression analysis and data from 796 young adults, considerable support is found for all three theoretical approaches and suggests an integrated model best predicts acts of violence and abuse among siblings. For physical violence, males and brothers had significantly higher rates. Spousal...

  16. Quantifying multi-dimensional functional trait spaces of trees: empirical versus theoretical approaches

    Science.gov (United States)

    Ogle, K.; Fell, M.; Barber, J. J.

    2016-12-01

    Empirical, field studies of plant functional traits have revealed important trade-offs among pairs or triplets of traits, such as the leaf (LES) and wood (WES) economics spectra. Trade-offs include correlations between leaf longevity (LL) vs specific leaf area (SLA), LL vs mass-specific leaf respiration rate (RmL), SLA vs RmL, and resistance to breakage vs wood density. Ordination analyses (e.g., PCA) show groupings of traits that tend to align with different life-history strategies or taxonomic groups. It is unclear, however, what underlies such trade-offs and emergent spectra. Do they arise from inherent physiological constraints on growth, or are they more reflective of environmental filtering? The relative importance of these mechanisms has implications for predicting biogeochemical cycling, which is influenced by trait distributions of the plant community. We address this question using an individual-based model of tree growth (ACGCA) to quantify the theoretical trait space of trees that emerges from physiological constraints. ACGCA's inputs include 32 physiological, anatomical, and allometric traits, many of which are related to the LES and WES. We fit ACGCA to 1.6 million USFS FIA observations of tree diameters and heights to obtain vectors of trait values that produce realistic growth, and we explored the structure of this trait space. No notable correlations emerged among the 496 trait pairs, but stepwise regressions revealed complicated multi-variate structure: e.g., relationships between pairs of traits (e.g., RmL and SLA) are governed by other traits (e.g., LL, radiation-use efficiency [RUE]). We also simulated growth under various canopy gap scenarios that impose varying degrees of environmental filtering to explore the multi-dimensional trait space (hypervolume) of trees that died vs survived. The centroid and volume of the hypervolumes differed among dead and live trees, especially under gap conditions leading to low mortality. Traits most predictive

  17. Ignorance, Vulnerability and the Occurrence of "Radical Surprises": Theoretical Reflections and Empirical Findings

    Science.gov (United States)

    Kuhlicke, C.

    2009-04-01

    , that the flood was far beyond people's power of imagination (nescience). The reason therefore is that previous to the flood an institutionalized space of experience and horizon of expectation existed, which did not consider the possibility that the "stability" of the river is artificially created by engineering achievements to reduce its naturally given variability. Based on the empirical findings and the theoretical reasoning overall conclusions are drawn and implications for flood risk management under conditions global environmental change are outlined.

  18. Why autobiographical memories for traumatic and emotional events might differ: theoretical arguments and empirical evidence.

    Science.gov (United States)

    Sotgiu, Igor; Rusconi, Maria Luisa

    2014-01-01

    The authors review five arguments supporting the hypothesis that memories for traumatic and nontraumatic emotional events should be considered as qualitatively different recollections. The first argument considers the objective features of traumatic and emotional events and their possible influence on the formation of memories for these events. The second argument assumes that traumatic memories distinguish from emotional ones as trauma exposure is often associated with the development of psychological disorders involving memory disturbances. The third argument is that traumatic experiences are more likely than emotional experiences to be forgotten and recovered. The fourth argument concerns the possibility that emotional memories are socially shared more frequently than traumatic memories. A fifth argument suggests that trauma exposure may impair selected brain systems implicated in memory functions. Theoretical and empirical evidence supporting these claims is reviewed. In the conclusions, the authors illustrate future research directions and discuss some conceptual issues related to the definitions of traumatic event currently employed by memory researchers.

  19. The Influence of Education and Socialization on Radicalization: An Exploration of Theoretical Presumptions and Empirical Research.

    Science.gov (United States)

    Pels, Trees; de Ruyter, Doret J

    2012-06-01

    BACKGROUND AND OBJECTIVE: Research into radicalization does not pay much attention to education. This is remarkable and possibly misses an important influence on the process of radicalization. Therefore this article sets out to explore the relation between education on the one hand and the onset or prevention of radicalization on the other hand. METHOD: This article is a theoretical literature review. It has analyzed empirical studies-mainly from European countries-about the educational aims, content and style of Muslim parents and parents with (extreme) right-wing sympathies. RESULTS: Research examining similarity in right-wing sympathies between parents and children yields mixed results, but studies among adolescents point to a significant concordance. Research also showed that authoritarian parenting may play a significant role. Similar research among Muslim families was not found. While raising children with distrust and an authoritarian style are prevalent, the impact on adolescents has not been investigated. The empirical literature we reviewed does not give sufficient evidence to conclude that democratic ideal in and an authoritative style of education are conducive to the development of a democratic attitude. CONCLUSION: There is a knowledge gap with regard to the influence of education on the onset or the prevention of radicalization. Schools and families are underappreciated sources of informal social control and social capital and therefore the gap should be closed. If there is a better understanding of the effect of education, policy as well as interventions can be developed to assist parents and teachers in preventing radicalization.

  20. Why do electricity utilities cooperate with coal suppliers? A theoretical and empirical analysis from China

    International Nuclear Information System (INIS)

    Zhao Xiaoli; Lyon, Thomas P.; Wang Feng; Song Cui

    2012-01-01

    The asymmetry of Chinese coal and electricity pricing reforms leads to serious conflict between coal suppliers and electricity utilities. Electricity utilities experience significant losses as a result of conflict: severe coal price fluctuations, and uncertainty in the quantity and quality of coal supplies. This paper explores whether establishing cooperative relationships between coal suppliers and electricity utilities can resolve conflicts. We begin with a discussion of the history of coal and electricity pricing reforms, and then conduct a theoretical analysis of relational contracting to provide a new perspective on the drivers behind the establishment of cooperative relationships between the two parties. Finally, we empirically investigate the role of cooperative relationships and the establishment of mine-mouth power plants on the performance of electricity utilities. The results show that relational contracting between electricity utilities and coal suppliers improves the market performance of electricity utilities; meanwhile, the transportation cost savings derived from mine-mouth power plants are of importance in improving the performance of electricity utilities. - Highlights: ► We discuss the history of coal and electricity pricing reforms. ► The roots of conflicts between electricity and coal firms are presented. ► We conduct a theoretical analysis of relational contracting. ► The role of mine-mouth power plants on the performance of power firms is examined.

  1. Theoretical analyses of superconductivity in iron based ...

    African Journals Online (AJOL)

    This paper focuses on the theoretical analysis of superconductivity in iron based superconductor Ba1−xKxFe2As2. After reviewing the current findings on this system, we suggest that phononexciton combined mechanism gives a right order of superconducting transition temperature (TC) for Ba1−xKxFe2As2 . By developing ...

  2. RETAIL STORE IMAGE: A COMPARISON AMONG THEORETICAL AND EMPIRICAL DIMENSIONS IN A BRAZILIAN STUDY

    Directory of Open Access Journals (Sweden)

    Janaina de Moura Engracia Giraldi

    2008-01-01

    Full Text Available The retail store can be the key success factor, the competitive advantage of a retail company. An important element to the retail strategy is the store image; the total sum of customers’ perceptions about a store. The present paper compares the theoretical and empirical dimensions of retail store’s image in a Brazilian study. The type of research used was the quantitative study, and the data collected was analyzed by use of the factor analysis technique, in order to identify the underlying factors to retail store image. In conclusion, it was observed that the form by which the respondents evaluate the image of a specific supermarket in Brazil is simpler than what was foreseen by theory, with nine factors representing the following store image dimensions: quality,price, after sales service, advertising, clientele, assortment, convenience, atmosphere and services. An important practical contribution of the present study refers to the development of a simpler scale, that can be used by retailers in a viable form to obtain data on their perceived image.

  3. Safety climate and injuries: an examination of theoretical and empirical relationships.

    Science.gov (United States)

    Beus, Jeremy M; Payne, Stephanie C; Bergman, Mindy E; Arthur, Winfred

    2010-07-01

    Our purpose in this study was to meta-analytically address several theoretical and empirical issues regarding the relationships between safety climate and injuries. First, we distinguished between extant safety climate-->injury and injury-->safety climate relationships for both organizational and psychological safety climates. Second, we examined several potential moderators of these relationships. Meta-analyses revealed that injuries were more predictive of organizational safety climate than safety climate was predictive of injuries. Additionally, the injury-->safety climate relationship was stronger for organizational climate than for psychological climate. Moderator analyses revealed that the degree of content contamination in safety climate measures inflated effects, whereas measurement deficiency attenuated effects. Additionally, moderator analyses showed that as the time period over which injuries were assessed lengthened, the safety climate-->injury relationship was attenuated. Supplemental meta-analyses of specific safety climate dimensions also revealed that perceived management commitment to safety is the most robust predictor of occupational injuries. Contrary to expectations, the operationalization of injuries did not meaningfully moderate safety climate-injury relationships. Implications and recommendations for future research and practice are discussed.

  4. How beauty works. Theoretical mechanisms and two empirical applications on students' evaluation of teaching.

    Science.gov (United States)

    Wolbring, Tobias; Riordan, Patrick

    2016-05-01

    Plenty of studies show that the physical appearance of a person affects a variety of outcomes in everyday life. However, due to an incomplete theoretical explication and empirical problems in disentangling different beauty effects, it is unclear which mechanisms are at work. To clarify how beauty works we present explanations from evolutionary theory and expectation states theory and show where both perspectives differ and where interlinkage appears promising. Using students' evaluations of teaching we find observational and experimental evidence for the different causal pathways of physical attractiveness. First, independent raters strongly agree over the physical attractiveness of a person. Second, attractive instructors receive better student ratings. Third, students attend classes of attractive instructors more frequently - even after controlling for teaching quality. Fourth, we find no evidence that attractiveness effects become stronger if rater and ratee are of the opposite sex. Finally, the beauty premium turns into a penalty if an attractive instructor falls short of students' expectations. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Evaluation of theoretical and empirical water vapor sorption isotherm models for soils

    Science.gov (United States)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per; de Jonge, Lis W.

    2016-01-01

    The mathematical characterization of water vapor sorption isotherms of soils is crucial for modeling processes such as volatilization of pesticides and diffusive and convective water vapor transport. Although numerous physically based and empirical models were previously proposed to describe sorption isotherms of building materials, food, and other industrial products, knowledge about the applicability of these functions for soils is noticeably lacking. We present an evaluation of nine models for characterizing adsorption/desorption isotherms for a water activity range from 0.03 to 0.93 based on measured data of 207 soils with widely varying textures, organic carbon contents, and clay mineralogy. In addition, the potential applicability of the models for prediction of sorption isotherms from known clay content was investigated. While in general, all investigated models described measured adsorption and desorption isotherms reasonably well, distinct differences were observed between physical and empirical models and due to the different degrees of freedom of the model equations. There were also considerable differences in model performance for adsorption and desorption data. While regression analysis relating model parameters and clay content and subsequent model application for prediction of measured isotherms showed promise for the majority of investigated soils, for soils with distinct kaolinitic and smectitic clay mineralogy predicted isotherms did not closely match the measurements.

  6. The use of theoretical and empirical knowledge in the production of explanations and arguments in an inquiry biology activity

    Directory of Open Access Journals (Sweden)

    Maíra Batistoni e Silva

    2017-08-01

    Full Text Available Agreeing with the scientific literacy as the purpose of science education and with the recent propositions that in order to achieve it we should favor the engagement of students in practices of scientific culture, this study intends to analyze the production of explanations and arguments in an inquiry based teaching activity in order to characterize students' mobilization of theoretical and empirical knowledge by engaging in these practices. Analyzing the scientific reports elaborated by the students (14-15 years old after the inquiry activity on population dynamics, we highlight the importance of empirical knowledge about the experimental context as a repertoire for construction of explanations, especially when students deal with anomalous data. This knowledge was also important for production of valid arguments, since most of the justifications were empirical, regardless of whether or not the data were in accordance with the explanatory model already known. These results reinforce the importance of students' engagement in inquiry activities, as already defended by different authors of this research area, and indicate that the inquiry practice allowed the engagement in epistemic practices, since the knowledge about the experimental conditions and the procedures of data collection provided a repertoire for the production of explanations and arguments. Finally, we discuss the relevance of this research to the field of biology teaching, seeking to defend the promotion of inquiry activities with an experimental approach as an opportunity to integrate conceptual and epistemic objectives and overcome the difficulties generated by the specificities of this area of knowledge in relation to the other disciplines in nature sciences.

  7. Box-Cox Test: the theoretical justification and US-China empirical study

    Directory of Open Access Journals (Sweden)

    Tam Bang Vu

    2011-01-01

    Full Text Available In econometrics, the derivation of a theoretical model leads sometimes to two econometric models, which can be considered justified based on their respective approximation approaches. Hence, the decision of choosing one between the two hinges on applied econometric tools. In this paper, the authors develop a theoretical econometrics consumer maximization model to measure the flow of durables’ expenditures where depreciation is added to former classical econometrics model. The proposed model was formulated in both linear and logarithmic forms. Box-Cox tests were used to choose the most appropriate one among them. The proposed model was then applied to the historical data from the U.S. and China for a comparative study and the results discussed.

  8. Solving theoretical and empirical conundrums in international strategy research by matching foreign entry mode choices and performance

    NARCIS (Netherlands)

    Martin, Xavier

    2013-01-01

    Several theoretical and empirical developments in the literature on foreign entry mode and performance, and on (international) strategy more generally, were influenced or prefigured by Brouthers’ (2002) JIBS Decade Award winning paper. Regarding theory, Brouthers is an archetype of the integration

  9. A Socio-Cultural Model Based on Empirical Data of Cultural and Social Relationship

    DEFF Research Database (Denmark)

    Lipi, Afia Akhter; Nakano, Yukiko; Rehm, Matthias

    2010-01-01

    The goal of this paper is to integrate culture and social relationship as a computational term in an embodied conversational agent system by employing empirical and theoretical approach. We propose a parameter-based model that predicts nonverbal expressions appropriate for specific cultures...... in different social relationship. So, first, we introduce the theories of social and cultural characteristics. Then, we did corpus analysis of human interaction of two cultures in two different social situations and extracted empirical data and finally, by integrating socio-cultural characteristics...... with empirical data, we establish a parameterized network model that generates culture specific non-verbal expressions in different social relationships....

  10. Empirically Based Myths: Astrology, Biorhythms, and ATIs.

    Science.gov (United States)

    Ragsdale, Ronald G.

    1980-01-01

    A myth may have an empirical basis through chance occurrence; perhaps Aptitude Treatment Interactions (ATIs) are in this category. While ATIs have great utility in describing, planning, and implementing instruction, few disordinal interactions have been found. Article suggests narrowing of ATI research with replications and estimates of effect…

  11. PHILOSOPHICAL VALIDITY, THEORETICAL, NORMATIVE AND EMPIRICAL PARADIGM OF GENERAL PRINCIPLES OF GOOD GOVERNANCE (AUPB AS A REVIEW OF PRESIDENTIAL IMPEACHMENT

    Directory of Open Access Journals (Sweden)

    Nadir Nadir

    2017-03-01

    Full Text Available Philosophical validity showed of the Principles of Good Governance (AUPB as A review to Presidential impeachment, is a principle of AUPB that contains ethical normative values used as the foundation of good governance, clean and respectable, moreover to complement the shortcomings and ambiguities in law. Technically, the application of AUPB by the judges of the Constitutional Court (MK-RI can be approached through induction and deduction legal reasoning. The method of implementing AUPB by the judges of the Constitutional Court (MK-RI is accomplished by deductive at first, meaning that the special rules is focused more to the certain field of law, then these are deducted based on its basic rules and deducted again into the rules of substantive, and deducted again into the rules of cases. After that, it starts to applicate the rules of case based on the concrete case by the judge, because of the nature of the judges of the Constitutional Court (MK-RI is kholifah fil'ardi as the representative of God on earth to uphold the law and justice. While theoretically AUPB is valid, the judge ius curia Novit as a verdict maker to perform legal discovery (rechtsvinding. Empirically AUPB is valid, it can be seen from the cases of impeachment against the President of the United States William Jefferson Clinton, on suspicion of "abominably act" (misdemeanors. Additionally, AUPB empirically has been tested through jurisprudence since Amtenarenwet 1929 officially applied on March 1, 1933. Centrale Raad van Beroep, in his verdict on June 22, 1933, and the jurisprudence verdict of Hoge Raad on November 13, 1936, and the jurisprudence verdict of Hoge Raad 1919. While the normative validity is based on the leading legal doctrine, that AUPB is positioned as the unwritten laws that must be obeyed by the government, and AUPB considered as a part of positive law. Moreover, in Indonesia AUPB incarnates in various legislations even though his name is remained as principal.

  12. Tax design-tax evasion relationship in Serbia: New empirical approach to standard theoretical model

    Directory of Open Access Journals (Sweden)

    Ranđelović Saša

    2015-01-01

    Full Text Available This paper provides evidence on the impact of the change in income tax rates and the degree of its progressivity on the scale of labour taxes evasion in Serbia, using the tax-benefit microsimulation model and econometric methods, on 2007 Living Standard Measurement Survey data. The empirical analysis is based on novel assumption that individual's tax evasion decision depends on a change in disposable income, which is captured by the variation in their Effective Marginal Tax Rates (EMTR, rather than on a change in after-tax income. The results suggest that the elasticity of tax evasion to EMTR equals -0.3, confirming the Yitzhaki's theory, while the propensity to evade is decreasing in the level of wages and increasing in the level of self-employment income. The results also show that introduction of revenue-neutral, progressive taxation of labour income would lead to increase in labour tax evasion by 1 percentage point.

  13. PWR surveillance based on correspondence between empirical models and physical

    International Nuclear Information System (INIS)

    Zwingelstein, G.; Upadhyaya, B.R.; Kerlin, T.W.

    1976-01-01

    An on line surveillance method based on the correspondence between empirical models and physicals models is proposed for pressurized water reactors. Two types of empirical models are considered as well as the mathematical models defining the correspondence between the physical and empirical parameters. The efficiency of this method is illustrated for the surveillance of the Doppler coefficient for Oconee I (an 886 MWe PWR) [fr

  14. The Safety Culture Enactment Questionnaire (SCEQ): Theoretical model and empirical validation.

    Science.gov (United States)

    de Castro, Borja López; Gracia, Francisco J; Tomás, Inés; Peiró, José M

    2017-06-01

    This paper presents the Safety Culture Enactment Questionnaire (SCEQ), designed to assess the degree to which safety is an enacted value in the day-to-day running of nuclear power plants (NPPs). The SCEQ is based on a theoretical safety culture model that is manifested in three fundamental components of the functioning and operation of any organization: strategic decisions, human resources practices, and daily activities and behaviors. The extent to which the importance of safety is enacted in each of these three components provides information about the pervasiveness of the safety culture in the NPP. To validate the SCEQ and the model on which it is based, two separate studies were carried out with data collection in 2008 and 2014, respectively. In Study 1, the SCEQ was administered to the employees of two Spanish NPPs (N=533) belonging to the same company. Participants in Study 2 included 598 employees from the same NPPs, who completed the SCEQ and other questionnaires measuring different safety outcomes (safety climate, safety satisfaction, job satisfaction and risky behaviors). Study 1 comprised item formulation and examination of the factorial structure and reliability of the SCEQ. Study 2 tested internal consistency and provided evidence of factorial validity, validity based on relationships with other variables, and discriminant validity between the SCEQ and safety climate. Exploratory Factor Analysis (EFA) carried out in Study 1 revealed a three-factor solution corresponding to the three components of the theoretical model. Reliability analyses showed strong internal consistency for the three scales of the SCEQ, and each of the 21 items on the questionnaire contributed to the homogeneity of its theoretically developed scale. Confirmatory Factor Analysis (CFA) carried out in Study 2 supported the internal structure of the SCEQ; internal consistency of the scales was also supported. Furthermore, the three scales of the SCEQ showed the expected correlation

  15. Patients' Acceptance of Smartphone Health Technology for Chronic Disease Management: A Theoretical Model and Empirical Test.

    Science.gov (United States)

    Dou, Kaili; Yu, Ping; Deng, Ning; Liu, Fang; Guan, YingPing; Li, Zhenye; Ji, Yumeng; Du, Ningkai; Lu, Xudong; Duan, Huilong

    2017-12-06

    Chronic disease patients often face multiple challenges from difficult comorbidities. Smartphone health technology can be used to help them manage their conditions only if they accept and use the technology. The aim of this study was to develop and test a theoretical model to predict and explain the factors influencing patients' acceptance of smartphone health technology for chronic disease management. Multiple theories and factors that may influence patients' acceptance of smartphone health technology have been reviewed. A hybrid theoretical model was built based on the technology acceptance model, dual-factor model, health belief model, and the factors identified from interviews that might influence patients' acceptance of smartphone health technology for chronic disease management. Data were collected from patient questionnaire surveys and computer log records about 157 hypertensive patients' actual use of a smartphone health app. The partial least square method was used to test the theoretical model. The model accounted for .412 of the variance in patients' intention to adopt the smartphone health technology. Intention to use accounted for .111 of the variance in actual use and had a significant weak relationship with the latter. Perceived ease of use was affected by patients' smartphone usage experience, relationship with doctor, and self-efficacy. Although without a significant effect on intention to use, perceived ease of use had a significant positive influence on perceived usefulness. Relationship with doctor and perceived health threat had significant positive effects on perceived usefulness, countering the negative influence of resistance to change. Perceived usefulness, perceived health threat, and resistance to change significantly predicted patients' intentions to use the technology. Age and gender had no significant influence on patients' acceptance of smartphone technology. The study also confirmed the positive relationship between intention to use

  16. Patients’ Acceptance of Smartphone Health Technology for Chronic Disease Management: A Theoretical Model and Empirical Test

    Science.gov (United States)

    Dou, Kaili; Yu, Ping; Liu, Fang; Guan, YingPing; Li, Zhenye; Ji, Yumeng; Du, Ningkai; Lu, Xudong; Duan, Huilong

    2017-01-01

    Background Chronic disease patients often face multiple challenges from difficult comorbidities. Smartphone health technology can be used to help them manage their conditions only if they accept and use the technology. Objective The aim of this study was to develop and test a theoretical model to predict and explain the factors influencing patients’ acceptance of smartphone health technology for chronic disease management. Methods Multiple theories and factors that may influence patients’ acceptance of smartphone health technology have been reviewed. A hybrid theoretical model was built based on the technology acceptance model, dual-factor model, health belief model, and the factors identified from interviews that might influence patients’ acceptance of smartphone health technology for chronic disease management. Data were collected from patient questionnaire surveys and computer log records about 157 hypertensive patients’ actual use of a smartphone health app. The partial least square method was used to test the theoretical model. Results The model accounted for .412 of the variance in patients’ intention to adopt the smartphone health technology. Intention to use accounted for .111 of the variance in actual use and had a significant weak relationship with the latter. Perceived ease of use was affected by patients’ smartphone usage experience, relationship with doctor, and self-efficacy. Although without a significant effect on intention to use, perceived ease of use had a significant positive influence on perceived usefulness. Relationship with doctor and perceived health threat had significant positive effects on perceived usefulness, countering the negative influence of resistance to change. Perceived usefulness, perceived health threat, and resistance to change significantly predicted patients’ intentions to use the technology. Age and gender had no significant influence on patients’ acceptance of smartphone technology. The study also

  17. Choice of Foreign Market Entry Mode - Cognitions from Empirical and Theoretical Studies

    OpenAIRE

    Zhao, Xuemin; Decker, Reinhold

    2004-01-01

    This paper analyzes critically five basic theories on market entry mode decision with respect to existing strengths and weaknesses and the results of corresponding empirical studies. Starting from conflictions both in theories and empirical studies dealing with the entry mode choice problem we motivate a significant need of further research in this important area of international marketing. Furthermore we provide implications for managers in practice and outline emerging trends in market entr...

  18. Adaptation of the concept of varying time of concentration within flood modelling: Theoretical and empirical investigations across the Mediterranean

    Science.gov (United States)

    Michailidi, Eleni Maria; Antoniadi, Sylvia; Koukouvinos, Antonis; Bacchi, Baldassare; Efstratiadis, Andreas

    2017-04-01

    The time of concentration, tc, is a key hydrological concept and often is an essential parameter of rainfall-runoff modelling, which has been traditionally tackled as a characteristic property of the river basin. However, both theoretical proof and empirical evidence imply that tc is a hydraulic quantity that depends on flow, and thus it should be considered as variable and not as constant parameter. Using a kinematic method approach, easily implemented in GIS environment, we first illustrate that the relationship between tc and the effective rainfall produced over the catchment is well-approximated by a power-type law, the exponent of which is associated with the slope of the longest flow path of the river basin. Next, we take advantage of this relationship to adapt the concept of varying time of concentration within flood modelling, and particularly the well-known SCS-CN approach. In this context, the initial abstraction ratio is also considered varying, while the propagation of the effective rainfall is employed through a parametric unit hydrograph, the shape of which is dynamically adjusted according to the runoff produced during the flood event. The above framework is tested in a number of Mediterranean river basins in Greece, Italy and Cyprus, ensuring faithful representation of most of the observed flood events. Based on the outcomes of this extended analysis, we provide guidance for employing this methodology for flood design studies in ungauged basins.

  19. Discovering the Neural Nature of Moral Cognition? Empirical, Theoretical, and Practical Challenges in Bioethical Research with Electroencephalography (EEG).

    Science.gov (United States)

    Wagner, Nils-Frederic; Chaves, Pedro; Wolff, Annemarie

    2017-06-01

    In this article we critically review the neural mechanisms of moral cognition that have recently been studied via electroencephalography (EEG). Such studies promise to shed new light on traditional moral questions by helping us to understand how effective moral cognition is embodied in the brain. It has been argued that conflicting normative ethical theories require different cognitive features and can, accordingly, in a broadly conceived naturalistic attempt, be associated with different brain processes that are rooted in different brain networks and regions. This potentially morally relevant brain activity has been empirically investigated through EEG-based studies on moral cognition. From neuroscientific evidence gathered in these studies, a variety of normative conclusions have been drawn and bioethical applications have been suggested. We discuss methodological and theoretical merits and demerits of the attempt to use EEG techniques in a morally significant way, point to legal challenges and policy implications, indicate the potential to reveal biomarkers of psychopathological conditions, and consider issues that might inform future bioethical work.

  20. 137Cs applicability to soil erosion assessment: theoretical and empirical model

    International Nuclear Information System (INIS)

    Andrello, Avacir Casanova

    2004-02-01

    The soil erosion processes acceleration and the increase of soil erosion rates due to anthropogenic perturbation in soil-weather-vegetation equilibrium has influenced in the soil quality and environment. So, the possibility to assess the amplitude and severity of soil erosion impact on the productivity and quality of soil is important so local scale as regional and global scale. Several models have been developed to assess the soil erosion so qualitative as quantitatively. 137 Cs, an anthropogenic radionuclide, have been very used to assess the superficial soil erosion process Empirical and theoretical models were developed on the basis of 137 Cs redistribution as indicative of soil movement by erosive process These models incorporate many parameters that can influence in the soil erosion rates quantification by 137 Cs redistribution. Statistical analysis was realized on the models recommended by IAEA to determinate the influence that each parameter generates in results of the soil redistribution. It was verified that the most important parameter is the 137 Cs redistribution, indicating the necessity of a good determination in the 137 Cs inventory values with a minimum deviation associated with these values. After this, it was associated a 10% deviation in the reference value of 137 Cs inventory and the 5% in the 137 Cs inventory of the sample and was determinate the deviation in results of the soil redistribution calculated by models. The results of soil redistribution was compared to verify if there was difference between the models, but there was not difference in the results determinate by models, unless above 70% of 137 Cs loss. Analyzing three native forests and an area of the undisturbed pasture in the Londrina region, can be verified that the 137 Cs spatial variability in local scale was 15%. Comparing the 137 Cs inventory values determinate in the three native forest with the 137 Cs inventory value determinate in the area of undisturbed pasture in the

  1. Longitudinal Driving Behavior in Case of Emergency Situations : An Empirically Underpinned Theoretical Framework

    NARCIS (Netherlands)

    Hoogendoorn, R.G.; Van Arem, B.; Brookhuis, K.A.

    2013-01-01

    Adverse conditions have been shown to have a substantial impact on traffic flow operations. It is however not yet clear to what extent emergency situations actually lead to adaptation effects in empirical longitudinal driving behavior, what the causes of these adaptation effects are and how these

  2. Data, Information, Knowledge, Wisdom (DIKW: A Semiotic Theoretical and Empirical Exploration of the Hierarchy and its Quality Dimension

    Directory of Open Access Journals (Sweden)

    Sasa Baskarada

    2013-03-01

    Full Text Available What exactly is the difference between data and information? What is the difference between data quality and information quality; is there any difference between the two? And, what are knowledge and wisdom? Are there such things as knowledge quality and wisdom quality? As these primitives are the most basic axioms of information systems research, it is somewhat surprising that consensus on exact definitions seems to be lacking. This paper presents a theoretical and empirical exploration of the sometimes directly quoted, and often implied Data, Information, Knowledge, Wisdom (DIKW hierarchy and its quality dimension. We first review relevant literature from a range of perspectives and develop and contextualise a theoretical DIKW framework through semiotics. The literature review identifies definitional commonalities and divergences from a scholarly perspective; the theoretical discussion contextualises the terms and their relationships within a semiotic framework and proposes relevant definitions grounded in that framework. Next, rooted in Wittgenstein’s ordinary language philosophy, we analyse 20 online news articles for their uses of the terms and present the results of an online focus group discussion comprising 16 information systems experts. The empirical exploration identifies a range of definitional ambiguities from a practical perspective.

  3. Collective animal navigation and migratory culture: from theoretical models to empirical evidence

    Science.gov (United States)

    Dell, Anthony I.

    2018-01-01

    Animals often travel in groups, and their navigational decisions can be influenced by social interactions. Both theory and empirical observations suggest that such collective navigation can result in individuals improving their ability to find their way and could be one of the key benefits of sociality for these species. Here, we provide an overview of the potential mechanisms underlying collective navigation, review the known, and supposed, empirical evidence for such behaviour and highlight interesting directions for future research. We further explore how both social and collective learning during group navigation could lead to the accumulation of knowledge at the population level, resulting in the emergence of migratory culture. This article is part of the theme issue ‘Collective movement ecology’. PMID:29581394

  4. Regional differences of outpatient physician supply as a theoretical economic and empirical generalized linear model.

    Science.gov (United States)

    Scholz, Stefan; Graf von der Schulenburg, Johann-Matthias; Greiner, Wolfgang

    2015-11-17

    Regional differences in physician supply can be found in many health care systems, regardless of their organizational and financial structure. A theoretical model is developed for the physicians' decision on office allocation, covering demand-side factors and a consumption time function. To test the propositions following the theoretical model, generalized linear models were estimated to explain differences in 412 German districts. Various factors found in the literature were included to control for physicians' regional preferences. Evidence in favor of the first three propositions of the theoretical model could be found. Specialists show a stronger association to higher populated districts than GPs. Although indicators for regional preferences are significantly correlated with physician density, their coefficients are not as high as population density. If regional disparities should be addressed by political actions, the focus should be to counteract those parameters representing physicians' preferences in over- and undersupplied regions.

  5. Attachment-based family therapy for depressed and suicidal adolescents: theory, clinical model and empirical support.

    Science.gov (United States)

    Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne

    2015-01-01

    Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support.

  6. A system of safety management practices and worker engagement for reducing and preventing accidents: an empirical and theoretical investigation.

    Science.gov (United States)

    Wachter, Jan K; Yorio, Patrick L

    2014-07-01

    The overall research objective was to theoretically and empirically develop the ideas around a system of safety management practices (ten practices were elaborated), to test their relationship with objective safety statistics (such as accident rates), and to explore how these practices work to achieve positive safety results (accident prevention) through worker engagement. Data were collected using safety manager, supervisor and employee surveys designed to assess and link safety management system practices, employee perceptions resulting from existing practices, and safety performance outcomes. Results indicate the following: there is a significant negative relationship between the presence of ten individual safety management practices, as well as the composite of these practices, with accident rates; there is a significant negative relationship between the level of safety-focused worker emotional and cognitive engagement with accident rates; safety management systems and worker engagement levels can be used individually to predict accident rates; safety management systems can be used to predict worker engagement levels; and worker engagement levels act as mediators between the safety management system and safety performance outcomes (such as accident rates). Even though the presence of safety management system practices is linked with incident reduction and may represent a necessary first-step in accident prevention, safety performance may also depend on mediation by safety-focused cognitive and emotional engagement by workers. Thus, when organizations invest in a safety management system approach to reducing/preventing accidents and improving safety performance, they should also be concerned about winning over the minds and hearts of their workers through human performance-based safety management systems designed to promote and enhance worker engagement. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. The demand-induced strain compensation model : renewed theoretical considerations and empirical evidence

    NARCIS (Netherlands)

    de Jonge, J.; Dormann, C.; van den Tooren, M.; Näswall, K.; Hellgren, J.; Sverke, M.

    2008-01-01

    This chapter presents a recently developed theoretical model on jobrelated stress and performance, the so-called Demand-Induced Strain Compensation (DISC) model. The DISC model predicts in general that adverse health effects of high job demands can best be compensated for by matching job resources

  8. A theoretical assessment of the empirical literature on the impact of multinationality on performance

    NARCIS (Netherlands)

    Hennart, J.M.A.

    2011-01-01

    I assess the theoretical basis for the existence of a relationship between the size of a firm's foreign footprint (its multinationality) and its performance. I argue that multinationality results from a firm's choice between coordinating internally the stages of its value chain and letting them be

  9. Predicting Child Abuse Potential: An Empirical Investigation of Two Theoretical Frameworks

    Science.gov (United States)

    Begle, Angela Moreland; Dumas, Jean E.; Hanson, Rochelle F.

    2010-01-01

    This study investigated two theoretical risk models predicting child maltreatment potential: (a) Belsky's (1993) developmental-ecological model and (b) the cumulative risk model in a sample of 610 caregivers (49% African American, 46% European American; 53% single) with a child between 3 and 6 years old. Results extend the literature by using a…

  10. Moving Beyond Pioneering: Empirical and Theoretical Perspectives on Lesbian, Gay, and Bisexual Affirmative Training.

    Science.gov (United States)

    Croteau, James M.; Bieschke, Kathleen J.; Phillips, Julia C.; Lark, Julianne S.

    1998-01-01

    States that the literature to date has broken the silence on lesbian, gay, and bisexual (LGB) issues and has affirmed the field of psychology as being affirmative toward these issues. Proposes that research should move toward a greater understanding of LGB affirmative professional training by focusing on training from theoretical and empirical…

  11. The Role of Identity in Acculturation among Immigrant People: Theoretical Propositions, Empirical Questions, and Applied Recommendations

    Science.gov (United States)

    Schwartz, Seth J.; Montgomery, Marilyn J.; Briones, Ervin

    2006-01-01

    The present paper advances theoretical propositions regarding the relationship between acculturation and identity. The most central thesis argued is that acculturation represents changes in cultural identity and that personal identity has the potential to "anchor" immigrant people during their transition to a new society. The article emphasizes…

  12. Theoretical insight into an empirical rule about organic corrosion inhibitors containing nitrogen, oxygen, and sulfur atoms

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Lei, E-mail: cqglei@163.com [School of Material and Chemical Engieering, Tongren University, Tongren 554300 (China); Obot, Ime Bassey [Center of Research Excellence in Corrosion, King Fahd University of Petroleum and Minerals, Dhahran 31261 (Saudi Arabia); Zheng, Xingwen [Material Corrosion and Protection Key Laboratory of Sichuan province, Zigong 643000 (China); Shen, Xun [School of Material and Chemical Engieering, Tongren University, Tongren 554300 (China); Qiang, Yujie [Material Corrosion and Protection Key Laboratory of Sichuan province, Zigong 643000 (China); Kaya, Savaş; Kaya, Cemal [Department of Chemistry, Faculty of Science, Cumhuriyet University, Sivas 58140 (Turkey)

    2017-06-01

    Highlights: • We obtained the habit information of α-Fe obtained by the “Morphology” module. • The adsorption of pyrrole, furan, and thiophene on Fe(110) surface were studied by DFT calculations. • Our DFT modeling provided a reasonable micro-explanation to the empirical rule. - Abstract: Steel is an important material in industry. Adding heterocyclic organic compounds have proved to be very efficient for steel protection. There exists an empirical rule that the general trend in the inhibition efficiencies of molecules containing heteroatoms is such that O < N < S. However, an atomic-level insight into the inhibition mechanism is still lacked. Thus, in this work, density functional theory calculations was used to investigate the adsorption of three typical heterocyclic molecules, i.e., pyrrole, furan, and thiophene, on Fe(110) surface. The approach is illustrated by carrying out geometric optimization of inhibitors on the stable and most exposed plane of α-Fe. Some salient features such as charge density difference, changes of work function, density of states were detailedly described. The present study is helpful to understand the afore-mentioned experiment rule.

  13. A Framework for the Corporate Governance of Data – Theoretical Background and Empirical Evidence

    OpenAIRE

    Tomi Dahlberg; Tiina Nokkala

    2015-01-01

    In a modern organization, IT and digital data have transformed from being functional resources to integral elements of business strategy. Against this background, our article addresses corporate governance of digital data in general and that of aging societies in particular. To describe the role of executives and managers in data governance, we first review the corporate and IT governance literature. We then propose a theoretical framework for the governance of data: a novel construct. We app...

  14. Cultural active approach to the issue of emotion regulation: theoretical explanation and empirical verification of a conceptual model

    Directory of Open Access Journals (Sweden)

    Elena I. Pervichko

    2016-06-01

    Full Text Available The paper gives a theoretical explanation and empirical verification of a conceptual emotion-regulating model, developed in the theoretical methodological context of cultural-active paradigm. A universal hypothesis concerning emotion regulation as a system including psychological and physiological levels has been verified empirically. The psychological level may be subdivided on motivational thinking level and operational technical ones, ruled by such psychological mechanisms as reflection and symbolical mediation. It has been figured out that motivational peculiarities determine the manifestation of other analyzed components of the system of emotion regulation. This is true not only for healthy patients, but also for patients with mitral valve prolapse (MVP. The significance of reflection and symbolical mediation in the system of cultural-active paradigm and emotion regulation has been determined. It has been proved that emotion regulation among patients with MVP differs from that of healthy people, highlighted by a very strong conflict of goal-achieving and fail-avoiding motives, lack of personal reflection and distortion of symbolical mediation, and very limited emotion-regulative resources. It has been shown that patients with MVP differ from the control group, suffering from far more strong emotional stress. It distributes an overall negative impact, reducing the ability to use emotion-regulating resource in emotionally meaningful situations effectively.

  15. Empirical Testing of a Theoretical Extension of the Technology Acceptance Model: An Exploratory Study of Educational Wikis

    Science.gov (United States)

    Liu, Xun

    2010-01-01

    This study extended the technology acceptance model and empirically tested the new model with wikis, a new type of educational technology. Based on social cognitive theory and the theory of planned behavior, three new variables, wiki self-efficacy, online posting anxiety, and perceived behavioral control, were added to the original technology…

  16. From the Cover: The growth of business firms: Theoretical framework and empirical evidence

    Science.gov (United States)

    Fu, Dongfeng; Pammolli, Fabio; Buldyrev, S. V.; Riccaboni, Massimo; Matia, Kaushik; Yamasaki, Kazuko; Stanley, H. Eugene

    2005-12-01

    We introduce a model of proportional growth to explain the distribution Pg(g) of business-firm growth rates. The model predicts that Pg(g) is exponential in the central part and depicts an asymptotic power-law behavior in the tails with an exponent = 3. Because of data limitations, previous studies in this field have been focusing exclusively on the Laplace shape of the body of the distribution. In this article, we test the model at different levels of aggregation in the economy, from products to firms to countries, and we find that the predictions of the model agree with empirical growth distributions and size-variance relationships. proportional growth | preferential attachment | Laplace distribution

  17. Empirical agent-based modelling challenges and solutions

    CERN Document Server

    Barreteau, Olivier

    2014-01-01

    This instructional book showcases techniques to parameterise human agents in empirical agent-based models (ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications.  It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.  In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes'  ready to be implemented. Agent-based modeling (AB...

  18. Green Product Purchasing Phenomenon: Exploring The Gaps Of Theoretical, Methodological And Empirical

    Directory of Open Access Journals (Sweden)

    Rahab Bin Tafsir

    2016-12-01

    Full Text Available This study aims to identify research gaps on green purchasing topics and proposes several recommendations for future research. To explain the phenomenon of green products  purchase, this study uses Theory of Planned Behavior (TPB in framework. The research uses a  qualitative method approach by conducting a review of articles that traced through four popular journal providers: Ebsco, J-Stors, Proquest and Emeraldinsight. A literature search process held between April, 2015 until Juni, 2015 and resulted on 67 chosen articles. The outcome of the review identified four theoretical gaps, two  methodological gaps and one practical gap.

  19. Energy transport in ASDEX in relation to theoretical and semi-empirical transport coefficients

    International Nuclear Information System (INIS)

    Gruber, O.; Wunderlich, R.; Lackner, K.; Schneider, W.

    1989-09-01

    A comparison of measurements with theoretically predicted energy transport coefficients has been done for Ohmic and NBI-heated discharges using both analysis and simulation codes. The contribution of strong electrostatic turbulence given by the η i -driven modes to the ion heat conductivity is very successful in explaining the observed response of confinement to density profile changes and is found to be even in good quantitative agreement. Regarding the electron branch, a combination of trapped electron driven turbulence and resistive ballooning modes might be a promising model to explain both the correct power and density dependence of confinement time, and the observed radial dependence of the electron heat conductivity. (orig.)

  20. Social Health Inequalities and eHealth: A Literature Review With Qualitative Synthesis of Theoretical and Empirical Studies.

    Science.gov (United States)

    Latulippe, Karine; Hamel, Christine; Giroux, Dominique

    2017-04-27

    eHealth is developing rapidly and brings with it a promise to reduce social health inequalities (SHIs). Yet, it appears that it also has the potential to increase them. The general objective of this review was to set out how to ensure that eHealth contributes to reducing SHIs rather than exacerbating them. This review has three objectives: (1) identifying characteristics of people at risk of experiencing social inequality in health; (2) determining the possibilities of developing eHealth tools that avoid increasing SHI; and (3) modeling the process of using an eHealth tool by people vulnerable to SHI. Following the EPPI approach (Evidence for Policy and Practice of Information of the Institute of Education at the University of London), two databases were searched for the terms SHIs and eHealth and their derivatives in titles and abstracts. Qualitative, quantitative, and mixed articles were included and evaluated. The software NVivo (QSR International) was employed to extract the data and allow for a metasynthesis of the data. Of the 73 articles retained, 10 were theoretical, 7 were from reviews, and 56 were based on empirical studies. Of the latter, 40 used a quantitative approach, 8 used a qualitative approach, 4 used mixed methods approach, and only 4 were based on participatory research-action approach. The digital divide in eHealth is a serious barrier and contributes greatly to SHI. Ethnicity and low income are the most commonly used characteristics to identify people at risk of SHI. The most promising actions for reducing SHI via eHealth are to aim for universal access to the tool of eHealth, become aware of users' literacy level, create eHealth tools that respect the cultural attributes of future users, and encourage the participation of people at risk of SHI. eHealth has the potential to widen the gulf between those at risk of SHI and the rest of the population. The widespread expansion of eHealth technologies calls for rigorous consideration of

  1. empiric treatment based on helicobacter pylori serology cannot ...

    African Journals Online (AJOL)

    EMPIRIC TREATMENT BASED ON. HELICOBACTER PYLORI SEROLOGY. CANNOT SUBSTITUTE FOR EARLY. ENDOSCOPY IN THE. MANAGEMENT OF DYSPEPTIC. RURAL BLACK AFRICANS. Stephen JD O'Keefe, B Salvador, J Nainkin, S Majikir H. Stevens, A Atherstone. Background_ Evidence that chronic gastric ...

  2. GIS Teacher Training: Empirically-Based Indicators of Effectiveness

    Science.gov (United States)

    Höhnle, Steffen; Fögele, Janis; Mehren, Rainer; Schubert, Jan Christoph

    2016-01-01

    In spite of various actions, the implementation of GIS (geographic information systems) in German schools is still very low. In the presented research, teaching experts as well as teaching novices were presented with empirically based constraints for implementation stemming from an earlier survey. In the process of various group discussions, the…

  3. The outcome of non-carbapenem-based empirical antibacterial ...

    African Journals Online (AJOL)

    Background: Febrile neutropenia (FN) is generally a complication of cancer chemotherapy. Objective: ... Furthermore, non-carbapenem-based empirical therapy provides benefit in regard to cost-effectiveness and antimicrobial stewardship when local antibiotic resistance patterns of gram-negative bacteria are considered.

  4. Pharmaceuticals, political money, and public policy: a theoretical and empirical agenda.

    Science.gov (United States)

    Jorgensen, Paul D

    2013-01-01

    Why, when confronted with policy alternatives that could improve patient care, public health, and the economy, does Congress neglect those goals and tailor legislation to suit the interests of pharmaceutical corporations? In brief, for generations, the pharmaceutical industry has convinced legislators to define policy problems in ways that protect its profit margin. It reinforces this framework by selectively providing information and by targeting campaign contributions to influential legislators and allies. In this way, the industry displaces the public's voice in developing pharmaceutical policy. Unless citizens mobilize to confront the political power of pharmaceutical firms, objectionable industry practices and public policy will not change. Yet we need to refine this analysis. I propose a research agenda to uncover pharmaceutical influence. It develops the theory of dependence corruption to explain how the pharmaceutical industry is able to deflect the broader interests of the general public. It includes empirical studies of lobbying and campaign finance to uncover the means drug firms use to: (1) shape the policy framework adopted and information used to analyze policy; (2) subsidize the work of political allies; and (3) influence congressional voting. © 2013 American Society of Law, Medicine & Ethics, Inc.

  5. On the theoretical development of new creep resistant alloys and their empirical validation

    International Nuclear Information System (INIS)

    Gaude-Fugarolas, D.; Regent, N.; Carlan, Y. de

    2008-01-01

    In anticipation to the present revival of nuclear power, and to obtain more efficient, secure and environmentally-friendly power plants, new families of high temperature resistant, low activation materials are under development. This work presents an example of work performed at CEA during the development of novel ferrito-martensitic reduced activation alloys for Generation IV and Fusion applications. In the past, the process of designing a new material was mostly heuristic, requiring repeated experimental trial and error, but nowadays, synergies between the accuracy of current scientific knowledge in thermodynamics and transformation kinetics and increased computer capacity enables us to design successful new alloys using minimal empirical feedback. This work presents this comprehensive and multi-model approach to alloy and microstructure design. The CALPHAD method, thermo-kinetic modelling of precipitation reactions and artificial neural network analysis are combined in the development of new alloys having their compositions and microstructures optimised for maximum creep resistance. To complete this work, a selection of the alloys designed has been cast and the results obtained during alloy design and the modelling of various heat treatments have been verified. Optical and electronic microscopy have been used to characterise the microstructure. Uniaxial tensile tests have been used to measure the mechanical performance of the alloys presented at room, service and higher temperatures. The characterisation of the behaviour of the material in service conditions is underway with relaxation and creep tests. (authors)

  6. Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.

    Science.gov (United States)

    Johnson, Shane D; Groff, Elizabeth R

    2014-07-01

    The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.

  7. The Empirical Measurement of a Theoretical Concept: Tracing Social Exclusion among Racial Minority and Migrant Groups in Canada

    Directory of Open Access Journals (Sweden)

    Luann Good Gingrich

    2015-07-01

    Full Text Available This paper provides an in-depth description and case application of a conceptual model of social exclusion: aiming to advance existing knowledge on how to conceive of and identify this complex idea, evaluate the methodologies used to measure it, and reconsider what is understood about its social realities toward a meaningful and measurable conception of social inclusion. Drawing on Pierre Bourdieu’s conceptual tools of social fields and systems of capital, our research posits and applies a theoretical framework that permits the measurement of social exclusion as dynamic, social, relational, and material. We begin with a brief review of existing social exclusion research literature, and specifically examine the difficulties and benefits inherent in quantitatively operationalizing a necessarily multifarious theoretical concept. We then introduce our conceptual model of social exclusion and inclusion, which is built on measurable constructs. Using our ongoing program of research as a case study, we briefly present our approach to the quantitative operationalization of social exclusion using secondary data analysis in the Canadian context. Through the development of an Economic Exclusion Index, we demonstrate how our statistical and theoretical analyses evidence intersecting processes of social exclusion which produce consequential gaps and uneven trajectories for migrant individuals and groups compared with Canadian-born, and racial minority groups versus white individuals. To conclude, we consider some methodological implications to advance the empirical measurement of social inclusion.

  8. Theoretical and empirical convergence results for additive congruential random number generators

    Science.gov (United States)

    Wikramaratna, Roy S.

    2010-03-01

    Additive Congruential Random Number (ACORN) generators represent an approach to generating uniformly distributed pseudo-random numbers that is straightforward to implement efficiently for arbitrarily large order and modulus; if it is implemented using integer arithmetic, it becomes possible to generate identical sequences on any machine. This paper briefly reviews existing results concerning ACORN generators and relevant theory concerning sequences that are well distributed mod 1 in k dimensions. It then demonstrates some new theoretical results for ACORN generators implemented in integer arithmetic with modulus M=2[mu] showing that they are a family of generators that converge (in a sense that is defined in the paper) to being well distributed mod 1 in k dimensions, as [mu]=log2M tends to infinity. By increasing k, it is possible to increase without limit the number of dimensions in which the resulting sequences approximate to well distributed. The paper concludes by applying the standard TestU01 test suite to ACORN generators for selected values of the modulus (between 260 and 2150), the order (between 4 and 30) and various odd seed values. On the basis of these and earlier results, it is recommended that an order of at least 9 be used together with an odd seed and modulus equal to 230p, for a small integer value of p. While a choice of p=2 should be adequate for most typical applications, increasing p to 3 or 4 gives a sequence that will consistently pass all the tests in the TestU01 test suite, giving additional confidence in more demanding applications. The results demonstrate that the ACORN generators are a reliable source of uniformly distributed pseudo-random numbers, and that in practice (as suggested by the theoretical convergence results) the quality of the ACORN sequences increases with increasing modulus and order.

  9. An Empirical Investigation into the Matching Problems among Game Theoretically Coordinating Parties in a Virtual Organization

    Directory of Open Access Journals (Sweden)

    Muhammad Yasir

    2010-09-01

    Full Text Available Virtual organization emerged as a highly flexible structure in response to the rapidly changing environment of 20th century. This organization consists of independently working parties that combine their best possible resources to exploit the emerging market opportunities. There are no formal control and coordination mechanisms employed by the classical hierarchical structures. Parties, therefore, manage their dependencies on each other through mutual understanding and trust.Mathematician John Nash, having significant contributions in Game Theory suggests that in every non-cooperative game there is at least one equilibrium point. At this point, according to him, every strategy of the player represents a response to the others’ strategies. Such equilibria could exist in a virtual organization, at which parties coordinate which each other to optimize their performance.Coordination/Matching problems are likely to arise among game theoretically coordinating parties in a virtual organization, mainly due to lack of binding agreements. By identifying and resolving these matching problems, virtual organizations could achieve efficiency and better coordination among parties.  

  10. A Framework for the Corporate Governance of Data – Theoretical Background and Empirical Evidence

    Directory of Open Access Journals (Sweden)

    Tomi Dahlberg

    2015-06-01

    Full Text Available In a modern organization, IT and digital data have transformed from being functional resources to integral elements of business strategy. Against this background, our article addresses corporate governance of digital data in general and that of aging societies in particular. To describe the role of executives and managers in data governance, we first review the corporate and IT governance literature. We then propose a theoretical framework for the governance of data: a novel construct. We apply the framework to the governance of aging societies related data, that is, to answer the question of how best to manage the provision of services to citizens with digital data enablement and support. We also disclose the results from two recent surveys, with 212 and 68 respondents respectively, on the business significance of data governance. The survey results reveal that good governance of data is considered critical to organizations. As concluding remarks, we discuss the significance of our results, our contributions to research, the limitations of our study and its managerial implications.

  11. Assessing the Potential Economic Viability of Precision Irrigation: A Theoretical Analysis and Pilot Empirical Evaluation

    Directory of Open Access Journals (Sweden)

    Francesco Galioto

    2017-12-01

    Full Text Available The present study explores the value generated by the use of information to rationalize the use of water resources in agriculture. The study introduces the value of information concept in the field of irrigation developing a theoretical assessment framework to evaluate whether the introduction of “Precision Irrigation” (PI practices can improve expectations on income. This is supported by a Stakeholders consultation and by a numerical example, using secondary data and crop growth models. The study reveals that the value generated with the transition to PI varies with pedo-climate, economic, technological and other conditions, and it depends on the initial status of the farmer’s information environment. These factors affect the prerequisite needed to make viable PI. To foster the adoption of PI, stakeholders envisaged the need to set up free meteorological information and advisory service that supports farmers in using PI, as well as other type of instruments. The paper concludes that the profitability of adoption and the relevant impact on the environment cannot be considered as generally given, but must be evaluated case by case justifying (or not the activation of specific agricultural policy measures supporting PI practices to target regions.

  12. Country risk premium: theoretical determinants and empirical evidence for latin american countries

    Directory of Open Access Journals (Sweden)

    Selmo Aronovich

    1999-12-01

    Full Text Available This paper investigates the behavior of the country risk premium for Argentina, Brazil and Mexico, from June 1997 to September 1998. It shows that the level of country risk premium is determined by different factors: the US dollar bond market structure; restrictions on the acquisition of emerging market bonds imposed by developed nations regulators; the credit risk measured by the notion of implied risk-neutral probability default; the different ways agents react to country risk due to asymmetric and imperfect information. The empirical investigation shows: the worse the country credit rating, the greater is the impact on international borrowing cost, which implies that negative expectations have greater impact on lower rated Latin American nations' bonds; country risk yield spreads overreacted to changes in the US dollar interest rates in the sample period.Este artigo investiga o comportamento do conceito de prêmio de risco-país para Argentina, Brasil e México, de junho de 1997 até setembro de 1998. Mostra-se que tal prêmio é determinado pelos seguintes fatores: a estrutura do mercado de títulos de dívida em dólares norte-americanos; as restrições à aquisição de títulos de dívida impostas por agentes reguladores de países desenvolvidos; o risco de crédito mensurado pelo conceito de probabilidade de inadimplência risco-nêutra implícita; o modo como os agentes reagem à informação assimétrica ou imperfeita. A evidência empírica revela que: quanto pior a classificação de risco de crédito, maior é o impacto esperado sobre as condições de captação externa, implicando que expectativas desfavoráveis afetam de forma mais acentuada países com baixa classificação de crédito; o valor do spread representativo do risco-país mostrou-se superelástico à variação da taxa de juros de longo prazo do Tesouro norte-americano no período da amostra.

  13. Swahili women since the nineteenth century: theoretical and empirical considerations on gender and identity construction.

    Science.gov (United States)

    Gower, R; Salm, S; Falola, T

    1996-01-01

    This paper provides an analysis and update on the theoretical discussion about the link between gender and identity and uses a group of Swahili women in eastern Africa as an example of how this link works in practice. The first part of the study provides a brief overview of gender theory related to the terms "gender" and "identity." It is noted that gender is only one aspect of identity and that the concept of gender has undergone important changes such as the reconceptualization of the terms "sex" and "gender." The second part of the study synthesizes the experiences of Swahili women in the 19th century when the convergence of gender and class was very important. The status of Muslim women is reviewed, and it is noted that even influential women practiced purdah and that all Swahili women experienced discrimination, which inhibited their opportunities for socioeconomic mobility. Slavery and concubinage were widespread during this period, and the participation of Islamic women in spirit possession cults was a way for women to express themselves culturally. The separation of men and women in Swahili culture led to the development of two distinct subcultures, which excluded women from most aspects of public life. The third part of the study looks at the experiences of Swahili women since the 19th century both during and after the colonial period. It is shown that continuity exists in trends observed over a period of 200 years. For example, the mobility of Swahili women remains limited by Islam, but women do exert influence behind the scenes. It is concluded that the socioeconomic status of Swahili woman has been shaped more by complex forces such as class, ethnic, religious, and geographic area than by the oppression of Islam and colonialism. This study indicates that gender cannot be studied in isolation from other salient variables affecting identity.

  14. Evidence-based Nursing Education - a Systematic Review of Empirical Research

    Science.gov (United States)

    Reiber, Karin

    2011-01-01

    The project „Evidence-based Nursing Education – Preparatory Stage“, funded by the Landesstiftung Baden-Württemberg within the programme Impulsfinanzierung Forschung (Funding to Stimulate Research), aims to collect information on current research concerned with nursing education and to process existing data. The results of empirical research which has already been carried out were systematically evaluated with aim of identifying further topics, fields and matters of interest for empirical research in nursing education. In the course of the project, the available empirical studies on nursing education were scientifically analysed and systematised. The over-arching aim of the evidence-based training approach – which extends beyond the aims of this project - is the conception, organisation and evaluation of vocational training and educational processes in the caring professions on the basis of empirical data. The following contribution first provides a systematic, theoretical link to the over-arching reference framework, as the evidence-based approach is adapted from thematically related specialist fields. The research design of the project is oriented towards criteria introduced from a selection of studies and carries out a two-stage systematic review of the selected studies. As a result, the current status of research in nursing education, as well as its organisation and structure, and questions relating to specialist training and comparative education are introduced and discussed. Finally, the empirical research on nursing training is critically appraised as a complementary element in educational theory/psychology of learning and in the ethical tradition of research. This contribution aims, on the one hand, to derive and describe the methods used, and to introduce the steps followed in gathering and evaluating the data. On the other hand, it is intended to give a systematic overview of empirical research work in nursing education. In order to preserve a

  15. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  16. ORGANIZATIONAL CULTURE AND TECHNOLOGY-INFUSED MANAGEMENT IN HIGHER EDUCATION: THEORETICAL AND EMPIRICAL ASPECTS

    Directory of Open Access Journals (Sweden)

    Petro M. Boichuk

    2017-10-01

    Full Text Available The paper provides an overview of the research sources, methodology and main findings of the research of a higher educational institution organizational culture and technology-infused management correlation. Based on the recent research, the authors provided the definition of the organizational culture concept. The research revealed that the current organizational culture of the Lutsk Pedagogical College is more like market culture. The respondents in the present study defined the adhocracy culture as the desirable organizational culture in the College. Notably, the results of the in-depth interview based on expert assessment method indicated that teachers were moderately satisfied with the level of competence of the administrative staff to meet challenges of technology-infused management at their higher educational institution.

  17. Innovation in Information Technology: Theoretical and Empirical Study in SMQR Section of Export Import in Automotive Industry

    Science.gov (United States)

    Edi Nugroho Soebandrija, Khristian; Pratama, Yogi

    2014-03-01

    This paper has the objective to provide the innovation in information technology in both theoretical and empirical study. Precisely, both aspects relate to the Shortage Mispacking Quality Report (SMQR) Claims in Export and Import in Automotive Industry. This paper discusses the major aspects of Innovation, Information Technology, Performance and Competitive Advantage. Furthermore, In the empirical study of PT. Astra Honda Motor (AHM) refers to SMQR Claims, Communication Systems, Analysis and Design Systems. Briefly both aspects of the major aspects and its empirical study are discussed in the Introduction Session. Furthermore, the more detail discussion is conducted in the related aspects in other sessions of this paper, in particular in Literature Review in term classical and updated reference of current research. The increases of SMQR claim and communication problem at PT. Astra Daihatsu Motor (PT. ADM) which still using the email cause the time of claim settlement become longer and finally it causes the rejected of SMQR claim by supplier. With presence of this problem then performed to design the integrated communication system to manage the communication process of SMQR claim between PT. ADM with supplier. The systems was analyzed and designed is expected to facilitate the claim communication process so that can be run in accordance with the procedure and fulfill the target of claim settlement time and also eliminate the difficulties and problems on the previous manual communication system with the email. The design process of the system using the approach of system development life cycle method by Kendall & Kendall (2006)which design process covers the SMQR problem communication process, judgment process by the supplier, claim process, claim payment process and claim monitoring process. After getting the appropriate system designs for managing the SMQR claim, furthermore performed the system implementation and can be seen the improvement in claim communication

  18. Coping, acculturation, and psychological adaptation among migrants: a theoretical and empirical review and synthesis of the literature

    Science.gov (United States)

    Kuo, Ben C.H.

    2014-01-01

    Given the continuous, dynamic demographic changes internationally due to intensive worldwide migration and globalization, the need to more fully understand how migrants adapt and cope with acculturation experiences in their new host cultural environment is imperative and timely. However, a comprehensive review of what we currently know about the relationship between coping behavior and acculturation experience for individuals undergoing cultural changes has not yet been undertaken. Hence, the current article aims to compile, review, and examine cumulative cross-cultural psychological research that sheds light on the relationships among coping, acculturation, and psychological and mental health outcomes for migrants. To this end, this present article reviews prevailing literature pertaining to: (a) the stress and coping conceptual perspective of acculturation; (b) four theoretical models of coping, acculturation and cultural adaptation; (c) differential coping pattern among diverse acculturating migrant groups; and (d) the relationship between coping variabilities and acculturation levels among migrants. In terms of theoretical understanding, this review points to the relative strengths and limitations associated with each of the four theoretical models on coping-acculturation-adaptation. These theories and the empirical studies reviewed in this article further highlight the central role of coping behaviors/strategies in the acculturation process and outcome for migrants and ethnic populations, both conceptually and functionally. Moreover, the review shows that across studies culturally preferred coping patterns exist among acculturating migrants and migrant groups and vary with migrants' acculturation levels. Implications and limitations of the existing literature for coping, acculturation, and psychological adaptation research are discussed and recommendations for future research are put forth. PMID:25750766

  19. Social-Emotional Well-Being and Resilience of Children in Early Childhood Settings--PERIK: An Empirically Based Observation Scale for Practitioners

    Science.gov (United States)

    Mayr, Toni; Ulich, Michaela

    2009-01-01

    Compared with the traditional focus on developmental problems, research on positive development is relatively new. Empirical research in children's well-being has been scarce. The aim of this study was to develop a theoretically and empirically based instrument for practitioners to observe and assess preschool children's well-being in early…

  20. Parenting around child snacking: development of a theoretically-guided, empirically informed conceptual model.

    Science.gov (United States)

    Davison, Kirsten K; Blake, Christine E; Blaine, Rachel E; Younginer, Nicholas A; Orloski, Alexandria; Hamtil, Heather A; Ganter, Claudia; Bruton, Yasmeen P; Vaughn, Amber E; Fisher, Jennifer O

    2015-09-17

    Snacking contributes to excessive energy intakes in children. Yet factors shaping child snacking are virtually unstudied. This study examines food parenting practices specific to child snacking among low-income caregivers. Semi-structured interviews were conducted in English or Spanish with 60 low-income caregivers of preschool-aged children (18 non-Hispanic white, 22 African American/Black, 20 Hispanic; 92% mothers). A structured interview guide was used to solicit caregivers' definitions of snacking and strategies they use to decide what, when and how much snack their child eats. Interviews were audio-recorded, transcribed verbatim and analyzed using an iterative theory-based and grounded approach. A conceptual model of food parenting specific to child snacking was developed to summarize the findings and inform future research. Caregivers' descriptions of food parenting practices specific to child snacking were consistent with previous models of food parenting developed based on expert opinion [1, 2]. A few noteworthy differences however emerged. More than half of participants mentioned permissive feeding approaches (e.g., my child is the boss when it comes to snacks). As a result, permissive feeding was included as a higher order feeding dimension in the resulting model. In addition, a number of novel feeding approaches specific to child snacking emerged including child-centered provision of snacks (i.e., responding to a child's hunger cues when making decisions about snacks), parent unilateral decision making (i.e., making decisions about a child's snacks without any input from the child), and excessive monitoring of snacks (i.e., monitoring all snacks provided to and consumed by the child). The resulting conceptual model includes four higher order feeding dimensions including autonomy support, coercive control, structure and permissiveness and 20 sub-dimensions. This study formulates a language around food parenting practices specific to child snacking

  1. The Ease of Language Understanding (ELU model: theoretical, empirical, and clinical advances

    Directory of Open Access Journals (Sweden)

    Jerker eRönnberg

    2013-07-01

    Full Text Available Working memory is important for online language processing during conversation. We use it to maintain relevant information, to inhibit or ignore irrelevant information, and to attend to conversation selectively. Working memory helps us to keep track of and actively participate in conversation, including taking turns and following the gist. This paper examines the Ease of Language Understanding model (i.e., the ELU model, Rönnberg, 2003; Rönnberg et al., 2008 in light of new behavioral and neural findings concerning the role of working memory capacity (WMC in uni-modal and bimodal language processing. The new ELU model is a meaning prediction system that depends on phonological and semantic interactions in rapid implicit and slower explicit processing mechanisms that both depend on WMC albeit in different ways. A revised ELU model is proposed based on findings that address the relationship between WMC and (a early attention processes in listening to speech, (b signal processing in hearing aids and its effects on short-term memory, (c inhibition of speech maskers and its effect on episodic long-term memory, (d the effects of hearing impairment on episodic and semantic long-term memory, and finally, (e listening effort. New predictions and clinical implications are outlined. Comparisons with other WMC and speech perception models are made.

  2. Theoretical Model for the Performance of Liquid Ring Pump Based on the Actual Operating Cycle

    Directory of Open Access Journals (Sweden)

    Si Huang

    2017-01-01

    Full Text Available Liquid ring pump is widely applied in many industry fields due to the advantages of isothermal compression process, simple structure, and liquid-sealing. Based on the actual operating cycle of “suction-compression-discharge-expansion,” a universal theoretical model for performance of liquid ring pump was established in this study, to solve the problem that the theoretical models deviated from the actual performance in operating cycle. With the major geometric parameters and operating conditions of a liquid ring pump, the performance parameters such as the actual capacity for suction and discharge, shaft power, and global efficiency can be conveniently predicted by the proposed theoretical model, without the limitation of empiric range, performance data, or the detailed 3D geometry of pumps. The proposed theoretical model was verified by experimental performances of liquid ring pumps and could provide a feasible tool for the application of liquid ring pump.

  3. Size-dependent standard deviation for growth rates: empirical results and theoretical modeling.

    Science.gov (United States)

    Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H Eugene; Grosse, I

    2008-05-01

    We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation sigma(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation sigma(R) on the average value of the wages with a scaling exponent beta approximately 0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation sigma(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of sigma(R) on the average payroll with a scaling exponent beta approximately -0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable.

  4. Size-dependent standard deviation for growth rates: Empirical results and theoretical modeling

    Science.gov (United States)

    Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H. Eugene; Grosse, I.

    2008-05-01

    We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation σ(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation σ(R) on the average value of the wages with a scaling exponent β≈0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation σ(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of σ(R) on the average payroll with a scaling exponent β≈-0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable.

  5. Empirical projection-based basis-component decomposition method

    Science.gov (United States)

    Brendel, Bernhard; Roessl, Ewald; Schlomka, Jens-Peter; Proksa, Roland

    2009-02-01

    Advances in the development of semiconductor based, photon-counting x-ray detectors stimulate research in the domain of energy-resolving pre-clinical and clinical computed tomography (CT). For counting detectors acquiring x-ray attenuation in at least three different energy windows, an extended basis component decomposition can be performed in which in addition to the conventional approach of Alvarez and Macovski a third basis component is introduced, e.g., a gadolinium based CT contrast material. After the decomposition of the measured projection data into the basis component projections, conventional filtered-backprojection reconstruction is performed to obtain the basis-component images. In recent work, this basis component decomposition was obtained by maximizing the likelihood-function of the measurements. This procedure is time consuming and often unstable for excessively noisy data or low intrinsic energy resolution of the detector. Therefore, alternative procedures are of interest. Here, we introduce a generalization of the idea of empirical dual-energy processing published by Stenner et al. to multi-energy, photon-counting CT raw data. Instead of working in the image-domain, we use prior spectral knowledge about the acquisition system (tube spectra, bin sensitivities) to parameterize the line-integrals of the basis component decomposition directly in the projection domain. We compare this empirical approach with the maximum-likelihood (ML) approach considering image noise and image bias (artifacts) and see that only moderate noise increase is to be expected for small bias in the empirical approach. Given the drastic reduction of pre-processing time, the empirical approach is considered a viable alternative to the ML approach.

  6. Developing a theoretical framework for complex community-based interventions.

    Science.gov (United States)

    Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana

    2014-01-01

    Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.

  7. Palm vein recognition based on directional empirical mode decomposition

    Science.gov (United States)

    Lee, Jen-Chun; Chang, Chien-Ping; Chen, Wei-Kuei

    2014-04-01

    Directional empirical mode decomposition (DEMD) has recently been proposed to make empirical mode decomposition suitable for the processing of texture analysis. Using DEMD, samples are decomposed into a series of images, referred to as two-dimensional intrinsic mode functions (2-D IMFs), from finer to large scale. A DEMD-based 2 linear discriminant analysis (LDA) for palm vein recognition is proposed. The proposed method progresses through three steps: (i) a set of 2-D IMF features of various scale and orientation are extracted using DEMD, (ii) the 2LDA method is then applied to reduce the dimensionality of the feature space in both the row and column directions, and (iii) the nearest neighbor classifier is used for classification. We also propose two strategies for using the set of 2-D IMF features: ensemble DEMD vein representation (EDVR) and multichannel DEMD vein representation (MDVR). In experiments using palm vein databases, the proposed MDVR-based 2LDA method achieved recognition accuracy of 99.73%, thereby demonstrating its feasibility for palm vein recognition.

  8. Randomized Trial of ConquerFear: A Novel, Theoretically Based Psychosocial Intervention for Fear of Cancer Recurrence

    NARCIS (Netherlands)

    Butow, P.N.; Turner, J.; Gilchrist, J.; Sharpe, L.; Smith, A.B.; Fardell, J.E.; Tesson, S.; O'Connell, R.; Girgis, A.; Gebski, V.J.; Asher, R.; Mihalopoulos, C.; Bell, M.L.; Zola, K.G.; Beith, J.; Thewes, B.

    2017-01-01

    Purpose Fear of cancer recurrence (FCR) is prevalent, distressing, and long lasting. This study evaluated the impact of a theoretically/empirically based intervention (ConquerFear) on FCR. Methods Eligible survivors had curable breast or colorectal cancer or melanoma, had completed treatment (not

  9. Norms and the development of new knowledge as determinants of climate policy. Theoretical considerations and empirical evidence

    International Nuclear Information System (INIS)

    Schymura, Michael

    2013-01-01

    The evaluation of long-term effects of climate change in cost-benefit analysis has a long tradition in environmental economics. Since the publication of the ''Stern Review'' in 2006, the debate about the impacts of climate change on the economy and how to compare cost and benefits with each other was revived. The assessment of climate change mitigation policies mainly depends on three not mutually exclusive decisions: First, the discount rate chosen, since costs are incurred today and long-term benefits occur in the future. Second, the uncertainties related to the problem of climate change. This debate was spurred by the literature surrounding Martin Weitzman's ''dismal theorem'', stating that the unknown unknowns could be too large for cost-benefit analysis of long-term climate policy measures. And third, the treatment of technological change in economic models of climate policy. This dissertation contributes to all three mentioned aspects and consists of two interrelated parts. First, I discuss norms, economic welfare criteria, and catastrophic risks in the context of climate change. This first part is closely bound to a discussion of the Stern Review. The second part of the dissertation deals with technological change in its various dimensions and addresses several questions: How is technological change taken into consideration in large-scale environment-economy models? How do certain inputs substitute each other? And how does technology affect energy intensity, a key parameter for meeting targets set by climate policy? Alongside these questions, I also discuss how the production of knowledge takes place in an environmental and resource economics framework and how knowledge production patterns have changed during previous decades. While the first part is of theoretical nature, the second part is mainly empirical. Summing up, I suggest approaches towards long-term evaluation of climate policies and how some methodological drawbacks of the ''Stern Review'' could

  10. Activity Theory as a Theoretical Framework for Health Self-Quantification: A Systematic Review of Empirical Studies.

    Science.gov (United States)

    Almalki, Manal; Gray, Kathleen; Martin-Sanchez, Fernando

    2016-05-27

    Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers' activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers' health systematically. The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and constructs of health SQ activity. A

  11. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    Science.gov (United States)

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  12. On the Development of a Theory of Traveler Attitude-Behavior Interrelationships : Volume 2. Theoretical and Empirical Findings.

    Science.gov (United States)

    1978-08-01

    The second volume of this final report presents conceptual and empirical findings which support the development of a theory of traveler attitude-behavior interrelationships. Such a theory will be useful in the design of transport systems and operatin...

  13. Central bank transparency, private information and the predictability of monetary policy in the financial markets : theoretical, experimental and empirical evidence

    NARCIS (Netherlands)

    Middeldorp, M.H.

    2010-01-01

    Central banks worldwide have become considerably more communicative about their policies and forecasts. An important reason is that democratic societies expect such transparency from public institutions. Central bankers, supported by a significant body of empirical research, also believe that

  14. In search of risk and safety cultures : empirical and theoretical considerations in the settings of northern and western Europe

    OpenAIRE

    Melinder, Karin

    2000-01-01

    Abstract There has been an increasing interest in determinants of public health at societal level. These factors have often been of a socio-economic nature, whereas culture has attracted less attention. The aim of this dissertation is to analyze, operationalize, and empirically explore the potentials of the concept of culture in macro- level injury research. The dissertation is built upon five different papers. The empirical papers investigate the scope available for emp...

  15. Theoretical bases analysis of scientific prediction on marketing principles

    OpenAIRE

    A.S. Rosohata

    2012-01-01

    The article presents an overview categorical apparatus of scientific predictions and theoretical foundations results of scientific forecasting. They are integral part of effective management of economic activities. The approaches to the prediction of scientists in different fields of Social science and the categories modification of scientific prediction, based on principles of marketing are proposed.

  16. Antecedents of employee electricity saving behavior in organizations: An empirical study based on norm activation model

    International Nuclear Information System (INIS)

    Zhang, Yixiang; Wang, Zhaohua; Zhou, Guanghui

    2013-01-01

    China is one of the major energy-consuming countries, and is under great pressure to promote energy saving and reduce domestic energy consumption. Employees constitute an important target group for energy saving. However, only a few research efforts have been paid to study what drives employee energy saving behavior in organizations. To fill this gap, drawing on norm activation model (NAM), we built a research model to study antecedents of employee electricity saving behavior in organizations. The model was empirically tested using survey data collected from office workers in Beijing, China. Results show that personal norm positively influences employee electricity saving behavior. Organizational electricity saving climate negatively moderates the effect of personal norm on electricity saving behavior. Awareness of consequences, ascription of responsibility, and organizational electricity saving climate positively influence personal norm. Furthermore, awareness of consequences positively influences ascription of responsibility. This paper contributes to the energy saving behavior literature by building a theoretical model of employee electricity saving behavior which is understudied in the current literature. Based on the empirical results, implications on how to promote employee electricity saving are discussed. - Highlights: • We studied employee electricity saving behavior based on norm activation model. • The model was tested using survey data collected from office workers in China. • Personal norm positively influences employee′s electricity saving behavior. • Electricity saving climate negatively moderates personal norm′s effect. • This research enhances our understanding of employee electricity saving behavior

  17. Empirical and theoretical evidence concerning the response of the earth's ice and snow cover to a global temperature increase

    Energy Technology Data Exchange (ETDEWEB)

    Hollin, J T; Barry, R G

    1979-01-01

    As a guide to the possible effects of a CO/sub 2/-induced warming on the cryosphere, we review the effects of three warm periods in the past, and our theoretical understanding of fluctuations in mountain glaciers, the Greenland and Antarctic ice sheets, ground ice, sea ice and seasonal snow cover. Between 1890 and 1940 A.D. the glaciated area in Switzerland was reduced by over 25%. In the Hypsithermal, at about 6000 BP, ground ice in Eurasia retreated northward by several hundred kilometers. In the interglacial Stage 5e, at about 120 000 BP, glocal sea-level rose by over 6 m. Fluctuations of mountain glaciers depend on mesoscale weather and on their mechanical response to it. Any melting of the Greenland ice sheet is likely to be slow in human terms. The West Antarctic ice sheet (its base below sea-level) is susceptible to an ungrounding, and such an event may have been the cause of the sea-level rise above. The East Antarctic ice sheet is susceptible to mechanical surges, which might be triggered by a warming at its margin. Both an ungrounding and a surge might occupy less than 100 yr, and are potentially the most important ice changes in human terms. Modeling studies suggest that a 5/sup 0/C warming would remove the Arctic pack ice in summer. and this may be the most significant effect for further climatic change.

  18. E-learning in engineering education: a theoretical and empirical study of the Algerian higher education institution

    Science.gov (United States)

    Benchicou, Soraya; Aichouni, Mohamed; Nehari, Driss

    2010-06-01

    Technology-mediated education or e-learning is growing globally both in scale and delivery capacity due to the large diffusion of the ubiquitous information and communication technologies (ICT) in general and the web technologies in particular. This statement has not yet been fully supported by research, especially in developing countries such as Algeria. The purpose of this paper was to identify directions for addressing the needs of academics in higher education institutions in Algeria in order to adopt the e-learning approach as a strategy to improve quality of education. The paper will report results of an empirical study that measures the readiness of the Algerian higher education institutions towards the implementation of ICT in the educational process and the attitudes of faculty members towards the application of the e-learning approach in engineering education. Three main objectives were targeted, namely: (a) to provide an initial evaluation of faculty members' attitudes and perceptions towards web-based education; (b) reporting on their perceived requirements for implementing e-learning in university courses; (c) providing an initial input for a collaborative process of developing an institutional strategy for e-learning. Statistical analysis of the survey results indicates that the Algerian higher education institution, which adopted the Licence - Master and Doctorate educational system, is facing a big challenge to take advantage of emerging technological innovations and the advent of e-learning to further develop its teaching programmes and to enhance the quality of education in engineering fields. The successful implementation of this modern approach is shown to depend largely on a set of critical success factors that would include: 1. The extent to which the institution will adopt a formal and official e-learning strategy. 2. The extent to which faculty members will adhere and adopt this strategy and develop ownership of the various measures in the

  19. Empirically modelled Pc3 activity based on solar wind parameters

    Directory of Open Access Journals (Sweden)

    B. Heilig

    2010-09-01

    Full Text Available It is known that under certain solar wind (SW/interplanetary magnetic field (IMF conditions (e.g. high SW speed, low cone angle the occurrence of ground-level Pc3–4 pulsations is more likely. In this paper we demonstrate that in the event of anomalously low SW particle density, Pc3 activity is extremely low regardless of otherwise favourable SW speed and cone angle. We re-investigate the SW control of Pc3 pulsation activity through a statistical analysis and two empirical models with emphasis on the influence of SW density on Pc3 activity. We utilise SW and IMF measurements from the OMNI project and ground-based magnetometer measurements from the MM100 array to relate SW and IMF measurements to the occurrence of Pc3 activity. Multiple linear regression and artificial neural network models are used in iterative processes in order to identify sets of SW-based input parameters, which optimally reproduce a set of Pc3 activity data. The inclusion of SW density in the parameter set significantly improves the models. Not only the density itself, but other density related parameters, such as the dynamic pressure of the SW, or the standoff distance of the magnetopause work equally well in the model. The disappearance of Pc3s during low-density events can have at least four reasons according to the existing upstream wave theory: 1. Pausing the ion-cyclotron resonance that generates the upstream ultra low frequency waves in the absence of protons, 2. Weakening of the bow shock that implies less efficient reflection, 3. The SW becomes sub-Alfvénic and hence it is not able to sweep back the waves propagating upstream with the Alfvén-speed, and 4. The increase of the standoff distance of the magnetopause (and of the bow shock. Although the models cannot account for the lack of Pc3s during intervals when the SW density is extremely low, the resulting sets of optimal model inputs support the generation of mid latitude Pc3 activity predominantly through

  20. Moment Conditions Selection Based on Adaptive Penalized Empirical Likelihood

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available Empirical likelihood is a very popular method and has been widely used in the fields of artificial intelligence (AI and data mining as tablets and mobile application and social media dominate the technology landscape. This paper proposes an empirical likelihood shrinkage method to efficiently estimate unknown parameters and select correct moment conditions simultaneously, when the model is defined by moment restrictions in which some are possibly misspecified. We show that our method enjoys oracle-like properties; that is, it consistently selects the correct moment conditions and at the same time its estimator is as efficient as the empirical likelihood estimator obtained by all correct moment conditions. Moreover, unlike the GMM, our proposed method allows us to carry out confidence regions for the parameters included in the model without estimating the covariances of the estimators. For empirical implementation, we provide some data-driven procedures for selecting the tuning parameter of the penalty function. The simulation results show that the method works remarkably well in terms of correct moment selection and the finite sample properties of the estimators. Also, a real-life example is carried out to illustrate the new methodology.

  1. Empiric treatment based on Helicobacter Pylori serology cannont ...

    African Journals Online (AJOL)

    Background: Evidence that chronic gastric Helicobacter pylori (HP) infection is an aetiological factor in dyspepsia, peptic ulcer disease, gastric carcinoma and lymphoma has led to the suggestion that all serologically positive dyspeptic patients should be treated empirically with antibiotics to eradicate the infection, without ...

  2. Theoretical bases on thermal stability of layered metallic systems

    International Nuclear Information System (INIS)

    Kadyrzhanov, K.K.; Rusakov, V.S.; Turkebaev, T.Eh.; Zhankadamova, A.M.; Ensebaeva, M.Z.

    2003-01-01

    The paper is dedicated to implementation of the theoretical bases for layered metallic systems thermal stabilization. The theory is based on the stabilization mechanism expense of the intermediate two-phase field formation. As parameters of calculated model are coefficients of mutual diffusion and inclusions sizes of generated phases in two-phase fields. The stabilization time dependence for beryllium-iron (Be (1.1 μm)-Fe(5.5 μm)) layered system from iron and beryllium diffusion coefficients, and inclusions sizes is shown as an example. Conclusion about possible mechanisms change at transition from microscopic consideration to the nano-crystal physics level is given

  3. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    Science.gov (United States)

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  4. The Importance of Emotion in Theories of Motivation: Empirical, Methodological, and Theoretical Considerations from a Goal Theory Perspective

    Science.gov (United States)

    Turner, Julianne C.; Meyer, Debra K.; Schweinle, Amy

    2003-01-01

    Despite its importance to educational psychology, prominent theories of motivation have mostly ignored emotion. In this paper, we review theoretical conceptions of the relation between motivation and emotion and discuss the role of emotion in understanding student motivation in classrooms. We demonstrate that emotion is one of the best indicators…

  5. THE FLAT TAX EFFECTS a€“ THEORETICAL AND EMPIRICAL EVIDENCE IN WESTERN AND EASTERN EUROPEAN COUNTRIES

    Directory of Open Access Journals (Sweden)

    Moga Aura Carmen

    2009-05-01

    Full Text Available This paper takes a close look at the advantages and disadvantages of the flat tax and looks at its proven benefits and failings in some European countries which adopted it and its theoretical or possible effects on the economies of other European countrie

  6. Tests of Parameters Instability: Theoretical Study and Empirical Applications on Two Types of Models (ARMA Model and Market Model

    Directory of Open Access Journals (Sweden)

    Sahbi FARHANI

    2012-01-01

    Full Text Available This paper considers tests of parameters instability and structural change with known, unknown or multiple breakpoints. The results apply to a wide class of parametric models that are suitable for estimation by strong rules for detecting the number of breaks in a time series. For that, we use Chow, CUSUM, CUSUM of squares, Wald, likelihood ratio and Lagrange multiplier tests. Each test implicitly uses an estimate of a change point. We conclude with an empirical analysis on two different models (ARMA model and simple linear regression model.

  7. DISCLOSURE POLICY AND PRICE VOLATILITY: A THEORETICAL DESCRIPTION AND EMPIRICAL TESTS OF THE 'FILTER EFFECT'

    Institute of Scientific and Technical Information of China (English)

    XiangminChen; JianghuiLin

    2004-01-01

    In the context of a push towards full disclosure by the regulatory authorities of securities markets, we evaluate the effectiveness of corporate disclosure policy by examining the 'filter effect'. Controlling for firm size and earnings changes, we conduct an empirical test of various disclosure options. Our results shows that the recent increase in disclosure frequency in mainland China's securities markets has not yet achieved its anticipated objective. Disclosure quality remains low and small firms often manipulate their stock prices through selective release of information.

  8. Theoretical Investigation of Bismuth-Based Semiconductors for Photocatalytic Applications

    KAUST Repository

    Laradhi, Shaikhah

    2017-11-01

    Converting solar energy to clean fuel has gained remarkable attention as an emerged renewable energy resource but optimum efficiency in photocatalytic applications has not yet been reached. One of the dominant factors is designing efficient photocatalytic semiconductors. The research reveals a theoretical investigation of optoelectronic properties of bismuth-based metal oxide and oxysulfide semiconductors using highly accurate first-principles quantum method based on density functional theory along with the range-separated hybrid HSE06 exchange-correlation functional. First, bismuth titanate compounds including Bi12TiO20, Bi4Ti3O12, and Bi2Ti2O7 were studied in a combined experimental and theoretical approach to prove its photocatalytic activity under UV light. They have unique bismuth layered structure, tunable electronic properties, high dielectric constant and low electron and effective masses in one crystallographic direction allowing for good charge separation and carrier diffusion properties. The accuracy of the investigation was determined by the good agreement between experimental and theoretical values. Next, BiVO4 with the highest efficiency for oxygen evolution was investigated. A discrepancy between the experimental and theoretical bandgap was reported and inspired a systematic study of all intrinsic defects of the material and the corresponding effect on the optical and transport properties. A candidate defective structure was proposed for an efficient photocatalytic performance. To overcome the carrier transport limitation, a mild hydrogen treatment was also introduced. Carrier lifetime was enhanced due to a significant reduction of trap-assisted recombination, either via passivation of deep trap states or reduction of trap state density. Finally, an accurate theoretical approach to design a new family of semiconductors with enhanced optoelectronic properties for water splitting was proposed. We simulated the solid solutions Bi1−xRExCuOS (RE = Y, La

  9. Institutionalization of conflict capability in the management of natural resources : theoretical perspectives and empirical experience in Indonesia

    OpenAIRE

    Yasmi, Y.

    2007-01-01

    Keywords: natural resource conflict, conflict capability, impairment, escalation This study concerns natural resource management (NRM) conflict particularly conflict in forestry sector and how such conflict can be addressed effectively. It consists of two major parts. The first deals with the theoretical review of conflict literature. It shows how conflict can conceptualized distinctively and how such distinctive conceptualization can be used as a strong basis for understanding and addressing...

  10. What should we mean by empirical validation in hypnotherapy: evidence-based practice in clinical hypnosis.

    Science.gov (United States)

    Alladin, Assen; Sabatini, Linda; Amundson, Jon K

    2007-04-01

    This paper briefly surveys the trend of and controversy surrounding empirical validation in psychotherapy. Empirical validation of hypnotherapy has paralleled the practice of validation in psychotherapy and the professionalization of clinical psychology, in general. This evolution in determining what counts as evidence for bona fide clinical practice has gone from theory-driven clinical approaches in the 1960s and 1970s through critical attempts at categorization of empirically supported therapies in the 1990s on to the concept of evidence-based practice in 2006. Implications of this progression in professional psychology are discussed in the light of hypnosis's current quest for validation and empirical accreditation.

  11. Theoretical study to determine the heat transfer by forced convection coefficient in an empirical correlation in single phase, for annular channels

    International Nuclear Information System (INIS)

    Herrera A, E.

    1994-01-01

    In the heat transfer studies by forced convection, we have few data about behavior of the fluids in an annular channel heated by a concentric pipe, such date is necessary to know the heat transfer coefficient that establish the interchange of energy and the thermic properties of the fluid with the geometry of the flow. In this work the objective, was to compare some empirical correlations that we needed for determinate the heat transfer coefficient for annular channels, where we obtained similar at the theoretical results of an experiment made by Miller and Benforado. It is important to know such coefficients because we can determinate the heat quantity transmitted to a probe zone, in which we simulate a nuclear fuel element that developed huge heat quantity that must be dispersed in short time. We give theoretical data of the heat forced transfer convection and we analyzed the phenomena in annular channels given some empirical correlations employed by some investigators and we analyzed each one. (Author)

  12. Why resilience is unappealing to social science: Theoretical and empirical investigations of the scientific use of resilience

    Science.gov (United States)

    Olsson, Lennart; Jerneck, Anne; Thoren, Henrik; Persson, Johannes; O’Byrne, David

    2015-01-01

    Resilience is often promoted as a boundary concept to integrate the social and natural dimensions of sustainability. However, it is a troubled dialogue from which social scientists may feel detached. To explain this, we first scrutinize the meanings, attributes, and uses of resilience in ecology and elsewhere to construct a typology of definitions. Second, we analyze core concepts and principles in resilience theory that cause disciplinary tensions between the social and natural sciences (system ontology, system boundary, equilibria and thresholds, feedback mechanisms, self-organization, and function). Third, we provide empirical evidence of the asymmetry in the use of resilience theory in ecology and environmental sciences compared to five relevant social science disciplines. Fourth, we contrast the unification ambition in resilience theory with methodological pluralism. Throughout, we develop the argument that incommensurability and unification constrain the interdisciplinary dialogue, whereas pluralism drawing on core social scientific concepts would better facilitate integrated sustainability research. PMID:26601176

  13. Theoretical Insight Into the Empirical Tortuosity-Connectivity Factor in the Burdine-Brooks-Corey Water Relative Permeability Model

    Science.gov (United States)

    Ghanbarian, Behzad; Ioannidis, Marios A.; Hunt, Allen G.

    2017-12-01

    A model commonly applied to the estimation of water relative permeability krw in porous media is the Burdine-Brooks-Corey model, which relies on a simplified picture of pores as a bundle of noninterconnected capillary tubes. In this model, the empirical tortuosity-connectivity factor is assumed to be a power law function of effective saturation with an exponent (μ) commonly set equal to 2 in the literature. Invoking critical path analysis and using percolation theory, we relate the tortuosity-connectivity exponent μ to the critical scaling exponent t of percolation that characterizes the power law behavior of the saturation-dependent electrical conductivity of porous media. We also discuss the cause of the nonuniversality of μ in terms of the nonuniversality of t and compare model estimations with water relative permeability from experiments. The comparison supports determining μ from the electrical conductivity scaling exponent t, but also highlights limitations of the model.

  14. Unsupervised active learning based on hierarchical graph-theoretic clustering.

    Science.gov (United States)

    Hu, Weiming; Hu, Wei; Xie, Nianhua; Maybank, Steve

    2009-10-01

    Most existing active learning approaches are supervised. Supervised active learning has the following problems: inefficiency in dealing with the semantic gap between the distribution of samples in the feature space and their labels, lack of ability in selecting new samples that belong to new categories that have not yet appeared in the training samples, and lack of adaptability to changes in the semantic interpretation of sample categories. To tackle these problems, we propose an unsupervised active learning framework based on hierarchical graph-theoretic clustering. In the framework, two promising graph-theoretic clustering algorithms, namely, dominant-set clustering and spectral clustering, are combined in a hierarchical fashion. Our framework has some advantages, such as ease of implementation, flexibility in architecture, and adaptability to changes in the labeling. Evaluations on data sets for network intrusion detection, image classification, and video classification have demonstrated that our active learning framework can effectively reduce the workload of manual classification while maintaining a high accuracy of automatic classification. It is shown that, overall, our framework outperforms the support-vector-machine-based supervised active learning, particularly in terms of dealing much more efficiently with new samples whose categories have not yet appeared in the training samples.

  15. Theoretical Proof and Empirical Confirmation of a Continuous Labeling Method Using Naturally 13C-Depleted Carbon Dioxide

    Institute of Scientific and Technical Information of China (English)

    Weixin Cheng; Feike A. Dijkstra

    2007-01-01

    Continuous isotope labeling and tracing is often needed to study the transformation, movement, and allocation of carbon in plant-soil systems. However, existing labeling methods have numerous limitations. The present study introduces a new continuous labeling method using naturally 13C-depleted CO2. We theoretically proved that a stable level of 13C-CO2 abundance In a labeling chamber can be maintained by controlling the rate of CO2-free air injection and the rate of ambient airflow with coupling of automatic control of CO2 concentration using a CO2 analyzer. The theoretical results were tested and confirmed in a 54 day experiment in a plant growth chamber. This new continuous labeling method avoids the use of radioactive 14C or expensive 13C-enriched CO2 required by existing methods and therefore eliminates issues of radiation safety or unaffordable isotope cost, as well as creating new opportunities for short- or long-term labeling experiments under a controlled environment.

  16. Training-Based Interventions in Motor Rehabilitation after Stroke: Theoretical and Clinical Considerations

    Directory of Open Access Journals (Sweden)

    Annette Sterr

    2004-01-01

    Full Text Available Basic neuroscience research on brain plasticity, motor learning and recovery has stimulated new concepts in neurological rehabilitation. Combined with the development of set methodological standards in clinical outcome research, these findings have led to a double-paradigm shift in motor rehabilitation: (a the move towards evidence-based procedures for the assessment of clinical outcome & the employment of disablement models to anchor outcome parameters, and (b the introduction of practice-based concepts that are derived from testable models that specify treatment mechanisms. In this context, constraint-induced movement therapy (CIT has played a catalytic role in taking motor rehabilitation forward into the scientific arena. As a theoretically founded and hypothesis-driven intervention, CIT research focuses on two main issues. The first issue is the assessment of long-term clinical benefits in an increasing range of patient groups, and the second issue is the investigation of neuronal and behavioural treatment mechanisms and their interactive contribution to treatment success. These studies are mainly conducted in the research environment and will eventually lead to increased treatment benefits for patients in standard health care. However, gradual but presumably more immediate benefits for patients may be achieved by introducing and testing derivates of the CIT concept that are more compatible with current clinical practice. Here, we summarize the theoretical and empirical issues related to the translation of research-based CIT work into the clinical context of standard health care.

  17. The equivalence of information-theoretic and likelihood-based methods for neural dimensionality reduction.

    Directory of Open Access Journals (Sweden)

    Ross S Williamson

    2015-04-01

    Full Text Available Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron's probability of spiking. One popular method, known as maximally informative dimensions (MID, uses an information-theoretic quantity known as "single-spike information" to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex.

  18. Within-Country Inequality and the Modern World-System: A Theoretical Reprise and Empirical First Step

    Directory of Open Access Journals (Sweden)

    Matthew C. Mahutga

    2015-08-01

    Full Text Available This article calls for a renewed investigation of the world-system position—inequality link. We begin by outlining two general types of causal mechanisms through which a country’s position in the world-system should impact the distribution of income within it. The first type impacts inequality indirectly by conditioning the developmental process, and call for conceptual and empirical models of inequality that account for the link between world-system position and economic development. The second type impacts inequality directly through processes that are more or less unobservable because they change over time or belie cross-nationally comparative indicators, and can thereby be captured by direct measurements of world-system position itself that stand in for varying or unobservable causal processes. We then analyze five measures of world-system position to identify which, if any, provides the most useful association with income inequality. Our findings suggest that the classic measure of Snyder and Kick (1979 provides the strongest association. We conclude by suggesting fruitful directions for future research.

  19. Shape analysis of isoseismals based on empirical and synthetic data

    International Nuclear Information System (INIS)

    Molchan, G.; Panza, G.F.

    2000-11-01

    We present an attempt to compare modeled ground motion acceleration fields with macroseismic observations. Two techniques for the representation of the observed intensities by isoseismals, a smoothing technique and one which visualizes the local uncertainty of an isoseismal, are tested with synthetic and observed data. We show how noise in the data and irregularities in the distribution of observation sites affect the resolution of the isoseismal's shape. In addition to ''standard'' elongated shapes, we identify cross-like patterns in the macroseismic observations for two Italian earthquakes of strike-slip type; similar patterns are displayed by the theoretical peak acceleration fields calculated assuming the point source models given in the literature. (author)

  20. Transport simulations TFTR: Theoretically-based transport models and current scaling

    International Nuclear Information System (INIS)

    Redi, M.H.; Cummings, J.C.; Bush, C.E.; Fredrickson, E.; Grek, B.; Hahm, T.S.; Hill, K.W.; Johnson, D.W.; Mansfield, D.K.; Park, H.; Scott, S.D.; Stratton, B.C.; Synakowski, E.J.; Tang, W.M.; Taylor, G.

    1991-12-01

    In order to study the microscopic physics underlying observed L-mode current scaling, 1-1/2-d BALDUR has been used to simulate density and temperature profiles for high and low current, neutral beam heated discharges on TFTR with several semi-empirical, theoretically-based models previously compared for TFTR, including several versions of trapped electron drift wave driven transport. Experiments at TFTR, JET and D3-D show that I p scaling of τ E does not arise from edge modes as previously thought, and is most likely to arise from nonlocal processes or from the I p -dependence of local plasma core transport. Consistent with this, it is found that strong current scaling does not arise from any of several edge models of resistive ballooning. Simulations with the profile consistent drift wave model and with a new model for toroidal collisionless trapped electron mode core transport in a multimode formalism, lead to strong current scaling of τ E for the L-mode cases on TFTR. None of the theoretically-based models succeeded in simulating the measured temperature and density profiles for both high and low current experiments

  1. Strong Generative Capacity and the Empirical Base of Linguistic Theory

    Directory of Open Access Journals (Sweden)

    Dennis Ott

    2017-09-01

    Full Text Available This Perspective traces the evolution of certain central notions in the theory of Generative Grammar (GG. The founding documents of the field suggested a relation between the grammar, construed as recursively enumerating an infinite set of sentences, and the idealized native speaker that was essentially equivalent to the relation between a formal language (a set of well-formed formulas and an automaton that recognizes strings as belonging to the language or not. But this early view was later abandoned, when the focus of the field shifted to the grammar's strong generative capacity as recursive generation of hierarchically structured objects as opposed to strings. The grammar is now no longer seen as specifying a set of well-formed expressions and in fact necessarily constructs expressions of any degree of intuitive “acceptability.” The field of GG, however, has not sufficiently acknowledged the significance of this shift in perspective, as evidenced by the fact that (informal and experimentally-controlled observations about string acceptability continue to be treated as bona fide data and generalizations for the theory of GG. The focus on strong generative capacity, it is argued, requires a new discussion of what constitutes valid empirical evidence for GG beyond observations pertaining to weak generation.

  2. Bridging process-based and empirical approaches to modeling tree growth

    Science.gov (United States)

    Harry T. Valentine; Annikki Makela; Annikki Makela

    2005-01-01

    The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...

  3. Validation of an empirically based instrument for the assessment of the quality of teaching in medicine

    OpenAIRE

    Prescher, Anja

    2016-01-01

    Measuring the quality of teaching is a necessary prerequisite for the evaluation and development of medical education and thus for high-quality patient care. Corresponding quality indicators can make the feedback for teachers comprehensible. A completely empirically based instrument for the assessment of the quality of teaching in medicine has not yet been described. Ten empirically based criteria from the field of general pedagogy were developed: clear structure, amount of true learning time...

  4. The emergence of a temporally extended self and factors that contribute to its development: from theoretical and empirical perspectives.

    Science.gov (United States)

    2013-04-01

    The main aims of the current research were to determine when children develop a temporally extended self (TES) and what factors contribute to its development. However, in order to address these aims it was important to, first, assess whether the test of delayed self-recognition (DSR) is a valid measure for the development of the TES, and, second, to propose and evaluate a theoretical model that describes what factors influence the development of the TES. The validity of the DSR test was verified by comparing the performance of 57 children on the DSR test to their performance on a meta-representational task (modified false belief task) and to a task that was essentially the same as the DSR test but was specifically designed to rely on the capacity to entertain secondary representations (i.e., surprise body task). Longitudinal testing of the children showed that at the mental age (MA) of 2.5 years they failed the DSR test, despite training them to understand the intended functions of the medium used in the DSR test; whereas, with training, children at the MA of 3.0 and 3.5 years exhibited DSR. Children at the MA of 4 years exhibited DSR without any training. Finally, results suggest that children's meta-representational ability was the only factor that contributed to the prediction of successful performance on the DSR test, and thus to the emergence of the TES. Furthermore, prospective longitudinal data revealed that caregiver conversational style was the only factor that contributed to the prediction of level of training required to pass the DSR test. That is, children of low-elaborative caregivers required significantly more training to pass the DSR test than children of high-elaborative caregivers, indicating that children who received more elaborative conversational input from their caregivers had a more advanced understanding of their TES. © 2013 The Society for Research in Child Development, Inc.

  5. Empirical methods for systematic reviews and evidence-based medicine

    NARCIS (Netherlands)

    van Enst, W.A.

    2014-01-01

    Evidence-Based Medicine is the integration of best research evidence with clinical expertise and patient values. Systematic reviews have become the cornerstone of evidence-based medicine, which is reflected in the position systematic reviews have in the pyramid of evidence-based medicine. Systematic

  6. Discourse Analysis of the Documentary Method as "Key" to Self-Referential Communication Systems? Theoretic-Methodological Basics and Empirical Vignettes

    Directory of Open Access Journals (Sweden)

    Gian-Claudio Gentile

    2010-09-01

    Full Text Available Niklas LUHMANN is well known for his deliberate departure from the classical focus on studying individual actions and directing attention on the actors' relatedness through so called (autopoietic communication systems. In contrast to the gain of a new perspective of observation his focus on autopoietic systems is simultaneously its biggest methodological obstacle for the use in social and management sciences. The present contribution considers the above shift on a theoretical level and with a specific qualitative method. It argues for a deeper understanding of systemic sense making and its enactment in a systematic and comprehensible way. Central to this approach is its focus on groups. Using group discussions as the method of data collection, and the "documentary method" by Ralf BOHNSACK (2003 as a method of data analysis, the article describes a methodologically grounded way to record the self-referential systems proposed by LUHMANN's system theory. The theoretical considerations of the paper are illustrated by empirical vignettes derived from a research project conducted in Switzerland concerning the social responsibility of business. URN: urn:nbn:de:0114-fqs1003156

  7. Deep in Data. Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository

    Energy Technology Data Exchange (ETDEWEB)

    Neymark, J. [J.Neymark and Associates, Golden, CO (United States); Roberts, D. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-06-01

    This paper describes progress toward developing a usable, standardized, empirical data-based software accuracy test suite using home energy consumption and building description data. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This could allow for modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data.

  8. THEORETICAL BASES OF DIVERSIFICATION OF PENITENTIARY EDUCATIONAL SYSTEM

    Directory of Open Access Journals (Sweden)

    Нэилэ Каюмовна Щепкина

    2013-08-01

    Full Text Available The article deals with the main results of scientific research devoted to the question of theoretical bases of diversification of penitentiary educational system in institutions of confinement.The urgency of scientific research reveals through the social importance of convicts’ education.The article draws attention to the fact that the problem of diversification of penitentiary educational system hasn’t been considered in pedagogy yet.  It also identifies the main contradictions, tasks and methods of scientific research.Retrospective analysis of criminal system inRussiahelps to define the existing tendencies of convicts’ education, unsolved problems in this field of science and formulate perspective ideas to modernize the penitentiary educational system.The item tells about the main point of diversification of penitentiary educational system and presents it in a model. It gives detailed analysis of model’s components and depicts some practical ways of its embodiment in institutions of confinement. Moreover the article describes the determinants of diversification of penitentiary educational system which are seemed to be the factors and conditions of its effective development.DOI: http://dx.doi.org/10.12731/2218-7405-2013-6-20

  9. Organizing the public health-clinical health interface: theoretical bases.

    Science.gov (United States)

    St-Pierre, Michèle; Reinharz, Daniel; Gauthier, Jacques-Bernard

    2006-01-01

    This article addresses the issue of the interface between public health and clinical health within the context of the search for networking approaches geared to a more integrated delivery of health services. The articulation of an operative interface is complicated by the fact that the definition of networking modalities involves complex intra- and interdisciplinary and intra- and interorganizational systems across which a new transversal dynamics of intervention practices and exchanges between service structures must be established. A better understanding of the situation is reached by shedding light on the rationale underlying the organizational methods that form the bases of the interface between these two sectors of activity. The Quebec experience demonstrates that neither the structural-functionalist approach, which emphasizes remodelling establishment structures and functions as determinants of integration, nor the structural-constructivist approach, which prioritizes distinct fields of practice in public health and clinical health, adequately serves the purpose of networking and integration. Consequently, a theoretical reframing is imperative. In this regard, structuration theory, which fosters the simultaneous study of methods of inter-structure coordination and inter-actor cooperation, paves the way for a better understanding of the situation and, in turn, to the emergence of new integration possibilities.

  10. Theoretical bases of individualization of training in wrestling

    Directory of Open Access Journals (Sweden)

    S.V. Latyshev

    2013-04-01

    Full Text Available Theoretical bases of individualization of training in wrestling are developed. They include the structure of organization of research, positions of conception, system of individualization of training. The system of individualization of training is designed as an aggregate of elements and subsystems, which guided mutually assist an exposure, forming, development and perfection of own style of opposing. It is marked that in the system of training activity substantially more attention is spared development of the special endurance and attended directed qualities. In the system of after training and after a competition activity an accent was displaced toward the search of facilities of more effective renewal and stimulation of the special capacity, search of new optimum rations of feed and new food additions, search of new methods of decline of weight of fighters. Tactic of conduct of duels changed in the system of competition activity, which foresees yet more rational and economy expense of energy in a fight and in a competition on the whole.

  11. Accrual-based accounting system versus cash-based accounting: An empirical study in municipality organization

    Directory of Open Access Journals (Sweden)

    Mahbobeh Arab

    2013-01-01

    Full Text Available There are many cases, where we may wish to choose a good accounting system and would like to learn how they work and the advantages and disadvantages of each so we can choose the better one for a business. In this paper, we present an empirical survey to understand whether we can choose accrual or cash accounting system. The proposed study designs a questionnaire among 220 experts in area of accounting affairs. The survey considers four sub hypotheses and one main hypothesis to see whether there are reliable rules and regulations in accrual-based accounting compared with cash accounting or not. Similarly, the survey investigates whether accrual-based accounting is more informative, comprehensive and provides better comparative results compared with cash accounting. The results indicate that accrual-based account performs better in terms of all mentioned criteria and it is a better method for managing accounting affairs compared with cash accounting systems.

  12. The Case of Value Based Communication—Epistemological and Methodological Reflections from a System Theoretical Perspective

    Directory of Open Access Journals (Sweden)

    Victoria von Groddeck

    2010-09-01

    Full Text Available The aim of this paper is to reflect the epistemological and methodological aspects of an empirical research study which analyzes the phenomenon of increased value communication within business organizations from a system theoretical perspective in the tradition of Niklas LUHMANN. Drawing on the theoretical term of observation it shows how a research perspective can be developed which opens up the scope for an empirical analysis of communication practices. This analysis focuses on the reconstruction of these practices by first understanding how these practices stabilize themselves and second by contrasting different practices to educe an understanding of different forms of observation of the relevant phenomenon and of the functions of these forms. Thus, this approach combines system theoretical epistemology, analytical research strategies, such as form and functional analysis, and qualitative research methods, such as narrative interviews, participant observation and document analysis. URN: urn:nbn:de:0114-fqs1003177

  13. Augmented Reality-Based Simulators as Discovery Learning Tools: An Empirical Study

    Science.gov (United States)

    Ibáñez, María-Blanca; Di-Serio, Ángela; Villarán-Molina, Diego; Delgado-Kloos, Carlos

    2015-01-01

    This paper reports empirical evidence on having students use AR-SaBEr, a simulation tool based on augmented reality (AR), to discover the basic principles of electricity through a series of experiments. AR-SaBEr was enhanced with knowledge-based support and inquiry-based scaffolding mechanisms, which proved useful for discovery learning in…

  14. Designing internet-based payment system: guidelines and empirical basis

    NARCIS (Netherlands)

    Abrazhevich, D.; Markopoulos, P.; Rauterberg, G.W.M.

    2009-01-01

    This article describes research into online electronic payment systems, focusing on the aspects of payment systems that are critical for their acceptance by end users. Based on our earlier research and a diary study of payments with an online payment system and with online banking systems of a

  15. Empirical Studies On Machine Learning Based Text Classification Algorithms

    OpenAIRE

    Shweta C. Dharmadhikari; Maya Ingle; Parag Kulkarni

    2011-01-01

    Automatic classification of text documents has become an important research issue now days. Properclassification of text documents requires information retrieval, machine learning and Natural languageprocessing (NLP) techniques. Our aim is to focus on important approaches to automatic textclassification based on machine learning techniques viz. supervised, unsupervised and semi supervised.In this paper we present a review of various text classification approaches under machine learningparadig...

  16. Supervisory Adaptive Network-Based Fuzzy Inference System (SANFIS Design for Empirical Test of Mobile Robot

    Directory of Open Access Journals (Sweden)

    Yi-Jen Mon

    2012-10-01

    Full Text Available A supervisory Adaptive Network-based Fuzzy Inference System (SANFIS is proposed for the empirical control of a mobile robot. This controller includes an ANFIS controller and a supervisory controller. The ANFIS controller is off-line tuned by an adaptive fuzzy inference system, the supervisory controller is designed to compensate for the approximation error between the ANFIS controller and the ideal controller, and drive the trajectory of the system onto a specified surface (called the sliding surface or switching surface while maintaining the trajectory onto this switching surface continuously to guarantee the system stability. This SANFIS controller can achieve favourable empirical control performance of the mobile robot in the empirical tests of driving the mobile robot with a square path. Practical experimental results demonstrate that the proposed SANFIS can achieve better control performance than that achieved using an ANFIS controller for empirical control of the mobile robot.

  17. The Ethical Judgment and Moral Reaction to the Product-Harm Crisis: Theoretical Model and Empirical Research

    Directory of Open Access Journals (Sweden)

    Dong Lu

    2016-07-01

    Full Text Available Based on the dual-process theory of ethical judgment, a research model is proposed for examining consumers’ moral reactions to a product-harm crisis. A national-wide survey was conducted with 801 respondents in China. The results of this study indicate that consumers will react to a product-harm crisis through controlled cognitive processing and emotional intuition. The results of the study also show that consumers view a product-harm crisis as an ethical issue, and they will make an ethical judgment according to the perceived severity and perceived relevance of the crisis. The ethical judgment in the perceived crisis severity and perceived crisis relevance will affect consumers’ condemning emotions in terms of contempt and anger. Through controlled cognitive processing, a personal consumption-related reaction (purchasing intention is influenced by the perceived crisis severity. Furthermore, a social and interpersonal reaction (negative word of mouth is influenced by the perceived crisis relevance through the controlled cognitive processing. This social and interpersonal reaction is also influenced by the perceived crisis severity and perceived crisis relevance through the intuition of other-condemning emotion. Moreover, this study finds that the product knowledge negatively moderates the impact of the perceived crisis severity on the condemning emotions. Therefore, when a consumer has a high level of product knowledge, the effect of perceived crisis severity on the condemning emotions will be attenuated, and vice versa. This study provides scholars and managers with means of understanding and handling of consumers’ reactions to a product-harm crisis.

  18. Comparing strategies for controlling an African pest rodent: an empirically based theoretical study

    DEFF Research Database (Denmark)

    Stenseth, Nils Chr.; Leirs, Herwig; Mercelis, Saskia

    2001-01-01

    of rats. Control measures affecting survival as well as reproduction were considered.4. The model showed that control measures reducing survival will only have long-term effects on population size if they are also applied when rodent densities are low. Control measures applied only when rodent densities...... are high will not have persistent effects, even at high mortality rates.5. The model demonstrated that control measures reducing reproduction are likely to prevent Mastomys outbreaks, but will keep densities low over a long period only when the contraceptive effect is strong (> 75% reduction).6. Provided...... in particular cause major economic losses in Africa through damage to crops. Attempts to develop dynamic population models for this and other pest rodents are ongoing. 2. Demographic estimates from a capture-mark-recapture (CMR) study in Tanzania were used to parameterize a population model for this species...

  19. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    Science.gov (United States)

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-06-24

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches.

  20. Toward an Empirically-based Parametric Explosion Spectral Model

    Science.gov (United States)

    Ford, S. R.; Walter, W. R.; Ruppert, S.; Matzel, E.; Hauk, T. F.; Gok, R.

    2010-12-01

    Small underground nuclear explosions need to be confidently detected, identified, and characterized in regions of the world where they have never occurred. We develop a parametric model of the nuclear explosion seismic source spectrum derived from regional phases (Pn, Pg, and Lg) that is compatible with earthquake-based geometrical spreading and attenuation. Earthquake spectra are fit with a generalized version of the Brune spectrum, which is a three-parameter model that describes the long-period level, corner-frequency, and spectral slope at high-frequencies. These parameters are then correlated with near-source geology and containment conditions. There is a correlation of high gas-porosity (low strength) with increased spectral slope. However, there are trade-offs between the slope and corner-frequency, which we try to independently constrain using Mueller-Murphy relations and coda-ratio techniques. The relationship between the parametric equation and the geologic and containment conditions will assist in our physical understanding of the nuclear explosion source, and aid in the prediction of observed local and regional distance seismic amplitudes for event identification and yield determination in regions with incomplete or no prior history of underground nuclear testing.

  1. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    Science.gov (United States)

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  2. Empirical comparison of web-based antimicrobial peptide prediction tools.

    Science.gov (United States)

    Gabere, Musa Nur; Noble, William Stafford

    2017-07-01

    Antimicrobial peptides (AMPs) are innate immune molecules that exhibit activities against a range of microbes, including bacteria, fungi, viruses and protozoa. Recent increases in microbial resistance against current drugs has led to a concomitant increase in the need for novel antimicrobial agents. Over the last decade, a number of AMP prediction tools have been designed and made freely available online. These AMP prediction tools show potential to discriminate AMPs from non-AMPs, but the relative quality of the predictions produced by the various tools is difficult to quantify. We compiled two sets of AMP and non-AMP peptides, separated into three categories-antimicrobial, antibacterial and bacteriocins. Using these benchmark data sets, we carried out a systematic evaluation of ten publicly available AMP prediction methods. Among the six general AMP prediction tools-ADAM, CAMPR3(RF), CAMPR3(SVM), MLAMP, DBAASP and MLAMP-we find that CAMPR3(RF) provides a statistically significant improvement in performance, as measured by the area under the receiver operating characteristic (ROC) curve, relative to the other five methods. Surprisingly, for antibacterial prediction, the original AntiBP method significantly outperforms its successor, AntiBP2 based on one benchmark dataset. The two bacteriocin prediction tools, BAGEL3 and BACTIBASE, both provide very good performance and BAGEL3 outperforms its predecessor, BACTIBASE, on the larger of the two benchmarks. gaberemu@ngha.med.sa or william-noble@uw.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  3. Empirical wind retrieval model based on SAR spectrum measurements

    Science.gov (United States)

    Panfilova, Maria; Karaev, Vladimir; Balandina, Galina; Kanevsky, Mikhail; Portabella, Marcos; Stoffelen, Ad

    ambiguity from polarimetric SAR. A criterion based on the complex correlation coefficient between the VV and VH signals sign is applied to select the wind direction. An additional quality control on the wind speed value retrieved with the spectral method is applied. Here, we use the direction obtained with the spectral method and the backscattered signal for CMOD wind speed estimate. The algorithm described above may be refined by the use of numerous SAR data and wind measurements. In the present preliminary work the first results of SAR images combined with in situ data processing are presented. Our results are compared to the results obtained using previously developed models CMOD, C-2PO for VH polarization and statistical wind retrieval approaches [1]. Acknowledgments. This work is supported by the Russian Foundation of Basic Research (grants 13-05-00852-a). [1] M. Portabella, A. Stoffelen, J. A. Johannessen, Toward an optimal inversion method for synthetic aperture radar wind retrieval, Journal of geophysical research, V. 107, N C8, 2002

  4. Generalized Empirical Likelihood-Based Focused Information Criterion and Model Averaging

    Directory of Open Access Journals (Sweden)

    Naoya Sueishi

    2013-07-01

    Full Text Available This paper develops model selection and averaging methods for moment restriction models. We first propose a focused information criterion based on the generalized empirical likelihood estimator. We address the issue of selecting an optimal model, rather than a correct model, for estimating a specific parameter of interest. Then, this study investigates a generalized empirical likelihood-based model averaging estimator that minimizes the asymptotic mean squared error. A simulation study suggests that our averaging estimator can be a useful alternative to existing post-selection estimators.

  5. Implementing community-based provider participation in research: an empirical study

    Science.gov (United States)

    2012-01-01

    Background Since 2003, the United States National Institutes of Health (NIH) has sought to restructure the clinical research enterprise in the United States by promoting collaborative research partnerships between academically-based investigators and community-based physicians. By increasing community-based provider participation in research (CBPPR), the NIH seeks to advance the science of discovery by conducting research in clinical settings where most people get their care, and accelerate the translation of research results into everyday clinical practice. Although CBPPR is seen as a promising strategy for promoting the use of evidence-based clinical services in community practice settings, few empirical studies have examined the organizational factors that facilitate or hinder the implementation of CBPPR. The purpose of this study is to explore the organizational start-up and early implementation of CBPPR in community-based practice. Methods We used longitudinal, case study research methods and an organizational model of innovation implementation to theoretically guide our study. Our sample consisted of three community practice settings that recently joined the National Cancer Institute’s (NCI) Community Clinical Oncology Program (CCOP) in the United States. Data were gathered through site visits, telephone interviews, and archival documents from January 2008 to May 2011. Results The organizational model for innovation implementation was useful in identifying and investigating the organizational factors influencing start-up and early implementation of CBPPR in CCOP organizations. In general, the three CCOP organizations varied in the extent to which they achieved consistency in CBPPR over time and across physicians. All three CCOP organizations demonstrated mixed levels of organizational readiness for change. Hospital management support and resource availability were limited across CCOP organizations early on, although they improved in one CCOP organization

  6. Development of theoretical oxygen saturation calibration curve based on optical density ratio and optical simulation approach

    Science.gov (United States)

    Jumadi, Nur Anida; Beng, Gan Kok; Ali, Mohd Alauddin Mohd; Zahedi, Edmond; Morsin, Marlia

    2017-09-01

    The implementation of surface-based Monte Carlo simulation technique for oxygen saturation (SaO2) calibration curve estimation is demonstrated in this paper. Generally, the calibration curve is estimated either from the empirical study using animals as the subject of experiment or is derived from mathematical equations. However, the determination of calibration curve using animal is time consuming and requires expertise to conduct the experiment. Alternatively, an optical simulation technique has been used widely in the biomedical optics field due to its capability to exhibit the real tissue behavior. The mathematical relationship between optical density (OD) and optical density ratios (ODR) associated with SaO2 during systole and diastole is used as the basis of obtaining the theoretical calibration curve. The optical properties correspond to systolic and diastolic behaviors were applied to the tissue model to mimic the optical properties of the tissues. Based on the absorbed ray flux at detectors, the OD and ODR were successfully calculated. The simulation results of optical density ratio occurred at every 20 % interval of SaO2 is presented with maximum error of 2.17 % when comparing it with previous numerical simulation technique (MC model). The findings reveal the potential of the proposed method to be used for extended calibration curve study using other wavelength pair.

  7. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    Science.gov (United States)

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  8. Theoretical Bases of the Model of Interaction of the Government and Local Government Creation

    OpenAIRE

    Nikolay I. Churinov

    2015-01-01

    Article is devoted to questions of understanding of a theoretical component: systems of interaction of bodies of different levels of the government. Author researches historical basis of the studied subject by research of foreign and domestic scientific experience in area of the theory of the state and the law. Much attention is paid to the scientific aspect of the question. By empirical approach interpretation of the theory of interaction of public authorities and local government, and also ...

  9. PROCESS-BASED LEARNING: TOWARDS THEORETICAL AND LECTURE-BASED COURSEWORK IN STUDIO STYLE

    Directory of Open Access Journals (Sweden)

    Hatem Ezzat Nabih

    2010-07-01

    Full Text Available This article presents a process-based learning approach to design education where theoretical coursework is taught in studio-style. Lecture-based coursework is sometimes regarded as lacking in challenge and broadening the gap between theory and practice. Furthermore, lecture-based curricula tend to be detached from the studio and deny students from applying their theoretically gained knowledge. Following the belief that student motivation is increased by establishing a higher level of autonomy in the learning process, I argue for a design education that links theory with applied design work within the studio setting. By synthesizing principles of Constructivist Learning and Problem-Based Learning, PBL students are given greater autonomy by being actively involved in their education. Accordingly, I argue for a studio setting that incorporates learning in studio style by presenting three design applications involving students in investigation and experimentation in order to self-experience the design process.

  10. Understanding the Organizational Nature of Student Persistence: Empirically-based Recommendations for Practice.

    Science.gov (United States)

    Berger, Joseph B.

    2002-01-01

    Builds on the assumption that colleges and universities are organizations and subsequently that the organizational perspective provides important insights for improving retention on campuses. A review of existing organizational studies of undergraduate persistence serves as the basis for ten empirically-based recommendations for practice that are…

  11. Distribution of longshore sediment transport along the Indian coast based on empirical model

    Digital Repository Service at National Institute of Oceanography (India)

    Chandramohan, P.; Nayak, B.U.

    An empirical sediment transport model has been developed based on longshore energy flux equation. Study indicates that annual gross sediment transport rate is high (1.5 x 10 super(6) cubic meters to 2.0 x 10 super(6) cubic meters) along the coasts...

  12. Satellite-based empirical models linking river plume dynamics with hypoxic area andvolume

    Science.gov (United States)

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L−1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and ...

  13. Theoretical comparison between solar combisystems based on bikini tanks and tank-in-tank solar combisystems

    DEFF Research Database (Denmark)

    Yazdanshenas, Eshagh; Furbo, Simon; Bales, Chris

    2008-01-01

    Theoretical investigations have shown that solar combisystems based on bikini tanks for low energy houses perform better than solar domestic hot water systems based on mantle tanks. Tank-in-tank solar combisystems are also attractive from a thermal performance point of view. In this paper......, theoretical comparisons between solar combisystems based on bikini tanks and tank-in-tank solar combisystems are presented....

  14. Theoretical Investigations of Plasma-Based Accelerators and Other Advanced Accelerator Concepts

    International Nuclear Information System (INIS)

    Shuets, G.

    2004-01-01

    Theoretical investigations of plasma-based accelerators and other advanced accelerator concepts. The focus of the work was on the development of plasma based and structure based accelerating concepts, including laser-plasma, plasma channel, and microwave driven plasma accelerators

  15. Theoretical and methodological bases of the cooperation and the cooperative

    Directory of Open Access Journals (Sweden)

    Claudio Alberto Rivera Rodríguez

    2013-12-01

    Full Text Available The present work has the purpose to approach the theoretical and methodological foundations of the rise of the cooperatives. In this article are studied the logical antecedents of the cooperativism, the premises  establish by  the Industrial Revolution for the emergence of the first modern cooperative “The Pioneers of Rochdale”  that  is  the inflection point of  cooperativism, until analyzing the contributions of the whole thinking  of the time that maintain this process.

  16. Theoretical analysis of noncanonical base pairing interactions in ...

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    Noncanonical base pairs in RNA have strong structural and functional implications but are currently not considered ..... Full optimizations of the systems were also carried out using ... of the individual bases in the base pair through the equation.

  17. Environmental efficiency and labour productivity. Trade-off or joint dynamics? A theoretical investigation and empirical evidence from Italy using NAMEA

    International Nuclear Information System (INIS)

    Mazzanti, Massimiliano; Zoboli, Roberto

    2009-01-01

    In this paper we test an adapted EKC hypothesis to verify the relationship between 'environmental efficiency' (namely emissions per unit of value added) and labour productivity (value added per employee). We exploit NAMEA data on Italy for 29 sector branches and 6 categories of air emissions for the period 1991-2001. We employ data on capital stock and trade openness to test the robustness of our results. On the basis of the theoretical and empirical analyses focusing on innovation, firm performances and environmental externalities, we would expect a positive correlation between environmental efficiency and labour productivity - a negative correlation between the emissions intensity of value added and labour productivity - which departs from the conventional mainstream view. The hypothesis tested is a critical one within the longstanding debate on the potential trade-off or complementarity between environmental preservation and economic performance, which is strictly associated with the role of technological innovation. We find that for most air emission categories there is a positive relationship between labour productivity and environmental efficiency. Labour productivity dynamics, then, seem to be complementary to a decreasing emissions intensity in the production process. Taking a disaggregate sector perspective, we show that the macro-aggregate evidence is driven by sector dynamics in a non-homogenous way across pollutants. Services tend always to show a 'complementary' relationship, while industry seems to be associated with inverted U-shape dynamics for greenhouse gases and nitrogen oxides. This is in line with our expectations. In any case, EKC shapes appear to drive such productivity links towards complementarity. The extent to which this evidence derives from endogenous market forces, industrial and structural change, and policy effects is discussed by taking an evolutionary perspective to innovation and by referring to impure public goods arguments

  18. Empirical evidence of the game-based learning advantages for online students persistence

    OpenAIRE

    A. Imbellone; G. Marinensi; C.M. Medaglia

    2015-01-01

    The paper presents the empirical results obtained from a study conducted on a game-based online course that took place in 2014 with 47 participants. The study evidenced the benefits of the learning games mechanics on learners’ willingness to continue the course. Assuming the interest for the subject of the course as a fundamental condition for student persistence within the course, it is shown how it can be significantly enhanced by the presence of both ludic and narrative game-based elements.

  19. Virtue-based Approaches to Professional Ethics: a Plea for More Rigorous Use of Empirical Science

    Directory of Open Access Journals (Sweden)

    Georg Spielthenner

    2017-08-01

    Full Text Available Until recently, the method of professional ethics has been largely principle-based. But the failure of this approach to take into sufficient account the character of professionals has led to a revival of virtue ethics. The kind of professional virtue ethics that I am concerned with in this paper is teleological in that it relates the virtues of a profession to the ends of this profession. My aim is to show how empirical research can (in addition to philosophical inquiry be used to develop virtue-based accounts of professional ethics, and that such empirically well-informed approaches are more convincing than traditional kinds of professional virtue ethics. The paper is divided into four sections. In the first, I outline the structure of a teleological approach to virtue ethics. In Section 2, I show that empirical research can play an essential role in professional ethics by emphasizing the difference between conceptual and empirical matters. Section 3 demonstrates the relevance of virtues in professional life; and the last section is concerned with some meta-ethical issues that are raised by a teleological account of professional virtues.

  20. Theoretic base of Edge Local Mode triggering by vertical displacements

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z. T. [Southwestern Institute of Physics, Chengdu 610041 (China); College of Physics Science and Technology, Sichuan University, Chengdu 610065 (China); He, Z. X.; Wang, Z. H. [Southwestern Institute of Physics, Chengdu 610041 (China); Wu, N.; Tang, C. J. [College of Physics Science and Technology, Sichuan University, Chengdu 610065 (China)

    2015-05-15

    Vertical instability is studied with R-dependent displacement. For Solovev's configuration, the stability boundary of the vertical instability is calculated. The pressure gradient is a destabilizing factor which is contrary to Rebhan's result. Equilibrium parallel current density, j{sub //}, at plasma boundary is a drive of the vertical instability similar to Peeling-ballooning modes; however, the vertical instability cannot be stabilized by the magnetic shear which tends towards infinity near the separatrix. The induced current observed in the Edge Local Mode (ELM) triggering experiment by vertical modulation is derived. The theory provides some theoretic explanation for the mitigation of type-I ELMS on ASDEX Upgrade. The principle could be also used for ITER.

  1. Optimization of Investment Planning Based on Game-Theoretic Approach

    Directory of Open Access Journals (Sweden)

    Elena Vladimirovna Butsenko

    2018-03-01

    Full Text Available The game-theoretic approach has a vast potential in solving economic problems. On the other hand, the theory of games itself can be enriched by the studies of real problems of decision-making. Hence, this study is aimed at developing and testing the game-theoretic technique to optimize the management of investment planning. This technique enables to forecast the results and manage the processes of investment planning. The proposed method of optimizing the management of investment planning allows to choose the best development strategy of an enterprise. This technique uses the “game with nature” model, and the Wald criterion, the maximum criterion and the Hurwitz criterion as criteria. The article presents a new algorithm for constructing the proposed econometric method to optimize investment project management. This algorithm combines the methods of matrix games. Furthermore, I show the implementation of this technique in a block diagram. The algorithm includes the formation of initial data, the elements of the payment matrix, as well as the definition of maximin, maximal, compromise and optimal management strategies. The methodology is tested on the example of the passenger transportation enterprise of the Sverdlovsk Railway in Ekaterinburg. The application of the proposed methodology and the corresponding algorithm allowed to obtain an optimal price strategy for transporting passengers for one direction of traffic. This price strategy contributes to an increase in the company’s income with minimal risk from the launch of this direction. The obtained results and conclusions show the effectiveness of using the developed methodology for optimizing the management of investment processes in the enterprise. The results of the research can be used as a basis for the development of an appropriate tool and applied by any economic entity in its investment activities.

  2. Adoption of energy-efficiency measures in SMEs—An empirical analysis based on energy audit data from Germany

    International Nuclear Information System (INIS)

    Fleiter, Tobias; Schleich, Joachim; Ravivanpong, Ployplearn

    2012-01-01

    This paper empirically investigates factors driving the adoption of energy-efficiency measures by small and medium-sized enterprises (SMEs). Our analyses are based on cross-sectional data from SMEs which participated in a German energy audit program between 2008 and 2010. In general, our findings appear robust to alternative model specifications and are consistent with the theoretical and still scarce empirical literature on barriers to energy-efficiency in SMEs. More specifically, high investment costs, which are captured by subjective and objective proxies, appear to impede the adoption of energy-efficiency measures, even if these measures are deemed profitable. Similarly, we find that lack of capital slows the adoption of energy-efficiency measures, primarily for larger investments. Hence, investment subsidies or soft loans (for larger investments) may help accelerating the diffusion of energy-efficiency measures in SMEs. Other barriers were not found to be statistically significant. Finally, our findings provide evidence that the quality of energy audits affects the adoption of energy-efficiency measures. Hence, effective regulation should involve quality standards for energy audits, templates for audit reports or mandatory monitoring of energy audits. - Highlights: ► We empirically analyze barriers to the adoption of energy-efficiency measures in SMEs. ► We focus on firms participating in the German energy audit program for SMEs. ► The program overcomes information related barriers. ► High investment costs still impede the adoption even for profitable measures. ► Low audit quality also impedes the adoption of profitable measures.

  3. Effectiveness of a theoretically-based judgment and decision making intervention for adolescents.

    Science.gov (United States)

    Knight, Danica K; Dansereau, Donald F; Becan, Jennifer E; Rowan, Grace A; Flynn, Patrick M

    2015-05-01

    Although adolescents demonstrate capacity for rational decision making, their tendency to be impulsive, place emphasis on peers, and ignore potential consequences of their actions often translates into higher risk-taking including drug use, illegal activity, and physical harm. Problems with judgment and decision making contribute to risky behavior and are core issues for youth in treatment. Based on theoretical and empirical advances in cognitive science, the Treatment Readiness and Induction Program (TRIP) represents a curriculum-based decision making intervention that can be easily inserted into a variety of content-oriented modalities as well as administered as a separate therapeutic course. The current study examined the effectiveness of TRIP for promoting better judgment among 519 adolescents (37 % female; primarily Hispanic and Caucasian) in residential substance abuse treatment. Change over time in decision making and premeditation (i.e., thinking before acting) was compared among youth receiving standard operating practice (n = 281) versus those receiving standard practice plus TRIP (n = 238). Change in TRIP-specific content knowledge was examined among clients receiving TRIP. Premeditation improved among youth in both groups; TRIP clients showed greater improvement in decision making. TRIP clients also reported significant increases over time in self-awareness, positive-focused thinking (e.g., positive self-talk, goal setting), and recognition of the negative effects of drug use. While both genders showed significant improvement, males showed greater gains in metacognitive strategies (i.e., awareness of one's own cognitive process) and recognition of the negative effects of drug use. These results suggest that efforts to teach core thinking strategies and apply/practice them through independent intervention modules may benefit adolescents when used in conjunction with content-based programs designed to change problematic behaviors.

  4. Value-based management: Theoretical base, shareholders' request and the concept

    Directory of Open Access Journals (Sweden)

    Kaličanin Đorđe M.

    2005-01-01

    Full Text Available The pressure of financial markets, which is a consequence of shareholder revolution, directly affects the solution to the following dilemma: is the mission of corporations to maximize shareholders' wealth or to satisfy interests of other stakeholders? The domination of shareholder theory has caused the appearance of the valuebased management concept. Value-based management is a relevant concept and a process of management in modern environment. The importance of shareholder value requires transformation of traditional enterprise into value driven enterprise. This paper addresses theoretical base, shareholder revolution and the main characteristics of value-based management.

  5. Comparison of ITER performance predicted by semi-empirical and theory-based transport models

    International Nuclear Information System (INIS)

    Mukhovatov, V.; Shimomura, Y.; Polevoi, A.

    2003-01-01

    The values of Q=(fusion power)/(auxiliary heating power) predicted for ITER by three different methods, i.e., transport model based on empirical confinement scaling, dimensionless scaling technique, and theory-based transport models are compared. The energy confinement time given by the ITERH-98(y,2) scaling for an inductive scenario with plasma current of 15 MA and plasma density 15% below the Greenwald value is 3.6 s with one technical standard deviation of ±14%. These data are translated into a Q interval of [7-13] at the auxiliary heating power P aux = 40 MW and [7-28] at the minimum heating power satisfying a good confinement ELMy H-mode. Predictions of dimensionless scalings and theory-based transport models such as Weiland, MMM and IFS/PPPL overlap with the empirical scaling predictions within the margins of uncertainty. (author)

  6. THE THEORETICAL ASTROPHYSICAL OBSERVATORY: CLOUD-BASED MOCK GALAXY CATALOGS

    Energy Technology Data Exchange (ETDEWEB)

    Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara; Hodkinson, Luke; Hassan, Amr H.; Garel, Thibault; Duffy, Alan R.; Mutch, Simon J.; Poole, Gregory B.; Hegarty, Sarah [Centre for Astrophysics and Supercomputing, Swinburne University of Technology, P.O. Box 218, Hawthorn, Victoria, 3122 (Australia)

    2016-03-15

    We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results can be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.

  7. THE THEORETICAL ASTROPHYSICAL OBSERVATORY: CLOUD-BASED MOCK GALAXY CATALOGS

    International Nuclear Information System (INIS)

    Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara; Hodkinson, Luke; Hassan, Amr H.; Garel, Thibault; Duffy, Alan R.; Mutch, Simon J.; Poole, Gregory B.; Hegarty, Sarah

    2016-01-01

    We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results can be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future

  8. Dashboard auditing of ABC (Activity-Based Costing). Theoretical approaches

    OpenAIRE

    Căpuşneanu, Sorinel/I

    2009-01-01

    This article aims to define the dashboard auditing according to the specifics of Activity-Based Costing method (ABC). It describes the main objectives of dashboard auditing, the criteria that a dashboard auditor should meet and the step-by-step stages of the entire dashboard auditing process of an enterprise from steel industry according to the Activity-Based Costing method (ABC).

  9. Tools for Empirical and Operational Analysis of Mobile Offloading in Loop-Based Applications

    Directory of Open Access Journals (Sweden)

    Alexandru-Corneliu OLTEANU

    2013-01-01

    Full Text Available Offloading for mobile devices is an increasingly popular research topic, matching the popu-larity mobile devices have in the general population. Studying mobile offloading is challenging because of device and application heterogeneity. However, we believe that focusing on a specific type of application can bring advances in offloading for mobile devices, while still keeping a wide range of applicability. In this paper we focus on loop-based applications, in which most of the functionality is given by iterating an execution loop. We model the main loop of the application with a graph that consists of a cycle and propose an operational analysis to study offloading on this model. We also propose a testbed based on a real-world application to empirically evaluate offloading. We conduct performance evaluation using both tools and compare the analytical and empirical results.

  10. Empirically Based Psychosocial Therapies for Schizophrenia: The Disconnection between Science and Practice

    Directory of Open Access Journals (Sweden)

    Glenn D. Shean

    2013-01-01

    Full Text Available Empirically validated psychosocial therapies for individuals diagnosed with schizophrenia were described in the report of the Schizophrenia Patient Outcomes Research Team (PORT, 2009. The PORT team identified eight psychosocial treatments: assertive community treatment, supported employment, cognitive behavioral therapy, family-based services, token economy, skills training, psychosocial interventions for alcohol and substance use disorders, and psychosocial interventions for weight management. PORT listings of empirically validated psychosocial therapies provide a useful template for the design of effective recovery-oriented mental health care systems. Unfortunately, surveys indicate that PORT listings have not been implemented in clinical settings. Obstacles to the implementation of PORT psychosocial therapy listings and suggestions for changes needed to foster implementation are discussed. Limitations of PORT therapy listings that are based on therapy outcome efficacy studies are discussed, and cross-cultural and course and outcome studies of correlates of recovery are summarized.

  11. Measuring microscopic evolution processes of complex networks based on empirical data

    International Nuclear Information System (INIS)

    Chi, Liping

    2015-01-01

    Aiming at understanding the microscopic mechanism of complex systems in real world, we perform the measurement that characterizes the evolution properties on two empirical data sets. In the Autonomous Systems Internet data, the network size keeps growing although the system suffers a high rate of node deletion (r = 0.4) and link deletion (q = 0.81). However, the average degree keeps almost unchanged during the whole time range. At each time step the external links attached to a new node are about c = 1.1 and the internal links added between existing nodes are approximately m = 8. For the Scientific Collaboration data, it is a cumulated result of all the authors from 1893 up to the considered year. There is no deletion of nodes and links, r = q = 0. The external and internal links at each time step are c = 1.04 and m = 0, correspondingly. The exponents of degree distribution p(k) ∼ k -γ of these two empirical datasets γ data are in good agreement with that obtained theoretically γ theory . The results indicate that these evolution quantities may provide an insight into capturing the microscopic dynamical processes that govern the network topology. (paper)

  12. Land of Addicts? An Empirical Investigation of Habit-Based Asset Pricing Behavior

    OpenAIRE

    Xiaohong Chen; Sydney C. Ludvigson

    2004-01-01

    This paper studies the ability of a general class of habit-based asset pricing models to match the conditional moment restrictions implied by asset pricing theory. We treat the functional form of the habit as unknown, and to estimate it along with the rest of the model's finite dimensional parameters. Using quarterly data on consumption growth, assets returns and instruments, our empirical results indicate that the estimated habit function is nonlinear, the habit formation is better described...

  13. An Explanation to Individual Knowledge and Behavior Based on Empirical Substrates

    OpenAIRE

    Zhao, Liang; Zhu, Xian Chen

    2008-01-01

    Using recent findings from modern empirical disciplines and mainly building on F.A.Hayek’s thoughts, the paper gives a definition of knowledge in accord with the Austrian School’s tradition, and basing on the definition, it sums up three behavior assumptions and a framework on explaining individual behavior and expounds ideas on hierarchical knowledge and its change in real situations. By this way, the paper believes that the Austrian School can be greatly advanced with the help of modern emp...

  14. Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection.

    Directory of Open Access Journals (Sweden)

    Veronika Tchesnokova

    Full Text Available Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs, which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients' urine within 25-35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care.

  15. Autonomous e-coaching in the wild: Empirical validation of a model-based reasoning system

    OpenAIRE

    Kamphorst, B.A.; Klein, M.C.A.; van Wissen, A.

    2014-01-01

    Autonomous e-coaching systems have the potential to improve people's health behaviors on a large scale. The intelligent behavior change support system eMate exploits a model of the human agent to support individuals in adopting a healthy lifestyle. The system attempts to identify the causes of a person's non-adherence by reasoning over a computational model (COMBI) that is based on established psychological theories of behavior change. The present work presents an extensive, monthlong empiric...

  16. Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection.

    Science.gov (United States)

    Tchesnokova, Veronika; Avagyan, Hovhannes; Rechkina, Elena; Chan, Diana; Muradova, Mariya; Haile, Helen Ghirmai; Radey, Matthew; Weissman, Scott; Riddell, Kim; Scholes, Delia; Johnson, James R; Sokurenko, Evgeni V

    2017-01-01

    Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture) antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs), which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients' urine within 25-35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care.

  17. What Do We Know About Base Erosion and Profit Shifting? A Review of the Empirical Literature

    OpenAIRE

    Dhammika Dharmapala

    2014-01-01

    The issue of tax-motivated income shifting within multinational firms has attracted increasing global attention in recent years. It is of central importance to many current policy debates, including those related to recent initiatives by the OECD on base erosion and profit shifting (BEPS) and to proposals for US tax reform in a territorial direction. This paper provides a survey of the empirical literature on tax-motivated income-shifting within multinational firms. Its emphasis is on clarify...

  18. Theoretical study of GC+/GC base pair derivatives

    International Nuclear Information System (INIS)

    Meng Fancui; Wang Huanjie; Xu Weiren; Liu Chengbu

    2005-01-01

    The geometries of R (R=CH 3 , CH 3 O, F, NO 2 ) substituted GC base pair derivatives and their cations have been optimized at B3LYP/6-31G* level and the substituent effects on the neutral and cationic geometric structures and energies have been discussed. The inner reorganization energies of various base pair derivatives and the native GC base pair have been calculated to discuss the substituent effects on the reorganization energy. NBO (natural bond orbital) analysis has been carried out on both the neutral and the cationic systems to investigate the differences of the charge distributions and the electronic structures. The outcomes indicate that 8-CH 3 O-G:C has the greatest reorganization energy and 8-NO 2 -G:C has the least, while the other substituted base pairs have a reorganization energy close to that of G:C. The one charge is mostly localized on guanine part after ionization and as high as 0.95e. The bond distances of N1-N3'andN2-O2' in the cationic base pair derivatives shortened and that of O6-N4' elongated as compared with the corresponding bond distances of the neutral GC base pair derivatives

  19. Audiovisual Rehabilitation in Hemianopia: A Model-Based Theoretical Investigation.

    Science.gov (United States)

    Magosso, Elisa; Cuppini, Cristiano; Bertini, Caterina

    2017-01-01

    stimuli into short-latency saccades, possibly moving the stimuli into visual detection regions. The retina-SC-extrastriate circuit is related to restitutive effects: visual stimuli can directly elicit visual detection with no need for eye movements. Model predictions and assumptions are critically discussed in view of existing behavioral and neurophysiological data, forecasting that other oculomotor compensatory mechanisms, beyond short-latency saccades, are likely involved, and stimulating future experimental and theoretical investigations.

  20. An empirically based model for knowledge management in health care organizations.

    Science.gov (United States)

    Sibbald, Shannon L; Wathen, C Nadine; Kothari, Anita

    2016-01-01

    Knowledge management (KM) encompasses strategies, processes, and practices that allow an organization to capture, share, store, access, and use knowledge. Ideal KM combines different sources of knowledge to support innovation and improve performance. Despite the importance of KM in health care organizations (HCOs), there has been very little empirical research to describe KM in this context. This study explores KM in HCOs, focusing on the status of current intraorganizational KM. The intention is to provide insight for future studies and model development for effective KM implementation in HCOs. A qualitative methods approach was used to create an empirically based model of KM in HCOs. Methods included (a) qualitative interviews (n = 24) with senior leadership to identify types of knowledge important in these roles plus current information-seeking behaviors/needs and (b) in-depth case study with leaders in new executive positions (n = 2). The data were collected from 10 HCOs. Our empirically based model for KM was assessed for face and content validity. The findings highlight the paucity of formal KM in our sample HCOs. Organizational culture, leadership, and resources are instrumental in supporting KM processes. An executive's knowledge needs are extensive, but knowledge assets are often limited or difficult to acquire as much of the available information is not in a usable format. We propose an empirically based model for KM to highlight the importance of context (internal and external), and knowledge seeking, synthesis, sharing, and organization. Participants who reviewed the model supported its basic components and processes, and potential for incorporating KM into organizational processes. Our results articulate ways to improve KM, increase organizational learning, and support evidence-informed decision-making. This research has implications for how to better integrate evidence and knowledge into organizations while considering context and the role of

  1. HIRS-AMTS satellite sounding system test - Theoretical and empirical vertical resolving power. [High resolution Infrared Radiation Sounder - Advanced Moisture and Temperature Sounder

    Science.gov (United States)

    Thompson, O. E.

    1982-01-01

    The present investigation is concerned with the vertical resolving power of satellite-borne temperature sounding instruments. Information is presented on the capabilities of the High Resolution Infrared Radiation Sounder (HIRS) and a proposed sounding instrument called the Advanced Moisture and Temperature Sounder (AMTS). Two quite different methods for assessing the vertical resolving power of satellite sounders are discussed. The first is the theoretical method of Conrath (1972) which was patterned after the work of Backus and Gilbert (1968) The Backus-Gilbert-Conrath (BGC) approach includes a formalism for deriving a retrieval algorithm for optimizing the vertical resolving power. However, a retrieval algorithm constructed in the BGC optimal fashion is not necessarily optimal as far as actual temperature retrievals are concerned. Thus, an independent criterion for vertical resolving power is discussed. The criterion is based on actual retrievals of signal structure in the temperature field.

  2. Complementary Theoretical Perspectives on Task-Based Classroom Realities

    Science.gov (United States)

    Jackson, Daniel O.; Burch, Alfred Rue

    2017-01-01

    Tasks are viewed as a principled foundation for classroom teaching, social interaction, and language development. This special issue sheds new light on how task-based classroom practices are supported by a diverse range of principles. This introduction describes current trends in classroom practice and pedagogic research in relation to task-based…

  3. Proto-ribosome: a theoretical approach based on RNA relics

    OpenAIRE

    Demongeot, Jacques

    2017-01-01

    We describe in this paper, based on already published articles, a contribution to the theory postulating the existence of a proto-ribosome, which could have appeared early at the origin of life and we discuss the interest of this notion in an evolutionary perspective, taking into account the existence of possible RNA relics of this proto-ribosome.

  4. Theoretical Bases of the Model of Interaction of the Government and Local Government Creation

    Directory of Open Access Journals (Sweden)

    Nikolay I. Churinov

    2015-09-01

    Full Text Available Article is devoted to questions of understanding of a theoretical component: systems of interaction of bodies of different levels of the government. Author researches historical basis of the studied subject by research of foreign and domestic scientific experience in area of the theory of the state and the law. Much attention is paid to the scientific aspect of the question. By empirical approach interpretation of the theory of interaction of public authorities and local government, and also subjective estimated opinion of the author is given.

  5. Intrinsic fluorescence of protein in turbid media using empirical relation based on Monte Carlo lookup table

    Science.gov (United States)

    Einstein, Gnanatheepam; Udayakumar, Kanniyappan; Aruna, Prakasarao; Ganesan, Singaravelu

    2017-03-01

    Fluorescence of Protein has been widely used in diagnostic oncology for characterizing cellular metabolism. However, the intensity of fluorescence emission is affected due to the absorbers and scatterers in tissue, which may lead to error in estimating exact protein content in tissue. Extraction of intrinsic fluorescence from measured fluorescence has been achieved by different methods. Among them, Monte Carlo based method yields the highest accuracy for extracting intrinsic fluorescence. In this work, we have attempted to generate a lookup table for Monte Carlo simulation of fluorescence emission by protein. Furthermore, we fitted the generated lookup table using an empirical relation. The empirical relation between measured and intrinsic fluorescence is validated using tissue phantom experiments. The proposed relation can be used for estimating intrinsic fluorescence of protein for real-time diagnostic applications and thereby improving the clinical interpretation of fluorescence spectroscopic data.

  6. Organizational change in its context. A theoretical and empirical study of the linkages between organizational change projects and their administrative, strategic and institutional environment

    NARCIS (Netherlands)

    Man, Huibert de

    1988-01-01

    This dissertation deals with the multi-level analysis of organizational change projects. Empirical examples are taken from the COB-SER participation experiments which took place in the Netherlands between 1977 and 1983. ... Zie: Summary

  7. Empirical validation of characteristics of design-based learning in higher education

    NARCIS (Netherlands)

    Gómez Puente, S.M.; Eijck, van M.W.; Jochems, W.M.G.

    2013-01-01

    Design-based learning (DBL) is an educational approach in which students gather and process theoretical knowledge while working on the design of artifacts, systems, and innovative solutions in project settings. Whereas DBL has been employed in the practice of teaching science in secondary education,

  8. 6 essays about auctions: a theoretical and empirical analysis. Application to power markets; 6 Essais sur les encheres: approches theorique et empirique. Application aux marches de l'electricite

    Energy Technology Data Exchange (ETDEWEB)

    Lamy, L

    2007-06-15

    This thesis is devoted to a theoretical and empirical analysis of auction mechanisms. Motivated by allocation issues in network industries, in particular by the liberalization of the electricity sector, it focus on auctions with externalities (either allocative or informational) and on multi-objects auctions. After an introduction which provides a survey of the use and the analysis of auctions in power markets, six chapters make this thesis. The first one considers standard auctions in Milgrom-Weber's model with interdependent valuations when the seller can not commit not to participate in the auction. The second and third chapters study the combinatorial auction mechanism proposed by Ausubel and Milgrom. The first of these two studies proposes a modification of this format with a final discount stage and clarifies the theoretical status of those formats, in particular the conditions such that truthful reporting is a dominant strategy. Motivated by the robustness issues of the generalizations of the Ausubel-Milgrom and the Vickrey combinatorial auctions to environments with allocative externalities between joint-purchasers, the second one characterizes the buyer-sub-modularity condition in a general model with allocative identity-dependent externalities between purchasers. In a complete information setup, the fourth chapter analyses the optimal design problem when the commitment abilities of the principal are reduced, namely she can not commit to a simultaneous participation game. The fifth chapter is devoted to the structural analysis of the private value auction model for a single-unit when the econometrician can not observe bidders' identities. The asymmetric independent private value (IPV) model is identified. A multi-step kernel-based estimator is proposed and shown to be asymptotically optimal. Using auctions data for the anglo-french electric Interconnector, the last chapter analyses a multi-unit ascending auctions through reduced forms. (author)

  9. Empirical evidence of the game-based learning advantages for online students persistence

    Directory of Open Access Journals (Sweden)

    A. Imbellone

    2015-07-01

    Full Text Available The paper presents the empirical results obtained from a study conducted on a game-based online course that took place in 2014 with 47 participants. The study evidenced the benefits of the learning games mechanics on learners’ willingness to continue the course. Assuming the interest for the subject of the course as a fundamental condition for student persistence within the course, it is shown how it can be significantly enhanced by the presence of both ludic and narrative game-based elements.

  10. Band structure calculation of GaSe-based nanostructures using empirical pseudopotential method

    International Nuclear Information System (INIS)

    Osadchy, A V; Obraztsova, E D; Volotovskiy, S G; Golovashkin, D L; Savin, V V

    2016-01-01

    In this paper we present the results of band structure computer simulation of GaSe- based nanostructures using the empirical pseudopotential method. Calculations were performed using a specially developed software that allows performing simulations using cluster computing. Application of this method significantly reduces the demands on computing resources compared to traditional approaches based on ab-initio techniques and provides receiving the adequate comparable results. The use of cluster computing allows to obtain information for structures that require an explicit account of a significant number of atoms, such as quantum dots and quantum pillars. (paper)

  11. Experimental and Theoretical Study of Microturbine-Based BCHP System

    International Nuclear Information System (INIS)

    Fairchild, P.D.

    2001-01-01

    On-site and near-site distributed power generation (DG), as part of a Buildings Cooling, Heating and Power (BCHP) system, brings both electricity and waste heat from the DG sources closer to the end user's electric and thermal loads. Consequently, the waste heat can be used as input power for heat-activated air conditioners, chillers, and desiccant dehumidification systems; to generate steam for space heating; or to provide hot water for laundry, kitchen, cleaning services and/or rest rooms. By making use of what is normally waste heat, BCHP systems meet a building's electrical and thermal loads with a lower input of fossil fuel, yielding resource efficiencies of 40 to 70% or more. To ensure the success of BCHP systems, interactions of a DG system-such as a microturbine and thermal heat recovery units under steady-state modes of operation with various exhaust back pressures-must be considered. This article studies the performance and emissions of a 30-kW microturbine over a range of design and off-design conditions in steady-state operating mode with various back pressures. In parallel with the experimental part of the project, a BCHP mathematical model was developed describing basic thermodynamic and hydraulic processes in the system, heat and material balances, and the relationship of the balances. to the system configuration. The model can determine the efficiency of energy conversion both for an individual microturbine unit and for the entire BCHP system for various system configurations and external loads. Based on actual data Tom a 30-kW microturbine, linear analysis was used to obtain an analytical relationship between the changes in the thermodynamic and hydraulic parameters of the system. The actual data show that, when the backpressure at the microturbine exhaust outlet is increased to the maximum of 7 in. WC (0.017 atm), the microturbine's useful power output decreases by from 3.5% at a full power setting of 30 kW to 5.5% at a one-third power setting (10

  12. Imitative Modeling as a Theoretical Base for Instructing Language-Disordered Children

    Science.gov (United States)

    Courtright, John A.; Courtright, Illene C.

    1976-01-01

    A modification of A. Bandura's social learning theory (imitative modeling) was employed as a theoretical base for language instruction with eight language disordered children (5 to 10 years old). (Author/SBH)

  13. Awareness-based game-theoretic space resource management

    Science.gov (United States)

    Chen, Genshe; Chen, Huimin; Pham, Khanh; Blasch, Erik; Cruz, Jose B., Jr.

    2009-05-01

    Over recent decades, the space environment becomes more complex with a significant increase in space debris and a greater density of spacecraft, which poses great difficulties to efficient and reliable space operations. In this paper we present a Hierarchical Sensor Management (HSM) method to space operations by (a) accommodating awareness modeling and updating and (b) collaborative search and tracking space objects. The basic approach is described as follows. Firstly, partition the relevant region of interest into district cells. Second, initialize and model the dynamics of each cell with awareness and object covariance according to prior information. Secondly, explicitly assign sensing resources to objects with user specified requirements. Note that when an object has intelligent response to the sensing event, the sensor assigned to observe an intelligent object may switch from time-to-time between a strong, active signal mode and a passive mode to maximize the total amount of information to be obtained over a multi-step time horizon and avoid risks. Thirdly, if all explicitly specified requirements are satisfied and there are still more sensing resources available, we assign the additional sensing resources to objects without explicitly specified requirements via an information based approach. Finally, sensor scheduling is applied to each sensor-object or sensor-cell pair according to the object type. We demonstrate our method with realistic space resources management scenario using NASA's General Mission Analysis Tool (GMAT) for space object search and track with multiple space borne observers.

  14. Game-Theoretic Models for Usage-based Maintenance Contract

    Science.gov (United States)

    Husniah, H.; Wangsaputra, R.; Cakravastia, A.; Iskandar, B. P.

    2018-03-01

    A usage-based maintenance contracts with coordination and non coordination between two parties is studied in this paper. The contract is applied to a dump truck operated in a mining industry. The situation under study is that an agent offers service contract to the owner of the truck after warranty ends. This contract has only a time limit but no usage limit. If the total usage per period exceeds the maximum usage allowed in the contract, then the owner will be charged an additional cost. In general, the agent (Original Equipment Manufacturer/OEM) provides a full coverage of maintenance, which includes PM and CM under the lease contract. The decision problem for the owner is to select the best option offered that fits to its requirement, and the decision problem for the agent is to find the optimal maintenance efforts for a given price of the service option offered. We first find the optimal decisions using coordination scheme and then with non coordination scheme for both parties.

  15. Theoretical study of impurity effects in iron-based superconductors

    Science.gov (United States)

    Navarro Gastiasoro, Maria; Hirschfeld, Peter; Andersen, Brian

    2013-03-01

    Several open questions remain unanswered for the iron-based superconductors (FeSC), including the importance of electronic correlations and the symmetry of the superconducting order parameter. Motivated by recent STM experiments which show a fascinating variety of resonant defect states in FeSC, we adopt a realistic five-band model including electronic Coulomb correlations to study local effects of disorder in the FeSC. In order to minimize the number of free parameters, we use the pairing interactions obtained from spin-fluctuation exchange to determine the homogeneous superconducting state. The ability of local impurity potentials to induce resonant states depends on their scattering strength Vimp; in addition, for appropriate Vimp, such states are associated with local orbital- and magnetic order. We investigate the density of states near such impurities and show how tunneling experiments may be used to probe local induced order. In the SDW phase, we show how C2 symmetry-breaking dimers are naturally formed around impurities which also form cigar-like (pi,pi) structures embedded in the (pi,0) magnetic bulk phase. Such electronic dimers have been shown to be candidates for explaining the so-called nematogens observed previously by QPI in Co-doped CaFe2As2.

  16. {sup 137}Cs applicability to soil erosion assessment: theoretical and empirical model; Aplicabilidade do {sup 137}Cs para medir erosao do solo: modelos teoricos e empiricos

    Energy Technology Data Exchange (ETDEWEB)

    Andrello, Avacir Casanova

    2004-02-15

    The soil erosion processes acceleration and the increase of soil erosion rates due to anthropogenic perturbation in soil-weather-vegetation equilibrium has influenced in the soil quality and environment. So, the possibility to assess the amplitude and severity of soil erosion impact on the productivity and quality of soil is important so local scale as regional and global scale. Several models have been developed to assess the soil erosion so qualitative as quantitatively. {sup 137}Cs, an anthropogenic radionuclide, have been very used to assess the superficial soil erosion process Empirical and theoretical models were developed on the basis of {sup 137} Cs redistribution as indicative of soil movement by erosive process These models incorporate many parameters that can influence in the soil erosion rates quantification by {sup 137} Cs redistribution. Statistical analysis was realized on the models recommended by IAEA to determinate the influence that each parameter generates in results of the soil redistribution. It was verified that the most important parameter is the {sup 137} Cs redistribution, indicating the necessity of a good determination in the {sup 137} Cs inventory values with a minimum deviation associated with these values. After this, it was associated a 10% deviation in the reference value of {sup 137} Cs inventory and the 5% in the {sup 137} Cs inventory of the sample and was determinate the deviation in results of the soil redistribution calculated by models. The results of soil redistribution was compared to verify if there was difference between the models, but there was not difference in the results determinate by models, unless above 70% of {sup 137} Cs loss. Analyzing three native forests and an area of the undisturbed pasture in the Londrina region, can be verified that the {sup 137} Cs spatial variability in local scale was 15%. Comparing the {sup 137} Cs inventory values determinate in the three native forest with the {sup 137} Cs inventory

  17. A Web-based multimedia collaboratory. Empirical work studies in film archives

    DEFF Research Database (Denmark)

    Pejtersen, A.M.; Albrechtsen, H.; Cleal, B.

    2001-01-01

    and interfaces for collaborative work and content-based access to digital repositories for film archives, researchers and end-users. This report is based on empirical analysis of three film archives inGermany, Austria and the Czech Republic, and seeks to elicit the user needs for a collaboratory in this domain....... Both the collection and analysis of data have been organised according to principles of Cognitive Work Analysis (CWA) as pioneered at Risø (cf.Rasmussen, Pejtersen and Goodstein 1994). Research based work on individual film projects is, due to international distribution and multiple versions, dependent......The Collaboratory for Annotation, Indexing and Retrieval of Digitized Historical Archive Material (Collate) is intended to foster and support collaboration on research, cultural mediation and preservation of films through a distributed multimediarepository. The tool will provide web-based tools...

  18. Generation of synthetic Kinect depth images based on empirical noise model

    DEFF Research Database (Denmark)

    Iversen, Thorbjørn Mosekjær; Kraft, Dirk

    2017-01-01

    The development, training and evaluation of computer vision algorithms rely on the availability of a large number of images. The acquisition of these images can be time-consuming if they are recorded using real sensors. An alternative is to rely on synthetic images which can be rapidly generated....... This Letter describes a novel method for the simulation of Kinect v1 depth images. The method is based on an existing empirical noise model from the literature. The authors show that their relatively simple method is able to provide depth images which have a high similarity with real depth images....

  19. Virtue-based Approaches to Professional Ethics: a Plea for More Rigorous Use of Empirical Science

    OpenAIRE

    Georg Spielthenner

    2017-01-01

    Until recently, the method of professional ethics has been largely principle-based. But the failure of this approach to take into sufficient account the character of professionals has led to a revival of virtue ethics. The kind of professional virtue ethics that I am concerned with in this paper is teleological in that it relates the virtues of a profession to the ends of this profession. My aim is to show how empirical research can (in addition to philosophical inquiry) be used to develop vi...

  20. Ensemble empirical mode decomposition based fluorescence spectral noise reduction for low concentration PAHs

    Science.gov (United States)

    Wang, Shu-tao; Yang, Xue-ying; Kong, De-ming; Wang, Yu-tian

    2017-11-01

    A new noise reduction method based on ensemble empirical mode decomposition (EEMD) is proposed to improve the detection effect for fluorescence spectra. Polycyclic aromatic hydrocarbons (PAHs) pollutants, as a kind of important current environmental pollution source, are highly oncogenic. Using the fluorescence spectroscopy method, the PAHs pollutants can be detected. However, instrument will produce noise in the experiment. Weak fluorescent signals can be affected by noise, so we propose a way to denoise and improve the detection effect. Firstly, we use fluorescence spectrometer to detect PAHs to obtain fluorescence spectra. Subsequently, noises are reduced by EEMD algorithm. Finally, the experiment results show the proposed method is feasible.

  1. Evaluating Process Quality Based on Change Request Data - An Empirical Study of the Eclipse Project

    Science.gov (United States)

    Schackmann, Holger; Schaefer, Henning; Lichter, Horst

    The information routinely collected in change request management systems contains valuable information for monitoring of the process quality. However this data is currently utilized in a very limited way. This paper presents an empirical study of the process quality in the product portfolio of the Eclipse project. It is based on a systematic approach for the evaluation of process quality characteristics using change request data. Results of the study offer insights into the development process of Eclipse. Moreover the study allows assessing applicability and limitations of the proposed approach for the evaluation of process quality.

  2. Time to Guideline-Based Empiric Antibiotic Therapy in the Treatment of Pneumonia in a Community Hospital: A Retrospective Review.

    Science.gov (United States)

    Erwin, Beth L; Kyle, Jeffrey A; Allen, Leland N

    2016-08-01

    The 2005 American Thoracic Society/Infectious Diseases Society of America (ATS/IDSA) guidelines for hospital-acquired pneumonia (HAP), ventilator-associated pneumonia (VAP), and health care-associated pneumonia (HCAP) stress the importance of initiating prompt appropriate empiric antibiotic therapy. This study's purpose was to determine the percentage of patients with HAP, VAP, and HCAP who received guideline-based empiric antibiotic therapy and to determine the average time to receipt of an appropriate empiric regimen. A retrospective chart review of adults with HAP, VAP, or HCAP was conducted at a community hospital in suburban Birmingham, Alabama. The hospital's electronic medical record system utilized International Classification of Diseases, Ninth Revision (ICD-9) codes to identify patients diagnosed with pneumonia. The percentage of patients who received guideline-based empiric antibiotic therapy was calculated. The mean time from suspected diagnosis of pneumonia to initial administration of the final antibiotic within the empiric regimen was calculated for patients who received guideline-based therapy. Ninety-three patients met the inclusion criteria. The overall guideline adherence rate for empiric antibiotic therapy was 31.2%. The mean time to guideline-based therapy in hours:minutes was 7:47 for HAP and 28:16 for HCAP. For HAP and HCAP combined, the mean time to appropriate therapy was 21:55. Guideline adherence rates were lower and time to appropriate empiric therapy was greater for patients with HCAP compared to patients with HAP. © The Author(s) 2015.

  3. Empirical Modeling of Lithium-ion Batteries Based on Electrochemical Impedance Spectroscopy Tests

    International Nuclear Information System (INIS)

    Samadani, Ehsan; Farhad, Siamak; Scott, William; Mastali, Mehrdad; Gimenez, Leonardo E.; Fowler, Michael; Fraser, Roydon A.

    2015-01-01

    Highlights: • Two commercial Lithium-ion batteries are studied through HPPC and EIS tests. • An equivalent circuit model is developed for a range of operating conditions. • This model improves the current battery empirical models for vehicle applications • This model is proved to be efficient in terms of predicting HPPC test resistances. - ABSTRACT: An empirical model for commercial lithium-ion batteries is developed based on electrochemical impedance spectroscopy (EIS) tests. An equivalent circuit is established according to EIS test observations at various battery states of charge and temperatures. A Laplace transfer time based model is developed based on the circuit which can predict the battery operating output potential difference in battery electric and plug-in hybrid vehicles at various operating conditions. This model demonstrates up to 6% improvement compared to simple resistance and Thevenin models and is suitable for modeling and on-board controller purposes. Results also show that this model can be used to predict the battery internal resistance obtained from hybrid pulse power characterization (HPPC) tests to within 20 percent, making it suitable for low to medium fidelity powertrain design purposes. In total, this simple battery model can be employed as a real-time model in electrified vehicle battery management systems

  4. Debris flow susceptibility assessment based on an empirical approach in the central region of South Korea

    Science.gov (United States)

    Kang, Sinhang; Lee, Seung-Rae

    2018-05-01

    Many debris flow spreading analyses have been conducted during recent decades to prevent damage from debris flows. An empirical approach that has been used in various studies on debris flow spreading has advantages such as simple data acquisition and good applicability for large areas. In this study, a GIS-based empirical model that was developed at the University of Lausanne (Switzerland) is used to assess the debris flow susceptibility. Study sites are classified based on the types of soil texture or geological conditions, which can indirectly consider geotechnical or rheological properties, to supplement the weaknesses of Flow-R which neglects local controlling factors. The mean travel angle for each classification is calculated from a debris flow inventory map. The debris flow susceptibility is assessed based on changes in the flow-direction algorithm, an inertial function with a 5-m DEM resolution. A simplified friction-limited model was applied to the runout distance analysis by using the appropriate travel angle for the corresponding classification with a velocity limit of 28 m/s. The most appropriate algorithm combinations that derived the highest average of efficiency and sensitivity for each classification are finally determined by applying a confusion matrix with the efficiency and the sensitivity to the results of the susceptibility assessment. The proposed schemes can be useful for debris flow susceptibility assessment in both the study area and the central region of Korea, which has similar environmental factors such as geological conditions, topography and rainfall characteristics to the study area.

  5. Empirical Equation Based Chirality (n, m Assignment of Semiconducting Single Wall Carbon Nanotubes from Resonant Raman Scattering Data

    Directory of Open Access Journals (Sweden)

    Md Shamsul Arefin

    2012-12-01

    Full Text Available This work presents a technique for the chirality (n, m assignment of semiconducting single wall carbon nanotubes by solving a set of empirical equations of the tight binding model parameters. The empirical equations of the nearest neighbor hopping parameters, relating the term (2n, m with the first and second optical transition energies of the semiconducting single wall carbon nanotubes, are also proposed. They provide almost the same level of accuracy for lower and higher diameter nanotubes. An algorithm is presented to determine the chiral index (n, m of any unknown semiconducting tube by solving these empirical equations using values of radial breathing mode frequency and the first or second optical transition energy from resonant Raman spectroscopy. In this paper, the chirality of 55 semiconducting nanotubes is assigned using the first and second optical transition energies. Unlike the existing methods of chirality assignment, this technique does not require graphical comparison or pattern recognition between existing experimental and theoretical Kataura plot.

  6. Empirical Equation Based Chirality (n, m) Assignment of Semiconducting Single Wall Carbon Nanotubes from Resonant Raman Scattering Data

    Science.gov (United States)

    Arefin, Md Shamsul

    2012-01-01

    This work presents a technique for the chirality (n, m) assignment of semiconducting single wall carbon nanotubes by solving a set of empirical equations of the tight binding model parameters. The empirical equations of the nearest neighbor hopping parameters, relating the term (2n− m) with the first and second optical transition energies of the semiconducting single wall carbon nanotubes, are also proposed. They provide almost the same level of accuracy for lower and higher diameter nanotubes. An algorithm is presented to determine the chiral index (n, m) of any unknown semiconducting tube by solving these empirical equations using values of radial breathing mode frequency and the first or second optical transition energy from resonant Raman spectroscopy. In this paper, the chirality of 55 semiconducting nanotubes is assigned using the first and second optical transition energies. Unlike the existing methods of chirality assignment, this technique does not require graphical comparison or pattern recognition between existing experimental and theoretical Kataura plot. PMID:28348319

  7. Empirical research on risk taking of listed financial institutions based on the perspective of corporate governance

    Directory of Open Access Journals (Sweden)

    Chen Hao

    2017-03-01

    Full Text Available After the financial crisis in 2008, the risk control of financial institutions has once again become the focus of attention. This paper selects the unbalanced panel data of 44 listed financial institutions in China from 2009 to 2013 for empirical analysis to study the risk taking of China’s listed financial institutions based on the perspective of corporate governance. Then the paper analyzes the effect of corporate governance on the risk taking of listed financial institutions based on the empirical analysis from four aspects. The results indicate that there is a significant negative correlation between the proportion of the largest shareholder’s shareholding and risk taking; a significant positive correlation between the size of the board of supervisors and risk taking; a significant positive correlation between the executive pay and risk taking, and a significant negative correlation between the equity incentive and risk taking. By comparison, the factors related to governance of board of directors have no significant effect on the risk taking of listed financial institutions.

  8. Determination of knock characteristics in spark ignition engines: an approach based on ensemble empirical mode decomposition

    International Nuclear Information System (INIS)

    Li, Ning; Liang, Caiping; Yang, Jianguo; Zhou, Rui

    2016-01-01

    Knock is one of the major constraints to improve the performance and thermal efficiency of spark ignition (SI) engines. It can also result in severe permanent engine damage under certain operating conditions. Based on the ensemble empirical mode decomposition (EEMD), this paper proposes a new approach to determine the knock characteristics in SI engines. By adding a uniformly distributed and finite white Gaussian noise, the EEMD can preserve signal continuity in different scales and therefore alleviates the mode-mixing problem occurring in the classic empirical mode decomposition (EMD). The feasibilities of applying the EEMD to detect the knock signatures of a test SI engine via the pressure signal measured from combustion chamber and the vibration signal measured from cylinder head are investigated. Experimental results show that the EEMD-based method is able to detect the knock signatures from both the pressure signal and vibration signal, even in initial stage of knock. Finally, by comparing the application results with those obtained by short-time Fourier transform (STFT), Wigner–Ville distribution (WVD) and discrete wavelet transform (DWT), the superiority of the EEMD method in determining knock characteristics is demonstrated. (paper)

  9. Dynamics of bloggers’ communities: Bipartite networks from empirical data and agent-based modeling

    Science.gov (United States)

    Mitrović, Marija; Tadić, Bosiljka

    2012-11-01

    We present an analysis of the empirical data and the agent-based modeling of the emotional behavior of users on the Web portals where the user interaction is mediated by posted comments, like Blogs and Diggs. We consider the dataset of discussion-driven popular Diggs, in which all comments are screened by machine-learning emotion detection in the text, to determine positive and negative valence (attractiveness and aversiveness) of each comment. By mapping the data onto a suitable bipartite network, we perform an analysis of the network topology and the related time-series of the emotional comments. The agent-based model is then introduced to simulate the dynamics and to capture the emergence of the emotional behaviors and communities. The agents are linked to posts on a bipartite network, whose structure evolves through their actions on the posts. The emotional states (arousal and valence) of each agent fluctuate in time, subject to the current contents of the posts to which the agent is exposed. By an agent’s action on a post its current emotions are transferred to the post. The model rules and the key parameters are inferred from the considered empirical data to ensure their realistic values and mutual consistency. The model assumes that the emotional arousal over posts drives the agent’s action. The simulations are preformed for the case of constant flux of agents and the results are analyzed in full analogy with the empirical data. The main conclusions are that the emotion-driven dynamics leads to long-range temporal correlations and emergent networks with community structure, that are comparable with the ones in the empirical system of popular posts. In view of pure emotion-driven agents actions, this type of comparisons provide a quantitative measure for the role of emotions in the dynamics on real blogs. Furthermore, the model reveals the underlying mechanisms which relate the post popularity with the emotion dynamics and the prevalence of negative

  10. Theoretical frameworks informing family-based child and adolescent obesity interventions

    DEFF Research Database (Denmark)

    Alulis, Sarah; Grabowski, Dan

    2017-01-01

    into focus. However, the use of theoretical frameworks to strengthen these interventions is rare and very uneven. OBJECTIVE AND METHOD: To conduct a qualitative meta-synthesis of family-based interventions for child and adolescent obesity to identify the theoretical frameworks applied, thus understanding how...... inconsistencies and a significant void between research results and health care practice. Based on the analysis, this article proposes three themes to be used as focus points when designing future interventions and when selecting theories for the development of solid, theory-based frameworks for application...... cognitive, self-efficacy and Family Systems Theory appeared most frequently. The remaining 24 were classified as theory-related as theoretical elements of self-monitoring; stimulus control, reinforcement and modelling were used. CONCLUSION: The designs of family-based interventions reveal numerous...

  11. Empirical global model of upper thermosphere winds based on atmosphere and dynamics explorer satellite data

    Science.gov (United States)

    Hedin, A. E.; Spencer, N. W.; Killeen, T. L.

    1988-01-01

    Thermospheric wind data obtained from the Atmosphere Explorer E and Dynamics Explorer 2 satellites have been used to generate an empirical wind model for the upper thermosphere, analogous to the MSIS model for temperature and density, using a limited set of vector spherical harmonics. The model is limited to above approximately 220 km where the data coverage is best and wind variations with height are reduced by viscosity. The data base is not adequate to detect solar cycle (F10.7) effects at this time but does include magnetic activity effects. Mid- and low-latitude data are reproduced quite well by the model and compare favorably with published ground-based results. The polar vortices are present, but not to full detail.

  12. Tissue artifact removal from respiratory signals based on empirical mode decomposition.

    Science.gov (United States)

    Liu, Shaopeng; Gao, Robert X; John, Dinesh; Staudenmayer, John; Freedson, Patty

    2013-05-01

    On-line measurement of respiration plays an important role in monitoring human physical activities. Such measurement commonly employs sensing belts secured around the rib cage and abdomen of the test object. Affected by the movement of body tissues, respiratory signals typically have a low signal-to-noise ratio. Removing tissue artifacts therefore is critical to ensuring effective respiration analysis. This paper presents a signal decomposition technique for tissue artifact removal from respiratory signals, based on the empirical mode decomposition (EMD). An algorithm based on the mutual information and power criteria was devised to automatically select appropriate intrinsic mode functions for tissue artifact removal and respiratory signal reconstruction. Performance of the EMD-algorithm was evaluated through simulations and real-life experiments (N = 105). Comparison with low-pass filtering that has been conventionally applied confirmed the effectiveness of the technique in tissue artifacts removal.

  13. An empirical Bayesian approach for model-based inference of cellular signaling networks

    Directory of Open Access Journals (Sweden)

    Klinke David J

    2009-11-01

    Full Text Available Abstract Background A common challenge in systems biology is to infer mechanistic descriptions of biological process given limited observations of a biological system. Mathematical models are frequently used to represent a belief about the causal relationships among proteins within a signaling network. Bayesian methods provide an attractive framework for inferring the validity of those beliefs in the context of the available data. However, efficient sampling of high-dimensional parameter space and appropriate convergence criteria provide barriers for implementing an empirical Bayesian approach. The objective of this study was to apply an Adaptive Markov chain Monte Carlo technique to a typical study of cellular signaling pathways. Results As an illustrative example, a kinetic model for the early signaling events associated with the epidermal growth factor (EGF signaling network was calibrated against dynamic measurements observed in primary rat hepatocytes. A convergence criterion, based upon the Gelman-Rubin potential scale reduction factor, was applied to the model predictions. The posterior distributions of the parameters exhibited complicated structure, including significant covariance between specific parameters and a broad range of variance among the parameters. The model predictions, in contrast, were narrowly distributed and were used to identify areas of agreement among a collection of experimental studies. Conclusion In summary, an empirical Bayesian approach was developed for inferring the confidence that one can place in a particular model that describes signal transduction mechanisms and for inferring inconsistencies in experimental measurements.

  14. The impact of SOA for achieving healthcare interoperability. An empirical investigation based on a hypothetical adoption.

    Science.gov (United States)

    Daskalakis, S; Mantas, J

    2009-01-01

    The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.

  15. Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    Naveed ur Rehman

    2015-05-01

    Full Text Available A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA, discrete wavelet transform (DWT and non-subsampled contourlet transform (NCT. A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.

  16. An empirically based conceptual framework for fostering meaningful patient engagement in research.

    Science.gov (United States)

    Hamilton, Clayon B; Hoens, Alison M; Backman, Catherine L; McKinnon, Annette M; McQuitty, Shanon; English, Kelly; Li, Linda C

    2018-02-01

    Patient engagement in research (PEIR) is promoted to improve the relevance and quality of health research, but has little conceptualization derived from empirical data. To address this issue, we sought to develop an empirically based conceptual framework for meaningful PEIR founded on a patient perspective. We conducted a qualitative secondary analysis of in-depth interviews with 18 patient research partners from a research centre-affiliated patient advisory board. Data analysis involved three phases: identifying the themes, developing a framework and confirming the framework. We coded and organized the data, and abstracted, illustrated, described and explored the emergent themes using thematic analysis. Directed content analysis was conducted to derive concepts from 18 publications related to PEIR to supplement, confirm or refute, and extend the emergent conceptual framework. The framework was reviewed by four patient research partners on our research team. Participants' experiences of working with researchers were generally positive. Eight themes emerged: procedural requirements, convenience, contributions, support, team interaction, research environment, feel valued and benefits. These themes were interconnected and formed a conceptual framework to explain the phenomenon of meaningful PEIR from a patient perspective. This framework, the PEIR Framework, was endorsed by the patient research partners on our team. The PEIR Framework provides guidance on aspects of PEIR to address for meaningful PEIR. It could be particularly useful when patient-researcher partnerships are led by researchers with little experience of engaging patients in research. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  17. Empirical tests of natural selection-based evolutionary accounts of ADHD: a systematic review.

    Science.gov (United States)

    Thagaard, Marthe S; Faraone, Stephen V; Sonuga-Barke, Edmund J; Østergaard, Søren D

    2016-10-01

    ADHD is a prevalent and highly heritable mental disorder associated with significant impairment, morbidity and increased rates of mortality. This combination of high prevalence and high morbidity/mortality seen in ADHD and other mental disorders presents a challenge to natural selection-based models of human evolution. Several hypotheses have been proposed in an attempt to resolve this apparent paradox. The aim of this study was to review the evidence for these hypotheses. We conducted a systematic review of the literature on empirical investigations of natural selection-based evolutionary accounts for ADHD in adherence with the PRISMA guideline. The PubMed, Embase, and PsycINFO databases were screened for relevant publications, by combining search terms covering evolution/selection with search terms covering ADHD. The search identified 790 records. Of these, 15 full-text articles were assessed for eligibility, and three were included in the review. Two of these reported on the evolution of the seven-repeat allele of the ADHD-associated dopamine receptor D4 gene, and one reported on the results of a simulation study of the effect of suggested ADHD-traits on group survival. The authors of the three studies interpreted their findings as favouring the notion that ADHD-traits may have been associated with increased fitness during human evolution. However, we argue that none of the three studies really tap into the core symptoms of ADHD, and that their conclusions therefore lack validity for the disorder. This review indicates that the natural selection-based accounts of ADHD have not been subjected to empirical test and therefore remain hypothetical.

  18. Evaluating Fast Maximum Likelihood-Based Phylogenetic Programs Using Empirical Phylogenomic Data Sets

    Science.gov (United States)

    Zhou, Xiaofan; Shen, Xing-Xing; Hittinger, Chris Todd

    2018-01-01

    Abstract The sizes of the data matrices assembled to resolve branches of the tree of life have increased dramatically, motivating the development of programs for fast, yet accurate, inference. For example, several different fast programs have been developed in the very popular maximum likelihood framework, including RAxML/ExaML, PhyML, IQ-TREE, and FastTree. Although these programs are widely used, a systematic evaluation and comparison of their performance using empirical genome-scale data matrices has so far been lacking. To address this question, we evaluated these four programs on 19 empirical phylogenomic data sets with hundreds to thousands of genes and up to 200 taxa with respect to likelihood maximization, tree topology, and computational speed. For single-gene tree inference, we found that the more exhaustive and slower strategies (ten searches per alignment) outperformed faster strategies (one tree search per alignment) using RAxML, PhyML, or IQ-TREE. Interestingly, single-gene trees inferred by the three programs yielded comparable coalescent-based species tree estimations. For concatenation-based species tree inference, IQ-TREE consistently achieved the best-observed likelihoods for all data sets, and RAxML/ExaML was a close second. In contrast, PhyML often failed to complete concatenation-based analyses, whereas FastTree was the fastest but generated lower likelihood values and more dissimilar tree topologies in both types of analyses. Finally, data matrix properties, such as the number of taxa and the strength of phylogenetic signal, sometimes substantially influenced the programs’ relative performance. Our results provide real-world gene and species tree phylogenetic inference benchmarks to inform the design and execution of large-scale phylogenomic data analyses. PMID:29177474

  19. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  20. Theoretical and Experimental Analysis of Adsorption in Surface-based Biosensors

    DEFF Research Database (Denmark)

    Hansen, Rasmus

    The present Ph.D. dissertation concerns the application of surface plasmon resonance (SPR) spectroscopy, which is a surface-based biosensor technology, for studies of adsorption dynamics. The thesis contains both experimental and theoretical work. In the theoretical part we develop the theory...... cell of the surface-based biosensor, in addition to the sensor surface, is investigated. In the experimental part of the thesis we use a Biacore SPR sensor to study lipase adsorption on model substrate surfaces, as well as competitive adsorption of lipase and surfactants. A part of the experimental...

  1. An Empirical Evaluation of Puzzle-Based Learning as an Interest Approach for Teaching Introductory Computer Science

    Science.gov (United States)

    Merrick, K. E.

    2010-01-01

    This correspondence describes an adaptation of puzzle-based learning to teaching an introductory computer programming course. Students from two offerings of the course--with and without the puzzle-based learning--were surveyed over a two-year period. Empirical results show that the synthesis of puzzle-based learning concepts with existing course…

  2. Integrated Management Systems and Workflow-Based Electronic Document Management: An Empirical Study

    DEFF Research Database (Denmark)

    Pho, Hang Thu; Tambo, Torben

    2014-01-01

    trustworthiness. Social implications: IMS has a tendency to stay with professionals, e.g. line managers and QA/QC/QMS professionals. The EDMS line of discussion suggests a broader inclusion. Originality/value: Researching IMS as a technological implementation is giving a better platform of aligning the IMS...... (EDMS) is essential, especially at global enterprises where a large amount of documents generated by processes flows through different work cultures. However, there is no "one-size-fits-all" design for EDMS because it depends on organizations' needs, size and resource allocation. This article discusses...... the interrelation between EDMS and IMS in order to suggest a best practice. Design/methodology/approach: This article methodologically based upon a qualitative, interpretivistic, longitudinal empirical study in a wind turbine factory. Findings and Originality/value: IMS improvement and effectiveness has been...

  3. Fringe-projection profilometry based on two-dimensional empirical mode decomposition.

    Science.gov (United States)

    Zheng, Suzhen; Cao, Yiping

    2013-11-01

    In 3D shape measurement, because deformed fringes often contain low-frequency information degraded with random noise and background intensity information, a new fringe-projection profilometry is proposed based on 2D empirical mode decomposition (2D-EMD). The fringe pattern is first decomposed into numbers of intrinsic mode functions by 2D-EMD. Because the method has partial noise reduction, the background components can be removed to obtain the fundamental components needed to perform Hilbert transformation to retrieve the phase information. The 2D-EMD can effectively extract the modulation phase of a single direction fringe and an inclined fringe pattern because it is a full 2D analysis method and considers the relationship between adjacent lines of a fringe patterns. In addition, as the method does not add noise repeatedly, as does ensemble EMD, the data processing time is shortened. Computer simulations and experiments prove the feasibility of this method.

  4. An epileptic seizures detection algorithm based on the empirical mode decomposition of EEG.

    Science.gov (United States)

    Orosco, Lorena; Laciar, Eric; Correa, Agustina Garces; Torres, Abel; Graffigna, Juan P

    2009-01-01

    Epilepsy is a neurological disorder that affects around 50 million people worldwide. The seizure detection is an important component in the diagnosis of epilepsy. In this study, the Empirical Mode Decomposition (EMD) method was proposed on the development of an automatic epileptic seizure detection algorithm. The algorithm first computes the Intrinsic Mode Functions (IMFs) of EEG records, then calculates the energy of each IMF and performs the detection based on an energy threshold and a minimum duration decision. The algorithm was tested in 9 invasive EEG records provided and validated by the Epilepsy Center of the University Hospital of Freiburg. In 90 segments analyzed (39 with epileptic seizures) the sensitivity and specificity obtained with the method were of 56.41% and 75.86% respectively. It could be concluded that EMD is a promissory method for epileptic seizure detection in EEG records.

  5. Empirical likelihood based detection procedure for change point in mean residual life functions under random censorship.

    Science.gov (United States)

    Chen, Ying-Ju; Ning, Wei; Gupta, Arjun K

    2016-05-01

    The mean residual life (MRL) function is one of the basic parameters of interest in survival analysis that describes the expected remaining time of an individual after a certain age. The study of changes in the MRL function is practical and interesting because it may help us to identify some factors such as age and gender that may influence the remaining lifetimes of patients after receiving a certain surgery. In this paper, we propose a detection procedure based on the empirical likelihood for the changes in MRL functions with right censored data. Two real examples are also given: Veterans' administration lung cancer study and Stanford heart transplant to illustrate the detecting procedure. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Empirically based Suggested Insights into the Concept of False-Self Defense: Contributions From a Study on Normalization of Children With Disabilities.

    Science.gov (United States)

    Eichengreen, Adva; Hoofien, Dan; Bachar, Eytan

    2016-02-01

    The concept of the false self has been used widely in psychoanalytic theory and practice but seldom in empirical research. In this empirically based study, elevated features of false-self defense were hypothetically associated with risk factors attendant on processes of rehabilitation and integration of children with disabilities, processes that encourage adaptation of the child to the able-bodied environment. Self-report questionnaires and in-depth interviews were conducted with 88 deaf and hard-of-hearing students and a comparison group of 88 hearing counterparts. Results demonstrate that despite the important contribution of rehabilitation and integration to the well-being of these children, these efforts may put the child at risk of increased use of the false-self defense. The empirical findings suggest two general theoretical conclusions: (1) The Winnicottian concept of the environment, usually confined to the parent-child relationship, can be understood more broadly as including cultural, social, and rehabilitational variables that both influence the parent-child relationship and operate independently of it. (2) The monolithic conceptualization of the false self may be more accurately unpacked to reveal two distinct subtypes: the compliant and the split false self. © 2016 by the American Psychoanalytic Association.

  7. The Role of Social Network Technologies in Online Health Promotion: A Narrative Review of Theoretical and Empirical Factors Influencing Intervention Effectiveness

    Science.gov (United States)

    Kennedy, Catriona M; Buchan, Iain; Powell, John; Ainsworth, John

    2015-01-01

    Background Social network technologies have become part of health education and wider health promotion—either by design or happenstance. Social support, peer pressure, and information sharing in online communities may affect health behaviors. If there are positive and sustained effects, then social network technologies could increase the effectiveness and efficiency of many public health campaigns. Social media alone, however, may be insufficient to promote health. Furthermore, there may be unintended and potentially harmful consequences of inaccurate or misleading health information. Given these uncertainties, there is a need to understand and synthesize the evidence base for the use of online social networking as part of health promoting interventions to inform future research and practice. Objective Our aim was to review the research on the integration of expert-led health promotion interventions with online social networking in order to determine the extent to which the complementary benefits of each are understood and used. We asked, in particular, (1) How is effectiveness being measured and what are the specific problems in effecting health behavior change?, and (2) To what extent is the designated role of social networking grounded in theory? Methods The narrative synthesis approach to literature review was used to analyze the existing evidence. We searched the indexed scientific literature using keywords associated with health promotion and social networking. The papers included were only those making substantial study of both social networking and health promotion—either reporting the results of the intervention or detailing evidence-based plans. General papers about social networking and health were not included. Results The search identified 162 potentially relevant documents after review of titles and abstracts. Of these, 42 satisfied the inclusion criteria after full-text review. Six studies described randomized controlled trials (RCTs) evaluating

  8. The Role of Social Network Technologies in Online Health Promotion: A Narrative Review of Theoretical and Empirical Factors Influencing Intervention Effectiveness.

    Science.gov (United States)

    Balatsoukas, Panos; Kennedy, Catriona M; Buchan, Iain; Powell, John; Ainsworth, John

    2015-06-11

    Social network technologies have become part of health education and wider health promotion—either by design or happenstance. Social support, peer pressure, and information sharing in online communities may affect health behaviors. If there are positive and sustained effects, then social network technologies could increase the effectiveness and efficiency of many public health campaigns. Social media alone, however, may be insufficient to promote health. Furthermore, there may be unintended and potentially harmful consequences of inaccurate or misleading health information. Given these uncertainties, there is a need to understand and synthesize the evidence base for the use of online social networking as part of health promoting interventions to inform future research and practice. Our aim was to review the research on the integration of expert-led health promotion interventions with online social networking in order to determine the extent to which the complementary benefits of each are understood and used. We asked, in particular, (1) How is effectiveness being measured and what are the specific problems in effecting health behavior change?, and (2) To what extent is the designated role of social networking grounded in theory? The narrative synthesis approach to literature review was used to analyze the existing evidence. We searched the indexed scientific literature using keywords associated with health promotion and social networking. The papers included were only those making substantial study of both social networking and health promotion—either reporting the results of the intervention or detailing evidence-based plans. General papers about social networking and health were not included. The search identified 162 potentially relevant documents after review of titles and abstracts. Of these, 42 satisfied the inclusion criteria after full-text review. Six studies described randomized controlled trials (RCTs) evaluating the effectiveness of online social

  9. Making Trade-Offs Visible: Theoretical and Methodological Considerations about the Relationship between Dimensions and Institutions of Democracy and Empirical Findings

    Directory of Open Access Journals (Sweden)

    Hans-Joachim Lauth

    2018-03-01

    Full Text Available Whereas the measurement of the quality of democracy focused on the rough differentiation of democracies and autocracies in the beginning (e.g. Vanhanen, Polity, Freedom House, the focal point of newer instruments is the assessment of the quality of established democracies. In this context, tensions resp. trade-offs between dimensions of democracy are discussed as well (e.g. Democracy Barometer, Varieties of Democracy. However, these approaches lack a systematic discussion of trade-offs and they are not able to show trade-offs empirically. We address this research desideratum in a three-step process: Firstly, we propose a new conceptual approach, which distinguishes between two different modes of relationships between dimensions: mutual reinforcing effects and a give-and-take relationship (trade-offs between dimensions. By introducing our measurement tool, Democracy Matrix, we finally locate mutually reinforcing effects as well as trade-offs. Secondly, we provide a new methodological approach to measure trade-offs. While one measuring strategy captures the mutual reinforcing effects, the other strategy employs indicators, which serve to gauge trade-offs. Thirdly, we demonstrate empirical findings of our measurement drawing on the Varieties of Democracy dataset. Incorporating trade-offs into the measurement enables us to identify various profiles of democracy (libertarian, egalitarian and control-focused democracy via the quality of its dimensions.

  10. Developing a Clustering-Based Empirical Bayes Analysis Method for Hotspot Identification

    Directory of Open Access Journals (Sweden)

    Yajie Zou

    2017-01-01

    Full Text Available Hotspot identification (HSID is a critical part of network-wide safety evaluations. Typical methods for ranking sites are often rooted in using the Empirical Bayes (EB method to estimate safety from both observed crash records and predicted crash frequency based on similar sites. The performance of the EB method is highly related to the selection of a reference group of sites (i.e., roadway segments or intersections similar to the target site from which safety performance functions (SPF used to predict crash frequency will be developed. As crash data often contain underlying heterogeneity that, in essence, can make them appear to be generated from distinct subpopulations, methods are needed to select similar sites in a principled manner. To overcome this possible heterogeneity problem, EB-based HSID methods that use common clustering methodologies (e.g., mixture models, K-means, and hierarchical clustering to select “similar” sites for building SPFs are developed. Performance of the clustering-based EB methods is then compared using real crash data. Here, HSID results, when computed on Texas undivided rural highway cash data, suggest that all three clustering-based EB analysis methods are preferred over the conventional statistical methods. Thus, properly classifying the road segments for heterogeneous crash data can further improve HSID accuracy.

  11. Empirical source strength correlations for rans-based acoustic analogy methods

    Science.gov (United States)

    Kube-McDowell, Matthew Tyndall

    JeNo is a jet noise prediction code based on an acoustic analogy method developed by Mani, Gliebe, Balsa, and Khavaran. Using the flow predictions from a standard Reynolds-averaged Navier-Stokes computational fluid dynamics solver, JeNo predicts the overall sound pressure level and angular spectra for high-speed hot jets over a range of observer angles, with a processing time suitable for rapid design purposes. JeNo models the noise from hot jets as a combination of two types of noise sources; quadrupole sources dependent on velocity fluctuations, which represent the major noise of turbulent mixing, and dipole sources dependent on enthalpy fluctuations, which represent the effects of thermal variation. These two sources are modeled by JeNo as propagating independently into the far-field, with no cross-correlation at the observer location. However, high-fidelity computational fluid dynamics solutions demonstrate that this assumption is false. In this thesis, the theory, assumptions, and limitations of the JeNo code are briefly discussed, and a modification to the acoustic analogy method is proposed in which the cross-correlation of the two primary noise sources is allowed to vary with the speed of the jet and the observer location. As a proof-of-concept implementation, an empirical correlation correction function is derived from comparisons between JeNo's noise predictions and a set of experimental measurements taken for the Air Force Aero-Propulsion Laboratory. The empirical correlation correction is then applied to JeNo's predictions of a separate data set of hot jets tested at NASA's Glenn Research Center. Metrics are derived to measure the qualitative and quantitative performance of JeNo's acoustic predictions, and the empirical correction is shown to provide a quantitative improvement in the noise prediction at low observer angles with no freestream flow, and a qualitative improvement in the presence of freestream flow. However, the results also demonstrate

  12. Substituent effif ects on hydrogen bonding in Watson-Crick base pairs. A theoretical study

    NARCIS (Netherlands)

    Fonseca Guerra, C.; van der Wijst, T.; Bickelhaupt, F.M.

    2005-01-01

    We have theoretically analyzed Watson-Crick AT and GC base pairs in which purine C8 and/or pyrimidine C6 positions carry a substituent X = H, F, Cl or Br, using the generalized gradient approximation (GGA) of density functional theory at BP86/TZ2P. The purpose is to study the effects on structure

  13. Study on the Theoretical Foundation of Business English Curriculum Design Based on ESP and Needs Analysis

    Science.gov (United States)

    Zhu, Wenzhong; Liu, Dan

    2014-01-01

    Based on a review of the literature on ESP and needs analysis, this paper is intended to offer some theoretical supports and inspirations for BE instructors to develop BE curricula for business contexts. It discusses how the theory of need analysis can be used in Business English curriculum design, and proposes some principles of BE curriculum…

  14. Theoretical and practical bases of transfer pricing formation at the microlevel in terms of national economy

    OpenAIRE

    Oksana Desyatniuk; Olga Cherevko

    2015-01-01

    The theoretical and methodological bases of transfer pricing formation at microlevel are studied. The factors acting upon transfer pricing are analysed and the algorithm to form transfer price at an enterprise is suggested. The model example to choose the method of transfer pricing and calculate the profitability interval meeting modern legal requirements is considered.

  15. Written institutional ethics policies on euthanasia: an empirical-based organizational-ethical framework.

    Science.gov (United States)

    Lemiengre, Joke; Dierckx de Casterlé, Bernadette; Schotsmans, Paul; Gastmans, Chris

    2014-05-01

    As euthanasia has become a widely debated issue in many Western countries, hospitals and nursing homes especially are increasingly being confronted with this ethically sensitive societal issue. The focus of this paper is how healthcare institutions can deal with euthanasia requests on an organizational level by means of a written institutional ethics policy. The general aim is to make a critical analysis whether these policies can be considered as organizational-ethical instruments that support healthcare institutions to take their institutional responsibility for dealing with euthanasia requests. By means of an interpretative analysis, we conducted a process of reinterpretation of results of former Belgian empirical studies on written institutional ethics policies on euthanasia in dialogue with the existing international literature. The study findings revealed that legal regulations, ethical and care-oriented aspects strongly affected the development, the content, and the impact of written institutional ethics policies on euthanasia. Hence, these three cornerstones-law, care and ethics-constituted the basis for the empirical-based organizational-ethical framework for written institutional ethics policies on euthanasia that is presented in this paper. However, having a euthanasia policy does not automatically lead to more legal transparency, or to a more professional and ethical care practice. The study findings suggest that the development and implementation of an ethics policy on euthanasia as an organizational-ethical instrument should be considered as a dynamic process. Administrators and ethics committees must take responsibility to actively create an ethical climate supporting care providers who have to deal with ethical dilemmas in their practice.

  16. Review: Joachim R. Höflich (2003. Mensch, Computer und Kommunikation. Theoretische Verortungen und empirische Befunde [Man, Computer, Communication. Theoretical Positions and Empirical Findings

    Directory of Open Access Journals (Sweden)

    Jan Schmidt

    2004-05-01

    Full Text Available Joachim R. HÖFLICH presents a theory of the institutionalization of computer-mediated communication that centers on the user and his/her expectations. "Computer frames", consisting of rules and routines for the appropriate use of a medium and its applications as a tool for information, public discussion or interpersonal communication, structure the single usage episodes as well as the users' expectations. Drawing on a variety of data on the development of the Newspaper-Mailbox "Augsburg Newsline" in the Mid-Nineties, HÖFLICH demonstrates the usefulness of his conceptual framework for empirical analysis. His book is, therefore, a valuable contribution to the field of online research in social and communication science alike. URN: urn:nbn:de:0114-fqs040297

  17. Theoretical and Experimental Study on Secondary Piezoelectric Effect Based on PZT-5

    International Nuclear Information System (INIS)

    Zhang, Z H; Sun, B Y; Shi, L P

    2006-01-01

    The purpose of this paper is to confirm the existence of secondary and multiple piezoelectric effect theoretically and experimentally. Based on Heckmann model showing the relationship among mechanical, electric and heat energy and the physical model on mechanical, electric, heat, and magnetic energy, theoretical analysis of multiple piezoelectric effect is made through four kinds of piezoelectric equations. Experimental research of secondary direct piezoelectric effect is conducted through adopting PZT-5 piles. The result of the experiment indicates that charge generated by secondary direct piezoelectric effect as well as displacement caused by first converse piezoelectric effect keeps fine linearity with the applied voltage

  18. The purchasing power parity in emerging Europe: Empirical results based on two-break analysis

    Directory of Open Access Journals (Sweden)

    Mladenović Zorica

    2013-01-01

    Full Text Available The purpose of the paper is to evaluate the validity of purchasing power parity (PPP for eight countries from the Emerging Europe: Hungary, Czech Republic, Poland, Romania, Lithuania, Latvia, Serbia and Turkey. Monthly data for euro and U.S. dollar based real exchange rate time series are considered covering the period: January, 2000 - August, 2011. Given significant changes in these economies in this sample it seems plausible to assume that real exchange time series are characterized by more than one time structural break. In order to endogenously determine the number and type of breaks while testing for the presence of unit roots we applied the Lee-Strazicich approach. For two euro based real exchange rate time series (in Hungary and Turkey the unit root hypothesis has been rejected. For the U.S. dollar based real exchange rate time series in Poland, Romania and Turkey the presence of unit root has been rejected. To assess the adjustment dynamics of those real exchange rates that were detected to be stationary with two breaks, the impulse response function is calculated and half-life is estimated. Our overall conclusion is that the persistence of real exchange rate in Emerging Europe is still substantially high. The lack of strong empirical support for PPP suggests that careful policy actions are needed in this region to prevent serious exchange rate misalignment.

  19. Bivariate empirical mode decomposition for ECG-based biometric identification with emotional data.

    Science.gov (United States)

    Ferdinando, Hany; Seppanen, Tapio; Alasaarela, Esko

    2017-07-01

    Emotions modulate ECG signals such that they might affect ECG-based biometric identification in real life application. It motivated in finding good feature extraction methods where the emotional state of the subjects has minimum impacts. This paper evaluates feature extraction based on bivariate empirical mode decomposition (BEMD) for biometric identification when emotion is considered. Using the ECG signal from the Mahnob-HCI database for affect recognition, the features were statistical distributions of dominant frequency after applying BEMD analysis to ECG signals. The achieved accuracy was 99.5% with high consistency using kNN classifier in 10-fold cross validation to identify 26 subjects when the emotional states of the subjects were ignored. When the emotional states of the subject were considered, the proposed method also delivered high accuracy, around 99.4%. We concluded that the proposed method offers emotion-independent features for ECG-based biometric identification. The proposed method needs more evaluation related to testing with other classifier and variation in ECG signals, e.g. normal ECG vs. ECG with arrhythmias, ECG from various ages, and ECG from other affective databases.

  20. Investigating properties of the cardiovascular system using innovative analysis algorithms based on ensemble empirical mode decomposition.

    Science.gov (United States)

    Yeh, Jia-Rong; Lin, Tzu-Yu; Chen, Yun; Sun, Wei-Zen; Abbod, Maysam F; Shieh, Jiann-Shing

    2012-01-01

    Cardiovascular system is known to be nonlinear and nonstationary. Traditional linear assessments algorithms of arterial stiffness and systemic resistance of cardiac system accompany the problem of nonstationary or inconvenience in practical applications. In this pilot study, two new assessment methods were developed: the first is ensemble empirical mode decomposition based reflection index (EEMD-RI) while the second is based on the phase shift between ECG and BP on cardiac oscillation. Both methods utilise the EEMD algorithm which is suitable for nonlinear and nonstationary systems. These methods were used to investigate the properties of arterial stiffness and systemic resistance for a pig's cardiovascular system via ECG and blood pressure (BP). This experiment simulated a sequence of continuous changes of blood pressure arising from steady condition to high blood pressure by clamping the artery and an inverse by relaxing the artery. As a hypothesis, the arterial stiffness and systemic resistance should vary with the blood pressure due to clamping and relaxing the artery. The results show statistically significant correlations between BP, EEMD-based RI, and the phase shift between ECG and BP on cardiac oscillation. The two assessments results demonstrate the merits of the EEMD for signal analysis.

  1. Transparency in Transcribing: Making Visible Theoretical Bases Impacting Knowledge Construction from Open-Ended Interview Records

    Directory of Open Access Journals (Sweden)

    Audra Skukauskaite

    2012-01-01

    Full Text Available This article presents a reflexive analysis of two transcripts of an open-ended interview and argues for transparency in transcribing processes and outcomes. By analyzing ways in which a researcher's theories become consequential in producing and using transcripts of an open-ended interview, this paper makes visible the importance of examining and presenting theoretical bases of transcribing decisions. While scholars across disciplines have argued that transcribing is a theoretically laden process (GREEN, FRANQUIZ & DIXON, 1997; KVALE & BRINKMAN, 2009, few have engaged in reflexive analyses of the data history to demonstrate the consequences particular theoretical and methodological approaches pose in producing knowledge claims and inciting dialogues across traditions. The article demonstrates how theory-method-claim relationships in transcribing influence research transparency and warrantability. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1201146

  2. Empirical Evaluation Indicators in Thai Higher Education: Theory-Based Multidimensional Learners' Assessment

    Science.gov (United States)

    Sritanyarat, Dawisa; Russ-Eft, Darlene

    2016-01-01

    This study proposed empirical indicators which can be validated and adopted in higher education institutions to evaluate quality of teaching and learning, and to serve as an evaluation criteria for human resource management and development of higher institutions in Thailand. The main purpose of this study was to develop empirical indicators of a…

  3. An Empirical Study on Washback Effects of the Internet-Based College English Test Band 4 in China

    Science.gov (United States)

    Wang, Chao; Yan, Jiaolan; Liu, Bao

    2014-01-01

    Based on Bailey's washback model, in respect of participants, process and products, the present empirical study was conducted to find the actual washback effects of the internet-based College English Test Band 4 (IB CET-4). The methods adopted are questionnaires, class observation, interview and the analysis of both the CET-4 teaching and testing…

  4. River channel and bar patterns explained and predicted by an empirical and a physics-based method

    NARCIS (Netherlands)

    Kleinhans, M.G.; Berg, J.H. van den

    2011-01-01

    Our objective is to understand general causes of different river channel patterns. In this paper we compare an empirical stream power-based classification and a physics-based bar pattern predictor. We present a careful selection of data from the literature that contains rivers with discharge and

  5. How "Does" the Comforting Process Work? An Empirical Test of an Appraisal-Based Model of Comforting

    Science.gov (United States)

    Jones, Susanne M.; Wirtz, John G.

    2006-01-01

    Burleson and Goldsmith's (1998) comforting model suggests an appraisal-based mechanism through which comforting messages can bring about a positive change in emotional states. This study is a first empirical test of three causal linkages implied by the appraisal-based comforting model. Participants (N=258) talked about an upsetting event with a…

  6. "Trabajo productivo, relación Precio/valor y tasa de plusvalía: puntos de vista teóricos y evidencia empírica con base en los Estados Unidos de América (1948-1987 y países de la comunidad económica europea (1960/70-1986" Productive labour, priceivalue ratio and rate of surplus-value: theoretical view, points and empirical evidence on the USA (1948-1987 and E.E.C. countries (1960/70/1986.

    Directory of Open Access Journals (Sweden)

    Gouverneur Jacques

    1990-12-01

    Full Text Available

    Presentando datos provenientes de la economía de los Estados Unidos (1948-1987 y la Comunidad Econ6mica Europea (1960/70-1986 como soporte empírico, el articulo busca aportar algunos elementos teóricos y metodológicos al debate sobre la tasa de plusvalía: En la primera parte se presenta la posición del autor acerca del controvertido debate sobre el valor sin hacer una revisión exhaustiva de éste. La segunda sección se ocupa del examen de la relación existente entre la suma total de los precios y la suma total de los valores considerando las conexiones que ésta establece entre magnitudes monetarias y magnitudes de trabajo y el método utilizado para estimarlas. En tercer lugar se consideran los problemas relacionados con la medición de la tasa de plusvalía para establecer luego la relación entre la forma y medidas basadas en los datos de las cuentas nacionales evaluando su papel Se estiman aquí el nivel y evolución de la tasa de plusvalía en los E.E. U. U. Y la C.E.E. Por último, se ofrece un método que busca analizar la evolución de los factores que afectan la tasa de plusvalía aplicándolos a los mismos países.

    Presenting as empirical support data on the economies ofthe United States (1948-1987 and the European Economic Community (1960/70-86 the essay seeks to contribute with theoretical and methodological elements to the debate on the theory of the rate of surplus-value. The first section of the essay presents the author's position in respect to the debate on the theory of value, without an exhaustive revision. The second section refers to the relationship that exists between the addition of 'prices and of values taklng into account the relationships that are established between monetary and labour magnitudes as well as the methods used to simulate them. Section three refers to problems related to the measurement of the rate of surplus-value in order to establish the relationship between forms and measures based

  7. Comparison of subset-based local and FE-based global digital image correlation: Theoretical error analysis and validation

    KAUST Repository

    Pan, B.

    2016-03-22

    Subset-based local and finite-element-based (FE-based) global digital image correlation (DIC) approaches are the two primary image matching algorithms widely used for full-field displacement mapping. Very recently, the performances of these different DIC approaches have been experimentally investigated using numerical and real-world experimental tests. The results have shown that in typical cases, where the subset (element) size is no less than a few pixels and the local deformation within a subset (element) can be well approximated by the adopted shape functions, the subset-based local DIC outperforms FE-based global DIC approaches because the former provides slightly smaller root-mean-square errors and offers much higher computation efficiency. Here we investigate the theoretical origin and lay a solid theoretical basis for the previous comparison. We assume that systematic errors due to imperfect intensity interpolation and undermatched shape functions are negligibly small, and perform a theoretical analysis of the random errors or standard deviation (SD) errors in the displacements measured by two local DIC approaches (i.e., a subset-based local DIC and an element-based local DIC) and two FE-based global DIC approaches (i.e., Q4-DIC and Q8-DIC). The equations that govern the random errors in the displacements measured by these local and global DIC approaches are theoretically derived. The correctness of the theoretically predicted SD errors is validated through numerical translation tests under various noise levels. We demonstrate that the SD errors induced by the Q4-element-based local DIC, the global Q4-DIC and the global Q8-DIC are 4, 1.8-2.2 and 1.2-1.6 times greater, respectively, than that associated with the subset-based local DIC, which is consistent with our conclusions from previous work. © 2016 Elsevier Ltd. All rights reserved.

  8. Confessional Ethical Base of Muslim Entrepreneurship in Russian Empire in Late 19th - Early 20th Centuries

    Directory of Open Access Journals (Sweden)

    Гадиля Гизатуллаевна Корноухова

    2012-06-01

    Full Text Available The article considers the confessional and ethical base of the Muslim entrepreneurship in the Russian Empire in the late 19th - early 20th centuries. The author analyzes the differentiation of the value-institutional system of the broad public on the one hand, and that of entrepreneurs - on the other hand. Whereas the former adhered to the national and ethical values of the traditional culture, the latter - to religious and moral values based on Islam and developed by the Russian Empire reformers of that period.

  9. A new approach for crude oil price analysis based on empirical mode decomposition

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shou-Yang; Lai, K.K.

    2008-01-01

    The importance of understanding the underlying characteristics of international crude oil price movements attracts much attention from academic researchers and business practitioners. Due to the intrinsic complexity of the oil market, however, most of them fail to produce consistently good results. Empirical Mode Decomposition (EMD), recently proposed by Huang et al., appears to be a novel data analysis method for nonlinear and non-stationary time series. By decomposing a time series into a small number of independent and concretely implicational intrinsic modes based on scale separation, EMD explains the generation of time series data from a novel perspective. Ensemble EMD (EEMD) is a substantial improvement of EMD which can better separate the scales naturally by adding white noise series to the original time series and then treating the ensemble averages as the true intrinsic modes. In this paper, we extend EEMD to crude oil price analysis. First, three crude oil price series with different time ranges and frequencies are decomposed into several independent intrinsic modes, from high to low frequency. Second, the intrinsic modes are composed into a fluctuating process, a slowly varying part and a trend based on fine-to-coarse reconstruction. The economic meanings of the three components are identified as short term fluctuations caused by normal supply-demand disequilibrium or some other market activities, the effect of a shock of a significant event, and a long term trend. Finally, the EEMD is shown to be a vital technique for crude oil price analysis. (author)

  10. Promoting Sustainability Transparency in European Local Governments: An Empirical Analysis Based on Administrative Cultures

    Directory of Open Access Journals (Sweden)

    Andrés Navarro-Galera

    2017-03-01

    Full Text Available Nowadays, the transparency of governments with respect to the sustainability of public services is a very interesting issue for stakeholders and academics. It has led to previous research and international organisations (EU, IMF, OECD, United Nations, IFAC, G-20, World Bank to recommend promotion of the online dissemination of economic, social and environmental information. Based on previous studies about e-government and the influence of administrative cultures on governmental accountability, this paper seeks to identify political actions useful to improve the practices of transparency on economic, social and environmental sustainability in European local governments. We perform a comparative analysis of sustainability information published on the websites of 72 local governments in 10 European countries grouped into main three cultural contexts (Anglo-Saxon, Southern European and Nordic. Using international sustainability reporting guidelines, our results reveal significant differences in local government transparency in each context. The most transparent local governments are the Anglo-Saxon ones, followed by Southern European and Nordic governments. Based on individualized empirical results for each administrative style, our conclusions propose useful policy interventions to enhance sustainability transparency within each cultural tradition, such as development of legal rules on transparency and sustainability, tools to motivate local managers for online diffusion of sustainability information and analysis of information needs of stakeholders.

  11. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    International Nuclear Information System (INIS)

    Han, G.; Lin, B.; Xu, Z.

    2017-01-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  12. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    Science.gov (United States)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  13. Going Global: A Model for Evaluating Empirically Supported Family-Based Interventions in New Contexts.

    Science.gov (United States)

    Sundell, Knut; Ferrer-Wreder, Laura; Fraser, Mark W

    2014-06-01

    The spread of evidence-based practice throughout the world has resulted in the wide adoption of empirically supported interventions (ESIs) and a growing number of controlled trials of imported and culturally adapted ESIs. This article is informed by outcome research on family-based interventions including programs listed in the American Blueprints Model and Promising Programs. Evidence from these controlled trials is mixed and, because it is comprised of both successful and unsuccessful replications of ESIs, it provides clues for the translation of promising programs in the future. At least four explanations appear plausible for the mixed results in replication trials. One has to do with methodological differences across trials. A second deals with ambiguities in the cultural adaptation process. A third explanation is that ESIs in failed replications have not been adequately implemented. A fourth source of variation derives from unanticipated contextual influences that might affect the effects of ESIs when transported to other cultures and countries. This article describes a model that allows for the differential examination of adaptations of interventions in new cultural contexts. © The Author(s) 2012.

  14. A hybrid filtering method based on a novel empirical mode decomposition for friction signals

    International Nuclear Information System (INIS)

    Li, Chengwei; Zhan, Liwei

    2015-01-01

    During a measurement, the measured signal usually contains noise. To remove the noise and preserve the important feature of the signal, we introduce a hybrid filtering method that uses a new intrinsic mode function (NIMF) and a modified Hausdorff distance. The NIMF is defined as the difference between the noisy signal and each intrinsic mode function (IMF), which is obtained by empirical mode decomposition (EMD), ensemble EMD, complementary ensemble EMD, or complete ensemble EMD with adaptive noise (CEEMDAN). The relevant mode selecting is based on the similarity between the first NIMF and the rest of the NIMFs. With this filtering method, the EMD and improved versions are used to filter the simulation and friction signals. The friction signal between an airplane tire and the runaway is recorded during a simulated airplane touchdown and features spikes of various amplitudes and noise. The filtering effectiveness of the four hybrid filtering methods are compared and discussed. The results show that the filtering method based on CEEMDAN outperforms other signal filtering methods. (paper)

  15. An empirically based steady state friction law and implications for fault stability.

    Science.gov (United States)

    Spagnuolo, E; Nielsen, S; Violay, M; Di Toro, G

    2016-04-16

    Empirically based rate-and-state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates ( V   0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest.

  16. Empirical Comparison of Graph-based Recommendation Engines for an Apps Ecosystem

    Directory of Open Access Journals (Sweden)

    Luis F. Chiroque

    2015-03-01

    Full Text Available Recommendation engines (RE are becoming highly popular, e.g., in the area of e-commerce. A RE offers new items (products or content to users based on their profile and historical data. The most popular algorithms used in RE are based on collaborative filtering. This technique makes recommendations based on the past behavior of other users and the similarity between users and items. In this paper we have evaluated the performance of several RE based on the properties of the networks formed by users and items. The RE use in a novel way graph theoretic concepts like edges weights or network flow. The evaluation has been conducted in a real environment (ecosystem for recommending apps to smartphone users. The analysis of the results allows concluding that the effectiveness of a RE can be improved if the age of the data, and if a global view of the data is considered. It also shows that graph-based RE are effective, but more experiments are required for a more accurate characterization of their properties.

  17. Theoretical results on the tandem junction solar cell based on its Ebers-Moll transistor model

    Science.gov (United States)

    Goradia, C.; Vaughn, J.; Baraona, C. R.

    1980-01-01

    A one-dimensional theoretical model of the tandem junction solar cell (TJC) with base resistivity greater than about 1 ohm-cm and under low level injection has been derived. This model extends a previously published conceptual model which treats the TJC as an npn transistor. The model gives theoretical expressions for each of the Ebers-Moll type currents of the illuminated TJC and allows for the calculation of the spectral response, I(sc), V(oc), FF and eta under variation of one or more of the geometrical and material parameters and 1MeV electron fluence. Results of computer calculations based on this model are presented and discussed. These results indicate that for space applications, both a high beginning of life efficiency, greater than 15% AM0, and a high radiation tolerance can be achieved only with thin (less than 50 microns) TJC's with high base resistivity (greater than 10 ohm-cm).

  18. Study of network resource allocation based on market and game theoretic mechanism

    Science.gov (United States)

    Liu, Yingmei; Wang, Hongwei; Wang, Gang

    2004-04-01

    We work on the network resource allocation issue concerning network management system function based on market-oriented mechanism. The scheme is to model the telecommunication network resources as trading goods in which the various network components could be owned by different competitive, real-world entities. This is a multidisciplinary framework concentrating on the similarity between resource allocation in network environment and the market mechanism in economic theory. By taking an economic (market-based and game theoretic) approach in routing of communication network, we study the dynamic behavior under game-theoretic framework in allocating network resources. Based on the prior work of Gibney and Jennings, we apply concepts of utility and fitness to the market mechanism with an intention to close the gap between experiment environment and real world situation.

  19. 企业设计战略模式的理论与实证研究%Theoretical and Empirical Study on Enterprise Design Strategy Mode

    Institute of Scientific and Technical Information of China (English)

    管顺丰; 肖雄; 李燕敏

    2016-01-01

    针对设计战略理论发展的需要,在系统总结国内外设计战略模式研究现状的基础上,从目标市场、产品定位、业务领域三个层面构建了设计战略模式体系(即价值创新战略、标准化战略、本土化战略、全球化战略、业务领域多元化战略以及业务领域集中化战略等),进而分析探讨了各种设计战略模式的特征和实施途径。最后将苹果公司和摩托罗拉公司作为案例进行了实证研究。%According to the needs of the development of design strategy theory, and from the levels of target market, product positioning, and business system, the design strategy model is constructed (that is, the value innovation strategy, standardization strategy, localization strategy, globalization strategy, diversiifed business strategy and business focus strategy, etc.), and then this paper analyze the features of various design strategic mode and implementation ways. Finally this paper makes a empirical study on the design strategic mode of APPLE Inc. and MOTOROLA Inc.

  20. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    Science.gov (United States)

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the

  1. The Consistency of Performance Management System Based on Attributes of the Performance Indicator: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Jan Zavadsky

    2014-07-01

    Full Text Available Purpose: The performance management system (PMS is a metasystem over all business processes at the strategic and operational level. Effectiveness of the various management systems depends on many factors. One of them is the consistent definition of each system elements. The main purpose of this study is to explore if the performance management systems of the sample companies is consistent and how companies can create such a system. The consistency in this case is based on the homogenous definition of attributes relating to the performance indicator as a basic element of PMS.Methodology: At the beginning, we used an affinity diagram that helped us to clarify and to group various attributes of performance indicators. The main research results we achieved are through empirical study. The empirical study was carried out in a sample of Slovak companies. The criterion for selection was the existence of the certified management systems according to the ISO 9001. Representativeness of the sample companies was confirmed by application of Pearson´s chi-squared test (χ2 - test due to above standards. Findings: Coming from the review of various literature, we defined four groups of attributes relating to the performance indicator: formal attributes, attributes of target value, informational attributes and attributes of evaluation. The whole set contains 21 attributes. The consistency of PMS is based not on maximum or minimum number of attributes, but on the same type of attributes for each performance indicator used in PMS at both the operational and strategic level. The main findings are: companies use various financial and non-financial indicators at strategic or operational level; companies determine various attributes of performance indicator, but most of the performance indicators are otherwise determined; we identified the common attributes for the whole sample of companies. Practical implications: The research results have got an implication for

  2. Eclectic continuum, distinct discipline or sub-domain of communication studies? Theoretical considerations and empirical findings on the disciplinarity, multidisciplinarity and transdisciplinarity of journalism studies

    Directory of Open Access Journals (Sweden)

    Martin Löffelholz

    2011-06-01

    Full Text Available Is journalism studies a sub-domain of communication studies, adistinct discipline, a multidisciplinary merger or a transdisciplinary endeavour? This question is discussed by analyzing the 2008 and2009 volumes of seven academic journals focusing on journalismresearch. The sample includes 349 articles published in BrazilianJournalism Research, Ecquid Novi, Journalism & CommunicationMonographs, Journalism & Mass Communication Quarterly, PacificJournalism Review, Journalism Studies, or Journalism: Theory,Practice and Criticism. Overall, the findings reveal that journalismresearch mainly applies theoretical approaches and empiricalmethods deriving from other disciplines, particularly sociology, psychology or cultural studies. In many countries, however, journalism studies has reached a comparatively high level of institutionalization indicated by the large number of specific schools, professorships, professional associations and respective academic journals. In conclusion, we argue that journalism studies is a sub-domain of communication studies, which integrates andtranscends various disciplines aiming to become one of the axialsubjects of the 21st century.

  3. ECLECTIC CONTINUUM, DISTINCT DISCIPLINE OR SUB-DOMAIN OF COMMUNICATION STUDIES? Theoretical considerations and empirical findings on the disciplinarity, multidisciplinarity and transdisciplinarity of journalism studies

    Directory of Open Access Journals (Sweden)

    Liane Rothenberger

    2011-06-01

    Full Text Available Is journalism studies a sub-domain of communication studies, adistinct discipline, a multidisciplinary merger or a transdisciplinary endeavour? This question is discussed by analyzing the 2008 and2009 volumes of seven academic journals focusing on journalismresearch. The sample includes 349 articles published in BrazilianJournalism Research, Ecquid Novi, Journalism & CommunicationMonographs, Journalism & Mass Communication Quarterly, PacificJournalism Review, Journalism Studies, or Journalism: Theory,Practice and Criticism. Overall, the findings reveal that journalismresearch mainly applies theoretical approaches and empiricalmethods deriving from other disciplines, particularly sociology, psychology or cultural studies. In many countries, however, journalism studies has reached a comparatively high level of institutionalization indicated by the large number of specific schools, professorships, professional associations and respective academic journals. In conclusion, we argue that journalism studies is a sub-domain of communication studies, which integrates andtranscends various disciplines aiming to become one of the axialsubjects of the 21st century.

  4. Benchmarking of a T-wave alternans detection method based on empirical mode decomposition.

    Science.gov (United States)

    Blanco-Velasco, Manuel; Goya-Esteban, Rebeca; Cruz-Roldán, Fernando; García-Alberola, Arcadi; Rojo-Álvarez, José Luis

    2017-07-01

    T-wave alternans (TWA) is a fluctuation of the ST-T complex occurring on an every-other-beat basis of the surface electrocardiogram (ECG). It has been shown to be an informative risk stratifier for sudden cardiac death, though the lack of gold standard to benchmark detection methods has promoted the use of synthetic signals. This work proposes a novel signal model to study the performance of a TWA detection. Additionally, the methodological validation of a denoising technique based on empirical mode decomposition (EMD), which is used here along with the spectral method, is also tackled. The proposed test bed system is based on the following guidelines: (1) use of open source databases to enable experimental replication; (2) use of real ECG signals and physiological noise; (3) inclusion of randomized TWA episodes. Both sensitivity (Se) and specificity (Sp) are separately analyzed. Also a nonparametric hypothesis test, based on Bootstrap resampling, is used to determine whether the presence of the EMD block actually improves the performance. The results show an outstanding specificity when the EMD block is used, even in very noisy conditions (0.96 compared to 0.72 for SNR = 8 dB), being always superior than that of the conventional SM alone. Regarding the sensitivity, using the EMD method also outperforms in noisy conditions (0.57 compared to 0.46 for SNR=8 dB), while it decreases in noiseless conditions. The proposed test setting designed to analyze the performance guarantees that the actual physiological variability of the cardiac system is reproduced. The use of the EMD-based block in noisy environment enables the identification of most patients with fatal arrhythmias. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Proposed Empirical Entropy and Gibbs Energy Based on Observations of Scale Invariance in Open Nonequilibrium Systems.

    Science.gov (United States)

    Tuck, Adrian F

    2017-09-07

    There is no widely agreed definition of entropy, and consequently Gibbs energy, in open systems far from equilibrium. One recent approach has sought to formulate an entropy and Gibbs energy based on observed scale invariances in geophysical variables, particularly in atmospheric quantities, including the molecules constituting stratospheric chemistry. The Hamiltonian flux dynamics of energy in macroscopic open nonequilibrium systems maps to energy in equilibrium statistical thermodynamics, and corresponding equivalences of scale invariant variables with other relevant statistical mechanical variables such as entropy, Gibbs energy, and 1/(k Boltzmann T), are not just formally analogous but are also mappings. Three proof-of-concept representative examples from available adequate stratospheric chemistry observations-temperature, wind speed and ozone-are calculated, with the aim of applying these mappings and equivalences. Potential applications of the approach to scale invariant observations from the literature, involving scales from molecular through laboratory to astronomical, are considered. Theoretical support for the approach from the literature is discussed.

  6. The effects of sampling on the efficiency and accuracy of k-mer indexes: Theoretical and empirical comparisons using the human genome.

    Science.gov (United States)

    Almutairy, Meznah; Torng, Eric

    2017-01-01

    One of the most common ways to search a sequence database for sequences that are similar to a query sequence is to use a k-mer index such as BLAST. A big problem with k-mer indexes is the space required to store the lists of all occurrences of all k-mers in the database. One method for reducing the space needed, and also query time, is sampling where only some k-mer occurrences are stored. Most previous work uses hard sampling, in which enough k-mer occurrences are retained so that all similar sequences are guaranteed to be found. In contrast, we study soft sampling, which further reduces the number of stored k-mer occurrences at a cost of decreasing query accuracy. We focus on finding highly similar local alignments (HSLA) over nucleotide sequences, an operation that is fundamental to biological applications such as cDNA sequence mapping. For our comparison, we use the NCBI BLAST tool with the human genome and human ESTs. When identifying HSLAs, we find that soft sampling significantly reduces both index size and query time with relatively small losses in query accuracy. For the human genome and HSLAs of length at least 100 bp, soft sampling reduces index size 4-10 times more than hard sampling and processes queries 2.3-6.8 times faster, while still achieving retention rates of at least 96.6%. When we apply soft sampling to the problem of mapping ESTs against the genome, we map more than 98% of ESTs perfectly while reducing the index size by a factor of 4 and query time by 23.3%. These results demonstrate that soft sampling is a simple but effective strategy for performing efficient searches for HSLAs. We also provide a new model for sampling with BLAST that predicts empirical retention rates with reasonable accuracy by modeling two key problem factors.

  7. THEORETIC INCURSION IN THE IDENTIFICATION OF DETERMINANTS AND E-GOVERNMENT STRATEGY. EMPIRIC STUDY OVER THE GRADE OF IMPLEMENTATION OF E-GOVERNMENT IN BIHOR COUNTY

    Directory of Open Access Journals (Sweden)

    Pop Cohut Ioana

    2012-07-01

    Full Text Available We are currently witnessing an unprecedented development of new technologies, that offers great opportunities for the modernization of the society, it changes radically the way governments, businesses and citizens have the opportunity to obtain goods and services, the way in which public services are provided that become increasingly important for citizens, the way of obtaining and transmitting information, the way in which business connections are done, the interaction between different communities, etc.. From this point of view we research the way in which public strategies of achievement and implementation of model of e-government are shaped, model which can exploit the opportunities offered by new technologies and can implement roles for the citizen and public policy makers and implicitly of local and regional development. How the public sector should support the implementation of strategies of e-government, to polarize at the highest level all communities of interest, to provide a framework for planning and action across the board, covering local administration, institutions, organizations and government agencies, to empower appropriate training for implementation of IT strategies, can lead to stimulation of generators factors of local and regional development. We also, intend to identify, from carrying out an empirical study in Bihor county, by analysis of 30 public institutions, the degree of implementation of new technologies in public administration, human resources readiness for appropriate use of IT facilities and what level of implementation of e-government in these institutions, in order to outline the main characteristics and determinants of e-government implementation. The present paper is of interest to policy-makers, researchers, local communities through analysis how the introduction of new technology development in developing and implementing public policies, particularly by networking government and e-government, can cause

  8. Future scenarios of land change based on empirical data and demographic trends

    Science.gov (United States)

    Sleeter, Benjamin M.; Wilson, Tamara; Sharygin, Ethan; Sherba, Jason

    2017-01-01

    Changes in land use and land cover (LULC) have important and fundamental interactions with the global climate system. Top-down global scale projections of land use change have been an important component of climate change research; however, their utility at local to regional scales is often limited. The goal of this study was to develop an approach for projecting changes in LULC based on land use histories and demographic trends. We developed a set of stochastic, empirical-based projections of LULC change for the state of California, for the period 2001–2100. Land use histories and demographic trends were used to project a “business-as-usual” (BAU) scenario and three population growth scenarios. For the BAU scenario, we projected developed lands would more than double by 2100. When combined with cultivated areas, we projected a 28% increase in anthropogenic land use by 2100. As a result, natural lands were projected to decline at a rate of 139 km2 yr−1; grasslands experienced the largest net decline, followed by shrublands and forests. The amount of cultivated land was projected to decline by approximately 10%; however, the relatively modest change masked large shifts between annual and perennial crop types. Under the three population scenarios, developed lands were projected to increase 40–90% by 2100. Our results suggest that when compared to the BAU projection, scenarios based on demographic trends may underestimate future changes in LULC. Furthermore, regardless of scenario, the spatial pattern of LULC change was likely to have the greatest negative impacts on rangeland ecosystems.

  9. Integrated management systems and workflow-based electronic document management: An empirical study

    Directory of Open Access Journals (Sweden)

    Hang Thu Pho

    2014-01-01

    Full Text Available Purpose: Many global organizations have aligned their strategy and operation via the ISO-based framework of integrated management system (IMS that allows them to merge quality, environment, health and safety management systems. In such context, having a robust electronic document management system (EDMS is essential, especially at global enterprises where a large amount of documents generated by processes flows through different work cultures. However, there is no "one-size-fits-all" design for EDMS because it depends on organizations' needs, size and resource allocation. This article discusses the interrelation between EDMS and IMS in order to suggest a best practice. Design/methodology/approach: This article methodologically based upon a qualitative, interpretivistic, longitudinal empirical study in a wind turbine factory. Findings and Originality/value: IMS improvement and effectiveness has been overlooking EDMS as a key factor in establishing appropriate technological support of the IMS processes. Rightful application of EDMS can further contribute to organizational learning, precision of documentation and cross-organisational collaboration. Research limitations/implications: Theorising on IMS needs a stronger perspective of the technological limitations and potentials of basing IMS on EDMS. Practical implications: IMS are complex systems involving a large number of administrative functions. EDMS provides a formal representation with automation potentials both heightening and securing document trustworthiness. Social implications: IMS has a tendency to stay with professionals, e.g. line managers and QA/QC/QMS professionals. The EDMS line of discussion suggests a broader inclusion. Originality/value: Researching IMS as a technological implementation is giving a better platform of aligning the IMS with other business processes and is bringing IMS closer to the operational activities within the enterprise.

  10. Future Scenarios of Land Change Based on Empirical Data and Demographic Trends

    Science.gov (United States)

    Sleeter, Benjamin M.; Wilson, Tamara S.; Sharygin, Ethan; Sherba, Jason T.

    2017-11-01

    Changes in land use and land cover (LULC) have important and fundamental interactions with the global climate system. Top-down global scale projections of land use change have been an important component of climate change research; however, their utility at local to regional scales is often limited. The goal of this study was to develop an approach for projecting changes in LULC based on land use histories and demographic trends. We developed a set of stochastic, empirical-based projections of LULC change for the state of California, for the period 2001-2100. Land use histories and demographic trends were used to project a "business-as-usual" (BAU) scenario and three population growth scenarios. For the BAU scenario, we projected developed lands would more than double by 2100. When combined with cultivated areas, we projected a 28% increase in anthropogenic land use by 2100. As a result, natural lands were projected to decline at a rate of 139 km2 yr-1; grasslands experienced the largest net decline, followed by shrublands and forests. The amount of cultivated land was projected to decline by approximately 10%; however, the relatively modest change masked large shifts between annual and perennial crop types. Under the three population scenarios, developed lands were projected to increase 40-90% by 2100. Our results suggest that when compared to the BAU projection, scenarios based on demographic trends may underestimate future changes in LULC. Furthermore, regardless of scenario, the spatial pattern of LULC change was likely to have the greatest negative impacts on rangeland ecosystems.

  11. Dispelling myths about dissociative identity disorder treatment: an empirically based approach.

    Science.gov (United States)

    Brand, Bethany L; Loewenstein, Richard J; Spiegel, David

    2014-01-01

    Some claim that treatment for dissociative identity disorder (DID) is harmful. Others maintain that the available data support the view that psychotherapy is helpful. We review the empirical support for both arguments. Current evidence supports the conclusion that phasic treatment consistent with expert consensus guidelines is associated with improvements in a wide range of DID patients' symptoms and functioning, decreased rates of hospitalization, and reduced costs of treatment. Research indicates that poor outcome is associated with treatment that does not specifically involve direct engagement with DID self-states to repair identity fragmentation and to decrease dissociative amnesia. The evidence demonstrates that carefully staged trauma-focused psychotherapy for DID results in improvement, whereas dissociative symptoms persist when not specifically targeted in treatment. The claims that DID treatment is harmful are based on anecdotal cases, opinion pieces, reports of damage that are not substantiated in the scientific literature, misrepresentations of the data, and misunderstandings about DID treatment and the phenomenology of DID. Given the severe symptomatology and disability associated with DID, iatrogenic harm is far more likely to come from depriving DID patients of treatment that is consistent with expert consensus, treatment guidelines, and current research.

  12. Non-empirical exchange-correlation parameterizations based on exact conditions from correlated orbital theory.

    Science.gov (United States)

    Haiduke, Roberto Luiz A; Bartlett, Rodney J

    2018-05-14

    Some of the exact conditions provided by the correlated orbital theory are employed to propose new non-empirical parameterizations for exchange-correlation functionals from Density Functional Theory (DFT). This reparameterization process is based on range-separated functionals with 100% exact exchange for long-range interelectronic interactions. The functionals developed here, CAM-QTP-02 and LC-QTP, show mitigated self-interaction error, correctly predict vertical ionization potentials as the negative of eigenvalues for occupied orbitals, and provide nice excitation energies, even for challenging charge-transfer excited states. Moreover, some improvements are observed for reaction barrier heights with respect to the other functionals belonging to the quantum theory project (QTP) family. Finally, the most important achievement of these new functionals is an excellent description of vertical electron affinities (EAs) of atoms and molecules as the negative of appropriate virtual orbital eigenvalues. In this case, the mean absolute deviations for EAs in molecules are smaller than 0.10 eV, showing that physical interpretation can indeed be ascribed to some unoccupied orbitals from DFT.

  13. Service Quality of Online Shopping Platforms: A Case-Based Empirical and Analytical Study

    Directory of Open Access Journals (Sweden)

    Tsan-Ming Choi

    2013-01-01

    Full Text Available Customer service is crucially important for online shopping platforms (OSPs such as eBay and Taobao. Based on the well-established service quality instruments and the scenario of the specific case on Taobao, this paper focuses on exploring the service quality of an OSP with an aim of revealing customer perceptions of the service quality associated with the provided functions and investigating their impacts on customer loyalty. By an empirical study, this paper finds that the “fulfillment and responsiveness” function is significantly related to the customer loyalty. Further analytical study is conducted to reveal that the optimal service level on the “fulfillment and responsiveness” function for the risk averse OSP uniquely exists. Moreover, the analytical results prove that (i if the customer loyalty is more positively correlated to the service level, it will lead to a larger optimal service level, and (ii the optimal service level is independent of the profit target, the source of uncertainty, and the risk preference of the OSP.

  14. Non-empirical exchange-correlation parameterizations based on exact conditions from correlated orbital theory

    Science.gov (United States)

    Haiduke, Roberto Luiz A.; Bartlett, Rodney J.

    2018-05-01

    Some of the exact conditions provided by the correlated orbital theory are employed to propose new non-empirical parameterizations for exchange-correlation functionals from Density Functional Theory (DFT). This reparameterization process is based on range-separated functionals with 100% exact exchange for long-range interelectronic interactions. The functionals developed here, CAM-QTP-02 and LC-QTP, show mitigated self-interaction error, correctly predict vertical ionization potentials as the negative of eigenvalues for occupied orbitals, and provide nice excitation energies, even for challenging charge-transfer excited states. Moreover, some improvements are observed for reaction barrier heights with respect to the other functionals belonging to the quantum theory project (QTP) family. Finally, the most important achievement of these new functionals is an excellent description of vertical electron affinities (EAs) of atoms and molecules as the negative of appropriate virtual orbital eigenvalues. In this case, the mean absolute deviations for EAs in molecules are smaller than 0.10 eV, showing that physical interpretation can indeed be ascribed to some unoccupied orbitals from DFT.

  15. Multivariate Empirical Mode Decomposition Based Signal Analysis and Efficient-Storage in Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Lu [University of Tennessee, Knoxville (UTK); Albright, Austin P [ORNL; Rahimpour, Alireza [University of Tennessee, Knoxville (UTK); Guo, Jiandong [University of Tennessee, Knoxville (UTK); Qi, Hairong [University of Tennessee, Knoxville (UTK); Liu, Yilu [University of Tennessee (UTK) and Oak Ridge National Laboratory (ORNL)

    2017-01-01

    Wide-area-measurement systems (WAMSs) are used in smart grid systems to enable the efficient monitoring of grid dynamics. However, the overwhelming amount of data and the severe contamination from noise often impede the effective and efficient data analysis and storage of WAMS generated measurements. To solve this problem, we propose a novel framework that takes advantage of Multivariate Empirical Mode Decomposition (MEMD), a fully data-driven approach to analyzing non-stationary signals, dubbed MEMD based Signal Analysis (MSA). The frequency measurements are considered as a linear superposition of different oscillatory components and noise. The low-frequency components, corresponding to the long-term trend and inter-area oscillations, are grouped and compressed by MSA using the mean shift clustering algorithm. Whereas, higher-frequency components, mostly noise and potentially part of high-frequency inter-area oscillations, are analyzed using Hilbert spectral analysis and they are delineated by statistical behavior. By conducting experiments on both synthetic and real-world data, we show that the proposed framework can capture the characteristics, such as trends and inter-area oscillation, while reducing the data storage requirements

  16. Bulgarian ethnos according to A.Kh. Khalikov’ works: scientific concept and its theoretical bases

    Directory of Open Access Journals (Sweden)

    Izmaylov Iskander L.

    2017-02-01

    Full Text Available The article is devoted to the problems of Bulgar and Tatar ethnogenesis studied in the works of the prominent Kazan archaeologist A.Kh. Khalikov. His concept was based on the fact that a number of ethnic groups (Turkic, Finno-Ugric, and East Slavic participated in the formation of these peoples and that the key role in these processes was played by their mutual cultural influence. The concept of ethnogenesis and ethnic history of the Tatar people offered by A.Kh. Khalikov was a serious theoretical breakthrough against the background of both ideology-biased historical schemes of the Soviet era and the various nationalist ideas, differing from them by a comprehensive, integral scientific analysis of predominantly archaeological data. At present, however, when theoretical and factual bases of historical and ethnological research have considerably expanded, a number of conflicting issues have arisen in the framework of this concept, which, therefore, require new approaches to their solution.

  17. Pseudopotential-based electron quantum transport: Theoretical formulation and application to nanometer-scale silicon nanowire transistors

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Jingtian, E-mail: jingtian.fang@utdallas.edu; Vandenberghe, William G.; Fu, Bo; Fischetti, Massimo V. [Department of Materials Science and Engineering, The University of Texas at Dallas, Richardson, Texas 75080 (United States)

    2016-01-21

    We present a formalism to treat quantum electronic transport at the nanometer scale based on empirical pseudopotentials. This formalism offers explicit atomistic wavefunctions and an accurate band structure, enabling a detailed study of the characteristics of devices with a nanometer-scale channel and body. Assuming externally applied potentials that change slowly along the electron-transport direction, we invoke the envelope-wavefunction approximation to apply the open boundary conditions and to develop the transport equations. We construct the full-band open boundary conditions (self-energies of device contacts) from the complex band structure of the contacts. We solve the transport equations and present the expressions required to calculate the device characteristics, such as device current and charge density. We apply this formalism to study ballistic transport in a gate-all-around (GAA) silicon nanowire field-effect transistor with a body-size of 0.39 nm, a gate length of 6.52 nm, and an effective oxide thickness of 0.43 nm. Simulation results show that this device exhibits a subthreshold slope (SS) of ∼66 mV/decade and a drain-induced barrier-lowering of ∼2.5 mV/V. Our theoretical calculations predict that low-dimensionality channels in a 3D GAA architecture are able to meet the performance requirements of future devices in terms of SS swing and electrostatic control.

  18. Biomarker-based strategy for early discontinuation of empirical antifungal treatment in critically ill patients: a randomized controlled trial.

    Science.gov (United States)

    Rouzé, Anahita; Loridant, Séverine; Poissy, Julien; Dervaux, Benoit; Sendid, Boualem; Cornu, Marjorie; Nseir, Saad

    2017-11-01

    The aim of this study was to determine the impact of a biomarker-based strategy on early discontinuation of empirical antifungal treatment. Prospective randomized controlled single-center unblinded study, performed in a mixed ICU. A total of 110 patients were randomly assigned to a strategy in which empirical antifungal treatment duration was determined by (1,3)-β-D-glucan, mannan, and anti-mannan serum assays, performed on day 0 and day 4; or to a routine care strategy, based on international guidelines, which recommend 14 days of treatment. In the biomarker group, early stop recommendation was determined using an algorithm based on the results of biomarkers. The primary outcome was the percentage of survivors discontinuing empirical antifungal treatment early, defined as a discontinuation strictly before day 7. A total of 109 patients were analyzed (one patient withdraw consent). Empirical antifungal treatment was discontinued early in 29 out of 54 patients in the biomarker strategy group, compared with one patient out of 55 in the routine strategy group [54% vs 2%, p strategy compared with routine strategy [median (IQR) 6 (4-13) vs 13 (12-14) days, p strategy increased the percentage of early discontinuation of empirical antifungal treatment among critically ill patients with suspected invasive Candida infection. These results confirm previous findings suggesting that early discontinuation of empirical antifungal treatment had no negative impact on outcome. However, further studies are needed to confirm the safety of this strategy. This trial was registered at ClinicalTrials.gov, NCT02154178.

  19. Shape of the self-concept clarity change during group psychotherapy predicts the outcome: an empirical validation of the theoretical model of the self-concept change

    Science.gov (United States)

    Styła, Rafał

    2015-01-01

    Background: Self-Concept Clarity (SCC) describes the extent to which the schemas of the self are internally integrated, well defined, and temporally stable. This article presents a theoretical model that describes how different shapes of SCC change (especially stable increase and “V” shape) observed in the course of psychotherapy are related to the therapy outcome. Linking the concept of Jean Piaget and the dynamic systems theory, the study postulates that a stable SCC increase is needed for the participants with a rather healthy personality structure, while SCC change characterized by a “V” shape or fluctuations is optimal for more disturbed patients. Method: Correlational study in a naturalistic setting with repeated measurements (M = 5.8) was conducted on the sample of 85 patients diagnosed with neurosis and personality disorders receiving intensive eclectic group psychotherapy under routine inpatient conditions. Participants filled in the Self-Concept Clarity Scale (SCCS), Symptoms' Questionnaire KS-II, and Neurotic Personality Questionnaire KON-2006 at the beginning and at the end of the course of psychotherapy. The SCCS was also administered every 2 weeks during psychotherapy. Results: As hypothesized, among the relatively healthiest group of patients the stable SCC increase was related to positive treatment outcome, while more disturbed patients benefited from the fluctuations and “V” shape of SCC change. Conclusions: The findings support the idea that for different personality dispositions either a monotonic increase or transient destabilization of SCC is a sign of a good treatment prognosis. PMID:26579001

  20. Theoretical bases and possibilities of program BRASIER for experimental data fitting and management

    International Nuclear Information System (INIS)

    Quintero, B.; Santos, J.; Garcia Yip, F.; Lopez, I.

    1992-01-01

    In the paper the theoretical bases and primary possibilities of the program BRASIER are shown. It was performed for the management and fitting of experimental data. Relevant characteristics are: Utilization of several regression methods, errors treatment, P oint-Drop Technique , multidimensional fitting, friendly interactivity, graphical possibilities and file management. The fact of using various regression methods has resulted in greater convergence possibility with respect to other similar programs that use an unique algorithm

  1. On the relationship between fiscal plans in the European Union: An empirical analysis based on real-time data

    NARCIS (Netherlands)

    Giuliodori, M.; Beetsma, R.M.W.J.

    2007-01-01

    We investigate the interdependence of fiscal policies, and in particular deficits, in the European Union using an empirical analysis based on real-time fiscal data. There are many potential reasons why fiscal policies could be interdependent, such as direct externalities due to cross-border public

  2. On the relationship between fiscal plans in the European Union: an empirical analysis based on real-time data

    NARCIS (Netherlands)

    Giuliodori, M.; Beetsma, R.

    2008-01-01

    We investigate the interdependence of fiscal policies, and in particular deficits, in the European Union using an empirical analysis based on real-time fiscal data. There are many potential reasons why fiscal policies could be interdependent, such as direct externalities due to cross-border public

  3. Empirically Based Phenotypic Profiles of Children with Pervasive Developmental Disorders : Interpretation in the Light of the DSM-5

    NARCIS (Netherlands)

    Greaves-Lord, Kirstin; Eussen, Mart L. J. M.; Verhulst, Frank C.; Minderaa, Ruud B.; Mandy, William; Hudziak, James J.; Steenhuis, Mark Peter; de Nijs, Pieter F.; Hartman, Catharina A.

    This study aimed to contribute to the Diagnostic and Statistical Manual (DSM) debates on the conceptualization of autism by investigating (1) whether empirically based distinct phenotypic profiles could be distinguished within a sample of mainly cognitively able children with pervasive developmental

  4. An Empirical Introduction to the Concept of Chemical Element Based on Van Hiele's Theory of Level Transitions

    Science.gov (United States)

    Vogelezang, Michiel; Van Berkel, Berry; Verdonk, Adri

    2015-01-01

    Between 1970 and 1990, the Dutch working group "Empirical Introduction to Chemistry" developed a secondary school chemistry education curriculum based on the educational vision of the mathematicians van Hiele and van Hiele-Geldof. This approach viewed learning as a process in which students must go through discontinuous level transitions…

  5. Empirically Based Phenotypic Profiles of Children with Pervasive Developmental Disorders: Interpretation in the Light of the DSM-5

    Science.gov (United States)

    Greaves-Lord, Kirstin; Eussen, Mart L. J. M.; Verhulst, Frank C.; Minderaa, Ruud B.; Mandy, William; Hudziak, James J.; Steenhuis, Mark Peter; de Nijs, Pieter F.; Hartman, Catharina A.

    2013-01-01

    This study aimed to contribute to the Diagnostic and Statistical Manual (DSM) debates on the conceptualization of autism by investigating (1) whether empirically based distinct phenotypic profiles could be distinguished within a sample of mainly cognitively able children with pervasive developmental disorder (PDD), and (2) how profiles related to…

  6. A novel signal compression method based on optimal ensemble empirical mode decomposition for bearing vibration signals

    Science.gov (United States)

    Guo, Wei; Tse, Peter W.

    2013-01-01

    Today, remote machine condition monitoring is popular due to the continuous advancement in wireless communication. Bearing is the most frequently and easily failed component in many rotating machines. To accurately identify the type of bearing fault, large amounts of vibration data need to be collected. However, the volume of transmitted data cannot be too high because the bandwidth of wireless communication is limited. To solve this problem, the data are usually compressed before transmitting to a remote maintenance center. This paper proposes a novel signal compression method that can substantially reduce the amount of data that need to be transmitted without sacrificing the accuracy of fault identification. The proposed signal compression method is based on ensemble empirical mode decomposition (EEMD), which is an effective method for adaptively decomposing the vibration signal into different bands of signal components, termed intrinsic mode functions (IMFs). An optimization method was designed to automatically select appropriate EEMD parameters for the analyzed signal, and in particular to select the appropriate level of the added white noise in the EEMD method. An index termed the relative root-mean-square error was used to evaluate the decomposition performances under different noise levels to find the optimal level. After applying the optimal EEMD method to a vibration signal, the IMF relating to the bearing fault can be extracted from the original vibration signal. Compressing this signal component obtains a much smaller proportion of data samples to be retained for transmission and further reconstruction. The proposed compression method were also compared with the popular wavelet compression method. Experimental results demonstrate that the optimization of EEMD parameters can automatically find appropriate EEMD parameters for the analyzed signals, and the IMF-based compression method provides a higher compression ratio, while retaining the bearing defect

  7. A quantum theoretical study of reactions of methyldiazonium ion with DNA base pairs

    International Nuclear Information System (INIS)

    Shukla, P.K.; Ganapathy, Vinay; Mishra, P.C.

    2011-01-01

    Graphical abstract: Reactions of methyldiazonium ion at the different sites of the DNA bases in the Watson-Crick GC and AT base pairs were investigated employing density functional and second order Moller-Plesset (MP2) perturbation theories. Display Omitted Highlights: → Methylation of the DNA bases is important as it can cause mutation and cancer. → Methylation reactions of the GC and AT base pairs with CH 3 N 2 + were not studied earlier theoretically. → Experimental observations have been explained using theoretical methods. - Abstract: Methylation of the DNA bases in the Watson-Crick GC and AT base pairs by the methyldiazonium ion was investigated employing density functional and second order Moller-Plesset (MP2) perturbation theories. Methylation at the N3, N7 and O6 sites of guanine, N1, N3 and N7 sites of adenine, O2 and N3 sites of cytosine and the O2 and O4 sites of thymine were considered. The computed reactivities for methylation follow the order N7(guanine) > N3(adenine) > O6(guanine) which is in agreement with experiment. The base pairing in DNA is found to play a significant role with regard to reactivities of the different sites.

  8. Shape of the self-concept clarity change during group psychotherapy predicts the outcome: An empirical validation of the theoretical model of the self-concept change

    Directory of Open Access Journals (Sweden)

    Rafał eStyła

    2015-10-01

    Full Text Available Background: Self-concept clarity describes the extent to which the schemas of the self are internally integrated, well defined, and temporally stable. This article presents a theoretical model that describes how different shapes of self-concept clarity change (especially stable increase and V shape observed in the course of psychotherapy are related to the therapy outcome. Linking the concept of Jean Piaget and the dynamic systems theory, the study postulates that a stable self-concept clarity increase is needed for the participants with a rather healthy personality structure, while self-concept clarity change characterized by a V shape or fluctuations is optimal for more disturbed patients. Method: Correlational study in a naturalistic setting with repeated measurements (M=5.8 was conducted on the sample of 85 patients diagnosed with neurosis and personality disorders receiving intensive eclectic group psychotherapy under routine inpatient conditions. Participants filled in the Self-Concept Clarity Scale, Symptoms’ Questionnaire KS-II, and Neurotic Personality Questionnaire KON-2006 at the beginning and at the end of the course of psychotherapy. The Self-Concept Clarity Scale was also administered every two weeks during psychotherapy. Results: As hypothesized, among the relatively healthiest group of patients the stable self-concept clarity increase was related to positive treatment outcome, while more disturbed patients benefited from the fluctuations and V shape of self-concept clarity change. Conclusions: The findings support the idea that for different personality dispositions either a monotonic increase or transient destabilization of self-concept clarity is a sign of a good treatment prognosis.

  9. Theoretical thermal dosimetry produced by an annular phased array system in CT-based patient models

    International Nuclear Information System (INIS)

    Paulsen, K.D.; Strohbehn, J.W.; Lynch, D.R.

    1984-01-01

    Theoretical calculations for the specific absorption rate (SAR) and the resulting temperature distributions produced by an annular phased array (APA) type system are made. The finite element numerical method is used in the formulation of both the electromagnetic (EM) and the thermal boundary value problems. A number of detailed patient models based on CT-scan data from the pelvic, visceral, and thoracic regions are generated to stimulate a variety of tumor locations and surrounding normal tissues. The SAR values from the EM solution are input into the bioheat transfer equation, and steady-rate temperature distributions are calculated for a wide variety of blood flow rates. Based on theoretical modeling, the APA shows no preferential heating of superficial over deep-seated tumors. However, in most cases satisfactory thermal profiles (therapeutic volume near 60%) are obtained in all three regions of the human trunk only for tumors with little or no blood flow. Unsatisfactory temperature patterns (therapeutic volume <50%) are found for tumors with moderate to high perfusion rates. These theoretical calculations should aid the clinician in the evaluation of the effectiveness of APA type devices in heating tumors located in the trunk region

  10. Knowledge-based immunosuppressive therapy for kidney transplant patients--from theoretical model to clinical integration.

    Science.gov (United States)

    Seeling, Walter; Plischke, Max; de Bruin, Jeroen S; Schuh, Christian

    2015-01-01

    Immunosuppressive therapy is a risky necessity after a patient received a kidney transplant. To reduce risks, a knowledge-based system was developed that determines the right dosage of the immunosuppresive agent Tacrolimus. A theoretical model, to classify medication blood levels as well as medication adaptions, was created using data from almost 500 patients, and over 13.000 examinations. This model was then translated into an Arden Syntax knowledge base, and integrated directly into the hospital information system of the Vienna General Hospital. In this paper we give an overview of the construction and integration of such a system.

  11. Substituent effect on redox potential of nitrido technetium complexes with Schiff base ligand. Theoretical calculations

    International Nuclear Information System (INIS)

    Takayama, T.; Sekine, T.; Kudo, H.

    2003-01-01

    Theoretical calculations based on the density functional theory (DFT) were performed to understand the effect of substituents on the molecular and electronic structures of technetium nitrido complexes with salen type Schiff base ligands. Optimized structures of these complexes are square pyramidal. The electron density on a Tc atom of the complex with electron withdrawing substituents is lower than that of the complex with electron donating substituents. The HOMO energy is lower in the complex with electron withdrawing substituents than that in the complex with electron donating substituents. The charge on Tc atoms is a good measure that reflects the redox potential of [TcN(L)] complex. (author)

  12. Theoretical model and optimization of a novel temperature sensor based on quartz tuning fork resonators

    International Nuclear Information System (INIS)

    Xu Jun; You Bo; Li Xin; Cui Juan

    2007-01-01

    To accurately measure temperatures, a novel temperature sensor based on a quartz tuning fork resonator has been designed. The principle of the quartz tuning fork temperature sensor is that the resonant frequency of the quartz resonator changes with the variation in temperature. This type of tuning fork resonator has been designed with a new doubly rotated cut work at flexural vibration mode as temperature sensor. The characteristics of the temperature sensor were evaluated and the results sufficiently met the target of development for temperature sensor. The theoretical model for temperature sensing has been developed and built. The sensor structure was analysed by finite element method (FEM) and optimized, including tuning fork geometry, tine electrode pattern and the sensor's elements size. The performance curve of output versus measured temperature is given. The results from theoretical analysis and experiments indicate that the sensor's sensitivity can reach 60 ppm 0 C -1 with the measured temperature range varying from 0 to 100 0 C

  13. Principles and software realization of a multimedia course on theoretical electrical engineering based on enterprise technology

    Directory of Open Access Journals (Sweden)

    Penev Krasimir

    2003-01-01

    Full Text Available The Department of Theoretical Electrical Engineering (TEE of Technical University of Sofia has been developing interactive enterprise-technologies based course on Theoretical Electrical Engineering. One side of the project is the development of multimedia teaching modules for the core undergraduate electrical engineering courses (Circuit Theory and Electromagnetic Fields and the other side is the development of Software Architecture of the web site on which modules are deployed. Initial efforts have been directed at the development of multimedia modules for the subject Electrical Circuits and on developing the web site structure. The objective is to develop teaching materials that will enhance lectures and laboratory exercises and will allow computerized examinations on the subject. This article outlines the framework used to develop the web site structure, the Circuit Theory teaching modules, and the strategy of their use as teaching tool.

  14. An empirical model to predict road dust emissions based on pavement and traffic characteristics.

    Science.gov (United States)

    Padoan, Elio; Ajmone-Marsan, Franco; Querol, Xavier; Amato, Fulvio

    2018-06-01

    The relative impact of non-exhaust sources (i.e. road dust, tire wear, road wear and brake wear particles) on urban air quality is increasing. Among them, road dust resuspension has generally the highest impact on PM concentrations but its spatio-temporal variability has been rarely studied and modeled. Some recent studies attempted to observe and describe the time-variability but, as it is driven by traffic and meteorology, uncertainty remains on the seasonality of emissions. The knowledge gap on spatial variability is much wider, as several factors have been pointed out as responsible for road dust build-up: pavement characteristics, traffic intensity and speed, fleet composition, proximity to traffic lights, but also the presence of external sources. However, no parameterization is available as a function of these variables. We investigated mobile road dust smaller than 10 μm (MF10) in two cities with different climatic and traffic conditions (Barcelona and Turin), to explore MF10 seasonal variability and the relationship between MF10 and site characteristics (pavement macrotexture, traffic intensity and proximity to braking zone). Moreover, we provide the first estimates of emission factors in the Po Valley both in summer and winter conditions. Our results showed a good inverse relationship between MF10 and macro-texture, traffic intensity and distance from the nearest braking zone. We also found a clear seasonal effect of road dust emissions, with higher emission in summer, likely due to the lower pavement moisture. These results allowed building a simple empirical mode, predicting maximal dust loadings and, consequently, emission potential, based on the aforementioned data. This model will need to be scaled for meteorological effect, using methods accounting for weather and pavement moisture. This can significantly improve bottom-up emission inventory for spatial allocation of emissions and air quality management, to select those roads with higher emissions

  15. Diet index-based and empirically derived dietary patterns are associated with colorectal cancer risk.

    Science.gov (United States)

    Miller, Paige E; Lazarus, Philip; Lesko, Samuel M; Muscat, Joshua E; Harper, Gregory; Cross, Amanda J; Sinha, Rashmi; Ryczak, Karen; Escobar, Gladys; Mauger, David T; Hartman, Terryl J

    2010-07-01

    Previous studies have derived patterns by measuring compliance with preestablished dietary guidance or empirical methods, such as principal components analysis (PCA). Our objective was to examine colorectal cancer risk associated with patterns identified by both methods. The study included 431 incident colorectal cancer cases (225 men, 206 women) and 726 healthy controls (330 men, 396 women) participating in a population-based, case-control study. PCA identified sex-specific dietary patterns and the Healthy Eating Index-2005 (HEI-05) assessed adherence to the 2005 Dietary Guidelines for Americans. A fruits and vegetables pattern and a meat, potatoes, and refined grains pattern were identified among men and women; a third pattern (alcohol and sweetened beverages) was identified in men. The fruits and vegetables pattern was inversely associated with risk among men [odds ratio (OR) = 0.38, 95% CI = 0.21-0.69 for the highest compared with the lowest quartile] and women (OR = 0.35, 95% CI = 0.19-0.65). The meat, potatoes, and refined grains pattern was positively associated with risk in women (OR = 2.20, 95% CI = 1.08-4.50) and there was a suggestion of a positive association among men (OR = 1.56, 95% CI = 0.84-2.90; P-trend = 0.070). Men and women with greater HEI-05 scores had a significantly reduced risk of colorectal cancer (OR = 0.56, 95% CI = 0.31-0.99; OR = 0.44, 95% CI = 0.24-0.77, respectively). Following the Dietary Guidelines or a dietary pattern lower in meat, potatoes, high fat, and refined foods and higher in fruits and vegetables may reduce colorectal cancer risk.

  16. Assessing for suicidal behavior in youth using the Achenbach System of Empirically Based Assessment.

    Science.gov (United States)

    Van Meter, Anna R; Algorta, Guillermo Perez; Youngstrom, Eric A; Lechtman, Yana; Youngstrom, Jen K; Feeny, Norah C; Findling, Robert L

    2018-02-01

    This study investigated the clinical utility of the Achenbach System of Empirically Based Assessment (ASEBA) for identifying youth at risk for suicide. Specifically, we investigated how well the Total Problems scores and the sum of two suicide-related items (#18 "Deliberately harms self or attempts suicide" and #91 "Talks about killing self") were able to distinguish youth with a history of suicidal behavior. Youth (N = 1117) aged 5-18 were recruited for two studies of mental illness. History of suicidal behavior was assessed by semi-structured interviews (K-SADS) with youth and caregivers. Youth, caregivers, and a primary teacher each completed the appropriate form (YSR, CBCL, and TRF, respectively) of the ASEBA. Areas under the curve (AUCs) from ROC analyses and diagnostic likelihood ratios (DLRs) were used to measure the ability of both Total Problems T scores, as well as the summed score of two suicide-related items, to identify youth with a history of suicidal behavior. The Suicide Items from the CBCL and YSR performed well (AUCs = 0.85 and 0.70, respectively). The TRF Suicide Items did not perform better than chance, AUC = 0.45. The AUCs for the Total Problems scores were poor-to-fair (0.33-0.65). The CBCL Suicide Items outperformed all other scores (ps = 0.04 to youth's risk for suicidal behavior. The low burden of this approach could facilitate wide-spread screening for suicide in an increasingly at-risk population.

  17. Internalisation of an externality and profitability: Based on an empirical study of in the food industry

    Directory of Open Access Journals (Sweden)

    Fumihiko Isada

    2014-09-01

    Full Text Available The objective of this research was to verify empirically that internalising externalities of private enterprises contributes to profitability on a long-term basis. One of the significant transitions of the economic environment globally at the beginning of this century has been the rapid economic growth of emerging countries. On the other hand, there is apprehension also about various social problems due to the rapid industrialisation of such emerging countries, or improvement in the living standard. Political-economist Malthus (1798 warned of such a drain on resources and energy, and the ravaging of the natural environment at the beginning of the nineteenth century. Present-day Malthusians warn of ruin because of the exhaustion of resources and environmental deterioration resulting from active business activity. This aims to tighten up regulations as "a governmental role" to the so-called "market failure." Mill (1848 introduced the concept of “stationary state” in the "philosophy of economics." On the other hand, Robert Solow (1956 highlighted the significance of technological progress as a factor of economic growth. In terms of the positive role of a private company, Porter et al. (2011 called it the strategy of shared-value creation. It is internalisation of the externality on a market transaction. In terms of research methodology, statistical verification was carried out based on the sustainability statement, the various publicity materials and the financial data released by each company. In conclusion, when private enterprises adopt sustainability activity positively, a consumer value, corporate value and social value are expanded cyclically.

  18. Empirical validation of a real options theory based method for optimizing evacuation decisions within chemical plants.

    Science.gov (United States)

    Reniers, G L L; Audenaert, A; Pauwels, N; Soudan, K

    2011-02-15

    This article empirically assesses and validates a methodology to make evacuation decisions in case of major fire accidents in chemical clusters. In this paper, a number of empirical results are presented, processed and discussed with respect to the implications and management of evacuation decisions in chemical companies. It has been shown in this article that in realistic industrial settings, suboptimal interventions may result in case the prospect to obtain additional information at later stages of the decision process is ignored. Empirical results also show that implications of interventions, as well as the required time and workforce to complete particular shutdown activities, may be very different from one company to another. Therefore, to be optimal from an economic viewpoint, it is essential that precautionary evacuation decisions are tailor-made per company. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation

    Science.gov (United States)

    Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah

    2018-03-01

    To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.

  20. How can results from macro economic analyses of the energy consumption of households be used in macro models? A discussion of theoretical and empirical literature about aggregation

    International Nuclear Information System (INIS)

    Halvorsen, Bente; Larsen, Bodil M.; Nesbakken, Runa

    2001-01-01

    The literature on energy demand shows that there are systematic differences in income- and price elasticity from analyses based on macro data and micro data. Even if one estimates models with the same explanatory variables, the results may differ with respect to estimated price- and income sensitivity. These differences may be caused by problems involved in transferring micro properties to macro properties, or the estimated macro relationships have failed to adequately consideration the fact that households behave differently in their energy demand. Political goals are often directed towards the entire household sector. Partial equilibrium models do not capture important equilibrium effects and feedback through the energy markets and the economy in general. Thus, it is very interesting, politically and scientifically, to do macro economic model analyses of different political measures that affect the energy consumption. The results of behavioural analyses, in which one investigates the heterogeneity of the energy demand, must be based on information about individual households. When the demand is studied based on micro data, it is difficult to aggregate its properties to a total demand function for the entire household sector if different household sectors have different behaviour. Such heterogeneity of behaviour may for instance arise when households in different regions have different heating equipment because of regional differences in the price of electricity. The subject of aggregation arises immediately when one wants to draw conclusions about the household sector based on information about individual households, whether the discussion is about the whole population or a selection of households. Thus, aggregation is a topic of interest in a wide range of problems

  1. Sci—Thur AM: YIS - 09: Validation of a General Empirically-Based Beam Model for kV X-ray Sources

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, Y. [CancerCare Manitoba (Canada); University of Calgary (Canada); Sommerville, M.; Johnstone, C.D. [San Diego State University (United States); Gräfe, J.; Nygren, I.; Jacso, F. [Tom Baker Cancer Centre (Canada); Khan, R.; Villareal-Barajas, J.E. [University of Calgary (Canada); Tom Baker Cancer Centre (Canada); Tambasco, M. [University of Calgary (Canada); San Diego State University (United States)

    2014-08-15

    Purpose: To present an empirically-based beam model for computing dose deposited by kilovoltage (kV) x-rays and validate it for radiographic, CT, CBCT, superficial, and orthovoltage kV sources. Method and Materials: We modeled a wide variety of imaging (radiographic, CT, CBCT) and therapeutic (superficial, orthovoltage) kV x-ray sources. The model characterizes spatial variations of the fluence and spectrum independently. The spectrum is derived by matching measured values of the half value layer (HVL) and nominal peak potential (kVp) to computationally-derived spectra while the fluence is derived from in-air relative dose measurements. This model relies only on empirical values and requires no knowledge of proprietary source specifications or other theoretical aspects of the kV x-ray source. To validate the model, we compared measured doses to values computed using our previously validated in-house kV dose computation software, kVDoseCalc. The dose was measured in homogeneous and anthropomorphic phantoms using ionization chambers and LiF thermoluminescent detectors (TLDs), respectively. Results: The maximum difference between measured and computed dose measurements was within 2.6%, 3.6%, 2.0%, 4.8%, and 4.0% for the modeled radiographic, CT, CBCT, superficial, and the orthovoltage sources, respectively. In the anthropomorphic phantom, the computed CBCT dose generally agreed with TLD measurements, with an average difference and standard deviation ranging from 2.4 ± 6.0% to 5.7 ± 10.3% depending on the imaging technique. Most (42/62) measured TLD doses were within 10% of computed values. Conclusions: The proposed model can be used to accurately characterize a wide variety of kV x-ray sources using only empirical values.

  2. A short review of theoretical and empirical models for characterization of optical materials doped with the transition metal and rare earth ions

    Science.gov (United States)

    Su, P.; Ma, C.-G.; Brik, M. G.; Srivastava, A. M.

    2018-05-01

    In this paper, a brief retrospective review of the main developments in crystal field theory is provided. We have examined how different crystal field models are applied to solve the problems that arise in the spectroscopy of optically active ions. Attention is focused on the joint application of crystal field and density functional theory (DFT) based models, which takes advantages of strong features of both individual approaches and allows for obtaining a complementary picture of the electronic properties of a doped crystal with impurity energy levels superimposed onto the host band structure.

  3. Meta-Theoretical Contributions to the Constitution of a Model-Based Didactics of Science

    Science.gov (United States)

    Ariza, Yefrin; Lorenzano, Pablo; Adúriz-Bravo, Agustín

    2016-10-01

    There is nowadays consensus in the community of didactics of science (i.e. science education understood as an academic discipline) regarding the need to include the philosophy of science in didactical research, science teacher education, curriculum design, and the practice of science education in all educational levels. Some authors have identified an ever-increasing use of the concept of `theoretical model', stemming from the so-called semantic view of scientific theories. However, it can be recognised that, in didactics of science, there are over-simplified transpositions of the idea of model (and of other meta-theoretical ideas). In this sense, contemporary philosophy of science is often blurred or distorted in the science education literature. In this paper, we address the discussion around some meta-theoretical concepts that are introduced into didactics of science due to their perceived educational value. We argue for the existence of a `semantic family', and we characterise four different versions of semantic views existing within the family. In particular, we seek to contribute to establishing a model-based didactics of science mainly supported in this semantic family.

  4. Comparison of two interpolation methods for empirical mode decomposition based evaluation of radiographic femur bone images.

    Science.gov (United States)

    Udhayakumar, Ganesan; Sujatha, Chinnaswamy Manoharan; Ramakrishnan, Swaminathan

    2013-01-01

    Analysis of bone strength in radiographic images is an important component of estimation of bone quality in diseases such as osteoporosis. Conventional radiographic femur bone images are used to analyze its architecture using bi-dimensional empirical mode decomposition method. Surface interpolation of local maxima and minima points of an image is a crucial part of bi-dimensional empirical mode decomposition method and the choice of appropriate interpolation depends on specific structure of the problem. In this work, two interpolation methods of bi-dimensional empirical mode decomposition are analyzed to characterize the trabecular femur bone architecture of radiographic images. The trabecular bone regions of normal and osteoporotic femur bone images (N = 40) recorded under standard condition are used for this study. The compressive and tensile strength regions of the images are delineated using pre-processing procedures. The delineated images are decomposed into their corresponding intrinsic mode functions using interpolation methods such as Radial basis function multiquadratic and hierarchical b-spline techniques. Results show that bi-dimensional empirical mode decomposition analyses using both interpolations are able to represent architectural variations of femur bone radiographic images. As the strength of the bone depends on architectural variation in addition to bone mass, this study seems to be clinically useful.

  5. Randomized Trial of ConquerFear: A Novel, Theoretically Based Psychosocial Intervention for Fear of Cancer Recurrence.

    Science.gov (United States)

    Butow, Phyllis N; Turner, Jane; Gilchrist, Jemma; Sharpe, Louise; Smith, Allan Ben; Fardell, Joanna E; Tesson, Stephanie; O'Connell, Rachel; Girgis, Afaf; Gebski, Val J; Asher, Rebecca; Mihalopoulos, Cathrine; Bell, Melanie L; Zola, Karina Grunewald; Beith, Jane; Thewes, Belinda

    2017-12-20

    Purpose Fear of cancer recurrence (FCR) is prevalent, distressing, and long lasting. This study evaluated the impact of a theoretically/empirically based intervention (ConquerFear) on FCR. Methods Eligible survivors had curable breast or colorectal cancer or melanoma, had completed treatment (not including endocrine therapy) 2 months to 5 years previously, were age > 18 years, and had scores above the clinical cutoff on the FCR Inventory (FCRI) severity subscale at screening. Participants were randomly assigned at a one-to-one ratio to either five face-to-face sessions of ConquerFear (attention training, metacognitions, acceptance/mindfulness, screening behavior, and values-based goal setting) or an attention control (Taking-it-Easy relaxation therapy). Participants completed questionnaires at baseline (T0), immediately post-therapy (T1), and 3 (T2) and 6 months (T3) later. The primary outcome was FCRI total score. Results Of 704 potentially eligible survivors from 17 sites and two online databases, 533 were contactable, of whom 222 (42%) consented; 121 were randomly assigned to intervention and 101 to control. Study arms were equivalent at baseline on all measured characteristics. ConquerFear participants had clinically and statistically greater improvements than control participants from T0 to T1 on FCRI total ( P psychological distress, and triggers) as well as in general anxiety, cancer-specific distress (total), and mental quality of life and metacognitions (total). Differences in FCRI psychological distress and cancer-specific distress (total) remained significantly different at T3. Conclusion This randomized trial demonstrated efficacy of ConquerFear compared with attention control (Taking-it-Easy) in reduction of FCRI total scores immediately post-therapy and 3 and 6 months later and in many secondary outcomes immediately post-therapy. Cancer-specific distress (total) remained more improved at 3- and 6-month follow-up.

  6. The Happy Culture: A Theoretical, Meta-Analytic, and Empirical Review of the Relationship Between Culture and Wealth and Subjective Well-Being.

    Science.gov (United States)

    Steel, Piers; Taras, Vasyl; Uggerslev, Krista; Bosco, Frank

    2018-05-01

    Do cultural values enhance financial and subjective well-being (SWB)? Taking a multidisciplinary approach, we meta-analytically reviewed the field, found it thinly covered, and focused on individualism. In counter, we collected a broad array of individual-level data, specifically an Internet sample of 8,438 adult respondents. Individual SWB was most strongly associated with cultural values that foster relationships and social capital, which typically accounted for more unique variance in life satisfaction than an individual's salary. At a national level, we used mean-based meta-analysis to construct a comprehensive cultural and SWB database. Results show some reversals from the individual level, particularly masculinity's facet of achievement orientation. In all, the happy nation has low power distance and low uncertainty avoidance, but is high in femininity and individualism, and these effects are interrelated but still partially independent from political and economic institutions. In short, culture matters for individual and national well-being.

  7. Consumers’ Acceptance and Use of Information and Communications Technology: A UTAUT and Flow Based Theoretical Model

    Directory of Open Access Journals (Sweden)

    Saleh Alwahaishi

    2013-03-01

    Full Text Available The world has changed a lot in the past years. The rapid advances in technology and the changing of the communication channels have changed the way people work and, for many, where do they work from. The Internet and mobile technology, the two most dynamic technological forces in modern information and communications technology (ICT are converging into one ubiquitous mobile Internet service, which will change our way of both doing business and dealing with our daily routine activities. As the use of ICT expands globally, there is need for further research into cultural aspects and implications of ICT. The acceptance of Information Technology (IT has become a fundamental part of the research plan for most organizations (Igbaria 1993. In IT research, numerous theories are used to understand users’ adoption of new technologies. Various models were developed including the Technology Acceptance Model, Theory of Reasoned Action, Theory of Planned Behavior, and recently, the Unified Theory of Acceptance and Use of Technology. Each of these models has sought to identify the factors which influence a citizen’s intention or actual use of information technology. Drawing on the UTAUT model and Flow Theory, this research composes a new hybrid theoretical framework to identify the factors affecting the acceptance and use of Mobile Internet -as an ICT application- in a consumer context. The proposed model incorporates eight constructs: Performance Expectancy, Effort Expectancy, Facilitating Conditions, Social Influences, Perceived Value, Perceived Playfulness, Attention Focus, and Behavioral intention. Data collected online from 238 respondents in Saudi Arabia were tested against the research model, using the structural equation modeling approach. The proposed model was mostly supported by the empirical data. The findings of this study provide several crucial implications for ICT and, in particular, mobile Internet service practitioners and researchers

  8. The Happy Culture: A Theoretical, Meta-Analytic, and Empirical Review of the Relationship Between Culture and Wealth and Subjective Well-Being

    Science.gov (United States)

    Steel, Piers; Taras, Vasyl; Uggerslev, Krista; Bosco, Frank

    2017-01-01

    Do cultural values enhance financial and subjective well-being (SWB)? Taking a multidisciplinary approach, we meta-analytically reviewed the field, found it thinly covered, and focused on individualism. In counter, we collected a broad array of individual-level data, specifically an Internet sample of 8,438 adult respondents. Individual SWB was most strongly associated with cultural values that foster relationships and social capital, which typically accounted for more unique variance in life satisfaction than an individual’s salary. At a national level, we used mean-based meta-analysis to construct a comprehensive cultural and SWB database. Results show some reversals from the individual level, particularly masculinity’s facet of achievement orientation. In all, the happy nation has low power distance and low uncertainty avoidance, but is high in femininity and individualism, and these effects are interrelated but still partially independent from political and economic institutions. In short, culture matters for individual and national well-being. PMID:28770649

  9. A theoretical and empirical study of the response of the high latitude thermosphere to the sense of the 'Y' component of the interplanetary magnetic field

    International Nuclear Information System (INIS)

    Rees, D.; Fuller-Rowell, T.J.; Gordon, R.

    1986-01-01

    The strength and direction of the Interplanetary Magnetic Field (IMF) controls the transfer of solar wind momentum and energy to the high latitude thermosphere in a direct fashion. The sense of ''Y'' component of the IMF (BY) creates a significant asymmetry of the magnetospheric convection pattern as mapped onto the high latitude thermosphere and ionosphere. The resulting response of the polar thermospheric winds during periods when BY is either positive or negative is quite distinct, with pronounced changes in the relative strength of thermospheric winds in the dusk-dawn parts of the polar cap and in the dawn part of the auroral oval. In a study of four periods when there was a clear signature of BY, observed by the ISEE-3 satellite, with observations of polar winds and electric fields from the Dynamics Explorer-2 satellite and with wind observations by a ground-based Fabry-Perot interferometer located in Kiruna, Northern Sweden, it is possible to explain features of the high latitude thermospheric circulation using three dimensional global models including BY dependent, asymmetric, polar convection fields. Anomalously zonal wind velocities are often observed, for BY positive and when BY is negative. These are matched by the observation of strong anti-sunward polar-cap wind jets from the DE-2 satellite, on the dusk side with BY negative, and on the dawn side with BY positive. (author)

  10. The neural mediators of kindness-based meditation: a theoretical model

    Directory of Open Access Journals (Sweden)

    Jennifer Streiffer Mascaro

    2015-02-01

    Full Text Available Although kindness-based contemplative practices are increasingly employed by clinicians and cognitive researchers to enhance prosocial emotions, social cognitive skills, and well-being, and as a tool to understand the basic workings of the social mind, we lack a coherent theoretical model with which to test the mechanisms by which kindness-based meditation may alter the brain and body. Here we link contemplative accounts of compassion and loving-kindness practices with research from social cognitive neuroscience and social psychology to generate predictions about how diverse practices may alter brain structure and function and related aspects of social cognition. Contingent on the nuances of the practice, kindness-based meditation may enhance the neural systems related to faster and more basic perceptual or motor simulation processes, simulation of another’s affective body state, slower and higher-level perspective-taking, modulatory processes such as emotion regulation and self/other discrimination, and combinations thereof. This theoretical model will be discussed alongside best practices for testing such a model and potential implications and applications of future work.

  11. Patient centredness in integrated care: results of a qualitative study based on a systems theoretical framework

    Directory of Open Access Journals (Sweden)

    Daniel Lüdecke

    2014-11-01

    Full Text Available Introduction: Health care providers seek to improve patient-centred care. Due to fragmentation of services, this can only be achieved by establishing integrated care partnerships. The challenge is both to control costs while enhancing the quality of care and to coordinate this process in a setting with many organisations involved. The problem is to establish control mechanisms, which ensure sufficiently consideration of patient centredness. Theory and methods: Seventeen qualitative interviews have been conducted in hospitals of metropolitan areas in northern Germany. The documentary method, embedded into a systems theoretical framework, was used to describe and analyse the data and to provide an insight into the specific perception of organisational behaviour in integrated care. Results: The findings suggest that integrated care partnerships rely on networks based on professional autonomy in the context of reliability. The relationships of network partners are heavily based on informality. This correlates with a systems theoretical conception of organisations, which are assumed autonomous in their decision-making. Conclusion and discussion: Networks based on formal contracts may restrict professional autonomy and competition. Contractual bindings that suppress the competitive environment have negative consequences for patient-centred care. Drawbacks remain due to missing self-regulation of the network. To conclude, less regimentation of integrated care partnerships is recommended.

  12. Patient centredness in integrated care: results of a qualitative study based on a systems theoretical framework

    Directory of Open Access Journals (Sweden)

    Daniel Lüdecke

    2014-11-01

    Full Text Available Introduction: Health care providers seek to improve patient-centred care. Due to fragmentation of services, this can only be achieved by establishing integrated care partnerships. The challenge is both to control costs while enhancing the quality of care and to coordinate this process in a setting with many organisations involved. The problem is to establish control mechanisms, which ensure sufficiently consideration of patient centredness.Theory and methods: Seventeen qualitative interviews have been conducted in hospitals of metropolitan areas in northern Germany. The documentary method, embedded into a systems theoretical framework, was used to describe and analyse the data and to provide an insight into the specific perception of organisational behaviour in integrated care.Results: The findings suggest that integrated care partnerships rely on networks based on professional autonomy in the context of reliability. The relationships of network partners are heavily based on informality. This correlates with a systems theoretical conception of organisations, which are assumed autonomous in their decision-making.Conclusion and discussion: Networks based on formal contracts may restrict professional autonomy and competition. Contractual bindings that suppress the competitive environment have negative consequences for patient-centred care. Drawbacks remain due to missing self-regulation of the network. To conclude, less regimentation of integrated care partnerships is recommended.

  13. Theoretical analysis and experimental evaluation of a CsI(Tl) based electronic portal imaging system

    International Nuclear Information System (INIS)

    Sawant, Amit; Zeman, Herbert; Samant, Sanjiv; Lovhoiden, Gunnar; Weinberg, Brent; DiBianca, Frank

    2002-01-01

    This article discusses the design and analysis of a portal imaging system based on a thick transparent scintillator. A theoretical analysis using Monte Carlo simulation was performed to calculate the x-ray quantum detection efficiency (QDE), signal to noise ratio (SNR) and the zero frequency detective quantum efficiency [DQE(0)] of the system. A prototype electronic portal imaging device (EPID) was built, using a 12.7 mm thick, 20.32 cm diameter, CsI(Tl) scintillator, coupled to a liquid nitrogen cooled CCD TV camera. The system geometry of the prototype EPID was optimized to achieve high spatial resolution. The experimental evaluation of the prototype EPID involved the determination of contrast resolution, depth of focus, light scatter and mirror glare. Images of humanoid and contrast detail phantoms were acquired using the prototype EPID and were compared with those obtained using conventional and high contrast portal film and a commercial EPID. A theoretical analysis was also carried out for a proposed full field of view system using a large area, thinned CCD camera and a 12.7 mm thick CsI(Tl) crystal. Results indicate that this proposed design could achieve DQE(0) levels up to 11%, due to its order of magnitude higher QDE compared to phosphor screen-metal plate based EPID designs, as well as significantly higher light collection compared to conventional TV camera based systems

  14. A global weighted mean temperature model based on empirical orthogonal function analysis

    Science.gov (United States)

    Li, Qinzheng; Chen, Peng; Sun, Langlang; Ma, Xiaping

    2018-03-01

    A global empirical orthogonal function (EOF) model of the tropospheric weighted mean temperature called GEOFM_Tm was developed using high-precision Global Geodetic Observing System (GGOS) Atmosphere Tm data during the years 2008-2014. Due to the quick convergence of EOF decomposition, it is possible to use the first four EOF series, which consists base functions Uk and associated coefficients Pk, to represent 99.99% of the overall variance of the original data sets and its spatial-temporal variations. Results show that U1 displays a prominent latitude distribution profile with positive peaks located at low latitude region. U2 manifests an asymmetric pattern that positive values occurred over 30° in the Northern Hemisphere, and negative values were observed at other regions. U3 and U4 displayed significant anomalies in Tibet and North America, respectively. Annual variation is the major component of the first and second associated coefficients P1 and P2, whereas P3 and P4 mainly reflects both annual and semi-annual variation components. Furthermore, the performance of constructed GEOFM_Tm was validated by comparison with GTm_III and GTm_N with different kinds of data including GGOS Atmosphere Tm data in 2015 and radiosonde data from Integrated Global Radiosonde Archive (IGRA) in 2014. Generally speaking, GEOFM_Tm can achieve the same accuracy and reliability as GTm_III and GTm_N models in a global scale, even has improved in the Antarctic and Greenland regions. The MAE and RMS of GEOFM_Tm tend to be 2.49 K and 3.14 K with respect to GGOS Tm data, respectively; and 3.38 K and 4.23 K with respect to IGRA sounding data, respectively. In addition, those three models have higher precision at low latitude than middle and high latitude regions. The magnitude of Tm remains at the range of 220-300 K, presented a high correlation with geographic latitude. In the Northern Hemisphere, there was a significant enhancement at high latitude region reaching 270 K during summer

  15. Upscaling Empirically Based Conceptualisations to Model Tropical Dominant Hydrological Processes for Historical Land Use Change

    Science.gov (United States)

    Toohey, R.; Boll, J.; Brooks, E.; Jones, J.

    2009-12-01

    Surface runoff and percolation to ground water are two hydrological processes of concern to the Atlantic slope of Costa Rica because of their impacts on flooding and drinking water contamination. As per legislation, the Costa Rican Government funds land use management from the farm to the regional scale to improve or conserve hydrological ecosystem services. In this study, we examined how land use (e.g., forest, coffee, sugar cane, and pasture) affects hydrological response at the point, plot (1 m2), and the field scale (1-6ha) to empirically conceptualize the dominant hydrological processes in each land use. Using our field data, we upscaled these conceptual processes into a physically-based distributed hydrological model at the field, watershed (130 km2), and regional (1500 km2) scales. At the point and plot scales, the presence of macropores and large roots promoted greater vertical percolation and subsurface connectivity in the forest and coffee field sites. The lack of macropores and large roots, plus the addition of management artifacts (e.g., surface compaction and a plough layer), altered the dominant hydrological processes by increasing lateral flow and surface runoff in the pasture and sugar cane field sites. Macropores and topography were major influences on runoff generation at the field scale. Also at the field scale, antecedent moisture conditions suggest a threshold behavior as a temporal control on surface runoff generation. However, in this tropical climate with very intense rainstorms, annual surface runoff was less than 10% of annual precipitation at the field scale. Significant differences in soil and hydrological characteristics observed at the point and plot scales appear to have less significance when upscaled to the field scale. At the point and plot scales, percolation acted as the dominant hydrological process in this tropical environment. However, at the field scale for sugar cane and pasture sites, saturation-excess runoff increased as

  16. Theoretical study of solar combisystems based on bikini tanks and tank-in-tank stores

    DEFF Research Database (Denmark)

    Yazdanshenas, Eshagh; Furbo, Simon

    2012-01-01

    . Originality/value - Many different Solar Combisystem designs have been commercialized over the years. In the IEA-SHC Task 26, twenty one solar combisystems have been described and analyzed. Maybe the mantle tank approach also for solar combisystems can be used with advantage? This might be possible...... if the solar heating system is based on a so called bikini tank. Therefore the new developed solar combisystems based on bikini tanks is compared to the tank-in-tank solar combisystems to elucidate which one is suitable for three different houses with low energy heating demand, medium and high heating demand.......Purpose - Low flow bikini solar combisystems and high flow tank-in-tank solar combisystems have been studied theoretically. The aim of the paper is to study which of these two solar combisystem designs is suitable for different houses. The thermal performance of solar combisystems based on the two...

  17. Earthquake Prediction Analysis Based on Empirical Seismic Rate: The M8 Algorithm

    International Nuclear Information System (INIS)

    Molchan, G.; Romashkova, L.

    2010-07-01

    The quality of space-time earthquake prediction is usually characterized by a two-dimensional error diagram (n,τ), where n is the rate of failures-to-predict and τ is the normalized measure of space-time alarm. The most reasonable space measure for analysis of a prediction strategy is the rate of target events λ(dg) in a sub-area dg. In that case the quantity H = 1-(n +τ) determines the prediction capability of the strategy. The uncertainty of λ(dg) causes difficulties in estimating H and the statistical significance, α, of prediction results. We investigate this problem theoretically and show how the uncertainty of the measure can be taken into account in two situations, viz., the estimation of α and the construction of a confidence zone for the (n,τ)-parameters of the random strategies. We use our approach to analyse the results from prediction of M ≥ 8.0 events by the M8 method for the period 1985-2009 (the M8.0+ test). The model of λ(dg) based on the events Mw ≥ 5.5, 1977-2004, and the magnitude range of target events 8.0 ≤ M < 8.5 are considered as basic to this M8 analysis. We find the point and upper estimates of α and show that they are still unstable because the number of target events in the experiment is small. However, our results argue in favour of non-triviality of the M8 prediction algorithm. (author)

  18. Earthquake prediction analysis based on empirical seismic rate: the M8 algorithm

    Science.gov (United States)

    Molchan, G.; Romashkova, L.

    2010-12-01

    The quality of space-time earthquake prediction is usually characterized by a 2-D error diagram (n, τ), where n is the fraction of failures-to-predict and τ is the local rate of alarm averaged in space. The most reasonable averaging measure for analysis of a prediction strategy is the normalized rate of target events λ(dg) in a subarea dg. In that case the quantity H = 1 - (n + τ) determines the prediction capability of the strategy. The uncertainty of λ(dg) causes difficulties in estimating H and the statistical significance, α, of prediction results. We investigate this problem theoretically and show how the uncertainty of the measure can be taken into account in two situations, viz., the estimation of α and the construction of a confidence zone for the (n, τ)-parameters of the random strategies. We use our approach to analyse the results from prediction of M >= 8.0 events by the M8 method for the period 1985-2009 (the M8.0+ test). The model of λ(dg) based on the events Mw >= 5.5, 1977-2004, and the magnitude range of target events 8.0 <= M < 8.5 are considered as basic to this M8 analysis. We find the point and upper estimates of α and show that they are still unstable because the number of target events in the experiment is small. However, our results argue in favour of non-triviality of the M8 prediction algorithm.

  19. The Synergy of applying virtual collaboration tools and problem-based approach for development of knowledge sharing skills : empirical research

    OpenAIRE

    Schoop, Eric; Kriaučiūnienė, Roma; Brundzaitė, Rasa

    2004-01-01

    This article analyses the needs and possibilities to educate new type of virtual collaboration skills for the university students, who are currently studying in business and information systems area. We investigate the possibility to incorporate problem-based group learning and computer supported tools into university curricula. The empirical research results are presented, which summarize experiences of using the virtual collaborative learning (VCL) environment, provided by Business informat...

  20. "Little Island into Mighty Base": Indigeneity, Race, and U.S. Empire in Guam, 1944-1962

    OpenAIRE

    Flores, Alfred Peredo

    2015-01-01

    This dissertation examines the creation of Guam’s post-World War II multiracial society through Chamorro land stewardship and the recruitment of non-local labor. This tiny 212-square-mile island in the western Pacific became a crucible of American empire that connected Guam, the Philippines, and the United States. This synergy of expansion between the U.S. government and private industry resulted in the construction of Apra Harbor, bases, military homes, and roads throughout Guam. This pro...

  1. Potential of qualitative network analysis in migration studies- Reflections based on an empirical analysis of young researchers' mobility aspirations

    OpenAIRE

    Elisabeth Scheibelhofer

    2011-01-01

    Based on the example of an empirical research study, the paper examines the strengths and limitations of a qualitative network approach to migration and mobility. The method of graphic drawings produced by the respondents within an interview setting was applied. With this method, we argue to be able to analyse migrants’ specific social embeddedness and its influence on future mobility aspirations. Likewise, connections between the migratory biography and the individuals’ various social relati...

  2. Modelling of proton exchange membrane fuel cell performance based on semi-empirical equations

    Energy Technology Data Exchange (ETDEWEB)

    Al-Baghdadi, Maher A.R. Sadiq [Babylon Univ., Dept. of Mechanical Engineering, Babylon (Iraq)

    2005-08-01

    Using semi-empirical equations for modeling a proton exchange membrane fuel cell is proposed for providing a tool for the design and analysis of fuel cell total systems. The focus of this study is to derive an empirical model including process variations to estimate the performance of fuel cell without extensive calculations. The model take into account not only the current density but also the process variations, such as the gas pressure, temperature, humidity, and utilization to cover operating processes, which are important factors in determining the real performance of fuel cell. The modelling results are compared well with known experimental results. The comparison shows good agreements between the modeling results and the experimental data. The model can be used to investigate the influence of process variables for design optimization of fuel cells, stacks, and complete fuel cell power system. (Author)

  3. A Reliability Test of a Complex System Based on Empirical Likelihood

    OpenAIRE

    Zhou, Yan; Fu, Liya; Zhang, Jun; Hui, Yongchang

    2016-01-01

    To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results.

  4. Organizational Learning, Strategic Flexibility and Business Model Innovation: An Empirical Research Based on Logistics Enterprises

    Science.gov (United States)

    Bao, Yaodong; Cheng, Lin; Zhang, Jian

    Using the data of 237 Jiangsu logistics firms, this paper empirically studies the relationship among organizational learning capability, business model innovation, strategic flexibility. The results show as follows; organizational learning capability has positive impacts on business model innovation performance; strategic flexibility plays mediating roles on the relationship between organizational learning capability and business model innovation; interaction among strategic flexibility, explorative learning and exploitative learning play significant roles in radical business model innovation and incremental business model innovation.

  5. Banking Fragility in Colombia: An Empirical Analysis Based on Balance Sheets

    OpenAIRE

    Ignacio Lozano; Alexander Guarín

    2014-01-01

    In this paper, we study the empirical relationship between credit funding sources and the financial vulnerability of the Colombian banking system. We propose a statistical model to measure and predict banking-fragility episodes associated with credit funding sources classified into retail deposits and wholesale funds. We compute the probability of financial fragility for both the aggregated banking system and the individual banks. Our approach performs a Bayesian averaging of estimated logit ...

  6. An ACE-based Nonlinear Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Nielsen, Allan Aasbjerg; Andersen, Ole

    2001-01-01

    This paper shows the application of the empirical orthogonal unctions/principal component transformation on global sea surface height and temperature data from 1996 and 1997. A nonlinear correlation analysis of the transformed data is proposed and performed by applying the alternating conditional...... expectations algorithm. New canonical variates are found that indicate that the highest correlation between ocean temperature and height is associated with the build-up of the El Niño during the last half of 1997....

  7. An Empirical Study Based on the SPSS Variance Analysis of College Teachers' Sports Participation and Satisfaction

    OpenAIRE

    Yunqiu Liang

    2013-01-01

    The study on University Teachers ' sports participation and their job satisfaction relationship for empirical research, mainly from the group to participate in sports activities situation on the object of study, investigation and mathematical statistics analysis SPSS. Results show that sports groups participate in job satisfaction higher than those in groups of job satisfaction; sports participation, different job satisfaction is also different. Recommendations for college teachers to address...

  8. The Fault Diagnosis of Rolling Bearing Based on Ensemble Empirical Mode Decomposition and Random Forest

    OpenAIRE

    Qin, Xiwen; Li, Qiaoling; Dong, Xiaogang; Lv, Siqi

    2017-01-01

    Accurate diagnosis of rolling bearing fault on the normal operation of machinery and equipment has a very important significance. A method combining Ensemble Empirical Mode Decomposition (EEMD) and Random Forest (RF) is proposed. Firstly, the original signal is decomposed into several intrinsic mode functions (IMFs) by EEMD, and the effective IMFs are selected. Then their energy entropy is calculated as the feature. Finally, the classification is performed by RF. In addition, the wavelet meth...

  9. An empirical typology of energy services based on a well-developed market: France

    International Nuclear Information System (INIS)

    Duplessis, Bruno; Adnot, Jérôme; Dupont, Maxime; Racapé, François

    2012-01-01

    The investigation is an attempt to apply a set of consistent and official definitions of energy services (ES), energy efficiency services (EES) and energy performance contracting (EPC) on a well-developed market: France. After defining the historical context of the French market the authors describe the types of offers that are presently made and that fall within the definition of energy services. There are many classic and novel factors for the success of energy services. For instance, the energy services market is now partly structured by the CEE scheme, the French ‘white certificates’ or ‘energy certificates scheme’. Also the grid problems lead to new services. The companies active on the market are described as a result of an empirical survey of ES market in France. This empirical survey of ES market in France includes estimates of the number of companies and of their turnover both for ES and EES. Examples and case studies are developed as a background. - Highlights: ► We apply a consistent and official set of definitions of energy services (ES). ► French ES market is structured by historical ES and shaped by recent evolutions. ► French ES market is now partly structured by white certificates and grid management. ► We describe the companies active on French market through an empirical survey. ► We have estimated French market size (actors and turnover) both for ES and EES.

  10. Bioactive conformational generation of small molecules: A comparative analysis between force-field and multiple empirical criteria based methods

    Directory of Open Access Journals (Sweden)

    Jiang Hualiang

    2010-11-01

    Full Text Available Abstract Background Conformational sampling for small molecules plays an essential role in drug discovery research pipeline. Based on multi-objective evolution algorithm (MOEA, we have developed a conformational generation method called Cyndi in the previous study. In this work, in addition to Tripos force field in the previous version, Cyndi was updated by incorporation of MMFF94 force field to assess the conformational energy more rationally. With two force fields against a larger dataset of 742 bioactive conformations of small ligands extracted from PDB, a comparative analysis was performed between pure force field based method (FFBM and multiple empirical criteria based method (MECBM hybrided with different force fields. Results Our analysis reveals that incorporating multiple empirical rules can significantly improve the accuracy of conformational generation. MECBM, which takes both empirical and force field criteria as the objective functions, can reproduce about 54% (within 1Å RMSD of the bioactive conformations in the 742-molecule testset, much higher than that of pure force field method (FFBM, about 37%. On the other hand, MECBM achieved a more complete and efficient sampling of the conformational space because the average size of unique conformations ensemble per molecule is about 6 times larger than that of FFBM, while the time scale for conformational generation is nearly the same as FFBM. Furthermore, as a complementary comparison study between the methods with and without empirical biases, we also tested the performance of the three conformational generation methods in MacroModel in combination with different force fields. Compared with the methods in MacroModel, MECBM is more competitive in retrieving the bioactive conformations in light of accuracy but has much lower computational cost. Conclusions By incorporating different energy terms with several empirical criteria, the MECBM method can produce more reasonable conformational

  11. Theoretical and experimental studies on ionic currents in nanopore-based biosensors.

    Science.gov (United States)

    Liu, Lei; Li, Chu; Ma, Jian; Wu, Yingdong; Ni, Zhonghua; Chen, Yunfei

    2014-12-01

    Novel generation of analytical technology based on nanopores has provided possibilities to fabricate nanofluidic devices for low-cost DNA sequencing or rapid biosensing. In this paper, a simplified model was suggested to describe DNA molecule's translocation through a nanopore, and the internal potential, ion concentration, ionic flowing speed and ionic current in nanopores with different sizes were theoretically calculated and discussed on the basis of Poisson-Boltzmann equation, Navier-Stokes equation and Nernst-Planck equation by considering several important parameters, such as the applied voltage, the thickness and the electric potential distributions in nanopores. In this way, the basic ionic currents, the modulated ionic currents and the current drops induced by translocation were obtained, and the size effects of the nanopores were carefully compared and discussed based on the calculated results and experimental data, which indicated that nanopores with a size of 10 nm or so are more advantageous to achieve high quality ionic current signals in DNA sensing.

  12. Graph theoretical analysis and application of fMRI-based brain network in Alzheimer's disease

    Directory of Open Access Journals (Sweden)

    LIU Xue-na

    2012-08-01

    Full Text Available Alzheimer's disease (AD, a progressive neurodegenerative disease, is clinically characterized by impaired memory and many other cognitive functions. However, the pathophysiological mechanisms underlying the disease are not thoroughly understood. In recent years, using functional magnetic resonance imaging (fMRI as well as advanced graph theory based network analysis approach, several studies of patients with AD suggested abnormal topological organization in both global and regional properties of functional brain networks, specifically, as demonstrated by a loss of small-world network characteristics. These studies provide novel insights into the pathophysiological mechanisms of AD and could be helpful in developing imaging biomarkers for disease diagnosis. In this paper we introduce the essential concepts of complex brain networks theory, and review recent advances of the study on human functional brain networks in AD, especially focusing on the graph theoretical analysis of small-world network based on fMRI. We also propound the existent problems and research orientation.

  13. Theoretical study of the structure and reactivity of lanthanide and actinide based organometallic complexes

    International Nuclear Information System (INIS)

    Barros, N.

    2007-06-01

    In this PhD thesis, lanthanide and actinide based organometallic complexes are studied using quantum chemistry methods. In a first part, the catalytic properties of organo-lanthanide compounds are evaluated by studying two types of reactions: the catalytic hydro-functionalization of olefins and the polymerisation of polar monomers. The reaction mechanisms are theoretically determined and validated, and the influence of possible secondary non productive reactions is envisaged. A second part focuses on uranium-based complexes. Firstly, the electronic structure of uranium metallocenes is analysed. An analogy with the uranyl compounds is proposed. In a second chapter, two isoelectronic complexes of uranium IV are studied. After validating the use of DFT methods for describing the electronic structure and the reactivity of these compounds, it is shown that their reactivity difference can be related to a different nature of chemical bonding in these complexes. (author)

  14. Activity – based costing in sport organizations:Theoretical background & future prospects

    Directory of Open Access Journals (Sweden)

    PANAGIOTIS E. DIMITROPOULOS

    2007-01-01

    Full Text Available Costing systems in recent years have shown a significantdevelopment and activity-based costing (ABC specificallyhas been considered as a major contribution to cost management, particularly in service businesses. The sport sector is composed to a great extent of service functions, yet considerably less have been reported of the use of activity based costing to support cost management in sport organizations. Since the power of information becomes continuously crucial for the implementation of effective business administration, the traditional methods of cost measurementproved insufficient on this issue, leading to the invention ofABC. The aim of this paper is twofold. First of all we wantto present the main theoretical background of ABC and itssubstantiated benefits, and secondly to present some practical steps for the implementation of ABC in sport organizations.

  15. A theoretical global optimization method for vapor-compression refrigeration systems based on entransy theory

    International Nuclear Information System (INIS)

    Xu, Yun-Chao; Chen, Qun

    2013-01-01

    The vapor-compression refrigeration systems have been one of the essential energy conversion systems for humankind and exhausting huge amounts of energy nowadays. Surrounding the energy efficiency promotion of the systems, there are lots of effectual optimization methods but mainly relied on engineering experience and computer simulations rather than theoretical analysis due to the complex and vague physical essence. We attempt to propose a theoretical global optimization method based on in-depth physical analysis for the involved physical processes, i.e. heat transfer analysis for condenser and evaporator, through introducing the entransy theory and thermodynamic analysis for compressor and expansion valve. The integration of heat transfer and thermodynamic analyses forms the overall physical optimization model for the systems to describe the relation between all the unknown parameters and known conditions, which makes theoretical global optimization possible. With the aid of the mathematical conditional extremum solutions, an optimization equation group and the optimal configuration of all the unknown parameters are analytically obtained. Eventually, via the optimization of a typical vapor-compression refrigeration system with various working conditions to minimize the total heat transfer area of heat exchangers, the validity and superior of the newly proposed optimization method is proved. - Highlights: • A global optimization method for vapor-compression systems is proposed. • Integrating heat transfer and thermodynamic analyses forms the optimization model. • A mathematical relation between design parameters and requirements is derived. • Entransy dissipation is introduced into heat transfer analysis. • The validity of the method is proved via optimization of practical cases

  16. Calculating the Fee-Based Services of Library Institutions: Theoretical Foundations and Practical Challenges

    Directory of Open Access Journals (Sweden)

    Sysіuk Svitlana V.

    2017-05-01

    Full Text Available The article is aimed at highlighting features of the provision of the fee-based services by library institutions, identifying problems related to the legal and regulatory framework for their calculation, and the methods to implement this. The objective of the study is to develop recommendations to improve the calculation of the fee-based library services. The theoretical foundations have been systematized, the need to develop a Provision for the procedure of the fee-based services by library institutions has been substantiated. Such a Provision would protect library institution from errors in fixing the fee for a paid service and would be an informational source of its explicability. The appropriateness of applying the market pricing law based on demand and supply has been substantiated. The development and improvement of accounting and calculation, taking into consideration both industry-specific and market-based conditions, would optimize the costs and revenues generated by the provision of the fee-based services. In addition, the complex combination of calculation leverages with development of the system of internal accounting together with use of its methodology – provides another equally efficient way of improving the efficiency of library institutions’ activity.

  17. An empirically-based model for the lift coefficients of twisted airfoils with leading-edge tubercles

    Science.gov (United States)

    Ni, Zao; Su, Tsung-chow; Dhanak, Manhar

    2018-04-01

    Experimental data for untwisted airfoils are utilized to propose a model for predicting the lift coefficients of twisted airfoils with leading-edge tubercles. The effectiveness of the empirical model is verified through comparison with results of a corresponding computational fluid-dynamic (CFD) study. The CFD study is carried out for both twisted and untwisted airfoils with tubercles, the latter shown to compare well with available experimental data. Lift coefficients of twisted airfoils predicted from the proposed empirically-based model match well with the corresponding coefficients determined using the verified CFD study. Flow details obtained from the latter provide better insight into the underlying mechanism and behavior at stall of twisted airfoils with leading edge tubercles.

  18. Deep in Data: Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Neymark, J.; Roberts, D.

    2013-06-01

    An opportunity is available for using home energy consumption and building description data to develop a standardized accuracy test for residential energy analysis tools. That is, to test the ability of uncalibrated simulations to match real utility bills. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This may facilitate the possibility of modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data. This paper describes progress toward, and issues related to, developing a usable, standardized, empirical data-based software accuracy test suite.

  19. Sectoral patterns of interactive learning : an empirical exploration using an extended resource based model

    NARCIS (Netherlands)

    Meeus, M.T.H.; Oerlemans, L.A.G.; Hage, J.

    1999-01-01

    This paper pursues the development of a theoretical framework that explains interactive learning between innovating firms and external actors in the knowledge infrastructure and the production chain. The research question is: what kinds of factors explain interactive learning of innovating firms

  20. Metamaterial-based theoretical description of light scattering by metallic nano-hole array structures

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Mahi R. [Department of Physics and Astronomy, University of Western Ontario, London N6A 3K7 (Canada); Najiminaini, Mohamadreza; Carson, Jeffrey J. L. [Lawson Health Research Institute, St. Joseph' s Health Care, 268 Grosvenor Street, London N6A 4V2 (Canada); Department of Medical Biophysics, University of Western Ontario, London N6A 3K7 (Canada); Balakrishnan, Shankar [Department of Physics and Astronomy, University of Western Ontario, London N6A 3K7 (Canada); Lawson Health Research Institute, St. Joseph' s Health Care, 268 Grosvenor Street, London N6A 4V2 (Canada); Department of Medical Biophysics, University of Western Ontario, London N6A 3K7 (Canada)

    2015-05-14

    We have experimentally and theoretically investigated the light-matter interaction in metallic nano-hole array structures. The scattering cross section spectrum was measured for three samples each having a unique nano-hole array radius and periodicity. Each measured spectrum had several peaks due to surface plasmon polaritons. The dispersion relation and the effective dielectric constant of the structure were calculated using transmission line theory and Bloch's theorem. Using the effective dielectric constant and the transfer matrix method, the surface plasmon polariton energies were calculated and found to be quantized. Using these quantized energies, a Hamiltonian for the surface plasmon polaritons was written in the second quantized form. Working with the Hamiltonian, a theory of scattering cross section was developed based on the quantum scattering theory and Green's function method. For both theory and experiment, the location of the surface plasmon polariton spectral peaks was dependant on the array periodicity and radii of the nano-holes. Good agreement was observed between the experimental and theoretical results. It is proposed that the newly developed theory can be used to facilitate optimization of nanosensors for medical and engineering applications.

  1. Theoretical rationalization for reduced charge recombination in bulky carbazole-based sensitizers in solar cells.

    Science.gov (United States)

    Surakhot, Yaowarat; Laszlo, Viktor; Chitpakdee, Chirawat; Promarak, Vinich; Sudyoadsuk, Taweesak; Kungwan, Nawee; Kowalczyk, Tim; Irle, Stephan; Jungsuttiwong, Siriporn

    2017-05-05

    The search for greater efficiency in organic dye-sensitized solar cells (DSCs) and in their perovskite cousins is greatly aided by a more complete understanding of the spectral and morphological properties of the photoactive layer. This investigation resolves a discrepancy in the observed photoconversion efficiency (PCE) of two closely related DSCs based on carbazole-containing D-π-A organic sensitizers. Detailed theoretical characterization of the absorption spectra, dye adsorption on TiO 2 , and electronic couplings for charge separation and recombination permit a systematic determination of the origin of the difference in PCE. Although the two dyes produce similar spectral features, ground- and excited-state density functional theory (DFT) simulations reveal that the dye with the bulkier donor group adsorbs more strongly to TiO 2 , experiences limited π-π aggregation, and is more resistant to loss of excitation energy via charge recombination on the dye. The effects of conformational flexibility on absorption spectra and on the electronic coupling between the bright exciton and charge-transfer states are revealed to be substantial and are characterized through density-functional tight-binding (DFTB) molecular dynamics sampling. These simulations offer a mechanistic explanation for the superior open-circuit voltage and short-circuit current of the bulky-donor dye sensitizer and provide theoretical justification of an important design feature for the pursuit of greater photocurrent efficiency in DSCs. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  2. A Game-theoretic Framework for Network Coding Based Device-to-Device Communications

    KAUST Repository

    Douik, Ahmed S.; Sorour, Sameh; Tembine, Hamidou; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2016-01-01

    This paper investigates the delay minimization problem for instantly decodable network coding (IDNC) based deviceto- device (D2D) communications. In D2D enabled systems, users cooperate to recover all their missing packets. The paper proposes a game theoretic framework as a tool for improving the distributed solution by overcoming the need for a central controller or additional signaling in the system. The session is modeled by self-interested players in a non-cooperative potential game. The utility functions are designed so as increasing individual payoff results in a collective behavior achieving both a desirable system performance in a shared network environment and the Nash equilibrium. Three games are developed whose first reduces the completion time, the second the maximum decoding delay and the third the sum decoding delay. The paper, further, improves the formulations by including a punishment policy upon collision occurrence so as to achieve the Nash bargaining solution. Learning algorithms are proposed for systems with complete and incomplete information, and for the imperfect feedback scenario. Numerical results suggest that the proposed game-theoretical formulation provides appreciable performance gain against the conventional point-to-multipoint (PMP), especially for reliable user-to-user channels.

  3. Feasibility of theoretical formulas on the anisotropy of shale based on laboratory measurement and error analysis

    Science.gov (United States)

    Xie, Jianyong; Di, Bangrang; Wei, Jianxin; Luan, Xinyuan; Ding, Pinbo

    2015-04-01

    This paper designs a total angle ultrasonic test method to measure the P-wave velocities (vp), vertically and horizontally polarized shear wave velocities (vsv and vsh) of all angles to the bedding plane on different kinds of strong anisotropic shale. Analysis has been made of the comparisons among the observations and corresponding calculated theoretical curves based on the varied vertical transversely isotropic (TI) medium theories, for which discussing the real similarity with the characterizations of the TI medium on the scope of dynamic behaviors, and further conclude a more accurate and precise theory from the varied theoretical formulas as well as its suitable range to characterize the strong anisotropy of shale. At a low phase angle (theta Berryman expressions provide a relatively much better agreement with the measured data for vp, vsv on shale. Also all of the three theories lead to more deviations in the approximation of the vsv than for the vp and vsh. Furthermore, we created synthetic comparative ideal physical models (from coarse bakelite, cambric bakelite, and paper bakelite) as supplementary models to natural shale, which are used to model shale with different anisotropy, to research the effects of the anisotropic parameters upon the applicability of the former optimal TI theories, especially for the vsv. We found the when the P-wave anisotropy, S-wave anisotropy ɛ, γ > 0.25, the Berrryman curve will be the best fit for the vp, vsv on shale.

  4. A Game-theoretic Framework for Network Coding Based Device-to-Device Communications

    KAUST Repository

    Douik, Ahmed

    2016-06-29

    This paper investigates the delay minimization problem for instantly decodable network coding (IDNC) based deviceto- device (D2D) communications. In D2D enabled systems, users cooperate to recover all their missing packets. The paper proposes a game theoretic framework as a tool for improving the distributed solution by overcoming the need for a central controller or additional signaling in the system. The session is modeled by self-interested players in a non-cooperative potential game. The utility functions are designed so as increasing individual payoff results in a collective behavior achieving both a desirable system performance in a shared network environment and the Nash equilibrium. Three games are developed whose first reduces the completion time, the second the maximum decoding delay and the third the sum decoding delay. The paper, further, improves the formulations by including a punishment policy upon collision occurrence so as to achieve the Nash bargaining solution. Learning algorithms are proposed for systems with complete and incomplete information, and for the imperfect feedback scenario. Numerical results suggest that the proposed game-theoretical formulation provides appreciable performance gain against the conventional point-to-multipoint (PMP), especially for reliable user-to-user channels.

  5. [DGRW-update: neurology--from empirical strategies towards evidence based interventions].

    Science.gov (United States)

    Schupp, W

    2011-12-01

    Stroke, Multiple Sclerosis (MS), traumatic brain injuries (TBI) and neuropathies are the most important diseases in neurological rehabilitation financed by the German Pension Insurance. The primary goal is vocational (re)integration. Driven by multiple findings of neuroscience research the traditional holistic approach with mainly empirically derived strategies was developed further and improved by new evidence-based interventions. This process had been, and continues to be, necessary to meet the health-economic pressures for ever shorter and more efficient rehab measures. Evidence-based interventions refer to symptom-oriented measures, to team-management concepts, as well as to education and psychosocial interventions. Drug therapy and/or neurophysiological measures can be added to increase neuroregeneration and neuroplasticity. Evidence-based aftercare concepts support sustainability and steadiness of rehab results.Mirror therapy, robot-assisted training, mental training, task-specific training, and above all constraint-induced movement therapy (CIMT) can restore motor arm and hand functions. Treadmill training and robot-assisted training improve stance and gait. Botulinum toxine injections in combination with physical and redressing methods are superior in managing spasticity. Guideline-oriented management of associated pain syndromes (myofascial, neuropathic, complex-regional=dystrophic) improve primary outcome and quality of life. Drug therapy with so-called co-analgetics and physical therapy play an important role in pain management. Swallowing disorders lead to higher mortality and morbidity in the acute phase; stepwise diagnostics (screening, endoscopy, radiology) and specific swallowing therapy can reduce these risks and frequently can restore normal eating und drinking.In our modern industrial societies communicative and cognitive disturbances are more impairing than the above mentioned disorders. Speech and language therapy (SLT) is dominant in

  6. Creating a memory of causal relationships an integration of empirical and explanation-based learning methods

    CERN Document Server

    Pazzani, Michael J

    2014-01-01

    This book presents a theory of learning new causal relationships by making use of perceived regularities in the environment, general knowledge of causality, and existing causal knowledge. Integrating ideas from the psychology of causation and machine learning, the author introduces a new learning procedure called theory-driven learning that uses abstract knowledge of causality to guide the induction process. Known as OCCAM, the system uses theory-driven learning when new experiences conform to common patterns of causal relationships, empirical learning to learn from novel experiences, and expl

  7. Support Vector Regression Model Based on Empirical Mode Decomposition and Auto Regression for Electric Load Forecasting

    Directory of Open Access Journals (Sweden)

    Hong-Juan Li

    2013-04-01

    Full Text Available Electric load forecasting is an important issue for a power utility, associated with the management of daily operations such as energy transfer scheduling, unit commitment, and load dispatch. Inspired by strong non-linear learning capability of support vector regression (SVR, this paper presents a SVR model hybridized with the empirical mode decomposition (EMD method and auto regression (AR for electric load forecasting. The electric load data of the New South Wales (Australia market are employed for comparing the forecasting performances of different forecasting models. The results confirm the validity of the idea that the proposed model can simultaneously provide forecasting with good accuracy and interpretability.

  8. The Fault Diagnosis of Rolling Bearing Based on Ensemble Empirical Mode Decomposition and Random Forest

    Directory of Open Access Journals (Sweden)

    Xiwen Qin

    2017-01-01

    Full Text Available Accurate diagnosis of rolling bearing fault on the normal operation of machinery and equipment has a very important significance. A method combining Ensemble Empirical Mode Decomposition (EEMD and Random Forest (RF is proposed. Firstly, the original signal is decomposed into several intrinsic mode functions (IMFs by EEMD, and the effective IMFs are selected. Then their energy entropy is calculated as the feature. Finally, the classification is performed by RF. In addition, the wavelet method is also used in the proposed process, the same as EEMD. The results of the comparison show that the EEMD method is more accurate than the wavelet method.

  9. Theoretical and experimental determination of mass attenuation coefficients of lead-based ceramics and their comparison with simulation

    Directory of Open Access Journals (Sweden)

    Vejdani-Noghreiyan Alireza

    2016-01-01

    Full Text Available Mass attenuation coefficient of lead-based ceramics have been measured by experimental methods and compared with theoretical and Monte Carlo simulation results. Lead-based ceramics were prepared using mixed oxide method and the X-ray diffraction analysis was done to evaluate the crystal structure of the produced handmade ceramics. The experimental results show good agreement with theoretical and simulation results. However at two gamma ray energies, small differences between experimental and theoretical results have been observed. By adding other additives to ceramics and observing the changes in the shielding properties such as flexibility, one can synthesize and optimize ceramics as a neutron shield.

  10. Inglorious Empire

    DEFF Research Database (Denmark)

    Khair, Tabish

    2017-01-01

    Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00......Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00...

  11. Combining empirical and theory-based land-use modelling approaches to assess economic potential of biofuel production avoiding iLUC: Argentina as a case study

    NARCIS (Netherlands)

    Diogo, V.; van der Hilst, F.; van Eijck, J.; Verstegen, J.A.; Hilbert, J.; Carballo, S.; Volante, J.; Faaij, A.

    2014-01-01

    In this paper, a land-use modelling framework is presented combining empirical and theory-based modelling approaches to determine economic potential of biofuel production avoiding indirect land-use changes (iLUC) resulting from land competition with other functions. The empirical approach explores

  12. Theoretical bases of project management in conditions of innovative economy based on fuzzy modeling

    Science.gov (United States)

    Beilin, I. L.; Khomenko, V. V.

    2018-05-01

    In recent years, more and more Russian enterprises (both private and public) are trying to organize their activities on the basis of modern scientific research in order to improve the management of economic processes. Business planning, financial and investment analysis, modern software products based on the latest scientific developments are introduced everywhere. At the same time, there is a growing demand for market research (both at the microeconomic and macroeconomic levels), for financial and general economic information.

  13. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    Science.gov (United States)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological

  14. Analysis of theoretical security level of PDF Encryption mechanism based on X.509 certificates

    Directory of Open Access Journals (Sweden)

    Joanna Dmitruk

    2017-12-01

    Full Text Available PDF Encryption is a content security mechanism developed and used by Adobe in their products. In this paper, we have checked a theoretical security level of a variant that uses public key infrastructure and X.509 certificates. We have described a basis of this mechanism and we have performed a simple security analysis. Then, we have showed possible tweaks and security improvements. At the end, we have given some recommendations that can improve security of a content secured with PDF Encryption based on X.509 certificates. Keywords: DRM, cryptography, security level, PDF Encryption, Adobe, X.509

  15. The theoretical model of the school-based prevention programme Unplugged.

    Science.gov (United States)

    Vadrucci, Serena; Vigna-Taglianti, Federica D; van der Kreeft, Peer; Vassara, Maro; Scatigna, Maria; Faggiano, Fabrizio; Burkhart, Gregor

    2016-12-01

    Unplugged is a school-based prevention programme designed and tested in the EU-Dap trial. The programme consists of 12 units delivered by class teachers to adolescents 12-14 years old. It is a strongly interactive programme including a training of personal and social skills with a specific focus on normative beliefs. The aim of this work is to define the theoretical model of the program, the contribution of the theories to the units, and the targeted mediators. The programme integrates several theories: Social Learning, Social Norms, Health Belief, theory of Reasoned Action-Attitude, and Problem Behaviour theory. Every theory contributes to the development of the units' contents, with specific weights. Knowledge, risk perception, attitudes towards drugs, normative beliefs, critical and creative thinking, relationship skills, communication skills, assertiveness, refusal skills, ability to manage emotions and to cope with stress, empathy, problem solving and decision making skills are the targeted mediators of the program. © The Author(s) 2015.

  16. A Theoretical Analysis of the Mission Statement Based on the Axiological Approach

    Directory of Open Access Journals (Sweden)

    Marius-Costel EŞI

    2016-12-01

    Full Text Available The aim of this work is focused on a theoretical analysis of formulating the mission statement of business organizations in relation to the idea of the organizational axiological core. On one hand, we consider the CSR-Corporate Social Responsibility which, in our view, must be brought into direct connection both with the moral entrepreneurship (which should support the philosophical perspective of the statement of business organizations mission and the purely economic entrepreneurship based on profit maximization (which should support the pragmatic perspective. On the other hand, an analysis of the moral concepts which should underpin business is becoming fundamental, in our view, as far as the idea of the social specific value of the social entrepreneurship is evidenced. Therefore, our approach highlights a number of epistemic explanations in relation to the actual practice dimension.

  17. ANTI-CRISIS MANAGEMENT IN CONTEXT OF ITS THEORETICAL AND METHODOLOGICAL RESEARCH BASES

    Directory of Open Access Journals (Sweden)

    A. V. Trapitsyn

    2011-01-01

    Full Text Available Effective country and enterprise economic management determines survival of dozens thousand Russian enterprises under market economics conditions any time, but management on the basis of marketing elements gains particular importance during crises. Anti-crisis management is a complex preventive management model created and functioning to neutralize or mitigate crisis phenomena. Different anti-crisis management theoretical and practical concepts used in the world are discussed and compared along with the management research approaches. Described in the article is author’s approach to anti-crisis enterprise management based on the need for enterprise to study relations between certain marketing elements and project efficiency indices and to take them into account.

  18. Empirical Study of E-logistics System Based on Tibet Logistics Industry

    OpenAIRE

    Liu, Yu

    2013-01-01

    With the rapid growth of E-logistics in the global logistics industry, it is important to get insight into E-logistics system in Chinese logistics industry. Regarding the current situation of E-logistics of Chinese logistics industry, there are still many problems to be concerned and resolved. This paper will review the concepts and theoretical background of E-logistics System from previous researches. After acknowledging the essential issues on E-logistics System, a research model is designe...

  19. Development of efficient air-cooling strategies for lithium-ion battery module based on empirical heat source model

    International Nuclear Information System (INIS)

    Wang, Tao; Tseng, K.J.; Zhao, Jiyun

    2015-01-01

    Thermal modeling is the key issue in thermal management of lithium-ion battery system, and cooling strategies need to be carefully investigated to guarantee the temperature of batteries in operation within a narrow optimal range as well as provide cost effective and energy saving solutions for cooling system. This article reviews and summarizes the past cooling methods especially forced air cooling and introduces an empirical heat source model which can be widely applied in the battery module/pack thermal modeling. In the development of empirical heat source model, three-dimensional computational fluid dynamics (CFD) method is employed, and thermal insulation experiments are conducted to provide the key parameters. A transient thermal model of 5 × 5 battery module with forced air cooling is then developed based on the empirical heat source model. Thermal behaviors of battery module under different air cooling conditions, discharge rates and ambient temperatures are characterized and summarized. Varies cooling strategies are simulated and compared in order to obtain an optimal cooling method. Besides, the battery fault conditions are predicted from transient simulation scenarios. The temperature distributions and variations during discharge process are quantitatively described, and it is found that the upper limit of ambient temperature for forced air cooling is 35 °C, and when ambient temperature is lower than 20 °C, forced air-cooling is not necessary. - Highlights: • An empirical heat source model is developed for battery thermal modeling. • Different air-cooling strategies on module thermal characteristics are investigated. • Impact of different discharge rates on module thermal responses are investigated. • Impact of ambient temperatures on module thermal behaviors are investigated. • Locations of maximum temperatures under different operation conditions are studied.

  20. Transformation of an empirical distribution to normal distribution by the use of Johnson system of translation and symmetrical quantile method

    OpenAIRE

    Ludvík Friebel; Jana Friebelová

    2006-01-01

    This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...

  1. Theoretical and numerical studies on the transport of transverse beam quality in plasma-based accelerators

    International Nuclear Information System (INIS)

    Mehrling, Timon Johannes

    2014-11-01

    This work examines effects, which impact the transverse quality of electron-beams in plasma-based accelerators, by means of theoretical and numerical methods. Plasma-based acceleration is a promising candidate for future particle accelerator technologies. In plasma-based acceleration, highly intense laser beams or high-current relativistic particle beams are focused into a plasma to excite plasma-waves with extreme transverse and longitudinal electric fields. The amplitude of these fields exceed with 10-100 GV/m the ones in today's radio-frequency accelerators by several orders of magnitude, hence, in principle allowing for accordingly shorter and cheaper accelerators based on plasma. Despite the tremendous progress in the recent decade, beams from plasma accelerators are not yet achieving the quality as demanded for pivotal applications of relativistic electron-beams, e.g. free-electron lasers (FELs).Studies within this work examine how the quality can be optimized in the production of the beams and preserved during the acceleration and transport to the interaction region. Such studies cannot be approached purely analytical but necessitate numerical methods, such as the Particle-In-Cell (PIC) method, which can model kinetic, electrodynamic and relativistic plasma phenomena. However, this method is computationally too expensive for parameter-scans in three-dimensional geometries. Hence, a quasi-static PIC code was developed in connection with this work, which is significantly more effective than the full PIC method for a class of problems in plasma-based acceleration.The evolution of the emittance of beams which are injected into plasma modules was studied in this work by means of theoretical and the above numerical methods. It was shown that the beam parameters need to be matched accurately into the focusing plasma-channel in order to allow for beam-quality preservation. This suggested that new extraction and injection-techniques are required in staged plasma

  2. Theoretical Analysis on Mechanical Deformation of Membrane-Based Photomask Blanks

    Science.gov (United States)

    Marumoto, Kenji; Aya, Sunao; Yabe, Hedeki; Okada, Tatsunori; Sumitani, Hiroaki

    2012-04-01

    Membrane-based photomask is used in proximity X-ray lithography including that in LIGA (Lithographie, Galvanoformung und Abformung) process, and near-field photolithography. In this article, out-of-plane deformation (OPD) and in-plane displacement (IPD) of membrane-based photomask blanks are theoretically analyzed to obtain the mask blanks with flat front surface and low stress absorber film. First, we derived the equations of OPD and IPD for the processing steps of membrane-based photomask such as film deposition, back-etching and bonding, using a theory of symmetrical bending of circular plates with a coaxial circular hole and that of deformation of cylinder under hydrostatic pressure. The validity of the equations was proved by comparing the calculation results with experimental ones. Using these equations, we investigated the relation between the geometry of the mask blanks and the distortions generally, and gave the criterion to attain the flat front surface. Moreover, the absorber stress-bias required to obtain zero-stress on finished mask blanks was also calculated and it has been found that only little stress-bias was required for adequate hole size of support plate.

  3. Semi-empirical Calculation of Detection Efficiency for Voluminous Source Based on Effective Solid Angle Concept

    Energy Technology Data Exchange (ETDEWEB)

    Kang, M. Y.; Kim, J. H.; Choi, H. D.; Sun, G. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To calculate the full energy (FE) absorption peak efficiency for arbitrary volume sample, we developed and verified the Effective Solid Angle (ESA) Code. The procedure for semi-empirical determination of the FE efficiency for the arbitrary volume sources and the calculation principles and processes about ESA code is referred to, and the code was validated with a HPGe detector (relative efficiency 32%, n-type) in previous studies. In this study, we use different type and efficiency of HPGe detectors, in order to verify the performance of the ESA code for the various detectors. We calculated the efficiency curve of voluminous source and compared with experimental data. We will carry out additional validation by measurement of various medium, volume and shape of CRM volume sources with detector of different efficiency and type. And we will reflect the effect of the dead layer of p-type HPGe detector and coincidence summing correction technique in near future.

  4. An empirical model of the Earth's bow shock based on an artificial neural network

    Science.gov (United States)

    Pallocchia, Giuseppe; Ambrosino, Danila; Trenchi, Lorenzo

    2014-05-01

    All of the past empirical models of the Earth's bow shock shape were obtained by best-fitting some given surfaces to sets of observed crossings. However, the issue of bow shock modeling can be addressed by means of artificial neural networks (ANN) as well. In this regard, here it is presented a perceptron, a simple feedforward network, which computes the bow shock distance along a given direction using the two angular coordinates of that direction, the bow shock predicted distance RF79 (provided by Formisano's model (F79)) and the upstream alfvénic Mach number Ma. After a brief description of the ANN architecture and training method, we discuss the results of the statistical comparison, performed over a test set of 1140 IMP8 crossings, between the prediction accuracies of ANN and F79 models.

  5. Global optimization based on noisy evaluations: An empirical study of two statistical approaches

    International Nuclear Information System (INIS)

    Vazquez, Emmanuel; Villemonteix, Julien; Sidorkiewicz, Maryan; Walter, Eric

    2008-01-01

    The optimization of the output of complex computer codes has often to be achieved with a small budget of evaluations. Algorithms dedicated to such problems have been developed and compared, such as the Expected Improvement algorithm (El) or the Informational Approach to Global Optimization (IAGO). However, the influence of noisy evaluation results on the outcome of these comparisons has often been neglected, despite its frequent appearance in industrial problems. In this paper, empirical convergence rates for El and IAGO are compared when an additive noise corrupts the result of an evaluation. IAGO appears more efficient than El and various modifications of El designed to deal with noisy evaluations. Keywords. Global optimization; computer simulations; kriging; Gaussian process; noisy evaluations.

  6. A DISTANCE EDUCATION MODEL FOR JORDANIAN STUDENTS BASED ON AN EMPIRICAL STUDY

    Directory of Open Access Journals (Sweden)

    Ahmad SHAHER MASHHOUR

    2007-04-01

    Full Text Available Distance education is expanding worldwide. Numbers of students enrolled in distance education are increasing at very high rates. Distance education is said to be the future of education because it addresses educational needs of the new millennium. This paper represents the findings of an empirical study on a sample of Jordanian distance education students into a requirement model that addresses the need of such education at the national level. The responses of the sample show that distance education is offering a viable and satisfactory alternative to those who cannot enroll in regular residential education. The study also shows that the shortcomings of the regular and the current form of distance education in Jordan can be overcome by the use of modern information technology.

  7. Inferring causal molecular networks: empirical assessment through a community-based effort.

    Science.gov (United States)

    Hill, Steven M; Heiser, Laura M; Cokelaer, Thomas; Unger, Michael; Nesser, Nicole K; Carlin, Daniel E; Zhang, Yang; Sokolov, Artem; Paull, Evan O; Wong, Chris K; Graim, Kiley; Bivol, Adrian; Wang, Haizhou; Zhu, Fan; Afsari, Bahman; Danilova, Ludmila V; Favorov, Alexander V; Lee, Wai Shing; Taylor, Dane; Hu, Chenyue W; Long, Byron L; Noren, David P; Bisberg, Alexander J; Mills, Gordon B; Gray, Joe W; Kellen, Michael; Norman, Thea; Friend, Stephen; Qutub, Amina A; Fertig, Elana J; Guan, Yuanfang; Song, Mingzhou; Stuart, Joshua M; Spellman, Paul T; Koeppl, Heinz; Stolovitzky, Gustavo; Saez-Rodriguez, Julio; Mukherjee, Sach

    2016-04-01

    It remains unclear whether causal, rather than merely correlational, relationships in molecular networks can be inferred in complex biological settings. Here we describe the HPN-DREAM network inference challenge, which focused on learning causal influences in signaling networks. We used phosphoprotein data from cancer cell lines as well as in silico data from a nonlinear dynamical model. Using the phosphoprotein data, we scored more than 2,000 networks submitted by challenge participants. The networks spanned 32 biological contexts and were scored in terms of causal validity with respect to unseen interventional data. A number of approaches were effective, and incorporating known biology was generally advantageous. Additional sub-challenges considered time-course prediction and visualization. Our results suggest that learning causal relationships may be feasible in complex settings such as disease states. Furthermore, our scoring approach provides a practical way to empirically assess inferred molecular networks in a causal sense.

  8. A Meta-Analysis of Empirically Tested School-Based Dating Violence Prevention Programs

    Directory of Open Access Journals (Sweden)

    Sarah R. Edwards

    2014-05-01

    Full Text Available Teen dating violence prevention programs implemented in schools and empirically tested were subjected to meta-analysis. Eight studies met criteria for inclusion, consisting of both within and between designs. Overall, the weighted mean effect size (ES across studies was significant, ESr = .11; 95% confidence interval (CI = [.08, .15], p < .0001, showing an overall positive effect of the studied prevention programs. However, 25% of the studies showed an effect in the negative direction, meaning students appeared to be more supportive of dating violence after participating in a dating violence prevention program. This heightens the need for thorough program evaluation as well as the need for decision makers to have access to data about the effectiveness of programs they are considering implementing. Further implications of the results and recommendations for future research are discussed.

  9. Use of empirically based corrosion model to aid steam generator life management

    Energy Technology Data Exchange (ETDEWEB)

    Angell, P.; Balakrishnan, P.V.; Turner, C.W

    2000-07-01

    Alloy 800 (N08800) tubes used in CANDU 6 steam generators have shown a low incidence of corrosion damage because of the good corrosion resistance of N08800 and successful water chemistry control strategies. However, N08800 is not immune to corrosion, especially pitting, under plausible SG conditions. Electrochemical potentials are critical in determining both susceptibility and rates of corrosion and are known to be a function of water-chemistry. Using laboratory data an empirical model for pitting and crevice corrosion has been developed for N08800. Combination of such a model with chemistry monitoring and diagnostic software makes it possible to arm the impact of plant operating conditions on SG tube corrosion for plant life management (PLIM). Possible transient chemistry regimes that could significantly shorten expected tube lifetimes have been identified and predictions continue to support the position dud under normal, low dissolved oxygen conditions, pitting of N08800 will not initiate. (author)

  10. Bearing Fault Detection Based on Empirical Wavelet Transform and Correlated Kurtosis by Acoustic Emission.

    Science.gov (United States)

    Gao, Zheyu; Lin, Jing; Wang, Xiufeng; Xu, Xiaoqiang

    2017-05-24

    Rolling bearings are widely used in rotating equipment. Detection of bearing faults is of great importance to guarantee safe operation of mechanical systems. Acoustic emission (AE), as one of the bearing monitoring technologies, is sensitive to weak signals and performs well in detecting incipient faults. Therefore, AE is widely used in monitoring the operating status of rolling bearing. This paper utilizes Empirical Wavelet Transform (EWT) to decompose AE signals into mono-components adaptively followed by calculation of the correlated kurtosis (CK) at certain time intervals of these components. By comparing these CK values, the resonant frequency of the rolling bearing can be determined. Then the fault characteristic frequencies are found by spectrum envelope. Both simulation signal and rolling bearing AE signals are used to verify the effectiveness of the proposed method. The results show that the new method performs well in identifying bearing fault frequency under strong background noise.

  11. Use of empirically based corrosion model to aid steam generator life management

    International Nuclear Information System (INIS)

    Angell, P.; Balakrishnan, P.V.; Turner, C.W.

    2000-01-01

    Alloy 800 (N08800) tubes used in CANDU 6 steam generators have shown a low incidence of corrosion damage because of the good corrosion resistance of N08800 and successful water chemistry control strategies. However, N08800 is not immune to corrosion, especially pitting, under plausible SG conditions. Electrochemical potentials are critical in determining both susceptibility and rates of corrosion and are known to be a function of water-chemistry. Using laboratory data an empirical model for pitting and crevice corrosion has been developed for N08800. Combination of such a model with chemistry monitoring and diagnostic software makes it possible to arm the impact of plant operating conditions on SG tube corrosion for plant life management (PLIM). Possible transient chemistry regimes that could significantly shorten expected tube lifetimes have been identified and predictions continue to support the position dud under normal, low dissolved oxygen conditions, pitting of N08800 will not initiate. (author)

  12. Theoretical analysis of transcranial Hall-effect stimulation based on passive cable model

    International Nuclear Information System (INIS)

    Yuan Yi; Li Xiao-Li

    2015-01-01

    Transcranial Hall-effect stimulation (THS) is a new stimulation method in which an ultrasonic wave in a static magnetic field generates an electric field in an area of interest such as in the brain to modulate neuronal activities. However, the biophysical basis of simulating the neurons remains unknown. To address this problem, we perform a theoretical analysis based on a passive cable model to investigate the THS mechanism of neurons. Nerve tissues are conductive; an ultrasonic wave can move ions embedded in the tissue in a static magnetic field to generate an electric field (due to Lorentz force). In this study, a simulation model for an ultrasonically induced electric field in a static magnetic field is derived. Then, based on the passive cable model, the analytical solution for the voltage distribution in a nerve tissue is determined. The simulation results showthat THS can generate a voltage to stimulate neurons. Because the THS method possesses a higher spatial resolution and a deeper penetration depth, it shows promise as a tool for treating or rehabilitating neuropsychiatric disorders. (paper)

  13. Tetraphenylpyrimidine-Based AIEgens: Facile Preparation, Theoretical Investigation and Practical Application

    Directory of Open Access Journals (Sweden)

    Junkai Liu

    2017-10-01

    Full Text Available Aggregation-induced emission (AIE has become a hot research area and tremendous amounts of AIE-active luminogens (AIEgens have been generated. To further promote the development of AIE, new AIEgens are highly desirable. Herein, new AIEgens based on tetraphenylpyrimidine (TPPM are rationally designed according to the AIE mechanism of restriction of intramolecular motion, and facilely prepared under mild reaction conditions. The photophysical property of the generated TPPM, TPPM-4M and TPPM-4P are systematically investigated and the results show that they feature the aggregation-enhanced emission (AEE characteristics. Theoretical study shows the high-frequency bending vibrations in the central pyrimidine ring of TPPM derivatives dominate the nonradiative decay channels. Thanks to the AEE feature, their aggregates can be used to detect explosives with super-amplification quenching effects, and the sensing ability is higher than typical AIE-active tetraphenylethene. It is anticipated that TPPM derivatives could serve as a new type of widely used AIEgen based on their facile preparation and good thermo-, photo- and chemostabilities.

  14. The NIHR Collaborations for Leadership in Applied Health Research and Care (CLAHRC) for Greater Manchester: combining empirical, theoretical and experiential evidence to design and evaluate a large-scale implementation strategy.

    Science.gov (United States)

    Harvey, Gill; Fitzgerald, Louise; Fielden, Sandra; McBride, Anne; Waterman, Heather; Bamford, David; Kislov, Roman; Boaden, Ruth

    2011-08-23

    In response to policy recommendations, nine National Institute for Health Research (NIHR) Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) were established in England in 2008, aiming to create closer working between the health service and higher education and narrow the gap between research and its implementation in practice. The Greater Manchester (GM) CLAHRC is a partnership between the University of Manchester and twenty National Health Service (NHS) trusts, with a five-year mission to improve healthcare and reduce health inequalities for people with cardiovascular conditions. This paper outlines the GM CLAHRC approach to designing and evaluating a large-scale, evidence- and theory-informed, context-sensitive implementation programme. The paper makes a case for embedding evaluation within the design of the implementation strategy. Empirical, theoretical, and experiential evidence relating to implementation science and methods has been synthesised to formulate eight core principles of the GM CLAHRC implementation strategy, recognising the multi-faceted nature of evidence, the complexity of the implementation process, and the corresponding need to apply approaches that are situationally relevant, responsive, flexible, and collaborative. In turn, these core principles inform the selection of four interrelated building blocks upon which the GM CLAHRC approach to implementation is founded. These determine the organizational processes, structures, and roles utilised by specific GM CLAHRC implementation projects, as well as the approach to researching implementation, and comprise: the Promoting Action on Research Implementation in Health Services (PARIHS) framework; a modified version of the Model for Improvement; multiprofessional teams with designated roles to lead, facilitate, and support the implementation process; and embedded evaluation and learning. Designing and evaluating a large-scale implementation strategy that can cope with and

  15. What 'empirical turn in bioethics'?

    Science.gov (United States)

    Hurst, Samia

    2010-10-01

    Uncertainty as to how we should articulate empirical data and normative reasoning seems to underlie most difficulties regarding the 'empirical turn' in bioethics. This article examines three different ways in which we could understand 'empirical turn'. Using real facts in normative reasoning is trivial and would not represent a 'turn'. Becoming an empirical discipline through a shift to the social and neurosciences would be a turn away from normative thinking, which we should not take. Conducting empirical research to inform normative reasoning is the usual meaning given to the term 'empirical turn'. In this sense, however, the turn is incomplete. Bioethics has imported methodological tools from empirical disciplines, but too often it has not imported the standards to which researchers in these disciplines are held. Integrating empirical and normative approaches also represents true added difficulties. Addressing these issues from the standpoint of debates on the fact-value distinction can cloud very real methodological concerns by displacing the debate to a level of abstraction where they need not be apparent. Ideally, empirical research in bioethics should meet standards for empirical and normative validity similar to those used in the source disciplines for these methods, and articulate these aspects clearly and appropriately. More modestly, criteria to ensure that none of these standards are completely left aside would improve the quality of empirical bioethics research and partly clear the air of critiques addressing its theoretical justification, when its rigour in the particularly difficult context of interdisciplinarity is what should be at stake.

  16. Do modified audit opinions have economic consequences? Empirical evidence based on financial constraints

    Directory of Open Access Journals (Sweden)

    Zhiwei Lin

    2011-09-01

    Full Text Available We present a framework and empirical evidence to explain why, on average, 11% of listed firms in China received modified audit opinions (MAOs between 1992 and 2009. We argue that there are two reasons for this phenomenon: strong earnings management incentives lower firms’ financial reporting quality and soft budget constraints weaken the information and governance roles of audit opinions. We find that firms’ financial constraints eased after receiving MAOs, which suggests that MAOs have limited economic consequences. Further analysis shows that this phenomenon predominantly exists in government-controlled firms and firms that receive MAOs for the first time. We also find that MAOs have not influenced financial constraints after 2006. Finally, we find that MAOs did not affect borrowing cash flows from banks until 2005, suggesting that MAOs did not start affecting bank financing until that year. We also find that firms receive more related-party financing after receiving MAOs. Our results indicate that a limited effect on bank financing and increased related-party financing reduce the effect of MAOs on financial constraints.

  17. Optimizing targeted vaccination across cyber-physical networks: an empirically based mathematical simulation study.

    Science.gov (United States)

    Mones, Enys; Stopczynski, Arkadiusz; Pentland, Alex 'Sandy'; Hupert, Nathaniel; Lehmann, Sune

    2018-01-01

    Targeted vaccination, whether to minimize the forward transmission of infectious diseases or their clinical impact, is one of the 'holy grails' of modern infectious disease outbreak response, yet it is difficult to achieve in practice due to the challenge of identifying optimal targets in real time. If interruption of disease transmission is the goal, targeting requires knowledge of underlying person-to-person contact networks. Digital communication networks may reflect not only virtual but also physical interactions that could result in disease transmission, but the precise overlap between these cyber and physical networks has never been empirically explored in real-life settings. Here, we study the digital communication activity of more than 500 individuals along with their person-to-person contacts at a 5-min temporal resolution. We then simulate different disease transmission scenarios on the person-to-person physical contact network to determine whether cyber communication networks can be harnessed to advance the goal of targeted vaccination for a disease spreading on the network of physical proximity. We show that individuals selected on the basis of their closeness centrality within cyber networks (what we call 'cyber-directed vaccination') can enhance vaccination campaigns against diseases with short-range (but not full-range) modes of transmission. © 2018 The Author(s).

  18. An empirical study on the relationship of purchasing a chocolate based on its packaging

    Directory of Open Access Journals (Sweden)

    Yasaman Giyahi

    2012-04-01

    Full Text Available Chocolate is one of popular gifts among people in many societies. Packaging of such product plays an important role on marketing this item and the primary question of this survey is to determine the impact of packaging on better introducing a product. The inference statistical tests show that packaging is an important item in selection of chocolate as a gift. Percentage of chocolate is the most important information on packaging and color of packaging is of paramount significance when customers purchase chocolate for individuals with official relationship. In this paper, we present an empirical study to measure the effects of chocolates' packaging on purchasing them. The proposed study of this paper designs a questionnaire and distributes them among different people. The results are analyzed using some non-parametric tests and they are discussed. The preliminary results indicate that the number of purchased packages within a year, cost of purchasing chocolate within a year, type of relationship of recipients of chocolate as gift, gender of recipient of chocolate as gift, age group of recipient of gift, type of store, nationality of chocolate, significance of packaging in various price ranges, type of packaging, insertion of information on package and color of packaging, are important factors influencing people to buy more.

  19. The Needs of Victims: An Empirical Categorization Based on Interpersonal Conflicts

    Directory of Open Access Journals (Sweden)

    Johanna Kirchhoff

    2013-09-01

    Full Text Available As a consequence of interpersonal conflicts, needs of the victimized are violated. These needs have to be addressed in order to achieve reconciliation. Due to the heterogeneity of need categories in scholarly research, we scrutinized which need categories can be empirically identified. 478 participants reported on an experienced interpersonal conflict. They responded to 109 items evaluating the perceived need violation for the conflict they reported on. By means of exploratory factor analysis with a random sub-sample (n1 = 239, six need categories were extracted. These are the need for respect, the need for meaning, the need for acceptance, the need for pleasure, the need for self-efficacy, and the need for safety. Confirmatory factor analyses showed that these needs replicated in the second random sub-sample (n2 = 239 as well as across sub-samples with people who had experienced an interpersonal conflict of lower severity of transgression (nA = 257 or higher severity of transgression (nB = 221. In addition, each of the need categories mediated the relationship between the severity of transgression and the desire for revenge. Yet, the results for the two need categories “pleasure” and “safety” have to be interpreted with caution due to a lack of scalar invariance. Among the other four need categories, respect was identified as the only independent mediator variable. Implications for the transformation of interpersonal conflict and further scholarly inquiries are discussed.

  20. Public University Students' Expectations: An Empirical Study Based on the Stakeholders Theory

    Directory of Open Access Journals (Sweden)

    Emerson Wagner MAINARDES

    2012-02-01

    Full Text Available In accordance with the importance that the student stakeholder represents to universities, the objective of this research project was to identify and classify the leading expectations of students at public universities. In order to achieve this, the study adopted both the premises of Stakeholder Theory and the approaches of earlier studies on the management of university stakeholders. This empirical study began with an exploratory study of students, at one university, to identify their expectations this resulting in a list of a total of twenty-five confirmed expectations. This provided the basis for the subsequent quantitative study involving students attending eleven Portuguese public universities. Through recourse to an online questionnaire, we obtained 1,669 correctly completed surveys that provided the input for data analysis deploying descriptive statistical processes and multiple linear regressions. Our findings show that the most important student expectations are the academic level of demand, the university’s connections with the employment market, student personal self-fulfillment and the prevailing university environment. According to students, these expectations should gain priority attention by university managers, once they consider them the most relevant aspects to the relationship between the student and the university.

  1. Combining Empirical Relationships with Data Based Mechanistic Modeling to Inform Solute Tracer Investigations across Stream Orders

    Science.gov (United States)

    Herrington, C.; Gonzalez-Pinzon, R.; Covino, T. P.; Mortensen, J.

    2015-12-01

    Solute transport studies in streams and rivers often begin with the introduction of conservative and reactive tracers into the water column. Information on the transport of these substances is then captured within tracer breakthrough curves (BTCs) and used to estimate, for instance, travel times and dissolved nutrient and carbon dynamics. Traditionally, these investigations have been limited to systems with small discharges (turbidity (e.g., nitrate signals with SUNA instruments or fluorescence measures) and/or high total dissolved solids (e.g., making prohibitively expensive the use of salt tracers such as NaCl) in larger systems. Additionally, a successful time-of-travel study is valuable for only a single discharge and river stage. We have developed a method to predict tracer BTCs to inform sampling frequencies at small and large stream orders using empirical relationships developed from multiple tracer injections spanning several orders of magnitude in discharge and reach length. This method was successfully tested in 1st to 8th order systems along the Middle Rio Grande River Basin in New Mexico, USA.

  2. Information-theoretic discrepancy based iterative reconstructions (IDIR) for polychromatic x-ray tomography

    International Nuclear Information System (INIS)

    Jang, Kwang Eun; Lee, Jongha; Sung, Younghun; Lee, SeongDeok

    2013-01-01

    Purpose: X-ray photons generated from a typical x-ray source for clinical applications exhibit a broad range of wavelengths, and the interactions between individual particles and biological substances depend on particles' energy levels. Most existing reconstruction methods for transmission tomography, however, neglect this polychromatic nature of measurements and rely on the monochromatic approximation. In this study, we developed a new family of iterative methods that incorporates the exact polychromatic model into tomographic image recovery, which improves the accuracy and quality of reconstruction.Methods: The generalized information-theoretic discrepancy (GID) was employed as a new metric for quantifying the distance between the measured and synthetic data. By using special features of the GID, the objective function for polychromatic reconstruction which contains a double integral over the wavelength and the trajectory of incident x-rays was simplified to a paraboloidal form without using the monochromatic approximation. More specifically, the original GID was replaced with a surrogate function with two auxiliary, energy-dependent variables. Subsequently, the alternating minimization technique was applied to solve the double minimization problem. Based on the optimization transfer principle, the objective function was further simplified to the paraboloidal equation, which leads to a closed-form update formula. Numerical experiments on the beam-hardening correction and material-selective reconstruction were conducted to compare and assess the performance of conventional methods and the proposed algorithms.Results: The authors found that the GID determines the distance between its two arguments in a flexible manner. In this study, three groups of GIDs with distinct data representations were considered. The authors demonstrated that one type of GIDs that comprises “raw” data can be viewed as an extension of existing statistical reconstructions; under a

  3. Empirical component model to predict the overall performance of heating coils: Calibrations and tests based on manufacturer catalogue data

    International Nuclear Information System (INIS)

    Ruivo, Celestino R.; Angrisani, Giovanni

    2015-01-01

    Highlights: • An empirical model for predicting the performance of heating coils is presented. • Low and high heating capacity cases are used for calibration. • Versions based on several effectiveness correlations are tested. • Catalogue data are considered in approach testing. • The approach is a suitable component model to be used in dynamic simulation tools. - Abstract: A simplified methodology for predicting the overall behaviour of heating coils is presented in this paper. The coil performance is predicted by the ε-NTU method. Usually manufacturers do not provide information about the overall thermal resistance or the geometric details that are required either for the device selection or to apply known empirical correlations for the estimation of the involved thermal resistances. In the present work, heating capacity tables from the manufacturer catalogue are used to calibrate simplified approaches based on the classical theory of heat exchangers, namely the effectiveness method. Only two reference operating cases are required to calibrate each approach. The validity of the simplified approaches is investigated for a relatively high number of operating cases, listed in the technical catalogue of a manufacturer. Four types of coils of three sizes of air handling units are considered. A comparison is conducted between the heating coil capacities provided by the methodology and the values given by the manufacturer catalogue. The results show that several of the proposed approaches are suitable component models to be integrated in dynamic simulation tools of air conditioning systems such as TRNSYS or EnergyPlus

  4. Theoretical justification of space-mapping-based modeling utilizing a database and on-demand parameter extraction

    DEFF Research Database (Denmark)

    Koziel, Slawomir; Bandler, John W.; Madsen, Kaj

    2006-01-01

    the surrogate, we perform parameter extraction with weighting coefficients dependent on the distance between the point of interest and base points. We provide theoretical results showing that the new methodology can assure any accuracy that is required (provided the base set is dense enough), which...

  5. Theoretical Conversions of Different Hardness and Tensile Strength for Ductile Materials Based on Stress-Strain Curves

    Science.gov (United States)

    Chen, Hui; Cai, Li-Xun

    2018-04-01

    Based on the power-law stress-strain relation and equivalent energy principle, theoretical equations for converting between Brinell hardness (HB), Rockwell hardness (HR), and Vickers hardness (HV) were established. Combining the pre-existing relation between the tensile strength ( σ b ) and Hollomon parameters ( K, N), theoretical conversions between hardness (HB/HR/HV) and tensile strength ( σ b ) were obtained as well. In addition, to confirm the pre-existing σ b -( K, N) relation, a large number of uniaxial tensile tests were conducted in various ductile materials. Finally, to verify the theoretical conversions, plenty of statistical data listed in ASTM and ISO standards were adopted to test the robustness of the converting equations with various hardness and tensile strength. The results show that both hardness conversions and hardness-strength conversions calculated from the theoretical equations accord well with the standard data.

  6. Grand Canonical adaptive resolution simulation for molecules with electrons: A theoretical framework based on physical consistency

    Science.gov (United States)

    Delle Site, Luigi

    2018-01-01

    A theoretical scheme for the treatment of an open molecular system with electrons and nuclei is proposed. The idea is based on the Grand Canonical description of a quantum region embedded in a classical reservoir of molecules. Electronic properties of the quantum region are calculated at constant electronic chemical potential equal to that of the corresponding (large) bulk system treated at full quantum level. Instead, the exchange of molecules between the quantum region and the classical environment occurs at the chemical potential of the macroscopic thermodynamic conditions. The Grand Canonical Adaptive Resolution Scheme is proposed for the treatment of the classical environment; such an approach can treat the exchange of molecules according to first principles of statistical mechanics and thermodynamic. The overall scheme is build on the basis of physical consistency, with the corresponding definition of numerical criteria of control of the approximations implied by the coupling. Given the wide range of expertise required, this work has the intention of providing guiding principles for the construction of a well founded computational protocol for actual multiscale simulations from the electronic to the mesoscopic scale.

  7. Experimental and theoretical investigation of vibrational spectra of coordination polymers based on TCE-TTF.

    Science.gov (United States)

    Olejniczak, Iwona; Lapiński, Andrzej; Swietlik, Roman; Olivier, Jean; Golhen, Stéphane; Ouahab, Lahcène

    2011-08-01

    The room-temperature infrared and Raman spectra of a series of four isostructural polymeric salts of 2,3,6,7-tetrakis(2-cyanoethylthio)-tetrathiafulvalene (TCE-TTF) with paramagnetic (Co(II), Mn(II)) and diamagnetic (Zn(II), Cd(II)) ions, together with BF(4)(-) or ClO(4)(-) anions are reported. Infrared and Raman-active modes are identified and assigned based on theoretical calculations for neutral and ionized TCE-TTF using density functional theory (DFT) methods. It is confirmed that the TCE-TTF molecules in all the materials investigated are fully ionized and interact in the crystal structure through cyanoethylthio groups. The vibrational modes related to the C=C stretching vibrations of TCE-TTF are analyzed assuming the occurrence of electron-molecular vibration coupling (EMV). The presence of the antisymmetric C=C dimeric mode provides evidence that charge transfer takes place between TCE-TTF molecules belonging to neighboring polymeric networks. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Triphenylamine-based fluorescent NLO phores with ICT characteristics: Solvatochromic and theoretical study

    Science.gov (United States)

    Katariya, Santosh B.; Patil, Dinesh; Rhyman, Lydia; Alswaidan, Ibrahim A.; Ramasami, Ponnadurai; Sekar, Nagaiyan

    2017-12-01

    The static first and second hyperpolarizability and their related properties were calculated for triphenylamine-based "push-pull" dyes using the B3LYP, CAM-B3LYP and BHHLYP functionals in conjunction with the 6-311+G(d,p) basis set. The electronic coupling for the electron transfer reaction of the dyes were calculated with the generalized Mulliken-Hush method. The results obtained were correlated with the polarizability parameter αCT , first hyperpolarizability parameter βCT, and the solvatochromic descriptor of 〈 γ〉 SD obtained by the solvatochromic method. The dyes studied show a high total first order hyperpolarizability (70-238 times) and second order hyperpolarizability (412-778 times) compared to urea. Among the three functionals, the CAM-B3LYP and BHHLYP functionals show hyperpolarizability values closer to experimental values. Experimental absorption and emission wavelengths measured for all the synthesized dyes are in good agreement with those predicted using the time-dependent density functional theory. The theoretical examination on non-linear optical properties was performed on the key parameters of polarizability and hyperpolarizability. A remarkable increase in non-linear optical response is observed on insertion of benzothiazole unit compared to benzimidazole unit.

  9. Margins of freedom: a field-theoretic approach to class-based health dispositions and practices.

    Science.gov (United States)

    Burnett, Patrick John; Veenstra, Gerry

    2017-09-01

    Pierre Bourdieu's theory of practice situates social practices in the relational interplay between experiential mental phenomena (habitus), resources (capitals) and objective social structures (fields). When applied to class-based practices in particular, the overarching field of power within which social classes are potentially made manifest is the primary field of interest. Applying relational statistical techniques to original survey data from Toronto and Vancouver, Canada, we investigated whether smoking, engaging in physical activity and consuming fruit and vegetables are dispersed in a three-dimensional field of power shaped by economic and cultural capitals and cultural dispositions and practices. We find that aesthetic dispositions and flexibility of developing and established dispositions are associated with positioning in the Canadian field of power and embedded in the logics of the health practices dispersed in the field. From this field-theoretic perspective, behavioural change requires the disruption of existing relations of harmony between the habitus of agents, the fields within which the practices are enacted and the capitals that inform and enforce the mores and regularities of the fields. The three-dimensional model can be explored at: http://relational-health.ca/margins-freedom. © 2017 Foundation for the Sociology of Health & Illness.

  10. Sequential game-theoretical analysis of safeguards systems based on the principle of material accountability

    International Nuclear Information System (INIS)

    Abel, V.; Avenhaus, R.

    1981-01-01

    The international control of fissile material used in nuclear technology is based on the principle of material accountability, i. e. on the difference between book inventory and physical inventory of a plant at the end of an inventory period. Since statistical measurement errors cannot be avoided, this comparison calls for methods of statistical hypotheses testing. Moreover, game-theoretical methods are needed as an analytical tool, since if the plant operator wilfully diverts material he will do so in an optimal manner. In this article, the optimal test strategy is determined for the case of two inventory periods. Two assumptions are made: The operation of the plant is stopped after the first inventory period if the test indicates wilful diversion, and the plant operator chooses the probability of wilfully diverting material for both inventory periods before the start of the first period. The assertions which depend on the payoff parameters and the substitutes which must be given in the case that the payoff parameters cannot be estimated are discussed. (orig.) [de

  11. Power Transmission Scheduling for Generators in a Deregulated Environment Based on a Game-Theoretic Approach

    Directory of Open Access Journals (Sweden)

    Bingtuan Gao

    2015-12-01

    Full Text Available In a deregulated environment of the power market, in order to lower their energy price and guarantee the stability of the power network, appropriate transmission lines have to be considered for electricity generators to sell their energy to the end users. This paper proposes a game-theoretic power transmission scheduling for multiple generators to lower their wheeling cost. Based on the embedded cost method, a wheeling cost model consisting of congestion cost, cost of losses and cost of transmission capacity is presented. By assuming each generator behaves in a selfish and rational way, the competition among the multiple generators is formulated as a non-cooperative game, where the players are the generators and the strategies are their daily schedules of power transmission. We will prove that there exists at least one pure-strategy Nash equilibrium of the formulated power transmission game. Moreover, a distributed algorithm will be provided to realize the optimization in terms of minimizing the wheeling cost. Finally, simulations were performed and discussed to verify the feasibility and effectiveness of the proposed non-cooperative game approach for the generators in a deregulated environment.

  12. A Game Theoretic Framework for Incentive-Based Models of Intrinsic Motivation in Artificial Systems

    Directory of Open Access Journals (Sweden)

    Kathryn Elizabeth Merrick

    2013-10-01

    Full Text Available An emerging body of research is focusing on understanding and building artificial systems that can achieve open-ended development influenced by intrinsic motivations. In particular, research in robotics and machine learning is yielding systems and algorithms with increasing capacity for self-directed learning and autonomy. Traditional software architectures and algorithms are being augmented with intrinsic motivations to drive cumulative acquisition of knowledge and skills. Intrinsic motivations have recently been considered in reinforcement learning, active learning and supervised learning settings among others. This paper considers game theory as a novel setting for intrinsic motivation. A game theoretic framework for intrinsic motivation is formulated by introducing the concept of optimally motivating incentive as a lens through which players perceive a game. Transformations of four well-known mixed-motive games are presented to demonstrate the perceived games when players’ optimally motivating incentive falls in three cases corresponding to strong power, affiliation and achievement motivation. We use agent-based simulations to demonstrate that players with different optimally motivating incentive act differently as a result of their altered perception of the game. We discuss the implications of these results both for modeling human behavior and for designing artificial agents or robots.

  13. A game theoretic framework for incentive-based models of intrinsic motivation in artificial systems.

    Science.gov (United States)

    Merrick, Kathryn E; Shafi, Kamran

    2013-01-01

    An emerging body of research is focusing on understanding and building artificial systems that can achieve open-ended development influenced by intrinsic motivations. In particular, research in robotics and machine learning is yielding systems and algorithms with increasing capacity for self-directed learning and autonomy. Traditional software architectures and algorithms are being augmented with intrinsic motivations to drive cumulative acquisition of knowledge and skills. Intrinsic motivations have recently been considered in reinforcement learning, active learning and supervised learning settings among others. This paper considers game theory as a novel setting for intrinsic motivation. A game theoretic framework for intrinsic motivation is formulated by introducing the concept of optimally motivating incentive as a lens through which players perceive a game. Transformations of four well-known mixed-motive games are presented to demonstrate the perceived games when players' optimally motivating incentive falls in three cases corresponding to strong power, affiliation and achievement motivation. We use agent-based simulations to demonstrate that players with different optimally motivating incentive act differently as a result of their altered perception of the game. We discuss the implications of these results both for modeling human behavior and for designing artificial agents or robots.

  14. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    Science.gov (United States)

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.

  15. An Empirical Agent-Based Model to Simulate the Adoption of Water Reuse Using the Social Amplification of Risk Framework.

    Science.gov (United States)

    Kandiah, Venu; Binder, Andrew R; Berglund, Emily Z

    2017-10-01

    Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large-scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision-making process. Based on the social amplification of risk framework, our agent-based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the "risk publics" model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community-level parameters-including social groups, relationships, and communication variables, also from survey data-are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks. © 2017 Society for Risk Analysis.

  16. A fluorescent sensor based on dansyl-diethylenetriamine-thiourea conjugate: a through theoretical investigation

    International Nuclear Information System (INIS)

    Nguyen Khoa Hien; Nguyen Thi Ai Nhung; Duong Tuan Quang; Ho Quoc Dai; Nguyen Tien Trung

    2015-01-01

    A new dansyl-diethylenetriamine-thiourea conjugate (DT) for detection of Hg 2+ ions in aqueous solution has been theoretically designed and compared to our previously published results. The synthetic path, the optimized geometric structure and the characteristics of the DT were found by the theoretical calculations at the B3LYP/LanL2DZ level. Accordingly, the DT can react with Hg 2+ ion to form a product with quenched fluorescence. It is remarkable that the experimental results are in an excellent agreement with the theoretically evaluated data. (author)

  17. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  18. Theoretical study of some aspects of the nucleo-bases reactivity: definition of new theoretical tools for the study of chemical reactivity

    International Nuclear Information System (INIS)

    Labet, V.

    2009-09-01

    In this work, three kinds of nucleo-base damages were studied from a theoretical point of view with quantum chemistry methods based on the density-functional theory: the spontaneous deamination of cytosine and its derivatives, the formation of tandem lesion induced by hydroxyl radicals in anaerobic medium and the formation of pyrimidic dimers under exposition to an UV radiation. The complementary use of quantitative static methods allowing the exploration of the potential energy surface of a chemical reaction, and of 'conceptual DFT' principles, leads to information concerning the mechanisms involved and to the rationalization of the differences in the nucleo-bases reactivity towards the formation of a same kind of damage. At the same time, a reflexion was undertaken on the asynchronous concerted mechanism concept, in terms of physical meaning of the transition state, respect of the Maximum Hardness Principle, and determination of the number of primitive processes involved. Finally, a new local reactivity index was developed, relevant to understand the reactivity of a molecular system in an excited state. (author)

  19. Theoretical and numerical studies of TWR based on ESFR core design

    International Nuclear Information System (INIS)

    Zhang, Dalin; Chen, Xue-Nong; Flad, Michael; Rineiski, Andrei; Maschek, Werner

    2013-01-01

    Highlights: • The traveling wave reactor (TWR) is studied based on the core design of the European Sodium-cooled Fast Reactor (ESFR). • The conventional fuel shuffling technique is used to produce a continuous radial fuel movement. • A stationary self sustainable nuclear fission power can be established asymptotically by only loading natural or depleted uranium. • The multi-group deterministic neutronic code ERANOS is applied. - Abstract: This paper deals with the so-called traveling wave reactor (TWR) based on the core design of the European Sodium-cooled Fast Reactor (ESFR). The current concept of TWR is to use the conventional radial fuel shuffling technique to produce a continuous radial fuel movement so that a stationary self sustainable nuclear fission power can be established asymptotically by only loading fertile material consisting of natural or depleted uranium. The core design of ESFR loaded with metallic uranium fuel without considering the control mechanism is used as a practical application example. The theoretical studies focus mainly on qualitative feasibility analyses, i.e. to identify out in general essential parameter dependences of such a kind of reactor. The numerical studies are carried out more specifically on a certain core design. The multi-group deterministic neutronic code ERANOS with the JEFF3.1 data library is applied as a basic tool to perform the neutronics and burn-up calculations. The calculations are performed in a 2-D R-Z geometry, which is sufficient for the current core layout. Numerical results of radial fuel shuffling indicate that the asymptotic k eff parabolically varies with the shuffling period, while the burn-up increases linearly. Typical shuffling periods investigated in this study are in the range of 300–1000 days. The important parameters, e.g. k eff , the burn-up, the power peaking factor, and safety coefficients are calculated

  20. An ecological and theoretical deconstruction of a school-based obesity prevention program in Mexico.

    Science.gov (United States)

    Safdie, Margarita; Cargo, Margaret; Richard, Lucie; Lévesque, Lucie

    2014-08-10

    Ecological intervention programs are recommended to prevent overweight and obesity in children. The National Institute of Public Health (INSP) in Mexico implemented a successful ecological intervention program to promote healthy lifestyle behaviors in school age children. This study assessed the integration of ecological principles and Social Cognitive Theory (SCT) constructs in this effective school-based obesity prevention program implemented in 15 elementary schools in Mexico City. Two coders applied the Intervention Analysis Procedure (IAP) to "map" the program's integration of ecological principles. A checklist gauged the use of SCT theory in program activities. Thirty-two distinct intervention strategies were implemented in one setting (i.e., school) to engage four different target-groups (students, parents, school representatives, government) across two domains (Nutrition and Physical Activity). Overall, 47.5% of the strategies targeted the school infrastructure and/or personnel; 37.5% of strategies targeted a key political actor, the Public Education Secretariat while fewer strategies targeted parents (12.5%) and children (3%). More strategies were implemented in the Nutrition domain (69%) than Physical Activity (31%). The most frequently used SCT construct within both intervention domains was Reciprocal Determinism (e.g., where changes to the environment influence changes in behavior and these behavioral changes influence further changes to the environment); no significant differences were observed in the use of SCT constructs across domains. Findings provide insight into a promising combination of strategies and theoretical constructs that can be used to implement a school-based obesity prevention program. Strategies emphasized school-level infrastructure/personnel change and strong political engagement and were most commonly underpinned by Reciprocal Determinism for both Nutrition and Physical Activity.

  1. Theoretical Issues

    Energy Technology Data Exchange (ETDEWEB)

    Marc Vanderhaeghen

    2007-04-01

    The theoretical issues in the interpretation of the precision measurements of the nucleon-to-Delta transition by means of electromagnetic probes are highlighted. The results of these measurements are confronted with the state-of-the-art calculations based on chiral effective-field theories (EFT), lattice QCD, large-Nc relations, perturbative QCD, and QCD-inspired models. The link of the nucleon-to-Delta form factors to generalized parton distributions (GPDs) is also discussed.

  2. Multifractal features of EUA and CER futures markets by using multifractal detrended fluctuation analysis based on empirical model decomposition

    International Nuclear Information System (INIS)

    Cao, Guangxi; Xu, Wei

    2016-01-01

    Basing on daily price data of carbon emission rights in futures markets of Certified Emission Reduction (CER) and European Union Allowances (EUA), we analyze the multiscale characteristics of the markets by using empirical mode decomposition (EMD) and multifractal detrended fluctuation analysis (MFDFA) based on EMD. The complexity of the daily returns of CER and EUA futures markets changes with multiple time scales and multilayered features. The two markets also exhibit clear multifractal characteristics and long-range correlation. We employ shuffle and surrogate approaches to analyze the origins of multifractality. The long-range correlations and fat-tail distributions significantly contribute to multifractality. Furthermore, we analyze the influence of high returns on multifractality by using threshold method. The multifractality of the two futures markets is related to the presence of high values of returns in the price series.

  3. Consideration of relativistic effects in band structure calculations based on the empirical tight-binding method

    International Nuclear Information System (INIS)

    Hanke, M.; Hennig, D.; Kaschte, A.; Koeppen, M.

    1988-01-01

    The energy band structure of cadmium telluride and mercury telluride materials is investigated by means of the tight-binding (TB) method considering relativistic effects and the spin-orbit interaction. Taking into account relativistic effects in the method is rather simple though the size of the Hamilton matrix doubles. Such considerations are necessary for the interesting small-interstice semiconductors, and the experimental results are reflected correctly in the band structures. The transformation behaviour of the eigenvectors within the Brillouin zone gets more complicated, but is, nevertheless, theoretically controllable. If, however, the matrix elements of the Green operator are to be calculated, one has to use formula manipulation programmes in particular for non-diagonal elements. For defect calculations by the Koster-Slater theory of scattering it is necessary to know these matrix elements. Knowledge of the transformation behaviour of eigenfunctions saves frequent diagonalization of the Hamilton matrix and thus permits a numerical solution of the problem. Corresponding results for the sp 3 basis are available

  4. Testing seasonal and long-term controls of streamwater DOC using empirical and process-based models.

    Science.gov (United States)

    Futter, Martyn N; de Wit, Heleen A

    2008-12-15

    Concentrations of dissolved organic carbon (DOC) in surface waters are increasing across Europe and parts of North America. Several mechanisms have been proposed to explain these increases including reductions in acid deposition, change in frequency of winter storms and changes in temperature and precipitation patterns. We used two modelling approaches to identify the mechanisms responsible for changing surface water DOC concentrations. Empirical regression analysis and INCA-C, a process-based model of stream-water DOC, were used to simulate long-term (1986--2003) patterns in stream water DOC concentrations in a small boreal stream. Both modelling approaches successfully simulated seasonal and inter-annual patterns in DOC concentration. In both models, seasonal patterns of DOC concentration were controlled by hydrology and inter-annual patterns were explained by climatic variation. There was a non-linear relationship between warmer summer temperatures and INCA-C predicted DOC. Only the empirical model was able to satisfactorily simulate the observed long-term increase in DOC. The observed long-term trends in DOC are likely to be driven by in-soil processes controlled by SO4(2-) and Cl(-) deposition, and to a lesser extent by temperature-controlled processes. Given the projected changes in climate and deposition, future modelling and experimental research should focus on the possible effects of soil temperature and moisture on organic carbon production, sorption and desorption rates, and chemical controls on organic matter solubility.

  5. An Empirical Study of Neural Network-Based Audience Response Technology in a Human Anatomy Course for Pharmacy Students.

    Science.gov (United States)

    Fernández-Alemán, José Luis; López-González, Laura; González-Sequeros, Ofelia; Jayne, Chrisina; López-Jiménez, Juan José; Carrillo-de-Gea, Juan Manuel; Toval, Ambrosio

    2016-04-01

    This paper presents an empirical study of a formative neural network-based assessment approach by using mobile technology to provide pharmacy students with intelligent diagnostic feedback. An unsupervised learning algorithm was integrated with an audience response system called SIDRA in order to generate states that collect some commonality in responses to questions and add diagnostic feedback for guided learning. A total of 89 pharmacy students enrolled on a Human Anatomy course were taught using two different teaching methods. Forty-four students employed intelligent SIDRA (i-SIDRA), whereas 45 students received the same training but without using i-SIDRA. A statistically significant difference was found between the experimental group (i-SIDRA) and the control group (traditional learning methodology), with T (87) = 6.598, p < 0.001. In four MCQs tests, the difference between the number of correct answers in the first attempt and in the last attempt was also studied. A global effect size of 0.644 was achieved in the meta-analysis carried out. The students expressed satisfaction with the content provided by i-SIDRA and the methodology used during the process of learning anatomy (M = 4.59). The new empirical contribution presented in this paper allows instructors to perform post hoc analyses of each particular student's progress to ensure appropriate training.

  6. An empirical nexus between oil price collapse and economic growth in Sub-Saharan African oil based economies

    Directory of Open Access Journals (Sweden)

    KEJI Sunday Anderu

    2018-06-01

    Full Text Available The focus of this study, is to empirically investigate the nexus between oil price collapse and economic growth in sub-Saharan Africa oil based economies, specifically from Angola, Nigeria and Sudan between January, 2010 and December, 2015, through panel random effects model (REM: Economic growth rate (GDPR and independent variables: Oil price (OPR, Exchange rate (EXR, Industrial Output (IND and Terms of Trade (TOT. REM result showed that there is negative link between oil price collapse and the economic growth in the case of Angola, Nigeria and Sudan, which confirmed the nexus between oil price collapse and economic growth. Post estimation tests such as Hausman and Breusch and Pagan Lagrange Multiplier Test were adopted to empirically show the consistency and efficiency of the model. Interestingly, the two key variables (GDPR and OPR disclose how unprecedented oil price fall disrupts economic growth of the selected economies. Meanwhile, poor institutional quality in the oil sector coupled with poor fiscal measure among others, further expose these economies to unprecedented external shocks that was characterized by skyrocket exchange rate, hence destabilize growth within the period under review. Therefore, the need for a robust fiscal measure is pertinent in order to sustain economic growth

  7. An empirical model of the topside plasma density around 600 km based on ROCSAT-1 and Hinotori observations

    Science.gov (United States)

    Huang, He; Chen, Yiding; Liu, Libo; Le, Huijun; Wan, Weixing

    2015-05-01

    It is an urgent task to improve the ability of ionospheric empirical models to more precisely reproduce the plasma density variations in the topside ionosphere. Based on the Republic of China Satellite 1 (ROCSAT-1) observations, we developed a new empirical model of topside plasma density around 600 km under relatively quiet geomagnetic conditions. The model reproduces the ROCSAT-1 plasma density observations with a root-mean-square-error of 0.125 in units of lg(Ni(cm-3)) and reasonably describes the temporal and spatial variations of plasma density at altitudes in the range from 550 to 660 km. The model results are also in good agreement with observations from Hinotori, Coupled Ion-Neutral Dynamics Investigations/Communications/Navigation Outage Forecasting System satellites and the incoherent scatter radar at Arecibo. Further, we combined ROCSAT-1 and Hinotori data to improve the ROCSAT-1 model and built a new model (R&H model) after the consistency between the two data sets had been confirmed with the original ROCSAT-1 model. In particular, we studied the solar activity dependence of topside plasma density at a fixed altitude by R&H model and find that its feature slightly differs from the case when the orbit altitude evolution is ignored. In addition, the R&H model shows the merging of the two crests of equatorial ionization anomaly above the F2 peak, while the IRI_Nq topside option always produces two separate crests in this range of altitudes.

  8. An Empirical Study of Instructor Adoption of Web-Based Learning Systems

    Science.gov (United States)

    Wang, Wei-Tsong; Wang, Chun-Chieh

    2009-01-01

    For years, web-based learning systems have been widely employed in both educational and non-educational institutions. Although web-based learning systems are emerging as a useful tool for facilitating teaching and learning activities, the number of users is not increasing as fast as expected. This study develops an integrated model of instructor…

  9. Empirically Supported Family-Based Treatments for Conduct Disorder and Delinquency in Adolescents

    Science.gov (United States)

    Henggeler, Scott W.; Sheidow, Ashli J.

    2012-01-01

    Several family-based treatments of conduct disorder and delinquency in adolescents have emerged as evidence-based and, in recent years, have been transported to more than 800 community practice settings. These models include multisystemic therapy, functional family therapy, multidimensional treatment foster care, and, to a lesser extent, brief…

  10. Formula-Based Public School Funding System in Victoria: An Empirical Analysis of Equity

    Science.gov (United States)

    Bandaranayake, Bandara

    2013-01-01

    This article explores the formula-based school funding system in the state of Victoria, Australia, where state funds are directly allocated to schools based on a range of equity measures. The impact of Victoria' funding system for education in terms of alleviating inequality and disadvantage is contentious, to say the least. It is difficult to…

  11. Text-Based On-Line Conferencing: A Conceptual and Empirical Analysis Using a Minimal Prototype.

    Science.gov (United States)

    McCarthy, John C.; And Others

    1993-01-01

    Analyzes requirements for text-based online conferencing through the use of a minimal prototype. Topics discussed include prototyping with a minimal system; text-based communication; the system as a message passer versus the system as a shared data structure; and three exercises that showed how users worked with the prototype. (Contains 61…

  12. The theoretical advantage of affinity membrane-based immunoadsorption therapy of hypercholesterolemia

    International Nuclear Information System (INIS)

    Green, P.; Odell, R.; Schindhelm, K.

    1996-01-01

    Full text: Therapy of hypercholesterolemia using immunoadsorption of Low Density Lipoprotein (LDL) to a gel substrate is a current clinical technique (Bosch T., Biomat., Art. Cells and Immob. Biotech, 20: 1165- 1169, 1992). Recently, Affinity Membranes have been proposed as an alternate substrate for immunoadsorption (Brandt S and others, Bio Technology, 6:779-782, 1988). Potentially, the overall rate of adsorption to a membrane may be faster than to a gel because of the different geometry (ibid). This implies that for the same conditions, a membrane-based device will have a higher Number of Transfer Units, more efficient adsorption and a smaller device size than a gel. To test this hypothesis, we calculated two key theoretical design parameters: Separation Factor, R, and the Number of Transfer Units, N, for a functioning clinical-scale affinity membrane device: R=K d /K d +C 0 . Kd: Equilibrium Dissociation Constant (M) and Co: Feed Concentration (M) N=k a Q max V m /F. ka: Intrinsic reaction rate constant (M -1 min -1 ), Qmax: Substrate capacity (M), Vm: Membrane volume (m1) and F: Flow Rate (m1 min -1 ). We assumed 1 hr treatment time during which 1 plasma volume (3L) is treated, hence F=50 (m1 min -1 ). If we assume 2/3 of LDL is removed from an initial level of 3 g/L, we can calculate an average feed concentration Co = 2 g / L. There is some data available in the literature for typical values of Kd (10 -8 M) and ka ( 10 3 M -1 s -1 to 3 x 10 5 M -1 s -1 ) (Olsen WC and others, Molec. Immun: 26: 129-136, 1989). Since the intrinsic reaction kinetics may vary from very slow (10 3 M) to very fast (3 x 10 5 M), the Number of Transfer Units, N may vary from small (2) to large (650). Hence for a membrane device, we must select the antibody with the fastest reaction, ka, and highest capacity (Qmax) otherwise, there may be no advantage in a membrane-based device over a gel-based device

  13. A new multivariate empirical mode decomposition method for improving the performance of SSVEP-based brain-computer interface

    Science.gov (United States)

    Chen, Yi-Feng; Atal, Kiran; Xie, Sheng-Quan; Liu, Quan

    2017-08-01

    Objective. Accurate and efficient detection of steady-state visual evoked potentials (SSVEP) in electroencephalogram (EEG) is essential for the related brain-computer interface (BCI) applications. Approach. Although the canonical correlation analysis (CCA) has been applied extensively and successfully to SSVEP recognition, the spontaneous EEG activities and artifacts that often occur during data recording can deteriorate the recognition performance. Therefore, it is meaningful to extract a few frequency sub-bands of interest to avoid or reduce the influence of unrelated brain activity and artifacts. This paper presents an improved method to detect the frequency component associated with SSVEP using multivariate empirical mode decomposition (MEMD) and CCA (MEMD-CCA). EEG signals from nine healthy volunteers were recorded to evaluate the performance of the proposed method for SSVEP recognition. Main results. We compared our method with CCA and temporally local multivariate synchronization index (TMSI). The results suggest that the MEMD-CCA achieved significantly higher accuracy in contrast to standard CCA and TMSI. It gave the improvements of 1.34%, 3.11%, 3.33%, 10.45%, 15.78%, 18.45%, 15.00% and 14.22% on average over CCA at time windows from 0.5 s to 5 s and 0.55%, 1.56%, 7.78%, 14.67%, 13.67%, 7.33% and 7.78% over TMSI from 0.75 s to 5 s. The method outperformed the filter-based decomposition (FB), empirical mode decomposition (EMD) and wavelet decomposition (WT) based CCA for SSVEP recognition. Significance. The results demonstrate the ability of our proposed MEMD-CCA to improve the performance of SSVEP-based BCI.

  14. Cell death following BNCT: A theoretical approach based on Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ballarini, F., E-mail: francesca.ballarini@pv.infn.it [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy); Bakeine, J. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy); Bortolussi, S. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy); Bruschi, P. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy); Cansolino, L.; Clerici, A.M.; Ferrari, C. [University of Pavia, Department of Surgery, Experimental Surgery Laboratory, Pavia (Italy); Protti, N.; Stella, S. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy); Zonta, A.; Zonta, C. [University of Pavia, Department of Surgery, Experimental Surgery Laboratory, Pavia (Italy); Altieri, S. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy)

    2011-12-15

    In parallel to boron measurements and animal studies, investigations on radiation-induced cell death are also in progress in Pavia, with the aim of better characterisation of the effects of a BNCT treatment down to the cellular level. Such studies are being carried out not only experimentally but also theoretically, based on a mechanistic model and a Monte Carlo code. Such model assumes that: (1) only clustered DNA strand breaks can lead to chromosome aberrations; (2) only chromosome fragments within a certain threshold distance can undergo misrejoining; (3) the so-called 'lethal aberrations' (dicentrics, rings and large deletions) lead to cell death. After applying the model to normal cells exposed to monochromatic fields of different radiation types, the irradiation section of the code was purposely extended to mimic the cell exposure to a mixed radiation field produced by the {sup 10}B(n,{alpha}) {sup 7}Li reaction, which gives rise to alpha particles and Li ions of short range and high biological effectiveness, and by the {sup 14}N(n,p){sup 14}C reaction, which produces 0.58 MeV protons. Very good agreement between model predictions and literature data was found for human and animal cells exposed to X- or gamma-rays, protons and alpha particles, thus allowing to validate the model for cell death induced by monochromatic radiation fields. The model predictions showed good agreement also with experimental data obtained by our group exposing DHD cells to thermal neutrons in the TRIGA Mark II reactor of University of Pavia; this allowed to validate the model also for a BNCT exposure scenario, providing a useful predictive tool to bridge the gap between irradiation and cell death.

  15. The theoretical base of e-learning and its role in surgical education.

    Science.gov (United States)

    Evgeniou, Evgenios; Loizou, Peter

    2012-01-01

    The advances in Internet and computer technology offer many solutions that can enhance surgical education and increase the effectiveness of surgical teaching. E-learning plays an important role in surgical education today, with many e-learning projects already available on the Internet. E-learning is based on a mixture of educational theories that derive from behaviorist, cognitivist, and constructivist educational theoretical frameworks. CAN EDUCATIONAL THEORY IMPROVE E-LEARNING?: Conventional educational theory can be applied to improve the quality and effectiveness of e-learning. The theory of "threshold concepts" and educational theories on reflection, motivation, and communities of practice can be applied when designing e-learning material. E-LEARNING IN SURGICAL EDUCATION: E-learning has many advantages but also has weaknesses. Studies have shown that e-learning is an effective teaching method that offers high levels of learner satisfaction. Instead of trying to compare e-learning with traditional methods of teaching, it is better to integrate in e-learning elements of traditional teaching that have been proven to be effective. E-learning can play an important role in surgical education as a blended approach, combined with more traditional methods of teaching, which offer better face-to-interaction with patients and colleagues in different circumstances and hands on practice of practical skills. National provision of e-learning can make evaluation easier. The correct utilization of Internet and computer resources combined with the application of valid conventional educational theory to design e-learning relevant to the various levels of surgical training can be effective in the training of future surgeons. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  16. Computer-Based Methods for Collecting Peer Nomination Data: Utility, Practice, and Empirical Support.

    Science.gov (United States)

    van den Berg, Yvonne H M; Gommans, Rob

    2017-09-01

    New technologies have led to several major advances in psychological research over the past few decades. Peer nomination research is no exception. Thanks to these technological innovations, computerized data collection is becoming more common in peer nomination research. However, computer-based assessment is more than simply programming the questionnaire and asking respondents to fill it in on computers. In this chapter the advantages and challenges of computer-based assessments are discussed. In addition, a list of practical recommendations and considerations is provided to inform researchers on how computer-based methods can be applied to their own research. Although the focus is on the collection of peer nomination data in particular, many of the requirements, considerations, and implications are also relevant for those who consider the use of other sociometric assessment methods (e.g., paired comparisons, peer ratings, peer rankings) or computer-based assessments in general. © 2017 Wiley Periodicals, Inc.

  17. Health care information systems and formula-based reimbursement: an empirical study.

    Science.gov (United States)

    Palley, M A; Conger, S

    1995-01-01

    Current initiatives in health care administration use formula-based approaches to reimbursement. Examples of such approaches include capitation and diagnosis related groups (DRGs). These approaches seek to contain medical costs and to facilitate managerial control over scarce health care resources. This article considers various characteristics of formula-based reimbursement, their operationalization on hospital information systems, and how these relate to hospital compliance costs.

  18. Holding-based network of nations based on listed energy companies: An empirical study on two-mode affiliation network of two sets of actors

    Science.gov (United States)

    Li, Huajiao; Fang, Wei; An, Haizhong; Gao, Xiangyun; Yan, Lili

    2016-05-01

    Economic networks in the real world are not homogeneous; therefore, it is important to study economic networks with heterogeneous nodes and edges to simulate a real network more precisely. In this paper, we present an empirical study of the one-mode derivative holding-based network constructed by the two-mode affiliation network of two sets of actors using the data of worldwide listed energy companies and their shareholders. First, we identify the primitive relationship in the two-mode affiliation network of the two sets of actors. Then, we present the method used to construct the derivative network based on the shareholding relationship between two sets of actors and the affiliation relationship between actors and events. After constructing the derivative network, we analyze different topological features on the node level, edge level and entire network level and explain the meanings of the different values of the topological features combining the empirical data. This study is helpful for expanding the usage of complex networks to heterogeneous economic networks. For empirical research on the worldwide listed energy stock market, this study is useful for discovering the inner relationships between the nations and regions from a new perspective.

  19. An Empirical Study of Email-Based Advertisement and its Influence on Consumers’ Attitude

    Directory of Open Access Journals (Sweden)

    Navid Behravan

    2012-07-01

    Full Text Available E-commerce becomes a cornerstone for many businesses over the recent years. Align with e-commerce activities, the marketing communication through online media plays a major role in achieving competitive advantageous. E-mail advertising in this context offers a cost effective, direct and reciprocal means for businesses overcoming time and geographical barriers. As so, this study discussed the advertising e-mail characteristics and its influences on customers' attitude about email-based advertisement. According to the research findings, entertainment and informativeness of advertising email content is strongly and positively affect customers’ attitude about email-based advertisement. On the other hand, the privacy of advertising e-mail is strongly, yet negatively influences the customers’ attitude towards email-based advertisement.

  20. Performance-based management and quality of work: an empirical assessment.

    Science.gov (United States)

    Falzon, Pierre; Nascimento, Adelaide; Gaudart, Corinne; Piney, Cécile; Dujarier, Marie-Anne; Germe, Jean-François

    2012-01-01

    In France, in the private sector as in the public sector, performance-based management tends to become a norm. Performance-based management is supposed to improve service quality, productivity and efficiency, transparency of allotted means and achieved results, and to better focus the activity of employees and of the whole organization. This text reports a study conducted for the French Ministry of Budget by a team of researchers in ergonomics, sociology and management science, in order to assess the impact of performance-based management on employees, on teams and on work organization. About 100 interviews were conducted with employees of all categories and 6 working groups were set up in order to discuss and validate or amend our first analyses. Results concern several aspects: workload and work intensification, indicators and performance management and the transformation of jobs induced by performance management.

  1. Wind turbine blades condition assessment based on vibration measurements and the level of an empirically decomposed feature

    International Nuclear Information System (INIS)

    Abouhnik, Abdelnasser; Albarbar, Alhussein

    2012-01-01

    Highlights: ► We used finite element method to model wind turbine induced vibration characteristics. ► We developed a technique for eliminating wind turbine’s vibration modulation problems. ► We use empirical mode decomposition to decompose the vibration into its fundamental elements. ► We show the area under shaft speed is a good indicator for assessing wind blades condition. ► We validate the technique under different wind turbine speeds and blade (cracks) conditions. - Abstract: Vibration based monitoring techniques are well understood and widely adopted for monitoring the condition of rotating machinery. However, in the case of wind turbines the measured vibration is complex due to the high number of vibration sources and modulation phenomenon. Therefore, extracting condition related information of a specific element e.g. blade condition is very difficult. In the work presented in this paper wind turbine vibration sources are outlined and then a three bladed wind turbine vibration was simulated by building its model in the ANSYS finite element program. Dynamic analysis was performed and the fundamental vibration characteristics were extracted under two healthy blades and one blade with one of four cracks introduced. The cracks were of length (10 mm, 20 mm, 30 mm and 40 mm), all had a consistent 3 mm width and 2 mm depth. The tests were carried out for three rotation speeds; 150, 250 and 360 r/min. The effects of the seeded faults were revealed by using a novel approach called empirically decomposed feature intensity level (EDFIL). The developed EDFIL algorithm is based on decomposing the measured vibration into its fundamental components and then determines the shaft rotational speed amplitude. A real model of the simulated wind turbine was constructed and the simulation outcomes were compared with real-time vibration measurements. The cracks were seeded sequentially in one of the blades and their presence and severity were determined by decomposing

  2. Empiric analysis of zero voltage switching in piezoelectric transformer based resonant converters

    DEFF Research Database (Denmark)

    Rødgaard, Martin Schøler; Andersen, Thomas; Andersen, Michael A. E.

    2012-01-01

    Research and development within piezoelectric transformer (PT) based converters are rapidly increasing, as the technology is maturing and starts to prove its capabilities. High power density and high efficiencies are reported and recently several inductor-less converters have emerged [1][2][7][10......Research and development within piezoelectric transformer (PT) based converters are rapidly increasing, as the technology is maturing and starts to prove its capabilities. High power density and high efficiencies are reported and recently several inductor-less converters have emerged [1...

  3. Gyroscope-driven mouse pointer with an EMOTIV® EEG headset and data analysis based on Empirical Mode Decomposition.

    Science.gov (United States)

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-08-14

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented.

  4. Gyroscope-Driven Mouse Pointer with an EMOTIV® EEG Headset and Data Analysis Based on Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    Carlos Reyes-Garcia

    2013-08-01

    Full Text Available This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user’s blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD. EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented.

  5. Precision comparison of the erosion rates derived from 137Cs measurements models with predictions based on empirical relationship

    International Nuclear Information System (INIS)

    Yang Mingyi; Liu Puling; Li Liqing

    2004-01-01

    The soil samples were collected in 6 cultivated runoff plots with grid sampling method, and the soil erosion rates derived from 137 Cs measurements were calculated. The models precision of Zhang Xinbao, Zhou Weizhi, Yang Hao and Walling were compared with predictions based on empirical relationship, data showed that the precision of 4 models is high within 50m slope length except for the slope with low slope angle and short length. Relatively, the precision of Walling's model is better than that of Zhang Xinbao, Zhou Weizhi and Yang Hao. In addition, the relationship between parameter Γ in Walling's improved model and slope angle was analyzed, the ralation is: Y=0.0109 X 1.0072 . (authors)

  6. A novel approach for baseline correction in 1H-MRS signals based on ensemble empirical mode decomposition.

    Science.gov (United States)

    Parto Dezfouli, Mohammad Ali; Dezfouli, Mohsen Parto; Rad, Hamidreza Saligheh

    2014-01-01

    Proton magnetic resonance spectroscopy ((1)H-MRS) is a non-invasive diagnostic tool for measuring biochemical changes in the human body. Acquired (1)H-MRS signals may be corrupted due to a wideband baseline signal generated by macromolecules. Recently, several methods have been developed for the correction of such baseline signals, however most of them are not able to estimate baseline in complex overlapped signal. In this study, a novel automatic baseline correction method is proposed for (1)H-MRS spectra based on ensemble empirical mode decomposition (EEMD). This investigation was applied on both the simulated data and the in-vivo (1)H-MRS of human brain signals. Results justify the efficiency of the proposed method to remove the baseline from (1)H-MRS signals.

  7. On Feature Relevance in Image-Based Prediction Models: An Empirical Study

    DEFF Research Database (Denmark)

    Konukoglu, E.; Ganz, Melanie; Van Leemput, Koen

    2013-01-01

    Determining disease-related variations of the anatomy and function is an important step in better understanding diseases and developing early diagnostic systems. In particular, image-based multivariate prediction models and the “relevant features” they produce are attracting attention from the co...

  8. Perceptions of the Effectiveness of System Dynamics-Based Interactive Learning Environments: An Empirical Study

    Science.gov (United States)

    Qudrat-Ullah, Hassan

    2010-01-01

    The use of simulations in general and of system dynamics simulation based interactive learning environments (SDILEs) in particular is well recognized as an effective way of improving users' decision making and learning in complex, dynamic tasks. However, the effectiveness of SDILEs in classrooms has rarely been evaluated. This article describes…

  9. Young Readers' Narratives Based on a Picture Book: Model Readers and Empirical Readers

    Science.gov (United States)

    Hoel, Trude

    2015-01-01

    The article present parts of a research project where the aim is to investigate six- to seven-year-old children's language use in storytelling. The children's oral texts are based on the wordless picture book "Frog, Where Are You?" Which has been, and still remains, a frequent tool for collecting narratives from children. The Frog story…

  10. The role of social networks in financing technology-based ventures: an empirical exploration

    NARCIS (Netherlands)

    Heuven, J.M.J.; Groen, Arend J.

    2012-01-01

    The focus of this study is on the role of networks in both identifying and accessing financial resource providers by technology-based ventures. We explore the role of networks by taking into account several specifications. We (1) acknowledge that new ventures can access financial resource providers

  11. Homogeneity in Community-Based Rape Prevention Programs: Empirical Evidence of Institutional Isomorphism

    Science.gov (United States)

    Townsend, Stephanie M.; Campbell, Rebecca

    2007-01-01

    This study examined the practices of 24 community-based rape prevention programs. Although these programs were geographically dispersed throughout one state, they were remarkably similar in their approach to rape prevention programming. DiMaggio and Powell's (1991) theory of institutional isomorphism was used to explain the underlying causes of…

  12. An Adaptive E-Learning System Based on Students' Learning Styles: An Empirical Study

    Science.gov (United States)

    Drissi, Samia; Amirat, Abdelkrim

    2016-01-01

    Personalized e-learning implementation is recognized as one of the most interesting research areas in the distance web-based education. Since the learning style of each learner is different one must fit e-learning with the different needs of learners. This paper presents an approach to integrate learning styles into adaptive e-learning hypermedia.…

  13. Empirical research in service engineering based on AHP and fuzzy methods

    Science.gov (United States)

    Zhang, Yanrui; Cao, Wenfu; Zhang, Lina

    2015-12-01

    Recent years, management consulting industry has been rapidly developing worldwide. Taking a big management consulting company as research object, this paper established an index system of service quality of consulting, based on customer satisfaction survey, evaluated service quality of the consulting company by AHP and fuzzy comprehensive evaluation methods.

  14. Nonparametric and group-based person-fit statistics : a validity study and an empirical example

    NARCIS (Netherlands)

    Meijer, R.R.

    1994-01-01

    In person-fit analysis, the object is to investigate whether an item score pattern is improbable given the item score patterns of the other persons in the group or given what is expected on the basis of a test model. In this study, several existing group-based statistics to detect such improbable

  15. First empirical evaluation of outcomes for mentalization-based group therapy for adolescents with BPD

    DEFF Research Database (Denmark)

    Bo, Sune; Sharp, Carla; Beck, Emma

    2017-01-01

    Adolescent borderline personality disorder (BPD) is a devastating disorder, and it is essential to identify and treat the disorder in its early course. A total of 34 female Danish adolescents between 15 and 18 years old participated in 1 year of structured mentalization-based group therapy. Twent...... with borderline traits. (PsycINFO Database RecordCopyright (c) 2017 APA, all rights reserved)....

  16. An empirical classification-based framework for the safety criticality assessment of energy production systems, in presence of inconsistent data

    International Nuclear Information System (INIS)

    Wang, Tai-Ran; Mousseau, Vincent; Pedroni, Nicola; Zio, Enrico

    2017-01-01

    The technical problem addressed in the present paper is the assessment of the safety criticality of energy production systems. An empirical classification model is developed, based on the Majority Rule Sorting method, to evaluate the class of criticallity of the plant/system of interest, with respect to safety. The model is built on the basis of a (limited-size) set of data representing the characteristics of a number of plants and their corresponding criticality classes, as assigned by experts. The construction of the classification model may raise two issues. First, the classification examples provided by the experts may contain contradictions: a validation of the consistency of the considered dataset is, thus, required. Second, uncertainty affects the process: a quantitative assessment of the performance of the classification model is, thus, in order, in terms of accuracy and confidence in the class assignments. In this paper, two approaches are proposed to tackle the first issue: the inconsistencies in the data examples are “resolved” by deleting or relaxing, respectively, some constraints in the model construction process. Three methods are proposed to address the second issue: (i) a model retrieval-based approach, (ii) the Bootstrap method and (iii) the cross-validation technique. Numerical analyses are presented with reference to an artificial case study regarding the classification of Nuclear Power Plants. - Highlights: • We use a hierarchical framework to represent safety criticality. • We use an empirical classification model to evaluate safety criticality. • Inconsistencies in data examples are “resolved” by deleting/relaxing constraints. • Accuracy and confidence in the class assignments are computed by three methods. • Method is applied to fictitious Nuclear Power Plants.

  17. Empirically based assessment and taxonomy of psychopathology for ages 1½-90+ years: Developmental, multi-informant, and multicultural findings.

    Science.gov (United States)

    Achenbach, Thomas M; Ivanova, Masha Y; Rescorla, Leslie A

    2017-11-01

    Originating in the 1960s, the Achenbach System of Empirically Based Assessment (ASEBA) comprises a family of instruments for assessing problems and strengths for ages 1½-90+ years. To provide an overview of the ASEBA, related research, and future directions for empirically based assessment and taxonomy. Standardized, multi-informant ratings of transdiagnostic dimensions of behavioral, emotional, social, and thought problems are hierarchically scored on narrow-spectrum syndrome scales, broad-spectrum internalizing and externalizing scales, and a total problems (general psychopathology) scale. DSM-oriented and strengths scales are also scored. The instruments and scales have been iteratively developed from assessments of clinical and population samples of hundreds of thousands of individuals. Items, instruments, scales, and norms are tailored to different kinds of informants for ages 1½-5, 6-18, 18-59, and 60-90+ years. To take account of differences between informants' ratings, parallel instruments are completed by parents, teachers, youths, adult probands, and adult collaterals. Syndromes and Internalizing/Externalizing scales derived from factor analyses of each instrument capture variations in patterns of problems that reflect different informants' perspectives. Confirmatory factor analyses have supported the syndrome structures in dozens of societies. Software displays scale scores in relation to user-selected multicultural norms for the age and gender of the person being assessed, according to ratings by each type of informant. Multicultural norms are derived from population samples in 57 societies on every inhabited continent. Ongoing and future research includes multicultural assessment of elders; advancing transdiagnostic progress and outcomes assessment; and testing higher order structures of psychopathology. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. A beginner's guide to writing the nursing conceptual model-based theoretical rationale.

    Science.gov (United States)

    Gigliotti, Eileen; Manister, Nancy N

    2012-10-01

    Writing the theoretical rationale for a study can be a daunting prospect for novice researchers. Nursing's conceptual models provide excellent frameworks for placement of study variables, but moving from the very abstract concepts of the nursing model to the less abstract concepts of the study variables is difficult. Similar to the five-paragraph essay used by writing teachers to assist beginning writers to construct a logical thesis, the authors of this column present guidelines that beginners can follow to construct their theoretical rationale. This guide can be used with any nursing conceptual model but Neuman's model was chosen here as the exemplar.

  19. Caprylate Salts Based on Amines as Volatile Corrosion Inhibitors for Metallic Zinc: Theoretical and Experimental Studies

    Science.gov (United States)

    Valente, Marco A. G.; Teixeira, Deiver A.; Azevedo, David L.; Feliciano, Gustavo T.; Benedetti, Assis V.; Fugivara, Cecílio S.

    2017-01-01

    The interaction of volatile corrosion inhibitors (VCI), caprylate salt derivatives from amines, with zinc metallic surfaces is assessed by density functional theory (DFT) computer simulations, electrochemical impedance (EIS) measurements and humid chamber tests. The results obtained by the different methods were compared, and linear correlations were obtained between theoretical and experimental data. The correlations between experimental and theoretical results showed that the molecular size is the determining factor in the inhibition efficiency. The models used and experimental results indicated that dicyclohexylamine caprylate is the most efficient inhibitor. PMID:28620602

  20. Accelerated Internationalization in Emerging Markets: Empirical Evidence from Brazilian Technology-Based Firms

    Directory of Open Access Journals (Sweden)

    Fernanda Ferreira Ribeiro

    2014-04-01

    Full Text Available This paper offers an analysis into the external factors influencing the accelerated internationalization of technology-based firms (TBFs in the context of an emerging country, Brazil. This type of firm is typically called born global and has been reported mainly in high technology sectors and from developed countries. A survey was applied to small and medium Brazilian TBFs. Logistic regression was used to test the research hypotheses. The results suggest that new and small Brazilian technology-based firms, which followed an accelerated internationalization process, are most likely to be integrated into a global production chain. Results also show that TBFs which take more than five years to enter the international market, benefit more from the location in an innovation habitat, the partnerships in the home country, and the pro-internationalization government policies. Therefore, this research contributes to a better understanding of the phenomenon and points to new perspectives of studies.