Motivational activities based on previous knowledge of students
García, J. A.; Gómez-Robledo, L.; Huertas, R.; Perales, F. J.
2014-07-01
Academic results depend strongly on the individual circumstances of students: background, motivation and aptitude. We think that academic activities conducted to increase motivation must be tuned to the special situation of the students. Main goal of this work is analyze the students in the first year of the Degree in Optics and Optometry in the University of Granada and the suitability of an activity designed for those students. Initial data were obtained from a survey inquiring about the reasons to choose this degree, their knowledge of it, and previous academic backgrounds. Results show that: 1) the group is quite heterogeneous, since students have very different background. 2) Reasons to choose the Degree in Optics and Optometry are also very different, and in many cases were selected as a second option. 3) Knowledge and motivations about the Degree are in general quite low. Trying to increase the motivation of the students we designed an academic activity in which we show different topics studied in the Degree. Results show that students that have been involved in this activity are the most motivated and most satisfied with their election of the degree.
Theory-Based Stakeholder Evaluation
Hansen, Morten Balle; Vedung, Evert
2010-01-01
This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…
Sourcing quality-of-life weights obtained from previous studies: theory and reality in Korea.
Bae, SeungJin; Bae, Eun Young; Lim, Sang Hee
2014-01-01
The quality-of-life weights obtained in previous studies are frequently used in cost-utility analyses. The purpose of this study is to describe how the values obtained in previous studies are incorporated into the industry submissions requesting listing at the Korean National Health Insurance (NHI), focusing on the issues discussed in theoretical studies and national guidelines. The industry submissions requesting listing at the Korean NHI from January 2007 until December 2009 were evaluated by two independent researchers at the Health Insurance Review and Assessment Service (HIRA). Specifically, we observed the methods that were used to pool, predict joint health state utilities, and retain consistency within submissions in terms of the issues discussed in methodological research papers and recommendations from national guidelines. More than half of the submissions used QALY as an outcome measure, and most of these submissions were sourced from prior studies. Heterogeneous methodologies were frequently used within a submission, with the inconsistent use of upper and lower anchors being prevalent. Assumptions behind measuring joint health state utilities or pooling multiple values for single health states were omitted in all submissions. Most national guidelines were rather vague regarding how to predict joint health states, how to select the best available value, how to maintain consistency within a submission, and how to generalize values obtained from prior studies. Previously-generated values were commonly sourced, but this practice was frequently related to inconsistencies within and among submissions. Attention should be paid to the consistency and transparency of the value, especially if the value is sourced from prior studies.
Attribute and topology based change detection in a constellation of previously detected objects
Paglieroni, David W.; Beer, Reginald N.
2016-01-19
A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.
Rakoczy, Hannes; Bergfeld, Delia; Schwarz, Ina; Fizke, Ella
2015-01-01
Existing evidence suggests that children, when they first pass standard theory-of-mind tasks, still fail to understand the essential aspectuality of beliefs and other propositional attitudes: such attitudes refer to objects only under specific aspects. Oedipus, for example, believes Yocaste (his mother) is beautiful, but this does not imply that he believes his mother is beautiful. In three experiments, 3- to 6-year-olds' (N = 119) understanding of aspectuality was tested with a novel, radically simplified task. In contrast to all previous findings, this task was as difficult as and highly correlated with a standard false belief task. This suggests that a conceptual capacity more unified than previously assumed emerges around ages 4-5, a full-fledged metarepresentational scheme of propositional attitudes.
Point pattern match-based change detection in a constellation of previously detected objects
Paglieroni, David W.
2016-06-07
A method and system is provided that applies attribute- and topology-based change detection to objects that were detected on previous scans of a medium. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, detection strength, size, elongation, orientation, etc. The locations define a three-dimensional network topology forming a constellation of previously detected objects. The change detection system stores attributes of the previously detected objects in a constellation database. The change detection system detects changes by comparing the attributes and topological consistency of newly detected objects encountered during a new scan of the medium to previously detected objects in the constellation database. The change detection system may receive the attributes of the newly detected objects as the objects are detected by an object detection system in real time.
Balcı, Fuat; Tosun, Tuğçe; Gür, Ezgi
2016-01-01
Animals can shape their timed behaviors based on experienced probabilistic relations in a nearly optimal fashion. On the other hand, it is not clear if they adopt these timed decisions by making computations based on previously learnt task parameters (time intervals, locations, and probabilities) or if they gradually develop their decisions based on trial and error. To address this question, we tested mice in the timed-switching task, which required them to anticipate when (after a short or l...
Tedeschi, Enrico; Canna, Antonietta; Cocozza, Sirio; Russo, Carmela; Angelini, Valentina; Brunetti, Arturo [University ' ' Federico II' ' , Neuroradiology, Department of Advanced Biomedical Sciences, Naples (Italy); Palma, Giuseppe; Quarantelli, Mario [National Research Council, Institute of Biostructure and Bioimaging, Naples (Italy); Borrelli, Pasquale; Salvatore, Marco [IRCCS SDN, Naples (Italy); Lanzillo, Roberta; Postiglione, Emanuela; Morra, Vincenzo Brescia [University ' ' Federico II' ' , Department of Neurosciences, Reproductive and Odontostomatological Sciences, Naples (Italy)
2016-12-15
To evaluate changes in T1 and T2* relaxometry of dentate nuclei (DN) with respect to the number of previous administrations of Gadolinium-based contrast agents (GBCA). In 74 relapsing-remitting multiple sclerosis (RR-MS) patients with variable disease duration (9.8±6.8 years) and severity (Expanded Disability Status Scale scores:3.1±0.9), the DN R1 (1/T1) and R2* (1/T2*) relaxation rates were measured using two unenhanced 3D Dual-Echo spoiled Gradient-Echo sequences with different flip angles. Correlations of the number of previous GBCA administrations with DN R1 and R2* relaxation rates were tested, including gender and age effect, in a multivariate regression analysis. The DN R1 (normalized by brainstem) significantly correlated with the number of GBCA administrations (p<0.001), maintaining the same significance even when including MS-related factors. Instead, the DN R2* values correlated only with age (p=0.003), and not with GBCA administrations (p=0.67). In a subgroup of 35 patients for whom the administered GBCA subtype was known, the effect of GBCA on DN R1 appeared mainly related to linear GBCA. In RR-MS patients, the number of previous GBCA administrations correlates with R1 relaxation rates of DN, while R2* values remain unaffected, suggesting that T1-shortening in these patients is related to the amount of Gadolinium given. (orig.)
Rakoczy, Hannes; Bergfeld, Delia; Schwarz, Ina; Fizke, Ella
2015-01-01
Existing evidence suggests that children, when they first pass standard theory-of-mind tasks, still fail to understand the essential aspectuality of beliefs and other propositional attitudes: such attitudes refer to objects only under specific aspects. Oedipus, for example, believes Yocaste (his mother) is beautiful, but this does not imply that…
Rettedal, Elizabeth; Gumpert, Heidi; Sommer, Morten
2014-01-01
The human gut microbiota is linked to a variety of human health issues and implicated in antibiotic resistance gene dissemination. Most of these associations rely on culture-independent methods, since it is commonly believed that gut microbiota cannot be easily or sufficiently cultured. Here, we...... show that carefully designed conditions enable cultivation of a representative proportion of human gut bacteria, enabling rapid multiplex phenotypic profiling. We use this approach to determine the phylogenetic distribution of antibiotic tolerance phenotypes for 16 antibiotics in the human gut...... microbiota. Based on the phenotypic mapping, we tailor antibiotic combinations to specifically select for previously uncultivated bacteria. Utilizing this method we cultivate and sequence the genomes of four isolates, one of which apparently belongs to the genus Oscillibacter; uncultivated Oscillibacter...
Nasrin Saharkhiz
2014-11-01
Full Text Available Background: Embryo transfer (ET is one of the most important steps in assisted reproductive technology (ART cycles and affected by many factors namely the depth of embryo deposition in uterus. In this study, the outcomes of intracytoplasmic sperm injection (ICSI cycles after blind embryo transfer and embryo transfer based on previously measured uterine length using vaginal ultrasound were compared. Materials and Methods: This prospective randomised clinical trial included one hundred and forty non-donor fresh embryo transfers during January 2010 to June 2011. In group I, ET was performed using conventional (blind method at 5-6cm from the external os, and in group II, ET was done at a depth of 1-1.5 cm from the uterine fundus based on previously measured uterine length using vaginal sonography. Appropriate statistical analysis was performed using Student’s t test and Chi-square or Fisher’s exact test. The software that we used was PASW statistics version 18. A p value <0.05 was considered statistically significant. Results: Chemical pregnancy rate was 28.7% in group I and 42.1% in group II, while the difference was not statistically significant (p=0.105. Clinical pregnancy, ongoing pregnancy and implantation rates for group I were 21.2%, 17.7%, and 12.8%, while for group II were 33.9%, 33.9%, and 22.1, respectively. In group I and group II, abortion rates were 34.7% and 0%, respectively, indicating a statistically significant difference (p<0.005. No ectopic pregnancy occurred in two groups. Conclusion: The use of uterine length measurement during treatment cycle in order to place embryos at depth of 1-1.5cm from fundus significantly increases clinical and ongoing pregnancy and implantation rates, while leads to a decrease in abortion rate (Registration Number: IRCT2014032512494N1.
Nilsson, Ingeborg; Townsend, Elizabeth
2014-01-01
The evolving theory of occupational justice links the concept to social justice and to concerns for a justice of difference: a justice that recognizes occupational rights to inclusive participation in everyday occupations for all persons in society, regardless of age, ability, gender, social class, or other differences. The purpose of this descriptive paper is to inspire and empower health professionals to build a theoretical bridge to practice with an occupational justice lens. Using illustrations from a study of leisure and the use of everyday technology in the lives of very old people in Northern Sweden, the authors argue that an occupational justice lens may inspire and empower health professionals to engage in critical dialogue on occupational justice; use global thinking about occupation, health, justice, and the environment; and combine population and individualized approaches. The authors propose that taking these initiatives to bridge theory and practice will energize health professionals to enable inclusive participation in everyday occupations in diverse contexts.
Theory-based interventions for contraception.
Lopez, Laureen M; Tolley, Elizabeth E; Grimes, David A; Chen, Mario; Stockton, Laurie L
2013-08-07
The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, educational interventions addressing contraception often have no stated theoretical base. Review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice; encourage contraceptive use; or promote adherence to, or continuation of, a contraceptive regimen. Through June 2013, we searched computerized databases for trials that tested a theory-based intervention for improving contraceptive use (MEDLINE, POPLINE, CENTRAL, PsycINFO, ClinicalTrials.gov, and ICTRP). Previous searches also included EMBASE. For the initial review, we wrote to investigators to find other trials. Trials tested a theory-based intervention for improving contraceptive use. We excluded trials focused on high-risk groups and preventing sexually transmitted infections or HIV. Interventions addressed the use of one or more contraceptive methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy, contraceptive choice or use, and contraceptive adherence or continuation. The primary author evaluated abstracts for eligibility. Two authors extracted data from included studies. For the dichotomous outcomes, the Mantel-Haenszel odds ratio (OR) with 95% CI was calculated using a fixed-effect model. Cluster randomized trials used various methods of accounting for the clustering, such as multilevel modeling. Most reports did not provide information to calculate the effective sample size. Therefore, we presented the results as reported by the investigators. No meta-analysis was conducted due to differences in interventions and outcome measures. We included three new trials for a
Vocation in theology-based nursing theories.
Lundmark, Mikael
2007-11-01
By using the concepts of intrinsicality/extrinsicality as analytic tools, the theology-based nursing theories of Ann Bradshaw and Katie Eriksson are analyzed regarding their explicit and/or implicit understanding of vocation as a motivational factor for nursing. The results show that both theories view intrinsic values as guarantees against reducing nursing practice to mechanistic applications of techniques and as being a way of reinforcing a high ethical standard. The theories explicitly (Bradshaw) or implicitly (Eriksson) advocate a vocational understanding of nursing as being essential for nursing theories. Eriksson's theory has a potential for conceptualizing an understanding of extrinsic and intrinsic motivational factors for nursing but one weakness in the theory could be the risk of slipping over to moral judgments where intrinsic factors are valued as being superior to extrinsic. Bradshaw's theory is more complex and explicit in understanding the concept of vocation and is theologically more plausible, although also more confessional.
Yang, Ren-Qiang; Jabbari, Javad; Cheng, Xiao-Shu
2014-01-01
BACKGROUND: Marfan syndrome (MFS) is a rare autosomal dominantly inherited connective tissue disorder with an estimated prevalence of 1:5,000. More than 1000 variants have been previously reported to be associated with MFS. However, the disease-causing effect of these variants may be questionable...... with regard to disease stratification based on these previously reported MFS-associated variants....
Analysis of Product Buying Decision on Lazada E-commerce based on Previous Buyers’ Comments
Neil Aldrin
2017-06-01
Full Text Available The aims of the present research are: 1 to know that product buying decision possibly occurs, 2 to know how product buying decision occurs on Lazada e-commerce’s customers, 3 how previous buyers’ comments can increase product buying decision on Lazada e-commerce. This research utilizes qualitative research method. Qualitative research is a research that investigates other researches and makes assumption or discussion result so that other analysis results can be made in order to widen idea and opinion. Research result shows that product which has many ratings and reviews will trigger other buyers to purchase or get that product. The conclusion is that product buying decision may occur because there are some processes before making decision which are: looking for recognition and searching for problems, knowing the needs, collecting information, evaluating alternative, evaluating after buying. In those stages, buying decision on Lazada e-commerce is supported by price, promotion, service, and brand.
Wavelet-Based Quantum Field Theory
Mikhail V. Altaisky
2007-11-01
Full Text Available The Euclidean quantum field theory for the fields $phi_{Delta x}(x$, which depend on both the position $x$ and the resolution $Delta x$, constructed in SIGMA 2 (2006, 046, on the base of the continuous wavelet transform, is considered. The Feynman diagrams in such a theory become finite under the assumption there should be no scales in internal lines smaller than the minimal of scales of external lines. This regularisation agrees with the existing calculations of radiative corrections to the electron magnetic moment. The transition from the newly constructed theory to a standard Euclidean field theory is achieved by integration over the scale arguments.
APPLICATION OF DECISION THEORY BASED CRITERIA FOR ...
2012-11-03
Nov 3, 2012 ... STRUCTURAL APPRAISAL OF A BUILDING DURING ... probabilistic model, decision theory based criteria, target value, uncommon accidents, collapse. 1. Introduction ... about some aspect of the design or construction, in-.
Jack, B Kelsey; Kousky, Carolyn; Sims, Katharine R E
2008-07-15
Payments for ecosystem services (PES) policies compensate individuals or communities for undertaking actions that increase the provision of ecosystem services such as water purification, flood mitigation, or carbon sequestration. PES schemes rely on incentives to induce behavioral change and can thus be considered part of the broader class of incentive- or market-based mechanisms for environmental policy. By recognizing that PES programs are incentive-based, policymakers can draw on insights from the substantial body of accumulated knowledge about this class of instruments. In particular, this article offers a set of lessons about how the environmental, socioeconomic, political, and dynamic context of a PES policy is likely to interact with policy design to produce policy outcomes, including environmental effectiveness, cost-effectiveness, and poverty alleviation.
BASES OF CREATIVE PARTICLES OF HIGGS THEORY (CPH THEORY)
Javadi, Hossein; Forouzbakhsh, Farshid
2010-01-01
One way to explain the Zero Point Energy (ZPE) is by means of the uncertainty principle of quantum physics, In CPH Theory the ZPE explained by using a novel description of the gravitons. This is based on the behavior of photons in a gravitational field, leading to a new definition of the graviton...... from the structure of photon to nuclear. These color charges and magnetic color form the energy. Energy converts to matter and anti-matter such as charged particles. Charged particles use gravitons and generate electromagnetic field. This way of looking at the problem show how two opposite charged...
Towards applied theories based on computability logic
Japaridze, Giorgi
2008-01-01
Computability logic (CL) (see http://www.cis.upenn.edu/~giorgi/cl.html) is a recently launched program for redeveloping logic as a formal theory of computability, as opposed to the formal theory of truth that logic has more traditionally been. Formulas in it represent computational problems, "truth" means existence of an algorithmic solution, and proofs encode such solutions. Within the line of research devoted to finding axiomatizations for ever more expressive fragments of CL, the present paper introduces a new deductive system CL12 and proves its soundness and completeness with respect to the semantics of CL. Conservatively extending classical predicate calculus and offering considerable additional expressive and deductive power, CL12 presents a reasonable, computationally meaningful, constructive alternative to classical logic as a basis for applied theories. To obtain a model example of such theories, this paper rebuilds the traditional, classical-logic-based Peano arithmetic into a computability-logic-b...
Jigsaw Cooperative Learning: Acid-Base Theories
Tarhan, Leman; Sesen, Burcin Acar
2012-01-01
This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…
Jigsaw Cooperative Learning: Acid-Base Theories
Tarhan, Leman; Sesen, Burcin Acar
2012-01-01
This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…
Nielsen, Lars Hougaard; Løkkegaard, Ellen; Andreasen, Anne Helms;
2009-01-01
PURPOSE: Many studies which investigate the effect of drugs categorize the exposure variable into never, current, and previous use of the study drug. When prescription registries are used to make this categorization, the exposure variable possibly gets misclassified since the registries do not ca...... with Hazard Ratios ranging from 1.68 to 1.78 for current use compared to never use. CONCLUSIONS: The findings suggest that it is possible to estimate the effect of never, current and previous use of HT on breast cancer using prescription data.......PURPOSE: Many studies which investigate the effect of drugs categorize the exposure variable into never, current, and previous use of the study drug. When prescription registries are used to make this categorization, the exposure variable possibly gets misclassified since the registries do...... not carry any information on the time of discontinuation of treatment.In this study, we investigated the amount of misclassification of exposure (never, current, previous use) to hormone therapy (HT) when the exposure variable was based on prescription data. Furthermore, we evaluated the significance...
Mean Spherical Approximation-Based Partitioned Density Functional Theory
ZHOU Shi-Qi
2003-01-01
Previous literature claims that the density functional theory for non-uniform non-hard sphere interaction potential fluid can be improved on by treating the tail part by the third order functional perturbation expansion approximation (FPEA) with the symmetrical and intuitive consideration-based simple function C0(3)(r1, r2, r3) =ζ∫ dr4a(r4 - r1)a(r4 - r2)a(r4 - r3) as the uniform third order direct correlation function (DCF) for the tail part,here kernel function a(r) = (6/πσ3)Heaviside(σ/2 - r). The present contribution concludes that for the mean spherical approximation-based second order DCF, the terms higher than second order in the FPEA of the tail part of the non-uniform first order DCF are exactly zero. The reason for the partial success of the previous a kernel function-based third order FPEA for the tail part is due to the adjustable parameter ζ and the short range of the a kernel function.Improvement over the previous theories is proposed and tested.
Mean Spherical Approximation-Based Partitioned Density Functional Theory
ZHOUShi-Qi
2003-01-01
Previous literature claims that the density functional theory for non-uniform non-hard sphere interaction potential fluid can be improved on by treating the tail part by the third order functional perturbation expansion approximation (FPEA) with the symmetrical and intuitive consideration-based simple function C0(3)(r1, r2, r3) =(∫dr4a(r4-r1)a(r4-r2)a(r4-r3) as the uniform third order direct correlation function (DCF) for the tail part,here kernel function a(r) = (6/πσ3)Heaviside(σ/2 - r). The present contribution concludes that for the mean spherical approximation-based second order DCF, the terms higher than second order in the FPEA of the tail part of the non-uniform first order DCF are exactly zero. The reason for the partial success of the previous a kernel function-based third order FPEA for the tail part is due to the adjustable parameter ξ and the short range of the a kernel function.Improvement over the previous theories is proposed and tested.
Theory- and evidence-based Intervention
Nissen, Poul
2011-01-01
In this paper, a model for assessment and intervention is presented. This model explains how to perform theory- and evidence-based as well as practice-based assessment and intervention. The assessment model applies a holistic approach to treatment planning which includes recognition...... of the influence of community, school, peers, famely and the functional and structural domains of personality at the behavioural, psenomenological, intra-psychic and biophysical level in a dialectical process. One important aspect of the theoretical basis for presentation of this model is that the child...
Theory- and evidence-based Intervention
Nissen, Poul
2011-01-01
In this paper, a model for assessment and intervention is presented. This model explains how to perform theory- and evidence-based as well as practice-based assessment and intervention. The assessment model applies a holistic approach to treatment planning which includes recognition...... of the influence of community, school, peers, famely and the functional and structural domains of personality at the behavioural, psenomenological, intra-psychic and biophysical level in a dialectical process. One important aspect of the theoretical basis for presentation of this model is that the child...
Ensemble method: Community detection based on game theory
Zhang, Xia; Xia, Zhengyou; Xu, Shengwu; Wang, J. D.
2014-08-01
Timely and cost-effective analytics over social network has emerged as a key ingredient for success in many businesses and government endeavors. Community detection is an active research area of relevance to analyze online social network. The problem of selecting a particular community detection algorithm is crucial if the aim is to unveil the community structure of a network. The choice of a given methodology could affect the outcome of the experiments because different algorithms have different advantages and depend on tuning specific parameters. In this paper, we propose a community division model based on the notion of game theory, which can combine advantages of previous algorithms effectively to get a better community classification result. By making experiments on some standard dataset, it verifies that our community detection model based on game theory is valid and better.
Grey-theory based intrusion detection model
Qin Boping; Zhou Xianwei; Yang Jun; Song Cunyi
2006-01-01
To solve the problem that current intrusion detection model needs large-scale data in formulating the model in real-time use, an intrusion detection system model based on grey theory (GTIDS) is presented. Grey theory has merits of fewer requirements on original data scale, less limitation of the distribution pattern and simpler algorithm in modeling.With these merits GTIDS constructs model according to partial time sequence for rapid detect on intrusive act in secure system. In this detection model rate of false drop and false retrieval are effectively reduced through twice modeling and repeated detect on target data. Furthermore, GTIDS framework and specific process of modeling algorithm are presented. The affectivity of GTIDS is proved through emulated experiments comparing snort and next-generation intrusion detection expert system (NIDES) in SRI international.
A New Glauber Theory based on Multiple Scattering Theory
Yahiro, Masanobu; Ogata, Kazuyuki; Kawai, Mitsuji
2008-01-01
Glauber theory for nucleus-nucleus scattering at high incident energies is reformulated so as to become applicable also for the scattering at intermediate energies. We test validity of the eikonal and adiabatic approximations used in the formulation, and discuss the relation between the present theory and the conventional Glauber calculations with either the empirical nucleon-nucleon profile function or the modified one including the in-medium effect.
The scope of usage-based theory.
Ibbotson, Paul
2013-01-01
Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the "cognitive commitment" of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure vs. cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works.
Theory-based interventions for contraception.
Lopez, Laureen M; Grey, Thomas W; Chen, Mario; Tolley, Elizabeth E; Stockton, Laurie L
2016-11-23
The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, many educational interventions addressing contraception have no explicit theoretical base. To review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice and encourage or improve contraceptive use. To 1 November 2016, we searched for trials that tested a theory-based intervention for improving contraceptive use in PubMed, CENTRAL, POPLINE, Web of Science, ClinicalTrials.gov, and ICTRP. For the initial review, we wrote to investigators to find other trials. Included trials tested a theory-based intervention for improving contraceptive use. Interventions addressed the use of one or more methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy and contraceptive choice or use. We assessed titles and abstracts identified during the searches. One author extracted and entered the data into Review Manager; a second author verified accuracy. We examined studies for methodological quality.For unadjusted dichotomous outcomes, we calculated the Mantel-Haenszel odds ratio (OR) with 95% confidence interval (CI). Cluster randomized trials used various methods of accounting for the clustering, such as multilevel modeling. Most reports did not provide information to calculate the effective sample size. Therefore, we presented the results as reported by the investigators. We did not conduct meta-analysis due to varied interventions and outcome measures. We included 10 new trials for a total of 25. Five were conducted outside the USA. Fifteen randomly assigned individuals and 10 randomized clusters. This section focuses on nine trials with high or
Information theory based approaches to cellular signaling.
Waltermann, Christian; Klipp, Edda
2011-10-01
Cells interact with their environment and they have to react adequately to internal and external changes such changes in nutrient composition, physical properties like temperature or osmolarity and other stresses. More specifically, they must be able to evaluate whether the external change is significant or just in the range of noise. Based on multiple external parameters they have to compute an optimal response. Cellular signaling pathways are considered as the major means of information perception and transmission in cells. Here, we review different attempts to quantify information processing on the level of individual cells. We refer to Shannon entropy, mutual information, and informal measures of signaling pathway cross-talk and specificity. Information theory in systems biology has been successfully applied to identification of optimal pathway structures, mutual information and entropy as system response in sensitivity analysis, and quantification of input and output information. While the study of information transmission within the framework of information theory in technical systems is an advanced field with high impact in engineering and telecommunication, its application to biological objects and processes is still restricted to specific fields such as neuroscience, structural and molecular biology. However, in systems biology dealing with a holistic understanding of biochemical systems and cellular signaling only recently a number of examples for the application of information theory have emerged. This article is part of a Special Issue entitled Systems Biology of Microorganisms. Copyright © 2011 Elsevier B.V. All rights reserved.
Oil monitoring methods based on information theory
XIA Yan-chun; HUO Hua
2009-01-01
To evaluate the Wear condition of machines accurately,oil spectrographic entropy,mutual information and ICA analysis methods based on information theory are presented.A full-scale diagnosis utilizing all channels of spectrographic analysis can be obtained.By measuring the complexity and correlativity,the characteristics of wear condition of machines can be shown clearly.The diagnostic quality is improved.The analysis processes of these monitoring methods are given through the explanation of examples.The availability of these methods is validated and further research fields are demonstrated.
CONDITIONAL FACTORIZATION BASED ON LATTICE THEORY FOR -INTEGERS
Zheng Yonghui; Zhu Yuefei
2008-01-01
In this paper, the integer N = pkq is called a -integer, if p and q are odd primes with almost the same size and k is a positive integer. Such integers were previously proposed for various cryptographic applications. The conditional factorization based on lattice theory for n-bit -integers is considered, and there is an algorithm in time polynomial in n to factor these integers if the least significant |(2k-1)n/(3k-1)(k-1)| bits of p are given.
Control theory based airfoil design using the Euler equations
Jameson, Antony; Reuther, James
1994-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.
Introduction to the theory of bases
Marti, Jürg T
1969-01-01
Since the publication of Banach's treatise on the theory of linear operators, the literature on the theory of bases in topological vector spaces has grown enormously. Much of this literature has for its origin a question raised in Banach's book, the question whether every sepa rable Banach space possesses a basis or not. The notion of a basis employed here is a generalization of that of a Hamel basis for a finite dimensional vector space. For a vector space X of infinite dimension, the concept of a basis is closely related to the convergence of the series which uniquely correspond to each point of X. Thus there are different types of bases for X, according to the topology imposed on X and the chosen type of convergence for the series. Although almost four decades have elapsed since Banach's query, the conjectured existence of a basis for every separable Banach space is not yet proved. On the other hand, no counter examples have been found to show the existence of a special Banach space having no basis. Howe...
Flocculation control study based on fractal theory
无
2005-01-01
A study on flocculation control based on fractal theory was carried out. Optimization test of chemical coagulant dosage confirmed that the fractal dimension could reflect the flocculation degree and settling characteristics of aggregates and the good correlation with the turbidity of settled effluent. So that the fractal dimension can be used as the major parameter for flocculation system control and achieve self-acting adjustment of chemical coagulant dosage. The fractal dimension flocculation control system was used for further study carried out on the effects of various flocculation parameters, among which are the dependency relationship among aggregates fractal dimension, chemical coagulant dosage, and turbidity of settled effluent under the conditions of variable water quality and quantity. And basic experimental data were obtained for establishing the chemical coagulant dosage control model mainly based on aggregates fractal dimension.
Torgén, M; Winkel, J; Alfredsson, L; Kilbom, A
1999-06-01
The principal aim of the present study was to evaluate questionnaire-based information on past physical work loads (6-year recall). Effects of memory difficulties on reproducibility were evaluated for 82 subjects by comparing previously reported results on current work loads (test-retest procedure) with the same items recalled 6 years later. Validity was assessed by comparing self-reports in 1995, regarding work loads in 1989, with worksite measurements performed in 1989. Six-year reproducibility, calculated as weighted kappa coefficients (k(w)), varied between 0.36 and 0.86, with the highest values for proportion of the workday spent sitting and for perceived general exertion and the lowest values for trunk and neck flexion. The six-year reproducibility results were similar to previously reported test-retest results for these items; this finding indicates that memory difficulties was a minor problem. The validity of the questionnaire responses, expressed as rank correlations (r(s)) between the questionnaire responses and workplace measurements, varied between -0.16 and 0.78. The highest values were obtained for the items sitting and repetitive work, and the lowest and "unacceptable" values were for head rotation and neck flexion. Misclassification of exposure did not appear to be differential with regard to musculoskeletal symptom status, as judged by the calculated risk estimates. The validity of some of these self-administered questionnaire items appears sufficient for a crude assessment of physical work loads in the past in epidemiologic studies of the general population with predominantly low levels of exposure.
Transitional clerkship: an experiential course based on workplace learning theory.
Chittenden, Eva H; Henry, Duncan; Saxena, Varun; Loeser, Helen; O'Sullivan, Patricia S
2009-07-01
Starting clerkships is anxiety provoking for medical students. To ease the transition from preclerkship to clerkship curricula, schools offer classroom-based courses which may not be the best model for preparing learners. Drawing from workplace learning theory, the authors developed a seven-day transitional clerkship (TC) in 2007 at the University of California, San Francisco School of Medicine in which students spent half of the course in the hospital, learning routines and logistics of the wards along with their roles and responsibilities as members of ward teams. Twice, they admitted and followed a patient into the next day as part of a shadow team that had no patient-care responsibilities. Dedicated preceptors gave feedback on oral presentations and patient write-ups. Satisfaction with the TC was higher than with the previous year's classroom-based course. TC students felt clearer about their roles and more confident in their abilities as third-year students compared with previous students. TC students continued to rate the transitional course highly after their first clinical rotation. Preceptors were enthusiastic about the course and expressed willingness to commit to future TC preceptorships. The transitional course models an approach to translating workplace learning theory into practice and demonstrates improved satisfaction, better understanding of roles, and increased confidence among new third-year students.
Sweis, R; Fox, M; Anggiansah, R; Anggiansah, A; Basavaraju, K; Canavan, R; Wong, T
2009-03-15
Standard pH monitoring is performed over 24 h with a naso-oesophageal catheter (C-pH). Limitations include naso-pharyngeal discomfort, nausea and social embarrassment resulting in reduced reflux-provoking activities. Recently a catheter-free pH-monitoring technique has become available. The tolerability and diagnostic yield of this system in patients who failed standard monitoring remain unknown. To examine the tolerability and diagnostic outcome of catheter-free pH-monitoring technique in patients who failed standard monitoring. Patients referred for C-pH and catheter-free pH monitoring completed a tolerability questionnaire. Acid exposure in the distal oesophagus and symptom index (SI) were reviewed. Over 4 years, 883/1751 (50%) of patients with typical reflux symptoms referred for C-pH were diagnosed with gastro-oesophageal reflux disease (GERD) based on a pathological percentage time acid exposure (%time pH patients failed C-pH and, of these, 129 successfully completed 2-day catheter-free pH monitoring. Ninety-eight (76%) of these patients had a pathological percentage pH patients (P patients who had previously failed C-pH; catheter-free pH monitoring assists the definitive diagnosis of GERD in this group.
Spectral clustering based on matrix perturbation theory
TIAN Zheng; LI XiaoBin; JU YanWei
2007-01-01
This paper exposes some intrinsic characteristics of the spectral clustering method by using the tools from the matrix perturbation theory. We construct a weight matrix of a graph and study its eigenvalues and eigenvectors. It shows that the number of clusters is equal to the number of eigenvalues that are larger than 1, and the number of points in each of the clusters can be approximated by the associated eigenvalue. It also shows that the eigenvector of the weight matrix can be used directly to perform clustering; that is, the directional angle between the two-row vectors of the matrix derived from the eigenvectors is a suitable distance measure for clustering. As a result, an unsupervised spectral clustering algorithm based on weight matrix (USCAWM) is developed. The experimental results on a number of artificial and real-world data sets show the correctness of the theoretical analysis.
Physically based rendering from theory to implementation
Pharr, Matt
2010-01-01
"Physically Based Rendering, 2nd Edition" describes both the mathematical theory behind a modern photorealistic rendering system as well as its practical implementation. A method - known as 'literate programming'- combines human-readable documentation and source code into a single reference that is specifically designed to aid comprehension. The result is a stunning achievement in graphics education. Through the ideas and software in this book, you will learn to design and employ a full-featured rendering system for creating stunning imagery. This book features new sections on subsurface scattering, Metropolis light transport, precomputed light transport, multispectral rendering, and much more. It includes a companion site complete with source code for the rendering system described in the book, with support for Windows, OS X, and Linux. Code and text are tightly woven together through a unique indexing feature that lists each function, variable, and method on the page that they are first described.
Generalized theory of diffusion based on kinetic theory
Schäfer, T.
2016-10-01
We propose to use spin hydrodynamics, a two-fluid model of spin propagation, as a generalization of the diffusion equation. We show that in the dense limit spin hydrodynamics reduces to Fick's law and the diffusion equation. In the opposite limit spin hydrodynamics is equivalent to a collisionless Boltzmann treatment of spin propagation. Spin hydrodynamics avoids unphysical effects that arise when the diffusion equation is used to describe to a strongly interacting gas with a dilute corona. We apply spin hydrodynamics to the problem of spin diffusion in a trapped atomic gas. We find that the observed spin relaxation rate in the high-temperature limit [Sommer et al., Nature (London) 472, 201 (2011), 10.1038/nature09989] is consistent with the diffusion constant predicted by kinetic theory.
A generalized Theory of Diffusion based on Kinetic Theory
Schaefer, Thomas
2016-01-01
We propose to use spin hydrodynamics, a two-fluid model of spin propagation, as a generalization of the diffusion equation. We show that in the dense limit spin hydrodynamics reduces to Fick's law and the diffusion equation. In the opposite limit spin hydrodynamics is equivalent to a collisionless Boltzmann treatment of spin propagation. Spin hydrodynamics avoids unphysical effects that arise when the diffusion equation is used to describe to a strongly interacting gas with a dilute corona. We apply spin hydrodynamics to the problem of spin diffusion in a trapped atomic gas. We find that the observed spin relaxation rate in the high temperature limit [Sommer et al., Nature 472, 201 (2011)] is consistent with the diffusion constant predicted by kinetic theory.
Information and communication theory. Citations from the NTIS data base
Carrigan, B.
1980-04-01
This bibliography cites Government sponsored research information and communication theory, including coding, decoding, and transmission of signals. Individual studies are cited on radio, television, and digital communication systems. Pure theory is also included. This updated bibliography contains 187 abstracts, 78 of which are new entries to the previous edition.
A molecularly based theory for electron transfer reorganization energy
Zhuang, Bilin; Wang, Zhen-Gang, E-mail: zgw@cheme.caltech.edu [Division of Chemistry and Chemical Engineering, California Institute of Technology, Pasadena, California 91125 (United States)
2015-12-14
Using field-theoretic techniques, we develop a molecularly based dipolar self-consistent-field theory (DSCFT) for charge solvation in pure solvents under equilibrium and nonequilibrium conditions and apply it to the reorganization energy of electron transfer reactions. The DSCFT uses a set of molecular parameters, such as the solvent molecule’s permanent dipole moment and polarizability, thus avoiding approximations that are inherent in treating the solvent as a linear dielectric medium. A simple, analytical expression for the free energy is obtained in terms of the equilibrium and nonequilibrium electrostatic potential profiles and electric susceptibilities, which are obtained by solving a set of self-consistent equations. With no adjustable parameters, the DSCFT predicts activation energies and reorganization energies in good agreement with previous experiments and calculations for the electron transfer between metallic ions. Because the DSCFT is able to describe the properties of the solvent in the immediate vicinity of the charges, it is unnecessary to distinguish between the inner-sphere and outer-sphere solvent molecules in the calculation of the reorganization energy as in previous work. Furthermore, examining the nonequilibrium free energy surfaces of electron transfer, we find that the nonequilibrium free energy is well approximated by a double parabola for self-exchange reactions, but the curvature of the nonequilibrium free energy surface depends on the charges of the electron-transferring species, contrary to the prediction by the linear dielectric theory.
Analysis of system trustworthiness based on information flow noninterference theory
Xiangying Kong; Yanhui Chen; Yi Zhuang
2015-01-01
The trustworthiness analysis and evaluation are the bases of the trust chain transfer. In this paper the formal method of trustworthiness analysis of a system based on the noninterfer-ence (NI) theory of the information flow is studied. Firstly, existing methods cannot analyze the impact of the system states on the trustworthiness of software during the process of trust chain trans-fer. To solve this problem, the impact of the system state on trust-worthiness of software is investigated, the run-time mutual interfer-ence behavior of software entities is described and an interference model of the access control automaton of a system is established. Secondly, based on the intransitive noninterference (INI) theory, a formal analytic method of trustworthiness for trust chain transfer is proposed, providing a theoretical basis for the analysis of dynamic trustworthiness of software during the trust chain transfer process. Thirdly, a prototype system with dynamic trustworthiness on a plat-form with dual core architecture is constructed and a verification algorithm of the system trustworthiness is provided. Final y, the monitor hypothesis is extended to the dynamic monitor hypothe-sis, a theorem of static judgment rule of system trustworthiness is provided, which is useful to prove dynamic trustworthiness of a system at the beginning of system construction. Compared with previous work in this field, this research proposes not only a formal analytic method for the determination of system trustworthiness, but also a modeling method and an analysis algorithm that are feasible for practical implementation.
Feature-Based Binding and Phase Theory
Antonenko, Andrei
2012-01-01
Current theories of binding cannot provide a uniform account for many facts associated with the distribution of anaphors, such as long-distance binding effects and the subject-orientation of monomorphemic anaphors. Further, traditional binding theory is incompatible with minimalist assumptions. In this dissertation I propose an analysis of…
Correspondence Between Bell Bases and Oriented Links in Knot Theory
QIAN ShangWu; GU ZhiYu
2002-01-01
From the comparison of correlation tensor in the theory of quantum network, the Alexander relation matrix in the theory of knot crystals and the identical inversion relations under the action of Pauli matrices, we show that there is a one to one correspondence between four Bell bases and four oriented links of the linkage 41 in knot theory.
A Literature Review of Previous Research on Textual Cohesion Theory%语篇衔接理论研究综述
杜晓文; 唐文杰
2011-01-01
由Halliday和Hasan所提出的语篇衔接理论涉及指称、替代、省略、连接和词汇衔接五大范畴。自二人合著的《英语中的衔接》一书出版以来，国内外学者围绕衔接和连贯进行了不同层次的研究，从不同角度、以不同方法对语篇衔接理论进行探讨，促使其得以不断发展和完善。%The English cohesion theory proposed by Halliday and Hasan consists of five categories, i.e. reference, substitution, ellipsis, conjunction and lexical cohesion. Since the landmark publication of Cohesion in English by Halliday and Hasan, many researches have been made in the field of cohesion and coherence. Scholars home and abroad discussed and studied cohesion from different perspectives and in different approaches, which contributes to the development and maturity of cohesion theory.
Unnur Valdimarsdóttir
2009-02-01
Full Text Available BACKGROUND: Psychotic illness following childbirth is a relatively rare but severe condition with unexplained etiology. The aim of this study was to investigate the impact of maternal background characteristics and obstetric factors on the risk of postpartum psychosis, specifically among mothers with no previous psychiatric hospitalizations. METHODS AND FINDINGS: We investigated incidence rates and potential maternal and obstetric risk factors of psychoses after childbirth in a national cohort of women who were first-time mothers from 1983 through 2000 (n = 745,596. Proportional hazard regression models were used to estimate relative risks of psychoses during and after the first 90 d postpartum, among mothers without any previous psychiatric hospitalization and among all mothers. Within 90 d after delivery, 892 women (1.2 per 1,000 births; 4.84 per 1,000 person-years were hospitalized due to psychoses and 436 of these (0.6 per 1,000 births; 2.38 per 1,000 person-years had not previously been hospitalized for any psychiatric disorder. During follow-up after the 90 d postpartum period, the corresponding incidence rates per 1,000 person-years were reduced to 0.65 for all women and 0.49 for women not previously hospitalized. During (but not after the first 90 d postpartum the risk of psychoses among women without any previous psychiatric hospitalization was independently affected by: maternal age (35 y or older versus 19 y or younger; hazard ratio 2.4, 95% confidence interval [CI] 1.2 to 4.7; high birth weight (> or = 4,500 g; hazard ratio 0.3, 95% CI 0.1 to 1.0; and diabetes (hazard ratio 0. CONCLUSIONS: The incidence of psychotic illness peaks immediately following a first childbirth, and almost 50% of the cases are women without any previous psychiatric hospitalization. High maternal age increases the risk while diabetes and high birth weight are associated with reduced risk of first-onset psychoses, distinctly during the postpartum period.
Andreasen, Charlotte Hartig; Nielsen, Jonas B; Refsgaard, Lena
2013-01-01
with these cardiomyopathies, but the disease-causing effect of reported variants is often dubious. In order to identify possible false-positive variants, we investigated the prevalence of previously reported cardiomyopathy-associated variants in recently published exome data. We searched for reported missense and nonsense...... variants in the NHLBI-Go Exome Sequencing Project (ESP) containing exome data from 6500 individuals. In ESP, we identified 94 variants out of 687 (14%) variants previously associated with HCM, 58 out of 337 (17%) variants associated with DCM, and 38 variants out of 209 (18%) associated with ARVC...... times higher than expected from the phenotype prevalences in the general population (HCM 1:500, DCM 1:2500, and ARVC 1:5000) and our data suggest that a high number of these variants are not monogenic causes of cardiomyopathy....
The Euphemism variety Based on Register theory
贾文庆
2015-01-01
It is the substitution of an agreeable expression for one that may offend something unpleasant. Euphemisms have a close relationship with context, especially tenor of discourse. This paper attempts to analyze the variation of euphemisms guided by register theory.
Intelligent control based on fuzzy logic and neural net theory
Lee, Chuen-Chien
1991-01-01
In the conception and design of intelligent systems, one promising direction involves the use of fuzzy logic and neural network theory to enhance such systems' capability to learn from experience and adapt to changes in an environment of uncertainty and imprecision. Here, an intelligent control scheme is explored by integrating these multidisciplinary techniques. A self-learning system is proposed as an intelligent controller for dynamical processes, employing a control policy which evolves and improves automatically. One key component of the intelligent system is a fuzzy logic-based system which emulates human decision making behavior. It is shown that the system can solve a fairly difficult control learning problem. Simulation results demonstrate that improved learning performance can be achieved in relation to previously described systems employing bang-bang control. The proposed system is relatively insensitive to variations in the parameters of the system environment.
A theory for cursive handwriting based on the minimization principle.
Wada, Y; Kawato, M
1995-06-01
We propose a trajectory planning and control theory which provides explanations at the computation, algorithm, representation, and hardware levels for continuous movement such as connected cursive handwriting. The hardware is based on our previously proposed forward-inverse-relaxation neural network. Computationally, the optimization principle is the minimum torque-change criterion. At the representation level, hard constraints satisfied by a trajectory are represented as a set of via-points extracted from handwritten characters. Accordingly, we propose a via-point estimation algorithm that estimates via-points by repeating trajectory formation of a character and via-point extraction from the character. It is shown experimentally that for movements with a single via-point target, the via-point estimation algorithm can assign a point near the actual via-point target. Good quantitative agreement is found between human movement data and the trajectories generated by the proposed model.
Axial vibration analysis of nanocones based on nonlocal elasticity theory
Shu-Qi Guo; Shao-Pu Yang
2012-01-01
Carbon nanocones have quite fascinating electronic and structural properties,whose axial vibration is seldom investigated in previous studies.In this paper,based on a nonlocal elasticity theory,a nonuniform rod model is applied to investigate the small-scale effect and the nonuniform effect on axial vibration of nanocones.Using the modified Wentzel-Brillouin-Kramers (WBK) method,an asymptotic solution is obtained for the axial vibration of general nonuniform nanorods.Then,using similar procedure,the axial vibration of nanocones is analyzed for nonuniform parameters,mode number and nonlocal parameters.Explicit expressions are derived for mode frequencies of clamped-clamped and clamped-free boundary conditions.It is found that axial vibration frequencies are highly overestimated by the classical rod model because of ignorance of the effect of small length scale.
Silva-Junior, Mario R.; Schreiber, Marko; Sauer, Stephan P. A.;
2008-01-01
Time-dependent density functional theory (TD-DFT) and DFT-based multireference configuration interaction (DFT/MRCI) calculations are reported for a recently proposed benchmark set of 28 medium-sized organic molecules. Vertical excitation energies, oscillator strengths, and excited-state dipole...... moments are computed using the same geometries (MP2/6-31G*) and basis set (TZVP) as in our previous ab initio benchmark study on electronically excited states. The results from TD-DFT (with the functionals BP86, B3LYP, and BHLYP) and from DFT/MRCI are compared against the previous high-level ab initio...
Kristina Dalberg
2006-09-01
Full Text Available BACKGROUND: Data on birth outcome and offspring health after the appearance of breast cancer are limited. The aim of this study was to assess the risk of adverse birth outcomes in women previously treated for invasive breast cancer compared with the general population of mothers. METHODS AND FINDINGS: Of all 2,870,932 singleton births registered in the Swedish Medical Birth Registry during 1973-2002, 331 first births following breast cancer surgery--with a mean time to pregnancy of 37 mo (range 7-163--were identified using linkage with the Swedish Cancer Registry. Logistic regression analysis was used. The estimates were adjusted for maternal age, parity, and year of delivery. Odds ratios (ORs and 95% confidence intervals (CIs were used to estimate infant health and mortality, delivery complications, the risk of preterm birth, and the rates of instrumental delivery and cesarean section. The large majority of births from women previously treated for breast cancer had no adverse events. However, births by women exposed to breast cancer were associated with an increased risk of delivery complications (OR 1.5, 95% CI 1.2-1.9, cesarean section (OR 1.3, 95% CI 1.0-1.7, very preterm birth (<32 wk (OR 3.2, 95% CI 1.7-6.0, and low birth weight (<1500 g (OR 2.9, 95% CI 1.4-5.8. A tendency towards an increased risk of malformations among the infants was seen especially in the later time period (1988-2002 (OR 2.1, 95% CI 1.2-3.7. CONCLUSIONS: It is reassuring that births overall were without adverse events, but our findings indicate that pregnancies in previously treated breast cancer patients should possibly be regarded as higher risk pregnancies, with consequences for their surveillance and management.
A Translation Case Analysis Based on Skopos Theory
盖孟姣
2015-01-01
This paper is a translation case analysis based on Skopos Theory.This paper choose President Xi’s New Year congratulations of 2015 as analysis text and gives the case analysis.This paper focuses on translating the text based on Skopos Theory.
Nascimento, Marcelle M; Gordan, Valeria V; Qvist, Vibeke;
2010-01-01
The authors conducted a study to identify and quantify the reasons used by dentists in The Dental Practice-Based Research Network (DPBRN) for placing restorations on unrestored permanent tooth surfaces and the dental materials they used in doing so....
Mapping site-based construction workers' motivation: Expectancy theory approach
Ghoddousi, Parviz; Bahrami, Nima; Chileshe, Nicholas; Hosseini, M Reza
2014-01-01
The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in construction and confirm the validity of this model for the construction industry...
Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen
2014-01-01
Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.
Release behaviour of clozapine matrix pellets based on percolation theory.
Aguilar-de-Leyva, Angela; Sharkawi, Tahmer; Bataille, Bernard; Baylac, Gilles; Caraballo, Isidoro
2011-02-14
The release behaviour of clozapine matrix pellets was studied in order to investigate if it is possible to explain it applying the concepts of percolation theory, previously used in the understanding of the release process of inert and hydrophilic matrix tablets. Thirteen batches of pellets with different proportions of clozapine/microcrystalline cellulose (MCC)/hydroxypropylmethyl cellulose (HPMC) and different clozapine particle size fractions were prepared by extrusion-spheronisation and the release profiles were studied. It has been observed that the distance to the excipient (HPMC) percolation threshold is important to control the release rate. Furthermore, the drug percolation threshold has a big influence in these systems. Batches very close to the drug percolation threshold, show a clear effect of the drug particle size in the release rate. However, this effect is much less evident when there is a bigger distance to the drug percolation threshold, so the release behaviour of clozapine matrix pellets is possible to be explained based on the percolation theory.
The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory
Anil, Duygu
2008-01-01
In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…
The Euphemism variety Based on Register theory
贾文庆
2015-01-01
It is the substitution of an agreeable expression for one that may offend something unpleasant.Euphemisms have a close relationship with context,especially tenor of discourse.This paper attempts to analyze the variation of euphemisms guided by register theory.
Recursive renormalization group theory based subgrid modeling
Zhou, YE
1991-01-01
Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.
Complete theory of symmetry-based indicators of band topology.
Po, Hoi Chun; Vishwanath, Ashvin; Watanabe, Haruki
2017-06-30
The interplay between symmetry and topology leads to a rich variety of electronic topological phases, protecting states such as the topological insulators and Dirac semimetals. Previous results, like the Fu-Kane parity criterion for inversion-symmetric topological insulators, demonstrate that symmetry labels can sometimes unambiguously indicate underlying band topology. Here we develop a systematic approach to expose all such symmetry-based indicators of band topology in all the 230 space groups. This is achieved by first developing an efficient way to represent band structures in terms of elementary basis states, and then isolating the topological ones by removing the subset of atomic insulators, defined by the existence of localized symmetric Wannier functions. Aside from encompassing all earlier results on such indicators, including in particular the notion of filling-enforced quantum band insulators, our theory identifies symmetry settings with previously hidden forms of band topology, and can be applied to the search for topological materials.Understanding the role of topology in determining electronic structure can lead to the discovery, or appreciation, of materials with exotic properties such as protected surface states. Here, the authors present a framework for identifying topologically distinct band-structures for all 3D space groups.
SMD-based numerical stochastic perturbation theory
Dalla Brida, Mattia [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN, Sezione di Milano-Bicocca (Italy); Luescher, Martin [CERN, Theoretical Physics Department, Geneva (Switzerland); AEC, Institute for Theoretical Physics, University of Bern (Switzerland)
2017-05-15
The viability of a variant of numerical stochastic perturbation theory, where the Langevin equation is replaced by the SMD algorithm, is examined. In particular, the convergence of the process to a unique stationary state is rigorously established and the use of higher-order symplectic integration schemes is shown to be highly profitable in this context. For illustration, the gradient-flow coupling in finite volume with Schroedinger functional boundary conditions is computed to two-loop (i.e. NNL) order in the SU(3) gauge theory. The scaling behaviour of the algorithm turns out to be rather favourable in this case, which allows the computations to be driven close to the continuum limit. (orig.)
Graph-based linear scaling electronic structure theory
Niklasson, Anders M N; Negre, Christian F A; Cawkwell, Marc J; Swart, Pieter J; Mohd-Yusof, Jamal; Germann, Timothy C; Wall, Michael E; Bock, Nicolas; Djidjev, Hristo
2016-01-01
We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.
Support vector machines optimization based theory, algorithms, and extensions
Deng, Naiyang; Zhang, Chunhua
2013-01-01
Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions presents an accessible treatment of the two main components of support vector machines (SVMs)-classification problems and regression problems. The book emphasizes the close connection between optimization theory and SVMs since optimization is one of the pillars on which SVMs are built.The authors share insight on many of their research achievements. They give a precise interpretation of statistical leaning theory for C-support vector classification. They also discuss regularized twi
A Stylistic Study of Translator Based on Western Translation Theory
Wu; Jing
2015-01-01
This paper focuses on the historic development of the stylistic study of translator in light of western translation theory,and analyzes its features on each phase and current emphasis of research,by which knowledge of the development of western theory and translators’ style can be shown.After that,we can research its problems existing in the stylistic study of translator based on western theory so than we can provide a panoramic analysis in this field.
A Stylistic Study of Translator Based on Western Translation Theory
Wu Jing
2015-01-01
This paper focuses on the historic development of the stylistic study of translator in light of western translation theory,and analyzes its features on each phase and current emphasis of research,by which knowledge of the development of western theory and translators' style can be shown.After that,we can research its problems existing in the stylistic study of translator based on western theory so than we can provide a panoramic analysis in thisfield.
Theory of friction based on brittle fracture
Byerlee, J.D.
1967-01-01
A theory of friction is presented that may be more applicable to geologic materials than the classic Bowden and Tabor theory. In the model, surfaces touch at the peaks of asperities and sliding occurs when the asperities fail by brittle fracture. The coefficient of friction, ??, was calculated from the strength of asperities of certain ideal shapes; for cone-shaped asperities, ?? is about 0.1 and for wedge-shaped asperities, ?? is about 0.15. For actual situations which seem close to the ideal model, observed ?? was found to be very close to 0.1, even for materials such as quartz and calcite with widely differing strengths. If surface forces are present, the theory predicts that ?? should decrease with load and that it should be higher in a vacuum than in air. In the presence of a fluid film between sliding surfaces, ?? should depend on the area of the surfaces in contact. Both effects are observed. The character of wear particles produced during sliding and the way in which ?? depends on normal load, roughness, and environment lend further support to the model of friction presented here. ?? 1967 The American Institute of Physics.
Portuguese Public University Student Satisfaction: A Stakeholder Theory-Based Approach
Mainardes, Emerson; Alves, Helena; Raposo, Mario
2013-01-01
In accordance with the importance of the student stakeholder to universities, the objective of this research project was to evaluate student satisfaction at Portuguese public universities as regards their self-expressed core expectations. The research was based both on stakeholder theory itself and on previous studies of university stakeholders.…
Portuguese Public University Student Satisfaction: A Stakeholder Theory-Based Approach
Mainardes, Emerson; Alves, Helena; Raposo, Mario
2013-01-01
In accordance with the importance of the student stakeholder to universities, the objective of this research project was to evaluate student satisfaction at Portuguese public universities as regards their self-expressed core expectations. The research was based both on stakeholder theory itself and on previous studies of university stakeholders.…
Implications and Applications of Modern Test Theory in the Context of Outcomes Based Education.
Andrich, David
2002-01-01
Uses a framework previously developed to relate outcomes based education and B. Bloom's "Taxonomy of Educational Objectives" to consider ways in which modern test theory can be used to connect aspects of assessment to the curriculum framework and to consider insights this connection might provide. (SLD)
Danae Campos-Melo
2014-09-01
Full Text Available MicroRNAs (miRNAs are small non-coding RNAs that regulate the majority of the transcriptome at a post-transcriptional level. Because of this critical role, it is important to ensure that the assays used to determine their functionality are robust and reproducible. Typically, the reporter gene assay in cell-based systems has been the first-line method to study miRNA functionality. In order to overcome some of the potential errors in interpretation that can be associated with this assay, we have developed a detailed protocol for the luciferase reporter gene assay that has been modified for miRNAs. We demonstrate that normalization against the effect of the miRNA and cellular factors on the luciferase coding sequence is essential to obtain the specific impact of the miRNA on the 3'UTR (untranslated region target. Our findings suggest that there is a real possibility that the roles for miRNA in transcriptome regulation may be misreported due to inaccurate normalization of experimental data and also that up-regulatory effects of miRNAs are not uncommon in cells. We propose to establish this comprehensive method as standard for miRNA luciferase reporter assays to avoid errors and misinterpretations in the functionality of miRNAs.
How Can Theory-Based Evaluation Make Greater Headway?
Weiss, Carol H.
1997-01-01
Explores the problems of theory-based evaluation, describes the nature of potential benefits, and suggests that the benefits are significant enough to warrant continued effort to overcome the obstacles and advance its use. Many of the problems are related to inadequate theories about pathways to desired program outcomes. (SLD)
Multimedia College English Teaching based upon the Behaviorism Theory
蒋华; 徐卿
2014-01-01
According to the behaviorism learning theory,the language learning is a process of habitual acquisition and practice. Based on the characteristics of college English teaching mode under the multimedia Internet condition,this article explores the practical guiding function of behaviorism learning theory to college English teaching,so as to realize the goal of college English teaching.
Network Security Transmission Based on Bimatrix Game Theory
ZHENG Ying; HU Hanping; GUO Wenxuan
2006-01-01
Based on the bimatrix game theory, the network data transmission has been depicted in a game theory way: the actions of the attacker and defender (legitimate users) are depicted within a two-person, non-cooperative and bimatrix game model, this paper proves the existence of the Nash equilibrium theoretically, which is further illustrated by the experimental results.
Theory-Based Approaches to the Concept of Life
El-Hani, Charbel Nino
2008-01-01
In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…
Task-Based Language Teaching and Expansive Learning Theory
Robertson, Margaret
2014-01-01
Task-Based Language Teaching (TBLT) has become increasingly recognized as an effective pedagogy, but its location in generalized sociocultural theories of learning has led to misunderstandings and criticism. The purpose of this article is to explain the congruence between TBLT and Expansive Learning Theory and the benefits of doing so. The merit…
Code generation based on formal BURS theory and heuristic search
Nymeyer, Albert; Katoen, Joost P.
BURS theory provides a powerful mechanism to efficiently generate pattern matches in a given expression tree. BURS, which stands for bottom-up rewrite system, is based on term rewrite systems, to which costs are added. We formalise the underlying theory, and derive an algorithm that computes all
MOTIVATING ENGLISH TEACHERS BASED ON THE BASIC NEEDS THEORY AND AN EXPECTANCY THEORY
Hidayatus Sholihah
2017-08-01
Full Text Available There are two main motivation theories. a hierarchy of basic needs theory, and an expectancy theory. In a Hyrarchy of basic needs theory, Maslow has stated that the basic needs as a main behaviour direction are structured into a hierarchy. There are five basic human needs. The first: Physiological needs such as: salary, bonus or working condition. The second: the safety needs, such as: safe job environment, job security or health cover. The third, social needs, such as union and team work. The next is self esteem, such as getting an award, medal, certificate or any other recognisition. Then the last is self actualization, for example is by providing an opportunity to share knowledge, skills and eprerience. The evaluation of this theory are: there is no spiritual needs as human basic needs is a main weakness of this theory. Then it is possible that different level of needs have to be satisfied in the same time, or not in hierarchy level or, not always have to be fulfilled in order. The next motivation theory is an Expectancy Theory. This theory is based on three main factors. The first factor is: English teachers will be motivated to work harder if they have a good perception to their own competences in accordance with their job. The second, individual motivation depends on the rewards given when they finish a particular job. Finally, it also depends on their regards to the rewards given from the job that they do. Expectancy theory is a good theory, however, it is not easy to be implemented because the principals should provide various types of reward to satisfy the expectation of their English teachers. Considering the strengths and weaknesses of these two theories, it is better to combine both of them in the practice to get more effective results.
Theory-Based Lexicographical Methods in a Functional Perspective
Tarp, Sven
2014-01-01
This contribution provides an overview of some of the methods used in relation to the function theory. It starts with a definition of the concept of method and the relation existing between theory and method. It establishes an initial distinction between artisanal and theory-based methods...... of various methods used in the different sub-phases of the overall dictionary compilation process, from the making of the concept to the preparation for publication on the chosen media, with focus on the Internet. Finally, it briefly discusses some of the methods used to create and test the function theory...
An Optimization Model Based on Game Theory
Yang Shi
2014-04-01
Full Text Available Game Theory has a wide range of applications in department of economics, but in the field of computer science, especially in the optimization algorithm is seldom used. In this paper, we integrate thinking of game theory into optimization algorithm, and then propose a new optimization model which can be widely used in optimization processing. This optimization model is divided into two types, which are called “the complete consistency” and “the partial consistency”. In these two types, the partial consistency is added disturbance strategy on the basis of the complete consistency. When model’s consistency is satisfied, the Nash equilibrium of the optimization model is global optimal and when the model’s consistency is not met, the presence of perturbation strategy can improve the application of the algorithm. The basic experiments suggest that this optimization model has broad applicability and better performance, and gives a new idea for some intractable problems in the field of artificial intelligence
Using Activity Theory as a Base for Investigating Language Teacher ...
Using Activity Theory as a Base for Investigating Language Teacher Education through ... Open Access DOWNLOAD FULL TEXT ... However, use of the platform is constrained by tensions and contradictions at system and individual levels.
Making Theory Come Alive through Practice-based Design Research
Markussen, Thomas; Knutz, Eva; Rind Christensen, Poul
2011-01-01
The aim of this paper is to demonstrate how practice-based design research is able not only to challenge, but also to push toward further development of some of the basic assumpstions in emotion theories as used within design research. In so doing, we wish to increase knolwedge on a central...... epistemological question for design research, namely how practice-based design research can be a vehicle for the construction of new theory for design research....
Measurement Theory in Deutsch's Algorithm Based on the Truth Values
Nagata, Koji; Nakamura, Tadao
2016-08-01
We propose a new measurement theory, in qubits handling, based on the truth values, i.e., the truth T (1) for true and the falsity F (0) for false. The results of measurement are either 0 or 1. To implement Deutsch's algorithm, we need both observability and controllability of a quantum state. The new measurement theory can satisfy these two. Especially, we systematically describe our assertion based on more mathematical analysis using raw data in a thoughtful experiment.
Modern Resource-Based Theory(ies)
Foss, Nicolai Juul; Stieglitz, Nils
We survey the resource-based view in strategic management, focusing on its roots in economics. We organize our discussion in terms of the Gavetti and Levinthal distinction between a “high church” and a “low church” resource-based view, and argue that these hitherto rather separate streams...
MODERN AREAS OF DEVELOPMENT OF RESOURCE-BASED THEORY
O. Miroshnychenko
2015-06-01
Full Text Available The essence of resource-based view is focus on using unique and rare combination of resources, core competences and organizational capabilities of the firm. The new intellectual resources and their combination provide the formation of sustainable competitive advantages. Therefore, it seems there is necessity to join in research the application of resource-based theory to creation of new combination of resources in dynamic environment. In this paper modern areas of development of resource-based theory have been considered. The concept of knowledge management, the concept of open innovation, the resource-based view and the system organization of economy have been characterised.
Theories about consensus-based conservation.
Leach, William D
2006-04-01
"Conservation and the Myth of Consensus" (Peterson et al. 2005) levels several serious indictments against consensus-based approaches to environmental decision making. Namely, the authors argue that consensus processes (1) reinforce apathy and ignorance of conservation issues; (2) legitimize damage to the environment; (3) quash public debate about conservation; (4) solidify the existing balance of power in favor of prodevelopment forces; and (5) block progress toward an ecologically sustainable future. Careful scrutiny of consensus-based approaches is important, especially considering their surging use in conservation policy. In the spirit of advancing the debate further, I review some of the limitations of the essay and its modes of inquiry.
Cognitive performance modeling based on general systems performance theory.
Kondraske, George V
2010-01-01
General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).
IMMAN: free software for information theory-based chemometric analysis.
Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo
2015-05-01
The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA
Moon, Byongook; Morash, Merry; McCluskey, Cynthia Perez; Hwang, Hye-Won
2009-01-01
Using longitudinal data on South Korean youth, the authors addressed limitations of previous tests of general strain theory (GST), focusing on the relationships among key strains, situational- and trait-based negative emotions, conditioning factors, and delinquency. Eight types of strain previously shown most likely to result in delinquency,…
Nami, Mohammad Rahim; Janghorban, Maziar
2013-12-30
In this article, a new higher order shear deformation theory based on trigonometric shear deformation theory is developed. In order to consider the size effects, the nonlocal elasticity theory is used. An analytical method is adopted to solve the governing equations for static analysis of simply supported nanoplates. In the present theory, the transverse shear stresses satisfy the traction free boundary conditions of the rectangular plates and these stresses can be calculated from the constitutive equations. The effects of different parameters such as nonlocal parameter and aspect ratio are investigated on both nondimensional deflections and deflection ratios. It may be important to mention that the present formulations are general and can be used for isotropic, orthotropic and anisotropic nanoplates.
Mohammad Rahim Nami
2013-12-01
Full Text Available In this article, a new higher order shear deformation theory based on trigonometric shear deformation theory is developed. In order to consider the size effects, the nonlocal elasticity theory is used. An analytical method is adopted to solve the governing equations for static analysis of simply supported nanoplates. In the present theory, the transverse shear stresses satisfy the traction free boundary conditions of the rectangular plates and these stresses can be calculated from the constitutive equations. The effects of different parameters such as nonlocal parameter and aspect ratio are investigated on both nondimensional deflections and deflection ratios. It may be important to mention that the present formulations are general and can be used for isotropic, orthotropic and anisotropic nanoplates.
Homogenization-based multi-scale damage theory
无
2010-01-01
The research of modern mechanics reveals that the damage and failure of structures should be considered on different scales. The present paper is dedicated to establishing the multi-scale damage theory for the nonlinear structural analysis. Starting from the asymptotic expansion based homogenization theory, the multi-scale energy integration is proposed to bridge the gap between the micro and macro scales. By recalling the Helmholtz free energy based damage definition, the damage variable is represented by the multi-scale energy integration. Hence the damage evolution could be numerically simulated on the basis of the unit cell analysis rather than the experimental data identification. Finally the framework of the multi-scale damage theory is established by transforming the multi-scale damage evolution into the conventional continuum damage mechanics. The agree- ment between the simulated results and the benchmark results indicates the validity and effectiveness of the proposed theory.
Kinetic energy decomposition scheme based on information theory.
Imamura, Yutaka; Suzuki, Jun; Nakai, Hiromi
2013-12-15
We proposed a novel kinetic energy decomposition analysis based on information theory. Since the Hirshfeld partitioning for electron densities can be formulated in terms of Kullback-Leibler information deficiency in information theory, a similar partitioning for kinetic energy densities was newly proposed. The numerical assessments confirm that the current kinetic energy decomposition scheme provides reasonable chemical pictures for ionic and covalent molecules, and can also estimate atomic energies using a correction with viral ratios.
Complexity measurement based on information theory and kolmogorov complexity.
Lui, Leong Ting; Terrazas, Germán; Zenil, Hector; Alexander, Cameron; Krasnogor, Natalio
2015-01-01
In the past decades many definitions of complexity have been proposed. Most of these definitions are based either on Shannon's information theory or on Kolmogorov complexity; these two are often compared, but very few studies integrate the two ideas. In this article we introduce a new measure of complexity that builds on both of these theories. As a demonstration of the concept, the technique is applied to elementary cellular automata and simulations of the self-organization of porphyrin molecules.
Opera house acoustics based on subjective preference theory
Ando, Yoichi
2015-01-01
This book focuses on opera house acoustics based on subjective preference theory; it targets researchers in acoustics and vision who are working in physics, psychology, and brain physiology. This book helps readers to understand any subjective attributes in relation to objective parameters based on the powerful and workable model of the auditory system. It is reconfirmed here that the well-known Helmholtz theory, which was based on a peripheral model of the auditory system, may not well describe pitch, timbre, and duration as well as the spatial sensations described in this book, nor overall responses such as subjective preference of sound fields and the annoyance of environmental noise.
Elastic theory of origami-based metamaterials
Brunck, V.; Lechenault, F.; Reid, A.; Adda-Bedia, M.
2016-03-01
Origami offers the possibility for new metamaterials whose overall mechanical properties can be programed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would like to determine the shape taken by the structure at rest and its mechanical response. In this article, we introduce a vector deformation field acting on the imprinted network of creases that allows us to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n -creases meeting at a single vertex. Computations of the equilibrium states are then carried out explicitly in two special cases: the generalized waterbomb base and the Miura-Ori. For the waterbomb, we show a generic bistability for any number of creases. For the Miura folding, however, we uncover a phase transition from monostable to bistable states that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. Moreover, the analysis shows that geometric frustration induces residual stresses in origami structures that should be taken into account in determining their mechanical response. This formalism can be extended to a general crease network, ordered or otherwise, and so opens new perspectives for the mechanics and the physics of origami-based metamaterials.
Elastic theory of origami-based metamaterials.
Brunck, V; Lechenault, F; Reid, A; Adda-Bedia, M
2016-03-01
Origami offers the possibility for new metamaterials whose overall mechanical properties can be programed by acting locally on each crease. Starting from a thin plate and having knowledge about the properties of the material and the folding procedure, one would like to determine the shape taken by the structure at rest and its mechanical response. In this article, we introduce a vector deformation field acting on the imprinted network of creases that allows us to express the geometrical constraints of rigid origami structures in a simple and systematic way. This formalism is then used to write a general covariant expression of the elastic energy of n-creases meeting at a single vertex. Computations of the equilibrium states are then carried out explicitly in two special cases: the generalized waterbomb base and the Miura-Ori. For the waterbomb, we show a generic bistability for any number of creases. For the Miura folding, however, we uncover a phase transition from monostable to bistable states that explains the efficient deployability of this structure for a given range of geometrical and mechanical parameters. Moreover, the analysis shows that geometric frustration induces residual stresses in origami structures that should be taken into account in determining their mechanical response. This formalism can be extended to a general crease network, ordered or otherwise, and so opens new perspectives for the mechanics and the physics of origami-based metamaterials.
Computer-Based Integrated Learning Systems: Research and Theory.
Hativa, Nira, Ed.; Becker, Henry Jay, Ed.
1994-01-01
The eight chapters of this theme issue discuss recent research and theory concerning computer-based integrated learning systems. Following an introduction about their theoretical background and current use in schools, the effects of using computer-based integrated learning systems in the elementary school classroom are considered. (SLD)
Evaluating hydrological model performance using information theory-based metrics
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...
Luis F López-Cortés
Full Text Available Significant controversy still exists about ritonavir-boosted protease inhibitor monotherapy (mtPI/rtv as a simplification strategy that is used up to now to treat patients that have not experienced previous virological failure (VF while on protease inhibitor (PI -based regimens. We have evaluated the effectiveness of two mtPI/rtv regimens in an actual clinical practice setting, including patients that had experienced previous VF with PI-based regimens.This retrospective study analyzed 1060 HIV-infected patients with undetectable viremia that were switched to lopinavir/ritonavir or darunavir/ritonavir monotherapy. In cases in which the patient had previously experienced VF while on a PI-based regimen, the lack of major HIV protease resistance mutations to lopinavir or darunavir, respectively, was mandatory. The primary endpoint of this study was the percentage of participants with virological suppression after 96 weeks according to intention-to-treat analysis (non-complete/missing = failure.A total of 1060 patients were analyzed, including 205 with previous VF while on PI-based regimens, 90 of whom were on complex therapies due to extensive resistance. The rates of treatment effectiveness (intention-to-treat analysis and virological efficacy (on-treatment analysis at week 96 were 79.3% (CI95, 76.8-81.8 and 91.5% (CI95, 89.6-93.4, respectively. No relationships were found between VF and earlier VF while on PI-based regimens, the presence of major or minor protease resistance mutations, the previous time on viral suppression, CD4+ T-cell nadir, and HCV-coinfection. Genotypic resistance tests were available in 49 out of the 74 patients with VFs and only four patients presented new major protease resistance mutations.Switching to mtPI/rtv achieves sustained virological control in most patients, even in those with previous VF on PI-based regimens as long as no major resistance mutations are present for the administered drug.
Toward theory-based diagnostic categories.
Gordon, M
1990-01-01
As a social institution, nursing has a responsibility to society for the development of knowledge in the areas described by nursing diagnoses. This article focuses on the need for developing the theoretical basis of each diagnostic category. Diagnoses are viewed as summarizations of underlying conceptual models for interpreting observations, and as such, they provide a perspective for understanding and thinking about a set of clinical observations. At present many diagnostic concepts do not meet this standard and suggest primitive, pretheoretical ideas with a minimal knowledge base. The importance of having valid and reliable diagnostic categories for use in making clinical judgements and as a focus for care planning is discussed. A cycle of development is outlined in three phases: diagnostic concept identification, concept analysis-model development, and construction/reconstruction of diagnostic categories. It is suggested that a useful category captures the conceptual understanding and state of knowledge development about a phenomena in its (a) name, (b) definition, and (c) cluster of defining characteristics.
Modeling acquaintance networks based on balance theory
Vukašinović Vida
2014-09-01
Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models
Robust optimization based upon statistical theory.
Sobotta, B; Söhn, M; Alber, M
2010-08-01
Organ movement is still the biggest challenge in cancer treatment despite advances in online imaging. Due to the resulting geometric uncertainties, the delivered dose cannot be predicted precisely at treatment planning time. Consequently, all associated dose metrics (e.g., EUD and maxDose) are random variables with a patient-specific probability distribution. The method that the authors propose makes these distributions the basis of the optimization and evaluation process. The authors start from a model of motion derived from patient-specific imaging. On a multitude of geometry instances sampled from this model, a dose metric is evaluated. The resulting pdf of this dose metric is termed outcome distribution. The approach optimizes the shape of the outcome distribution based on its mean and variance. This is in contrast to the conventional optimization of a nominal value (e.g., PTV EUD) computed on a single geometry instance. The mean and variance allow for an estimate of the expected treatment outcome along with the residual uncertainty. Besides being applicable to the target, the proposed method also seamlessly includes the organs at risk (OARs). The likelihood that a given value of a metric is reached in the treatment is predicted quantitatively. This information reveals potential hazards that may occur during the course of the treatment, thus helping the expert to find the right balance between the risk of insufficient normal tissue sparing and the risk of insufficient tumor control. By feeding this information to the optimizer, outcome distributions can be obtained where the probability of exceeding a given OAR maximum and that of falling short of a given target goal can be minimized simultaneously. The method is applicable to any source of residual motion uncertainty in treatment delivery. Any model that quantifies organ movement and deformation in terms of probability distributions can be used as basis for the algorithm. Thus, it can generate dose
Spectrum Allocation Based on Game Theory in Cognitive Radio Networks
Qiufen Ni
2013-03-01
Full Text Available As a kind of intelligent communication technology, the characteristic of dynamic spectrum allocation of cognitive radio provides feasible scheme for sharing with the spectrum resources among the primary user and secondary users, which solves the current spectrum resource scarcity problem. In this paper, we comprehensively explored the cognitive radio spectrum allocation models based on game theory from cooperative game and non-cooperative game, which provide detailed overview and analysis on the state of the art of spectrum allocation based on game theory. In order to provide flexible and efficient spectrum allocation in wireless networks, this paper also provides the general framework model based on game theory for cognitive radio spectrum allocation.
Circular Economy Development Mode Based on System Theory
Xiao Huamao; Wang Fengqi
2007-01-01
The paper tries to explore circular economy from the viewpoint of the system theory.Circular economy is a kind of complicated economic acdvity,and it is a wholly new ecotype economy pmposed by Western countries after they had summed up many experiences and lessons from traditional economy.It is an entirety and has many layers.It keeps an open and dynamic balance.Because the system theory is the theoretical foundation of circular economy,we should systemically analyze and study circular economy from the macroscopic view,correctly grasp its operational laws,improve its service functions,and realize human beings's ustainable development.The paper introduces the content and general characteristics of the system theory and the idea of circular economy.Then it analyzes circular economy based on the systemtheory.The paper concludes that the intonation of circular economy and the system theory can promote the functions of circular economy.
Rock mechanics modeling based on soft granulation theory
Owladeghaffari, H
2008-01-01
This paper describes application of information granulation theory, on the design of rock engineering flowcharts. Firstly, an overall flowchart, based on information granulation theory has been highlighted. Information granulation theory, in crisp (non-fuzzy) or fuzzy format, can take into account engineering experiences (especially in fuzzy shape-incomplete information or superfluous), or engineering judgments, in each step of designing procedure, while the suitable instruments modeling are employed. In this manner and to extension of soft modeling instruments, using three combinations of Self Organizing Map (SOM), Neuro-Fuzzy Inference System (NFIS), and Rough Set Theory (RST) crisp and fuzzy granules, from monitored data sets are obtained. The main underlined core of our algorithms are balancing of crisp(rough or non-fuzzy) granules and sub fuzzy granules, within non fuzzy information (initial granulation) upon the open-close iterations. Using different criteria on balancing best granules (information pock...
A density gradient theory based method for surface tension calculations
Liang, Xiaodong; Michelsen, Michael Locht; Kontogeorgis, Georgios
2016-01-01
The density gradient theory has been becoming a widely used framework for calculating surface tension, within which the same equation of state is used for the interface and bulk phases, because it is a theoretically sound, consistent and computationally affordable approach. Based on the observation...... that the optimal density path from the geometric mean density gradient theory passes the saddle point of the tangent plane distance to the bulk phases, we propose to estimate surface tension with an approximate density path profile that goes through this saddle point. The linear density gradient theory, which...... assumes linearly distributed densities between the two bulk phases, has also been investigated. Numerical problems do not occur with these density path profiles. These two approximation methods together with the full density gradient theory have been used to calculate the surface tension of various...
Modification of evidence theory based on feature extraction
DU Feng; SHI Wen-kang; DENG Yong
2005-01-01
Although evidence theory has been widely used in information fusion due to its effectiveness of uncertainty reasoning, the classical DS evidence theory involves counter-intuitive behaviors when high conflict information exists. Many modification methods have been developed which can be classified into the following two kinds of ideas, either modifying the combination rules or modifying the evidence sources. In order to make the modification more reasonable and more effective, this paper gives a thorough analysis of some typical existing modification methods firstly, and then extracts the intrinsic feature of the evidence sources by using evidence distance theory. Based on the extracted features, two modified plans of evidence theory according to the corresponding modification ideas have been proposed. The results of numerical examples prove the good performance of the plans when combining evidence sources with high conflict information.
Gauge theory of supergravity based only on a self-dual spin connection
Nieto, J.A.; Socorro, J.; Obregon, O. [Area of Superstrings, Escuela de Ciencias Fisico-Matematicas de la Universidad Michoacana de San Nicolas de Hidalgo, P.O. Box 749, 58000, Morelia, Michoacan (Mexico)]|[Instituto de Fisica de la Universidad de Guanajuato, P.O. Box E-143, 37150, Leon, Gto. (Mexico)]|[Depto. de Fisica, Universidad Autonoma Metropolitana-Iztapalapa, P.O. Box 55-534, 09340, D.F., Mexico (Mexico)
1996-05-01
A gauge theory of supergravity is constructed based only on the supersymmetric self-dual spin connection associated to the supergroup OSp(1{vert_bar}4). We show that Jacobson{close_quote}s supergravity action arises naturally from our proposed action. It is formulated by taking the self-dual part of the MacDowell-Mansouri gauge theory of supergravity. In this sense, our quadratic action in the supersymmetric self-dual curvature tensor provides a relation between these two important previous extensions of supergravity. {copyright} {ital 1996 The American Physical Society.}
An information theory-based approach to modeling the information processing of NPP operators
Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute, Taejon (Korea, Republic of)
2002-08-01
This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory.
3D Medical Image Segmentation Based on Rough Set Theory
CHEN Shi-hao; TIAN Yun; WANG Yi; HAO Chong-yang
2007-01-01
This paper presents a method which uses multiple types of expert knowledge together in 3D medical image segmentation based on rough set theory. The focus of this paper is how to approximate a ROI (region of interest) when there are multiple types of expert knowledge. Based on rough set theory, the image can be split into three regions:positive regions; negative regions; boundary regions. With multiple knowledge we refine ROI as an intersection of all of the expected shapes with single knowledge. At last we show the results of implementing a rough 3D image segmentation and visualization system.
Snow avalanche friction relation based on extended kinetic theory
Rauter, Matthias; Fischer, Jan-Thomas; Fellin, Wolfgang; Kofler, Andreas
2016-11-01
Rheological models for granular materials play an important role in the numerical simulation of dry dense snow avalanches. This article describes the application of a physically based model from the field of kinetic theory to snow avalanche simulations. The fundamental structure of the so-called extended kinetic theory is outlined and the decisive model behavior for avalanches is identified. A simplified relation, covering the basic features of the extended kinetic theory, is developed and implemented into an operational avalanche simulation software. To test the obtained friction relation, simulation results are compared to velocity and runout observations of avalanches, recorded from different field tests. As reference we utilize a classic phenomenological friction relation, which is commonly applied for hazard estimation. The quantitative comparison is based on the combination of normalized residuals of different observation variables in order to take into account the quality of the simulations in various regards. It is demonstrated that the extended kinetic theory provides a physically based explanation for the structure of phenomenological friction relations. The friction relation derived with the help of the extended kinetic theory shows advantages to the classic phenomenological friction, in particular when different events and various observation variables are investigated.
Caveats: Numerical Requirements in Graph Theory Based Quantitation of Tissue Architecture
J. Sudbø
2000-01-01
Full Text Available Graph theory based methods represent one approach to an objective and reproducible structural analysis of tissue architecture. By these methods, neighborhood relations between a number of objects (e.g., cells are explored and inherent to these methods are therefore certain requirements as to the number of objects to be included in the analysis. However, the question of how many objects are required to achieve reproducible values in repeated computations of proposed structural features, has previously not been adressed specifically.
Measurement-based load modeling: Theory and application
无
2007-01-01
Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.
Measurement Theory Based on the Truth Values Violates Local Realism
Nagata, Koji
2017-02-01
We investigate the violation factor of the Bell-Mermin inequality. Until now, we use an assumption that the results of measurement are ±1. In this case, the maximum violation factor is 2( n-1)/2. The quantum predictions by n-partite Greenberger-Horne-Zeilinger (GHZ) state violate the Bell-Mermin inequality by an amount that grows exponentially with n. Recently, a new measurement theory based on the truth values is proposed (Nagata and Nakamura, Int. J. Theor. Phys. 55:3616, 2016). The values of measurement outcome are either +1 or 0. Here we use the new measurement theory. We consider multipartite GHZ state. It turns out that the Bell-Mermin inequality is violated by the amount of 2( n-1)/2. The measurement theory based on the truth values provides the maximum violation of the Bell-Mermin inequality.
Unifying ecology and macroevolution with individual-based theory.
Rosindell, James; Harmon, Luke J; Etienne, Rampal S
2015-05-01
A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, whereas ecology often focuses on individual organisms. Here, we develop a new parsimonious individual-based theory by adding mild selection to the neutral theory of biodiversity. We show that this model generates realistic phylogenies showing a slowdown in diversification and also improves on the ecological predictions of neutral theory by explaining the occurrence of very common species. Moreover, we find the distribution of individual fitness changes over time, with average fitness increasing at a pace that depends positively on community size. Consequently, large communities tend to produce fitter species than smaller communities. These findings have broad implications beyond biodiversity theory, potentially impacting, for example, invasion biology and paleontology. © 2015 The Authors. Ecology Letters published by John Wiley & Sons Ltd and CNRS.
Inductive Data Types Based on Fibrations Theory in Programming
Decheng Miao
2016-03-01
Full Text Available Traditional methods including algebra and category theory have some deficiencies in analyzing semantics properties and describing inductive rules of inductive data types, we present a method based on Fibrations theory aiming at those questions above. We systematically analyze some basic logical structures of inductive data types about a fibration such as re-indexing functor, truth functor and comprehension functor, make semantics models of non-indexed fibration, single-sorted indexed fibration and many-sorted indexed fibration respectively. On this basis, we thoroughly discuss semantics properties of fibred, single-sorted indexed and many-sorted indexed inductive data types, and abstractly describe their inductive rules with universality. Furthermore, we briefly introduce applications of the three inductive dana types for analyzing semantics properties and describing inductive rules based on Fibrations theory via some examples. Compared with traditional methods, our works have the following three advantages. Firstly, brief descriptions and flexible expansibility of Fibrations theory can analyze semantics properties of inductive data types accurately, whose semantics are computed automatically. Secondly, superior abstractness of Fibrations theory does not rely on particular computing environments to depict inductive rules of inductive data types with universality. Thirdly, its rigorousness and consistence provide sound basis for testing and maintenance of software development.
A theory-based approach to nursing shared governance.
Joseph, M Lindell; Bogue, Richard J
2016-01-01
The discipline of nursing uses a general definition of shared governance. The discipline's lack of a specified theory with precepts and propositions contributes to persistent barriers in progress toward building evidence-based knowledge through systematic study. The purposes of this article were to describe the development and elements of a program theory approach for nursing shared governance implementation and to recommend further testing. Five studies using multiple methods are described using a structured framework. The studies led to the use of Lipsey's method of theory development for program implementation to develop a theory for shared governance for nursing. Nine competencies were verified to define nursing practice council effectiveness. Other findings reveal that nurse empowerment results from alignment between the competencies of self- directed work teams and the competencies of organizational leaders. Implementation of GEMS theory based nursing shared governance can advance goals at the individual, unit, department, and organization level. Advancing professional nursing practice requires that nursing concepts are systematically studied and then formalized for implementation. This article describes the development of a theoretical foundation for the systematic study and implementation of nursing shared governance. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
Field-Based Concerns about Fourth-Generation Evaluation Theory.
Lai, Morris K.
Some aspects of fourth generation evaluation procedures that have been advocated by E. G. Guba and Y. S. Lincoln were examined empirically, with emphasis on areas where there have been discrepancies between theory and field-based experience. In fourth generation evaluation, the product of an evaluation is not a set of conclusions, recommendations,…
Project-Based Language Learning: An Activity Theory Analysis
Gibbes, Marina; Carson, Lorna
2014-01-01
This paper reports on an investigation of project-based language learning (PBLL) in a university language programme. Learner reflections of project work were analysed through Activity Theory, where tool-mediated activity is understood as the central unit of analysis for human interaction. Data were categorised according to the components of human…
Reasserting Theory in Professionally Based Initial Teacher Education
Hodson, Elaine; Smith, Kim; Brown, Tony
2012-01-01
Conceptions of theory within initial teacher education in England are adjusting to new conditions where most learning how to teach is school-based. Student teachers on a programme situated primarily in an employing school were monitored within a practitioner enquiry by their university programme tutors according to how they progressively…
ANALYSIS OF CIRCUIT TOLERANCE BASED ON RANDOM SET THEORY
无
2008-01-01
Monte Carlo Analysis has been an accepted method for circuit tolerance analysis,but the heavy computational complexity has always prevented its applications.Based on random set theory,this paper presents a simple and flexible tolerance analysis method to estimate circuit yield.It is the alternative to Monte Carlo analysis,but reduces the number of calculations dramatically.
A theory of game trees, based on solution trees
W.H.L.M. Pijls (Wim); A. de Bruin (Arie); A. Plaat (Aske)
1996-01-01
textabstractIn this paper a complete theory of game tree algorithms is presented, entirely based upon the notion of a solution tree. Two types of solution trees are distinguished: max and min solution trees respectively. We show that most game tree algorithms construct a superposition of a max and a
Design of structurally colored surfaces based on scalar diffraction theory
Johansen, Villads Egede; Andkjær, Jacob Anders; Sigmund, Ole
2014-01-01
reflective surface, paint-free text and coloration, UV-resistant coloring, etc. In this initial study, the main focus is on finding a systematic way to obtain these results. For now the simulation and optimization is based on a simple scalar diffraction theory model. From the results, several design issues...
Unifying ecology and macroevolution with individual-based theory
Rosindell, James; Harmon, Luke J.; Etienne, Rampal S.
2015-01-01
A contemporary goal in both ecology and evolutionary biology is to develop theory that transcends the boundary between the two disciplines, to understand phenomena that cannot be explained by either field in isolation. This is challenging because macroevolution typically uses lineage-based models, w
A theory of game trees, based on solution trees
W.H.L.M. Pijls (Wim); A. de Bruin (Arie); A. Plaat (Aske)
1996-01-01
textabstractIn this paper a complete theory of game tree algorithms is presented, entirely based upon the notion of a solution tree. Two types of solution trees are distinguished: max and min solution trees respectively. We show that most game tree algorithms construct a superposition of a max and a
New MPPT algorithm based on hybrid dynamical theory
Elmetennani, Shahrazed
2014-11-01
This paper presents a new maximum power point tracking algorithm based on the hybrid dynamical theory. A multiceli converter has been considered as an adaptation stage for the photovoltaic chain. The proposed algorithm is a hybrid automata switching between eight different operating modes, which has been validated by simulation tests under different working conditions. © 2014 IEEE.
Mapping site-based construction workers’ motivation: Expectancy theory approach
Parviz Ghoddousi
2014-03-01
Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers.
Mapping site-based construction workers’ motivation: Expectancy theory approach
Parviz Ghoddousi
2014-03-01
Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers.
Regularization of identity based solution in string field theory
Zeze, Syoji
2010-10-01
We demonstrate that an Erler-Schnabl type solution in cubic string field theory can be naturally interpreted as a gauge invariant regularization of an identity based solution. We consider a solution which interpolates between an identity based solution and ordinary Erler-Schnabl one. Two gauge invariant quantities, the classical action and the closed string tadpole, are evaluated for finite value of the gauge parameter. It is explicitly checked that both of them are independent of the gauge parameter.
Regularization of identity based solution in string field theory
Zeze, Syoji
2010-01-01
We demonstrate that an Erler-Schnabl type solution in cubic string field theory can be naturally interpreted as a gauge invariant regularization of an identity based solution. We consider a solution which interpolates between an identity based solution and ordinary Erler-Schnabl one. Two gauge invariant quantities, the classical action and the closed string tadpole, are evaluated for finite value of the gauge parameter. It is explicitly checked that both of them are independent of the gauge parameter.
Constructing Breaker Sequence based System Restoration Strategy with Graph Theory
Peng, C.; Qin, Z.; Wang, C.; Hou, Y
2014-01-01
This paper has proposed a mapping approach to serve as an interface between the branch-bus model and the breaker-based model. In order to find the specific optimal operation for breakers in substations according to the restoration strategies, firstly, the paper has established the breaker-based model for the substation by using graphic theory, and then the optimal operation sequence for breakers has been figured out by adopting Dijkstra algorithm. Finally, a case study for a realistic power s...
Evidence for an expectancy-based theory of avoidance behaviour.
Declercq, Mieke; De Houwer, Jan; Baeyens, Frank
2008-01-01
In most studies on avoidance learning, participants receive an aversive unconditioned stimulus after a warning signal is presented, unless the participant performs a particular response. Lovibond (2006) recently proposed a cognitive theory of avoidance learning, according to which avoidance behaviour is a function of both Pavlovian and instrumental conditioning. In line with this theory, we found that avoidance behaviour was based on an integration of acquired knowledge about, on the one hand, the relation between stimuli and, on the other hand, the relation between behaviour and stimuli.
A Construction Way of MAS Based on Organization Theory
GAO Bo; FEI Qi; CHEN Xue-guang
2002-01-01
With emphasizing that the integration of autonomy and coordination is the basis for constructing multi-agent systems (MAS), we analyze the organizational characters inherent with MAS and point out that it's a natural and essential way to construct MAS based on organization theory. We consider that the emphasis of the theory is the process of system analyzing. Then we present an analysis frame to expound the process, which includes the process of organization definition, the process of role definition, the process of organizational structure definition and the process of interaction protocol definition. Lastly, we discuss some issues associated with the processes of system design and implementation.
Thermoeconomic Diagnosis Theory Based on Thermo-Characterization
Victor Hugo Rangel-Hernandez
2010-11-01
Full Text Available In this paper, the axioms and procedures to define a thermoeconomic diagnosis theory, based on thermo-characterization, are presented herein. This theory can be applied for advanced energy systems. The thermo-characterization set for each component in a system a reference operating conditions in a three dimensional map (w, s, FR, and pre-evaluate all the effects when environmental variation, internal or control malfunction occurs (dw/dmalf, dw/dmalf. It should allow real-time monitoring in actual operating process, by determining matrix of malfunctions and its global thermoeconomic assessment. Results are validated with respect to an analytical simulator obtaining high accuracy.
Automata theory based on complete residuated lattice-valued logic
邱道文
2001-01-01
This paper establishes a fundamental framework of automata theory based on complete residuated lattice-valued logic. First it deals with how to extend the transition relation of states and particularly presents a characterization of residuated lattice by fuzzy automata (called valued automata).After that fuzzy subautomata (called valued subautomata), successor and source operators are proposed and their basic properties as well as the equivalent relation among them are discussed, from which it follows that the two fuzzy operators are exactly fuzzy closure operators. Finally an L bifuzzy topological characterization of valued automata is presented, so a more generalized fuzzy automata theory is built.
Theory-based Bayesian models of inductive learning and reasoning.
Tenenbaum, Joshua B; Griffiths, Thomas L; Kemp, Charles
2006-07-01
Inductive inference allows humans to make powerful generalizations from sparse data when learning about word meanings, unobserved properties, causal relationships, and many other aspects of the world. Traditional accounts of induction emphasize either the power of statistical learning, or the importance of strong constraints from structured domain knowledge, intuitive theories or schemas. We argue that both components are necessary to explain the nature, use and acquisition of human knowledge, and we introduce a theory-based Bayesian framework for modeling inductive learning and reasoning as statistical inferences over structured knowledge representations.
A curvature theory for discrete surfaces based on mesh parallelity
Bobenko, Alexander Ivanovich
2009-12-18
We consider a general theory of curvatures of discrete surfaces equipped with edgewise parallel Gauss images, and where mean and Gaussian curvatures of faces are derived from the faces\\' areas and mixed areas. Remarkably these notions are capable of unifying notable previously defined classes of surfaces, such as discrete isothermic minimal surfaces and surfaces of constant mean curvature. We discuss various types of natural Gauss images, the existence of principal curvatures, constant curvature surfaces, Christoffel duality, Koenigs nets, contact element nets, s-isothermic nets, and interesting special cases such as discrete Delaunay surfaces derived from elliptic billiards. © 2009 Springer-Verlag.
Martinez-Trufero, J; Isla, D; Adansa, J C; Irigoyen, A; Hitt, R; Gil-Arnaiz, I; Lambea, J; Lecumberri, M J; Cruz, J J
2010-01-01
Background: Platinum-based therapy (PBT) is the standard therapy for recurrent and/or metastatic head and neck cancer (HNC), but the incidence of recurrence remains high. This study evaluates the efficacy and tolerability of capecitabine as palliative monotherapy for recurrent HNC previously treated with PBT. Methods: Patients aged 18–75 years, with Eastern Cooperative Oncology Group performance status 0–2, squamous HNC with locoregional and/or metastatic recurrence previously treated with PBT and adequate organ functions, were included. Capecitabine (1.250 mg m−2 BID) was administered on days 1–14 every 21 days for at least two cycles. Results: A total of 40 male patients with a median age of 58 years were analysed. All patients received a median number of four cycles of capecitabine (range: 1–9) and the median relative dose intensity was 91%. Seven patients were not evaluable for response. Overall response rate was 24.2%. Median time to progression and overall survival were 4.8 and 7.3 months, respectively. Haematological adverse events (AEs) grade 3/4 were reported in six patients. Most common grade 3/4 non-haematological AEs were asthenia (12.5%), palmar-plantar eritrodisestesia (10%), mucositis (10%), dysphagia (10%) and diarrhoea (7.5%). Conclusions: Capecitabine seems to be an active, feasible and well-tolerated mode of palliative treatment for advanced HNC patients who have previously received PBT schedules. PMID:20485287
Biophysics of risk aversion based on neurotransmitter receptor theory
Takahashi, Taiki
2011-01-01
Decision under risk and uncertainty has been attracting attention in neuroeconomics and neuroendocrinology of decision-making. This paper demonstrated that the neurotransmitter receptor theory-based value (utility) function can account for human and animal risk-taking behavior. The theory predicts that (i) when dopaminergic neuronal response is efficiently coupled to the formation of ligand-receptor complex, subjects are risk-aversive (irrespective of their satisfaction level) and (ii) when the coupling is inefficient, subjects are risk-seeking at low satisfaction levels, consistent with risk-sensitive foraging theory in ecology. It is further suggested that some anomalies in decision under risk are due to inefficiency of the coupling between dopamine receptor activation and neuronal response. Future directions in the application of the model to studies in neuroeconomics of addiction and neuroendocrine modulation of risk-taking behavior are discussed.
[Brazilian scientific production based on Orem's nursing theory: integrative review].
Raimondo, Maria Lúcia; Fegadoli, Débora; Méier, Marineli Joaquim; Wall, Marilene Loewen; Labronici, Liliana Maria; Raimondo-Ferraz, Maria Isabel
2012-01-01
Integrative review, held in the databases LILACS, SciELO and BDENF from January 2005 to May 2009, aimed to summarize the Brazilian scientific production based on Orem's Nursing Theory. We obtained 23 articles, analyzed by simple descriptive statistics. It was found that 100% of the studies focused on adults. Of this total, 65,22% returned to the chronicle diseases. In 39,15% of the searches, the theory was used in full and in 34,80% one of the constructs. 91,30% of publications aimed to the construction and deployment of the structured and theoretically grounded practice of care. It was concluded that the theory has been used as theoretical and philosophical basis to justify the practice of nursing in a variety of situations in order to emphasize the role of the nurse in the care.
Knowledge Access Based on the Rough Set Theory
HAN Yan-ling; YANG Bing-ru; CAO Shou-qi
2005-01-01
During the procedure of fault diagnosis for large-scale complicated equipment, the existence of redundant and fuzzy information results in the difficulty of knowledge access. Aiming at this characteristic, this paper brought forth the Rough Set (RS) theory to the field of fault diagnosis. By means of the RS theory which is predominant in the way of dealing with fuzzy and uncertain information,knowledge access about fault diagnosis was realized. The foundation ideology of the RS theory was exhausted in detail, an amended RS algorithm was proposed, and the process model of knowledge access based on the amended RS algorithm was researched. Finally, we verified the correctness and the practicability of this method during the procedure of knowledge access.
Study of Idioms Based on the Conceptual Metaphor Theory
DONG Yan-yuan
2016-01-01
On the production and understanding of the idioms, cognitive linguists put forward totally different opinions from the traditional linguistic theory. One enduring belief about the arbitrariness of English idioms is that they are non-componential, be-cause their idiomatic meanings are not deducible from the meaning of their individual parts. According to“conceptual meta-phor”theory, which are proposed by cognitive linguists Lakoff and Johnson:Metaphor is a mapping that from one more famil-iar, more easily understanding source domain to a less familiar , more difficult target domain. The paper studies on the idiomatic-ity and the motivation of the idioms based on the“conceptual metaphor”theory from the perspective of cognitive linguistics.
Kinematics Analysis Based on Screw Theory of a Humanoid Robot
MAN Cui-hua; FAN Xun; LI Cheng-rong; ZHAO Zhong-hui
2007-01-01
A humanoid robot is a complex dynamic system for its idiosyncrasy. This paper aims to provide a mathematical and theoretical foundation for the design of the configuration, kinematics analysis of a novel humanoid robot. It has a simplified configuration and design for entertainment purpose. The design methods, principle and mechanism are discussed. According to the design goals of this research, there are ten degrees of freedom in the two bionic arms.Modularization, concurrent design and extension theory methods were adopted in the configuration study and screw theory was introduced into the analysis of humanoid robot kinematics. Comparisons with other methods show that: 1) only two coordinates need to be established in the kinematics analysis of humanoid robot based on screw theory; 2) the spatial manipulator Jacobian obtained by using twist and exponential product formula is succinct and legible; 3) adopting screw theory to resolve the humanoid robot arms kinematics question can avoid singularities; 4) using screw theory can solve the question of specification insufficiency.
Correlation theory-based signal processing method for CMF signals
Shen, Yan-lin; Tu, Ya-qing
2016-06-01
Signal processing precision of Coriolis mass flowmeter (CMF) signals affects measurement accuracy of Coriolis mass flowmeters directly. To improve the measurement accuracy of CMFs, a correlation theory-based signal processing method for CMF signals is proposed, which is comprised of the correlation theory-based frequency estimation method and phase difference estimation method. Theoretical analysis shows that the proposed method eliminates the effect of non-integral period sampling signals on frequency and phase difference estimation. The results of simulations and field experiments demonstrate that the proposed method improves the anti-interference performance of frequency and phase difference estimation and has better estimation performance than the adaptive notch filter, discrete Fourier transform and autocorrelation methods in terms of frequency estimation and the data extension-based correlation, Hilbert transform, quadrature delay estimator and discrete Fourier transform methods in terms of phase difference estimation, which contributes to improving the measurement accuracy of Coriolis mass flowmeters.
Automatic web services classification based on rough set theory
陈立; 张英; 宋自林; 苗壮
2013-01-01
With development of web services technology, the number of existing services in the internet is growing day by day. In order to achieve automatic and accurate services classification which can be beneficial for service related tasks, a rough set theory based method for services classification was proposed. First, the services descriptions were preprocessed and represented as vectors. Elicited by the discernibility matrices based attribute reduction in rough set theory and taking into account the characteristic of decision table of services classification, a method based on continuous discernibility matrices was proposed for dimensionality reduction. And finally, services classification was processed automatically. Through the experiment, the proposed method for services classification achieves approving classification result in all five testing categories. The experiment result shows that the proposed method is accurate and could be used in practical web services classification.
Computer-based Training in Medicine and Learning Theories.
Haag, Martin; Bauch, Matthias; Garde, Sebastian; Heid, Jörn; Weires, Thorsten; Leven, Franz-Josef
2005-01-01
Computer-based training (CBT) systems can efficiently support modern teaching and learning environments. In this paper, we demonstrate on the basis of the case-based CBT system CAMPUS that current learning theories and design principles (Bloom's Taxonomy and practice fields) are (i) relevant to CBT and (ii) are feasible to implement using computer-based training and adequate learning environments. Not all design principles can be fulfilled by the system alone, the integration of the system in adequate teaching and learning environments therefore is essential. Adequately integrated, CBT programs become valuable means to build or support practice fields for learners that build domain knowledge and problem-solving skills. Learning theories and their design principles can support in designing these systems as well as in assessing their value.
Analysis and synthesis of phase shifting algorithms based on linear systems theory
Servin, M.; Estrada, J. C.
2012-08-01
We review and update a recently published formalism for the theory of linear Phase Shifting Algorithms (PSAs) based on linear filtering (systems) theory, mainly using the Frequency Transfer Function (FTF). The FTF has been for decades the standard tool in Electrical Engineering to analyze and synthesize their linear systems. Given the well defined FTF approach (matured over the last century), it clarifies, in our view, many not fully understood properties of PSAs. We present easy formulae for the spectra of the PSAs (the FTF magnitude), their Signal to Noise (S/N) power-ratio gain, their detuning robustness, and their harmonic rejection in terms of the FTF. This paper has more practical appeal than previous publications by the same authors, hoping to enrich the understanding of this PSA's theory as applied to the analysis and synthesis of temporal interferometry algorithms in Optical Metrology.
Chan, Emily Y Y; Kim, Jean H; Lin, Cherry; Cheung, Eliza Y L; Lee, Polly P Y
2014-06-01
Disaster preparedness is an important preventive strategy for protecting health and mitigating adverse health effects of unforeseen disasters. A multi-site based ethnic minority project (2009-2015) is set up to examine health and disaster preparedness related issues in remote, rural, disaster prone communities in China. The primary objective of this reported study is to examine if previous disaster experience significantly increases household disaster preparedness levels in remote villages in China. A cross-sectional, household survey was conducted in January 2011 in Gansu Province, in a predominately Hui minority-based village. Factors related to disaster preparedness were explored using quantitative methods. Two focus groups were also conducted to provide additional contextual explanations to the quantitative findings of this study. The village household response rate was 62.4 % (n = 133). Although previous disaster exposure was significantly associated with perception of living in a high disaster risk area (OR = 6.16), only 10.7 % households possessed a disaster emergency kit. Of note, for households with members who had non-communicable diseases, 9.6 % had prepared extra medications to sustain clinical management of their chronic conditions. This is the first study that examined disaster preparedness in an ethnic minority population in remote communities in rural China. Our results indicate the need of disaster mitigation education to promote preparedness in remote, resource-poor communities.
Device modeling of superconductor transition edge sensors based on the two-fluid theory
Wang, Tian-Shun; Zhu, Qing-Feng; Wang, Jun-Xian; Li, Tie-Fu; Liu, Jian-She; Chen, Wei; Zhou, Xingxiang
2012-01-01
In order to support the design and study of sophisticated large scale transition edge sensor (TES) circuits, we use basic SPICE elements to develop device models for TESs based on the superfluid-normal fluid theory. In contrast to previous studies, our device model is not limited to small signal simulation, and it relies only on device parameters that have clear physical meaning and can be easily measured. We integrate the device models in design kits based on powerful EDA tools such as CADENCE and OrCAD, and use them for versatile simulations of TES circuits. Comparing our simulation results with published experimental data, we find good agreement which suggests that device models based on the two-fluid theory can be used to predict the behavior of TES circuits reliably and hence they are valuable for assisting the design of sophisticated TES circuits.
Commitment-based action: Rational choice theory and contrapreferential choice
Radovanović Bojana
2014-01-01
Full Text Available This paper focuses on Sen’s concept of contrapreferential choice. Sen has developed this concept in order to overcome weaknesses of the rational choice theory. According to rational choice theory a decision-maker can be always seen as someone who maximises utility, and each choice he makes as the one that brings to him the highest level of personal wellbeing. Sen argues that in some situations we chose alternatives that bring us lower level of wellbeing than we could achieve if we had chosen some other alternative available to us. This happens when we base our decisions on moral principles, when we act out of duty. Sen calls such action a commitment-based action. When we act out of commitment we actually neglect our preferences and thus we make a contrapreferential choice, as Sen argues. This paper shows that, contrary to Sen, a commitment-based action can be explained within the framework of rational choice theory. However, when each choice we make can be explained within the framework of rational choice theory, when in everything we do maximisation principle can be loaded, then the variety of our motives and traits is lost, and the explanatory power of the rational choice theory is questionable. [Projekat Ministarstva nauke Republike Srbije, br. 47009: Evropske integracije i društveno-ekonomske promene privrede Srbije na putu ka EU i br. 179015: Izazovi i perspektive strukturnih promena u Srbiji: Strateški pravci ekonomskog razvoja i usklađivanje sa zahtevima EU
Paquete, Ana Teresa; Miguel, Luís Silva; Becker, Ursula; Pereira, Catarina; Pinto, Carlos Gouveia
2017-08-01
Chronic lymphocytic leukaemia (CLL) mostly affects patients with comorbidities and limited therapeutic options. Obinutuzumab in combination with chlorambucil (GClb) is a new therapeutic option for previously untreated CLL patients who are unsuitable for full-dose fludarabine-based therapy. This combination delays disease progression but incurs additional costs; thus, an assessment of its value for money is relevant. To estimate the incremental cost-utility ratio of GClb in comparison with (i) rituximab in combination with chlorambucil (RClb), and (ii) chlorambucil alone (Clb) from the perspective of the Portuguese National Health Service (NHS). A Markov model was used to predict disease progression. Pre-progression clinical data were based on the latest CLL11 trial data, and post-progression clinical data were obtained from CLL5 trial data. Utility values are from Kosmas et al. (Leuk Lymphoma 56:1320-1326, 14). Only direct medical costs were included. The resource consumption was estimated by a panel of Portuguese experts, and the unit costs were obtained from official sources. A discount rate of 5% was applied to costs and consequences. GClb and RClb were associated with an increase of 1.06 and 0.39 quality-adjusted life-years (QALY) at an additional cost of €21,720 and €9836 when compared to Clb, respectively. The cost-utility ratio of GClb versus Clb was €20,397/QALY, while RClb was extendedly dominated. The use of GClb for previously untreated CLL patients who are unsuitable for full-dose fludarabine-based therapy incurs an incremental cost per QALY that is generally accepted in Portugal. Therefore, although there is some uncertainty, obinutuzumab is probably a cost-effective therapy in the Portuguese setting.
Ontology mapping approach based on set & relation theory and OCL
QIAN Peng-fei; ZHANG Shen-sheng; LIU Ying-hua
2009-01-01
An ontology mapping approach based on set & relation theory and OCL is introduced, then an ontolo-gy mapping meta-model is established which is composed of ontology related elements, mapping related elements and definition rule related elements. This ontology mapping meta-model can be regarded as a unified mechanism to realize different kinds of ontology mappings. The powerful computation capability of set and relation theory and the flexible expressive capability of OCL can be used in the computation of ontology mapping meta-model to realize the unified mapping among different ontology models. Based on the mapping meta-model, a general mapping management framework is developed to provide a common mapping storage mechanism, some mapping APIs and mapping rule APIs.
Managing for resilience: an information theory-based ...
Ecosystems are complex and multivariate; hence, methods to assess the dynamics of ecosystems should have the capacity to evaluate multiple indicators simultaneously. Most research on identifying leading indicators of regime shifts has focused on univariate methods and simple models which have limited utility when evaluating real ecosystems, particularly because drivers are often unknown. We discuss some common univariate and multivariate approaches for detecting critical transitions in ecosystems and demonstrate their capabilities via case studies. Synthesis and applications. We illustrate the utility of an information theory-based index for assessing ecosystem dynamics. Trends in this index also provide a sentinel of both abrupt and gradual transitions in ecosystems. In response to the need to identify leading indicators of regime shifts in ecosystems, our research compares traditional indicators and Fisher information, an information theory based method, by examining four case study systems. Results demonstrate the utility of methods and offers great promise for quantifying and managing for resilience.
A Model of Resurgence Based on Behavioral Momentum Theory
Shahan, Timothy A; Sweeney, Mary M.
2011-01-01
Resurgence is the reappearance of an extinguished behavior when an alternative behavior reinforced during extinction is subsequently placed on extinction. Resurgence is of particular interest because it may be a source of relapse to problem behavior following treatments involving alternative reinforcement. In this article we develop a quantitative model of resurgence based on the augmented model of extinction provided by behavioral momentum theory. The model suggests that alternative reinforc...
Theory-based Practice: Comparing and Contrasting OT Models
Nielsen, Kristina Tomra; Berg, Brett
2012-01-01
Theory- Based Practice: Comparing and Contrasting OT Models The workshop will present a critical analysis of the major models of occupational therapy, A Model of Human Occupation, Enabling Occupation II, and Occupational Therapy Intervention Process Model. Similarities and differences among...... the models will be discussed, including each model’s limitations and unique contributions to the profession. Workshop format will include short lectures and group discussions....
Designing “Theory of Machines and Mechanisms” course on Project Based Learning approach
Shinde, Vikas
2013-01-01
by the industry and the learning outcomes specified by the National Board of Accreditation (NBA), India; this course is restructured on Project Based Learning approach. A mini project is designed to suit course objectives. An objective of this paper is to discuss the rationale of this course design......Theory of Machines and Mechanisms course is one of the essential courses of Mechanical Engineering undergraduate curriculum practiced at Indian Institute. Previously, this course was taught by traditional instruction based pedagogy. In order to achieve profession specific skills demanded...
Ebrahimi, Farzad; Reza Barati, Mohammad
2017-01-01
In this research, vibration characteristics of a flexoelectric nanobeam in contact with Winkler-Pasternak foundation is investigated based on the nonlocal elasticity theory considering surface effects. This nonclassical nanobeam model contains flexoelectric effect to capture coupling of strain gradients and electrical polarizations. Moreover, the nonlocal elasticity theory is employed to study the nonlocal and long-range interactions between the particles. The present model can degenerate into the classical model if the nonlocal parameter, flexoelectric and surface effects are omitted. Hamilton's principle is employed to derive the governing equations and the related boundary conditions which are solved applying a Galerkin-based solution. Natural frequencies are verified with those of previous papers on nanobeams. It is illustrated that flexoelectricity, nonlocality, surface stresses, elastic foundation and boundary conditions affects considerably the vibration frequencies of piezoelectric nanobeams.
Das, Susanta K.; Berry, K. J.
A two-cell theory is developed to measure proton exchange membrane (PEM) resistance to proton flow during conduction through a PEM fuel cell. The theoretical framework developed herein is based upon fundamental thermodynamic principles and engineering laws. We made appropriate corrections to develop the theoretical model previously proposed by Babu and Nair (B.V. Babu, N. Nair, J. Energy Edu. Sci. Technol. 13 (2004) 13-20) for measuring membrane resistance to the flow of protons, which is the only ion that travels from one electrode to the other through the membrane. A simple experimental set-up and procedure are also developed to validate the theoretical model predictions. A widely used commercial membrane (Nafion ®) and several in-house membranes are examined to compare relative resistance among membranes. According to the theory, resistance of the proton exchange membrane is directly proportional to the time taken for a specific amount of protons to pass through the membrane. A second order differential equation describes the entire process. The results show that theoretical predictions are in excellent agreement with experimental observations. It is our speculation that the investigation results will open up a route to develop a simple device to measure resistance during membrane manufacturing since electrolyte resistance is one of the key performance drivers for the advancement of fuel cell technology.
Principle-based concept analysis: intentionality in holistic nursing theories.
Aghebati, Nahid; Mohammadi, Eesa; Ahmadi, Fazlollah; Noaparast, Khosrow Bagheri
2015-03-01
This is a report of a principle-based concept analysis of intentionality in holistic nursing theories. A principle-based concept analysis method was used to analyze seven holistic theories. The data included eight books and 31 articles (1998-2011), which were retrieved through MEDLINE and CINAHL. Erickson, Kriger, Parse, Watson, and Zahourek define intentionality as a capacity, a focused consciousness, and a pattern of human being. Rogers and Newman do not explicitly mention intentionality; however, they do explain pattern and consciousness (epistemology). Intentionality has been operationalized as a core concept of nurse-client relationships (pragmatic). The theories are consistent on intentionality as a noun and as an attribute of the person-intentionality is different from intent and intention (linguistic). There is ambiguity concerning the boundaries between intentionality and consciousness (logic). Theoretically, intentionality is an evolutionary capacity to integrate human awareness and experience. Because intentionality is an individualized concept, we introduced it as "a matrix of continuous known changes" that emerges in two forms: as a capacity of human being and as a capacity of transpersonal caring. This study has produced a theoretical definition of intentionality and provides a foundation for future research to further investigate intentionality to better delineate its boundaries. © The Author(s) 2014.
An interface energy density-based theory considering the coherent interface effect in nanomaterials
Yao, Yin; Chen, Shaohua; Fang, Daining
2017-02-01
To characterize the coherent interface effect conveniently and feasibly in nanomaterials, a continuum theory is proposed that is based on the concept of the interface free energy density, which is a dominant factor affecting the mechanical properties of the coherent interface in materials of all scales. The effect of the residual strain caused by self-relaxation and the lattice misfit of nanomaterials, as well as that due to the interface deformation induced by an external load on the interface free energy density is considered. In contrast to the existing theories, the stress discontinuity at the interface is characterized by the interface free energy density through an interface-induced traction. As a result, the interface elastic constant introduced in previous theories, which is not easy to determine precisely, is avoided in the present theory. Only the surface energy density of the bulk materials forming the interface, the relaxation parameter induced by surface relaxation, and the mismatch parameter for forming a coherent interface between the two surfaces are involved. All the related parameters are far easier to determine than the interface elastic constants. The effective bulk and shear moduli of a nanoparticle-reinforced nanocomposite are predicted using the proposed theory. Closed-form solutions are achieved, demonstrating the feasibility and convenience of the proposed model for predicting the interface effect in nanomaterials.
A danger-theory-based immune network optimization algorithm.
Zhang, Ruirui; Li, Tao; Xiao, Xin; Shi, Yuanquan
2013-01-01
Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times.
Two approaches to synthesis based on the domain theory
Hansen, Claus Thorp; Andreasen, Mogens Myrup
2002-01-01
. The functional reasoning within each domain and between the domains seems to be ruled by the function-means law (Hubka´s law). On the basis of the domain theory and the function-means law we present two formal approaches to the synthesis of mechanical artefacts, namely a design-process-oriented approach...... and an artefact-oriented approach. The design-process-oriented synthesis approach can be seen as a basic design step for composite mechanical artefacts. The artefact-oriented approach has been utilised for the development of computer-based design support systems.......The domain theory is described in this chapter. By a strict distinction between the structural characteristics and the behavioural properties of a mechanical artefact, each domain, i.e. transformation-, organ-, and part domain, becomes a productive view for design of mechanical artefacts...
Cosmological Model Based on Gauge Theory of Gravity
WU Ning
2005-01-01
A cosmological model based on gauge theory of gravity is proposed in this paper. Combining cosmological principle and field equation of gravitational gauge field, dynamical equations of the scale factor R(t) of our universe can be obtained. This set of equations has three different solutions. A prediction of the present model is that, if the energy density of the universe is not zero and the universe is expanding, the universe must be space-flat, the total energy density must be the critical density ρc of the universe. For space-flat case, this model gives the same solution as that of the Friedmann model. In other words, though they have different dynamics of gravitational interactions, general relativity and gauge theory of gravity give the same cosmological model.
A New Network Robustness Topology Measure based on Information Theory
Schieber, Tiago A; Frery, Alejandro C; Rosso, Osvaldo A; Pardalos, Panos M; Ravetti, Martin G
2014-01-01
A crucial challenge in network theory is to study how robust a network is when facing failures or attacks. In this work, we propose a novel methodology to measure the topological resilience and robustness of a network based on Information Theory quantifiers. This measure can be used with any probability distribution able to represent the network's properties. In particular, we analyze the efficiency in capturing small perturbations in the network's topology when using the degree and distance distributions. Theoretical examples and real networks are used to study the performance of this methodology. Although both cases show to be able to detect any single topological change, the distance distribution seems to be more consistent to reflect the network structural deviations. In all cases, the novel resilience and robustness measures computed by using the distance distribution reflect better the consequences of the failures, outperforming other methods.
Wu, Christina J; Roytman, Marina M; Hong, Leena K; Huddleston, Leslie; Trujillo, Ruby; Cheung, Alvin; Poerzgen, Peter; Tsai, Naoky C S
2015-09-01
The introduction of sofosbuvir, a direct acting antiviral, has revolutionized the treatment of chronic hepatitis C virus (HCV). Phase 3 clinical trials have demonstrated the efficacy, simplicity, and tolerability of sofosbuvir-based regimens and report high rates of sustained virological response (SVR) rates. The purpose of this study was to assess whether clinical trial findings translate into a real-world setting, particularly with treatment of chronic HCV in our diverse, multiethnic population of Hawai'i. Retrospective analysis was performed for 113 patients with genotype 1-6 HCV infection being treated at the Queen's Liver Center between January 2014 and March 2015. SVR rates for our cohort were slightly lower than the rates published by the clinical trials. Data analysis also suggested that most baseline characteristics previously associated with inferior response might not be as significant for sofosbuvir-based regimens; in our cohort, male gender was the only factor significantly related to increased risk of virologic relapse. Pacific Islanders also had higher rate of relapse compared to other ethnic groups, but the small number of patients treated in this subgroup make it difficult to validate this finding. While newer all-oral treatment regimens have been introduced since this study, we highlight the importance of comparing real-world versus clinical trial results for new treatments, and provide data analyses for treatment of chronic HCV in Hawai'i.
Hunault, C C; Habbema, J D F; Eijkemans, M J C; Collins, J A; Evers, J L H; te Velde, E R
2004-09-01
Several models have been published for the prediction of spontaneous pregnancy among subfertile patients. The aim of this study was to broaden the empirical basis for these predictions by making a synthesis of three previously published models. We used the original data from the studies of Eimers et al. (1994), Collins et al. (1995) and Snick et al. (1997) on couples consulting for various forms of subfertility. We developed a so-called three-sample synthesis model for predicting spontaneous conception leading to live birth within 1 year after intake based on the three data sets. The predictors used are duration of subfertility, women's age, primary or secondary infertility, percentage of motile sperm, and whether the couple was referred by a general practitioner or by a gynaecologist (referral status). The performance of this model was assessed according to a 'jack-knife' analysis. Because the post-coital test (PCT) was not assessed in one of the samples, a synthesis model including the PCT was based on two samples only. The ability of the synthesis models to distinguish between women who became pregnant and those who did not was comparable to the ability of the one-sample models when applied in the other samples. The reliability of the predictions by the three-sample synthesis model was somewhat better. Predictions improved considerably by including the PCT. The synthesis models performed better and had a broader empirical basis than the original models. They are therefore better suitable for application in other centres.
Determination of the sediment carrying capacity based on perturbed theory.
Ni, Zhi-hui; Zeng, Qiang; Li-chun, Wu
2014-01-01
According to the previous studies of sediment carrying capacity, a new method of sediment carrying capacity on perturbed theory was proposed. By taking into account the average water depth, average flow velocity, settling velocity, and other influencing factors and introducing the median grain size as one main influencing factor in deriving the new formula, we established a new sediment carrying capacity formula. The coefficients were determined by the principle of dimensional analysis, multiple linear regression method, and the least square method. After that, the new formula was verified through measuring data of natural rivers and flume tests and comparing the verified results calculated by Cao Formula, Zhang Formula, Li Formula, Engelung-Hansen Formula, Ackers-White Formula, and Yang Formula. According to the compared results, it can be seen that the new method is of high accuracy. It could be a useful reference for the determination of sediment carrying capacity.
Ranking streamflow model performance based on Information theory metrics
Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas
2016-04-01
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.
Discovering Pair-Wise Genetic Interactions: An Information Theory-Based Approach
Ignac, Tomasz M.; Skupin, Alexander; Sakhanenko, Nikita A.; Galas, David J.
2014-01-01
Phenotypic variation, including that which underlies health and disease in humans, results in part from multiple interactions among both genetic variation and environmental factors. While diseases or phenotypes caused by single gene variants can be identified by established association methods and family-based approaches, complex phenotypic traits resulting from multi-gene interactions remain very difficult to characterize. Here we describe a new method based on information theory, and demonstrate how it improves on previous approaches to identifying genetic interactions, including both synthetic and modifier kinds of interactions. We apply our measure, called interaction distance, to previously analyzed data sets of yeast sporulation efficiency, lipid related mouse data and several human disease models to characterize the method. We show how the interaction distance can reveal novel gene interaction candidates in experimental and simulated data sets, and outperforms other measures in several circumstances. The method also allows us to optimize case/control sample composition for clinical studies. PMID:24670935
Study on thermal wave based on the thermal mass theory
无
2009-01-01
The conservation equations for heat conduction are established based on the concept of thermal mass.We obtain a general heat conduction law which takes into account the spatial and temporal inertia of thermal mass.The general law introduces a damped thermal wave equation.It reduces to the well-known CV model when the spatial inertia of heat flux and temperature and the temporal inertia of temperature are neglected,which indicates that the CV model only considers the temporal inertia of heat flux.Numerical simulations on the propagation and superposition of thermal waves show that for small thermal perturbation the CV model agrees with the thermal wave equation based on the thermal mass theory.For larger thermal perturbation,however,the physically impossible phenomenon pre-dicted by CV model,i.e.the negative temperature induced by the thermal wave superposition,is eliminated by the general heat conduction law,which demonstrates that the present heat conduction law based on the thermal mass theory is more reasonable.
Half-global discretization algorithm based on rough set theory
Tan Xu; Chen Yingwu
2009-01-01
It is being widely studied how to extract knowledge from a decision table based on rough set theory. The novel problem is how to discretize a decision table having continuous attribute. In order to obtain more reasonable discretization results, a discretization algorithm is proposed, which arranges half-global discretization based on the correlational coefficient of each continuous attribute while considering the uniqueness of rough set theory. When choosing heuristic information, stability is combined with rough entropy. In terms of stability, the possibility of classifying objects belonging to certain sub-interval of a given attribute into neighbor sub-intervals is minimized. By doing this, rational discrete intervals can be determined. Rough entropy is employed to decide the optimal cut-points while guaranteeing the consistency of the decision table after discretization. Thought of this algorithm is elaborated through Iris data and then some experiments by comparing outcomes of four discritized datasets are also given, which are calculated by the proposed algorithm and four other typical algorithms for discritization respectively. After that, classification rules are deduced and summarized through rough set based classifiers. Results show that the proposed discretization algorithm is able to generate optimal classification accuracy while minimizing the number of discrete intervals. It displays superiority especially when dealing with a decision table having a large attribute number.
A communication-theory based view on telemedical communication.
Schall, Thomas; Roeckelein, Wolfgang; Mohr, Markus; Kampshoff, Joerg; Lange, Tim; Nerlich, Michael
2003-01-01
Communication theory based analysis sheds new light on the use of health telematics. This analysis of structures in electronic medical communication shows communicative structures with special features. Current and evolving telemedical applications are analyzed. The methodology of communicational theory (focusing on linguistic pragmatics) is used to compare it with its conventional counterpart. The semiotic model, the roles of partners, the respective message and their relation are discussed. Channels, sender, addressee, and other structural roles are analyzed for different types of electronic medical communication. The communicative processes are shown as mutual, rational action towards a common goal. The types of communication/texts are analyzed in general. Furthermore the basic communicative structures of medical education via internet are presented with their special features. The analysis shows that electronic medical communication has special features compared to everyday communication: A third participant role often is involved: the patient. Messages often are addressed to an unspecified partner or to an unspecified partner within a group. Addressing in this case is (at least partially) role-based. Communication and message often directly (rather than indirectly) influence actions of the participants. Communication often is heavily regulated including legal implications like liability, and more. The conclusion from the analysis is that the development of telemedical applications so far did not sufficiently take communicative structures into consideration. Based on these results recommendations for future developments of telemedical applications/services are given.
Study on thermal wave based on the thermal mass theory
HU RuiFeng; CAO BingYang
2009-01-01
The conservation equations for heat conduction are established based on the concept of thermal mass. We obtain a general heat conduction law which takes into account the spatial and temporal inertia of thermal mass. The general law introduces a damped thermal wave equation. It reduces to the well-known CV model when the spatial inertia of heat flux and temperature and the temporal inertia of temperature are neglected, which indicates that the CV model only considers the temporal inertia of heat flux. Numerical simulations on the propagation and superposition of thermal waves show that for small thermal perturbation the CV model agrees with the thermal wave equation based on the thermal mass theory. For larger thermal perturbation, however, the physically impossible phenomenon pre-dicted by CV model, i.e. the negative temperature induced by the thermal wave superposition, is eliminated by the general heat conduction law, which demonstrates that the present heat conduction law based on the thermal mass theory is more reasonable.
Game Theory and Risk-Based Levee System Design
Hui, R.; Lund, J. R.; Madani, K.
2014-12-01
Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.
Theory of Networked Minority Games based on Strategy Pattern Dynamics
Lo, T. S.; H.Y. Chan; P.M. Hui; Johnson, N F
2004-01-01
We formulate a theory of agent-based models in which agents compete to be in a winning group. The agents may be part of a network or not, and the winning group may be a minority group or not. The novel feature of the present formalism is its focus on the dynamical pattern of strategy rankings, and its careful treatment of the strategy ties which arise during the system's temporal evolution. We apply it to the Minority Game (MG) with connected populations. Expressions for the mean success rate...
Method of Fire Image Identification Based on Optimization Theory
无
2002-01-01
In view of some distinctive characteristics of the early-stage flame image, a corresponding method of characteristic extraction is presented. Also introduced is the application of the improved BP algorithm based on the optimization theory to identifying fire image characteristics. First the optimization of BP neural network adopting Levenberg-Marquardt algorithm with the property of quadratic convergence is discussed, and then a new system of fire image identification is devised. Plenty of experiments and field tests have proved that this system can detect the early-stage fire flame quickly and reliably.
Diversification or splitting-an explanation based on Contract Theory
LUO Liang-zhong; SHI Zhan-zhong
2005-01-01
By inserting the variable of the exactness of corporate valuation into the classic model of Contract Theory, this paper,on the bases of the interaction of the variables of the veraciousness of corporate valuation, managerial incentives and operational risks, explores the deep-seated reasons for changes in corporate structures, and draws the conclusion that the divestment of the subsidiary is beneficial to shareholders when the parent corporate is undervalued and that the relation between the parent and the subsidiary is disordered, or vice versa. This conclusion is consistent with the motives of many divestiture cases in reality.
Transportation optimization with fuzzy trapezoidal numbers based on possibility theory.
He, Dayi; Li, Ran; Huang, Qi; Lei, Ping
2014-01-01
In this paper, a parametric method is introduced to solve fuzzy transportation problem. Considering that parameters of transportation problem have uncertainties, this paper develops a generalized fuzzy transportation problem with fuzzy supply, demand and cost. For simplicity, these parameters are assumed to be fuzzy trapezoidal numbers. Based on possibility theory and consistent with decision-makers' subjectiveness and practical requirements, the fuzzy transportation problem is transformed to a crisp linear transportation problem by defuzzifying fuzzy constraints and objectives with application of fractile and modality approach. Finally, a numerical example is provided to exemplify the application of fuzzy transportation programming and to verify the validity of the proposed methods.
Active XML Document Rewriting Based on Tree Automata Theory
MA Haitao; HAO Zhongxiao; ZHU Yan
2006-01-01
The problem of document rewriting is a fundamental problem in active XML(AXML) data exchange and usually has a higher complexity.Prior work was focused on string automaton theory.This paper tries to solve it by using tree automaton.More precisely, the paper firstly defines a new tree automaton, active XML tree automaton (AXTA),which can efficiently represent the set of AXML documents produced by an AXML document or AXML document schema.And then, an algorithm for constructing AXTA automaton is also proposed.Finally, a polynomial time(PTIME) determining algorithm for AXML document rewriting is presented based on AXTA automaton.
Sadiq, Faizan A; Li, Yun; Liu, TongJie; Flint, Steve; Zhang, Guohua; He, GuoQing
2016-01-18
Aerobic spore forming bacteria are potential milk powder contaminants and are viewed as indicators of poor quality. A total of 738 bacteria, including both mesophilic and thermophilic, isolated from twenty-five powdered milk samples representative of three types of milk powders in China were analyzed based on the random amplified polymorphic DNA (RAPD) protocol to provide insight into species diversity. Bacillus licheniformis was found to be the most prevalent bacterium with greatest diversity (~43% of the total isolates) followed by Geobacillus stearothermophilus (~21% of the total isolates). Anoxybacillus flavithermus represented only 8.5% of the total profiles. Interestingly, actinomycetes represented a major group of the isolates with the predominance of Laceyella sacchari followed by Thermoactinomyces vulgaris, altogether comprising of 7.3% of the total isolates. Out of the nineteen separate bacterial species (except five unidentified groups) recovered and identified from milk powders, twelve proved to belong to novel or previously unreported species in milk powders. Assessment and characterization of the harmful effects caused by this particular micro-flora on the quality and safety of milk powders will be worth doing in the future.
Control theory based airfoil design for potential flow and a finite volume discretization
Reuther, J.; Jameson, A.
1994-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.
A Translation Case Analysis Based on Skopos Theory
刘冬梅
2015-01-01
With the spread of globalization,the roleof translation is crucial in cultural,economic,and social communication.The functionalist approaches of translation originated in the 1970s in Germany.They had carried on the reasonable aspects of the traditional theories and broken their restraint,which are very practical.Skopos theory reflects a general shift from predominantly linguistic and rather formal translation theories to a more functionally and socio-culturally oriented concept of translation,which drew inspiration from communication theory,action theory,text linguistics,and text theory,as well as from movements in literary studies towards reception theories.
A Translation Case Analysis Based on Skopos Theory
刘冬梅
2015-01-01
With the spread of globalization,the role of translation is crucial in cultural,economic,and social communication.The functionalist approaches of translation originated in the 1970s in Germany.They had carried on the reasonable aspects of the traditional theories and broken their restraint,which are very practical.Skopos theory reflects a general shift from predominantly linguistic and rather formal translation theories to a more functionally and socio-culturally oriented concept of translation,which drew inspiration from communication theory,action theory,text linguistics,and text theory,as well as from movements in literary studies towards reception theories.
Control for Wind Power Generation Based on Inverse System Theory
Jiyong Zhang
2013-11-01
Full Text Available Traditional Double-fed Wind Generation systems are based on the vector control method, and it is dependent on motor parameters. The performance of the control system will be affected with the parameters changing,. This paper proposes a new control method based on inverse system and variable structure sliding mode(VSS theories, through the inverse system theory, the structure of its state’s equation, obtaining the structure of the inverse system, the establishment of Wind Power Generation closed-loop control system is established. The VSS controller, designed with exponential reaching law, can improve the dynamic performance in normal operation range effectively. When the system operates with variable speed constant frequency (VSCF and the phase voltage drops, the simulations show that the control system can control the DC link voltage steabily, maintain unity power factor, achieve the decoupling of the active and reactive power. And experiments show that the control method used in VSCF wind power system is feasible.
Similarity theory based on the Dougherty-Ozmidov length scale
Grachev, Andrey A; Fairall, Christopher W; Guest, Peter S; Persson, P Ola G
2014-01-01
Local similarity theory is suggested based on the Brunt-Vaisala frequency and the dissipation rate of turbulent kinetic energy instead the turbulent fluxes used in the traditional Monin-Obukhov similarity theory. Based on dimensional analysis (Pi theorem), it is shown that any properly scaled statistics of the small-scale turbulence are universal functions of a stability parameter defined as the ratio of a reference height z and the Dougherty-Ozmidov length scale which in the limit of z-less stratification is linearly proportional to the Obukhov length scale. Measurements of atmospheric turbulence made at five levels on a 20-m tower over the Arctic pack ice during the Surface Heat Budget of the Arctic Ocean experiment (SHEBA) are used to examine the behaviour of different similarity functions in the stable boundary layer. It is found that in the framework of this approach the non-dimensional turbulent viscosity is equal to the gradient Richardson number whereas the non-dimensional turbulent thermal diffusivit...
Disaster Rescue Simulation based on Complex Adaptive Theory
Feng Jiang
2013-05-01
Full Text Available Disaster rescue is one of the key measures of disaster reduction. The rescue process is a complex process with the characteristics of large scale, complicate structure, non-linear. It is hard to describe and analyze them with traditional methods. Based on complex adaptive theory, this paper analyzes the complex adaptation of the rescue process from seven features: aggregation, nonlinearity, mobility, diversity, tagging, internal model and building block. With the support of Repast platform, an agent-based model including rescue agents and victim agents was proposed. Moreover, two simulations with different parameters are employed to examine the feasibility of the model. As a result, the proposed model has been shown that it is efficient in dealing with the disaster rescue simulation and can provide the reference for making decisions.
Face Recognition System Based on Spectral Graph Wavelet Theory
R. Premalatha Kanikannan
2014-09-01
Full Text Available This study presents an efficient approach for automatic face recognition based on Spectral Graph Wavelet Theory (SGWT. SGWT is analogous to wavelet transform and the transform functions are defined on the vertices of a weighted graph. The given face image is decomposed by SGWT at first. The energies of obtained sub-bands are fused together and considered as feature vector for the corresponding image. The performance of proposed system is analyzed on ORL face database using nearest neighbor classifier. The face images used in this study has variations in pose, expression and facial details. The results indicate that the proposed system based on SGWT is better than wavelet transform and 94% recognition accuracy is achieved.
STUDY ON IMAGE EDGE PROPERTY LOCATION BASED ON FRACTAL THEORY
无
2001-01-01
A novel approach of printed circuit board(PCB)image locating is presentedBased on the rectangle mark image edge of PCB,the featur es is used to describe the image edge and the fractal properby of image edge is analyzedIt is proved that the rectangle mark image edge of PCB has some fracta l featuresA method of deleting unordinary curve noise and compensating the l ength of the fractal curve is put forward,which can get the fractal dimension value from one complex edge fractal property curveThe relation between the dim ension of the fractal curve and the turning angle of image can be acquired from an equation,as a result,the angle value of the PCB image is got exactlyA real image edge analysis result confirms that the method based on the fractal theory is a new powerful tool for angle locating on PCB and related image area
Inventory control based on advanced probability theory, an application
Krever, Maarten; Schorr, Bernd; Wunderink, S
2005-01-01
Whenever stock is placed as a buffer between consumption and supply the decision when to replenish the stock is based on uncertain values of future demand and supply variables. Uncertainty exists about the replenishment lead time, about the number of demands and the quantities demanded during this period. We develop a new analytical expression for the reorder point, which is based on the desired service level and three distributions: the distribution of the quantity of single demands during lead time, the distribution of the lengths of time intervals between successive demands, and the distribution of the lead time itself. The distribution of lead time demand is derived from the distributions of individual demand quantities and not from the demand per period. It is not surprising that the resulting formulae for the mean and variance are different from those currently used. The theory developed is also applicable to periodic review systems. The system has been implemented at CERN and enables a significant enha...
Feature selection with neighborhood entropy-based cooperative game theory.
Zeng, Kai; She, Kun; Niu, Xinzheng
2014-01-01
Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighborhood entropy-based feature contribution is proposed under the framework of cooperative game. The evaluative criteria of features can be formalized as the product of contribution and other classical feature measures. Finally, the proposed method is tested on several UCI datasets. The results show that neighborhood entropy-based cooperative game theory model (NECGT) yield better performance than classical ones.
Theory based design and optimization of materials for spintronics applications
Xu, Tianyi
The Spintronics industry has developed rapidly in the past decade. Finding the right material is very important for Spintronics applications, which requires good understanding of the physics behind specific phenomena. In this dissertation, we will focus on two types of perpendicular transport phenomena, the current-perpendicular-to-plane giant-magneto-resistance (CPP-GMR) phenomenon and the tunneling phenomenon in the magnetic tunnel junctions. The Valet-Fert model is a very useful semi-classical approach for understanding the transport and spin-flip process in CPP-GMR. We will present a finite element based implementation for the Valet-Fert model which enables a practical way to calculate the electron transport in real CPP-GMR spin valves. It is very important to find high spin polarized materials for CPP-GMR spin valves. The half-metal, due to its full spin polarization, is of interest. We will propose a rational way to find half-metals based on the gap theorem. Then we will focus on the high-MR TMR phenomenon. The tunneling theory of electron transport in mesoscopic systems will be covered. Then we will calculate the transport properties of certain junctions with the help of Green's function under the Landauer-Buttiker formalism, also known as the scattering formalism. The damping constant determines the switching rate of a device. We can calculate it using a method based on the Extended Huckel Tight-Binding theory (EHTB). The symmetry filtering effect is very helpful for finding materials for TMR junctions. Based upon which, we find a good candidate material, MnAl, for TMR applications.
Cole, Galen E.
1999-01-01
Provides strategies for constructing theories of theory-based evaluation and provides examples in the field of public health. Techniques are designed to systematize and bring objectivity to the process of theory construction. Also introduces a framework of program theory. (SLD)
Investigating the Learning-Theory Foundations of Game-Based Learning: A Meta-Analysis
Wu, W-H.; Hsiao, H-C.; Wu, P-L.; Lin, C-H.; Huang, S-H.
2012-01-01
Past studies on the issue of learning-theory foundations in game-based learning stressed the importance of establishing learning-theory foundation and provided an exploratory examination of established learning theories. However, we found research seldom addressed the development of the use or failure to use learning-theory foundations and…
Priyan Perera
2013-03-01
Full Text Available A better understanding on relationships between future behavioural intentions and its antecedents allow ecotourism operators to manipulate their ecotourism products to optimize customer satisfaction, and improve marketing efforts. Although the relationship between previous visits and future behavioural intentions have been previously studied, less attention has been given on understanding the process of how previous visits interact with other key determinants of behavioural intentions such as trip quality, perceived value, and satisfaction to form future behavioural intentions. This study proposes a model to examine the role of previous visits in predicting future behavioural intentions to participate in ecotourism, and the relationship between previous visits and future behavioural intentions is modelled in a quality-satisfaction domain. Results suggest previous visits, trip quality, satisfaction and perceived value as important predictors of ecotourists’ intention to revisit and recommend the destination, as well as their propensity to engage in ecotourism in the future. Trip quality was the most important determinant of future ecotourism behavioural intentions. Implications of the study are discussed in the perspective of ecotourism marketing.
Towards a Complexity Theory of Randomized Search Heuristics: Ranking-Based Black-Box Complexity
Doerr, Benjamin
2011-01-01
Randomized search heuristics are a broadly used class of general-purpose algorithms. Analyzing them via classical methods of theoretical computer science is a growing field. A big step forward would be a useful complexity theory for such algorithms. We enrich the two existing black-box complexity notions due to Wegener and other authors by the restrictions that not actual objective values, but only the relative quality of the previously evaluated solutions may be taken into account by the algorithm. Many randomized search heuristics belong to this class of algorithms. We show that the new ranking-based model gives more realistic complexity estimates for some problems, while for others the low complexities of the previous models still hold.
Treatment of adolescent sexual offenders: theory-based practice.
Sermabeikian, P; Martinez, D
1994-11-01
The treatment of adolescent sexual offenders (ASO) has its theoretical underpinnings in social learning theory. Although social learning theory has been frequently cited in literature, a comprehensive application of this theory, as applied to practice, has not been mapped out. The social learning and social cognitive theories of Bandura appear to be particularly relevant to the group treatment of this population. The application of these theories to practice, as demonstrated in a program model, is discussed as a means of demonstrating how theory-driven practice methods can be developed.
Bathymetry Prediction Based on the Admittance Theory of Gravity Anomalies
OUYANG Mingda
2015-10-01
Full Text Available Based on the admittance theory of gravity anomalies, the method of bathymetry prediction was studied in detail in this paper. In frequency domains, the correlation between gravity anomalies and bathymetry was analyzed, which suggests that the wavelength band correlated strongly was in a range of 20—300 km, this band was appropriated to inverse bathymetry by gravity anomalies. Took the Emperor Chain as an example, the uncompensated admittance model and flexural isostatic admittance model were used for researching, respectively, the included parameter of crust thickness and effective elastic thickness were calculated by the isostatic response function. As the down continuation factor was unstable, a high-cut filter was proposed in the inversion procedure to ensure convergence of series. The results showed that, the admittance theory of gravity anomalies can be used effectively in the bathymetry prediction, the predicted result was real and reliable, the relative precision was approximately 6%, which was equal to ETOPO1 model, and the detailed feature of sea floor which was not showed in ETOPO1 model can also be depicted; the precisions were not so well in areas of ocean mountains intensively distributed because of the complexion of the sea floor.
Laser image denoising technique based on multi-fractal theory
Du, Lin; Sun, Huayan; Tian, Weiqing; Wang, Shuai
2014-02-01
The noise of laser images is complex, which includes additive noise and multiplicative noise. Considering the features of laser images, the basic processing capacity and defects of the common algorithm, this paper introduces the fractal theory into the research of laser image denoising. The research of laser image denoising is implemented mainly through the analysis of the singularity exponent of each pixel in fractal space and the feature of multi-fractal spectrum. According to the quantitative and qualitative evaluation of the processed image, the laser image processing technique based on fractal theory not only effectively removes the complicated noise of the laser images obtained by range-gated laser active imaging system, but can also maintains the detail information when implementing the image denoising processing. For different laser images, multi-fractal denoising technique can increase SNR of the laser image at least 1~2dB compared with other denoising techniques, which basically meet the needs of the laser image denoising technique.
Qigong in Cancer Care: Theory, Evidence-Base, and Practice
Penelope Klein
2017-01-01
Full Text Available Background: The purpose of this discussion is to explore the theory, evidence base, and practice of Qigong for individuals with cancer. Questions addressed are: What is qigong? How does it work? What evidence exists supporting its practice in integrative oncology? What barriers to wide-spread programming access exist? Methods: Sources for this discussion include a review of scholarly texts, the Internet, PubMed, field observations, and expert opinion. Results: Qigong is a gentle, mind/body exercise integral within Chinese medicine. Theoretical foundations include Chinese medicine energy theory, psychoneuroimmunology, the relaxation response, the meditation effect, and epigenetics. Research supports positive effects on quality of life (QOL, fatigue, immune function and cortisol levels, and cognition for individuals with cancer. There is indirect, scientific evidence suggesting that qigong practice may positively influence cancer prevention and survival. No one Qigong exercise regimen has been established as superior. Effective protocols do have common elements: slow mindful exercise, easy to learn, breath regulation, meditation, emphasis on relaxation, and energy cultivation including mental intent and self-massage. Conclusions: Regular practice of Qigong exercise therapy has the potential to improve cancer-related QOL and is indirectly linked to cancer prevention and survival. Wide-spread access to quality Qigong in cancer care programming may be challenged by the availability of existing programming and work force capacity.
Martinez, Rafael; Rodriguez, Francisco de Borja; Camacho, David
2007-01-01
The main contribution of this paper is to design an Information Retrieval (IR) technique based on Algorithmic Information Theory (using the Normalized Compression Distance- NCD), statistical techniques (outliers), and novel organization of data base structure. The paper shows how they can be integrated to retrieve information from generic databases using long (text-based) queries. Two important problems are analyzed in the paper. On the one hand, how to detect "false positives" when the distance among the documents is very low and there is actual similarity. On the other hand, we propose a way to structure a document database which similarities distance estimation depends on the length of the selected text. Finally, the experimental evaluations that have been carried out to study previous problems are shown.
A Lie based 4-dimensional higher Chern-Simons theory
Zucchini, Roberto
2015-01-01
We present and study a model of 4-dimensional higher Chern-Simons theory, special Chern-Simons (SCS) theory, instances of which have appeared in the string literature, whose symmetry is encoded in a skeletal semistrict Lie 2-algebra constructed from a compact Lie group with non discrete center. The field content of SCS theory consists of a Lie valued 2-connection coupled to a background closed 3-form. SCS theory enjoys a large gauge and gauge for gauge symmetry organized in an infinite dimensional strict Lie 2-group. The partition function of SCS theory is simply related to that of a topological gauge theory localizing on flat connections with degree 3 second characteristic class determined by the background 3-form. Finally, SCS theory is related to a 3-dimensional special gauge theory whose 2-connection space has a natural symplectic structure with respect to which the 1-gauge transformation action is Hamiltonian, the 2-curvature map acting as moment map.
Invulnerability of power grids based on maximum flow theory
Fan, Wenli; Huang, Shaowei; Mei, Shengwei
2016-11-01
The invulnerability analysis against cascades is of great significance in evaluating the reliability of power systems. In this paper, we propose a novel cascading failure model based on the maximum flow theory to analyze the invulnerability of power grids. In the model, node initial loads are built on the feasible flows of nodes with a tunable parameter γ used to control the initial node load distribution. The simulation results show that both the invulnerability against cascades and the tolerance parameter threshold αT are affected by node load distribution greatly. As γ grows, the invulnerability shows the distinct change rules under different attack strategies and different tolerance parameters α respectively. These results are useful in power grid planning and cascading failure prevention.
Multivariate discrimination technique based on the Bayesian theory
JIN Ping; PAN Chang-zhou; XIAO Wei-guo
2007-01-01
A multivariate discrimination technique was established based on the Bayesian theory. Using this technique, P/S ratios of different types (e.g., Pn/Sn, Pn/Lg, Pg/Sn or Pg/Lg) measured within different frequency bands and from different stations were combined together to discriminate seismic events in Central Asia. Major advantages of the Bayesian approach are that the probability to be an explosion for any unknown event can be directly calculated given the measurements of a group of discriminants, and at the same time correlations among these discriminants can be fully taken into account. It was proved theoretically that the Bayesian technique would be optimal and its discriminating performance would be better than that of any individual discriminant as well as better than that yielded by the linear combination approach ignoring correlations among discriminants. This conclusion was also validated in this paper by applying the Bayesian approach to the above-mentioned observed data.
Quantum theory of a spaser-based nanolaser.
Parfenyev, Vladimir M; Vergeles, Sergey S
2014-06-02
We present a quantum theory of a spaser-based nanolaser, under the bad-cavity approximation. We find first- and second-order correlation functions g(1)(τ) and g(2)(τ) below and above the generation threshold, and obtain the average number of plasmons in the cavity. The latter is shown to be of the order of unity near the generation threshold, where the spectral line narrows considerably. In this case the coherence is preserved in a state of active atoms in contradiction to the good-cavity lasers, where the coherence is preserved in a state of photons. The damped oscillations in g(2)(τ) above the generation threshold indicate the unusual character of amplitude fluctuations of polarization and population, which become interconnected in this case. Obtained results allow to understand the fundamental principles of operation of nanolasers.
Piecewise Filter of Infrared Image Based on Moment Theory
GAO Yang; LI Yan-jun; ZHANG Ke
2007-01-01
The disadvantages of IR images mostly include high noise, blurry edge and so on. The characteristics make the existent smoothing methods ineffective in preserving edge. To solve this problem, a piecewise moment filter (PMF) is put forward. By using moment and piecewise linear theory, the filter can preserve edge. Based on the statistical model of random noise, a related-coefficient method is presented to estimate the variance of noise. The edge region and model are then detected by the estimated variance. The expectation of first-order derivatives is used in getting the reliable offset of edge.At last, a fast moment filter of double-stair edge model is used to gain the piecewise smoothing results and reduce the calculation. The experimental result shows that the new method has a better capability than other methods in suppressing noise and preserving edge.
A Lyapunov theory based UPFC controller for power flow control
Zangeneh, Ali; Kazemi, Ahad; Hajatipour, Majid; Jadid, Shahram [Center of Excellence for Power Systems Automation and Operation, Iran University of Science and Technology, Tehran (Iran)
2009-09-15
Unified power flow controller (UPFC) is the most comprehensive multivariable device among the FACTS controllers. Capability of power flow control is the most important responsibility of UPFC. According to high importance of power flow control in transmission lines, the proper controller should be robust against uncertainty and disturbance and also have suitable settling time. For this purpose, a new controller is designed based on the Lyapunov theory and its stability is also evaluated. The Main goal of this paper is to design a controller which enables a power system to track reference signals precisely and to be robust in the presence of uncertainty of system parameters and disturbances. The performance of the proposed controller is simulated on a two bus test system and compared with a conventional PI controller. The simulation results show the power and accuracy of the proposed controller. (author)
Novel welding image processing method based on fractal theory
陈强; 孙振国; 肖勇; 路井荣
2002-01-01
Computer vision has come into used in the fields of welding process control and automation. In order to improve precision and rapidity of welding image processing, a novel method based on fractal theory has been put forward in this paper. Compared with traditional methods, the image is preliminarily processed in the macroscopic regions then thoroughly analyzed in the microscopic regions in the new method. With which, an image is divided up to some regions according to the different fractal characters of image edge, and the fuzzy regions including image edges are detected out, then image edges are identified with Sobel operator and curved by LSM (Lease Square Method). Since the data to be processed have been decreased and the noise of image has been reduced, it has been testified through experiments that edges of weld seam or weld pool could be recognized correctly and quickly.
Similarity theory based method for MEMS dynamics analysis
LI Gui-xian; PENG Yun-feng; ZHANG Xin
2008-01-01
A new method for MEMS dynamics analysis is presented, ased on the similarity theory. With this method, two systems' similarities can be captured in terms of physics quantities/governed-equations amongst different energy fields, and then the unknown dynamic characteristics of one of the systems can be analyzed ac-cording to the similar ones of the other system. The probability to establish a pair of similar systems among MEMS and other energy systems is also discussed based on the equivalent between mechanics and electrics, and then the feasibility of applying this method is proven by an example, in which the squeezed damping force in MEMS and the current of its equivalent circuit established by this method are compared.
Error Analysis of English Writing Based on Interlanguage Theory
李玲
2014-01-01
Language learning process has been hunted by learner’s errors,which is an unavoidable phenomenon.In the 1950s and 1960s,Contractive Analysis (CA) based on behaviorism and structuralism was generally employed in analyzing learners’ errors. CA soon lost its popularity.Error Analysis (EA),a branch of applied linguistics,has made great contributions to the study of second language learning and throws some light on the process of second language learning.Careful study of the errors reveals the common problems shared by the language learners.Writing is important in language learning process.Under Chinese context,English writing is always a difficult question for Chinese teachers and students,so errors in students’ written works are unavoidable.In this thesis,the author studies on error analysis of English writing with the interlanguage theory as its theoretical guidance.
Error Analysis of English Writing Based on Interlanguage Theory
李玲
2014-01-01
Language learning process has been hunted by learner’s errors,which is an unavoidable phenomenon.In the 1950 s and 1960 s,Contractive Analysis(CA) based on behaviorism and structuralism was generally employed in analyzing learners’ errors.CA soon lost its popularity.Error Analysis(EA),a branch of applied linguistics,has made great contributions to the study of second language learning and throws some light on the process of second language learning.Careful study of the errors reveals the common problems shared by the language learners.Writing is important in language learning process.Under Chinese context,English writing is always a difficult question for Chinese teachers and students,so errors in students’ written works are unavoidable.In this thesis,the author studies on error analysis of English writing with the interlanguage theory as its theoretical guidance.
The Bus Station Spacing Optimization Based on Game Theory
Changjiang Zheng
2015-01-01
Full Text Available With the development of city, the problem of traffic is becoming more and more serious. Developing public transportation has become the key to solving this problem in all countries. Based on the existing public transit network, how to improve the bus operation efficiency, and reduce the residents transit trip cost has become a simple and effective way to develop the public transportation. Bus stop spacing is an important factor affecting passengers’ travel time. How to set up bus stop spacing has become the key to reducing passengers’ travel time. According to comprehensive traffic survey, theoretical analysis, and summary of urban public transport characteristics, this paper analyzes the impact of bus stop spacing on passenger in-bus time cost and out-bus time cost and establishes in-bus time and out-bus time model. Finally, the paper gets the balance best station spacing by introducing the game theory.
Translation of Brand Names Based on Adaptation Theory
黄莉
2013-01-01
This paper, from the perspective of Verschueren’s adaptation theory, explores how a translator should adapt to the properties of products, different language customs, and consumers’psychology during the translation of brand names. First, a gen-eral introduction is made on adaptation theory. Then, the application of adaptation theory in brand name translation is illustrated. Finally it is found that adaptation theory is very helpful for the translation of brand names.
An Analysis of Business English Translation Based on Schema Theory
夏兴宜
2014-01-01
Business English translation is a cognitive process. schema theory has seldom been applied to business English. This pa-per is written on the basis of the features of business English and schema theory; it studies the process in which schemata are formed in the brain and the application of four major schemata to business English translation. Through introducing theories and illustrating many examples, the thesis shows the importance of the application of the schema theory to business English translation.
Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K
2016-11-28
Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes.
Evaluating Theory-Based Evaluation: Information, Norms, and Adherence
Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose
2012-01-01
Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…
A Study on Translation Process Based on Relevance Theory
吴竞
2015-01-01
Relevance theory belongs to the field of pragmatics. Translation is a kind of communicative activity in nature. In the frame of relevance theory, translation is the process of cognition and inference. This paper focuses on the study of trans-lation process on the basis of relevance theory in order to improve the practice of translation.
Evolutionary game theory using agent-based methods.
Adami, Christoph; Schossau, Jory; Hintze, Arend
2016-12-01
Evolutionary game theory is a successful mathematical framework geared towards understanding the selective pressures that affect the evolution of the strategies of agents engaged in interactions with potential conflicts. While a mathematical treatment of the costs and benefits of decisions can predict the optimal strategy in simple settings, more realistic settings such as finite populations, non-vanishing mutations rates, stochastic decisions, communication between agents, and spatial interactions, require agent-based methods where each agent is modeled as an individual, carries its own genes that determine its decisions, and where the evolutionary outcome can only be ascertained by evolving the population of agents forward in time. While highlighting standard mathematical results, we compare those to agent-based methods that can go beyond the limitations of equations and simulate the complexity of heterogeneous populations and an ever-changing set of interactors. We conclude that agent-based methods can predict evolutionary outcomes where purely mathematical treatments cannot tread (for example in the weak selection-strong mutation limit), but that mathematics is crucial to validate the computational simulations.
C.C. Hunault; J.D.F. Habbema (Dik); M.J.C. Eijkemans (René); J.A. Collins (John); J.L.H. Evers (Johannes); E.R. te Velde (Egbert)
2004-01-01
textabstractBACKGROUND: Several models have been published for the prediction of spontaneous pregnancy among subfertile patients. The aim of this study was to broaden the empirical basis for these predictions by making a synthesis of three previously published models. METHODS: We u
C.C. Hunault; J.D.F. Habbema (Dik); M.J.C. Eijkemans (René); J.A. Collins (John); J.L.H. Evers (Johannes); E.R. te Velde (Egbert)
2004-01-01
textabstractBACKGROUND: Several models have been published for the prediction of spontaneous pregnancy among subfertile patients. The aim of this study was to broaden the empirical basis for these predictions by making a synthesis of three previously published models. METHODS: We u
Hildegard Peplau meets family systems nursing: innovation in theory-based practice.
Forchuk, C; Dorsay, J P
1995-01-01
Nursing theories which have evolved from mental health--psychiatric nursing have focused on the individual nurse-client relationship. Other nursing theories generally focus on the individual as client. Therefore, nurses working with families may have difficulty in applying these frameworks to their practice. Nursing theories need to be expanded to include families, groups and communities more explicitly. The well established theory of Hildegard Peplau, which previous studies have found to be the theory most frequently used by psychiatric nurses, and the family systems nursing theory of Wright and Leahey share a complementary focus. Both theories form part of the interpersonal paradigm of nursing; both view nursing from an interactional perspective, rather than focusing on individuals. Use of a combined theoretical approach offers several advantages. The approach explicitly considers both the individual and the family. The combination provides grounding for family work in an articulated nursing theory.
J.O. Akinyele
2011-02-01
Full Text Available The complexity and conservative nature of the Yield Line Theory and its being an upper bound theory have made many design engineers to jettison the use of the analytical method in the analysis of slabs. Before now, the method has basically been a manual or hand methodwhich some engineers did not see a need for its use since there are many computer based packages in the analysis and design of slabs and other civil engineering structures. This paper presents a computer program that has adopted the yield line theory in the analysis of solid slabs. Two rectangular slabs of the same depth but differentdimensions were investigated. The Yield Line Theory was compared with two other analytical methods namely, Finite Element Method and Elastic Theory Method. The results obtained for a two-way spanning slab showed that the yield line theory is truly conservative, butincreasing the result by 25% caused the moment obtained to be very close to the results of the other two methods. Although it was still conservative, the check for deflections showed that it is reliable and economical in terms of reinforcement provision. For a one way spanning slab the results without any increment falls in between the two other methods with the Elastic method giving a conservative results. The paper concludes that the introduction of a computer-based yield line theory program will make the analytical method acceptable to design engineers in the developing countries of the world.
吴竞
2015-01-01
Relevance theory belongs to the field of pragmatics. Translation is a kind of communicative activity in nature. In the frame of relevance theory, translation is the process of cognition and inference. This paper focuses on the study of translation process on the basis of relevance theory in order to improve the practice of translation.
Switching theory-based steganographic system for JPEG images
Cherukuri, Ravindranath C.; Agaian, Sos S.
2007-04-01
Cellular communications constitute a significant portion of the global telecommunications market. Therefore, the need for secured communication over a mobile platform has increased exponentially. Steganography is an art of hiding critical data into an innocuous signal, which provide answers to the above needs. The JPEG is one of commonly used format for storing and transmitting images on the web. In addition, the pictures captured using mobile cameras are in mostly in JPEG format. In this article, we introduce a switching theory based steganographic system for JPEG images which is applicable for mobile and computer platforms. The proposed algorithm uses the fact that energy distribution among the quantized AC coefficients varies from block to block and coefficient to coefficient. Existing approaches are effective with a part of these coefficients but when employed over all the coefficients they show there ineffectiveness. Therefore, we propose an approach that works each set of AC coefficients with different frame work thus enhancing the performance of the approach. The proposed system offers a high capacity and embedding efficiency simultaneously withstanding to simple statistical attacks. In addition, the embedded information could be retrieved without prior knowledge of the cover image. Based on simulation results, the proposed method demonstrates an improved embedding capacity over existing algorithms while maintaining a high embedding efficiency and preserving the statistics of the JPEG image after hiding information.
Consistency analysis of accelerated degradation mechanism based on gray theory
Yunxia Chen; Hongxia Chen; Zhou Yang; Rui Kang; Yi Yang
2014-01-01
A fundamental premise of an accelerated testing is that the failure mechanism under elevated and normal stress levels should remain the same. Thus, verification of the consistency of failure mechanisms is essential during an accelerated testing. A new consistency analysis method based on the gray theory is pro-posed for complex products. First of al , existing consistency ana-lysis methods are reviewed with a focus on the comparison of the differences among them. Then, the proposed consistency ana-lysis method is introduced. Two effective gray prediction models, gray dynamic model and new information and equal dimensional (NIED) model, are adapted in the proposed method. The process to determine the dimension of NIED model is also discussed, and a decision rule is expanded. Based on that, the procedure of ap-plying the new consistent analysis method is developed. Final y, a case study of the consistency analysis of a reliability enhancement testing is conducted to demonstrate and validate the proposed method.
Landslides susceptibility mapping in Guizhou province based on fuzzy theory
WANG Wei-dong; XIE Cui-ming; DU Xiang-gang
2009-01-01
The purpose of this study was to assess the susceptibility of landslides around the area of Guizhou province based on fuzzy theory. In first instance, slope, elevation, lithology, proximity to tectonic lines, proximity to drainage and annual precipitation were taken as independent, causal factors in this study. A landslide hazard evaluation factor system was established by classifying these factors into more subclasses according to some rules. Secondly, a trapezoidal fuzzy number weighting (TFNW) approach was used to assess the importance of six causal factors to landslides in an ArcGIS environment. Thirdly, a landslide susceptibility map was created based on a weighted linear combination model. According to this susceptibility map, the study area was classified into four categories of landslide susceptibility: low, moderate, high and very high. Finally, in order to verify the results obtained, the susceptibility map and the landslide inventory map were combined in the GIS. In addition, the weighting procedure showed that TFNW is an efficient method for weighting causal landslide factors.
Image integrity authentication scheme based on fixed point theory.
Li, Xu; Sun, Xingming; Liu, Quansheng
2015-02-01
Based on the fixed point theory, this paper proposes a new scheme for image integrity authentication, which is very different from digital signature and fragile watermarking. By the new scheme, the sender transforms an original image into a fixed point image (very close to the original one) of a well-chosen transform and sends the fixed point image (instead of the original one) to the receiver; using the same transform, the receiver checks the integrity of the received image by testing whether it is a fixed point image and locates the tampered areas if the image has been modified during the transmission. A realization of the new scheme is based on Gaussian convolution and deconvolution (GCD) transform, for which an existence theorem of fixed points is proved. The semifragility is analyzed via commutativity of transforms, and three commutativity theorems are found for the GCD transform. Three iterative algorithms are presented for finding a fixed point image with a few numbers of iterations, and for the whole procedure of image integrity authentication; a fragile authentication system and a semifragile one are separately built. Experiments show that both the systems have good performance in transparence, fragility, security, and tampering localization. In particular, the semifragile system can perfectly resist the rotation by a multiple of 90° flipping and brightness attacks.
Solar Activity Predictions Based on Solar Dynamo Theories
Schatten, Kenneth H.
2009-05-01
We review solar activity prediction methods, statistical, precursor, and recently the Dikpati and the Choudhury groups’ use of numerical flux-dynamo methods. Outlining various methods, we compare precursor techniques with weather forecasting. Precursors involve events prior to a solar cycle. First started by the Russian geomagnetician Ohl, and then Brown and Williams; the Earth's field variations near solar minimum was used to predict the next solar cycle, with a correlation of 0.95. From the standpoint of causality, as well as energetically, these relationships were somewhat bizarre. One index used was the "number of anomalous quiet days,” an antiquated, subjective index. Scientific progress cannot be made without some suspension of disbelief; otherwise old paradigms become tautologies. So, with youthful naïveté, Svalgaard, Scherrer, Wilcox and I viewed the results through rose-colored glasses and pressed ahead searching for understanding. We eventually fumbled our way to explaining how the Sun could broadcast the state of its internal dynamo to Earth. We noted one key aspect of the Babcock-Leighton Flux Dynamo theory: the polar field at the end of a cycle serves as a seed for the next cycle's growth. Near solar minimum this field usually bathes the Earth, and thereby affects geomagnetic indices then. We found support by examining 8 previous solar cycles. Using our solar precursor technique we successfully predicted cycles 21, 22 and 23 using WSO and MWSO data. Pesnell and I improved the method using a SODA (SOlar Dynamo Amplitude) Index. In 2005, nearing cycle 23's minimum, Svalgaard and I noted an unusually weak polar field, and forecasted a small cycle 24. We discuss future advances: the flux-dynamo methods. As far as future solar activity, I shall let the Sun decide; it will do so anyhow.
GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.
Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain
2015-01-01
Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.
Kim, Dong Hyun; Kim, Hak Sung [Hanyang University, Seoul (Korea, Republic of); Kim, Hyo Chan; Yang, Yong Sik; In, Wang kee [KAERI, Daejeon (Korea, Republic of)
2016-05-15
In this paper, an analytical method based on thick walled theory has been studied to calculate stress and strain of ATF cladding. In order to prescribe boundary conditions of the analytical method, two algorithms were employed which are called subroutine 'Cladf' and 'Couple' of FRACAS, respectively. To evaluate the developed method, equivalent model using finite element method was established and stress components of the method were compared with those of equivalent FE model. One of promising ATF concepts is the coated cladding, which take advantages such as high melting point, a high neutron economy, and low tritium permeation rate. To evaluate the mechanical behavior and performance of the coated cladding, we need to develop the specified model to simulate the ATF behaviors in the reactor. In particular, the model for simulation of stress and strain for the coated cladding should be developed because the previous model, which is 'FRACAS', is for one body model. The FRACAS module employs the analytical method based on thin walled theory. According to thin-walled theory, radial stress is defined as zero but this assumption is not suitable for ATF cladding because value of the radial stress is not negligible in the case of ATF cladding. Recently, a structural model for multi-layered ceramic cylinders based on thick-walled theory was developed. Also, FE-based numerical simulation such as BISON has been developed to evaluate fuel performance. An analytical method that calculates stress components of ATF cladding was developed in this study. Thick-walled theory was used to derive equations for calculating stress and strain. To solve for these equations, boundary and loading conditions were obtained by subroutine 'Cladf' and 'Couple' and applied to the analytical method. To evaluate the developed method, equivalent FE model was established and its results were compared to those of analytical model. Based on the
A blood pressure measurement method based on synergetics theory
2010-01-01
<正>The principle for blood pressure measurement using pulse transit time is introduced in this paper.And the math model of synergetics theory is studied in detail.The synergetics theory is applied in the analysis of blood pressure measurement data.The simulation results show that the application of synergetics theory is helpful to judge the normal blood pressure,and the accuracy is up to 80%.
Nami, Mohammad Rahim; Janghorban, Maziar
2013-01-01
.... In order to consider the size effects, the nonlocal elasticity theory is used. An analytical method is adopted to solve the governing equations for static analysis of simply supported nanoplates...
A novel DEMATEL theory based on Liu’s polytomous ordering theory
Liu Hsiang-Chuan
2016-01-01
Full Text Available The most important issue in DEMATEL theory is how to obtain a reliable initial direct relation matrix with order n, the traditional theory obtains it by using the pair-wise comparison method, in which, each respondent must answer n(n-1 times pair-wise comparisons of all of the direct influences, if n is a large number, the work of pair-wise comparing is becoming hard, time-consuming, and unreliable. In this paper, for overcoming above drawbacks, we replace the pair-wise comparison method with Liu's ordering theory to find the initial direct relation matrix. This new method without pair-wise comparing can be used for any order n, a simple example was also provided in this paper to illustrate the advantages of the proposed theory.
Density functional theory based generalized effective fragment potential method
Nguyen, Kiet A., E-mail: kiet.nguyen@wpafb.af.mil, E-mail: ruth.pachter@wpafb.af.mil [Air Force Research Laboratory, Wright-Patterson Air Force Base, Ohio 45433 (United States); UES, Inc., Dayton, Ohio 45432 (United States); Pachter, Ruth, E-mail: kiet.nguyen@wpafb.af.mil, E-mail: ruth.pachter@wpafb.af.mil [Air Force Research Laboratory, Wright-Patterson Air Force Base, Ohio 45433 (United States); Day, Paul N. [Air Force Research Laboratory, Wright-Patterson Air Force Base, Ohio 45433 (United States); General Dynamics Information Technology, Inc., Dayton, Ohio 45431 (United States)
2014-06-28
We present a generalized Kohn-Sham (KS) density functional theory (DFT) based effective fragment potential (EFP2-DFT) method for the treatment of solvent effects. Similar to the original Hartree-Fock (HF) based potential with fitted parameters for water (EFP1) and the generalized HF based potential (EFP2-HF), EFP2-DFT includes electrostatic, exchange-repulsion, polarization, and dispersion potentials, which are generated for a chosen DFT functional for a given isolated molecule. The method does not have fitted parameters, except for implicit parameters within a chosen functional and the dispersion correction to the potential. The electrostatic potential is modeled with a multipolar expansion at each atomic center and bond midpoint using Stone's distributed multipolar analysis. The exchange-repulsion potential between two fragments is composed of the overlap and kinetic energy integrals and the nondiagonal KS matrices in the localized molecular orbital basis. The polarization potential is derived from the static molecular polarizability. The dispersion potential includes the intermolecular D3 dispersion correction of Grimme et al. [J. Chem. Phys. 132, 154104 (2010)]. The potential generated from the CAMB3LYP functional has mean unsigned errors (MUEs) with respect to results from coupled cluster singles, doubles, and perturbative triples with a complete basis set limit (CCSD(T)/CBS) extrapolation, of 1.7, 2.2, 2.0, and 0.5 kcal/mol, for the S22, water-benzene clusters, water clusters, and n-alkane dimers benchmark sets, respectively. The corresponding EFP2-HF errors for the respective benchmarks are 2.41, 3.1, 1.8, and 2.5 kcal/mol. Thus, the new EFP2-DFT-D3 method with the CAMB3LYP functional provides comparable or improved results at lower computational cost and, therefore, extends the range of applicability of EFP2 to larger system sizes.
An Approach to Theory-Based Youth Programming
Duerden, Mat D.; Gillard, Ann
2011-01-01
A key but often overlooked aspect of intentional, out-of-school-time programming is the integration of a guiding theoretical framework. The incorporation of theory in programming can provide practitioners valuable insights into essential processes and principles of successful programs. While numerous theories exist that relate to youth development…
An Approach to Theory-Based Youth Programming
Duerden, Mat D.; Gillard, Ann
2011-01-01
A key but often overlooked aspect of intentional, out-of-school-time programming is the integration of a guiding theoretical framework. The incorporation of theory in programming can provide practitioners valuable insights into essential processes and principles of successful programs. While numerous theories exist that relate to youth development…
Beginning Student Teachers' Teacher Identities Based on Their Practical Theories
Stenberg, Katariina; Karlsson, Liisa; Pitkaniemi, Harri; Maaranen, Katriina
2014-01-01
In this article, we investigate first-year student teachers' teacher identities through their practical theories and ask what these practical theories reveal about their emerging teacher identities? This study approaches teacher identity from a dialogical viewpoint where identity is constructed through various positions. The empirical part of this…
Viscosity Prediction of Hydrocarbon Mixtures Based on the Friction Theory
Zeberg-Mikkelsen, Claus Kjær; Cisneros, Sergio; Stenby, Erling Halfdan
2001-01-01
The application and capability of the friction theory (f-theory) for viscosity predictions of hydrocarbon fluids is further illustrated by predicting the viscosity of binary and ternary liquid mixtures composed of n-alkanes ranging from n-pentane to n-decane for wide ranges of temperature and fro...
Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory.
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-18
Sensor data fusion plays an important role in fault diagnosis. Dempster-Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.
Sentiment Prediction Based on Dempster-Shafer Theory of Evidence
Mohammad Ehsan Basiri
2014-01-01
Full Text Available Sentiment prediction techniques are often used to assign numerical scores to free-text format reviews written by people in online review websites. In order to exploit the fine-grained structural information of textual content, a review may be considered as a collection of sentences, each with its own sentiment orientation and score. In this manner, a score aggregation method is needed to combine sentence-level scores into an overall review rating. While recent work has concentrated on designing effective sentence-level prediction methods, there remains the problem of finding efficient algorithms for score aggregation. In this study, we investigate different aggregation methods, as well as the cases in which they perform poorly. According to the analysis of existing methods, we propose a new score aggregation method based on the Dempster-Shafer theory of evidence. In the proposed method, we first detect the polarity of reviews using a machine learning approach and then, consider sentence scores as evidence for the overall review rating. The results from two public social web datasets show the higher performance of our method in comparison with existing score aggregation methods and state-of-the-art machine learning approaches.
Fowler Nordheim theory of carbon nanotube based field emitters
Parveen, Shama; Kumar, Avshish; Husain, Samina; Husain, Mushahid
2017-01-01
Field emission (FE) phenomena are generally explained in the frame-work of Fowler Nordheim (FN) theory which was given for flat metal surfaces. In this work, an effort has been made to present the field emission mechanism in carbon nanotubes (CNTs) which have tip type geometry at nanoscale. High aspect ratio of CNTs leads to large field enhancement factor and lower operating voltages because the electric field strength in the vicinity of the nanotubes tip can be enhanced by thousand times. The work function of nanostructure by using FN plot has been calculated with reverse engineering. With the help of modified FN equation, an important formula for effective emitting area (active area for emission of electrons) has been derived and employed to calculate the active emitting area for CNT field emitters. Therefore, it is of great interest to present a state of art study on the complete solution of FN equation for CNTs based field emitter displays. This manuscript will also provide a better understanding of calculation of different FE parameters of CNTs field emitters using FN equation.
System of marketing deciding support based on game theory
Gordana Dukić
2008-12-01
Full Text Available Quantitative methods and models can be applied in numerous spheres of marketing deciding. The choice of optimal strategy in product advertising is one of the problems that the marketing-management often meets. The use of models developed within the framework of game theory makes significantly easier to find out the solutions of conflict situations that appear herewith. The system of deciding support presented in this work is based on the supposition that two opposed sides take part in the game. With the aim of deciding process promotion, the starting model incorporates computer simulation of percentile changes in the market share that represent elements of payment matrix. The supposition is that the random variables that represent them follow the normal division. It is necessary to carry out the evaluation of their parameters because of relevant data. Information techniques, computer and the adequate program applications take the special position in solving and analysis of the suggested model. This kind of their application represents the basic characteristic of the deciding support system.
Fowler Nordheim theory of carbon nanotube based field emitters
Parveen, Shama; Kumar, Avshish [Department of Physics, Jamia Millia Islamia (Central University), New Delhi (India); Husain, Samina [Centre for Nanoscience and Nanotechnology, Jamia Millia Islamia (Central University), New Delhi (India); Husain, Mushahid, E-mail: mush_reslab@rediffmail.com [Department of Physics, Jamia Millia Islamia (Central University), New Delhi (India)
2017-01-15
Field emission (FE) phenomena are generally explained in the frame-work of Fowler Nordheim (FN) theory which was given for flat metal surfaces. In this work, an effort has been made to present the field emission mechanism in carbon nanotubes (CNTs) which have tip type geometry at nanoscale. High aspect ratio of CNTs leads to large field enhancement factor and lower operating voltages because the electric field strength in the vicinity of the nanotubes tip can be enhanced by thousand times. The work function of nanostructure by using FN plot has been calculated with reverse engineering. With the help of modified FN equation, an important formula for effective emitting area (active area for emission of electrons) has been derived and employed to calculate the active emitting area for CNT field emitters. Therefore, it is of great interest to present a state of art study on the complete solution of FN equation for CNTs based field emitter displays. This manuscript will also provide a better understanding of calculation of different FE parameters of CNTs field emitters using FN equation.
Rock burst prevention based on dissipative structure theory
Song Dazhao; Wang Enyuan; Li Nan; Jin Mingyue; Xue Shipeng
2012-01-01
Dynamic collapses of deeply mined coal rocks are severe.In order to explore new ideas for rock burst prevention,the relationship between entropy equations and dissipative structure was studied,and a concept-rock burst activity system (RAS) was proposed and its entropy was analyzed.The energy features of RAS were analyzed,and the relationship between electromagnetic radiation (EMR) intensity E and dissipated energy Ud was initially established.We suggest that rock burst normally happens only when d,S<< -des in RAS:RAS is the dissipative structure before collapse,and after which it become a new orderly structure,i.e.,a "dead",a statically orderly structure.We advanced that the effective way to prevent rock burst is to introduce entropy to the system for it keeps the system away from the dissipative structure.E and Ud of RAS are positively related,which is used as a bridge between dissipative structure theory and rock burst prevention engineering applications.Based on this.and using the data of rock burst prevention for working face No.250205up of Yanbei coal mine,an engineering verification for the dissipative structure of RAS was carried out,which showed good results.
The Development of an Attribution-Based Theory of Motivation: A History of Ideas
Weiner, Bernard
2010-01-01
The history of ideas guiding the development of an attribution-based theory of motivation is presented. These influences include the search for a "grand" theory of motivation (from drive and expectancy/value theory), an attempt to represent how the past may influence the present and the future (as Thorndike accomplished), and the…
The Development of an Attribution-Based Theory of Motivation: A History of Ideas
Weiner, Bernard
2010-01-01
The history of ideas guiding the development of an attribution-based theory of motivation is presented. These influences include the search for a "grand" theory of motivation (from drive and expectancy/value theory), an attempt to represent how the past may influence the present and the future (as Thorndike accomplished), and the…
Microscopic theory of semiconductor-based optoelectronic devices
Iotti, Rita C.; Rossi, Fausto
2005-11-01
Since the seminal paper by Esaki and Tsu, semiconductor-based nanometric heterostructures have been the subject of impressive theoretical and experimental activity due to their high potential impact in both fundamental research and device technology. The steady scaling down of typical space and time scales in quantum optoelectronic systems inevitably leads to a regime in which the validity of the traditional Boltzmann transport theory cannot be taken for granted and a more general quantum-transport description is imperative. In this paper, we shall review state-of-the-art approaches used in the theoretical modelling, design and optimization of optoelectronic quantum devices. The primary goal is to provide a cohesive treatment of basic quantum-transport effects, able to explain and predict the performances of new-generation semiconductor devices. With this aim, we shall review and discuss a fully three-dimensional microscopic treatment of time-dependent as well as steady-state quantum-transport phenomena, based on the density matrix formalism. This will allow us to introduce in a quite natural way the separation between coherent and incoherent processes. Starting with this general theoretical framework, we shall analyse two different types of quantum devices, namely periodically repeated structures and quantum systems with open boundaries. For devices within the first class, we will show how a proper use of periodic boundary conditions allows us to reproduce and predict their current-voltage characteristics without resorting to phenomenological parameters. For the second class of devices, we will address the relevant issue of a quantum treatment of charge transport in systems with open boundaries (electrical contacts) when studying and simulating an at least two-terminal device.
Rutting Prediction in Asphalt Pavement Based on Viscoelastic Theory
Nahi Mohammed Hadi
2016-01-01
Full Text Available Rutting is one of the most disturbing failures on the asphalt roads due to the interrupting it is caused to the drivers. Predicting of asphalt pavement rutting is essential tool leads to better asphalt mixture design. This work describes a method of predicting the behaviour of various asphalt pavement mixes and linking these to an accelerated performance testing. The objective of this study is to develop a finite element model based on viscoplastic theory for simulating the laboratory testing of asphalt mixes in Hamburg Wheel Rut Tester (HWRT for rutting. The creep parameters C1, C2 and C3 are developed from the triaxial repeated load creep test at 50°C and at a frequency of 1 Hz and the modulus of elasticity and Poisson’ s ratio determined at the same temperature. Viscoelastic model (creep model is adopted using a FE simulator (ANSYS in order to calculate the rutting for various mixes under a uniform loading pressure of 500 kPa. An eight-node with a three Degrees of Freedom (UX, UY, and UZ Element is used for the simulation. The creep model developed for HWRT tester was verified by comparing the predicted rut depths with the measured one and by comparing the rut depth with ABAQUS result from literature. Reasonable agreement can be obtained between the predicted rut depths and the measured one. Moreover, it is found that creep model parameter C1 and C3 have a strong relationship with rutting. It was clear that the parameter C1 strongly influences rutting than the parameter C3. Finally, it can be concluded that creep model based on finite element method can be used as an effective tool to analyse rutting of asphalt pavements.
6D F-theory models and elliptically fibered Calabi-Yau threefolds over semi-toric base surfaces
Martini, Gabriella
2014-01-01
We carry out a systematic study of a class of 6D F-theory models and associated Calabi-Yau threefolds that are constructed using base surfaces with a generalization of toric structure. In particular, we determine all smooth surfaces that have a structure invariant under a single C^* action (sometimes called "T-varieties" in the mathematical literature) that can act as bases for an elliptic fibration with section of a Calabi-Yau threefold. We identify 162,408 distinct bases, which include as a subset the previously studied set of strictly toric bases. Calabi-Yau threefolds constructed in this fashion include examples with previously unknown Hodge numbers. There are also bases over which the generic elliptic fibration has a Mordell-Weil group of sections with nonzero rank, corresponding to U(1) factors in the 6D supergravity model; this type of structure does not arise for generic elliptic fibrations in the purely toric context.
The Application of Carl Rogers' Person-Centered Learning Theory to Web-Based Instruction.
Miller, Christopher T.
This paper provides a review of literature that relates research on Carl Rogers' person-centered learning theory to Web-based learning. Based on the review of the literature, a set of criteria is described that can be used to determine how closely a Web-based course matches the different components of Rogers' person-centered learning theory. Using…
Roybal, H; Baxendale, S J; Gupta, M
1999-01-01
Activity-based costing and the theory of constraints have been applied successfully in many manufacturing organizations. Recently, those concepts have been applied in service organizations. This article describes the application of activity-based costing and the theory of constraints in a managed care mental health and substance abuse organization. One of the unique aspects of this particular application was the integration of activity-based costing and the theory of constraints to guide process improvement efforts. This article describes the activity-based costing model and the application of the theory of constraint's focusing steps with an emphasis on unused capacities of activities in the organization.
Learning Theory Bases of Communicative Methodology and the Notional/Functional Syllabus
Jacqueline D., Beebe
1992-01-01
This paper examines the learning theories that underlie the philosophy and practices known as communicative language teaching methodology. These theories are identified first as a reaction against the behavioristic learning theory of audiolingualism. Approaches to syllabus design based on both the "weak" version of communicative language teaching-learning to use the second language-and the "strong" version-using the second language to learn it-are examined. The application of cognitive theory...
Learning Theory Bases of Communicative Methodology and the Notional/Functional Syllabus
Jacqueline D., Beebe
1992-01-01
This paper examines the learning theories that underlie the philosophy and practices known as communicative language teaching methodology. These theories are identified first as a reaction against the behavioristic learning theory of audiolingualism. Approaches to syllabus design based on both the "weak" version of communicative language teaching-learning to use the second language-and the "strong" version-using the second language to learn it-are examined. The application of cognitive theory...
An information theory criteria based blind method for enumerating active users in DS-CDMA system
Samsami Khodadad, Farid; Abed Hodtani, Ghosheh
2014-11-01
In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.
V. I. Khvorostyanov
2012-03-01
Full Text Available A new analytical parameterization of homogeneous ice nucleation is developed based on extended classical nucleation theory including new equations for the critical radii of the ice germs, free energies and nucleation rates as the functions of the temperature and water saturation ratio simultaneously. By representing these quantities as separable products of the analytical functions of the temperature and supersaturation, analytical solutions are found for the integral-differential supersaturation equation and concentration of nucleated crystals. Parcel model simulations are used to illustrate the general behavior of various nucleation properties under various conditions, for justifications of the further key analytical simplifications, and for verification of the resulting parameterization.
The final parameterization is based upon the values of the supersaturation that determines the current or maximum concentrations of the nucleated ice crystals. The crystal concentration is analytically expressed as a function of time and can be used for parameterization of homogeneous ice nucleation both in the models with small time steps and for substep parameterization in the models with large time steps. The crystal concentration is expressed analytically via the error functions or elementary functions and depends only on the fundamental atmospheric parameters and parameters of classical nucleation theory. The diffusion and kinetic limits of the new parameterization agree with previous semi-empirical parameterizations.
Recipes Prediction by Matching to K/S Values Based on New Two constant Theory
HE Guo-xing; XING Huai-zhong; ZHOU Ming-xun
2006-01-01
A concept of new two-constant of colorant, both (k/St) and (s/St), is introduced based on the Kubelka-Munk theory.A new two-constant theory for color matching is presented.Basic equations used in matching to K/S values are given in matrix form based on the new two-constant theory.Algorithm for a least-squares match to K/S values of a sample is developed by use of the new two-constant theory.The algorithm is suitable for single-constant theory as well as two-constant theory. The experimental data show that calculating K/S values of disperse dyes based on new two-constant theory are accordant with the measuring ones. The reoipes predicted by new two-constant theory are closer to the actual recipes of the standard sample than the recipes predicted by single-constant theory. The sample according to the recipe predicted by new two-constant theory has smaller color difference against for the standard than the sample according to the recipe predicted by single-constant theory.The results show that the scattering of disperse dyes cannot be negligible, and that the recipes match to textiles colored by disperse dyes should be predicted by using of new two-constant theory.
Galante, Julieta; Adamska, Ligia; Young, Alan; Young, Heather; Littlejohns, Thomas J; Gallacher, John; Allen, Naomi
2016-02-28
Although dietary intake over a single 24-h period may be atypical of an individual's habitual pattern, multiple 24-h dietary assessments can be representative of habitual intake and help in assessing seasonal variation. Web-based questionnaires are convenient for the participant and result in automatic data capture for study investigators. This study reports on the acceptability of repeated web-based administration of the Oxford WebQ--a 24-h recall of frequency from a set food list suitable for self-completion from which energy and nutrient values can be automatically generated. As part of the UK Biobank study, four invitations to complete the Oxford WebQ were sent by email over a 16-month period. Overall, 176 012 (53% of those invited) participants completed the online version of the Oxford WebQ at least once and 66% completed it more than once, although only 16% completed it on all four occasions. The response rate for any one round of invitations varied between 34 and 26%. On most occasions, the Oxford WebQ was completed on the same day that they received the invitation, although this was less likely if sent on a weekend. Participants who completed the Oxford WebQ tended to be white, female, slightly older, less deprived and more educated, which is typical of health-conscious volunteer-based studies. These findings provide preliminary evidence to suggest that repeated 24-h dietary assessment via the Internet is acceptable to the public and a feasible strategy for large population-based studies.
State variable theories based on Hart's formulation
Korhonen, M.A.; Hannula, S.P.; Li, C.Y.
1985-01-01
In this paper a review of the development of a state variable theory for nonelastic deformation is given. The physical and phenomenological basis of the theory and the constitutive equations describing macroplastic, microplastic, anelastic and grain boundary sliding enhanced deformation are presented. The experimental and analytical evaluation of different parameters in the constitutive equations are described in detail followed by a review of the extensive experimental work on different materials. The technological aspects of the state variable approach are highlighted by examples of the simulative and predictive capabilities of the theory. Finally, a discussion of general capabilities, limitations and future developments of the theory and particularly the possible extensions to cover an even wider range of deformation or deformation-related phenomena is presented.
Lim, Lucy; Thompson, Alexander; Patterson, Scott; George, Jacob; Strasser, Simone; Lee, Alice; Sievert, William; Nicoll, Amanda; Desmond, Paul; Roberts, Stuart; Marion, Kaye; Bowden, Scott; Locarnini, Stephen; Angus, Peter
2017-06-01
Multidrug-resistant HBV continues to be an important clinical problem. The TDF-109 study demonstrated that TDF±LAM is an effective salvage therapy through 96 weeks for LAM-resistant patients who previously failed ADV add-on or switch therapy. We evaluated the 5-year efficacy and safety outcomes in patients receiving long-term TDF±LAM in the TDF-109 study. A total of 59 patients completed the first phase of the TDF-109 study and 54/59 were rolled over into a long-term prospective open-label study of TDF±LAM 300 mg daily. Results are reported at the end of year 5 of treatment. At year 5, 75% (45/59) had achieved viral suppression by intent-to-treat analysis. Per-protocol assessment revealed 83% (45/54) were HBV DNA undetectable. Nine patients remained HBV DNA detectable, however 8/9 had very low HBV DNA levels (<264IU/mL) and did not meet virological criteria for virological breakthrough (VBT). One patient experienced VBT, but this was in the setting of documented non-compliance. The response was independent of baseline LAM therapy or mutations conferring ADV resistance. Four patients discontinued TDF, one patient was lost to follow-up and one died from hepatocellular carcinoma. Long-term TDF treatment appears to be safe and effective in patients with prior failure of LAM and a suboptimal response to ADV therapy. These findings confirm that TDF has a high genetic barrier to resistance is active against multidrug-resistant HBV, and should be the preferred oral anti-HBV agent in CHB patients who fail treatment with LAM and ADV. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Frequency Shift of Carbon-Nanotube-Based Mass Sensor Using Nonlocal Elasticity Theory
Lee Haw-Long
2010-01-01
Full Text Available Abstract The frequency equation of carbon-nanotube-based cantilever sensor with an attached mass is derived analytically using nonlocal elasticity theory. According to the equation, the relationship between the frequency shift of the sensor and the attached mass can be obtained. When the nonlocal effect is not taken into account, the variation of frequency shift with the attached mass on the sensor is compared with the previous study. According to this study, the result shows that the frequency shift of the sensor increases with increasing the attached mass. When the attached mass is small compared with that of the sensor, the nonlocal effect is obvious and increasing nonlocal parameter decreases the frequency shift of the sensor. In addition, when the location of the attached mass is closer to the free end, the frequency shift is more significant and that makes the sensor reveal more sensitive. When the attached mass is small, a high sensitivity is obtained.
An Analysis of Break,Break,Break Based on the Stylistic Theory
李瑶
2014-01-01
Break,Break,Break is a poem by Alfred Lord Tennyson, the Poet Laureate during the Queen Victoria's reign. This exquisite little poem is wel known for the poet’s grief-stricken feelings and heart-broken emotions over the premature death of his best friend, Arthur Henry Hal am. Most of the previous studies on this poem focus on the emotional level to consider it as an elegy, expressing sorrow and lamentation for the death of a particular person. However, in order to have a deep understanding in general, this paper analyzes the poem based on the stylistic theory, concerning on the phonological level and the grammatical level. It aims at helping the readers to cultivate a sense of appropriateness, to sharpen the understanding and appreciation of literary works and to achieve adaptation in translation.
A Meme-Based Approach to Oral Traditional Theory
Michael D. C. Drout
2006-10-01
Full Text Available A meme is the simplest unit of cultural replication. This paper adapts meme theory to explain the workings of several aspects of oral traditions––traditional referentiality, anaphora, and the use of repeated metrical patterns. All three of these phenomena can be explained by operations of repetition and pattern-recognition. This paper ultimately illustrates that the development of meme theory is an important first step towards a wholly materialist cultural poetics.
A Theory of Hierarchies Based on Limited Managerial Attention
John Geanakoplos; Paul R. Milgrom
1988-01-01
Our purpose in this paper is to investigate the economics of managerial organizations by focusing on the decision problem of management. Ours is a "team theory" analysis, that is, it ignores the problem of conflicting objectives among managers and focuses instead on the problem of coordinating the decisions of several imperfectly informed actors. However, unlike classical team theory, we concentrate on the choice by managers of what to know, as well as what to do, and we allow the possibility...
Trends in information theory-based chemical structure codification.
Barigye, Stephen J; Marrero-Ponce, Yovani; Pérez-Giménez, Facundo; Bonchev, Danail
2014-08-01
This report offers a chronological review of the most relevant applications of information theory in the codification of chemical structure information, through the so-called information indices. Basically, these are derived from the analysis of the statistical patterns of molecular structure representations, which include primitive global chemical formulae, chemical graphs, or matrix representations. Finally, new approaches that attempt to go "back to the roots" of information theory, in order to integrate other information-theoretic measures in chemical structure coding are discussed.
Ali Hoseini Soorand
2015-10-01
Full Text Available Background and Aim: Hypertension is one of the most common and important non-communicable diseases and health problems in the world today nevertheless, it is preventable and controllable. Theory of Planned Behavior is one of the major theories that explains the process of adopting healthy behaviors. The present study aimed atat determining the effect of the theory on components of theory in patients with hypertension. Materials and Methods:. This randomized controlled field trial study was done on 110 patients with hypertension in Zirkouh city who were divided into two equal groups. Validity and reliability of the questionnaire used were determined through face and content validity and through Cronbach’s alpha and test-retest, respectively. The obtained data was analyzed by means of SPSS software (V: 16 using statistical t-test and repeated analysis of variance. Results: Both groups were similar regarding mean score of the theory components before intervention, but after the intervention the average scores of the experimental group increased. The attitude increased from 48.7 to 64.1, subjective norm from 34.9 to 43.1, perceived behavioral control from 33.8 to 43, intention behavior from 33.9 to 41.09 and behavior from 65.6 to 82.45 and these differences were statistically significant (P<0.001. However, nosignificant difference was observed in the control group. Conclusion: Regarding the positive effect of education based on The Theory of Planned Behavior in controlling hypertension, planning of a curriculum based on this theory is recommended.
Toward a brain-based theory of beauty.
Ishizu, Tomohiro; Zeki, Semir
2011-01-01
We wanted to learn whether activity in the same area(s) of the brain correlate with the experience of beauty derived from different sources. 21 subjects took part in a brain-scanning experiment using functional magnetic resonance imaging. Prior to the experiment, they viewed pictures of paintings and listened to musical excerpts, both of which they rated on a scale of 1-9, with 9 being the most beautiful. This allowed us to select three sets of stimuli--beautiful, indifferent and ugly--which subjects viewed and heard in the scanner, and rated at the end of each presentation. The results of a conjunction analysis of brain activity showed that, of the several areas that were active with each type of stimulus, only one cortical area, located in the medial orbito-frontal cortex (mOFC), was active during the experience of musical and visual beauty, with the activity produced by the experience of beauty derived from either source overlapping almost completely within it. The strength of activation in this part of the mOFC was proportional to the strength of the declared intensity of the experience of beauty. We conclude that, as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources--musical and visual--and probably by other sources as well. This has led us to formulate a brain-based theory of beauty.
Sánchez-Carracedo, David; López-Guimerà, Gemma; Fauquet, Jordi; Barrada, Juan Ramón; Pàmias, Montserrat; Puntí, Joaquim; Querol, Mireia; Trepat, Esther
2013-10-12
The prevention of eating disorders and disordered eating are increasingly recognized as public health priorities. Challenges in this field included moving from efficacy to effectiveness and developing an integrated approach to the prevention of a broad spectrum of eating and weight-related problems. A previous efficacy trial indicated that a universal disordered eating prevention program, based on the social cognitive model, media literacy educational approach and cognitive dissonance theory, reduced risk factors for disordered eating, but it is unclear whether this program has effects under more real-world conditions. The main aim of this effectiveness trial protocol is to test whether this program has effects when incorporating an integrated approach to prevention and when previously-trained community providers implement the intervention. The research design involved a multi-center non-randomized controlled trial with baseline, post and 1-year follow-up measures. Six schools from the city of Sabadell (close to Barcelona) participated in the intervention group, and eleven schools from four towns neighboring Sabadell participated in the control group. A total of 174 girls and 180 boys in the intervention group, and 484 girls and 490 boys in the control group were registered in class lists prior to baseline. A total of 18 community providers, secondary-school class tutors, nurses from the Catalan Government's Health and School Program, and health promotion technicians from Sabadell City Council were trained and delivered the program. Shared risk factors of eating and weight-related problems were assessed as main measures. It will be vital for progress in disordered eating prevention to conduct effectiveness trials, which test whether interventions are effective when delivered by community providers under ecologically valid conditions, as opposed to tightly controlled research trials. The MABIC project will provide new contributions in this transition from efficacy
Exceptional knowledge discovery in databases based on information theory
Suzuki, Einoshin [Yokohama National Univ. (Japan); Shimura, Masamichi [Tokyo Inst. of Technology (Japan)
1996-12-31
This paper presents an algorithm for discovering exceptional knowledge from databases. Exceptional knowledge, which is defined as an exception to a general fact, exhibits unexpectedness and is sometimes extremely useful in spite of its obscurity. Previous discovery approaches for this type of knowledge employ either background knowledge or domain-specific criteria for evaluating the possible usefulness, i.e. the interestingness of the knowledge extracted from a database. It has been pointed out, however, that these approaches are prone to overlook useful knowledge. In order to circumvent these difficulties, we propose an information-theoretic approach in which we obtain exceptional knowledge associated with general knowledge in the form of a rule pair using a depth-first search method. The product of the ACEs (Average Compressed Entropies) of the rule pair is introduced as the criterion for evaluating the interestingness of exceptional knowledge. The inefficiency of depth-first search is alleviated by a branch-and-bound method, which exploits the upper-bound for the product of the ACEs. MEPRO, which is a knowledge discovery system based on our approach, has been validated using the benchmark databases in the machine learning community.
Petrarca, Mateus H; Fernandes, José O; Godoy, Helena T; Cunha, Sara C
2016-12-01
With the aim to develop a new gas chromatography-mass spectrometry method to analyze 24 pesticide residues in baby foods at the level imposed by established regulation two simple, rapid and environmental-friendly sample preparation techniques based on QuEChERS (quick, easy, cheap, effective, robust and safe) were compared - QuEChERS with dispersive liquid-liquid microextraction (DLLME) and QuEChERS with dispersive solid-phase extraction (d-SPE). Both sample preparation techniques achieved suitable performance criteria, including selectivity, linearity, acceptable recovery (70-120%) and precision (⩽20%). A higher enrichment factor was observed for DLLME and consequently better limits of detection and quantification were obtained. Nevertheless, d-SPE provided a more effective removal of matrix co-extractives from extracts than DLLME, which contributed to lower matrix effects. Twenty-two commercial fruit-based baby food samples were analyzed by the developed method, being procymidone detected in one sample at a level above the legal limit established by EU.
Malte Kroenig
2016-01-01
Full Text Available Objective. In this study, we compared prostate cancer detection rates between MRI-TRUS fusion targeted and systematic biopsies using a robot-guided, software based transperineal approach. Methods and Patients. 52 patients received a MRIT/TRUS fusion followed by a systematic volume adapted biopsy using the same robot-guided transperineal approach. The primary outcome was the detection rate of clinically significant disease (Gleason grade ≥ 4. Secondary outcomes were detection rate of all cancers, sampling efficiency and utility, and serious adverse event rate. Patients received no antibiotic prophylaxis. Results. From 52 patients, 519 targeted biopsies from 135 lesions and 1561 random biopsies were generated (total n=2080. Overall detection rate of clinically significant PCa was 44.2% (23/52 and 50.0% (26/52 for target and random biopsy, respectively. Sampling efficiency as the median number of cores needed to detect clinically significant prostate cancer was 9 for target (IQR: 6–14.0 and 32 (IQR: 24–32 for random biopsy. The utility as the number of additionally detected clinically significant PCa cases by either strategy was 0% (0/52 for target and 3.9% (2/52 for random biopsy. Conclusions. MRI/TRUS fusion based target biopsy did not show an advantage in the overall detection rate of clinically significant prostate cancer.
Cluster density functional theory for lattice models based on the theory of Möbius functions
Lafuente, Luis; Cuesta, José A.
2005-08-01
Rosenfeld's fundamental-measure theory for lattice models is given a rigorous formulation in terms of the theory of Möbius functions of partially ordered sets. The free-energy density functional is expressed as an expansion in a finite set of lattice clusters. This set is endowed with a partial order, so that the coefficients of the cluster expansion are connected to its Möbius function. Because of this, it is rigorously proven that a unique such expansion exists for any lattice model. The low-density analysis of the free-energy functional motivates a redefinition of the basic clusters (zero-dimensional cavities) which guarantees a correct zero-density limit of the pair and triplet direct correlation functions. This new definition extends Rosenfeld's theory to lattice models with any kind of short-range interaction (repulsive or attractive, hard or soft, one or multicomponent ...). Finally, a proof is given that these functionals have a consistent dimensional reduction, i.e. the functional for dimension d' can be obtained from that for dimension d (d' < d) if the latter is evaluated at a density profile confined to a d'-dimensional subset.
Cluster density functional theory for lattice models based on the theory of Moebius functions
Lafuente, Luis; Cuesta, Jose A [Grupo Interdisciplinar de Sistemas Complejos (GISC), Departamento de Matematicas, Universidad Carlos III de Madrid, 28911 Leganes, Madrid (Spain)
2005-08-26
Rosenfeld's fundamental-measure theory for lattice models is given a rigorous formulation in terms of the theory of Moebius functions of partially ordered sets. The free-energy density functional is expressed as an expansion in a finite set of lattice clusters. This set is endowed with a partial order, so that the coefficients of the cluster expansion are connected to its Moebius function. Because of this, it is rigorously proven that a unique such expansion exists for any lattice model. The low-density analysis of the free-energy functional motivates a redefinition of the basic clusters (zero-dimensional cavities) which guarantees a correct zero-density limit of the pair and triplet direct correlation functions. This new definition extends Rosenfeld's theory to lattice models with any kind of short-range interaction (repulsive or attractive, hard or soft, one or multicomponent ...). Finally, a proof is given that these functionals have a consistent dimensional reduction, i.e. the functional for dimension d' can be obtained from that for dimension d (d' < d) if the latter is evaluated at a density profile confined to a d'-dimensional subset.
Han, Gang; Newell, Jay
2014-01-01
This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…
Andrew C. Elton
2017-01-01
Full Text Available Salmonella meningitis is a rare manifestation of meningitis typically presenting in neonates and the elderly. This infection typically associates with foodborne outbreaks in developing nations and AIDS-endemic regions. We report a case of a 19-year-old male presenting with altered mental status after 3-day absence from work at a Wisconsin tourist area. He was febrile, tachycardic, and tachypneic with a GCS of 8. The patient was intubated and a presumptive diagnosis of meningitis was made. Treatment was initiated with ceftriaxone, vancomycin, acyclovir, dexamethasone, and fluid resuscitation. A lumbar puncture showed cloudy CSF with Gram negative rods. He was admitted to the ICU. CSF culture confirmed Salmonella enterica subsp. I (enterica Enteritidis (A. Based on this finding, a 4th-generation HIV antibody/p24 antigen test was sent. When this returned positive, a CD4 count was obtained and showed 3 cells/mm3, confirming AIDS. The patient ultimately received 38 days of ceftriaxone, was placed on elvitegravir, cobicistat, emtricitabine, and tenofovir alafenamide (Genvoya for HIV/AIDS, and was discharged neurologically intact after a 44-day admission.
Mean field theory for fermion-based U(2) anyons
McGraw, P
1996-01-01
The energy density is computed for a U(2) Chern-Simons theory coupled to a non-relativistic fermion field (a theory of ``non-Abelian anyons'') under the assumptions of uniform charge and matter density. When the matter field is a spinless fermion, we find that this energy is independent of the two Chern-Simons coupling constants and is minimized when the non-Abelian charge density is zero. This suggests that there is no spontaneous breaking of the SU(2) subgroup of the symmetry, at least in this mean-field approximation. For spin-1/2 fermions, we find self-consistent mean-field states with a small non-Abelian charge density, which vanishes as the theory of free fermions is approached.
Evgeni V Nikolaev
2016-04-01
Full Text Available Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of "on-off" switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens, and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular ("intrinsic" or environmental ("extrinsic" noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a "majority-vote" correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of "monotone" dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable ("chaotic" behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation analysis
Nikolaev, Evgeni V; Sontag, Eduardo D
2016-04-01
Synthetic constructs in biotechnology, biocomputing, and modern gene therapy interventions are often based on plasmids or transfected circuits which implement some form of "on-off" switch. For example, the expression of a protein used for therapeutic purposes might be triggered by the recognition of a specific combination of inducers (e.g., antigens), and memory of this event should be maintained across a cell population until a specific stimulus commands a coordinated shut-off. The robustness of such a design is hampered by molecular ("intrinsic") or environmental ("extrinsic") noise, which may lead to spontaneous changes of state in a subset of the population and is reflected in the bimodality of protein expression, as measured for example using flow cytometry. In this context, a "majority-vote" correction circuit, which brings deviant cells back into the required state, is highly desirable, and quorum-sensing has been suggested as a way for cells to broadcast their states to the population as a whole so as to facilitate consensus. In this paper, we propose what we believe is the first such a design that has mathematically guaranteed properties of stability and auto-correction under certain conditions. Our approach is guided by concepts and theory from the field of "monotone" dynamical systems developed by M. Hirsch, H. Smith, and others. We benchmark our design by comparing it to an existing design which has been the subject of experimental and theoretical studies, illustrating its superiority in stability and self-correction of synchronization errors. Our stability analysis, based on dynamical systems theory, guarantees global convergence to steady states, ruling out unpredictable ("chaotic") behaviors and even sustained oscillations in the limit of convergence. These results are valid no matter what are the values of parameters, and are based only on the wiring diagram. The theory is complemented by extensive computational bifurcation analysis, performed for a
Theory of sampling and its application in tissue based diagnosis
Kayser Gian
2009-02-01
Full Text Available Abstract Background A general theory of sampling and its application in tissue based diagnosis is presented. Sampling is defined as extraction of information from certain limited spaces and its transformation into a statement or measure that is valid for the entire (reference space. The procedure should be reproducible in time and space, i.e. give the same results when applied under similar circumstances. Sampling includes two different aspects, the procedure of sample selection and the efficiency of its performance. The practical performance of sample selection focuses on search for localization of specific compartments within the basic space, and search for presence of specific compartments. Methods When a sampling procedure is applied in diagnostic processes two different procedures can be distinguished: I the evaluation of a diagnostic significance of a certain object, which is the probability that the object can be grouped into a certain diagnosis, and II the probability to detect these basic units. Sampling can be performed without or with external knowledge, such as size of searched objects, neighbourhood conditions, spatial distribution of objects, etc. If the sample size is much larger than the object size, the application of a translation invariant transformation results in Kriege's formula, which is widely used in search for ores. Usually, sampling is performed in a series of area (space selections of identical size. The size can be defined in relation to the reference space or according to interspatial relationship. The first method is called random sampling, the second stratified sampling. Results Random sampling does not require knowledge about the reference space, and is used to estimate the number and size of objects. Estimated features include area (volume fraction, numerical, boundary and surface densities. Stratified sampling requires the knowledge of objects (and their features and evaluates spatial features in relation to
Frustrated magnetism and caloric effects in Mn-based antiperovskite nitrides: Ab initio theory
Zemen, J.; Mendive-Tapia, E.; Gercsi, Z.; Banerjee, R.; Staunton, J. B.; Sandeman, K. G.
2017-05-01
We model changes of magnetic ordering in Mn-based antiperovskite nitrides driven by biaxial lattice strain at zero and at finite temperature. We employ a noncollinear spin-polarized density functional theory to compare the response of the geometrically frustrated exchange interactions to a tetragonal symmetry breaking (the so called piezomagnetic effect) across a range of Mn3AN (A = Rh, Pd, Ag, Co, Ni, Zn, Ga, In, Sn) at zero temperature. Building on the robustness of the effect we focus on Mn3GaN and extend our study to finite temperature using the disordered local moment (DLM) first-principles electronic structure theory to model the interplay between the ordering of Mn magnetic moments and itinerant electron states. We discover a rich temperature-strain magnetic phase diagram with two previously unreported phases stabilized by strains larger than 0.75% and with transition temperatures strongly dependent on strain. We propose an elastocaloric cooling cycle crossing two of the available phase transitions to achieve simultaneously a large isothermal entropy change (due to the first-order transition) and a large adiabatic temperature change (due to the second-order transition).
Course Management and Students' Expectations: Theory-Based Considerations
Buckley, M. Ronald; Novicevic, Milorad M.; Halbesleben, Jonathon R. B.; Harvey, Michael
2004-01-01
This paper proposes a theoretical, yet practical, framework for managing the formation process of students unrealistic expectations in a college course. Using relational contracting theory, alternative teacher interventions, aimed at effective management of students expectations about the course, are described. Also, the formation of the student…
Videogames, Tools for Change: A Study Based on Activity Theory
Méndez, Laura; Lacasa, Pilar
2015-01-01
Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…
Aligning activity theory with community based participatory research
INTRODuCTION. 1In the year 2000 about five ... 1Through the use of the language of Cultural Historical Activity Theory (CHAT), in anticipation .... models for analysing information; (ii) new ideas and tools that the participants come up with; and ...
Effective Contraceptive Use: An Exploration of Theory-Based Influences
Peyman, N.; Oakley, D.
2009-01-01
The purpose of this study was to explore factors that influence oral contraceptive (OC) use among women in Iran using the Theory of Planned Behavior (TPB) and concept of self-efficacy (SE). The study sample consisted of 360 married OC users, aged 18-49 years recruited at public health centers of Mashhad, 900 km east of Tehran. SE had the strongest…
Automatic Trading Agent. RMT based Portfolio Theory and Portfolio Selection
Snarska, M; Snarska, Malgorzata; Krzych, Jakub
2006-01-01
Portfolio theory is a very powerful tool in the modern investment theory. It is helpful in estimating risk of an investor's portfolio, which arises from our lack of information, uncertainty and incomplete knowledge of reality, which forbids a perfect prediction of future price changes. Despite of many advantages this tool is not known and is not widely used among investors on Warsaw Stock Exchange. The main reason for abandoning this method is a high level of complexity and immense calculations. The aim of this paper is to introduce an automatic decision - making system, which allows a single investor to use such complex methods of Modern Portfolio Theory (MPT). The key tool in MPT is an analysis of an empirical covariance matrix. This matrix, obtained from historical data is biased by such a high amount of statistical uncertainty, that it can be seen as random. By bringing into practice the ideas of Random Matrix Theory (RMT), the noise is removed or significantly reduced, so the future risk and return are b...
A Method for Dispersion Compensation Based on GLM Theory
无
2000-01-01
A method used to design the waveguide gratings for dispersion compensation employing GLM theory is briefly described. By using this method a reflective grating is designed, which has both a flat amplitude and a quadratic phase response over the transfer bandwidth.
Effective Contraceptive Use: An Exploration of Theory-Based Influences
Peyman, N.; Oakley, D.
2009-01-01
The purpose of this study was to explore factors that influence oral contraceptive (OC) use among women in Iran using the Theory of Planned Behavior (TPB) and concept of self-efficacy (SE). The study sample consisted of 360 married OC users, aged 18-49 years recruited at public health centers of Mashhad, 900 km east of Tehran. SE had the strongest…
Videogames, Tools for Change: A Study Based on Activity Theory
Méndez, Laura; Lacasa, Pilar
2015-01-01
Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…
Varzinczak, IJ
2010-02-01
Full Text Available revisit the semantics of action theory contraction proposed in previous work, giving more robust operators that express minimal change based on a notion of distance between Kripke-models. Second we give algorithms for syntactical action theory contraction...
Problem-based learning for technical students on the base TRIZ (theory of inventive problem solving
Babenko Oksana
2016-01-01
Full Text Available The basis of modern educational technology in teaching is problem-based learning through the use of educational technologies Powerful Thinking - Theory of Inventive Problem Solving (TRIZ, including a systematic approach to the complex organization of independent work of search and research character. Developed by systemic administration of the physical features workshops on the basis TRIZ in the cycle of the natural sciences with the implementation of all aspects of the educational activities - substantive, procedural and motivational. A new model of the physical design of the workshop and its form of organization, which is based on problem-based learning with the use of TRIZ Interactive form of organization of the workshop allows you to get high-quality substantive and personality of the students who have a significant role in the formation of professional competencies and affect the quality of produce practice-oriented specialists.
Carol A. Gordon
2009-09-01
Full Text Available Objective – The purpose of this paper is to articulate a theory for the use of action research as a tool of evidence based practice for information literacy instruction in school libraries. The emerging theory is intended to capture the complex phenomenon of information skills teaching as it is embedded in school curricula. Such a theory is needed to support research on the integrated approach to teaching information skills and knowledge construction within the framework of inquiry learning. Part 1 of this paper, in the previous issue, built a foundation for emerging theory, which established user‐centric information behavior and constructivist learning theory as the substantive theory behind evidence based library instruction in schools. Part 2 continues to build on the Information Search Process and Guided Inquiry as foundational to studying the information‐to‐knowledge connection and the concepts of help and intervention characteristic of 21st century school library instruction.Methods – This paper examines the purpose and methodology of action research as a tool of evidence based instruction. This is accomplished through the explication of three components of theory‐building: paradigm, substantive research, and metatheory. Evidence based practice is identified as the paradigm that contributes values and assumptions about school library instruction. It establishes the role of evidence in teaching and learning, linking theory and practice. Action research, as a tool of evidence based practice is defined as the synthesis of authentic learning, or performance‐based assessment practices that continuously generate evidence throughout the inquiry unit of instruction and traditional data collection methods typically used in formal research. This paper adds social psychology theory from Lewin’s work, which contributes methodology from Gestalt psychology, field theory, group dynamics, and change theory. For Lewin the purpose of action
Gamification in library websites based on motivational theories
Zahed Bigdeli
2016-06-01
Full Text Available Gamification is defined as “the use of game elements and techniques in non-game contexts”. In fact, this definition is the most comprehensive one presented so far. This concept emerged first in 2002 but it has been prolonged for 8 years to attract individuals’ attention. Gamification has been applied in various disciplines according to their different needs. In fact, gamification tries to present tedious and usual daily tasks in a manner which proved to be fantastic for users/players. This paper studies gamification’s role as an engagement tool for libraries. Also, the study aims to investigate the role of the common theories in library game operations. These theories, namely “self-determination theory” and “flow theory”.
SMD-based numerical stochastic perturbation theory arXiv
Dalla Brida, Mattia
The viability of a variant of numerical stochastic perturbation theory, where the Langevin equation is replaced by the SMD algorithm, is examined. In particular, the convergence of the process to a unique stationary state is rigorously established and the use of higher-order symplectic integration schemes is shown to be highly profitable in this context. For illustration, the gradient-flow coupling in finite volume with Schr\\"odinger functional boundary conditions is computed to two-loop (i.e. NNL) order in the SU(3) gauge theory. The scaling behaviour of the algorithm turns out to be rather favourable in this case, which allows the computations to be driven close to the continuum limit.
THE RESPONSIBILITY TO PROTECT. A JUST WAR THEORY BASED ANALYSIS
Andreea IANCU
2014-11-01
Full Text Available This paper analyzes the Responsibility to protect principle as the paradigm that reinforces the just war theory in the current international relations. The importance of this analysis is given by the fact that in the current change of source of international conflicts, the Responsibility to protect principle affirms the responsibility of the international community to protect all the citizens of the world. In this context we witness a translation toward a Post-Westphalian international system, which values the individual as a security referent. This article discusses the origins of the responsibility to protect principle and problematizes (discusses the legitimacy of use of violence and force in the current international system. Moreover, the paper analyzes the possible humanization of the current international relations and, simultaneously, the persistency of conflict and warfare in the international system. The conclusion of this research states that the Responsibility to protect principle revises the just war theory by centering it on the individual.
Optimal Differential Routing based on Finite State Machine Theory
M. S. Krishnamoorthy; Loy, James R.; McDonald, John F.
1999-01-01
Noise margins in high speed digital systems continue to erode. Full differential signal routing provides a mechanism for deferring these effects. This paper proposes a three stage routing process for solving the adjacent placement routing problem of differential signal pairs, and proves that it is optimal. The process views differential pairs as logical nets; routes the logical nets; then bifurcates the result to achieve a physical realization. Finite state machine theory provides the critica...
Fault Diagnosis of Machine Based on Fuzzy Reliability Theory
无
2001-01-01
According to life analysis in reliability theory, certain diagnosis rules can be used to diagnose machines' faults. On this basis, considering the indefiniteness in machine working states, the accurate diagnosis rule was extended to fuzzy diagnosis rule by using basic concepts and methods of fuzzy mathematics. The formulas of fault probability under different conditions were deduced. In the end, an example is given and the results of two methods were compared.
Research on DFRP Theory and Application Based on PDM
无
2001-01-01
Product data management is the foundation and platform toimplement Concurrent Engineering. To take manufacturing resource planning of enterprises into account while performing concurrent product design, the paper puts forth a new theory, namely DFRP (Design for Resource Planning), by which fulfills the data intercommunication between PDM (Product Data Management) and MRPII (Manufacturing Resource Planning), thus brings the DFRP into DFX (Design for Something) tools. Finally, the paper analyzes the present problems in the field and points out the development.
Buckled graphene: A model study based on density functional theory
Khan, Mohammad A.
2010-09-01
We make use of ab initio calculations within density functional theory to investigate the influence of buckling on the electronic structure of single layer graphene. Our systematic study addresses a wide range of bond length and bond angle variations in order to obtain insights into the energy scale associated with the formation of ripples in a graphene sheet. © 2010 Elsevier B.V. All rights reserved.
Literary pedagogy in nursing: a theory-based perspective.
Sakalys, Jurate A
2002-09-01
Using fictional and autobiographical literature in nursing education is a primary way of understanding patients' lived experiences and fostering development of essential relational and reflective thinking skills. Application of literary theory to this pedagogic practice can expand conceptualization of teaching goals, inform specific teaching strategies, and potentially contribute to socially consequential educational outcomes. This article describes a theoretical schema that focuses on pedagogical goals in terms of the three related skills (i.e., reading, interpretation, criticism) of textual competence.
The emergence of information systems: a communication-based theory
Holten, Roland; Rosenkranz, Christoph
2012-01-01
An information system is more than just the information technology; it is the system that emerges from the complex interactions and relationships between the information technology and the organization. However, what impact information technology has on an organization and how organizational structures and organizational change influence information technology remains an open question. We propose a theory to explain how communication structures emerge and adapt to environmental changes. We op...
Extensive Generalization of Statistical Mechanics Based on Incomplete Information Theory
Qiuping A. Wang
2003-06-01
Full Text Available Statistical mechanics is generalized on the basis of an additive information theory for incomplete probability distributions. The incomplete normalization is used to obtain generalized entropy . The concomitant incomplete statistical mechanics is applied to some physical systems in order to show the effect of the incompleteness of information. It is shown that this extensive generalized statistics can be useful for the correlated electron systems in weak coupling regime.
Collective learning modeling based on the kinetic theory of active particles
Burini, D.; De Lillo, S.; Gibelli, L.
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.
INVESTIGATION ON KANE DYNAMIC EQUATIONS BASED ON SCREW THEORY FOR OPENCHAIN MANIPULATORS
LIU Wu-fa; GONG Zhen-bang; WANG Qin-que
2005-01-01
First, screw theory, product of exponential formulas and Jacobian matrix are introduced. Then definitions are given about active force wrench, inertial force wrench, partial velocity twist, generalized active force, and generalized inertial force according to screw theory. After that Kane dynamic equations based on screw theory for open-chain manipulators have been derived. Later on how to compute the partial velocity twist by geometrical method is illustrated. Finally the correctness of conclusions is verified by example.
Research on adsorption mechanism of wall climbing robots based on internally balanced theory
FAN Ji-zhuang; ZHU Yan-he; ZHAO Jie; CAI He-gao
2007-01-01
The internally balanced theory proposed by the Japanese researchers, solved the contradiction between adsorption ability and moving capability of the permanent magnetic adsorption mechanism. However, it still has some problems when applied to wall climbing robots. This paper analyzes and improves this theory, and the improved internally balanced theory satisfies the requirements of the adsorption mechanism significantly. Finally, a practical prototype is proposed based on this method, and both the analysis using ANSYS and the experiment results justify the design validity.
Gravitational Cherenkov losses in theories based on modified Newtonian dynamics.
Milgrom, Mordehai
2011-03-18
Survival of high-energy cosmic rays (HECRs) against gravitational Cherenkov losses is shown not to cast strong constraints on modified Newtonian dynamics (MOND) theories that are compatible with general relativity (GR): theories that coincide with GR for accelerations ≫a(0) (a(0) is the MOND constant). The energy-loss rate, E, is many orders smaller than those derived in the literature for theories with no extra scale. Modification to GR, which underlies E, enters only beyond the MOND radius of the particle: r(M)=(Gp/ca(0))(1/2). The spectral cutoff, entering E quadratically, is thus r(M)(-1), not k(dB)=p/ℏ. Thus, E is smaller than published rates, which use k(dB), by a factor ∼(r(M)k(dB))(2)≈10(39)(cp/3×10(11) Gev)(3). Losses are important only beyond D(loss)≈qℓ(M), where q is a dimensionless factor, and ℓ(M)=c(2)/a(0) is the MOND length, which is ≈2π times the Hubble distance.
The refined theory of deep rectangular beams based on general solutions of elasticity
GAO; Yang; WANG; Minzhong
2006-01-01
The problem of deducing one-dimensional theory from two-dimensional theory for a homogeneous isotropic beam is investigated. Based on elasticity theory, the refined theory of rectangular beams is derived by using Papkovich-Neuber solution and Lur'e method without ad hoc assumptions. It is shown that the displacements and stresses of the beam can be represented by the angle of rotation and the deflection of the neutral surface. Based on the refined beam theory, the exact equations for the beam without transverse surface loadings are derived and consist of two governing differential equations: the fourth-order equation and the transcendental equation. The approximate equations for the beam under transverse loadings are derived directly from the refined beam theory and are almost the same as the governing equations of Timoshenko beam theory. In two examples, it is shown that the new theory provides better results than Levinson's beam theory when compared with those obtained from the linear theory of elasticity.
Nonextensive random-matrix theory based on Kaniadakis entropy
Abul-Magd, A.Y. [Department of Mathematics, Faculty of Science, Zagazig University, Zagazig (Egypt)]. E-mail: a_y_abul_magd@hotmail.com
2007-02-12
The joint eigenvalue distributions of random-matrix ensembles are derived by applying the principle maximum entropy to the Renyi, Abe and Kaniadakis entropies. While the Renyi entropy produces essentially the same matrix-element distributions as the previously obtained expression by using the Tsallis entropy, and the Abe entropy does not lead to a closed form expression, the Kaniadakis entropy leads to a new generalized form of the Wigner surmise that describes a transition of the spacing distribution from chaos to order. This expression is compared with the corresponding expression obtained by assuming Tsallis' entropy as well as the results of a previous numerical experiment.
Nezarat, Amin; Dastghaibifard, G H
2015-01-01
One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.
A Review on Theories of Task-based Teaching
谢济光
2014-01-01
Based on several books on English teaching and learning, this paper gives a general review on task-based teaching. First-ly, it introduces the definitions of task and task-based syllabus and makes two distinctions, namely,“task”and“activity”,“task-based teaching”and“communicative teaching”. Then it states the notions behind the task-based teaching. Finally, it gives the notes for the application of this approach.
Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory
Johnson, David W.; Johnson, Roger T.; Smith, Karl A.
2014-01-01
Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…
O'Connor, Thomas G.; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen
2013-01-01
Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social…
山口,生史
2002-01-01
This study is based on organizational justice theory. Although organizational justice theory is useful for explaining organizational behavior, it has not focused on motivation, per se. ln this study, the linkage between organizational justice and motivation is explored with the mediating effect of interpersonal communication in an organization (i.e.，organizational communication).
Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact
McQuillin, Samuel D.; Lyons, Michael D.
2016-01-01
This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…
The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation
Moorer, Cleamon, Jr.
2014-01-01
This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,…
O'Connor, Thomas G.; Matias, Carla; Futh, Annabel; Tantam, Grace; Scott, Stephen
2013-01-01
Parenting programs for school-aged children are typically based on behavioral principles as applied in social learning theory. It is not yet clear if the benefits of these interventions extend beyond aspects of the parent-child relationship quality conceptualized by social learning theory. The current study examined the extent to which a social…
Pearce, Ella Elizabeth
Four seventh grade life science classes, given curriculum materials based upon Piagetian theories of intellectual development and Skinner's theories of secondary reinforcement, were compared with four control classes from the same school districts. Nine students from each class, who(at the pretest) were at the concrete operations stage of…
van Urk, Felix; Grant, Sean; Bonell, Chris
2016-01-01
The use of explicit programme theory to guide evaluation is widely recommended. However, practitioners and other partnering stakeholders often initiate programmes based on implicit theories, leaving researchers to explicate them before commencing evaluation. The current study aimed to apply a systematic method to undertake this process. We…
The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation
Moorer, Cleamon, Jr.
2014-01-01
This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,…
Bresciani, Marilee J.
2011-01-01
The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…
Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact
McQuillin, Samuel D.; Lyons, Michael D.
2016-01-01
This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…
Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory
Johnson, David W.; Johnson, Roger T.; Smith, Karl A.
2014-01-01
Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…
No Previous Public Services Required
Taylor, Kelley R.
2009-01-01
In 2007, the Supreme Court heard a case that involved the question of whether a school district could be required to reimburse parents who unilaterally placed their child in private school when the child had not previously received special education and related services in a public institution ("Board of Education v. Tom F."). The…
A precepted leadership course based on Bandura's social learning theory.
Haddock, K S
1994-01-01
Transition from student to registered nurse (RN) has long been cited as a difficult time for new graduates entering health care. Bandura's (1977) theory of social learning guided a revision of a nursing leadership course required of baccalaureate student nurses (BSNs) in their final semester. The preceptorship allowed students to work closely with and to practice modeled behaviors of RNs and then receive feedback and reinforcement from both the preceptor and the supervising faculty member. Students were thus prepared to function better in the reality of the practice setting. Positive outcomes were experienced by students, BSN preceptors, faculty, and nurse administrators.
Interactive Image Segmentation Framework Based On Control Theory.
Zhu, Liangjia; Kolesov, Ivan; Karasev, Peter; Tannenbaum, Allen
2015-02-21
Segmentation of anatomical structures in medical imagery is a key step in a variety of clinical applications. Designing a generic, automated method that works for various structures and imaging modalities is a daunting task. Instead of proposing a new specific segmentation algorithm, in this paper, we present a general design principle on how to integrate user interactions from the perspective of control theory. In this formulation, Lyapunov stability analysis is employed to design and analyze an interactive segmentation system. The effectiveness and robustness of the proposed method are demonstrated.
Rapid Occupant Classification System Based Rough Sets Theory
Lin Chen
2012-09-01
Full Text Available In the intelligent airbag system, the correct classification of occupant type is the precondition and plays an important role in controlling the airbag release time and inflation strength during emergent accidents. In the paper, the novel rapid occupant classification system is proposed in which tens of pressure sensors are needed to real-time collect pressure distribution data and then the rough sets theory is combined to extract classification knowledge from data features. Furthermore, Experiments have been done to verify its efficiency and effectiviness.
A Gyrocompass for Maritime Applications Based Upon Multivariable Control Theory
Olav Egeland
1984-10-01
Full Text Available A gyrocompass is designed using multivariable control theory. The compass can be implemented with an inertial platform or as a strap-down system. Measurement noise caused by vessel acceleration is modeled and feedforward is taken from vessel speed. Though the model is of order 9, it has only three unknown parameters of which one can be chosen a priori. Parameter estimation is discussed. For simulation of the compass, a non-linear surface vessel model with 6 degrees of freedom and wave excitation is used.
An eddy tracking algorithm based on dynamical systems theory
Conti, Daniel; Orfila, Alejandro; Mason, Evan; Sayol, Juan Manuel; Simarro, Gonzalo; Balle, Salvador
2016-11-01
This work introduces a new method for ocean eddy detection that applies concepts from stationary dynamical systems theory. The method is composed of three steps: first, the centers of eddies are obtained from fixed points and their linear stability analysis; second, the size of the eddies is estimated from the vorticity between the eddy center and its neighboring fixed points, and, third, a tracking algorithm connects the different time frames. The tracking algorithm has been designed to avoid mismatching connections between eddies at different frames. Eddies are detected for the period between 1992 and 2012 using geostrophic velocities derived from AVISO altimetry and a new database is provided for the global ocean.
Computer-based teaching module design: principles derived from learning theories.
Lau, K H Vincent
2014-03-01
The computer-based teaching module (CBTM), which has recently gained prominence in medical education, is a teaching format in which a multimedia program serves as a single source for knowledge acquisition rather than playing an adjunctive role as it does in computer-assisted learning (CAL). Despite empirical validation in the past decade, there is limited research into the optimisation of CBTM design. This review aims to summarise research in classic and modern multimedia-specific learning theories applied to computer learning, and to collapse the findings into a set of design principles to guide the development of CBTMs. Scopus was searched for: (i) studies of classic cognitivism, constructivism and behaviourism theories (search terms: 'cognitive theory' OR 'constructivism theory' OR 'behaviourism theory' AND 'e-learning' OR 'web-based learning') and their sub-theories applied to computer learning, and (ii) recent studies of modern learning theories applied to computer learning (search terms: 'learning theory' AND 'e-learning' OR 'web-based learning') for articles published between 1990 and 2012. The first search identified 29 studies, dominated in topic by the cognitive load, elaboration and scaffolding theories. The second search identified 139 studies, with diverse topics in connectivism, discovery and technical scaffolding. Based on their relative representation in the literature, the applications of these theories were collapsed into a list of CBTM design principles. Ten principles were identified and categorised into three levels of design: the global level (managing objectives, framing, minimising technical load); the rhetoric level (optimising modality, making modality explicit, scaffolding, elaboration, spaced repeating), and the detail level (managing text, managing devices). This review examined the literature in the application of learning theories to CAL to develop a set of principles that guide CBTM design. Further research will enable educators to
Branes in Extended Spacetime: Brane Worldvolume Theory Based on Duality Symmetry.
Sakatani, Yuho; Uehara, Shozo
2016-11-04
We propose a novel approach to the brane worldvolume theory based on the geometry of extended field theories: double field theory and exceptional field theory. We demonstrate the effectiveness of this approach by showing that one can reproduce the conventional bosonic string and membrane actions, and the M5-brane action in the weak-field approximation. At a glance, the proposed 5-brane action without approximation looks different from the known M5-brane actions, but it is consistent with the known nonlinear self-duality relation, and it may provide a new formulation of a single M5-brane action. Actions for exotic branes are also discussed.
Branes in Extended Spacetime: Brane Worldvolume Theory Based on Duality Symmetry
Sakatani, Yuho; Uehara, Shozo
2016-11-01
We propose a novel approach to the brane worldvolume theory based on the geometry of extended field theories: double field theory and exceptional field theory. We demonstrate the effectiveness of this approach by showing that one can reproduce the conventional bosonic string and membrane actions, and the M 5 -brane action in the weak-field approximation. At a glance, the proposed 5-brane action without approximation looks different from the known M 5 -brane actions, but it is consistent with the known nonlinear self-duality relation, and it may provide a new formulation of a single M 5 -brane action. Actions for exotic branes are also discussed.
Quaternion based generalization of Chern-Simons theories in arbitrary dimensions
D'Adda, Alessandro; Shimode, Naoki; Tsukioka, Takuya
2016-01-01
A generalization of Chern-Simons gauge theory is formulated in any dimension and arbitrary gauge group where gauge fields and gauge parameters are differential forms of any degree. The quaternion algebra structure of this formulation is shown to be equivalent to a three Z(2)-gradings structure, thus clarifying the quaternion role in a previous formulation.
Quaternion based generalization of Chern–Simons theories in arbitrary dimensions
Alessandro D'Adda
2017-08-01
Full Text Available A generalization of Chern–Simons gauge theory is formulated in any dimension and arbitrary gauge group where gauge fields and gauge parameters are differential forms of any degree. The quaternion algebra structure of this formulation is shown to be equivalent to a three Z2-gradings structure, thus clarifying the quaternion role in the previous formulation.
Tumour chemotherapy strategy based on impulse control theory.
Ren, Hai-Peng; Yang, Yan; Baptista, Murilo S; Grebogi, Celso
2017-03-06
Chemotherapy is a widely accepted method for tumour treatment. A medical doctor usually treats patients periodically with an amount of drug according to empirical medicine guides. From the point of view of cybernetics, this procedure is an impulse control system, where the amount and frequency of drug used can be determined analytically using the impulse control theory. In this paper, the stability of a chemotherapy treatment of a tumour is analysed applying the impulse control theory. The globally stable condition for prescription of a periodic oscillatory chemotherapeutic agent is derived. The permanence of the solution of the treatment process is verified using the Lyapunov function and the comparison theorem. Finally, we provide the values for the strength and the time interval that the chemotherapeutic agent needs to be applied such that the proposed impulse chemotherapy can eliminate the tumour cells and preserve the immune cells. The results given in the paper provide an analytical formula to guide medical doctors to choose the theoretical minimum amount of drug to treat the cancer and prevent harming the patients because of over-treating.This article is part of the themed issue 'Horizons of cybernetical physics'.
Johnson, Victoria A; Ronan, Kevin R; Johnston, David M; Peace, Robin
2016-11-01
A main weakness in the evaluation of disaster education programs for children is evaluators' propensity to judge program effectiveness based on changes in children's knowledge. Few studies have articulated an explicit program theory of how children's education would achieve desired outcomes and impacts related to disaster risk reduction in households and communities. This article describes the advantages of constructing program theory models for the purpose of evaluating disaster education programs for children. Following a review of some potential frameworks for program theory development, including the logic model, the program theory matrix, and the stage step model, the article provides working examples of these frameworks. The first example is the development of a program theory matrix used in an evaluation of ShakeOut, an earthquake drill practiced in two Washington State school districts. The model illustrates a theory of action; specifically, the effectiveness of school earthquake drills in preventing injuries and deaths during disasters. The second example is the development of a stage step model used for a process evaluation of What's the Plan Stan?, a voluntary teaching resource distributed to all New Zealand primary schools for curricular integration of disaster education. The model illustrates a theory of use; specifically, expanding the reach of disaster education for children through increased promotion of the resource. The process of developing the program theory models for the purpose of evaluation planning is discussed, as well as the advantages and shortcomings of the theory-based approaches.
Sun, Songsong; Yu, Xiaoli; Liu, Zhentao; Chen, Xiaoping
2016-01-01
For the critical engine parts such as the crankshaft, the fatigue limit load is one of the most important parameters involved the design and manufacture stage. In previous engineering applications, this parameter has always been obtained by experiment, which is expensive and time-consuming. This paper, based on the theory of critical distance (TCD), first analyzes the stress distribution of a crankshaft under its limit load. In this way, the length of the critical distance can be obtained. Then a certain load is applied to a new crankshaft made of the same material and the effective stress is calculated based on the critical distance above. Finally, the fatigue limit load of the new crankshaft can be obtained by comparing the effective stress and the fatigue limit of the material. Comparison between the prediction and the corresponding experimental data shows that the traditional TCD may result in bigger errors on some occasions, while the modified TCD proposed in this paper can provide a more satisfactory result in terms of the fatigue limit for a quick engineering prediction.
Theory-Based Formative Research on an Anti-Cyberbullying Victimization Intervention Message.
Savage, Matthew W; Deiss, Douglas M; Roberto, Anthony J; Aboujaoude, Elias
2017-02-01
Cyberbullying is a common byproduct of the digital revolution with serious consequences to victims. Unfortunately, there is a dearth of empirically based methods to confront it. This study used social cognitive theory to design and test an intervention message aimed at persuading college students to abstain from retaliation, seek social support, save evidence, and notify authorities-important victim responses identified and recommended in previous research. Using a posttest-only control group design, this study tested the effectiveness of an intervention message in changing college students' perceived susceptibility to and perceived severity of cyberbullying as well as their self-efficacy, response efficacy, attitudes, and behavioral intentions toward each recommended response in future episodes of cyberbullying. Results indicated that the intervention message caused participants in the experimental condition to report significantly higher susceptibility, but not perceived severity, to cyberbullying than those in the control condition. The intervention message also caused expected changes in all outcomes except self-efficacy for not retaliating and in all outcomes for seeking social support, saving evidence, and notifying an authority. Implications for message design and future research supporting evidence-based anti-cyberbullying health communication campaigns are discussed.
Some considerations on the definition of risk based on concepts of systems theory and probability.
Andretta, Massimo
2014-07-01
The concept of risk has been applied in many modern science and technology fields. Despite its successes in many applicative fields, there is still not a well-established vision and universally accepted definition of the principles and fundamental concepts of the risk assessment discipline. As emphasized recently, the risk fields suffer from a lack of clarity on their scientific bases that can define, in a unique theoretical framework, the general concepts in the different areas of application. The aim of this article is to make suggestions for another perspective of risk definition that could be applied and, in a certain sense, generalize some of the previously known definitions (at least in the fields of technical and scientific applications). By drawing on my experience of risk assessment in different applicative situations (particularly in the risk estimation for major industrial accidents, and in the health and ecological risk assessment for contaminated sites), I would like to revise some general and foundational concepts of risk analysis in as consistent a manner as possible from the axiomatic/deductive point of view. My proposal is based on the fundamental concepts of the systems theory and of the probability. In this way, I try to frame, in a single, broad, and general theoretical context some fundamental concepts and principles applicable in many different fields of risk assessment. I hope that this article will contribute to the revitalization and stimulation of useful discussions and new insights into the key issues and theoretical foundations of risk assessment disciplines.
A global test for gene-gene interactions based on random matrix theory.
Frost, H Robert; Amos, Christopher I; Moore, Jason H
2016-12-01
Statistical interactions between markers of genetic variation, or gene-gene interactions, are believed to play an important role in the etiology of many multifactorial diseases and other complex phenotypes. Unfortunately, detecting gene-gene interactions is extremely challenging due to the large number of potential interactions and ambiguity regarding marker coding and interaction scale. For many data sets, there is insufficient statistical power to evaluate all candidate gene-gene interactions. In these cases, a global test for gene-gene interactions may be the best option. Global tests have much greater power relative to multiple individual interaction tests and can be used on subsets of the markers as an initial filter prior to testing for specific interactions. In this paper, we describe a novel global test for gene-gene interactions, the global epistasis test (GET), that is based on results from random matrix theory. As we show via simulation studies based on previously proposed models for common diseases including rheumatoid arthritis, type 2 diabetes, and breast cancer, our proposed GET method has superior performance characteristics relative to existing global gene-gene interaction tests. A glaucoma GWAS data set is used to demonstrate the practical utility of the GET method.
A global test for gene‐gene interactions based on random matrix theory
Amos, Christopher I.; Moore, Jason H.
2016-01-01
ABSTRACT Statistical interactions between markers of genetic variation, or gene‐gene interactions, are believed to play an important role in the etiology of many multifactorial diseases and other complex phenotypes. Unfortunately, detecting gene‐gene interactions is extremely challenging due to the large number of potential interactions and ambiguity regarding marker coding and interaction scale. For many data sets, there is insufficient statistical power to evaluate all candidate gene‐gene interactions. In these cases, a global test for gene‐gene interactions may be the best option. Global tests have much greater power relative to multiple individual interaction tests and can be used on subsets of the markers as an initial filter prior to testing for specific interactions. In this paper, we describe a novel global test for gene‐gene interactions, the global epistasis test (GET), that is based on results from random matrix theory. As we show via simulation studies based on previously proposed models for common diseases including rheumatoid arthritis, type 2 diabetes, and breast cancer, our proposed GET method has superior performance characteristics relative to existing global gene‐gene interaction tests. A glaucoma GWAS data set is used to demonstrate the practical utility of the GET method. PMID:27386793
Kirchner, Tom
2013-05-01
Ion-impact induced ionization and fragmentation of complex molecules have important applications in many branches of science. If the molecule is H2O an obvious topic to address is the radiobiological relevance of these processes, e.g. in the context of hadron therapy, to name just one example. From a more fundamental physics viewpoint ion-molecule collision systems constitute interesting many-body systems, whose analysis poses challenges to both experimentalists and theorists. This talk will describe a theoretical approach to ion-molecule collisions, which is based on density functional theory to describe the nonperturbative electron dynamics. The basis generator method applied in the past successfully to ion-atom collisions is adapted to deal with the multi-center problem one faces when one considers molecular targets. Cross sections for single- and multiple-electron processes (capture and transfer to the continuum) are obtained directly from solving time-dependent Kohn-Sham-type orbital equations and using a Slater determinant based analysis. Fragmentation yields are predicted on the basis of a semi-phenomenological model which uses the calculated cross sections as input. Results will be presented for various ions impacting on water molecules in the energy range of 10-5000 keV/amu and compared with experimental data and previous theoretical calculations where available. First applications of the model to collisions involving CH4 molecules will also be discussed. This work has been supported by SHARCNET and NSERC Canada.
Sandler, Jen
2007-12-01
This paper critically reviews two diverse intellectual traditions concerned with community-based interventions: the literature on dissemination of community interventions and the critical psychology literature that is concerned with systemic power inequalities and structural injustice. The dominant dissemination-of-innovations framework has shifted toward an emphasis on community, yet it does not generally take into account issues of power and inequality within the diverse community spheres into which interventions are disseminated. On the other hand, critical psychologists, who have concerned themselves with both understanding and addressing issues of power and structural injustice, have tended to eschew the possibility of standardizing and making transferable practices, programs, and even processes that address these issues in particular settings. This paper traces and critiques both sides of this divide within community psychology, positing a framework to bring these diverse intellectual resources together so that community interventions might fruitfully be examined in terms of their community-based practices, or practices that bear on structural injustice. This framework is illustrated with a case study of the community-based practices of a widely disseminated evidence-based community intervention.
ANALYTICAL RELATIONS BETWEEN EIGENVALUES OF CIRCULAR PLATE BASED ON VARIOUS PLATE THEORIES
无
2006-01-01
Based on the mathematical similarity of the axisymmetric eigenvalue problems of a circular plate between the classical plate theory(CPT), the first-order shear deformation plate theory(FPT) and the Reddy's third-order shear deformation plate theory(RPT), analytical relations between the eigenvalues of circular plate based on various plate theories are investigated. In the present paper, the eigenvalue problem is transformed to solve an algebra equation. Analytical relationships that are expressed explicitly between various theories are presented. Therefore, from these relationships one can easily obtain the exact RPT and FPT solutions of critical buckling load and natural frequencyfor a circular plate with CPT solutions. The relationships are useful for engineering application, and can be used to check the validity, convergence and accuracy of numerical results for the eigenvalue problem of plates.
2009-01-01
Seezink, A., Poell, R. F., & Kirschner, P. A. (2009). Teachers' individual action theories about competence-based education: The value of the cognitive apprenticeship model. Journal of Vocational Education & Training, 61, 203-215.
The theory and practice of 100 pilot SHP-based rural electrification counties in China
Luo Gaorong [Organization of the United Nations, Beijing (China). International Centre of Small Hydroelectric Power Plants
1995-07-01
This document presents the theory and practice of 100 pilot small hydroelectric power plants (SHP) based rural electrification counties in China. The document reports the research contents, methodology and results, and the pilot benefit analysis and evaluation.
Simplified theory of plastic zones based on Zarka's method
Hübel, Hartwig
2017-01-01
The present book provides a new method to estimate elastic-plastic strains via a series of linear elastic analyses. For a life prediction of structures subjected to variable loads, frequently encountered in mechanical and civil engineering, the cyclically accumulated deformation and the elastic plastic strain ranges are required. The Simplified Theory of Plastic Zones (STPZ) is a direct method which provides the estimates of these and all other mechanical quantities in the state of elastic and plastic shakedown. The STPZ is described in detail, with emphasis on the fact that not only scientists but engineers working in applied fields and advanced students are able to get an idea of the possibilities and limitations of the STPZ. Numerous illustrations and examples are provided to support the reader's understanding.
Emitter Design and Numerical Simulation Based on the Extenics Theory
Jiang Fan
2014-05-01
Full Text Available In order to improve the performance of emitter, the extenics theory is introduced, whose divergent thinking is used to resolve the conflict of anti-clogging and energy dissipation and a new structure is proposed. The wide triangular areas are designed to reduce the flow rate behind of the each orifice and be easy to precipitation of impurities. The orifices are set to gradually decrease water kinetic energy and the flow channel is designed to be dismantle. The numerical simulation technology is used to analyze the internal flow field of emitter, the flow field results show that the improved emitter has great effect of energy dissipation and anti-clogging. As the same time, the structure of emitter is optimized and L1 = 31 mm, L2 = 21 mm, L3 = 8 mm and L4 = 5 mm are the optimization size values.
Short-term Power Load Forecasting Based on Gray Theory
Cui Herui
2013-11-01
Full Text Available Power load forecasting provides the basis for the preparation of power planning, especially the accurate short-term power load forecasting. It can formulate power rationing program of area load reliably and timely, to maintain the normal production and life. This article describes the gray prediction method, and improves GM (1,1 model via processing the original data sequence smoothly, using the correction model of parameteramending parameter values, adding the residual model, and also applying the idea of the metabolism. It conducts an empirical analysis of the 10KV large cable of Guigang Power Supply Bureau in Nan Ping, and verifies the limitations of ordinary gray theory. The improved gray model has a higher prediction accuracy than the conventional GM (1,1 model.
A queueing theory based model for business continuity in hospitals.
Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R
2013-01-01
Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.
Thermal rectification based on phonon hydrodynamics and thermomass theory
Dong Yuan
2016-06-01
Full Text Available The thermal diode is the fundamental device for phononics. There are various mechanisms for thermal rectification, e.g. different temperature dependent thermal conductivity of two ends, asymmetric interfacial resistance, and nonlocal behavior of phonon transport in asymmetric structures. The phonon hydrodynamics and thermomass theory treat the heat conduction in a fluidic viewpoint. The phonon gas flowing through the media is characterized by the balance equation of momentum, like the Navier-Stokes equation for fluid mechanics. Generalized heat conduction law thereby contains the spatial acceleration (convection term and the viscous (Laplacian term. The viscous term predicts the size dependent thermal conductivity. Rectification appears due to the MFP supersession of phonons. The convection term also predicts rectification because of the inertia effect, like a gas passing through a nozzle or diffuser.
Design of the Soymilk Mill based on TRIZ Theory
Jiang Fan
2013-05-01
Full Text Available The soymilk mill is an important food machine, but its volume is too large to be suitable for house using. This study first analyzes some problems in the soymilk mill miniaturization. For these problems, the thinking tools, evolutionary tools and contradiction solving tool of TRIZ theory are used to resolve the conflict in the integration of grinding and boiling and in the keep grinding effect, to tackle the optimization problem in the grinding stria structure and mill plate speed, then the Dwarfs method and substance-field analysis model is used in solving the interference and the conditions water supply problems which are encountered in the design of the global structure and a micro soymilk mill is designed. Finally, the mechanical analysis model about soy granules, soymilk particles in the grinding zone and the computational model of motor starting torque are obtained; they provide the reference data for application of soymilk mill.
Dynamic Simulation of Backward Diffusion Based on Random Walk Theory
Dung, Vu Ba; Nguyen, Bui Huu
2016-06-01
Results of diffusion study in silicon showed that diffusion of the selfinterstitial and vacancy could be backward diffusion and their diffusivity could be negative [1]. The backward diffusion process and negative diffusivity is contrary to the fundamental laws of diffusion such as the law of Fick law, namely the diffusive flux of backward diffusion goes from regions of low concentration to regions of high concentration. The backward diffusion process have been explained [2]. In this paper, the backward diffusion process is simulated. Results is corresponding to theory and show that when thermal velocity of the low concentration area is greater than thermal velocity of the high concentration area, the backward diffusion can be occurred.
HRM Model in Tourism, Based on Dialectical Systems Theory
Simona Šarotar Žižek
2015-12-01
Full Text Available A human resources management (HRM model integrating trends in HRM with trends in tourism into a dialectical system by the Dialectical Systems Theory (DST. HRM strategy, integrated within the tourism organization’s (to’s strategy is implemented through functional strategies helping their users to achieve a requisitely holistic (rh HRM strategy replacing the prevailing one-sided ones. to’s strategy covers: employees (1 planning, (2 acquisition and selection, (3 development and training, (4 diversity management, (5 teamwork and creativity, (6 motivation and rewarding, (7 stress reduction and health, (8 relationships, (9 personal holism, (10 well-being, (11 work and results assessment; etc. Everyone matters; their synergy is crucial. An innovated HRM model for TOS, which applies employees’, organizations’ rh and integrates new knowledge about HRM. HRM belongs to central managers’ tools. Their HRM must be adapted for TOS, where employees are crucial.
Geometry-based density functional theory an overview
Schmidt, M
2003-01-01
An overview of recent developments and applications of a specific density functional approach that originates from Rosenfeld's fundamental measure theory for hard spheres is given. Model systems that were treated include penetrable spheres that interact with a step function pair potential, the Widom-Rowlinson model, the Asakura-Oosawa colloid-polymer mixture, ternary mixtures of spheres, needles, and globular polymers, hard-body amphiphilic mixtures, fluids in porous media, and random sequential adsorption that describes non-equilibrium processes such as colloidal deposition and random car parking. In these systems various physical phenomena were studied, such as correlations in liquids, freezing and demixing phase behaviour, the properties of fluid interfaces with and without orientational order, and wetting and layering phenomena at walls.
Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander
2010-01-01
Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.
Molar conductivity calculation of Li-ion battery electrolyte based on mode coupling theory
Pu, Weihua; He, Xiangming; Lu, Jiufang; Jiang, Changyin; Wan, Chunrong
2005-12-01
A method is proposed to calculate molar conductivity based on mode coupling theory in which the ion transference number is introduced into the theory. The molar conductivities of LiPF6, LiClO4, LiBF4, LiAsF6 in PC (propylene carbonate) are calculated based on this method. The results fit well to the literature data. This presents a potential way to calculate the conductivities of Li-ion battery electrolytes.
Zu-Tong Wang; Jian-Sheng Guo; Ming-Fa Zheng; Ying Wang
2014-01-01
Based on the credibility theory, this paper is devoted to the fuzzy multiobjective programming problem. Firstly, the expected-value model of fuzzy multiobjective programming problem is provided based on credibility theory; then two new approaches for obtaining efficient solutions are proposed on the basis of the expected-value model, whose validity has been proven. For solving the fuzzy MOP problem efficiently, Latin hypercube sampling, fuzzy simulation, support vector machine, an...
Applying critical chain buffer management theory in location-based management
Büchmann-Slorup, Rolf
2014-01-01
Guidelines for prioritizing buffers on location-based management (LBM) projects are established through the use of critical chain theory (CCT). Buffer management theory in LBM has gained little attention from the research community. CCT builds on the assumption that each task is, either consciously...... or unconsciously, given a certain time buffer with which to cope with unpredicted events, and that these buffers become a large part of the project lead time. However, CCT suggests that these buffers entail inherent waste within schedules and fail to protect both critical activities and projects. CCT assumes...... that are based on critical chain buffer management theory....
Stanton, B F; Fitzgerald, A M; Li, X; Shipena, H; Ricardo, I B; Galbraith, J S; Terreri, N; Strijdom, J; Hangula-Ndlovu, V; Kahihuata, J
1999-04-01
Considerable progress has been made in the United States and Europe regarding HIV risk prevention efforts targeting adolescents. However, in Africa less progress has been made to date. This article address three questions: Can risk assessment questionnaires developed in Western countries be modified so as to be appropriate for use in African countries? Are social cognitive models appropriate in African settings? Does covariation among risk behaviors occur among youth residing in African countries? The data was obtained from a cross-sectional survey conducted among 922 youth ages 12 to 18 years living in school-based hostels in Namibia. Data were collected using a theory-based risk assessment questionnaire. One third of the youth were sexually experienced, three quarters of whom had engaged in sexual intercourse in the previous 6 months. Over one third of these youth had had more that one sexual partner in the previous 6 months and over one half had not used a condom at last episode of intercourse. The psychometric properties of the questionnaire and the relationship between perceptions and behaviors provide evidence that theory-based questionnaires developed in Western countries can be modified for use in different cultural settings. The data also provide strong evidence of covariation between risk behaviors among Namibian youth.
Xiong, Xiao-Gen; Yanai, Takeshi
2017-07-11
The Projector Augmented Wave (PAW) method developed by Blöchl is well recognized as an efficient, accurate pseudopotential approach in solid-state density functional theory (DFT) calculations with the plane-wave basis. Here we present an approach to incorporate the PAW method into the Gauss-type function (GTF) based DFT implementation, which is widely used for molecular quantum chemistry calculations. The nodal and high-exponent GTF components of valence molecular orbitals (MOs) are removed or pseudized by the ultrasoft PAW treatment, while there is elaborate transparency to construct an accurate and well-controlled pseudopotential from all-electron atomic description and to reconstruct an all-electron form of valence MOs from the pseudo MOs. The smoothness of the pseudo MOs should benefit the efficiency of GTF-based DFT calculations in terms of elimination of high-exponent primitive GTFs and reduction of grid points in the numerical quadrature. The processes of the PAW method are divided into basis-independent and -dependent parts. The former is carried out using the previously developed PAW libraries libpaw and atompaw. The present scheme is implemented by incorporating libpaw into the conventional GTF-based DFT solver. The details of the formulations and implementations of GTF-related PAW procedures are presented. The test calculations are shown for illustrating the performance. With the near-complete GTF basis at the cc-pVQZ level, the total energies obtained using our PAW method with suited frozen core treatments converge to those with the conventional all-electron GTF-based method with a rather small absolute error.
GENERALIZED LANDSCAPE THEORY: AGENT-BASED APPROACH TO ALLIANCE FORMATIONS IN CIVIL AVIATION INDUSTRY
Kyoichi Kijima
2001-01-01
The purpose of this paper is to generalize Landscape theory proposed by R.Axelrod and, then, to apply it to the civil aviation industry for simulating alliance formations in it. Landscape theory provides a well-known agent-based simulation model for analyzing alliance (or coalition) formation process. When a set N of agents or autonomous decision makers is given, the theory assumes that each agent tries to make a coalition in such a way that the resulting alliance minimizes its frustration. The theory is essentially based on two premises. One is that a propensity is symmetric,i.e., the propensity of agent i toward j is exactly the same as that of j toward i for anyagents i and j in N. The other is that the number of alliances is restricted to two, i.e., at any moment N is partitioned into two parties. Though the two basic premises underpin the theory and make the model simple and operational, they do not always reflect alliance formation processes in a realistic way. A generalized Landscape theory that this paper proposes removes them and allows asymmetric propensity and existence of alliances of any number. Since the premises are essential for the model, the generalization requires a drastic reconstruction of the whole idea of the theory. Finally, we analyze a real alliance formation process in the civil aviation industry.This analysis provides interesting insights about the industry as well as some validation of our generalized Landscape theory.
Gröbner bases in control theory and signal processing
Regensburger, Georg
2007-01-01
This volume contains survey and original articles presenting the state of the art on the application of Gröbner bases in control theory and signal processing. The contributions are based on talks delivered at the Special Semester on Gröbner Bases and Related Methods at the Johann Radon Institute of Computational and Applied Mathematics (RICAM), Linz, Austria, in May 2006.
Pay-based Screening Mechanism: Personnel Selection in the View of Economics Theory
刘帮成; 唐宁玉
2003-01-01
Based on economic theories,the paper studies the personnel selection at the asymmetric job market using signaling and screening model.The authors hold the opinion that an organization can screen the candidates'signaling based on the self-selection principle by providing an apropriate compensation choice.A pay-based screening mechanism is qualified applicants and retain the excellent applicants.
Community-Based Research: From Practice to Theory and Back Again.
Stoecker, Randy
2003-01-01
Explores the theoretical strands being combined in community-based research--charity service learning, social justice service learning, action research, and participatory research. Shows how different models of community-based research, based in different theories of society and different approaches to community work, may combine or conflict. (EV)
Gravitational Wave Spectrums from Pole-like Inflations based on Generalized Gravity Theories
Hwang, J
1998-01-01
We present a general and unified formulation which can handle the classical evolution and quantum generation processes of the cosmological gravitational wave in a broad class of generalized gravity theories. Applications are made in several inflation models based on the scalar-tensor theory, the induced gravity, and the low energy effective action of string theory. The gravitational wave power spectrums based on the vacuum expectation value of the quantized fluctuating metric during the pole-like inflation stages are derived in analytic forms. Assuming that the gravity theory transits to Einstein one while the relevant scales remain in the superhorizon scale, we derive the consequent power spectrums and the directional fluctuations of the relic radiation produced by the gravitational wave. The spectrums seeded by the vacuum fluctuations in the pole-like inflation models based on the generalized gravity show a distinguished common feature which differs from the scale invariant spectrum generated in an exponent...
Resolving defence mechanisms: A perspective based on dissipative structure theory.
Zhang, Wei; Guo, Ben-Yu
2017-04-01
Theories and classifications of defence mechanisms are not unified. This study addresses the psychological system as a dissipative structure which exchanges information with the external and internal world. When using defence mechanisms, the cognitive-affective schema of an individual could remain stable and ordered by excluding psychological entropy, obtaining psychological negentropy or by dissipating the energy of self-presentation. From this perspective, defences can be classified into three basic types: isolation, compensation and self-dissipation. However, not every kind of defence mechanisms can actually help the individual. Non-adaptive defences are just functioning as an effective strategy in the short run but can be a harmful approach in the long run, while adaptive defences could instead help the individual as a long-term mechanism. Thus, we would like to suggest that it is more useful for the individual to use more adaptive defence mechanisms and seek out social or interpersonal support when undergoing psychic difficulties. As this model of defences is theoretical at present, we therefore aim to support and enrich this viewpoint with empirical evidence.
Consolidity: Stack-based systems change pathway theory elaborated
Hassen Taher Dorrah
2014-06-01
Full Text Available This paper presents an elaborated analysis for investigating the stack-based layering processes during the systems change pathway. The system change pathway is defined as the path resulting from the combinations of all successive changes induced on the system when subjected to varying environments, activities, events, or any excessive internal or external influences and happenings “on and above” its normal stands, situations or set-points during its course of life. The analysis is essentially based on the important overall system paradigm of “Time driven-event driven-parameters change”. Based on this paradigm, it is considered that any affected activity, event or varying environment is intelligently self-recorded inside the system through an incremental consolidity-scaled change in system parameters of the stack-based layering types. Various joint stack-based mathematical and graphical approaches supported by representable case studies are suggested for the identification, extraction, and processing of various stack-based systems changes layering of different classifications and categorizations. Moreover, some selected real life illustrative applications are provided to demonstrate the (infinite stack-based identification and recognition of the change pathway process in the areas of geology, archeology, life sciences, ecology, environmental science, engineering, materials, medicine, biology, sociology, humanities, and other important fields. These case studies and selected applications revealed that there are general similarities of the stack-based layering structures and formations among all the various research fields. Such general similarities clearly demonstrate the global concept of the “fractals-general stacking behavior” of real life systems during their change pathways. Therefore, it is recommended that concentrated efforts should be expedited toward building generic modular stack-based systems or blocks for the mathematical
Theory of Dynamic Diagnosis Based on Integrated Maintenance Information
无
2002-01-01
Based on the concept of an integrated maintenance information system and related information environment, this paper discusses the process of troubleshooting in modern maintenance in detail, and gives a model of dynamic diagnosis of faults, in which a reasoning program is designed through taking advantage of information fusion and time analysis. In the end, the authors present the logic process of dynamic diagnosis through a typical example, and proposes a dynamic diagnostic system based on information fusion.
Boundary based on exchange symmetry theory for multilevel simulations. I. Basic theory.
Shiga, Motoyuki; Masia, Marco
2013-07-28
In this paper, we lay the foundations for a new method that allows multilevel simulations of a diffusive system, i.e., a system where a flux of particles through the boundaries might disrupt the primary region. The method is based on the use of flexible restraints that maintain the separation between inner and outer particles. It is shown that, by introducing a bias potential that accounts for the exchange symmetry of the system, the correct statistical distribution is preserved. Using a toy model consisting of non-interacting particles in an asymmetric potential well, we prove that the method is formally exact, and that it could be simplified by considering only up to a couple of particle exchanges without a loss of accuracy. A real-world test is then made by considering a hybrid MM(∗)/MM calculation of cesium ion in water. In this case, the single exchange approximation is sound enough that the results superimpose to the exact solutions. Potential applications of this method to many different hybrid QM/MM systems are discussed, as well as its limitations and strengths in comparison to existing approaches.
Using genetic algorithm based fuzzy adaptive resonance theory for clustering analysis
LIU Bo; WANG Yong; WANG Hong-jian
2006-01-01
In the clustering applications field, fuzzy adaptive resonance theory system has been widely applied. But, three parameters of fuzzy adaptive resonance theory need to be adjusted manually for obtaining better clustering. It needs much time to test and does not assure a best result. Genetic algorithm is an optimal mathematical search technique based on the principles of natural selection and genetic recombination. So, to make the fuzzy adaptive resonance theory parameters choosing process automation, an approach incorporating genetic algorithm and fuzzy adaptive resonance theory neural network has been applied. Then, the best clustering result can be obtained.Through experiment, it can be proved that the most appropriate parameters of fuzzy adaptive resonance theory can be gained effectively by this approach.
Theory of fractional order elements based impedance matching networks
Radwan, Ahmed G.
2011-03-01
Fractional order circuit elements (inductors and capacitors) based impedance matching networks are introduced for the first time. In comparison to the conventional integer based L-type matching networks, fractional matching networks are much simpler and versatile. Any complex load can be matched utilizing a single series fractional element, which generally requires two elements for matching in the conventional approach. It is shown that all the Smith chart circles (resistance and reactance) are actually pairs of completely identical circles. They appear to be single for the conventional integer order case, where the identical circles completely overlap each other. The concept is supported by design equations and impedance matching examples. © 2010 IEEE.
Research on Maintainability Evaluation Model Based on Fuzzy Theory
无
2007-01-01
Maintainability influencing attributes are analyzed, their weight and value calculating methods are given, and the maintainability fuzzy evaluation method is proposed based on the relative closeness. According to the maintenance task simulation operated in virtual environment, the maintainability virtual evaluation model is built by analyzing the maintenance task for each replaceable unit of product.At last, a case study is given based upon the main landing gear system of a certain type civil aircraft, and the result indicates that the model is suitable for maintainability qualitative evaluation and can support maintainability concurrent design.
A new method of single celestial-body sun positioning based on theory of mechanisms
Zhang Wei; Xu Xiaofeng; Wu Yuanzhe
2016-01-01
Considering defects of current single celestial-body positioning methods such as discon-tinuity and long period, a new sun positioning algorithm is herein put forward. Instead of tradi-tional astronomical spherical trigonometry and celestial coordinate system, the proposed new positioning algorithm is built by theory of mechanisms. Based on previously derived solar vector equations (from a C1R2P2 series mechanism), a further global positioning method is developed by inverse kinematics. The longitude and latitude coordinates expressed by Greenwich mean time (GMT) and solar vector in local coordinate system are formulated. Meanwhile, elimination method of multiple solutions, errors of longitude and latitude calculation are given. In addition, this algo-rithm has been integrated successfully into a mobile phone application to visualize sun positioning process. Results of theoretical verification and smart phone’s test demonstrate the validity of pre-sented coordinate’s expressions. Precision is shown as equivalent to current works and is acceptable to civil aviation requirement. This new method solves long-period problem in sun sight running fix-ing and improves applicability of sun positioning. Its methodology can inspire development of new sun positioning device. It would be more applicable to be combined with inertial navigation systems for overcoming discontinuity of celestial navigation systems and accumulative errors of inertial nav-igation systems.
Lu, Chao; Zheng, Yefeng; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Tietjen, Christian; Boettger, Thomas; Duncan, James S; Zhou, S Kevin
2012-01-01
In this paper, we present a novel method by incorporating information theory into the learning-based approach for automatic and accurate pelvic organ segmentation (including the prostate, bladder and rectum). We target 3D CT volumes that are generated using different scanning protocols (e.g., contrast and non-contrast, with and without implant in the prostate, various resolution and position), and the volumes come from largely diverse sources (e.g., diseased in different organs). Three key ingredients are combined to solve this challenging segmentation problem. First, marginal space learning (MSL) is applied to efficiently and effectively localize the multiple organs in the largely diverse CT volumes. Second, learning techniques, steerable features, are applied for robust boundary detection. This enables handling of highly heterogeneous texture pattern. Third, a novel information theoretic scheme is incorporated into the boundary inference process. The incorporation of the Jensen-Shannon divergence further drives the mesh to the best fit of the image, thus improves the segmentation performance. The proposed approach is tested on a challenging dataset containing 188 volumes from diverse sources. Our approach not only produces excellent segmentation accuracy, but also runs about eighty times faster than previous state-of-the-art solutions. The proposed method can be applied to CT images to provide visual guidance to physicians during the computer-aided diagnosis, treatment planning and image-guided radiotherapy to treat cancers in pelvic region.
A new method of single celestial-body sun positioning based on theory of mechanisms
Zhang Wei
2016-02-01
Full Text Available Considering defects of current single celestial-body positioning methods such as discontinuity and long period, a new sun positioning algorithm is herein put forward. Instead of traditional astronomical spherical trigonometry and celestial coordinate system, the proposed new positioning algorithm is built by theory of mechanisms. Based on previously derived solar vector equations (from a C1R2P2 series mechanism, a further global positioning method is developed by inverse kinematics. The longitude and latitude coordinates expressed by Greenwich mean time (GMT and solar vector in local coordinate system are formulated. Meanwhile, elimination method of multiple solutions, errors of longitude and latitude calculation are given. In addition, this algorithm has been integrated successfully into a mobile phone application to visualize sun positioning process. Results of theoretical verification and smart phone’s test demonstrate the validity of presented coordinate’s expressions. Precision is shown as equivalent to current works and is acceptable to civil aviation requirement. This new method solves long-period problem in sun sight running fixing and improves applicability of sun positioning. Its methodology can inspire development of new sun positioning device. It would be more applicable to be combined with inertial navigation systems for overcoming discontinuity of celestial navigation systems and accumulative errors of inertial navigation systems.
Competency-based medical education : theory to practice
Frank, Jason R.; Snell, Linda S.; Ten Cate, Olle; Holmboe, Eric S.; Carraccio, Carol; Swing, Susan R.; Harris, Peter; Glasgow, Nicholas J.; Campbell, Craig; Dath, Deepak; Harden, Ronald M.; Iobst, William; Long, Donlin M.; Mungroo, Rani; Richardson, Denyse L.; Sherbino, Jonathan; Silver, Ivan; Taber, Sarah; Talbot, Martin; Harris, Kenneth A.
2010-01-01
Although competency-based medical education (CBME) has attracted renewed interest in recent years among educators and policy-makers in the health care professions, there is little agreement on many aspects of this paradigm. We convened a unique partnership - the International CBME Collaborators - to
A Conceptual Framework Based on Activity Theory for Mobile CSCL
Zurita, Gustavo; Nussbaum, Miguel
2007-01-01
There is a need for collaborative group activities that promote student social interaction in the classroom. Handheld computers interconnected by a wireless network allow people who work on a common task to interact face to face while maintaining the mediation afforded by a technology-based system. Wirelessly interconnected handhelds open up new…
"Theory repositories" via the web for problem-based learning
van der Veen, Johan (CTIT); van Riemsdijk, Maarten; Jones, Valerie M.; Collis, Betty
2000-01-01
This paper describes a series of experiments conducted at the School of Management Studies at the University of Twente designed to improve students' concentration on the theoretical study materials in a particular course. In 1997 a problem-based learning approach was introduced into a course on orga
Structured-light Image Compression Based on Fractal Theory
无
2002-01-01
The method of fractal image compression is introduced which is applied to compress the line structured-light image. Based on the self-similarity of the structured-light image, we attain satisfactory compression ratio and higher peak signal-to-noise ratio (PSNR). The experimental results indicate that this method can achieve high performance.
The Underdetermined Knowledge-Based Theory of the MNC
Fransson, Anders; Håkanson, Lars; W. Liesch, Peter
2011-01-01
parties; and (2) that the threat of opportunism is not necessary, although it may be sufficient, to explain the existence of the MNC. Their knowledge-based view shifted the conceptualization of the firm from an institution arising from market failure and transaction costs economizing to a progeny...
Modeling powder encapsulation in dosator-based machines: I. Theory.
Khawam, Ammar
2011-12-15
Automatic encapsulation machines have two dosing principles: dosing disc and dosator. Dosator-based machines compress the powder to plugs that are transferred into capsules. The encapsulation process in dosator-based capsule machines was modeled in this work. A model was proposed to predict the weight and length of produced plugs. According to the model, the plug weight is a function of piston dimensions, powder-bed height, bulk powder density and precompression densification inside dosator while plug length is a function of piston height, set piston displacement, spring stiffness and powder compressibility. Powder densification within the dosator can be achieved by precompression, compression or both. Precompression densification depends on the powder to piston height ratio while compression densification depends on piston displacement against powder. This article provides the theoretical basis of the encapsulation model, including applications and limitations. The model will be applied to experimental data separately.
Grid Computing based on Game Optimization Theory for Networks Scheduling
Peng-fei Zhang
2014-05-01
Full Text Available The resource sharing mechanism is introduced into grid computing algorithm so as to solve complex computational tasks in heterogeneous network-computing problem. However, in the Grid environment, it is required for the available resource from network to reasonably schedule and coordinate, which can get a good workflow and an appropriate network performance and network response time. In order to improve the performance of resource allocation and task scheduling in grid computing method, a game model based on non-cooperation game is proposed. Setting the time and cost of user’s resource allocation can increase the performance of networks, and incentive resource of networks uses an optimization scheduling algorithm, which minimizes the time and cost of resource scheduling. Simulation experiment results show the feasibility and suitability of model. In addition, we can see from the experiment result that model-based genetic algorithm is the best resource scheduling algorithm
Resource-based theory and mergers & acquisitions success
Grill, Plina; Bresser, Rudi K. F.
2011-01-01
Mergers & acquisitions (M&A) are most popular external growth strategies. While the number of M&A has been increasing during the past decades, on average, only the shareholders of target firms gain value during the acquisitions process, while acquirers do not receive abnormal positive returns. This paper analyses the impact of strategically valuable resources on the success of M&A decisions. We test complementary resource-based hypotheses regarding the value of M&A for the shareholders of bot...
Consolidity: Stack-based systems change pathway theory elaborated
Dorrah, Hassen Taher
2014-01-01
This paper presents an elaborated analysis for investigating the stack-based layering processes during the systems change pathway. The system change pathway is defined as the path resulting from the combinations of all successive changes induced on the system when subjected to varying environments, activities, events, or any excessive internal or external influences and happenings “on and above” its normal stands, situations or set-points during its course of life. The analysis is essentially...
Hybrid Fundamental Solution Based Finite Element Method: Theory and Applications
Changyong Cao; Qing-Hua Qin
2015-01-01
An overview on the development of hybrid fundamental solution based finite element method (HFS-FEM) and its application in engineering problems is presented in this paper. The framework and formulations of HFS-FEM for potential problem, plane elasticity, three-dimensional elasticity, thermoelasticity, anisotropic elasticity, and plane piezoelectricity are presented. In this method, two independent assumed fields (intraelement filed and auxiliary frame field) are employed. The formulations for...
Transient response of lattice structures based on exact member theory
Anderson, Melvin S.
1989-01-01
The computer program BUNVIS-RG, which treats vibration and buckling of lattice structures using exact member stiffness matrices, has been extended to calculate the exact modal mass and stiffness quantities that can be used in a conventional transient response analysis based on modes. The exact nature of the development allows inclusion of local member response without introduction of any interior member nodes. Results are given for several problems in which significant interaction between local and global response occurs.
Research on e-learning services based on ontology theory
Liu, Rui
2013-07-01
E-learning services can realize network learning resource sharing and interoperability, but they can't realize automatic discovery, implementation and integration of services. This paper proposes a framework of e-learning services based on ontology, the ontology technology is applied to the publication and discovery process of e-learning services, in order to realize accurate and efficient retrieval and utilization of e-learning services.
INTERNET BANKING ACCEPTANCE IN MALAYSIA BASED ON THE THEORY OF REASONED ACTION
J Michael Pearson
2008-09-01
Full Text Available ABSTRACT The theory of reasoned action originally introduced in the field of Social Psychology has been widely used to explain individuals’ behaviour. The theory postulates that individuals’ behaviour is influenced by their attitude and subjective norm. The purpose of this study was to determine factors that influence an individual’s intention to use a technology based on the theory of reasoned action. We used Internet banking as the target technology and Malaysian subjects as the sampling frame. A principal component analysis was used to validate the constructs and multiple regressions were used to analyze the data. As expected, the results supported the theory’s proposition as that an individuals’ behavioural intention to use Internet banking is influenced by their attitude and subjective norm. Based on the findings, theoretical and practical implications were offered. Keywords: theory of reasoned action, Internet banking, technology acceptance
Design-based research – issues in connecting theory, research and practice
Kolmos, Anette
2015-01-01
indicate several key issues at both the scientific and personal level. Scientifically, the main issues are contribution to theory and the role of the researcher. At the personal level, it is an investment beyond normal research procedures to involve yourself as a researcher in curriculum change.......During the last 20 years, design-based research (DBR) has become a popular methodology for connecting educational theory, research and practice. The missing link between educational theory, research and educational practice is an ongoing issue and DBR is seen as an integrated methodology to bridge...
Exact Amplitude-Based Resummation in Quantum Field Theory: Recent Results
Ward, B F L
2012-01-01
We present the current status of the application of our approach of exact amplitude-based resummation in quantum field theory to two areas of investigation: precision QCD calculations of all three of us as needed for LHC physics and the resummed quantum gravity realization by one of us (B.F.L.W.) of Feynman's formulation of Einstein's theory of general relativity. We discuss recent results as they relate to experimental observations. There is reason for optimism in the attendant comparison of theory and experiment.
Microstructure-dependent piezoelectric beam based on modified strain gradient theory
Li, Y. S.; Feng, W. J.
2014-09-01
A microstructure-dependent piezoelectric beam model was developed using a variational formulation, which is based on the modified strain gradient theory and the Timoshenko beam theory. The new model contains three material length scale parameters and can capture the size effect, unlike the classical beam theory. To illustrate the new piezoelectric beam model, the static bending and the free vibration problems of a simply supported beam are numerically solved. These results may be useful in the analysis and design of smart structures that are constructed from piezoelectric materials.
New weighting factors assignment of evidence theory based on evidence distance
Chen Liangzhou; Shi Wenkang; Du Feng
2005-01-01
Evidence theory has been widely used in the information fusion for its effectiveness of the uncertainty reasoning. However, the classical DS evidence theory involves counter-intuitive behaviors when the high conflict information exists. Based on the analysis of some modified methods, Assigning the weighting factors according to the intrinsic characteristics of the existing evidence sources is proposed, which is determined on the evidence distance theory. From the numerical examples, the proposed method provides a reasonable result with good convergence efficiency. In addition, the new rule retrieves to the Yager's formula when all the evidence sources contradict to each other completely.
A NEW QUADRILATERAL THIN PLATE ELEMENT BASED ON THE MEMBRANE-PLATE SIMILARITY THEORY
黄若煜; 郑长良; 钟万勰; 姚伟岸
2002-01-01
A new effective path has been proposed to formulate thin plate element by using the similarity theory between plane elasticity and plate bending. Because of avoiding the difficulty of c1 continuity , the construction of thin plate elements becomes easier. The similarity theory and its applications were discussed more deeply, and a new four nodes, sixteen D. O. F. ( degree of fieedom) thin plate element was presented on the base of the similarity theory. Numerical results for typical problems show that this new element can pass the patch test and has a very good convergence and a high precision.
Following Human Footsteps: Proposal of a Decision Theory Based on Human Behavior
Mahmud, Faisal
2011-01-01
Human behavior is a complex nature which depends on circumstances and decisions varying from time to time as well as place to place. The way a decision is made either directly or indirectly related to the availability of the options. These options though appear at random nature, have a solid directional way for decision making. In this paper, a decision theory is proposed which is based on human behavior. The theory is structured with model sets that will show the all possible combinations for making a decision, A virtual and simulated environment is considered to show the results of the proposed decision theory
zahra asadi; bahman hajipour
2014-01-01
In today's competitive world, all market participants ranging from individuals, organizations should be looking for ways to success in the market. The secret to success high contact service providers as important part of market participants is, compliance and follow customers of high contact service providers the instructions and guidance. In this paper, a model based on Bandura social - Cognitive theory has Provided to customer compliance . According Bandura social - Cognitive theory and t...
Steering laws analysis of SGCMGs based on singular value decomposition theory
ZHANG Jing-rui
2008-01-01
The steering laws of single gimbal control moment gyros (SGCMGs) are analyzed and compared in this paper for a spacecraft attitude control system based on singular value decomposition (SVD) theory. The mechanism of steering laws escaping singularity, especially how the steering laws affect singularity of gimbal configuration and the output torque error, is studied using SVD theory. Performance of various steering laws are analyzed and compared quantitatively by simulation. The obtained results can be used as a reference for designers.
Hamiltonian system for orthotropic plate bending based on analogy theory
无
2001-01-01
Based on analogy between plane elasticity and plate bending as well as variational principles of mixed energy, Hamiltonian system is further led to orthotropic plate bending problems in this paper. Thus many effective methods of mathematical physics such as separation of variables and eigenfunction expansion can be employed in orthotropic plate bending problems as they are used in plane elasticity. Analytical solutions of rectangular plate are presented directly, which expands the range of analytical solutions. There is an essential distinction between this method and traditional semi-inverse method. Numerical results of orthotropic plate with two lateral sides fixed are included to demonstrate the effectiveness and accuracy of this method.
Ten practical, theory-based tips for clinical course planners
Balslev, T.; Westphall, I.; Blichfeldt, S.
2008-01-01
A list of practical advice and examples are given based on the literature. E-learning with cliffhanger text-cases can activate prior knowledge, and selected examination skills can be trained with simulated patients. Patient video recordings can be used to train clinical reasoning skills, including...... pattern recognition and hypothetic-deductive approaches. Interactive approaches, for example, questioning, quizzes or buzz groups imply active involvement and participation. Quizzes and MCQ-testing can provide a formative 'check-up' on learning and point to gaps in understanding for the teachers...
Venture Capital Investment Base on Grey Relational Theory
Zhang, Xubo
This paper builds a venture capital investment projects selection evaluation model base on risk-weight investment return using grey relational analysis. The risk and return in venture capital investment projects selection process is analyses. These risk and return mainly constricted in management ability, operation ability, market ability, exit obtain and investment cost. The 18 sub-indicators are the impact factors contributed to these five evaluation aspects. Grey relation analysis is use to evaluate the venture capital investment selection. Get the optimal solution of risk-weight double objective investment selection evaluation model. An example is used to demonstrate the model in this paper.
Willis, Jerry
2011-01-01
This article is the second in a series (see Willis, 2011) that looks at the current status of instructional design scholarship and theory. In this concluding article, the focus is on two cultures of ID work, one based on constructivist and interpretivist theory and the other based on critical theory and critical pedagogy. There are distinct…
Willis, Jerry
2011-01-01
This article is the second in a series (see Willis, 2011) that looks at the current status of instructional design scholarship and theory. In this concluding article, the focus is on two cultures of ID work, one based on constructivist and interpretivist theory and the other based on critical theory and critical pedagogy. There are distinct…
Is There an Optimal Strategic Oil Reserve for Each Country? A Study Based on the Game Theory
Yang, Junan; Cong, Ronggang
2014-01-01
-cooperative game theory. It also analyzes the establishment of strategic oil reserve among different countries based on the coalition game theory and presents the core solution for it. The results show that based on a certain constraint mechanism, it is feasible for different countries to establish their own...... suitable strategic oil reserves in theory and practice....
Safety models incorporating graph theory based transit indicators.
Quintero, Liliana; Sayed, Tarek; Wahba, Mohamed M
2013-01-01
There is a considerable need for tools to enable the evaluation of the safety of transit networks at the planning stage. One interesting approach for the planning of public transportation systems is the study of networks. Network techniques involve the analysis of systems by viewing them as a graph composed of a set of vertices (nodes) and edges (links). Once the transport system is visualized as a graph, various network properties can be evaluated based on the relationships between the network elements. Several indicators can be calculated including connectivity, coverage, directness and complexity, among others. The main objective of this study is to investigate the relationship between network-based transit indicators and safety. The study develops macro-level collision prediction models that explicitly incorporate transit physical and operational elements and transit network indicators as explanatory variables. Several macro-level (zonal) collision prediction models were developed using a generalized linear regression technique, assuming a negative binomial error structure. The models were grouped into four main themes: transit infrastructure, transit network topology, transit route design, and transit performance and operations. The safety models showed that collisions were significantly associated with transit network properties such as: connectivity, coverage, overlapping degree and the Local Index of Transit Availability. As well, the models showed a significant relationship between collisions and some transit physical and operational attributes such as the number of routes, frequency of routes, bus density, length of bus and 3+ priority lanes.
[Training -- competency-based education -- learning theory and practice].
Breuer, Georg
2013-11-01
A lifelong learning process is necessarily the basis for the specialization and expertise in the field of anesthesiology. Thus competency as a physician is a complex, multidimensional construction of knowledge, skills and attitudes to be able to solve and persist the complex daily work challenges in a flexible and responsible way. Experts therefore showflexible and intuitive capabilities in pursuing their profession. Accordingly modern competency based learning objectives are very helpful. The DGAI Commission for “Further Education” already thought ahead in defining a competencybased curriculum for the specialization in the field of anesthesiology and could be integrated into the frameworks of the German Medical Association. In addition to the curricular framework elements of assessment are necessary. A single oral exam is consequently not representative for different levels of competencies. However, there is beside the responsibility of the learners for their learning processalso a high obligation of the clinical teachers to attend the learning process and to ensure a positive learning atmosphere with scope for feedback. Some competencies potentially could be better learned in a “sheltered” room based on simulation outside the OR, for example to train rare incidents or emergency procedures. In general there should be ongoing effort to enhance the process of expertise development, also in context of patient safety and quality management.
Constraint-Based Modeling: From Cognitive Theory to Computer Tutoring--and Back Again
Ohlsson, Stellan
2016-01-01
The ideas behind the constraint-based modeling (CBM) approach to the design of intelligent tutoring systems (ITSs) grew out of attempts in the 1980's to clarify how declarative and procedural knowledge interact during skill acquisition. The learning theory that underpins CBM was based on two conceptual innovations. The first innovation was to…
Prospect Theory and Travel Behaviour: a Personal Reflection Based on a Seminar
Van Wee, G.P.
2010-01-01
This paper is the final paper of a special issue on Prospect Theory (PT) and its applications in travel behaviour research. It is largely (but not exclusively) based on discussions held during a seminar that took place on the 8th of October 2009. The paper presents some personal reflections based on
Josephson current in Fe-based superconducting junctions: theory and experiment
Burmistrova, A.V.; Devyatov, I.A.; Golubov, A.; Yada, K.; Tanaka, Y.; Tortello, M.; Gonnelli, R.S.; Stepanov, V.A.; Ding, X.X.; Wen, H.H.; Green, L.H.
2015-01-01
We present a theory of the dc Josephson effect in contacts between Fe-based and spin-singlet s-wave superconductors. The method is based on the calculation of temperature Green's function in the junction within the tight-binding model. We calculate the phase dependencies of the Josephson current for
Advances in the application of decision theory to test-based decision making
Linden, van der Wim J.
1985-01-01
This paper reviews recent research in the Netherlands on the application of decision theory to test-based decision making about personnel selection and student placement. The review is based on an earlier model proposed for the classification of decision problems, and emphasizes an empirical Bayesia
Applications of the theory of Gr?bner bases to the study of linear recurring arrays
无
2001-01-01
This is a small survey of applications of the theory of Gr?bner bases to the study of linear recurring arrays. It applies some properties of Gr?bner bases to studying linear recurring arrays and contains recent new results on linear recurring arrays.
Using Game Theory and Competition-Based Learning to Stimulate Student Motivation and Performance
Burguillo, Juan C.
2010-01-01
This paper introduces a framework for using Game Theory tournaments as a base to implement Competition-based Learning (CnBL), together with other classical learning techniques, to motivate the students and increase their learning performance. The paper also presents a description of the learning activities performed along the past ten years of a…
Holographic theory and recording techniques. Citations from the NTIS data base
Carrigan, B.
1980-05-01
The topics cited include holographic recording techniques, theory, equipment, and materials. Among the techniques cited are color holography, X-ray holography, high speed holography, and motion picture holography. Photographic materials, films, emulsions, and equipment for recording and information storage are covered. Techniques for image motion compensation, image deblurring, wave-front reconstruction, and resolution are also cited. This updated bibliography contains 251 abstracts, 17 of which are new entries to the previous edition.
Kinematics and dynamics of deployable structures with scissor-like-elements based on screw theory
Sun, Yuantao; Wang, Sanmin; Mills, James K.; Zhi, Changjian
2014-07-01
Because the deployable structures are complex multi-loop structures and methods of derivation which lead to simpler kinematic and dynamic equations of motion are the subject of research effort, the kinematics and dynamics of deployable structures with scissor-like-elements are presented based on screw theory and the principle of virtual work respectively. According to the geometric characteristic of the deployable structure examined, the basic structural unit is the common scissor-like-element(SLE). First, a spatial deployable structure, comprised of three SLEs, is defined, and the constraint topology graph is obtained. The equations of motion are then derived based on screw theory and the geometric nature of scissor elements. Second, to develop the dynamics of the whole deployable structure, the local coordinates of the SLEs and the Jacobian matrices of the center of mass of the deployable structure are derived. Then, the equivalent forces are assembled and added in the equations of motion based on the principle of virtual work. Finally, dynamic behavior and unfolded process of the deployable structure are simulated. Its figures of velocity, acceleration and input torque are obtained based on the simulate results. Screw theory not only provides an efficient solution formulation and theory guidance for complex multi-closed loop deployable structures, but also extends the method to solve dynamics of deployable structures. As an efficient mathematical tool, the simper equations of motion are derived based on screw theory.
Game Theory Based Trust Model for Cloud Environment.
Gokulnath, K; Uthariaraj, Rhymend
2015-01-01
The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered.
Error Analysis of Robotic Assembly System Based on Screw Theory
韩卫军; 费燕琼; 赵锡芳
2003-01-01
Assembly errors have great influence on assembly quality in robotic assembly systems. Error analysis is directed to the propagations and accumula-tions of various errors and their effect on assembly success.Using the screw coordinates, assembly errors are represented as "error twist", the extremely compact expression. According to the law of screw composition, relative position and orientation errors of mating parts are computed and the necessary condition of assembly success is concluded. A new simple method for measuring assembly errors is also proposed based on the transformation law of a screw.Because of the compact representation of error, the model presented for error analysis can be applied to various part- mating types and especially useful for error analysis of complexity assembly.
[The Chinese urban metabolisms based on the emergy theory].
Song, Tao; Cai, Jian-Ming; Ni, Pan; Yang, Zhen-Shan
2014-04-01
By using emergy indices of urban metabolisms, this paper analyzed 31 Chinese urban metabolisms' systematic structures and characteristics in 2000 and 2010. The results showed that Chinese urban metabolisms were characterized as resource consumption and coastal external dependency. Non-renewable resource emergy accounted for a higher proportion of the total emergy in the inland cities' urban metabolisms. The emergy of imports and exports accounted for the vast majority of urban metabolic systems in metropolises and coastal cities such as Beijing and Shanghai, showing a significant externally-oriented metabolic characteristic. Based on that, the related policies were put forward: to develop the renewable resource and energy industry; to improve the non-renewable resource and energy utilization efficiencies; to optimize the import and export structure of services, cargo and fuel; and to establish the flexible management mechanism of urban metabolisms.
Hybrid Fundamental Solution Based Finite Element Method: Theory and Applications
Changyong Cao
2015-01-01
Full Text Available An overview on the development of hybrid fundamental solution based finite element method (HFS-FEM and its application in engineering problems is presented in this paper. The framework and formulations of HFS-FEM for potential problem, plane elasticity, three-dimensional elasticity, thermoelasticity, anisotropic elasticity, and plane piezoelectricity are presented. In this method, two independent assumed fields (intraelement filed and auxiliary frame field are employed. The formulations for all cases are derived from the modified variational functionals and the fundamental solutions to a given problem. Generation of elemental stiffness equations from the modified variational principle is also described. Typical numerical examples are given to demonstrate the validity and performance of the HFS-FEM. Finally, a brief summary of the approach is provided and future trends in this field are identified.
Theory of Carbon Nanotube (CNT-Based Electron Field Emitters
Alexander V. Eletskii
2013-07-01
Full Text Available Theoretical problems arising in connection with development and operation of electron field emitters on the basis of carbon nanotubes are reviewed. The physical aspects of electron field emission that underlie the unique emission properties of carbon nanotubes (CNTs are considered. Physical effects and phenomena affecting the emission characteristics of CNT cathodes are analyzed. Effects given particular attention include: the electric field amplification near a CNT tip with taking into account the shape of the tip, the deviation from the vertical orientation of nanotubes and electrical field-induced alignment of those; electric field screening by neighboring nanotubes; statistical spread of the parameters of the individual CNTs comprising the cathode; the thermal effects resulting in degradation of nanotubes during emission. Simultaneous consideration of the above-listed effects permitted the development of the optimization procedure for CNT array in terms of the maximum reachable emission current density. In accordance with this procedure, the optimum inter-tube distance in the array depends on the region of the external voltage applied. The phenomenon of self-misalignment of nanotubes in an array has been predicted and analyzed in terms of the recent experiments performed. A mechanism of degradation of CNT-based electron field emitters has been analyzed consisting of the bombardment of the emitters by ions formed as a result of electron impact ionization of the residual gas molecules.
Theory of Carbon Nanotube (CNT)-Based Electron Field Emitters.
Bocharov, Grigory S; Eletskii, Alexander V
2013-07-17
Theoretical problems arising in connection with development and operation of electron field emitters on the basis of carbon nanotubes are reviewed. The physical aspects of electron field emission that underlie the unique emission properties of carbon nanotubes (CNTs) are considered. Physical effects and phenomena affecting the emission characteristics of CNT cathodes are analyzed. Effects given particular attention include: the electric field amplification near a CNT tip with taking into account the shape of the tip, the deviation from the vertical orientation of nanotubes and electrical field-induced alignment of those; electric field screening by neighboring nanotubes; statistical spread of the parameters of the individual CNTs comprising the cathode; the thermal effects resulting in degradation of nanotubes during emission. Simultaneous consideration of the above-listed effects permitted the development of the optimization procedure for CNT array in terms of the maximum reachable emission current density. In accordance with this procedure, the optimum inter-tube distance in the array depends on the region of the external voltage applied. The phenomenon of self-misalignment of nanotubes in an array has been predicted and analyzed in terms of the recent experiments performed. A mechanism of degradation of CNT-based electron field emitters has been analyzed consisting of the bombardment of the emitters by ions formed as a result of electron impact ionization of the residual gas molecules.
Dynamics of Plant Growth; A Theory Based on Riemannian Geometry
Pulwicki, Julia
2016-01-01
In this work, a new model for macroscopic plant tissue growth based on dynamical Riemannian geometry is presented. We treat 1D and 2D tissues as continuous, deformable, growing geometries for sizes larger than 1mm. The dynamics of the growing tissue are described by a set of coupled tensor equations in non-Euclidean (curved) space. These coupled equations represent a novel feedback mechanism between growth and curvature dynamics. For 1D growth, numerical simulations are compared to two measures of root growth. First, modular growth along the simulated root shows an elongation zone common to many species of plant roots. Second, the relative elemental growth rate (REGR) calculated in silico exhibits temporal dynamics recently characterized in high-resolution root growth studies but which thus far lack a biological hypothesis to explain them. Namely, the REGR can evolve from a single peak localized near the root tip to a double-peak structure. In our model, this is a direct consequence of considering growth as b...
[Modeling continuous scaling of NDVI based on fractal theory].
Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng
2013-07-01
Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.
Time dependent mechanical modeling for polymers based on network theory
Billon, Noëlle
2016-05-01
Despite of a lot of attempts during recent years, complex mechanical behaviour of polymers remains incompletely modelled, making industrial design of structures under complex, cyclic and hard loadings not totally reliable. The non linear and dissipative viscoelastic, viscoplastic behaviour of those materials impose to take into account non linear and combined effects of mechanical and thermal phenomena. In this view, a visco-hyperelastic, viscoplastic model, based on network description of the material has recently been developed and designed in a complete thermodynamic frame in order to take into account those main thermo-mechanical couplings. Also, a way to account for coupled effects of strain-rate and temperature was suggested. First experimental validations conducted in the 1D limit on amorphous rubbery like PMMA in isothermal conditions led to pretty goods results. In this paper a more complete formalism is presented and validated in the case of a semi crystalline polymer, a PA66 and a PET (either amorphous or semi crystalline) are used. Protocol for identification of constitutive parameters is described. It is concluded that this new approach should be the route to accurately model thermo-mechanical behaviour of polymers using a reduced number of parameters of some physicl meaning.
Theory of Carbon Nanotube (CNT)-Based Electron Field Emitters
Bocharov, Grigory S.; Eletskii, Alexander V.
2013-01-01
Theoretical problems arising in connection with development and operation of electron field emitters on the basis of carbon nanotubes are reviewed. The physical aspects of electron field emission that underlie the unique emission properties of carbon nanotubes (CNTs) are considered. Physical effects and phenomena affecting the emission characteristics of CNT cathodes are analyzed. Effects given particular attention include: the electric field amplification near a CNT tip with taking into account the shape of the tip, the deviation from the vertical orientation of nanotubes and electrical field-induced alignment of those; electric field screening by neighboring nanotubes; statistical spread of the parameters of the individual CNTs comprising the cathode; the thermal effects resulting in degradation of nanotubes during emission. Simultaneous consideration of the above-listed effects permitted the development of the optimization procedure for CNT array in terms of the maximum reachable emission current density. In accordance with this procedure, the optimum inter-tube distance in the array depends on the region of the external voltage applied. The phenomenon of self-misalignment of nanotubes in an array has been predicted and analyzed in terms of the recent experiments performed. A mechanism of degradation of CNT-based electron field emitters has been analyzed consisting of the bombardment of the emitters by ions formed as a result of electron impact ionization of the residual gas molecules. PMID:28348342
Time dependent mechanical modeling for polymers based on network theory
Billon, Noëlle [MINES ParisTech, PSL-Research University, CEMEF – Centre de mise en forme des matériaux, CNRS UMR 7635, CS 10207 rue Claude Daunesse 06904 Sophia Antipolis Cedex (France)
2016-05-18
Despite of a lot of attempts during recent years, complex mechanical behaviour of polymers remains incompletely modelled, making industrial design of structures under complex, cyclic and hard loadings not totally reliable. The non linear and dissipative viscoelastic, viscoplastic behaviour of those materials impose to take into account non linear and combined effects of mechanical and thermal phenomena. In this view, a visco-hyperelastic, viscoplastic model, based on network description of the material has recently been developed and designed in a complete thermodynamic frame in order to take into account those main thermo-mechanical couplings. Also, a way to account for coupled effects of strain-rate and temperature was suggested. First experimental validations conducted in the 1D limit on amorphous rubbery like PMMA in isothermal conditions led to pretty goods results. In this paper a more complete formalism is presented and validated in the case of a semi crystalline polymer, a PA66 and a PET (either amorphous or semi crystalline) are used. Protocol for identification of constitutive parameters is described. It is concluded that this new approach should be the route to accurately model thermo-mechanical behaviour of polymers using a reduced number of parameters of some physical meaning.
THEORY OF REGENERATION BASED ON MASS ACTION. II.
Loeb, J
1923-11-20
1. Quantitative proof is furnished that all the material available for shoot and root formation in an isolated leaf of Bryophyllum calycinum flows to those notches where through the influence of gravity or by a more abundant supply of water growth is accelerated. As soon as the acceleration of growth in these notches commences, the growth of shoots and roots in the other notches which may already have started ceases. 2. It had been shown in a preceding paper that the regeneration of an isolated piece of stem may be and frequently is in the beginning not markedly polar, but that after some time the growth of all the roots except those at the base and of all the shoots except those at the apex is suppressed. This analogy with the behavior of regeneration in a leaf in which the growth in one set of notches is accelerated, suggests that in an isolated stem a more rapid growth is favored at the extreme ends (probably by a block of the sap flow at the extreme ends) and that when this happens the total flow of ascending sap goes to the most apical buds and the total flow of the descending sap goes to the most basal roots. As soon as this occurs, the growth of the other roots and shoots is suppressed.
Smith D
2005-10-01
Full Text Available Abstract Background To prospectively evaluate the efficacy and safety of selective internal radiation (SIR spheres in patients with inoperable liver metastases from colorectal cancer who have failed 5FU based chemotherapy. Methods Patients were prospectively enrolled at three Australian centres. All patients had previously received 5-FU based chemotherapy for metastatic colorectal cancer. Patients were ECOG 0–2 and had liver dominant or liver only disease. Concurrent 5-FU was given at investigator discretion. Results Thirty patients were treated between January 2002 and March 2004. As of July 2004 the median follow-up is 18.3 months. Median patient age was 61.7 years (range 36 – 77. Twenty-nine patients are evaluable for toxicity and response. There were 10 partial responses (33%, with the median duration of response being 8.3 months (range 2–18 and median time to progression of 5.3 mths. Response rates were lower (21% and progression free survival shorter (3.9 mths in patients that had received all standard chemotherapy options (n = 14. No responses were seen in patients with a poor performance status (n = 3 or extrahepatic disease (n = 6. Overall treatment related toxicity was acceptable, however significant late toxicity included 4 cases of gastric ulceration. Conclusion In patients with metastatic colorectal cancer that have previously received treatment with 5-FU based chemotherapy, treatment with SIR-spheres has demonstrated encouraging activity. Further studies are required to better define the subsets of patients most likely to respond.
A Christian faith-based recovery theory: understanding God as sponsor.
Timmons, Shirley M
2012-12-01
This article reports the development of a substantive theory to explain an evangelical Christian-based process of recovery from addiction. Faith-based, 12-step, mutual aid programs can improve drug abstinence by offering: (a) an intervention option alone and/or in conjunction with secular programs and (b) an opportunity for religious involvement. Although literature on religion, spirituality, and addiction is voluminous, traditional 12-step programs fail to explain the mechanism that underpins the process of Christian-based recovery (CR). This pilot study used grounded theory to explore and describe the essence of recovery of 10 former crack cocaine-addicted persons voluntarily enrolled in a CR program. Data were collected from in-depth interviews during 4 months of 2008. Audiotapes were transcribed verbatim, and the constant comparative method was used to analyze data resulting in the basic social process theory, understanding God as sponsor. The theory was determined through writing theoretical memos that generated key elements that allow persons to recover: acknowledging God-centered crises, communicating with God, and planning for the future. Findings from this preliminary study identifies important factors that can help persons in recovery to sustain sobriety and program administrators to benefit from theory that guides the development of evidence-based addiction interventions.
Tokuyama, Michio
2017-01-01
The renormalized simplified model is proposed to investigate indirectly how the static structure factor plays an important role in renormalizing a quadratic nonlinear term in the ideal mode-coupling memory function near the glass transition. The renormalized simplified recursion equation is then derived based on the time-convolutionless mode-coupling theory (TMCT) proposed recently by the present author. This phenomenological approach is successfully applied to check from a unified point of view how strong liquids are different from fragile liquids. The simulation results for those two types of liquids are analyzed consistently by the numerical solutions of the recursion equation. Then, the control parameter dependence of the renormalized nonlinear exponent in both types of liquids is fully investigated. Thus, it is shown that there exists a novel difference between the universal behavior in strong liquids and that in fragile liquids not only for their transport coefficients but also for their dynamics.
Uncertainty, Pluralism, and the Knowledge-Based Theory of the Firm
Reihlen, Markus; Ringberg, Torsten
2013-01-01
, and the interaction between the individual and the social. This socio-cognitive theory of the firm posits that sustained competitive advantage of a firm is founded on the ability to align knowledge internally within the firm as well as externally with its stakeholders through the individual sense-making of feedback......J.-C. Spender’s award-winning, knowledge-based theory of the firm is based on four premises: (1) The firm can be sufficiently understood as a system of knowledge, (2) explicit and implicit knowing can be clearly dissociated, (3) organizations are conceived as cognizing entities, and (4) intuition......-cultural conventions and other social processes. Although comprehensive in scope, we argue that a knowledge-based theory of the firm needs to integrate a cognitivist approach that includes the synergetic production of tacit and explicit knowledge, the role of reflective thinking in resolving strategic uncertainties...
Efficacy of a Transition Theory-Based Discharge Planning Program for Childhood Asthma Management.
Ekim, Ayfer; Ocakci, Ayse Ferda
2016-02-01
This study tested the efficacy of a nurse-led discharge planning program for childhood asthma management, based on transition theory. A quasi-experimental design was used. The sample comprised 120 children with asthma and their parents (intervention group n = 60, control group n = 60). The asthma management self-efficacy perception level of parents in the intervention group increased significantly and the number of triggers their children were exposed to at home was reduced by 60.8%. The rates of admission to emergency departments and unscheduled outpatient visits were significantly lower in the intervention group compared with the control group. Transition theory-based nursing interventions can provide successful outcomes on childhood asthma management. Transition theory-based discharge planning program can guide nursing interventions to standardize care of the child with asthma. Combining care at home with hospital care strengthens ongoing qualified asthma management. © 2015 NANDA International, Inc.
Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory
Matsumura, Koki; Kawamoto, Masaru
This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.
Gerjets, Peter; Scheiter, Katharina; Cierniak, Gabriele
2009-01-01
In this paper, two methodological perspectives are used to elaborate on the value of cognitive load theory (CLT) as a scientific theory. According to the more traditional critical rationalism of Karl Popper, CLT cannot be considered a scientific theory because some of its fundamental assumptions cannot be tested empirically and are thus not…
Gerjets, Peter; Scheiter, Katharina; Cierniak, Gabriele
2009-01-01
In this paper, two methodological perspectives are used to elaborate on the value of cognitive load theory (CLT) as a scientific theory. According to the more traditional critical rationalism of Karl Popper, CLT cannot be considered a scientific theory because some of its fundamental assumptions cannot be tested empirically and are thus not…
Nekrasov, Nikita
2004-01-01
We present the evidence for the existence of the topological string analogue of M-theory, which we call Z-theory. The corners of Z-theory moduli space correspond to the Donaldson-Thomas theory, Kodaira-Spencer theory, Gromov-Witten theory, and Donaldson-Witten theory. We discuss the relations of Z-theory with Hitchin's gravities in six and seven dimensions, and make our own proposal, involving spinor generalization of Chern-Simons theory of three-forms. Based on the talk at Strings'04 in Paris.
A design method based on photonic crystal theory for Bragg concave diffraction grating
Du, Bingzheng; Zhu, Jingping; Mao, Yuzheng; Li, Bao; Zhang, Yunyao; Hou, Xun
2017-02-01
A design method based on one-dimensional photonic crystal theory (1-D PC theory) is presented to design Bragg concave diffraction grating (Bragg-CDG) for the demultiplexer. With this design method, the reflection condition calculated by the 1-D PC theory can be matched perfectly with the diffraction condition. As a result, the shift of central wavelength of diffraction spectra can be improved, while keeping high diffraction efficiency. Performances of Bragg-CDG for TE and TM-mode are investigated, and the simulation results are consistent with the 1-D PC theory. This design method is expected to be applied to improve the accuracy and efficiency of Bragg-CDG after further research.
Attribute reduction theory of concept lattice based on decision formal contexts
WEI Ling; QI diandun; ZHANG WenXiu
2008-01-01
The theory of concept lattices is an efficient tool for knowledge representation and knowledge discovery,and is applied to many fields successfully.One focus of knowledge discovery is knowledge reduction.Based on the reduction theory of classical formal context,this paper proposes the definition of decision formal context and its reduction theory,which extends the reduction theory of concept lattices.In this paper,strong consistence and weak consistence of decision formal context are defined respectively.For strongly consistent decision formal context,the judgment theorems of consistent sets are examined,and approaches to reduc-tion are given.For weakly consistent decision formal context,implication mapping is defined,and its reduction is studied.Finally,the relation between reducts of weakly consistent decision formal context and reducts of implication mapping is discussed.
Zhang, Lei; Chen, Lingen; Sun, Fengrui
2016-03-01
The finite-time thermodynamic method based on probability analysis can more accurately describe various performance parameters of thermodynamic systems. Based on the relation between optimal efficiency and power output of a generalized Carnot heat engine with a finite high-temperature heat reservoir (heat source) and an infinite low-temperature heat reservoir (heat sink) and with the only irreversibility of heat transfer, this paper studies the problem of power optimization of chemically driven heat engine based on first and second order reaction kinetic theory, puts forward a model of the coupling heat engine which can be run periodically and obtains the effects of the finite-time thermodynamic characteristics of the coupling relation between chemical reaction and heat engine on the power optimization. The results show that the first order reaction kinetics model can use fuel more effectively, and can provide heat engine with higher temperature heat source to increase the power output of the heat engine. Moreover, the power fluctuation bounds of the chemically driven heat engine are obtained by using the probability analysis method. The results may provide some guidelines for the character analysis and power optimization of the chemically driven heat engines.
Towards socio-material approaches in simulation-based education: lessons from complexity theory.
Fenwick, Tara; Dahlgren, Madeleine Abrandt
2015-04-01
Review studies of simulation-based education (SBE) consistently point out that theory-driven research is lacking. The literature to date is dominated by discourses of fidelity and authenticity - creating the 'real' - with a strong focus on the developing of clinical procedural skills. Little of this writing incorporates the theory and research proliferating in professional studies more broadly, which show how professional learning is embodied, relational and situated in social - material relations. A key concern for medical educators concerns how to better prepare students for the unpredictable and dynamic ambiguity of professional practice; this has stimulated the movement towards socio-material theories in education that address precisely this question. Among the various socio-material theories that are informing new developments in professional education, complexity theory has been of particular importance for medical educators interested in updating current practices. This paper outlines key elements of complexity theory, illustrated with examples from empirical study, to argue its particular relevance for improving SBE. Complexity theory can make visible important material dynamics, and their problematic consequences, that are not often noticed in simulated experiences in medical training. It also offers conceptual tools that can be put to practical use. This paper focuses on concepts of emergence, attunement, disturbance and experimentation. These suggest useful new approaches for designing simulated settings and scenarios, and for effective pedagogies before, during and following simulation sessions. Socio-material approaches such as complexity theory are spreading through research and practice in many aspects of professional education across disciplines. Here, we argue for the transformative potential of complexity theory in medical education using simulation as our focus. Complexity tools open questions about the socio-material contradictions inherent in
Category Theory as a Formal Mathematical Foundation for Model-Based Systems Engineering
Mabrok, Mohamed
2017-01-09
In this paper, we introduce Category Theory as a formal foundation for model-based systems engineering. A generalised view of the system based on category theory is presented, where any system can be considered as a category. The objects of the category represent all the elements and components of the system and the arrows represent the relations between these components (objects). The relationship between these objects are the arrows or the morphisms in the category. The Olog is introduced as a formal language to describe a given real-world situation description and requirement writing. A simple example is provided.
Map Building for a Mobile Robot Based on Grey System Theory
王卫华; 席裕庚; 陈卫东
2003-01-01
In this paper, a new method for mobile robot map building based on grey system theory is presented, by which interpretation and integration of sonar readings can be solved robustly and efficiently. The conception of "grey number" is introduced to model and handle the uncertainty of sonar reading, A new data fusion approach based on grey system theory is proposed to construct environment model. Map building experiments are performed both on a platform of simulation and a real mobile robot. Experimental results show that our method is robust and accurate.
Research on the Formation Modes of Emerging Technologies Based on the Species Origin Theory
SONG Yan; YIN Lu
2006-01-01
Based on the research on emerging technologies and "the technology species", this thesis tries to find out the homologous relations between emerging technology and new species. The theory of the new biology species in synthetic evolution and the modern genetics is referred in the research. The modes of forming emerging technologies are discussed through some examples of IT evolvement according to modes of the biology speciation. Finally, it is pointed out that the biology species origin theory can analogize continuously the research about the evolvement mode of every emerging technology, and the implications have been bought to enterprises based on the formation conditions of each mode.
Prospect Theory and Travel Behaviour: a Personal Reflection Based on a Seminar
van Wee, G.P.
2010-01-01
This paper is the final paper of a special issue on Prospect Theory (PT) and its applications in travel behaviour research. It is largely (but not exclusively) based on discussions held during a seminar that took place on the 8th of October 2009. The paper presents some personal reflections based on the discussion subjects on the use of PT in the area of travel behaviour research, clustered by the position of PT as a theory, applications of PT, future research, and relevance for policy making...
Revisited: The South Dakota Board of Nursing theory-based regulatory decisioning model.
Damgaard, Gloria; Bunkers, Sandra Schmidt
2012-07-01
The authors of this column describe the South Dakota Board of Nursing's 11 year journey utilizing a humanbecoming theory-based regulatory decisioning model. The column revisits the model with an emphasis on the cocreation of a strategic plan guiding the work of the South Dakota Board of Nursing through 2014. The strategic plan was influenced by the latest refinements of the humanbecoming postulates and the humanbecoming community change concepts. A graphic picture of the decisioning model is presented along with future plans for the theory-based model.
An ambiguity-based theory of the linguistic verbal joke in English
Lew, Robert
1996-01-01
Instytut Filologii Angielskiej The dissertation analyzes the role of linguistic ambiguity in canned linguistic verbal jokes in English. A typology of jokes based on the type of ambiguity underlying the joke is proposed. It is claimed that jokes based on ambiguity contain purposeful textual devices (ambiguators) enhancing the probability of both semantic interpretations being perceived by the recipient. The ambiguity-based theory of the joke offers an alternative justification for the featu...
Kerner, B. S.; Klenov, S. L.; Brakemeier, A.
2007-01-01
A testbed for wireless vehicle communication based on a microscopic model in the framework of three-phase traffic theory is presented. In this testbed, vehicle motion in traffic flow and analyses of a vehicle communication channel access based on IEEE 802.11e mechanisms, radio propagation modeling, message reception characteristics as well as all other effects associated with ad-hoc networks are integrated into a three-phase traffic flow model. Based on simulations of this testbed, some stati...
Middle-aged women's preferred theory-based features in mobile physical activity applications.
Ehlers, Diane K; Huberty, Jennifer L
2014-09-01
The purpose of this study was to describe which theory-based behavioral and technological features middle-aged women prefer to be included in a mobile application designed to help them adopt and maintain regular physical activity (PA). Women aged 30 to 64 years (N = 120) completed an online survey measuring their demographics and mobile PA application preferences. The survey was developed upon behavioral principles of Social Cognitive Theory, recent mobile app research, and technology adoption principles of the Unified Theory of Acceptance and Use of Technology. Frequencies were calculated and content analyses conducted to identify which features women most preferred. Behavioral features that help women self-regulate their PA (PA tracking, goal-setting, progress monitoring) were most preferred. Technological features that enhance perceived effort expectancy and playfulness were most preferred. Many women reported the desire to interact and compete with others through the application. Theory-based PA self-regulation features and theory-based design features that improve perceived effort expectancy and playfulness may be most beneficial in a mobile PA application for middle-aged women. Opportunities to interact with other people and the employment of social, game-like activities may also be attractive. Interdisciplinary engagement of experts in PA behavior change, technology adoption, and software development is needed.
Studying thin film damping in a micro-beam resonator based on non-classical theories
Ghanbari, Mina; Hossainpour, Siamak; Rezazadeh, Ghader
2016-06-01
In this paper, a mathematical model is presented for studying thin film damping of the surrounding fluid in an in-plane oscillating micro-beam resonator. The proposed model for this study is made up of a clamped-clamped micro-beam bound between two fixed layers. The micro-gap between the micro-beam and fixed layers is filled with air. As classical theories are not properly capable of predicting the size dependence behaviors of the micro-beam, and also behavior of micro-scale fluid media, hence in the presented model, equation of motion governing longitudinal displacement of the micro-beam has been extracted based on non-local elasticity theory. Furthermore, the fluid field has been modeled based on micro-polar theory. These coupled equations have been simplified using Newton-Laplace and continuity equations. After transforming to non-dimensional form and linearizing, the equations have been discretized and solved simultaneously using a Galerkin-based reduced order model. Considering slip boundary conditions and applying a complex frequency approach, the equivalent damping ratio and quality factor of the micro-beam resonator have been obtained. The obtained values for the quality factor have been compared to those based on classical theories. We have shown that applying non-classical theories underestimate the values of the quality factor obtained based on classical theories. The effects of geometrical parameters of the micro-beam and micro-scale fluid field on the quality factor of the resonator have also been investigated.
Studying thin film damping in a micro-beam resonator based on non-classical theories
Mina Ghanbari; Siamak Hossainpour; Ghader Rezazadeh
2016-01-01
In this paper, a mathematical model is presented for studying thin film damping of the surrounding fluid in an in-plane oscillating micro-beam resonator. The proposed model for this study is made up of a clamped-clamped micro-beam bound between two fixed layers. The micro-gap between the micro-beam and fixed layers is filled with air. As classical theories are not properly capable of pre-dicting the size dependence behaviors of the micro-beam, and also behavior of micro-scale fluid media, hence in the presented model, equation of motion governing longitudinal displacement of the micro-beam has been extracted based on non-local elasticity theory. Furthermore, the fluid field has been modeled based on micro-polar theory. These coupled equations have been simplified using Newton-Laplace and continuity equations. After transforming to non-dimensional form and linearizing, the equations have been discretized and solved simultaneously using a Galerkin-based reduced order model. Considering slip boundary conditions and applying a complex frequency approach, the equivalent damping ratio and quality factor of the micro-beam resonator have been obtained. The obtained values for the quality factor have been compared to those based on classical theories. We have shown that applying non-classical theories underestimate the values of the quality factor obtained based on classical theo-ries. The effects of geometrical parameters of the micro-beam and micro-scale fluid field on the quality factor of the res-onator have also been investigated.
Theory, the Final Frontier? A Corpus-Based Analysis of the Role of Theory in Psychological Articles
Sieghard Beller
2017-06-01
Full Text Available Contemporary psychology regards itself as an empirical science, at least in most of its subfields. Theory building and development are often considered critical to the sciences, but the extent to which psychology can be cast in this way is under debate. According to those advocating a strong role of theory, studies should be designed to test hypotheses derived from theories (theory-driven and ideally should yield findings that stimulate hypothesis formation and theory building (theory-generating. The alternative position values empirical findings over theories as the lasting legacy of science. To investigate which role theory actually plays in current research practice, we analyse references to theory in the complete set of 2,046 articles accepted for publication in Frontiers of Psychology in 2015. This sample of articles, while not representative in the strictest sense, covers a broad range of sub-disciplines, both basic and applied, and a broad range of article types, including research articles, reviews, hypothesis & theory, and commentaries. For the titles, keyword lists, and abstracts in this sample, we conducted a text search for terms related to empiricism and theory, assessed the frequency and scope of usage for six theory-related terms, and analyzed their distribution over different article types and subsections of the journal. The results indicate substantially lower frequencies of theoretical than empirical terms, with references to a specific (named theory in less than 10% of the sample and references to any of even the most frequently mentioned theories in less than 0.5% of the sample. In conclusion, we discuss possible limitations of our study and the prospect of theoretical advancement.
Theory, the Final Frontier? A Corpus-Based Analysis of the Role of Theory in Psychological Articles
Beller, Sieghard; Bender, Andrea
2017-01-01
Contemporary psychology regards itself as an empirical science, at least in most of its subfields. Theory building and development are often considered critical to the sciences, but the extent to which psychology can be cast in this way is under debate. According to those advocating a strong role of theory, studies should be designed to test hypotheses derived from theories (theory-driven) and ideally should yield findings that stimulate hypothesis formation and theory building (theory-generating). The alternative position values empirical findings over theories as the lasting legacy of science. To investigate which role theory actually plays in current research practice, we analyse references to theory in the complete set of 2,046 articles accepted for publication in Frontiers of Psychology in 2015. This sample of articles, while not representative in the strictest sense, covers a broad range of sub-disciplines, both basic and applied, and a broad range of article types, including research articles, reviews, hypothesis & theory, and commentaries. For the titles, keyword lists, and abstracts in this sample, we conducted a text search for terms related to empiricism and theory, assessed the frequency and scope of usage for six theory-related terms, and analyzed their distribution over different article types and subsections of the journal. The results indicate substantially lower frequencies of theoretical than empirical terms, with references to a specific (named) theory in less than 10% of the sample and references to any of even the most frequently mentioned theories in less than 0.5% of the sample. In conclusion, we discuss possible limitations of our study and the prospect of theoretical advancement. PMID:28642728
Fariba Shahraki-Sanavi
2012-09-01
Full Text Available Background: The purpose of this research was to study the attitudes of pregnant women with intention of elective cesarean section, based on the theory of planned behavior. Materials and Methods: This cross-sectional study was carried out on 150 pregnant women in their third trimester of pregnancy with an intention or decision to elective cesarean section, who were selected through probability sampling. The collection tool of information was a questionnaire based on the theory of planned behavior.Results: In a majority of women, the attitude and the control of perceived behavior was weak or intermediate. The ANOVA test showed a significant statistical correlation between the means scores of attitude with education level and the control of perceived behavior with type of previous labor. Obedience incentive was based on physicians, mothers, and spouses’ decisions, respectively.Conclusion: Continuous classes for training psychological skills and the preparation of mothers for delivery should be established to decrease the interest of pregnant women toward elective cesarean section.
May, Jon; Andrade, Jackie; Kavanagh, David J; Feeney, Gerald F X; Gullo, Mathew J; Statham, Dixie J; Skorka-Brown, Jessica; Connolly, Jennifer M; Cassimatis, Mandy; Young, Ross McD; Connor, Jason P
2014-05-01
Research into craving is hampered by lack of theoretical specification and a plethora of substance-specific measures. This study aimed to develop a generic measure of craving based on elaborated intrusion (EI) theory. Confirmatory factor analysis (CFA) examined whether a generic measure replicated the three-factor structure of the Alcohol Craving Experience (ACE) scale over different consummatory targets and time-frames. Twelve studies were pooled for CFA. Targets included alcohol, cigarettes, chocolate and food. Focal periods varied from the present moment to the previous week. Separate analyses were conducted for strength and frequency forms. Nine studies included university students, with single studies drawn from an internet survey, a community sample of smokers and alcohol-dependent out-patients. A heterogeneous sample of 1230 participants. Adaptations of the ACE questionnaire. Both craving strength [comparative fit indices (CFI = 0.974; root mean square error of approximation (RMSEA) = 0.039, 95% confidence interval (CI) = 0.035-0.044] and frequency (CFI = 0.971, RMSEA = 0.049, 95% CI = 0.044-0.055) gave an acceptable three-factor solution across desired targets that mapped onto the structure of the original ACE (intensity, imagery, intrusiveness), after removing an item, re-allocating another and taking intercorrelated error terms into account. Similar structures were obtained across time-frames and targets. Preliminary validity data on the resulting 10-item Craving Experience Questionnaire (CEQ) for cigarettes and alcohol were strong. The Craving Experience Questionnaire (CEQ) is a brief, conceptually grounded and psychometrically sound measure of desires. It demonstrates a consistent factor structure across a range of consummatory targets in both laboratory and clinical contexts. © 2014 Society for the Study of Addiction.
Fundamentals of the fuzzy logic-based generalized theory of decisions
Aliev, Rafik Aziz
2013-01-01
Every day decision making and decision making in complex human-centric systems are characterized by imperfect decision-relevant information. Main drawback of the existing decision theories is namely incapability to deal with imperfect information and modeling vague preferences. Actually, a paradigm of non-numerical probabilities in decision making has a long history and arose also in Keynes’s analysis of uncertainty. There is a need for further generalization – a move to decision theories with perception-based imperfect information described in NL. The languages of new decision models for human-centric systems should be not languages based on binary logic but human-centric computational schemes able to operate on NL-described information. Development of new theories is now possible due to an increased computational power of information processing systems which allows for computations with imperfect information, particularly, imprecise and partially true information, which are much more complex than comput...
Varzinczak, Ivan
2008-01-01
Like any other logical theory, domain descriptions in reasoning about actions may evolve, and thus need revision methods to adequately accommodate new information about the behavior of actions. The present work is about changing action domain descriptions in propositional dynamic logic. Its contribution is threefold: first we revisit the semantics of action theory contraction that has been done in previous work, giving more robust operators that express minimal change based on a notion of dis...
Theory-based predictors of multiple clinician behaviors in the management of diabetes.
Presseau, Justin; Johnston, Marie; Francis, Jill J; Hrisos, Susan; Stamp, Elaine; Steen, Nick; Hawthorne, Gillian; Grimshaw, Jeremy M; Elovainio, Marko; Hunter, Margaret; Eccles, Martin P
2014-08-01
Behavioral theory is often tested on one behavior in isolation from other behaviors and theories. We aimed to test the predictive validity of constructs from motivation and action theories of behavior across six diabetes-related clinician behaviors, within the same sample of primary care clinicians. Physicians and nurses (n = 427 from 99 practices in the United Kingdom) completed questionnaires at baseline and 12 months. six self-reported clinician behaviors related to advising, prescribing and examining measured at 12 months; secondary outcomes: baseline intention and patient-scenario-based simulated behavior. Across six behaviors, each theory accounted for a medium amount of variance for 12-month behavior (median R adj (2) = 0.15), large and medium amount of variance for two intention measures (median R adj (2) = 0.66; 0.34), and small amount of variance for simulated behavior (median R adj (2) = 0.05). Intention/proximal goals, self-efficacy, and habit predicted all behaviors. Constructs from social cognitive theory (self-efficacy), learning theory (habit) and action and coping planning consistently predicted multiple clinician behaviors and should be targeted by quality improvement interventions.
Battino, U.; Pignatari, M.; Ritter, C.; Herwig, F.; Denisenkov, P.; Den Hartogh, J. W.; Trappitsch, R.; Hirschi, R.; Freytag, B.; Thielemann, F.; Paxton, B.
2016-08-01
The s-process nucleosynthesis in Asymptotic giant branch (AGB) stars depends on the modeling of convective boundaries. We present models and s-process simulations that adopt a treatment of convective boundaries based on the results of hydrodynamic simulations and on the theory of mixing due to gravity waves in the vicinity of convective boundaries. Hydrodynamics simulations suggest the presence of convective boundary mixing (CBM) at the bottom of the thermal pulse-driven convective zone. Similarly, convection-induced mixing processes are proposed for the mixing below the convective envelope during third dredge-up (TDU), where the {}13{{C}} pocket for the s process in AGB stars forms. In this work, we apply a CBM model motivated by simulations and theory to models with initial mass M = 2 and M=3 {M}⊙ , and with initial metal content Z = 0.01 and Z = 0.02. As reported previously, the He-intershell abundances of {}12{{C}} and {}16{{O}} are increased by CBM at the bottom of the pulse-driven convection zone. This mixing is affecting the {}22{Ne}(α, n){}25{Mg} activation and the s-process efficiency in the {}13{{C}}-pocket. In our model, CBM at the bottom of the convective envelope during the TDU represents gravity wave mixing. Furthermore, we take into account the fact that hydrodynamic simulations indicate a declining mixing efficiency that is already about a pressure scale height from the convective boundaries, compared to mixing-length theory. We obtain the formation of the {}13{{C}}-pocket with a mass of ≈ {10}-4 {M}⊙ . The final s-process abundances are characterized by 0.36\\lt [{{s}}/{Fe}]\\lt 0.78 and the heavy-to-light s-process ratio is -0.23\\lt [{hs}/{ls}]\\lt 0.45. Finally, we compare our results with stellar observations, presolar grain measurements and previous work.
White, Becky K; Martin, Annegret; White, James A; Burns, Sharyn K; Maycock, Bruce R; Giglia, Roslyn C
2016-01-01
Background Despite evidence of the benefits of breastfeeding, babies are exclusively breastfed to the recommended 6 months. The support of the father is one of the most important factors in breastfeeding success, and targeting breastfeeding interventions to the father has been a successful strategy in previous research. Mobile technology offers unique opportunities to engage and reach populations to enhance health literacy and healthy behavior. Objective The objective of our study was to use previous research, formative evaluation, and behavior change theory to develop the first evidence-based breastfeeding app targeted at men. We designed the app to provide men with social support and information aiming to increase the support men can offer their breastfeeding partners. Methods We used social cognitive theory to design and develop the Milk Man app through stages of formative research, testing, and iteration. We held focus groups with new and expectant fathers (n=18), as well as health professionals (n=16), and used qualitative data to inform the design and development of the app. We tested a prototype with fathers (n=4) via a think-aloud study and the completion of the Mobile Application Rating Scale (MARS). Results Fathers and health professionals provided input through the focus groups that informed the app development. The think-aloud walkthroughs identified 6 areas of functionality and usability to be addressed, including the addition of a tutorial, increased size of text and icons, and greater personalization. Testers rated the app highly, and the average MARS score for the app was 4.3 out of 5. Conclusions To our knowledge, Milk Man is the first breastfeeding app targeted specifically at men. The development of Milk Man followed a best practice approach, including the involvement of a multidisciplinary team and grounding in behavior change theory. It tested well with end users during development. Milk Man is currently being trialed as part of the Parent
Tambo, Yuichi; Hosomi, Yukio; Sakai, Hiroshi; Nogami, Naoyuki; Atagi, Shinji; Sasaki, Yasutsuna; Kato, Terufumi; Takahashi, Toshiaki; Seto, Takashi; Maemondo, Makoto; Nokihara, Hiroshi; Koyama, Ryo; Nakagawa, Kazuhiko; Kawaguchi, Tomoya; Okamura, Yuta; Nakamura, Osamu; Nishio, Makoto; Tamura, Tomohide
2017-04-01
Objectives To determine the recommended dose and efficacy/safety of docetaxel combined with resminostat (DR) in non-small cell lung cancer (NSCLC) patients with previous platinum-based chemotherapy. Materials and Methods A multicenter, open-label, phase I/II study was performed in Japanese patients with stage IIIB/IV or recurrent NSCLC and prior platinum-based chemotherapy. The recommended phase II dose was determined using a standard 3 + 3 dose design in phase I part. Resminostat was escalated from 400 to 600 mg/day and docetaxel fixed at 75 mg/m(2). In phase II part, the patients were randomly assigned to docetaxel alone (75 mg/m(2)) or DR therapy. Docetaxel was administered on day 1 and resminostat on days 1-5 in the DR group. Treatment was repeated every 21 days until progression or unacceptable toxicity. The primary endpoint was progression-free survival (PFS). Results A total of 117 patients (phase I part, 9; phase II part, 108) were enrolled. There was no dose-limiting toxicity in phase I part; the recommended dose for resminostat was 600 mg/day with 75 mg/m(2) of docetaxel. In phase II part, median PFS (95% confidence interval [CI]) was 4.2 (2.8-5.7) months with docetaxel group and 4.1 (1.5-5.4) months with DR group (hazard ratio [HR]: 1.354, 95% CI: 0.835-2.195; p = 0.209). Grade ≥ 3 adverse events significantly more common with DR group than docetaxel group were leukopenia, febrile neutropenia, thrombocytopenia, and anorexia. Conclusion In Japanese NSCLC patients previously treated with platinum-based chemotherapy, DR therapy did not improve PFS compared with docetaxel alone and increased toxicity.
Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course
McGowan, Ian S.
2016-01-01
Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…
Yan, Zi
2014-01-01
The theory of planned behaviour (TPB) was used to explore the Hong Kong teachers' intentions to implement school-based assessment (SBA) and the predictors of those intentions. A total of 280 teachers from Hong Kong secondary schools who had been involved in SBA were surveyed. Rasch-calibrated teacher measures were calculated for each of the 6…
Predicting Study Abroad Intentions Based on the Theory of Planned Behavior
Schnusenberg, Oliver; de Jong, Pieter; Goel, Lakshmi
2012-01-01
The emphasis on study abroad programs is growing in the academic context as U.S. based universities seek to incorporate a global perspective in education. Using a model that has underpinnings in the theory of planned behavior (TPB), we predict students' intention to participate in short-term study abroad program. We use TPB to identify behavioral,…
Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders
Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel
2012-01-01
This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum…
Examining Instruction in MIDI-based Composition through a Critical Theory Lens
Louth, Paul
2013-01-01
This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…
WANG Can-fu; CHENG Xiao-qiu
2011-01-01
Construction of forestry socialized service systems is the important content for reform of collective forestry tenure systems.Based on the necessity, possibility and problem of construction of forestry socialized service system, according to Barnard's Organizational Structure theory, the path and countermeasure of forestry socialized service system in China are discussed.
A theory of longitudinally polarised piezocomposite rod based on Mindlin-Herrmann model
Shatalov, M
2010-09-01
Full Text Available The conventional theory of the piezoelectric rod is based on an assumption that its lateral vibrations are negligible. In this case the rod, vibration could be described in terms of one-dimensional wave equation and a set of mechanical and electric...
An economic theory-based explanatory model of agricultural land-use patterns
Diogo, V.; Koomen, E.; Kuhlman, T.
2015-01-01
An economic theory-based land-use modelling framework is presented aiming to explain the causal link between economic decisions and resulting spatial patterns of agricultural land use. The framework assumes that farmers pursue utility maximisation in agricultural production systems, while conside
Simulation of Climate Negotiation Strategies between China and the U.S. Based on Game Theory
Zhu-Gang Jin
2014-01-01
Citation: Jin, Z.-G., W.-J. Cai, and C. Wang, 2014: Simulation of climate negotiation strategies between China and the U.S. based on game theory.Adv. Clim. Change Res.,5(1, doi: 10.3724/SP.J.1248.2014.034.
Effects of Guided Writing Strategies on Students' Writing Attitudes Based on Media Richness Theory
Lan, Yu-Feng; Hung, Chun-Ling; Hsu, Hung-Ju
2011-01-01
The purpose of this paper is to develop different guided writing strategies based on media richness theory and further evaluate the effects of these writing strategies on younger students' writing attitudes in terms of motivation, enjoyment and anxiety. A total of 66 sixth-grade elementary students with an average age of twelve were invited to…
Liaw, Shu-Sheng; Huang, Hsiu-Mei
2016-01-01
This paper investigates the use of e-books as learning tools in terms of learner satisfaction, usefulness, behavioral intention, and learning effectiveness. Based on the activity theory approach, this research develops a research model to understand learner attitudes toward e-books in two physical sizes: 10? and 7?. Results suggest that screen…
Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders
Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel
2012-01-01
This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum…
Assessment of Prevalence of Persons with Down Syndrome: A Theory-Based Demographic Model
de Graaf, Gert; Vis, Jeroen C.; Haveman, Meindert; van Hove, Geert; de Graaf, Erik A. B.; Tijssen, Jan G. P.; Mulder, Barbara J. M.
2011-01-01
Background: The Netherlands are lacking reliable empirical data in relation to the development of birth and population prevalence of Down syndrome. For the UK and Ireland there are more historical empirical data available. A theory-based model is developed for predicting Down syndrome prevalence in the Netherlands from the 1950s onwards. It is…
Liaw, Shu-Sheng; Huang, Hsiu-Mei
2016-01-01
This paper investigates the use of e-books as learning tools in terms of learner satisfaction, usefulness, behavioral intention, and learning effectiveness. Based on the activity theory approach, this research develops a research model to understand learner attitudes toward e-books in two physical sizes: 10? and 7?. Results suggest that screen…
Schuitema, J.; Peetsma, T.; van der Veen, I.
2014-01-01
The authors investigated the effects of an intervention developed to enhance student motivation in the first years of secondary education. The intervention, based on future time perspective (FTP) theory, has been found to be effective in prevocational secondary education (T. T. D. Peetsma & I. Van
Poverty Lines Based on Fuzzy Sets Theory and Its Application to Malaysian Data
Abdullah, Lazim
2011-01-01
Defining the poverty line has been acknowledged as being highly variable by the majority of published literature. Despite long discussions and successes, poverty line has a number of problems due to its arbitrary nature. This paper proposes three measurements of poverty lines using membership functions based on fuzzy set theory. The three…
Min, Shangchao; He, Lianzhen
2014-01-01
This study examined the relative effectiveness of the multidimensional bi-factor model and multidimensional testlet response theory (TRT) model in accommodating local dependence in testlet-based reading assessment with both dichotomously and polytomously scored items. The data used were 14,089 test-takers' item-level responses to the testlet-based…
Gabriel, Rachael
2011-01-01
In 1999, Ball and Cohen proposed a practice-based theory of professional education, which would end inadequate professional development efforts with a more comprehensive approach. Their work has been referenced over the past decade, yet there have been limited attempts to actualize their ideals and research their implications. In this article, I…
Examining Instruction in MIDI-based Composition through a Critical Theory Lens
Louth, Paul
2013-01-01
This paper considers the issue of computer-assisted composition in formal music education settings from the perspective of critical theory. The author examines the case of MIDI-based software applications and suggests that the greatest danger from the standpoint of ideology critique is not the potential for circumventing a traditional…
Agent-based models for higher-order theory of mind
de Weerd, Harmen; Verbrugge, Rineke; Verheij, Bart; Kamiński, Bogumił; Koloch, Grzegorz
2014-01-01
Agent-based models are a powerful tool for explaining the emergence of social phenomena in a society. In such models, individual agents typically have little cognitive ability. In this paper, we model agents with the cognitive ability to make use of theory of mind. People use this ability to reason
Li, Zhenying
2012-01-01
Based on Constructivism Theory, this paper aims to investigate the application of online multimedia courseware to college English teaching. By making experiments and students' feedback, some experience has been accumulated, and some problems are discovered and certain revelations are acquired as well in English teaching practice, which pave the…
The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology
Wang, Greg G.; Swanson, Richard A.
2008-01-01
Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…
Marcussen, Lis; Aasberg-Petersen, K.; Krøll, Annette Elisabeth
2000-01-01
An adsorption isotherm equation for nonideal pure component adsorption based on vacancy solution theory and the Non-Random-Two-Liquid (NRTL) equation is found to be useful for predicting pure component adsorption equilibria at a variety of conditions. The isotherm equation is evaluated successfully...... adsorption systems, spreading pressure and isosteric heat of adsorption are also calculated....
Determinants of oral hygiene behavior : a study based on the theory of planned behavior
Buunk-Werkhoven, Y.A.; Dijkstra, Arie; van der Schans, C.P.
2011-01-01
Objective: The aim of this study was to develop an index for oral hygiene behavior (OHB) and to examine potential predictors of this actual behavior based on the theory of planned behavior (TPB). Measures of oral health knowledge (OHK) and the expected effect of having healthy teeth on social relati
A resource-based theory of market structure and organizational form
van Witteloostuijn, A.; Boone, C.A.J.J.
2006-01-01
We argue that combining the insights from both the industrial organization and organizational ecology perspectives is likely to produce value added. We develop a resource-based theory of market structure, where resources pertain to the environmental assets (together forming the resource space) witho
Applications of Cognitive Load Theory to Multimedia-Based Foreign Language Learning: An Overview
Chen, I-Jung; Chang, Chi-Cheng; Lee, Yen-Chang
2009-01-01
This article reviews the multimedia instructional design literature based on cognitive load theory (CLT) in the context of foreign language learning. Multimedia are of particular importance in language learning materials because they incorporate text, image, and sound, thus offering an integrated learning experience of the four language skills…
Predicting Study Abroad Intentions Based on the Theory of Planned Behavior
Schnusenberg, Oliver; de Jong, Pieter; Goel, Lakshmi
2012-01-01
The emphasis on study abroad programs is growing in the academic context as U.S. based universities seek to incorporate a global perspective in education. Using a model that has underpinnings in the theory of planned behavior (TPB), we predict students' intention to participate in short-term study abroad program. We use TPB to identify behavioral,…
Campbell, Chris; MacPherson, Seonaigh; Sawkins, Tanis
2014-01-01
This case study describes how sociocultural and activity theory were applied in the design of a publicly funded, Canadian Language Benchmark (CLB)-based English as a Second Language (ESL) credential program and curriculum for immigrant and international students in postsecondary institutions in British Columbia, Canada. The ESL Pathways Project…
Imitation dynamics of vaccine decision-making behaviours based on the game theory.
Yang, Junyuan; Martcheva, Maia; Chen, Yuming
2016-01-01
Based on game theory, we propose an age-structured model to investigate the imitation dynamics of vaccine uptake. We first obtain the existence and local stability of equilibria. We show that Hopf bifurcation can occur. We also establish the global stability of the boundary equilibria and persistence of the disease. The theoretical results are supported by numerical simulations.
Liaw, Shu-Sheng; Hatala, Marek; Huang, Hsiu-Mei
2010-01-01
Mobile devices could facilitate human interaction and access to knowledge resources anytime and anywhere. With respect to wide application possibilities of mobile learning, investigating learners' acceptance towards it is an essential issue. Based on activity theory approach, this research explores positive factors for the acceptance of m-learning…
Critically Evaluating Competing Theories: An Exercise Based on the Kitty Genovese Murder
Sagarin, Brad J.; Lawler-Sagarin, Kimberly A.
2005-01-01
We describe an exercise based on the 1964 murder of Catherine Genovese--a murder observed by 38 witnesses, none of whom called the police. Students read a summary of the murder and worked in small groups to design an experiment to test the competing theories for the inaction of the witnesses (Americans' selfishness and insensitivity vs. diffusion…
Amadei, A; Apol, MEF; DiNola, A; Berendsen, HJC
1996-01-01
A new theory is presented for calculating the Helmholtz free energy based on the potential energy distribution function. The usual expressions of free energy, internal energy and entropy involving the partition function are rephrased in terms of the potential energy distribution function, which must
Poverty Lines Based on Fuzzy Sets Theory and Its Application to Malaysian Data
Abdullah, Lazim
2011-01-01
Defining the poverty line has been acknowledged as being highly variable by the majority of published literature. Despite long discussions and successes, poverty line has a number of problems due to its arbitrary nature. This paper proposes three measurements of poverty lines using membership functions based on fuzzy set theory. The three…
Yan, Zi
2014-01-01
The theory of planned behaviour (TPB) was used to explore the Hong Kong teachers' intentions to implement school-based assessment (SBA) and the predictors of those intentions. A total of 280 teachers from Hong Kong secondary schools who had been involved in SBA were surveyed. Rasch-calibrated teacher measures were calculated for each of the 6…
A communication-based theory of the choice between Greenfield and acquisition entry
A.H.L. Slangen
2011-01-01
This paper develops a communication-based theory of the choice by multinational enterprises (MNEs) between greenfield and acquisition entry. It argues that MNE parents communicate with their subsidiaries for reasons of knowledge exchange, coordination, monitoring, and socialization. The expected com
New Image Recognition Method Based on Rough-Sets and Fuzzy Theory
张艳; 李凤霞; 战守义
2003-01-01
A new image recognition method based on fuzzy-rough sets theory is proposed, and its implementation discussed. The performance of this method as applied to ferrography image recognition is evaluated. It is shown that the new method gives better results than fuzzy or rough-sets method when used alone.
Amadei, A; Apol, MEF; DiNola, A; Berendsen, HJC
1996-01-01
A new theory is presented for calculating the Helmholtz free energy based on the potential energy distribution function. The usual expressions of free energy, internal energy and entropy involving the partition function are rephrased in terms of the potential energy distribution function, which must