WorldWideScience

Sample records for model translation methodology

  1. Learning by Translating: A Contrastive Methodology for ESP Learning and Translation

    Directory of Open Access Journals (Sweden)

    Sara Laviosa

    2015-11-01

    Full Text Available Over the last few years applied linguists have explored the possibility of integrating the insights of second language acquisition theories, contrastive analysis, foreign language teaching methodologies, and translation studies with a view to enhancing current communicative models and techniques for L2 teaching and translator training (see for example Sewell and Higgins 1996; Laviosa-Braithwaite 1997; Campbell 1998; Malmkjær 1998; Laviosa 2000; Colina 2002. We intend to make a contribution to this interdisciplinary orientation by putting forward a translation-based methodology for learning ESP vocabulary and grammar through real life mediating communicative activities. With particular reference to the translation task itself, we endeavour to provide teachers of English for special purposes and translator trainers with a methodology for guiding their students in producing, to the best of their abilities, a target text which meets the quality criteria of terminological accuracy and stylistic fluency, and is also effective in terms of the communicative situation it is intended for. After outlining the rationale and main theoretical approaches underpinning our work, we will illustrate our methodology for learning ESP vocabulary and translation skills from a contrastive perspective, as in our book Learning by Translating (Laviosa and Cleverton 2003.

  2. Methodological considerations when translating “burnout”

    Directory of Open Access Journals (Sweden)

    Allison Squires

    2014-09-01

    Full Text Available No study has systematically examined how researchers address cross-cultural adaptation of burnout. We conducted an integrative review to examine how researchers had adapted the instruments to the different contexts. We reviewed the Content Validity Indexing scores for the Maslach Burnout Inventory-Human Services Survey from the 12-country comparative nursing workforce study, RN4CAST. In the integrative review, multiple issues related to translation were found in existing studies. In the cross-cultural instrument analysis, 7 out of 22 items on the instrument received an extremely low kappa score. Investigators may need to employ more rigorous cross-cultural adaptation methods when attempting to measure burnout.

  3. A Metadata based Knowledge Discovery Methodology for Seeding Translational Research.

    Science.gov (United States)

    Kothari, Cartik R; Payne, Philip R O

    2015-01-01

    In this paper, we present a semantic, metadata based knowledge discovery methodology for identifying teams of researchers from diverse backgrounds who can collaborate on interdisciplinary research projects: projects in areas that have been identified as high-impact areas at The Ohio State University. This methodology involves the semantic annotation of keywords and the postulation of semantic metrics to improve the efficiency of the path exploration algorithm as well as to rank the results. Results indicate that our methodology can discover groups of experts from diverse areas who can collaborate on translational research projects.

  4. Reflexivity: a methodological tool in the knowledge translation process?

    Science.gov (United States)

    Alley, Sarah; Jackson, Suzanne F; Shakya, Yogendra B

    2015-05-01

    Knowledge translation is a dynamic and iterative process that includes the synthesis, dissemination, exchange, and application of knowledge. It is considered the bridge that closes the gap between research and practice. Yet it appears that in all areas of practice, a significant gap remains in translating research knowledge into practical application. Recently, researchers and practitioners in the field of health care have begun to recognize reflection and reflexive exercises as a fundamental component to the knowledge translation process. As a practical tool, reflexivity can go beyond simply looking at what practitioners are doing; when approached in a systematic manner, it has the potential to enable practitioners from a wide variety of backgrounds to identify, understand, and act in relation to the personal, professional, and political challenges they face in practice. This article focuses on how reflexive practice as a methodological tool can provide researchers and practitioners with new insights and increased self-awareness, as they are able to critically examine the nature of their work and acknowledge biases, which may affect the knowledge translation process. Through the use of structured journal entries, the nature of the relationship between reflexivity and knowledge translation was examined, specifically exploring if reflexivity can improve the knowledge translation process, leading to increased utilization and application of research findings into everyday practice.

  5. The use of social surveys in translation studies: methodological characteristics

    OpenAIRE

    Kuznik, Anna; Hurtado Albir, Amparo; Espinal Berenguer, Anna; Andrews, Mark

    2010-01-01

    Translation is an activity carried out by professionals – in some cases after a period of formal training – who are employed or self-employed, and whose work is destined for translation users. Translators, translator trainees, employers of translators, and translation users are four clearly defined social groups within the translation industry that may be the subject of study using one of the methods most frequently used within the field of social sciences: the social survey. This paper prese...

  6. Contemporary research on parenting: conceptual, methodological, and translational issues.

    Science.gov (United States)

    Power, Thomas G; Sleddens, Ester F C; Berge, Jerica; Connell, Lauren; Govig, Bert; Hennessy, Erin; Liggett, Leanne; Mallan, Kimberley; Santa Maria, Diane; Odoms-Young, Angela; St George, Sara M

    2013-08-01

    Researchers over the last decade have documented the association between general parenting style and numerous factors related to childhood obesity (e.g., children's eating behaviors, physical activity, and weight status). Many recent childhood obesity prevention programs are family focused and designed to modify parenting behaviors thought to contribute to childhood obesity risk. This article presents a brief consideration of conceptual, methodological, and translational issues that can inform future research on the role of parenting in childhood obesity. They include: (1) General versus domain specific parenting styles and practices; (2) the role of ethnicity and culture; (3) assessing bidirectional influences; (4) broadening assessments beyond the immediate family; (5) novel approaches to parenting measurement; and (6) designing effective interventions. Numerous directions for future research are offered.

  7. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    Science.gov (United States)

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  8. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    Science.gov (United States)

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  9. Translational models of lung disease.

    Science.gov (United States)

    Mercer, Paul F; Abbott-Banner, Katharine; Adcock, Ian M; Knowles, Richard G

    2015-02-01

    The 2nd Cross Company Respiratory Symposium (CCRS), held in Horsham, U.K. in 2012, brought together representatives from across the pharmaceutical industry with expert academics, in the common interest of improving the design and translational predictiveness of in vivo models of respiratory disease. Organized by the respiratory representatives of the European Federation of Pharmaceutical Industries and Federations (EFPIA) group of companies involved in the EU-funded project (U-BIOPRED), the aim of the symposium was to identify state-of-the-art improvements in the utility and design of models of respiratory disease, with a view to improving their translational potential and reducing wasteful animal usage. The respiratory research and development community is responding to the challenge of improving translation in several ways: greater collaboration and open sharing of data, careful selection of the species, complexity and chronicity of the models, improved practices in preclinical research, continued refinement in models of respiratory diseases and their sub-types, greater understanding of the biology underlying human respiratory diseases and their sub-types, and finally greater use of human (and especially disease-relevant) cells, tissues and explants. The present review highlights these initiatives, combining lessons from the symposium and papers published in Clinical Science arising from the symposium, with critiques of the models currently used in the settings of asthma, idiopathic pulmonary fibrosis and COPD. The ultimate hope is that this will contribute to a more rational, efficient and sustainable development of a range of new treatments for respiratory diseases that continue to cause substantial morbidity and mortality across the world.

  10. The Integration of Project-Based Methodology into Teaching in Machine Translation

    Science.gov (United States)

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…

  11. Key Methodological Aspects of Translators' Training in Ukraine and in the USA

    Science.gov (United States)

    Skyba, Kateryna

    2015-01-01

    The diversity of international relations in the globalized world has influenced the role of a translator that is becoming more and more important. Translators' training institutions today are to work out and to implement the best teaching methodology taking into consideration the new challenges of modern multinational and multicultural society.…

  12. Hybrid intelligent methodology to design translation invariant morphological operators for Brazilian stock market prediction.

    Science.gov (United States)

    Araújo, Ricardo de A

    2010-12-01

    This paper presents a hybrid intelligent methodology to design increasing translation invariant morphological operators applied to Brazilian stock market prediction (overcoming the random walk dilemma). The proposed Translation Invariant Morphological Robust Automatic phase-Adjustment (TIMRAA) method consists of a hybrid intelligent model composed of a Modular Morphological Neural Network (MMNN) with a Quantum-Inspired Evolutionary Algorithm (QIEA), which searches for the best time lags to reconstruct the phase space of the time series generator phenomenon and determines the initial (sub-optimal) parameters of the MMNN. Each individual of the QIEA population is further trained by the Back Propagation (BP) algorithm to improve the MMNN parameters supplied by the QIEA. Also, for each prediction model generated, it uses a behavioral statistical test and a phase fix procedure to adjust time phase distortions observed in stock market time series. Furthermore, an experimental analysis is conducted with the proposed method through four Brazilian stock market time series, and the achieved results are discussed and compared to results found with random walk models and the previously introduced Time-delay Added Evolutionary Forecasting (TAEF) and Morphological-Rank-Linear Time-lag Added Evolutionary Forecasting (MRLTAEF) methods.

  13. A Study on Jerome Translation Model and Horace Model

    Institute of Scientific and Technical Information of China (English)

    杨坚; 石美

    2013-01-01

    The two translation models, namely the Jerome and Horace can be thought to be the forerunner of the translation theory nowadays. They share one common concept, 0faithfulness. But the Jerome and Horace model hold different views to faithfulness. In Jerome model, the translator is faithful to a text, but in the Horace model he should be faithful to his customers. Besides, the faithfulness, equivalence, domesticating and foreignizing principles in the two translation models are all related to the three factors, the author, the text and the reader which are necessary in translation.

  14. "Traduction" et didactique des langues ("Translation" and Language-Teaching Methodology)

    Science.gov (United States)

    Besse, Henri

    1975-01-01

    Attempts, through clarification of the term "translation," to define and put in place, from a methodological point of view, the latent tendency to reach back to the native language system, and its effect on the learning of the target language. (Text is in French.) (IFS/WGA)

  15. Towards a Multimodal Methodology for the Analysis of Translated/Localised Games

    Directory of Open Access Journals (Sweden)

    Bárbara Resende Coelho

    2016-12-01

    Full Text Available Multimedia materials require research methodologies that are able to comprehend all of their assets. Videogames are the epitome of multimedia, joining image, sound, video, animation, graphics and text with the interactivity factor. A methodology to conduct research into translation and localisation of videogames should be able to analyse all of its assets and features. This paper sets out to develop a research methodology for games and their translations/localisations that goes beyond the collection and analysis of “screenshots” and includes as many of their assets as possible. Using the fully localised version of the game Watchdogs, this papers shows how tools and technologies allow for transcending the mere analysis of linguistic contents within multimedia materials. Using software ELAN Language Archive to analyse Portuguese-language dubbed and English-language subtitled excerpts from the videogame, it was possible to identify patterns in both linguistic and audio-visual elements, as well as to correlate them.

  16. Which Principles and which Methodology for Specialized Translation? Contrastive Typology (French – Italian of Specialized Discourses: The Example of Translation from the Field of Economics

    Directory of Open Access Journals (Sweden)

    Louis BEGIONI

    2011-01-01

    Full Text Available The few reflections we are to present here are the result of our observations on specialized translations: version, theme, methodology of the translation ensured by us as part of our teaching activity at the Applied Modern Languages unit of our University.

  17. Nonhuman primate models in translational regenerative medicine.

    Science.gov (United States)

    Daadi, Marcel M; Barberi, Tiziano; Shi, Qiang; Lanford, Robert E

    2014-12-01

    Humans and nonhuman primates (NHPs) are similar in size, behavior, physiology, biochemistry, structure and function of organs, and complexity of the immune system. Research on NHPs generates complementary data that bridge translational research from small animal models to humans. NHP models of human disease offer unique opportunities to develop stem cell-based therapeutic interventions that directly address relevant and challenging translational aspects of cell transplantation therapy. These include the use of autologous induced pluripotent stem cell-derived cellular products, issues related to the immune response in autologous and allogeneic setting, pros and cons of delivery techniques in a clinical setting, as well as the safety and efficacy of candidate cell lines. The NHP model allows the assessment of complex physiological, biochemical, behavioral, and imaging end points, with direct relevance to human conditions. At the same time, the value of using primates in scientific research must be carefully evaluated and timed due to expense and the necessity for specialized equipment and highly trained personnel. Often it is more efficient and useful to perform initial proof-of-concept studies for new therapeutics in rodents and/or other species before the pivotal studies in NHPs that may eventually lead to first-in-human trials. In this report, we present how the Southwest National Primate Research Center, one of seven NIH-funded National Primate Research Centers, may help the global community in translating promising technologies to the clinical arena.

  18. Sketch of a Noisy Channel Model for the Translation Process

    DEFF Research Database (Denmark)

    Carl, Michael

    The paper develops a Noisy Channel Model for the translation process that is based on actual user activity data. It builds on the monitor model and makes a distinction between early, automatic and late, conscious translation processes: while early priming processes are at the basis of a "literal ...... of the Center for Research and Innovation in Translation and Translation Technology (CRITT) at IBC....... of International Business Communication. His current research interests are related to the investigation of human translation processes and how advanced computer tools (such as machine translation) can fruitfully complement and support the human (translation) activities. Furthermore, he is the director...

  19. Applying Corpus Methodology to Error Analysis of Students' Translation into the L1: The Context of Audiovisual Translation

    OpenAIRE

    Yakimovskaya, Ksenia

    2012-01-01

    The aim of the present research is to investigate error patterns in students’ translation of the audiovisual discourse and to describe factors influencing the process of interpretation. As translation into the mother tongue is usually considered to be the norm, abstracts taken from the movie script were rendered by participants from English (L2) into Russian (L1). The data was collected from 12 learners studying at the Department of Translation and Interpretation at PyatigorskStateLinguisticU...

  20. Translational In Vivo Models for Cardiovascular Diseases.

    Science.gov (United States)

    Fliegner, Daniela; Gerdes, Christoph; Meding, Jörg; Stasch, Johannes-Peter

    2016-01-01

    Cardiovascular diseases are still the first leading cause of death and morbidity in developed countries. Experimental cardiology research and preclinical drug development in cardiology call for appropriate and especially clinically relevant in vitro and in vivo studies. The use of animal models has contributed to expand our knowledge and our understanding of the underlying mechanisms and accordingly provided new approaches focused on the improvement of diagnostic and treatment strategies of various cardiac pathologies.Numerous animal models in different species as well as in small and large animals have been developed to address cardiovascular complications, including heart failure, pulmonary hypertension, and thrombotic diseases. However, a perfect model of heart failure or other indications that reproduces every aspect of the natural disease does not exist. The complexity and heterogeneity of cardiac diseases plus the influence of genetic and environmental factors limit to mirror a particular disease with a single experimental model.Thus, drug development in the field of cardiology is not only very challenging but also inspiring; therefore animal models should be selected that reflect as best as possible the disease being investigated. Given the wide range of animal models, reflecting critical features of the human pathophysiology available nowadays increases the likelihood of the translation to the patients. Furthermore, this knowledge and the increase of the predictive value of preclinical models help us to find more efficient and reliable solutions as well as better and innovative treatment strategies for cardiovascular diseases.

  1. Animal models of tic disorders: a translational perspective.

    Science.gov (United States)

    Godar, Sean C; Mosher, Laura J; Di Giovanni, Giuseppe; Bortolato, Marco

    2014-12-30

    Tics are repetitive, sudden movements and/or vocalizations, typically enacted as maladaptive responses to intrusive premonitory urges. The most severe tic disorder, Tourette syndrome (TS), is a childhood-onset condition featuring multiple motor and at least one phonic tic for a duration longer than 1 year. The pharmacological treatment of TS is mainly based on antipsychotic agents; while these drugs are often effective in reducing tic severity and frequency, their therapeutic compliance is limited by serious motor and cognitive side effects. The identification of novel therapeutic targets and development of better treatments for tic disorders is conditional on the development of animal models with high translational validity. In addition, these experimental tools can prove extremely useful to test hypotheses on the etiology and neurobiological bases of TS and related conditions. In recent years, the translational value of these animal models has been enhanced, thanks to a significant re-organization of our conceptual framework of neuropsychiatric disorders, with a greater focus on endophenotypes and quantitative indices, rather than qualitative descriptors. Given the complex and multifactorial nature of TS and other tic disorders, the selection of animal models that can appropriately capture specific symptomatic aspects of these conditions can pose significant theoretical and methodological challenges. In this article, we will review the state of the art on the available animal models of tic disorders, based on genetic mutations, environmental interventions as well as pharmacological manipulations. Furthermore, we will outline emerging lines of translational research showing how some of these experimental preparations have led to significant progress in the identification of novel therapeutic targets for tic disorders.

  2. Education in health research methodology: use of a wiki for knowledge translation.

    Directory of Open Access Journals (Sweden)

    Michele P Hamm

    Full Text Available INTRODUCTION: A research-practice gap exists between what is known about conducting methodologically rigorous randomized controlled trials (RCTs and what is done. Evidence consistently shows that pediatric RCTs are susceptible to high risk of bias; therefore novel methods of influencing the design and conduct of trials are required. The objective of this study was to develop and pilot test a wiki designed to educate pediatric trialists and trainees in the principles involved in minimizing risk of bias in RCTs. The focus was on preliminary usability testing of the wiki. METHODS: The wiki was developed through adaptation of existing knowledge translation strategies and through tailoring the site to the identified needs of the end-users. The wiki was evaluated for usability and user preferences regarding the content and formatting. Semi-structured interviews were conducted with 15 trialists and systematic reviewers, representing varying levels of experience with risk of bias or the conduct of trials. Data were analyzed using content analysis. RESULTS: Participants found the wiki to be well organized, easy to use, and straightforward to navigate. Suggestions for improvement tended to focus on clarification of the text or on esthetics, rather than on the content or format. Participants liked the additional features of the site that were supplementary to the text, such as the interactive examples, and the components that focused on practical applications, adding relevance to the theory presented. While the site could be used by both trialists and systematic reviewers, the lack of a clearly defined target audience caused some confusion among participants. CONCLUSIONS: Participants were supportive of using a wiki as a novel educational tool. The results of this pilot test will be used to refine the risk of bias wiki, which holds promise as a knowledge translation intervention for education in medical research methodology.

  3. Serbian translation of the 20-item toronto alexithymia scale: Psychometric properties and the new methodological approach in translating scales

    Directory of Open Access Journals (Sweden)

    Trajanović Nikola N.

    2013-01-01

    Full Text Available Introduction. Since inception of the alexithymia construct in 1970’s, there has been a continuous effort to improve both its theoretical postulates and the clinical utility through development, standardization and validation of assessment scales. Objective. The aim of this study was to validate the Serbian translation of the 20-item Toronto Alexithymia Scale (TAS-20 and to propose a new method of translation of scales with a property of temporal stability. Methods. The scale was expertly translated by bilingual medical professionals and a linguist, and given to a sample of bilingual participants from the general population who completed both the English and the Serbian version of the scale one week apart. Results. The findings showed that the Serbian version of the TAS-20 had a good internal consistency reliability regarding total scale (α=0.86, and acceptable reliability of the three factors (α=0.71-0.79. Conclusion. The analysis confirmed the validity and consistency of the Serbian translation of the scale, with observed weakness of the factorial structure consistent with studies in other languages. The results also showed that the method of utilizing a self-control bilingual subject is a useful alternative to the back-translation method, particularly in cases of linguistically and structurally sensitive scales, or in cases where a larger sample is not available. This method, dubbed as ‘forth-translation’, could be used to translate psychometric scales measuring properties which have temporal stability over the period of at least several weeks.

  4. A Model for Cognitive Process of Neologisms Translation

    Directory of Open Access Journals (Sweden)

    Seyed Mohammad Moghadas

    2014-03-01

    Full Text Available Over the past three decades Process-oriented Descriptive Translation Studies has developed noticeably. Think-aloud protocols (TAPs, as a verbal report, are still the most applied empirical method to investigate the complex and conscious processes of translator’s mind during translating. This article deals with the cognitive process of professional translators’ problem-solving to translate a neologism from English source text into Persian by using TAPs. Likewise, the researcher has also used the means of video recording to observe the other behaviors of participants during the problem solving. The results indicate that the professional translators do not use one single way of performing a translation task and the complexity of the process of problem-solving (neologisms translation depends on the translation competence of translators. Finally, the researcher has also presented a cognitive model for the translation process of neologisms in ideal situations. As the translation universals are cognitive phenomena, the cognitive model presented here can be a pattern for trainee translators in education to visualize the natural process of neologism translation.

  5. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the w

  6. Intelligent CAD Methodology Research of Adaptive Modeling

    Institute of Scientific and Technical Information of China (English)

    ZHANG Weibo; LI Jun; YAN Jianrong

    2006-01-01

    The key to carry out ICAD technology is to establish the knowledge-based and wide rang of domains-covered product model. This paper put out a knowledge-based methodology of adaptive modeling. It is under the Ontology mind, using the Object-Oriented technology and being a knowledge-based model framework. It involves the diverse domains in product design and realizes the multi-domain modeling, embedding the relative information including standards, regulars and expert experience. To test the feasibility of the methodology, the research bonds of the automotive diaphragm spring clutch design and an adaptive clutch design model is established, using the knowledge-based modeling language-AML.

  7. A Structured Methodology for Spreadsheet Modelling

    CERN Document Server

    Knight, Brian; Rajalingham, Kamalesen

    2008-01-01

    In this paper, we discuss the problem of the software engineering of a class of business spreadsheet models. A methodology for structured software development is proposed, which is based on structured analysis of data, represented as Jackson diagrams. It is shown that this analysis allows a straightforward modularisation, and that individual modules may be represented with indentation in the block-structured form of structured programs. The benefits of structured format are discussed, in terms of comprehensibility, ease of maintenance, and reduction in errors. The capability of the methodology to provide a modular overview in the model is described, and examples are given. The potential for a reverse-engineering tool, to transform existing spreadsheet models is discussed.

  8. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  9. Representational Translation with Concrete Models in Organic Chemistry

    Science.gov (United States)

    Stull, Andrew T.; Hegarty, Mary; Dixon, Bonnie; Stieff, Mike

    2012-01-01

    In representation-rich domains such as organic chemistry, students must be facile and accurate when translating between different 2D representations, such as diagrams. We hypothesized that translating between organic chemistry diagrams would be more accurate when concrete models were used because difficult mental processes could be augmented by…

  10. Probabilistic Boolean Network Modelling and Analysis Framework for mRNA Translation.

    Science.gov (United States)

    Zhao, Yun-Bo; Krishnan, J

    2016-01-01

    mRNA translation is a complex process involving the progression of ribosomes on the mRNA, resulting in the synthesis of proteins, and is subject to multiple layers of regulation. This process has been modelled using different formalisms, both stochastic and deterministic. Recently, we introduced a Probabilistic Boolean modelling framework for mRNA translation, which possesses the advantage of tools for numerically exact computation of steady state probability distribution, without requiring simulation. Here, we extend this model to incorporate both random sequential and parallel update rules, and demonstrate its effectiveness in various settings, including its flexibility in accommodating additional static and dynamic biological complexities and its role in parameter sensitivity analysis. In these applications, the results from the model analysis match those of TASEP model simulations. Importantly, the proposed modelling framework maintains the stochastic aspects of mRNA translation and provides a way to exactly calculate probability distributions, providing additional tools of analysis in this context. Finally, the proposed modelling methodology provides an alternative approach to the understanding of the mRNA translation process, by bridging the gap between existing approaches, providing new analysis tools, and contributing to a more robust platform for modelling and understanding translation.

  11. Translation

    OpenAIRE

    2005-01-01

    "Translation" is a life narrative about the ways in which cultural histories shape personal stories, and the capacity of the imagination to develop alternative narratives about oneself and the world. It can also be read a way of addressing the effects of what Ato Quayson calls the global process of postcolonializing. Quaysons critical perspective might be used as an interpretive lens for seeing some of the ways in which  this autobiographical narrative complicates the jargon of race, cl...

  12. Translation

    OpenAIRE

    2005-01-01

    "Translation" is a life narrative about the ways in which cultural histories shape personal stories, and the capacity of the imagination to develop alternative narratives about oneself and the world. It can also be read a way of addressing the effects of what Ato Quayson calls the global process of postcolonializing. Quaysons critical perspective might be used as an interpretive lens for seeing some of the ways in which  this autobiographical narrative complicates the jargon of race, class, ...

  13. A Stochastic Model of RNA Translation with Frameshifting

    Science.gov (United States)

    Bailey, Brenae

    2011-10-01

    Many viruses can produce different proteins from the same RNA sequence by encoding them in overlapping genes. One mechanism that causes the ribosomes of infected cells to decode both genes is called programmed ribosomal frameshifting (PRF). Although PRF has been recognized for 25 years, the mechanism is not well understood. We have developed a model that treats RNA translation as a stochastic process in which the transition probabilities are based on the free energies of local molecular interactions. The model reproduces observed translation rates and frameshift efficiencies, and can be used to predict the effects of mutations in the viral RNA sequence on both the mean translation rate and the frameshift efficiency.

  14. Linguistically motivated statistical machine translation models and algorithms

    CERN Document Server

    Xiong, Deyi

    2015-01-01

    This book provides a wide variety of algorithms and models to integrate linguistic knowledge into Statistical Machine Translation (SMT). It helps advance conventional SMT to linguistically motivated SMT by enhancing the following three essential components: translation, reordering and bracketing models. It also serves the purpose of promoting the in-depth study of the impacts of linguistic knowledge on machine translation. Finally it provides a systematic introduction of Bracketing Transduction Grammar (BTG) based SMT, one of the state-of-the-art SMT formalisms, as well as a case study of linguistically motivated SMT on a BTG-based platform.

  15. The Multidisciplinary Translational Team (MTT) Model for Training and Development of Translational Research Investigators.

    Science.gov (United States)

    Ameredes, Bill T; Hellmich, Mark R; Cestone, Christina M; Wooten, Kevin C; Ottenbacher, Kenneth J; Chonmaitree, Tasnee; Anderson, Karl E; Brasier, Allan R

    2015-10-01

    Multiinstitutional research collaborations now form the most rapid and productive project execution structures in the health sciences. Effective adoption of a multidisciplinary team research approach is widely accepted as one mechanism enabling rapid translation of new discoveries into interventions in human health. Although the impact of successful team-based approaches facilitating innovation has been well-documented, its utility for training a new generation of scientists has not been thoroughly investigated. We describe the characteristics of how multidisciplinary translational teams (MTTs) promote career development of translational research scholars through competency building, interprofessional integration, and team-based mentoring approaches. Exploratory longitudinal and outcome assessments from our experience show that MTT membership had a positive effect on the development of translational research competencies, as determined by a self-report survey of 32 scholars. We also observed that all trainees produced a large number of collaborative publications that appeared to be associated with their CTSA association and participation with MTTs. We conclude that the MTT model provides a unique training environment for translational and team-based learning activities, for investigators at early stages of career development.

  16. Lost and Found in Translation: An Ecological Approach to Bilingual Research Methodology

    Directory of Open Access Journals (Sweden)

    Justin Jagosh PhD

    2009-06-01

    Full Text Available Translation issues emerged from a qualitative study, conducted in French and English, that gathered patient perspectives on a newly implemented undergraduate medical curriculum entitled Physicianship: The Physician as Professional and Healer. French-speaking participants were interviewed using a translated interview guide, originally developed in English. A major finding that francophone participants contested the idea of the physician-healer in a manner not witnessed among the anglophone participants. Consultation with multilingual health professionals was undertaken to explore whether the contestation was the result of poor translation of the word healer. This process confirmed that no appropriate French equivalent could be found. With hindsight, the authors emphasize the importance of pretesting translated research instrumentation. An ecological perspective on language equivalency is also emphasized, in which emergent linguistic discrepancies are viewed as opportunities for learning about the culture-language relationship.

  17. Hon-yaku: a biology-driven Bayesian methodology for identifying translation initiation sites in prokaryotes

    Directory of Open Access Journals (Sweden)

    de Hoon Michiel JL

    2007-02-01

    Full Text Available Abstract Background Computational prediction methods are currently used to identify genes in prokaryote genomes. However, identification of the correct translation initiation sites remains a difficult task. Accurate translation initiation sites (TISs are important not only for the annotation of unknown proteins but also for the prediction of operons, promoters, and small non-coding RNA genes, as this typically makes use of the intergenic distance. A further problem is that most existing methods are optimized for Escherichia coli data sets; applying these methods to newly sequenced bacterial genomes may not result in an equivalent level of accuracy. Results Based on a biological representation of the translation process, we applied Bayesian statistics to create a score function for predicting translation initiation sites. In contrast to existing programs, our combination of methods uses supervised learning to optimally use the set of known translation initiation sites. We combined the Ribosome Binding Site (RBS sequence, the distance between the translation initiation site and the RBS sequence, the base composition of the start codon, the nucleotide composition (A-rich sequences following start codons, and the expected distribution of the protein length in a Bayesian scoring function. To further increase the prediction accuracy, we also took into account the operon orientation. The outcome of the procedure achieved a prediction accuracy of 93.2% in 858 E. coli genes from the EcoGene data set and 92.7% accuracy in a data set of 1243 Bacillus subtilis 'non-y' genes. We confirmed the performance in the GC-rich Gamma-Proteobacteria Herminiimonas arsenicoxydans, Pseudomonas aeruginosa, and Burkholderia pseudomallei K96243. Conclusion Hon-yaku, being based on a careful choice of elements important in translation, improved the prediction accuracy in B. subtilis data sets and other bacteria except for E. coli. We believe that most remaining

  18. A Critical Evaluation of the Methodological Obstacles to Translating Cell-Based Research Into an Effective Treatment for People With Parkinson's Disease.

    Science.gov (United States)

    Polgar, Stephen; Karimi, Leila; Buultjens, Melissa; Morris, Meg E

    2016-10-01

    The remarkable scientific and technological advances in the field of cell research have not been translated into viable restorative therapies for brain disorders. In this article, we examine the best available evidence for the clinical efficacy of reconstructive intracerebral transplantation in people with Parkinson's disease (PD), with the aim of identifying methodological obstacles to the translation process. The major stumbling block is the fact that the potential contributions of people with neural grafts and the effects of the physical and social environment in which they recover have not been adequately investigated and applied to advancing the clinical stages of the research program. We suggest that the biopsychosocial model along with emerging evidence of targeted rehabilitation can provide a useful framework for conducting research and evaluation that will ensure the best possible outcomes following intracerebral transplantation for PD.

  19. From Translational Research to Translational Effectiveness: The “Patient-Centered Dental Home” Model

    Directory of Open Access Journals (Sweden)

    Francesco Chiappelli

    2011-06-01

    Full Text Available Toward revitalizing the Nation’s primary medical care system, the Agency for Health Research & Quality (AHRQ stated that new foundational measures must be crafted for achieving high-quality, accessible, efficient health care for all Americans. The efficiency of medical care is viewed along two dimensions: first, we must continue to pursue translational research; and second, we must translate research to optimize effectiveness in specific clinical settings. It is increasingly evident that the efficiency of both translational processes is critical to the revitalization of health care, and that it rests on the practical functionality of the nexus among three cardinal entities: the researcher, the clinician, and the patient. A novel model has evolved that encapsulates this notion, and that proposes the advanced pri-mary care “medical home”, more commonly referred to as the “patient-centered medical home” (PCMH. It is a promising model for transforming the organization and delivery of primary medical care, because it is not simply a place per se, but it is a function-ing unit that delivers medical care along the fundamental principles of being patient-centered, comprehensive, coordinated, and accessible. It is energized by translational research, and its principal aim and ultimate goal is translational effectiveness. The PCMH is a model that works well within the priorities set by the American Recovery and Reinvestment Act of 2009, and the Health Care Reform Act of 2010. However, while dentistry has a clearly defined place in both Acts, the PCMH is designed for medical and nursing care. A parallel model of the “patient-centered dental home” (PCDH must be realized.

  20. Theoretical and Methodological Principles of Developing Translational Capabilities of Controlling in Ensuring Sustainable Development of the Enterprise

    Directory of Open Access Journals (Sweden)

    Tarasova Tetiana O.

    2016-08-01

    Full Text Available The aim of the article is to study the role of controlling in modern system of social, ecological, economic relations and assess its capabilities in ensuring sustainable development of a business unit. Under these conditions the concept of controlling should focus on the development of translational capabilities, which provides for the extension of accounting and analytical support functions forming the information environment of the data bank and the knowledge base of management of future events. Taking into account the new management philosophy, it is appropriate to consider controlling as a socio-economic cross-functional management technology, which forms the information space supporting operational, tactical and strategic management by using the mechanism of internal management actions of the reflex nature aimed at sustainable development of the enterprise. There proposed a new direction of the controlling concept based on the philosophy of sustainable development of the enterprise and includes the following translational components: basic and auxiliary business processes; centers of responsibility in the system of business architecture, planning and budgeting systems, motivational tools. In view of the basic management functions the following destination controlling elements are identified: the modeling of the organizational structure of the enterprise as a system of hierarchically dependent business units; formation of a system of operational and strategic planning; development of a tool support system (control values, which allows creating a multilevel system to control achievement of strategic goals and detection of deviations. There proposed a mechanism of forming the information environment of controlling, which includes six interrelated informative components – planning, organizing, monitoring, analysis, regulation consulting, which allows to bridge gaps in information communications in the control system in a single contour of

  1. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model developm......We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  2. Translational research challenges: finding the right animal models.

    Science.gov (United States)

    Prabhakar, Sharma

    2012-12-01

    Translation of scientific discoveries into meaningful human applications, particularly novel therapies of human diseases, requires development of suitable animal models. Experimental approaches to test new drugs in preclinical phases often necessitated animal models that not only replicate human disease in etiopathogenesis and pathobiology but also biomarkers development and toxicity prediction. Whereas the transgenic and knockout techniques have revolutionized manipulation of rodents and other species to get greater insights into human disease pathogenesis, we are far from generating ideal animal models of most human disease states. The challenges in using the currently available animal models for translational research, particularly for developing potentially new drugs for human disease, coupled with the difficulties in toxicity prediction have led some researchers to develop a scoring system for translatability. These aspects and the challenges in selecting an animal model among those that are available to study human disease pathobiology and drug development are the topics covered in this detailed review.

  3. Sight Translation and written translation. A comparative Analysis of causes of problems, Strategies and Translation Errors within the PACTE Translation Competence Model

    OpenAIRE

    2008-01-01

    This paper presents a comparative empirical exploratory study of some cognitive aspects of the oral and written translation process within the translation competence construct. This research has a twofold objective: finding some evidence of specific translation competence skills in translation tasks and comparing these data in sight translation and written translation in order to empirically check if sight translation can really be considered an interpreting modality. A sample ...

  4. Multi-functional scaling methodology for translational pharmacokinetic and pharmacodynamic applications using integrated microphysiological systems (MPS).

    Science.gov (United States)

    Maass, Christian; Stokes, Cynthia L; Griffith, Linda G; Cirit, Murat

    2017-04-18

    Microphysiological systems (MPS) provide relevant physiological environments in vitro for studies of pharmacokinetics, pharmacodynamics and biological mechanisms for translational research. Designing multi-MPS platforms is essential to study multi-organ systems. Typical design approaches, including direct and allometric scaling, scale each MPS individually and are based on relative sizes not function. This study's aim was to develop a new multi-functional scaling approach for integrated multi-MPS platform design for specific applications. We developed an optimization approach using mechanistic modeling and specification of an objective that considered multiple MPS functions, e.g., drug absorption and metabolism, simultaneously to identify system design parameters. This approach informed the design of two hypothetical multi-MPS platforms consisting of gut and liver (multi-MPS platform I) and gut, liver and kidney (multi-MPS platform II) to recapitulate in vivo drug exposures in vitro. This allows establishment of clinically relevant drug exposure-response relationships, a prerequisite for efficacy and toxicology assessment. Design parameters resulting from multi-functional scaling were compared to designs based on direct and allometric scaling. Human plasma time-concentration profiles of eight drugs were used to inform the designs, and profiles of an additional five drugs were calculated to test the designed platforms on an independent set. Multi-functional scaling yielded exposure times in good agreement with in vivo data, while direct and allometric scaling approaches resulted in short exposure durations. Multi-functional scaling allows appropriate scaling from in vivo to in vitro of multi-MPS platforms, and in the cases studied provides designs that better mimic in vivo exposures than standard MPS scaling methods.

  5. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States.

  6. Post-translation modification of proteins; methodologies and applications in plant sciences.

    Science.gov (United States)

    Bond, A E; Row, P E; Dudley, E

    2011-07-01

    Proteins have the potential to undergo a variety of post-translational modifications and the different methods available to study these cellular processes has advanced rapidly with the continuing development of proteomic technologies. In this review we aim to detail five major post-translational modifications (phosphorylation, glycosylaion, lipid modification, ubiquitination and redox-related modifications), elaborate on the techniques that have been developed for their analysis and briefly discuss the study of these modifications in selected areas of plant science. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Methodology, models and algorithms in thermographic diagnostics

    CERN Document Server

    Živčák, Jozef; Madarász, Ladislav; Rudas, Imre J

    2013-01-01

    This book presents  the methodology and techniques of  thermographic applications with focus primarily on medical thermography implemented for parametrizing the diagnostics of the human body. The first part of the book describes the basics of infrared thermography, the possibilities of thermographic diagnostics and the physical nature of thermography. The second half includes tools of intelligent engineering applied for the solving of selected applications and projects. Thermographic diagnostics was applied to problematics of paraplegia and tetraplegia and carpal tunnel syndrome (CTS). The results of the research activities were created with the cooperation of the four projects within the Ministry of Education, Science, Research and Sport of the Slovak Republic entitled Digital control of complex systems with two degrees of freedom, Progressive methods of education in the area of control and modeling of complex object oriented systems on aircraft turbocompressor engines, Center for research of control of te...

  8. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    43 Figure 14: Simulation Study Methodology for the Weapon System Analysis Metrics Definition and Data Collection The analysis plan calls for...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Presented to the Faculty Department of Operational Sciences

  9. RM-structure alignment based statistical machine translation model

    Institute of Scientific and Technical Information of China (English)

    Sun Jiadong; Zhao Tiejun

    2008-01-01

    A novel model based on structure alignments is proposed for statistical machine translation in this paper.Meta-structure and sequence of meta-structure for a parse tree are defined.During the translation process, a parse tree is decomposed to deal with the structure divergence and the alignments can be constructed at different levels of recombination of meta-structure (RM).This method can perform the structure mapping across the sub-tree structure between languages.As a result, we get not only the translation for the target language, but sequence of meta-structure of its parse tree at the same time.Experiments show that the model in the framework of log-linear model has better generative ability and significantly outperforms Pharaoh, a phrase-based system.

  10. Modeling Syntax for Parsing and Translation

    Science.gov (United States)

    2003-12-15

    EBm CO Cs Ss*b Ost Ss*d Pv MVa IDSs D*u BsDs Ost Ss Ds A Ss Mp NSaNIaxOptE A AN Sp Op NIax MVp MVp Jp F*J F*J NIax Jp F*J F*J G CO Sp*i Os MVp Ds Js...thesis, we can see a theme running throughout: incorporating syntax into generative models of human language, which is just a special case of a more...general theme : combining knowledge with statistical models. In this thesis we incorporated linguistic knowledge into statistical models of language

  11. Model evaluation methodology applicable to environmental assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes.

  12. Methodology for Modeling and Analysis of Business Processes (MMABP)

    OpenAIRE

    Vaclav Repa; Tomas Bruckner

    2015-01-01

    This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process stat...

  13. Methodological challenges in quality of life research among Turkish and Moroccan ethnic minority cancer patients: translation, recruitment and ethical issues.

    Science.gov (United States)

    Hoopman, Rianne; Terwee, Caroline B; Muller, Martin J; Ory, Ferko G; Aaronson, Neil K

    2009-06-01

    The large population of first generation Turkish and Moroccan immigrants who moved to Western Europe in the 1960s and 1970s is now reaching an age at which the incidence of chronic diseases, including cancer, rises sharply. To date, little attention has been paid to the health-related quality of life (HRQOL) of these ethnic minority groups, primarily due to the paucity of well translated and validated measures, but also because of a range of methodological and logistical barriers. The primary objective of this paper is to describe the methodological challenges in conducting HRQOL research among these patient populations, based on experience gained in a project in which four widely used HRQOL questionnaires were translated into Turkish, Moroccan-Arabic and Tarifit, and administered to a sample of 90 Turkish and 79 Moroccan cancer patients in the Netherlands. Problems encountered in translating and administering the questionnaires included achieving semantic equivalence (use of loanwords), use of numerical rating scales, lengthy questions and response scales, and culturally sensitive and/or inappropriate questions. Privacy laws that prohibit hospitals from registering the ethnicity of patients hampered efficient identification of eligible patients. Recruiting patients to studies is often difficult due to low literacy levels, lack of familiarity with and distrust of research, concerns about immigration status, and inaccurate or missing contact information. This can lead to lower response rates than is the case with the population of Dutch cancer patients. Additional ethical issues that arise in such studies concern patients' problems with communicating with their health care providers, their lack of understanding of their diagnosis, treatment and prognosis, and the potential role conflict experienced by bilingual research assistants who may wish or be asked to intervene on the patients' behalf. Practical approaches to resolving these issues are presented.

  14. Measuring Difficulty in English-Chinese Translation: Towards a General Model of Translation Difficulty

    Science.gov (United States)

    Sun, Sanjun

    2012-01-01

    Accurate assessment of a text's level of translation difficulty is critical for translator training and accreditation, translation research, and the language industry as well. Traditionally, people rely on their general impression to gauge a text's translation difficulty level. If the evaluation process is to be more effective and the…

  15. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  16. The Application of the Jerome Model and the Horace Model in Translation Practice

    Institute of Scientific and Technical Information of China (English)

    WU Jiong

    2015-01-01

    The Jerome model and the Horace model have a great influence on translation theories and practice from ancient times. This paper starts from a comparative study of the two models, and mainly discusses similarities, differences and weakness of them. And then, through the case study, it analyzes the application of the two models to English-Chinese translation. In the end, it draws a conclusion that generally accepted translation criterion does not exist, different types of texts require different transla⁃tion criterion.

  17. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  18. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  19. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  20. Exploration of Disease Markers under Translational Medicine Model

    Institute of Scientific and Technical Information of China (English)

    Rajagopal Krishnamoorthy; Octavio Fernandes; Arturo Chiti

    2015-01-01

    Disease markers are defined as the biomarkers with specific characteristics during the general physical, pathological or therapeutic process, the detection of which can inform the progression of present biological process of organisms. However, the exploration of disease markers is complicated and difficult, and only a few markers can be used in clinical practice and there is no significant difference in the mortality of cancers before and after biomarker exploration. Translational medicine focuses on breaking the blockage between basic medicine and clinical practice. In addition, it also establishes an effective association between researchers engaged on basic scientific discovery and clinical physicians well informed of patients’ requirements, and gives particular attentions on how to translate the basic molecular biological research to the most effective and appropriate methods for the diagnosis, treatment and prevention of diseases, hoping to translate basic research into the new therapeutic methods in clinic. Therefore, this study mainly summarized the exploration of disease markers under translational medicine model so as to provide a basis for the translation of basic research results into clinical application.

  1. Designing and implementing INTREPID, an intensive program in translational research methodologies for new investigators.

    Science.gov (United States)

    Plottel, Claudia S; Aphinyanaphongs, Yindalon; Shao, Yongzhao; Micoli, Keith J; Fang, Yixin; Goldberg, Judith D; Galeano, Claudia R; Stangel, Jessica H; Chavis-Keeling, Deborah; Hochman, Judith S; Cronstein, Bruce N; Pillinger, Michael H

    2014-12-01

    Senior housestaff and junior faculty are often expected to perform clinical research, yet may not always have the requisite knowledge and skills to do so successfully. Formal degree programs provide such knowledge, but require a significant commitment of time and money. Short-term training programs (days to weeks) provide alternative ways to accrue essential information and acquire fundamental methodological skills. Unfortunately, published information about short-term programs is sparse. To encourage discussion and exchange of ideas regarding such programs, we here share our experience developing and implementing INtensive Training in Research Statistics, Ethics, and Protocol Informatics and Design (INTREPID), a 24-day immersion training program in clinical research methodologies. Designing, planning, and offering INTREPID was feasible, and required significant faculty commitment, support personnel and infrastructure, as well as committed trainees.

  2. Threat model framework and methodology for personal networks (PNs)

    DEFF Research Database (Denmark)

    Prasad, Neeli R.

    2007-01-01

    is to give a structured, convenient approach for building threat models. A framework for the threat model is presented with a list of requirements for methodology. The methodology will be applied to build a threat model for Personal Networks. Practical tools like UML sequence diagrams and attack trees have...

  3. Translation rescoring through recurrent neural network language models

    OpenAIRE

    PERIS ABRIL, ÁLVARO

    2014-01-01

    This work is framed into the Statistical Machine Translation field, more specifically into the language modeling challenge. In this area, have classically predominated the n-gram approach, but, in the latest years, different approaches have arisen in order to tackle this problem. One of this approaches is the use of artificial recurrent neural networks, which are supposed to outperform the n-gram language models. The aim of this work is to test empirically these new language...

  4. Semiotics of Umberto Eco in a Literary Translation Class: The Model Reader as the Competent Translator

    Science.gov (United States)

    Ozturk Kasar, Sündüz; Can, Alize

    2017-01-01

    Classroom environment can be thought as an absolute place to practice and improve translation skills of students. They have the possibility to brainstorm and discuss problematic points they face with each other during a translation activity. It can be estimated in the same way in a literary translation class. Students who are supposed to become…

  5. Genetic Algorithms for Models Optimization for Recognition of Translation Initiation Sites

    KAUST Repository

    Mora, Arturo Magana

    2011-06-01

    This work uses genetic algorithms (GA) to reduce the complexity of the artificial neural networks (ANNs) and decision trees (DTs) for the accurate recognition of translation initiation sites (TISs) in Arabidopsis Thaliana. The Arabidopsis data was extracted directly from genomic DNA sequences. Methods derived in this work resulted in both reduced complexity of the predictors, as well as in improvement in prediction accuracy (generalization). Optimization through use of GA is generally a computationally intensive task. One of the approaches to overcome this problem is to use parallelization of code that implements GA, thus allowing computation on multiprocessing infrastructure. However, further improvement in performance GA implementation could be achieved through modification done to GA basic operations such as selection, crossover and mutation. In this work we explored two such improvements, namely evolutive mutation and GA-Simplex crossover operation. In this thesis we studied the benefit of these modifications on the problem of TISs recognition. Compared to the non-modified GA approach, we reduced the number of weights in the resulting model\\'s neural network component by 51% and the number of nodes in the model\\'s DTs component by 97% whilst improving the model\\'s accuracy at the same time. Separately, we developed another methodology for reducing the complexity of prediction models by optimizing the composition of training data subsets in bootstrap aggregation (bagging) methodology. This optimization is achieved by applying a new GA-based bagging methodology in order to optimize the composition of each of the training data subsets. This approach has shown in our test cases to considerably enhance the accuracy of the TIS prediction model compared to the original bagging methodology. Although these methods are applied to the problem of accurate prediction of TISs we believe that these methodologies have a potential for wider scope of application.

  6. A conceptual model for translating omic data into clinical action

    Directory of Open Access Journals (Sweden)

    Timothy M Herr

    2015-01-01

    Full Text Available Genomic, proteomic, epigenomic, and other "omic" data have the potential to enable precision medicine, also commonly referred to as personalized medicine. The volume and complexity of omic data are rapidly overwhelming human cognitive capacity, requiring innovative approaches to translate such data into patient care. Here, we outline a conceptual model for the application of omic data in the clinical context, called "the omic funnel." This model parallels the classic "Data, Information, Knowledge, Wisdom pyramid" and adds context for how to move between each successive layer. Its goal is to allow informaticians, researchers, and clinicians to approach the problem of translating omic data from bench to bedside, by using discrete steps with clearly defined needs. Such an approach can facilitate the development of modular and interoperable software that can bring precision medicine into widespread practice.

  7. Translated Poisson Mixture Model for Stratification Learning (PREPRINT)

    Science.gov (United States)

    2007-09-01

    unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Translated Poisson Mixture Model for Stratification Learning Gloria Haro Dept. Teoria ...Pless. Figure 1 shows, for each algorithm, the point cloud with each point colored and marked differently according to its classification. In the dif...1: Clustering of a spiral and a plane. Results with different algorithms (this is a color figure). Due to the statistical nature of the R-TPMM

  8. Modeling Virtual Organization Architecture with the Virtual Organization Breeding Methodology

    Science.gov (United States)

    Paszkiewicz, Zbigniew; Picard, Willy

    While Enterprise Architecture Modeling (EAM) methodologies become more and more popular, an EAM methodology tailored to the needs of virtual organizations (VO) is still to be developed. Among the most popular EAM methodologies, TOGAF has been chosen as the basis for a new EAM methodology taking into account characteristics of VOs presented in this paper. In this new methodology, referred as Virtual Organization Breeding Methodology (VOBM), concepts developed within the ECOLEAD project, e.g. the concept of Virtual Breeding Environment (VBE) or the VO creation schema, serve as fundamental elements for development of VOBM. VOBM is a generic methodology that should be adapted to a given VBE. VOBM defines the structure of VBE and VO architectures in a service-oriented environment, as well as an architecture development method for virtual organizations (ADM4VO). Finally, a preliminary set of tools and methods for VOBM is given in this paper.

  9. Modeling Virtual Organization Architecture with the Virtual Organization Breeding Methodology

    CERN Document Server

    Paszkiewicz, Zbigniew

    2011-01-01

    While Enterprise Architecture Modeling (EAM) methodologies become more and more popular, an EAM methodology tailored to the needs of virtual organizations (VO) is still to be developed. Among the most popular EAM methodologies, TOGAF has been chosen as the basis for a new EAM methodology taking into account characteristics of VOs presented in this paper. In this new methodology, referred as Virtual Organization Breeding Methodology (VOBM), concepts developed within the ECOLEAD project, e.g. the concept of Virtual Breeding Environment (VBE) or the VO creation schema, serve as fundamental elements for development of VOBM. VOBM is a generic methodology that should be adapted to a given VBE. VOBM defines the structure of VBE and VO architectures in a service-oriented environment, as well as an architecture development method for virtual organizations (ADM4VO). Finally, a preliminary set of tools and methods for VOBM is given in this paper.

  10. Goal Model to Business Process Model: A Methodology for Enterprise Government Tourism System Development

    National Research Council Canada - National Science Library

    Ahmad Nurul Fajar; Imam Marzuki Shofi

    2016-01-01

    .... However, the goal model could not used directly to make business process model. In order to solve this problem,this paper presents and proposed a Methodology to extract goal model into business process model that called GBPM Methodology...

  11. Evaluating Supply Chain Management: A Methodology Based on a Theoretical Model

    Directory of Open Access Journals (Sweden)

    Alexandre Tadeu Simon

    2015-01-01

    Full Text Available Despite the increasing interest in supply chain management (SCM by researchers and practitioners, there is still a lack of academic literature concerning topics such as methodologies to guide and support SCM evaluation. Most developed methodologies have been provided by consulting companies and are restricted in their publication and use. This article presents a methodology for evaluating companies’ degree of adherence to a SCM conceptual model. The methodology is based on Cooper, Lambert and Pagh’s original contribution and involves analysis of eleven referential axes established from key business processes, horizontal structures, and initiatives & practices. We analyze the applicability of the proposed model based on findings from interviews with experts - academics and practitioners - as well as from case studies of three focal firms and their supply chains. In general terms, the methodology can be considered a diagnostic instrument that allows companies to evaluate their maturity regarding SCM practices. From this diagnosis, firms can identify and implement activities to improve degree of adherence to the reference model and achieve SCM benefits. The methodology aims to contribute to SCM theory development. It is an initial, but structured, reference for translating a theoretical approach into practical aspects.

  12. A methodology for ecosystem-scale modeling of selenium

    Science.gov (United States)

    Presser, T.S.; Luoma, S.N.

    2010-01-01

    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determinehow Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure

  13. Efficacy of Female Rat Models in Translational Cardiovascular Aging Research

    Directory of Open Access Journals (Sweden)

    K. M. Rice

    2014-01-01

    Full Text Available Cardiovascular disease is the leading cause of death in women in the United States. Aging is a primary risk factor for the development of cardiovascular disease as well as cardiovascular-related morbidity and mortality. Aging is a universal process that all humans undergo; however, research in aging is limited by cost and time constraints. Therefore, most research in aging has been done in primates and rodents; however it is unknown how well the effects of aging in rat models translate into humans. To compound the complication of aging gender has also been indicated as a risk factor for various cardiovascular diseases. This review addresses the systemic pathophysiology of the cardiovascular system associated with aging and gender for aging research with regard to the applicability of rat derived data for translational application to human aging.

  14. Methodological Approach for Modeling of Multienzyme in-pot Processes

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia; Roman Martinez, Alicia; Sin, Gürkan;

    2011-01-01

    This paper presents a methodological approach for modeling multi-enzyme in-pot processes. The methodology is exemplified stepwise through the bi-enzymatic production of N-acetyl-D-neuraminic acid (Neu5Ac) from N-acetyl-D-glucosamine (GlcNAc). In this case study, sensitivity analysis is also used...

  15. Optimal Data Split Methodology for Model Validation

    CERN Document Server

    Morrison, Rebecca; Terejanu, Gabriel; Miki, Kenji; Prudhomme, Serge

    2011-01-01

    The decision to incorporate cross-validation into validation processes of mathematical models raises an immediate question - how should one partition the data into calibration and validation sets? We answer this question systematically: we present an algorithm to find the optimal partition of the data subject to certain constraints. While doing this, we address two critical issues: 1) that the model be evaluated with respect to predictions of a given quantity of interest and its ability to reproduce the data, and 2) that the model be highly challenged by the validation set, assuming it is properly informed by the calibration set. This framework also relies on the interaction between the experimentalist and/or modeler, who understand the physical system and the limitations of the model; the decision-maker, who understands and can quantify the cost of model failure; and the computational scientists, who strive to determine if the model satisfies both the modeler's and decision maker's requirements. We also note...

  16. The unfolded protein response and translation attenuation: a modelling approach.

    Science.gov (United States)

    Trusina, A; Tang, C

    2010-10-01

    Unfolded protein response (UPR) is a stress response to increased levels of unfolded proteins in the endoplasmic reticulum (ER). To deal with this stress, all eukaryotic cells share a well-conserved strategy--the upregulation of chaperons and proteases to facilitate protein folding and to degrade the misfolded proteins. For metazoans, however, an additional and seemingly redundant strategy has been evolved--translation attenuation (TA) of proteins targeted to the ER via the protein kinase PERK pathway. PERK is essential in secretory cells, such as the pancreatic β-cells, but not in non-secretory cell types. We have recently developed a mathematical model of UPR, focusing on the interplay and synergy between the TA arm and the conserved Ire1 arm of the UPR. The model showed that the TA mechanism is beneficial in highly fluctuating environment, for example, in the case where the ER stress changes frequently. Under highly variable levels of ER stress, tight regulation of the ER load by TA avoids excess amount of chaperons and proteases being produced. The model also showed that TA is of greater importance when there is a large flux of proteins through the ER. In this study, we further expand our model to investigate different types of ER stress and different temporal profiles of the stress. We found that TA is more desirable in dealing with the translation stress, for example, prolonged stimulation of proinsulin biosynthesis, than the chemical stress.

  17. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  18. Translational Mouse Models of Autism: Advancing Toward Pharmacological Therapeutics.

    Science.gov (United States)

    Kazdoba, Tatiana M; Leach, Prescott T; Yang, Mu; Silverman, Jill L; Solomon, Marjorie; Crawley, Jacqueline N

    Animal models provide preclinical tools to investigate the causal role of genetic mutations and environmental factors in the etiology of autism spectrum disorder (ASD). Knockout and humanized knock-in mice, and more recently knockout rats, have been generated for many of the de novo single gene mutations and copy number variants (CNVs) detected in ASD and comorbid neurodevelopmental disorders. Mouse models incorporating genetic and environmental manipulations have been employed for preclinical testing of hypothesis-driven pharmacological targets, to begin to develop treatments for the diagnostic and associated symptoms of autism. In this review, we summarize rodent behavioral assays relevant to the core features of autism, preclinical and clinical evaluations of pharmacological interventions, and strategies to improve the translational value of rodent models of autism.

  19. Knowledge Translation for Research Utilization: Design of a Knowledge Translation Model at Tehran University of Medical Sciences

    Science.gov (United States)

    Majdzadeh, Reza; Sadighi, Jila; Nejat, Saharnaz; Mahani, Ali Shahidzade; Gholami, Jaleh

    2008-01-01

    Introduction: The present study aimed to generate a model that would provide a conceptual framework for linking disparate components of knowledge translation. A theoretical model of such would enable the organization and evaluation of attempts to analyze current conditions and to design interventions on the transfer and utilization of research…

  20. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF......). This paper presents an application of the Udwadia-Kalaba Equation for modelling the Reconfigurable Underwater Robots. The constraints developed to enforce the rigid connection between robots in the system is derived through restrictions on relative distances and orientations. To avoid singularities...

  1. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF......). This paper presents an application of the Udwadia-Kalaba Equation for modelling the Reconfigurable Underwater Robots. The constraints developed to enforce the rigid connection between robots in the system is derived through restrictions on relative distances and orientations. To avoid singularities...... in the orientation and, thereby, allow the robots to undertake any relative configuration the attitude is represented in Euler parameters....

  2. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  3. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF...

  4. Expectations for the methodology and translation of animal research: a survey of the general public, medical students and animal researchers in North America.

    Science.gov (United States)

    Joffe, Ari R; Bara, Meredith; Anton, Natalie; Nobis, Nathan

    2016-09-01

    To determine what are considered acceptable standards for animal research (AR) methodology and translation rate to humans, a validated survey was sent to: a) a sample of the general public, via Sampling Survey International (SSI; Canada), Amazon Mechanical Turk (AMT; USA), a Canadian city festival (CF) and a Canadian children's hospital (CH); b) a sample of medical students (two first-year classes); and c) a sample of scientists (corresponding authors and academic paediatricians). There were 1379 responses from the general public sample (SSI, n = 557; AMT, n = 590; CF, n = 195; CH, n = 102), 205/330 (62%) medical student responses, and 23/323 (7%, too few to report) scientist responses. Asked about methodological quality, most of the general public and medical student respondents expect that: AR is of high quality (e.g. anaesthesia and analgesia are monitored, even overnight, and 'humane' euthanasia, optimal statistical design, comprehensive literature review, randomisation and blinding, are performed), and costs and difficulty are not acceptable justifications for lower quality (e.g. costs of expert consultation, or more laboratory staff). Asked about their expectations of translation to humans (of toxicity, carcinogenicity, teratogenicity and treatment findings), most expect translation more than 60% of the time. If translation occurred less than 20% of the time, a minority disagreed that this would "significantly reduce your support for AR". Medical students were more supportive of AR, even if translation occurred less than 20% of the time. Expectations for AR are much higher than empirical data show to have been achieved.

  5. A stochastic model of translation with -1 programmed ribosomal frameshifting

    Science.gov (United States)

    Bailey, Brenae L.; Visscher, Koen; Watkins, Joseph

    2014-02-01

    Many viruses produce multiple proteins from a single mRNA sequence by encoding overlapping genes. One mechanism to decode both genes, which reside in alternate reading frames, is -1 programmed ribosomal frameshifting. Although recognized for over 25 years, the molecular and physical mechanism of -1 frameshifting remains poorly understood. We have developed a mathematical model that treats mRNA translation and associated -1 frameshifting as a stochastic process in which the transition probabilities are based on the energetics of local molecular interactions. The model predicts both the location and efficiency of -1 frameshift events in HIV-1. Moreover, we compute -1 frameshift efficiencies upon mutations in the viral mRNA sequence and variations in relative tRNA abundances, predictions that are directly testable in experiment.

  6. General Methodology for developing UML models from UI

    CERN Document Server

    Reddy, Ch Ram Mohan; Srinivasa, K G; Kumar, T V Suresh; Kanth, K Rajani

    2012-01-01

    In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines. Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case...

  7. Drosophila Melanogaster as an Emerging Translational Model of Human Nephrolithiasis

    Science.gov (United States)

    Miller, Joe; Chi, Thomas; Kapahi, Pankaj; Kahn, Arnold J.; Kim, Man Su; Hirata, Taku; Romero, Michael F.; Dow, Julian A.T.; Stoller, Marshall L.

    2013-01-01

    Purpose The limitations imposed by human clinical studies and mammalian models of nephrolithiasis have hampered the development of effective medical treatments and preventative measures for decades. The simple but elegant Drosophila melanogaster is emerging as a powerful translational model of human disease, including nephrolithiasis and may provide important information essential to our understanding of stone formation. We present the current state of research using D. melanogaster as a model of human nephrolithiasis. Materials and Methods A comprehensive review of the English language literature was performed using PUBMED. When necessary, authoritative texts on relevant subtopics were consulted. Results The genetic composition, anatomic structure and physiologic function of Drosophila Malpighian tubules are remarkably similar to those of the human nephron. The direct effects of dietary manipulation, environmental alteration, and genetic variation on stone formation can be observed and quantified in a matter of days. Several Drosophila models of human nephrolithiasis, including genetically linked and environmentally induced stones, have been developed. A model of calcium oxalate stone formation is among the most recent fly models of human nephrolithiasis. Conclusions The ability to readily manipulate and quantify stone formation in D. melanogaster models of human nephrolithiasis presents the urologic community with a unique opportunity to increase our understanding of this enigmatic disease. PMID:23500641

  8. Information Model Translation to Support a Wider Science Community

    Science.gov (United States)

    Hughes, John S.; Crichton, Daniel; Ritschel, Bernd; Hardman, Sean; Joyner, Ronald

    2014-05-01

    The Planetary Data System (PDS), NASA's long-term archive for solar system exploration data, has just released PDS4, a modernization of the PDS architecture, data standards, and technical infrastructure. This next generation system positions the PDS to meet the demands of the coming decade, including big data, international cooperation, distributed nodes, and multiple ways of analysing and interpreting data. It also addresses three fundamental project goals: providing more efficient data delivery by data providers to the PDS, enabling a stable, long-term usable planetary science data archive, and enabling services for the data consumer to find, access, and use the data they require in contemporary data formats. The PDS4 information architecture is used to describe all PDS data using a common model. Captured in an ontology modeling tool it supports a hierarchy of data dictionaries built to the ISO/IEC 11179 standard and is designed to increase flexibility, enable complex searches at the product level, and to promote interoperability that facilitates data sharing both nationally and internationally. A PDS4 information architecture design requirement stipulates that the content of the information model must be translatable to external data definition languages such as XML Schema, XMI/XML, and RDF/XML. To support the semantic Web standards we are now in the process of mapping the contents into RDF/XML to support SPARQL capable databases. We are also building a terminological ontology to support virtually unified data retrieval and access. This paper will provide an overview of the PDS4 information architecture focusing on its domain information model and how the translation and mapping are being accomplished.

  9. Mathematical Modelling of Translation and Rotation Movement in Quad Tiltrotor

    Directory of Open Access Journals (Sweden)

    Andi Dharmawan

    2017-06-01

    Full Text Available Quadrotor as one type of UAV (Unmanned Aerial Vehicle is an underactuated mechanical system. It means that the system has some control inputs is lower than its DOF (Degrees of Freedom. This condition causes quadrotor to have limited mobility because of its inherent under actuation, namely, the availability of four independent control signals (four-speed rotating propellers versus 6 degrees of freedom parameterizing quadrotor position or orientation in space. If a quadrotor is made to have 6 DOF, a full motion control system to optimize the flight will be different from before. So it becomes necessary to develop over actuated quad tiltrotor. Quad tiltrotor has control signals more than its DOF. Therefore, we can refer it to the overactuated system. We need a good control system to fly the quad tiltrotor. Good control systems can be designed using the model of the quad tiltrotor system. We can create quad tiltrotor model using its dynamics based on Newton-Euler approach. After we have a set of model, we can simulate the control system using some control method. There are several control methods that we can use in the quad tiltrotor flight system. However, we can improve the control by implementing a modern control system that uses the concept of state space. The simulations show that the quad tiltrotor has done successful translational motion without significant interference. Also, undesirable rotation movement in the quad tiltrotor flight when performing the translational motions resulting from the transition process associated with the tilt rotor change was successfully reduced below 1 degree.

  10. My Understanding of the Main Similarities and Differences between the Three Translation Models

    Institute of Scientific and Technical Information of China (English)

    支志

    2009-01-01

    In this paper,the author wants to prove that the three translation models not only have similarities but also have differences,with the similarities being that they all refer to faithful and free translation and the status of reader,the differences being that their focuses are quite different and their influence upon the present translation theory and practice vary.

  11. Data and models in Action. Methodological Issues in Production Ecology

    NARCIS (Netherlands)

    Stein, A.; Penning, de F.W.T.

    1999-01-01

    This book addresses methodological issues of production ecology. A central issue is the combination of the agricultural model with reliable data in relation to scale. A model is developed with data from a single point, whereas decisions are to be made for areas of land. Such an approach requires the

  12. Stochastic sequence-level model of coupled transcription and translation in prokaryotes

    OpenAIRE

    Yli-Harja Olli; Lloyd-Price Jason; Mäkelä Jarno; Ribeiro Andre S

    2011-01-01

    Abstract Background In prokaryotes, transcription and translation are dynamically coupled, as the latter starts before the former is complete. Also, from one transcript, several translation events occur in parallel. To study how events in transcription elongation affect translation elongation and fluctuations in protein levels, we propose a delayed stochastic model of prokaryotic transcription and translation at the nucleotide and codon level that includes the promoter open complex formation ...

  13. Building better biomarkers: brain models in translational neuroimaging.

    Science.gov (United States)

    Woo, Choong-Wan; Chang, Luke J; Lindquist, Martin A; Wager, Tor D

    2017-02-23

    Despite its great promise, neuroimaging has yet to substantially impact clinical practice and public health. However, a developing synergy between emerging analysis techniques and data-sharing initiatives has the potential to transform the role of neuroimaging in clinical applications. We review the state of translational neuroimaging and outline an approach to developing brain signatures that can be shared, tested in multiple contexts and applied in clinical settings. The approach rests on three pillars: (i) the use of multivariate pattern-recognition techniques to develop brain signatures for clinical outcomes and relevant mental processes; (ii) assessment and optimization of their diagnostic value; and (iii) a program of broad exploration followed by increasingly rigorous assessment of generalizability across samples, research contexts and populations. Increasingly sophisticated models based on these principles will help to overcome some of the obstacles on the road from basic neuroscience to better health and will ultimately serve both basic and applied goals.

  14. Modeling survival in colon cancer: a methodological review

    Directory of Open Access Journals (Sweden)

    Holbert Don

    2007-02-01

    Full Text Available Abstract The Cox proportional hazards model is the most widely used model for survival analysis because of its simplicity. The fundamental assumption in this model is the proportionality of the hazard function. When this condition is not met, other modifications or other models must be used for analysis of survival data. We illustrate in this review several methodological approaches to deal with the violation of the proportionality assumption, using survival in colon cancer as an illustrative example.

  15. Suitability of Modern Software Development Methodologies for Model Driven Development

    Directory of Open Access Journals (Sweden)

    Ruben Picek

    2009-12-01

    Full Text Available As an answer to today’s growing challenges in software industry, wide spectrum of new approaches of software development has occurred. One prominent direction is currently most promising software development paradigm called Model Driven Development (MDD. Despite a lot of skepticism and problems, MDD paradigm is being used and improved to accomplish many inherent potential benefits. In the methodological approach of software development it is necessary to use some kind of development process. Modern methodologies can be classified into two main categories: formal or heavyweight and agile or lightweight. But when it is a question about MDD and development process for MDD, currently known methodologies are very poor or better said they don't have any explanation of MDD process. As the result of research, in this paper, author examines the possibilities of using existing modern software methodologies in context of MDD paradigm.

  16. Design Intelligent Model base Online Tuning Methodology for Nonlinear System

    Directory of Open Access Journals (Sweden)

    Ali Roshanzamir

    2014-04-01

    Full Text Available In various dynamic parameters systems that need to be training on-line adaptive control methodology is used. In this paper fuzzy model-base adaptive methodology is used to tune the linear Proportional Integral Derivative (PID controller. The main objectives in any systems are; stability, robust and reliability. However PID controller is used in many applications but it has many challenges to control of continuum robot. To solve these problems nonlinear adaptive methodology based on model base fuzzy logic is used. This research is used to reduce or eliminate the PID controller problems based on model reference fuzzy logic theory to control of flexible robot manipulator system and testing of the quality of process control in the simulation environment of MATLAB/SIMULINK Simulator.

  17. Translation of a High-Level Temporal Model into Lower Level Models: Impact of Modelling at Different Description Levels

    DEFF Research Database (Denmark)

    Kraft, Peter; Sørensen, Jens Otto

    2001-01-01

    the existences in time can be mapped precisely and consistently securing a consistent handling of the temporal properties. We translate the high level temporal model into an entity-relationship model, with the information in a two-dimensional graph, and finally we look at the translations into relational...

  18. In vitro models of pancreatic cancer for translational oncology research

    Science.gov (United States)

    Feldmann, Georg; Rauenzahn, Sherri; Maitra, Anirban

    2009-01-01

    Background Pancreatic cancer is a disease of near uniform fatality and the overwhelming majority of patients succumb to their advanced malignancy within a few months of diagnosis. Despite considerable advances in our understanding of molecular mechanisms underlying pancreatic carcinogenesis, this knowledge has not yet been fully translated into clinically available treatment strategies that yield significant improvements in disease free or overall survival. Objective Cell line-based in vitro model systems provide powerful tools to identify potential molecular targets for therapeutic intervention as well as for initial pre-clinical evaluation of novel drug candidates. Here we provide a brief overview of recent literature on cell line-based model systems of pancreatic cancer and their application in the search for novel therapeutics against this vicious disease. Conclusion While in vitro models of pancreatic cancer are of tremendous value for genetic studies and initial functional screenings in drug discovery, they carry several imanent drawbacks and are often poor in predicting therapeutic response in humans. Therefore, in most instances they are successfully exploited to generate hypothesis and identify molecular targets for novel therapeutics, which are subsequently subject to further in-depth characterization using more advanced in vivo model systems and clinical trials. PMID:20160967

  19. Translating patient reported outcome measures: methodological issues explored using cognitive interviewing with three rheumatoid arthritis measures in six European languages

    NARCIS (Netherlands)

    Hewlett, Sarah E.; Nicklin, Joanna; Bode, Christina; Carmona, Loretto; Dures, Emma; Engelbrecht, Matthias; Hagel, Sofia; Kirwan, John R.; Molto, Anna; Redondo, Marta; Gossec, Laure

    2016-01-01

    Objective. Cross-cultural translation of patient-reported outcome measures (PROMs) is a lengthy process, often performed professionally. Cognitive interviewing assesses patient comprehension of PROMs. The objective was to evaluate the usefulness of cognitive interviewing to assess translations and c

  20. DIGITAL GEOMETRIC MODELLING OF TEETH PROFILE BY USING CAD METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Krzysztof TWARDOCH

    2014-03-01

    Full Text Available This article is devoted to the problem of properly defining the spatial model of tooth profile with CAD methodologies. Moved by the problem of the accuracy of the mapping defined curves describing the geometry of the teeth. Particular attention was paid to precise geometric modeling involute tooth profile, which has a significant influence on the process of identifying the mesh stiffness for tests performed on the dynamic phenomena occurring in the gear transmission systems conducted using dynamic models

  1. A methodology for constructing the calculation model of scientific spreadsheets

    OpenAIRE

    Vos, de, Ans; Wielemaker, J.; Schreiber, G.; Wielinga, B.; Top, J.L.

    2015-01-01

    Spreadsheets models are frequently used by scientists to analyze research data. These models are typically described in a paper or a report, which serves as single source of information on the underlying research project. As the calculation workflow in these models is not made explicit, readers are not able to fully understand how the research results are calculated, and trace them back to the underlying spreadsheets. This paper proposes a methodology for semi-automatically deriving the calcu...

  2. Masked Translation Priming with Semantic Categorization: Testing the Sense Model

    Science.gov (United States)

    Wang, Xin; Forster, Kenneth I.

    2010-01-01

    Four experiments are reported which were designed to test hypotheses concerning the asymmetry of masked translation priming. Experiment 1 confirmed the presence of L2-L1 priming with a semantic categorization task and demonstrated that this effect was restricted to exemplars. Experiment 2 showed that the translation priming effect was not due to…

  3. Masked Translation Priming with Semantic Categorization: Testing the Sense Model

    Science.gov (United States)

    Wang, Xin; Forster, Kenneth I.

    2010-01-01

    Four experiments are reported which were designed to test hypotheses concerning the asymmetry of masked translation priming. Experiment 1 confirmed the presence of L2-L1 priming with a semantic categorization task and demonstrated that this effect was restricted to exemplars. Experiment 2 showed that the translation priming effect was not due to…

  4. On Interactive Teaching Model of Translation Course Based on Wechat

    Science.gov (United States)

    Lin, Wang

    2017-01-01

    Constructivism is a theory related to knowledge and learning, focusing on learners' subjective initiative, based on which the interactive approach has been proved to play a crucial role in language learning. Accordingly, the interactive approach can also be applied to translation teaching since translation itself is a bilingual transformational…

  5. Recursive modular modelling methodology for lumped-parameter dynamic systems.

    Science.gov (United States)

    Orsino, Renato Maia Matarazzo

    2017-08-01

    This paper proposes a novel approach to the modelling of lumped-parameter dynamic systems, based on representing them by hierarchies of mathematical models of increasing complexity instead of a single (complex) model. Exploring the multilevel modularity that these systems typically exhibit, a general recursive modelling methodology is proposed, in order to conciliate the use of the already existing modelling techniques. The general algorithm is based on a fundamental theorem that states the conditions for computing projection operators recursively. Three procedures for these computations are discussed: orthonormalization, use of orthogonal complements and use of generalized inverses. The novel methodology is also applied for the development of a recursive algorithm based on the Udwadia-Kalaba equation, which proves to be identical to the one of a Kalman filter for estimating the state of a static process, given a sequence of noiseless measurements representing the constraints that must be satisfied by the system.

  6. Translating the Foundational Model of Anatomy into OWL.

    Science.gov (United States)

    Noy, Natalya F; Rubin, Daniel L

    2008-01-01

    The Foundational Model of Anatomy (FMA) represents the result of manual and disciplined modeling of the structural organization of the human body. It is a tremendous resource in bioinformatics that facilitates sharing of information among applications that use anatomy knowledge. The FMA was developed in Protégé and the Protégé frames language is the canonical representation language for the FMA. We present a translation of the original Protégé frame representation of the FMA into OWL. Our effort is complementary to the earlier efforts to represent FMA in OWL and is focused on two main goals: (1) representing only the information that is explicitly present in the frames representation of the FMA or that can be directly inferred from the semantics of Protégé frames; (2) representing all the information that is present in the frames representation of the FMA, thus producing an OWL representation for the complete FMA. Our complete representation of the FMA in OWL consists of two components: an OWL DL component that contains the FMA constructs that are compatible with OWL DL; and an OWL Full component that imports the OWL DL component and adds the FMA constructs that OWL DL does not allow.

  7. Arterial Calcification in Diabetes Mellitus: Preclinical Models and Translational Implications.

    Science.gov (United States)

    Stabley, John N; Towler, Dwight A

    2017-02-01

    Diabetes mellitus increasingly afflicts our aging and dysmetabolic population. Type 2 diabetes mellitus and the antecedent metabolic syndrome represent the vast majority of the disease burden-increasingly prevalent in children and older adults. However, type 1 diabetes mellitus is also advancing in preadolescent children. As such, a crushing wave of cardiometabolic disease burden now faces our society. Arteriosclerotic calcification is increased in metabolic syndrome, type 2 diabetes mellitus, and type 1 diabetes mellitus-impairing conduit vessel compliance and function, thereby increasing the risk for dementia, stroke, heart attack, limb ischemia, renal insufficiency, and lower extremity amputation. Preclinical models of these dysmetabolic settings have provided insights into the pathobiology of arterial calcification. Osteochondrogenic morphogens in the BMP-Wnt signaling relay and transcriptional regulatory programs driven by Msx and Runx gene families are entrained to innate immune responses-responses activated by the dysmetabolic state-to direct arterial matrix deposition and mineralization. Recent studies implicate the endothelial-mesenchymal transition in contributing to the phenotypic drift of mineralizing vascular progenitors. In this brief overview, we discuss preclinical disease models that provide mechanistic insights-and point to challenges and opportunities to translate these insights into new therapeutic strategies for our patients afflicted with diabetes mellitus and its arteriosclerotic complications. © 2016 American Heart Association, Inc.

  8. SYSTEMS METHODOLOGY AND MATHEMATICAL MODELS FOR KNOWLEDGE MANAGEMENT

    Institute of Scientific and Technical Information of China (English)

    Yoshiteru NAKAMORI

    2003-01-01

    This paper first introduces a new discipline knowledge science and the role of systems science inits development. Then, after the discussion on current trend in systems science, the paper proposes anew systems methodology for knowledge management and creation. Finally, the paper discussesmathematical modeling techniques to represent and manage human knowledge that is essentiallyvague and context-dependent.

  9. A methodology to annotate systems biology markup language models with the synthetic biology open language.

    Science.gov (United States)

    Roehner, Nicholas; Myers, Chris J

    2014-02-21

    Recently, we have begun to witness the potential of synthetic biology, noted here in the form of bacteria and yeast that have been genetically engineered to produce biofuels, manufacture drug precursors, and even invade tumor cells. The success of these projects, however, has often failed in translation and application to new projects, a problem exacerbated by a lack of engineering standards that combine descriptions of the structure and function of DNA. To address this need, this paper describes a methodology to connect the systems biology markup language (SBML) to the synthetic biology open language (SBOL), existing standards that describe biochemical models and DNA components, respectively. Our methodology involves first annotating SBML model elements such as species and reactions with SBOL DNA components. A graph is then constructed from the model, with vertices corresponding to elements within the model and edges corresponding to the cause-and-effect relationships between these elements. Lastly, the graph is traversed to assemble the annotating DNA components into a composite DNA component, which is used to annotate the model itself and can be referenced by other composite models and DNA components. In this way, our methodology can be used to build up a hierarchical library of models annotated with DNA components. Such a library is a useful input to any future genetic technology mapping algorithm that would automate the process of composing DNA components to satisfy a behavioral specification. Our methodology for SBML-to-SBOL annotation is implemented in the latest version of our genetic design automation (GDA) software tool, iBioSim.

  10. In vitro micro-physiological models for translational immunology.

    Science.gov (United States)

    Ramadan, Qasem; Gijs, Martin A M

    2015-02-07

    -tissue interactions, focusing in particular on the study of immune reactions, inflammation and the development of diseases. Also, an outlook on the opportunities and issues for further translational development of functional in vitro models in immunology will be presented.

  11. Availability modeling methodology applied to solar power systems

    Science.gov (United States)

    Unione, A.; Burns, E.; Husseiny, A.

    1981-01-01

    Availability is discussed as a measure for estimating the expected performance for solar- and wind-powered generation systems and for identifying causes of performance loss. Applicable analysis techniques, ranging from simple system models to probabilistic fault tree analysis, are reviewed. A methodology incorporating typical availability models is developed for estimating reliable plant capacity. Examples illustrating the impact of design and configurational differences on the expected capacity of a solar-thermal power plant with a fossil-fired backup unit are given.

  12. Stochastic sequence-level model of coupled transcription and translation in prokaryotes

    Directory of Open Access Journals (Sweden)

    Yli-Harja Olli

    2011-04-01

    Full Text Available Abstract Background In prokaryotes, transcription and translation are dynamically coupled, as the latter starts before the former is complete. Also, from one transcript, several translation events occur in parallel. To study how events in transcription elongation affect translation elongation and fluctuations in protein levels, we propose a delayed stochastic model of prokaryotic transcription and translation at the nucleotide and codon level that includes the promoter open complex formation and alternative pathways to elongation, namely pausing, arrests, editing, pyrophosphorolysis, RNA polymerase traffic, and premature termination. Stepwise translation can start after the ribosome binding site is formed and accounts for variable codon translation rates, ribosome traffic, back-translocation, drop-off, and trans-translation. Results First, we show that the model accurately matches measurements of sequence-dependent translation elongation dynamics. Next, we characterize the degree of coupling between fluctuations in RNA and protein levels, and its dependence on the rates of transcription and translation initiation. Finally, modeling sequence-specific transcriptional pauses, we find that these affect protein noise levels. Conclusions For parameter values within realistic intervals, transcription and translation are found to be tightly coupled in Escherichia coli, as the noise in protein levels is mostly determined by the underlying noise in RNA levels. Sequence-dependent events in transcription elongation, e.g. pauses, are found to cause tangible effects in the degree of fluctuations in protein levels.

  13. Stochastic sequence-level model of coupled transcription and translation in prokaryotes.

    Science.gov (United States)

    Mäkelä, Jarno; Lloyd-Price, Jason; Yli-Harja, Olli; Ribeiro, Andre S

    2011-04-26

    In prokaryotes, transcription and translation are dynamically coupled, as the latter starts before the former is complete. Also, from one transcript, several translation events occur in parallel. To study how events in transcription elongation affect translation elongation and fluctuations in protein levels, we propose a delayed stochastic model of prokaryotic transcription and translation at the nucleotide and codon level that includes the promoter open complex formation and alternative pathways to elongation, namely pausing, arrests, editing, pyrophosphorolysis, RNA polymerase traffic, and premature termination. Stepwise translation can start after the ribosome binding site is formed and accounts for variable codon translation rates, ribosome traffic, back-translocation, drop-off, and trans-translation. First, we show that the model accurately matches measurements of sequence-dependent translation elongation dynamics. Next, we characterize the degree of coupling between fluctuations in RNA and protein levels, and its dependence on the rates of transcription and translation initiation. Finally, modeling sequence-specific transcriptional pauses, we find that these affect protein noise levels. For parameter values within realistic intervals, transcription and translation are found to be tightly coupled in Escherichia coli, as the noise in protein levels is mostly determined by the underlying noise in RNA levels. Sequence-dependent events in transcription elongation, e.g. pauses, are found to cause tangible effects in the degree of fluctuations in protein levels.

  14. A translation invariant bipolaron in the Holstein model and superconductivity.

    Science.gov (United States)

    Lakhno, Victor

    2016-01-01

    Large-radius translation invariant (TI) bipolarons are considered in a one-dimensional Holstein molecular chain. Criteria of their stability are obtained. The energy of a translation invariant bipolaron is shown to be lower than that of a bipolaron with broken symmetry. The results obtained are applied to the problem of superconductivity in 1D-systems. It is shown that TI-bipolaron mechanism of Bose-Einstein condensation can support superconductivity even for infinite chain.

  15. An experimental methodology for a fuzzy set preference model

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate

  16. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  17. Qualitative response models: A survey of methodology and illustrative applications

    Directory of Open Access Journals (Sweden)

    Nojković Aleksandra

    2007-01-01

    Full Text Available This paper introduces econometric modeling with discrete (categorical dependent variables. Such models, commonly referred to as qualitative response (QR models, have become a standard tool of microeconometric analysis. Microeconometric research represents empirical analysis of microdata, i.e. economic information about individuals, households and firms. Microeconometrics has been most widely adopted in various fields, such as labour economics, consumer behavior, or economy of transport. The latest research shows that this methodology can also be successfully transferred to macroeconomic context and applied to time series and panel data analysis in a wider scope. .

  18. Powerline Communications Channel Modelling Methodology Based on Statistical Features

    CERN Document Server

    Tan, Bo

    2012-01-01

    This paper proposes a new channel modelling method for powerline communications networks based on the multipath profile in the time domain. The new channel model is developed to be applied in a range of Powerline Communications (PLC) research topics such as impulse noise modelling, deployment and coverage studies, and communications theory analysis. To develop the methodology, channels are categorised according to their propagation distance and power delay profile. The statistical multipath parameters such as path arrival time, magnitude and interval for each category are analyzed to build the model. Each generated channel based on the proposed statistical model represents a different realisation of a PLC network. Simulation results in similar the time and frequency domains show that the proposed statistical modelling method, which integrates the impact of network topology presents the PLC channel features as the underlying transmission line theory model. Furthermore, two potential application scenarios are d...

  19. Translating the foundational model of anatomy into french using knowledge-based and lexical methods

    Directory of Open Access Journals (Sweden)

    Merabti Tayeb

    2011-10-01

    Full Text Available Abstract Background The Foundational Model of Anatomy (FMA is the reference ontology regarding human anatomy. FMA vocabulary was integrated into the Health Multi Terminological Portal (HMTP developed by CISMeF based on the CISMeF Information System which also includes 26 other terminologies and controlled vocabularies, mainly in French. However, FMA is primarily in English. In this context, the translation of FMA English terms into French could also be useful for searching and indexing French anatomy resources. Various studies have investigated automatic methods to assist the translation of medical terminologies or create multilingual medical vocabularies. The goal of this study was to facilitate the translation of FMA vocabulary into French. Methods We compare two types of approaches to translate the FMA terms into French. The first one is UMLS-based on the conceptual information of the UMLS metathesaurus. The second method is lexically-based on several Natural Language Processing (NLP tools. Results The UMLS-based approach produced a translation of 3,661 FMA terms into French whereas the lexical approach produced a translation of 3,129 FMA terms into French. A qualitative evaluation was made on 100 FMA terms translated by each method. For the UMLS-based approach, among the 100 translations, 52% were manually rated as "very good" and only 7% translations as "bad". For the lexical approach, among the 100 translations, 47% were rated as "very good" and 20% translations as "bad". Conclusions Overall, a low rate of translations were demonstrated by the two methods. The two approaches permitted us to semi-automatically translate 3,776 FMA terms from English into French, this was to added to the existing 10,844 French FMA terms in the HMTP (4,436 FMA French terms and 6,408 FMA terms manually translated.

  20. Translating the Foundational Model of Anatomy into French using knowledge-based and lexical methods.

    Science.gov (United States)

    Merabti, Tayeb; Soualmia, Lina F; Grosjean, Julien; Palombi, Olivier; Müller, Jean-Michel; Darmoni, Stéfan J

    2011-10-26

    The Foundational Model of Anatomy (FMA) is the reference ontology regarding human anatomy. FMA vocabulary was integrated into the Health Multi Terminological Portal (HMTP) developed by CISMeF based on the CISMeF Information System which also includes 26 other terminologies and controlled vocabularies, mainly in French. However, FMA is primarily in English. In this context, the translation of FMA English terms into French could also be useful for searching and indexing French anatomy resources. Various studies have investigated automatic methods to assist the translation of medical terminologies or create multilingual medical vocabularies. The goal of this study was to facilitate the translation of FMA vocabulary into French. We compare two types of approaches to translate the FMA terms into French. The first one is UMLS-based on the conceptual information of the UMLS metathesaurus. The second method is lexically-based on several Natural Language Processing (NLP) tools. The UMLS-based approach produced a translation of 3,661 FMA terms into French whereas the lexical approach produced a translation of 3,129 FMA terms into French. A qualitative evaluation was made on 100 FMA terms translated by each method. For the UMLS-based approach, among the 100 translations, 52% were manually rated as "very good" and only 7% translations as "bad". For the lexical approach, among the 100 translations, 47% were rated as "very good" and 20% translations as "bad". Overall, a low rate of translations were demonstrated by the two methods. The two approaches permitted us to semi-automatically translate 3,776 FMA terms from English into French, this was to added to the existing 10,844 French FMA terms in the HMTP (4,436 FMA French terms and 6,408 FMA terms manually translated).

  1. The Methodology Roles in the Realization of a Model Development Environment

    OpenAIRE

    Arthur, James D.; Nance, Richard E.

    1988-01-01

    The definition of "methodology" is followed by a very brief review of past work in modeling methodologies. The dual role of a methodologies is explained: (1) conceptual guidance in the modeling task, and (2) definition of needs for environment designers. A model development based on the conical methodology serves for specific illustration of both roles.

  2. Supply Chain Modeling: Downstream Risk Assessment Methodology (DRAM)

    Science.gov (United States)

    2013-12-05

    Supply Chain Modeling: Downstream Risk Assessment Methodology (DRAM) Dr. Sean Barnett December 5, 2013 Institute for Defense Analyses Alexandria, Virginia DMSMS Conference 2013 These Slides are Unclassified and Not Proprietary Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the

  3. An improved methodology for precise geoid/quasigeoid modelling

    Science.gov (United States)

    Nesvadba, Otakar; Holota, Petr

    2016-04-01

    The paper describes recent development of the computational procedure useful for precise local quasigeoid modelling. The overall methodology is primarily based on a solution of the so-called gravimetric boundary value problem for an ellipsoidal domain (exterior to an oblate spheroid), which means that gravity disturbances on the ellipsoid are used in quality of input data. The problem of a difference between the Earth's topography and the chosen ellipsoidal surface is solved iteratively, by analytical continuation of the gravity disturbances to the computational ellipsoid. The methodology covers an interpolation technique of the discrete gravity data, which, considering a priori adopted covariance function, provides the best linear unbiased estimate of the respective quantity, numerical integration technique developed on the surface of ellipsoid in the spectral domain, an iterative procedure of analytical continuation in ellipsoidal coordinates, remove and restore of the atmospheric masses, an estimate of the far-zones contribution (in a case of regional data coverage) and the restore step of the obtained disturbing gravity potential to the target height anomaly. All the computational steps of the procedure are modest in the consumption of compute resources, thus the methodology can be used on a common personal computer, free of any accuracy or resolution penalty. Finally, the performance of the developed methodology is demonstrated on the real-case examples related to the territories of France (Auvergne regional quasigeoid) and the Czech Republic.

  4. Methodology of problem space modeling in industrial enterprise management system

    Directory of Open Access Journals (Sweden)

    V.V. Glushchevsky

    2015-03-01

    Full Text Available The aim of the article. The aim of the article is to develop methodological principles for building a problem space model which can be integrated into industrial enterprise management system. The results of the analysis. The author developed methodological principles for constructing the problem space of an industrial enterprise as a structural and functional model. These problems appear on enterprise business process network topology and can be solved by its management system. The centerpiece of the article is description of the main stages of implementation of modeling methodology of industrial enterprise typical management problems. These stages help to solve several units of organizational management system structure of enterprise within their functional competence. Author formulated an axiom system of structural and characteristic properties of modeling space problems elements, and interconnections between them. This system of axioms is actually a justification for the correctness and adequacy of the proposed modeling methodology and comes as theoretical basis in the construction of the structural and functional model of the management problems space. This model generalizes three basic structural components of the enterprise management system with the help of axioms system: a three-dimensional model of the management problem space (the first dimension is the enterprise business process network, the second dimension is a set of management problems, the third dimension is four vectors of measurable and qualitative characteristics of management problems, which can be analyzed and managed during enterprise functioning; a two-dimensional model of the cybernetic space of analytical problems, which are formalized form of management problems (multivariate model experiments can be implemented with the help of this model to solve wide range of problem situations and determine the most effective or optimal management solutions; a two-dimensional model

  5. Mixed-mode modelling mixing methodologies for organisational intervention

    CERN Document Server

    Clarke, Steve; Lehaney, Brian

    2001-01-01

    The 1980s and 1990s have seen a growing interest in research and practice in the use of methodologies within problem contexts characterised by a primary focus on technology, human issues, or power. During the last five to ten years, this has given rise to challenges regarding the ability of a single methodology to address all such contexts, and the consequent development of approaches which aim to mix methodologies within a single problem situation. This has been particularly so where the situation has called for a mix of technological (the so-called 'hard') and human­ centred (so-called 'soft') methods. The approach developed has been termed mixed-mode modelling. The area of mixed-mode modelling is relatively new, with the phrase being coined approximately four years ago by Brian Lehaney in a keynote paper published at the 1996 Annual Conference of the UK Operational Research Society. Mixed-mode modelling, as suggested above, is a new way of considering problem situations faced by organisations. Traditional...

  6. Environmental sustainability modeling with exergy methodology for building life cycle

    Institute of Scientific and Technical Information of China (English)

    刘猛; 姚润明

    2009-01-01

    As an important human activity,the building industry has created comfortable space for living and work,and at the same time brought considerable pollution and huge consumption of energy and recourses. From 1990s after the first building environmental assessment model-BREEAM was released in the UK,a number of assessment models were formulated as analytical and practical in methodology respectively. This paper aims to introduce a generic model of exergy assessment on environmental impact of building life cycle,taking into consideration of previous models and focusing on natural environment as well as building life cycle,and three environmental impacts will be analyzed,namely energy embodied exergy,resource chemical exergy and abatement exergy on energy consumption,resource consumption and pollutant discharge respectively. The model of exergy assessment on environmental impact of building life cycle thus formulated contains two sub-models,one from the aspect of building energy utilization,and the other from building materials use. Combining theories by ecologists such as Odum,building environmental sustainability modeling with exergy methodology is put forward with the index of exergy footprint of building environmental impacts.

  7. All Majorana Models with Translation Symmetry are Supersymmetric

    CERN Document Server

    Hsieh, Timothy H; Grover, Tarun

    2016-01-01

    We establish results similar to Kramers and Lieb-Schultz-Mattis theorems but involving only translation symmetry and for Majorana modes. In particular, we show that all states are at least doubly degenerate in any one and two dimensional array of Majorana modes with translation symmetry, periodic boundary conditions, and an odd number of modes per unit cell. Moreover, we show that all such systems have an underlying N=2 supersymmetry and explicitly construct the generator of the supersymmetry. Furthermore, we show that there cannot be a unique gapped ground state in such one dimensional systems with anti-periodic boundary conditions. These general results are fundamentally a consequence of the fact that translations for Majorana modes are represented projectively, which in turn stems from the anomalous nature of a single Majorana mode.

  8. All Majorana Models with Translation Symmetry are Supersymmetric

    Science.gov (United States)

    Hsieh, Timothy H.; Halász, Gábor B.; Grover, Tarun

    2016-10-01

    We establish results similar to Kramers and Lieb-Schultz-Mattis theorems but involving only translation symmetry and for Majorana modes. In particular, we show that all states are at least doubly degenerate in any one- and two-dimensional array of Majorana modes with translation symmetry, periodic boundary conditions, and an odd number of modes per unit cell. Moreover, we show that all such systems have an underlying N =2 supersymmetry and explicitly construct the generator of the supersymmetry. Furthermore, we establish that there cannot be a unique gapped ground state in such one-dimensional systems with antiperiodic boundary conditions. These general results are fundamentally a consequence of the fact that translations for Majorana modes are represented projectively, which in turn stems from the anomalous nature of a single Majorana mode. An experimental signature of the degeneracy arising from supersymmetry is a zero-bias peak in tunneling conductance.

  9. Translation techniques for distributed-shared memory programming models

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, Douglas James [Iowa State Univ., Ames, IA (United States)

    2005-01-01

    The high performance computing community has experienced an explosive improvement in distributed-shared memory hardware. Driven by increasing real-world problem complexity, this explosion has ushered in vast numbers of new systems. Each new system presents new challenges to programmers and application developers. Part of the challenge is adapting to new architectures with new performance characteristics. Different vendors release systems with widely varying architectures that perform differently in different situations. Furthermore, since vendors need only provide a single performance number (total MFLOPS, typically for a single benchmark), they only have strong incentive initially to optimize the API of their choice. Consequently, only a fraction of the available APIs are well optimized on most systems. This causes issues porting and writing maintainable software, let alone issues for programmers burdened with mastering each new API as it is released. Also, programmers wishing to use a certain machine must choose their API based on the underlying hardware instead of the application. This thesis argues that a flexible, extensible translator for distributed-shared memory APIs can help address some of these issues. For example, a translator might take as input code in one API and output an equivalent program in another. Such a translator could provide instant porting for applications to new systems that do not support the application's library or language natively. While open-source APIs are abundant, they do not perform optimally everywhere. A translator would also allow performance testing using a single base code translated to a number of different APIs. Most significantly, this type of translator frees programmers to select the most appropriate API for a given application based on the application (and developer) itself instead of the underlying hardware.

  10. Translation techniques for distributed-shared memory programming models

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, Douglas James

    2005-08-01

    The high performance computing community has experienced an explosive improvement in distributed-shared memory hardware. Driven by increasing real-world problem complexity, this explosion has ushered in vast numbers of new systems. Each new system presents new challenges to programmers and application developers. Part of the challenge is adapting to new architectures with new performance characteristics. Different vendors release systems with widely varying architectures that perform differently in different situations. Furthermore, since vendors need only provide a single performance number (total MFLOPS, typically for a single benchmark), they only have strong incentive initially to optimize the API of their choice. Consequently, only a fraction of the available APIs are well optimized on most systems. This causes issues porting and writing maintainable software, let alone issues for programmers burdened with mastering each new API as it is released. Also, programmers wishing to use a certain machine must choose their API based on the underlying hardware instead of the application. This thesis argues that a flexible, extensible translator for distributed-shared memory APIs can help address some of these issues. For example, a translator might take as input code in one API and output an equivalent program in another. Such a translator could provide instant porting for applications to new systems that do not support the application's library or language natively. While open-source APIs are abundant, they do not perform optimally everywhere. A translator would also allow performance testing using a single base code translated to a number of different APIs. Most significantly, this type of translator frees programmers to select the most appropriate API for a given application based on the application (and developer) itself instead of the underlying hardware.

  11. TRANSLATOR OF FINITE STATE MACHINE MODEL PARAMETERS FROM MATLAB ENVIRONMENT INTO HUMAN-MACHINE INTERFACE APPLICATION

    OpenAIRE

    2012-01-01

    Technology and means for automatic translation of FSM model parameters from Matlab application to human-machine interface application is proposed. The example of technology application to the electric apparatus model is described.

  12. Didaktisch-methodisches Modell, Methode und methodisches Instrumentarium im Fremdsprachenunterricht (Pedagogical-Methodological Model, Method and Methodological Arsenal in Foreign Language Teaching)

    Science.gov (United States)

    Guenther, Klaus

    1975-01-01

    Concentrates on (1) an exposition of the categories "pedagogical-methodological model", "method", and "methodological arsenal" from the viewpoint of FL teaching; (2) clearing up the relation between the pedagogical-methodological model and teaching method; (3) explaining an example of the application of the categories mentioned. (Text is in…

  13. Modeling sleep alterations in Parkinson's disease: How close are we to valid translational animal models?

    Science.gov (United States)

    Fifel, Karim; Piggins, Hugh; Deboer, Tom

    2016-02-01

    Parkinson disease is one of the neurodegenerative diseases that benefited the most from the use of non-human models. Consequently, significant advances have been made in the symptomatic treatments of the motor aspects of the disease. Unfortunately, this translational success has been tempered by the recognition of the debilitating aspect of multiple non-motor symptoms of the illness. Alterations of the sleep/wakefulness behavior experienced as insomnia, excessive daytime sleepiness, sleep/wake cycle fragmentation and REM sleep behavior disorder are among the non-motor symptoms that predate motor alterations and inevitably worsen over disease progression. The absence of adequate humanized animal models with the perfect phenocopy of these sleep alterations contribute undoubtedly to the lack of efficient therapies for these non-motor complications. In the context of developing efficient translational therapies, we provide an overview of the strengths and limitations of the various currently available models to replicate sleep alterations of Parkinson's disease. Our investigation reveals that although these models replicate dopaminergic deficiency and related parkinsonism, they rarely display a combination of sleep fragmentation and excessive daytime sleepiness and never REM sleep behavior disorder. In this light, we critically discuss the construct, face and predictive validities of both rodent and non-human primate animals to model the main sleep abnormalities experienced by patients with PD. We conclude by highlighting the need of integrating a network-based perspective in our modeling approach of such complex syndrome in order to celebrate valid translational models.

  14. Porcine models of digestive disease: the future of large animal translational research

    OpenAIRE

    Gonzalez, Liara M.; Moeser, Adam J; Blikslager, Anthony T.

    2015-01-01

    There is increasing interest in non-rodent translational models for the study of human disease. The pig, in particular, serves as a useful animal model for the study of pathophysiological conditions relevant to the human intestine. This review assesses currently used porcine models of gastrointestinal physiology and disease and provides a rationale for the use of these models for future translational studies. The pig has proven its utility for the study of fundamental disease conditions such ...

  15. A generalized methodology to characterize composite materials for pyrolysis models

    Science.gov (United States)

    McKinnon, Mark B.

    The predictive capabilities of computational fire models have improved in recent years such that models have become an integral part of many research efforts. Models improve the understanding of the fire risk of materials and may decrease the number of expensive experiments required to assess the fire hazard of a specific material or designed space. A critical component of a predictive fire model is the pyrolysis sub-model that provides a mathematical representation of the rate of gaseous fuel production from condensed phase fuels given a heat flux incident to the material surface. The modern, comprehensive pyrolysis sub-models that are common today require the definition of many model parameters to accurately represent the physical description of materials that are ubiquitous in the built environment. Coupled with the increase in the number of parameters required to accurately represent the pyrolysis of materials is the increasing prevalence in the built environment of engineered composite materials that have never been measured or modeled. The motivation behind this project is to develop a systematic, generalized methodology to determine the requisite parameters to generate pyrolysis models with predictive capabilities for layered composite materials that are common in industrial and commercial applications. This methodology has been applied to four common composites in this work that exhibit a range of material structures and component materials. The methodology utilizes a multi-scale experimental approach in which each test is designed to isolate and determine a specific subset of the parameters required to define a material in the model. Data collected in simultaneous thermogravimetry and differential scanning calorimetry experiments were analyzed to determine the reaction kinetics, thermodynamic properties, and energetics of decomposition for each component of the composite. Data collected in microscale combustion calorimetry experiments were analyzed to

  16. Selected translated abstracts of Russian-language climate-change publications. 4: General circulation models

    Energy Technology Data Exchange (ETDEWEB)

    Burtis, M.D. [comp.] [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Razuvaev, V.N.; Sivachok, S.G. [All-Russian Research Inst. of Hydrometeorological Information--World Data Center, Obninsk (Russian Federation)

    1996-10-01

    This report presents English-translated abstracts of important Russian-language literature concerning general circulation models as they relate to climate change. Into addition to the bibliographic citations and abstracts translated into English, this report presents the original citations and abstracts in Russian. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  17. Unbiased quantitative models of protein translation derived from ribosome profiling data

    NARCIS (Netherlands)

    Gritsenko, A.A.; Hulsman, M.; Reinders, M.J.T.; Ridder, de D.

    2015-01-01

    Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of prot

  18. Unbiased Quantitative Models of Protein Translation Derived from Ribosome Profiling Data

    NARCIS (Netherlands)

    Gritsenko, A.A.; Hulsman, M.; Reinders, M.J.T.; De Ridder, D.

    2015-01-01

    Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of prot

  19. Unbiased Quantitative Models of Protein Translation Derived from Ribosome Profiling Data.

    Directory of Open Access Journals (Sweden)

    Alexey A Gritsenko

    2015-08-01

    Full Text Available Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of protein synthesis. Although a number of computational models of the process have been proposed, their application is limited by the assumptions they make. Ribosome profiling (RP, a relatively new sequencing-based technique capable of recording snapshots of the locations of actively translating ribosomes, is a promising source of information for deriving unbiased data-driven translation models. However, quantitative analysis of RP data is challenging due to high measurement variance and the inability to discriminate between the number of ribosomes measured on a gene and their speed of translation. We propose a solution in the form of a novel multi-scale interpretation of RP data that allows for deriving models with translation dynamics extracted from the snapshots. We demonstrate the usefulness of this approach by simultaneously determining for the first time per-codon translation elongation and per-gene translation initiation rates of Saccharomyces cerevisiae from RP data for two versions of the Totally Asymmetric Exclusion Process (TASEP model of translation. We do this in an unbiased fashion, by fitting the models using only RP data with a novel optimization scheme based on Monte Carlo simulation to keep the problem tractable. The fitted models match the data significantly better than existing models and their predictions show better agreement with several independent protein abundance datasets than existing models. Results additionally indicate that the tRNA pool adaptation hypothesis is incomplete, with evidence suggesting that tRNA post-transcriptional modifications and codon context may play a role in determining codon elongation rates.

  20. Unbiased Quantitative Models of Protein Translation Derived from Ribosome Profiling Data.

    Science.gov (United States)

    Gritsenko, Alexey A; Hulsman, Marc; Reinders, Marcel J T; de Ridder, Dick

    2015-08-01

    Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of protein synthesis. Although a number of computational models of the process have been proposed, their application is limited by the assumptions they make. Ribosome profiling (RP), a relatively new sequencing-based technique capable of recording snapshots of the locations of actively translating ribosomes, is a promising source of information for deriving unbiased data-driven translation models. However, quantitative analysis of RP data is challenging due to high measurement variance and the inability to discriminate between the number of ribosomes measured on a gene and their speed of translation. We propose a solution in the form of a novel multi-scale interpretation of RP data that allows for deriving models with translation dynamics extracted from the snapshots. We demonstrate the usefulness of this approach by simultaneously determining for the first time per-codon translation elongation and per-gene translation initiation rates of Saccharomyces cerevisiae from RP data for two versions of the Totally Asymmetric Exclusion Process (TASEP) model of translation. We do this in an unbiased fashion, by fitting the models using only RP data with a novel optimization scheme based on Monte Carlo simulation to keep the problem tractable. The fitted models match the data significantly better than existing models and their predictions show better agreement with several independent protein abundance datasets than existing models. Results additionally indicate that the tRNA pool adaptation hypothesis is incomplete, with evidence suggesting that tRNA post-transcriptional modifications and codon context may play a role in determining codon elongation rates.

  1. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  2. Improved Methodology for Parameter Inference in Nonlinear, Hydrologic Regression Models

    Science.gov (United States)

    Bates, Bryson C.

    1992-01-01

    A new method is developed for the construction of reliable marginal confidence intervals and joint confidence regions for the parameters of nonlinear, hydrologic regression models. A parameter power transformation is combined with measures of the asymptotic bias and asymptotic skewness of maximum likelihood estimators to determine the transformation constants which cause the bias or skewness to vanish. These optimized constants are used to construct confidence intervals and regions for the transformed model parameters using linear regression theory. The resulting confidence intervals and regions can be easily mapped into the original parameter space to give close approximations to likelihood method confidence intervals and regions for the model parameters. Unlike many other approaches to parameter transformation, the procedure does not use a grid search to find the optimal transformation constants. An example involving the fitting of the Michaelis-Menten model to velocity-discharge data from an Australian gauging station is used to illustrate the usefulness of the methodology.

  3. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  4. teaching translation

    Directory of Open Access Journals (Sweden)

    Sergio Bolaños Cuéllar

    2007-01-01

    Full Text Available The advance in cultural-oriented perspectives in Translation Studies has sometimes played down the text linguistic nature of translation. A pilot study in teaching translation was carried out to make students aware of the text linguistic character of translating and help them to improve their translation skills, particularly with an emphasis on self-awareness and self-correcting strategies. The theoretical background is provided by the Dynamic Translation Model (2004, 2005 proposed by the author, with relevant and important contributions taken from Genette’s (1982 transtextuality phenomena (hypertext, hypotext, metatext, paratext, intertext and House and Kasper’s (1981 pragmatic modality markers (downgraders, upgraders. The key conceptual role of equivalence as a defining feature of translation is also dealt with. The textual relationship between Source Language Text (slt is deemed to be pivotal for performing translation and correction tasks in the classroom. Finally, results of the pilot study are discussed and some conclusions are drawn.

  5. Animal Models of Virus-Induced Neurobehavioral Sequelae: Recent Advances, Methodological Issues, and Future Prospects

    Directory of Open Access Journals (Sweden)

    Marco Bortolato

    2010-01-01

    Full Text Available Converging lines of clinical and epidemiological evidence suggest that viral infections in early developmental stages may be a causal factor in neuropsychiatric disorders such as schizophrenia, bipolar disorder, and autism-spectrum disorders. This etiological link, however, remains controversial in view of the lack of consistent and reproducible associations between viruses and mental illness. Animal models of virus-induced neurobehavioral disturbances afford powerful tools to test etiological hypotheses and explore pathophysiological mechanisms. Prenatal or neonatal inoculations of neurotropic agents (such as herpes-, influenza-, and retroviruses in rodents result in a broad spectrum of long-term alterations reminiscent of psychiatric abnormalities. Nevertheless, the complexity of these sequelae often poses methodological and interpretational challenges and thwarts their characterization. The recent conceptual advancements in psychiatric nosology and behavioral science may help determine new heuristic criteria to enhance the translational value of these models. A particularly critical issue is the identification of intermediate phenotypes, defined as quantifiable factors representing single neurochemical, neuropsychological, or neuroanatomical aspects of a diagnostic category. In this paper, we examine how the employment of these novel concepts may lead to new methodological refinements in the study of virus-induced neurobehavioral sequelae through animal models.

  6. Quality Assessment of Persian Translation of English Pharmaceutical Leaflets Based on House’s Model

    Directory of Open Access Journals (Sweden)

    Atefeh Zekri

    2016-11-01

    Full Text Available This research attempted to evaluate the quality of Persian translation of drug leaflets. The researchers randomly selected a set of 30 pharmaceutical leaflets collected between March-August, 2015. The leaflets were analyzed based on House’s functional-pragmatic model of translation quality assessment. At first, the profiles of both source texts and target texts were collected. Then, their overtly and covertly erroneous errors and the kinds of strategies used by the translators in translating the pharmaceutical leaflets into Persian were identified. The results indicate that out of 90 selected sentences of English leaflets, 47 were overtly erroneous and 43 were error-free. These overtly erroneous translations had 27 instances of “mistranslation”, 15 instances of “grammatical mistakes”, six instances of “addition”, six instances of “omission” and four instances of “substitution”. The only covert error was “tenor mismatch” in all sample sentences. The study findings can help teachers in translation studies to improve the quality of students’ translation. Moreover, awareness of the errors in the current leaflet translations can assist students in performing their future jobs as translators. The findings may directly and indirectly affect the health of patients in an efficient and effective way.

  7. Lost in translation: animal models and clinical trials in cancer treatment.

    Science.gov (United States)

    Mak, Isabella Wy; Evaniew, Nathan; Ghert, Michelle

    2014-01-01

    Due to practical and ethical concerns associated with human experimentation, animal models have been essential in cancer research. However, the average rate of successful translation from animal models to clinical cancer trials is less than 8%. Animal models are limited in their ability to mimic the extremely complex process of human carcinogenesis, physiology and progression. Therefore the safety and efficacy identified in animal studies is generally not translated to human trials. Animal models can serve as an important source of in vivo information, but alternative translational approaches have emerged that may eventually replace the link between in vitro studies and clinical applications. This review summarizes the current state of animal model translation to clinical practice, and offers some explanations for the general lack of success in this process. In addition, some alternative strategies to the classic in vivo approach are discussed.

  8. Rodent models of ischemic stroke lack translational relevance... are baboon models the answer?

    Science.gov (United States)

    Kwiecien, Timothy D; Sy, Christopher; Ding, Yuchuan

    2014-05-01

    Rodent models of ischemic stroke are associated with many issues and limitations, which greatly diminish the translational potential of these studies. Recent studies demonstrate that significant differences exist between rodent and human ischemic stroke. These differences include the physical characteristics of the stroke, as well as changes in the subsequent inflammatory and molecular pathways following the acute ischemic insult. Non-human primate (NHP) models of ischemic stroke, however, are much more similar to humans. In addition to evident anatomical similarities, the physiological responses that NHPs experience during ischemic stroke are much more applicable to the human condition and thus make it an attractive model for future research. The baboon ischemic stroke model, in particular, has been studied extensively in comparison to other NHP models. Here we discuss the major shortcomings associated with rodent ischemic stroke models and provide a comparative overview of baboon ischemic stroke models. Studies have shown that baboons, although more difficult to obtain and handle, are more representative of ischemic events in humans and may have greater translational potential that can offset these deficiencies. There remain critical issues within these baboon stroke studies that need to be addressed in future investigations. The most critical issue revolves around the size and the variability of baboon ischemic stroke. Compared to rodent models, however, issues such as these can be addressed in future studies. Importantly, baboon models avoid many drawbacks associated with rodent models including vascular variability and inconsistent inflammatory responses - issues that are inherent to the species and cannot be avoided.

  9. Developing the Business Model – a Methodology for Virtual Enterprises

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a methodology to develop Virtual Enterprises (VEs). This Virtual Enterprise Methodology (VEM) outlines activities to consider when setting up and managing virtual enterprises. As a methodology the VEM helps companies to ask the right questions when preparing for, and setting...... and Methodology ISO15704:2000)....

  10. Methodology and basic algorithms of the Livermore Economic Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.B.

    1981-03-17

    The methodology and the basic pricing algorithms used in the Livermore Economic Modeling System (EMS) are described. The report explains the derivations of the EMS equations in detail; however, it could also serve as a general introduction to the modeling system. A brief but comprehensive explanation of what EMS is and does, and how it does it is presented. The second part examines the basic pricing algorithms currently implemented in EMS. Each algorithm's function is analyzed and a detailed derivation of the actual mathematical expressions used to implement the algorithm is presented. EMS is an evolving modeling system; improvements in existing algorithms are constantly under development and new submodels are being introduced. A snapshot of the standard version of EMS is provided and areas currently under study and development are considered briefly.

  11. MODERN MODELS AND METHODS OF DIAGNOSIS OF METHODOLOGY COMPETENT TEACHERS

    Directory of Open Access Journals (Sweden)

    Loyko V. I.

    2016-06-01

    Full Text Available The purpose of the research is development of models and methods of diagnostics of methodical competence of a teacher. According to modern views, methodical thinking is the key competence of teachers. Modern experts consider the methodical competence of a teacher as a personal and professional quality, which is a fundamentally important factor in the success of the professional activity of teachers, as well as a subsystem of its professional competence. This is due to the fact that in today's world, a high level of knowledge of teachers of academic subjects and their possessing of learnt basics of teaching methods can not fully describe the level of professional competence of the teacher. The authors have characterized the functional components of methodical competence of the teacher, its relationship with other personalprofessional qualities (first - to the psychological and educational, research and informational competence, as well as its levels of formation. Forming a model of methodical competence of the teacher, the authors proceeded from the fact that a contemporary teacher high demands: it must be ready to conduct independent research, design-learning technologies, forecasting results of training and education of students. As a leading component of the methodical competence of the teacher is his personal experience in methodological activities and requirements of methodical competence determined goals and objectives of methodical activity, the process of the present study, the formation of patterns of methodical competence of the teacher preceded the refinement of existing models methodical activity of scientific and pedagogical staff of higher education institutions and secondary vocational education institutions. The proposed model of methodical competence of the teacher - the scientific basis of a system of monitoring of his personal and professional development, and evaluation criteria and levels of her diagnosis - targets system of

  12. Proposed Methodology for Generation of Building Information Model with Laserscanning

    Institute of Scientific and Technical Information of China (English)

    Shutao Li; J(o)rg lsele; Georg Bretthauer

    2008-01-01

    For refurbishment and state review of an existing old building,a new model reflecting the current state is often required especially when the original plans are no longer accessible.Laser scanners are used more and more as surveying instruments for various applications because of their high-precision scanning abilities.For buildings,the most notable and widely accepted product data model is the IFC product data model.It is designed to cover the whole lifecycle and supported by various software vendors and enables applications to efficiently share and exchange project information.The models obtained with the laser scan-ner,normally sets of points ("point cloud"),have to be transferred to an IFC compatible building information model to serve the needs of different planning states.This paper presents an approach designed by the German Research Center in Karlsmhe (Forschungszentrum Kadsmhe) to create an IFC compatible building information model from laser range images.The methodology through the entire process from data acquisi tion to the IFC compatible product model was proposed in this paper.In addition,IFC-Models with different level of detail (LoDs) were introduced and discussed within the work.

  13. A thematic analysis of theoretical models for translational science in nursing: mapping the field.

    Science.gov (United States)

    Mitchell, Sandra A; Fisher, Cheryl A; Hastings, Clare E; Silverman, Leanne B; Wallen, Gwenyth R

    2010-01-01

    The quantity and diversity of conceptual models in translational science may complicate rather than advance the use of theory. This paper offers a comparative thematic analysis of the models available to inform knowledge development, transfer, and utilization. Literature searches identified 47 models for knowledge translation. Four thematic areas emerged: (1) evidence-based practice and knowledge transformation processes, (2) strategic change to promote adoption of new knowledge, (3) knowledge exchange and synthesis for application and inquiry, and (4) designing and interpreting dissemination research. This analysis distinguishes the contributions made by leaders and researchers at each phase in the process of discovery, development, and service delivery. It also informs the selection of models to guide activities in knowledge translation. A flexible theoretical stance is essential to simultaneously develop new knowledge and accelerate the translation of that knowledge into practice behaviors and programs of care that support optimal patient outcomes.

  14. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  15. A generic method for automatic translation between input models for different versions of simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Serfontein, Dawid E., E-mail: Dawid.Serfontein@nwu.ac.za [School of Mechanical and Nuclear Engineering, North West University (PUK-Campus), PRIVATE BAG X6001 (Internal Post Box 360), Potchefstroom 2520 (South Africa); Mulder, Eben J. [School of Mechanical and Nuclear Engineering, North West University (South Africa); Reitsma, Frederik [Calvera Consultants (South Africa)

    2014-05-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications.

  16. Assessing the Quality of Persian Translation of Orwell's Nineteen Eighty-four Based on House's Model: Overt-covert Translation Distinction

    Directory of Open Access Journals (Sweden)

    Hossein Heidani TABRIZI

    2014-12-01

    Full Text Available This study aimed to assess the quality of Persian translation of Orwell's (1949 Nineteen Eighty-Four by Balooch (2004 based on House's (1997 model of translation quality assessment. To do so, about 10 percent of the source text was randomly selected. The profile of the source text register was produced and the genre was realized. The source text profile was compared to the translation text profile. The result of this comparison was dimensional mismatches and overt errors. The dimensional mismatches were categorized based on different dimensions of register. The overt errors which were based on denotative mismatches and target system errors were categorized into omissions, additions, substitutions, and breaches of the target language system. Then, the frequencies of occurrences of subcategories of overt errors along with their percentages were calculated. The dimensional mismatches and a large number of major overt errors including omissions and substitutions indicated that the translation was not in accordance with the House's view stating that literary works needed to be translated overtly. Mismatches on different levels of register showed that the cultural filter was applied in translation and the second-level functional equivalence required for overt translation was not reached. Therefore, the Persian translation of Nineteen Eighty-Four did not fulfill the criteria to be an overt translation.

  17. A methodology for modeling barrier island storm-impact scenarios

    Science.gov (United States)

    Mickey, Rangley C.; Long, Joseph W.; Plant, Nathaniel G.; Thompson, David M.; Dalyander, P. Soupy

    2017-02-16

    A methodology for developing a representative set of storm scenarios based on historical wave buoy and tide gauge data for a region at the Chandeleur Islands, Louisiana, was developed by the U.S. Geological Survey. The total water level was calculated for a 10-year period and analyzed against existing topographic data to identify when storm-induced wave action would affect island morphology. These events were categorized on the basis of the threshold of total water level and duration to create a set of storm scenarios that were simulated, using a high-fidelity, process-based, morphologic evolution model, on an idealized digital elevation model of the Chandeleur Islands. The simulated morphological changes resulting from these scenarios provide a range of impacts that can help coastal managers determine resiliency of proposed or existing coastal structures and identify vulnerable areas within those structures.

  18. Modeling of electrohydrodynamic drying process using response surface methodology.

    Science.gov (United States)

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-05-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box-Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM.

  19. A ROADMAP FOR GENERATING SEMANTICALLY ENRICHED BUILDING MODELS ACCORDING TO CITYGML MODEL VIA TWO DIFFERENT METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    G. Floros

    2016-10-01

    Full Text Available The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model’s format, via semi-automatic procedures with respect to the user’s scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model’s generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects’ purposes.

  20. Ketamine-induced brain activation in awake female nonhuman primates: a translational functional imaging model.

    Science.gov (United States)

    Maltbie, Eric; Gopinath, Kaundinya; Urushino, Naoko; Kempf, Doty; Howell, Leonard

    2016-03-01

    There is significant interest in the NMDA receptor antagonist ketamine due to its efficacy in treating depressive disorders and its induction of psychotic-like symptoms that make it a useful tool for modeling psychosis. The present study extends the successful development of an apparatus and methodology to conduct pharmacological MRI studies in awake rhesus monkeys in order to evaluate the CNS effects of ketamine. Functional MRI scans were conducted in four awake adult female rhesus monkeys during sub-anesthetic intravenous (i.v.) infusions of ketamine (0.345 mg/kg bolus followed by 0.256 mg/kg/h constant infusion) with and without risperidone pretreatment (0.06 mg/kg). Statistical parametric maps of ketamine-induced blood oxygenation level-dependent (BOLD) activation were obtained with appropriate general linear regression models (GLMs) incorporating motion and hemodynamics of ketamine infusion. Ketamine infusion induced and sustained robust BOLD activation in a number of cortical and subcortical regions, including the thalamus, cingulate gyrus, and supplementary motor area. Pretreatment with the antipsychotic drug risperidone markedly blunted ketamine-induced activation in many brain areas. The results are remarkably similar to human imaging studies showing ketamine-induced BOLD activation in many of the same brain areas, and pretreatment with risperidone or another antipsychotic blunting the ketamine response to a similar extent. The strong concordance of the functional imaging data in humans with these results from nonhuman primates highlights the translational value of the model and provides an excellent avenue for future research examining the CNS effects of ketamine. This model may also be a useful tool for evaluating the efficacy of novel antipsychotic drugs.

  1. Effects of different per translational kinetics on the dynamics of a core circadian clock model.

    Directory of Open Access Journals (Sweden)

    Paula S Nieto

    Full Text Available Living beings display self-sustained daily rhythms in multiple biological processes, which persist in the absence of external cues since they are generated by endogenous circadian clocks. The period (per gene is a central player within the core molecular mechanism for keeping circadian time in most animals. Recently, the modulation PER translation has been reported, both in mammals and flies, suggesting that translational regulation of clock components is important for the proper clock gene expression and molecular clock performance. Because translational regulation ultimately implies changes in the kinetics of translation and, therefore, in the circadian clock dynamics, we sought to study how and to what extent the molecular clock dynamics is affected by the kinetics of PER translation. With this objective, we used a minimal mathematical model of the molecular circadian clock to qualitatively characterize the dynamical changes derived from kinetically different PER translational mechanisms. We found that the emergence of self-sustained oscillations with characteristic period, amplitude, and phase lag (time delays between per mRNA and protein expression depends on the kinetic parameters related to PER translation. Interestingly, under certain conditions, a PER translation mechanism with saturable kinetics introduces longer time delays than a mechanism ruled by a first-order kinetics. In addition, the kinetic laws of PER translation significantly changed the sensitivity of our model to parameters related to the synthesis and degradation of per mRNA and PER degradation. Lastly, we found a set of parameters, with realistic values, for which our model reproduces some experimental results reported recently for Drosophila melanogaster and we present some predictions derived from our analysis.

  2. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  3. Developing the Business Model – a Methodology for Virtual Enterprises

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a methodology to develop Virtual Enterprises (VEs). This Virtual Enterprise Methodology (VEM) outlines activities to consider when setting up and managing virtual enterprises. As a methodology the VEM helps companies to ask the right questions when preparing for, and setting...

  4. 莫言作品传播的译者模式和翻译策略%On Translator Model and Translating Strategy of Mo Yan' s Works

    Institute of Scientific and Technical Information of China (English)

    邵霞

    2015-01-01

    The importance of translator was proved when Mo Yan won Nobel Prize for Literature.The selection of translators and translation strategies are the two factors that influence Chinese literature to go abroad.As a matter of fact,adopting sinologist translation model and domestication translating strategy are the two efficient ways to promote Chinese literature to go abroad.%莫言获得诺贝尔文学奖证明了译者的重要性.译者模式的遴选原则以及翻译策略的理性选择是中国文学"走出去"的关键因素.事实上,汉学家译者模式以及归化式翻译策略理应成为翻译界让中国文学"走出去"的有效方法.

  5. Critical Appraisal of Translational Research Models for Suitability in Performance Assessment of Cancer Centers

    OpenAIRE

    Rajan, Abinaya; Sullivan, Richard; Bakker, Suzanne; van Harten, Wim H.

    2012-01-01

    This study aimed to critically appraise translational research models for suitability in performance assessment of cancer centers. Process models, such as the Process Marker Model and Lean and Six Sigma applications, seem to be suitable for performance assessment of cancer centers. However, they must be thoroughly tested in practice.

  6. Broken Time Translation Symmetry as a Model for Quantum State Reduction

    Directory of Open Access Journals (Sweden)

    Jasper van Wezel

    2010-04-01

    Full Text Available The symmetries that govern the laws of nature can be spontaneously broken, enabling the occurrence of ordered states. Crystals arise from the breaking of translation symmetry, magnets from broken spin rotation symmetry and massive particles break a phase rotation symmetry. Time translation symmetry can be spontaneously broken in exactly the same way. The order associated with this form of spontaneous symmetry breaking is characterised by the emergence of quantum state reduction: systems which spontaneously break time translation symmetry act as ideal measurement machines. In this review the breaking of time translation symmetry is first compared to that of other symmetries such as spatial translations and rotations. It is then discussed how broken time translation symmetry gives rise to the process of quantum state reduction and how it generates a pointer basis, Born’s rule, etc. After a comparison between this model and alternative approaches to the problem of quantum state reduction, the experimental implications and possible tests of broken time translation symmetry in realistic experimental settings are discussed.

  7. Maximizing protein translation rate in the non-homogeneous ribosome flow model: a convex optimization approach.

    Science.gov (United States)

    Poker, Gilad; Zarai, Yoram; Margaliot, Michael; Tuller, Tamir

    2014-11-06

    Translation is an important stage in gene expression. During this stage, macro-molecules called ribosomes travel along the mRNA strand linking amino acids together in a specific order to create a functioning protein. An important question, related to many biomedical disciplines, is how to maximize protein production. Indeed, translation is known to be one of the most energy-consuming processes in the cell, and it is natural to assume that evolution shaped this process so that it maximizes the protein production rate. If this is indeed so then one can estimate various parameters of the translation machinery by solving an appropriate mathematical optimization problem. The same problem also arises in the context of synthetic biology, namely, re-engineer heterologous genes in order to maximize their translation rate in a host organism. We consider the problem of maximizing the protein production rate using a computational model for translation-elongation called the ribosome flow model (RFM). This model describes the flow of the ribosomes along an mRNA chain of length n using a set of n first-order nonlinear ordinary differential equations. It also includes n + 1 positive parameters: the ribosomal initiation rate into the mRNA chain, and n elongation rates along the chain sites. We show that the steady-state translation rate in the RFM is a strictly concave function of its parameters. This means that the problem of maximizing the translation rate under a suitable constraint always admits a unique solution, and that this solution can be determined using highly efficient algorithms for solving convex optimization problems even for large values of n. Furthermore, our analysis shows that the optimal translation rate can be computed based only on the optimal initiation rate and the elongation rate of the codons near the beginning of the ORF. We discuss some applications of the theoretical results to synthetic biology, molecular evolution, and functional genomics.

  8. Harmonization and translation of crop modeling data to ensure interoperability

    NARCIS (Netherlands)

    Porter, C.; Villalobos, C.; Holzworth, D.; Nelson, R.; White, J.W.; Athanasiadis, I.N.; Janssen, S.J.C.; Ripoche, D.; Cufi, J.; Raes, D.; Zhang, M.; Knapen, M.J.R.; Sahajpal, R.; Boote, K.; Jones, J.W.

    2014-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) seeks to improve the capability of ecophysiological and economic models to describe the potential impacts of climate change on agricultural systems. AgMIP protocols emphasize the use of multiple models; consequently, data harmoni

  9. Harmonization and translation of crop modeling data to ensure interoperability

    NARCIS (Netherlands)

    Porter, C.; Villalobos, C.; Holzworth, D.; Nelson, R.; White, J.W.; Athanasiadis, I.N.; Janssen, S.J.C.; Ripoche, D.; Cufi, J.; Raes, D.; Zhang, M.; Knapen, M.J.R.; Sahajpal, R.; Boote, K.; Jones, J.W.

    2014-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) seeks to improve the capability of ecophysiological and economic models to describe the potential impacts of climate change on agricultural systems. AgMIP protocols emphasize the use of multiple models; consequently, data

  10. A translational research framework for enhanced validity of mouse models of psychopathological states in depression.

    Science.gov (United States)

    Pryce, Christopher R; Seifritz, Erich

    2011-04-01

    Depression presents as a disorder of feelings and thoughts that debilitate daily functioning and can be life threatening. Increased understanding of these specific emotional-cognitive pathological states and their underlying pathophysiologies and neuropathologies is fundamental to an increased understanding of the disorder and, therefore, to development of much-needed improved therapies. Despite this, there is a current lack of emphasis on development and application of translational (i.e. valid) neuropsychological measures in depression research. The appropriate strategy is neuropsychological research translated, bi-directionally, between epidemiological and clinical human research and in vivo - ex vivo preclinical research conducted, primarily, with mice. This paper presents a translational framework to stimulate and inform such research, in four inter-dependent sections. (1) A depression systems-model describes the pathway between human environment-gene (E-G) epidemiology, pathophysiology, psycho- and neuropathology, symptoms, and diagnosis. This model indicates that G→emotional-cognitive endophenotypes and E-G/endophenotype→emotional-cognitive state markers are central to experimental and translational depression research. (2) Human neuropsychological tests with (potential) translational value for the quantitative study of these endophenotypes and state markers are presented. (3) The analogous rodent behavioural tests are presented and their translational validity in terms of providing analogue emotional-cognitive endophenotypes and state markers are discussed. (4) The need for aetiological validity of mouse models in terms of G→endophenotypes and E-G→state markers is presented. We conclude that the informed application of the proposed neuropsychological translational framework will yield mouse models of high face, construct and aetiological validity with respect to emotional-cognitive dysfunction in depression. These models, together with the available

  11. An isolated perfused pig heart model for the development, validation and translation of novel cardiovascular magnetic resonance techniques

    Directory of Open Access Journals (Sweden)

    Perera Divaka

    2010-09-01

    Full Text Available Abstract Background Novel cardiovascular magnetic resonance (CMR techniques and imaging biomarkers are often validated in small animal models or empirically in patients. Direct translation of small animal CMR protocols to humans is rarely possible, while validation in humans is often difficult, slow and occasionally not possible due to ethical considerations. The aim of this study is to overcome these limitations by introducing an MR-compatible, free beating, blood-perfused, isolated pig heart model for the development of novel CMR methodology. Methods 6 hearts were perfused outside of the MR environment to establish preparation stability. Coronary perfusion pressure (CPP, coronary blood flow (CBF, left ventricular pressure (LVP, arterial blood gas and electrolyte composition were monitored over 4 hours. Further hearts were perfused within 3T (n = 3 and 1.5T (n = 3 clinical MR scanners, and characterised using functional (CINE, perfusion and late gadolinium enhancement (LGE imaging. Perfusion imaging was performed globally and selectively for the right (RCA and left coronary artery (LCA. In one heart the RCA perfusion territory was determined and compared to infarct size after coronary occlusion. Results All physiological parameters measured remained stable and within normal ranges. The model proved amenable to CMR at both field strengths using typical clinical acquisitions. There was good agreement between the RCA perfusion territory measured by selective first pass perfusion and LGE after coronary occlusion (37% versus 36% of the LV respectively. Conclusions This flexible model allows imaging of cardiac function in a controllable, beating, human-sized heart using clinical MR systems. It should aid further development, validation and clinical translation of novel CMR methodologies, and imaging sequences.

  12. FUZZY MODEL OPTIMIZATION FOR TIME SERIES DATA USING A TRANSLATION IN THE EXTENT OF MEAN ERROR

    Directory of Open Access Journals (Sweden)

    Nurhayadi

    2014-01-01

    Full Text Available Recently, many researchers in the field of writing about the prediction of stock price forecasting, electricity load demand and academic enrollment using fuzzy methods. However, in general, modeling does not consider the model position to actual data yet where it means that error is not been handled optimally. The error that is not managed well can reduce the accuracy of the forecasting. Therefore, the paper will discuss reducing error using model translation. The error that will be reduced is Mean Square Error (MSE. Here, the analysis is done mathematically and the empirical study is done by applying translation to fuzzy model for enrollment forecasting at the Alabama University. The results of this analysis show that the translation in the extent of mean error can reduce the MSE.

  13. Development of a General Modelling Methodology for Vacuum Residue Hydroconversion

    Directory of Open Access Journals (Sweden)

    Pereira de Oliveira L.

    2013-11-01

    Full Text Available This work concerns the development of a methodology for kinetic modelling of refining processes, and more specifically for vacuum residue conversion. The proposed approach allows to overcome the lack of molecular detail of the petroleum fractions and to simulate the transformation of the feedstock molecules into effluent molecules by means of a two-step procedure. In the first step, a synthetic mixture of molecules representing the feedstock for the process is generated via a molecular reconstruction method, termed SR-REM molecular reconstruction. In the second step, a kinetic Monte-Carlo method (kMC is used to simulate the conversion reactions on this mixture of molecules. The molecular reconstruction was applied to several petroleum residues and is illustrated for an Athabasca (Canada vacuum residue. The kinetic Monte-Carlo method is then described in detail. In order to validate this stochastic approach, a lumped deterministic model for vacuum residue conversion was simulated using Gillespie’s Stochastic Simulation Algorithm. Despite the fact that both approaches are based on very different hypotheses, the stochastic simulation algorithm simulates the conversion reactions with the same accuracy as the deterministic approach. The full-scale stochastic simulation approach using molecular-level reaction pathways provides high amounts of detail on the effluent composition and is briefly illustrated for Athabasca VR hydrocracking.

  14. An ABET assessment model using Six Sigma methodology

    Science.gov (United States)

    Lalovic, Mira

    Technical fields are changing so rapidly that even the core of an engineering education must be constantly reevaluated. Graduates of today give more dedication and, almost certainly, more importance to continued learning than to mastery of specific technical concepts. Continued learning shapes a high-quality education, which is what an engineering college must offer its students. The question is how to guarantee the quality of education. In addition, the Accreditation Board of Engineering and Technology is asking that universities commit to continuous and comprehensive education, assuming quality of the educational process. The research is focused on developing a generic assessment model for a college of engineering as an annual cycle that consists of a systematic assessment of every course in the program, followed by an assessment of the program and of the college as a whole using Six Sigma methodology. This unique approach to assessment in education will provide a college of engineering with valuable information regarding many important curriculum decisions in every accreditation cycle. The Industrial and Manufacturing Engineering (IME) Program in the College of Engineering at the University of Cincinnati will be used as a case example for a preliminary test of the generic model.

  15. SR-Site groundwater flow modelling methodology, setup and results

    Energy Technology Data Exchange (ETDEWEB)

    Selroos, Jan-Olof (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Follin, Sven (SF GeoLogic AB, Taeby (Sweden))

    2010-12-15

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report.

  16. Mathematical modeling of translation initiation for the estimation of its efficiency to computationally design mRNA sequences with desired expression levels in prokaryotes

    Directory of Open Access Journals (Sweden)

    Lee Sunjae

    2010-05-01

    Full Text Available Abstract Background Within the emerging field of synthetic biology, engineering paradigms have recently been used to design biological systems with novel functionalities. One of the essential challenges hampering the construction of such systems is the need to precisely optimize protein expression levels for robust operation. However, it is difficult to design mRNA sequences for expression at targeted protein levels, since even a few nucleotide modifications around the start codon may alter translational efficiency and dramatically (up to 250-fold change protein expression. Previous studies have used ad hoc approaches (e.g., random mutagenesis to obtain the desired translational efficiencies for mRNA sequences. Hence, the development of a mathematical methodology capable of estimating translational efficiency would greatly facilitate the future design of mRNA sequences aimed at yielding desired protein expression levels. Results We herein propose a mathematical model that focuses on translation initiation, which is the rate-limiting step in translation. The model uses mRNA-folding dynamics and ribosome-binding dynamics to estimate translational efficiencies solely from mRNA sequence information. We confirmed the feasibility of our model using previously reported expression data on the MS2 coat protein. For further confirmation, we used our model to design 22 luxR mRNA sequences predicted to have diverse translation efficiencies ranging from 10-5 to 1. The expression levels of these sequences were measured in Escherichia coli and found to be highly correlated (R2 = 0.87 with their estimated translational efficiencies. Moreover, we used our computational method to successfully transform a low-expressing DsRed2 mRNA sequence into a high-expressing mRNA sequence by maximizing its translational efficiency through the modification of only eight nucleotides upstream of the start codon. Conclusions We herein describe a mathematical model that uses m

  17. Maximizing Protein Translation Rate in the Ribosome Flow Model: The Homogeneous Case.

    Science.gov (United States)

    Zarai, Yoram; Margaliot, Michael; Tuller, Tamir

    2014-01-01

    Gene translation is the process in which intracellular macro-molecules, called ribosomes, decode genetic information in the mRNA chain into the corresponding proteins. Gene translation includes several steps. During the elongation step, ribosomes move along the mRNA in a sequential manner and link amino-acids together in the corresponding order to produce the proteins. The homogeneous ribosome flow model (HRFM) is a deterministic computational model for translation-elongation under the assumption of constant elongation rates along the mRNA chain. The HRFM is described by a set of n first-order nonlinear ordinary differential equations, where n represents the number of sites along the mRNA chain. The HRFM also includes two positive parameters: ribosomal initiation rate and the (constant) elongation rate. In this paper, we show that the steady-state translation rate in the HRFM is a concave function of its parameters. This means that the problem of determining the parameter values that maximize the translation rate is relatively simple. Our results may contribute to a better understanding of the mechanisms and evolution of translation-elongation. We demonstrate this by using the theoretical results to estimate the initiation rate in M. musculus embryonic stem cell. The underlying assumption is that evolution optimized the translation mechanism. For the infinite-dimensional HRFM, we derive a closed-form solution to the problem of determining the initiation and transition rates that maximize the protein translation rate. We show that these expressions provide good approximations for the optimal values in the n-dimensional HRFM already for relatively small values of n. These results may have applications for synthetic biology where an important problem is to re-engineer genomic systems in order to maximize the protein production rate.

  18. Translational Perspectives for Computational Neuroimaging.

    Science.gov (United States)

    Stephan, Klaas E; Iglesias, Sandra; Heinzle, Jakob; Diaconescu, Andreea O

    2015-08-19

    Functional neuroimaging has made fundamental contributions to our understanding of brain function. It remains challenging, however, to translate these advances into diagnostic tools for psychiatry. Promising new avenues for translation are provided by computational modeling of neuroimaging data. This article reviews contemporary frameworks for computational neuroimaging, with a focus on forward models linking unobservable brain states to measurements. These approaches-biophysical network models, generative models, and model-based fMRI analyses of neuromodulation-strive to move beyond statistical characterizations and toward mechanistic explanations of neuroimaging data. Focusing on schizophrenia as a paradigmatic spectrum disease, we review applications of these models to psychiatric questions, identify methodological challenges, and highlight trends of convergence among computational neuroimaging approaches. We conclude by outlining a translational neuromodeling strategy, highlighting the importance of openly available datasets from prospective patient studies for evaluating the clinical utility of computational models. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Pharmacologic modulation of RORγt translates to efficacy in preclinical and translational models of psoriasis and inflammatory arthritis

    Science.gov (United States)

    Xue, Xiaohua; Soroosh, Pejman; De Leon-Tabaldo, Aimee; Luna-Roman, Rosa; Sablad, Marciano; Rozenkrants, Natasha; Yu, Jingxue; Castro, Glenda; Banie, Homayon; Fung-Leung, Wai-Ping; Santamaria-Babi, Luis; Schlueter, Thomas; Albers, Michael; Leonard, Kristi; Budelsky, Alison L.; Fourie, Anne M.

    2016-01-01

    The IL-23/IL-17 pathway is implicated in autoimmune diseases, particularly psoriasis, where biologics targeting IL-23 and IL-17 have shown significant clinical efficacy. Retinoid-related orphan nuclear receptor gamma t (RORγt) is required for Th17 differentiation and IL-17 production in adaptive and innate immune cells. We identified JNJ-54271074, a potent and highly-selective RORγt inverse agonist, which dose-dependently inhibited RORγt-driven transcription, decreased co-activator binding and promoted interaction with co-repressor protein. This compound selectively blocked Th17 differentiation, significantly reduced IL-17A production from memory T cells, and decreased IL-17A- and IL-22-producing human and murine γδ and NKT cells. In a murine collagen-induced arthritis model, JNJ-54271074 dose-dependently suppressed joint inflammation. Furthermore, JNJ-54271074 suppressed IL-17A production in human PBMC from rheumatoid arthritis patients. RORγt-deficient mice showed decreased IL-23-induced psoriasis-like skin inflammation and cytokine gene expression, consistent with dose-dependent inhibition in wild-type mice through oral dosing of JNJ-54271074. In a translational model of human psoriatic epidermal cells and skin-homing T cells, JNJ-54271074 selectively inhibited streptococcus extract-induced IL-17A and IL-17F. JNJ-54271074 is thus a potent, selective RORγt modulator with therapeutic potential in IL-23/IL-17 mediated autoimmune diseases. PMID:27905482

  20. A note on the translation of conceptual data models into description logics: disjointness and covering assumptions

    CSIR Research Space (South Africa)

    Casini, G

    2012-10-01

    Full Text Available possibilities for conceptual data modeling. It also raises the question of how existing conceptual models using ER, UML or ORM could be translated into Description Logics (DLs), a family of logics that have proved to be particularly appropriate for formalizing...

  1. Metaphors and Models in Translation between College and Workplace Mathematics

    Science.gov (United States)

    Williams, Julian; Wake, Geoff

    2007-01-01

    We report a study of repairs in communication between workers and visiting outsiders (students, researchers or teachers). We show how cultural models such as metaphors and mathematical models facilitated explanations and repair work in inquiry and pedagogical dialogues. We extend previous theorisations of metaphor by Black; Lakoff and Johnson;…

  2. Metaphors and Models in Translation between College and Workplace Mathematics

    Science.gov (United States)

    Williams, Julian; Wake, Geoff

    2007-01-01

    We report a study of repairs in communication between workers and visiting outsiders (students, researchers or teachers). We show how cultural models such as metaphors and mathematical models facilitated explanations and repair work in inquiry and pedagogical dialogues. We extend previous theorisations of metaphor by Black; Lakoff and Johnson;…

  3. The Standard Profile of the 21st Century Translator: Implications for Translator Training

    Directory of Open Access Journals (Sweden)

    Sakwe George Mbotake

    2015-09-01

    Full Text Available This study examines the profile of the translator in Cameroon and posits that the translating activity is increasingly becoming part of the translation service, reflecting the market expectation to train translation service providers rather than translators. The paper demonstrates that the translation profession as it is performed in the field and portrayed in job adverts reveals that a wide range of employers are looking for translators and their services. A survey of the Cameroonian translation market was carried out to raise awareness of the language skills translators need in order to work successfully as language services providers. The data for this study was obtained from 36professionaltranslators drawn from the public service, the freelance and in-house corporate translation market in Cameroon. The study argues that today’s new translator’s profile and his activities are basically variants of interlingual communication in which the traditional concept of translation constitutes only one option and that these ‘add-ons,’which contribute to a better professionalization of the translator,pose new challenges to translation pedagogy in terms of both content and methodology.In this vein the study proposes a translational language teaching model aimed at making training more responsive to market exigencies in this era of modernization.

  4. Predictive in vivo animal models and translation to clinical trials.

    Science.gov (United States)

    Cook, Natalie; Jodrell, Duncan I; Tuveson, David A

    2012-03-01

    Vast resources are expended during the development of new cancer therapeutics, and selection of optimal in vivo models should improve this process. Genetically engineered mouse models (GEMM) of cancer have progressively improved in technical sophistication and, accurately recapitulating the human cognate condition, have had a measureable impact on our knowledge of tumourigenesis. However, the application of GEMMs to facilitate the development of innovative therapeutic and diagnostic approaches has lagged behind. GEMMs that recapitulate human cancer offer an additional opportunity to accelerate drug development, and should complement the role of the widely used engraftment tumour models.

  5. Modeling neurodevelopmental cognitive deficits in tasks with cross-species translational validity.

    Science.gov (United States)

    Cope, Z A; Powell, S B; Young, J W

    2016-01-01

    Numerous psychiatric disorders whose cognitive dysfunction links to functional outcome have neurodevelopmental origins including schizophrenia, autism and bipolar disorder. Treatments are needed for these cognitive deficits, which require development using animal models. Models of neurodevelopmental disorders are as varied and diverse as the disorders themselves, recreating some but not all aspects of the disorder. This variety may in part underlie why purported procognitive treatments translated from these models have failed to restore functioning in the targeted patient populations. Further complications arise from environmental factors used in these models that can contribute to numerous disorders, perhaps only impacting specific domains, while diagnostic boundaries define individual disorders, limiting translational efficacy. The Research Domain Criteria project seeks to 'develop new ways to classify mental disorders based on behavioral dimensions and neurobiological measures' in hopes of facilitating translational research by remaining agnostic toward diagnostic borders derived from clinical presentation in humans. Models could therefore recreate biosignatures of cognitive dysfunction irrespective of disease state. This review highlights work within the field of neurodevelopmental models of psychiatric disorders tested in cross-species translational cognitive paradigms that directly inform this newly developing research strategy. By expounding on this approach, the hopes are that a fuller understanding of each model may be attainable in terms of the cognitive profile elicited by each manipulation. Hence, conclusions may begin to be drawn on the nature of cognitive neuropathology on neurodevelopmental and other disorders, increasing the chances of procognitive treatment development for individuals affected in specific cognitive domains.

  6. Translator awareness Translator awareness

    Directory of Open Access Journals (Sweden)

    Wolfram Wilss

    2008-04-01

    Full Text Available If we want to encompass adequately the wide-ranging field of human translation, it is necessary to include in translation studies (TS the concept of translator awareness (or translator consciousness, for that matter. However, this is more easily said than done, because this concept does not easily lend itself to definition, let alone to measurement, e. g., by investigating translator behaviour. To put it bluntly: Translator awareness is a fuzzy concept. Like many obviously difficult-to-define concepts, with which dialogue in TS is burdened, translator awareness lacks an articulated theory within which different forms of translator behaviour can be convincingly related to, or distinguished from, one another. Hence, TS has so far not tackled, at least not systematically, the issue of translator awareness. If we want to encompass adequately the wide-ranging field of human translation, it is necessary to include in translation studies (TS the concept of translator awareness (or translator consciousness, for that matter. However, this is more easily said than done, because this concept does not easily lend itself to definition, let alone to measurement, e. g., by investigating translator behaviour. To put it bluntly: Translator awareness is a fuzzy concept. Like many obviously difficult-to-define concepts, with which dialogue in TS is burdened, translator awareness lacks an articulated theory within which different forms of translator behaviour can be convincingly related to, or distinguished from, one another. Hence, TS has so far not tackled, at least not systematically, the issue of translator awareness.

  7. QEFSM model and Markov Algorithm for translating Quran reciting rules into Braille code

    Directory of Open Access Journals (Sweden)

    Abdallah M. Abualkishik

    2015-07-01

    Full Text Available The Holy Quran is the central religious verbal text of Islam. Muslims are expected to read, understand, and apply the teachings of the Holy Quran. The Holy Quran was translated to Braille code as a normal Arabic text without having its reciting rules included. It is obvious that the users of this transliteration will not be able to recite the Quran the right way. Through this work, Quran Braille Translator (QBT presents a specific translator to translate Quran verses and their reciting rules into the Braille code. Quran Extended Finite State Machine (QEFSM model is proposed through this study as it is able to detect the Quran reciting rules (QRR from the Quran text. Basis path testing was used to evaluate the inner work for the model by checking all the test cases for the model. Markov Algorithm (MA was used for translating the detected QRR and Quran text into the matched Braille code. The data entries for QBT are Arabic letters and diacritics. The outputs of this study are seen in the double lines of Braille symbols; the first line is the proposed Quran reciting rules and the second line is for the Quran scripts.

  8. Translation-invariant and periodic Gibbs measures for the Potts model on a Cayley tree

    Science.gov (United States)

    Khakimov, R. M.; Khaydarov, F. Kh.

    2016-11-01

    We study translation-invariant Gibbs measures on a Cayley tree of order k = 3 for the ferromagnetic three-state Potts model. We obtain explicit formulas for translation-invariant Gibbs measures. We also consider periodic Gibbs measures on a Cayley tree of order k for the antiferromagnetic q-state Potts model. Moreover, we improve previously obtained results: we find the exact number of periodic Gibbs measures with the period two on a Cayley tree of order k ≥ 3 that are defined on some invariant sets.

  9. Translation of overlay models of student knowledge for relative domains based on domain ontology mapping

    DEFF Research Database (Denmark)

    Sosnovsky, Sergey; Dolog, Peter; Henze, Nicola;

    2007-01-01

    argue that the implementation of underlying knowledge models in a sharable format, as domain ontologies - along with application of automatic ontology mapping techniques for model alignment - can help to overcome the "new-user" problem and will greatly widen opportunities for student model translation....... Moreover, it then becomes possible for systems from relevant domains to rely on knowledge transfer and reuse those portions of the student models that are related to overlapping concepts....

  10. Methodologies for modelling energy and amino acid responses in poultry

    Directory of Open Access Journals (Sweden)

    Robert Mervyn Gous

    2007-07-01

    Full Text Available The objective of this paper is to present some of the issues faced by those whose interest is to predict responses in poultry, concentrating mainly on those related to the prediction of voluntary food intake, as this should be the basis of models designed to optimise both performance and feeding programmes. The value of models designed to predict growth or reproductive performance has been improved inestimably by making food intake an output from, as opposed to an input to, such models. Predicting voluntary food intake requires the potential of the bird to be known, be this the growth of body protein or lipid, the growth of feather protein, or the rate at which yolk and albumen may be deposited daily in the form of an egg, and some of the issues relating to the description of potentials are discussed. This potential defines the nutrients that would be required by the bird on the day, which can be converted to a desired food intake by dividing each requirement by the content of that nutrient in the feed. There will be occasions when the bird will be unable to consume what is required, and predicting the magnitude of these constraints on intake and performance provides the greatest challenge for modellers. This paper concentrates on some issues raised in defining the nutrient requirements of an individual, on constraints such as high temperatures and the social and infectious environment on voluntary food intake, on some recent differences in the response to dietary protein that have been observed between the major broiler strains, and on the methodologies used to deal with populations of birds, and finally with broiler breeder hens, whose food intake is constrained by management, not by the environment. These issues suggest that there are still challenges that lie ahead for those wishing to predict responses to nutrients in poultry. It is imperative, however, that the methods used to measure the numbers that make theories work, and that the

  11. Methodological improvements of geoid modelling for the Austrian geoid computation

    Science.gov (United States)

    Kühtreiber, Norbert; Pail, Roland; Wiesenhofer, Bernadette; Pock, Christian; Wirnsberger, Harald; Hofmann-Wellenhof, Bernhard; Ullrich, Christian; Höggerl, Norbert; Ruess, Diethard; Imrek, Erich

    2010-05-01

    The geoid computation method of Least Squares Collocation (LSC) is usually applied in connection with the remove-restore technique. The basic idea is to remove, before applying LSC, not only the long-wavelength gravity field effect represented by the global gravity field model, but also the high-frequency signals, which are mainly related to topography, by applying a topographic-isostatic reduction. In the current Austrian geoid solution, an Airy-Heiskanen model with a standard density of 2670 kg/m3 was used. A close investigation of the absolute error structure of this solution reveals some correlations with topography, which may be explained with these simplified assumptions. On parameter of the remove-restore process to be investigated in this work is the depth of the reference surface of isostatic compensation, the Mohorovicic discontinuity (Moho). The recently compiled European plate Moho depth model, which is based on 3D-seismic tomography and other geophysical measurements, is used instead of the reference surface derived from the Airy-Heiskanen isostatic model. Additionally, the use of of the standard density of 2670 kg/m3 is replaced by a laterally variable (surface) density model. The impact of these two significant modifications of the geophysical conception of the remove-restore procedure on the Austrian geoid solution is investigated and analyzed in detail. In the current Austrian geoid solution the above described remove-restore concept was used in a first step to derive a pure gravimetric geoid and predicting the geoid height for 161 GPS/levelling points. The difference between measured and predicted geoid heights shows a long-wavelength structure. These systematic distortions are commonly attributed to inconsistencies in the datum, distortions of the orthometric height system, and systematic GPS errors. In order to cope with this systematic term, a polynomial of degree 3 was fitted to the difference of predicted geoid heights and GPS

  12. Assessing the Quality of Persian Translation of Kite Runner based on House’s (2014 Functional Pragmatic Model

    Directory of Open Access Journals (Sweden)

    Fateme Kargarzadeh

    2017-03-01

    Full Text Available Translation quality assessment is at the heart of any theory of translation. It is used in the academic or teaching contexts to judge translations, to discuss their merits and demerits and to suggest solutions. However, literary translations needs more consideration in terms of quality and clarity as it is widely read form of translation. In this respect, Persian literary translation of Kite Runner was taken for investigation based on House’s (2014 functional pragmatic model of translation quality assessment. To this end, around 100 pages from the beginning of both English and Persian versions of the novel were selected and compared. Using House’s model, the profile of the source text register was created and the genre was recognized. The source text profile was compared to the translation text profile. The results were minute mismatches in field, tenor, and mode which accounted for as overt erroneous expressions and leading matches which were accounted for as covert translation. The mismatches were some mistranslations of tenses and selection of inappropriate meanings for the lexicon. Since the informal and culture specific terms were transferred thoroughly, the culture filter was not applied. Besides, as the translation was a covert one. The findings of the study have implications for translators, researchers and translator trainers.

  13. Evaluating Supply Chain Management: A Methodology Based on a Theoretical Model

    OpenAIRE

    Alexandre Tadeu Simon; Luiz Carlos Di Serio; Silvio Roberto Ignacio Pires; Guilherme Silveira Martins

    2015-01-01

    Despite the increasing interest in supply chain management (SCM) by researchers and practitioners, there is still a lack of academic literature concerning topics such as methodologies to guide and support SCM evaluation. Most developed methodologies have been provided by consulting companies and are restricted in their publication and use. This article presents a methodology for evaluating companies’ degree of adherence to a SCM conceptual model. The methodology is based on Cooper, Lambert an...

  14. High level models and methodologies for information systems

    CERN Document Server

    Isaias, Pedro

    2014-01-01

    This book introduces methods and methodologies in Information Systems (IS) by presenting, describing, explaining, and illustrating their uses in various contexts, including website development, usability evaluation, quality evaluation, and success assessment.

  15. Modeling and Architectural Design in Agile Development Methodologies

    NARCIS (Netherlands)

    Stojanovic, Z.; Dahanayake, A.; Sol, H

    2003-01-01

    Agile Development Methodologies have been designed to address the problem of delivering high-quality software on time under constantly and rapidly changing requirements in business and IT environments. Agile development processes are characterized by extensive coding practice, intensive communicatio

  16. A Statistical Word-Level Translation Model for Comparable Corpora

    Science.gov (United States)

    2000-06-01

    readily available resources such as corpora, thesauri, bilingual and multilingual lexicons and dictionaries. The acquisition of such resources has...could aid in Monolingual Information Retrieval (MIR) by methods of query expansion, and thesauri construction. To date, most of the existing...testing the limits of its performance. Future directions include testing the model with a monolingual comparable corpus, e.g. WSJ [42M] and either IACA/B

  17. Einstein's cosmic model of 1931: a translation and analysis of a forgotten model of the universe

    CERN Document Server

    Raifeartaigh, C O

    2013-01-01

    We present a translation and analysis of a cosmic model published by Einstein in 1931. The paper, which is not widely known, features a model of a universe that undergoes an expansion followed by a contraction, quite different to his static model of 1917 or the monotonic Einstein-de Sitter model of 1932. The paper offers many insights into the cosmology of Albert Einstein in the light of the first evidence for an expanding universe, and we discuss his views of issues such as the curvature of space, the cosmological constant, the singularity and the timespan of the expansion. We argue that retrospective descriptions of this model as cyclic or periodic are not historically or mathematically accurate. We find that calculations in the paper of the matter density and radius of the universe contain a numerical error, a finding that is supported by writing on a blackboard used by Einstein during a lecture at Oxford University in May 1931. Our article concludes with a general discussion of his philosophy of cosmology...

  18. Reference Management Methodologies for Large Structural Models at Kennedy Space Center

    Science.gov (United States)

    Jones, Corey; Bingham, Ryan; Schmidt, Rick

    2011-01-01

    There have been many challenges associated with modeling some of NASA KSC's largest structures. Given the size of the welded structures here at KSC, it was critically important to properly organize model struc.ture and carefully manage references. Additionally, because of the amount of hardware to be installed on these structures, it was very important to have a means to coordinate between different design teams and organizations, check for interferences, produce consistent drawings, and allow for simple release processes. Facing these challenges, the modeling team developed a unique reference management methodology and model fidelity methodology. This presentation will describe the techniques and methodologies that were developed for these projects. The attendees will learn about KSC's reference management and model fidelity methodologies for large structures. The attendees will understand the goals of these methodologies. The attendees will appreciate the advantages of developing a reference management methodology.

  19. Natural gas production problems : solutions, methodologies, and modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, Christopher Arthur; Herrin, James M.; Cooper, Scott Patrick; Basinski, Paul M. (El Paso Production Company, Houston, TX); Olsson, William Arthur; Arnold, Bill Walter; Broadhead, Ronald F. (New Mexico Bureau of Geology and Mineral Resources, Socorro, NM); Knight, Connie D. (Consulting Geologist, Golden, CO); Keefe, Russell G.; McKinney, Curt (Devon Energy Corporation, Oklahoma City, OK); Holm, Gus (Vermejo Park Ranch, Raton, NM); Holland, John F.; Larson, Rich (Vermejo Park Ranch, Raton, NM); Engler, Thomas W. (New Mexico Institute of Mining and Technology, Socorro, NM); Lorenz, John Clay

    2004-10-01

    Natural gas is a clean fuel that will be the most important domestic energy resource for the first half the 21st centtuy. Ensuring a stable supply is essential for our national energy security. The research we have undertaken will maximize the extractable volume of gas while minimizing the environmental impact of surface disturbances associated with drilling and production. This report describes a methodology for comprehensive evaluation and modeling of the total gas system within a basin focusing on problematic horizontal fluid flow variability. This has been accomplished through extensive use of geophysical, core (rock sample) and outcrop data to interpret and predict directional flow and production trends. Side benefits include reduced environmental impact of drilling due to reduced number of required wells for resource extraction. These results have been accomplished through a cooperative and integrated systems approach involving industry, government, academia and a multi-organizational team within Sandia National Laboratories. Industry has provided essential in-kind support to this project in the forms of extensive core data, production data, maps, seismic data, production analyses, engineering studies, plus equipment and staff for obtaining geophysical data. This approach provides innovative ideas and technologies to bring new resources to market and to reduce the overall environmental impact of drilling. More importantly, the products of this research are not be location specific but can be extended to other areas of gas production throughout the Rocky Mountain area. Thus this project is designed to solve problems associated with natural gas production at developing sites, or at old sites under redevelopment.

  20. FUZZY MODEL OPTIMIZATION FOR TIME SERIES DATA USING A TRANSLATION IN THE EXTENT OF MEAN ERROR

    OpenAIRE

    Nurhayadi; ., Subanar; Abdurakhman; Agus Maman Abadi

    2014-01-01

    Recently, many researchers in the field of writing about the prediction of stock price forecasting, electricity load demand and academic enrollment using fuzzy methods. However, in general, modeling does not consider the model position to actual data yet where it means that error is not been handled optimally. The error that is not managed well can reduce the accuracy of the forecasting. Therefore, the paper will discuss reducing error using model translation. The error that will be reduced i...

  1. The marmoset monkey: a multi-purpose preclinical and translational model of human biology and disease.

    Science.gov (United States)

    't Hart, Bert A; Abbott, David H; Nakamura, Katsuki; Fuchs, Eberhard

    2012-11-01

    The development of biologic molecules (monoclonal antibodies, cytokines, soluble receptors) as specific therapeutics for human disease creates a need for animal models in which safety and efficacy can be tested. Models in lower animal species are precluded when the reagents fail to recognize their targets, which is often the case in rats and mice. In this Feature article we will highlight the common marmoset, a small-bodied nonhuman primate (NHP), as a useful model in biomedical and preclinical translational research.

  2. Antisense Oligonucleotides: Translation from Mouse Models to Human Neurodegenerative Diseases.

    Science.gov (United States)

    Schoch, Kathleen M; Miller, Timothy M

    2017-06-21

    Multiple neurodegenerative diseases are characterized by single-protein dysfunction and aggregation. Treatment strategies for these diseases have often targeted downstream pathways to ameliorate consequences of protein dysfunction; however, targeting the source of that dysfunction, the affected protein itself, seems most judicious to achieve a highly effective therapeutic outcome. Antisense oligonucleotides (ASOs) are small sequences of DNA able to target RNA transcripts, resulting in reduced or modified protein expression. ASOs are ideal candidates for the treatment of neurodegenerative diseases, given numerous advancements made to their chemical modifications and delivery methods. Successes achieved in both animal models and human clinical trials have proven ASOs both safe and effective. With proper considerations in mind regarding the human applicability of ASOs, we anticipate ongoing in vivo research and clinical trial development of ASOs for the treatment of neurodegenerative diseases. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Optimization of Glioblastoma Mouse Orthotopic Xenograft Models for Translational Research.

    Science.gov (United States)

    Irtenkauf, Susan M; Sobiechowski, Susan; Hasselbach, Laura A; Nelson, Kevin K; Transou, Andrea D; Carlton, Enoch T; Mikkelsen, Tom; deCarvalho, Ana C

    2017-08-01

    Glioblastoma is an aggressive primary brain tumor predominantly localized to the cerebral cortex. We developed a panel of patient-derived mouse orthotopic xenografts (PDOX) for preclinical drug studies by implanting cancer stem cells (CSC) cultured from fresh surgical specimens intracranially into 8-wk-old female athymic nude mice. Here we optimize the glioblastoma PDOX model by assessing the effect of implantation location on tumor growth, survival, and histologic characteristics. To trace the distribution of intracranial injections, toluidine blue dye was injected at 4 locations with defined mediolateral, anterioposterior, and dorsoventral coordinates within the cerebral cortex. Glioblastoma CSC from 4 patients and a glioblastoma nonstem-cell line were then implanted by using the same coordinates for evaluation of tumor location, growth rate, and morphologic and histologic features. Dye injections into one of the defined locations resulted in dye dissemination throughout the ventricles, whereas tumor cell implantation at the same location resulted in a much higher percentage of small multifocal ventricular tumors than did the other 3 locations tested. Ventricular tumors were associated with a lower tumor growth rate, as measured by in vivo bioluminescence imaging, and decreased survival in 4 of 5 cell lines. In addition, tissue oxygenation, vasculature, and the expression of astrocytic markers were altered in ventricular tumors compared with nonventricular tumors. Based on this information, we identified an optimal implantation location that avoided the ventricles and favored cortical tumor growth. To assess the effects of stress from oral drug administration, mice that underwent daily gavage were compared with stress-positive and -negative control groups. Oral gavage procedures did not significantly affect the survival of the implanted mice or physiologic measurements of stress. Our findings document the importance of optimization of the implantation site for

  4. What Does It Mean to Be Pragmatic? Pragmatic Methods, Measures, and Models to Facilitate Research Translation

    Science.gov (United States)

    Glasgow, Russell E.

    2013-01-01

    Background: One of the reasons for the slow and uncertain translation of research into practice is likely due to the emphasis in science on explanatory models and efficacy designs rather than more pragmatic approaches. Methods: Following a brief definition of what constitutes a pragmatic approach, I provide examples of pragmatic methods, measures,…

  5. Improvement of preclinical animal models for autoimmune-mediated disorders via reverse translation of failed therapies

    NARCIS (Netherlands)

    't Hart, Bert A.; Jagessar, S. Anwar; Kap, Yolanda S.; Haanstra, Krista G.; Philippens, Ingrid H. C. H. M.; Serguera, Che; Langermans, Jan; Vierboom, Michel

    2014-01-01

    The poor translational validity of autoimmune-mediated inflammatory disease (AIMID) models in inbred and specific pathogen-free (SPF) rodents underlies the high attrition of new treatments for the corresponding human disease. Experimental autoimmune encephalomyelitis (EAE) is a frequently used precl

  6. Reverse translation of failed treatments can help improving the validity of preclinical animal models

    NARCIS (Netherlands)

    't Hart, Bert A.

    2015-01-01

    A major challenge in translational research is to reduce the currently high proportion of new candidate treatment agents for neuroinflammatory disease, which fail to reproduce promising effects observed in animal models when tested in patients. This disturbing situation has raised criticism against

  7. Transcriptome and proteome exploration to model translation efficiency and protein stability in Lactococcus lactis.

    Directory of Open Access Journals (Sweden)

    Clémentine Dressaire

    2009-12-01

    Full Text Available This genome-scale study analysed the various parameters influencing protein levels in cells. To achieve this goal, the model bacterium Lactococcus lactis was grown at steady state in continuous cultures at different growth rates, and proteomic and transcriptomic data were thoroughly compared. Ratios of mRNA to protein were highly variable among proteins but also, for a given gene, between the different growth conditions. The modeling of cellular processes combined with a data fitting modeling approach allowed both translation efficiencies and degradation rates to be estimated for each protein in each growth condition. Estimated translational efficiencies and degradation rates strongly differed between proteins and were tested for their biological significance through statistical correlations with relevant parameters such as codon or amino acid bias. These efficiencies and degradation rates were not constant in all growth conditions and were inversely proportional to the growth rate, indicating a more efficient translation at low growth rate but an antagonistic higher rate of protein degradation. Estimated protein median half-lives ranged from 23 to 224 min, underlying the importance of protein degradation notably at low growth rates. The regulation of intracellular protein level was analysed through regulatory coefficient calculations, revealing a complex control depending on protein and growth conditions. The modeling approach enabled translational efficiencies and protein degradation rates to be estimated, two biological parameters extremely difficult to determine experimentally and generally lacking in bacteria. This method is generic and can now be extended to other environments and/or other micro-organisms.

  8. Knowledge Structure Measures of Reader's Situation Models across Languages: Translation Engenders Richer Structure

    Science.gov (United States)

    Kim, Kyung; Clariana, Roy B.

    2015-01-01

    In order to further validate and extend the application of recent knowledge structure (KS) measures to second language settings, this investigation explores how second language (L2, English) situation models are influenced by first language (L1, Korean) translation tasks. Fifty Korean low proficient English language learners were asked to read an…

  9. Anatomy and bronchoscopy of the porcine lung. A model for translational respiratory medicine.

    LENUS (Irish Health Repository)

    Judge, Eoin P

    2014-09-01

    The porcine model has contributed significantly to biomedical research over many decades. The similar size and anatomy of pig and human organs make this model particularly beneficial for translational research in areas such as medical device development, therapeutics and xenotransplantation. In recent years, a major limitation with the porcine model was overcome with the successful generation of gene-targeted pigs and the publication of the pig genome. As a result, the role of this model is likely to become even more important. For the respiratory medicine field, the similarities between pig and human lungs give the porcine model particular potential for advancing translational medicine. An increasing number of lung conditions are being studied and modeled in the pig. Genetically modified porcine models of cystic fibrosis have been generated that, unlike mouse models, develop lung disease similar to human cystic fibrosis. However, the scientific literature relating specifically to porcine lung anatomy and airway histology is limited and is largely restricted to veterinary literature and textbooks. Furthermore, methods for in vivo lung procedures in the pig are rarely described. The aims of this review are to collate the disparate literature on porcine lung anatomy, histology, and microbiology; to provide a comparison with the human lung; and to describe appropriate bronchoscopy procedures for the pig lungs to aid clinical researchers working in the area of translational respiratory medicine using the porcine model.

  10. contemporary translation studies and bible translation

    African Journals Online (AJOL)

    the source text to the translation process, the product and/or reception of ... The methodological impact was a shift from normative linguistic-based ... ary Society Period, with formal-equivalent translations being made by mis- ... the theoretical foundation of the functional-equivalent approach problem- ... After a review of.

  11. ChOrDa: a methodology for the modeling of business processes with BPMN

    CERN Document Server

    Buferli, Matteo; Montesi, Danilo

    2009-01-01

    In this paper we present a modeling methodology for BPMN, the standard notation for the representation of business processes. Our methodology simplifies the development of collaborative BPMN diagrams, enabling the automated creation of skeleton process diagrams representing complex choreographies. To evaluate and tune the methodology, we have developed a tool supporting it, that we apply to the modeling of an international patenting process as a working example.

  12. Masked translation priming asymmetry in Chinese-English bilinguals: making sense of the Sense Model.

    Science.gov (United States)

    Xia, Violet; Andrews, Sally

    2015-01-01

    Masked translation priming asymmetry is the robust finding that priming from a bilingual's first language (L1) to their second language (L2) is stronger than priming from L2 to L1. This asymmetry has been claimed to be task dependent. The Sense Model proposed by Finkbeiner, Forster, Nicol, and Nakamura (2004) claims that the asymmetry is reduced in semantic categorization relative to lexical decision due to a category filtering mechanism that limits the features considered in categorization decisions to dominant, category-relevant features. This paper reports two pairs of semantic categorization and lexical decision tasks designed to test the Sense Model's predictions. The experiments replicated the finding of Finkbeiner et al. that L2-L1 priming is somewhat stronger in semantic categorization than lexical decision, selectively for exemplars of the category. However, the direct comparison of L2-L1 and L1-L2 translation priming across tasks failed to confirm the Sense Model's central prediction that translation priming asymmetry is significantly reduced in semantic categorization. The data therefore fail to support the category filtering account of translation priming asymmetry. Rather, they suggest that pre-activation of conceptual features of the target category provides feedback to lexical forms that compensates for the weak connections between the lexical and conceptual representations of L2 words.

  13. HIERARCHICAL METHODOLOGY FOR MODELING HYDROGEN STORAGE SYSTEMS PART II: DETAILED MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, B; Donald L. Anton, D

    2008-12-22

    There is significant interest in hydrogen storage systems that employ a media which either adsorbs, absorbs or reacts with hydrogen in a nearly reversible manner. In any media based storage system the rate of hydrogen uptake and the system capacity is governed by a number of complex, coupled physical processes. To design and evaluate such storage systems, a comprehensive methodology was developed, consisting of a hierarchical sequence of models that range from scoping calculations to numerical models that couple reaction kinetics with heat and mass transfer for both the hydrogen charging and discharging phases. The scoping models were presented in Part I [1] of this two part series of papers. This paper describes a detailed numerical model that integrates the phenomena occurring when hydrogen is charged and discharged. A specific application of the methodology is made to a system using NaAlH{sub 4} as the storage media.

  14. Translating therapies for Huntington's disease from genetic animal models to clinical trials.

    Science.gov (United States)

    Hersch, Steven M; Ferrante, Robert J

    2004-07-01

    Genetic animal models of inherited neurological diseases provide an opportunity to test potential treatments and explore their promise for translation to humans experiencing these diseases. Therapeutic trials conducted in mouse models of Huntington's disease have identified a growing number of potential therapies that are candidates for clinical trials. Although it is very exciting to have these candidates, there has been increasing concern about the feasibility and desirability of taking all of the compounds that may work in mice and testing them in patients with HD. There is a need to begin to prioritize leads emerging from transgenic mouse studies; however, it is difficult to compare results between compounds and laboratories, and there are also many additional factors that can affect translation to humans. Among the important issues are what constitutes an informative genetic model, what principals should be followed in designing and conducting experiments using genetic animal models, how can results from different laboratories and in different models be compared, what body of evidence is desirable to fully inform clinical decision making, and what factors contribute to the equipoise in determining whether preclinical information about a therapy makes clinical study warranted. In the context of Huntington's disease, we will review the current state of genetic models and their successes in putting forward therapeutic leads, provide a guide to assessing studies in mouse models, and discuss some of the salient issues related to translation from mice to humans.

  15. Translational PK/PD modeling to increase probability of success in drug discovery and early development.

    Science.gov (United States)

    Lavé, Thierry; Caruso, Antonello; Parrott, Neil; Walz, Antje

    In this review we present ways in which translational PK/PD modeling can address opportunities to enhance probability of success in drug discovery and early development. This is achieved by impacting efficacy and safety-driven attrition rates, through increased focus on the quantitative understanding and modeling of translational PK/PD. Application of the proposed principles early in the discovery and development phases is anticipated to bolster confidence of successfully evaluating proof of mechanism in humans and ultimately improve Phase II success. The present review is centered on the application of predictive modeling and simulation approaches during drug discovery and early development, and more specifically of mechanism-based PK/PD modeling. Case studies are presented, focused on the relevance of M&S contributions to real-world questions and the impact on decision making.

  16. Early-life stress origins of gastrointestinal disease: animal models, intestinal pathophysiology, and translational implications.

    Science.gov (United States)

    Pohl, Calvin S; Medland, Julia E; Moeser, Adam J

    2015-12-15

    Early-life stress and adversity are major risk factors in the onset and severity of gastrointestinal (GI) disease in humans later in life. The mechanisms by which early-life stress leads to increased GI disease susceptibility in adult life remain poorly understood. Animal models of early-life stress have provided a foundation from which to gain a more fundamental understanding of this important GI disease paradigm. This review focuses on animal models of early-life stress-induced GI disease, with a specific emphasis on translational aspects of each model to specific human GI disease states. Early postnatal development of major GI systems and the consequences of stress on their development are discussed in detail. Relevant translational differences between species and models are highlighted.

  17. A Model-Based Systems Engineering Methodology for Employing Architecture In System Analysis: Developing Simulation Models Using Systems Modeling Language Products to Link Architecture and Analysis

    Science.gov (United States)

    2016-06-01

    ENGINEERING METHODOLOGY FOR EMPLOYING ARCHITECTURE IN SYSTEM ANALYSIS: DEVELOPING SIMULATION MODELS USING SYSTEMS MODELING LANGUAGE PRODUCTS TO LINK... ENGINEERING METHODOLOGY FOR EMPLOYING ARCHITECTURE IN SYSTEM ANALYSIS: DEVELOPING SIMULATION MODELS USING SYSTEMS MODELING LANGUAGE PRODUCTS TO LINK...to model-based systems engineering (MBSE) by formally defining an MBSE methodology for employing architecture in system analysis (MEASA) that presents

  18. Development and translational imaging of a TP53 porcine tumorigenesis model

    OpenAIRE

    Sieren, Jessica C; Meyerholz, David K.; Wang, Xiao-Jun; Davis, Bryan T.; Newell, John D; Hammond, Emily; Rohret, Judy A.; Rohret, Frank A.; Struzynski, Jason T.; Goeken, J. Adam; Naumann, Paul W.; Leidinger, Mariah R.; Taghiyev, Agshin; Van Rheeden, Richard; Hagen, Jussara

    2014-01-01

    Cancer is the second deadliest disease in the United States, necessitating improvements in tumor diagnosis and treatment. Current model systems of cancer are informative, but translating promising imaging approaches and therapies to clinical practice has been challenging. In particular, the lack of a large-animal model that accurately mimics human cancer has been a major barrier to the development of effective diagnostic tools along with surgical and therapeutic interventions. Here, we develo...

  19. A new model for utricular function testing using a sinusoidal translation profile during unilateral centrifugation.

    Science.gov (United States)

    Buytaert, K I; Nooij, S A E; Neyt, X; Migeotte, P-F; Vanspauwen, R; Van de Heyning, P H; Wuyts, F L

    2010-01-01

    The utricle plays an important role in orientation with respect to gravity. The unilateral centrifugation test allows a side-by-side investigation of both utricles. During this test, the subject is rotated about an earth-vertical axis at high rotation speeds (e.g. 400°/s) and translated along an interaural axis to consecutively align the axis of rotation with the left and the right utricle. A simple sinusoidal translation profile (0.013 Hz; amplitude = 4 cm) was chosen. The combined rotation and translation induces ocular counter rolling (OCR), which is measured using 3-D video-oculography. This OCR is the sum of the reflexes generated by both the semicircular canals and the utricles. In this paper, we present a new physiological model that decomposes this total OCR into a canal and a utricular contribution, modelled by a second-order transfer function and a combination of 2 sine functions, respectively. This model yields parameters such as canal gain, cupular and adaptation time constants and a velocity storage component for the canals. Utricular gain, bias, phase and the asymmetry between the left and the right utricle are characteristic parameters generated by the model for the utricles. The model is presented along with the results of 10 healthy subjects and 2 patients with a unilateral vestibular loss due to acoustic neuroma surgery to illustrate the effectiveness of the model.

  20. Translation between representation languages

    Science.gov (United States)

    Vanbaalen, Jeffrey

    1994-01-01

    A capability for translating between representation languages is critical for effective knowledge base reuse. A translation technology for knowledge representation languages based on the use of an interlingua for communicating knowledge is described. The interlingua-based translation process consists of three major steps: translation from the source language into a subset of the interlingua, translation between subsets of the interlingua, and translation from a subset of the interlingua into the target language. The first translation step into the interlingua can typically be specified in the form of a grammar that describes how each top-level form in the source language translates into the interlingua. In cases where the source language does not have a declarative semantics, such a grammar is also a specification of a declarative semantics for the language. A methodology for building translators that is currently under development is described. A 'translator shell' based on this methodology is also under development. The shell has been used to build translators for multiple representation languages and those translators have successfully translated nontrivial knowledge bases.

  1. Translations and Translators.

    Science.gov (United States)

    Nida, Eugene A.

    1979-01-01

    The necessity for stylistic appropriateness in translation as well as correct content is discussed. To acquire this skill, translators must be trained in stylistics through close examination of their own language and must have practice in translating for different audiences at different levels. (PMJ)

  2. TREXMO: a translation tool to support the use of regulatory occupational exposure models

    OpenAIRE

    Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, Dernez

    2016-01-01

    Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EX...

  3. The Translation Invariant Massive Nelson Model: III. Asymptotic Completeness Below the Two-Boson Threshold

    Science.gov (United States)

    Dybalski, Wojciech; Møller, Jacob Schach

    2015-11-01

    We show asymptotic completeness of two-body scattering for a class of translation invariant models describing a single quantum particle (the electron) linearly coupled to a massive scalar field (bosons). Our proof is based on a recently established Mourre estimate for these models. In contrast to previous approaches, it requires no number cutoff, no restriction on the particle-field coupling strength, and no restriction on the magnitude of total momentum. Energy, however, is restricted by the two-boson threshold, admitting only scattering of a dressed electron and a single asymptotic boson. The class of models we consider include the UV-cutoff Nelson and polaron models.

  4. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  5. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens;

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types....... The purpose of the model is to enable design of control strategies for cooperative reconfigurable underwater systems....... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  6. TREXMO: A Translation Tool to Support the Use of Regulatory Occupational Exposure Models.

    Science.gov (United States)

    Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, David

    2016-10-01

    Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EXPO-TOOL, and EASE v.2.0. By enabling a semi-automatic translation between the parameters of these six models, TREXMO facilitates their simultaneous use. For a given exposure situation, defined by a set of parameters in one of the models, TREXMO provides the user with the most appropriate parameters to use in the other exposure models. Results showed that, once an exposure situation and parameters were set in ART, TREXMO reduced the number of possible outcomes in the other models by 1-4 orders of magnitude. The tool should manage to reduce the uncertain entry or selection of parameters in the six models, improve between-user reliability, and reduce the time required for running several models for a given exposure situation. In addition to these advantages, registrants of chemicals and authorities should benefit from more reliable exposure estimates for the risk characterization of dangerous chemicals under Regulation, Evaluation, Authorisation and restriction of CHemicals (REACH).

  7. Translating Words, Translating Cultures

    Directory of Open Access Journals (Sweden)

    Richard Whitaker

    2012-03-01

    Full Text Available What exactly does (or should translation from one language into another try to do? Attempt to convey to readers of the target language (the language into which one is translating something of the strangeness, difference and historicity of the original in the source language (the language from which one is translating? Or must translation try to bridge the gap between source and target language, by rendering the original in a thoroughly contemporary style and diction, as if this were a work being written now for the first time? And related to these the further questions: how closely should a translation render the genre, language, metre, style and content of the original? How far can a translation depart from the original without ceasing to be a translation – in other words, where is one to situate the border between “translation”, “version” and “adaptation”?

  8. Fear Extinction as a Model for Translational Neuroscience: Ten Years of Progress

    OpenAIRE

    Milad, Mohammed R.; Quirk, Gregory J

    2012-01-01

    The psychology of extinction has been studied for decades. Approximately 10 years ago, however, there began a concerted effort to understand the neural circuits of extinction of fear conditioning, in both animals and humans. Progress during this period has been facilitated by an unusual degree of coordination between rodent and human researchers examining fear extinction. This successful research program could serve as a model for translational research in other areas of behavioral neuroscien...

  9. Boundary Conditions for Translation-Invariant Gibbs Measures of the Potts Model on Cayley Trees

    Science.gov (United States)

    Gandolfo, D.; Rahmatullaev, M. M.; Rozikov, U. A.

    2017-06-01

    We consider translation-invariant splitting Gibbs measures (TISGMs) for the q-state Potts model on a Cayley tree of order two. Recently a full description of the TISGMs was obtained, and it was shown in particular that at sufficiently low temperatures their number is 2q-1. In this paper for each TISGM μ we explicitly give the set of boundary conditions such that limiting Gibbs measures with respect to these boundary conditions coincide with μ.

  10. An Analysis of Step, Jt, and Pdf Format Translation Between Constraint-based Cad Systems with a Benchmark Model

    OpenAIRE

    McKenzie-Veal, Dillon

    2012-01-01

    This research was conducted to provide greater depth into the ability of STEP AP 203 Edition 2, JT, and 3D PDF to translate and preserve information while using a benchmark model. The benchmark model was designed based on four industry models and created natively in the five industry leading 3D CAD programs. The native CAD program models were translated using STEP, JT, and 3D PDF. Several criteria were analyzed along the paths of translation from one disparate CAD program to another. Along wi...

  11. Developing a multidisciplinary model of comparative effectiveness research within a clinical and translational science award.

    Science.gov (United States)

    Marantz, Paul R; Strelnick, A Hal; Currie, Brian; Bhalla, Rohit; Blank, Arthur E; Meissner, Paul; Selwyn, Peter A; Walker, Elizabeth A; Hsu, Daphne T; Shamoon, Harry

    2011-06-01

    The Clinical and Translational Science Awards (CTSAs) were initiated to improve the conduct and impact of the National Institutes of Health's research portfolio, transforming training programs and research infrastructure at academic institutions and creating a nationwide consortium. They provide a model for translating research across disciplines and offer an efficient and powerful platform for comparative effectiveness research (CER), an effort that has long struggled but enjoys renewed hope under health care reform. CTSAs include study design and methods expertise, informatics, and regulatory support; programs in education, training, and career development in domains central to CER; and programs in community engagement.Albert Einstein College of Medicine of Yeshiva University and Montefiore Medical Center have entered a formal partnership that places their CTSA at a critical intersection for clinical and translational research. Their CTSA leaders were asked to develop a strategy for enhancing CER activities, and in 2010 they developed a model that encompasses four broadly defined "compartments" of research strength that must be coordinated for this enterprise to succeed: evaluation and health services research, biobehavioral research and prevention, efficacy studies and clinical trials, and social science and implementation research.This article provides historical context for CER, elucidates Einstein-Montefiore's CER model and strategic planning efforts, and illustrates how a CTSA can provide vision, leadership, coordination, and services to support an academic health center's collaborative efforts to develop a robust CER portfolio and thus contribute to the national effort to improve health and health care.

  12. A geometrical model of vertical translation and alar ligament tension in atlanto-axial rotation.

    Science.gov (United States)

    Boszczyk, B M; Littlewood, A P; Putz, R

    2012-08-01

    While allowing the greatest range of axial rotation of the entire spine with 40° to each side, gradual restraint at the extremes of motion by the alar ligaments is of vital importance. In order for the ligaments to facilitate a gradual transition from the neutral to the elastic zone, a complex interaction of axial rotation and vertical translation via the biconvex articular surfaces is essential. The aim of this investigation is to establish a geometrical model of the intricate interaction of the alar ligaments and vertical translatory motion of C1/C2 in axial rotation. Bilateral alar ligaments including the odontoid process and condylar bony entheses were removed from six adult cadavers aged 65-89 years within 48 h of death. All specimens were judged to be free of abnormalities with the exception of non-specific degenerative changes. Dimensions of the odontoid process and alar ligaments were measured. Graphical multiplanar reconstruction of atlanto-axial rotation was done in the transverse and frontal planes for the neutral position and for rotation to 40° with vertical translation of 3 mm. The necessary fibre elongation of the alar ligaments in the setting with and without vertical translation of the atlas was calculated. The mean diameter of the odontoid process in the sagittal plane was 10.6 mm (SD 1.1). The longest fibre length was measured from the posterior border of the odontoid enthesis to the posterior border of the condylar enthesis with an average of 13.2 mm (SD 2.5) and the shortest between the lateral (anterior) border odontoid enthesis and the anterior condylar enthesis with an average of 8.2 mm (SD 2.2). In graphical multiplanar reconstruction of atlanto-axial rotation to 40° without vertical translation of C1/C2, theoretical alar fibre elongation reaches 27.1% for the longest fibres, which is incompatible with the collagenous structure of the alar ligaments. Allowing 3 mm caudal translation of C1 on C2 at 40° rotation, as facilitated by the

  13. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare

    NARCIS (Netherlands)

    Bracke, M.B.M.; Edwards, S.A.; Metz, J.H.M.; Noordhuizen, J.P.T.M.; Algers, B.

    2008-01-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called `semantic modelling¿ has been developed. To date, however, this methodology has not been generally app

  14. Translational Rodent Models for Research on Parasitic Protozoa—A Review of Confounders and Possibilities

    Directory of Open Access Journals (Sweden)

    Totta Ehret

    2017-06-01

    Full Text Available Rodents, in particular Mus musculus, have a long and invaluable history as models for human diseases in biomedical research, although their translational value has been challenged in a number of cases. We provide some examples in which rodents have been suboptimal as models for human biology and discuss confounders which influence experiments and may explain some of the misleading results. Infections of rodents with protozoan parasites are no exception in requiring close consideration upon model choice. We focus on the significant differences between inbred, outbred and wild animals, and the importance of factors such as microbiota, which are gaining attention as crucial variables in infection experiments. Frequently, mouse or rat models are chosen for convenience, e.g., availability in the institution rather than on an unbiased evaluation of whether they provide the answer to a given question. Apart from a general discussion on translational success or failure, we provide examples where infections with single-celled parasites in a chosen lab rodent gave contradictory or misleading results, and when possible discuss the reason for this. We present emerging alternatives to traditional rodent models, such as humanized mice and organoid primary cell cultures. So-called recombinant inbred strains such as the Collaborative Cross collection are also a potential solution for certain challenges. In addition, we emphasize the advantages of using wild rodents for certain immunological, ecological, and/or behavioral questions. The experimental challenges (e.g., availability of species-specific reagents that come with the use of such non-model systems are also discussed. Our intention is to foster critical judgment of both traditional and newly available translational rodent models for research on parasitic protozoa that can complement the existing mouse and rat models.

  15. Model for bridging the translational "valleys of death" in spinal cord injury research

    Directory of Open Access Journals (Sweden)

    Barrable B

    2014-04-01

    Full Text Available Bill Barrable,1 Nancy Thorogood,1 Vanessa Noonan,1,2 Jocelyn Tomkinson,1 Phalgun Joshi,1 Ken Stephenson,1 John Barclay,1 Katharina Kovacs Burns3 1Rick Hansen Institute, 2Division of Spine, Department of Orthopaedics, University of British Columbia, Vancouver, BC, 3Health Sciences Council, University of Alberta, Edmonton, AB, Canada Abstract: To improve health care outcomes with cost-effective treatments and prevention initiatives, basic health research must be translated into clinical application and studied during implementation, a process commonly referred to as translational research. It is estimated that only 14% of health-related scientific discoveries enter into medical practice and that it takes an average of 17 years for them to do so. The transition from basic research to clinical knowledge and from clinical knowledge to practice or implementation is so fraught with obstacles that these transitions are often referred to as “valleys of death”. The Rick Hansen Institute has developed a unique praxis model for translational research in the field of spinal cord injury (SCI. The praxis model involves three components. The first is a coordinated program strategy of cure, care, consumer engagement, and commercialization. The second is a knowledge cycle that consists of four phases, ie, environmental scanning, knowledge generation and synthesis, knowledge validation, and implementation. The third is the provision of relevant resources and infrastructure to overcome obstacles in the “valleys of death”, ie, funding, clinical research operations, informatics, clinical research and best practice implementation, consumer engagement, collaborative networks, and strategic partnerships. This model, which is to be independently evaluated in 2018 to determine its strengths and limitations, has been used to advance treatments for pressure ulcers in SCI. The Rick Hansen Institute has developed an innovative solution to move knowledge into action by

  16. Dynamical modeling of microRNA action on the protein translation process

    Directory of Open Access Journals (Sweden)

    Barillot Emmanuel

    2010-02-01

    Full Text Available Abstract Background Protein translation is a multistep process which can be represented as a cascade of biochemical reactions (initiation, ribosome assembly, elongation, etc., the rate of which can be regulated by small non-coding microRNAs through multiple mechanisms. It remains unclear what mechanisms of microRNA action are the most dominant: moreover, many experimental reports deliver controversial messages on what is the concrete mechanism actually observed in the experiment. Nissan and Parker have recently demonstrated that it might be impossible to distinguish alternative biological hypotheses using the steady state data on the rate of protein synthesis. For their analysis they used two simple kinetic models of protein translation. Results In contrary to the study by Nissan and Parker, we show that dynamical data allow discriminating some of the mechanisms of microRNA action. We demonstrate this using the same models as developed by Nissan and Parker for the sake of comparison but the methods developed (asymptotology of biochemical networks can be used for other models. We formulate a hypothesis that the effect of microRNA action is measurable and observable only if it affects the dominant system (generalization of the limiting step notion for complex networks of the protein translation machinery. The dominant system can vary in different experimental conditions that can partially explain the existing controversy of some of the experimental data. Conclusions Our analysis of the transient protein translation dynamics shows that it gives enough information to verify or reject a hypothesis about a particular molecular mechanism of microRNA action on protein translation. For multiscale systems only that action of microRNA is distinguishable which affects the parameters of dominant system (critical parameters, or changes the dominant system itself. Dominant systems generalize and further develop the old and very popular idea of limiting step

  17. Conception d'une methodologie generale d'evaluation de la traduction automatique (Conception of a General Methodology for Evaluating Machine Translation).

    Science.gov (United States)

    van Slype, Georges

    1982-01-01

    It is proposed that assessment of human translation versus machine translation programs use methods and criteria that measure efficiency and cost effectiveness and are efficient and cost-effective in themselves. A variety of methods and criteria are evaluated and discussed. (MSE)

  18. Integrated methodology for constructing a quantified hydrodynamic model for application to clastic petroleum reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Honarpour, M. M.; Schatzinger, R. A.; Szpakiewicz, M. J.; Jackson, S. R.; Sharma, B.; Tomutsa, L.; Chang, M. M.

    1990-01-01

    A comprehensive, multidisciplinary, stepwise methodology is developed for constructing and integration geological and engineering information for predicting petroleum reservoir performance. This methodology is based on our experience in characterizing shallow marine reservoirs, but it should also apply to other deposystems. The methodology is presented as Part 1 of this report. Three major tasks that must be studied to facilitate a systematic approach for constructing a predictive hydrodynamic model for petroleum reservoirs are addressed: (1) data collection, organization, evaluation, and integration; (2) hydrodynamic model construction and verification; and (3) prediction and ranking of reservoir parameters by numerical simulation using data derived from the model. 39 refs., 62 figs., 13 tabs.

  19. A cislunar guidance methodology and model for low thrust trajectory generation

    Science.gov (United States)

    Korsmeyer, David J.

    1992-01-01

    A guidance methodology for generating low-thrust cislunar trajectories was developed and incorporated in a computer model. The guidance methodology divides the cislunar transfer into three phases. Each phase is discussed in turn. To establish the effectiveness of the methodology and algorithms the computer model generated three example cases for the cislunar transfer of a low-thrust electric orbital transfer vehicle (EOTV). Transfers from both earth orbit to lunar orbit and from lunar orbit back to earth orbit are considered. The model allows the determination of the low-thrust EOTV's time-of-flight, propellant mass, payload mass, and thrusting history.

  20. Novel Methodology for Functional Modeling and Simulation of Wireless Embedded Systems

    Directory of Open Access Journals (Sweden)

    Nitasha Jugessur

    2008-07-01

    Full Text Available A novel methodology is presented for the modeling and the simulation of wireless embedded systems. Tight interaction between the analog and the digital functionality makes the design and verification of such systems a real challenge. The applied methodology brings together the functional models of the baseband algorithms written in C language with the circuit descriptions at behavioral level in Verilog or Verilog-AMS for the system simulations in a single kernel environment. The physical layer of an ultrawideband system has been successfully modeled and simulated. The results confirm that this methodology provides a standardized framework in order to efficiently and accurately simulate complex mixed signal applications for embedded systems.

  1. Noise analysis of genome-scale protein synthesis using a discrete computational model of translation.

    Science.gov (United States)

    Racle, Julien; Stefaniuk, Adam Jan; Hatzimanikatis, Vassily

    2015-07-28

    Noise in genetic networks has been the subject of extensive experimental and computational studies. However, very few of these studies have considered noise properties using mechanistic models that account for the discrete movement of ribosomes and RNA polymerases along their corresponding templates (messenger RNA (mRNA) and DNA). The large size of these systems, which scales with the number of genes, mRNA copies, codons per mRNA, and ribosomes, is responsible for some of the challenges. Additionally, one should be able to describe the dynamics of ribosome exchange between the free ribosome pool and those bound to mRNAs, as well as how mRNA species compete for ribosomes. We developed an efficient algorithm for stochastic simulations that addresses these issues and used it to study the contribution and trade-offs of noise to translation properties (rates, time delays, and rate-limiting steps). The algorithm scales linearly with the number of mRNA copies, which allowed us to study the importance of genome-scale competition between mRNAs for the same ribosomes. We determined that noise is minimized under conditions maximizing the specific synthesis rate. Moreover, sensitivity analysis of the stochastic system revealed the importance of the elongation rate in the resultant noise, whereas the translation initiation rate constant was more closely related to the average protein synthesis rate. We observed significant differences between our results and the noise properties of the most commonly used translation models. Overall, our studies demonstrate that the use of full mechanistic models is essential for the study of noise in translation and transcription.

  2. Noise analysis of genome-scale protein synthesis using a discrete computational model of translation

    Energy Technology Data Exchange (ETDEWEB)

    Racle, Julien; Hatzimanikatis, Vassily, E-mail: vassily.hatzimanikatis@epfl.ch [Laboratory of Computational Systems Biotechnology, Ecole Polytechnique Fédérale de Lausanne (EPFL), CH-1015 Lausanne (Switzerland); Swiss Institute of Bioinformatics (SIB), CH-1015 Lausanne (Switzerland); Stefaniuk, Adam Jan [Laboratory of Computational Systems Biotechnology, Ecole Polytechnique Fédérale de Lausanne (EPFL), CH-1015 Lausanne (Switzerland)

    2015-07-28

    Noise in genetic networks has been the subject of extensive experimental and computational studies. However, very few of these studies have considered noise properties using mechanistic models that account for the discrete movement of ribosomes and RNA polymerases along their corresponding templates (messenger RNA (mRNA) and DNA). The large size of these systems, which scales with the number of genes, mRNA copies, codons per mRNA, and ribosomes, is responsible for some of the challenges. Additionally, one should be able to describe the dynamics of ribosome exchange between the free ribosome pool and those bound to mRNAs, as well as how mRNA species compete for ribosomes. We developed an efficient algorithm for stochastic simulations that addresses these issues and used it to study the contribution and trade-offs of noise to translation properties (rates, time delays, and rate-limiting steps). The algorithm scales linearly with the number of mRNA copies, which allowed us to study the importance of genome-scale competition between mRNAs for the same ribosomes. We determined that noise is minimized under conditions maximizing the specific synthesis rate. Moreover, sensitivity analysis of the stochastic system revealed the importance of the elongation rate in the resultant noise, whereas the translation initiation rate constant was more closely related to the average protein synthesis rate. We observed significant differences between our results and the noise properties of the most commonly used translation models. Overall, our studies demonstrate that the use of full mechanistic models is essential for the study of noise in translation and transcription.

  3. A model of relative translation and rotation in leader-follower spacecraft formations

    Directory of Open Access Journals (Sweden)

    Raymond Kristiansen

    2007-01-01

    Full Text Available In this paper, a model of a leader-follower spacecraft formation in six degrees of freedom is derived and presented. The nonlinear model describes the relative translational and rotationalmotion of the spacecraft, and extends previous work by providing a more complete factorization, together with detailed information about the matrices in the model. The resulting model shows many similarities with models for systems such as robot manipulators and marine vehicles. In addition, mathematical models of orbital perturbations due to gravitational variations, atmospheric drag, solar radiation and third-body effects are presented for completeness. Results from simulations are presented to visualize the properties of the model and to show the impact of the different orbital perturbations on the flight path.

  4. Writing Through: Practising Translation

    Directory of Open Access Journals (Sweden)

    Joel Scott

    2010-05-01

    Full Text Available This essay exists as a segment in a line of study and writing practice that moves between a critical theory analysis of translation studies conceptions of language, and the practical questions of what those ideas might mean for contemporary translation and writing practice. Although the underlying preoccupation of this essay, and my more general line of inquiry, is translation studies and practice, in many ways translation is merely a way into a discussion on language. For this essay, translation is the threshold of language. But the two trails of the discussion never manage to elude each other, and these concatenations have informed two experimental translation methods, referred to here as Live Translations and Series Translations. Following the essay are a number of poems in translation, all of which come from Blanco Nuclear by the contemporary Spanish poet, Esteban Pujals Gesalí. The first group, the Live Translations consist of transcriptions I made from audio recordings read in a public setting, in which the texts were translated in situ, either off the page of original Spanish-language poems, or through a process very much like that carried out by simultaneous translators, for which readings of the poems were played back to me through headphones at varying speeds to be translated before the audience. The translations collected are imperfect renderings, attesting to a moment in language practice rather than language objects. The second method involves an iterative translation process, by which three versions of any one poem are rendered, with varying levels of fluency, fidelity and servility. All three translations are presented one after the other as a series, with no version asserting itself as the primary translation. These examples, as well as the translation methods themselves, are intended as preliminary experiments within an endlessly divergent continuum of potential methods and translations, and not as a complete representation of

  5. Rabbit models for the study of human atherosclerosis: from pathophysiological mechanisms to translational medicine.

    Science.gov (United States)

    Fan, Jianglin; Kitajima, Shuji; Watanabe, Teruo; Xu, Jie; Zhang, Jifeng; Liu, Enqi; Chen, Y Eugene

    2015-02-01

    Laboratory animal models play an important role in the study of human diseases. Using appropriate animals is critical not only for basic research but also for the development of therapeutics and diagnostic tools. Rabbits are widely used for the study of human atherosclerosis. Because rabbits have a unique feature of lipoprotein metabolism (like humans but unlike rodents) and are sensitive to a cholesterol diet, rabbit models have not only provided many insights into the pathogenesis and development of human atherosclerosis but also made a great contribution to translational research. In fact, rabbit was the first animal model used for studying human atherosclerosis, more than a century ago. Currently, three types of rabbit model are commonly used for the study of human atherosclerosis and lipid metabolism: (1) cholesterol-fed rabbits, (2) Watanabe heritable hyperlipidemic rabbits, analogous to human familial hypercholesterolemia due to genetic deficiency of LDL receptors, and (3) genetically modified (transgenic and knock-out) rabbits. Despite their importance, compared with the mouse, the most widely used laboratory animal model nowadays, the use of rabbit models is still limited. In this review, we focus on the features of rabbit lipoprotein metabolism and pathology of atherosclerotic lesions that make it the optimal model for human atherosclerotic disease, especially for the translational medicine. For the sake of clarity, the review is not an attempt to be completely inclusive, but instead attempts to summarize substantial information concisely and provide a guideline for experiments using rabbits.

  6. Epilepsy Therapy Development: Technical and Methodological Issues in Studies with Animal Models

    Science.gov (United States)

    Galanopoulou, Aristea S.; Kokaia, Merab; Loeb, Jeffrey A.; Nehlig, Astrid; Pitkänen, Asla; Rogawski, Michael A.; Staley, Kevin J.; Whittemore, Vicky H.; Dudek, F. Edward

    2013-01-01

    SUMMARY The search for new treatments for seizures, epilepsies and their comorbidities faces considerable challenges. Partly, this is due to gaps in our understanding of the etiology and pathophysiology of most forms of epilepsy. An additional challenge is the difficulty to predict the efficacy, tolerability and impact of potential new treatments on epilepsies and comorbidities in humans, using the available resources. Here we provide a summary of the discussions and proposals of the Working Group 2 as presented in the Joint American Epilepsy Society and International League Against Epilepsy Translational Workshop in London (September 2012). We propose methodological and reporting practices that will enhance the uniformity, reliability and reporting of early stage preclinical studies with animal seizure and epilepsy models that aim to develop and evaluate new therapies for seizures or epilepsies, using multi-disciplinary approaches. The topics considered include: (a) implementation of better study design and reporting practices, (b) incorporation in the study design and analysis of covariants that may impact outcomes (including species, age, sex), (c) utilization of approaches to document target relevance, exposure and engagement by the tested treatment, (d) utilization of clinically relevant treatment protocols, (e) optimization of the use of video-EEG recordings to best meet the study goals, and (f) inclusion of outcome measures that address the tolerability of the treatment or study endpoints apart from seizures. We further discuss the different expectations for studies aiming to meet regulatory requirements to obtain approval for clinical testing in humans. Implementation of the rigorous practices discussed in this report will require considerable investment in time, funds and other research resources, which may create challenges for academic researchers seeking to contribute to epilepsy therapy discovery and development. We propose several infrastructure

  7. Epilepsy therapy development: technical and methodologic issues in studies with animal models.

    Science.gov (United States)

    Galanopoulou, Aristea S; Kokaia, Merab; Loeb, Jeffrey A; Nehlig, Astrid; Pitkänen, Asla; Rogawski, Michael A; Staley, Kevin J; Whittemore, Vicky H; Dudek, F Edward

    2013-08-01

    The search for new treatments for seizures, epilepsies, and their comorbidities faces considerable challenges. This is due in part to gaps in our understanding of the etiology and pathophysiology of most forms of epilepsy. An additional challenge is the difficulty in predicting the efficacy, tolerability, and impact of potential new treatments on epilepsies and comorbidities in humans, using the available resources. Herein we provide a summary of the discussions and proposals of the Working Group 2 as presented in the Joint American Epilepsy Society and International League Against Epilepsy Translational Workshop in London (September 2012). We propose methodologic and reporting practices that will enhance the uniformity, reliability, and reporting of early stage preclinical studies with animal seizure and epilepsy models that aim to develop and evaluate new therapies for seizures or epilepsies, using multidisciplinary approaches. The topics considered include the following: (1) implementation of better study design and reporting practices; (2) incorporation in the study design and analysis of covariants that may influence outcomes (including species, age, sex); (3) utilization of approaches to document target relevance, exposure, and engagement by the tested treatment; (4) utilization of clinically relevant treatment protocols; (5) optimization of the use of video-electroencephalography (EEG) recordings to best meet the study goals; and (6) inclusion of outcome measures that address the tolerability of the treatment or study end points apart from seizures. We further discuss the different expectations for studies aiming to meet regulatory requirements to obtain approval for clinical testing in humans. Implementation of the rigorous practices discussed in this report will require considerable investment in time, funds, and other research resources, which may create challenges for academic researchers seeking to contribute to epilepsy therapy discovery and

  8. Interval Methods for Model Qualification: Methodology and Advanced Application

    OpenAIRE

    Alexandre dit Sandretto, Julien; Trombettoni, Gilles; Daney, David

    2012-01-01

    It is often too complex to use, and sometimes impossible to obtain, an actual model in simulation or command field . To handle a system in practice, a simplification of the real model is then necessary. This simplification goes through some hypotheses made on the system or the modeling approach. In this paper, we deal with all models that can be expressed by real-valued variables involved in analytical relations and depending on parameters. We propose a method that qualifies the simplificatio...

  9. International orientation on methodologies for modelling developments in road safety.

    NARCIS (Netherlands)

    Reurings, M.C.B. & Commandeur, J.J.F.

    2007-01-01

    This report gives an overview of the models developed in countries other than the Netherlands to evaluate past developments in road traffic safety and to obtain estimates of these developments in the future. These models include classical linear regression and loglinear models as applied in Great Br

  10. New systematic methodology for incorporating dynamic heat transfer modelling in multi-phase biochemical reactors.

    Science.gov (United States)

    Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E

    2014-09-01

    This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization.

  11. Chronic early life stress induced by limited bedding and nesting (LBN) material in rodents: critical considerations of methodology, outcomes and translational potential.

    Science.gov (United States)

    Walker, Claire-Dominique; Bath, Kevin G; Joels, Marian; Korosi, Aniko; Larauche, Muriel; Lucassen, Paul J; Morris, Margaret J; Raineki, Charlis; Roth, Tania L; Sullivan, Regina M; Taché, Yvette; Baram, Tallie Z

    2017-07-12

    The immediate and long-term effects of exposure to early life stress (ELS) have been documented in humans and animal models. Even relatively brief periods of stress during the first 10 days of life in rodents can impact later behavioral regulation and the vulnerability to develop adult pathologies, in particular an impairment of cognitive functions and neurogenesis, but also modified social, emotional, and conditioned fear responses. The development of preclinical models of ELS exposure allows the examination of mechanisms and testing of therapeutic approaches that are not possible in humans. Here, we describe limited bedding and nesting (LBN) procedures, with models that produce altered maternal behavior ranging from fragmentation of care to maltreatment of infants. The purpose of this paper is to discuss important issues related to the implementation of this chronic ELS procedure and to describe some of the most prominent endpoints and consequences, focusing on areas of convergence between laboratories. Effects on the hypothalamic-pituitary adrenal (HPA) axis, gut axis and metabolism are presented in addition to changes in cognitive and emotional functions. Interestingly, recent data have suggested a strong sex difference in some of the reported consequences of the LBN paradigm, with females being more resilient in general than males. As both the chronic and intermittent variants of the LBN procedure have profound consequences on the offspring with minimal external intervention from the investigator, this model is advantageous ecologically and has a large translational potential. In addition to the direct effect of ELS on neurodevelopmental outcomes, exposure to adverse early environments can also have intergenerational impacts on mental health and function in subsequent generation offspring. Thus, advancing our understanding of the effect of ELS on brain and behavioral development is of critical concern for the health and wellbeing of both the current

  12. Spreadsheets Grow Up: Three Spreadsheet Engineering Methodologies for Large Financial Planning Models

    CERN Document Server

    Grossman, Thomas A

    2010-01-01

    Many large financial planning models are written in a spreadsheet programming language (usually Microsoft Excel) and deployed as a spreadsheet application. Three groups, FAST Alliance, Operis Group, and BPM Analytics (under the name "Spreadsheet Standards Review Board") have independently promulgated standardized processes for efficiently building such models. These spreadsheet engineering methodologies provide detailed guidance on design, construction process, and quality control. We summarize and compare these methodologies. They share many design practices, and standardized, mechanistic procedures to construct spreadsheets. We learned that a written book or standards document is by itself insufficient to understand a methodology. These methodologies represent a professionalization of spreadsheet programming, and can provide a means to debug a spreadsheet that contains errors. We find credible the assertion that these spreadsheet engineering methodologies provide enhanced productivity, accuracy and maintain...

  13. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    Science.gov (United States)

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  14. Testing for Equivalence: A Methodology for Computational Cognitive Modelling

    Science.gov (United States)

    Stewart, Terrence; West, Robert

    2010-12-01

    The equivalence test (Stewart and West, 2007; Stewart, 2007) is a statistical measure for evaluating the similarity between a model and the system being modelled. It is designed to avoid over-fitting and to generate an easily interpretable summary of the quality of a model. We apply the equivalence test to two tasks: Repeated Binary Choice (Erev et al., 2010) and Dynamic Stocks and Flows (Gonzalez and Dutt, 2007). In the first case, we find a broad range of statistically equivalent models (and win a prediction competition) while identifying particular aspects of the task that are not yet adequately captured. In the second case, we re-evaluate results from the Dynamic Stocks and Flows challenge, demonstrating how our method emphasizes the breadth of coverage of a model and how it can be used for comparing different models. We argue that the explanatory power of models hinges on numerical similarity to empirical data over a broad set of measures.

  15. OxLM: A Neural Language Modelling Framework for Machine Translation

    Directory of Open Access Journals (Sweden)

    Paul Baltescu

    2014-09-01

    Full Text Available This paper presents an open source implementation1 of a neural language model for machine translation. Neural language models deal with the problem of data sparsity by learning distributed representations for words in a continuous vector space. The language modelling probabilities are estimated by projecting a word's context in the same space as the word representations and by assigning probabilities proportional to the distance between the words and the context's projection. Neural language models are notoriously slow to train and test. Our framework is designed with scalability in mind and provides two optional techniques for reducing the computational cost: the so-called class decomposition trick and a training algorithm based on noise contrastive estimation. Our models may be extended to incorporate direct n-gram features to learn weights for every n-gram in the training data. Our framework comes with wrappers for the cdec and Moses translation toolkits, allowing our language models to be incorporated as normalized features in their decoders (inside the beam search.

  16. The Oncopig Cancer Model: An Innovative Large Animal Translational Oncology Platform

    DEFF Research Database (Denmark)

    Schachtschneider, Kyle M.; Schwind, Regina M.; Newson, Jordan

    2017-01-01

    in bridging the gap between fundamental diagnostic and therapeutic discoveries and human clinical trials. Such animal models offer insights into all aspects of the basic science-clinical translational cancer research continuum (screening, detection, oncogenesis, tumor biology, immunogenicity, therapeutics......-the Oncopig Cancer Model (OCM)-as a next-generation large animal platform for the study of hematologic and solid tumor oncology. With mutations in key tumor suppressor and oncogenes, TP53R167H and KRASG12D , the OCM recapitulates transcriptional hallmarks of human disease while also exhibiting clinically...

  17. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  18. Towards a methodology for educational modelling: a case in educational assessment

    NARCIS (Netherlands)

    Giesbers, Bas; Van Bruggen, Jan; Hermans, Henry; Joosten-ten Brinke, Desirée; Burgers, Jan; Koper, Rob; Latour, Ignace

    2005-01-01

    Giesbers, B., van Bruggen, J., Hermans, H., Joosten-ten Brinke, D., Burgers, J., Koper, R., & Latour, I. (2007). Towards a methodology for educational modelling: a case in educational assessment. Educational Technology & Society, 10 (1), 237-247.

  19. A changing climate: impacts on human exposures to O3 using an integrated modeling methodology

    Science.gov (United States)

    Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposu...

  20. A holistic methodology for modeling consumer response to innovation.

    Science.gov (United States)

    Bagozzi, R P

    1983-01-01

    A general structural equation model for representing consumer response to innovation is derived and illustrated. The approach both complements and extends an earlier model proposed by Hauser and Urban. Among other benefits, the model is able to take measurement error into account explicitly, to estimate the intercorrelation among exogenous factors if these exist, to yield a unique solution in a statistical sense, and to test complex hypotheses (e.g., systems of relations, simultaneity, feedback) associated with the measurement of consumer responses and their impact on actual choice behavior. In addition, the procedures permit one to model environmental and managerially controllable stimuli as they constrain and influence consumer choice. Limitations of the procedures are discussed and related to existing approaches. Included in the discussion is a development of four generic response models designed to provide a framework for modeling how consumers behave and how managers might better approach the design of products, persuasive appeals, and other controllable factors in the marketing mix.

  1. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  2. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  3. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  4. Modeling Epistemic and Ontological Cognition: Philosophical Perspectives and Methodological Directions

    Science.gov (United States)

    Greene, Jeffrey A.; Azevedo, Roger A.; Torney-Purta, Judith

    2008-01-01

    We propose an integration of aspects of several developmental and systems of beliefs models of personal epistemology. Qualitatively different positions, including realism, dogmatism, skepticism, and rationalism, are characterized according to individuals' beliefs across three dimensions in a model of epistemic and ontological cognition. This model…

  5. A methodology to calibrate pedestrian walker models using multiple objectives

    NARCIS (Netherlands)

    Campanella, M.C.; Daamen, W.; Hoogendoorn, S.P.

    2012-01-01

    The application of walker models to simulate real situations require accuracy in several traffic situations. One strategy to obtain a generic model is to calibrate the parameters in several situations using multiple-objective functions in the optimization process. In this paper, we propose a general

  6. USEPA SHEDS MODEL: METHODOLOGY FOR EXPOSURE ASSESSMENT FOR WOOD PRESERVATIVES

    Science.gov (United States)

    A physically-based, Monte Carlo probabilistic model (SHEDS-Wood: Stochastic Human Exposure and Dose Simulation model for wood preservatives) has been applied to assess the exposure and dose of children to arsenic (As) and chromium (Cr) from contact with chromated copper arsenat...

  7. A Reordering Model Using a Source-Side Parse-Tree for Statistical Machine Translation

    Science.gov (United States)

    Hashimoto, Kei; Yamamoto, Hirofumi; Okuma, Hideo; Sumita, Eiichiro; Tokuda, Keiichi

    This paper presents a reordering model using a source-side parse-tree for phrase-based statistical machine translation. The proposed model is an extension of IST-ITG (imposing source tree on inversion transduction grammar) constraints. In the proposed method, the target-side word order is obtained by rotating nodes of the source-side parse-tree. We modeled the node rotation, monotone or swap, using word alignments based on a training parallel corpus and source-side parse-trees. The model efficiently suppresses erroneous target word orderings, especially global orderings. Furthermore, the proposed method conducts a probabilistic evaluation of target word reorderings. In English-to-Japanese and English-to-Chinese translation experiments, the proposed method resulted in a 0.49-point improvement (29.31 to 29.80) and a 0.33-point improvement (18.60 to 18.93) in word BLEU-4 compared with IST-ITG constraints, respectively. This indicates the validity of the proposed reordering model.

  8. Advancing Transdisciplinary and Translational Research Practice: Issues and Models of Doctoral Education in Public Health

    Directory of Open Access Journals (Sweden)

    Linda Neuhauser

    2007-01-01

    Full Text Available Finding solutions to complex health problems, such as obesity, violence, and climate change, will require radical changes in cross-disciplinary education, research, and practice. The fundamental determinants of health include many interrelated factors such as poverty, culture, education, environment, and government policies. However, traditional public health training has tended to focus more narrowly on diseases and risk factors, and has not adequately leveraged the rich contributions of sociology, anthropology, economics, geography, communication, political science, and other disciplines. Further, students are often not sufficiently trained to work across sectors to translate research findings into effective, large-scale sustainable actions.During the past 2 decades, national and international organizations have called for more effective interdisciplinary, transdisciplinary, and translational approaches to graduate education. Although it has been difficult to work across traditional academic boundaries, some promising models draw on pedagogical theory and feature cross-disciplinary training focused on real-world problems, linkage between research, professional practice, community action, and cultivation of leadership skills.We describe the development the Doctor of Public Health program at the University of California, Berkeley, USA and its efforts to improve transdisciplinary and translational research education. We stress the need for international collaboration to improve educational approaches and better evaluate their impact.

  9. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... of the essential energy-momentum spectrum and either the two-body threshold, if there are no exited isolated mass shells, or the one-body threshold pertaining to the first exited isolated mass shell, if it exists. For the model restricted to the vacuum and one-particle sectors, the absence of singular continuous...... spectrum is proven to hold globally and scattering theory of the model is studied using time-dependent methods, of which the main result is asymptotic completeness....

  10. Quantum Supremacy for Simulating a Translation-Invariant Ising Spin Model

    Science.gov (United States)

    Gao, Xun; Wang, Sheng-Tao; Duan, L.-M.

    2017-01-01

    We introduce an intermediate quantum computing model built from translation-invariant Ising-interacting spins. Despite being nonuniversal, the model cannot be classically efficiently simulated unless the polynomial hierarchy collapses. Equipped with the intrinsic single-instance-hardness property, a single fixed unitary evolution in our model is sufficient to produce classically intractable results, compared to several other models that rely on implementation of an ensemble of different unitaries (instances). We propose a feasible experimental scheme to implement our Hamiltonian model using cold atoms trapped in a square optical lattice. We formulate a procedure to certify the correct functioning of this quantum machine. The certification requires only a polynomial number of local measurements assuming measurement imperfections are sufficiently small.

  11. WRF Model Methodology for Offshore Wind Energy Applications

    Directory of Open Access Journals (Sweden)

    Evangelia-Maria Giannakopoulou

    2014-01-01

    Full Text Available Among the parameters that must be considered for an offshore wind farm development, the stability conditions of the marine atmospheric boundary layer (MABL are of significant importance. Atmospheric stability is a vital parameter in wind resource assessment (WRA due to its direct relation to wind and turbulence profiles. A better understanding of the stability conditions occurring offshore and of the interaction between MABL and wind turbines is needed. Accurate simulations of the offshore wind and stability conditions using mesoscale modelling techniques can lead to a more precise WRA. However, the use of any mesoscale model for wind energy applications requires a proper validation process to understand the accuracy and limitations of the model. For this validation process, the weather research and forecasting (WRF model has been applied over the North Sea during March 2005. The sensitivity of the WRF model performance to the use of different horizontal resolutions, input datasets, PBL parameterisations, and nesting options was examined. Comparison of the model results with other modelling studies and with high quality observations recorded at the offshore measurement platform FINO1 showed that the ERA-Interim reanalysis data in combination with the 2.5-level MYNN PBL scheme satisfactorily simulate the MABL over the North Sea.

  12. Terminology and methodology in modelling for water quality management

    DEFF Research Database (Denmark)

    Carstensen, J.; Vanrolleghem, P.; Rauch, W.

    1997-01-01

    There is a widespread need for a common terminology in modelling for water quality management. This paper points out sources of confusion in the communication between researchers due to misuse of existing terminology or use of unclear terminology. The paper attempts to clarify the context...... of the most widely used terms for characterising models and within the process of model building. It is essential to the ever growing society of researchers within water quality management, that communication is eased by establishing a common terminology. This should not be done by giving broader definitions...

  13. Methodology and models in erosion research: discussion and conclusions

    National Research Council Canada - National Science Library

    Shellis, R P; Ganss, C; Ren, Y; Zero, D T; Lussi, A

    2011-01-01

    .... The prospects for clinical trials are also discussed. All models in erosion research require a number of choices regarding experimental conditions, study design and measurement techniques, and these general aspects are discussed first...

  14. Methodologies in the modeling of combined chemo-radiation treatments

    Science.gov (United States)

    Grassberger, C.; Paganetti, H.

    2016-11-01

    The variety of treatment options for cancer patients has increased significantly in recent years. Not only do we combine radiation with surgery and chemotherapy, new therapeutic approaches such as immunotherapy and targeted therapies are starting to play a bigger role. Physics has made significant contributions to radiation therapy treatment planning and delivery. In particular, treatment plan optimization using inverse planning techniques has improved dose conformity considerably. Furthermore, medical physics is often the driving force behind tumor control and normal tissue complication modeling. While treatment optimization and outcome modeling does focus mainly on the effects of radiation, treatment modalities such as chemotherapy are treated independently or are even neglected entirely. This review summarizes the published efforts to model combined modality treatments combining radiation and chemotherapy. These models will play an increasing role in optimizing cancer therapy not only from a radiation and drug dosage standpoint, but also in terms of spatial and temporal optimization of treatment schedules.

  15. Methodology for physical modeling of melter electrode power plug

    Energy Technology Data Exchange (ETDEWEB)

    Heath, W.O.

    1984-09-01

    A method is presented for building and testing a one-third scale model of an electrode power plug used to supply up to 3000 amperes to a liquid fed ceramic melter. The method describes how a one-third scale model can be used to verify the ampacity of the power plug, the effectiveness of the power plug cooling system and the effect of the high amperage current on eddy current heating of rebar in the cell wall. Scale-up of the test data, including cooling air flow rate and pressure drop, temperature profiles, melter water jacket heat duty and electrical resistance is covered. The materials required to build the scale model are specified as well as scale surface finish and dimensions. The method for designing and testing a model power plug involves developing a way to recreate the thermal conditions including heat sources, sinks and boundary temperatures on a scale basis. The major heat sources are the molten glass in contact with the electrode, joule heat generation within the power plug, and eddy current heating of the wall rebar. The melting cavity heat source is modelled using a plate heater to provide radiant heat transfer to a geometrically similar, one-third scale electrode housed in a scale model of a melting cavity having a thermally and geometrically similar wall and floor. The joule heat generation within the power plug is simulated by passing electricity through the model power plug with geometrically similar rebar positioned to simulate the eddy heating phenomenon. The proposed model also features two forced air cooling circuits similar to those on the full design. The interaction of convective, natural and radiant heat transfer in the wall cooling circuit are considered. The cell environment and a melter water jacket, along with the air cooling circuits, constitute the heat sinks and are also simulated.

  16. A Review of Kinetic Modeling Methodologies for Complex Processes

    Directory of Open Access Journals (Sweden)

    de Oliveira Luís P.

    2016-05-01

    Full Text Available In this paper, kinetic modeling techniques for complex chemical processes are reviewed. After a brief historical overview of chemical kinetics, an overview is given of the theoretical background of kinetic modeling of elementary steps and of multistep reactions. Classic lumping techniques are introduced and analyzed. Two examples of lumped kinetic models (atmospheric gasoil hydrotreating and residue hydroprocessing developed at IFP Energies nouvelles (IFPEN are presented. The largest part of this review describes advanced kinetic modeling strategies, in which the molecular detail is retained, i.e. the reactions are represented between molecules or even subdivided into elementary steps. To be able to retain this molecular level throughout the kinetic model and the reactor simulations, several hurdles have to be cleared first: (i the feedstock needs to be described in terms of molecules, (ii large reaction networks need to be automatically generated, and (iii a large number of rate equations with their rate parameters need to be derived. For these three obstacles, molecular reconstruction techniques, deterministic or stochastic network generation programs, and single-event micro-kinetics and/or linear free energy relationships have been applied at IFPEN, as illustrated by several examples of kinetic models for industrial refining processes.

  17. Systematic reviews of animal models: methodology versus epistemology.

    Science.gov (United States)

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions.

  18. Systematic Reviews of Animal Models: Methodology versus Epistemology

    Directory of Open Access Journals (Sweden)

    Ray Greek, Andre Menache

    2013-01-01

    Full Text Available Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions.

  19. A Roadmap for Generating Semantically Enriched Building Models According to CityGML Model via Two Different Methodologies

    Science.gov (United States)

    Floros, G.; Solou, D.; Pispidikis, I.; Dimopoulou, E.

    2016-10-01

    The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model's format, via semi-automatic procedures with respect to the user's scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model's generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects' purposes.

  20. Mathematical modelling of translational motion of rail-guided cart with suspended payload

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In this paper modelling of the translational motion of transportation rail-guided cart with rope suspended payload is considered. The linearly moving cart, driven by a travel mechanism, is modelled as a discrete six degrees of freedom (DOF) dynamic system. The hoisting mechanism for lowering and lifting the payload is considered and is included in the dynamic model as one DOF system. Differential equations of motion of the cart elements are derived using Lagrangian dynamics and are solved for a set of real-life constant parameters of the cart. A two-sided interaction was observed between the swinging payload and the travel mechanism. Results for kinematical and force parameters of the system are obtained. A verification of the proposed model was conducted.

  1. Automated Translation and Thermal Zoning of Digital Building Models for Energy Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Nathaniel L. [Cornell University; McCrone, Colin J. [Cornell University; Walter, Bruce J. [Cornell University; Pratt, Kevin B. [Cornell University; Greenberg, Donald P. [Cornell University

    2013-08-26

    Building energy simulation is valuable during the early stages of design, when decisions can have the greatest impact on energy performance. However, preparing digital design models for building energy simulation typically requires tedious manual alteration. This paper describes a series of five automated steps to translate geometric data from an unzoned CAD model into a multi-zone building energy model. First, CAD input is interpreted as geometric surfaces with materials. Second, surface pairs defining walls of various thicknesses are identified. Third, normal directions of unpaired surfaces are determined. Fourth, space boundaries are defined. Fifth, optionally, settings from previous simulations are applied, and spaces are aggregated into a smaller number of thermal zones. Building energy models created quickly using this method can offer guidance throughout the design process.

  2. Measuring Team Shared Understanding Using the Analysis-Constructed Shared Mental Model Methodology

    Science.gov (United States)

    Johnson, Tristan E.; O'Connor, Debra L.

    2008-01-01

    Teams are an essential part of successful performance in learning and work environments. Analysis-constructed shared mental model (ACSMM) methodology is a set of techniques where individual mental models are elicited and sharedness is determined not by the individuals who provided their mental models but by an analytical procedure. This method…

  3. A European test of pesticide-leaching models: methodology and major recommendations

    NARCIS (Netherlands)

    Vanclooster, M.; Boesten, J.J.T.I.; Trevisan, M.; Brown, C.D.; Capri, E.; Eklo, O.M.; Gottesbüren, B.; Gouy, V.; Linden, van der A.M.A.

    2000-01-01

    Testing of pesticide-leaching models is important in view of their increasing use in pesticide registration procedures in the European Union. This paper presents the methodology and major conclusions of a test of pesticide-leaching models. Twelve models simulating the vertical one-dimensional moveme

  4. Translating China

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    sidney Shapiro, an American-born translator famous for his translation of Chinese literary works, received the Lifetime Achievement Award in Translation by the Translators Association of China on December 2, 2010.

  5. A branch-and-bound methodology within algebraic modelling systems

    NARCIS (Netherlands)

    Bisschop, J.J.; Heerink, J.B.J.; Kloosterman, G.

    1998-01-01

    Through the use of application-specific branch-and-bound directives it is possible to find solutions to combinatorial models that would otherwise be difficult or impossible to find by just using generic branch-and-bound techniques within the framework of mathematical programming. {\\sc Minto} is an e

  6. Translation Theory 'Translated'

    DEFF Research Database (Denmark)

    Wæraas, Arild; Nielsen, Jeppe

    2016-01-01

    Translation theory has proved to be a versatile analytical lens used by scholars working from different traditions. On the basis of a systematic literature review, this study adds to our understanding of the ‘translations’ of translation theory by identifying the distinguishing features of the mo......, but also overlapping. We discuss the ways in which the three versions of translation theory may be combined and enrich each other so as to inform future research, thereby offering a more complete understanding of translation in and across organizational settings.......Translation theory has proved to be a versatile analytical lens used by scholars working from different traditions. On the basis of a systematic literature review, this study adds to our understanding of the ‘translations’ of translation theory by identifying the distinguishing features of the most...... common theoretical approaches to translation within the organization and management discipline: actor-network theory, knowledge-based theory, and Scandinavian institutionalism. Although each of these approaches already has borne much fruit in research, the literature is diverse and somewhat fragmented...

  7. Discriminative feature-rich models for syntax-based machine translation.

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, Kevin R.

    2012-12-01

    This report describes the campus executive LDRD %E2%80%9CDiscriminative Feature-Rich Models for Syntax-Based Machine Translation,%E2%80%9D which was an effort to foster a better relationship between Sandia and Carnegie Mellon University (CMU). The primary purpose of the LDRD was to fund the research of a promising graduate student at CMU; in this case, Kevin Gimpel was selected from the pool of candidates. This report gives a brief overview of Kevin Gimpel's research.

  8. Fear extinction as a model for translational neuroscience: ten years of progress.

    Science.gov (United States)

    Milad, Mohammed R; Quirk, Gregory J

    2012-01-01

    The psychology of extinction has been studied for decades. Approximately 10 years ago, however, there began a concerted effort to understand the neural circuits of extinction of fear conditioning, in both animals and humans. Progress during this period has been facilitated by a high degree of coordination between rodent and human researchers examining fear extinction. Here we review the major advances and highlight new approaches to understanding and exploiting fear extinction. Research in fear extinction could serve as a model for translational research in other areas of behavioral neuroscience.

  9. Understanding Translation

    DEFF Research Database (Denmark)

    Schjoldager, Anne Gram; Gottlieb, Henrik; Klitgård, Ida

    Understanding Translation is designed as a textbook for courses on the theory and practice of translation in general and of particular types of translation - such as interpreting, screen translation and literary translation. The aim of the book is to help you gain an in-depth understanding...... - translators, language teachers, translation users and literary, TV and film critics, for instance. Discussions focus on translation between Danish and English....

  10. Understanding Translation

    DEFF Research Database (Denmark)

    Schjoldager, Anne Gram; Gottlieb, Henrik; Klitgård, Ida

    Understanding Translation is designed as a textbook for courses on the theory and practice of translation in general and of particular types of translation - such as interpreting, screen translation and literary translation. The aim of the book is to help you gain an in-depth understanding of the...... - translators, language teachers, translation users and literary, TV and film critics, for instance. Discussions focus on translation between Danish and English....

  11. Methodology Aspects of Quantifying Stochastic Climate Variability with Dynamic Models

    Science.gov (United States)

    Nuterman, Roman; Jochum, Markus; Solgaard, Anna

    2015-04-01

    The paleoclimatic records show that climate has changed dramatically through time. For the past few millions of years it has been oscillating between ice ages, with large parts of the continents covered with ice, and warm interglacial periods like the present one. It is commonly assumed that these glacial cycles are related to changes in insolation due to periodic changes in Earth's orbit around Sun (Milankovitch theory). However, this relationship is far from understood. The insolation changes are so small that enhancing feedbacks must be at play. It might even be that the external perturbation only plays a minor role in comparison to internal stochastic variations or internal oscillations. This claim is based on several shortcomings in the Milankovitch theory: Prior to one million years ago, the duration of the glacial cycles was indeed 41,000 years, in line with the obliquity cycle of Earth's orbit. This duration changed at the so-called Mid-Pleistocene transition to approximately 100,000 years. Moreover, according to Milankovitch's theory the interglacial of 400,000 years ago should not have happened. Thus, while prior to one million years ago the pacing of these glacial cycles may be tied to changes in Earth's orbit, we do not understand the current magnitude and phasing of the glacial cycles. In principle it is possible that the glacial/interglacial cycles are not due to variations in Earth's orbit, but due to stochastic forcing or internal modes of variability. We present a new method and preliminary results for a unified framework using a fully coupled Earth System Model (ESM), in which the leading three ice age hypotheses will be investigated together. Was the waxing and waning of ice sheets due to an internal mode of variability, due to variations in Earth's orbit, or simply due to a low-order auto-regressive process (i.e., noise integrated by system with memory)? The central idea is to use the Generalized Linear Models (GLM), which can handle both

  12. Experimental liver fibrosis research: update on animal models, legal issues and translational aspects

    Science.gov (United States)

    2013-01-01

    Liver fibrosis is defined as excessive extracellular matrix deposition and is based on complex interactions between matrix-producing hepatic stellate cells and an abundance of liver-resident and infiltrating cells. Investigation of these processes requires in vitro and in vivo experimental work in animals. However, the use of animals in translational research will be increasingly challenged, at least in countries of the European Union, because of the adoption of new animal welfare rules in 2013. These rules will create an urgent need for optimized standard operating procedures regarding animal experimentation and improved international communication in the liver fibrosis community. This review gives an update on current animal models, techniques and underlying pathomechanisms with the aim of fostering a critical discussion of the limitations and potential of up-to-date animal experimentation. We discuss potential complications in experimental liver fibrosis and provide examples of how the findings of studies in which these models are used can be translated to human disease and therapy. In this review, we want to motivate the international community to design more standardized animal models which might help to address the legally requested replacement, refinement and reduction of animals in fibrosis research. PMID:24274743

  13. Early-Life Nutrition and Neurodevelopment: Use of the Piglet as a Translational Model.

    Science.gov (United States)

    Mudd, Austin T; Dilger, Ryan N

    2017-01-01

    Optimal nutrition early in life is critical to ensure proper structural and functional development of infant organ systems. Although pediatric nutrition historically has emphasized research on the relation between nutrition, growth rates, and gastrointestinal maturation, efforts increasingly have focused on how nutrition influences neurodevelopment. The provision of human milk is considered the gold standard in pediatric nutrition; thus, there is interest in understanding how functional nutrients and bioactive components in milk may modulate developmental processes. The piglet has emerged as an important translational model for studying neurodevelopmental outcomes influenced by pediatric nutrition. Given the comparable nutritional requirements and strikingly similar brain developmental patterns between young pigs and humans, the piglet is being used increasingly in developmental nutritional neuroscience studies. The piglet primarily has been used to assess the effects of dietary fatty acids and their accretion in the brain throughout neurodevelopment. However, recent research indicates that other dietary components, including choline, iron, cholesterol, gangliosides, and sialic acid, among other compounds, also affect neurodevelopment in the pig model. Moreover, novel analytical techniques, including but not limited to MRI, behavioral assessments, and molecular quantification, allow for a more holistic understanding of how nutrition affects neurodevelopmental patterns. By combining early-life nutritional interventions with innovative analytical approaches, opportunities abound to quantify factors affecting neurodevelopmental trajectories in the neonate. This review discusses research using the translational pig model with primary emphasis on early-life nutrition interventions assessing neurodevelopment outcomes, while also discussing nutritionally-sensitive methods to characterize brain maturation.

  14. A logic model for community engagement within the Clinical and Translational Science Awards consortium: can we measure what we model?

    Science.gov (United States)

    Eder, Milton Mickey; Carter-Edwards, Lori; Hurd, Thelma C; Rumala, Bernice B; Wallerstein, Nina

    2013-10-01

    The Clinical and Translational Science Award (CTSA) initiative calls on academic health centers to engage communities around a clinical research relationship measured ultimately in terms of public health. Among a few initiatives involving university accountability for advancing public interests, a small CTSA workgroup devised a community engagement (CE) logic model that organizes common activities within a university-community infrastructure to facilitate CE in research. Whereas the model focuses on the range of institutional CE inputs, it purposefully does not include an approach for assessing how CE influences research implementation and outcomes. Rather, with communities and individuals beginning to transition into new research roles, this article emphasizes studying CE through specific relationship types and assessing how expanded research teams contribute to the full spectrum of translational science.The authors propose a typology consisting of three relationship types-engagement, collaboration, and shared leadership-to provide a foundation for investigating community-academic contributions to the new CTSA research paradigm. The typology shifts attention from specific community-academic activities and, instead, encourages analyses focused on measuring the strength of relationships through variables like synergy and trust. The collaborative study of CE relationships will inform an understanding of CTSA infrastructure development in support of translational research and its goal, which is expressed in the logic model: better science, better answers, better population health.

  15. Animal models of gastrointestinal and liver diseases. Animal models of visceral pain: pathophysiology, translational relevance, and challenges.

    Science.gov (United States)

    Greenwood-Van Meerveld, Beverley; Prusator, Dawn K; Johnson, Anthony C

    2015-06-01

    Visceral pain describes pain emanating from the thoracic, pelvic, or abdominal organs. In contrast to somatic pain, visceral pain is generally vague, poorly localized, and characterized by hypersensitivity to a stimulus such as organ distension. Animal models have played a pivotal role in our understanding of the mechanisms underlying the pathophysiology of visceral pain. This review focuses on animal models of visceral pain and their translational relevance. In addition, the challenges of using animal models to develop novel therapeutic approaches to treat visceral pain will be discussed. Copyright © 2015 the American Physiological Society.

  16. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    Energy Technology Data Exchange (ETDEWEB)

    Knezevic, J.; Odoom, E.R

    2001-07-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets.

  17. Prototype integration of the joint munitions assessment and planning model with the OSD threat methodology

    Energy Technology Data Exchange (ETDEWEB)

    Lynn, R.Y.S.; Bolmarcich, J.J.

    1994-06-01

    The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discusses the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.

  18. A multiscale approach to blast neurotrauma modeling:Part II: Methodology for inducing blast injury to in vitro models

    Directory of Open Access Journals (Sweden)

    Gwen B. Effgen

    2012-02-01

    Full Text Available Due to the prominent role of improvised explosive devices (IEDs in wounding patterns of U.S. war-fighters in Iraq and Afghanistan, blast injury has risen to a new level of importance and is recognized to be a major cause of injuries to the brain. However, an injury risk-function for microscopic, macroscopic, behavioral, and neurological deficits has yet to be defined. While operational blast injuries can be very complex and thus difficult to analyze, a simplified blast injury model would facilitate studies correlating biological outcomes with blast biomechanics to define tolerance criteria. Blast-induced traumatic brain injury (bTBI results from the translation of a shock wave in air, such as that produced by an IED, into a pressure wave within the skull-brain complex. Our blast injury methodology recapitulates this phenomenon in vitro, allowing for control of the injury biomechanics via a compressed-gas shock tube used in conjunction with a custom-designed, fluid-filled receiver that contains the living culture. The receiver converts the air shock wave into a fast-rising pressure transient with minimal reflections, mimicking the intracranial pressure history in blast. We have developed an organotypic hippocampal slice culture model that exhibits cell death when exposed to a 530  17.7 kPa peak overpressure with a 1.026 ± 0.017 ms duration and 190 ± 10.7 kPa-ms impulse in-air. We have also injured a simplified in vitro model of the blood-brain barrier, which exhibits disrupted integrity immediately following exposure to 581  10.0 kPa peak overpressure with a 1.067 ms ± 0.006 ms duration and 222 ± 6.9 kPa-ms impulse in-air. To better prevent and treat bTBI, both the initiating biomechanics and the ensuing pathobiology must be understood in greater detail. A well-characterized, in vitro model of bTBI, in conjunction with animal models, will be a powerful tool for developing strategies to mitigate the risks of bTBI.

  19. Improved methodology for developing cost uncertainty models for naval vessels

    OpenAIRE

    Brown, Cinda L.

    2008-01-01

    The purpose of this thesis is to analyze the probabilistic cost model currently in use by NAVSEA 05C to predict cost uncertainty in naval vessel construction and to develop a method that better predicts the ultimate cost risk. The data used to develop the improved approach is collected from analysis of the CG(X) class ship by NAVSEA 05C. The NAVSEA 05C cost risk factors are reviewed and analyzed to determine if different factors are better cost predictors. The impact of data elicitation, t...

  20. A Classification Methodology and Retrieval Model to Support Software Reuse

    Science.gov (United States)

    1988-01-01

    Smart .... 42 2.22.2 Sire ....... 42 2.2.2.3 Caliban _ 43 2.23 Probabilistic Information Retrieval 44 2.23.1 Harter’s Model __ 45 2.23.2 University of...attribute vectors destroys the boolean structure. 22.23 Caliban Caliban is an experimental IR system developed at the Swiss Federal Institute of...information items. To retrieve information using Caliban , the user specifies a "virtual information item" (fills out a template describing the item

  1. Methodological aspects of journaling a dynamic adjusting entry model

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská

    2011-01-01

    Full Text Available This paper expands the discussion of the importance and function of adjusting entries for loan receivables. Discussion of the cyclical development of adjusting entries, their negative impact on the business cycle and potential solutions has intensified during the financial crisis. These discussions are still ongoing and continue to be relevant to members of the professional public, banking regulators and representatives of international accounting institutions. The objective of this paper is to evaluate a method of journaling dynamic adjusting entries under current accounting law. It also expresses the authors’ opinions on the potential for consistently implementing basic accounting principles in journaling adjusting entries for loan receivables under a dynamic model.

  2. Fuel cycle assessment: A compendium of models, methodologies, and approaches

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The purpose of this document is to profile analytical tools and methods which could be used in a total fuel cycle analysis. The information in this document provides a significant step towards: (1) Characterizing the stages of the fuel cycle. (2) Identifying relevant impacts which can feasibly be evaluated quantitatively or qualitatively. (3) Identifying and reviewing other activities that have been conducted to perform a fuel cycle assessment or some component thereof. (4) Reviewing the successes/deficiencies and opportunities/constraints of previous activities. (5) Identifying methods and modeling techniques/tools that are available, tested and could be used for a fuel cycle assessment.

  3. Translating landfill methane generation parameters among first-order decay models.

    Science.gov (United States)

    Krause, Max J; Chickering, Giles W; Townsend, Timothy G

    2016-11-01

    Landfill gas (LFG) generation is predicted by a first-order decay (FOD) equation that incorporates two parameters: a methane generation potential (L0) and a methane generation rate (k). Because non-hazardous waste landfills may accept many types of waste streams, multiphase models have been developed in an attempt to more accurately predict methane generation from heterogeneous waste streams. The ability of a single-phase FOD model to predict methane generation using weighted-average methane generation parameters and tonnages translated from multiphase models was assessed in two exercises. In the first exercise, waste composition from four Danish landfills represented by low-biodegradable waste streams was modeled in the Afvalzorg Multiphase Model and methane generation was compared to the single-phase Intergovernmental Panel on Climate Change (IPCC) Waste Model and LandGEM. In the second exercise, waste composition represented by IPCC waste components was modeled in the multiphase IPCC and compared to single-phase LandGEM and Australia's Solid Waste Calculator (SWC). In both cases, weight-averaging of methane generation parameters from waste composition data in single-phase models was effective in predicting cumulative methane generation from -7% to +6% of the multiphase models. The results underscore the understanding that multiphase models will not necessarily improve LFG generation prediction because the uncertainty of the method rests largely within the input parameters. A unique method of calculating the methane generation rate constant by mass of anaerobically degradable carbon was presented (kc) and compared to existing methods, providing a better fit in 3 of 8 scenarios. Generally, single phase models with weighted-average inputs can accurately predict methane generation from multiple waste streams with varied characteristics; weighted averages should therefore be used instead of regional default values when comparing models. Translating multiphase first

  4. Literature Survey of previous research work in Models and Methodologies in Project Management

    OpenAIRE

    Ravinder Singh; Dr. Kevin Lano

    2014-01-01

    This paper provides a survey of the existing literature and research carried out in the area of project management using different models, methodologies, and frameworks. Project Management (PM) broadly means programme management, portfolio management, practice management, project management office, etc. A project management system has a set of processes, procedures, framework, methods, tools, methodologies, techniques, resources, etc. which are used to manage the full life cycle of projects. ...

  5. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  6. Methodology for modeling the microbial contamination of air filters.

    Directory of Open Access Journals (Sweden)

    Yun Haeng Joe

    Full Text Available In this paper, we propose a theoretical model to simulate microbial growth on contaminated air filters and entrainment of bioaerosols from the filters to an indoor environment. Air filter filtration and antimicrobial efficiencies, and effects of dust particles on these efficiencies, were evaluated. The number of bioaerosols downstream of the filter could be characterized according to three phases: initial, transitional, and stationary. In the initial phase, the number was determined by filtration efficiency, the concentration of dust particles entering the filter, and the flow rate. During the transitional phase, the number of bioaerosols gradually increased up to the stationary phase, at which point no further increase was observed. The antimicrobial efficiency and flow rate were the dominant parameters affecting the number of bioaerosols downstream of the filter in the transitional and stationary phase, respectively. It was found that the nutrient fraction of dust particles entering the filter caused a significant change in the number of bioaerosols in both the transitional and stationary phases. The proposed model would be a solution for predicting the air filter life cycle in terms of microbiological activity by simulating the microbial contamination of the filter.

  7. Methodological characteristics in establishing rat models of poststroke depression

    Institute of Scientific and Technical Information of China (English)

    Fuyou Liu; Shi Yang; Weiyin Chen; Jinyu Wang; Yi Tang; Guanxiang Zhu

    2006-01-01

    BACKGROUND: Ideal model of poststroke depression (PSD) may be induced in rats guided by the theoretical evidence that "primary endogenous mechanism" and "reactivity mechanism" theories for PSD in human being.OBJECTIVE: To investigate the feasibility of comprehensive methods to induce PSD models in rats.DESrGN: A randomized controlled animal trial.SETTING: Department of Neurology, Affiliated Hospital of Chengdu University of Traditional Chinese Medicine.MATERrALS: Male SD rats of SPF degree, weighing 350-500 g, were provided by the experimental animal center of Chengdu University of Traditional Chinese Medicine. The rats were raised for 1 week adaptively, then screened behaviorally by open-field test and passive avoidance test. Forty-five rats with close scores were randomly divided into normal control group (n =10), simple stroke group (n =10), stress group (n =10) and PSD group (n =15).METHODS: The experiments were carried out in the laboratory of Chengdu University of Traditional Chinese Medicine from July 2002 to February 2003. ① Rat models of focal cerebral ischemia were induced by thread embolization, then treated with separate raising and unpredictable stress to induce PSD models. ②The neurologic deficit was evaluated by Longa 5-grade standard (the higher the score, the severer the neurologic deficit) and horizontal round rod test (normal rat could stay on it for at least 3 minutes). ③ The behavioral changes of PSD rats were evaluated by the saccharin water test, open-field text and passive avoidance test,including the changes of interest, spontaneous and exploratory activities, etc. ④ The levels of monoamine neurotransmitters, including norepinephrine (NE), serotonin (5-HT) and dopamine, in brain were determined using fluorospectrophotometry.MAIN OUTCOME MEASURES: ① Score of Longa 5-grade standard; Stayed time in the horizontal round rod test;② Amount of saccharin water consumption; Open-field text: time stayed in the central square, times

  8. The virtue of translational PKPD modeling in drug discovery: selecting the right clinical candidate while sparing animal lives.

    Science.gov (United States)

    Bueters, Tjerk; Ploeger, Bart A; Visser, Sandra A G

    2013-09-01

    Translational pharmacokinetic-pharmacodynamic (PKPD) modeling has been fully implemented at AstraZeneca's drug discovery unit for central nervous system and pain indications to facilitate timely progression of the right compound to clinical studies, simultaneously assuring essential preclinical efficacy and safety knowledge. This review illustrates the impact of a translational PKPD paradigm with examples from drug discovery programs. Paradoxically, laboratory animal use decreased owing to better understanding of in vitro-in vivo relationships, optimized in vivo study designs, meta-analyses and hypothesis testing using simulations. From an ethical and effectivity perspective, we advocate that translational PKPD approaches should be implemented more broadly in drug discovery.

  9. Modeling menopause: The utility of rodents in translational behavioral endocrinology research.

    Science.gov (United States)

    Koebele, Stephanie V; Bimonte-Nelson, Heather A

    2016-05-01

    The human menopause transition and aging are each associated with an increase in a variety of health risk factors including, but not limited to, cardiovascular disease, osteoporosis, cancer, diabetes, stroke, sexual dysfunction, affective disorders, sleep disturbances, and cognitive decline. It is challenging to systematically evaluate the biological underpinnings associated with the menopause transition in the human population. For this reason, rodent models have been invaluable tools for studying the impact of gonadal hormone fluctuations and eventual decline on a variety of body systems. While it is essential to keep in mind that some of the mechanisms associated with aging and the transition into a reproductively senescent state can differ when translating from one species to another, animal models provide researchers with opportunities to gain a fundamental understanding of the key elements underlying reproduction and aging processes, paving the way to explore novel pathways for intervention associated with known health risks. Here, we discuss the utility of several rodent models used in the laboratory for translational menopause research, examining the benefits and drawbacks in helping us to better understand aging and the menopause transition in women. The rodent models discussed are ovary-intact, ovariectomy, and 4-vinylcylohexene diepoxide for the menopause transition. We then describe how these models may be implemented in the laboratory, particularly in the context of cognition. Ultimately, we aim to use these animal models to elucidate novel perspectives and interventions for maintaining a high quality of life in women, and to potentially prevent or postpone the onset of negative health consequences associated with these significant life changes during aging. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. New temperature model of the Netherlands from new data and novel modelling methodology

    Science.gov (United States)

    Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik

    2017-04-01

    Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat

  11. Translational relevance of rodent models of hypothalamic-pituitary-adrenal function and stressors in adolescence

    Directory of Open Access Journals (Sweden)

    Cheryl M. McCormick

    2017-02-01

    Full Text Available Elevations in glucocorticoids that result from environmental stressors can have programming effects on brain structure and function when the exposure occurs during sensitive periods that involve heightened neural development. In recent years, adolescence has gained increasing attention as another sensitive period of development, a period in which pubertal transitions may increase the vulnerability to stressors. There are similarities in physical and behavioural development between humans and rats, and rats have been used effectively as an animal model of adolescence and the unique plasticity of this period of ontogeny. This review focuses on benefits and challenges of rats as a model for translational research on hypothalamic-pituitary-adrenal (HPA function and stressors in adolescence, highlighting important parallels and contrasts between adolescent rats and humans, and we review the main stress procedures that are used in investigating HPA stress responses and their consequences in adolescence in rats. We conclude that a greater focus on timing of puberty as a factor in research in adolescent rats may increase the translational relevance of the findings.

  12. Coupling watersheds, estuaries and regional ocean through numerical modelling for Western Iberia: a novel methodology

    Science.gov (United States)

    Campuzano, Francisco; Brito, David; Juliano, Manuela; Fernandes, Rodrigo; de Pablo, Hilda; Neves, Ramiro

    2016-12-01

    An original methodology for integrating the water cycle from the rain water to the open ocean by numerical models was set up using an offline coupling technique. The different components of the water continuum, including watersheds, estuaries and ocean, for Western Iberia were reproduced using numerical components of the MOHID Water Modelling System (http://www.mohid.com). This set of models, when combined through this novel methodology, is able to fill information gaps, and to include, in a realistic mode, the fresh water inputs in terms of volume and composition, into a regional ocean model. The designed methodology is illustrated using the Tagus River, estuary and its region of fresh water influence as case study, and its performance is evaluated by means of river flow and salinity observations.

  13. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

  14. METHODOLOGICAL APPROACH AND MODEL ANALYSIS FOR IDENTIFICATION OF TOURIST TRENDS

    Directory of Open Access Journals (Sweden)

    Neven Šerić

    2015-06-01

    Full Text Available The draw and diversity of the destination’s offer is an antecedent of the tourism visits growth. The destination supply differentiation is carried through new, specialised tourism products. The usual approach consists of forming specialised tourism products in accordance with the existing tourism destination image. Another approach, prevalent in practice of developed tourism destinations is based on innovating the destination supply through accordance with the global tourism trends. For this particular purpose, it is advisable to choose a monitoring and analysis method of tourism trends. The goal is to determine actual trends governing target markets, differentiating whims from trends during the tourism preseason. When considering the return on investment, modifying the destination’s tourism offer on the basis of a tourism whim is a risky endeavour, indeed. Adapting the destination’s supply to tourism whims can result in a shifted image, one that is unable to ensure a long term interest and tourist vacation growth. With regard to tourism trend research and based on the research conducted, an advisable model for evaluating tourism phenomena is proposed, one that determines whether tourism phenomena is a tourism trend or a tourism whim.

  15. Developing a new model for the invention and translation of neurotechnologies in academic neurosurgery.

    Science.gov (United States)

    Leuthardt, Eric C

    2013-01-01

    There is currently an acceleration of new scientific and technical capabilities that create new opportunities for academic neurosurgery. To engage these changing dynamics, the Center for Innovation in Neuroscience and Technology (CINT) was created on the premise that successful innovation of device-related ideas relies on collaboration between multiple disciplines. The CINT has created a unique model that integrates scientific, medical, engineering, and legal/business experts to participate in the continuum from idea generation to translation. To detail the method by which this model has been implemented in the Department of Neurological Surgery at Washington University in St. Louis and the experience that has been accrued thus far. The workflow is structured to enable cross-disciplinary interaction, both intramurally and extramurally between academia and industry. This involves a structured method for generating, evaluating, and prototyping promising device concepts. The process begins with the "invention session," which consists of a structured exchange between inventors from diverse technical and medical backgrounds. Successful ideas, which pass a separate triage mechanism, are then sent to industry-sponsored multidisciplinary fellowships to create functioning prototypes. After 3 years, the CINT has engaged 32 clinical and nonclinical inventors, resulting in 47 ideas, 16 fellowships, and 12 patents, for which 7 have been licensed to industry. Financial models project that if commercially successful, device sales could have a notable impact on departmental revenue. The CINT is a model that supports an integrated approach from the time an idea is created through its translational development. To date, the approach has been successful in creating numerous concepts that have led to industry licenses. In the long term, this model will create a novel revenue stream to support the academic neurosurgical mission.

  16. Towards A Model-Based Prognostics Methodology For Electrolytic Capacitors: A Case Study Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter...

  17. Joint intelligence operations centers (JIOC) business process model & capabilities evaluation methodology

    OpenAIRE

    Schacher, Gordon; Irvine, Nelson; Hoyt, Roger

    2012-01-01

    A JIOC Business Process Model has been developed for use in evaluating JIOC capabilities. The model is described and depicted through OV5 and organization swim-lane diagrams. Individual intelligence activities diagrams are included. A JIOC evaluation methodology is described.

  18. Translation: An Integration of Cultures.

    Science.gov (United States)

    Mohanty, Niranjan

    1994-01-01

    Discusses translation in the Indian context. Posits that translation involves cultural transfer in addition to linguistic meaning. Shows that several established models of translation can accommodate the inclusion of cultural features. Illustrates this with two translations of Orissan poetry. Concludes that the translator is a creative agent in…

  19. Design, modeling and control of a novel multi functional translational-rotary micro ultrasonic motor

    Science.gov (United States)

    Tuncdemir, Safakcan

    The major goal of this thesis was to design and develop an actuator, which is capable of producing translational and rotary output motions in a compact structure with simple driving conditions, for the needs of small-scale actuators for micro robotic systems. Piezoelectric ultrasonic motors were selected as the target actuator schemes because of their unbeatable characteristics in the meso-scale range, which covers the structure sizes from hundred micrometers to ten millimeters and with operating ranges from few nanometers to centimeters. In order to meet the objectives and the design constraints, a number of key research tasks had to be undertaken. The design constraints and objectives were so stringent and entangled that none of the existing methods in literature could solve the research problems individually. Therefore, several unique methods were established to accomplish the research objectives. The methods produced novel solutions at every stage of design, development and modeling of the multi functional micro ultrasonic motor. Specifically, an ultrasonic motor utilizing slanted ceramics on a brass rod was designed. Because of the unique slanted ceramics design, longitudinal and torsional mode vibration modes could be obtained on the same structure. A ring shaped mobile element was loosely fitted on the metal rod stator. The mobile element moved in translational or rotational, depending on whether the vibration mode was longitudinal or torsional. A new ultrasonic motor drive method was required because none of the existing ultrasonic motor drive techniques were able to provide both output modes in a compact and cylindrical structure with the use of single drive source. By making use of rectangular wave drive signals, saw-tooth shaped displacement profile could be obtained at longitudinal and torsional resonance modes. Thus, inheriting the operating principle of smooth impact drive method, a new resonance type inertial drive was introduced. This new technique

  20. Efficiency of Iranian Translation Syllabus at BA Level; Deficiency: A New Comprehensive Model

    Science.gov (United States)

    Sohrabi, Sarah; Rahimi, Ramin; Arjmandi, Masoume

    2015-01-01

    This study aims at investigating the practicality of the current curriculum for translation studies at national level (Iranian curriculum). It is going to have a comprehensive idea of translation students and teachers (university lecturers) over the current translation syllabus at BA level in Iran. A researcher-made CEQ questionnaire (Curriculum…

  1. Embedding Web-Based Statistical Translation Models in Cross-Language Information Retrieval

    NARCIS (Netherlands)

    Kraaij, W.; Nie, J.Y.; Simard, M.

    2003-01-01

    Although more and more language pairs are covered by machine translation (MT) services, there are still many pairs that lack translation resources. Cross-language information retrieval (CUR) is an application that needs translation functionality of a relatively low level of sophistication, since

  2. Embedding Web-Based Statistical Translation Models in Cross-Language Information Retrieval

    NARCIS (Netherlands)

    Kraaij, W.; Nie, J.Y.; Simard, M.

    2003-01-01

    Although more and more language pairs are covered by machine translation (MT) services, there are still many pairs that lack translation resources. Cross-language information retrieval (CUR) is an application that needs translation functionality of a relatively low level of sophistication, since cur

  3. A parameter estimation and identifiability analysis methodology applied to a street canyon air pollution model

    DEFF Research Database (Denmark)

    Ottosen, Thor Bjørn; Ketzel, Matthias; Skov, Henrik

    2016-01-01

    Pollution Model (OSPM®). To assess the predictive validity of the model, the data is split into an estimation and a prediction data set using two data splitting approaches and data preparation techniques (clustering and outlier detection) are analysed. The sensitivity analysis, being part......Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street...

  4. Integration of process design and controller design for chemical processes using model-based methodology

    DEFF Research Database (Denmark)

    Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan; Gani, Rafiqul

    2010-01-01

    In this paper, a novel systematic model-based methodology for performing integrated process design and controller design (IPDC) for chemical processes is presented. The methodology uses a decomposition method to solve the IPDC typically formulated as a mathematical programming (optimization...... with constraints) problem. Accordingly the optimization problem is decomposed into four sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection and verification, which are relatively easier to solve. The methodology makes use of thermodynamic-process...... insights and the reverse design approach to arrive at the final process design–controller design decisions. The developed methodology is illustrated through the design of: (a) a single reactor, (b) a single separator, and (c) a reactor–separator-recycle system and shown to provide effective solutions...

  5. Novel Methodology for Functional Modeling and Simulation of Wireless Embedded Systems

    Directory of Open Access Journals (Sweden)

    Sosa Morales Emma

    2008-01-01

    Full Text Available Abstract A novel methodology is presented for the modeling and the simulation of wireless embedded systems. Tight interaction between the analog and the digital functionality makes the design and verification of such systems a real challenge. The applied methodology brings together the functional models of the baseband algorithms written in C language with the circuit descriptions at behavioral level in Verilog or Verilog-AMS for the system simulations in a single kernel environment. The physical layer of an ultrawideband system has been successfully modeled and simulated. The results confirm that this methodology provides a standardized framework in order to efficiently and accurately simulate complex mixed signal applications for embedded systems.

  6. Towards a Cognitive Handoff for the Future Internet: Model-driven Methodology and Taxonomy of Scenarios

    CERN Document Server

    Gonzalez-Horta, Francisco A; Ramirez-Cortes, Juan M; Martinez-Carballido, Jorge; Buenfil-Alpuche, Eldamira

    2011-01-01

    A cognitive handoff is a multipurpose handoff that achieves many desirable features simultaneously; e.g., seamlessness, autonomy, security, correctness, adaptability, etc. But, the development of cognitive handoffs is a challenging task that has not been properly addressed in the literature. In this paper, we discuss the difficulties of developing cognitive handoffs and propose a new model-driven methodology for their systematic development. The theoretical framework of this methodology is the holistic approach, the functional decomposition method, the model-based design paradigm, and the theory of design as scientific problem-solving. We applied the proposed methodology and obtained the following results: (i) a correspondence between handoff purposes and quantitative environment information, (ii) a novel taxonomy of handoff mobility scenarios, and (iii) an original state-based model representing the functional behavior of the handoff process.

  7. Translational atherosclerosis research: From experimental models to coronary artery disease in humans.

    Science.gov (United States)

    Gleissner, Christian A

    2016-05-01

    Atherosclerosis is the leading cause of death worldwide. Research on the pathophysiological mechanisms of atherogenesis has made tremendous progress over the past two decades. However, despite great advances there is still a lack of therapies that reduce adverse cardiovascular events to an acceptable degree. This review addresses successes, but also questions, challenges, and chances regarding the translation of basic science results into clinical practice, i.e. the capability to apply the results of basic and/or clinical research in order to design therapies suitable to improve patient outcome. Specifically, it discusses problems in translating findings from the most broadly used murine models of atherosclerosis into clinically feasible therapies and strategies potentially improving the results of clinical trials. Most likely, the key to success will be a multimodal approach employing novel imaging methods as well as large scale screening tools-summarized as "omics" approach. Using individually tailored therapies, plaque stabilization and regression could prevent adverse cardiovascular events thereby improving outcome of a large number of patients.

  8. Key-Aspects of Scientific Modeling Exemplified by School Science Models: Some Units for Teaching Contextualized Scientific Methodology

    Science.gov (United States)

    Develaki, Maria

    2016-01-01

    Models and modeling are core elements of scientific methods and consequently also are of key importance for the conception and teaching of scientific methodology. The epistemology of models and its transfer and adaption to nature of science education are not, however, simple themes. We present some conceptual units in which school science models…

  9. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  10. Translating Means Translating Meaning

    Institute of Scientific and Technical Information of China (English)

    李海燕

    2000-01-01

    美国著名翻译理论家尤金·奈达说 :“翻译即译意 (Translating m eans translating m eaning)。”就实质而言 ,翻译即译意。就是把一种语言表达的意义用另一种语言表达出来。翻译分理解与表达两个步骤。理解是翻译的基础 ,表达直接决定译文的成败与优劣 ,两者缺一不可

  11. Human Translator and Translation Technology

    Institute of Scientific and Technical Information of China (English)

    李辰

    2016-01-01

    With the great development of technology, translation technology exerts great influence on human translators because during their translation process, they may use many computer-aided translation tools, such as TRADOS, Snowman, WordFisher and etc. However, they always misunderstand the concept of computer-aided translation, so this thesis managed to providedetails about some translation technology and human translators' strengths so as to help them improve the productivity and the quality of theirtranslation works effectively and efficiently.

  12. Introducing the Interactive Model for the Training of Audiovisual Translators and Analysis of Multimodal Texts

    Directory of Open Access Journals (Sweden)

    Pietro Luigi Iaia

    2015-07-01

    Full Text Available Abstract – This paper introduces the ‘Interactive Model’ of audiovisual translation developed in the context of my PhD research on the cognitive-semantic, functional and socio-cultural features of the Italian-dubbing translation of a corpus of humorous texts. The Model is based on two interactive macro-phases – ‘Multimodal Critical Analysis of Scripts’ (MuCrAS and ‘Multimodal Re-Textualization of Scripts’ (MuReTS. Its construction and application are justified by a multidisciplinary approach to the analysis and translation of audiovisual texts, so as to focus on the linguistic and extralinguistic dimensions affecting both the reception of source texts and the production of target ones (Chaume 2004; Díaz Cintas 2004. By resorting to Critical Discourse Analysis (Fairclough 1995, 2001, to a process-based approach to translation and to a socio-semiotic analysis of multimodal texts (van Leeuwen 2004; Kress and van Leeuwen 2006, the Model is meant to be applied to the training of audiovisual translators and discourse analysts in order to help them enquire into the levels of pragmalinguistic equivalence between the source and the target versions. Finally, a practical application shall be discussed, detailing the Italian rendering of a comic sketch from the American late-night talk show Conan.Abstract – Questo studio introduce il ‘Modello Interattivo’ di traduzione audiovisiva sviluppato durante il mio dottorato di ricerca incentrato sulle caratteristiche cognitivo-semantiche, funzionali e socio-culturali della traduzione italiana per il doppiaggio di un corpus di testi comici. Il Modello è costituito da due fasi: la prima, di ‘Analisi critica e multimodale degli script’ (MuCrAS e la seconda, di ‘Ritestualizzazione critica e multimodale degli script’ (MuReTS, e la sua costruzione e applicazione sono frutto di un approccio multidisciplinare all’analisi e traduzione dei testi audiovisivi, al fine di esaminare le

  13. A PLM components monitoring framework for SMEs based on a PLM maturity model and FAHP methodology

    OpenAIRE

    Zhang, Haiqing; Sekhari, Aicha; Ouzrout, Yacine; Bouras, Abdelaziz

    2014-01-01

    Right PLM components selection and investments increase business advantages. This paper develops a PLM components monitoring framework to assess and guide PLM implementation in small and middle enterprises (SMEs). The framework builds upon PLM maturity models and decision-making methodology. PLM maturity model has the capability to analyze PLM functionalities and evaluate PLM components. A proposed PLM components maturity assessment (PCMA) model can obtain general maturity levels of PLM compo...

  14. Computational modelling for congenital heart disease: how far are we from clinical translation?

    Science.gov (United States)

    Biglino, Giovanni; Capelli, Claudio; Bruse, Jan; Bosi, Giorgia M; Taylor, Andrew M; Schievano, Silvia

    2017-01-01

    Computational models of congenital heart disease (CHD) have become increasingly sophisticated over the last 20 years. They can provide an insight into complex flow phenomena, allow for testing devices into patient-specific anatomies (pre-CHD or post-CHD repair) and generate predictive data. This has been applied to different CHD scenarios, including patients with single ventricle, tetralogy of Fallot, aortic coarctation and transposition of the great arteries. Patient-specific simulations have been shown to be informative for preprocedural planning in complex cases, allowing for virtual stent deployment. Novel techniques such as statistical shape modelling can further aid in the morphological assessment of CHD, risk stratification of patients and possible identification of new ‘shape biomarkers’. Cardiovascular statistical shape models can provide valuable insights into phenomena such as ventricular growth in tetralogy of Fallot, or morphological aortic arch differences in repaired coarctation. In a constant move towards more realistic simulations, models can also account for multiscale phenomena (eg, thrombus formation) and importantly include measures of uncertainty (ie, CIs around simulation results). While their potential to aid understanding of CHD, surgical/procedural decision-making and personalisation of treatments is undeniable, important elements are still lacking prior to clinical translation of computational models in the field of CHD, that is, large validation studies, cost-effectiveness evaluation and establishing possible improvements in patient outcomes. PMID:27798056

  15. Development and translational imaging of a TP53 porcine tumorigenesis model.

    Science.gov (United States)

    Sieren, Jessica C; Meyerholz, David K; Wang, Xiao-Jun; Davis, Bryan T; Newell, John D; Hammond, Emily; Rohret, Judy A; Rohret, Frank A; Struzynski, Jason T; Goeken, J Adam; Naumann, Paul W; Leidinger, Mariah R; Taghiyev, Agshin; Van Rheeden, Richard; Hagen, Jussara; Darbro, Benjamin W; Quelle, Dawn E; Rogers, Christopher S

    2014-09-01

    Cancer is the second deadliest disease in the United States, necessitating improvements in tumor diagnosis and treatment. Current model systems of cancer are informative, but translating promising imaging approaches and therapies to clinical practice has been challenging. In particular, the lack of a large-animal model that accurately mimics human cancer has been a major barrier to the development of effective diagnostic tools along with surgical and therapeutic interventions. Here, we developed a genetically modified porcine model of cancer in which animals express a mutation in TP53 (which encodes p53) that is orthologous to one commonly found in humans (R175H in people, R167H in pigs). TP53(R167H/R167H) mutant pigs primarily developed lymphomas and osteogenic tumors, recapitulating the tumor types observed in mice and humans expressing orthologous TP53 mutant alleles. CT and MRI imaging data effectively detected developing tumors, which were validated by histopathological evaluation after necropsy. Molecular genetic analyses confirmed that these animals expressed the R167H mutant p53, and evaluation of tumors revealed characteristic chromosomal instability. Together, these results demonstrated that TP53(R167H/R167H) pigs represent a large-animal tumor model that replicates the human condition. Our data further suggest that this model will be uniquely suited for developing clinically relevant, noninvasive imaging approaches to facilitate earlier detection, diagnosis, and treatment of human cancers.

  16. A methodology for semiautomatic generation of finite element models: Application to mechanical devices

    Directory of Open Access Journals (Sweden)

    Jesús López

    2015-02-01

    Full Text Available In this work, a methodology to create parameterized finite element models is presented, particularly focusing on the development of suitable algorithms in order to generate models and meshes with high computational efficiency. The methodology is applied to the modeling of two common mechanical devices: an optical linear encoder and a gear transmission. This practical application constitutes a tough test to the methodology proposed, given the complexity and the large number of components that set up this high-precision measurement device and the singularity of the internal gears. The geometrical and mechanical particularities of the components lead to multidimensional modeling, seeking to ensure proper interaction between the different types of finite elements. Besides, modeling criteria to create components such as compression and torsion springs, sheet springs, bearings, or adhesive joints are also presented in the article. The last part of the work aims to validate the simulation results obtained with the methodology proposed with those derived from experimental tests through white noise base-driven vibration and hammer impact excitation modal analysis.

  17. Collective dynamics in atomistic models with coupled translational and spin degrees of freedom

    Science.gov (United States)

    Perera, Dilina; Nicholson, Don M.; Eisenbach, Markus; Stocks, G. Malcolm; Landau, David P.

    2017-01-01

    Using an atomistic model that simultaneously treats the dynamics of translational and spin degrees of freedom, we perform combined molecular and spin dynamics simulations to investigate the mutual influence of the phonons and magnons on their respective frequency spectra and lifetimes in ferromagnetic bcc iron. By calculating the Fourier transforms of the space- and time-displaced correlation functions, the characteristic frequencies and the linewidths of the vibrational and magnetic excitation modes were determined. Comparison of the results with that of the stand-alone molecular dynamics and spin dynamics simulations reveals that the dynamic interplay between the phonons and magnons leads to a shift in the respective frequency spectra and a decrease in the lifetimes. Moreover, in the presence of lattice vibrations, additional longitudinal magnetic excitations were observed with the same frequencies as the longitudinal phonons.

  18. A component modes projection and assembly model reduction methodology for articulated, multi-flexible body structures

    Science.gov (United States)

    Lee, Allan Y.; Tsuha, Walter S.

    1993-01-01

    A two-stage model reduction methodology, combining the classical Component Mode Synthesis (CMS) method and the newly developed Enhanced Projection and Assembly (EP&A) method, is proposed in this research. The first stage of this methodology, called the COmponent Modes Projection and Assembly model REduction (COMPARE) method, involves the generation of CMS mode sets, such as the MacNeal-Rubin mode sets. These mode sets are then used to reduce the order of each component model in the Rayleigh-Ritz sense. The resultant component models are then combined to generate reduced-order system models at various system configurations. A composite mode set which retains important system modes at all system configurations is then selected from these reduced-order system models. In the second stage, the EP&A model reduction method is employed to reduce further the order of the system model generated in the first stage. The effectiveness of the COMPARE methodology has been successfully demonstrated on a high-order, finite-element model of the cruise-configured Galileo spacecraft.

  19. An overview of models, methods, and reagents developed for translational autoimmunity research in the common marmoset (Callithrix jacchus)

    NARCIS (Netherlands)

    S.A. Jagessar (Anwar); M.P.M. Vierboom (Michel); E. Blezer (Erwin); J. Bauer; B.A. 't Hart (Bert); Y.S. Kap (Yolanda)

    2013-01-01

    textabstractThe common marmoset (Callithrix jacchus) is a small-bodied Neotropical primate and a useful preclinical animal model for translational research into autoimmune-mediated inflammatory diseases (AIMID), such as rheumatoid arthritis (RA) and multiple sclerosis (MS). The animal model for MS e

  20. An Overview of Models, Methods, and Reagents Developed for Translational Autoimmunity Research in the Common Marmoset (Callithrix jacchus)

    NARCIS (Netherlands)

    Jagessar, S. Anwar; Vierboom, Michel; Blezer, Erwin L. A.; Bauer, Jan; 't Hart, Bert A.; Kap, Yolanda S.

    2013-01-01

    The common marmoset (Callithrix jacchus) is a small-bodied Neotropical primate and a useful preclinical animal model for translational research into autoimmune-mediated inflammatory diseases (AIMID), such as rheumatoid arthritis (RA) and multiple sclerosis (MS). The animal model for MS established i

  1. 回归译本对比——鲁译研究方法论刍议%Return to Contrastive Analysis:The Research Methodology of Lu Xun' s Translation

    Institute of Scientific and Technical Information of China (English)

    陈红

    2012-01-01

    Ignoring contrastive analysis on translation, one-sided view on target language and the misuse of contrastive analysis are the problems of research methodology on Lu Xun' s translation. Therefore many conclusions are unreliable. The huge quantity, the complexity of Lu Xun' s translation and the academic background of the researchers are the most important reasons behind this phenomenon.%在对鲁迅翻译的研究中,存在着忽视译本考察、简单聚焦译入语文本以及误用译出语文本等问题,导致许多结论得不到具体实例的支撑,也难以引发更深入的鲁译研究。其原因除鲁译作品多、文体繁杂外,鲁译本身的复杂性及鲁译研究者的身份、学术背景等因素也不可忽视。

  2. Machine Translation

    Institute of Scientific and Technical Information of China (English)

    张严心

    2015-01-01

    As a kind of ancillary translation tool, Machine Translation has been paid increasing attention to and received different kinds of study by a great deal of researchers and scholars for a long time. To know the definition of Machine Translation and to analyse its benefits and problems are significant for translators in order to make good use of Machine Translation, and helpful to develop and consummate Machine Translation Systems in the future.

  3. From LCAs to simplified models: a generic methodology applied to wind power electricity.

    Science.gov (United States)

    Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle

    2013-02-05

    This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.

  4. Translational pain research: Evaluating analgesic effect in experimental visceral pain models

    Institute of Scientific and Technical Information of China (English)

    Anne Estrup Olesen; Trine Andresen; Lona Louring Christrup; Richard N Upton

    2009-01-01

    Deep visceral pain is frequent and presents major challenges in pain management, since its pathophysiology is still poorly understood. One way to optimize treatment of visceral pain is to improve knowledge of the mechanisms behind the pain and the mode of action of analgesic substances. This can be achieved through standardized experimental human pain models. Experimental pain models in healthy volunteers are advantageous forevaluation of analgesic action, as this is often difficult to assess in the clinic because of confounding factors such as sedation, nausea and general malaise. These pain models facilitate minimizing the gap between knowledge gained in animal and human clinical studies. Combining experimental pain studies and pharmacokinetic studies can improve understanding of the pharmacokineticpharmacodynamic relationship of analgesics and, thus, provide valuable insight into optimal clinical treatment of visceral pain. To improve treatment of visceral pain, it is important to study the underlying mechanisms of pain and the action of analgesics used for its treatment. An experimental pain model activates different modalities and can be used to investigate the mechanism of action of different analgesics in detail. In combination with pharmacokinetic studies and objective assessment such as electroencephalography, new information re- garding a given drug substance and its effects can be obtained. Results from experimental human visceral pain research can bridge the gap in knowledge between animal studies and clinical condition in patients suffering from visceral pain, and thus constitute the missing link in translational pain research.

  5. Cognitive Dysfunction in Major Depressive Disorder. A Translational Review in Animal Models of the Disease.

    Science.gov (United States)

    Darcet, Flavie; Gardier, Alain M; Gaillard, Raphael; David, Denis J; Guilloux, Jean-Philippe

    2016-02-17

    Major Depressive Disorder (MDD) is the most common psychiatric disease, affecting millions of people worldwide. In addition to the well-defined depressive symptoms, patients suffering from MDD consistently complain about cognitive disturbances, significantly exacerbating the burden of this illness. Among cognitive symptoms, impairments in attention, working memory, learning and memory or executive functions are often reported. However, available data about the heterogeneity of MDD patients and magnitude of cognitive symptoms through the different phases of MDD remain difficult to summarize. Thus, the first part of this review briefly overviewed clinical studies, focusing on the cognitive dysfunctions depending on the MDD type. As animal models are essential translational tools for underpinning the mechanisms of cognitive deficits in MDD, the second part of this review synthetized preclinical studies observing cognitive deficits in different rodent models of anxiety/depression. For each cognitive domain, we determined whether deficits could be shared across models. Particularly, we established whether specific stress-related procedures or unspecific criteria (such as species, sex or age) could segregate common cognitive alteration across models. Finally, the role of adult hippocampal neurogenesis in rodents in cognitive dysfunctions during MDD state was also discussed.

  6. Caenorhabditis elegans as a model system to study post-translational modifications of human transthyretin

    Science.gov (United States)

    Henze, Andrea; Homann, Thomas; Rohn, Isabelle; Aschner, Michael; Link, Christopher D.; Kleuser, Burkhard; Schweigert, Florian J.; Schwerdtle, Tanja; Bornhorst, Julia

    2016-11-01

    The visceral protein transthyretin (TTR) is frequently affected by oxidative post-translational protein modifications (PTPMs) in various diseases. Thus, better insight into structure-function relationships due to oxidative PTPMs of TTR should contribute to the understanding of pathophysiologic mechanisms. While the in vivo analysis of TTR in mammalian models is complex, time- and resource-consuming, transgenic Caenorhabditis elegans expressing hTTR provide an optimal model for the in vivo identification and characterization of drug-mediated oxidative PTPMs of hTTR by means of matrix assisted laser desorption/ionization – time of flight – mass spectrometry (MALDI-TOF-MS). Herein, we demonstrated that hTTR is expressed in all developmental stages of Caenorhabditis elegans, enabling the analysis of hTTR metabolism during the whole life-cycle. The suitability of the applied model was verified by exposing worms to D-penicillamine and menadione. Both drugs induced substantial changes in the oxidative PTPM pattern of hTTR. Additionally, for the first time a covalent binding of both drugs with hTTR was identified and verified by molecular modelling.

  7. Developing a methodology to predict PM10 concentrations in urban areas using generalized linear models.

    Science.gov (United States)

    Garcia, J M; Teodoro, F; Cerdeira, R; Coelho, L M R; Kumar, Prashant; Carvalho, M G

    2016-09-01

    A methodology to predict PM10 concentrations in urban outdoor environments is developed based on the generalized linear models (GLMs). The methodology is based on the relationship developed between atmospheric concentrations of air pollutants (i.e. CO, NO2, NOx, VOCs, SO2) and meteorological variables (i.e. ambient temperature, relative humidity (RH) and wind speed) for a city (Barreiro) of Portugal. The model uses air pollution and meteorological data from the Portuguese monitoring air quality station networks. The developed GLM considers PM10 concentrations as a dependent variable, and both the gaseous pollutants and meteorological variables as explanatory independent variables. A logarithmic link function was considered with a Poisson probability distribution. Particular attention was given to cases with air temperatures both below and above 25°C. The best performance for modelled results against the measured data was achieved for the model with values of air temperature above 25°C compared with the model considering all ranges of air temperatures and with the model considering only temperature below 25°C. The model was also tested with similar data from another Portuguese city, Oporto, and results found to behave similarly. It is concluded that this model and the methodology could be adopted for other cities to predict PM10 concentrations when these data are not available by measurements from air quality monitoring stations or other acquisition means.

  8. Do Methodological Choices in Environmental Modeling Bias Rebound Effects? : A Case Study on Electric Cars

    NARCIS (Netherlands)

    Font Vivanco, D.; Tukker, A.; Kemp, R.

    2016-01-01

    Improvements in resource efficiency often underperform because of rebound effects. Calculations of the size of rebound effects are subject to various types of bias, among which methodological choices have received particular attention. Modellers have primarily focused on choices related to changes i

  9. Have Cognitive Diagnostic Models Delivered Their Goods? Some Substantial and Methodological Concerns

    Science.gov (United States)

    Wilhelm, Oliver; Robitzsch, Alexander

    2009-01-01

    The paper by Rupp and Templin (2008) is an excellent work on the characteristics and features of cognitive diagnostic models (CDM). In this article, the authors comment on some substantial and methodological aspects of this focus paper. They organize their comments by going through issues associated with the terms "cognitive," "diagnostic" and…

  10. Development in methodologies for modelling of human and ecotoxic impacts in LCA

    DEFF Research Database (Denmark)

    Hauschild, Michael Zwicky; Huijbregts, Mark; Jolliet, Olivier;

    2009-01-01

    Under the UNEP-SETAC Life Cycle Initiative there is an aim to develop an internationally backed recommended practice of life cycle impact assessment addressing methodological issues like choice of characterization model and characterization factors. In this context, an international comparison wa...

  11. 3D Buildings Modelling Based on a Combination of Techniques and Methodologies

    NARCIS (Netherlands)

    Pop, G.; Bucksch, A.K.; Gorte, B.G.H.

    2007-01-01

    Three dimensional architectural models are more and more important for a large number of applications. Specialists look for faster and more precise ways to generate them. This paper discusses methods to combine methodologies for handling data acquired from multiple sources: maps, terrestrial laser a

  12. Analysis of Feedback Processes in Online Group Interaction: A Methodological Model

    Science.gov (United States)

    Espasa, Anna; Guasch, Teresa; Alvarez, Ibis M.

    2013-01-01

    The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the…

  13. Setting road safety targets in Cambodia : a methodological demonstration using the latent risk time series model.

    NARCIS (Netherlands)

    Commandeur, J.J.F. Wesemann, P. Bijleveld, F.D. Chhoun, V. & Sann, S.

    2017-01-01

    The authors present the methodology used for estimating forecasts for the number of road traffic fatalities in 2011—2020 in Cambodia based on observed developments in Cambodian road traffic fatalities and motor vehicle ownership in the years 1995—2009. Using the latent risk time series model

  14. Literature Survey of previous research work in Models and Methodologies in Project Management

    Directory of Open Access Journals (Sweden)

    Ravinder Singh

    2014-09-01

    Full Text Available This paper provides a survey of the existing literature and research carried out in the area of project management using different models, methodologies, and frameworks. Project Management (PM broadly means programme management, portfolio management, practice management, project management office, etc. A project management system has a set of processes, procedures, framework, methods, tools, methodologies, techniques, resources, etc. which are used to manage the full life cycle of projects. This also means to create risk, quality, performance, and other management plans to monitor and manage the projects efficiently and effectively.

  15. Numerical Methodology for Metal Forming Processes Using Elastoplastic Model with Damage Occurrence

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Ductile damage often occurs during metal forming processes due to the large thermo-elasto (visco) plastic flow Iocalisation. This paper presents a numerical methodology, which aims to virtually improve any metal forming processes. The methodology is based on elastoplastic constitutive equations accounting for nonlinear mixed isotropic and kinematic hardening strongly coupled with isotropic ductile damage. An adaptive remeshing scheme based on geometrical and physical error estimates including a kill element procedure is used. Some numerical results are presented to show the capability of the model to predict the damage initiation and growth during the metal forming processes.

  16. Towards a Pattern-Driven Topical Ontology Modeling Methodology in Elderly Care Homes

    Science.gov (United States)

    Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert; Pudkey, Kevin

    This paper presents a pattern-driven ontology modeling methodology, which is used to create topical ontologies in the human resource management (HRM) domain. An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). We use the Organization for Economic Co-operation and Development (OECD) and the National Vocational Qualification (NVQ) as the resource to create the topical ontologies in this paper. The methodology is implemented in a tool called PAD-ON suit. The paper approach is illustrated with a use case from elderly care homes in UK.

  17. A new methodology for the development of high-latitude ionospheric climatologies and empirical models

    Science.gov (United States)

    Chisham, G.

    2017-01-01

    Many empirical models and climatologies of high-latitude ionospheric processes, such as convection, have been developed over the last 40 years. One common feature in the development of these models is that measurements from different times are combined and averaged on fixed coordinate grids. This methodology ignores the reality that high-latitude ionospheric features are organized relative to the location of the ionospheric footprint of the boundary between open and closed geomagnetic field lines (OCB). This boundary is in continual motion, and the polar cap that it encloses is continually expanding and contracting in response to changes in the rates of magnetic reconnection at the Earth's magnetopause and in the magnetotail. As a consequence, models that are developed by combining and averaging data in fixed coordinate grids heavily smooth the variations that occur near the boundary location. Here we propose that the development of future models should consider the location of the OCB in order to more accurately model the variations in this region. We present a methodology which involves identifying the OCB from spacecraft auroral images and then organizing measurements in a grid where the bins are placed relative to the OCB location. We demonstrate the plausibility of this methodology using ionospheric vorticity measurements made by the Super Dual Auroral Radar Network radars and OCB measurements from the IMAGE spacecraft FUV auroral imagers. This demonstration shows that this new methodology results in sharpening and clarifying features of climatological maps near the OCB location. We discuss the potential impact of this methodology on space weather applications.

  18. Future Directions in Medical Physics: Models, Technology, and Translation to Medicine

    Science.gov (United States)

    Siewerdsen, Jeffrey

    The application of physics in medicine has been integral to major advances in diagnostic and therapeutic medicine. Two primary areas represent the mainstay of medical physics research in the last century: in radiation therapy, physicists have propelled advances in conformal radiation treatment and high-precision image guidance; and in diagnostic imaging, physicists have advanced an arsenal of multi-modality imaging that includes CT, MRI, ultrasound, and PET as indispensible tools for noninvasive screening, diagnosis, and assessment of treatment response. In addition to their role in building such technologically rich fields of medicine, physicists have also become integral to daily clinical practice in these areas. The future suggests new opportunities for multi-disciplinary research bridging physics, biology, engineering, and computer science, and collaboration in medical physics carries a strong capacity for identification of significant clinical needs, access to clinical data, and translation of technologies to clinical studies. In radiation therapy, for example, the extraction of knowledge from large datasets on treatment delivery, image-based phenotypes, genomic profile, and treatment outcome will require innovation in computational modeling and connection with medical physics for the curation of large datasets. Similarly in imaging physics, the demand for new imaging technology capable of measuring physical and biological processes over orders of magnitude in scale (from molecules to whole organ systems) and exploiting new contrast mechanisms for greater sensitivity to molecular agents and subtle functional / morphological change will benefit from multi-disciplinary collaboration in physics, biology, and engineering. Also in surgery and interventional radiology, where needs for increased precision and patient safety meet constraints in cost and workflow, development of new technologies for imaging, image registration, and robotic assistance can leverage

  19. Bovine Brain: An in vitro Translational Model in Developmental Neuroscience and Neurodegenerative Research

    Science.gov (United States)

    Peruffo, Antonella; Cozzi, Bruno

    2014-01-01

    Animal models provide convenient and clinically relevant tools in the research on neurodegenerative diseases. Studies on developmental disorders extensively rely on the use of laboratory rodents. The present mini-review proposes an alternative translational model based on the use of fetal bovine brain tissue. The bovine (Bos taurus) possesses a large and highly gyrencephalic brain and the long gestation period (41 weeks) is comparable to human pregnancy (38–40 weeks). Primary cultures obtained from fetal bovine brain constitute a validated in vitro model that allows examinations of neurons and/or glial cells under controlled and reproducible conditions. Physiological processes can be also studied on cultured bovine neural cells incubated with specific substrates or by electrically coupled electrolyte-oxide-semiconductor capacitors that permit direct recording from neuronal cells. Bovine neural cells and specific in vitro cell culture could be an alternative in comparative neuroscience and in neurodegenerative research, useful for studying development of normal and altered circuitry in a long gestation mammalian species. Use of bovine tissues would promote a substantial reduction in the use of laboratory animals. PMID:25072040

  20. A Dynamic Defense Modeling and Simulation Methodology using Semantic Web Services

    Directory of Open Access Journals (Sweden)

    Kangsun Lee

    2010-04-01

    Full Text Available Defense Modeling and Simulations require interoperable and autonomous federates in order to fully simulate complex behavior of war-fighters and to dynamically adapt themselves to various war-game events, commands and controls. In this paper, we propose a semantic web service based methodology to develop war-game simulations. Our methodology encapsulates war-game logic into a set of web services with additional semantic information in WSDL (Web Service Description Language and OWL (Web Ontology Language. By utilizing dynamic discovery and binding power of semantic web services, we are able to dynamically reconfigure federates according to various simulation events. An ASuW (Anti-Surface Warfare simulator is constructed to demonstrate the methodology and successfully shows that the level of interoperability and autonomy can be greatly improved.

  1. Can we grow sperm? A translational perspective on the current animal and human spermatogenesis models

    Institute of Scientific and Technical Information of China (English)

    Kirk C Lo; Trustin Domes

    2011-01-01

    @@ There have been tremendous advances in both the diagnosis and treatment of male factor infertility; however,the mechanisms responsible to recreate spermatogenesis outside of the testicular environment continue to elude andrologists.Having the ability to 'grow'human sperm would be a tremendous advance in reproductive biology with multiple possible clinical applications,such as a treatment option for men with testicular failure and azoospermia of multiple etiologies.To understand the complexities of human spermatogenesis in a research environment,model systems have been designed with the intent to replicate the testicular microenvironment.Currently,there are both in vivo and in vitro model systems.In vivo model systems involve the transplantation of either spermatogonial stem cells or testicular xenographs.In vitro model systems involve the use of pluripotent stem cells and complex coculturing and/or three-dimensional culturing techniques.This review discusses the basic methodologies,possible clinical applications,benefits and limitations of each model system.Although these model systems have greatly improved our understanding of human spermatogenesis,we unfortunately have not been successful in demonstrating complete human spermatogenesis outside of the testicle.

  2. An improved methodology for dynamic modelling and simulation of electromechanically coupled drive systems: An experimental validation

    Indian Academy of Sciences (India)

    Nuh Erdogan; Humberto Henao; Richard Grisel

    2015-10-01

    The complexity of electromechanical coupling drive system (ECDS)s, specifically electrical drive systems, makes studying them in their entirety challenging since they consist of elements of diverse nature, i.e. electric, electronics and mechanics. This presents a real struggle to the engineers who want to design and implement such systems with high performance, efficiency and reliability. For this purpose, engineers need a tool capable of modelling and/or simulating components of diverse nature within the ECDS. However, a majority of the available tools are limited in their capacity to describe the characteristics of such components sufficiently. To overcome this difficulty, this paper first proposes an improved methodology of modelling and simulation for ECDS. The approach is based on using domain-based simulators individually, namely electric and mechanic part simulators and also integrating them with a co-simulation. As for the modelling of the drive machine, a finely tuned dynamic model is developed by taking the saturation effect into account. In order to validate the developed model as well as the proposed methodology, an industrial ECDS is tested experimentally. Later, both the experimental and simulation results are compared to prove the accuracy of the developed model and the relevance of the proposed methodology.

  3. A coupled groundwater-flow-modelling and vulnerability-mapping methodology for karstic terrain management

    Science.gov (United States)

    Kavouri, Konstantina P.; Karatzas, George P.; Plagnes, Valérie

    2017-02-01

    A coupled groundwater-flow-modelling and vulnerability-mapping methodology for the management of karst aquifers with spatial variability is developed. The methodology takes into consideration the duality of flow and recharge in karst and introduces a simple method to integrate the effect of temporal storage in the unsaturated zone. In order to investigate the applicability of the developed methodology, simulation results are validated against available field measurement data. The criteria maps from the PaPRIKa vulnerability-mapping method are used to document the groundwater flow model. The FEFLOW model is employed for the simulation of the saturated zone of Palaikastro-Chochlakies karst aquifer, in the island of Crete, Greece, for the hydrological years 2010-2012. The simulated water table reproduces typical karst characteristics, such as steep slopes and preferred drain axes, and is in good agreement with field observations. Selected calculated error indicators—Nash-Sutcliffe efficiency (NSE), root mean squared error (RMSE) and model efficiency (E')—are within acceptable value ranges. Results indicate that different storage processes take place in different parts of the aquifer. The north-central part seems to be more sensitive to diffuse recharge, while the southern part is affected primarily by precipitation events. Sensitivity analysis is performed on the parameters of hydraulic conductivity and specific yield. The methodology is used to estimate the feasibility of artificial aquifer recharge (AAR) at the study area. Based on the developed methodology, guidelines were provided for the selection of the appropriate AAR scenario that has positive impact on the water table.

  4. A coupled groundwater-flow-modelling and vulnerability-mapping methodology for karstic terrain management

    Science.gov (United States)

    Kavouri, Konstantina P.; Karatzas, George P.; Plagnes, Valérie

    2017-08-01

    A coupled groundwater-flow-modelling and vulnerability-mapping methodology for the management of karst aquifers with spatial variability is developed. The methodology takes into consideration the duality of flow and recharge in karst and introduces a simple method to integrate the effect of temporal storage in the unsaturated zone. In order to investigate the applicability of the developed methodology, simulation results are validated against available field measurement data. The criteria maps from the PaPRIKa vulnerability-mapping method are used to document the groundwater flow model. The FEFLOW model is employed for the simulation of the saturated zone of Palaikastro-Chochlakies karst aquifer, in the island of Crete, Greece, for the hydrological years 2010-2012. The simulated water table reproduces typical karst characteristics, such as steep slopes and preferred drain axes, and is in good agreement with field observations. Selected calculated error indicators—Nash-Sutcliffe efficiency (NSE), root mean squared error (RMSE) and model efficiency (E')—are within acceptable value ranges. Results indicate that different storage processes take place in different parts of the aquifer. The north-central part seems to be more sensitive to diffuse recharge, while the southern part is affected primarily by precipitation events. Sensitivity analysis is performed on the parameters of hydraulic conductivity and specific yield. The methodology is used to estimate the feasibility of artificial aquifer recharge (AAR) at the study area. Based on the developed methodology, guidelines were provided for the selection of the appropriate AAR scenario that has positive impact on the water table.

  5. An integrated measurement and modeling methodology for estuarine water quality management

    Institute of Scientific and Technical Information of China (English)

    Michael Hartnett; Stephen Nash

    2015-01-01

    This paper describes research undertaken by the authors to develop an integrated measurement and modeling methodology for water quality management of estuaries. The approach developed utilizes modeling and measurement results in a synergistic manner. Modeling results were initially used to inform the field campaign of appropriate sampling locations and times, and field data were used to develop accurate models. Remote sensing techniques were used to capture data for both model development and model validation. Field surveys were undertaken to provide model initial conditions through data assimilation and determine nutrient fluxes into the model domain. From field data, salinity re-lationships were developed with various water quality parameters, and relationships between chlorophyll a concentrations, transparency, and light attenuation were also developed. These relationships proved to be invaluable in model development, particularly in modeling the growth and decay of chlorophyll a. Cork Harbour, an estuary that regularly experiences summer algal blooms due to anthropogenic sources of nutrients, was used as a case study to develop the methodology. The integration of remote sensing, conventional fieldwork, and modeling is one of the novel aspects of this research and the approach developed has widespread applicability.

  6. Critical appraisal of the suitability of translational research models for performance assessment of cancer institutions

    NARCIS (Netherlands)

    Rajan, A.; Sullivan, R.; Bakker, S.; Harten, van W.H.

    2012-01-01

    Background. Translational research is a complex cumulative process that takes time. However, the operating environment for cancer centers engaged in translational research is now financially insecure. Centers are challenged to improve results and reduce time from discovery to practice innovations. P

  7. Critical Appraisal of Translational Research Models for Suitability in Performance Assessment of Cancer Centers

    NARCIS (Netherlands)

    Rajan, Abinaya; Sullivan, Richard; Bakker, Suzanne; Harten, van Wim H.

    2012-01-01

    Background. Translational research is a complex cumulative process that takes time. However, the operating environment for cancer centers engaged in translational research is now financially insecure. Centers are challenged to improve results and reduce time from discovery to practice innovations. P

  8. Developing translational medicine professionals : The Marie Skłodowska-Curie action model

    NARCIS (Netherlands)

    Petrelli, Alessandra; Prakken, Berent J.|info:eu-repo/dai/nl/156714183; Rosenblum, Norman D.

    2016-01-01

    End goal of translational medicine is to combine disciplines and expertise to eventually promote improvement of the global healthcare system by delivering effective therapies to individuals and society. Well-trained experts of the translational medicine process endowed with profound knowledge of

  9. Developing translational medicine professionals : The Marie Skłodowska-Curie action model

    NARCIS (Netherlands)

    Petrelli, Alessandra; Prakken, Berent J.; Rosenblum, Norman D.

    2016-01-01

    End goal of translational medicine is to combine disciplines and expertise to eventually promote improvement of the global healthcare system by delivering effective therapies to individuals and society. Well-trained experts of the translational medicine process endowed with profound knowledge of bio

  10. Comparative wound healing--are the small animal veterinarian's clinical patients an improved translational model for human wound healing research?

    Science.gov (United States)

    Volk, Susan W; Bohling, Mark W

    2013-01-01

    Despite intensive research efforts into understanding the pathophysiology of both chronic wounds and scar formation, and the development of wound care strategies to target both healing extremes, problematic wounds in human health care remain a formidable challenge. Although valuable fundamental information regarding the pathophysiology of problematic wounds can be gained from in vitro investigations and in vivo studies performed in laboratory animal models, the lack of concordance with human pathophysiology has been cited as a major impediment to translational research in human wound care. Therefore, the identification of superior clinical models for both chronic wounds and scarring disorders should be a high priority for scientists who work in the field of human wound healing research. To be successful, translational wound healing research should function as an intellectual ecosystem in which information flows from basic science researchers using in vitro and in vivo models to clinicians and back again from the clinical investigators to the basic scientists. Integral to the efficiency of this process is the incorporation of models which can accurately predict clinical success. The aim of this review is to describe the potential advantages and limitations of using clinical companion animals (primarily dogs and cats) as translational models for cutaneous wound healing research by describing comparative aspects of wound healing in these species, common acute and chronic cutaneous wounds in clinical canine and feline patients, and the infrastructure that currently exists in veterinary medicine which may facilitate translational studies and simultaneously benefit both veterinary and human wound care patients.

  11. Animal models in idiopathic inflammatory myopathies: How to overcome a translational roadblock?

    Science.gov (United States)

    Afzali, Ali Maisam; Ruck, Tobias; Wiendl, Heinz; Meuth, Sven G

    2017-03-07

    Idiopathic inflammatory myopathies (IIMs) encompass a heterogenic group of rare muscle diseases with common symptoms including muscle weakness and the presence of certain histological features. Since the pathogenesis remains unclear, therapeutic approaches in general comprise unspecific immunosuppression strategies that have been met with limited success. Therefore, a deeper understanding of the underlying pathophysiological mechanisms is critically required to assist in development of targeted therapies. Animal models have proven to be tremendously helpful in mechanistic studies and allow researchers to overcome the inevitable restrictions of human research. Although the number of different IIM models has drastically increased over the last few decades, a model that exhibits the phenotypical and histopathological hallmarks of IIM is still missing. Recent publications have shown promising results addressing different pathophysiological issues like mechanisms of onset, chronification or relapse in IIM. However, a standardization of the methodology is critically required in order to improve comparability and transferability among different groups. Here we provide an overview of the currently available IIM models including our own C-peptide based small-peptide model, critically discuss their advantages and disadvantages and give perspectives to their future use.

  12. Construction of an Integrated Translation Teaching Model Based on Translation Practice Platform%基于翻译实训平台的笔译综合教学模式构建

    Institute of Scientific and Technical Information of China (English)

    殷燕; 严红艳

    2015-01-01

    以计算机网络为平台进行翻译服务作业已成为数字化时代笔译职业译员的主要工作模式,为了适应翻译职业化时代的挑战,通过在笔译教学中融入网络翻译实训平台,在线下立足于“多元视角”展开课堂教学,在线上立足于提升翻译能力展开实训活动,构建一种笔译综合教学模式。实验结果显示,这种线下线上混合式笔译教学模式有助于提高学生的学习自主性,增强学生的协作能力和笔译潜能。%The application of computer networks as a platform for their translation and translation service has become the main work pattern of translation professionals in the digital age.In order to meet the challenges of an era of professional translation, this proposed innovative translation teaching model, through utilizing the translation practice platform, integrates both the off -line classroom teaching and the on-line student translator training, the former of which bases its teaching on“multiple perspectives” , while the latter on cultivating translator competence.The results of its implementation show that this blended on-line and off-line translation teaching model is conducive to the development of learner autonomy, students’ collaborative competence and potentials for bilingual translation.

  13. 冲突--翻译伦理模式理论再思考%Conflicts---Second Thought on Models of Translational Ethics

    Institute of Scientific and Technical Information of China (English)

    梅阳春; 汤金霞

    2013-01-01

      One of the major trends in the translating study in China is drawing inspiration from western trans-lational ethics and constructing Chinese translational ethics .Among the western schools of translational eth-ics ,theory of models of translation ethics constituted by translational ethics of representation ,of service ,of communication ,norm-based translational ethics and translational ethics of commitment gives such inspiration to the construction of Chinese translational ethics that there comes from the circle of translation in China the voice of formulating Chinese translation ethics on the basis of this theory .It is discovered that the first four models of translational ethics contradict each other in what interest group they should serve most ,in what sta-tus they should confer on each of the translation agents but the translator and in what status they should con -fer on the translator . Translational ethics of commitment ,with the purpose of integrating four preceding models of translational ethics can not solve the problem of incompatibility between the four translational eth-ics .Therefore ,theory of models of translational ethics ,though beneficial to the construction of Chinese trans-lational ethics ,can not act as the basis of this construction .%  借鉴西方翻译伦理学研究成果构建中国翻译伦理学是当今中国翻译学的主要发展趋势之一。在西方翻译伦理学诸流派中,由翻译的再现伦理、服务伦理、交际伦理、规范伦理和承诺伦理构建的翻译伦理模式理论对中国翻译伦理学的发展影响最大,以至于国内翻译界出现了要以该理论为基础构建中国翻译伦理学的呼声。但该理论的前四种伦理在服务主体、主体定位和译者定位三个维度上的冲突导致它们互不兼容,旨在融合四种翻译伦理的承诺伦理也未能解决兼容性问题。因此,以翻译伦理模式为基础构建中国翻译伦理学的设想并不可行。

  14. Acellular Hydrogels for Regenerative Burn Wound Healing: Translation from a Porcine Model.

    Science.gov (United States)

    Shen, Yu-I; Song, Hyun-Ho G; Papa, Arianne E; Burke, Jacqueline A; Volk, Susan W; Gerecht, Sharon

    2015-10-01

    Currently available skin grafts and skin substitutes for healing following third-degree burn injuries are fraught with complications, often resulting in long-term physical and psychological sequelae. Synthetic treatment that can promote wound healing in a regenerative manner would provide an off-the-shelf, non-immunogenic strategy to improve clinical care of severe burn wounds. Here, we demonstrate the vulnerary efficacy and accelerated healing mechanism of a dextran-based hydrogel in a third-degree porcine burn model. The model was optimized to allow examination of the hydrogel treatment for clinical translation and its regenerative response mechanisms. Hydrogel treatment accelerated third-degree burn wound healing by rapid wound closure, improved re-epithelialization, enhanced extracellular matrix remodeling, and greater nerve reinnervation, compared with the dressing-treated group. These effects appear to be mediated through the ability of the hydrogel to facilitate a rapid but brief initial inflammatory response that coherently stimulates neovascularization within the granulation tissue during the first week of treatment, followed by an efficient vascular regression to promote a regenerative healing process. Our results suggest that the dextran-based hydrogels may substantially improve healing quality and reduce skin grafting incidents and thus pave the way for clinical studies to improve the care of severe burn injury patients.

  15. CloudLM: a Cloud-based Language Model for Machine Translation

    Directory of Open Access Journals (Sweden)

    Ferrández-Tordera Jorge

    2016-04-01

    Full Text Available Language models (LMs are an essential element in statistical approaches to natural language processing for tasks such as speech recognition and machine translation (MT. The advent of big data leads to the availability of massive amounts of data to build LMs, and in fact, for the most prominent languages, using current techniques and hardware, it is not feasible to train LMs with all the data available nowadays. At the same time, it has been shown that the more data is used for a LM the better the performance, e.g. for MT, without any indication yet of reaching a plateau. This paper presents CloudLM, an open-source cloud-based LM intended for MT, which allows to query distributed LMs. CloudLM relies on Apache Solr and provides the functionality of state-of-the-art language modelling (it builds upon KenLM, while allowing to query massive LMs (as the use of local memory is drastically reduced, at the expense of slower decoding speed.

  16. Travelling models of participation: Global ideas and local translations of water management in Namibia

    Directory of Open Access Journals (Sweden)

    Michael Schnegg

    2016-09-01

    Full Text Available In recent decades, water management in Namibia has profoundly changed. Beginning in the 1990s the Namibian state has incrementally turned ownership of and the responsibility for its rural water supply to local user groups. While the state withdrew from managing resources directly, it continued to circumscribe the ways in which local communities should govern them. In so doing, a “new commons” was created. Inclusive participation became the leitmotif of the new management scheme and in particular the participation of women was a major political and societal goal. In this article, we use the notion of travelling models as a theoretical guide to explore how the idea of participation emerged in international development discourses and how it was then translated through national legislation into the local context. The results of the analysis show that more than 20 years after the formulation of international conventions the average participation of women in local water committees remains low. However, older women do manage the funds associated with water and thus occupy one of the most important functions. Our explanation takes the wider social and cultural field into account and shows that gender and generational roles provide elder women with autonomy and authority which prepare their ways into these new official roles. We conclude by considering whether and how the travelling model of participation has been changing local social structures in general and the role of elder women in particular.

  17. Translating dosages from animal models to human clinical trials--revisiting body surface area scaling.

    Science.gov (United States)

    Blanchard, Otis L; Smoliga, James M

    2015-05-01

    Body surface area (BSA) scaling has been used for prescribing individualized dosages of various drugs and has also been recommended by the U.S. Food and Drug Administration as one method for using data from animal model species to establish safe starting dosages for first-in-human clinical trials. Although BSA conversion equations have been used in certain clinical applications for decades, recent recommendations to use BSA to derive interspecies equivalents for therapeutic dosages of drug and natural products are inappropriate. A thorough review of the literature reveals that BSA conversions are based on antiquated science and have little justification in current translational medicine compared to more advanced allometric and physiologically based pharmacokinetic modeling. Misunderstood and misinterpreted use of BSA conversions may have disastrous consequences, including underdosing leading to abandonment of potentially efficacious investigational drugs, and unexpected deadly adverse events. We aim to demonstrate that recent recommendations for BSA are not appropriate for animal-to-human dosage conversions and use pharmacokinetic data from resveratrol studies to demonstrate how confusion between the "human equivalent dose" and "pharmacologically active dose" can lead to inappropriate dose recommendations. To optimize drug development, future recommendations for interspecies scaling must be scientifically justified using physiologic, pharmacokinetic, and toxicology data rather than simple BSA conversion.

  18. Nonhuman primates: translational models for predicting antipsychotic-induced movement disorders.

    Science.gov (United States)

    Porsolt, Roger D; Castagné, Vincent; Hayes, Eric; Virley, David

    2013-12-01

    Repeated haloperidol treatment administered to nonhuman primates (NHPs) over several months or even years leads to the gradual appearance of drug-induced dystonic reactions in the orofacial region (mouth opening, tongue protrusion or retraction, bar biting) and in the whole body (writhing of the limbs and trunk, bar grasping). The propensity of antipsychotics to induce dystonia in NHPs is not correlated with their propensity to induce catalepsy in rodents, suggesting that the two types of effects are dissociated and may represent distinct aspects of the extrapyramidal symptoms induced by antipsychotics. In view of the clear homology to clinically observed phenomena, antipsychotic-induced dystonias in antipsychotic-primed NHPs would appear to possess a high degree of translational validity. These NHP phenomena could therefore serve as a useful model for predicting the occurrence of similar abnormal movements with novel substances developed for the treatment of schizophrenia or other psychotic disorders. Moreover, the NHP dystonia model could possibly serve as a biomarker for substances that will eventually cause tardive dyskinesia in patients.

  19. Imaging of Small Animal Peripheral Artery Disease Models: Recent Advancements and Translational Potential

    Directory of Open Access Journals (Sweden)

    Jenny B. Lin

    2015-05-01

    Full Text Available Peripheral artery disease (PAD is a broad disorder encompassing multiple forms of arterial disease outside of the heart. As such, PAD development is a multifactorial process with a variety of manifestations. For example, aneurysms are pathological expansions of an artery that can lead to rupture, while ischemic atherosclerosis reduces blood flow, increasing the risk of claudication, poor wound healing, limb amputation, and stroke. Current PAD treatment is often ineffective or associated with serious risks, largely because these disorders are commonly undiagnosed or misdiagnosed. Active areas of research are focused on detecting and characterizing deleterious arterial changes at early stages using non-invasive imaging strategies, such as ultrasound, as well as emerging technologies like photoacoustic imaging. Earlier disease detection and characterization could improve interventional strategies, leading to better prognosis in PAD patients. While rodents are being used to investigate PAD pathophysiology, imaging of these animal models has been underutilized. This review focuses on structural and molecular information and disease progression revealed by recent imaging efforts of aortic, cerebral, and peripheral vascular disease models in mice, rats, and rabbits. Effective translation to humans involves better understanding of underlying PAD pathophysiology to develop novel therapeutics and apply non-invasive imaging techniques in the clinic.

  20. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare.

    Science.gov (United States)

    Bracke, M B M; Edwards, S A; Metz, J H M; Noordhuizen, J P T M; Algers, B

    2008-07-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called 'semantic modelling' has been developed. To date, however, this methodology has not been generally applied. Recently, a qualitative Risk Assessment approach has been published by the European Food Safety Authority (EFSA) for the first time, concerning the welfare of intensively reared calves. This paper reports on a critical analysis of this Risk Assessment (RA) approach from a semantic-modelling (SM) perspective, emphasizing the importance of several seemingly self-evident principles, including the definition of concepts, application of explicit methodological procedures and specification of how underlying values and scientific information lead to the RA output. In addition, the need to include positive aspects of welfare and overall welfare assessments are emphasized. The analysis shows that the RA approach for animal welfare could benefit from SM methodology to support transparent and science-based decision-making.

  1. A fault diagnosis methodology for rolling element bearings based on advanced signal pretreatment and autoregressive modelling

    Science.gov (United States)

    Al-Bugharbee, Hussein; Trendafilova, Irina

    2016-05-01

    This study proposes a methodology for rolling element bearings fault diagnosis which gives a complete and highly accurate identification of the faults present. It has two main stages: signals pretreatment, which is based on several signal analysis procedures, and diagnosis, which uses a pattern-recognition process. The first stage is principally based on linear time invariant autoregressive modelling. One of the main contributions of this investigation is the development of a pretreatment signal analysis procedure which subjects the signal to noise cleaning by singular spectrum analysis and then stationarisation by differencing. So the signal is transformed to bring it close to a stationary one, rather than complicating the model to bring it closer to the signal. This type of pretreatment allows the use of a linear time invariant autoregressive model and improves its performance when the original signals are non-stationary. This contribution is at the heart of the proposed method, and the high accuracy of the diagnosis is a result of this procedure. The methodology emphasises the importance of preliminary noise cleaning and stationarisation. And it demonstrates that the information needed for fault identification is contained in the stationary part of the measured signal. The methodology is further validated using three different experimental setups, demonstrating very high accuracy for all of the applications. It is able to correctly classify nearly 100 percent of the faults with regard to their type and size. This high accuracy is the other important contribution of this methodology. Thus, this research suggests a highly accurate methodology for rolling element bearing fault diagnosis which is based on relatively simple procedures. This is also an advantage, as the simplicity of the individual processes ensures easy application and the possibility for automation of the entire process.

  2. Cross-Language Translation Priming Asymmetry with Chinese-English Bilinguals: A Test of the Sense Model

    Science.gov (United States)

    Chen, Baoguo; Zhou, Huixia; Gao, Yiwen; Dunlap, Susan

    2014-01-01

    The present study aimed to test the Sense Model of cross-linguistic masked translation priming asymmetry, proposed by Finkbeiner et al. ("J Mem Lang" 51:1-22, 2004), by manipulating the number of senses that bilingual participants associated with words from both languages. Three lexical decision experiments were conducted with…

  3. Cross-Language Translation Priming Asymmetry with Chinese-English Bilinguals: A Test of the Sense Model

    Science.gov (United States)

    Chen, Baoguo; Zhou, Huixia; Gao, Yiwen; Dunlap, Susan

    2014-01-01

    The present study aimed to test the Sense Model of cross-linguistic masked translation priming asymmetry, proposed by Finkbeiner et al. ("J Mem Lang" 51:1-22, 2004), by manipulating the number of senses that bilingual participants associated with words from both languages. Three lexical decision experiments were conducted with…

  4. Translating Evidence-Based Practice into a Comprehensive Educational Model within an Autism-Specific Special School

    Science.gov (United States)

    Lambert-Lee, Katy A.; Jones, Rebecca; O'Sullivan, Julie; Hastings, Richard P.; Douglas-Cobane, Emma; Thomas J., Esther; Hughes, Carl; Griffith, Gemma

    2015-01-01

    Research evaluations of Applied Behaviour Analysis (ABA)-based interventions for children with autism demonstrate positive outcomes. However, little research has focused on the translation of these evidence-based interventions into service delivery models within existing education systems. In the present article, we provide a description of the…

  5. Methodology for Constructing Reduced-Order Power Block Performance Models for CSP Applications: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, M.

    2010-10-01

    The inherent variability of the solar resource presents a unique challenge for CSP systems. Incident solar irradiation can fluctuate widely over a short time scale, but plant performance must be assessed for long time periods. As a result, annual simulations with hourly (or sub-hourly) timesteps are the norm in CSP analysis. A highly detailed power cycle model provides accuracy but tends to suffer from prohibitively long run-times; alternatively, simplified empirical models can run quickly but don?t always provide enough information, accuracy, or flexibility for the modeler. The ideal model for feasibility-level analysis incorporates both the detail and accuracy of a first-principle model with the low computational load of a regression model. The work presented in this paper proposes a methodology for organizing and extracting information from the performance output of a detailed model, then using it to develop a flexible reduced-order regression model in a systematic and structured way. A similar but less generalized approach for characterizing power cycle performance and a reduced-order modeling methodology for CFD analysis of heat transfer from electronic devices have been presented. This paper builds on these publications and the non-dimensional approach originally described.

  6. Revisiting interaction in knowledge translation

    Directory of Open Access Journals (Sweden)

    Zackheim Lisa

    2007-10-01

    Full Text Available Abstract Background Although the study of research utilization is not new, there has been increased emphasis on the topic over the recent past. Science push models that are researcher driven and controlled and demand pull models emphasizing users/decision-maker interests have largely been abandoned in favour of more interactive models that emphasize linkages between researchers and decisionmakers. However, despite these and other theoretical and empirical advances in the area of research utilization, there remains a fundamental gap between the generation of research findings and the application of those findings in practice. Methods Using a case approach, the current study looks at the impact of one particular interaction approach to research translation used by a Canadian funding agency. Results Results suggest there may be certain conditions under which different levels of decisionmaker involvement in research will be more or less effective. Four attributes are illuminated by the current case study: stakeholder diversity, addressability/actionability of results, finality of study design and methodology, and politicization of results. Future research could test whether these or other variables can be used to specify some of the conditions under which different approaches to interaction in knowledge translation are likely to facilitate research utilization. Conclusion This work suggests that the efficacy of interaction approaches to research translation may be more limited than current theory proposes and underscores the need for more completely specified models of research utilization that can help address the slow pace of change in this area.

  7. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  8. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  9. A methodology for including wall roughness effects in k-ε low-Reynolds turbulence models

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosini, W., E-mail: walter.ambrosini@ing.unipi.it; Pucciarelli, A.; Borroni, I.

    2015-05-15

    Highlights: • A model for taking into account wall roughness in low-Reynolds k-ε models is presented. • The model is subjected to a first validation to show its potential in general applications. • The application of the model in predicting heat transfer to supercritical fluids is also discussed. - Abstract: A model accounting for wall roughness effects in k-ε low-Reynolds turbulence models is described in the present paper. In particular, the introduction in the transport equations of k and ε of additional source terms related to roughness, based on simple assumptions and dimensional relationships, is proposed. An objective of the present paper, in addition to obtaining more realistic predictions of wall friction, is the application of the proposed model to the study of heat transfer to supercritical fluids. A first validation of the model is reported. The model shows the capability of predicting, at least qualitatively, some of the most important trends observed when dealing with rough pipes in very different flow conditions. Qualitative comparisons with some DNS data available in literature are also performed. Further analyses provided promising results concerning the ability of the model in reproducing the trend of friction factor when varying the flow conditions, though improvements are necessary for achieving better quantitative accuracy. First applications of the model in simulating heat transfer to supercritical fluids are also described, showing the capability of the model to affect the predictions of these heat transfer phenomena, in particular in the vicinity of the pseudo-critical conditions. A more extended application of the model to relevant deteriorated heat transfer conditions will clarify the usefulness of this modelling methodology in improving predictions of these difficult phenomena. Whatever the possible success in this particular application that motivated its development, this approach suggests a general methodology for accounting

  10. An Intelligent Response Surface Methodology for Modeling of Domain Level Constraints

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An effective modeling method of domain level constraints in the constraint network for concurrent engineering (CE) was developed. The domain level constraints were analyzed and the framework of modeling of domain level constraints based on simulation and approximate technology was given. An intelligent response surface methodology (IRSM) was proposed, in which artificial intelligence technologies are introduced into the optimization process. The design of crank and connecting rod in the V6 engine as example was given to show the validity of the modeling method.

  11. Translational Creativity

    DEFF Research Database (Denmark)

    Nielsen, Sandro

    2010-01-01

    A long-established approach to legal translation focuses on terminological equivalence making translators strictly follow the words of source texts. Recent research suggests that there is room for some creativity allowing translators to deviate from the source texts. However, little attention...... is given to genre conventions in source texts and the ways in which they can best be translated. I propose that translators of statutes with an informative function in expert-to-expert communication may be allowed limited translational creativity when translating specific types of genre convention....... This creativity is a result of translators adopting either a source-language or a target-language oriented strategy and is limited by the pragmatic principle of co-operation. Examples of translation options are provided illustrating the different results in target texts. The use of a target-language oriented...

  12. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  13. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    . The GC model uses the Marrero-Gani (MR) method which considers the group contribution in different levels both functional and structural. The methodology helps improve accuracy and reliability of property modeling and provides a rigorous model quality check and assurance. This is expected to further......Property prediction models are a fundamental tool of process modeling and analysis, especially at the early stage of process development. Furthermore, property prediction models are the fundamental tool for Computer-aided molecular design used for the development of new refrigerants. Group...... contribution (GC) based prediction methods use structurally dependent parameters in order to determine the property of pure components. The aim of the GC parameter estimation is to find the best possible set of model parameters that fits the experimental data. In that sense, there is often a lack of attention...

  14. The Oncopig Cancer Model: An Innovative Large Animal Translational Oncology Platform

    Directory of Open Access Journals (Sweden)

    Kyle M. Schachtschneider

    2017-08-01

    Full Text Available Despite an improved understanding of cancer molecular biology, immune landscapes, and advancements in cytotoxic, biologic, and immunologic anti-cancer therapeutics, cancer remains a leading cause of death worldwide. More than 8.2 million deaths were attributed to cancer in 2012, and it is anticipated that cancer incidence will continue to rise, with 19.3 million cases expected by 2025. The development and investigation of new diagnostic modalities and innovative therapeutic tools is critical for reducing the global cancer burden. Toward this end, transitional animal models serve a crucial role in bridging the gap between fundamental diagnostic and therapeutic discoveries and human clinical trials. Such animal models offer insights into all aspects of the basic science-clinical translational cancer research continuum (screening, detection, oncogenesis, tumor biology, immunogenicity, therapeutics, and outcomes. To date, however, cancer research progress has been markedly hampered by lack of a genotypically, anatomically, and physiologically relevant large animal model. Without progressive cancer models, discoveries are hindered and cures are improbable. Herein, we describe a transgenic porcine model—the Oncopig Cancer Model (OCM—as a next-generation large animal platform for the study of hematologic and solid tumor oncology. With mutations in key tumor suppressor and oncogenes, TP53R167H and KRASG12D, the OCM recapitulates transcriptional hallmarks of human disease while also exhibiting clinically relevant histologic and genotypic tumor phenotypes. Moreover, as obesity rates increase across the global population, cancer patients commonly present clinically with multiple comorbid conditions. Due to the effects of these comorbidities on patient management, therapeutic strategies, and clinical outcomes, an ideal animal model should develop cancer on the background of representative comorbid conditions (tumor macro- and microenvironments. As

  15. 2014 International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Ören, Tuncer; Kacprzyk, Janusz; Filipe, Joaquim

    2015-01-01

    The present book includes a set of selected extended papers from the 4th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2014), held in Vienna, Austria, from 28 to 30 August 2014. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2014 received 167 submissions, from 45 countries, in all continents. After a double blind paper review performed by the Program Committee, 23% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2014. Commitment to high quality standards is a major concern of SIMULTEC...

  16. 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Kacprzyk, Janusz; Ören, Tuncer; Filipe, Joaquim

    2016-01-01

    The present book includes a set of selected extended papers from the 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2015), held in Colmar, France, from 21 to 23 July 2015. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2015 received 102 submissions, from 36 countries, in all continents. After a double blind paper review performed by the Program Committee, 19% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2015. Commitment to high quality standards is a major concern of SIMULTECH t...

  17. A methodology for modeling photocatalytic reactors for indoor pollution control using previously estimated kinetic parameters

    Energy Technology Data Exchange (ETDEWEB)

    Passalia, Claudio; Alfano, Orlando M. [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina); Brandi, Rodolfo J., E-mail: rbrandi@santafe-conicet.gov.ar [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina)

    2012-04-15

    Highlights: Black-Right-Pointing-Pointer Indoor pollution control via photocatalytic reactors. Black-Right-Pointing-Pointer Scaling-up methodology based on previously determined mechanistic kinetics. Black-Right-Pointing-Pointer Radiation interchange model between catalytic walls using configuration factors. Black-Right-Pointing-Pointer Modeling and experimental validation of a complex geometry photocatalytic reactor. - Abstract: A methodology for modeling photocatalytic reactors for their application in indoor air pollution control is carried out. The methodology implies, firstly, the determination of intrinsic reaction kinetics for the removal of formaldehyde. This is achieved by means of a simple geometry, continuous reactor operating under kinetic control regime and steady state. The kinetic parameters were estimated from experimental data by means of a nonlinear optimization algorithm. The second step was the application of the obtained kinetic parameters to a very different photoreactor configuration. In this case, the reactor is a corrugated wall type using nanosize TiO{sub 2} as catalyst irradiated by UV lamps that provided a spatially uniform radiation field. The radiative transfer within the reactor was modeled through a superficial emission model for the lamps, the ray tracing method and the computation of view factors. The velocity and concentration fields were evaluated by means of a commercial CFD tool (Fluent 12) where the radiation model was introduced externally. The results of the model were compared experimentally in a corrugated wall, bench scale reactor constructed in the laboratory. The overall pollutant conversion showed good agreement between model predictions and experiments, with a root mean square error less than 4%.

  18. Abnormal Motor Activity and Thermoregulation in a Schizophrenia Rat Model for Translational Science.

    Science.gov (United States)

    Horvath, Gyongyi; Kekesi, Gabriella; Petrovszki, Zita; Benedek, Gyorgy

    2015-01-01

    Schizophrenia is accompanied by altered motor activity and abnormal thermoregulation; therefore, the presence of these symptoms can enhance the face validity of a schizophrenia animal model. The goal was to characterize these parameters in freely moving condition of a new substrain of rats showing several schizophrenia-related alterations. Male Wistar rats were used: the new substrain housed individually (for four weeks) and treated subchronically with ketamine, and naive animals without any manipulations. Adult animals were implanted with E-Mitter transponders intraabdominally to record body temperature and locomotor activity continuously. The circadian rhythm of these parameters and the acute effects of changes in light conditions were analyzed under undisturbed circumstances, and the effects of different interventions (handling, bed changing or intraperitoneal vehicle injection) were also determined. Decreased motor activity with fragmented pattern was observed in the new substrain. However, these animals had higher body temperature during the active phase, and they showed wider range of its alterations, too. The changes in light conditions and different interventions produced blunted hyperactivity and altered body temperature responses in the new substrain. Poincaré plot analysis of body temperature revealed enhanced short- and long-term variabilities during the active phase compared to the inactive phase in both groups. Furthermore, the new substrain showed increased short- and long-term variabilities with lower degree of asymmetry suggesting autonomic dysregulation. In summary, the new substrain with schizophrenia-related phenomena showed disturbed motor activity and thermoregulation suggesting that these objectively determined parameters can be biomarkers in translational research.

  19. The Traumatic Brain Injury Model Systems: a longitudinal database, research, collaboration and knowledge translation.

    Science.gov (United States)

    Hammond, F M; Malec, J F

    2010-12-01

    In 1988, the National Institute on Disability and Rehabilitation Research (NIDRR) launched the Traumatic Brain Injury Model Systems (TBIMS) program, creating the longest and largest longitudinal database on individuals with moderate-to-severe traumatic brain injury (TBI) available today. In addition to sustaining the longitudinal database, centers that successfully compete to be part of the TBIMS centers are also expected to complete local and collaborative research projects to further scientific knowledge about TBI. The research has focused on areas of the NIDRR Long Range Plan which emphasizes employment, health and function, technology for access and function, independent living and community integration, and other associated disability research areas. Centers compete for funded participation in the TBIMS on a 5-year cycle. Dissemination of scientific knowledge gained through the TBIMS is the responsibility of both individual centers and the TBIMS as a whole. This is accomplished through multiple venues that target a broad audience of those who need to receive the information and learn how to best apply it to practice. The sites produce many useful websites, manuals, publications and other materials to accomplish this translation of knowledge to practice.

  20. A Model-Based Methodology for Spray-Drying Process Development

    OpenAIRE

    Dobry, Dan E.; Settell, Dana M.; Baumann, John M.; Ray, Rod J.; Graham, Lisa J; Beyerinck, Ron A.

    2009-01-01

    Solid amorphous dispersions are frequently used to improve the solubility and, thus, the bioavailability of poorly soluble active pharmaceutical ingredients (APIs). Spray-drying, a well-characterized pharmaceutical unit operation, is ideally suited to producing solid amorphous dispersions due to its rapid drying kinetics. This paper describes a novel flowchart methodology based on fundamental engineering models and state-of-the-art process characterization techniques that ensure that spray-dr...

  1. Model-based interpretation of the ECG: a methodology for temporal and spatial reasoning.

    OpenAIRE

    Tong, D. A.; Widman, L. E.

    1992-01-01

    A new software architecture for automatic interpretation of the electrocardiogram is presented. Using the hypothesize-and-test paradigm, a semi-quantitative physiological model and production rule-based knowledge are combined to reason about time- and space-varying characteristics of complex heart rhythms. A prototype system implementing the methodology accepts a semi-quantitative description of the onset and morphology of the P waves and QRS complexes that are observed in the body-surface el...

  2. A Methodology for Modeling the Flow of Military Personnel Across Air Force Active and Reserve Components

    Science.gov (United States)

    2016-01-01

    for pilots will depend on the • number of active component pilots who separate • fraction of separating pilots who affiliate with the reserve ...when tracking economic output over a period of time. GDP data were collected from the Federal Reserve Economic Data (FRED), Federal Reserve Bank of St...C O R P O R A T I O N Research Report A Methodology for Modeling the Flow of Military Personnel Across Air Force Active and Reserve Components

  3. A METHODOLOGICAL MODEL FOR INTEGRATING CHARACTER WITHIN CONTENT AND LANGUAGE INTEGRATED LEARNING IN SOCIOLOGY OF RELIGION

    OpenAIRE

    Moh Yasir Alimi

    2014-01-01

    AbstractIn this article, I describe a methodological model I used in a experimental study on how to integrate character within the practice of Content and Language Integrated Learning (CLIL) at the higher education Indonesia.This research can be added to research about character education and CLIL in tertiary education, giving nuances to the practice of CLIL so far predominantly a practice in primary and secondary schools.The research was conducted in Semarang State University, in the Departm...

  4. Modeling the Capacity and Emissions Impacts of Reduced Electricity Demand. Part 1. Methodology and Preliminary Results

    Energy Technology Data Exchange (ETDEWEB)

    Coughlin, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Shen, Hongxia [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Chan, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; McDevitt, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Sturges, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division

    2013-02-07

    Policies aimed at energy conservation and efficiency have broad environmental and economic impacts. Even if these impacts are relatively small, they may be significant compared to the cost of implementing the policy. Methodologies that quantify the marginal impacts of reduced demand for energy have an important role to play in developing accurate measures of both the benefits and costs of a given policy choice. This report presents a methodology for estimating the impacts of reduced demand for electricity on the electric power sector as a whole. The approach uses the National Energy Modeling System (NEMS), a mid-range energy forecast model developed and maintained by the U.S. Department of Energy, Energy Information Administration (EIA)(DOE EIA 2013). The report is organized as follows: In the rest of this section the traditional NEMS-BT approach is reviewed and an outline of the new reduced form NEMS methodology is presented. Section 2 provides an overview of how the NEMS model works, and describes the set of NEMS-BT runs that are used as input to the reduced form approach. Section 3 presents our NEMS-BT simulation results and post-processing methods. In Section 4 we show how the NEMS-BT output can be generalized to apply to a broader set of end-uses. In Section 5 we disuss the application of this approach to policy analysis, and summarize some of the issues that will be further investigated in Part 2 of this study.

  5. MODEL - INTEGRAL METHODOLOGY FOR SUCCESSFUL DESIGNING AND IMPLEMENTING OF TQM SYSTEM IN MACEDONIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    Elizabeta Mitreva

    2011-12-01

    Full Text Available The subject of this paper is linked with the valorization of the meaning and the perspectives of Total Quality Management (TQM system design and implementation within the domestic companies and creating a model-methodology for improved performance, efficiency and effectiveness. The research is designed as an attempt to depict the existing condition in the Macedonian companies regarding quality system design and implementation, analysed through 4 polls in the "house of quality" whose top is the ultimate management, and as its bases measurement, evaluation, analyzing and comparison of the quality are used. This "house" is being held by 4 subsystems e.g. internal standardization, methods and techniques for flawless work performance, education and motivation and analyses of the quality costs. The data received from the research and the proposal of the integral methodology for designing and implementing of TQM system are designed in turn to help and present useful directions to all Macedonian companies tending to become "world class" organizations. The basis in the creation of this model is the redesign of the business processes which afterword begins as a new phase of the business performance - continued improvement, rolling of Deming's Quality Circle (Plan-Do-Check-Act. The model-methodology proposed in this paper is integral and universal which means that it is applicable to all companies regardless of the business area.

  6. IMPLEMENTATION OF DATA ASSIMILATION METHODOLOGY FOR PHYSICAL MODEL UNCERTAINTY EVALUATION USING POST-CHF EXPERIMENTAL DATA

    Directory of Open Access Journals (Sweden)

    JAESEOK HEO

    2014-10-01

    Full Text Available The Best Estimate Plus Uncertainty (BEPU method has been widely used to evaluate the uncertainty of a best-estimate thermal hydraulic system code against a figure of merit. This uncertainty is typically evaluated based on the physical model's uncertainties determined by expert judgment. This paper introduces the application of data assimilation methodology to determine the uncertainty bands of the physical models, e.g., the mean value and standard deviation of the parameters, based upon the statistical approach rather than expert judgment. Data assimilation suggests a mathematical methodology for the best estimate bias and the uncertainties of the physical models which optimize the system response following the calibration of model parameters and responses. The mathematical approaches include deterministic and probabilistic methods of data assimilation to solve both linear and nonlinear problems with the a posteriori distribution of parameters derived based on Bayes' theorem. The inverse problem was solved analytically to obtain the mean value and standard deviation of the parameters assuming Gaussian distributions for the parameters and responses, and a sampling method was utilized to illustrate the non-Gaussian a posteriori distributions of parameters. SPACE is used to demonstrate the data assimilation method by determining the bias and the uncertainty bands of the physical models employing Bennett's heated tube test data and Becker's post critical heat flux experimental data. Based on the results of the data assimilation process, the major sources of the modeling uncertainties were identified for further model development.

  7. Developing translational medicine professionals: the Marie Skłodowska-Curie action model.

    Science.gov (United States)

    Petrelli, Alessandra; Prakken, Berent J; Rosenblum, Norman D

    2016-11-29

    End goal of translational medicine is to combine disciplines and expertise to eventually promote improvement of the global healthcare system by delivering effective therapies to individuals and society. Well-trained experts of the translational medicine process endowed with profound knowledge of biomedical technology, ethical and clinical issues, as well as leadership and teamwork abilities are essential for the effective development of tangible therapeutic products for patients. In this article we focus on education and, in particular, we discuss how programs providing training on the broad spectrum of the translational medicine continuum have still a limited degree of diffusion and do not provide professional support and mentorship in the long-term, resulting in the lack of well established professionals of translational medicine (TMPs) in the scientific community. Here, we describe the Marie Skłodowska-Curie Actions program ITN-EUtrain (EUropean Translational tRaining for Autoimmunity & Immune manipulation Network) where training on the Translational Medicine machinery was integrated with education on professional and personal skills, mentoring, and a long-lasting network of TMPs.

  8. A Conceptual Model for the Translation of Bioethics Research and Scholarship.

    Science.gov (United States)

    Mathews, Debra J H; Hester, D Micah; Kahn, Jeffrey; McGuire, Amy; McKinney, Ross; Meador, Keith; Philpott-Jones, Sean; Youngner, Stuart; Wilfond, Benjamin S

    2016-09-01

    While the bioethics literature demonstrates that the field has spent substantial time and thought over the last four decades on the goals, methods, and desired outcomes for service and training in bioethics, there has been less progress defining the nature and goals of bioethics research and scholarship. This gap makes it difficult both to describe the breadth and depth of these areas of bioethics and, importantly, to gauge their success. However, the gap also presents us with an opportunity to define this scope of work for ourselves and to help shape the broader conversation about the impact of academic research. Because of growing constraints on academic funding, researchers and scholars in many fields are being asked to demonstrate and also forecast the value and impact of their work. To do that, and also to satisfy ourselves that our work has meaningful effect, we must understand how our work can motivate change and how that change can be meaningfully measured. In a field as diverse as bioethics, the pathways to and metrics of change will likewise be diverse. It is therefore critical that any assessment of the impact of bioethics research and scholarship be informed by an understanding of the nature of the work, its goals, and how those goals can and ought to be furthered. In this paper, we propose a conceptual model that connects individual bioethics projects to the broader goals of scholarship, describing the translation of research and scholarly output into changes in thinking, practice, and policy. One of the key implications of the model is that impact in bioethics is generally the result of a collection of projects rather than of any single piece of research or scholarship. Our goal is to lay the groundwork for a thoroughgoing conversation about bioethics research and scholarship that will advance and shape the important conversation about their impact.

  9. Methodology of a diabetes prevention translational research project utilizing a community-academic partnership for implementation in an underserved Latino community

    Directory of Open Access Journals (Sweden)

    Ma Yunsheng

    2009-03-01

    Full Text Available Abstract Background Latinos comprise the largest racial/ethnic group in the United States and have 2–3 times the prevalence of type 2 diabetes mellitus as Caucasians. Methods and design The Lawrence Latino Diabetes Prevention Project (LLDPP is a community-based translational research study which aims to reduce the risk of diabetes among Latinos who have a ≥ 30% probability of developing diabetes in the next 7.5 years per a predictive equation. The project was conducted in Lawrence, Massachusetts, a predominantly Caribbean-origin urban Latino community. Individuals were identified primarily from a community health center's patient panel, screened for study eligibility, randomized to either a usual care or a lifestyle intervention condition, and followed for one year. Like the efficacious Diabetes Prevention Program (DPP, the LLDPP intervention targeted weight loss through dietary change and increased physical activity. However, unlike the DPP, the LLDPP intervention was less intensive, tailored to literacy needs and cultural preferences, and delivered in Spanish. The group format of the intervention (13 group sessions over 1 year was complemented by 3 individual home visits and was implemented by individuals from the community with training and supervision by a clinical research nutritionist and a behavioral psychologist. Study measures included demographics, Stern predictive equation components (age, gender, ethnicity, fasting glucose, systolic blood pressure, HDL-cholesterol, body mass index, and family history of diabetes, glycosylated hemoglobin, dietary intake, physical activity, depressive symptoms, social support, quality of life, and medication use. Body weight was measured at baseline, 6-months, and one-year; all other measures were assessed at baseline and one-year. All surveys were orally administered in Spanish. Results A community-academic partnership enabled the successful recruitment, intervention, and assessment of Latinos at

  10. Methodology of a diabetes prevention translational research project utilizing a community-academic partnership for implementation in an underserved Latino community.

    Science.gov (United States)

    Merriam, Philip A; Tellez, Trinidad L; Rosal, Milagros C; Olendzki, Barbara C; Ma, Yunsheng; Pagoto, Sherry L; Ockene, Ira S

    2009-03-13

    Latinos comprise the largest racial/ethnic group in the United States and have 2-3 times the prevalence of type 2 diabetes mellitus as Caucasians. The Lawrence Latino Diabetes Prevention Project (LLDPP) is a community-based translational research study which aims to reduce the risk of diabetes among Latinos who have a >/= 30% probability of developing diabetes in the next 7.5 years per a predictive equation. The project was conducted in Lawrence, Massachusetts, a predominantly Caribbean-origin urban Latino community. Individuals were identified primarily from a community health center's patient panel, screened for study eligibility, randomized to either a usual care or a lifestyle intervention condition, and followed for one year. Like the efficacious Diabetes Prevention Program (DPP), the LLDPP intervention targeted weight loss through dietary change and increased physical activity. However, unlike the DPP, the LLDPP intervention was less intensive, tailored to literacy needs and cultural preferences, and delivered in Spanish. The group format of the intervention (13 group sessions over 1 year) was complemented by 3 individual home visits and was implemented by individuals from the community with training and supervision by a clinical research nutritionist and a behavioral psychologist. Study measures included demographics, Stern predictive equation components (age, gender, ethnicity, fasting glucose, systolic blood pressure, HDL-cholesterol, body mass index, and family history of diabetes), glycosylated hemoglobin, dietary intake, physical activity, depressive symptoms, social support, quality of life, and medication use. Body weight was measured at baseline, 6-months, and one-year; all other measures were assessed at baseline and one-year. All surveys were orally administered in Spanish. A community-academic partnership enabled the successful recruitment, intervention, and assessment of Latinos at risk of diabetes with a one-year study retention rate of 93%. NCT

  11. Application of Box-Behnken design and response surface methodology for modeling of some Turkish coals

    Energy Technology Data Exchange (ETDEWEB)

    N. Aslan; Y. Cebeci [Cumhuriyet University, Sivas (Turkey). Mining Engineering Department

    2007-01-15

    The aim of our research was to apply Box-Behnken experimental design and response surface methodology for modeling of some Turkish coals. As a base for this study, standard Bond grindability tests were initially done and Bond work indexes (Wi) values were calculated for three Turkish coals. The Box-Behnken experimental design was used to provide data for modeling and the variables of model were Bond work index, grinding time and ball diameter of mill. Coal grinding tests were performed changing these three variables for three size fractions of coals (-3350 + 1700 {mu}m, -1700 + 710 {mu}m and -710 {mu}m). Using these sets of experimental data obtained by mathematical software package (MATLAB 7.1), mathematical models were then developed to show the effect of each parameter and their interactions on product 80% passing size (d{sub 80}). Predicted values of d80 obtained using model equations were in good agreement with the experimental values of d{sub 80} (R{sup 2} value of 0.96 for -3350 + 1700 {mu}m, R{sup 2} value of 0.98 for -1700 + 710 {mu}m and R{sup 2} value of 0.94 for -710 {mu}m). This study proved that Box-Behnken design and response surface methodology could efficiently be applied for modeling of grinding of some Turkish coals. 19 refs., 14 figs., 6 tabs.

  12. Translation Nation

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The International Federation of Translators will hold its largest ever world congress in China on the eve of 2008 Olympic Games china’ s position as a powerhouse of the translation industry is to be cemented,

  13. Translating Europe

    Directory of Open Access Journals (Sweden)

    Yves Chevrel

    2007-07-01

    Europe thinks in many languages and Europe is a land of translation. Translation is a means of transmitting culture, a means of making it available to others and an invitation to share. It is a cement which binds Europe together.

  14. Object-oriented modelling with unified modelling language 2.0 for simple software application based on agile methodology

    CERN Document Server

    Warnars, Spits

    2010-01-01

    Unified modelling language (UML) 2.0 introduced in 2002 has been developing and influencing object-oriented software engineering and has become a standard and reference for information system analysis and design modelling. There are many concepts and theories to model the information system or software application with UML 2.0, which can make ambiguities and inconsistencies for a novice to learn to how to model the system with UML especially with UML 2.0. This article will discuss how to model the simple software application by using some of the diagrams of UML 2.0 and not by using the whole diagrams as suggested by agile methodology. Agile methodology is considered as convenient for novices because it can deliver the information technology environment to the end-user quickly and adaptively with minimal documentation. It also has the ability to deliver best performance software application according to the customer's needs. Agile methodology will make simple model with simple documentation, simple team and si...

  15. L-leucine partially rescues translational and developmental defects associated with zebrafish models of Cornelia de Lange syndrome.

    Science.gov (United States)

    Xu, Baoshan; Sowa, Nenja; Cardenas, Maria E; Gerton, Jennifer L

    2015-03-15

    Cohesinopathies are human genetic disorders that include Cornelia de Lange syndrome (CdLS) and Roberts syndrome (RBS) and are characterized by defects in limb and craniofacial development as well as mental retardation. The developmental phenotypes of CdLS and other cohesinopathies suggest that mutations in the structure and regulation of the cohesin complex during embryogenesis interfere with gene regulation. In a previous project, we showed that RBS was associated with highly fragmented nucleoli and defects in both ribosome biogenesis and protein translation. l-leucine stimulation of the mTOR pathway partially rescued translation in human RBS cells and development in zebrafish models of RBS. In this study, we investigate protein translation in zebrafish models of CdLS. Our results show that phosphorylation of RPS6 as well as 4E-binding protein 1 (4EBP1) was reduced in nipbla/b, rad21 and smc3-morphant embryos, a pattern indicating reduced translation. Moreover, protein biosynthesis and rRNA production were decreased in the cohesin morphant embryo cells. l-leucine partly rescued protein synthesis and rRNA production in the cohesin morphants and partially restored phosphorylation of RPS6 and 4EBP1. Concomitantly, l-leucine treatment partially improved cohesinopathy embryo development including the formation of craniofacial cartilage. Interestingly, we observed that alpha-ketoisocaproate (α-KIC), which is a keto derivative of leucine, also partially rescued the development of rad21 and nipbla/b morphants by boosting mTOR-dependent translation. In summary, our results suggest that cohesinopathies are caused in part by defective protein synthesis, and stimulation of the mTOR pathway through l-leucine or its metabolite α-KIC can partially rescue development in zebrafish models for CdLS. © The Author 2014. Published by Oxford University Press.

  16. Concepts and methodologies for modeling and simulation a tribute to Tuncer Oren

    CERN Document Server

    Yilmaz, Levent

    2015-01-01

    This comprehensive text/reference presents cutting-edge advances in the theory and methodology of modeling and simulation (M&S), and reveals how this work has been influenced by the fundamental contributions of Professor Tuncer Ören to this field. Exploring the synergies among the domains of M&S and systems engineering (SE), the book describes how M&S and SE can help to address the complex problems identified as "Grand Challenges" more effectively under a model-driven and simulation-directed systems engineering framework. Topics and features: examines frameworks for the development of advan

  17. A methodology to model causal relationships on offshore safety assessment focusing on human and organizational factors.

    Science.gov (United States)

    Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B

    2008-01-01

    Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and

  18. Literal Translation and Free Translation

    Institute of Scientific and Technical Information of China (English)

    彭佼

    2011-01-01

    @@ Dispute over the method of literal translation and that of free translation had a long history in China, in East Jin Dynasty Daoan(道安314-385),a well-known monk, was the representative of those who firmly advocated literal translation.Since he feared that free translation might not be true to the original, he advocated strict literal translation so as to preserve the true features.Works under his direction were typical of word-for-word translation, in which no alteration was made except accidental changes in word order.

  19. Business Model Change Methodology: Applying New Technology in Organization: The Case of Mobile Technology in Learning Industry

    OpenAIRE

    Nastaran Hajiheydari; Payam Hanafizadeh

    2013-01-01

    The present study intends to design a methodology for examining the influence of modern information and communication technology on business models (BMs). Theoretical framework is mainly selected based on literature as well as consultation with expert focus groups. This methodology is validated by expert judgment and simulated as a real case applying system dynamics. The outcome of the survey includes a change methodology formulated in 5 phases and 37 activities. Not only has this study cover...

  20. Methodological considerations for economic modelling of latent tuberculous infection screening in migrants.

    Science.gov (United States)

    Shedrawy, J; Siroka, A; Oxlade, O; Matteelli, A; Lönnroth, K

    2017-09-01

    Tuberculosis (TB) in migrants from endemic to low-incidence countries results mainly from the reactivation of latent tuberculous infection (LTBI). LTBI screening policies for migrants vary greatly between countries, and the evidence on the cost-effectiveness of the different approaches is weak and heterogeneous. The aim of this review was to assess the methodology used in published economic evaluations of LTBI screening among migrants to identify critical methodological options that must be considered when using modelling to determine value for money from different economic perspectives. Three electronic databases were searched and 10 articles were included. There was considerable variation across this small number of studies with regard to economic perspective, main outcomes, modelling technique, screening options and target populations considered, as well as in parameterisation of the epidemiological situation, test accuracy, efficacy, safety and programme performance. Only one study adopted a societal perspective; others adopted a health care or wider government perspective. Parameters representing the cascade of screening and treating LTBI varied widely, with some studies using highly aspirational scenarios. This review emphasises the need for a more harmonised approach for economic analysis, and better transparency in how policy options and economic perspectives influence methodological choices. Variability is justifiable for some parameters. However, sufficient data are available to standardise others. A societal perspective is ideal, but can be challenging due to limited data. Assumptions about programme performance should be based on empirical data or at least realistic assumptions. Results should be interpreted within specific contexts and policy options, with cautious generalisations.

  1. Economic modeling of electricity production from hot dry rock geothermal reservoirs: methodology and analyses. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Cummings, R.G.; Morris, G.E.

    1979-09-01

    An analytical methodology is developed for assessing alternative modes of generating electricity from hot dry rock (HDR) geothermal energy sources. The methodology is used in sensitivity analyses to explore relative system economics. The methodology used a computerized, intertemporal optimization model to determine the profit-maximizing design and management of a unified HDR electric power plant with a given set of geologic, engineering, and financial conditions. By iterating this model on price, a levelized busbar cost of electricity is established. By varying the conditions of development, the sensitivity of both optimal management and busbar cost to these conditions are explored. A plausible set of reference case parameters is established at the outset of the sensitivity analyses. This reference case links a multiple-fracture reservoir system to an organic, binary-fluid conversion cycle. A levelized busbar cost of 43.2 mills/kWh ($1978) was determined for the reference case, which had an assumed geothermal gradient of 40/sup 0/C/km, a design well-flow rate of 75 kg/s, an effective heat transfer area per pair of wells of 1.7 x 10/sup 6/ m/sup 2/, and plant design temperature of 160/sup 0/C. Variations in the presumed geothermal gradient, size of the reservoir, drilling costs, real rates of return, and other system parameters yield minimum busbar costs between -40% and +76% of the reference case busbar cost.

  2. Testing spectral models for stellar populations with star clusters: I. Methodology

    CERN Document Server

    Fernandes, Roberto Cid

    2009-01-01

    High resolution spectral models for simple stellar populations (SSP) developed in the past few years have become a standard ingredient in studies of stellar population of galaxies. As more such models become available, it becomes increasingly important to test them. In this and a companion paper, we test a suite of publicly available evolutionary synthesis models using integrated optical spectra in the blue-near-UV range of 27 well studied star clusters from the work of Leonardi & Rose (2003) spanning a wide range of ages and metallicities. Most (23) of the clusters are from the Magellanic clouds. This paper concentrates on methodological aspects of spectral fitting. The data are fitted with SSP spectral models from Vazdekis and collaborators, based on the MILES library. Best-fit and Bayesian estimates of age, metallicity and extinction are presented, and degeneracies between these parameters are mapped. We find that these models can match the observed spectra very well in most cases, with small formal un...

  3. Comparative Heat Conduction Model of a Cold Storage with Puf & Eps Insulation Using Taguchi Methodology

    Directory of Open Access Journals (Sweden)

    Dr. N. Mukhopadhyay

    2015-05-01

    Full Text Available In this project work a mathematical heat conduction model of a cold storage (with the help of computer program; and multiple regression analysis has been proposed which can be used for further development of cold storages in the upcoming future. In cold storage refrigeration system brings down the temperature initially during start up but thermal insulation maintains the temperature later on continuously. In this view, the simple methodology is presented to calculate heat transfer by analytical method also attempt has been made to minimize the energy consumption by replacing 150 mm Expanded polystyrene (EPS by 100 mm Poly Urethane foam (PUF insulation. The methodology is validated against actual data obtained from Penguin cold storage situated in Pune, India. Insulation thickness of the side walls (TW, area of the wall (AW, and insulation thickness of the roof (TR have been chosen as predictor variables of the study.

  4. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  5. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    Science.gov (United States)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  6. Sharing on Web 3d Models of Ancient Theatres. a Methodological Workflow

    Science.gov (United States)

    Scianna, A.; La Guardia, M.; Scaduto, M. L.

    2016-06-01

    In the last few years, the need to share on the Web the knowledge of Cultural Heritage (CH) through navigable 3D models has increased. This need requires the availability of Web-based virtual reality systems and 3D WEBGIS. In order to make the information available to all stakeholders, these instruments should be powerful and at the same time very user-friendly. However, research and experiments carried out so far show that a standardized methodology doesn't exist. All this is due both to complexity and dimensions of geometric models to be published, on the one hand, and to excessive costs of hardware and software tools, on the other. In light of this background, the paper describes a methodological approach for creating 3D models of CH, freely exportable on the Web, based on HTML5 and free and open source software. HTML5, supporting the WebGL standard, allows the exploration of 3D spatial models using most used Web browsers like Chrome, Firefox, Safari, Internet Explorer. The methodological workflow here described has been tested for the construction of a multimedia geo-spatial platform developed for three-dimensional exploration and documentation of the ancient theatres of Segesta and of Carthage, and the surrounding landscapes. The experimental application has allowed us to explore the potential and limitations of sharing on the Web of 3D CH models based on WebGL standard. Sharing capabilities could be extended defining suitable geospatial Web-services based on capabilities of HTML5 and WebGL technology.

  7. Translation-A Process of Transformation

    Institute of Scientific and Technical Information of China (English)

    Viola Zhu

    2008-01-01

    Translation is a process that involves transformation and reproduction. This essay discusses some useful techniques of trans-lating practice by introducing a simple model. A few examples of good translation are presented to support explaining the model cleady.

  8. (The Need for A Model of Translational Mind Science Justice Research

    Directory of Open Access Journals (Sweden)

    Phillip Atiba Goff

    2013-12-01

    Full Text Available Despite the historical importance of translational research to social psychological investigations of social justice issues, the culture and incentives of contemporary social psychology are ambivalent towards non-experimental field research. This ambivalence poses a significant impediment to social psychology’s role in societal change. This paper offers a brief history of how the field evolved from a relative emphasis on translating social psychology from the laboratory to the field (and back to the present moment. In doing so, we enumerate the most significant impediments to contemporary translational social psychology, namely that conducting translational research often involves greater cost, greater difficulty advancing psychological theory, and more time navigating logistics compared with basic laboratory research. Finally, using the example of recent multi-investigator research on race and gender equity in policing, we outline emerging strategies for how to conduct translational research amidst contemporary impediments, and offer modest suggestions for how the field can better facilitate this kind of research in the future. Taken together this review offers a set of theoretical and practical suggestions for easing the path from research to societal change.

  9. Fear extinction and BDNF: translating animal models of PTSD to the clinic.

    Science.gov (United States)

    Andero, R; Ressler, K J

    2012-07-01

    Brain-derived neurotrophic factor (BDNF) is the most studied neurotrophin involved in synaptic plasticity processes that are required for long-term learning and memory. Specifically, BDNF gene expression and activation of its high-affinity tropomyosin-related kinase B (TrkB) receptor are necessary in the amygdala, hippocampus and prefrontal cortex for the formation of emotional memories, including fear memories. Among the psychiatric disorders with altered fear processing, there is post-traumatic stress disorder (PTSD) which is characterized by an inability to extinguish fear memories. Since BDNF appears to enhance extinction of fear, targeting impaired extinction in anxiety disorders such as PTSD via BDNF signalling may be an important and novel way to enhance treatment efficacy. The aim of this review is to provide a translational point of view that stems from findings in the BDNF regulation of synaptic plasticity and fear extinction. In addition, there are different systems that seem to alter fear extinction through BDNF modulation like the endocannabinoid system and the hypothalamic-pituitary adrenal axis. Recent work also finds that the pituitary adenylate cyclase-activating polypeptide and PAC1 receptor, which are upstream of BDNF activation, may be implicated in PTSD. Especially interesting are data that exogenous fear extinction enhancers such as antidepressants, histone deacetylases inhibitors and D-cycloserine, a partial N-methyl d-aspartate agonist, may act through or in concert with the BDNF-TrkB system. Finally, we review studies where recombinant BDNF and a putative TrkB agonist, 7,8-dihydroxyflavone, may enhance extinction of fear. These approaches may lead to novel agents that improve extinction in animal models and eventually humans.

  10. Translation of Methdology used in Human Myocardial Imaging to a Sheep Model of Acute Myocardial Infarction

    Directory of Open Access Journals (Sweden)

    Elizabeth A Bailey

    2013-10-01

    Full Text Available Introduction: Pre-clinical investigation of stem cells for repairing damaged myocardium predominantly used rodents, however large animals have cardiac circulation closely resembling the human heart. The aim of this study was to evaluate whether SPECT/CT myocardial perfusion imaging (MPI could be used for assessing sheep myocardium following an acute myocardial infarction (MI and response to intervention. Method: 18 sheep enrolled in a pilot study to evaluate [99mTc]-sestamibi MPI at baseline, post-MI and after therapy. Modifications to the standard MPI protocols were developed. All data was reconstructed with OSEM using CT-derived attenuation and scatter correction. Standard analyses were performed and inter-observer agreement were measured using Kappa (. Power determined the sample sizes needed to show statistically significant changes due to intervention. Results: Ten sheep completed the full protocol. Data processed were performed using pre-existing hardware and software used in human MPI scanning. No improvement in perfusion was seen in the control group, however improvements of 15% - 35% were seen after intra-myocardial stem cell administration. Inter-observer agreement was excellent (К=0.89. Using a target power of 0.9, 28 sheep were required to detect a 10-12% change in perfusion. Conclusions: Study demonstrates the suitability of large animal models for imaging with standard MPI protocols and it’s feasibility with a manageable number of animals. These protocols could be translated into humans to study the efficacy of stem cell therapy in heart regeneration and repair.

  11. Translation of Methdology used in Human Myocardial Imaging to a Sheep Model of Acute Myocardial Infarction

    Directory of Open Access Journals (Sweden)

    Elizabeth Bailey

    2013-10-01

    Full Text Available Background: Pre-clinical investigation of stem cells for repairing damaged myocardium predominantly used rodents, however large animals have cardiac circulation closely resembling the human heart. The aim of this study was to evaluate whether SPECT/CT myocardial perfusion imaging (MPI could be used for assessing sheep myocardium following an acute myocardial infarction (MI and response to intervention. Method: 18 sheep enrolled in a pilot study to evaluate [99mTc]-sestamibi MPI at baseline, post-MI and after therapy. Modifications to the standard MPI protocols were developed. All data was reconstructed with OSEM using CT-derived attenuation and scatter correction. Standard analyses were performed and inter-observer agreement were measured using Kappa (. Power determined the sample sizes needed to show statistically significant changes due to intervention. Results: Ten sheep completed the full protocol. Data processed were performed using pre-existing hardware and software used in human MPI scanning. No improvement in perfusion was seen in the control group, however improvements of 15% - 35% were seen after intra-myocardial stem cell administration. Inter-observer agreement was excellent (К=0.89. Using a target power of 0.9, 28 sheep were required to detect a 10-12% change in perfusion. Conclusions: Study demonstrates the suitability of large animal models for imaging with standard MPI protocols and it’s feasibility with a manageable number of animals. These protocols could be translated into humans to study the efficacy of stem cell therapy in heart regeneration and repair.

  12. Ab initio optimization principle for the ground states of translationally invariant strongly correlated quantum lattice models

    Science.gov (United States)

    Ran, Shi-Ju

    2016-05-01

    In this work, a simple and fundamental numeric scheme dubbed as ab initio optimization principle (AOP) is proposed for the ground states of translational invariant strongly correlated quantum lattice models. The idea is to transform a nondeterministic-polynomial-hard ground-state simulation with infinite degrees of freedom into a single optimization problem of a local function with finite number of physical and ancillary degrees of freedom. This work contributes mainly in the following aspects: (1) AOP provides a simple and efficient scheme to simulate the ground state by solving a local optimization problem. Its solution contains two kinds of boundary states, one of which play the role of the entanglement bath that mimics the interactions between a supercell and the infinite environment, and the other gives the ground state in a tensor network (TN) form. (2) In the sense of TN, a novel decomposition named as tensor ring decomposition (TRD) is proposed to implement AOP. Instead of following the contraction-truncation scheme used by many existing TN-based algorithms, TRD solves the contraction of a uniform TN in an opposite way by encoding the contraction in a set of self-consistent equations that automatically reconstruct the whole TN, making the simulation simple and unified; (3) AOP inherits and develops the ideas of different well-established methods, including the density matrix renormalization group (DMRG), infinite time-evolving block decimation (iTEBD), network contractor dynamics, density matrix embedding theory, etc., providing a unified perspective that is previously missing in this fields. (4) AOP as well as TRD give novel implications to existing TN-based algorithms: A modified iTEBD is suggested and the two-dimensional (2D) AOP is argued to be an intrinsic 2D extension of DMRG that is based on infinite projected entangled pair state. This paper is focused on one-dimensional quantum models to present AOP. The benchmark is given on a transverse Ising

  13. Testing spectral models for stellar populations with star clusters - I. Methodology

    Science.gov (United States)

    Cid Fernandes, Roberto; González Delgado, Rosa M.

    2010-04-01

    High-resolution spectral models for simple stellar populations (SSP) developed in the past few years have become a standard ingredient in studies of stellar population of galaxies. As more such models become available, it becomes increasingly important to test them. In this and a companion paper, we test a suite of publicly available evolutionary synthesis models using integrated optical spectra in the blue-near-UV range of 27 well-studied star clusters from the work of Leonardi and Rose spanning a wide range of ages and metallicities. Most (23) of the clusters are from the Magellanic Clouds. This paper concentrates on the methodological aspects of spectral fitting. The data are fitted with SSP spectral models from Vazdekis and collaborators, based on the Medium-resolution INT Library of Empirical Spectra. Best-fitting and Bayesian estimates of age, metallicity and extinction are presented, and degeneracies between these parameters are mapped. We find that these models can match the observed spectra very well in most cases, with small formal uncertainties in t,Z and AV. In some cases, the spectral fits indicate that the models lack a blue old population, probably associated with the horizontal branch. This methodology, which is mostly based on the publicly available code STARLIGHT, is extended to other sets of models in Paper II, where a comparison with properties derived from spatially resolved data (colour-magnitude diagrams) is presented. The global aim of these two papers is to provide guidance to users of evolutionary synthesis models and empirical feedback to model makers.

  14. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Launiainen, T.; Heljanko, K.; Ropponen, J. [Aalto Univ., Espoo (Finland). Dept. of Information and Computer Science

    2012-07-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  15. Avoiding and identifying errors in health technology assessment models: qualitative study and methodological review.

    Science.gov (United States)

    Chilcott, J; Tappenden, P; Rawdin, A; Johnson, M; Kaltenthaler, E; Paisley, S; Papaioannou, D; Shippam, A

    2010-05-01

    Health policy decisions must be relevant, evidence-based and transparent. Decision-analytic modelling supports this process but its role is reliant on its credibility. Errors in mathematical decision models or simulation exercises are unavoidable but little attention has been paid to processes in model development. Numerous error avoidance/identification strategies could be adopted but it is difficult to evaluate the merits of strategies for improving the credibility of models without first developing an understanding of error types and causes. The study aims to describe the current comprehension of errors in the HTA modelling community and generate a taxonomy of model errors. Four primary objectives are to: (1) describe the current understanding of errors in HTA modelling; (2) understand current processes applied by the technology assessment community for avoiding errors in development, debugging and critically appraising models for errors; (3) use HTA modellers' perceptions of model errors with the wider non-HTA literature to develop a taxonomy of model errors; and (4) explore potential methods and procedures to reduce the occurrence of errors in models. It also describes the model development process as perceived by practitioners working within the HTA community. A methodological review was undertaken using an iterative search methodology. Exploratory searches informed the scope of interviews; later searches focused on issues arising from the interviews. Searches were undertaken in February 2008 and January 2009. In-depth qualitative interviews were performed with 12 HTA modellers from academic and commercial modelling sectors. All qualitative data were analysed using the Framework approach. Descriptive and explanatory accounts were used to interrogate the data within and across themes and subthemes: organisation, roles and communication; the model development process; definition of error; types of model error; strategies for avoiding errors; strategies for

  16. Efficient methodologies for system matrix modelling in iterative image reconstruction for rotating high-resolution PET

    Energy Technology Data Exchange (ETDEWEB)

    Ortuno, J E; Kontaxakis, G; Rubio, J L; Santos, A [Departamento de Ingenieria Electronica (DIE), Universidad Politecnica de Madrid, Ciudad Universitaria s/n, 28040 Madrid (Spain); Guerra, P [Networking Research Center on Bioengineering, Biomaterials and Nanomedicine (CIBER-BBN), Madrid (Spain)], E-mail: juanen@die.upm.es

    2010-04-07

    A fully 3D iterative image reconstruction algorithm has been developed for high-resolution PET cameras composed of pixelated scintillator crystal arrays and rotating planar detectors, based on the ordered subsets approach. The associated system matrix is precalculated with Monte Carlo methods that incorporate physical effects not included in analytical models, such as positron range effects and interaction of the incident gammas with the scintillator material. Custom Monte Carlo methodologies have been developed and optimized for modelling of system matrices for fast iterative image reconstruction adapted to specific scanner geometries, without redundant calculations. According to the methodology proposed here, only one-eighth of the voxels within two central transaxial slices need to be modelled in detail. The rest of the system matrix elements can be obtained with the aid of axial symmetries and redundancies, as well as in-plane symmetries within transaxial slices. Sparse matrix techniques for the non-zero system matrix elements are employed, allowing for fast execution of the image reconstruction process. This 3D image reconstruction scheme has been compared in terms of image quality to a 2D fast implementation of the OSEM algorithm combined with Fourier rebinning approaches. This work confirms the superiority of fully 3D OSEM in terms of spatial resolution, contrast recovery and noise reduction as compared to conventional 2D approaches based on rebinning schemes. At the same time it demonstrates that fully 3D methodologies can be efficiently applied to the image reconstruction problem for high-resolution rotational PET cameras by applying accurate pre-calculated system models and taking advantage of the system's symmetries.

  17. Translation Techniques

    Directory of Open Access Journals (Sweden)

    Marcia Pinheiro

    2015-05-01

    Full Text Available In this paper, we discuss three translation techniques: literal, cultural, and artistic. Literal translation is a well-known technique, which means that it is quite easy to find sources on the topic. Cultural and artistic translation may be new terms. Whilst cultural translation focuses on matching contexts, artistic translation focuses on matching reactions. Because literal translation matches only words, it is not hard to find situations in which we should not use this technique.  Because artistic translation focuses on reactions, judging the quality of an artistic translation work is one of the most difficult things one can do. We end up having a score of complexity and humanity for each one of the mentioned techniques: Literal translation would be the closest thing we have to the machines world and artistic translation would be the closest thing we have to the purely human world. By creating these classifications and studying the subtleties of each one of them, we are adding degrees of quality to our courses and to translation as a professional field. The main contribution of this paper is then the formalization of such a piece of knowledge. We, however, also lay the foundations for studies of this type.

  18. 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Koziel, Slawomir; Kacprzyk, Janusz; Leifsson, Leifur; Ören, Tuncer

    2015-01-01

    This book includes extended and revised versions of a set of selected papers from the 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2013) which was co-organized by the Reykjavik University (RU) and sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC). SIMULTECH 2013 was held in cooperation with the ACM SIGSIM - Special Interest Group (SIG) on SImulation and Modeling (SIM), Movimento Italiano Modellazione e Simulazione (MIMOS) and AIS Special Interest Group on Modeling and Simulation (AIS SIGMAS) and technically co-sponsored by the Society for Modeling & Simulation International (SCS), Liophant Simulation, Simulation Team and International Federation for Information Processing (IFIP). This proceedings brings together researchers, engineers, applied mathematicians and practitioners working in the advances and applications in the field of system simulation.

  19. Methodology of the Access to Care and Timing Simulation Model for Traumatic Spinal Cord Injury Care.

    Science.gov (United States)

    Santos, Argelio; Fallah, Nader; Lewis, Rachel; Dvorak, Marcel F; Fehlings, Michael G; Burns, Anthony Scott; Noonan, Vanessa K; Cheng, Christiana L; Chan, Elaine; Singh, Anoushka; Belanger, Lise M; Atkins, Derek

    2017-03-12

    Despite the relatively low incidence, the management and care of persons with traumatic spinal cord injury (tSCI) can be resource intensive and complex, spanning multiple phases of care and disciplines. Using a simulation model built with a system level view of the healthcare system allows for prediction of the impact of interventions on patient and system outcomes from injury through to community reintegration after tSCI. The Access to Care and Timing (ACT) project developed a simulation model for tSCI care using techniques from operations research and its development has been described previously. The objective of this article is to briefly describe the methodology and the application of the ACT Model as it was used in several of the articles in this focus issue. The approaches employed in this model provide a framework to look into the complexity of interactions both within and among the different SCI programs, sites and phases of care.

  20. The Relationships of Soft Systems Methodology (SSM, Business Process Modeling and e-Government

    Directory of Open Access Journals (Sweden)

    Arief Ramadhan

    2012-01-01

    Full Text Available e-Government have emerged in several countries. Because of many aspects that must be considered, and because of there are exist some soft components in e-Government, then the Soft Systems Methodology (SSM can be considered to use in e-Government systems development process. On the other hand, business process modeling is essential in many fields nowadays, as well as in e-Government. Some researchers have used SSM in e-Government. Several studies that relate the business processes modeling with e-Government have been conducted. This paper tries to reveal the relationship between SSM and business process modeling. Moreover, this paper also tries to explain how business process modeling is integrated within SSM, and further link that integration to the e-Government.

  1. Power Prediction Model for Turning EN-31 Steel Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    M. Hameedullah

    2010-01-01

    Full Text Available Power consumption in turning EN-31 steel (a material that is most extensively used in automotive industry with tungstencarbide tool under different cutting conditions was experimentally investigated. The experimental runs were planned accordingto 24+8 added centre point factorial design of experiments, replicated thrice. The data collected was statisticallyanalyzed using Analysis of Variance technique and first order and second order power consumption prediction models weredeveloped by using response surface methodology (RSM. It is concluded that second-order model is more accurate than thefirst-order model and fit well with the experimental data. The model can be used in the automotive industries for decidingthe cutting parameters for minimum power consumption and hence maximum productivity

  2. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios;

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one...

  3. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one-b...

  4. DATA MINING METHODOLOGY FOR DETERMINING THE OPTIMAL MODEL OF COST PREDICTION IN SHIP INTERIM PRODUCT ASSEMBLY

    Directory of Open Access Journals (Sweden)

    Damir Kolich

    2016-03-01

    Full Text Available In order to accurately predict costs of the thousands of interim products that are assembled in shipyards, it is necessary to use skilled engineers to develop detailed Gantt charts for each interim product separately which takes many hours. It is helpful to develop a prediction tool to estimate the cost of interim products accurately and quickly without the need for skilled engineers. This will drive down shipyard costs and improve competitiveness. Data mining is used extensively for developing prediction models in other industries. Since ships consist of thousands of interim products, it is logical to develop a data mining methodology for a shipyard or any other manufacturing industry where interim products are produced. The methodology involves analysis of existing interim products and data collection. Pre-processing and principal component analysis is done to make the data “user-friendly” for later prediction processing and the development of both accurate and robust models. The support vector machine is demonstrated as the better model when there are a lower number of tuples. However as the number of tuples is increased to over 10000, then the artificial neural network model is recommended.

  5. A New Mathematical Model for Flank Wear Prediction Using Functional Data Analysis Methodology

    Directory of Open Access Journals (Sweden)

    Sonja Jozić

    2014-01-01

    Full Text Available This paper presents a new approach improving the reliability of flank wear prediction during the end milling process. In the present work, prediction of flank wear has been achieved by using cutting parameters and force signals as the sensitive carriers of information about the machining process. A series of experiments were conducted to establish the relationship between flank wear and cutting force components as well as the cutting parameters such as cutting speed, feed per tooth, and radial depth of cut. In order to be able to predict flank wear a new linear regression mathematical model has been developed by utilizing functional data analysis methodology. Regression coefficients of the model are in the form of time dependent functions that have been determined through the use of functional data analysis methodology. The mathematical model has been developed by means of applied cutting parameters and measured cutting forces components during the end milling of workpiece made of 42CrMo4 steel. The efficiency and flexibility of the developed model have been verified by comparing it with the separate experimental data set.

  6. Methodology of synchronization among strategy and operation. A standards-based modeling approach

    Directory of Open Access Journals (Sweden)

    VICTOR EDWIN COLLAZOS

    2017-05-01

    Full Text Available Enterprise Architecture (EA has gained importance in recent years, mainly for its concept of “alignment” between the strategic and operational levels of organizations. Such alignment occurs when Information Technology (IT is applied correctly and timely, working in synergy and harmony with strategy and the operation to achieve mutually their own goals and satisfy the organizational needs.Both the strategic and operational levels have standards that help model elements necessary to obtain desired results. In this sense, BMM and BPMN were selected because both have the support of OMG and they are fairly well known for modelling the strategic level and operational level, respectively. In addition, i* modeling goal can be used for reducing the gap between these two standards. This proposal may help both the high-level design of the information system and to the appropriate identification of the business processes that will support it.This paper presents a methodology for aligning strategy and the operation based on standards and heuristics. We have made a classification for elements of the models and, for some specific cases, an extension of the heuristics associated between them. This allows us to propose methodology, which uses above-mentioned standards and combines mappings, transformations and actions to be considered in the alignment process.

  7. Electricity Capacity Expansion Modeling, Analysis, and Visualization: A Summary of High-Renewable Modeling Experiences (Chinese Translation)

    Energy Technology Data Exchange (ETDEWEB)

    Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zhou, Ella [National Renewable Energy Lab. (NREL), Golden, CO (United States); Getman, Dan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Arent, Douglas J. [Joint Inst. for Strategic Energy Analysis, Golden, CO (United States)

    2015-10-01

    This is the Chinese translation of NREL/TP-6A20-64831. Mathematical and computational models are widely used for the analysis and design of both physical and financial systems. Modeling the electric grid is of particular importance to China for three reasons. First, power-sector assets are expensive and long-lived, and they are critical to any country's development. China's electric load, transmission, and other energy-related infrastructure are expected to continue to grow rapidly; therefore it is crucial to understand and help plan for the future in which those assets will operate. Second, China has dramatically increased its deployment of renewable energy (RE), and is likely to continue further accelerating such deployment over the coming decades. Careful planning and assessment of the various aspects (technical, economic, social, and political) of integrating a large amount of renewables on the grid is required. Third, companies need the tools to develop a strategy for their own involvement in the power market China is now developing, and to enable a possible transition to an efficient and high RE future.

  8. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Indian Academy of Sciences (India)

    Diego Rivera; Yessica Rivas; Alex Godoy

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s−1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  9. Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology

    Science.gov (United States)

    Macioł, Piotr; Michalik, Kazimierz

    2016-10-01

    Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.

  10. A geostatistical methodology to assess the accuracy of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.

  11. A consistent modelling methodology for secondary settling tanks in wastewater treatment.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Nopens, Ingmar

    2011-03-01

    The aim of this contribution is partly to build consensus on a consistent modelling methodology (CMM) of complex real processes in wastewater treatment by combining classical concepts with results from applied mathematics, and partly to apply it to the clarification-thickening process in the secondary settling tank. In the CMM, the real process should be approximated by a mathematical model (process model; ordinary or partial differential equation (ODE or PDE)), which in turn is approximated by a simulation model (numerical method) implemented on a computer. These steps have often not been carried out in a correct way. The secondary settling tank was chosen as a case since this is one of the most complex processes in a wastewater treatment plant and simulation models developed decades ago have no guarantee of satisfying fundamental mathematical and physical properties. Nevertheless, such methods are still used in commercial tools to date. This particularly becomes of interest as the state-of-the-art practice is moving towards plant-wide modelling. Then all submodels interact and errors propagate through the model and severely hamper any calibration effort and, hence, the predictive purpose of the model. The CMM is described by applying it first to a simple conversion process in the biological reactor yielding an ODE solver, and then to the solid-liquid separation in the secondary settling tank, yielding a PDE solver. Time has come to incorporate established mathematical techniques into environmental engineering, and wastewater treatment modelling in particular, and to use proven reliable and consistent simulation models.

  12. Frescoed Vaults: Accuracy Controlled Simplified Methodology for Planar Development of Three-Dimensional Textured Models

    Directory of Open Access Journals (Sweden)

    Marco Giorgio Bevilacqua

    2016-03-01

    Full Text Available In the field of documentation and preservation of cultural heritage, there is keen interest in 3D metric viewing and rendering of architecture for both formal appearance and color. On the other hand, operative steps of restoration interventions still require full-scale, 2D metric surface representations. The transition from 3D to 2D representation, with the related geometric transformations, has not yet been fully formalized for planar development of frescoed vaults. Methodologies proposed so far on this subject provide transitioning from point cloud models to ideal mathematical surfaces and projecting textures using software tools. The methodology used for geometry and texture development in the present work does not require any dedicated software. The different processing steps can be individually checked for any error introduced, which can be then quantified. A direct accuracy check of the planar development of the frescoed surface has been carried out by qualified restorers, yielding a result of 3 mm. The proposed methodology, although requiring further studies to improve automation of the different processing steps, allowed extracting 2D drafts fully usable by operators restoring the vault frescoes.

  13. A Model-Based Methodology for Spray-Drying Process Development.

    Science.gov (United States)

    Dobry, Dan E; Settell, Dana M; Baumann, John M; Ray, Rod J; Graham, Lisa J; Beyerinck, Ron A

    2009-09-01

    Solid amorphous dispersions are frequently used to improve the solubility and, thus, the bioavailability of poorly soluble active pharmaceutical ingredients (APIs). Spray-drying, a well-characterized pharmaceutical unit operation, is ideally suited to producing solid amorphous dispersions due to its rapid drying kinetics. This paper describes a novel flowchart methodology based on fundamental engineering models and state-of-the-art process characterization techniques that ensure that spray-drying process development and scale-up are efficient and require minimal time and API. This methodology offers substantive advantages over traditional process-development methods, which are often empirical and require large quantities of API and long development times. This approach is also in alignment with the current guidance on Pharmaceutical Development Q8(R1). The methodology is used from early formulation-screening activities (involving milligrams of API) through process development and scale-up for early clinical supplies (involving kilograms of API) to commercial manufacturing (involving metric tons of API). It has been used to progress numerous spray-dried dispersion formulations, increasing bioavailability of formulations at preclinical through commercial scales.

  14. A New Methodology for Building-Up a Robust Model for Heliostat Field Flux Characterization

    Directory of Open Access Journals (Sweden)

    Nicolás C. Cruz

    2017-05-01

    Full Text Available The heliostat field of solar central receiver systems (SCRS is formed by hundreds, even thousands, of working heliostats. Their adequate configuration and control define a currently active research line. For instance, automatic aiming methodologies of existing heliostat fields are being widely studied. In general, control techniques require a model of the system to be controlled in order to obtain an estimation of its states. However, this kind of information may not be available or may be hard to obtain for every plant to be studied. In this work, an innovative methodology for data-based analytical heliostat field characterization is proposed and described. It formalizes the way in which the behavior of a whole field can be derived from the study of its more descriptive parts. By successfully applying this procedure, the instantaneous behavior of a field could be expressed by a reduced set of expressions that can be seen as a field descriptor. It is not intended to replace real experimentation but to enhance researchers’ autonomy to build their own reliable and portable synthetic datasets at preliminary stages of their work. The methodology proposed in this paper is successfully applied to a virtual field. Only 30 heliostats out of 541 were studied to characterize the whole field. For the validation set, the average difference in power between the flux maps directly fitted from the measured information and the estimated ones is only of 0.67% (just 0.10946 kW/m2 of root-mean-square error, on average, between them. According to these results, a consistent field descriptor can be built by applying the proposed methodology, which is hence ready for use.

  15. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Berglund, Johan [SwedPower AB, Stockholm (Sweden); Follin, Sven [SF Geologic AB, Stockholm (Sweden); Hakami, Eva [Itasca Geomekanik AB, Stockholm (Sweden); Halvarson, Jan [Swedish Nuclear Fuel and Waste Management Co, Stockholm (Sweden); Hermanson, Jan [Golder Associates AB, Stockholm (Sweden); Laaksoharju, Marcus [Geopoint (Sweden); Rhen, Ingvar [Sweco VBB/VIAK, Stockholm (Sweden); Wahlgren, C.H. [Sveriges Geologiska Undersoekning, Uppsala (Sweden)

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline

  16. A new methodology for building energy benchmarking: An approach based on clustering concept and statistical models

    Science.gov (United States)

    Gao, Xuefeng

    Though many building energy benchmarking programs have been developed during the past decades, they hold certain limitations. The major concern is that they may cause misleading benchmarking due to not fully considering the impacts of the multiple features of buildings on energy performance. The existing methods classify buildings according to only one of many features of buildings -- the use type, which may result in a comparison between two buildings that are tremendously different in other features and not properly comparable as a result. This research aims to tackle this challenge by proposing a new methodology based on the clustering concept and statistical analysis. The clustering concept, which reflects on machine learning algorithms, classifies buildings based on a multi-dimensional domain of building features, rather than the single dimension of use type. Buildings with the greatest similarity of features that influence energy performance are classified into the same cluster, and benchmarked according to the centroid reference of the cluster. Statistical analysis is applied to find the most influential features impacting building energy performance, as well as provide prediction models for the new design energy consumption. The proposed methodology as applicable to both existing building benchmarking and new design benchmarking was discussed in this dissertation. The former contains four steps: feature selection, clustering algorithm adaptation, results validation, and interpretation. The latter consists of three parts: data observation, inverse modeling, and forward modeling. The experimentation and validation were carried out for both perspectives. It was shown that the proposed methodology could account for the total building energy performance and was able to provide a more comprehensive approach to benchmarking. In addition, the multi-dimensional clustering concept enables energy benchmarking among different types of buildings, and inspires a new

  17. Exploring theoretical functions of corpus data in teaching translation

    Directory of Open Access Journals (Sweden)

    Éric Poirier

    2016-06-01

    Full Text Available As language referential data banks, corpora are instrumental in the exploration of translation solutions in bilingual parallel texts or conventional usages of source or target language in monolingual general or specialized texts. These roles are firmly rooted in translation processes, from analysis and interpretation of source text to searching for an acceptable equivalent and integrating it into the production of the target text. Provided the creative and not the conservative way be taken, validation or adaptation of target text in accordance with conventional usages in the target language also benefits from corpora. Translation teaching is not exploiting this way of translating that is common practice in the professional translation markets around the world. Instead of showing what corpus tools can do to translation teaching, we start our analysis with a common issue within translation teaching and show how corpus data can help to resolve it in learning activities in translation courses. We suggest a corpus-driven model for the interpretation of ‘business’ as a term and as an item in complex terms based on source text pattern analysis. This methodology will make it possible for teachers to explain and justify interpretation rules that have been defined theoretically from corpus data. It will also help teachers to conceive and non-subjectively assess practical activities designed for learners of translation. Corpus data selected for the examples of rule-based interpretations provided in this paper have been compiled in a corpus-driven study (Poirier, 2015 on the translation of the noun ‘business’ in the field of specialized translation in business, economics, and finance from English to French. The corpus methodology and rule-based interpretation of senses can be generalized and applied in the definition of interpretation rules for other language pairs and other specialized simple and complex terms. These works will encourage the

  18. What is translational research? Concepts and applications in nutrition and dietetics.

    Science.gov (United States)

    Zoellner, Jamie; Van Horn, Linda; Gleason, Philip M; Boushey, Carol J

    2015-07-01

    This monograph is tenth in a series of articles focused on research design and analysis, and provides an overview of translational research concepts. Specifically, this article presents models and processes describing translational research, defines key terms, discusses methodological considerations for speeding the translation of nutrition research into practice, illustrates application of translational research concepts for nutrition practitioners and researchers, and provides examples of translational research resources and training opportunities. To promote the efficiency and translation of evidence-based nutrition guidelines into routine clinical-, community-, and policy-based practice, the dissemination and implementation phases of translational research are highlighted and illustrated in this monograph. Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  19. Do Methodological Choices in Environmental Modeling Bias Rebound Effects? A Case Study on Electric Cars.

    Science.gov (United States)

    Font Vivanco, David; Tukker, Arnold; Kemp, René

    2016-10-18

    Improvements in resource efficiency often underperform because of rebound effects. Calculations of the size of rebound effects are subject to various types of bias, among which methodological choices have received particular attention. Modellers have primarily focused on choices related to changes in demand, however, choices related to modeling the environmental burdens from such changes have received less attention. In this study, we analyze choices in the environmental assessment methods (life cycle assessment (LCA) and hybrid LCA) and environmental input-output databases (E3IOT, Exiobase and WIOD) used as a source of bias. The analysis is done for a case study on battery electric and hydrogen cars in Europe. The results describe moderate rebound effects for both technologies in the short term. Additionally, long-run scenarios are calculated by simulating the total cost of ownership, which describe notable rebound effect sizes-from 26 to 59% and from 18 to 28%, respectively, depending on the methodological choices-with favorable economic conditions. Relevant sources of bias are found to be related to incomplete background systems, technology assumptions and sectorial aggregation. These findings highlight the importance of the method setup and of sensitivity analyses of choices related to environmental modeling in rebound effect assessments.

  20. A methodology for assessing the market benefits of alternative motor fuels: The Alternative Fuels Trade Model

    Energy Technology Data Exchange (ETDEWEB)

    Leiby, P.N.

    1993-09-01

    This report describes a modeling methodology for examining the prospective economic benefits of displacing motor gasoline use by alternative fuels. The approach is based on the Alternative Fuels Trade Model (AFTM). AFTM development was undertaken by the US Department of Energy (DOE) as part of a longer term study of alternative fuels issues. The AFTM is intended to assist with evaluating how alternative fuels may be promoted effectively, and what the consequences of substantial alternative fuels use might be. Such an evaluation of policies and consequences of an alternative fuels program is being undertaken by DOE as required by Section 502(b) of the Energy Policy Act of 1992. Interest in alternative fuels is based on the prospective economic, environmental and energy security benefits from the substitution of these fuels for conventional transportation fuels. The transportation sector is heavily dependent on oil. Increased oil use implies increased petroleum imports, with much of the increase coming from OPEC countries. Conversely, displacement of gasoline has the potential to reduce US petroleum imports, thereby reducing reliance on OPEC oil and possibly weakening OPEC`s ability to extract monopoly profits. The magnitude of US petroleum import reduction, the attendant fuel price changes, and the resulting US benefits, depend upon the nature of oil-gas substitution and the supply and demand behavior of other world regions. The methodology applies an integrated model of fuel market interactions to characterize these effects.

  1. Computational simulation methodologies for mechanobiological modelling: a cell-centred approach to neointima development in stents.

    Science.gov (United States)

    Boyle, C J; Lennon, A B; Early, M; Kelly, D J; Lally, C; Prendergast, P J

    2010-06-28

    The design of medical devices could be very much improved if robust tools were available for computational simulation of tissue response to the presence of the implant. Such tools require algorithms to simulate the response of tissues to mechanical and chemical stimuli. Available methodologies include those based on the principle of mechanical homeostasis, those which use continuum models to simulate biological constituents, and the cell-centred approach, which models cells as autonomous agents. In the latter approach, cell behaviour is governed by rules based on the state of the local environment around the cell; and informed by experiment. Tissue growth and differentiation requires simulating many of these cells together. In this paper, the methodology and applications of cell-centred techniques--with particular application to mechanobiology--are reviewed, and a cell-centred model of tissue formation in the lumen of an artery in response to the deployment of a stent is presented. The method is capable of capturing some of the most important aspects of restenosis, including nonlinear lesion growth with time. The approach taken in this paper provides a framework for simulating restenosis; the next step will be to couple it with more patient-specific geometries and quantitative parameter data.

  2. METHODOLOGY FOR THE ESTIMATION OF PARAMETERS, OF THE MODIFIED BOUC-WEN MODEL

    Directory of Open Access Journals (Sweden)

    Tomasz HANISZEWSKI

    2015-03-01

    Full Text Available Bouc-Wen model is theoretical formulation that allows to reflect real hysteresis loop of modeled object. Such object is for example a wire rope, which is present on equipment of crane lifting mechanism. Where adopted modified version of the model has nine parameters. Determination of such a number of parameters is complex and problematic issue. In this article are shown the methodology to identify and sample results of numerical simulations. The results were compared with data obtained on the basis of laboratory tests of ropes [3] and on their basis it was found that there is compliance between results and there is possibility to apply in dynamic systems containing in their structures wire ropes [4].

  3. CALS and the Product State Model - Methodology and Supporting Schools and Paradigms

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1998-01-01

    This paper address the preliminary considerations in a research project, initiated February 1997, regarding Continuous Acquisition and Life-cycle Support (CALS) which is a part of the activities in CALS Center Denmark. The CALS concept is presented focusing on the Product State Model (PSM). The PSM...... incorporates relevant information about each stage of the production process.The paper will describe the research object, the model object and discuss a part of the methodology in developing a Product State Model. The project is primarily technological, however, organisational and human aspects...... will be considered, as the intentions are that a prototype should be implemented in the production line at Odense Steel Shipyard. Hence, a Multiview approach will be considered incorporating the informational need of many actors/machines. Parameter identification, i.e. describing the parameters which PSM...

  4. Modeling Customer Loyalty by System Dynamics Methodology (Case Study: Internet Service Provider Company

    Directory of Open Access Journals (Sweden)

    Alireza Bafandeh Zendeh

    2016-03-01

    Full Text Available Due to the complexity of the customer loyalty, we tried to provide a conceptual model to explain it in an Internet service provider company with system dynamics approach. To do so, the customer’s loyalty for statistical population was analyzed according to Sterman’s modeling methodology. First of all the reference modes (historical behavior of customer loyalty was evaluated. Then dynamic hypotheses was developed by utilizing causal - loop diagrams and stock-flow maps, based on theoretical literature. In third stage, initial conditions of variables, parameters, and mathematical functions between them were estimated. The model was tested, finally advertising, quality of services improvement and continuing the current situation scenarios were evaluated. Results showed improving the quality of service scenario is more effectiveness in compare to others

  5. A Mapping Model for Transforming Traditional Software Development Methods to Agile Methodology

    Directory of Open Access Journals (Sweden)

    Rashmi Popli

    2013-07-01

    Full Text Available Agility is bringing in responsibility and ownership in individuals, which will eventually bring outeffectiveness and efficiency in deliverables. Agile model is growing in the market at very good pace.Companies are drifting from traditional Software Development Life Cycle models to Agile Environment forthe purpose of attaining quality and for the sake of saving cost and time. Nimbleness nature of Agile ishelpful in frequent releases so as to satisfy the customer by providing frequent dual feedback. InTraditional models, life cycle is properly defined and also phases are elaborated by specifying needed inputand output parameters. On the other hand, in Agile environment, phases are specific to methodologies ofAgile - Extreme Programming etc. In this paper a common life cycle approach is proposed that isapplicable for different kinds of teams. The paper aims to describe a mapping function for mapping oftraditional methods to Agile method.

  6. Methodology for Training Small Domain-specific Language Models and Its Application in Service Robot Speech Interface

    Directory of Open Access Journals (Sweden)

    ONDAS Stanislav

    2014-05-01

    Full Text Available The proposed paper introduces the novel methodology for training small domain-specific language models only from domain vocabulary. Proposed methodology is intended for situations, when no training data are available and preparing of appropriate deterministic grammar is not trivial task. Methodology consists of two phases. In the first phase the “random” deterministic grammar, which enables to generate all possible combination of unigrams and bigrams is constructed from vocabulary. Then, prepared random grammar serves for generating the training corpus. The “random” n-gram model is trained from generated corpus, which can be adapted in second phase. Evaluation of proposed approach has shown usability of the methodology for small domains. Results of methodology assessment favor designed method instead of constructing the appropriate deterministic grammar.

  7. A Methodology for Robust Multiproxy Paleoclimate Reconstructions and Modeling of Temperature Conditional Quantiles.

    Science.gov (United States)

    Janson, Lucas; Rajaratnam, Bala

    Great strides have been made in the field of reconstructing past temperatures based on models relating temperature to temperature-sensitive paleoclimate proxies. One of the goals of such reconstructions is to assess if current climate is anomalous in a millennial context. These regression based approaches model the conditional mean of the temperature distribution as a function of paleoclimate proxies (or vice versa). Some of the recent focus in the area has considered methods which help reduce the uncertainty inherent in such statistical paleoclimate reconstructions, with the ultimate goal of improving the confidence that can be attached to such endeavors. A second important scientific focus in the subject area is the area of forward models for proxies, the goal of which is to understand the way paleoclimate proxies are driven by temperature and other environmental variables. One of the primary contributions of this paper is novel statistical methodology for (1) quantile regression with autoregressive residual structure, (2) estimation of corresponding model parameters, (3) development of a rigorous framework for specifying uncertainty estimates of quantities of interest, yielding (4) statistical byproducts that address the two scientific foci discussed above. We show that by using the above statistical methodology we can demonstrably produce a more robust reconstruction than is possible by using conditional-mean-fitting methods. Our reconstruction shares some of the common features of past reconstructions, but we also gain useful insights. More importantly, we are able to demonstrate a significantly smaller uncertainty than that from previous regression methods. In addition, the quantile regression component allows us to model, in a more complete and flexible way than least squares, the conditional distribution of temperature given proxies. This relationship can be used to inform forward models relating how proxies are driven by temperature.

  8. A new methodology to test galaxy formation models using the dependence of clustering on stellar mass

    Science.gov (United States)

    Campbell, David J. R.; Baugh, Carlton M.; Mitchell, Peter D.; Helly, John C.; Gonzalez-Perez, Violeta; Lacey, Cedric G.; Lagos, Claudia del P.; Simha, Vimal; Farrow, Daniel J.

    2015-09-01

    We present predictions for the two-point correlation function of galaxy clustering as a function of stellar mass, computed using two new versions of the GALFORM semi-analytic galaxy formation model. These models make use of a high resolution, large volume N-body simulation, set in the 7-year Wilkinson Microwave Anisotropy Probe cosmology. One model uses a universal stellar initial mass function (IMF), while the other assumes different IMFs for quiescent star formation and bursts. Particular consideration is given to how the assumptions required to estimate the stellar masses of observed galaxies (such as the choice of IMF, stellar population synthesis model, and dust extinction) influence the perceived dependence of galaxy clustering on stellar mass. Broad-band spectral energy distribution fitting is carried out to estimate stellar masses for the model galaxies in the same manner as in observational studies. We show clear differences between the clustering signals computed using the true and estimated model stellar masses. As such, we highlight the importance of applying our methodology to compare theoretical models to observations. We introduce an alternative scheme for the calculation of the merger time-scales for satellite galaxies in GALFORM, which takes into account the dark matter subhalo information from the simulation. This reduces the amplitude of small-scale clustering. The new merger scheme offers improved or similar agreement with observational clustering measurements, over the redshift range 0 Public Extragalactic Redshift Survey, depending on the GALFORM model used.

  9. Evaluation of methodologies for interpolation of data for hydrological modeling in glacierized basins with limited information

    Science.gov (United States)

    Muñoz, Randy; Paredes, Javier; Huggel, Christian; Drenkhan, Fabian; García, Javier

    2017-04-01

    The availability and consistency of data is a determining factor for the reliability of any hydrological model and simulated results. Unfortunately, there are many regions worldwide where data is not available in the desired quantity and quality. The Santa River basin (SRB), located within a complex topographic and climatic setting in the tropical Andes of Peru is a clear example of this challenging situation. A monitoring network of in-situ stations in the SRB recorded series of hydro-meteorological variables which finally ceased to operate in 1999. In the following years, several researchers evaluated and completed many of these series. This database was used by multiple research and policy-oriented projects in the SRB. However, hydroclimatic information remains limited, making it difficult to perform research, especially when dealing with the assessment of current and future water resources. In this context, here the evaluation of different methodologies to interpolate temperature and precipitation data at a monthly time step as well as ice volume data in glacierized basins with limited data is presented. The methodologies were evaluated for the Quillcay River, a tributary of the SRB, where the hydro-meteorological data is available from nearby monitoring stations since 1983. The study period was 1983 - 1999 with a validation period among 1993 - 1999. For temperature series the aim was to extend the observed data and interpolate it. Data from Reanalysis NCEP was used to extend the observed series: 1) using a simple correlation with multiple field stations, or 2) applying the altitudinal correction proposed in previous studies. The interpolation then was applied as a function of altitude. Both methodologies provide very close results, by parsimony simple correlation is shown as a viable choice. For precipitation series, the aim was to interpolate observed data. Two methodologies were evaluated: 1) Inverse Distance Weighting whose results underestimate the amount

  10. A MAINTENANCE STRATEGY MODEL FOR STATIC EQUIPMENT USING INSPECTION METHODOLOGIES AND RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    J.K. Visser

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Mechanical equipment used on process plants can be categorised into two main types, namely static and rotating equipment. A brief survey at a number of chemical process plants indicated that a number of maintenance strategies exist and are used for rotating equipment. However, some of these strategies are not directly applicable to static equipment, although the risk-based inspection (RBI methodology has been developed for pressure vessels. A generalised risk-based maintenance strategy for all types of static equipment does not currently exist. This paper describes the development of an optimised model of inspection methodologies, maintenance strategies, and risk management principles that are generically applicable for static equipment. It enables maintenance managers and engineers to select an applicable maintenance strategy and inspection methodology, based on the operational and business risks posed by the individual pieces of equipment.

    AFRIKAANSE OPSOMMING: Meganiese toerusting wat op prosesaanlegte gebruik word kan in twee kategorieë verdeel word, naamlik statiese en roterende toerusting. 'n Bondige ondersoek by 'n aantal chemiese prosesaanlegte het aangedui dat 'n aantal strategieë vir instandhouding van roterende toerusting gebruik word, terwyl die risikogebaseerde inspeksiemetodologie wel vir drukvate gebruik word. 'n Algemene risikogebaseerde instandhoudingstrategie vir alle tipes statiese toerusting is egter nie tans beskikbaar nie. Hierdie artikel beskryf die ontwikkeling van 'n geoptimeerde model van inspeksiemetodologieë, instandhoudingstrategieë, en risikobestuursbeginsels wat algemeen gebruik kan word vir statiese toerusting. Dit stel die instandhouding-bestuurders en -ingenieurs in staat om 'n instandhoudingstrategie en inspeksie-metodologie te kies, gebaseer op die operasionele en besigheidsrisiko's van die individuele toerusting.

  11. A system-of-systems modeling methodology for strategic general aviation design decision-making

    Science.gov (United States)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting

  12. A new methodology for dynamic modelling of health risks arising from wastewater influenced urban flooding

    Science.gov (United States)

    Jørgensen, Claus; Mark, Ole; Djordjevic, Slobodan; Hammond, Michael; Khan, David M.; Erichsen, Anders; Dorrit Enevoldsen, Ann; Heinicke, Gerald; Helwigh, Birgitte

    2015-04-01

    Indroduction Urban flooding due to rainfall exceeding the design capacity of drainage systems is a global problem and it has significant economic and social consequences. While the cost of the direct flood damages of urban flooding is well understood, the indirect damages, like the water borne diseases is in general still poorly understood. Climate changes are expected to increase the frequency of urban flooding in many countries which is likely to increase water borne diseases. Diarrheal diseases are most prevalent in developing countries, where poor sanitation, poor drinking water and poor surface water quality causes a high disease burden and mortality, especially during floods. The level of water borne diarrhea in countries with well-developed water and waste water infrastructure has been reduced to an acceptable level, and the population in general do not consider waste water as being a health risk. Hence, exposure to wastewater influenced urban flood water still has the potential to cause transmission of diarrheal diseases. When managing urban flooding and planning urban climate change adaptations, health risks are rarely taken into consideration. This paper outlines a novel methodology for linking dynamic urban flood modelling with Quantitative Microbial Risk Assessment (QMRA). This provides a unique possibility for understanding the interaction between urban flooding and the health risks caused by direct human contact with flood water and provides an option for reducing the burden of disease in the population through the use of intelligent urban flood risk management. Methodology We have linked hydrodynamic urban flood modelling with quantitative microbial risk assessment (QMRA) to determine the risk of infection caused by exposure to wastewater influenced urban flood water. The deterministic model MIKE Flood, which integrates the sewer network model in MIKE Urban and the 2D surface model MIKE21, was used to calculate the concentration of pathogens in the

  13. Methodology for the Incorporation of Passive Component Aging Modeling into the RAVEN/ RELAP-7 Environment

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua; Alfonsi, Andrea; Askin Guler; Tunc Aldemir

    2014-11-01

    Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper represents an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation

  14. Geared rotor dynamic methodologies for advancing prognostic modeling capabilities in rotary-wing transmission systems

    Science.gov (United States)

    Stringer, David Blake

    The overarching objective in this research is the development of a robust, rotor dynamic, physics based model of a helicopter drive train as a foundation for the prognostic modeling for rotary-wing transmissions. Rotorcrafts rely on the integrity of their drive trains for their airworthiness. Drive trains rely on gear technology for their integrity and function. Gears alter the vibration characteristics of a mechanical system and significantly contribute to noise, component fatigue, and personal discomfort prevalent in rotorcraft. This research effort develops methodologies for generating a rotor dynamic model of a rotary-wing transmission based on first principles, through (i) development of a three-dimensional gear-mesh stiffness model for helical and spur gears and integration of this model in a finite element rotor dynamic model, (ii) linear and nonlinear analyses of a geared system for comparison and validation of the gear-mesh model, (iii) development of a modal synthesis technique for potentially providing model reduction and faster analysis capabilities for geared systems, and (iv) extension of the gear-mesh model to bevel and epicyclic configurations. In addition to model construction and validation, faults indigenous to geared systems are presented and discussed. Two faults are selected for analysis and seeded into the transmission model. Diagnostic vibration parameters are presented and used as damage indicators in the analysis. The fault models produce results consistent with damage experienced during experimental testing. The results of this research demonstrate the robustness of the physics-based approach in simulating multiple normal and abnormal conditions. The advantages of this physics-based approach, when combined with contemporary probabilistic and time-series techniques, provide a useful method for improving health monitoring technologies in mechanical systems.

  15. Revising Translations

    DEFF Research Database (Denmark)

    Rasmussen, Kirsten Wølch; Schjoldager, Anne

    2011-01-01

    out by specialised revisers, but by staff translators, who revise the work of colleagues and freelancers on an ad hoc basis. Corrections are mostly given in a peer-to-peer fashion, though the work of freelancers and inexperienced in-house translators is often revised in an authoritative (nonnegotiable......) way. Most respondents and interviewees are worried about increasing pressures on the translation market, which, combined with customers’ general lack of understanding of the translation process, mean that systematic, all-encompassing quality assurance is rarely financially viable....

  16. Dunedin's free clinic: an exploration of its model of care using case study methodology.

    Science.gov (United States)

    Loh, Lik; Jaye, Chrystal; Dovey, Susan; Lloyd, Hywel; Rowe, Joanne

    2015-06-01

    Models of care are important therapeutic modalities for achieving the goals of health care teams, but they are seldom explicitly stated or investigated. To describe the model of care at Dunedin's free clinic, and assess whether this model catered to the particular needs of enrolled patients. A mixed methods study was conducted using case study methodology to construct the clinic's model of care from multiple data sources, and to create a profile of patients' needs. A nested case study of patients with diabetes examined patients' social vulnerability characteristics. The pattern matching analytic technique was used to assess the degree of alignment between the model of care and patients' needs. Patients were not only high users of both primary and secondary health care, but also of justice and social welfare sector services. The care of patients with diabetes was complicated by coexisting social vulnerability and medical comorbidities. Surveyed patients placed high value on interpersonal dimensions of care, the Christian ethos of the clinic, and the wider range of services available. This study suggests a degree of 'fit' between the clinic's model of care and the needs of enrolled patients. A model of care that caters to the needs of patients with complex needs is important for securing their engagement in health services.

  17. Modeling and Analysis of MRR, EWR and Surface Roughness in EDM Milling through Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    A. K.M.S. Iqbal

    2010-01-01

    Full Text Available Problem statement: Electrical Discharge Machining (EDM has grown over the last few decades from a novelty to a mainstream manufacturing process. Though, EDM process is very demanding but the mechanism of the process is complex and far from completely understood. It is difficult to establish a model that can accurately predict the performance by correlating the process parameters. The optimum processing parameters are essential to increase the production rate and decrease the machining time, since the materials, which are processed by EDM and even the process is very costly. This research establishes empirical relations regarding machining parameters and the responses in analyzing the machinability of the stainless steel. Approach: The machining factors used are voltage, rotational speed of electrode and feed rate over the responses MRR, EWR and Ra. Response surface methodology was used to investigate the relationships and parametric interactions between the three controllable variables on the MRR, EWR and Ra. Central composite experimental design was used to estimate the model coefficients of the three factors. The responses were modeled using a response surface model based on experimental results. The significant coefficients were obtained by performing Analysis Of Variance (ANOVA at 95% level of significance. Results: The variation in percentage errors for developed models was found within 5%. Conclusion: The developed models show that voltage and rotary motion of electrode are the most significant machining parameters influencing MRR, EWR and Ra. These models can be used to get the desired responses within the experimental range.

  18. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  19. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  20. PAGIS summary report of phase 1: a common methodological approach based on European data and models

    Energy Technology Data Exchange (ETDEWEB)

    Cadelli, N.; Cottone, G.; Bertozzi, G.; Girardi, F.

    1984-01-01

    Since 1982 a joint study has been launched by the CEC with the participation of national institutions in the E.C., aiming at a Performance Assessment of Geological Isolation Systems (PAGIS) for HLW disposal. This document is a summary of the first phase of the study which was devoted to the collection of data and models and to the choice of an appropriate methodology. To this purpose, real or national sites have been chosen, which are representative of three types of continental geological formations in the E.C.: clay, granite and salt (although the choices imply no committment of any kind about their final use). Moreover, sub-seabed areas have also been identified. The study covers the following items: - basic data on waste characteristics, site data and repository designs; - methodology, which allows sensitivity and uncertainty analyses to be performed, as well as the assessment of radiation doses to individuals and populations; - preliminary modelling of radionuclide release and migration through the geosphere (near- and far-field) and the biosphere following their various pathways to man; - selection of the most relevant radionuclide release scenarios and their probability of occurrence. Reference values have been selected for the basic data as well as variants covering the various options which are under consideration in the different Countries of the E.C.

  1. The Double Layer Methodology and the Validation of Eigenbehavior Techniques Applied to Lifestyle Modeling

    Science.gov (United States)

    Lamichhane, Bishal

    2017-01-01

    A novel methodology, the double layer methodology (DLM), for modeling an individual's lifestyle and its relationships with health indicators is presented. The DLM is applied to model behavioral routines emerging from self-reports of daily diet and activities, annotated by 21 healthy subjects over 2 weeks. Unsupervised clustering on the first layer of the DLM separated our population into two groups. Using eigendecomposition techniques on the second layer of the DLM, we could find activity and diet routines, predict behaviors in a portion of the day (with an accuracy of 88% for diet and 66% for activity), determine between day and between individual similarities, and detect individual's belonging to a group based on behavior (with an accuracy up to 64%). We found that clustering based on health indicators was mapped back into activity behaviors, but not into diet behaviors. In addition, we showed the limitations of eigendecomposition for lifestyle applications, in particular when applied to noisy and sparse behavioral data such as dietary information. Finally, we proposed the use of the DLM for supporting adaptive and personalized recommender systems for stimulating behavior change. PMID:28133607

  2. Application of infinite model predictive control methodology to other advanced controllers.

    Science.gov (United States)

    Abu-Ayyad, M; Dubay, R; Hernandez, J M

    2009-01-01

    This paper presents an application of most recent developed predictive control algorithm an infinite model predictive control (IMPC) to other advanced control schemes. The IMPC strategy was derived for systems with different degrees of nonlinearity on the process gain and time constant. Also, it was shown that IMPC structure uses nonlinear open-loop modeling which is conducted while closed-loop control is executed every sampling instant. The main objective of this work is to demonstrate that the methodology of IMPC can be applied to other advanced control strategies making the methodology generic. The IMPC strategy was implemented on several advanced controllers such as PI controller using Smith-Predictor, Dahlin controller, simplified predictive control (SPC), dynamic matrix control (DMC), and shifted dynamic matrix (m-DMC). Experimental work using these approaches combined with IMPC was conducted on both single-input-single-output (SISO) and multi-input-multi-output (MIMO) systems and compared with the original forms of these advanced controllers. Computer simulations were performed on nonlinear plants demonstrating that the IMPC strategy can be readily implemented on other advanced control schemes providing improved control performance. Practical work included real-time control applications on a DC motor, plastic injection molding machine and a MIMO three zone thermal system.

  3. Top-down methodology for rainfall-runoff modelling and evaluation of hydrological extremes

    Science.gov (United States)

    Willems, Patrick

    2014-05-01

    A top-down methodology is presented for implementation and calibration of a lumped conceptual catchment rainfall-runoff model that aims to produce high model performance (depending on the quality and availability of data) in terms of rainfall-runoff discharges for the full range from low to high discharges, including the peak and low flow extremes. The model is to be used to support water engineering applications, which most often deal with high and low flows as well as cumulative runoff volumes. With this application in mind, the paper wants to contribute to the above-mentioned problems and advancements on model evaluation, model-structure selection, the overparameterization problem and the long time the modeller needs to invest or the difficulties one encounters when building and calibrating a lumped conceptual model for a river catchment. The methodology is an empirical and step-wise technique that includes examination of the various model components step by step through a data-based analysis of response characteristics. The approach starts from a generalized lumped conceptual model structure. In this structure, only the general components of a lumped conceptual model, such as the existence of storage and routing elements, and their inter-links, are pre-defined. The detailed specifications on model equations and parameters are supported by advanced time series analysis of the empirical response between the rainfall and evapotranspiration inputs and the river flow output. Subresponses are separated and submodel components and related subsets of parameters are calibrated as independently as possible. At the same time, the model-structure identification process aims to reach parsimonious submodel-structures, and accounts for the serial dependency of runoff values, which typically is higher for low flows than for high flows. It also accounts for the heteroscedasticity and dependency of model residuals when evaluating the model performance. It is shown that this step

  4. Methodology for Outdoor Water Savings Model and Spreadsheet Tool for U.S. and Selected States

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alison A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chen, Yuting [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dunham, Camilla [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Hannah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    Green lawns and landscaping are archetypical of the populated American landscape, and typically require irrigation, which corresponds to a significant fraction of residential, commercial, and institutional water use. In North American cities, the estimated portion of residential water used for outdoor purposes ranges from 22-38% in cooler climates up to 59-67% in dry and hot environments, while turfgrass coverage within the United States spans 11.1-20.2 million hectares (Milesi et al. 2009). One national estimate uses satellite and aerial photography data to develop a relationship between impervious surface and lawn surface area, yielding a conservative estimate of 16.4 (± 3.6) million hectares of lawn surface area in the United States—an area three times larger than that devoted to any irrigated crop (Milesi et al. 2005). One approach that holds promise for cutting unnecessary outdoor water use is the increased deployment of “smart” irrigation controllers to increase the water efficiency of irrigation systems. This report describes the methodology and inputs employed in a mathematical model that quantifies the effects of the U.S. Environmental Protection Agency’s WaterSense labeling program for one such type of controller, weather-based irrigation controllers (WBIC). This model builds off that described in “Methodology for National Water Savings Model and Spreadsheet Tool–Outdoor Water Use” and uses a two-tiered approach to quantify outdoor water savings attributable to the WaterSense program for WBIC, as well as net present value (NPV) of that savings. While the first iteration of the model assessed national impacts using averaged national values, this version begins by evaluating impacts in three key large states that make up a sizable portion of the irrigation market: California, Florida, and Texas. These states are considered to be the principal market of “smart” irrigation controllers that may result in the bulk of national savings. Modeled

  5. Rosetta: an operator basis translator for standard model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Falkowski, Adam [Laboratoire de Physique Théorique, Bat. 210, Université Paris-Sud, 91405, Orsay (France); Fuks, Benjamin [Département Recherches Subatomiques, Institut Pluridisciplinaire Hubert Curien, Université de Strasbourg/CNRS-IN2P3, 23 rue du Loess, 67037, Strasbourg (France); Mawatari, Kentarou [Theoretische Natuurkunde and IIHE/ELEM, Vrije Universiteit Brussel, and International Solvay Institutes, Pleinlaan 2, 1050, Brussels (Belgium); Mimasu, Ken, E-mail: k.mimasu@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, BN1 9QH, Brighton (United Kingdom); Riva, Francesco [CERN, Theory Division, 1211, Geneva (Switzerland); Sanz, Verónica [Department of Physics and Astronomy, University of Sussex, BN1 9QH, Brighton (United Kingdom)

    2015-12-10

    We introduce Rosetta, a program allowing for the translation between different bases of effective field theory operators. We present the main functions of the program and provide an example of usage. One of the Lagrangians which Rosetta can translate into has been implemented into FeynRules, which allows Rosetta to be interfaced into various high-energy physics programs such as Monte Carlo event generators. In addition to popular bases choices, such as the Warsaw and Strongly Interacting Light Higgs bases already implemented in the program, we also detail how to add new operator bases into the Rosetta package. In this way, phenomenological studies using an effective field theory framework can be straightforwardly performed.

  6. Rosetta: an operator basis translator for standard model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Falkowski, Adam [Universite Paris-Sud, Laboratoire de Physique Theorique, Bat. 210, Orsay (France); Fuks, Benjamin [Universite de Strasbourg/CNRS-IN2P3, Departement Recherches Subatomiques, Institut Pluridisciplinaire Hubert Curien, Strasbourg (France); Mawatari, Kentarou [Theoretische Natuurkunde and IIHE/ELEM, Vrije Universiteit Brussel, and International Solvay Institutes, Brussels (Belgium); Mimasu, Ken; Sanz, Veronica [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom); Riva, Francesco [CERN, Theory Division, Geneva (Switzerland)

    2015-12-15

    We introduce Rosetta, a program allowing for the translation between different bases of effective field theory operators. We present the main functions of the program and provide an example of usage. One of the Lagrangians which Rosetta can translate into has been implemented into FeynRules, which allows Rosetta to be interfaced into various high-energy physics programs such as Monte Carlo event generators. In addition to popular bases choices, such as the Warsaw and Strongly Interacting Light Higgs bases already implemented in the program, we also detail how to add new operator bases into the Rosetta package. In this way, phenomenological studies using an effective field theory framework can be straightforwardly performed. (orig.)

  7. Rosetta: an operator basis translator for Standard Model effective field theory

    CERN Document Server

    Falkowski, Adam; Mawatari, Kentarou; Mimasu, Ken; Riva, Francesco; sanz, Verónica

    2015-01-01

    We introduce Rosetta, a program allowing for the translation between different bases of effective field theory operators. We present the main functions of the program and provide an example of usage. One of the bases which Rosetta can translate into has been implemented into FeynRules, which allows Rosetta to be interfaced into various high-energy physics programs such as Monte Carlo event generators. In addition to popular bases choices, such as the Warsaw and Strongly Interacting Light Higgs bases already implemented in the program, we also detail how to add new operator bases into the Rosetta package. In this way, phenomenological studies using an effective field theory framework can be straightforwardly performed.

  8. Direct Adaptive Control Methodologies for Flexible-Joint Space Manipulators with Uncertainties and Modeling Errors

    Science.gov (United States)

    Ulrich, Steve

    This work addresses the direct adaptive trajectory tracking control problem associated with lightweight space robotic manipulators that exhibit elastic vibrations in their joints, and which are subject to parametric uncertainties and modeling errors. Unlike existing adaptive control methodologies, the proposed flexible-joint control techniques do not require identification of unknown parameters, or mathematical models of the system to be controlled. The direct adaptive controllers developed in this work are based on the model reference adaptive control approach, and manage modeling errors and parametric uncertainties by time-varying the controller gains using new adaptation mechanisms, thereby reducing the errors between an ideal model and the actual robot system. More specifically, new decentralized adaptation mechanisms derived from the simple adaptive control technique and fuzzy logic control theory are considered in this work. Numerical simulations compare the performance of the adaptive controllers with a nonadaptive and a conventional model-based controller, in the context of 12.6 m xx 12.6 m square trajectory tracking. To validate the robustness of the controllers to modeling errors, a new dynamics formulation that includes several nonlinear effects usually neglected in flexible-joint dynamics models is proposed. Results obtained with the adaptive methodologies demonstrate an increased robustness to both uncertainties in joint stiffness coefficients and dynamics modeling errors, as well as highly improved tracking performance compared with the nonadaptive and model-based strategies. Finally, this work considers the partial state feedback problem related to flexible-joint space robotic manipulators equipped only with sensors that provide noisy measurements of motor positions and velocities. An extended Kalman filter-based estimation strategy is developed to estimate all state variables in real-time. The state estimation filter is combined with an adaptive

  9. Modelling and Statistical Optimization of Dilute Acid Hydrolysis of Corn Stover Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Andrew Nosakhare Amenaghawon

    2014-07-01

    Full Text Available Response surface methodology (RSM was employed for the analysis of the simultaneous effect of acid concentration, pretreatment time and temperature on the total reducing sugar concentration obtained during acid hydrolysis of corn stover. A three-variable, three-level Box-Behnken design (BBD was used to develop a statistical model for the optimization of the process variables. The optimal hydrolysis conditions that resulted in the maximum total reducing sugar concentration were acid concentration; 1.72 % (w/w, temperature; 169.260C and pretreatment time; 48.73 minutes. Under these conditions, the total reducing sugar concentration was obtained to be 23.41g/L. Validation of the model indicated no difference between predicted and observed values.

  10. A new methodology for modelling of health risk from urban flooding exemplified by cholera

    DEFF Research Database (Denmark)

    Mark, Ole; Jørgensen, Claus; Hammond, Michael

    2016-01-01

    The phenomenon of urban flooding due to rainfall exceeding the design capacity of drainage systems is a global problem and can have significant economic and social consequences. This is even more extreme in developing countries, where poor sanitation still causes a high infectious disease burden...... outlines a novel methodology for linking dynamic urban flood modelling with quantitative microbial risk assessment (QMRA). This provides a unique possibility for understanding the interaction between urban flooding and health risk caused by direct human contact with the flood water and hence gives...... and mortality, especially during floods. At present, there are no software tools capable of combining hydrodynamic modelling and health risk analyses, and the links between urban flooding and the health risk for the population due to direct contact with the flood water are poorly understood. The present paper...

  11. State-space models for bio-loggers: A methodological road map

    DEFF Research Database (Denmark)

    Jonsen, I.D.; Basson, M.; Bestley, S.

    2012-01-01

    development of state-space modelling approaches for animal movement data provides statistical rigor for inferring hidden behavioural states, relating these states to bio-physical data, and ultimately for predicting the potential impacts of climate change. Despite the widespread utility, and current popularity......-physical datasets to understand physiological and ecological influences on habitat selection. In most cases, however, the behavioural context is not directly observable and therefore, must be inferred. Animal movement data are complex in structure, entailing a need for stochastic analysis methods. The recent......, of state-space models for analysis of animal tracking data, these tools are not simple and require considerable care in their use. Here we develop a methodological “road map” for ecologists by reviewing currently available state-space implementations. We discuss appropriate use of state-space methods...

  12. A methodology for 3D modeling and visualization of geological objects

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Geological body structure is the product of the geological evolution in the time dimension, which is presented in 3D configuration in the natural world. However, many geologists still record and process their geological data using the 2D or 1D pattern, which results in the loss of a large quantity of spatial data. One of the reasons is that the current methods have limitations on how to express underground geological objects. To analyze and interpret geological models, we present a layer data model to organize different kinds of geological datasets. The data model implemented the unification expression and storage of geological data and geometric models. In addition, it is a method for visualizing large-scaled geological datasets through building multi-resolution geological models rapidly, which can meet the demand of the operation, analysis, and interpretation of 3D geological objects. It proves that our methodology is competent for 3D modeling and self-adaptive visualization of large geological objects and it is a good way to solve the problem of integration and share of geological spatial data.

  13. The Methodology of Interactive Parametric Modelling of Construction Site Facilities in BIM Environment

    Science.gov (United States)

    Kozlovská, Mária; Čabala, Jozef; Struková, Zuzana

    2014-11-01

    Information technology is becoming a strong tool in different industries, including construction. The recent trend of buildings designing is leading up to creation of the most comprehensive virtual building model (Building Information Model) in order to solve all the problems relating to the project as early as in the designing phase. Building information modelling is a new way of approaching to the design of building projects documentation. Currently, the building site layout as a part of the building design documents has a very little support in the BIM environment. Recently, the research of designing the construction process conditions has centred on improvement of general practice in planning and on new approaches to construction site layout planning. The state of art in field of designing the construction process conditions indicated an unexplored problem related to connection of knowledge system with construction site facilities (CSF) layout through interactive modelling. The goal of the paper is to present the methodology for execution of 3D construction site facility allocation model (3D CSF-IAM), based on principles of parametric and interactive modelling.

  14. Methodology to develop crash modification functions for road safety treatments with fully specified and hierarchical models.

    Science.gov (United States)

    Chen, Yongsheng; Persaud, Bhagwant

    2014-09-01

    Crash modification factors (CMFs) for road safety treatments are developed as multiplicative factors that are used to reflect the expected changes in safety performance associated with changes in highway design and/or the traffic control features. However, current CMFs have methodological drawbacks. For example, variability with application circumstance is not well understood, and, as important, correlation is not addressed when several CMFs are applied multiplicatively. These issues can be addressed by developing safety performance functions (SPFs) with components of crash modification functions (CM-Functions), an approach that includes all CMF related variables, along with others, while capturing quantitative and other effects of factors and accounting for cross-factor correlations. CM-Functions can capture the safety impact of factors through a continuous and quantitative approach, avoiding the problematic categorical analysis that is often used to capture CMF variability. There are two formulations to develop such SPFs with CM-Function components - fully specified models and hierarchical models. Based on sample datasets from two Canadian cities, both approaches are investigated in this paper. While both model formulations yielded promising results and reasonable CM-Functions, the hierarchical model was found to be more suitable in retaining homogeneity of first-level SPFs, while addressing CM-Functions in sub-level modeling. In addition, hierarchical models better capture the correlations between different impact factors.

  15. A methodology for 3D modeling and visualization of geological objects

    Institute of Scientific and Technical Information of China (English)

    ZHANG LiQiang; TAN YuMin; KANG ZhiZhong; RUI XiaoPing; ZHAO YuanYuan; LIU Liu

    2009-01-01

    Geological body structure is the product of the geological evolution in the time dimension, which is presented in 3D configuration in the natural world. However, many geologists still record and process their geological data using the 2D or 1D pattern, which results in the loss of a large quantity of spatial data. One of the reasons is that the current methods have limitations on how to express underground geological objects. To analyze and interpret geological models, we present a layer data model to or- ganize different kinds of geological datasets. The data model implemented the unification expression and storage of geological data and geometric models. In addition, it is a method for visualizing large-scaled geological datasets through building multi-resolution geological models rapidly, which can meet the demand of the operation, analysis, and interpretation of 3D geological objects. It proves that our methodology is competent for 3D modeling and self-adaptive visualization of large geological objects and It is a good way to solve the problem of integration and share of geological spatial data.

  16. Agent-Oriented Methodology and Modeling Tools%面向主体的开发方法和可视化建模工具

    Institute of Scientific and Technical Information of China (English)

    季强

    2002-01-01

    This paper introduces an agent-oriented methodology and modeling tools based on MAGE. The methodology supports analysis, desing and implimentation of multi-agent systems. The modeling tools assist the developer in building multi-agent systems using the methodology through a set of visual model editors.

  17. Revising Translations

    DEFF Research Database (Denmark)

    Rasmussen, Kirsten Wølch; Schjoldager, Anne

    2011-01-01

    out by specialised revisers, but by staff translators, who revise the work of colleagues and freelancers on an ad hoc basis. Corrections are mostly given in a peer-to-peer fashion, though the work of freelancers and inexperienced in-house translators is often revised in an authoritative (nonnegotiable...

  18. Prediction of hip joint load and translation using musculoskeletal modelling with force-dependent kinematics and experimental validation.

    Science.gov (United States)

    Zhang, Xuan; Chen, Zhenxian; Wang, Ling; Yang, Wenjian; Li, Dichen; Jin, Zhongmin

    2015-07-01

    Musculoskeletal lower limb models are widely used to predict the resultant contact force in the hip joint as a non-invasive alternative to instrumented implants. Previous musculoskeletal models based on rigid body assumptions treated the hip joint as an ideal sphere with only three rotational degrees of freedom. An musculoskeletal model that considered force-dependent kinematics with three additional translational degrees of freedom was developed and validated in this study by comparing it with a previous experimental measurement. A 32-mm femoral head against a polyethylene cup was considered in the musculoskeletal model for calculating the contact forces. The changes in the main modelling parameters were found to have little influence on the hip joint forces (relative deviation of peak value kinematics approach underestimated the maximum hip contact force by a mean value of 6.68 ± 1.75% BW compared with the experimental measurements. The predicted maximum translations of the hip joint centres were 0.125 ± 0.03 mm in level walking and 0.123 ± 0.005 mm in climbing stairs.

  19. Integrated Methodology for Information System Change Control Based on Enterprise Architecture Models

    Directory of Open Access Journals (Sweden)

    Pirta Ruta

    2015-12-01

    Full Text Available The information system (IS change management and governance, according to the best practices, are defined and described in several international methodologies, standards, and frameworks (ITIL, COBIT, ValIT etc.. These methodologies describe IS change management aspects from the viewpoint of their particular enterprise resource management area. The areas are mainly viewed in a partly isolated environment, and the integration of the existing methodologies is insufficient for providing unified and controlled methodological support for holistic IS change management. In this paper, an integrated change management methodology is introduced. The methodology consists of guidelines for IS change control by integrating the following significant resource management areas – information technology (IT governance, change management and enterprise architecture (EA change management. In addition, the methodology includes lists of controls applicable at different phases. The approach is based on re-use and fusion of principles used by related methodologies as well as on empirical observations about typical IS change management mistakes in enterprises.

  20. A Model-Based Prognostics Methodology For Electrolytic Capacitors Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...