WorldWideScience

Sample records for model translation methodology

  1. Research methodology workshops evaluation using the Kirkpatrick's model: translating theory into practice.

    Science.gov (United States)

    Abdulghani, Hamza Mohammad; Shaik, Shaffi Ahamed; Khamis, Nehal; Al-Drees, Abdulmajeed Abdulrahman; Irshad, Mohammad; Khalil, Mahmoud Salah; Alhaqwi, Ali Ibrahim; Isnani, Arthur

    2014-04-01

    Qualitative and quantitative evaluation of academic programs can enhance the development, effectiveness, and dissemination of comparative quality reports as well as quality improvement efforts. To evaluate the five research methodology workshops through assessing participants' satisfaction, knowledge and skills gain and impact on practices by the Kirkpatrick's evaluation model. The four level Kirkpatrick's model was applied for the evaluation. Training feedback questionnaires, pre and post tests, learner development plan reports and behavioral surveys were used to evaluate the effectiveness of the workshop programs. Of the 116 participants, 28 (24.1%) liked with appreciation, 62 (53.4%) liked with suggestions and 26 (22.4%) disliked the programs. Pre and post MCQs tests mean scores showed significant improvement of relevant basic knowledge and cognitive skills by 17.67% (p ≤ 0.005). Pre-and-post tests scores on workshops sub-topics also significantly improved for the manuscripts (p ≤ 0.031) and proposal writing (p ≤ 0.834). As for the impact, 56.9% of participants started research, and 6.9% published their studies. The results from participants' performance revealed an overall positive feedback and 79% of participant reported transfer of training skills at their workplace. The course outcomes achievement and suggestions given for improvements offer insight into the program which were encouraging and very useful. Encouraging "research culture" and work-based learning are probably the most powerful determinants for research promotion. These findings therefore encourage faculty development unit to continue its training and development in the research methodology aspects.

  2. Learning by Translating: A Contrastive Methodology for ESP Learning and Translation

    Directory of Open Access Journals (Sweden)

    Sara Laviosa

    2015-11-01

    Full Text Available Over the last few years applied linguists have explored the possibility of integrating the insights of second language acquisition theories, contrastive analysis, foreign language teaching methodologies, and translation studies with a view to enhancing current communicative models and techniques for L2 teaching and translator training (see for example Sewell and Higgins 1996; Laviosa-Braithwaite 1997; Campbell 1998; Malmkjær 1998; Laviosa 2000; Colina 2002. We intend to make a contribution to this interdisciplinary orientation by putting forward a translation-based methodology for learning ESP vocabulary and grammar through real life mediating communicative activities. With particular reference to the translation task itself, we endeavour to provide teachers of English for special purposes and translator trainers with a methodology for guiding their students in producing, to the best of their abilities, a target text which meets the quality criteria of terminological accuracy and stylistic fluency, and is also effective in terms of the communicative situation it is intended for. After outlining the rationale and main theoretical approaches underpinning our work, we will illustrate our methodology for learning ESP vocabulary and translation skills from a contrastive perspective, as in our book Learning by Translating (Laviosa and Cleverton 2003.

  3. Methodological considerations when translating “burnout”

    Directory of Open Access Journals (Sweden)

    Allison Squires

    2014-09-01

    Full Text Available No study has systematically examined how researchers address cross-cultural adaptation of burnout. We conducted an integrative review to examine how researchers had adapted the instruments to the different contexts. We reviewed the Content Validity Indexing scores for the Maslach Burnout Inventory-Human Services Survey from the 12-country comparative nursing workforce study, RN4CAST. In the integrative review, multiple issues related to translation were found in existing studies. In the cross-cultural instrument analysis, 7 out of 22 items on the instrument received an extremely low kappa score. Investigators may need to employ more rigorous cross-cultural adaptation methods when attempting to measure burnout.

  4. A Metadata based Knowledge Discovery Methodology for Seeding Translational Research.

    Science.gov (United States)

    Kothari, Cartik R; Payne, Philip R O

    2015-01-01

    In this paper, we present a semantic, metadata based knowledge discovery methodology for identifying teams of researchers from diverse backgrounds who can collaborate on interdisciplinary research projects: projects in areas that have been identified as high-impact areas at The Ohio State University. This methodology involves the semantic annotation of keywords and the postulation of semantic metrics to improve the efficiency of the path exploration algorithm as well as to rank the results. Results indicate that our methodology can discover groups of experts from diverse areas who can collaborate on translational research projects.

  5. Contemporary Research on Parenting: Conceptual, Methodological, and Translational Issues

    OpenAIRE

    Power, Thomas G.; Sleddens, Ester F. C.; Berge, Jerica; Connell, Lauren; Govig, Bert; Hennessy, Erin; Liggett, Leanne; Mallan, Kimberley; Santa Maria, Diane; Odoms-Young, Angela; St. George, Sara M.

    2013-01-01

    Researchers over the last decade have documented the association between general parenting style and numerous factors related to childhood obesity (e.g., children's eating behaviors, physical activity, and weight status). Many recent childhood obesity prevention programs are family focused and designed to modify parenting behaviors thought to contribute to childhood obesity risk. This article presents a brief consideration of conceptual, methodological, and translational issues that can infor...

  6. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Bringing translation out of the shadows: translation as an issue of methodological significance in cross-cultural qualitative research.

    Science.gov (United States)

    Wong, Josephine Pui-Hing; Poon, Maurice Kwong-Lai

    2010-04-01

    Translation is an integral component of cross-cultural research that has remained invisible. It is commonly assumed that translation is an objective and neutral process, in which the translators are "technicians" in producing texts in different languages. Drawing from the field of translation studies and the findings of a translation exercise conducted with three bilingual Cantonese-English translators, the authors highlight some of the methodological issues about translation in cross-cultural qualitative research. They argue that only by making translation visible and through open dialogue can researchers uncover the richness embedded in the research data and facilitate multiple ways of knowing.

  8. Livestock models in translational medicine.

    Science.gov (United States)

    Roth, James A; Tuggle, Christopher K

    2015-01-01

    This issue of the ILAR Journal focuses on livestock models in translational medicine. Livestock models of selected human diseases present important advantages as compared with rodent models for translating fundamental breakthroughs in biology to useful preventatives and therapeutics for humans. Livestock reflect the complexity of applying medical advances in an outbred species. In many cases, the pathogenesis of infectious, metabolic, genetic, and neoplastic diseases in livestock species more closely resembles that in humans than does the pathogenesis of rodent models. Livestock models also provide the advantage of similar organ size and function and the ability to serially sample an animal throughout the study period. Research using livestock models for human disease often benefits not only human health but animal health and food production as well. This issue of the ILAR Journal presents information on translational research using livestock models in two broad areas: microbiology and infectious disease (transmissible spongiform encephalopathies, mycobacterial infections, influenza A virus infection, vaccine development and testing, the human microbiota) and metabolic, neoplastic, and genetic disorders (stem cell therapy, male germ line cell biology, pulmonary adenocarcinoma, muscular dystrophy, wound healing). In addition, there is a manuscript devoted to Institutional Animal Care and Use Committees' responsibilities for reviewing research using livestock models. Conducting translational research using livestock models requires special facilities and researchers with expertise in livestock. There are many institutions in the world with experienced researchers and facilities designed for livestock research; primarily associated with colleges of agriculture and veterinary medicine or government laboratories. © The Author 2015. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research. All rights reserved. For permissions

  9. The ECOUTER methodology for stakeholder engagement in translational research.

    Science.gov (United States)

    Murtagh, Madeleine J; Minion, Joel T; Turner, Andrew; Wilson, Rebecca C; Blell, Mwenza; Ochieng, Cynthia; Murtagh, Barnaby; Roberts, Stephanie; Butters, Oliver W; Burton, Paul R

    2017-04-04

    Because no single person or group holds knowledge about all aspects of research, mechanisms are needed to support knowledge exchange and engagement. Expertise in the research setting necessarily includes scientific and methodological expertise, but also expertise gained through the experience of participating in research and/or being a recipient of research outcomes (as a patient or member of the public). Engagement is, by its nature, reciprocal and relational: the process of engaging research participants, patients, citizens and others (the many 'publics' of engagement) brings them closer to the research but also brings the research closer to them. When translating research into practice, engaging the public and other stakeholders is explicitly intended to make the outcomes of translation relevant to its constituency of users. In practice, engagement faces numerous challenges and is often time-consuming, expensive and 'thorny' work. We explore the epistemic and ontological considerations and implications of four common critiques of engagement methodologies that contest: representativeness, communication and articulation, impacts and outcome, and democracy. The ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) methodology addresses problems of representation and epistemic foundationalism using a methodology that asks, "How could it be otherwise?" ECOUTER affords the possibility of engagement where spatial and temporal constraints are present, relying on saturation as a method of 'keeping open' the possible considerations that might emerge and including reflexive use of qualitative analytic methods. This paper describes the ECOUTER process, focusing on one worked example and detailing lessons learned from four other pilots. ECOUTER uses mind-mapping techniques to 'open up' engagement, iteratively and organically. ECOUTER aims to balance the breadth, accessibility and user-determination of the scope of engagement. An ECOUTER

  10. Contemporary research on parenting: conceptual, methodological, and translational issues.

    Science.gov (United States)

    Power, Thomas G; Sleddens, Ester F C; Berge, Jerica; Connell, Lauren; Govig, Bert; Hennessy, Erin; Liggett, Leanne; Mallan, Kimberley; Santa Maria, Diane; Odoms-Young, Angela; St George, Sara M

    2013-08-01

    Researchers over the last decade have documented the association between general parenting style and numerous factors related to childhood obesity (e.g., children's eating behaviors, physical activity, and weight status). Many recent childhood obesity prevention programs are family focused and designed to modify parenting behaviors thought to contribute to childhood obesity risk. This article presents a brief consideration of conceptual, methodological, and translational issues that can inform future research on the role of parenting in childhood obesity. They include: (1) General versus domain specific parenting styles and practices; (2) the role of ethnicity and culture; (3) assessing bidirectional influences; (4) broadening assessments beyond the immediate family; (5) novel approaches to parenting measurement; and (6) designing effective interventions. Numerous directions for future research are offered.

  11. Contemporary Research on Parenting: Conceptual, Methodological, and Translational Issues

    Science.gov (United States)

    Sleddens, Ester F. C.; Berge, Jerica; Connell, Lauren; Govig, Bert; Hennessy, Erin; Liggett, Leanne; Mallan, Kimberley; Santa Maria, Diane; Odoms-Young, Angela; St. George, Sara M.

    2013-01-01

    Abstract Researchers over the last decade have documented the association between general parenting style and numerous factors related to childhood obesity (e.g., children's eating behaviors, physical activity, and weight status). Many recent childhood obesity prevention programs are family focused and designed to modify parenting behaviors thought to contribute to childhood obesity risk. This article presents a brief consideration of conceptual, methodological, and translational issues that can inform future research on the role of parenting in childhood obesity. They include: (1) General versus domain specific parenting styles and practices; (2) the role of ethnicity and culture; (3) assessing bidirectional influences; (4) broadening assessments beyond the immediate family; (5) novel approaches to parenting measurement; and (6) designing effective interventions. Numerous directions for future research are offered. PMID:23944927

  12. Translational invariance in bag model

    International Nuclear Information System (INIS)

    Megahed, F.

    1981-10-01

    In this thesis, the effect of restoring the translational invariance to an approximation to the MIT bag model on the calculation of deep inelastic structure functions is investigated. In chapter one, the model and its major problems are reviewed and Dirac's method of quantisation is outlined. This method is used in chapter two to quantise a two-dimensional complex scalar bag and formal expressions for the form factor and the structure functions are obtained. In chapter three, the expression for the structure function away from the Bjorken limit is studied. The corrections to the L 0 - approximation to the structure function is calculated in chapter four and it is shown to be large. Finally, in chapter five, a bag-like model for kinematic corrections to structure functions is introduced and agreement with data between 2 and 6 (GeV/C) 2 is obtained. (author)

  13. Lost in Translation: Methodological Considerations in Cross-Cultural Research

    Science.gov (United States)

    Pena, Elizabeth D.

    2007-01-01

    In cross-cultural child development research there is often a need to translate instruments and instructions to languages other than English. Typically, the translation process focuses on ensuring linguistic equivalence. However, establishment of linguistic equivalence through translation techniques is often not sufficient to guard against…

  14. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    Science.gov (United States)

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  15. Studies into abnormal aggression in humans and rodents: Methodological and translational aspects.

    Science.gov (United States)

    Haller, Jozsef

    2017-05-01

    Here we review the principles based on which aggression is rendered abnormal in humans and laboratory rodents, and comparatively overview the main methodological approaches based on which this behavior is studied in the two categories of subjects. It appears that the discriminating property of abnormal aggression is rule breaking, which renders aggression dysfunctional from the point of view of the perpetrator. We show that rodent models of abnormal aggression were created by the translation of human conditions into rodent equivalents, and discuss how findings obtained with such models may be "translated back" to human conditions when the mechanisms underlying aggression and its possibilities of treatment are investigated. We suggest that the complementary nature of human and rodent research approaches invite a more intense cross-talk between the two sides of aggression research than the one presently observed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Methodological standards for in vitro models of epilepsy and epileptic seizures. A TASK1-WG4 report of the AES/ILAE Translational Task Force of the ILAE

    Czech Academy of Sciences Publication Activity Database

    Raimondo, J. V.; Heinemann, U.; de Curtis, M.; Goodkin, H. P.; Dulla, Ch. G.; Janigro, D.; Ikeda, A.; Lin, Ch.-Ch. K.; Jiruška, Přemysl; Galanopoulou, A. S.; Bernard, Ch.

    2017-01-01

    Roč. 58, Suppl.4 (2017), s. 40-52 ISSN 0013-9580 R&D Projects: GA MZd(CZ) NV15-29835A; GA MZd(CZ) NV15-33115A; GA ČR(CZ) GA14-02634S; GA ČR(CZ) GA15-08565S Institutional support: RVO:67985823 Keywords : brain slice preparation * electrophysiological recording methods * recording solution composition * in vitro models of seizures * animal selection and killing Subject RIV: FH - Neurology OBOR OECD: Neurosciences (including psychophysiology Impact factor: 5.295, year: 2016

  17. Latent domain models for statistical machine translation

    NARCIS (Netherlands)

    Hoàng, C.

    2017-01-01

    A data-driven approach to model translation suffers from the data mismatch problem and demands domain adaptation techniques. Given parallel training data originating from a specific domain, training an MT system on the data would result in a rather suboptimal translation for other domains. But does

  18. The Integration of Project-Based Methodology into Teaching in Machine Translation

    Science.gov (United States)

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…

  19. Key Methodological Aspects of Translators' Training in Ukraine and in the USA

    Science.gov (United States)

    Skyba, Kateryna

    2015-01-01

    The diversity of international relations in the globalized world has influenced the role of a translator that is becoming more and more important. Translators' training institutions today are to work out and to implement the best teaching methodology taking into consideration the new challenges of modern multinational and multicultural society.…

  20. Methodological considerations when translating “burnout”☆

    Science.gov (United States)

    Squires, Allison; Finlayson, Catherine; Gerchow, Lauren; Cimiotti, Jeannie P.; Matthews, Anne; Schwendimann, Rene; Griffiths, Peter; Busse, Reinhard; Heinen, Maude; Brzostek, Tomasz; Moreno-Casbas, Maria Teresa; Aiken, Linda H.; Sermeus, Walter

    2014-01-01

    No study has systematically examined how researchers address cross-cultural adaptation of burnout. We conducted an integrative review to examine how researchers had adapted the instruments to the different contexts. We reviewed the Content Validity Indexing scores for the Maslach Burnout Inventory-Human Services Survey from the 12-country comparative nursing workforce study, RN4CAST. In the integrative review, multiple issues related to translation were found in existing studies. In the cross-cultural instrument analysis, 7 out of 22 items on the instrument received an extremely low kappa score. Investigators may need to employ more rigorous cross-cultural adaptation methods when attempting to measure burnout. PMID:25343131

  1. Methodological aspects in quantitative translational neuroimaging in central nervous system diseases with Positron Emission Tomography

    International Nuclear Information System (INIS)

    Müllauer, J.

    2013-01-01

    Patients suffering from central nervous system (CNS) diseases crucially depend on a sufficient supply with CNS active drugs that help them to control and endure their illness. As the site of action of CNS drugs is in the brain, these substances need to pass the blood-brain barrier (BBB), a physiological barrier seperating the blood circulation and the brain. However, CNS drug treatment is often accompanied by pharmacoresistance (drug resistance). Multidrug transporters, such as permeable glycoprotein (Pgp) are responsible for a gradient dependent transport of substances over the BBB. Drug resistance is hypothesised as a result of overactivity of multidrug transporters at the BBB, with the result of insufficient and poor CNS drug levels in the brain. In the case of epilepsy in up to 20 - 40% of patients drug resistance is observed. The influence of Ppg overexpression on drug resistance in epilepsy was studied using positron emission tomography (PET), a novel non-invasive nuclear imaging method, together with radioligands that interact with Pgp. Radiolabeled Pgp-substrates ((R)-[11C]verapamil), and inhibitors ([11C]elacridar and [11C]tariquidar) were developed and used to study the influence of Pgp and other transporters at the BBB in a translational research approach; in animal models of epilepsy and in humans. The aim in translational PET research is a direct comparision of gathered animal and human data. Consequently diverse methodological challenges arise, that need to be addressed and observed in order to enable a translation between species. To achieve full quantification of the function and density of drug transporters at the BBB in both, humans and rodents, kinetic modeling (compartmental modeling) was applied to the PET pharmacokinetic data. Estimated modeling parameters were in succession used to estimate biological and physiological processes of Pgp at the BBB. Subsequently, nonlinear mixed effects modeling was deployed to increase the mechanistic

  2. The ECOUTER methodology for stakeholder engagement in translational research

    OpenAIRE

    Murtagh, Madeleine J.; Minion, Joel T.; Turner, Andrew; Wilson, Rebecca C.; Blell, Mwenza; Ochieng, Cynthia; Murtagh, Barnaby; Roberts, Stephanie; Butters, Oliver W.; Burton, Paul R

    2017-01-01

    Abstract Background Because no single person or group holds knowledge about all aspects of research, mechanisms are needed to support knowledge exchange and engagement. Expertise in the research setting necessarily includes scientific and methodological expertise, but also expertise gained through the experience of participating in research and/or being a recipient of research outcomes (as a patient or member of the public). Engagement i...

  3. Hybrid intelligent methodology to design translation invariant morphological operators for Brazilian stock market prediction.

    Science.gov (United States)

    Araújo, Ricardo de A

    2010-12-01

    This paper presents a hybrid intelligent methodology to design increasing translation invariant morphological operators applied to Brazilian stock market prediction (overcoming the random walk dilemma). The proposed Translation Invariant Morphological Robust Automatic phase-Adjustment (TIMRAA) method consists of a hybrid intelligent model composed of a Modular Morphological Neural Network (MMNN) with a Quantum-Inspired Evolutionary Algorithm (QIEA), which searches for the best time lags to reconstruct the phase space of the time series generator phenomenon and determines the initial (sub-optimal) parameters of the MMNN. Each individual of the QIEA population is further trained by the Back Propagation (BP) algorithm to improve the MMNN parameters supplied by the QIEA. Also, for each prediction model generated, it uses a behavioral statistical test and a phase fix procedure to adjust time phase distortions observed in stock market time series. Furthermore, an experimental analysis is conducted with the proposed method through four Brazilian stock market time series, and the achieved results are discussed and compared to results found with random walk models and the previously introduced Time-delay Added Evolutionary Forecasting (TAEF) and Morphological-Rank-Linear Time-lag Added Evolutionary Forecasting (MRLTAEF) methods. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Evaluating Translational Research: A Process Marker Model

    Science.gov (United States)

    Trochim, William; Kane, Cathleen; Graham, Mark J.; Pincus, Harold A.

    2011-01-01

    Abstract Objective: We examine the concept of translational research from the perspective of evaluators charged with assessing translational efforts. One of the major tasks for evaluators involved in translational research is to help assess efforts that aim to reduce the time it takes to move research to practice and health impacts. Another is to assess efforts that are intended to increase the rate and volume of translation. Methods: We offer an alternative to the dominant contemporary tendency to define translational research in terms of a series of discrete “phases.”Results: We contend that this phased approach has been confusing and that it is insufficient as a basis for evaluation. Instead, we argue for the identification of key operational and measurable markers along a generalized process pathway from research to practice. Conclusions: This model provides a foundation for the evaluation of interventions designed to improve translational research and the integration of these findings into a field of translational studies. Clin Trans Sci 2011; Volume 4: 153–162 PMID:21707944

  5. MODEL OF TEACHING PROFESSION SPECIFIC BILATERAL TRANSLATION

    Directory of Open Access Journals (Sweden)

    Yana Fabrychna

    2017-03-01

    Full Text Available The article deals with the author’s interpretation of the process of teaching profession specific bilateral translation to student teacher of English in the Master’s program. The goal of the model of teaching profession specific bilateral translation development is to determine the logical sequence of educational activities of the teacher as the organizer of the educational process and students as its members. English and Ukrainian texts on methods of foreign languages and cultures teaching are defined as the object of study. Learning activities aimed at the development of student teachers of English profession specific competence in bilateral translation and Translation Proficiency Language Portfolio for Student Teachers of English are suggested as teaching tools. The realization of the model of teaching profession specific bilateral translation to student teachers of English in the Master’s program is suggested within the module topics of the academic discipline «Practice of English as the first foreign language»: Globalization; Localization; Education; Work; The role of new communication technologies in personal and professional development. We believe that the amount of time needed for efficient functioning of the model is 48 academic hours, which was determined by calculating the total number of academic hours allotted for the academic discipline «Practice of English as the first foreign language» in Ukrainian universities. Peculiarities of the model realization as well as learning goals and content of class activities and home self-study work of students are outlined.

  6. Methodology for the analysis of transcription and translation in transcription-coupled-to-translation systems in vitro.

    Science.gov (United States)

    Castro-Roa, Daniel; Zenkin, Nikolay

    2015-09-15

    The various properties of RNA polymerase (RNAP) complexes with nucleic acids during different stages of transcription involve various types of regulation and different cross-talk with other cellular entities and with fellow RNAP molecules. The interactions of transcriptional apparatus with the translational machinery have been focused mainly in terms of outcomes of gene expression, whereas the study of the physical interaction of the ribosome and the RNAP remains obscure partly due to the lack of a system that allows such observations. In this article we will describe the methodology needed to set up a pure, transcription-coupled-to-translation system in which the translocation of the ribosome can be performed in a step-wise manner towards RNAP allowing investigation of the interactions between the two machineries at colliding and non-colliding distances. In the same time RNAP can be put in various types of states, such as paused, roadblocked, backtracked, etc. The experimental system thus allows studying the effects of the ribosome on different aspects of transcription elongation and the effects by RNAP on translation. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Neural Machine Translation with Recurrent Attention Modeling

    OpenAIRE

    Yang, Zichao; Hu, Zhiting; Deng, Yuntian; Dyer, Chris; Smola, Alex

    2016-01-01

    Knowing which words have been attended to in previous time steps while generating a translation is a rich source of information for predicting what words will be attended to in the future. We improve upon the attention model of Bahdanau et al. (2014) by explicitly modeling the relationship between previous and subsequent attention levels for each word using one recurrent network per input word. This architecture easily captures informative features, such as fertility and regularities in relat...

  8. Towards a Multimodal Methodology for the Analysis of Translated/Localised Games

    Directory of Open Access Journals (Sweden)

    Bárbara Resende Coelho

    2016-12-01

    Full Text Available Multimedia materials require research methodologies that are able to comprehend all of their assets. Videogames are the epitome of multimedia, joining image, sound, video, animation, graphics and text with the interactivity factor. A methodology to conduct research into translation and localisation of videogames should be able to analyse all of its assets and features. This paper sets out to develop a research methodology for games and their translations/localisations that goes beyond the collection and analysis of “screenshots” and includes as many of their assets as possible. Using the fully localised version of the game Watchdogs, this papers shows how tools and technologies allow for transcending the mere analysis of linguistic contents within multimedia materials. Using software ELAN Language Archive to analyse Portuguese-language dubbed and English-language subtitled excerpts from the videogame, it was possible to identify patterns in both linguistic and audio-visual elements, as well as to correlate them.

  9. Standards and Methodological Rigor in Pulmonary Arterial Hypertension Preclinical and Translational Research.

    Science.gov (United States)

    Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien

    2018-03-30

    Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from

  10. Mapping new theoretical and methodological terrain for knowledge translation: contributions from critical realism and the arts

    Science.gov (United States)

    Kontos, Pia C; Poland, Blake D

    2009-01-01

    Background Clinical practice guidelines have been a popular tool for the improvement of health care through the implementation of evidence from systematic research. Yet, it is increasingly clear that knowledge alone is insufficient to change practice. The social, cultural, and material contexts within which practice occurs may invite or reject innovation, complement or inhibit the activities required for success, and sustain or alter adherence to entrenched practices. However, knowledge translation (KT) models are limited in providing insight about how and why contextual contingencies interact, the causal mechanisms linking structural aspects of context and individual agency, and how these mechanisms influence KT. Another limitation of KT models is the neglect of methods to engage potential adopters of the innovation in critical reflection about aspects of context that influence practice, the relevance and meaning of innovation in the context of practice, and the identification of strategies for bringing about meaningful change. Discussion This paper presents a KT model, the Critical Realism and the Arts Research Utilization Model (CRARUM), that combines critical realism and arts-based methodologies. Critical realism facilitates understanding of clinical settings by providing insight into the interrelationship between its structures and potentials, and individual action. The arts nurture empathy, and can foster reflection on the ways in which contextual factors influence and shape clinical practice, and how they may facilitate or impede change. The combination of critical realism and the arts within the CRARUM model promotes the successful embedding of interventions, and greater impact and sustainability. Conclusion CRARUM has the potential to strengthen the science of implementation research by addressing the complexities of practice settings, and engaging potential adopters to critically reflect on existing and proposed practices and strategies for sustaining

  11. Expectations for methodology and translation of animal research: a survey of health care workers.

    Science.gov (United States)

    Joffe, Ari R; Bara, Meredith; Anton, Natalie; Nobis, Nathan

    2015-05-07

    Health care workers (HCW) often perform, promote, and advocate use of public funds for animal research (AR); therefore, an awareness of the empirical costs and benefits of animal research is an important issue for HCW. We aim to determine what health-care-workers consider should be acceptable standards of AR methodology and translation rate to humans. After development and validation, an e-mail survey was sent to all pediatricians and pediatric intensive care unit nurses and respiratory-therapists (RTs) affiliated with a Canadian University. We presented questions about demographics, methodology of AR, and expectations from AR. Responses of pediatricians and nurses/RTs were compared using Chi-square, with P methodological quality, most respondents expect that: AR is done to high quality; costs and difficulty are not acceptable justifications for low quality; findings should be reproducible between laboratories and strains of the same species; and guidelines for AR funded with public money should be consistent with these expectations. Asked about benefits of AR, most thought that there are sometimes/often large benefits to humans from AR, and disagreed that "AR rarely produces benefit to humans." Asked about expectations of translation to humans (of toxicity, carcinogenicity, teratogenicity, and treatment findings), most: expect translation >40% of the time; thought that misleading AR results should occur methodological quality of, and the translation rate to humans of findings from AR. These expectations are higher than the empirical data show having been achieved. Unless these areas of AR significantly improve, HCW support of AR may be tenuous.

  12. Innovation and Integrity in Intervention Research: Conceptual Issues, Methodology, and Knowledge Translation.

    Science.gov (United States)

    Malti, Tina; Beelmann, Andreas; Noam, Gil G; Sommer, Simon

    2018-04-01

    In this article, we introduce the special issue entitled Innovation and Integrity in Intervention Science. Its focus is on essential problems and prospects for intervention research examining two related topics, i.e., methodological issues and research integrity, and challenges in the transfer of research knowledge into practice and policy. The main aims are to identify how to advance methodology in order to improve research quality, examine scientific integrity in the field of intervention science, and discuss future steps to enhance the transfer of knowledge about evidence-based intervention principles into sustained practice, routine activities, and policy decisions. Themes of the special issue are twofold. The first includes questions about research methodology in intervention science, both in terms of research design and methods, as well as data analyses and the reporting of findings. Second, the issue tackles questions surrounding the types of knowledge translation frameworks that might be beneficial to mobilize the transfer of research-based knowledge into practice and public policies. The issue argues that innovations in methodology and thoughtful approaches to knowledge translation can enable transparency, quality, and sustainability of intervention research.

  13. The Modelling of Axially Translating Flexible Beams

    Science.gov (United States)

    Theodore, R. J.; Arakeri, J. H.; Ghosal, A.

    1996-04-01

    The axially translating flexible beam with a prismatic joint can be modelled by using the Euler-Bernoulli beam equation together with the convective terms. In general, the method of separation of variables cannot be applied to solve this partial differential equation. In this paper, a non-dimensional form of the Euler Bernoulli beam equation is presented, obtained by using the concept of group velocity, and also the conditions under which separation of variables and assumed modes method can be used. The use of clamped-mass boundary conditions leads to a time-dependent frequency equation for the translating flexible beam. A novel method is presented for solving this time dependent frequency equation by using a differential form of the frequency equation. The assume mode/Lagrangian formulation of dynamics is employed to derive closed form equations of motion. It is shown by using Lyapunov's first method that the dynamic responses of flexural modal variables become unstable during retraction of the flexible beam, which the dynamic response during extension of the beam is stable. Numerical simulation results are presented for the uniform axial motion induced transverse vibration for a typical flexible beam.

  14. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  15. Animal models of tic disorders: a translational perspective.

    Science.gov (United States)

    Godar, Sean C; Mosher, Laura J; Di Giovanni, Giuseppe; Bortolato, Marco

    2014-12-30

    Tics are repetitive, sudden movements and/or vocalizations, typically enacted as maladaptive responses to intrusive premonitory urges. The most severe tic disorder, Tourette syndrome (TS), is a childhood-onset condition featuring multiple motor and at least one phonic tic for a duration longer than 1 year. The pharmacological treatment of TS is mainly based on antipsychotic agents; while these drugs are often effective in reducing tic severity and frequency, their therapeutic compliance is limited by serious motor and cognitive side effects. The identification of novel therapeutic targets and development of better treatments for tic disorders is conditional on the development of animal models with high translational validity. In addition, these experimental tools can prove extremely useful to test hypotheses on the etiology and neurobiological bases of TS and related conditions. In recent years, the translational value of these animal models has been enhanced, thanks to a significant re-organization of our conceptual framework of neuropsychiatric disorders, with a greater focus on endophenotypes and quantitative indices, rather than qualitative descriptors. Given the complex and multifactorial nature of TS and other tic disorders, the selection of animal models that can appropriately capture specific symptomatic aspects of these conditions can pose significant theoretical and methodological challenges. In this article, we will review the state of the art on the available animal models of tic disorders, based on genetic mutations, environmental interventions as well as pharmacological manipulations. Furthermore, we will outline emerging lines of translational research showing how some of these experimental preparations have led to significant progress in the identification of novel therapeutic targets for tic disorders. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Animal models of tic disorders: A translational perspective

    Science.gov (United States)

    Godar, Sean C.; Mosher, Laura J.; Di Giovanni, Giuseppe; Bortolato, Marco

    2014-01-01

    Tics are repetitive, sudden movements and/or vocalizations, typically enacted as maladaptive responses to intrusive premonitory urges. The most severe tic disorder, Tourette syndrome (TS), is a childhood-onset condition featuring multiple motor and at least one phonic tic for a duration longer than 1 year. The pharmacological treatment of TS is mainly based on antipsychotic agents; while these drugs are often effective in reducing tic severity and frequency, their therapeutic compliance is limited by serious motor and cognitive side effects. The identification of novel therapeutic targets and development of better treatments for tic disorders is conditional on the development of animal models with high translational validity. In addition, these experimental tools can prove extremely useful to test hypotheses on the etiology and neurobiological bases of TS and related conditions. In recent years, the translational value of these animal models has been enhanced, thanks to a significant re-organization of our conceptual framework of neuropsychiatric disorders, with a greater focus on endophenotypes and quantitative indices, rather than qualitative descriptors. Given the complex and multifactorial nature of TS and other tic disorders, the selection of animal models that can appropriately capture specific symptomatic aspects of these conditions can pose significant theoretical and methodological challenges. In this article, we will review the state of the art on the available animal models of tic disorders, based on genetic mutations, environmental interventions as well as pharmacological manipulations. Furthermore, we will outline emerging lines of translational research showing how some of these experimental preparations have led to significant progress in the identification of novel therapeutic targets for tic disorders. PMID:25244952

  17. PCI: A PATRAN-NASTRAN model translator

    Science.gov (United States)

    Sheerer, T. J.

    1990-01-01

    The amount of programming required to develop a PATRAN-NASTRAN translator was surprisingly small. The approach taken produced a highly flexible translator comparable with the PATNAS translator and superior to the PATCOS translator. The coding required varied from around ten lines for a shell element to around thirty for a bar element, and the time required to add a feature to the program is typically less than an hour. The use of a lookup table for element names makes the translator also applicable to other versions of NASTRAN. The saving in time as a result of using PDA's Gateway utilities was considerable. During the writing of the program it became apparent that, with a somewhat more complex structure, it would be possible to extend the element data file to contain all data required to define the translation from PATRAN to NASTRAN by mapping of data between formats. Similar data files on property, material and grid formats would produce a completely universal translator from PATRAN to any FEA program, or indeed any CAE system.

  18. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  19. Serbian translation of the 20-item toronto alexithymia scale: Psychometric properties and the new methodological approach in translating scales

    Directory of Open Access Journals (Sweden)

    Trajanović Nikola N.

    2013-01-01

    Full Text Available Introduction. Since inception of the alexithymia construct in 1970’s, there has been a continuous effort to improve both its theoretical postulates and the clinical utility through development, standardization and validation of assessment scales. Objective. The aim of this study was to validate the Serbian translation of the 20-item Toronto Alexithymia Scale (TAS-20 and to propose a new method of translation of scales with a property of temporal stability. Methods. The scale was expertly translated by bilingual medical professionals and a linguist, and given to a sample of bilingual participants from the general population who completed both the English and the Serbian version of the scale one week apart. Results. The findings showed that the Serbian version of the TAS-20 had a good internal consistency reliability regarding total scale (α=0.86, and acceptable reliability of the three factors (α=0.71-0.79. Conclusion. The analysis confirmed the validity and consistency of the Serbian translation of the scale, with observed weakness of the factorial structure consistent with studies in other languages. The results also showed that the method of utilizing a self-control bilingual subject is a useful alternative to the back-translation method, particularly in cases of linguistically and structurally sensitive scales, or in cases where a larger sample is not available. This method, dubbed as ‘forth-translation’, could be used to translate psychometric scales measuring properties which have temporal stability over the period of at least several weeks.

  20. Education in health research methodology: use of a wiki for knowledge translation.

    Directory of Open Access Journals (Sweden)

    Michele P Hamm

    Full Text Available INTRODUCTION: A research-practice gap exists between what is known about conducting methodologically rigorous randomized controlled trials (RCTs and what is done. Evidence consistently shows that pediatric RCTs are susceptible to high risk of bias; therefore novel methods of influencing the design and conduct of trials are required. The objective of this study was to develop and pilot test a wiki designed to educate pediatric trialists and trainees in the principles involved in minimizing risk of bias in RCTs. The focus was on preliminary usability testing of the wiki. METHODS: The wiki was developed through adaptation of existing knowledge translation strategies and through tailoring the site to the identified needs of the end-users. The wiki was evaluated for usability and user preferences regarding the content and formatting. Semi-structured interviews were conducted with 15 trialists and systematic reviewers, representing varying levels of experience with risk of bias or the conduct of trials. Data were analyzed using content analysis. RESULTS: Participants found the wiki to be well organized, easy to use, and straightforward to navigate. Suggestions for improvement tended to focus on clarification of the text or on esthetics, rather than on the content or format. Participants liked the additional features of the site that were supplementary to the text, such as the interactive examples, and the components that focused on practical applications, adding relevance to the theory presented. While the site could be used by both trialists and systematic reviewers, the lack of a clearly defined target audience caused some confusion among participants. CONCLUSIONS: Participants were supportive of using a wiki as a novel educational tool. The results of this pilot test will be used to refine the risk of bias wiki, which holds promise as a knowledge translation intervention for education in medical research methodology.

  1. A Flexible Statechart-to-Model-Checker Translator

    Science.gov (United States)

    Rouquette, Nicolas; Dunphy, Julia; Feather, Martin S.

    2000-01-01

    Many current-day software design tools offer some variant of statechart notation for system specification. We, like others, have built an automatic translator from (a subset of) statecharts to a model checker, for use to validate behavioral requirements. Our translator is designed to be flexible. This allows us to quickly adjust the translator to variants of statechart semantics, including problem-specific notational conventions that designers employ. Our system demonstration will be of interest to the following two communities: (1) Potential end-users: Our demonstration will show translation from statecharts created in a commercial UML tool (Rational Rose) to Promela, the input language of Holzmann's model checker SPIN. The translation is accomplished automatically. To accommodate the major variants of statechart semantics, our tool offers user-selectable choices among semantic alternatives. Options for customized semantic variants are also made available. The net result is an easy-to-use tool that operates on a wide range of statechart diagrams to automate the pathway to model-checking input. (2) Other researchers: Our translator embodies, in one tool, ideas and approaches drawn from several sources. Solutions to the major challenges of statechart-to-model-checker translation (e.g., determining which transition(s) will fire, handling of concurrent activities) are retired in a uniform, fully mechanized, setting. The way in which the underlying architecture of the translator itself facilitates flexible and customizable translation will also be evident.

  2. Proposal for a telehealth concept in the translational research model.

    Science.gov (United States)

    Silva, Angélica Baptista; Morel, Carlos Médicis; Moraes, Ilara Hämmerli Sozzi de

    2014-04-01

    To review the conceptual relationship between telehealth and translational research. Bibliographical search on telehealth was conducted in the Scopus, Cochrane BVS, LILACS and MEDLINE databases to find experiences of telehealth in conjunction with discussion of translational research in health. The search retrieved eight studies based on analysis of models of the five stages of translational research and the multiple strands of public health policy in the context of telehealth in Brazil. The models were applied to telehealth activities concerning the Network of Human Milk Banks, in the Telemedicine University Network. The translational research cycle of human milk collected, stored and distributed presents several integrated telehealth initiatives, such as video conferencing, and software and portals for synthesizing knowledge, composing elements of an information ecosystem, mediated by information and communication technologies in the health system. Telehealth should be composed of a set of activities in a computer mediated network promoting the translation of knowledge between research and health services.

  3. Sketch of a Noisy Channel Model for the Translation Process

    DEFF Research Database (Denmark)

    Carl, Michael

    default rendering" procedure, later conscious processes are triggered by a monitor who interferes when something goes wrong. An attempt is made to explain monitor activities with relevance theoretic concepts according to which a translator needs to ensure the similarity of explicatures and implicatures......The paper develops a Noisy Channel Model for the translation process that is based on actual user activity data. It builds on the monitor model and makes a distinction between early, automatic and late, conscious translation processes: while early priming processes are at the basis of a "literal...... of the source and the target texts. It is suggested that events and parameters in the model need be measurable and quantifiable in the user activity data so as to trace back monitoring activities in the translation process data. Michael Carl is a Professor with special responsibilities at the Department...

  4. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  5. From Translational Research to Translational Effectiveness: The “Patient-Centered Dental Home” Model

    Directory of Open Access Journals (Sweden)

    Francesco Chiappelli

    2011-06-01

    Full Text Available Toward revitalizing the Nation’s primary medical care system, the Agency for Health Research & Quality (AHRQ stated that new foundational measures must be crafted for achieving high-quality, accessible, efficient health care for all Americans. The efficiency of medical care is viewed along two dimensions: first, we must continue to pursue translational research; and second, we must translate research to optimize effectiveness in specific clinical settings. It is increasingly evident that the efficiency of both translational processes is critical to the revitalization of health care, and that it rests on the practical functionality of the nexus among three cardinal entities: the researcher, the clinician, and the patient. A novel model has evolved that encapsulates this notion, and that proposes the advanced pri-mary care “medical home”, more commonly referred to as the “patient-centered medical home” (PCMH. It is a promising model for transforming the organization and delivery of primary medical care, because it is not simply a place per se, but it is a function-ing unit that delivers medical care along the fundamental principles of being patient-centered, comprehensive, coordinated, and accessible. It is energized by translational research, and its principal aim and ultimate goal is translational effectiveness. The PCMH is a model that works well within the priorities set by the American Recovery and Reinvestment Act of 2009, and the Health Care Reform Act of 2010. However, while dentistry has a clearly defined place in both Acts, the PCMH is designed for medical and nursing care. A parallel model of the “patient-centered dental home” (PCDH must be realized.

  6. Methodological Issues in Cross-Cultural Counseling Research: Equivalence, Bias, and Translations

    Science.gov (United States)

    Aegisdottir, Stefania; Gerstein, Lawrence A.; Cinarbas, Deniz Canel

    2008-01-01

    Concerns about the cross-cultural validity of constructs are discussed, including equivalence, bias, and translation procedures. Methods to enhance equivalence are described, as are strategies to evaluate and minimize types of bias. Recommendations for translating instruments are also presented. To illustrate some challenges of cross-cultural…

  7. Hon-yaku: a biology-driven Bayesian methodology for identifying translation initiation sites in prokaryotes

    Directory of Open Access Journals (Sweden)

    de Hoon Michiel JL

    2007-02-01

    Full Text Available Abstract Background Computational prediction methods are currently used to identify genes in prokaryote genomes. However, identification of the correct translation initiation sites remains a difficult task. Accurate translation initiation sites (TISs are important not only for the annotation of unknown proteins but also for the prediction of operons, promoters, and small non-coding RNA genes, as this typically makes use of the intergenic distance. A further problem is that most existing methods are optimized for Escherichia coli data sets; applying these methods to newly sequenced bacterial genomes may not result in an equivalent level of accuracy. Results Based on a biological representation of the translation process, we applied Bayesian statistics to create a score function for predicting translation initiation sites. In contrast to existing programs, our combination of methods uses supervised learning to optimally use the set of known translation initiation sites. We combined the Ribosome Binding Site (RBS sequence, the distance between the translation initiation site and the RBS sequence, the base composition of the start codon, the nucleotide composition (A-rich sequences following start codons, and the expected distribution of the protein length in a Bayesian scoring function. To further increase the prediction accuracy, we also took into account the operon orientation. The outcome of the procedure achieved a prediction accuracy of 93.2% in 858 E. coli genes from the EcoGene data set and 92.7% accuracy in a data set of 1243 Bacillus subtilis 'non-y' genes. We confirmed the performance in the GC-rich Gamma-Proteobacteria Herminiimonas arsenicoxydans, Pseudomonas aeruginosa, and Burkholderia pseudomallei K96243. Conclusion Hon-yaku, being based on a careful choice of elements important in translation, improved the prediction accuracy in B. subtilis data sets and other bacteria except for E. coli. We believe that most remaining

  8. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  9. Measuring Difficulty in English-Chinese Translation: Towards a General Model of Translation Difficulty

    Science.gov (United States)

    Sun, Sanjun

    2012-01-01

    Accurate assessment of a text's level of translation difficulty is critical for translator training and accreditation, translation research, and the language industry as well. Traditionally, people rely on their general impression to gauge a text's translation difficulty level. If the evaluation process is to be more effective and the…

  10. Syntactic discriminative language model rerankers for statistical machine translation

    NARCIS (Netherlands)

    Carter, S.; Monz, C.

    2011-01-01

    This article describes a method that successfully exploits syntactic features for n-best translation candidate reranking using perceptrons. We motivate the utility of syntax by demonstrating the superior performance of parsers over n-gram language models in differentiating between Statistical

  11. A Minimal Cognitive Model for Translating and Post-editing

    DEFF Research Database (Denmark)

    Schaeffer, Moritz; Carl, Michael

    2017-01-01

    This study investigates the coordination of reading (input) and writing (output) activities in from-scratch translation and post-editing. We segment logged eye movements and keylogging data into minimal units of reading and writing activity and model the process of post-editing and from-scratch t...

  12. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  13. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  14. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  15. Exploration of Disease Markers under Translational Medicine Model

    Directory of Open Access Journals (Sweden)

    Rajagopal Krishnamoorthy

    2015-06-01

    Full Text Available Disease markers are defined as the biomarkers with specific characteristics during the general physical, pathological or therapeutic process, the detection of which can inform the progression of present biological process of organisms. However, the exploration of disease markers is complicated and difficult, and only a few markers can be used in clinical practice and there is no significant difference in the mortality of cancers before and after biomarker exploration. Translational medicine focuses on breaking the blockage between basic medicine and clinical practice. In addition, it also establishes an effective association between researchers engaged on basic scientific discovery and clinical physicians well informed of patients' requirements, and gives particular attentions on how to translate the basic molecular biological research to the most effective and appropriate methods for the diagnosis, treatment and prevention of diseases, hoping to translate basic research into the new therapeutic methods in clinic. Therefore, this study mainly summarized the exploration of disease markers under translational medicine model so as to provide a basis for the translation of basic research results into clinical application.

  16. Model of cap-dependent translation initiation in sea urchin: a step towards the eukaryotic translation regulation network.

    Science.gov (United States)

    Bellé, Robert; Prigent, Sylvain; Siegel, Anne; Cormier, Patrick

    2010-03-01

    The large and rapid increase in the rate of protein synthesis following fertilization of the sea urchin egg has long been a paradigm of translational control, an important component of the regulation of gene expression in cells. This translational up-regulation is linked to physiological changes that occur upon fertilization and is necessary for entry into first cell division cycle. Accumulated knowledge on cap-dependent initiation of translation makes it suited and timely to start integrating the data into a system view of biological functions. Using a programming environment for system biology coupled with model validation (named Biocham), we have built an integrative model for cap-dependent initiation of translation. The model is described by abstract rules. It contains 51 reactions involved in 74 molecular complexes. The model proved to be coherent with existing knowledge by using queries based on computational tree logic (CTL) as well as Boolean simulations. The model could simulate the change in translation occurring at fertilization in the sea urchin model. It could also be coupled with an existing model designed for cell-cycle control. Therefore, the cap-dependent translation initiation model can be considered a first step towards the eukaryotic translation regulation network.

  17. Semiotics of Umberto Eco in a Literary Translation Class: The Model Reader as the Competent Translator

    Science.gov (United States)

    Ozturk Kasar, Sündüz; Can, Alize

    2017-01-01

    Classroom environment can be thought as an absolute place to practice and improve translation skills of students. They have the possibility to brainstorm and discuss problematic points they face with each other during a translation activity. It can be estimated in the same way in a literary translation class. Students who are supposed to become…

  18. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  19. Translation and development of the BNWL-geosphere model

    International Nuclear Information System (INIS)

    Grundfelt, B.

    1977-02-01

    The report deals with the rate of radioactivity discharge from a repository for radioactive waste in a geologic formation to the biosphere. A BASIC language computer program called GETOUT has been developed in USA. It was obtained by the Swedish project Nuclear Fuel Safety and has thereafter been translated into FORTRAN. The main extension of the code, that was made during the translation, is a model for averaging the hydrodynamic and geochemical parameters for the case of non-uniform packing of the column (e.g. considering a repository in cracked rock with crack width, crack spacing etc. in different zones). The program has been outtested on an IBM model 360/75 computer. The migration is governed by three parameters i.e. the ground water velocity, the dispersion coefficient and the nuclide retentivities. (L.B.)

  20. A conceptual model for translating omic data into clinical action

    Directory of Open Access Journals (Sweden)

    Timothy M Herr

    2015-01-01

    Full Text Available Genomic, proteomic, epigenomic, and other "omic" data have the potential to enable precision medicine, also commonly referred to as personalized medicine. The volume and complexity of omic data are rapidly overwhelming human cognitive capacity, requiring innovative approaches to translate such data into patient care. Here, we outline a conceptual model for the application of omic data in the clinical context, called "the omic funnel." This model parallels the classic "Data, Information, Knowledge, Wisdom pyramid" and adds context for how to move between each successive layer. Its goal is to allow informaticians, researchers, and clinicians to approach the problem of translating omic data from bench to bedside, by using discrete steps with clearly defined needs. Such an approach can facilitate the development of modular and interoperable software that can bring precision medicine into widespread practice.

  1. Models of kulture in Nabokov's memoirs and translation memoirs in Serbian and Croatian language

    Directory of Open Access Journals (Sweden)

    Razdobudko-Čović Larisa I.

    2012-01-01

    Full Text Available The paper presents an analysis of two Serbian translations of V. Nabokov's memoirs, that is the translation of the novel 'Drugie berega' ('The Other Shores' published in Russian as an authorized translation from the original English version 'Conclusive Evidence', and the translation of Nabokov's authorized translation from Russian to English entitled 'Speak, Memory'. Creolization of three models of culture in translation from the two originals - Russian and English - Is presented. Specific features of the two Serbian translations are analyzed, and a survey of characteristic mistakes caused by some specific characteristics of the source language is given. Also, Nabokov's very original approach to translation which is quite interpretative is highlighted.

  2. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  3. Modeling and prediction of human word search behavior in interactive machine translation

    Science.gov (United States)

    Ji, Duo; Yu, Bai; Ma, Bin; Ye, Na

    2017-12-01

    As a kind of computer aided translation method, Interactive Machine Translation technology reduced manual translation repetitive and mechanical operation through a variety of methods, so as to get the translation efficiency, and played an important role in the practical application of the translation work. In this paper, we regarded the behavior of users' frequently searching for words in the translation process as the research object, and transformed the behavior to the translation selection problem under the current translation. The paper presented a prediction model, which is a comprehensive utilization of alignment model, translation model and language model of the searching words behavior. It achieved a highly accurate prediction of searching words behavior, and reduced the switching of mouse and keyboard operations in the users' translation process.

  4. Animal Models for Tuberculosis in Translational and Precision Medicine

    Directory of Open Access Journals (Sweden)

    Lingjun Zhan

    2017-05-01

    Full Text Available Tuberculosis (TB is a health threat to the global population. Anti-TB drugs and vaccines are key approaches for TB prevention and control. TB animal models are basic tools for developing biomarkers of diagnosis, drugs for therapy, vaccines for prevention and researching pathogenic mechanisms for identification of targets; thus, they serve as the cornerstone of comparative medicine, translational medicine, and precision medicine. In this review, we discuss the current use of TB animal models and their problems, as well as offering perspectives on the future of these models.

  5. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  6. Translational Models of Gambling-Related Decision-Making.

    Science.gov (United States)

    Winstanley, Catharine A; Clark, Luke

    Gambling is a harmless, recreational pastime that is ubiquitous across cultures. However, for some, gambling becomes a maladaptive and compulsive, and this syndrome is conceptualized as a behavioural addiction. Laboratory models that capture the key cognitive processes involved in gambling behaviour, and that can be translated across species, have the potential to make an important contribution to both decision neuroscience and the study of addictive disorders. The Iowa gambling task has been widely used to assess human decision-making under uncertainty, and this paradigm can be successfully modelled in rodents. Similar neurobiological processes underpin choice behaviour in humans and rats, and thus, a preference for the disadvantageous "high-risk, high-reward" options may reflect meaningful vulnerability for mental health problems. However, the choice behaviour operationalized by these tasks does not necessarily approximate the vulnerability to gambling disorder (GD) per se. We consider a number of psychological challenges that apply to modelling gambling in a translational way, and evaluate the success of the existing models. Heterogeneity in the structure of gambling games, as well as in the motivations of individuals with GD, is highlighted. The potential issues with extrapolating too directly from established animal models of drug dependency are discussed, as are the inherent difficulties in validating animal models of GD in the absence of any approved treatments for GD. Further advances in modelling the cognitive biases endemic in human decision-making, which appear to be exacerbated in GD, may be a promising line of research.

  7. Methodology, models and algorithms in thermographic diagnostics

    CERN Document Server

    Živčák, Jozef; Madarász, Ladislav; Rudas, Imre J

    2013-01-01

    This book presents  the methodology and techniques of  thermographic applications with focus primarily on medical thermography implemented for parametrizing the diagnostics of the human body. The first part of the book describes the basics of infrared thermography, the possibilities of thermographic diagnostics and the physical nature of thermography. The second half includes tools of intelligent engineering applied for the solving of selected applications and projects. Thermographic diagnostics was applied to problematics of paraplegia and tetraplegia and carpal tunnel syndrome (CTS). The results of the research activities were created with the cooperation of the four projects within the Ministry of Education, Science, Research and Sport of the Slovak Republic entitled Digital control of complex systems with two degrees of freedom, Progressive methods of education in the area of control and modeling of complex object oriented systems on aircraft turbocompressor engines, Center for research of control of te...

  8. Designing and Implementing INTREPID, an Intensive Program in Translational Research Methodologies for New Investigators

    Science.gov (United States)

    Aphinyanaphongs, Yindalon; Shao, Yongzhao; Micoli, Keith J.; Fang, Yixin; Goldberg, Judith D.; Galeano, Claudia R.; Stangel, Jessica H.; Chavis‐Keeling, Deborah; Hochman, Judith S.; Cronstein, Bruce N.; Pillinger, Michael H.

    2014-01-01

    Abstract Senior housestaff and junior faculty are often expected to perform clinical research, yet may not always have the requisite knowledge and skills to do so successfully. Formal degree programs provide such knowledge, but require a significant commitment of time and money. Short‐term training programs (days to weeks) provide alternative ways to accrue essential information and acquire fundamental methodological skills. Unfortunately, published information about short‐term programs is sparse. To encourage discussion and exchange of ideas regarding such programs, we here share our experience developing and implementing INtensive Training in Research Statistics, Ethics, and Protocol Informatics and Design (INTREPID), a 24‐day immersion training program in clinical research methodologies. Designing, planning, and offering INTREPID was feasible, and required significant faculty commitment, support personnel and infrastructure, as well as committed trainees. PMID:25066862

  9. A cognitive-pragmatic model for translation-shift analysis in ...

    African Journals Online (AJOL)

    A cognitive-pragmatic model for translation-shift analysis in descriptive case ... This model responds to the tendency of descriptive studies to analyse all translation shifts under the same rubric of neutrality. ... AJOL African Journals Online.

  10. Computerized methodology for micro-CT and histological data inflation using an IVUS based translation map.

    Science.gov (United States)

    Athanasiou, Lambros S; Rigas, George A; Sakellarios, Antonis I; Exarchos, Themis P; Siogkas, Panagiotis K; Naka, Katerina K; Panetta, Daniele; Pelosi, Gualtiero; Vozzi, Federico; Michalis, Lampros K; Parodi, Oberdan; Fotiadis, Dimitrios I

    2015-10-01

    A framework for the inflation of micro-CT and histology data using intravascular ultrasound (IVUS) images, is presented. The proposed methodology consists of three steps. In the first step the micro-CT/histological images are manually co-registered with IVUS by experts using fiducial points as landmarks. In the second step the lumen of both the micro-CT/histological images and IVUS images are automatically segmented. Finally, in the third step the micro-CT/histological images are inflated by applying a transformation method on each image. The transformation method is based on the IVUS and micro-CT/histological contour difference. In order to validate the proposed image inflation methodology, plaque areas in the inflated micro-CT and histological images are compared with the ones in the IVUS images. The proposed methodology for inflating micro-CT/histological images increases the sensitivity of plaque area matching between the inflated and the IVUS images (7% and 22% in histological and micro-CT images, respectively). Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Genetic Algorithms for Models Optimization for Recognition of Translation Initiation Sites

    KAUST Repository

    Mora, Arturo Magana

    2011-06-01

    This work uses genetic algorithms (GA) to reduce the complexity of the artificial neural networks (ANNs) and decision trees (DTs) for the accurate recognition of translation initiation sites (TISs) in Arabidopsis Thaliana. The Arabidopsis data was extracted directly from genomic DNA sequences. Methods derived in this work resulted in both reduced complexity of the predictors, as well as in improvement in prediction accuracy (generalization). Optimization through use of GA is generally a computationally intensive task. One of the approaches to overcome this problem is to use parallelization of code that implements GA, thus allowing computation on multiprocessing infrastructure. However, further improvement in performance GA implementation could be achieved through modification done to GA basic operations such as selection, crossover and mutation. In this work we explored two such improvements, namely evolutive mutation and GA-Simplex crossover operation. In this thesis we studied the benefit of these modifications on the problem of TISs recognition. Compared to the non-modified GA approach, we reduced the number of weights in the resulting model\\'s neural network component by 51% and the number of nodes in the model\\'s DTs component by 97% whilst improving the model\\'s accuracy at the same time. Separately, we developed another methodology for reducing the complexity of prediction models by optimizing the composition of training data subsets in bootstrap aggregation (bagging) methodology. This optimization is achieved by applying a new GA-based bagging methodology in order to optimize the composition of each of the training data subsets. This approach has shown in our test cases to considerably enhance the accuracy of the TIS prediction model compared to the original bagging methodology. Although these methods are applied to the problem of accurate prediction of TISs we believe that these methodologies have a potential for wider scope of application.

  12. Methodological Developments in Geophysical Assimilation Modeling

    Science.gov (United States)

    Christakos, George

    2005-06-01

    This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to

  13. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  14. A Model of Translator's Competence from an Educational Perspective

    Science.gov (United States)

    Eser, Oktay

    2015-01-01

    Translation as a business is a service. The concept of translation competence is a term covering the various skills and knowledge that a translator needs to have in order to translate functionally. The term which is often studied as a multi-componential concept in literature may not cover the necessary skills if it is taken from an organizational…

  15. A methodology for spectral wave model evaluation

    Science.gov (United States)

    Siqueira, S. A.; Edwards, K. L.; Rogers, W. E.

    2017-12-01

    climate, omitting the energy in the frequency band between the two lower limits tested can lead to an incomplete characterization of model performance. This methodology was developed to aid in selecting a comparison frequency range that does not needlessly increase computational expense and does not exclude energy to the detriment of model performance analysis.

  16. Common Marmosets: A Potential Translational Animal Model of Juvenile Depression

    Directory of Open Access Journals (Sweden)

    Nicole Leite Galvão-Coelho

    2017-09-01

    Full Text Available Major depression is a psychiatric disorder with high prevalence in the general population, with increasing expression in adolescence, about 14% in young people. Frequently, it presents as a chronic condition, showing no remission even after several pharmacological treatments and persisting in adult life. Therefore, distinct protocols and animal models have been developed to increase the understanding of this disease or search for new therapies. To this end, this study investigated the effects of chronic social isolation and the potential antidepressant action of nortriptyline in juvenile Callithrix jacchus males and females by monitoring fecal cortisol, body weight, and behavioral parameters and searching for biomarkers and a protocol for inducing depression. The purpose was to validate this species and protocol as a translational model of juvenile depression, addressing all domain criteria of validation: etiologic, face, functional, predictive, inter-relational, evolutionary, and population. In both sexes and both protocols (IDS and DPT, we observed a significant reduction in cortisol levels in the last phase of social isolation, concomitant with increases in autogrooming, stereotyped and anxiety behaviors, and the presence of anhedonia. The alterations induced by chronic social isolation are characteristic of the depressive state in non-human primates and/or in humans, and were reversed in large part by treatment with an antidepressant drug (nortriptyline. Therefore, these results indicate C. jacchus as a potential translational model of juvenile depression by addressing all criteria of validation.

  17. Translational Assays for Assessment of Cognition in Rodent Models of Alzheimer's Disease and Dementia.

    Science.gov (United States)

    Shepherd, A; Tyebji, S; Hannan, A J; Burrows, E L

    2016-11-01

    Cognitive dysfunction appears as a core feature of dementia, which includes its most prevalent form, Alzheimer's disease (AD), as well as vascular dementia, frontotemporal dementia, and other brain disorders. AD alone affects more than 45 million people worldwide, with growing prevalence in aging populations. There is no cure, and therapeutic options remain limited. Gene-edited and transgenic animal models, expressing disease-specific gene mutations, illuminate pathogenic mechanisms leading to cognitive decline in AD and other forms of dementia. To date, cognitive tests in AD mouse models have not been directly relevant to the clinical presentation of AD, providing challenges for translation of findings to the clinic. Touchscreen testing in mice has enabled the assessment of specific cognitive domains in mice that are directly relevant to impairments described in human AD patients. In this review, we provide context for how cognitive decline is measured in the clinic, describe traditional methods for assessing cognition in mice, and outline novel approaches, including the use of the touchscreen platform for cognitive testing. We highlight the limitations of traditional memory-testing paradigms in mice, particularly their capacity for direct translation into cognitive testing of patients. While it is not possible to expect direct translation in testing methodologies, we can aim to develop tests that engage similar neural substrates in both humans and mice. Ultimately, that would enable us to better predict efficacy across species and therefore improve the chances that a treatment that works in mice will also work in the clinic.

  18. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  19. Translational Mouse Models of Autism: Advancing Toward Pharmacological Therapeutics

    Science.gov (United States)

    Kazdoba, Tatiana M.; Leach, Prescott T.; Yang, Mu; Silverman, Jill L.; Solomon, Marjorie

    2016-01-01

    Animal models provide preclinical tools to investigate the causal role of genetic mutations and environmental factors in the etiology of autism spectrum disorder (ASD). Knockout and humanized knock-in mice, and more recently knockout rats, have been generated for many of the de novo single gene mutations and copy number variants (CNVs) detected in ASD and comorbid neurodevelopmental disorders. Mouse models incorporating genetic and environmental manipulations have been employed for preclinical testing of hypothesis-driven pharmacological targets, to begin to develop treatments for the diagnostic and associated symptoms of autism. In this review, we summarize rodent behavioral assays relevant to the core features of autism, preclinical and clinical evaluations of pharmacological interventions, and strategies to improve the translational value of rodent models of autism. PMID:27305922

  20. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  1. Slow dynamics in translation-invariant quantum lattice models

    Science.gov (United States)

    Michailidis, Alexios A.; Žnidarič, Marko; Medvedyeva, Mariya; Abanin, Dmitry A.; Prosen, Tomaž; Papić, Z.

    2018-03-01

    Many-body quantum systems typically display fast dynamics and ballistic spreading of information. Here we address the open problem of how slow the dynamics can be after a generic breaking of integrability by local interactions. We develop a method based on degenerate perturbation theory that reveals slow dynamical regimes and delocalization processes in general translation invariant models, along with accurate estimates of their delocalization time scales. Our results shed light on the fundamental questions of the robustness of quantum integrable systems and the possibility of many-body localization without disorder. As an example, we construct a large class of one-dimensional lattice models where, despite the absence of asymptotic localization, the transient dynamics is exceptionally slow, i.e., the dynamics is indistinguishable from that of many-body localized systems for the system sizes and time scales accessible in experiments and numerical simulations.

  2. Drosophila Melanogaster as an Emerging Translational Model of Human Nephrolithiasis

    Science.gov (United States)

    Miller, Joe; Chi, Thomas; Kapahi, Pankaj; Kahn, Arnold J.; Kim, Man Su; Hirata, Taku; Romero, Michael F.; Dow, Julian A.T.; Stoller, Marshall L.

    2013-01-01

    Purpose The limitations imposed by human clinical studies and mammalian models of nephrolithiasis have hampered the development of effective medical treatments and preventative measures for decades. The simple but elegant Drosophila melanogaster is emerging as a powerful translational model of human disease, including nephrolithiasis and may provide important information essential to our understanding of stone formation. We present the current state of research using D. melanogaster as a model of human nephrolithiasis. Materials and Methods A comprehensive review of the English language literature was performed using PUBMED. When necessary, authoritative texts on relevant subtopics were consulted. Results The genetic composition, anatomic structure and physiologic function of Drosophila Malpighian tubules are remarkably similar to those of the human nephron. The direct effects of dietary manipulation, environmental alteration, and genetic variation on stone formation can be observed and quantified in a matter of days. Several Drosophila models of human nephrolithiasis, including genetically linked and environmentally induced stones, have been developed. A model of calcium oxalate stone formation is among the most recent fly models of human nephrolithiasis. Conclusions The ability to readily manipulate and quantify stone formation in D. melanogaster models of human nephrolithiasis presents the urologic community with a unique opportunity to increase our understanding of this enigmatic disease. PMID:23500641

  3. Methodology to translate policy assessment problems into scenarios: the example of the SEAMLESS integrated framework

    NARCIS (Netherlands)

    Therond, O.; Belhouchette, H.; Janssen, S.J.C.; Louhichi, K.; Ewert, F.; Bergez, J.E.; Wery, J.; Heckelei, T.; Olsson, J.A.; Leenhardt, D.; Ittersum, van M.K.

    2009-01-01

    Scenario-based approaches in environmental and policy assessment studies are increasingly applied within integrated assessment and modelling frameworks. The SEAMLESS project develops such an integrated framework (SEAMLESS-IF) aiming to assess, ex-ante, impacts of alternative agro-environmental

  4. A Novel Translational Model of Spinal Cord Injury in Nonhuman Primate.

    Science.gov (United States)

    Le Corre, Marine; Noristani, Harun N; Mestre-Frances, Nadine; Saint-Martin, Guillaume P; Coillot, Christophe; Goze-Bac, Christophe; Lonjon, Nicolas; Perrin, Florence E

    2017-11-27

    Spinal cord injuries (SCI) lead to major disabilities affecting > 2.5 million people worldwide. Major shortcomings in clinical translation result from multiple factors, including species differences, development of moderately predictive animal models, and differences in methodologies between preclinical and clinical studies. To overcome these obstacles, we first conducted a comparative neuroanatomical analysis of the spinal cord between mice, Microcebus murinus (a nonhuman primate), and humans. Next, we developed and characterized a new model of lateral spinal cord hemisection in M. murinus. Over a 3-month period after SCI, we carried out a detailed, longitudinal, behavioral follow-up associated with in vivo magnetic resonance imaging ( 1 H-MRI) monitoring. Then, we compared lesion extension and tissue alteration using 3 methods: in vivo 1 H-MRI, ex vivo 1 H-MRI, and classical histology. The general organization and glial cell distribution/morphology in the spinal cord of M. murinus closely resembles that of humans. Animals assessed at different stages following lateral hemisection of the spinal cord presented specific motor deficits and spinal cord tissue alterations. We also found a close correlation between 1 H-MRI signal and microglia reactivity and/or associated post-trauma phenomena. Spinal cord hemisection in M. murinus provides a reliable new nonhuman primate model that can be used to promote translational research on SCI and represents a novel and more affordable alternative to larger primates.

  5. Translating patient reported outcome measures: methodological issues explored using cognitive interviewing with three rheumatoid arthritis measures in six European languages

    NARCIS (Netherlands)

    Hewlett, Sarah E.; Nicklin, Joanna; Bode, Christina; Carmona, Loretto; Dures, Emma; Engelbrecht, Matthias; Hagel, Sofia; Kirwan, John R.; Molto, Anna; Redondo, Marta; Gossec, Laure

    2016-01-01

    Objective. Cross-cultural translation of patient-reported outcome measures (PROMs) is a lengthy process, often performed professionally. Cognitive interviewing assesses patient comprehension of PROMs. The objective was to evaluate the usefulness of cognitive interviewing to assess translations and

  6. A methodology for ecosystem-scale modeling of selenium

    Science.gov (United States)

    Presser, T.S.; Luoma, S.N.

    2010-01-01

    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determinehow Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure

  7. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  8. Mapping research questions about translation to methods, measures, and models

    NARCIS (Netherlands)

    Berninger, V.; Rijlaarsdam, G.; Fayol, M.L.; Fayol, M.; Alamargot, D.; Berninger, V.W.

    2012-01-01

    About the book: Translation of cognitive representations into written language is one of the most important processes in writing. This volume provides a long-awaited updated overview of the field. The contributors discuss each of the commonly used research methods for studying translation; theorize

  9. On Interactive Teaching Model of Translation Course Based on Wechat

    Science.gov (United States)

    Lin, Wang

    2017-01-01

    Constructivism is a theory related to knowledge and learning, focusing on learners' subjective initiative, based on which the interactive approach has been proved to play a crucial role in language learning. Accordingly, the interactive approach can also be applied to translation teaching since translation itself is a bilingual transformational…

  10. Understanding the limits of animal models as predictors of human biology: lessons learned from the sbv IMPROVER Species Translation Challenge.

    Science.gov (United States)

    Rhrissorrakrai, Kahn; Belcastro, Vincenzo; Bilal, Erhan; Norel, Raquel; Poussin, Carine; Mathis, Carole; Dulize, Rémi H J; Ivanov, Nikolai V; Alexopoulos, Leonidas; Rice, J Jeremy; Peitsch, Manuel C; Stolovitzky, Gustavo; Meyer, Pablo; Hoeng, Julia

    2015-02-15

    Inferring how humans respond to external cues such as drugs, chemicals, viruses or hormones is an essential question in biomedicine. Very often, however, this question cannot be addressed because it is not possible to perform experiments in humans. A reasonable alternative consists of generating responses in animal models and 'translating' those results to humans. The limitations of such translation, however, are far from clear, and systematic assessments of its actual potential are urgently needed. sbv IMPROVER (systems biology verification for Industrial Methodology for PROcess VErification in Research) was designed as a series of challenges to address translatability between humans and rodents. This collaborative crowd-sourcing initiative invited scientists from around the world to apply their own computational methodologies on a multilayer systems biology dataset composed of phosphoproteomics, transcriptomics and cytokine data derived from normal human and rat bronchial epithelial cells exposed in parallel to 52 different stimuli under identical conditions. Our aim was to understand the limits of species-to-species translatability at different levels of biological organization: signaling, transcriptional and release of secreted factors (such as cytokines). Participating teams submitted 49 different solutions across the sub-challenges, two-thirds of which were statistically significantly better than random. Additionally, similar computational methods were found to range widely in their performance within the same challenge, and no single method emerged as a clear winner across all sub-challenges. Finally, computational methods were able to effectively translate some specific stimuli and biological processes in the lung epithelial system, such as DNA synthesis, cytoskeleton and extracellular matrix, translation, immune/inflammation and growth factor/proliferation pathways, better than the expected response similarity between species. pmeyerr@us.ibm.com or Julia

  11. Methodological Approach for Modeling of Multienzyme in-pot Processes

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia; Roman Martinez, Alicia; Sin, Gürkan

    2011-01-01

    This paper presents a methodological approach for modeling multi-enzyme in-pot processes. The methodology is exemplified stepwise through the bi-enzymatic production of N-acetyl-D-neuraminic acid (Neu5Ac) from N-acetyl-D-glucosamine (GlcNAc). In this case study, sensitivity analysis is also used ...

  12. Locally Simple Models Construction: Methodology and Practice

    Directory of Open Access Journals (Sweden)

    I. A. Kazakov

    2017-12-01

    Full Text Available One of the most notable trends associated with the Fourth industrial revolution is a significant strengthening of the role played by semantic methods. They are engaged in artificial intelligence means, knowledge mining in huge flows of big data, robotization, and in the internet of things. Smart contracts also can be mentioned here, although the ’intelligence’ of smart contracts still needs to be seriously elaborated. These trends should inevitably lead to an increased role of logical methods working with semantics, and significantly expand the scope of their application in practice. However, there are a number of problems that hinder this process. We are developing an approach, which makes the application of logical modeling efficient in some important areas. The approach is based on the concept of locally simple models and is primarily focused on solving tasks in the management of enterprises, organizations, governing bodies. The most important feature of locally simple models is their ability to replace software systems. Replacement of programming by modeling gives huge advantages, for instance, it dramatically reduces development and support costs. Modeling, unlike programming, preserves the explicit semantics of models allowing integration with artificial intelligence and robots. In addition, models are much more understandable to general people than programs. In this paper we propose the implementation of the concept of locally simple modeling on the basis of so-called document models, which has been developed by us earlier. It is shown that locally simple modeling is realized through document models with finite submodel coverages. In the second part of the paper an example of using document models for solving a management problem of real complexity is demonstrated.

  13. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  14. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  15. Modeling Methodologies for Representing Urban Cultural Geographies in Stability Operations

    National Research Council Canada - National Science Library

    Ferris, Todd P

    2008-01-01

    ... 2.0.0, in an effort to provide modeling methodologies for a single simulation tool capable of exploring the complex world of urban cultural geographies undergoing Stability Operations in an irregular warfare (IW) environment...

  16. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  17. Translation and linguistic validation of the Pediatric Patient-Reported Outcomes Measurement Information System measures into simplified Chinese using cognitive interviewing methodology.

    Science.gov (United States)

    Liu, Yanyan; Hinds, Pamela S; Wang, Jichuan; Correia, Helena; Du, Shizheng; Ding, Jian; Gao, Wen Jun; Yuan, Changrong

    2013-01-01

    The Pediatric Patient-Reported Outcomes Measurement Information System (PROMIS) measures were developed using modern measurement theory and tested in a variety of settings to assess the quality of life, function, and symptoms of children and adolescents experiencing a chronic illness and its treatment. Developed in English, this set of measures had not been translated into Chinese. The objective of this study was to develop the Chinese version of the Pediatric PROMIS measures (C-Ped-PROMIS), specifically 8 short forms, and to pretest the translated measures in children and adolescents through cognitive interviewing methodology. The C-Ped-PROMIS was developed following the standard Functional Assessment of Chronic Illness Therapy Translation Methodology. Bilingual teams from the United States and China reviewed the translation to develop a provisional version, which was then pretested with cognitive interview by probing 10 native Chinese-speaking children aged 8 to 17 years in China. The translation was finalized by the bilingual teams. Most items, response options, and instructions were well understood by the children, and some revisions were made to address patient's comments during the cognitive interview. The results indicated that the C-Ped-PROMIS items were semantically and conceptually equivalent to the original. Children aged 8 to 17 years in China were able to comprehend these measures and express their experience and feelings about illness or their life. The C-Ped-PROMIS is available for psychometric validation. Future work will be directed at translating the rest of the item banks, calibrating them and creating a Chinese final version of the short forms.

  18. Translating the foundational model of anatomy into french using knowledge-based and lexical methods

    Directory of Open Access Journals (Sweden)

    Merabti Tayeb

    2011-10-01

    Full Text Available Abstract Background The Foundational Model of Anatomy (FMA is the reference ontology regarding human anatomy. FMA vocabulary was integrated into the Health Multi Terminological Portal (HMTP developed by CISMeF based on the CISMeF Information System which also includes 26 other terminologies and controlled vocabularies, mainly in French. However, FMA is primarily in English. In this context, the translation of FMA English terms into French could also be useful for searching and indexing French anatomy resources. Various studies have investigated automatic methods to assist the translation of medical terminologies or create multilingual medical vocabularies. The goal of this study was to facilitate the translation of FMA vocabulary into French. Methods We compare two types of approaches to translate the FMA terms into French. The first one is UMLS-based on the conceptual information of the UMLS metathesaurus. The second method is lexically-based on several Natural Language Processing (NLP tools. Results The UMLS-based approach produced a translation of 3,661 FMA terms into French whereas the lexical approach produced a translation of 3,129 FMA terms into French. A qualitative evaluation was made on 100 FMA terms translated by each method. For the UMLS-based approach, among the 100 translations, 52% were manually rated as "very good" and only 7% translations as "bad". For the lexical approach, among the 100 translations, 47% were rated as "very good" and 20% translations as "bad". Conclusions Overall, a low rate of translations were demonstrated by the two methods. The two approaches permitted us to semi-automatically translate 3,776 FMA terms from English into French, this was to added to the existing 10,844 French FMA terms in the HMTP (4,436 FMA French terms and 6,408 FMA terms manually translated.

  19. Modeling workflow to design machine translation applications for public health practice.

    Science.gov (United States)

    Turner, Anne M; Brownstein, Megumu K; Cole, Kate; Karasz, Hilary; Kirchhoff, Katrin

    2015-02-01

    Provide a detailed understanding of the information workflow processes related to translating health promotion materials for limited English proficiency individuals in order to inform the design of context-driven machine translation (MT) tools for public health (PH). We applied a cognitive work analysis framework to investigate the translation information workflow processes of two large health departments in Washington State. Researchers conducted interviews, performed a task analysis, and validated results with PH professionals to model translation workflow and identify functional requirements for a translation system for PH. The study resulted in a detailed description of work related to translation of PH materials, an information workflow diagram, and a description of attitudes towards MT technology. We identified a number of themes that hold design implications for incorporating MT in PH translation practice. A PH translation tool prototype was designed based on these findings. This study underscores the importance of understanding the work context and information workflow for which systems will be designed. Based on themes and translation information workflow processes, we identified key design guidelines for incorporating MT into PH translation work. Primary amongst these is that MT should be followed by human review for translations to be of high quality and for the technology to be adopted into practice. The time and costs of creating multilingual health promotion materials are barriers to translation. PH personnel were interested in MT's potential to improve access to low-cost translated PH materials, but expressed concerns about ensuring quality. We outline design considerations and a potential machine translation tool to best fit MT systems into PH practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF). Th...

  1. Translation techniques for distributed-shared memory programming models

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, Douglas James [Iowa State Univ., Ames, IA (United States)

    2005-01-01

    The high performance computing community has experienced an explosive improvement in distributed-shared memory hardware. Driven by increasing real-world problem complexity, this explosion has ushered in vast numbers of new systems. Each new system presents new challenges to programmers and application developers. Part of the challenge is adapting to new architectures with new performance characteristics. Different vendors release systems with widely varying architectures that perform differently in different situations. Furthermore, since vendors need only provide a single performance number (total MFLOPS, typically for a single benchmark), they only have strong incentive initially to optimize the API of their choice. Consequently, only a fraction of the available APIs are well optimized on most systems. This causes issues porting and writing maintainable software, let alone issues for programmers burdened with mastering each new API as it is released. Also, programmers wishing to use a certain machine must choose their API based on the underlying hardware instead of the application. This thesis argues that a flexible, extensible translator for distributed-shared memory APIs can help address some of these issues. For example, a translator might take as input code in one API and output an equivalent program in another. Such a translator could provide instant porting for applications to new systems that do not support the application's library or language natively. While open-source APIs are abundant, they do not perform optimally everywhere. A translator would also allow performance testing using a single base code translated to a number of different APIs. Most significantly, this type of translator frees programmers to select the most appropriate API for a given application based on the application (and developer) itself instead of the underlying hardware.

  2. From theoretical model to practical use: an example of knowledge translation.

    Science.gov (United States)

    Bjørk, Ida Torunn; Lomborg, Kirsten; Nielsen, Carsten Munch; Brynildsen, Grethe; Frederiksen, Anne-Marie Skovsgaard; Larsen, Karin; Reierson, Inger Åse; Sommer, Irene; Stenholt, Britta

    2013-10-01

    To present a case of knowledge translation in nursing education and practice and discusses mechanisms relevant to bringing knowledge into action. The process of knowledge translation aspires to close the gap between theory and practice. Knowledge translation is a cyclic process involving both the creation and application of knowledge in several phases. The case presented in this paper is the translation of the Model of Practical Skill Performance into education and practice. Advantages and problems with the use of this model and its adaptation and tailoring to local contexts illustrate the cyclic and iterative process of knowledge translation. The cultivation of a three-sided relationship between researchers, educators, and clinical nurses was a major asset in driving the process of knowledge translation. The knowledge translation process gained momentum by replacing passive diffusion strategies with interaction and teamwork between stakeholders. The use of knowledge creates feedback that might have consequences for the refinement and tailoring of that same knowledge itself. With end-users in mind, several heuristics were used by the research group to increase clarity of the model and to tailor the implementation of knowledge to the users. This article illustrates the need for enduring collaboration between stakeholders to promote the process of knowledge translation. Translation of research knowledge into practice is a time-consuming process that is enhanced when appropriate support is given by leaders in the involved facilities. Knowledge translation is a time-consuming and collaborative endeavour. On the basis of our experience we advocate the implementation and use of a conceptual framework for the entire process of knowledge translation. More descriptions of knowledge translation in the nursing discipline are needed to inspire and advise in this process. © 2013 Blackwell Publishing Ltd.

  3. A Translational Model of Research-Practice Integration

    Science.gov (United States)

    Vivian, Dina; Hershenberg, Rachel; Teachman, Bethany A.; Drabick, Deborah A. G.; Goldfried, Marvin R.; Wolfe, Barry

    2013-01-01

    We propose a four-level, recursive Research-Practice Integration framework as a heuristic to (a) integrate and reflect on the articles in this Special Section as contributing to a bidirectional bridge between research and practice, and (b) consider additional opportunities to address the research–practice gap. Level 1 addresses Treatment Validation studies and includes an article by Lochman and colleagues concerning the programmatic adaptation, implementation, and dissemination of the empirically supported Coping Power treatment program for youth aggression. Level 2 translation, Training in Evidence-Based Practice, includes a paper by Hershenberg, Drabick, and Vivian, which focuses on the critical role that predoctoral training plays in bridging the research–practice gap. Level 3 addresses the Assessment of Clinical Utility and Feedback to Research aspects of translation. The articles by Lambert and Youn, Kraus, and Castonguay illustrate the use of commercial outcome packages that enable psychotherapists to integrate ongoing client assessment, thus enhancing the effectiveness of treatment implementation and providing data that can be fed back to researchers. Lastly, Level 4 translation, the Cross-Level Integrative Research and Communication, concerns research efforts that integrate data from clinical practice and all other levels of translation, as well as communication efforts among all stakeholders, such as researchers, psychotherapists, and clients. Using a two-chair technique as a framework for his discussion, Wolfe's article depicts the struggle inherent in research–practice integration efforts and proposes a rapprochement that highlights advancements in the field. PMID:22642522

  4. Selected translated abstracts of Russian-language climate-change publications. 4: General circulation models

    Energy Technology Data Exchange (ETDEWEB)

    Burtis, M.D. [comp.] [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Razuvaev, V.N.; Sivachok, S.G. [All-Russian Research Inst. of Hydrometeorological Information--World Data Center, Obninsk (Russian Federation)

    1996-10-01

    This report presents English-translated abstracts of important Russian-language literature concerning general circulation models as they relate to climate change. Into addition to the bibliographic citations and abstracts translated into English, this report presents the original citations and abstracts in Russian. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  5. Comparison of Methods for Modeling a Hydraulic Loader Crane With Flexible Translational Links

    DEFF Research Database (Denmark)

    Pedersen, Henrik Clemmensen; Andersen, Torben O.; Nielsen, Brian K.

    2015-01-01

    not hold for translational links. Hence, special care has to be taken when including flexible translational links. In the current paper, different methods for modeling a hydraulic loader crane with a telescopic arm are investigated and compared using both the finite segment (FS) and AMs method...

  6. Efficient accurate syntactic direct translation models: one tree at a time

    NARCIS (Netherlands)

    Hassan, H.; Sima'an, K.; Way, A.

    2011-01-01

    A challenging aspect of Statistical Machine Translation from Arabic to English lies in bringing the Arabic source morpho-syntax to bear on the lexical as well as word-order choices of the English target string. In this article, we extend the feature-rich discriminative Direct Translation Model 2

  7. Using physiologically based models for clinical translation: predictive modelling, data interpretation or something in-between?

    Science.gov (United States)

    Niederer, Steven A; Smith, Nic P

    2016-12-01

    Heart disease continues to be a significant clinical problem in Western society. Predictive models and simulations that integrate physiological understanding with patient information derived from clinical data have huge potential to contribute to improving our understanding of both the progression and treatment of heart disease. In particular they provide the potential to improve patient selection and optimisation of cardiovascular interventions across a range of pathologies. Currently a significant proportion of this potential is still to be realised. In this paper we discuss the opportunities and challenges associated with this realisation. Reviewing the successful elements of model translation for biophysically based models and the emerging supporting technologies, we propose three distinct modes of clinical translation. Finally we outline the challenges ahead that will be fundamental to overcome if the ultimate goal of fully personalised clinical cardiac care is to be achieved. © 2016 The Authors. The Journal of Physiology © 2016 The Physiological Society.

  8. Translation of a High-Level Temporal Model into Lower Level Models: Impact of Modelling at Different Description Levels

    DEFF Research Database (Denmark)

    Kraft, Peter; Sørensen, Jens Otto

    2001-01-01

    given types of properties, and examine how descriptions on higher levels translate into descriptions on lower levels. Our example looks at temporal properties where the information is concerned with the existence in time. In a high level temporal model with information kept in a three-dimensional space...... the existences in time can be mapped precisely and consistently securing a consistent handling of the temporal properties. We translate the high level temporal model into an entity-relationship model, with the information in a two-dimensional graph, and finally we look at the translations into relational...... and other textual models. We also consider the aptness of models that include procedural mechanisms such as active and object databases...

  9. Geologic modeling in risk assessment methodology for radioactive waste management

    International Nuclear Information System (INIS)

    Logan, S.E.; Berbano, M.C.

    1977-01-01

    Under contract to the U.S. Environmental Protection Agency (EPA), the University of New Mexico is developing a computer based assessment methodology for evaluating public health and environmental impacts from the disposal of radioactive waste in geologic formations. Methodology incorporates a release or fault tree model, an environmental model, and an economic model. The release model and its application to a model repository in bedded salt is described. Fault trees are constructed to provide the relationships between various geologic and man-caused events which are potential mechanisms for release of radioactive material beyond the immediate environs of the repository. The environmental model includes: 1) the transport to and accumulations at various receptors in the biosphere, 2) pathways from these environmental concentrations, and 3) radiation dose to man. Finally, economic results are used to compare and assess various disposal configurations as a basis for formulatin

  10. Machine Translation as a Model for Overcoming Some Common Errors in English-into-Arabic Translation among EFL University Freshmen

    Science.gov (United States)

    El-Banna, Adel I.; Naeem, Marwa A.

    2016-01-01

    This research work aimed at making use of Machine Translation to help students avoid some syntactic, semantic and pragmatic common errors in translation from English into Arabic. Participants were a hundred and five freshmen who studied the "Translation Common Errors Remedial Program" prepared by the researchers. A testing kit that…

  11. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  12. A methodology to annotate systems biology markup language models with the synthetic biology open language.

    Science.gov (United States)

    Roehner, Nicholas; Myers, Chris J

    2014-02-21

    Recently, we have begun to witness the potential of synthetic biology, noted here in the form of bacteria and yeast that have been genetically engineered to produce biofuels, manufacture drug precursors, and even invade tumor cells. The success of these projects, however, has often failed in translation and application to new projects, a problem exacerbated by a lack of engineering standards that combine descriptions of the structure and function of DNA. To address this need, this paper describes a methodology to connect the systems biology markup language (SBML) to the synthetic biology open language (SBOL), existing standards that describe biochemical models and DNA components, respectively. Our methodology involves first annotating SBML model elements such as species and reactions with SBOL DNA components. A graph is then constructed from the model, with vertices corresponding to elements within the model and edges corresponding to the cause-and-effect relationships between these elements. Lastly, the graph is traversed to assemble the annotating DNA components into a composite DNA component, which is used to annotate the model itself and can be referenced by other composite models and DNA components. In this way, our methodology can be used to build up a hierarchical library of models annotated with DNA components. Such a library is a useful input to any future genetic technology mapping algorithm that would automate the process of composing DNA components to satisfy a behavioral specification. Our methodology for SBML-to-SBOL annotation is implemented in the latest version of our genetic design automation (GDA) software tool, iBioSim.

  13. A generic method for automatic translation between input models for different versions of simulation codes

    International Nuclear Information System (INIS)

    Serfontein, Dawid E.; Mulder, Eben J.; Reitsma, Frederik

    2014-01-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications

  14. A generic method for automatic translation between input models for different versions of simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Serfontein, Dawid E., E-mail: Dawid.Serfontein@nwu.ac.za [School of Mechanical and Nuclear Engineering, North West University (PUK-Campus), PRIVATE BAG X6001 (Internal Post Box 360), Potchefstroom 2520 (South Africa); Mulder, Eben J. [School of Mechanical and Nuclear Engineering, North West University (South Africa); Reitsma, Frederik [Calvera Consultants (South Africa)

    2014-05-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications.

  15. K-Means Subject Matter Expert Refined Topic Model Methodology

    Science.gov (United States)

    2017-01-01

    computing environment the Visual Basic for Applications ( VBA ) programming language presents the option as our programming language of choice. We propose...background, or access to other computational programming environments, to build topic models from free text datasets using a familiar Excel based...environment the restricts access to other software based text analytic tools. Opportunities to deploy developmental versions of the methodology and

  16. Combining patient journey modelling and visual multi-agent computer simulation: a framework to improving knowledge translation in a healthcare environment.

    Science.gov (United States)

    Curry, Joanne; Fitzgerald, Anneke; Prodan, Ante; Dadich, Ann; Sloan, Terry

    2014-01-01

    This article focuses on a framework that will investigate the integration of two disparate methodologies: patient journey modelling and visual multi-agent simulation, and its impact on the speed and quality of knowledge translation to healthcare stakeholders. Literature describes patient journey modelling and visual simulation as discrete activities. This paper suggests that their combination and their impact on translating knowledge to practitioners are greater than the sum of the two technologies. The test-bed is ambulatory care and the goal is to determine if this approach can improve health services delivery, workflow, and patient outcomes and satisfaction. The multidisciplinary research team is comprised of expertise in patient journey modelling, simulation, and knowledge translation.

  17. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  18. Efficient Embedded Decoding of Neural Network Language Models in a Machine Translation System.

    Science.gov (United States)

    Zamora-Martinez, Francisco; Castro-Bleda, Maria Jose

    2018-02-22

    Neural Network Language Models (NNLMs) are a successful approach to Natural Language Processing tasks, such as Machine Translation. We introduce in this work a Statistical Machine Translation (SMT) system which fully integrates NNLMs in the decoding stage, breaking the traditional approach based on [Formula: see text]-best list rescoring. The neural net models (both language models (LMs) and translation models) are fully coupled in the decoding stage, allowing to more strongly influence the translation quality. Computational issues were solved by using a novel idea based on memorization and smoothing of the softmax constants to avoid their computation, which introduces a trade-off between LM quality and computational cost. These ideas were studied in a machine translation task with different combinations of neural networks used both as translation models and as target LMs, comparing phrase-based and [Formula: see text]-gram-based systems, showing that the integrated approach seems more promising for [Formula: see text]-gram-based systems, even with nonfull-quality NNLMs.

  19. Effects of different per translational kinetics on the dynamics of a core circadian clock model.

    Science.gov (United States)

    Nieto, Paula S; Revelli, Jorge A; Garbarino-Pico, Eduardo; Condat, Carlos A; Guido, Mario E; Tamarit, Francisco A

    2015-01-01

    Living beings display self-sustained daily rhythms in multiple biological processes, which persist in the absence of external cues since they are generated by endogenous circadian clocks. The period (per) gene is a central player within the core molecular mechanism for keeping circadian time in most animals. Recently, the modulation PER translation has been reported, both in mammals and flies, suggesting that translational regulation of clock components is important for the proper clock gene expression and molecular clock performance. Because translational regulation ultimately implies changes in the kinetics of translation and, therefore, in the circadian clock dynamics, we sought to study how and to what extent the molecular clock dynamics is affected by the kinetics of PER translation. With this objective, we used a minimal mathematical model of the molecular circadian clock to qualitatively characterize the dynamical changes derived from kinetically different PER translational mechanisms. We found that the emergence of self-sustained oscillations with characteristic period, amplitude, and phase lag (time delays) between per mRNA and protein expression depends on the kinetic parameters related to PER translation. Interestingly, under certain conditions, a PER translation mechanism with saturable kinetics introduces longer time delays than a mechanism ruled by a first-order kinetics. In addition, the kinetic laws of PER translation significantly changed the sensitivity of our model to parameters related to the synthesis and degradation of per mRNA and PER degradation. Lastly, we found a set of parameters, with realistic values, for which our model reproduces some experimental results reported recently for Drosophila melanogaster and we present some predictions derived from our analysis.

  20. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving.

    Science.gov (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-10-11

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  1. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    Directory of Open Access Journals (Sweden)

    Jos Elfring

    2016-10-01

    Full Text Available The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  2. Making sense of the Sense Model: translation priming with Japanese-English bilinguals

    OpenAIRE

    Allen, David; Conklin, Kathy; Van Heuven, Walter J.B.

    2015-01-01

    Many studies have reported that first language (L1) translation primes speed responses to second language (L2) targets, whereas L2 translation primes generally do not speed up responses to L1 targets in lexical decision. According to the Sense Model (Finkbeiner, Forster, Nicol & Nakamura, 2004) this asymmetry is due to the proportion of senses activated by the prime. Because L2 primes activate only a subset of the L1 translations senses, priming is not observed. In this study we test the pred...

  3. Genome-Scale Analysis of Translation Elongation with a Ribosome Flow Model

    Science.gov (United States)

    Meilijson, Isaac; Kupiec, Martin; Ruppin, Eytan

    2011-01-01

    We describe the first large scale analysis of gene translation that is based on a model that takes into account the physical and dynamical nature of this process. The Ribosomal Flow Model (RFM) predicts fundamental features of the translation process, including translation rates, protein abundance levels, ribosomal densities and the relation between all these variables, better than alternative (‘non-physical’) approaches. In addition, we show that the RFM can be used for accurate inference of various other quantities including genes' initiation rates and translation costs. These quantities could not be inferred by previous predictors. We find that increasing the number of available ribosomes (or equivalently the initiation rate) increases the genomic translation rate and the mean ribosome density only up to a certain point, beyond which both saturate. Strikingly, assuming that the translation system is tuned to work at the pre-saturation point maximizes the predictive power of the model with respect to experimental data. This result suggests that in all organisms that were analyzed (from bacteria to Human), the global initiation rate is optimized to attain the pre-saturation point. The fact that similar results were not observed for heterologous genes indicates that this feature is under selection. Remarkably, the gap between the performance of the RFM and alternative predictors is strikingly large in the case of heterologous genes, testifying to the model's promising biotechnological value in predicting the abundance of heterologous proteins before expressing them in the desired host. PMID:21909250

  4. Multi-dimensional knowledge translation: enabling health informatics capacity audits using patient journey models.

    Science.gov (United States)

    Catley, Christina; McGregor, Carolyn; Percival, Jennifer; Curry, Joanne; James, Andrew

    2008-01-01

    This paper presents a multi-dimensional approach to knowledge translation, enabling results obtained from a survey evaluating the uptake of Information Technology within Neonatal Intensive Care Units to be translated into knowledge, in the form of health informatics capacity audits. Survey data, having multiple roles, patient care scenarios, levels, and hospitals, is translated using a structured data modeling approach, into patient journey models. The data model is defined such that users can develop queries to generate patient journey models based on a pre-defined Patient Journey Model architecture (PaJMa). PaJMa models are then analyzed to build capacity audits. Capacity audits offer a sophisticated view of health informatics usage, providing not only details of what IT solutions a hospital utilizes, but also answering the questions: when, how and why, by determining when the IT solutions are integrated into the patient journey, how they support the patient information flow, and why they improve the patient journey.

  5. A Methodology to Assess Ionospheric Models for GNSS

    Science.gov (United States)

    Rovira-Garcia, Adria; Juan, José Miguel; Sanz, Jaume; González-Casado, Guillermo; Ibánez, Deimos

    2015-04-01

    Testing the accuracy of the ionospheric models used in the Global Navigation Satellite System (GNSS) is a long-standing issue. It is still a challenging problem due to the lack of accurate enough slant ionospheric determinations to be used as a reference. The present study proposes a methodology to assess any ionospheric model used in satellite-based applications and, in particular, GNSS ionospheric models. The methodology complements other analysis comparing the navigation based on different models to correct the code and carrier-phase observations. Specifically, the following ionospheric models are assessed: the operational models broadcast in the Global Positioning System (GPS), Galileo and the European Geostationary Navigation Overlay System (EGNOS), the post-process Global Ionospheric Maps (GIMs) from different analysis centers belonging to the International GNSS Service (IGS) and, finally, a new GIM computed by the gAGE/UPC research group. The methodology is based in the comparison between the predictions of the ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences shall be separated into the hardware delays (a receiver constant plus a satellite constant) per data interval, e.g., a day. The condition that these Differential Code Biases (DCBs) are commonly shared throughout the world-wide network of receivers and satellites provides a global character to the assessment. This approach generalizes simple tests based on double differenced Slant Total Electron Contents (STECs) between pairs of satellites and receivers on a much local scale. The present study has been conducted during the entire 2014, i.e., the last Solar Maximum. The seasonal and latitudinal structures of the results clearly reflect the different strategies used by the different models. On one hand, ionospheric model corrections based on a grid (IGS-GIMs or EGNOS) are shown to be several times better than the models

  6. An experimental methodology for a fuzzy set preference model

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate

  7. Predictive biophysical modeling and understanding of the dynamics of mRNA translation and its evolution

    Science.gov (United States)

    Zur, Hadas; Tuller, Tamir

    2016-01-01

    mRNA translation is the fundamental process of decoding the information encoded in mRNA molecules by the ribosome for the synthesis of proteins. The centrality of this process in various biomedical disciplines such as cell biology, evolution and biotechnology, encouraged the development of dozens of mathematical and computational models of translation in recent years. These models aimed at capturing various biophysical aspects of the process. The objective of this review is to survey these models, focusing on those based and/or validated on real large-scale genomic data. We consider aspects such as the complexity of the models, the biophysical aspects they regard and the predictions they may provide. Furthermore, we survey the central systems biology discoveries reported on their basis. This review demonstrates the fundamental advantages of employing computational biophysical translation models in general, and discusses the relative advantages of the different approaches and the challenges in the field. PMID:27591251

  8. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  9. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    Science.gov (United States)

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  10. Teaching methodology for modeling reference evapotranspiration with artificial neural networks

    OpenAIRE

    Martí, Pau; Pulido Calvo, Inmaculada; Gutiérrez Estrada, Juan Carlos

    2015-01-01

    [EN] Artificial neural networks are a robust alternative to conventional models for estimating different targets in irrigation engineering, among others, reference evapotranspiration, a key variable for estimating crop water requirements. This paper presents a didactic methodology for introducing students in the application of artificial neural networks for reference evapotranspiration estimation using MatLab c . Apart from learning a specific application of this software wi...

  11. Mixed-mode modelling mixing methodologies for organisational intervention

    CERN Document Server

    Clarke, Steve; Lehaney, Brian

    2001-01-01

    The 1980s and 1990s have seen a growing interest in research and practice in the use of methodologies within problem contexts characterised by a primary focus on technology, human issues, or power. During the last five to ten years, this has given rise to challenges regarding the ability of a single methodology to address all such contexts, and the consequent development of approaches which aim to mix methodologies within a single problem situation. This has been particularly so where the situation has called for a mix of technological (the so-called 'hard') and human­ centred (so-called 'soft') methods. The approach developed has been termed mixed-mode modelling. The area of mixed-mode modelling is relatively new, with the phrase being coined approximately four years ago by Brian Lehaney in a keynote paper published at the 1996 Annual Conference of the UK Operational Research Society. Mixed-mode modelling, as suggested above, is a new way of considering problem situations faced by organisations. Traditional...

  12. Spectral and scattering theory for translation invariant models in quantum field theory

    DEFF Research Database (Denmark)

    Rasmussen, Morten Grud

    This thesis is concerned with a large class of massive translation invariant models in quantum field theory, including the Nelson model and the Fröhlich polaron. The models in the class describe a matter particle, e.g. a nucleon or an electron, linearly coupled to a second quantised massive scalar...... by the physically relevant choices. The translation invariance implies that the Hamiltonian may be decomposed into a direct integral over the space of total momentum where the fixed momentum fiber Hamiltonians are given by , where denotes total momentum and is the Segal field operator. The fiber Hamiltonians...

  13. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  14. A note on the translation of conceptual data models into description logics: disjointness and covering assumptions

    CSIR Research Space (South Africa)

    Casini, G

    2012-10-01

    Full Text Available possibilities for conceptual data modeling. It also raises the question of how existing conceptual models using ER, UML or ORM could be translated into Description Logics (DLs), a family of logics that have proved to be particularly appropriate for formalizing...

  15. 3D Urban Virtual Models generation methodology for smart cities

    Directory of Open Access Journals (Sweden)

    M. Álvarez

    2018-04-01

    Full Text Available Currently the use of Urban 3D Models goes beyond the mere support of three-dimensional image for the visualization of our urban surroundings. The three-dimensional Urban Models are in themselves fundamental tools to manage the different phenomena that occur in smart cities. It is therefore necessary to generate realistic models, in which BIM building design information can be integrated with GIS and other space technologies. The generation of 3D Urban Models benefit from the amount of data from sensors with the latest technologies such as airborne sensors and of the existence of international standards such as CityGML. This paper presents a methodology for the development of a three - dimensional Urban Model, based on LiDAR data and the CityGML standard, applied to the city of Lorca.

  16. A methodology for overall consequence modeling in chemical industry

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2009-01-01

    Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.

  17. Murine models of osteosarcoma: A piece of the translational puzzle.

    Science.gov (United States)

    Walia, Mannu K; Castillo-Tandazo, Wilson; Mutsaers, Anthony J; Martin, Thomas John; Walkley, Carl R

    2018-06-01

    Osteosarcoma (OS) is the most common cancer of bone in children and young adults. Despite extensive research efforts, there has been no significant improvement in patient outcome for many years. An improved understanding of the biology of this cancer and how genes frequently mutated contribute to OS may help improve outcomes for patients. While our knowledge of the mutational burden of OS is approaching saturation, our understanding of how these mutations contribute to OS initiation and maintenance is less clear. Murine models of OS have now been demonstrated to be highly valid recapitulations of human OS. These models were originally based on the frequent disruption of p53 and Rb in familial OS syndromes, which are also common mutations in sporadic OS. They have been applied to significantly improve our understanding about the functions of recurrently mutated genes in disease. The murine models can be used as a platform for preclinical testing and identifying new therapeutic targets, in addition to testing the role of additional mutations in vivo. Most recently these models have begun to be used for discovery based approaches and screens, which hold significant promise in furthering our understanding of the genetic and therapeutic sensitivities of OS. In this review, we discuss the mouse models of OS that have been reported in the last 3-5 years and newly identified pathways from these studies. Finally, we discuss the preclinical utilization of the mouse models of OS for identifying and validating actionable targets to improve patient outcome. © 2017 Wiley Periodicals, Inc.

  18. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  19. QEFSM model and Markov Algorithm for translating Quran reciting rules into Braille code

    Directory of Open Access Journals (Sweden)

    Abdallah M. Abualkishik

    2015-07-01

    Full Text Available The Holy Quran is the central religious verbal text of Islam. Muslims are expected to read, understand, and apply the teachings of the Holy Quran. The Holy Quran was translated to Braille code as a normal Arabic text without having its reciting rules included. It is obvious that the users of this transliteration will not be able to recite the Quran the right way. Through this work, Quran Braille Translator (QBT presents a specific translator to translate Quran verses and their reciting rules into the Braille code. Quran Extended Finite State Machine (QEFSM model is proposed through this study as it is able to detect the Quran reciting rules (QRR from the Quran text. Basis path testing was used to evaluate the inner work for the model by checking all the test cases for the model. Markov Algorithm (MA was used for translating the detected QRR and Quran text into the matched Braille code. The data entries for QBT are Arabic letters and diacritics. The outputs of this study are seen in the double lines of Braille symbols; the first line is the proposed Quran reciting rules and the second line is for the Quran scripts.

  20. Animal Models of Virus-Induced Neurobehavioral Sequelae: Recent Advances, Methodological Issues, and Future Prospects

    Directory of Open Access Journals (Sweden)

    Marco Bortolato

    2010-01-01

    Full Text Available Converging lines of clinical and epidemiological evidence suggest that viral infections in early developmental stages may be a causal factor in neuropsychiatric disorders such as schizophrenia, bipolar disorder, and autism-spectrum disorders. This etiological link, however, remains controversial in view of the lack of consistent and reproducible associations between viruses and mental illness. Animal models of virus-induced neurobehavioral disturbances afford powerful tools to test etiological hypotheses and explore pathophysiological mechanisms. Prenatal or neonatal inoculations of neurotropic agents (such as herpes-, influenza-, and retroviruses in rodents result in a broad spectrum of long-term alterations reminiscent of psychiatric abnormalities. Nevertheless, the complexity of these sequelae often poses methodological and interpretational challenges and thwarts their characterization. The recent conceptual advancements in psychiatric nosology and behavioral science may help determine new heuristic criteria to enhance the translational value of these models. A particularly critical issue is the identification of intermediate phenotypes, defined as quantifiable factors representing single neurochemical, neuropsychological, or neuroanatomical aspects of a diagnostic category. In this paper, we examine how the employment of these novel concepts may lead to new methodological refinements in the study of virus-induced neurobehavioral sequelae through animal models.

  1. Current Translational Research and Murine Models For Duchenne Muscular Dystrophy

    Science.gov (United States)

    Rodrigues, Merryl; Echigoya, Yusuke; Fukada, So-ichiro; Yokota, Toshifumi

    2016-01-01

    Duchenne muscular dystrophy (DMD) is an X-linked genetic disorder characterized by progressive muscle degeneration. Mutations in the DMD gene result in the absence of dystrophin, a protein required for muscle strength and stability. Currently, there is no cure for DMD. Since murine models are relatively easy to genetically manipulate, cost effective, and easily reproducible due to their short generation time, they have helped to elucidate the pathobiology of dystrophin deficiency and to assess therapies for treating DMD. Recently, several murine models have been developed by our group and others to be more representative of the human DMD mutation types and phenotypes. For instance, mdx mice on a DBA/2 genetic background, developed by Fukada et al., have lower regenerative capacity and exhibit very severe phenotype. Cmah-deficient mdx mice display an accelerated disease onset and severe cardiac phenotype due to differences in glycosylation between humans and mice. Other novel murine models include mdx52, which harbors a deletion mutation in exon 52, a hot spot region in humans, and dystrophin/utrophin double-deficient (dko), which displays a severe dystrophic phenotype due the absence of utrophin, a dystrophin homolog. This paper reviews the pathological manifestations and recent therapeutic developments in murine models of DMD such as standard mdx (C57BL/10), mdx on C57BL/6 background (C57BL/6-mdx), mdx52, dystrophin/utrophin double-deficient (dko), mdxβgeo, Dmd-null, humanized DMD (hDMD), mdx on DBA/2 background (DBA/2-mdx), Cmah-mdx, and mdx/mTRKO murine models. PMID:27854202

  2. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  3. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  4. Domain Adaptation of Translation Models for Multilingual Applications

    Science.gov (United States)

    2009-04-01

    employed. In the past two years, domain adaptation for NLP tasks has become an active research area [3, 38, 25, 23]. New domain adaptation tasks have...and unlabeled data in the target domain and learn a mixture model to adapt from the source domain. Other NLP tasks where domain adaptation has been...evaluation forum, http://www.clef-campaign.org. [13] K. Darwish and D. Oard, CLIR experiments at maryland for TREC-2002: Evidence combination for arabic

  5. Testing the validity of a translated pharmaceutical therapy-related quality of life instrument, using qualitative 'think aloud' methodology.

    Science.gov (United States)

    Renberg, T; Kettis Lindblad, A; Tully, M P

    2008-06-01

    In pharmacy practice, there is a need for valid and reliable instruments to study patient-reported outcomes. One potential candidate is a pharmaceutical therapy-related quality of life (PTRQoL) instrument. This study explored the face and content validity, including cognitive aspects of question answering of a PTRQoL instrument, translated from English to Swedish. A sample of 16 customers at Swedish community pharmacies, was asked to fill in the PTRQoL instrument while constantly reporting how they reasoned. The resulting interviews and concurrent probing, were audio-taped, transcribed verbatim and analysed using constant comparison method. The relation between the measurement and its theoretical underpinning was challenged. Respondents neglected to read the instructions, used response options in an unpredictable way, and varied in their interpretations of the items. The combination of 'think-aloud', retrospective probing and qualitative analysis informed on the validity of the PTRQoL instrument and was valuable in questionnaire development. The study also identified specific problems that could be relevant for other instruments probing patients' medicines-related attitudes and behaviour.

  6. Assessing the Quality of Persian Translation of Kite Runner based on House’s (2014 Functional Pragmatic Model

    Directory of Open Access Journals (Sweden)

    Fateme Kargarzadeh

    2017-03-01

    Full Text Available Translation quality assessment is at the heart of any theory of translation. It is used in the academic or teaching contexts to judge translations, to discuss their merits and demerits and to suggest solutions. However, literary translations needs more consideration in terms of quality and clarity as it is widely read form of translation. In this respect, Persian literary translation of Kite Runner was taken for investigation based on House’s (2014 functional pragmatic model of translation quality assessment. To this end, around 100 pages from the beginning of both English and Persian versions of the novel were selected and compared. Using House’s model, the profile of the source text register was created and the genre was recognized. The source text profile was compared to the translation text profile. The results were minute mismatches in field, tenor, and mode which accounted for as overt erroneous expressions and leading matches which were accounted for as covert translation. The mismatches were some mistranslations of tenses and selection of inappropriate meanings for the lexicon. Since the informal and culture specific terms were transferred thoroughly, the culture filter was not applied. Besides, as the translation was a covert one. The findings of the study have implications for translators, researchers and translator trainers.

  7. Translation from mathematical model to data driven knowledge

    OpenAIRE

    Boixareu Fiol, Margarita

    2017-01-01

    Prove if with the acquisition of new samples of data the knowledge we can obtain is more accurate. The projects centers in the data obtained from the patient-specific calibration of a computer simulation of a human heart. Neural networks are used to understand the relation input-output of the data and they are compared with a physical model that relates them. Input data are some parameters of the heart and the output data is the elastance of the heart (variation of pressure/ variation of volu...

  8. The companion dog as a unique translational model for aging.

    Science.gov (United States)

    Mazzatenta, Andrea; Carluccio, Augusto; Robbe, Domenico; Giulio, Camillo Di; Cellerino, Alessandro

    2017-10-01

    The dog is a unique species due to its wide variation among breeds in terms of size, morphology, behaviour and lifespan, coupled with a genetic structure that facilitates the dissection of the genetic architecture that controls these traits. Dogs and humans co-evolved and share recent evolutionary selection processes, such as adaptation to digest starch-rich diets. Many diseases of the dog have a human counterpart, and notably Alzheimer's disease, which is otherwise difficult to model in other organisms. Unlike laboratory animals, companion dogs share the human environment and lifestyle, are exposed to the same pollutants, and are faced with pathogens and infections. Dogs represented a very useful model to understand the relationship between size, insulin-like growth factor-1 genetic variation and lifespan, and have been used to test the effects of dietary restriction and immunotherapy for Alzheimer's disease. Very recently, rapamycin was tested in companion dogs outside the laboratory, and this approach where citizens are involved in research aimed at the benefit of dog welfare might become a game changer in geroscience. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Methodology and preliminary models for analyzing nuclear safeguards decisions

    International Nuclear Information System (INIS)

    1978-11-01

    This report describes a general analytical tool designed to assist the NRC in making nuclear safeguards decisions. The approach is based on decision analysis--a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material, demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria), and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  10. Methodology and preliminary models for analyzing nuclear-safeguards decisions

    International Nuclear Information System (INIS)

    Judd, B.R.; Weissenberger, S.

    1978-11-01

    This report describes a general analytical tool designed with Lawrence Livermore Laboratory to assist the Nuclear Regulatory Commission in making nuclear safeguards decisions. The approach is based on decision analysis - a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material; demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria); and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  11. Model identification methodology for fluid-based inerters

    Science.gov (United States)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  12. Integrating FMEA in a Model-Driven Methodology

    Science.gov (United States)

    Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno

    2016-08-01

    Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.

  13. Modeling collective animal behavior with a cognitive perspective: a methodological framework.

    Directory of Open Access Journals (Sweden)

    Sebastian Weitz

    Full Text Available The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the

  14. Anatomy and bronchoscopy of the porcine lung. A model for translational respiratory medicine.

    LENUS (Irish Health Repository)

    Judge, Eoin P

    2014-09-01

    The porcine model has contributed significantly to biomedical research over many decades. The similar size and anatomy of pig and human organs make this model particularly beneficial for translational research in areas such as medical device development, therapeutics and xenotransplantation. In recent years, a major limitation with the porcine model was overcome with the successful generation of gene-targeted pigs and the publication of the pig genome. As a result, the role of this model is likely to become even more important. For the respiratory medicine field, the similarities between pig and human lungs give the porcine model particular potential for advancing translational medicine. An increasing number of lung conditions are being studied and modeled in the pig. Genetically modified porcine models of cystic fibrosis have been generated that, unlike mouse models, develop lung disease similar to human cystic fibrosis. However, the scientific literature relating specifically to porcine lung anatomy and airway histology is limited and is largely restricted to veterinary literature and textbooks. Furthermore, methods for in vivo lung procedures in the pig are rarely described. The aims of this review are to collate the disparate literature on porcine lung anatomy, histology, and microbiology; to provide a comparison with the human lung; and to describe appropriate bronchoscopy procedures for the pig lungs to aid clinical researchers working in the area of translational respiratory medicine using the porcine model.

  15. OvidSP Medline-to-PubMed search filter translation: a methodology for extending search filter range to include PubMed's unique content.

    Science.gov (United States)

    Damarell, Raechel A; Tieman, Jennifer J; Sladek, Ruth M

    2013-07-02

    PubMed translations of OvidSP Medline search filters offer searchers improved ease of access. They may also facilitate access to PubMed's unique content, including citations for the most recently published biomedical evidence. Retrieving this content requires a search strategy comprising natural language terms ('textwords'), rather than Medical Subject Headings (MeSH). We describe a reproducible methodology that uses a validated PubMed search filter translation to create a textword-only strategy to extend retrieval to PubMed's unique heart failure literature. We translated an OvidSP Medline heart failure search filter for PubMed and established version equivalence in terms of indexed literature retrieval. The PubMed version was then run within PubMed to identify citations retrieved by the filter's MeSH terms (Heart failure, Left ventricular dysfunction, and Cardiomyopathy). It was then rerun with the same MeSH terms restricted to searching on title and abstract fields (i.e. as 'textwords'). Citations retrieved by the MeSH search but not the textword search were isolated. Frequency analysis of their titles/abstracts identified natural language alternatives for those MeSH terms that performed less effectively as textwords. These terms were tested in combination to determine the best performing search string for reclaiming this 'lost set'. This string, restricted to searching on PubMed's unique content, was then combined with the validated PubMed translation to extend the filter's performance in this database. The PubMed heart failure filter retrieved 6829 citations. Of these, 834 (12%) failed to be retrieved when MeSH terms were converted to textwords. Frequency analysis of the 834 citations identified five high frequency natural language alternatives that could improve retrieval of this set (cardiac failure, cardiac resynchronization, left ventricular systolic dysfunction, left ventricular diastolic dysfunction, and LV dysfunction). Together these terms reclaimed

  16. Recent Advances in Translational Magnetic Resonance Imaging in Animal Models of Stress and Depression.

    Science.gov (United States)

    McIntosh, Allison L; Gormley, Shane; Tozzi, Leonardo; Frodl, Thomas; Harkin, Andrew

    2017-01-01

    Magnetic resonance imaging (MRI) is a valuable translational tool that can be used to investigate alterations in brain structure and function in both patients and animal models of disease. Regional changes in brain structure, functional connectivity, and metabolite concentrations have been reported in depressed patients, giving insight into the networks and brain regions involved, however preclinical models are less well characterized. The development of more effective treatments depends upon animal models that best translate to the human condition and animal models may be exploited to assess the molecular and cellular alterations that accompany neuroimaging changes. Recent advances in preclinical imaging have facilitated significant developments within the field, particularly relating to high resolution structural imaging and resting-state functional imaging which are emerging techniques in clinical research. This review aims to bring together the current literature on preclinical neuroimaging in animal models of stress and depression, highlighting promising avenues of research toward understanding the pathological basis of this hugely prevalent disorder.

  17. Methodological Aspects of Modelling and Simulation of Robotized Workstations

    Directory of Open Access Journals (Sweden)

    Naqib Daneshjo

    2018-05-01

    Full Text Available From the point of view of development of application and program products, key directions that need to be respected in computer support for project activities are quite clearly specified. User interfaces with a high degree of graphical interactive convenience, two-dimensional and three-dimensional computer graphics contribute greatly to streamlining project methodologies and procedures in particular. This is mainly due to the fact that a high number of solved tasks is clearly graphic in the modern design of robotic systems. Automation of graphical character tasks is therefore a significant development direction for the subject area. The authors present results of their research in the area of automation and computer-aided design of robotized systems. A new methodical approach to modelling robotic workstations, consisting of ten steps incorporated into the four phases of the logistics process of creating and implementing a robotic workplace, is presented. The emphasis is placed on the modelling and simulation phase with verification of elaborated methodologies on specific projects or elements of the robotized welding plant in automotive production.

  18. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  19. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    Science.gov (United States)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  20. Language Model Adaptation Using Machine-Translated Text for Resource-Deficient Languages

    Directory of Open Access Journals (Sweden)

    Sadaoki Furui

    2009-01-01

    Full Text Available Text corpus size is an important issue when building a language model (LM. This is a particularly important issue for languages where little data is available. This paper introduces an LM adaptation technique to improve an LM built using a small amount of task-dependent text with the help of a machine-translated text corpus. Icelandic speech recognition experiments were performed using data, machine translated (MT from English to Icelandic on a word-by-word and sentence-by-sentence basis. LM interpolation using the baseline LM and an LM built from either word-by-word or sentence-by-sentence translated text reduced the word error rate significantly when manually obtained utterances used as a baseline were very sparse.

  1. A ROADMAP FOR GENERATING SEMANTICALLY ENRICHED BUILDING MODELS ACCORDING TO CITYGML MODEL VIA TWO DIFFERENT METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    G. Floros

    2016-10-01

    Full Text Available The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model’s format, via semi-automatic procedures with respect to the user’s scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model’s generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects’ purposes.

  2. MoPCoM Methodology: Focus on Models of Computation

    Science.gov (United States)

    Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent

    Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).

  3. Building Modelling Methodologies for Virtual District Heating and Cooling Networks

    Energy Technology Data Exchange (ETDEWEB)

    Saurav, Kumar; Choudhury, Anamitra R.; Chandan, Vikas; Lingman, Peter; Linder, Nicklas

    2017-10-26

    District heating and cooling systems (DHC) are a proven energy solution that has been deployed for many years in a growing number of urban areas worldwide. They comprise a variety of technologies that seek to develop synergies between the production and supply of heat, cooling, domestic hot water and electricity. Although the benefits of DHC systems are significant and have been widely acclaimed, yet the full potential of modern DHC systems remains largely untapped. There are several opportunities for development of energy efficient DHC systems, which will enable the effective exploitation of alternative renewable resources, waste heat recovery, etc., in order to increase the overall efficiency and facilitate the transition towards the next generation of DHC systems. This motivated the need for modelling these complex systems. Large-scale modelling of DHC-networks is challenging, as it has several components interacting with each other. In this paper we present two building methodologies to model the consumer buildings. These models will be further integrated with network model and the control system layer to create a virtual test bed for the entire DHC system. The model is validated using data collected from a real life DHC system located at Lulea, a city on the coast of northern Sweden. The test bed will be then used for simulating various test cases such as peak energy reduction, overall demand reduction etc.

  4. Early-life stress origins of gastrointestinal disease: animal models, intestinal pathophysiology, and translational implications.

    Science.gov (United States)

    Pohl, Calvin S; Medland, Julia E; Moeser, Adam J

    2015-12-15

    Early-life stress and adversity are major risk factors in the onset and severity of gastrointestinal (GI) disease in humans later in life. The mechanisms by which early-life stress leads to increased GI disease susceptibility in adult life remain poorly understood. Animal models of early-life stress have provided a foundation from which to gain a more fundamental understanding of this important GI disease paradigm. This review focuses on animal models of early-life stress-induced GI disease, with a specific emphasis on translational aspects of each model to specific human GI disease states. Early postnatal development of major GI systems and the consequences of stress on their development are discussed in detail. Relevant translational differences between species and models are highlighted. Copyright © 2015 the American Physiological Society.

  5. Early-life stress origins of gastrointestinal disease: animal models, intestinal pathophysiology, and translational implications

    Science.gov (United States)

    Pohl, Calvin S.; Medland, Julia E.

    2015-01-01

    Early-life stress and adversity are major risk factors in the onset and severity of gastrointestinal (GI) disease in humans later in life. The mechanisms by which early-life stress leads to increased GI disease susceptibility in adult life remain poorly understood. Animal models of early-life stress have provided a foundation from which to gain a more fundamental understanding of this important GI disease paradigm. This review focuses on animal models of early-life stress-induced GI disease, with a specific emphasis on translational aspects of each model to specific human GI disease states. Early postnatal development of major GI systems and the consequences of stress on their development are discussed in detail. Relevant translational differences between species and models are highlighted. PMID:26451004

  6. An Investigation of Pun Translatability in English Translations of Sa'di's Ghazals Based on Delabastita's Proposed Model

    Science.gov (United States)

    Koochacki, Fahime

    2016-01-01

    The rich cultural connotations behind puns and the distinctive features of the puns' form, sound and meanings pose great challenges to the translator. Furthermore, given puns' non-negligible effects in Persian literary texts, it has been the aim of the present study to analyze and measure how puns in Sa'di's Ghazals have actually been treated in…

  7. Modeling methodology for a CMOS-MEMS electrostatic comb

    Science.gov (United States)

    Iyer, Sitaraman V.; Lakdawala, Hasnain; Mukherjee, Tamal; Fedder, Gary K.

    2002-04-01

    A methodology for combined modeling of capacitance and force 9in a multi-layer electrostatic comb is demonstrated in this paper. Conformal mapping-based analytical methods are limited to 2D symmetric cross-sections and cannot account for charge concentration effects at corners. Vertex capacitance can be more than 30% of the total capacitance in a single-layer 2 micrometers thick comb with 10 micrometers overlap. Furthermore, analytical equations are strictly valid only for perfectly symmetrical finger positions. Fringing and corner effects are likely to be more significant in a multi- layered CMOS-MEMS comb because of the presence of more edges and vertices. Vertical curling of CMOS-MEMS comb fingers may also lead to reduced capacitance and vertical forces. Gyroscopes are particularly sensitive to such undesirable forces, which therefore, need to be well-quantified. In order to address the above issues, a hybrid approach of superposing linear regression models over a set of core analytical models is implemented. Design of experiments is used to obtain data for capacitance and force using a commercial 3D boundary-element solver. Since accurate force values require significantly higher mesh refinement than accurate capacitance, we use numerical derivatives of capacitance values to compute the forces. The model is formulated such that the capacitance and force models use the same regression coefficients. The comb model thus obtained, fits the numerical capacitance data to within +/- 3% and force to within +/- 10%. The model is experimentally verified by measuring capacitance change in a specially designed test structure. The capacitance model matches measurements to within 10%. The comb model is implemented in an Analog Hardware Description Language (ADHL) for use in behavioral simulation of manufacturing variations in a CMOS-MEMS gyroscope.

  8. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  9. A Transparent Translation from Legacy System Model into Common Information Model: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Simpson, Jeffrey [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhang, Yingchen [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-27

    Advance in smart grid is forcing utilities towards better monitoring, control and analysis of distribution systems, and requires extensive cyber-based intelligent systems and applications to realize various functionalities. The ability of systems, or components within systems, to interact and exchange services or information with each other is the key to the success of smart grid technologies, and it requires efficient information exchanging and data sharing infrastructure. The Common Information Model (CIM) is a standard that allows different applications to exchange information about an electrical system, and it has become a widely accepted solution for information exchange among different platforms and applications. However, most existing legacy systems are not developed using CIM, but using their own languages. Integrating such legacy systems is a challenge for utilities, and the appropriate utilization of the integrated legacy systems is even more intricate. Thus, this paper has developed an approach and open-source tool in order to translate legacy system models into CIM format. The developed tool is tested for a commercial distribution management system and simulation results have proved its effectiveness.

  10. A case study in data audit and modelling methodology. Australia

    Energy Technology Data Exchange (ETDEWEB)

    Apelbaum, John [Apelbaum Consulting Group, 750 Blackburn Road, Melbourne VIC 3170 (Australia)

    2009-10-15

    The purpose of the paper is to outline a rigorous, spatially consistent and cost-effective transport planning tool that projects travel demand, energy and emissions for all modes associated with domestic and international transport. The planning tool (Aus{sub e}Tran) is a multi-modal, multi-fuel and multi-regional macroeconomic and demographic-based computational model of the Australian transport sector that overcomes some of the gaps associated with existing strategic level transport emission models. The paper also identifies a number of key data issues that need to be resolved prior to model development with particular reference to the Australian environment. The strategic model structure endogenously derives transport demand, energy and emissions by jurisdiction, vehicle type, emission type and transport service for both freight and passenger transport. Importantly, the analytical framework delineates the national transport task, energy consumed and emissions according to region, state/territory of origin and jurisdictional protocols, provides an audit mechanism for the evaluation of the methodological framework, integrates a mathematical protocol to derive time series FFC emission factors and allows for the impact of non-registered road vehicles on transport, fuel and emissions. (author)

  11. Translating Institutional Templates: A Historical Account of the Consequences of Importing Policing Models into Argentina

    Directory of Open Access Journals (Sweden)

    Matías Dewey

    2017-01-01

    Full Text Available This article focuses on the translation of the French and English law enforcement models into Argentina and analyzes its consequences in terms of social order. Whereas in the former two models the judiciary and police institutions originated in large-scale processes of historical consolidation, in the latter these institutions were implanted without the antecedents present in their countries of origin. The empirical references are Argentine police institutions, particularly the police of the Buenos Aires Province, observed at two moments in which the institutional import was particularly intense: towards the end of the nineteenth and beginning of the twentieth centuries, and at the end of the twentieth century. By way of tracing these processes of police constitution and reform, we show how new models of law enforcement and policing interacted with indigenous political structures and cultural frames, as well as how this constellation produced a social order in which legality and illegality are closely interwoven. The article is an attempt to go beyond the common observations regarding how an imported model failed; instead, it dissects the effects the translation actually produced and how the translated models transform into resources that reshape the new social order. A crucial element, the article shows, is that these resources can be instrumentalized according to »idiosyncrasies«, interests, and quotas of power.

  12. Respiratory nanoparticle-based vaccines and challenges associated with animal models and translation.

    Science.gov (United States)

    Renukaradhya, Gourapura J; Narasimhan, Balaji; Mallapragada, Surya K

    2015-12-10

    Vaccine development has had a huge impact on human health. However, there is a significant need to develop efficacious vaccines for several existing as well as emerging respiratory infectious diseases. Several challenges need to be overcome to develop efficacious vaccines with translational potential. This review focuses on two aspects to overcome some barriers - 1) the development of nanoparticle-based vaccines, and 2) the choice of suitable animal models for respiratory infectious diseases that will allow for translation. Nanoparticle-based vaccines, including subunit vaccines involving synthetic and/or natural polymeric adjuvants and carriers, as well as those based on virus-like particles offer several key advantages to help overcome the barriers to effective vaccine development. These include the ability to deliver combinations of antigens, target the vaccine formulation to specific immune cells, enable cross-protection against divergent strains, act as adjuvants or immunomodulators, allow for sustained release of antigen, enable single dose delivery, and potentially obviate the cold chain. While mouse models have provided several important insights into the mechanisms of infectious diseases, they are often a limiting step in translation of new vaccines to the clinic. An overview of different animal models involved in vaccine research for respiratory infections, with advantages and disadvantages of each model, is discussed. Taken together, advances in nanotechnology, combined with the right animal models for evaluating vaccine efficacy, has the potential to revolutionize vaccine development for respiratory infections. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Exaggerated Cap-Dependent Translation as a Mechanism for Corticostriatal Dysfunction in Fragile X Syndrome Model Mice

    Science.gov (United States)

    2017-11-01

    AWARD NUMBER: W81XWH-15-1-0361 TITLE: “Exaggerated Cap-Dependent Translation as a Mechanism for Corticostriatal Dysfunction in Fragile X...Annual 3. DATES COVERED 19Oct2016 - 18Oct2017 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER “Exaggerated Cap-Dependent Translation as a Mechanism for... translation inhibitors. Our specific tasks are centered on a proteomic study of FXS striatal synapses by using a transgenic mouse model that allows to

  14. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  15. Modeling timelines for translational science in cancer; the impact of technological maturation.

    Directory of Open Access Journals (Sweden)

    Laura M McNamee

    Full Text Available This work examines translational science in cancer based on theories of innovation that posit a relationship between the maturation of technologies and their capacity to generate successful products. We examined the growth of technologies associated with 138 anticancer drugs using an analytical model that identifies the point of initiation of exponential growth and the point at which growth slows as the technology becomes established. Approval of targeted and biological products corresponded with technological maturation, with first approval averaging 14 years after the established point and 44 years after initiation of associated technologies. The lag in cancer drug approvals after the increases in cancer funding and dramatic scientific advances of the 1970s thus reflects predictable timelines of technology maturation. Analytical models of technological maturation may be used for technological forecasting to guide more efficient translation of scientific discoveries into cures.

  16. Modeling timelines for translational science in cancer; the impact of technological maturation

    OpenAIRE

    McNamee, Laura M.; Ledley, Fred D.

    2017-01-01

    This work examines translational science in cancer based on theories of innovation that posit a relationship between the maturation of technologies and their capacity to generate successful products. We examined the growth of technologies associated with 138 anticancer drugs using an analytical model that identifies the point of initiation of exponential growth and the point at which growth slows as the technology becomes established. Approval of targeted and biological products corresponded ...

  17. SR-Site groundwater flow modelling methodology, setup and results

    International Nuclear Information System (INIS)

    Selroos, Jan-Olof; Follin, Sven

    2010-12-01

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report

  18. Development of a General Modelling Methodology for Vacuum Residue Hydroconversion

    Directory of Open Access Journals (Sweden)

    Pereira de Oliveira L.

    2013-11-01

    Full Text Available This work concerns the development of a methodology for kinetic modelling of refining processes, and more specifically for vacuum residue conversion. The proposed approach allows to overcome the lack of molecular detail of the petroleum fractions and to simulate the transformation of the feedstock molecules into effluent molecules by means of a two-step procedure. In the first step, a synthetic mixture of molecules representing the feedstock for the process is generated via a molecular reconstruction method, termed SR-REM molecular reconstruction. In the second step, a kinetic Monte-Carlo method (kMC is used to simulate the conversion reactions on this mixture of molecules. The molecular reconstruction was applied to several petroleum residues and is illustrated for an Athabasca (Canada vacuum residue. The kinetic Monte-Carlo method is then described in detail. In order to validate this stochastic approach, a lumped deterministic model for vacuum residue conversion was simulated using Gillespie’s Stochastic Simulation Algorithm. Despite the fact that both approaches are based on very different hypotheses, the stochastic simulation algorithm simulates the conversion reactions with the same accuracy as the deterministic approach. The full-scale stochastic simulation approach using molecular-level reaction pathways provides high amounts of detail on the effluent composition and is briefly illustrated for Athabasca VR hydrocracking.

  19. SR-Site groundwater flow modelling methodology, setup and results

    Energy Technology Data Exchange (ETDEWEB)

    Selroos, Jan-Olof (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Follin, Sven (SF GeoLogic AB, Taeby (Sweden))

    2010-12-15

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report.

  20. Modeling and control of lateral vibration of an axially translating flexible link

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Heon Seop; Rhim, Sung Soo [Kyung Hee University, Yongin (Korea, Republic of)

    2015-01-15

    Manipulators used for the transportation of large panel-shape payloads often adopt long and slender links (or forks) with translational joins to carry the payloads. As the size of the payload increases, the length of the links also increases to hold the payload securely. The increased length of the link inevitably amplifies the effect of the flexure in the link. Intuitively, the translational motion of the link in its longitudinal direction should have no effect on the lateral vibration of the link because of the orthogonality between the direction of the translational motion and the lateral vibration. If, however, the link was flexible and translated horizontally (perpendicular to the gravitational field) the asymmetric deflection of the link caused by gravity would break the orthogonality between the two directions, and the longitudinal motion of the link would excite lateral motion in the link. In this paper, the lateral oscillatory motion of the flexible link in a large-scale solar cell panel handling robot is investigated where the links carry the panel in its longitudinal direction. The Newtonian approach in conjunction with the assumed modes method is used for derivation of the equation of motion for the flexible forks where non-zero control force is applied at the base of the link. The analysis illustrates the effect of longitudinal motion on the lateral vibration and dynamic stiffening effect (variation of the natural frequency) of the link due to the translational velocity. Lateral vibration behavior is simulated using the derived equations of the motion. A robust vibration control scheme, the input shaping filter technique, is implemented on the model and the effectiveness of the scheme is verified numerically.

  1. Modeling and control of lateral vibration of an axially translating flexible link

    International Nuclear Information System (INIS)

    Shin, Heon Seop; Rhim, Sung Soo

    2015-01-01

    Manipulators used for the transportation of large panel-shape payloads often adopt long and slender links (or forks) with translational joins to carry the payloads. As the size of the payload increases, the length of the links also increases to hold the payload securely. The increased length of the link inevitably amplifies the effect of the flexure in the link. Intuitively, the translational motion of the link in its longitudinal direction should have no effect on the lateral vibration of the link because of the orthogonality between the direction of the translational motion and the lateral vibration. If, however, the link was flexible and translated horizontally (perpendicular to the gravitational field) the asymmetric deflection of the link caused by gravity would break the orthogonality between the two directions, and the longitudinal motion of the link would excite lateral motion in the link. In this paper, the lateral oscillatory motion of the flexible link in a large-scale solar cell panel handling robot is investigated where the links carry the panel in its longitudinal direction. The Newtonian approach in conjunction with the assumed modes method is used for derivation of the equation of motion for the flexible forks where non-zero control force is applied at the base of the link. The analysis illustrates the effect of longitudinal motion on the lateral vibration and dynamic stiffening effect (variation of the natural frequency) of the link due to the translational velocity. Lateral vibration behavior is simulated using the derived equations of the motion. A robust vibration control scheme, the input shaping filter technique, is implemented on the model and the effectiveness of the scheme is verified numerically.

  2. Translation of overlay models of student knowledge for relative domains based on domain ontology mapping

    DEFF Research Database (Denmark)

    Sosnovsky, Sergey; Dolog, Peter; Henze, Nicola

    2007-01-01

    The effectiveness of an adaptive educational system in many respects depends on the precision of modeling assumptions it makes about a student. One of the well-known challenges in student modeling is to adequately assess the initial level of student's knowledge when s/he starts working...... with a system. Sometimes potentially handful data are available as a part of user model from a system used by the student before. The usage of external user modeling information is troublesome because of differences in system architecture, knowledge representation, modeling constraints, etc. In this paper, we...... argue that the implementation of underlying knowledge models in a sharable format, as domain ontologies - along with application of automatic ontology mapping techniques for model alignment - can help to overcome the "new-user" problem and will greatly widen opportunities for student model translation...

  3. Translational Rodent Models for Research on Parasitic Protozoa-A Review of Confounders and Possibilities.

    Science.gov (United States)

    Ehret, Totta; Torelli, Francesca; Klotz, Christian; Pedersen, Amy B; Seeber, Frank

    2017-01-01

    Rodents, in particular Mus musculus , have a long and invaluable history as models for human diseases in biomedical research, although their translational value has been challenged in a number of cases. We provide some examples in which rodents have been suboptimal as models for human biology and discuss confounders which influence experiments and may explain some of the misleading results. Infections of rodents with protozoan parasites are no exception in requiring close consideration upon model choice. We focus on the significant differences between inbred, outbred and wild animals, and the importance of factors such as microbiota, which are gaining attention as crucial variables in infection experiments. Frequently, mouse or rat models are chosen for convenience, e.g., availability in the institution rather than on an unbiased evaluation of whether they provide the answer to a given question. Apart from a general discussion on translational success or failure, we provide examples where infections with single-celled parasites in a chosen lab rodent gave contradictory or misleading results, and when possible discuss the reason for this. We present emerging alternatives to traditional rodent models, such as humanized mice and organoid primary cell cultures. So-called recombinant inbred strains such as the Collaborative Cross collection are also a potential solution for certain challenges. In addition, we emphasize the advantages of using wild rodents for certain immunological, ecological, and/or behavioral questions. The experimental challenges (e.g., availability of species-specific reagents) that come with the use of such non-model systems are also discussed. Our intention is to foster critical judgment of both traditional and newly available translational rodent models for research on parasitic protozoa that can complement the existing mouse and rat models.

  4. Translational Rodent Models for Research on Parasitic Protozoa—A Review of Confounders and Possibilities

    Directory of Open Access Journals (Sweden)

    Totta Ehret

    2017-06-01

    Full Text Available Rodents, in particular Mus musculus, have a long and invaluable history as models for human diseases in biomedical research, although their translational value has been challenged in a number of cases. We provide some examples in which rodents have been suboptimal as models for human biology and discuss confounders which influence experiments and may explain some of the misleading results. Infections of rodents with protozoan parasites are no exception in requiring close consideration upon model choice. We focus on the significant differences between inbred, outbred and wild animals, and the importance of factors such as microbiota, which are gaining attention as crucial variables in infection experiments. Frequently, mouse or rat models are chosen for convenience, e.g., availability in the institution rather than on an unbiased evaluation of whether they provide the answer to a given question. Apart from a general discussion on translational success or failure, we provide examples where infections with single-celled parasites in a chosen lab rodent gave contradictory or misleading results, and when possible discuss the reason for this. We present emerging alternatives to traditional rodent models, such as humanized mice and organoid primary cell cultures. So-called recombinant inbred strains such as the Collaborative Cross collection are also a potential solution for certain challenges. In addition, we emphasize the advantages of using wild rodents for certain immunological, ecological, and/or behavioral questions. The experimental challenges (e.g., availability of species-specific reagents that come with the use of such non-model systems are also discussed. Our intention is to foster critical judgment of both traditional and newly available translational rodent models for research on parasitic protozoa that can complement the existing mouse and rat models.

  5. Dynamical modeling of microRNA action on the protein translation process.

    Science.gov (United States)

    Zinovyev, Andrei; Morozova, Nadya; Nonne, Nora; Barillot, Emmanuel; Harel-Bellan, Annick; Gorban, Alexander N

    2010-02-24

    Protein translation is a multistep process which can be represented as a cascade of biochemical reactions (initiation, ribosome assembly, elongation, etc.), the rate of which can be regulated by small non-coding microRNAs through multiple mechanisms. It remains unclear what mechanisms of microRNA action are the most dominant: moreover, many experimental reports deliver controversial messages on what is the concrete mechanism actually observed in the experiment. Nissan and Parker have recently demonstrated that it might be impossible to distinguish alternative biological hypotheses using the steady state data on the rate of protein synthesis. For their analysis they used two simple kinetic models of protein translation. In contrary to the study by Nissan and Parker, we show that dynamical data allow discriminating some of the mechanisms of microRNA action. We demonstrate this using the same models as developed by Nissan and Parker for the sake of comparison but the methods developed (asymptotology of biochemical networks) can be used for other models. We formulate a hypothesis that the effect of microRNA action is measurable and observable only if it affects the dominant system (generalization of the limiting step notion for complex networks) of the protein translation machinery. The dominant system can vary in different experimental conditions that can partially explain the existing controversy of some of the experimental data. Our analysis of the transient protein translation dynamics shows that it gives enough information to verify or reject a hypothesis about a particular molecular mechanism of microRNA action on protein translation. For multiscale systems only that action of microRNA is distinguishable which affects the parameters of dominant system (critical parameters), or changes the dominant system itself. Dominant systems generalize and further develop the old and very popular idea of limiting step. Algorithms for identifying dominant systems in multiscale

  6. Dynamical modeling of microRNA action on the protein translation process

    Directory of Open Access Journals (Sweden)

    Barillot Emmanuel

    2010-02-01

    Full Text Available Abstract Background Protein translation is a multistep process which can be represented as a cascade of biochemical reactions (initiation, ribosome assembly, elongation, etc., the rate of which can be regulated by small non-coding microRNAs through multiple mechanisms. It remains unclear what mechanisms of microRNA action are the most dominant: moreover, many experimental reports deliver controversial messages on what is the concrete mechanism actually observed in the experiment. Nissan and Parker have recently demonstrated that it might be impossible to distinguish alternative biological hypotheses using the steady state data on the rate of protein synthesis. For their analysis they used two simple kinetic models of protein translation. Results In contrary to the study by Nissan and Parker, we show that dynamical data allow discriminating some of the mechanisms of microRNA action. We demonstrate this using the same models as developed by Nissan and Parker for the sake of comparison but the methods developed (asymptotology of biochemical networks can be used for other models. We formulate a hypothesis that the effect of microRNA action is measurable and observable only if it affects the dominant system (generalization of the limiting step notion for complex networks of the protein translation machinery. The dominant system can vary in different experimental conditions that can partially explain the existing controversy of some of the experimental data. Conclusions Our analysis of the transient protein translation dynamics shows that it gives enough information to verify or reject a hypothesis about a particular molecular mechanism of microRNA action on protein translation. For multiscale systems only that action of microRNA is distinguishable which affects the parameters of dominant system (critical parameters, or changes the dominant system itself. Dominant systems generalize and further develop the old and very popular idea of limiting step

  7. Model for bridging the translational "valleys of death" in spinal cord injury research

    Directory of Open Access Journals (Sweden)

    Barrable B

    2014-04-01

    Full Text Available Bill Barrable,1 Nancy Thorogood,1 Vanessa Noonan,1,2 Jocelyn Tomkinson,1 Phalgun Joshi,1 Ken Stephenson,1 John Barclay,1 Katharina Kovacs Burns3 1Rick Hansen Institute, 2Division of Spine, Department of Orthopaedics, University of British Columbia, Vancouver, BC, 3Health Sciences Council, University of Alberta, Edmonton, AB, Canada Abstract: To improve health care outcomes with cost-effective treatments and prevention initiatives, basic health research must be translated into clinical application and studied during implementation, a process commonly referred to as translational research. It is estimated that only 14% of health-related scientific discoveries enter into medical practice and that it takes an average of 17 years for them to do so. The transition from basic research to clinical knowledge and from clinical knowledge to practice or implementation is so fraught with obstacles that these transitions are often referred to as “valleys of death”. The Rick Hansen Institute has developed a unique praxis model for translational research in the field of spinal cord injury (SCI. The praxis model involves three components. The first is a coordinated program strategy of cure, care, consumer engagement, and commercialization. The second is a knowledge cycle that consists of four phases, ie, environmental scanning, knowledge generation and synthesis, knowledge validation, and implementation. The third is the provision of relevant resources and infrastructure to overcome obstacles in the “valleys of death”, ie, funding, clinical research operations, informatics, clinical research and best practice implementation, consumer engagement, collaborative networks, and strategic partnerships. This model, which is to be independently evaluated in 2018 to determine its strengths and limitations, has been used to advance treatments for pressure ulcers in SCI. The Rick Hansen Institute has developed an innovative solution to move knowledge into action by

  8. Noise analysis of genome-scale protein synthesis using a discrete computational model of translation

    Energy Technology Data Exchange (ETDEWEB)

    Racle, Julien; Hatzimanikatis, Vassily, E-mail: vassily.hatzimanikatis@epfl.ch [Laboratory of Computational Systems Biotechnology, Ecole Polytechnique Fédérale de Lausanne (EPFL), CH-1015 Lausanne (Switzerland); Swiss Institute of Bioinformatics (SIB), CH-1015 Lausanne (Switzerland); Stefaniuk, Adam Jan [Laboratory of Computational Systems Biotechnology, Ecole Polytechnique Fédérale de Lausanne (EPFL), CH-1015 Lausanne (Switzerland)

    2015-07-28

    Noise in genetic networks has been the subject of extensive experimental and computational studies. However, very few of these studies have considered noise properties using mechanistic models that account for the discrete movement of ribosomes and RNA polymerases along their corresponding templates (messenger RNA (mRNA) and DNA). The large size of these systems, which scales with the number of genes, mRNA copies, codons per mRNA, and ribosomes, is responsible for some of the challenges. Additionally, one should be able to describe the dynamics of ribosome exchange between the free ribosome pool and those bound to mRNAs, as well as how mRNA species compete for ribosomes. We developed an efficient algorithm for stochastic simulations that addresses these issues and used it to study the contribution and trade-offs of noise to translation properties (rates, time delays, and rate-limiting steps). The algorithm scales linearly with the number of mRNA copies, which allowed us to study the importance of genome-scale competition between mRNAs for the same ribosomes. We determined that noise is minimized under conditions maximizing the specific synthesis rate. Moreover, sensitivity analysis of the stochastic system revealed the importance of the elongation rate in the resultant noise, whereas the translation initiation rate constant was more closely related to the average protein synthesis rate. We observed significant differences between our results and the noise properties of the most commonly used translation models. Overall, our studies demonstrate that the use of full mechanistic models is essential for the study of noise in translation and transcription.

  9. Translating and transforming (a) CALL for leadership for learning

    DEFF Research Database (Denmark)

    Weinreich, Elvi; Bjerg, Helle

    2015-01-01

    "The paper pursues the argument that the process of translation is not solely a linguistic exercise. It also implies methodological and conceptual questions related to the translation and as such transformation of general and theoretical research based models of leadership for learning...

  10. Methodology for assessing electric vehicle charging infrastructure business models

    International Nuclear Information System (INIS)

    Madina, Carlos; Zamora, Inmaculada; Zabala, Eduardo

    2016-01-01

    The analysis of economic implications of innovative business models in networked environments, as electro-mobility is, requires a global approach to ensure that all the involved actors obtain a benefit. Although electric vehicles (EVs) provide benefits for the society as a whole, there are a number of hurdles for their widespread adoption, mainly the high investment cost for the EV and for the infrastructure. Therefore, a sound business model must be built up for charging service operators, which allows them to recover their costs while, at the same time, offer EV users a charging price which makes electro-mobility comparable to internal combustion engine vehicles. For that purpose, three scenarios are defined, which present different EV charging alternatives, in terms of charging power and charging station ownership and accessibility. A case study is presented for each scenario and the required charging station usage to have a profitable business model is calculated. We demonstrate that private home charging is likely to be the preferred option for EV users who can charge at home, as it offers a lower total cost of ownership under certain conditions, even today. On the contrary, finding a profitable business case for fast charging requires more intensive infrastructure usage. - Highlights: • Ecosystem is a network of actors who collaborate to create a positive business case. • Electro-mobility (electricity-powered road vehicles and ICT) is a complex ecosystem. • Methodological analysis to ensure that all actors benefit from electro-mobility. • Economic analysis of charging infrastructure deployment linked to its usage. • Comparison of EV ownership cost vs. ICE for vehicle users.

  11. Translational mixed-effects PKPD modelling of recombinant human growth hormone - from hypophysectomized rat to patients

    DEFF Research Database (Denmark)

    Thorsted, A; Thygesen, P; Agersø, H

    2016-01-01

    BACKGROUND AND PURPOSE: We aimed to develop a mechanistic mixed-effects pharmacokinetic (PK)-pharmacodynamic (PD) (PKPD) model for recombinant human growth hormone (rhGH) in hypophysectomized rats and to predict the human PKPD relationship. EXPERIMENTAL APPROACH: A non-linear mixed-effects model...... was developed from experimental PKPD studies of rhGH and effects of long-term treatment as measured by insulin-like growth factor 1 (IGF-1) and bodyweight gain in rats. Modelled parameter values were scaled to human values using the allometric approach with fixed exponents for PKs and unscaled for PDs...... s.c. administration was over predicted. After correction of the human s.c. absorption model, the induction model for IGF-1 well described the human PKPD data. CONCLUSIONS: A translational mechanistic PKPD model for rhGH was successfully developed from experimental rat data. The model links...

  12. Rabbit models for the study of human atherosclerosis: from pathophysiological mechanisms to translational medicine.

    Science.gov (United States)

    Fan, Jianglin; Kitajima, Shuji; Watanabe, Teruo; Xu, Jie; Zhang, Jifeng; Liu, Enqi; Chen, Y Eugene

    2015-02-01

    Laboratory animal models play an important role in the study of human diseases. Using appropriate animals is critical not only for basic research but also for the development of therapeutics and diagnostic tools. Rabbits are widely used for the study of human atherosclerosis. Because rabbits have a unique feature of lipoprotein metabolism (like humans but unlike rodents) and are sensitive to a cholesterol diet, rabbit models have not only provided many insights into the pathogenesis and development of human atherosclerosis but also made a great contribution to translational research. In fact, rabbit was the first animal model used for studying human atherosclerosis, more than a century ago. Currently, three types of rabbit model are commonly used for the study of human atherosclerosis and lipid metabolism: (1) cholesterol-fed rabbits, (2) Watanabe heritable hyperlipidemic rabbits, analogous to human familial hypercholesterolemia due to genetic deficiency of LDL receptors, and (3) genetically modified (transgenic and knock-out) rabbits. Despite their importance, compared with the mouse, the most widely used laboratory animal model nowadays, the use of rabbit models is still limited. In this review, we focus on the features of rabbit lipoprotein metabolism and pathology of atherosclerotic lesions that make it the optimal model for human atherosclerotic disease, especially for the translational medicine. For the sake of clarity, the review is not an attempt to be completely inclusive, but instead attempts to summarize substantial information concisely and provide a guideline for experiments using rabbits. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. A methodology model for quality management in a general hospital.

    Science.gov (United States)

    Stern, Z; Naveh, E

    1997-01-01

    A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.

  14. Modeling myocardial infarction in mice: methodology, monitoring, pathomorphology.

    Science.gov (United States)

    Ovsepyan, A A; Panchenkov, D N; Prokhortchouk, E B; Telegin, G B; Zhigalova, N A; Golubev, E P; Sviridova, T E; Matskeplishvili, S T; Skryabin, K G; Buziashvili, U I

    2011-01-01

    Myocardial infarction is one of the most serious and widespread diseases in the world. In this work, a minimally invasive method for simulating myocardial infarction in mice is described in the Russian Federation for the very first time; the procedure is carried out by ligation of the coronary heart artery or by controlled electrocoagulation. As a part of the methodology, a series of anesthetic, microsurgical and revival protocols are designed, owing to which a decrease in the postoperational mortality from the initial 94.6 to 13.6% is achieved. ECG confirms the development of large-focal or surface myocardial infarction. Postmortal histological examination confirms the presence of necrosis foci in the heart muscles of 87.5% of animals. Altogether, the medical data allow us to conclude that an adequate mouse model for myocardial infarction was generated. A further study is focused on the standardization of the experimental procedure and the use of genetically modified mouse strains, with the purpose of finding the most efficient therapeutic approaches for this disease.

  15. Writing Through: Practising Translation

    Directory of Open Access Journals (Sweden)

    Joel Scott

    2010-05-01

    Full Text Available This essay exists as a segment in a line of study and writing practice that moves between a critical theory analysis of translation studies conceptions of language, and the practical questions of what those ideas might mean for contemporary translation and writing practice. Although the underlying preoccupation of this essay, and my more general line of inquiry, is translation studies and practice, in many ways translation is merely a way into a discussion on language. For this essay, translation is the threshold of language. But the two trails of the discussion never manage to elude each other, and these concatenations have informed two experimental translation methods, referred to here as Live Translations and Series Translations. Following the essay are a number of poems in translation, all of which come from Blanco Nuclear by the contemporary Spanish poet, Esteban Pujals Gesalí. The first group, the Live Translations consist of transcriptions I made from audio recordings read in a public setting, in which the texts were translated in situ, either off the page of original Spanish-language poems, or through a process very much like that carried out by simultaneous translators, for which readings of the poems were played back to me through headphones at varying speeds to be translated before the audience. The translations collected are imperfect renderings, attesting to a moment in language practice rather than language objects. The second method involves an iterative translation process, by which three versions of any one poem are rendered, with varying levels of fluency, fidelity and servility. All three translations are presented one after the other as a series, with no version asserting itself as the primary translation. These examples, as well as the translation methods themselves, are intended as preliminary experiments within an endlessly divergent continuum of potential methods and translations, and not as a complete representation of

  16. Reverse-translational biomarker validation of Abnormal Repetitive Behaviors in mice: an illustration of the 4P's modeling approach.

    Science.gov (United States)

    Garner, Joseph P; Thogerson, Collette M; Dufour, Brett D; Würbel, Hanno; Murray, James D; Mench, Joy A

    2011-06-01

    The NIMH's new strategic plan, with its emphasis on the "4P's" (Prediction, Pre-emption, Personalization, and Populations) and biomarker-based medicine requires a radical shift in animal modeling methodology. In particular 4P's models will be non-determinant (i.e. disease severity will depend on secondary environmental and genetic factors); and validated by reverse-translation of animal homologues to human biomarkers. A powerful consequence of the biomarker approach is that different closely related disorders have a unique fingerprint of biomarkers. Animals can be validated as a highly specific model of a single disorder by matching this 'fingerprint'; or as a model of a symptom seen in multiple disorders by matching common biomarkers. Here we illustrate this approach with two Abnormal Repetitive Behaviors (ARBs) in mice: stereotypies and barbering (hair pulling). We developed animal versions of the neuropsychological biomarkers that distinguish human ARBs, and tested the fingerprint of the different mouse ARBs. As predicted, the two mouse ARBs were associated with different biomarkers. Both barbering and stereotypy could be discounted as models of OCD (even though they are widely used as such), due to the absence of limbic biomarkers which are characteristic of OCD and hence are necessary for a valid model. Conversely barbering matched the fingerprint of trichotillomania (i.e. selective deficits in set-shifting), suggesting it may be a highly specific model of this disorder. In contrast stereotypies were correlated only with a biomarker (deficits in response shifting) correlated with stereotypies in multiple disorders, suggesting that animal stereotypies model stereotypies in multiple disorders. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Translational models of infection prevention and control: lessons from studying high risk aging populations.

    Science.gov (United States)

    Mody, Lona

    2018-06-13

    The present review describes our research experiences and efforts in advancing the field of infection prevention and control in nursing facilities including postacute and long-term care settings. There are over two million infections in postacute and long-term care settings each year in the United States and $4 billion in associated costs. To define a target group most amenable to infection prevention and control interventions, we sought to quantify the relation between indwelling device use and microbial colonization in nursing facility patients. Using various methodologies including survey methods, observational epidemiology, randomized controlled studies, and collaboratives, we showed that indwelling device type is related to the site of multidrug-resistant organism (MDRO) colonization; multianatomic site colonization with MDROs is common; community-associated methicillin-resistant Staphylococcus aureus (MRSA) appeared in the nursing facility setting almost immediately following its emergence in acute care; (4) MDRO prevalence and catheter-associated infection rates can be reduced through a multimodal targeted infection prevention intervention; and (5) using a collaborative approach, such an intervention can be successfully scaled up. Our work advances the infection prevention field through translational research utilizing various methodologies, including quantitative and qualitative surveys, patient-oriented randomized controlled trials, and clinical microbiologic and molecular methods. The resulting interventions employ patient-oriented methods to reduce infections and antimicrobial resistance, and with partnerships from major national entities, can be implemented nationally.

  18. Entrainment to periodic initiation and transition rates in a computational model for gene translation.

    Directory of Open Access Journals (Sweden)

    Michael Margaliot

    Full Text Available Periodic oscillations play an important role in many biomedical systems. Proper functioning of biological systems that respond to periodic signals requires the ability to synchronize with the periodic excitation. For example, the sleep/wake cycle is a manifestation of an internal timing system that synchronizes to the solar day. In the terminology of systems theory, the biological system must entrain or phase-lock to the periodic excitation. Entrainment is also important in synthetic biology. For example, connecting several artificial biological systems that entrain to a common clock may lead to a well-functioning modular system. The cell-cycle is a periodic program that regulates DNA synthesis and cell division. Recent biological studies suggest that cell-cycle related genes entrain to this periodic program at the gene translation level, leading to periodically-varying protein levels of these genes. The ribosome flow model (RFM is a deterministic model obtained via a mean-field approximation of a stochastic model from statistical physics that has been used to model numerous processes including ribosome flow along the mRNA. Here we analyze the RFM under the assumption that the initiation and/or transition rates vary periodically with a common period T. We show that the ribosome distribution profile in the RFM entrains to this periodic excitation. In particular, the protein synthesis pattern converges to a unique periodic solution with period T. To the best of our knowledge, this is the first proof of entrainment in a mathematical model for translation that encapsulates aspects such as initiation and termination rates, ribosomal movement and interactions, and non-homogeneous elongation speeds along the mRNA. Our results support the conjecture that periodic oscillations in tRNA levels and other factors related to the translation process can induce periodic oscillations in protein levels, and may suggest a new approach for re-engineering genetic

  19. Translation Theory 'Translated'

    DEFF Research Database (Denmark)

    Wæraas, Arild; Nielsen, Jeppe

    2016-01-01

    Translation theory has proved to be a versatile analytical lens used by scholars working from different traditions. On the basis of a systematic literature review, this study adds to our understanding of the ‘translations’ of translation theory by identifying the distinguishing features of the most...... common theoretical approaches to translation within the organization and management discipline: actor-network theory, knowledge-based theory, and Scandinavian institutionalism. Although each of these approaches already has borne much fruit in research, the literature is diverse and somewhat fragmented......, but also overlapping. We discuss the ways in which the three versions of translation theory may be combined and enrich each other so as to inform future research, thereby offering a more complete understanding of translation in and across organizational settings....

  20. Recent Advances in Translational Magnetic Resonance Imaging in Animal Models of Stress and Depression

    Directory of Open Access Journals (Sweden)

    Allison L. McIntosh

    2017-05-01

    Full Text Available Magnetic resonance imaging (MRI is a valuable translational tool that can be used to investigate alterations in brain structure and function in both patients and animal models of disease. Regional changes in brain structure, functional connectivity, and metabolite concentrations have been reported in depressed patients, giving insight into the networks and brain regions involved, however preclinical models are less well characterized. The development of more effective treatments depends upon animal models that best translate to the human condition and animal models may be exploited to assess the molecular and cellular alterations that accompany neuroimaging changes. Recent advances in preclinical imaging have facilitated significant developments within the field, particularly relating to high resolution structural imaging and resting-state functional imaging which are emerging techniques in clinical research. This review aims to bring together the current literature on preclinical neuroimaging in animal models of stress and depression, highlighting promising avenues of research toward understanding the pathological basis of this hugely prevalent disorder.

  1. Advancing Transdisciplinary and Translational Research Practice: Issues and Models of Doctoral Education in Public Health

    Directory of Open Access Journals (Sweden)

    Linda Neuhauser

    2007-01-01

    Full Text Available Finding solutions to complex health problems, such as obesity, violence, and climate change, will require radical changes in cross-disciplinary education, research, and practice. The fundamental determinants of health include many interrelated factors such as poverty, culture, education, environment, and government policies. However, traditional public health training has tended to focus more narrowly on diseases and risk factors, and has not adequately leveraged the rich contributions of sociology, anthropology, economics, geography, communication, political science, and other disciplines. Further, students are often not sufficiently trained to work across sectors to translate research findings into effective, large-scale sustainable actions.During the past 2 decades, national and international organizations have called for more effective interdisciplinary, transdisciplinary, and translational approaches to graduate education. Although it has been difficult to work across traditional academic boundaries, some promising models draw on pedagogical theory and feature cross-disciplinary training focused on real-world problems, linkage between research, professional practice, community action, and cultivation of leadership skills.We describe the development the Doctor of Public Health program at the University of California, Berkeley, USA and its efforts to improve transdisciplinary and translational research education. We stress the need for international collaboration to improve educational approaches and better evaluate their impact.

  2. The SMART Study, a Mobile Health and Citizen Science Methodological Platform for Active Living Surveillance, Integrated Knowledge Translation, and Policy Interventions: Longitudinal Study.

    Science.gov (United States)

    Katapally, Tarun Reddy; Bhawra, Jasmin; Leatherdale, Scott T; Ferguson, Leah; Longo, Justin; Rainham, Daniel; Larouche, Richard; Osgood, Nathaniel

    2018-03-27

    Physical inactivity is the fourth leading cause of death worldwide, costing approximately US $67.5 billion per year to health care systems. To curb the physical inactivity pandemic, it is time to move beyond traditional approaches and engage citizens by repurposing sedentary behavior (SB)-enabling ubiquitous tools (eg, smartphones). The primary objective of the Saskatchewan, let's move and map our activity (SMART) Study was to develop a mobile and citizen science methodological platform for active living surveillance, knowledge translation, and policy interventions. This methodology paper enumerates the SMART Study platform's conceptualization, design, implementation, data collection procedures, analytical strategies, and potential for informing policy interventions. This longitudinal investigation was designed to engage participants (ie, citizen scientists) in Regina and Saskatoon, Saskatchewan, Canada, in four different seasons across 3 years. In spring 2017, pilot data collection was conducted, where 317 adult citizen scientists (≥18 years) were recruited in person and online. Citizen scientists used a custom-built smartphone app, Ethica (Ethica Data Services Inc), for 8 consecutive days to provide a complex series of objective and subjective data. Citizen scientists answered a succession of validated surveys that were assigned different smartphone triggering mechanisms (eg, user-triggered and schedule-triggered). The validated surveys captured physical activity (PA), SB, motivation, perception of outdoor and indoor environment, and eudaimonic well-being. Ecological momentary assessments were employed on each day to capture not only PA but also physical and social contexts along with barriers and facilitators of PA, as relayed by citizen scientists using geo-coded pictures and audio files. To obtain a comprehensive objective picture of participant location, motion, and compliance, 6 types of sensor-based (eg, global positioning system and accelerometer) data

  3. The SMART Study, a Mobile Health and Citizen Science Methodological Platform for Active Living Surveillance, Integrated Knowledge Translation, and Policy Interventions: Longitudinal Study

    Science.gov (United States)

    Bhawra, Jasmin; Leatherdale, Scott T; Ferguson, Leah; Longo, Justin; Rainham, Daniel; Larouche, Richard; Osgood, Nathaniel

    2018-01-01

    Background Physical inactivity is the fourth leading cause of death worldwide, costing approximately US $67.5 billion per year to health care systems. To curb the physical inactivity pandemic, it is time to move beyond traditional approaches and engage citizens by repurposing sedentary behavior (SB)–enabling ubiquitous tools (eg, smartphones). Objective The primary objective of the Saskatchewan, let’s move and map our activity (SMART) Study was to develop a mobile and citizen science methodological platform for active living surveillance, knowledge translation, and policy interventions. This methodology paper enumerates the SMART Study platform’s conceptualization, design, implementation, data collection procedures, analytical strategies, and potential for informing policy interventions. Methods This longitudinal investigation was designed to engage participants (ie, citizen scientists) in Regina and Saskatoon, Saskatchewan, Canada, in four different seasons across 3 years. In spring 2017, pilot data collection was conducted, where 317 adult citizen scientists (≥18 years) were recruited in person and online. Citizen scientists used a custom-built smartphone app, Ethica (Ethica Data Services Inc), for 8 consecutive days to provide a complex series of objective and subjective data. Citizen scientists answered a succession of validated surveys that were assigned different smartphone triggering mechanisms (eg, user-triggered and schedule-triggered). The validated surveys captured physical activity (PA), SB, motivation, perception of outdoor and indoor environment, and eudaimonic well-being. Ecological momentary assessments were employed on each day to capture not only PA but also physical and social contexts along with barriers and facilitators of PA, as relayed by citizen scientists using geo-coded pictures and audio files. To obtain a comprehensive objective picture of participant location, motion, and compliance, 6 types of sensor-based (eg, global

  4. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  5. TREXMO: A Translation Tool to Support the Use of Regulatory Occupational Exposure Models.

    Science.gov (United States)

    Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, David

    2016-10-01

    Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EXPO-TOOL, and EASE v.2.0. By enabling a semi-automatic translation between the parameters of these six models, TREXMO facilitates their simultaneous use. For a given exposure situation, defined by a set of parameters in one of the models, TREXMO provides the user with the most appropriate parameters to use in the other exposure models. Results showed that, once an exposure situation and parameters were set in ART, TREXMO reduced the number of possible outcomes in the other models by 1-4 orders of magnitude. The tool should manage to reduce the uncertain entry or selection of parameters in the six models, improve between-user reliability, and reduce the time required for running several models for a given exposure situation. In addition to these advantages, registrants of chemicals and authorities should benefit from more reliable exposure estimates for the risk characterization of dangerous chemicals under Regulation, Evaluation, Authorisation and restriction of CHemicals (REACH). © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  6. From an animal model to human patients: An example of a translational study on obsessive compulsive disorder (OCD).

    Science.gov (United States)

    Eilam, David

    2017-05-01

    The application of similar analyses enables a direct projection from translational research in animals to human studies. Following is an example of how the methodology of a specific animal model of obsessive-compulsive disorder (OCD) was applied to study human patients. Specifically, the quinpirole rat model for OCD was based on analyzing the trajectories of travel among different locales, and scoring the set of acts performed at each locale. Applying this analytic approach in human patients unveiled various aspects of OCD, such as the repetition and addition of acts, incompleteness, and the link between behavior and specific locations. It is also illustrated how the same analytical approach could be applicable to studying other mental disorders. Finally, it is suggested that the development of OCD could be explained by the four-phase sequence of Repetition, Addition, Condensation, and Elimination, as outlined in the study of ontogeny and phylogeny and applied to normal development of behavior. In OCD, this sequence is curtailed, resulting in the abundant repetition and addition of acts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Reference Management Methodologies for Large Structural Models at Kennedy Space Center

    Science.gov (United States)

    Jones, Corey; Bingham, Ryan; Schmidt, Rick

    2011-01-01

    There have been many challenges associated with modeling some of NASA KSC's largest structures. Given the size of the welded structures here at KSC, it was critically important to properly organize model struc.ture and carefully manage references. Additionally, because of the amount of hardware to be installed on these structures, it was very important to have a means to coordinate between different design teams and organizations, check for interferences, produce consistent drawings, and allow for simple release processes. Facing these challenges, the modeling team developed a unique reference management methodology and model fidelity methodology. This presentation will describe the techniques and methodologies that were developed for these projects. The attendees will learn about KSC's reference management and model fidelity methodologies for large structures. The attendees will understand the goals of these methodologies. The attendees will appreciate the advantages of developing a reference management methodology.

  8. Fear Extinction as a Model for Translational Neuroscience: Ten Years of Progress

    Science.gov (United States)

    Milad, Mohammed R.; Quirk, Gregory J.

    2016-01-01

    The psychology of extinction has been studied for decades. Approximately 10 years ago, however, there began a concerted effort to understand the neural circuits of extinction of fear conditioning, in both animals and humans. Progress during this period has been facilitated by an unusual degree of coordination between rodent and human researchers examining fear extinction. This successful research program could serve as a model for translational research in other areas of behavioral neuroscience. Here we review the major advances and highlight new approaches to understanding and exploiting fear extinction. PMID:22129456

  9. Patient Derived Xenograft Models: An Emerging Platform for Translational Cancer Research

    Science.gov (United States)

    Hidalgo, Manuel; Amant, Frederic; Biankin, Andrew V.; Budinská, Eva; Byrne, Annette T.; Caldas, Carlos; Clarke, Robert B.; de Jong, Steven; Jonkers, Jos; Mælandsmo, Gunhild Mari; Roman-Roman, Sergio; Seoane, Joan; Trusolino, Livio; Villanueva, Alberto

    2014-01-01

    Recently, there has been increasing interest in the development and characterization of patient derived tumor xenograft (PDX) models for cancer research. PDX models mostly retain the principal histological and genetic characteristics of their donor tumor and remain stable across passages. These models have been shown to be predictive of clinical outcomes and are being used for preclinical drug evaluation, biomarker identification, biological studies, and personalized medicine strategies. This paper summarizes the current state of the art in this field including methodological issues, available collections, practical applications, challenges and shortcoming, and future directions, and introduces a European consortium of PDX models. PMID:25185190

  10. Risk methodology for geologic disposal of radioactive waste: model description and user manual for Pathways model

    International Nuclear Information System (INIS)

    Helton, J.C.; Kaestner, P.C.

    1981-03-01

    A model for the environmental movement and human uptake of radionuclides is presented. This model is designated the Pathways-to-Man Model and was developed as part of a project funded by the Nuclear Regulatory Commission to design a methodology to assess the risk associated with the geologic disposal of high-level radioactive waste. The Pathways-to-Man Model is divided into two submodels. One of these, the Environmental Transport Model, represents the long-term distribution and accumulation of radionuclides in the environment. This model is based on a mixed-cell approach and describes radionuclide movement with a system of linear differential equations. The other, the Transport-to-Man Model, represents the movement of radionuclides from the environment to man. This model is based on concentration ratios. General descriptions of these models are provided in this report. Further, documentation is provided for the computer program which implements the Pathways Model

  11. Experimental liver fibrosis research: update on animal models, legal issues and translational aspects

    Science.gov (United States)

    2013-01-01

    Liver fibrosis is defined as excessive extracellular matrix deposition and is based on complex interactions between matrix-producing hepatic stellate cells and an abundance of liver-resident and infiltrating cells. Investigation of these processes requires in vitro and in vivo experimental work in animals. However, the use of animals in translational research will be increasingly challenged, at least in countries of the European Union, because of the adoption of new animal welfare rules in 2013. These rules will create an urgent need for optimized standard operating procedures regarding animal experimentation and improved international communication in the liver fibrosis community. This review gives an update on current animal models, techniques and underlying pathomechanisms with the aim of fostering a critical discussion of the limitations and potential of up-to-date animal experimentation. We discuss potential complications in experimental liver fibrosis and provide examples of how the findings of studies in which these models are used can be translated to human disease and therapy. In this review, we want to motivate the international community to design more standardized animal models which might help to address the legally requested replacement, refinement and reduction of animals in fibrosis research. PMID:24274743

  12. Proposed Model for Translational Research at a Teaching-Intensive College of Pharmacy.

    Science.gov (United States)

    Ulrich, Erin; Grady, Sarah; Vonderhaar, Jacqueline; Ruplin, Andrew

    2017-08-08

    Many American colleges of pharmacy are small, private, teaching institutions. Faculty are required to maintain a research agenda, although the publication quota is less compared with their publicly funded college of pharmacy peers. Faculty at these smaller schools conduct research with very little internal or external funding. This tends to lead to smaller, less impactful research findings. Translational research is becoming popular for research faculty as it bridges theory to practice. The Knowledge-to-Action (KTA) framework presents the steps to conduct translational research. To apply and determine if the KTA framework would be able to produce practice-impactful research at an institution that does not depend on grant funding as part of faculty research agendas. An interdisciplinary team was formed with providers at the clinical faculty's practice site. As the team moved through the KTA steps, authors documented the roles of each team member. It was clear that many different types of teams were formed throughout the KTA process. These teams were then categorized according to the Interdisciplinary Teamwork System. The final result is a proposed model of types of teams and required member roles that are necessary within each KTA step for faculty to conduct practice-impactful research at a small, private, teaching institution without substantial grant funding awards. Applying the KTA framework, two impactful original research manuscripts were developed over two academic years. Furthermore, the practitioners at the clinical faculty member's site were very pleased with the ease of conducting research, as they were never required to take a lead role. In addition, both faculty members alternated lead and support role allowing for a decreased burden of workload while producing theory-driven research. The KTA framework can create a model for translational research and may be particularly beneficial to small teaching institutions to conduct impactful research. Copyright

  13. BRIDG: a domain information model for translational and clinical protocol-driven research.

    Science.gov (United States)

    Becnel, Lauren B; Hastak, Smita; Ver Hoef, Wendy; Milius, Robert P; Slack, MaryAnn; Wold, Diane; Glickman, Michael L; Brodsky, Boris; Jaffe, Charles; Kush, Rebecca; Helton, Edward

    2017-09-01

    It is critical to integrate and analyze data from biological, translational, and clinical studies with data from health systems; however, electronic artifacts are stored in thousands of disparate systems that are often unable to readily exchange data. To facilitate meaningful data exchange, a model that presents a common understanding of biomedical research concepts and their relationships with health care semantics is required. The Biomedical Research Integrated Domain Group (BRIDG) domain information model fulfills this need. Software systems created from BRIDG have shared meaning "baked in," enabling interoperability among disparate systems. For nearly 10 years, the Clinical Data Standards Interchange Consortium, the National Cancer Institute, the US Food and Drug Administration, and Health Level 7 International have been key stakeholders in developing BRIDG. BRIDG is an open-source Unified Modeling Language-class model developed through use cases and harmonization with other models. With its 4+ releases, BRIDG includes clinical and now translational research concepts in its Common, Protocol Representation, Study Conduct, Adverse Events, Regulatory, Statistical Analysis, Experiment, Biospecimen, and Molecular Biology subdomains. The model is a Clinical Data Standards Interchange Consortium, Health Level 7 International, and International Standards Organization standard that has been utilized in national and international standards-based software development projects. It will continue to mature and evolve in the areas of clinical imaging, pathology, ontology, and vocabulary support. BRIDG 4.1.1 and prior releases are freely available at https://bridgmodel.nci.nih.gov . © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. Chronic early life stress induced by limited bedding and nesting (LBN) material in rodents: critical considerations of methodology, outcomes and translational potential.

    Science.gov (United States)

    Walker, Claire-Dominique; Bath, Kevin G; Joels, Marian; Korosi, Aniko; Larauche, Muriel; Lucassen, Paul J; Morris, Margaret J; Raineki, Charlis; Roth, Tania L; Sullivan, Regina M; Taché, Yvette; Baram, Tallie Z

    2017-09-01

    The immediate and long-term effects of exposure to early life stress (ELS) have been documented in humans and animal models. Even relatively brief periods of stress during the first 10 days of life in rodents can impact later behavioral regulation and the vulnerability to develop adult pathologies, in particular an impairment of cognitive functions and neurogenesis, but also modified social, emotional, and conditioned fear responses. The development of preclinical models of ELS exposure allows the examination of mechanisms and testing of therapeutic approaches that are not possible in humans. Here, we describe limited bedding and nesting (LBN) procedures, with models that produce altered maternal behavior ranging from fragmentation of care to maltreatment of infants. The purpose of this paper is to discuss important issues related to the implementation of this chronic ELS procedure and to describe some of the most prominent endpoints and consequences, focusing on areas of convergence between laboratories. Effects on the hypothalamic-pituitary adrenal (HPA) axis, gut axis and metabolism are presented in addition to changes in cognitive and emotional functions. Interestingly, recent data have suggested a strong sex difference in some of the reported consequences of the LBN paradigm, with females being more resilient in general than males. As both the chronic and intermittent variants of the LBN procedure have profound consequences on the offspring with minimal external intervention from the investigator, this model is advantageous ecologically and has a large translational potential. In addition to the direct effect of ELS on neurodevelopmental outcomes, exposure to adverse early environments can also have intergenerational impacts on mental health and function in subsequent generation offspring. Thus, advancing our understanding of the effect of ELS on brain and behavioral development is of critical concern for the health and wellbeing of both the current

  15. Translational relevance of rodent models of hypothalamic-pituitary-adrenal function and stressors in adolescence

    Directory of Open Access Journals (Sweden)

    Cheryl M. McCormick

    2017-02-01

    Full Text Available Elevations in glucocorticoids that result from environmental stressors can have programming effects on brain structure and function when the exposure occurs during sensitive periods that involve heightened neural development. In recent years, adolescence has gained increasing attention as another sensitive period of development, a period in which pubertal transitions may increase the vulnerability to stressors. There are similarities in physical and behavioural development between humans and rats, and rats have been used effectively as an animal model of adolescence and the unique plasticity of this period of ontogeny. This review focuses on benefits and challenges of rats as a model for translational research on hypothalamic-pituitary-adrenal (HPA function and stressors in adolescence, highlighting important parallels and contrasts between adolescent rats and humans, and we review the main stress procedures that are used in investigating HPA stress responses and their consequences in adolescence in rats. We conclude that a greater focus on timing of puberty as a factor in research in adolescent rats may increase the translational relevance of the findings.

  16. Dynamic Modeling of GAIT System Reveals Transcriptome Expansion and Translational Trickle Control Device

    Science.gov (United States)

    Yao, Peng; Potdar, Alka A.; Arif, Abul; Ray, Partho Sarothi; Mukhopadhyay, Rupak; Willard, Belinda; Xu, Yichi; Yan, Jun; Saidel, Gerald M.; Fox, Paul L.

    2012-01-01

    SUMMARY Post-transcriptional regulatory mechanisms superimpose “fine-tuning” control upon “on-off” switches characteristic of gene transcription. We have exploited computational modeling with experimental validation to resolve an anomalous relationship between mRNA expression and protein synthesis. Differential GAIT (Gamma-interferon Activated Inhibitor of Translation) complex activation repressed VEGF-A synthesis to a low, constant rate despite high, variable VEGFA mRNA expression. Dynamic model simulations indicated the presence of an unidentified, inhibitory GAIT element-interacting factor. We discovered a truncated form of glutamyl-prolyl tRNA synthetase (EPRS), the GAIT constituent that binds the 3’-UTR GAIT element in target transcripts. The truncated protein, EPRSN1, prevents binding of functional GAIT complex. EPRSN1 mRNA is generated by a remarkable polyadenylation-directed conversion of a Tyr codon in the EPRS coding sequence to a stop codon (PAY*). By low-level protection of GAIT element-bearing transcripts, EPRSN1 imposes a robust “translational trickle” of target protein expression. Genome-wide analysis shows PAY* generates multiple truncated transcripts thereby contributing to transcriptome expansion. PMID:22386318

  17. Critical Appraisal of Translational Research Models for Suitability in Performance Assessment of Cancer Centers

    NARCIS (Netherlands)

    Rajan, Abinaya; Sullivan, Richard; Bakker, Suzanne; van Harten, Willem H.

    2012-01-01

    Background. Translational research is a complex cumulative process that takes time. However, the operating environment for cancer centers engaged in translational research is now financially insecure. Centers are challenged to improve results and reduce time from discovery to practice innovations.

  18. Critical appraisal of the suitability of translational research models for performance assessment of cancer institutions

    NARCIS (Netherlands)

    Rajan, A.; Sullivan, R.; Bakker, S.; van Harten, Willem H.

    2012-01-01

    Background. Translational research is a complex cumulative process that takes time. However, the operating environment for cancer centers engaged in translational research is now financially insecure. Centers are challenged to improve results and reduce time from discovery to practice innovations.

  19. Developing a new model for the invention and translation of neurotechnologies in academic neurosurgery.

    Science.gov (United States)

    Leuthardt, Eric C

    2013-01-01

    There is currently an acceleration of new scientific and technical capabilities that create new opportunities for academic neurosurgery. To engage these changing dynamics, the Center for Innovation in Neuroscience and Technology (CINT) was created on the premise that successful innovation of device-related ideas relies on collaboration between multiple disciplines. The CINT has created a unique model that integrates scientific, medical, engineering, and legal/business experts to participate in the continuum from idea generation to translation. To detail the method by which this model has been implemented in the Department of Neurological Surgery at Washington University in St. Louis and the experience that has been accrued thus far. The workflow is structured to enable cross-disciplinary interaction, both intramurally and extramurally between academia and industry. This involves a structured method for generating, evaluating, and prototyping promising device concepts. The process begins with the "invention session," which consists of a structured exchange between inventors from diverse technical and medical backgrounds. Successful ideas, which pass a separate triage mechanism, are then sent to industry-sponsored multidisciplinary fellowships to create functioning prototypes. After 3 years, the CINT has engaged 32 clinical and nonclinical inventors, resulting in 47 ideas, 16 fellowships, and 12 patents, for which 7 have been licensed to industry. Financial models project that if commercially successful, device sales could have a notable impact on departmental revenue. The CINT is a model that supports an integrated approach from the time an idea is created through its translational development. To date, the approach has been successful in creating numerous concepts that have led to industry licenses. In the long term, this model will create a novel revenue stream to support the academic neurosurgical mission.

  20. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Efficiency of Iranian Translation Syllabus at BA Level; Deficiency: A New Comprehensive Model

    Science.gov (United States)

    Sohrabi, Sarah; Rahimi, Ramin; Arjmandi, Masoume

    2015-01-01

    This study aims at investigating the practicality of the current curriculum for translation studies at national level (Iranian curriculum). It is going to have a comprehensive idea of translation students and teachers (university lecturers) over the current translation syllabus at BA level in Iran. A researcher-made CEQ questionnaire (Curriculum…

  2. Mapping Translation Technology Research in Translation Studies

    DEFF Research Database (Denmark)

    Schjoldager, Anne; Christensen, Tina Paulsen; Flanagan, Marian

    2017-01-01

    /Schjoldager 2010, 2011; Christensen 2011). Unfortunately, the increasing professional use of translation technology has not been mirrored within translation studies (TS) by a similar increase in research projects on translation technology (Munday 2009: 15; O’Hagan 2013; Doherty 2016: 952). The current thematic...... section aims to improve this situation by presenting new and innovative research papers that reflect on recent technological advances and their impact on the translation profession and translators from a diversity of perspectives and using a variety of methods. In Section 2, we present translation...... technology research as a subdiscipline of TS, and we define and discuss some basic concepts and models of the field that we use in the rest of the paper. Based on a small-scale study of papers published in TS journals between 2006 and 2016, Section 3 attempts to map relevant developments of translation...

  3. ROCK inhibition in models of neurodegeneration and its potential for clinical translation.

    Science.gov (United States)

    Koch, Jan Christoph; Tatenhorst, Lars; Roser, Anna-Elisa; Saal, Kim-Ann; Tönges, Lars; Lingor, Paul

    2018-04-03

    Neurodegenerative disorders like Parkinson's disease, Alzheimer's disease, or amyotrophic lateral sclerosis are affecting a rapidly increasing population worldwide. While common pathomechanisms such as protein aggregation, axonal degeneration, dysfunction of protein clearing and an altered immune response have been characterized, no disease-modifying therapies have been developed so far. Interestingly, a significant involvement of the Rho kinase (ROCK) signaling pathway has been described in all of these mechanisms making it a promising target for new therapeutic approaches. In this article, we first review current knowledge of the involvement of ROCK in neurodegenerative disorders and the utility of its inhibition as a disease-modifying therapy in different neurodegenerative disorders. After a detailed description of the biochemical characteristics of ROCK and its molecular interactors, differences of ROCK-expression under physiological and pathological conditions are compared. Next, different pharmacological and molecular-genetic strategies to inhibit ROCK-function are discussed, focusing on pharmacological ROCK-inhibitors. The role of the ROCK-pathway in cellular processes that are central in neurodegenerative disorders pathology like axonal degeneration, autophagy, synaptic and glial function is explained in detail. Finally, all available data on ROCK-inhibition in different animal models of neurodegenerative disorders is reviewed and first approaches for translation into human patients are discussed. Taken together, there is now extensive evidence from preclinical studies in several neurodegenerative disorders that characterize ROCK as a promising drug target for further translational research in neurodegenerative disorders. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  4. DG TO FT - AUTOMATIC TRANSLATION OF DIGRAPH TO FAULT TREE MODELS

    Science.gov (United States)

    Iverson, D. L.

    1994-01-01

    Fault tree and digraph models are frequently used for system failure analysis. Both types of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Each model has its advantages. While digraphs can be derived in a fairly straightforward manner from system schematics and knowledge about component failure modes and system design, fault tree structure allows for fast processing using efficient techniques developed for tree data structures. The similarities between digraphs and fault trees permits the information encoded in the digraph to be translated into a logically equivalent fault tree. The DG TO FT translation tool will automatically translate digraph models, including those with loops or cycles, into fault tree models that have the same minimum cut set solutions as the input digraph. This tool could be useful, for example, if some parts of a system have been modeled using digraphs and others using fault trees. The digraphs could be translated and incorporated into the fault trees, allowing them to be analyzed using a number of powerful fault tree processing codes, such as cut set and quantitative solution codes. A cut set for a given node is a group of failure events that will cause the failure of the node. A minimum cut set for a node is any cut set that, if any of the failures in the set were to be removed, the occurrence of the other failures in the set will not cause the failure of the event represented by the node. Cut sets calculations can be used to find dependencies, weak links, and vital system components whose failures would cause serious systems failure. The DG TO FT translation system reads in a digraph with each node listed as a separate object in the input file. The user specifies a terminal node for the digraph that will be used as the top node of the resulting fault tree. A fault tree basic event node representing the failure of that digraph node is created and becomes a child of the terminal

  5. A methodology for the parametric modelling of the flow coefficients and flow rate in hydraulic valves

    International Nuclear Information System (INIS)

    Valdés, José R.; Rodríguez, José M.; Saumell, Javier; Pütz, Thomas

    2014-01-01

    Highlights: • We develop a methodology for the parametric modelling of flow in hydraulic valves. • We characterize the flow coefficients with a generic function with two parameters. • The parameters are derived from CFD simulations of the generic geometry. • We apply the methodology to two cases from the automotive brake industry. • We validate by comparing with CFD results varying the original dimensions. - Abstract: The main objective of this work is to develop a methodology for the parametric modelling of the flow rate in hydraulic valve systems. This methodology is based on the derivation, from CFD simulations, of the flow coefficient of the critical restrictions as a function of the Reynolds number, using a generalized square root function with two parameters. The methodology is then demonstrated by applying it to two completely different hydraulic systems: a brake master cylinder and an ABS valve. This type of parametric valve models facilitates their implementation in dynamic simulation models of complex hydraulic systems

  6. Scoping review identifies significant number of knowledge translation theories, models and frameworks with limited use.

    Science.gov (United States)

    Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E

    2018-04-13

    To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.

  7. Introducing the Interactive Model for the Training of Audiovisual Translators and Analysis of Multimodal Texts

    Directory of Open Access Journals (Sweden)

    Pietro Luigi Iaia

    2015-07-01

    Full Text Available Abstract – This paper introduces the ‘Interactive Model’ of audiovisual translation developed in the context of my PhD research on the cognitive-semantic, functional and socio-cultural features of the Italian-dubbing translation of a corpus of humorous texts. The Model is based on two interactive macro-phases – ‘Multimodal Critical Analysis of Scripts’ (MuCrAS and ‘Multimodal Re-Textualization of Scripts’ (MuReTS. Its construction and application are justified by a multidisciplinary approach to the analysis and translation of audiovisual texts, so as to focus on the linguistic and extralinguistic dimensions affecting both the reception of source texts and the production of target ones (Chaume 2004; Díaz Cintas 2004. By resorting to Critical Discourse Analysis (Fairclough 1995, 2001, to a process-based approach to translation and to a socio-semiotic analysis of multimodal texts (van Leeuwen 2004; Kress and van Leeuwen 2006, the Model is meant to be applied to the training of audiovisual translators and discourse analysts in order to help them enquire into the levels of pragmalinguistic equivalence between the source and the target versions. Finally, a practical application shall be discussed, detailing the Italian rendering of a comic sketch from the American late-night talk show Conan.Abstract – Questo studio introduce il ‘Modello Interattivo’ di traduzione audiovisiva sviluppato durante il mio dottorato di ricerca incentrato sulle caratteristiche cognitivo-semantiche, funzionali e socio-culturali della traduzione italiana per il doppiaggio di un corpus di testi comici. Il Modello è costituito da due fasi: la prima, di ‘Analisi critica e multimodale degli script’ (MuCrAS e la seconda, di ‘Ritestualizzazione critica e multimodale degli script’ (MuReTS, e la sua costruzione e applicazione sono frutto di un approccio multidisciplinare all’analisi e traduzione dei testi audiovisivi, al fine di esaminare le

  8. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development.

    Science.gov (United States)

    Tøndel, Kristin; Niederer, Steven A; Land, Sander; Smith, Nicolas P

    2014-05-20

    Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input-output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on

  9. Understanding Translation

    DEFF Research Database (Denmark)

    Schjoldager, Anne Gram; Gottlieb, Henrik; Klitgård, Ida

    Understanding Translation is designed as a textbook for courses on the theory and practice of translation in general and of particular types of translation - such as interpreting, screen translation and literary translation. The aim of the book is to help you gain an in-depth understanding...... of the phenomenon of translation and to provide you with a conceptual framework for the analysis of various aspects of professional translation. Intended readers are students of translation and languages, but the book will also be relevant for others who are interested in the theory and practice of translation...... - translators, language teachers, translation users and literary, TV and film critics, for instance. Discussions focus on translation between Danish and English....

  10. FEM BASED PARAMETRIC DESIGN STUDY OF TIRE PROFILE USING DEDICATED CAD MODEL AND TRANSLATION CODE

    Directory of Open Access Journals (Sweden)

    Nikola Korunović

    2014-12-01

    Full Text Available In this paper a finite element method (FEM based parametric design study of the tire profile shape and belt width is presented. One of the main obstacles that similar studies have faced is how to change the finite element mesh after a modification of the tire geometry is performed. In order to overcome this problem, a new approach is proposed. It implies automatic update of the finite elements mesh, which follows the change of geometric design parameters on a dedicated CAD model. The mesh update is facilitated by an originally developed mapping and translation code. In this way, the performance of a large number of geometrically different tire design variations may be analyzed in a very short time. Although a pilot one, the presented study has also led to the improvement of the existing tire design.

  11. Translation Techniques

    OpenAIRE

    Marcia Pinheiro

    2015-01-01

    In this paper, we discuss three translation techniques: literal, cultural, and artistic. Literal translation is a well-known technique, which means that it is quite easy to find sources on the topic. Cultural and artistic translation may be new terms. Whilst cultural translation focuses on matching contexts, artistic translation focuses on matching reactions. Because literal translation matches only words, it is not hard to find situations in which we should not use this technique.  Because a...

  12. Maintaining the clinical relevance of animal models in translational studies of post-traumatic stress disorder.

    Science.gov (United States)

    Cohen, Hagit; Matar, Michael A; Zohar, Joseph

    2014-01-01

    The diagnosis of Post-Traumatic Stress Disorder (PTSD) is conditional on directly experiencing or witnessing a significantly threatening event and the presence of a certain minimal number of symptoms from each of four symptom clusters (re-experiencing, avoidance, negative cognition and mood, and hyperarousal) at least one month after the event (DSM 5) (American Psychiatric Association 2013). Only a proportion of the population exposed develops symptoms fulfilling the criteria. The individual heterogeneity in responses of stress-exposed animals suggested that adapting clearly defined and reliably reproducible "diagnostic", i.e. behavioral, criteria for animal responses would augment the clinical validity of the analysis of study data. We designed cut-off (inclusion/exclusion) behavioral criteria (CBC) which classify study subjects as being severely, minimally or partially affected by the stress paradigm, to be applied retrospectively in the analysis of behavioral data. Behavioral response classification enables the researcher to correlate (retrospectively) specific anatomic, bio-molecular and physiological parameters with the degree and pattern of the individual behavioral response, and also introduces "prevalence rates" as a valid study-parameter. The cumulative results of our studies indicate that, by classifying the data from individual subjects according to their response patterns, the animal study can more readily be translated into clinical "follow-up" studies and back again. This article will discuss the concept of the model and its background, and present a selection of studies employing and examining the model, alongside the underlying translational rationale of each. © The Author 2014. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Methodology for estimating soil carbon for the forest carbon budget model of the United States, 2001

    Science.gov (United States)

    L. S. Heath; R. A. Birdsey; D. W. Williams

    2002-01-01

    The largest carbon (C) pool in United States forests is the soil C pool. We present methodology and soil C pool estimates used in the FORCARB model, which estimates and projects forest carbon budgets for the United States. The methodology balances knowledge, uncertainties, and ease of use. The estimates are calculated using the USDA Natural Resources Conservation...

  14. Methodology for Designing Models Predicting Success of Infertility Treatment

    OpenAIRE

    Alireza Zarinara; Mohammad Mahdi Akhondi; Hojjat Zeraati; Koorsh Kamali; Kazem Mohammad

    2016-01-01

    Abstract Background: The prediction models for infertility treatment success have presented since 25 years ago. There are scientific principles for designing and applying the prediction models that is also used to predict the success rate of infertility treatment. The purpose of this study is to provide basic principles for designing the model to predic infertility treatment success. Materials and Methods: In this paper, the principles for developing predictive models are explained and...

  15. A generic methodology for developing fuzzy decision models

    NARCIS (Netherlands)

    Bosma, R.; Berg, van den J.; Kaymak, U.; Udo, H.; Verreth, J.

    2012-01-01

    An important paradigm in decision-making models is utility-maximization where most models do not include actors’ motives. Fuzzy set theory on the other hand offers a method to simulate human decisionmaking. However, the literature describing expert-driven fuzzy logic models, rarely gives precise

  16. A generic methodology for developing fuzzy decision models

    NARCIS (Netherlands)

    Bosma, R.H.; Berg, van den J.; Kaymak, Uzay; Udo, H.M.J.; Verreth, J.A.J.

    2012-01-01

    An important paradigm in decision-making models is utility-maximization where most models do not include actors’ motives. Fuzzy set theory on the other hand offers a method to simulate human decision-making. However, the literature describing expert-driven fuzzy logic models, rarely gives precise

  17. A methodology for constructing the calculation model of scientific spreadsheets

    NARCIS (Netherlands)

    Vos, de M.; Wielemaker, J.; Schreiber, G.; Wielinga, B.; Top, J.L.

    2015-01-01

    Spreadsheets models are frequently used by scientists to analyze research data. These models are typically described in a paper or a report, which serves as single source of information on the underlying research project. As the calculation workflow in these models is not made explicit, readers are

  18. An Overview of Models, Methods, and Reagents Developed for Translational Autoimmunity Research in the Common Marmoset (Callithrix jacchus)

    NARCIS (Netherlands)

    Jagessar, S. Anwar; Vierboom, Michel; Blezer, Erwin L. A.; Bauer, Jan; 't Hart, Bert A.; Kap, Yolanda S.

    The common marmoset (Callithrix jacchus) is a small-bodied Neotropical primate and a useful preclinical animal model for translational research into autoimmune-mediated inflammatory diseases (AIMID), such as rheumatoid arthritis (RA) and multiple sclerosis (MS). The animal model for MS established

  19. An overview of models, methods, and reagents developed for translational autoimmunity research in the common marmoset (Callithrix jacchus)

    NARCIS (Netherlands)

    S.A. Jagessar (Anwar); M.P.M. Vierboom (Michel); E. Blezer (Erwin); J. Bauer; B.A. 't Hart (Bert); Y.S. Kap (Yolanda)

    2013-01-01

    textabstractThe common marmoset (Callithrix jacchus) is a small-bodied Neotropical primate and a useful preclinical animal model for translational research into autoimmune-mediated inflammatory diseases (AIMID), such as rheumatoid arthritis (RA) and multiple sclerosis (MS). The animal model for MS

  20. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  1. Cognitive Dysfunction in Major Depressive Disorder. A Translational Review in Animal Models of the Disease

    Science.gov (United States)

    Darcet, Flavie; Gardier, Alain M.; Gaillard, Raphael; David, Denis J.; Guilloux, Jean-Philippe

    2016-01-01

    Major Depressive Disorder (MDD) is the most common psychiatric disease, affecting millions of people worldwide. In addition to the well-defined depressive symptoms, patients suffering from MDD consistently complain about cognitive disturbances, significantly exacerbating the burden of this illness. Among cognitive symptoms, impairments in attention, working memory, learning and memory or executive functions are often reported. However, available data about the heterogeneity of MDD patients and magnitude of cognitive symptoms through the different phases of MDD remain difficult to summarize. Thus, the first part of this review briefly overviewed clinical studies, focusing on the cognitive dysfunctions depending on the MDD type. As animal models are essential translational tools for underpinning the mechanisms of cognitive deficits in MDD, the second part of this review synthetized preclinical studies observing cognitive deficits in different rodent models of anxiety/depression. For each cognitive domain, we determined whether deficits could be shared across models. Particularly, we established whether specific stress-related procedures or unspecific criteria (such as species, sex or age) could segregate common cognitive alteration across models. Finally, the role of adult hippocampal neurogenesis in rodents in cognitive dysfunctions during MDD state was also discussed. PMID:26901205

  2. Extensive and systematic rewiring of histone post-translational modifications in cancer model systems.

    Science.gov (United States)

    Noberini, Roberta; Osti, Daniela; Miccolo, Claudia; Richichi, Cristina; Lupia, Michela; Corleone, Giacomo; Hong, Sung-Pil; Colombo, Piergiuseppe; Pollo, Bianca; Fornasari, Lorenzo; Pruneri, Giancarlo; Magnani, Luca; Cavallaro, Ugo; Chiocca, Susanna; Minucci, Saverio; Pelicci, Giuliana; Bonaldi, Tiziana

    2018-05-04

    Histone post-translational modifications (PTMs) generate a complex combinatorial code that regulates gene expression and nuclear functions, and whose deregulation has been documented in different types of cancers. Therefore, the availability of relevant culture models that can be manipulated and that retain the epigenetic features of the tissue of origin is absolutely crucial for studying the epigenetic mechanisms underlying cancer and testing epigenetic drugs. In this study, we took advantage of quantitative mass spectrometry to comprehensively profile histone PTMs in patient tumor tissues, primary cultures and cell lines from three representative tumor models, breast cancer, glioblastoma and ovarian cancer, revealing an extensive and systematic rewiring of histone marks in cell culture conditions, which includes a decrease of H3K27me2/me3, H3K79me1/me2 and H3K9ac/K14ac, and an increase of H3K36me1/me2. While some changes occur in short-term primary cultures, most of them are instead time-dependent and appear only in long-term cultures. Remarkably, such changes mostly revert in cell line- and primary cell-derived in vivo xenograft models. Taken together, these results support the use of xenografts as the most representative models of in vivo epigenetic processes, suggesting caution when using cultured cells, in particular cell lines and long-term primary cultures, for epigenetic investigations.

  3. Cognitive Dysfunction in Major Depressive Disorder. A Translational Review in Animal Models of the Disease

    Directory of Open Access Journals (Sweden)

    Flavie Darcet

    2016-02-01

    Full Text Available Major Depressive Disorder (MDD is the most common psychiatric disease, affecting millions of people worldwide. In addition to the well-defined depressive symptoms, patients suffering from MDD consistently complain about cognitive disturbances, significantly exacerbating the burden of this illness. Among cognitive symptoms, impairments in attention, working memory, learning and memory or executive functions are often reported. However, available data about the heterogeneity of MDD patients and magnitude of cognitive symptoms through the different phases of MDD remain difficult to summarize. Thus, the first part of this review briefly overviewed clinical studies, focusing on the cognitive dysfunctions depending on the MDD type. As animal models are essential translational tools for underpinning the mechanisms of cognitive deficits in MDD, the second part of this review synthetized preclinical studies observing cognitive deficits in different rodent models of anxiety/depression. For each cognitive domain, we determined whether deficits could be shared across models. Particularly, we established whether specific stress-related procedures or unspecific criteria (such as species, sex or age could segregate common cognitive alteration across models. Finally, the role of adult hippocampal neurogenesis in rodents in cognitive dysfunctions during MDD state was also discussed.

  4. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  5. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  6. Determinants of translation ambiguity

    Science.gov (United States)

    Degani, Tamar; Prior, Anat; Eddington, Chelsea M.; Arêas da Luz Fontes, Ana B.; Tokowicz, Natasha

    2016-01-01

    Ambiguity in translation is highly prevalent, and has consequences for second-language learning and for bilingual lexical processing. To better understand this phenomenon, the current study compared the determinants of translation ambiguity across four sets of translation norms from English to Spanish, Dutch, German and Hebrew. The number of translations an English word received was correlated across these different languages, and was also correlated with the number of senses the word has in English, demonstrating that translation ambiguity is partially determined by within-language semantic ambiguity. For semantically-ambiguous English words, the probability of the different translations in Spanish and Hebrew was predicted by the meaning-dominance structure in English, beyond the influence of other lexical and semantic factors, for bilinguals translating from their L1, and translating from their L2. These findings are consistent with models postulating direct access to meaning from L2 words for moderately-proficient bilinguals. PMID:27882188

  7. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  8. New systematic methodology for incorporating dynamic heat transfer modelling in multi-phase biochemical reactors.

    Science.gov (United States)

    Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E

    2014-09-01

    This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Future Directions in Medical Physics: Models, Technology, and Translation to Medicine

    Science.gov (United States)

    Siewerdsen, Jeffrey

    The application of physics in medicine has been integral to major advances in diagnostic and therapeutic medicine. Two primary areas represent the mainstay of medical physics research in the last century: in radiation therapy, physicists have propelled advances in conformal radiation treatment and high-precision image guidance; and in diagnostic imaging, physicists have advanced an arsenal of multi-modality imaging that includes CT, MRI, ultrasound, and PET as indispensible tools for noninvasive screening, diagnosis, and assessment of treatment response. In addition to their role in building such technologically rich fields of medicine, physicists have also become integral to daily clinical practice in these areas. The future suggests new opportunities for multi-disciplinary research bridging physics, biology, engineering, and computer science, and collaboration in medical physics carries a strong capacity for identification of significant clinical needs, access to clinical data, and translation of technologies to clinical studies. In radiation therapy, for example, the extraction of knowledge from large datasets on treatment delivery, image-based phenotypes, genomic profile, and treatment outcome will require innovation in computational modeling and connection with medical physics for the curation of large datasets. Similarly in imaging physics, the demand for new imaging technology capable of measuring physical and biological processes over orders of magnitude in scale (from molecules to whole organ systems) and exploiting new contrast mechanisms for greater sensitivity to molecular agents and subtle functional / morphological change will benefit from multi-disciplinary collaboration in physics, biology, and engineering. Also in surgery and interventional radiology, where needs for increased precision and patient safety meet constraints in cost and workflow, development of new technologies for imaging, image registration, and robotic assistance can leverage

  10. National Center for Advancing Translational Sciences

    Science.gov (United States)

    ... Models Core Technologies Clinical Innovation Clinical and Translational Science Awards Program Rare Diseases Clinical Research Network Patient ... to our monthly e-newsletter. About Translation Translational Science Spectrum Explore the full spectrum of translational science, ...

  11. Jungmann's translation of Paradise Lost

    OpenAIRE

    Janů, Karel

    2014-01-01

    This thesis examines Josef Jungmann's translation of John Milton's Paradise Lost. Josef Jungmann was one of the leading figures of the Czech National Revival and translated Milton 's poem between the years 1800 and 1804. The thesis covers Jungmann's theoretical model of translation and presents Jungmann's motives for translation of Milton's epic poem. The paper also describes the aims Jungmann had with his translation and whether he has achieved them. The reception Jungmann's translation rece...

  12. Methodologies for quantitative systems pharmacology (QSP) models : Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, Hp; Agoram, B.; Davies, M.R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, Ph.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  13. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  14. NOVEL APPROACH TO IMPROVE GEOCENTRIC TRANSLATION MODEL PERFORMANCE USING ARTIFICIAL NEURAL NETWORK TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Yao Yevenyo Ziggah

    Full Text Available Abstract: Geocentric translation model (GTM in recent times has not gained much popularity in coordinate transformation research due to its attainable accuracy. Accurate transformation of coordinate is a major goal and essential procedure for the solution of a number of important geodetic problems. Therefore, motivated by the successful application of Artificial Intelligence techniques in geodesy, this study developed, tested and compared a novel technique capable of improving the accuracy of GTM. First, GTM based on official parameters (OP and new parameters determined using the arithmetic mean (AM were applied to transform coordinate from global WGS84 datum to local Accra datum. On the basis of the results, the new parameters (AM attained a maximum horizontal position error of 1.99 m compared to the 2.75 m attained by OP. In line with this, artificial neural network technology of backpropagation neural network (BPNN, radial basis function neural network (RBFNN and generalized regression neural network (GRNN were then used to compensate for the GTM generated errors based on AM parameters to obtain a new coordinate transformation model. The new implemented models offered significant improvement in the horizontal position error from 1.99 m to 0.93 m.

  15. The Potential of Zebrafish as a Model Organism for Improving the Translation of Genetic Anticancer Nanomedicines

    Directory of Open Access Journals (Sweden)

    C Gutiérrez-Lovera

    2017-11-01

    Full Text Available In the last few decades, the field of nanomedicine applied to cancer has revolutionized cancer treatment: several nanoformulations have already reached the market and are routinely being used in the clinical practice. In the case of genetic nanomedicines, i.e., designed to deliver gene therapies to cancer cells for therapeutic purposes, advances have been less impressive. This is because of the many barriers that limit the access of the therapeutic nucleic acids to their target site, and the lack of models that would allow for an improvement in the understanding of how nanocarriers can be tailored to overcome them. Zebrafish has important advantages as a model species for the study of anticancer therapies, and have a lot to offer regarding the rational development of efficient delivery of genetic nanomedicines, and hence increasing the chances of their successful translation. This review aims to provide an overview of the recent advances in the development of genetic anticancer nanomedicines, and of the zebrafish models that stand as promising tools to shed light on their mechanisms of action and overall potential in oncology.

  16. Principles and methodology for translation and cross-cultural adaptation of the Nordic Occupational Skin Questionnaire (NOSQ-2002) to Spanish and Catalan.

    Science.gov (United States)

    Sala-Sastre, Nohemi; Herdman, Mike; Navarro, Lidia; de la Prada, Miriam; Pujol, Ramón M; Serra, Consol; Alonso, Jordi; Flyvholm, Mari-Ann; Giménez-Arnau, Ana M

    2009-08-01

    Occupational skin diseases are among the most frequent work-related diseases in industrialized countries. The Nordic Occupational Skin Questionnaire (NOSQ-2002), developed in English, is a useful tool for screening of occupational skin diseases. To culturally adapt the NOSQ-2002 to Spanish and Catalan and to assess the clarity, comprehension, cultural relevance and appropriateness of the translated versions. The International Society for Pharmacoeconomics and Outcomes Research (ISPOR) principles of good practice for the translation and cultural adaptation of patient-reported outcomes were followed. After translation into the target language, a first consensus version of the questionnaire was evaluated in multiple cognitive debriefing interviews. The expert panel introduced some modifications in 39 (68%) and 27 (47%) items in the Spanish and Catalan version, respectively (e.g. addition of examples and definitions, reformulation of instructions and use of direct question format). This version was back translated and submitted to the original authors, who suggested a further seven and two modifications in the Spanish and Catalan versions, respectively. A second set of cognitive interviews were performed. A consensus version of both questionnaires was obtained after final modifications based on comments by the patients. The final versions of the Spanish and Catalan NOSQ-2002 questionnaires are now available at www.NRCWE.dk/NOSQ.

  17. A changing climate: impacts on human exposures to O3 using an integrated modeling methodology

    Science.gov (United States)

    Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposu...

  18. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test......In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

  19. An improved methodology for dynamic modelling and simulation of ...

    Indian Academy of Sciences (India)

    The diversity of the processes and the complexity of the drive system .... modelling the specific event, general simulation tools such as Matlab R provide the user with tools for creating ..... using the pulse width modulation (PWM) techniques.

  20. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  1. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  2. Developing translational medicine professionals : The Marie Skłodowska-Curie action model

    NARCIS (Netherlands)

    Petrelli, Alessandra; Prakken, Berent J.; Rosenblum, Norman D.

    2016-01-01

    End goal of translational medicine is to combine disciplines and expertise to eventually promote improvement of the global healthcare system by delivering effective therapies to individuals and society. Well-trained experts of the translational medicine process endowed with profound knowledge of

  3. A mediation model for the translation of radio news texts in a ...

    African Journals Online (AJOL)

    Broadcast journalists in South Africa are media workers, editors and translators simultaneously producing news for bilingual or multilingual audiences. News texts are translated from English into one or more of the other official languages, depending on the target audience of the broadcaster. This article aims to indicate how ...

  4. Methodology for assessing electric vehicle charging infrastructure business models

    OpenAIRE

    Madina, Carlos; Zamora, Inmaculada; Zabala, Eduardo

    2016-01-01

    The analysis of economic implications of innovative business models in networked environments, as electro-mobility is, requires a global approach to ensure that all the involved actors obtain a benefit. Although electric vehicles (EVs) provide benefits for the society as a whole, there are a number of hurdles for their widespread adoption, mainly the high investment cost for the EV and for the infrastructure. Therefore, a sound business model must be built up for charging service operators, w...

  5. The methodology of energy policy-making in economical models

    Energy Technology Data Exchange (ETDEWEB)

    Poursina, B.

    1998-08-01

    Scrutiny and careful study in energy is a subject that in human science has been investigated from different point of view. The expansion of this research, because of its importance and effect in different dimensions of human life, has also arrived in the field of political and economic sciences. Economics evaluates the energy phenomenon at the side of elements such as labor, capital and technology in the production functions of firms. The nature of these discussions is mainly from the viewpoint of micro analyses. Nevertheless, the variation and challenges concerning energy and environment during the recent decades and the economists` detailed investigations in its analysis and evaluation have led to the arrival of energy discussions in a special shape in macro planning and large economic models. The paper compares various energy models - EFDM, MEDEE, MIDAS and HERMES. This extent of planning and consequently modelling which lacks a background in the processes of economic researches, deals with analysis of energy and economics reacting effects. Modelling of energy-economy interaction and energy policy in modeling macroeconomics large models are new ideas in energy studies and economics. 7 refs., 6 figs., 1 tab.

  6. A Comparative Study of Three Methodologies for Modeling Dynamic Stall

    Science.gov (United States)

    Sankar, L.; Rhee, M.; Tung, C.; ZibiBailly, J.; LeBalleur, J. C.; Blaise, D.; Rouzaud, O.

    2002-01-01

    During the past two decades, there has been an increased reliance on the use of computational fluid dynamics methods for modeling rotors in high speed forward flight. Computational methods are being developed for modeling the shock induced loads on the advancing side, first-principles based modeling of the trailing wake evolution, and for retreating blade stall. The retreating blade dynamic stall problem has received particular attention, because the large variations in lift and pitching moments encountered in dynamic stall can lead to blade vibrations and pitch link fatigue. Restricting to aerodynamics, the numerical prediction of dynamic stall is still a complex and challenging CFD problem, that, even in two dimensions at low speed, gathers the major difficulties of aerodynamics, such as the grid resolution requirements for the viscous phenomena at leading-edge bubbles or in mixing-layers, the bias of the numerical viscosity, and the major difficulties of the physical modeling, such as the turbulence models, the transition models, whose both determinant influences, already present in static maximal-lift or stall computations, are emphasized by the dynamic aspect of the phenomena.

  7. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  8. Translational neuropharmacology and the appropriate and effective use of animal models.

    Science.gov (United States)

    Green, A R; Gabrielsson, J; Fone, K C F

    2011-10-01

    This issue of the British Journal of Pharmacology is dedicated to reviews of the major animal models used in neuropharmacology to examine drugs for both neurological and psychiatric conditions. Almost all major conditions are reviewed. In general, regulatory authorities require evidence for the efficacy of novel compounds in appropriate animal models. However, the failure of many compounds in clinical trials following clear demonstration of efficacy in animal models has called into question both the value of the models and the discovery process in general. These matters are expertly reviewed in this issue and proposals for better models outlined. In this editorial, we further suggest that more attention be made to incorporate pharmacokinetic knowledge into the studies (quantitative pharmacology). We also suggest that more attention be made to ensure that full methodological details are published and recommend that journals should be more amenable to publishing negative data. Finally, we propose that new approaches must be used in drug discovery so that preclinical studies become more reflective of the clinical situation, and studies using animal models mimic the anticipated design of studies to be performed in humans, as closely as possible. © 2011 The Authors. British Journal of Pharmacology © 2011 The British Pharmacological Society.

  9. Terminology and methodology in modelling for water quality management

    DEFF Research Database (Denmark)

    Carstensen, J.; Vanrolleghem, P.; Rauch, W.

    1997-01-01

    There is a widespread need for a common terminology in modelling for water quality management. This paper points out sources of confusion in the communication between researchers due to misuse of existing terminology or use of unclear terminology. The paper attempts to clarify the context...... of the most widely used terms for characterising models and within the process of model building. It is essential to the ever growing society of researchers within water quality management, that communication is eased by establishing a common terminology. This should not be done by giving broader definitions...... of the terms, but by stressing the use of a stringent terminology. Therefore, the goal of the paper is to advocate the use of such a well defined and clear terminology. (C) 1997 IAWQ. Published by Elsevier Science Ltd....

  10. Benefits and limitations of animal models in partial bladder outlet obstruction for translational research.

    Science.gov (United States)

    Kitta, Takeya; Kanno, Yukiko; Chiba, Hiroki; Higuchi, Madoka; Ouchi, Mifuka; Togo, Mio; Moriya, Kimihiko; Shinohara, Nobuo

    2018-01-01

    The functions of the lower urinary tract have been investigated for more than a century. Lower urinary tract symptoms, such as incomplete bladder emptying, weak urine stream, daytime urinary frequency, urgency, urge incontinence and nocturia after partial bladder outlet obstruction, is a frequent cause of benign prostatic hyperplasia in aging men. However, the pathophysiological mechanisms have not been fully elucidated. The use of animal models is absolutely imperative for understanding the pathophysiological processes involved in bladder dysfunction. Surgical induction has been used to study lower urinary tract functions of numerous animal species, such as pig, dog, rabbit, guinea pig, rat and mouse, of both sexes. Several morphological and functional modifications under partial bladder outlet obstruction have not only been observed in the bladder, but also in the central nervous system. Understanding the changes of the lower urinary tract functions induced by partial bladder outlet obstruction would also contribute to appropriate drug development for treating these pathophysiological conditions. In the present review, we discuss techniques for creating partial bladder outlet obstruction, the characteristics of several species, as well as issues of each model, and their translational value. © 2017 The Japanese Urological Association.

  11. CloudLM: a Cloud-based Language Model for Machine Translation

    Directory of Open Access Journals (Sweden)

    Ferrández-Tordera Jorge

    2016-04-01

    Full Text Available Language models (LMs are an essential element in statistical approaches to natural language processing for tasks such as speech recognition and machine translation (MT. The advent of big data leads to the availability of massive amounts of data to build LMs, and in fact, for the most prominent languages, using current techniques and hardware, it is not feasible to train LMs with all the data available nowadays. At the same time, it has been shown that the more data is used for a LM the better the performance, e.g. for MT, without any indication yet of reaching a plateau. This paper presents CloudLM, an open-source cloud-based LM intended for MT, which allows to query distributed LMs. CloudLM relies on Apache Solr and provides the functionality of state-of-the-art language modelling (it builds upon KenLM, while allowing to query massive LMs (as the use of local memory is drastically reduced, at the expense of slower decoding speed.

  12. Imaging of Small Animal Peripheral Artery Disease Models: Recent Advancements and Translational Potential

    Directory of Open Access Journals (Sweden)

    Jenny B. Lin

    2015-05-01

    Full Text Available Peripheral artery disease (PAD is a broad disorder encompassing multiple forms of arterial disease outside of the heart. As such, PAD development is a multifactorial process with a variety of manifestations. For example, aneurysms are pathological expansions of an artery that can lead to rupture, while ischemic atherosclerosis reduces blood flow, increasing the risk of claudication, poor wound healing, limb amputation, and stroke. Current PAD treatment is often ineffective or associated with serious risks, largely because these disorders are commonly undiagnosed or misdiagnosed. Active areas of research are focused on detecting and characterizing deleterious arterial changes at early stages using non-invasive imaging strategies, such as ultrasound, as well as emerging technologies like photoacoustic imaging. Earlier disease detection and characterization could improve interventional strategies, leading to better prognosis in PAD patients. While rodents are being used to investigate PAD pathophysiology, imaging of these animal models has been underutilized. This review focuses on structural and molecular information and disease progression revealed by recent imaging efforts of aortic, cerebral, and peripheral vascular disease models in mice, rats, and rabbits. Effective translation to humans involves better understanding of underlying PAD pathophysiology to develop novel therapeutics and apply non-invasive imaging techniques in the clinic.

  13. Translational research in immune senescence: Assessing the relevance of current models

    Science.gov (United States)

    High, Kevin P.; Akbar, Arne N.; Nikolich-Zugich, Janko

    2014-01-01

    Advancing age is accompanied by profound changes in immune function; some are induced by the loss of critical niches that support development of naïve cells (e.g. thymic involution), others by the intrinsic physiology of long-lived cells attempting to maintain homeostasis, still others by extrinsic effects such as oxidative stress or long-term exposure to antigen due to persistent viral infections. Once compensatory mechanisms can no longer maintain a youthful phenotype the end result is the immune senescent milieu – one characterized by chronic, low grade, systemic inflammation and impaired responses to immune challenge, particularly when encountering new antigens. This state is associated with progression of chronic illnesses like atherosclerosis and dementia, and an increased risk of acute illness, disability and death in older adults. The complex interaction between immune senescence and chronic illness provides an ideal landscape for translational research with the potential to greatly affect human health. However, current animal models and even human investigative strategies for immune senescence have marked limitations, and the reductionist paradigm itself may be poorly suited to meet these challenges. A new paradigm, one that embraces complexity as a core feature of research in older adults is required to address the critical health issues facing the burgeoning senior population, the group that consumes the majority of healthcare resources. In this review, we outline the major advantages and limitations of current models and offer suggestions for how to move forward. PMID:22633440

  14. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  15. Development and Validation of a Translation Test.

    Science.gov (United States)

    Ghonsooly, Behzad

    1993-01-01

    Translation testing methodology has been criticized for its subjective character. No real strides have so far been made in developing an objective translation test. In this paper, certain detailed procedures including various phases of pretesting have been performed to achieve objectivity and scorability in translation testing methodology. In…

  16. A Methodology for Validation of High Resolution Combat Models

    Science.gov (United States)

    1988-06-01

    TELEOLOGICAL PROBLEM ................................ 7 C. EPISTEMOLOGICAL PROBLEM ............................. 8 D. UNCERTAINTY PRINCIPLE...theoretical issues. "The Teleological Problem"--How a model by its nature formulates an explicit cause-and-effect relationship that excludes other...34experts" in establishing the standard for reality. Generalization from personal experience is often hampered by the parochial aspects of the

  17. Experimental animal models for COPD: a methodological review

    Directory of Open Access Journals (Sweden)

    Vahideh Ghorani

    2017-05-01

    The present review provides various methods used for induction of animal models of COPD, different animals used (mainly mice, guinea pigs and rats and measured parameters. The information provided in this review is valuable for choosing appropriate animal, method of induction and selecting parameters to be measured in studies concerning COPD.

  18. A multiscale approach to blast neurotrauma modeling:Part II: Methodology for inducing blast injury to in vitro models

    Directory of Open Access Journals (Sweden)

    Gwen B. Effgen

    2012-02-01

    Full Text Available Due to the prominent role of improvised explosive devices (IEDs in wounding patterns of U.S. war-fighters in Iraq and Afghanistan, blast injury has risen to a new level of importance and is recognized to be a major cause of injuries to the brain. However, an injury risk-function for microscopic, macroscopic, behavioral, and neurological deficits has yet to be defined. While operational blast injuries can be very complex and thus difficult to analyze, a simplified blast injury model would facilitate studies correlating biological outcomes with blast biomechanics to define tolerance criteria. Blast-induced traumatic brain injury (bTBI results from the translation of a shock wave in air, such as that produced by an IED, into a pressure wave within the skull-brain complex. Our blast injury methodology recapitulates this phenomenon in vitro, allowing for control of the injury biomechanics via a compressed-gas shock tube used in conjunction with a custom-designed, fluid-filled receiver that contains the living culture. The receiver converts the air shock wave into a fast-rising pressure transient with minimal reflections, mimicking the intracranial pressure history in blast. We have developed an organotypic hippocampal slice culture model that exhibits cell death when exposed to a 530  17.7 kPa peak overpressure with a 1.026 ± 0.017 ms duration and 190 ± 10.7 kPa-ms impulse in-air. We have also injured a simplified in vitro model of the blood-brain barrier, which exhibits disrupted integrity immediately following exposure to 581  10.0 kPa peak overpressure with a 1.067 ms ± 0.006 ms duration and 222 ± 6.9 kPa-ms impulse in-air. To better prevent and treat bTBI, both the initiating biomechanics and the ensuing pathobiology must be understood in greater detail. A well-characterized, in vitro model of bTBI, in conjunction with animal models, will be a powerful tool for developing strategies to mitigate the risks of bTBI.

  19. 冲突--翻译伦理模式理论再思考%Conflicts---Second Thought on Models of Translational Ethics

    Institute of Scientific and Technical Information of China (English)

    梅阳春; 汤金霞

    2013-01-01

      One of the major trends in the translating study in China is drawing inspiration from western trans-lational ethics and constructing Chinese translational ethics .Among the western schools of translational eth-ics ,theory of models of translation ethics constituted by translational ethics of representation ,of service ,of communication ,norm-based translational ethics and translational ethics of commitment gives such inspiration to the construction of Chinese translational ethics that there comes from the circle of translation in China the voice of formulating Chinese translation ethics on the basis of this theory .It is discovered that the first four models of translational ethics contradict each other in what interest group they should serve most ,in what sta-tus they should confer on each of the translation agents but the translator and in what status they should con -fer on the translator . Translational ethics of commitment ,with the purpose of integrating four preceding models of translational ethics can not solve the problem of incompatibility between the four translational eth-ics .Therefore ,theory of models of translational ethics ,though beneficial to the construction of Chinese trans-lational ethics ,can not act as the basis of this construction .%  借鉴西方翻译伦理学研究成果构建中国翻译伦理学是当今中国翻译学的主要发展趋势之一。在西方翻译伦理学诸流派中,由翻译的再现伦理、服务伦理、交际伦理、规范伦理和承诺伦理构建的翻译伦理模式理论对中国翻译伦理学的发展影响最大,以至于国内翻译界出现了要以该理论为基础构建中国翻译伦理学的呼声。但该理论的前四种伦理在服务主体、主体定位和译者定位三个维度上的冲突导致它们互不兼容,旨在融合四种翻译伦理的承诺伦理也未能解决兼容性问题。因此,以翻译伦理模式为基础构建中国翻译伦理学的设想并不可行。

  20. Animal models to guide clinical drug development in ADHD: lost in translation?

    Science.gov (United States)

    Wickens, Jeffery R; Hyland, Brian I; Tripp, Gail

    2011-01-01

    We review strategies for developing animal models for examining and selecting compounds with potential therapeutic benefit in attention-deficit hyperactivity disorder (ADHD). ADHD is a behavioural disorder of unknown aetiology and pathophysiology. Current understanding suggests that genetic factors play an important role in the aetiology of ADHD. The involvement of dopaminergic and noradrenergic systems in the pathophysiology of ADHD is probable. We review the clinical features of ADHD including inattention, hyperactivity and impulsivity and how these are operationalized for laboratory study. Measures of temporal discounting (but not premature responding) appear to predict known drug effects well (treatment validity). Open-field measures of overactivity commonly used do not have treatment validity in human populations. A number of animal models have been proposed that simulate the symptoms of ADHD. The most commonly used are the spontaneously hypertensive rat (SHR) and the 6-hydroxydopamine-lesioned (6-OHDA) animals. To date, however, the SHR lacks treatment validity, and the effects of drugs on symptoms of impulsivity and inattention have not been studied extensively in 6-OHDA-lesioned animals. At the present stage of development, there are no in vivo models of proven effectiveness for examining and selecting compounds with potential therapeutic benefit in ADHD. However, temporal discounting is an emerging theme in theories of ADHD, and there is good evidence of increased value of delayed reward following treatment with stimulant drugs. Therefore, operant behaviour paradigms that measure the effects of drugs in situations of delayed reinforcement, whether in normal rats or selected models, show promise for the future. LINKED ARTICLES This article is part of a themed issue on Translational Neuropharmacology. To view the other articles in this issue visit http://dx.doi.org/10.1111/bph.2011.164.issue-4 PMID:21480864

  1. A European test of pesticide-leaching models: methodology and major recommendations

    NARCIS (Netherlands)

    Vanclooster, M.; Boesten, J.J.T.I.; Trevisan, M.; Brown, C.D.; Capri, E.; Eklo, O.M.; Gottesbüren, B.; Gouy, V.; Linden, van der A.M.A.

    2000-01-01

    Testing of pesticide-leaching models is important in view of their increasing use in pesticide registration procedures in the European Union. This paper presents the methodology and major conclusions of a test of pesticide-leaching models. Twelve models simulating the vertical one-dimensional

  2. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  3. Cross-Language Translation Priming Asymmetry with Chinese-English Bilinguals: A Test of the Sense Model

    Science.gov (United States)

    Chen, Baoguo; Zhou, Huixia; Gao, Yiwen; Dunlap, Susan

    2014-01-01

    The present study aimed to test the Sense Model of cross-linguistic masked translation priming asymmetry, proposed by Finkbeiner et al. ("J Mem Lang" 51:1-22, 2004), by manipulating the number of senses that bilingual participants associated with words from both languages. Three lexical decision experiments were conducted with…

  4. A Methodology for Modeling Confined, Temperature Sensitive Cushioning Systems

    Science.gov (United States)

    1980-06-01

    thickness of cushion T, and®- s temperature 0, and as a dependent variable, G, the peak acceleration. The initial model, Equation (IV-11), proved deficient ...k9) = TR * TCTH ALV(60) = Tk * TCTH AL2 V6)= Tk2 * FCTH V2 =TRk * TCrFH *AL V(6~3) =THZ * TC.TH AU! V(,34) =TRa * TCTH 141 Yj)=Tks * T(-Th * AL V(.4b

  5. Modeling postpartum depression in rats: theoretic and methodological issues

    Science.gov (United States)

    Ming, LI; Shinn-Yi, CHOU

    2016-01-01

    The postpartum period is when a host of changes occur at molecular, cellular, physiological and behavioral levels to prepare female humans for the challenge of maternity. Alteration or prevention of these normal adaptions is thought to contribute to disruptions of emotion regulation, motivation and cognitive abilities that underlie postpartum mental disorders, such as postpartum depression. Despite the high incidence of this disorder, and the detrimental consequences for both mother and child, its etiology and related neurobiological mechanisms remain poorly understood, partially due to the lack of appropriate animal models. In recent decades, there have been a number of attempts to model postpartum depression disorder in rats. In the present review, we first describe clinical symptoms of postpartum depression and discuss known risk factors, including both genetic and environmental factors. Thereafter, we discuss various rat models that have been developed to capture various aspects of this disorder and knowledge gained from such attempts. In doing so, we focus on the theories behind each attempt and the methods used to achieve their goals. Finally, we point out several understudied areas in this field and make suggestions for future directions. PMID:27469254

  6. Modeling postpartum depression in rats: theoretic and methodological issues

    Directory of Open Access Journals (Sweden)

    Ming LI

    2018-06-01

    Full Text Available The postpartum period is when a host of changes occur at molecular, cellular, physiological and behavioral levels to prepare female humans for the challenge of maternity. Alteration or prevention of these normal adaptions is thought to contribute to disruptions of emotion regulation, motivation and cognitive abilities that underlie postpartum mental disorders, such as postpartum depression. Despite the high incidence of this disorder, and the detrimental consequences for both mother and child, its etiology and related neurobiological mechanisms remain poorly understood, partially due to the lack of appropriate animal models. In recent decades, there have been a number of attempts to model postpartum depression disorder in rats. In the present review, we first describe clinical symptoms of postpartum depression and discuss known risk factors, including both genetic and environmental factors. Thereafter, we discuss various rat models that have been developed to capture various aspects of this disorder and knowledge gained from such attempts. In doing so, we focus on the theories behind each attempt and the methods used to achieve their goals. Finally, we point out several understudied areas in this field and make suggestions for future directions.

  7. Translational Creativity

    DEFF Research Database (Denmark)

    Nielsen, Sandro

    2010-01-01

    A long-established approach to legal translation focuses on terminological equivalence making translators strictly follow the words of source texts. Recent research suggests that there is room for some creativity allowing translators to deviate from the source texts. However, little attention...... is given to genre conventions in source texts and the ways in which they can best be translated. I propose that translators of statutes with an informative function in expert-to-expert communication may be allowed limited translational creativity when translating specific types of genre convention....... This creativity is a result of translators adopting either a source-language or a target-language oriented strategy and is limited by the pragmatic principle of co-operation. Examples of translation options are provided illustrating the different results in target texts. The use of a target-language oriented...

  8. On the development of non-commutative translation-invariant quantum gauge field models

    International Nuclear Information System (INIS)

    Sedmik, R.I.P.

    2009-01-01

    Aiming to understand the most fundamental principles of nature one has to approach the highest possible energy scales corresponding to the smallest possible distances - the Planck scale. Historically, three different theoretical fields have been developed to treat the problems appearing in this endeavor: string theory, quantum gravity, and non-commutative (NC) quantum field theory (QFT). The latter was originally motivated by the conjecture that the introduction of uncertainty relations between space-time coordinates introduces a natural energy cutoff, which should render the resulting computations well defined and finite. Despite failing to fulfill this expectation, NC physics is a challenging field of research, which has proved to be a fruitful source for new ideas and methods. Mathematically, non-commutativity is implemented by the so called Weyl quantization, giving rise to a modified product - the Groenewold-Moyal product. It realizes an operator ordering, and allows to work within the well established framework of QFT on non-commutative spaces. The main obstacle of NCQFT is the appearance of singularities being shifted from high to low energies. This effect, being referred to as 'uV/IR mixing', is a direct consequence of the deformation of the product, and inhibits or complicates the direct application of well approved renormalization schemes. In order to remedy this problem, several approaches have been worked out during the past decade which, unfortunately, all have shortcomings such as the breaking of translation invariance or an inappropriate alternation of degrees of freedom. Thence, the resulting theories are either being rendered 'unphysical', or considered a priori to be toy models. Nonetheless, these efforts have helped to analyze the mechanisms leading to uV/IR mixing and finally led to the insight that renormalizability can only be achieved by respecting the inherent connection of long and short distances (scales) of NCQFT in the construction of

  9. Fuel cycle assessment: A compendium of models, methodologies, and approaches

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The purpose of this document is to profile analytical tools and methods which could be used in a total fuel cycle analysis. The information in this document provides a significant step towards: (1) Characterizing the stages of the fuel cycle. (2) Identifying relevant impacts which can feasibly be evaluated quantitatively or qualitatively. (3) Identifying and reviewing other activities that have been conducted to perform a fuel cycle assessment or some component thereof. (4) Reviewing the successes/deficiencies and opportunities/constraints of previous activities. (5) Identifying methods and modeling techniques/tools that are available, tested and could be used for a fuel cycle assessment.

  10. Methodological aspects of journaling a dynamic adjusting entry model

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská

    2011-01-01

    Full Text Available This paper expands the discussion of the importance and function of adjusting entries for loan receivables. Discussion of the cyclical development of adjusting entries, their negative impact on the business cycle and potential solutions has intensified during the financial crisis. These discussions are still ongoing and continue to be relevant to members of the professional public, banking regulators and representatives of international accounting institutions. The objective of this paper is to evaluate a method of journaling dynamic adjusting entries under current accounting law. It also expresses the authors’ opinions on the potential for consistently implementing basic accounting principles in journaling adjusting entries for loan receivables under a dynamic model.

  11. Treatment of Ligament Constructs with Exercise-conditioned Serum: A Translational Tissue Engineering Model.

    Science.gov (United States)

    Lee-Barthel, Ann; Baar, Keith; West, Daniel W D

    2017-06-11

    In vitro experiments are essential to understand biological mechanisms; however, the gap between monolayer tissue culture and human physiology is large, and translation of findings is often poor. Thus, there is ample opportunity for alternative experimental approaches. Here we present an approach in which human cells are isolated from human anterior cruciate ligament tissue remnants, expanded in culture, and used to form engineered ligaments. Exercise alters the biochemical milieu in the blood such that the function of many tissues, organs and bodily processes are improved. In this experiment, ligament construct culture media was supplemented with experimental human serum that has been 'conditioned' by exercise. Thus the intervention is more biologically relevant since an experimental tissue is exposed to the full endogenous biochemical milieu, including binding proteins and adjunct compounds that may be altered in tandem with the activity of an unknown agent of interest. After treatment, engineered ligaments can be analyzed for mechanical function, collagen content, morphology, and cellular biochemistry. Overall, there are four major advantages versus traditional monolayer culture and animal models, of the physiological model of ligament tissue that is presented here. First, ligament constructs are three-dimensional, allowing for mechanical properties (i.e., function) such as ultimate tensile stress, maximal tensile load, and modulus, to be quantified. Second, the enthesis, the interface between boney and sinew elements, can be examined in detail and within functional context. Third, preparing media with post-exercise serum allows for the effects of the exercise-induced biochemical milieu, which is responsible for the wide range of health benefits of exercise, to be investigated in an unbiased manner. Finally, this experimental model advances scientific research in a humane and ethical manner by replacing the use of animals, a core mandate of the National

  12. Behavioral profiling as a translational approach in an animal model of posttraumatic stress disorder.

    Science.gov (United States)

    Ardi, Ziv; Albrecht, Anne; Richter-Levin, Alon; Saha, Rinki; Richter-Levin, Gal

    2016-04-01

    Diagnosis of psychiatric disorders in humans is based on comparing individuals to the normal population. However, many animal models analyze averaged group effects, thus compromising their translational power. This discrepancy is particularly relevant in posttraumatic stress disorder (PTSD), where only a minority develop the disorder following a traumatic experience. In our PTSD rat model, we utilize a novel behavioral profiling approach that allows the classification of affected and unaffected individuals in a trauma-exposed population. Rats were exposed to underwater trauma (UWT) and four weeks later their individual performances in the open field and elevated plus maze were compared to those of the control group, allowing the identification of affected and resilient UWT-exposed rats. Behavioral profiling revealed that only a subset of the UWT-exposed rats developed long-lasting behavioral symptoms. The proportion of affected rats was further enhanced by pre-exposure to juvenile stress, a well-described risk factor of PTSD. For a biochemical proof of concept we analyzed the expression levels of the GABAA receptor subunits α1 and α2 in the ventral, dorsal hippocampus and basolateral amygdala. Increased expression, mainly of α1, was observed in ventral but not dorsal hippocampus of exposed animals, which would traditionally be interpreted as being associated with the exposure-resultant psychopathology. However, behavioral profiling revealed that this increased expression was confined to exposed-unaffected individuals, suggesting a resilience-associated expression regulation. The results provide evidence for the importance of employing behavioral profiling in animal models of PTSD, in order to better understand the neural basis of stress vulnerability and resilience. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Methodology for modeling the microbial contamination of air filters.

    Science.gov (United States)

    Joe, Yun Haeng; Yoon, Ki Young; Hwang, Jungho

    2014-01-01

    In this paper, we propose a theoretical model to simulate microbial growth on contaminated air filters and entrainment of bioaerosols from the filters to an indoor environment. Air filter filtration and antimicrobial efficiencies, and effects of dust particles on these efficiencies, were evaluated. The number of bioaerosols downstream of the filter could be characterized according to three phases: initial, transitional, and stationary. In the initial phase, the number was determined by filtration efficiency, the concentration of dust particles entering the filter, and the flow rate. During the transitional phase, the number of bioaerosols gradually increased up to the stationary phase, at which point no further increase was observed. The antimicrobial efficiency and flow rate were the dominant parameters affecting the number of bioaerosols downstream of the filter in the transitional and stationary phase, respectively. It was found that the nutrient fraction of dust particles entering the filter caused a significant change in the number of bioaerosols in both the transitional and stationary phases. The proposed model would be a solution for predicting the air filter life cycle in terms of microbiological activity by simulating the microbial contamination of the filter.

  14. Methodology for modeling the microbial contamination of air filters.

    Directory of Open Access Journals (Sweden)

    Yun Haeng Joe

    Full Text Available In this paper, we propose a theoretical model to simulate microbial growth on contaminated air filters and entrainment of bioaerosols from the filters to an indoor environment. Air filter filtration and antimicrobial efficiencies, and effects of dust particles on these efficiencies, were evaluated. The number of bioaerosols downstream of the filter could be characterized according to three phases: initial, transitional, and stationary. In the initial phase, the number was determined by filtration efficiency, the concentration of dust particles entering the filter, and the flow rate. During the transitional phase, the number of bioaerosols gradually increased up to the stationary phase, at which point no further increase was observed. The antimicrobial efficiency and flow rate were the dominant parameters affecting the number of bioaerosols downstream of the filter in the transitional and stationary phase, respectively. It was found that the nutrient fraction of dust particles entering the filter caused a significant change in the number of bioaerosols in both the transitional and stationary phases. The proposed model would be a solution for predicting the air filter life cycle in terms of microbiological activity by simulating the microbial contamination of the filter.

  15. New temperature model of the Netherlands from new data and novel modelling methodology

    Science.gov (United States)

    Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik

    2017-04-01

    Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat

  16. Developing a Collection of Composable Data Translation Software Units to Improve Efficiency and Reproducibility in Ecohydrologic Modeling Workflows

    Science.gov (United States)

    Olschanowsky, C.; Flores, A. N.; FitzGerald, K.; Masarik, M. T.; Rudisill, W. J.; Aguayo, M.

    2017-12-01

    Dynamic models of the spatiotemporal evolution of water, energy, and nutrient cycling are important tools to assess impacts of climate and other environmental changes on ecohydrologic systems. These models require spatiotemporally varying environmental forcings like precipitation, temperature, humidity, windspeed, and solar radiation. These input data originate from a variety of sources, including global and regional weather and climate models, global and regional reanalysis products, and geostatistically interpolated surface observations. Data translation measures, often subsetting in space and/or time and transforming and converting variable units, represent a seemingly mundane, but critical step in the application workflows. Translation steps can introduce errors, misrepresentations of data, slow execution time, and interrupt data provenance. We leverage a workflow that subsets a large regional dataset derived from the Weather Research and Forecasting (WRF) model and prepares inputs to the Parflow integrated hydrologic model to demonstrate the impact translation tool software quality on scientific workflow results and performance. We propose that such workflows will benefit from a community approved collection of data transformation components. The components should be self-contained composable units of code. This design pattern enables automated parallelization and software verification, improving performance and reliability. Ensuring that individual translation components are self-contained and target minute tasks increases reliability. The small code size of each component enables effective unit and regression testing. The components can be automatically composed for efficient execution. An efficient data translation framework should be written to minimize data movement. Composing components within a single streaming process reduces data movement. Each component will typically have a low arithmetic intensity, meaning that it requires about the same number of

  17. Small-Diameter Awls Improve Articular Cartilage Repair After Microfracture Treatment in a Translational Animal Model.

    Science.gov (United States)

    Orth, Patrick; Duffner, Julia; Zurakowski, David; Cucchiarini, Magali; Madry, Henning

    2016-01-01

    Microfracture is the most commonly applied arthroscopic marrow stimulation procedure. Articular cartilage repair is improved when the subchondral bone is perforated by small-diameter microfracture awls compared with larger awls. Controlled laboratory study. Standardized rectangular (4 × 8 mm) full-thickness chondral defects (N = 24) were created in the medial femoral condyle of 16 adult sheep and debrided down to the subchondral bone plate. Three treatment groups (n = 8 defects each) were tested: 6 microfracture perforations using small-diameter awls (1.0 mm; group 1), large-diameter awls (1.2 mm; group 2), or without perforations (debridement control; group 3). Osteochondral repair was assessed at 6 months in vivo using established macroscopic, histological, immunohistochemical, biochemical, and micro-computed tomography analyses. Compared with control defects, histological cartilage repair was always improved after both microfracture techniques (P Subchondral bone cysts and intralesional osteophytes were frequently observed after either microfracture treatment. Macroscopic grading, DNA, proteoglycan, and type I and type II collagen contents as well as degenerative changes within the adjacent cartilage remained unaffected by the awl diameter. Small-diameter microfracture awls improve articular cartilage repair in the translational sheep model more effectively than do larger awls. These data support the use of small microfracture instruments for the surgical treatment of cartilage defects and warrant prolonged clinical investigations. © 2015 The Author(s).

  18. Osteochondral Allograft Transplantation in Cartilage Repair: Graft Storage Paradigm, Translational Models, and Clinical Applications

    Science.gov (United States)

    Bugbee, William D.; Pallante-Kichura, Andrea L.; Görtz, Simon; Amiel, David; Sah, Robert

    2016-01-01

    The treatment of articular cartilage injury and disease has become an increasingly relevant part of orthopaedic care. Articular cartilage transplantation, in the form of osteochondral allografting, is one of the most established techniques for restoration of articular cartilage. Our research efforts over the last two decades have supported the transformation of this procedure from experimental “niche” status to a cornerstone of orthopaedic practice. In this Kappa Delta paper, we describe our translational and clinical science contributions to this transformation: (1) to enhance the ability of tissue banks to process and deliver viable tissue to surgeons and patients, (2) to improve the biological understanding of in vivo cartilage and bone remodeling following osteochondral allograft (OCA) transplantation in an animal model system, (3) to define effective surgical techniques and pitfalls, and (4) to identify and clarify clinical indications and outcomes. The combination of coordinated basic and clinical studies is part of our continuing comprehensive academic OCA transplant program. Taken together, the results have led to the current standards for OCA processing and storage prior to implantation and also novel observations and mechanisms of the biological and clinical behavior of OCA transplants in vivo. Thus, OCA transplantation is now a successful and increasingly available treatment for patients with disabling osteoarticular cartilage pathology. PMID:26234194

  19. Adaptability and stability of maize varieties using mixed model methodology

    Directory of Open Access Journals (Sweden)

    Walter Fernandes Meirelles

    2012-01-01

    Full Text Available The objective of this study was to evaluate the performance, adaptability and stability of corn cultivars simultaneously in unbalanced experiments, using the method of harmonic means of the relative performance of genetic values. The grain yield of 45 cultivars, including hybrids and varieties, was evaluated in 49 environments in two growing seasons. In the 2007/2008 growing season, 36 cultivars were evaluated and in 2008/2009 25 cultivars, of which 16 were used in both seasons. Statistical analyses were performed based on mixed models, considering genotypes as random and replications within environments as fixed factors. The experimental precision in the combined analyses was high (accuracy estimates > 92 %. Despite the existence of genotype x environment interaction, hybrids and varieties with high adaptability and stability were identified. Results showed that the method of harmonic means of the relative performance of genetic values is a suitable method for maize breeding programs.

  20. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    International Nuclear Information System (INIS)

    Knezevic, J.; Odoom, E.R.

    2001-01-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets

  1. Selection of low-level radioactive waste disposal sites using screening models versus more complex methodologies

    International Nuclear Information System (INIS)

    Uslu, I.; Fields, D.E.

    1993-01-01

    The task of choosing a waste-disposal site from a set of candidate sites requires an approach capable of objectively handling many environmental variables for each site. Several computer methodologies have been developed to assist in the process of choosing a site for the disposal of low-level radioactive waste; however, most of these models are costly to apply, in terms of computer resources and the time and effort required by professional modelers, geologists, and waste-disposal experts. The authors describe how the relatively simple DRASTIC methodology (a standardized system for evaluating groundwater pollution potential using hydrogeologic settings) may be used for open-quotes pre-screeningclose quotes of sites to determine which subset of candidate sites is worthy of more detailed screening. Results of site comparisons made with DRASTIC are compared with results obtained using PRESTO-II methodology, which is representative of the more complex release-transport-human exposure methodologies. 6 refs., 1 fig., 1 tab

  2. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  3. Competence development organizations in project management on the basis of genomic model methodologies

    OpenAIRE

    Бушуев, Сергей Дмитриевич; Рогозина, Виктория Борисовна; Ярошенко, Юрий Федерович

    2013-01-01

    The matrix technology for identification of organisational competencies in project management is presented in the article. Matrix elements are the components of organizational competence in the field of project management and project management methodology represented in the structure of the genome. The matrix model of competence in the framework of the adopted methodologies and scanning method for identifying organizational competences formalised. Proposed methods for building effective proj...

  4. Oil Well Blowout 3D computational modeling: review of methodology and environmental requirements

    OpenAIRE

    Pedro Mello Paiva; Alexandre Nunes Barreto; Jader Lugon Junior; Leticia Ferraço de Campos

    2016-01-01

    This literature review aims to present the different methodologies used in the three-dimensional modeling of the hydrocarbons dispersion originated from an oil well blowout. It presents the concepts of coastal environmental sensitivity and vulnerability, their importance for prioritizing the most vulnerable areas in case of contingency, and the relevant legislation. We also discuss some limitations about the methodology currently used in environmental studies of oil drift, which considers sim...

  5. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  6. Revisiting interaction in knowledge translation

    Directory of Open Access Journals (Sweden)

    Zackheim Lisa

    2007-10-01

    Full Text Available Abstract Background Although the study of research utilization is not new, there has been increased emphasis on the topic over the recent past. Science push models that are researcher driven and controlled and demand pull models emphasizing users/decision-maker interests have largely been abandoned in favour of more interactive models that emphasize linkages between researchers and decisionmakers. However, despite these and other theoretical and empirical advances in the area of research utilization, there remains a fundamental gap between the generation of research findings and the application of those findings in practice. Methods Using a case approach, the current study looks at the impact of one particular interaction approach to research translation used by a Canadian funding agency. Results Results suggest there may be certain conditions under which different levels of decisionmaker involvement in research will be more or less effective. Four attributes are illuminated by the current case study: stakeholder diversity, addressability/actionability of results, finality of study design and methodology, and politicization of results. Future research could test whether these or other variables can be used to specify some of the conditions under which different approaches to interaction in knowledge translation are likely to facilitate research utilization. Conclusion This work suggests that the efficacy of interaction approaches to research translation may be more limited than current theory proposes and underscores the need for more completely specified models of research utilization that can help address the slow pace of change in this area.

  7. Methodology if inspections to carry out the nuclear outages model

    International Nuclear Information System (INIS)

    Aycart, J.; Mortenson, S.; Fourquet, J. M.

    2005-01-01

    Before the nuclear generation industry was deregulated in the United States, refueling and maintenance outages in nuclear power plants usually lasted orotund 100 days. After deregulation took effect, improved capability factors and performances became more important. As a result, it became essential to reduce the critical path time during the outage, which meant that activities that had typically been done in series had to be executed in parallel. The new outage model required the development of new tools and new processes, The 360-degree platform developed by GE Energy has made it possible to execute multiple activities in parallel. Various in-vessel visual inspection (IVVI) equipments can now simultaneously perform inspections on the pressurized reactor vessel (RPV) components. The larger number of inspection equipments in turn results in a larger volume of data, with the risk of increasing the time needed for examining them and postponing the end of the analysis phase, which is critical for the outage. To decrease data analysis times, the IVVI Digitalisation process has been development. With this process, the IVVI data are sent via a high-speed transmission line to a site outside the Plant called Center of Excellence (COE), where a team of Level III experts is in charge of analyzing them. The tools for the different product lines are being developed to interfere with each other as little as possible, thus minimizing the impact of the critical path on plant refueling activities. Methods are also being developed to increase the intervals between inspection. In accordance with the guidelines of the Boiling Water Reactor Vessel and Internals project (BWRVIP), the intervals between inspections are typically longer if ultrasound volumetric inspections are performed than if the scope is limited to IVVI. (Author)

  8. Machine Translation

    Indian Academy of Sciences (India)

    Research Mt System Example: The 'Janus' Translating Phone Project. The Janus ... based on laptops, and simultaneous translation of two speakers in a dialogue. For more ..... The current focus in MT research is on using machine learning.

  9. A Conceptual Model for the Translation of Bioethics Research and Scholarship.

    Science.gov (United States)

    Mathews, Debra J H; Hester, D Micah; Kahn, Jeffrey; McGuire, Amy; McKinney, Ross; Meador, Keith; Philpott-Jones, Sean; Youngner, Stuart; Wilfond, Benjamin S

    2016-09-01

    While the bioethics literature demonstrates that the field has spent substantial time and thought over the last four decades on the goals, methods, and desired outcomes for service and training in bioethics, there has been less progress defining the nature and goals of bioethics research and scholarship. This gap makes it difficult both to describe the breadth and depth of these areas of bioethics and, importantly, to gauge their success. However, the gap also presents us with an opportunity to define this scope of work for ourselves and to help shape the broader conversation about the impact of academic research. Because of growing constraints on academic funding, researchers and scholars in many fields are being asked to demonstrate and also forecast the value and impact of their work. To do that, and also to satisfy ourselves that our work has meaningful effect, we must understand how our work can motivate change and how that change can be meaningfully measured. In a field as diverse as bioethics, the pathways to and metrics of change will likewise be diverse. It is therefore critical that any assessment of the impact of bioethics research and scholarship be informed by an understanding of the nature of the work, its goals, and how those goals can and ought to be furthered. In this paper, we propose a conceptual model that connects individual bioethics projects to the broader goals of scholarship, describing the translation of research and scholarly output into changes in thinking, practice, and policy. One of the key implications of the model is that impact in bioethics is generally the result of a collection of projects rather than of any single piece of research or scholarship. Our goal is to lay the groundwork for a thoroughgoing conversation about bioethics research and scholarship that will advance and shape the important conversation about their impact. © 2016 The Hastings Center.

  10. An Overview of Models, Methods, and Reagents Developed for Translational Autoimmunity Research in the Common Marmoset (Callithrix jacchus)

    OpenAIRE

    Jagessar, S. Anwar; Vierboom, Michel; Blezer, Erwin L.A.; Bauer, Jan; Hart, Bert A. ‘t; Kap, Yolanda S.

    2013-01-01

    textabstractThe common marmoset (Callithrix jacchus) is a small-bodied Neotropical primate and a useful preclinical animal model for translational research into autoimmune-mediated inflammatory diseases (AIMID), such as rheumatoid arthritis (RA) and multiple sclerosis (MS). The animal model for MS established in marmosets has proven their value for exploratory research into (etio) pathogenic mechanisms and for the evaluation of new therapies that cannot be tested in lower species because of t...

  11. A methodology and supply chain management inspired reference ontology for modeling healthcare teams.

    Science.gov (United States)

    Kuziemsky, Craig E; Yazdi, Sara

    2011-01-01

    Numerous studies and strategic plans are advocating more team based healthcare delivery that is facilitated by information and communication technologies (ICTs). However before we can design ICTs to support teams we need a solid conceptual model of team processes and a methodology for using such a model in healthcare settings. This paper draws upon success in the supply chain management domain to develop a reference ontology of healthcare teams and a methodology for modeling teams to instantiate the ontology in specific settings. This research can help us understand how teams function and how we can design ICTs to support teams.

  12. Rapid acquisition and model-based analysis of cell-free transcription–translation reactions from nonmodel bacteria

    Science.gov (United States)

    Wienecke, Sarah; Ishwarbhai, Alka; Tsipa, Argyro; Aw, Rochelle; Kylilis, Nicolas; Bell, David J.; McClymont, David W.; Jensen, Kirsten; Biedendieck, Rebekka

    2018-01-01

    Native cell-free transcription–translation systems offer a rapid route to characterize the regulatory elements (promoters, transcription factors) for gene expression from nonmodel microbial hosts, which can be difficult to assess through traditional in vivo approaches. One such host, Bacillus megaterium, is a giant Gram-positive bacterium with potential biotechnology applications, although many of its regulatory elements remain uncharacterized. Here, we have developed a rapid automated platform for measuring and modeling in vitro cell-free reactions and have applied this to B. megaterium to quantify a range of ribosome binding site variants and previously uncharacterized endogenous constitutive and inducible promoters. To provide quantitative models for cell-free systems, we have also applied a Bayesian approach to infer ordinary differential equation model parameters by simultaneously using time-course data from multiple experimental conditions. Using this modeling framework, we were able to infer previously unknown transcription factor binding affinities and quantify the sharing of cell-free transcription–translation resources (energy, ribosomes, RNA polymerases, nucleotides, and amino acids) using a promoter competition experiment. This allows insights into resource limiting-factors in batch cell-free synthesis mode. Our combined automated and modeling platform allows for the rapid acquisition and model-based analysis of cell-free transcription–translation data from uncharacterized microbial cell hosts, as well as resource competition within cell-free systems, which potentially can be applied to a range of cell-free synthetic biology and biotechnology applications. PMID:29666238

  13. L-leucine partially rescues translational and developmental defects associated with zebrafish models of Cornelia de Lange syndrome.

    Science.gov (United States)

    Xu, Baoshan; Sowa, Nenja; Cardenas, Maria E; Gerton, Jennifer L

    2015-03-15

    Cohesinopathies are human genetic disorders that include Cornelia de Lange syndrome (CdLS) and Roberts syndrome (RBS) and are characterized by defects in limb and craniofacial development as well as mental retardation. The developmental phenotypes of CdLS and other cohesinopathies suggest that mutations in the structure and regulation of the cohesin complex during embryogenesis interfere with gene regulation. In a previous project, we showed that RBS was associated with highly fragmented nucleoli and defects in both ribosome biogenesis and protein translation. l-leucine stimulation of the mTOR pathway partially rescued translation in human RBS cells and development in zebrafish models of RBS. In this study, we investigate protein translation in zebrafish models of CdLS. Our results show that phosphorylation of RPS6 as well as 4E-binding protein 1 (4EBP1) was reduced in nipbla/b, rad21 and smc3-morphant embryos, a pattern indicating reduced translation. Moreover, protein biosynthesis and rRNA production were decreased in the cohesin morphant embryo cells. l-leucine partly rescued protein synthesis and rRNA production in the cohesin morphants and partially restored phosphorylation of RPS6 and 4EBP1. Concomitantly, l-leucine treatment partially improved cohesinopathy embryo development including the formation of craniofacial cartilage. Interestingly, we observed that alpha-ketoisocaproate (α-KIC), which is a keto derivative of leucine, also partially rescued the development of rad21 and nipbla/b morphants by boosting mTOR-dependent translation. In summary, our results suggest that cohesinopathies are caused in part by defective protein synthesis, and stimulation of the mTOR pathway through l-leucine or its metabolite α-KIC can partially rescue development in zebrafish models for CdLS. © The Author 2014. Published by Oxford University Press.

  14. Towards A Model-Based Prognostics Methodology For Electrolytic Capacitors: A Case Study Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter...

  15. Application of SADT and ARIS methodologies for modeling and management of business processes of information systems

    Directory of Open Access Journals (Sweden)

    O. V. Fedorova

    2018-01-01

    Full Text Available The article is devoted to application of SADT and ARIS methodologies for modeling and management of business processes of information systems. The relevance of this article is beyond doubt, because the design of the architecture of information systems, based on a thorough system analysis of the subject area, is of paramount importance for the development of information systems in general. The authors conducted a serious work on the analysis of the application of SADT and ARIS methodologies for modeling and managing business processes of information systems. The analysis was carried out both in terms of modeling business processes (notation and applying the CASE-tool, and in terms of business process management. The first point of view reflects the interaction of the business analyst and the programmer in the development of the information system. The second point of view is the interaction of the business analyst and the customer. The basis of many modern methodologies for modeling business processes is the SADT methodology. Using the methodology of the IDEF family, it is possible to efficiently display and analyze the activity models of a wide range of complex information systems in various aspects. CASE-tool ARIS is a complex of tools for analysis and modeling of the organization's activities. The methodical basis of ARIS is a set of different modeling methods that reflect different views on the system under study. The authors' conclusions are fully justified. The results of the work can be useful for specialists in the field of modeling business processes of information systems. In addition, the article has an oriented character when working on the constituent elements of curricula for students specializing in information specialties and management, provides an update of the content and structure of disciplines on modeling the architecture of information systems and organization management, using models.

  16. Machine translation

    Energy Technology Data Exchange (ETDEWEB)

    Nagao, M

    1982-04-01

    Each language has its own structure. In translating one language into another one, language attributes and grammatical interpretation must be defined in an unambiguous form. In order to parse a sentence, it is necessary to recognize its structure. A so-called context-free grammar can help in this respect for machine translation and machine-aided translation. Problems to be solved in studying machine translation are taken up in the paper, which discusses subjects for semantics and for syntactic analysis and translation software. 14 references.

  17. Novel Methodology for Functional Modeling and Simulation of Wireless Embedded Systems

    Directory of Open Access Journals (Sweden)

    Sosa Morales Emma

    2008-01-01

    Full Text Available Abstract A novel methodology is presented for the modeling and the simulation of wireless embedded systems. Tight interaction between the analog and the digital functionality makes the design and verification of such systems a real challenge. The applied methodology brings together the functional models of the baseband algorithms written in C language with the circuit descriptions at behavioral level in Verilog or Verilog-AMS for the system simulations in a single kernel environment. The physical layer of an ultrawideband system has been successfully modeled and simulated. The results confirm that this methodology provides a standardized framework in order to efficiently and accurately simulate complex mixed signal applications for embedded systems.

  18. Fear extinction and BDNF: Translating animal models of PTSD to the clinic

    Science.gov (United States)

    Andero, Raül; Ressler, Kerry J

    2012-01-01

    Brain-derived neurotrophic factor (BDNF) is the most studied neurotrophin involved in synaptic plasticity processes that are required for long-term learning and memory. Specifically, BDNF gene expression and activation of its high-affinity TrkB receptor are necessary in the amygdala, hippocampus and prefrontal cortex for the formation of emotional memories, including fear memories. Among the psychiatric disorders with altered fear processing there is Post-traumatic Stress Disorder (PTSD) which is characterized by an inability to extinguish fear memories. Since BDNF appears to enhance extinction of fear, targeting impaired extinction in anxiety disorders such as PTSD via BDNF signalling may be an important and novel way to enhance treatment efficacy. The aim of this review is to provide a translational point of view that stems from findings in the BDNF regulation of synaptic plasticity and fear extinction. In addition, there are different systems that seem to alter fear extinction through BDNF modulation like the endocannabionoid system and the hypothalamic-pituitary adrenal axis (HPA). Recent work also finds that the pituitary adenylate cyclase-activating polypeptide (PACAP) and PAC1 receptor, which are upstream of BDNF activation, may be implicated in PTSD. Especially interesting are data that exogenous fear extinction enhancers such as antidepressants, histone deacetylases inhibitors (HDACi) and D-cycloserine, a partial NMDA agonist, may act through or in concert with the BDNF-TrkB system. Finally, we review studies where recombinant BDNF and a putative TrkB agonist, 7,8-DHF, may enhance extinction of fear. These approaches may lead to novel agents that improve extinction in animal models and eventually humans. PMID:22530815

  19. Ab initio optimization principle for the ground states of translationally invariant strongly correlated quantum lattice models.

    Science.gov (United States)

    Ran, Shi-Ju

    2016-05-01

    In this work, a simple and fundamental numeric scheme dubbed as ab initio optimization principle (AOP) is proposed for the ground states of translational invariant strongly correlated quantum lattice models. The idea is to transform a nondeterministic-polynomial-hard ground-state simulation with infinite degrees of freedom into a single optimization problem of a local function with finite number of physical and ancillary degrees of freedom. This work contributes mainly in the following aspects: (1) AOP provides a simple and efficient scheme to simulate the ground state by solving a local optimization problem. Its solution contains two kinds of boundary states, one of which play the role of the entanglement bath that mimics the interactions between a supercell and the infinite environment, and the other gives the ground state in a tensor network (TN) form. (2) In the sense of TN, a novel decomposition named as tensor ring decomposition (TRD) is proposed to implement AOP. Instead of following the contraction-truncation scheme used by many existing TN-based algorithms, TRD solves the contraction of a uniform TN in an opposite way by encoding the contraction in a set of self-consistent equations that automatically reconstruct the whole TN, making the simulation simple and unified; (3) AOP inherits and develops the ideas of different well-established methods, including the density matrix renormalization group (DMRG), infinite time-evolving block decimation (iTEBD), network contractor dynamics, density matrix embedding theory, etc., providing a unified perspective that is previously missing in this fields. (4) AOP as well as TRD give novel implications to existing TN-based algorithms: A modified iTEBD is suggested and the two-dimensional (2D) AOP is argued to be an intrinsic 2D extension of DMRG that is based on infinite projected entangled pair state. This paper is focused on one-dimensional quantum models to present AOP. The benchmark is given on a transverse Ising

  20. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    Science.gov (United States)

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  1. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  2. Electricity Capacity Expansion Modeling, Analysis, and Visualization: A Summary of High-Renewable Modeling Experiences (Chinese Translation)

    Energy Technology Data Exchange (ETDEWEB)

    Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zhou, Ella [National Renewable Energy Lab. (NREL), Golden, CO (United States); Getman, Dan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Arent, Douglas J. [Joint Inst. for Strategic Energy Analysis, Golden, CO (United States)

    2015-10-01

    This is the Chinese translation of NREL/TP-6A20-64831. Mathematical and computational models are widely used for the analysis and design of both physical and financial systems. Modeling the electric grid is of particular importance to China for three reasons. First, power-sector assets are expensive and long-lived, and they are critical to any country's development. China's electric load, transmission, and other energy-related infrastructure are expected to continue to grow rapidly; therefore it is crucial to understand and help plan for the future in which those assets will operate. Second, China has dramatically increased its deployment of renewable energy (RE), and is likely to continue further accelerating such deployment over the coming decades. Careful planning and assessment of the various aspects (technical, economic, social, and political) of integrating a large amount of renewables on the grid is required. Third, companies need the tools to develop a strategy for their own involvement in the power market China is now developing, and to enable a possible transition to an efficient and high RE future.

  3. Mathematical models of tumor growth: translating absorbed dose to tumor control probability

    International Nuclear Information System (INIS)

    Sgouros, G.

    1996-01-01

    Full text: The dose-rate in internal emitter therapy is low and time-dependent as compared to external beam radiotherapy. Once the total absorbed dose delivered to a target tissue is calculated, however, most dosimetric analyses of radiopharmaceuticals are considered complete. To translate absorbed dose estimates obtained for internal emitter therapy to biologic effect, the growth characteristics, repair capacity, and radiosensitivity of the tumor must be considered. Tumor growth may be represented by the Gompertz equation in which tumor cells increase at an exponential growth rate that is itself decreasing at an exponential rate; as the tumor increases in size, the growth rate diminishes. The empirical Gompertz expression for tumor growth may be derived from a mechanistic model in which growth is represented by a balance between tumor-cell birth and loss. The birth rate is assumed to be fixed, while the cell loss rate is time-dependent and increases with tumor size. The birth rate of the tumors may be related to their potential doubling time. Multiple biopsies of individual tumors have demonstrated a heterogeneity in the potential doubling time of tumors. By extending the mechanistic model described above to allow for sub-populations of tumor cells with different birth rates, the effect of kinetic heterogeneity within a tumor may be examined. Model simulations demonstrate that the cell kinetic parameters of a tumor are predicted to change over time and measurements obtained using a biopsy are unlikely to reflect the kinetics of the tumor throughout its growth history. A decrease in overall tumor mass, in which each sub-population is reduced in proportion to its cell number, i.e., the log-kill assumption, leads to re-growth of a tumor that has a greater proliferation rate. Therapy that is linked to the potential doubling time or to the effective proliferation rate of the tumor may lead to re-growth of a tumor that is kinetically unchanged. The simplest model of

  4. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  5. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  6. So, You Think You Have an Idea: A Practical Risk Reduction-Conceptual Model for Academic Translational Research

    Directory of Open Access Journals (Sweden)

    John Schwartz

    2017-04-01

    elements of a market-driven translational program (1 problem identification and validation; (2 defining the conceptual model of disease; and (3 risk evaluation and mitigation strategies.

  7. Methodologies for Wind Turbine and STATCOM Integration in Wind Power Plant Models for Harmonic Resonances Assessment

    DEFF Research Database (Denmark)

    Freijedo Fernandez, Francisco Daniel; Chaudhary, Sanjay Kumar; Guerrero, Josep M.

    2015-01-01

    -domain. As an alternative, a power based averaged modelling is also proposed. Type IV wind turbine harmonic signature and STATCOM active harmonic mitigation are considered for the simulation case studies. Simulation results provide a good insight of the features and limitations of the proposed methodologies.......This paper approaches modelling methodologies for integration of wind turbines and STATCOM in harmonic resonance studies. Firstly, an admittance equivalent model representing the harmonic signature of grid connected voltage source converters is provided. A simplified type IV wind turbine modelling...... is then straightforward. This linear modelling is suitable to represent the wind turbine in the range of frequencies at which harmonic interactions are likely. Even the admittance method is suitable both for frequency and time domain studies, some limitations arise in practice when implementing it in the time...

  8. Translating VDM to Alloy

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    2013-01-01

    specifications. However, to take advantage of the automated analysis of Alloy, the model-oriented VDM specifications must be translated into a constraint-based Alloy specifications. We describe how a sub- set of VDM can be translated into Alloy and how assertions can be expressed in VDM and checked by the Alloy...

  9. Translating India

    CERN Document Server

    Kothari, Rita

    2014-01-01

    The cultural universe of urban, English-speaking middle class in India shows signs of growing inclusiveness as far as English is concerned. This phenomenon manifests itself in increasing forms of bilingualism (combination of English and one Indian language) in everyday forms of speech - advertisement jingles, bilingual movies, signboards, and of course conversations. It is also evident in the startling prominence of Indian Writing in English and somewhat less visibly, but steadily rising, activity of English translation from Indian languages. Since the eighties this has led to a frenetic activity around English translation in India's academic and literary circles. Kothari makes this very current phenomenon her chief concern in Translating India.   The study covers aspects such as the production, reception and marketability of English translation. Through an unusually multi-disciplinary approach, this study situates English translation in India amidst local and global debates on translation, representation an...

  10. Translating Inclusion

    DEFF Research Database (Denmark)

    Fallov, Mia Arp; Birk, Rasmus

    2018-01-01

    The purpose of this paper is to explore how practices of translation shape particular paths of inclusion for people living in marginalized residential areas in Denmark. Inclusion, we argue, is not an end-state, but rather something which must be constantly performed. Active citizenship, today......, is not merely a question of participation, but of learning to become active in all spheres of life. The paper draws on empirical examples from a multi-sited field work in 6 different sites of local community work in Denmark, to demonstrate how different dimensions of translation are involved in shaping active...... citizenship. We propose the following different dimensions of translation: translating authority, translating language, translating social problems. The paper takes its theoretical point of departure from assemblage urbanism, arguing that cities are heterogeneous assemblages of socio-material interactions...

  11. From LCAs to simplified models: a generic methodology applied to wind power electricity.

    Science.gov (United States)

    Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle

    2013-02-05

    This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.

  12. Through the Looking Glass: No Wonderland Yet! (The Reciprocal Relationship between Methodology and Models of Reality).

    Science.gov (United States)

    Unger, Rhoda Kesler

    1983-01-01

    Discusses the relationship between conceptual frameworks and methodology in psychology. Argues that models of reality influence research in terms of question selection, causal factors hypothesized, and interpretation of data. Considers the position and role of women as objects and agents of research using a sociology of knowledge perspective.…

  13. Nirex methodology for scenario and conceptual model development. An international peer review

    International Nuclear Information System (INIS)

    1999-06-01

    Nirex has responsibilities for nuclear waste management in the UK. The company's top level objectives are to maintain technical credibility on deep disposal, to gain public acceptance for a deep geologic repository, and to provide relevant advice to customers on the safety implications of their waste packaging proposals. Nirex utilizes peer reviews as appropriate to keep its scientific tools up-to-date and to periodically verify the quality of its products. The NEA formed an International Review Team (IRT) consisting of four internationally recognised experts plus a member of the NEA Secretariat. The IRT performed an in-depth analysis of five Nirex scientific reports identified in the terms of reference of the review. The review was to primarily judge whether the Nirex methodology provides an adequate framework to support the building of a future licensing safety case. Another objective was to judge whether the methodology could aid in establishing a better understanding, and, ideally, enhance acceptance of a repository among stakeholders. Methodologies for conducting safety assessments include at a very basic level the identification of features, events, and processes (FEPs) relevant to the system at hand, their convolution in scenarios for analysis, and the formulation of conceptual models to be addressed through numerical modelling. The main conclusion of the IRT is that Nirex has developed a potentially sound methodology for the identification and analysis of FEPs and for the identification of conceptual model needs and model requirements. The work is still in progress and is not yet complete. (R.P.)

  14. A Methodological Review of Structural Equation Modelling in Higher Education Research

    Science.gov (United States)

    Green, Teegan

    2016-01-01

    Despite increases in the number of articles published in higher education journals using structural equation modelling (SEM), research addressing their statistical sufficiency, methodological appropriateness and quantitative rigour is sparse. In response, this article provides a census of all covariance-based SEM articles published up until 2013…

  15. Eigenvectors determination of the ribosome dynamics model during mRNA translation using the Kleene Star algorithm

    Science.gov (United States)

    Ernawati; Carnia, E.; Supriatna, A. K.

    2018-03-01

    Eigenvalues and eigenvectors in max-plus algebra have the same important role as eigenvalues and eigenvectors in conventional algebra. In max-plus algebra, eigenvalues and eigenvectors are useful for knowing dynamics of the system such as in train system scheduling, scheduling production systems and scheduling learning activities in moving classes. In the translation of proteins in which the ribosome move uni-directionally along the mRNA strand to recruit the amino acids that make up the protein, eigenvalues and eigenvectors are used to calculate protein production rates and density of ribosomes on the mRNA. Based on this, it is important to examine the eigenvalues and eigenvectors in the process of protein translation. In this paper an eigenvector formula is given for a ribosome dynamics during mRNA translation by using the Kleene star algorithm in which the resulting eigenvector formula is simpler and easier to apply to the system than that introduced elsewhere. This paper also discusses the properties of the matrix {B}λ \\otimes n of model. Among the important properties, it always has the same elements in the first column for n = 1, 2,… if the eigenvalue is the time of initiation, λ = τin , and the column is the eigenvector of the model corresponding to λ.

  16. On the fit of models to covariances and methodology to the Bulletin.

    Science.gov (United States)

    Bentler, P M

    1992-11-01

    It is noted that 7 of the 10 top-cited articles in the Psychological Bulletin deal with methodological topics. One of these is the Bentler-Bonett (1980) article on the assessment of fit in covariance structure models. Some context is provided on the popularity of this article. In addition, a citation study of methodology articles appearing in the Bulletin since 1978 was carried out. It verified that publications in design, evaluation, measurement, and statistics continue to be important to psychological research. Some thoughts are offered on the role of the journal in making developments in these areas more accessible to psychologists.

  17. Methodology of a diabetes prevention translational research project utilizing a community-academic partnership for implementation in an underserved Latino community

    Directory of Open Access Journals (Sweden)

    Ma Yunsheng

    2009-03-01

    Full Text Available Abstract Background Latinos comprise the largest racial/ethnic group in the United States and have 2–3 times the prevalence of type 2 diabetes mellitus as Caucasians. Methods and design The Lawrence Latino Diabetes Prevention Project (LLDPP is a community-based translational research study which aims to reduce the risk of diabetes among Latinos who have a ≥ 30% probability of developing diabetes in the next 7.5 years per a predictive equation. The project was conducted in Lawrence, Massachusetts, a predominantly Caribbean-origin urban Latino community. Individuals were identified primarily from a community health center's patient panel, screened for study eligibility, randomized to either a usual care or a lifestyle intervention condition, and followed for one year. Like the efficacious Diabetes Prevention Program (DPP, the LLDPP intervention targeted weight loss through dietary change and increased physical activity. However, unlike the DPP, the LLDPP intervention was less intensive, tailored to literacy needs and cultural preferences, and delivered in Spanish. The group format of the intervention (13 group sessions over 1 year was complemented by 3 individual home visits and was implemented by individuals from the community with training and supervision by a clinical research nutritionist and a behavioral psychologist. Study measures included demographics, Stern predictive equation components (age, gender, ethnicity, fasting glucose, systolic blood pressure, HDL-cholesterol, body mass index, and family history of diabetes, glycosylated hemoglobin, dietary intake, physical activity, depressive symptoms, social support, quality of life, and medication use. Body weight was measured at baseline, 6-months, and one-year; all other measures were assessed at baseline and one-year. All surveys were orally administered in Spanish. Results A community-academic partnership enabled the successful recruitment, intervention, and assessment of Latinos at

  18. Rosetta: an operator basis translator for standard model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Falkowski, Adam [Laboratoire de Physique Théorique, Bat. 210, Université Paris-Sud, 91405, Orsay (France); Fuks, Benjamin [Département Recherches Subatomiques, Institut Pluridisciplinaire Hubert Curien, Université de Strasbourg/CNRS-IN2P3, 23 rue du Loess, 67037, Strasbourg (France); Mawatari, Kentarou [Theoretische Natuurkunde and IIHE/ELEM, Vrije Universiteit Brussel, and International Solvay Institutes, Pleinlaan 2, 1050, Brussels (Belgium); Mimasu, Ken, E-mail: k.mimasu@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, BN1 9QH, Brighton (United Kingdom); Riva, Francesco [CERN, Theory Division, 1211, Geneva (Switzerland); Sanz, Verónica [Department of Physics and Astronomy, University of Sussex, BN1 9QH, Brighton (United Kingdom)

    2015-12-10

    We introduce Rosetta, a program allowing for the translation between different bases of effective field theory operators. We present the main functions of the program and provide an example of usage. One of the Lagrangians which Rosetta can translate into has been implemented into FeynRules, which allows Rosetta to be interfaced into various high-energy physics programs such as Monte Carlo event generators. In addition to popular bases choices, such as the Warsaw and Strongly Interacting Light Higgs bases already implemented in the program, we also detail how to add new operator bases into the Rosetta package. In this way, phenomenological studies using an effective field theory framework can be straightforwardly performed.

  19. Rosetta: an operator basis translator for standard model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Falkowski, Adam [Universite Paris-Sud, Laboratoire de Physique Theorique, Bat. 210, Orsay (France); Fuks, Benjamin [Universite de Strasbourg/CNRS-IN2P3, Departement Recherches Subatomiques, Institut Pluridisciplinaire Hubert Curien, Strasbourg (France); Mawatari, Kentarou [Theoretische Natuurkunde and IIHE/ELEM, Vrije Universiteit Brussel, and International Solvay Institutes, Brussels (Belgium); Mimasu, Ken; Sanz, Veronica [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom); Riva, Francesco [CERN, Theory Division, Geneva (Switzerland)

    2015-12-15

    We introduce Rosetta, a program allowing for the translation between different bases of effective field theory operators. We present the main functions of the program and provide an example of usage. One of the Lagrangians which Rosetta can translate into has been implemented into FeynRules, which allows Rosetta to be interfaced into various high-energy physics programs such as Monte Carlo event generators. In addition to popular bases choices, such as the Warsaw and Strongly Interacting Light Higgs bases already implemented in the program, we also detail how to add new operator bases into the Rosetta package. In this way, phenomenological studies using an effective field theory framework can be straightforwardly performed. (orig.)

  20. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  1. Simplified life cycle assessment models: methodological framework and applications to energy pathways

    International Nuclear Information System (INIS)

    Padey, Pierryves

    2013-01-01

    The energy transition debate is a key issue for today and the coming years. One of the challenges is to limit the environmental impacts of electricity production. Decision support tools, sufficiently accurate, simple to use, accounting for environmental aspects and favoring future energetic choices, must be implemented. However, the environmental assessment of the energy pathways is complex, and it means considering a two levels characterization. The 'energy pathway' is the first level and corresponds to its environmental distribution, to compare overall pathways. The 'system pathway' is the 2. level and compares environmental impacts of systems within each pathway. We have devised a generic methodology covering both necessary characterization levels by estimating the energy pathways environmental profiles while allowing a simple comparison of its systems environmental impacts. This methodology is based on the definition of a parameterized Life Cycle Assessment model and considers, through a Global Sensitivity Analysis, the environmental impacts of a large sample of systems representative of an energy pathway. As a second step, this methodology defines simplified models based on few key parameters identified as inducing the largest variability in the energy pathway environmental impacts. These models assess in a simple way the systems environmental impacts, avoiding any complex LCAs. This reduction methodology has been applied to the onshore wind power energy pathway in Europe and the photovoltaic energy pathway in France. (author)

  2. Exploring theoretical functions of corpus data in teaching translation

    Directory of Open Access Journals (Sweden)

    Éric Poirier

    2016-04-01

    Full Text Available http://dx.doi.org/10.5007/2175-7968.2016v36nesp1p177 As language referential data banks, corpora are instrumental in the exploration of translation solutions in bilingual parallel texts or conventional usages of source or target language in monolingual general or specialized texts. These roles are firmly rooted in translation processes, from analysis and interpretation of source text to searching for an acceptable equivalent and integrating it into the production of the target text. Provided the creative and not the conservative way be taken, validation or adaptation of target text in accordance with conventional usages in the target language also benefits from corpora. Translation teaching is not exploiting this way of translating that is common practice in the professional translation markets around the world. Instead of showing what corpus tools can do to translation teaching, we start our analysis with a common issue within translation teaching and show how corpus data can help to resolve it in learning activities in translation courses. We suggest a corpus-driven model for the interpretation of ‘business’ as a term and as an item in complex terms based on source text pattern analysis. This methodology will make it possible for teachers to explain and justify interpretation rules that have been defined theoretically from corpus data. It will also help teachers to conceive and non-subjectively assess practical activities designed for learners of translation. Corpus data selected for the examples of rule-based interpretations provided in this paper have been compiled in a corpus-driven study (Poirier, 2015 on the translation of the noun ‘business’ in the field of specialized translation in business, economics, and finance from English to French. The corpus methodology and rule-based interpretation of senses can be generalized and applied in the definition of interpretation rules for other language pairs and other specialized simple and

  3. Exploring theoretical functions of corpus data in teaching translation

    Directory of Open Access Journals (Sweden)

    Éric Poirier

    2016-06-01

    Full Text Available As language referential data banks, corpora are instrumental in the exploration of translation solutions in bilingual parallel texts or conventional usages of source or target language in monolingual general or specialized texts. These roles are firmly rooted in translation processes, from analysis and interpretation of source text to searching for an acceptable equivalent and integrating it into the production of the target text. Provided the creative and not the conservative way be taken, validation or adaptation of target text in accordance with conventional usages in the target language also benefits from corpora. Translation teaching is not exploiting this way of translating that is common practice in the professional translation markets around the world. Instead of showing what corpus tools can do to translation teaching, we start our analysis with a common issue within translation teaching and show how corpus data can help to resolve it in learning activities in translation courses. We suggest a corpus-driven model for the interpretation of ‘business’ as a term and as an item in complex terms based on source text pattern analysis. This methodology will make it possible for teachers to explain and justify interpretation rules that have been defined theoretically from corpus data. It will also help teachers to conceive and non-subjectively assess practical activities designed for learners of translation. Corpus data selected for the examples of rule-based interpretations provided in this paper have been compiled in a corpus-driven study (Poirier, 2015 on the translation of the noun ‘business’ in the field of specialized translation in business, economics, and finance from English to French. The corpus methodology and rule-based interpretation of senses can be generalized and applied in the definition of interpretation rules for other language pairs and other specialized simple and complex terms. These works will encourage the

  4. A Dynamic Defense Modeling and Simulation Methodology using Semantic Web Services

    Directory of Open Access Journals (Sweden)

    Kangsun Lee

    2010-04-01

    Full Text Available Defense Modeling and Simulations require interoperable and autonomous federates in order to fully simulate complex behavior of war-fighters and to dynamically adapt themselves to various war-game events, commands and controls. In this paper, we propose a semantic web service based methodology to develop war-game simulations. Our methodology encapsulates war-game logic into a set of web services with additional semantic information in WSDL (Web Service Description Language and OWL (Web Ontology Language. By utilizing dynamic discovery and binding power of semantic web services, we are able to dynamically reconfigure federates according to various simulation events. An ASuW (Anti-Surface Warfare simulator is constructed to demonstrate the methodology and successfully shows that the level of interoperability and autonomy can be greatly improved.

  5. Graduate Education for the Future: New Models and Methods for the Clinical and Translational Workforce

    Science.gov (United States)

    Bennett, L. Michelle; Cicutto, Lisa; Gadlin, Howard; Moss, Marc; Tentler, John; Schoenbaum, Ellie

    2015-01-01

    Abstract This paper is the third in a five‐part series on the clinical and translational science educational pipeline, and it focuses on strategies for enhancing graduate research education to improve skills for interdisciplinary team science. Although some of the most cutting edge science takes place at the borders between disciplines, it is widely perceived that advancements in clinical and translational science are hindered by the “siloed” efforts of researchers who are comfortable working in their separate domains, and reluctant to stray from their own discipline when conducting research. Without appropriate preparation for career success as members and leaders of interdisciplinary teams, talented scientists may choose to remain siloed or to leave careers in clinical and translational science all together, weakening the pipeline and depleting the future biomedical research workforce. To address this threat, it is critical to begin at what is perhaps the most formative moment for academics: graduate training. This paper focuses on designs for graduate education, and contrasts the methods and outcomes from traditional educational approaches with those skills perceived as essential for the workforce of the future, including the capacity for research collaboration that crosses disciplinary boundaries. PMID:26643714

  6. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    Science.gov (United States)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  7. Translation of the model plant of the CN code TRAC-BF1 Cofrentes of a SNAP-TRACE

    International Nuclear Information System (INIS)

    Escriva, A.; Munuz-Cobo, J. L.; Concejal, A.; Melara, J.; Albendea, M.

    2012-01-01

    It aims to develop a three-dimensional model of the CN Cofrentes whose consistent results Compared with those in current use programs (TRAC-BFl, RETRAN) validated with data of the plant. This comparison should be done globally and that you can not carry a compensation of errors. To check the correct translation of the results obtained have been compared with TRACE and the programs currently in use and the relevant adjustments have been made, taking into account that both the correlations and models are different codes. During the completion of this work we have detected several errors that must be corrected in future versions of these tools.

  8. Methodology Development for SiC Sensor Signal Modelling in the Nuclear Reactor Radiation Environments

    International Nuclear Information System (INIS)

    Cetnar, J.; Krolikowski, I.P.

    2013-06-01

    This paper deals with SiC detector simulation methodology for signal formation by neutrons and induced secondary radiation as well as its inverse interpretation. The primary goal is to achieve the SiC capability of simultaneous spectroscopic measurements of neutrons and gamma-rays for which an appropriate methodology of the detector signal modelling and its interpretation must be adopted. The process of detector simulation is divided into two basically separate but actually interconnected sections. The first one is the forward simulation of detector signal formation in the field of the primary neutron and secondary radiations, whereas the second one is the inverse problem of finding a representation of the primary radiation, based on the measured detector signals. The applied methodology under development is based on the Monte Carlo description of radiation transport and analysis of the reactor physics. The methodology of SiC detector signal interpretation will be based on the existing experience in neutron metrology developed in the past for various neutron and gamma-ray detection systems. Since the novel sensors based on SiC are characterised by a new structure, yet to be finally designed, the methodology for particle spectroscopic fluence measurement must be developed while giving a productive feed back to the designing process of SiC sensor, in order to arrive at the best possible design. (authors)

  9. Compositional translation

    NARCIS (Netherlands)

    Appelo, Lisette; Janssen, Theo; Jong, de F.M.G.; Landsbergen, S.P.J.

    1994-01-01

    This book provides an in-depth review of machine translation by discussing in detail a particular method, called compositional translation, and a particular system, Rosetta, which is based on this method. The Rosetta project is a unique combination of fundamental research and large-scale

  10. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  11. A coupled groundwater-flow-modelling and vulnerability-mapping methodology for karstic terrain management

    Science.gov (United States)

    Kavouri, Konstantina P.; Karatzas, George P.; Plagnes, Valérie

    2017-08-01

    A coupled groundwater-flow-modelling and vulnerability-mapping methodology for the management of karst aquifers with spatial variability is developed. The methodology takes into consideration the duality of flow and recharge in karst and introduces a simple method to integrate the effect of temporal storage in the unsaturated zone. In order to investigate the applicability of the developed methodology, simulation results are validated against available field measurement data. The criteria maps from the PaPRIKa vulnerability-mapping method are used to document the groundwater flow model. The FEFLOW model is employed for the simulation of the saturated zone of Palaikastro-Chochlakies karst aquifer, in the island of Crete, Greece, for the hydrological years 2010-2012. The simulated water table reproduces typical karst characteristics, such as steep slopes and preferred drain axes, and is in good agreement with field observations. Selected calculated error indicators—Nash-Sutcliffe efficiency (NSE), root mean squared error (RMSE) and model efficiency (E')—are within acceptable value ranges. Results indicate that different storage processes take place in different parts of the aquifer. The north-central part seems to be more sensitive to diffuse recharge, while the southern part is affected primarily by precipitation events. Sensitivity analysis is performed on the parameters of hydraulic conductivity and specific yield. The methodology is used to estimate the feasibility of artificial aquifer recharge (AAR) at the study area. Based on the developed methodology, guidelines were provided for the selection of the appropriate AAR scenario that has positive impact on the water table.

  12. Can Neuroscience Contribute to Practical Ethics? A Critical Review and Discussion of the Methodological and Translational Challenges of the Neuroscience of Ethics.

    Science.gov (United States)

    Racine, Eric; Dubljević, Veljko; Jox, Ralf J; Baertschi, Bernard; Christensen, Julia F; Farisco, Michele; Jotterand, Fabrice; Kahane, Guy; Müller, Sabine

    2017-06-01

    Neuroethics is an interdisciplinary field that arose in response to novel ethical challenges posed by advances in neuroscience. Historically, neuroethics has provided an opportunity to synergize different disciplines, notably proposing a two-way dialogue between an 'ethics of neuroscience' and a 'neuroscience of ethics'. However, questions surface as to whether a 'neuroscience of ethics' is a useful and unified branch of research and whether it can actually inform or lead to theoretical insights and transferable practical knowledge to help resolve ethical questions. In this article, we examine why the neuroscience of ethics is a promising area of research and summarize what we have learned so far regarding its most promising goals and contributions. We then review some of the key methodological challenges which may have hindered the use of results generated thus far by the neuroscience of ethics. Strategies are suggested to address these challenges and improve the quality of research and increase neuroscience's usefulness for applied ethics and society at large. Finally, we reflect on potential outcomes of a neuroscience of ethics and discuss the different strategies that could be used to support knowledge transfer to help different stakeholders integrate knowledge from the neuroscience of ethics. © 2017 John Wiley & Sons Ltd.

  13. A hierarchical modeling methodology for the definition and selection of requirements

    Science.gov (United States)

    Dufresne, Stephane

    This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the

  14. Translation of a High-Level Temporal Model into Lower Level Models: Impact of Modelling at Different Description Levels

    DEFF Research Database (Denmark)

    Kraft, Peter; Sørensen, Jens Otto

    2001-01-01

    The paper attempts theoretically to clarify the interrelation between various levels of descriptions used in the modelling and the programming of information systems. We suggest an analysis where we characterise the description levels with respect to how precisely they may handle information abou...... and other textual models. We also consider the aptness of models that include procedural mechanisms such as active and object databases...

  15. A User’s Manual for the Revised Defense Translator Model

    Science.gov (United States)

    1990-06-01

    Sed emmsabt f t bebon *aMs w swq s ulma o6 ig embalm of iwumbe. SAin m ggidam fo muft Odm hkinuso V WUm&gmn beadawimo Swnle. 0omM fot hngmma ft Osmdm wd...Subsequently, the translator has undergone a series of revisions in order to reflect changes in computational methods and in the amount and types of goods...ELECTRONIC COMPONENTS, N.E.C. 0.13643 341 STORAGE BATTERIES 0.03045 342 PRIMARY BATTERIES, DRY & WET 0.09329 343 X-RAY APPRATUS & TUBES 0.00415 345 ELECTRICAL

  16. A methodology for including wall roughness effects in k-ε low-Reynolds turbulence models

    International Nuclear Information System (INIS)

    Ambrosini, W.; Pucciarelli, A.; Borroni, I.

    2015-01-01

    Highlights: • A model for taking into account wall roughness in low-Reynolds k-ε models is presented. • The model is subjected to a first validation to show its potential in general applications. • The application of the model in predicting heat transfer to supercritical fluids is also discussed. - Abstract: A model accounting for wall roughness effects in k-ε low-Reynolds turbulence models is described in the present paper. In particular, the introduction in the transport equations of k and ε of additional source terms related to roughness, based on simple assumptions and dimensional relationships, is proposed. An objective of the present paper, in addition to obtaining more realistic predictions of wall friction, is the application of the proposed model to the study of heat transfer to supercritical fluids. A first validation of the model is reported. The model shows the capability of predicting, at least qualitatively, some of the most important trends observed when dealing with rough pipes in very different flow conditions. Qualitative comparisons with some DNS data available in literature are also performed. Further analyses provided promising results concerning the ability of the model in reproducing the trend of friction factor when varying the flow conditions, though improvements are necessary for achieving better quantitative accuracy. First applications of the model in simulating heat transfer to supercritical fluids are also described, showing the capability of the model to affect the predictions of these heat transfer phenomena, in particular in the vicinity of the pseudo-critical conditions. A more extended application of the model to relevant deteriorated heat transfer conditions will clarify the usefulness of this modelling methodology in improving predictions of these difficult phenomena. Whatever the possible success in this particular application that motivated its development, this approach suggests a general methodology for accounting

  17. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  18. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  19. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  20. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  1. On the Universality of Theory Models of Translation Ethics:From the Perspective of the Exact Referent and Status of Each Translation Agent%翻译伦理模式理论普适性思考之翻译主体指涉与定位探究

    Institute of Scientific and Technical Information of China (English)

    汤金霞; 梅阳春

    2014-01-01

    由翻译的再现伦理、服务伦理、交际伦理、规范伦理和承诺伦理组成的翻译伦理模式理论对中西方翻译伦理学的发展贡献巨大,以致于国内翻译界出现了以该理论为基础构建普适性翻译伦理的建议。然而再现伦理、服务伦理、交际伦理和规范伦理没有明确翻译主体(译者除外)的具体指涉,对翻译主体(译者除外)的定位也不一致,旨在融合这4种翻译伦理的承诺伦理也没有解决这两个问题。因此,以翻译伦理模式理论为基础构建普世翻译伦理的设想不可行。%Theory of models of translation ethics constituted by translational ethics of representation, of service, of communication, of norm-based translational ethics and translation ethics of commitment contributes a lot to the development of translation ethics home and abroad. In the circle of translation of China, there comes the voice of formulating universal translation ethics on the basis of this theory. Therefore, this paper deals with the feasibility of the formulation of universal translation ethics on the basis of the theory models of translation ethics. In this paper, the exact referents of the translation agents ( mainly including the source, the target culture, the original writer, the client, the translator and the target readership) illustrated in translation ethics of representation, service, communication, norm-based translation ethics and the status of each translation agent conferred by each of the four translation ethics are compared and contrasted. It is discovered that, in the first four models of translation ethics, the referents of all the translation agents except the translator and the status conferred to each of the relevant translation agents except the target readership contradict each other. Translation ethics of commitment, with the purpose of integrating four preceding models of translation ethics does not solve the two problems either. Therefore

  2. A methodology for modeling photocatalytic reactors for indoor pollution control using previously estimated kinetic parameters

    Energy Technology Data Exchange (ETDEWEB)

    Passalia, Claudio; Alfano, Orlando M. [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina); Brandi, Rodolfo J., E-mail: rbrandi@santafe-conicet.gov.ar [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina)

    2012-04-15

    Highlights: Black-Right-Pointing-Pointer Indoor pollution control via photocatalytic reactors. Black-Right-Pointing-Pointer Scaling-up methodology based on previously determined mechanistic kinetics. Black-Right-Pointing-Pointer Radiation interchange model between catalytic walls using configuration factors. Black-Right-Pointing-Pointer Modeling and experimental validation of a complex geometry photocatalytic reactor. - Abstract: A methodology for modeling photocatalytic reactors for their application in indoor air pollution control is carried out. The methodology implies, firstly, the determination of intrinsic reaction kinetics for the removal of formaldehyde. This is achieved by means of a simple geometry, continuous reactor operating under kinetic control regime and steady state. The kinetic parameters were estimated from experimental data by means of a nonlinear optimization algorithm. The second step was the application of the obtained kinetic parameters to a very different photoreactor configuration. In this case, the reactor is a corrugated wall type using nanosize TiO{sub 2} as catalyst irradiated by UV lamps that provided a spatially uniform radiation field. The radiative transfer within the reactor was modeled through a superficial emission model for the lamps, the ray tracing method and the computation of view factors. The velocity and concentration fields were evaluated by means of a commercial CFD tool (Fluent 12) where the radiation model was introduced externally. The results of the model were compared experimentally in a corrugated wall, bench scale reactor constructed in the laboratory. The overall pollutant conversion showed good agreement between model predictions and experiments, with a root mean square error less than 4%.

  3. 2014 International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Ören, Tuncer; Kacprzyk, Janusz; Filipe, Joaquim

    2015-01-01

    The present book includes a set of selected extended papers from the 4th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2014), held in Vienna, Austria, from 28 to 30 August 2014. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2014 received 167 submissions, from 45 countries, in all continents. After a double blind paper review performed by the Program Committee, 23% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2014. Commitment to high quality standards is a major concern of SIMULTEC...

  4. 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Kacprzyk, Janusz; Ören, Tuncer; Filipe, Joaquim

    2016-01-01

    The present book includes a set of selected extended papers from the 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2015), held in Colmar, France, from 21 to 23 July 2015. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2015 received 102 submissions, from 36 countries, in all continents. After a double blind paper review performed by the Program Committee, 19% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2015. Commitment to high quality standards is a major concern of SIMULTECH t...

  5. A road map to Translational Medicine in Qatar and a model for the world

    Science.gov (United States)

    2012-01-01

    Translational Medicine (TM) in Qatar is part of a concerted effort of the Qatari medical and scientific leadership supported by a strong political will by Qatari authorities to deliver world-class health care to Qatari residents while participating in the worldwide quest to bridge the gap between bench-to-bedside-to-community. TM programs should embrace the Qatar National vision for research to become an international hub of excellence in research and development, based on intellectual merit, contributing to global knowledge and adhering to international standards, to innovate by translating new and original ideas into useful applications, to be inclusive at the national and international level, to build and maintain a competitive and diversified economy and ultimately improve the health and well-being of the Qatar’s population. Although this writing focuses on Qatar, we hope that the thoughts expressed here may be of broader use for the development of any TM program particularly in regions where an established academic community surrounded by a rich research infrastructure and/or a vibrant biotechnology enterprise is not already present. PMID:22929646

  6. Binary translation using peephole translation rules

    Science.gov (United States)

    Bansal, Sorav; Aiken, Alex

    2010-05-04

    An efficient binary translator uses peephole translation rules to directly translate executable code from one instruction set to another. In a preferred embodiment, the translation rules are generated using superoptimization techniques that enable the translator to automatically learn translation rules for translating code from the source to target instruction set architecture.

  7. Testing Pixel Translation Digital Elevation Models to Reconstruct Slip Histories: An Example from the Agua Blanca Fault, Baja California, Mexico

    Science.gov (United States)

    Wilson, J.; Wetmore, P. H.; Malservisi, R.; Ferwerda, B. P.; Teran, O.

    2012-12-01

    We use recently collected slip vector and total offset data from the Agua Blanca fault (ABF) to constrain a pixel translation digital elevation model (DEM) to reconstruct the slip history of this fault. This model was constructed using a Perl script that reads a DEM file (Easting, Northing, Elevation) and a configuration file with coordinates that define the boundary of each fault segment. A pixel translation vector is defined as a magnitude of lateral offset in an azimuthal direction. The program translates pixels north of the fault and prints their pre-faulting position to a new DEM file that can be gridded and displayed. This analysis, where multiple DEMs are created with different translation vectors, allows us to identify areas of transtension or transpression while seeing the topographic expression in these areas. The benefit of this technique, in contrast to a simple block model, is that the DEM gives us a valuable graphic which can be used to pose new research questions. We have found that many topographic features correlate across the fault, i.e. valleys and ridges, which likely have implications for the age of the ABF, long term landscape evolution rates, and potentially provide conformation for total slip assessments The ABF of northern Baja California, Mexico is an active, dextral strike slip fault that transfers Pacific-North American plate boundary strain out of the Gulf of California and around the "Big Bend" of the San Andreas Fault. Total displacement on the ABF in the central and eastern parts of the fault is 10 +/- 2 km based on offset Early-Cretaceous features such as terrane boundaries and intrusive bodies (plutons and dike swarms). Where the fault bifurcates to the west, the northern strand (northern Agua Blanca fault or NABF) is constrained to 7 +/- 1 km. We have not yet identified piercing points on the southern strand, the Santo Tomas fault (STF), but displacement is inferred to be ~4 km assuming that the sum of slip on the NABF and STF is

  8. A Model-Driven Methodology for Big Data Analytics-as-a-Service

    OpenAIRE

    Damiani, Ernesto; Ardagna, Claudio Agostino; Ceravolo, Paolo; Bellandi, Valerio; Bezzi, Michele; Hebert, Cedric

    2017-01-01

    The Big Data revolution has promised to build a data-driven ecosystem where better decisions are supported by enhanced analytics and data management. However, critical issues still need to be solved in the road that leads to commodization of Big Data Analytics, such as the management of Big Data complexity and the protection of data security and privacy. In this paper, we focus on the first issue and propose a methodology based on Model Driven Engineering (MDE) that aims to substantially lowe...

  9. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  10. A phased translation function

    International Nuclear Information System (INIS)

    Read, R.J.; Schierbeek, A.J.

    1988-01-01

    A phased translation function, which takes advantage of prior phase information to determine the position of an oriented mulecular replacement model, is examined. The function is the coefficient of correlation between the electron density computed with the prior phases and the electron density of the translated model, evaluated in reciprocal space as a Fourier transform. The correlation coefficient used in this work is closely related to an overlap function devised by Colman, Fehlhammer and Bartels. Tests with two protein structures, one of which was solved with the help of the phased translation function, show that little phase information is required to resolve the translation problem, and that the function is relatively insensitive to misorientation of the model. (orig.)

  11. Methodological Aspects of Modeling Development and Viability of Systems and Counterparties in the Digital Economy

    Directory of Open Access Journals (Sweden)

    Vitlinskyy Valdemar V.

    2018-03-01

    Full Text Available The aim of the article is to study and generalize methodological approaches to modeling economic development and viability of economic systems with consideration for risk, changing their goals, status, and behavior in the digital economy. The definition of categories of economic development and viability is offered, the directions of their research by means of mathematical modeling are grounded. The system of characteristics and markers of the external economic environment under conditions of digitalization of economic activity is analyzed. The theoretical foundations and methodology for mathematical modeling of development of economic systems as well as ensuring their viability and security under conditions of introducing infrastructure of information society and digital economy on the principles of the information and knowledge approach are considered. It is proved that in an information society, predictive model technologies are a growing safety resource. There studied prerequisites for replacing the traditional integration concept of evaluation, analysis, modeling, management, and administration of economic development based on a threat-oriented approach to the definition of security protectors, information, and knowledge. There proposed a concept of creating a database of models for examining trends and patterns of economic development, which, unlike traditional trend models of dynamics, identifies and iteratively conceptualizes processes based on a set of knowledgeable predictors based on the use of data mining and machine learning tools, including in-depth training.

  12. Model methodology for estimating pesticide concentration extremes based on sparse monitoring data

    Science.gov (United States)

    Vecchia, Aldo V.

    2018-03-22

    This report describes a new methodology for using sparse (weekly or less frequent observations) and potentially highly censored pesticide monitoring data to simulate daily pesticide concentrations and associated quantities used for acute and chronic exposure assessments, such as the annual maximum daily concentration. The new methodology is based on a statistical model that expresses log-transformed daily pesticide concentration in terms of a seasonal wave, flow-related variability, long-term trend, and serially correlated errors. Methods are described for estimating the model parameters, generating conditional simulations of daily pesticide concentration given sparse (weekly or less frequent) and potentially highly censored observations, and estimating concentration extremes based on the conditional simulations. The model can be applied to datasets with as few as 3 years of record, as few as 30 total observations, and as few as 10 uncensored observations. The model was applied to atrazine, carbaryl, chlorpyrifos, and fipronil data for U.S. Geological Survey pesticide sampling sites with sufficient data for applying the model. A total of 112 sites were analyzed for atrazine, 38 for carbaryl, 34 for chlorpyrifos, and 33 for fipronil. The results are summarized in this report; and, R functions, described in this report and provided in an accompanying model archive, can be used to fit the model parameters and generate conditional simulations of daily concentrations for use in investigations involving pesticide exposure risk and uncertainty.

  13. MODEL - INTEGRAL METHODOLOGY FOR SUCCESSFUL DESIGNING AND IMPLEMENTING OF TQM SYSTEM IN MACEDONIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    Elizabeta Mitreva

    2011-12-01

    Full Text Available The subject of this paper is linked with the valorization of the meaning and the perspectives of Total Quality Management (TQM system design and implementation within the domestic companies and creating a model-methodology for improved performance, efficiency and effectiveness. The research is designed as an attempt to depict the existing condition in the Macedonian companies regarding quality system design and implementation, analysed through 4 polls in the "house of quality" whose top is the ultimate management, and as its bases measurement, evaluation, analyzing and comparison of the quality are used. This "house" is being held by 4 subsystems e.g. internal standardization, methods and techniques for flawless work performance, education and motivation and analyses of the quality costs. The data received from the research and the proposal of the integral methodology for designing and implementing of TQM system are designed in turn to help and present useful directions to all Macedonian companies tending to become "world class" organizations. The basis in the creation of this model is the redesign of the business processes which afterword begins as a new phase of the business performance - continued improvement, rolling of Deming's Quality Circle (Plan-Do-Check-Act. The model-methodology proposed in this paper is integral and universal which means that it is applicable to all companies regardless of the business area.

  14. Precision translator

    Science.gov (United States)

    Reedy, Robert P.; Crawford, Daniel W.

    1984-01-01

    A precision translator for focusing a beam of light on the end of a glass fiber which includes two turning fork-like members rigidly connected to each other. These members have two prongs each with its separation adjusted by a screw, thereby adjusting the orthogonal positioning of a glass fiber attached to one of the members. This translator is made of simple parts with capability to keep adjustment even in condition of rough handling.

  15. Energy Demand Modeling Methodology of Key State Transitions of Turning Processes

    Directory of Open Access Journals (Sweden)

    Shun Jia

    2017-04-01

    Full Text Available Energy demand modeling of machining processes is the foundation of energy optimization. Energy demand of machining state transition is integral to the energy requirements of the machining process. However, research focus on energy modeling of state transition is scarce. To fill this gap, an energy demand modeling methodology of key state transitions of the turning process is proposed. The establishment of an energy demand model of state transition could improve the accuracy of the energy model of the machining process, which also provides an accurate model and reliable data for energy optimization of the machining process. Finally, case studies were conducted on a CK6153i CNC lathe, the results demonstrating that predictive accuracy with the proposed method is generally above 90% for the state transition cases.

  16. Development in methodologies for modelling of human and ecotoxic impacts in LCA

    DEFF Research Database (Denmark)

    Hauschild, Michael Zwicky; Huijbregts, Mark; Jolliet, Olivier

    2009-01-01

    Under the UNEP-SETAC Life Cycle Initiative there is an aim to develop an internationally backed recommended practice of life cycle impact assessment addressing methodological issues like choice of characterization model and characterization factors. In this context, an international comparison...... was performed of characterization models for toxic impacts from chemicals in life cycle assessment. Six commonly used characterization models were compared and in a sequence of workshops. Crucial fate, exposure and effect aspects were identified for which the models differed in their treatment. The models were....... The USEtox™ model has been used to calculate characterization factors for several thousand substances and is currently under review with the intention that it shall form the basis of the recommendations from the UNEP-SETAC Life Cycle Initiative regarding characterization of toxic impacts in Life Cycle...

  17. Current Status of Animal Models of Posttraumatic Stress Disorder: Behavioral and Biological Phenotypes, and Future Challenges in Improving Translation.

    Science.gov (United States)

    Deslauriers, Jessica; Toth, Mate; Der-Avakian, Andre; Risbrough, Victoria B

    2018-05-15

    Increasing predictability of animal models of posttraumatic stress disorder (PTSD) has required active collaboration between clinical and preclinical scientists. Modeling PTSD is challenging, as it is a heterogeneous disorder with ≥20 symptoms. Clinical research increasingly utilizes objective biological measures (e.g., imaging, peripheral biomarkers) or nonverbal behaviors and/or physiological responses to complement verbally reported symptoms. This shift toward more-objectively measurable phenotypes enables refinement of current animal models of PTSD, and it supports the incorporation of homologous measures across species. We reviewed >600 articles to examine the ability of current rodent models to probe biological phenotypes of PTSD (e.g., sleep disturbances, hippocampal and fear-circuit dysfunction, inflammation, glucocorticoid receptor hypersensitivity) in addition to behavioral phenotypes. Most models reliably produced enduring generalized anxiety-like or depression-like behaviors, as well as hyperactive fear circuits, glucocorticoid receptor hypersensitivity, and response to long-term selective serotonin reuptake inhibitors. Although a few paradigms probed fear conditioning/extinction or utilized peripheral immune, sleep, and noninvasive imaging measures, we argue that these should be incorporated more to enhance translation. Data on female subjects, on subjects at different ages across the life span, or on temporal trajectories of phenotypes after stress that can inform model validity and treatment study design are needed. Overall, preclinical (and clinical) PTSD researchers are increasingly incorporating homologous biological measures to assess markers of risk, response, and treatment outcome. This shift is exciting, as we and many others hope it not only will support translation of drug efficacy from animal models to clinical trials but also will potentially improve predictability of stage II for stage III clinical trials. Published by Elsevier Inc.

  18. Finite Element Modelling and Analysis of Damage Detection Methodology in Piezo Electric Sensor and Actuator Integrated Sandwich Cantilever Beam

    Science.gov (United States)

    Pradeep, K. R.; Thomas, A. M.; Basker, V. T.

    2018-03-01

    Structural health monitoring (SHM) is an essential component of futuristic civil, mechanical and aerospace structures. It detects the damages in system or give warning about the degradation of structure by evaluating performance parameters. This is achieved by the integration of sensors and actuators into the structure. Study of damage detection process in piezoelectric sensor and actuator integrated sandwich cantilever beam is carried out in this paper. Possible skin-core debond at the root of the cantilever beam is simulated and compared with undamaged case. The beam is actuated using piezoelectric actuators and performance differences are evaluated using Polyvinylidene fluoride (PVDF) sensors. The methodology utilized is the voltage/strain response of the damaged versus undamaged beam against transient actuation. Finite element model of piezo-beam is simulated in ANSYSTM using 8 noded coupled field element, with nodal degrees of freedoms are translations in the x, y directions and voltage. An aluminium sandwich beam with a length of 800mm, thickness of core 22.86mm and thickness of skin 0.3mm is considered. Skin-core debond is simulated in the model as unmerged nodes. Reduction in the fundamental frequency of the damaged beam is found to be negligible. But the voltage response of the PVDF sensor under transient excitation shows significantly visible change indicating the debond. Piezo electric based damage detection system is an effective tool for the damage detection of aerospace and civil structural system having inaccessible/critical locations and enables online monitoring possibilities as the power requirement is minimal.

  19. Water level management of lakes connected to regulated rivers: An integrated modeling and analytical methodology

    Science.gov (United States)

    Hu, Tengfei; Mao, Jingqiao; Pan, Shunqi; Dai, Lingquan; Zhang, Peipei; Xu, Diandian; Dai, Huichao

    2018-07-01

    Reservoir operations significantly alter the hydrological regime of the downstream river and river-connected lake, which has far-reaching impacts on the lake ecosystem. To facilitate the management of lakes connected to regulated rivers, the following information must be provided: (1) the response of lake water levels to reservoir operation schedules in the near future and (2) the importance of different rivers in terms of affecting the water levels in different lake regions of interest. We develop an integrated modeling and analytical methodology for the water level management of such lakes. The data-driven method is used to model the lake level as it has the potential of producing quick and accurate predictions. A new genetic algorithm-based synchronized search is proposed to optimize input variable time lags and data-driven model parameters simultaneously. The methodology also involves the orthogonal design and range analysis for extracting the influence of an individual river from that of all the rivers. The integrated methodology is applied to the second largest freshwater lake in China, the Dongting Lake. The results show that: (1) the antecedent lake levels are of crucial importance for the current lake level prediction; (2) the selected river discharge time lags reflect the spatial heterogeneity of the rivers' impacts on lake level changes; (3) the predicted lake levels are in very good agreement with the observed data (RMSE ≤ 0.091 m; R2 ≥ 0.9986). This study demonstrates the practical potential of the integrated methodology, which can provide both the lake level responses to future dam releases and the relative contributions of different rivers to lake level changes.

  20. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    Science.gov (United States)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population

  1. Testing methodologies for quantifying physical models uncertainties. A comparative exercise using CIRCE and IPREM (FFTBM)

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, Jordi, E-mail: jordi.freixa-terradas@upc.edu; Alfonso, Elsa de, E-mail: elsa.de.alfonso@upc.edu; Reventós, Francesc, E-mail: francesc.reventos@upc.edu

    2016-08-15

    Highlights: • Uncertainty of physical models are a key issue in Best estimate plus uncertainty analysis. • Estimation of uncertainties of physical models of thermal hydraulics system codes. • Comparison of CIRCÉ and FFTBM methodologies. • Simulation of reflood experiments in order to evaluate uncertainty of physical models related to the reflood scenario. - Abstract: The increasing importance of Best-Estimate Plus Uncertainty (BEPU) analyses in nuclear safety and licensing processes have lead to several international activities. The latest findings highlighted the uncertainties of physical models as one of the most controversial aspects of BEPU. This type of uncertainties is an important contributor to the total uncertainty of NPP BE calculations. Due to the complexity of estimating this uncertainty, it is often assessed solely by engineering judgment. The present study comprises a comparison of two different state-of-the-art methodologies CIRCÉ and IPREM (FFTBM) capable of quantifying the uncertainty of physical models. Similarities and differences of their results are discussed through the observation of probability distribution functions and envelope calculations. In particular, the analyzed scenario is core reflood. Experimental data from the FEBA and PERICLES test facilities is employed while the thermal hydraulic simulations are carried out with RELAP5/mod3.3. This work is undertaken under the framework of PREMIUM (Post-BEMUSE Reflood Model Input Uncertainty Methods) benchmark.

  2. Translational pain research: evaluating analgesic effect in experimental visceral pain models

    DEFF Research Database (Denmark)

    Olesen, Anne Estrup; Andresen, Trine; Christrup, Lona Louring

    2009-01-01

    Deep visceral pain is frequent and presents major challenges in pain management, since its pathophysiology is still poorly understood. One way to optimize treatment of visceral pain is to improve knowledge of the mechanisms behind the pain and the mode of action of analgesic substances. This can ...... studies and clinical condition in patients suffering from visceral pain, and thus constitute the missing link in translational pain research.......Deep visceral pain is frequent and presents major challenges in pain management, since its pathophysiology is still poorly understood. One way to optimize treatment of visceral pain is to improve knowledge of the mechanisms behind the pain and the mode of action of analgesic substances. This can...... facilitate minimizing the gap between knowledge gained in animal and human clinical studies. Combining experimental pain studies and pharmacokinetic studies can improve understanding of the pharmacokinetic-pharmacodynamic relationship of analgesics and, thus, provide valuable insight into optimal clinical...

  3. Concepts and methodologies for modeling and simulation a tribute to Tuncer Oren

    CERN Document Server

    Yilmaz, Levent

    2015-01-01

    This comprehensive text/reference presents cutting-edge advances in the theory and methodology of modeling and simulation (M&S), and reveals how this work has been influenced by the fundamental contributions of Professor Tuncer Ören to this field. Exploring the synergies among the domains of M&S and systems engineering (SE), the book describes how M&S and SE can help to address the complex problems identified as "Grand Challenges" more effectively under a model-driven and simulation-directed systems engineering framework. Topics and features: examines frameworks for the development of advan

  4. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  5. Decision modelling of non-pharmacological interventions for individuals with dementia: a systematic review of methodologies

    DEFF Research Database (Denmark)

    Sopina, Liza; Sørensen, Jan

    2018-01-01

    alongside an RCT without additional modelling. Results: Two primary, five secondary and three tertiary prevention intervention studies were identified and reviewed. Five studies utilised Markov models, with others using discrete event, regression-based simulation, and decision tree approaches. A number...... of challenging methodological issues were identified, including the use of MMSE-score as the main outcome measure, limited number of strategies compared, restricted time horizons, and limited or dated data on dementia onset, progression and mortality. Only one of the three tertiary prevention studies explicitly...

  6. Methodology and Applications in Non-linear Model-based Geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    that are approximately Gaussian. Parameter estimation and prediction for the transformed Gaussian model is studied. In some cases a transformation cannot possibly render the data Gaussian. A methodology for analysing such data was introduced by Diggle, Tawn and Moyeed (1998): The generalised linear spatial model...... priors for Bayesian inference is discussed. Procedures for parameter estimation and prediction are studied. Theoretical properties of Markov chain Monte Carlo algorithms are investigated, and different algorithms are compared. In addition, the thesis contains a manual for an R-package, geoRglmm, which...

  7. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  8. A methodology to model flow-thermals inside a domestic gas oven

    International Nuclear Information System (INIS)

    Mistry, Hiteshkumar; Ganapathisubbu, S.; Dey, Subhrajit; Bishnoi, Peeush; Castillo, Jose Luis

    2011-01-01

    In this paper, the authors describe development of a CFD based methodology to evaluate performance of a domestic gas oven. This involves modeling three-dimensional, unsteady, forced convective flow field coupled with radiative participating media. Various strategies for capturing transient heat transfer coupled with mixed convection flow field are evaluated considering the trade-off between computational time and accuracy of predictions. A new technique of modeling gas oven that does not require detailed modeling of flow-thermals through the burner is highlighted. Experiments carried out to support this modeling development shows that heat transfer from burners can be represented as non-dimensional false bottom temperature profiles. Transient validation of this model with experiments show less than 6% discrepancy in thermal field during preheating of bake cycle of gas oven.

  9. Establishment, maintenance and in vitro and in vivo applications of primary human glioblastoma multiforme (GBM) xenograft models for translational biology studies and drug discovery.

    Science.gov (United States)

    Carlson, Brett L; Pokorny, Jenny L; Schroeder, Mark A; Sarkaria, Jann N

    2011-03-01

    Development of clinically relevant tumor model systems for glioblastoma multiforme (GBM) is important for advancement of basic and translational biology. One model that has gained wide acceptance in the neuro-oncology community is the primary xenograft model. This model entails the engraftment of patient tumor specimens into the flank of nude mice and subsequent serial passage of these tumors in the flank of mice. These tumors are then used to establish short-term explant cultures or intracranial xenografts. This unit describes detailed procedures for establishment, maintenance, and utilization of a primary GBM xenograft panel for the purpose of using them as tumor models for basic or translational studies.

  10. Validation of a Methodology to Predict Micro-Vibrations Based on Finite Element Model Approach

    Science.gov (United States)

    Soula, Laurent; Rathband, Ian; Laduree, Gregory

    2014-06-01

    This paper presents the second part of the ESA R&D study called "METhodology for Analysis of structure- borne MICro-vibrations" (METAMIC). After defining an integrated analysis and test methodology to help predicting micro-vibrations [1], a full-scale validation test campaign has been carried out. It is based on a bread-board representative of typical spacecraft (S/C) platform consisting in a versatile structure made of aluminium sandwich panels equipped with different disturbance sources and a dummy payload made of a silicon carbide (SiC) bench. The bread-board has been instrumented with a large set of sensitive accelerometers and tests have been performed including back-ground noise measurement, modal characterization and micro- vibration tests. The results provided responses to the perturbation coming from a reaction wheel or cryo-cooler compressors, operated independently then simultaneously with different operation modes. Using consistent modelling and associated experimental characterization techniques, a correlation status has been assessed by comparing test results with predictions based on FEM approach. Very good results have been achieved particularly for the case of a wheel in sweeping rate operation with test results over-predicted within a reasonable margin lower than two. Some limitations of the methodology have also been identified for sources operating at a fixed rate or coming with a small number of dominant harmonics and recommendations have been issued in order to deal with model uncertainties and stay conservative.

  11. Methodological considerations for economic modelling of latent tuberculous infection screening in migrants.

    Science.gov (United States)

    Shedrawy, J; Siroka, A; Oxlade, O; Matteelli, A; Lönnroth, K

    2017-09-01

    Tuberculosis (TB) in migrants from endemic to low-incidence countries results mainly from the reactivation of latent tuberculous infection (LTBI). LTBI screening policies for migrants vary greatly between countries, and the evidence on the cost-effectiveness of the different approaches is weak and heterogeneous. The aim of this review was to assess the methodology used in published economic evaluations of LTBI screening among migrants to identify critical methodological options that must be considered when using modelling to determine value for money from different economic perspectives. Three electronic databases were searched and 10 articles were included. There was considerable variation across this small number of studies with regard to economic perspective, main outcomes, modelling technique, screening options and target populations considered, as well as in parameterisation of the epidemiological situation, test accuracy, efficacy, safety and programme performance. Only one study adopted a societal perspective; others adopted a health care or wider government perspective. Parameters representing the cascade of screening and treating LTBI varied widely, with some studies using highly aspirational scenarios. This review emphasises the need for a more harmonised approach for economic analysis, and better transparency in how policy options and economic perspectives influence methodological choices. Variability is justifiable for some parameters. However, sufficient data are available to standardise others. A societal perspective is ideal, but can be challenging due to limited data. Assumptions about programme performance should be based on empirical data or at least realistic assumptions. Results should be interpreted within specific contexts and policy options, with cautious generalisations.

  12. SOURCE LANGUAGE TEXT, PARALELL TEXT, AND MODEL TRANSLATED TEXT: A PILOT STUDY IN TEACHING TRANSLATION TEXTO LENGUA ORIGEN, TEXTO PARALELO Y TEXTO TRADUCIDO MODELO. ESTUDIO PILOTO EN LA ENSEÑANZA DE LA TRADUCCIÓN

    Directory of Open Access Journals (Sweden)

    Sergio Bolaños Cuéllar

    2007-12-01

    Full Text Available The advance in cultural-oriented perspectives in Translation Studies has sometimes played down the text linguistic nature of translation. A pilot study in teaching translation was carried out to make students aware of the text linguistic character of translating and help them to improve their translation skills, particularly with an emphasis on self-awareness and self-correcting strategies. The theoretical background is provided by the Dynamic Translation Model (2004, 2005 proposed by the author, with relevant and important contributions taken from Genette's (1982 transtextuality phenomena (hypertext, hypotext, metatext, paratext, intertext and House and Kasper's (1981 pragmatic modality markers (downgraders, upgraders. The key conceptual role of equivalence as a defining feature of translation is also dealt with. The textual relationship between Source Language Text (SLT is deemed to be pivotal for performing translation and correction tasks in the classroom. Finally, results of the pilot study are discussed and some conclusions are drawn.El desarrollo de las teorías traductológicas orientadas hacia la cultura en ocasiones ha opacado la naturaleza textolingüística de la traducción. Se llevó a cabo un estudio piloto para la enseñanza de la traducción con el fin de recalcar entre los estudiantes el carácter textolingüístico de la labor de traducción y para ayudarles a mejorar sus habilidades de traducción, con especial énfasis en las estrategias de autoconciencia y autocorrección. El marco teórico proviene del Modelo Traductológico Dinámico (2004, 2005, propuesto por el autor, con destacados aportes tomados de los fenómenos de transtextualidad de Genette (1982 (hipertexto, hipotexto, metatexto, paratexto, intertexto y de los marcadores de modalidad pragmática de House y Kasper (1981 (atenuadores, intensificadores. También se aborda el papel conceptual fundamental de la equivalencia como rasgo determinante de la traducci

  13. The TetO rat as a new translational model for type 2 diabetic retinopathy by inducible insulin receptor knockdown.

    Science.gov (United States)

    Reichhart, Nadine; Crespo-Garcia, Sergio; Haase, Nadine; Golic, Michaela; Skosyrski, Sergej; Rübsam, Anne; Herrspiegel, Christina; Kociok, Norbert; Alenina, Natalia; Bader, Michael; Dechend, Ralf; Strauss, Olaf; Joussen, Antonia M

    2017-01-01

    Although the renin-angiotensin system plays an important role in the progression of diabetic retinopathy, its influence therein has not been systematically evaluated. Here we test the suitability of a new translational model of diabetic retinopathy, the TetO rat, for addressing the role of angiotensin-II receptor 1 (AT1) blockade in experimental diabetic retinopathy. Diabetes was induced by tetracycline-inducible small hairpin RNA (shRNA) knockdown of the insulin receptor in rats, generating TetO rats. Systemic treatment consisted of an AT1 blocker (ARB) at the onset of diabetes, following which, 4-5 weeks later the retina was analysed in vivo and ex vivo. Retinal function was assessed by Ganzfeld electroretinography (ERG). Retinal vessels in TetO rats showed differences in vessel calibre, together with gliosis. The total number and the proportion of activated mononuclear phagocytes was increased. TetO rats presented with loss of retinal ganglion cells (RGC) and ERG indicated photoreceptor malfunction. Both the inner and outer blood-retina barriers were affected. The ARB treated group showed reduced gliosis and an overall amelioration of retinal function, alongside RGC recovery, whilst no statistically significant differences in vascular and inflammatory features were detected. The TetO rat represents a promising translational model for the early neurovascular changes associated with type 2 diabetic retinopathy. ARB treatment had an effect on the neuronal component of the retina but not on the vasculature.

  14. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    Science.gov (United States)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  15. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  16. A modelling methodology for assessing the impact of climate variability and climatic change on hydroelectric generation

    International Nuclear Information System (INIS)

    Munoz, J.R.; Sailor, D.J.

    1998-01-01

    A new methodology relating basic climatic variables to hydroelectric generation was developed. The methodology can be implemented in large or small basins with any number of hydro plants. The method was applied to the Sacramento, Eel and Russian river basins in northern California where more than 100 hydroelectric plants are located. The final model predicts the availability of hydroelectric generation for the entire basin provided present and near past climate conditions, with about 90% accuracy. The results can be used for water management purposes or for analyzing the effect of climate variability on hydrogeneration availability in the basin. A wide range of results can be obtained depending on the climate change scenario used. (Author)

  17. Mathematical Methodology for New Modeling of Water Hammer in Emergency Core Cooling System

    International Nuclear Information System (INIS)

    Lee, Seungchan; Yoon, Dukjoo; Ha, Sangjun

    2013-01-01

    In engineering insight, the water hammer study has carried out through the experimental work and the fluid mechanics. In this study, a new access methodology is introduced by Newton mechanics and a mathematical method. Also, NRC Generic Letter 2008-01 requires nuclear power plant operators to evaluate the effect of water-hammer for the protection of pipes of the Emergency Core Cooling System, which is related to the Residual Heat Removal System and the Containment Spray System. This paper includes modeling, the processes of derivation of the mathematical equations and the comparison with other experimental work. To analyze the effect of water-hammer, this mathematical methodology is carried out. This study is in good agreement with other experiment results as above. This method is very efficient to explain the water-hammer phenomena

  18. Mathematical Methodology for New Modeling of Water Hammer in Emergency Core Cooling System

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungchan; Yoon, Dukjoo; Ha, Sangjun [Korea Hydro Nuclear Power Co. Ltd, Daejeon (Korea, Republic of)

    2013-05-15

    In engineering insight, the water hammer study has carried out through the experimental work and the fluid mechanics. In this study, a new access methodology is introduced by Newton mechanics and a mathematical method. Also, NRC Generic Letter 2008-01 requires nuclear power plant operators to evaluate the effect of water-hammer for the protection of pipes of the Emergency Core Cooling System, which is related to the Residual Heat Removal System and the Containment Spray System. This paper includes modeling, the processes of derivation of the mathematical equations and the comparison with other experimental work. To analyze the effect of water-hammer, this mathematical methodology is carried out. This study is in good agreement with other experiment results as above. This method is very efficient to explain the water-hammer phenomena.

  19. Modeling of Throughput in Production Lines Using Response Surface Methodology and Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Federico Nuñez-Piña

    2018-01-01

    Full Text Available The problem of assigning buffers in a production line to obtain an optimum production rate is a combinatorial problem of type NP-Hard and it is known as Buffer Allocation Problem. It is of great importance for designers of production systems due to the costs involved in terms of space requirements. In this work, the relationship among the number of buffer slots, the number of work stations, and the production rate is studied. Response surface methodology and artificial neural network were used to develop predictive models to find optimal throughput values. 360 production rate values for different number of buffer slots and workstations were used to obtain a fourth-order mathematical model and four hidden layers’ artificial neural network. Both models have a good performance in predicting the throughput, although the artificial neural network model shows a better fit (R=1.0000 against the response surface methodology (R=0.9996. Moreover, the artificial neural network produces better predictions for data not utilized in the models construction. Finally, this study can be used as a guide to forecast the maximum or near maximum throughput of production lines taking into account the buffer size and the number of machines in the line.

  20. A new methodology for modeling of direct landslide costs for transportation infrastructures

    Science.gov (United States)

    Klose, Martin; Terhorst, Birgit

    2014-05-01

    The world's transportation infrastructure is at risk of landslides in many areas across the globe. A safe and affordable operation of traffic routes are the two main criteria for transportation planning in landslide-prone areas. The right balancing of these often conflicting priorities requires, amongst others, profound knowledge of the direct costs of landslide damage. These costs include capital investments for landslide repair and mitigation as well as operational expenditures for first response and maintenance works. This contribution presents a new methodology for ex post assessment of direct landslide costs for transportation infrastructures. The methodology includes tools to compile, model, and extrapolate landslide losses on different spatial scales over time. A landslide susceptibility model enables regional cost extrapolation by means of a cost figure obtained from local cost compilation for representative case study areas. On local level, cost survey is closely linked with cost modeling, a toolset for cost estimation based on landslide databases. Cost modeling uses Landslide Disaster Management Process Models (LDMMs) and cost modules to simulate and monetize cost factors for certain types of landslide damage. The landslide susceptibility model provides a regional exposure index and updates the cost figure to a cost index which describes the costs per km of traffic route at risk of landslides. Both indexes enable the regionalization of local landslide losses. The methodology is applied and tested in a cost assessment for highways in the Lower Saxon Uplands, NW Germany, in the period 1980 to 2010. The basis of this research is a regional subset of a landslide database for the Federal Republic of Germany. In the 7,000 km² large Lower Saxon Uplands, 77 km of highway are located in potential landslide hazard area. Annual average costs of 52k per km of highway at risk of landslides are identified as cost index for a local case study area in this region. The

  1. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one-b...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  2. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  3. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    International Nuclear Information System (INIS)

    Lahtinen, J.; Launiainen, T.; Heljanko, K.; Ropponen, J.

    2012-01-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  4. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Launiainen, T.; Heljanko, K.; Ropponen, J. [Aalto Univ., Espoo (Finland). Dept. of Information and Computer Science

    2012-07-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  5. 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Koziel, Slawomir; Kacprzyk, Janusz; Leifsson, Leifur; Ören, Tuncer

    2015-01-01

    This book includes extended and revised versions of a set of selected papers from the 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2013) which was co-organized by the Reykjavik University (RU) and sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC). SIMULTECH 2013 was held in cooperation with the ACM SIGSIM - Special Interest Group (SIG) on SImulation and Modeling (SIM), Movimento Italiano Modellazione e Simulazione (MIMOS) and AIS Special Interest Group on Modeling and Simulation (AIS SIGMAS) and technically co-sponsored by the Society for Modeling & Simulation International (SCS), Liophant Simulation, Simulation Team and International Federation for Information Processing (IFIP). This proceedings brings together researchers, engineers, applied mathematicians and practitioners working in the advances and applications in the field of system simulation.

  6. Power Prediction Model for Turning EN-31 Steel Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    M. Hameedullah

    2010-01-01

    Full Text Available Power consumption in turning EN-31 steel (a material that is most extensively used in automotive industry with tungstencarbide tool under different cutting conditions was experimentally investigated. The experimental runs were planned accordingto 24+8 added centre point factorial design of experiments, replicated thrice. The data collected was statisticallyanalyzed using Analysis of Variance technique and first order and second order power consumption prediction models weredeveloped by using response surface methodology (RSM. It is concluded that second-order model is more accurate than thefirst-order model and fit well with the experimental data. The model can be used in the automotive industries for decidingthe cutting parameters for minimum power consumption and hence maximum productivity

  7. Topobathymetric elevation model development using a new methodology: Coastal National Elevation Database

    Science.gov (United States)

    Danielson, Jeffrey J.; Poppenga, Sandra K.; Brock, John C.; Evans, Gayla A.; Tyler, Dean; Gesch, Dean B.; Thatcher, Cindy A.; Barras, John

    2016-01-01

    During the coming decades, coastlines will respond to widely predicted sea-level rise, storm surge, and coastalinundation flooding from disastrous events. Because physical processes in coastal environments are controlled by the geomorphology of over-the-land topography and underwater bathymetry, many applications of geospatial data in coastal environments require detailed knowledge of the near-shore topography and bathymetry. In this paper, an updated methodology used by the U.S. Geological Survey Coastal National Elevation Database (CoNED) Applications Project is presented for developing coastal topobathymetric elevation models (TBDEMs) from multiple topographic data sources with adjacent intertidal topobathymetric and offshore bathymetric sources to generate seamlessly integrated TBDEMs. This repeatable, updatable, and logically consistent methodology assimilates topographic data (land elevation) and bathymetry (water depth) into a seamless coastal elevation model. Within the overarching framework, vertical datum transformations are standardized in a workflow that interweaves spatially consistent interpolation (gridding) techniques with a land/water boundary mask delineation approach. Output gridded raster TBDEMs are stacked into a file storage system of mosaic datasets within an Esri ArcGIS geodatabase for efficient updating while maintaining current and updated spatially referenced metadata. Topobathymetric data provide a required seamless elevation product for several science application studies, such as shoreline delineation, coastal inundation mapping, sediment-transport, sea-level rise, storm surge models, and tsunami impact assessment. These detailed coastal elevation data are critical to depict regions prone to climate change impacts and are essential to planners and managers responsible for mitigating the associated risks and costs to both human communities and ecosystems. The CoNED methodology approach has been used to construct integrated TBDEM models

  8. Efficient methodologies for system matrix modelling in iterative image reconstruction for rotating high-resolution PET

    Energy Technology Data Exchange (ETDEWEB)

    Ortuno, J E; Kontaxakis, G; Rubio, J L; Santos, A [Departamento de Ingenieria Electronica (DIE), Universidad Politecnica de Madrid, Ciudad Universitaria s/n, 28040 Madrid (Spain); Guerra, P [Networking Research Center on Bioengineering, Biomaterials and Nanomedicine (CIBER-BBN), Madrid (Spain)], E-mail: juanen@die.upm.es

    2010-04-07

    A fully 3D iterative image reconstruction algorithm has been developed for high-resolution PET cameras composed of pixelated scintillator crystal arrays and rotating planar detectors, based on the ordered subsets approach. The associated system matrix is precalculated with Monte Carlo methods that incorporate physical effects not included in analytical models, such as positron range effects and interaction of the incident gammas with the scintillator material. Custom Monte Carlo methodologies have been developed and optimized for modelling of system matrices for fast iterative image reconstruction adapted to specific scanner geometries, without redundant calculations. According to the methodology proposed here, only one-eighth of the voxels within two central transaxial slices need to be modelled in detail. The rest of the system matrix elements can be obtained with the aid of axial symmetries and redundancies, as well as in-plane symmetries within transaxial slices. Sparse matrix techniques for the non-zero system matrix elements are employed, allowing for fast execution of the image reconstruction process. This 3D image reconstruction scheme has been compared in terms of image quality to a 2D fast implementation of the OSEM algorithm combined with Fourier rebinning approaches. This work confirms the superiority of fully 3D OSEM in terms of spatial resolution, contrast recovery and noise reduction as compared to conventional 2D approaches based on rebinning schemes. At the same time it demonstrates that fully 3D methodologies can be efficiently applied to the image reconstruction problem for high-resolution rotational PET cameras by applying accurate pre-calculated system models and taking advantage of the system's symmetries.

  9. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  10. A geostatistical methodology to assess the accuracy of unsaturated flow models

    International Nuclear Information System (INIS)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error

  11. A geostatistical methodology to assess the accuracy of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.

  12. DATA MINING METHODOLOGY FOR DETERMINING THE OPTIMAL MODEL OF COST PREDICTION IN SHIP INTERIM PRODUCT ASSEMBLY

    Directory of Open Access Journals (Sweden)

    Damir Kolich

    2016-03-01

    Full Text Available In order to accurately predict costs of the thousands of interim products that are assembled in shipyards, it is necessary to use skilled engineers to develop detailed Gantt charts for each interim product separately which takes many hours. It is helpful to develop a prediction tool to estimate the cost of interim products accurately and quickly without the need for skilled engineers. This will drive down shipyard costs and improve competitiveness. Data mining is used extensively for developing prediction models in other industries. Since ships consist of thousands of interim products, it is logical to develop a data mining methodology for a shipyard or any other manufacturing industry where interim products are produced. The methodology involves analysis of existing interim products and data collection. Pre-processing and principal component analysis is done to make the data “user-friendly” for later prediction processing and the development of both accurate and robust models. The support vector machine is demonstrated as the better model when there are a lower number of tuples. However as the number of tuples is increased to over 10000, then the artificial neural network model is recommended.

  13. Methodology of synchronization among strategy and operation. A standards-based modeling approach

    Directory of Open Access Journals (Sweden)

    VICTOR EDWIN COLLAZOS

    2017-05-01

    Full Text Available Enterprise Architecture (EA has gained importance in recent years, mainly for its concept of “alignment” between the strategic and operational levels of organizations. Such alignment occurs when Information Technology (IT is applied correctly and timely, working in synergy and harmony with strategy and the operation to achieve mutually their own goals and satisfy the organizational needs.Both the strategic and operational levels have standards that help model elements necessary to obtain desired results. In this sense, BMM and BPMN were selected because both have the support of OMG and they are fairly well known for modelling the strategic level and operational level, respectively. In addition, i* modeling goal can be used for reducing the gap between these two standards. This proposal may help both the high-level design of the information system and to the appropriate identification of the business processes that will support it.This paper presents a methodology for aligning strategy and the operation based on standards and heuristics. We have made a classification for elements of the models and, for some specific cases, an extension of the heuristics associated between them. This allows us to propose methodology, which uses above-mentioned standards and combines mappings, transformations and actions to be considered in the alignment process.

  14. Methodology for identifying parameters for the TRNSYS model Type 210 - wood pellet stoves and boilers

    Energy Technology Data Exchange (ETDEWEB)

    Persson, Tomas; Fiedler, Frank; Nordlander, Svante

    2006-05-15

    This report describes a method how to perform measurements on boilers and stoves and how to identify parameters from the measurements for the boiler/stove-model TRNSYS Type 210. The model can be used for detailed annual system simulations using TRNSYS. Experience from measurements on three different pellet stoves and four boilers were used to develop this methodology. Recommendations for the set up of measurements are given and the required combustion theory for the data evaluation and data preparation are given. The data evaluation showed that the uncertainties are quite large for the measured flue gas flow rate and for boilers and stoves with high fraction of energy going to the water jacket also the calculated heat rate to the room may have large uncertainties. A methodology for the parameter identification process and identified parameters for two different stoves and three boilers are given. Finally the identified models are compared with measured data showing that the model generally agreed well with measured data during both stationary and dynamic conditions.

  15. Oil Well Blowout 3D computational modeling: review of methodology and environmental requirements

    Directory of Open Access Journals (Sweden)

    Pedro Mello Paiva

    2016-12-01

    Full Text Available This literature review aims to present the different methodologies used in the three-dimensional modeling of the hydrocarbons dispersion originated from an oil well blowout. It presents the concepts of coastal environmental sensitivity and vulnerability, their importance for prioritizing the most vulnerable areas in case of contingency, and the relevant legislation. We also discuss some limitations about the methodology currently used in environmental studies of oil drift, which considers simplification of the spill on the surface, even in the well blowout scenario. Efforts to better understand the oil and gas behavior in the water column and three-dimensional modeling of the trajectory gained strength after the Deepwater Horizon spill in 2010 in the Gulf of Mexico. The data collected and the observations made during the accident were widely used for adjustment of the models, incorporating various factors related to hydrodynamic forcing and weathering processes to which the hydrocarbons are subjected during subsurface leaks. The difficulties show to be even more challenging in the case of blowouts in deep waters, where the uncertainties are still larger. The studies addressed different variables to make adjustments of oil and gas dispersion models along the upward trajectory. Factors that exert strong influences include: speed of the subsurface currents;  gas separation from the main plume; hydrate formation, dissolution of oil and gas droplets; variations in droplet diameter; intrusion of the droplets at intermediate depths; biodegradation; and appropriate parametrization of the density, salinity and temperature profiles of water through the column.

  16. Machine Translation and Other Translation Technologies.

    Science.gov (United States)

    Melby, Alan

    1996-01-01

    Examines the application of linguistic theory to machine translation and translator tools, discusses the use of machine translation and translator tools in the real world of translation, and addresses the impact of translation technology on conceptions of language and other issues. Findings indicate that the human mind is flexible and linguistic…

  17. Probabilistic risk assessment modeling of digital instrumentation and control systems using two dynamic methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, T., E-mail: aldemir.1@osu.ed [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Guarro, S. [ASCA, Inc., 1720 S. Catalina Avenue, Suite 220, Redondo Beach, CA 90277-5501 (United States); Mandelli, D. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Kirschenbaum, J. [Ohio State University, Department of Computer Science and Engineering, Columbus, OH 43210 (United States); Mangan, L.A. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Bucci, P. [Ohio State University, Department of Computer Science and Engineering, Columbus, OH 43210 (United States); Yau, M. [ASCA, Inc., 1720 S. Catalina Avenue, Suite 220, Redondo Beach, CA 90277-5501 (United States); Ekici, E. [Ohio State University, Department of Electrical and Computer Engineering, Columbus, OH 43210 (United States); Miller, D.W.; Sun, X. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Arndt, S.A. [U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001 (United States)

    2010-10-15

    The Markov/cell-to-cell mapping technique (CCMT) and the dynamic flowgraph methodology (DFM) are two system logic modeling methodologies that have been proposed to address the dynamic characteristics of digital instrumentation and control (I and C) systems and provide risk-analytical capabilities that supplement those provided by traditional probabilistic risk assessment (PRA) techniques for nuclear power plants. Both methodologies utilize a discrete state, multi-valued logic representation of the digital I and C system. For probabilistic quantification purposes, both techniques require the estimation of the probabilities of basic system failure modes, including digital I and C software failure modes, that appear in the prime implicants identified as contributors to a given system event of interest. As in any other system modeling process, the accuracy and predictive value of the models produced by the two techniques, depend not only on the intrinsic features of the modeling paradigm, but also and to a considerable extent on information and knowledge available to the analyst, concerning the system behavior and operation rules under normal and off-nominal conditions, and the associated controlled/monitored process dynamics. The application of the two methodologies is illustrated using a digital feedwater control system (DFWCS) similar to that of an operating pressurized water reactor. This application was carried out to demonstrate how the use of either technique, or both, can facilitate the updating of an existing nuclear power plant PRA model following an upgrade of the instrumentation and control system from analog to digital. Because of scope limitations, the focus of the demonstration of the methodologies was intentionally limited to aspects of digital I and C system behavior for which probabilistic data was on hand or could be generated within the existing project bounds of time and resources. The data used in the probabilistic quantification portion of the

  18. Capturing complexity in work disability research: application of system dynamics modeling methodology.

    Science.gov (United States)

    Jetha, Arif; Pransky, Glenn; Hettinger, Lawrence J

    2016-01-01

    Work disability (WD) is characterized by variable and occasionally undesirable outcomes. The underlying determinants of WD outcomes include patterns of dynamic relationships among health, personal, organizational and regulatory factors that have been challenging to characterize, and inadequately represented by contemporary WD models. System dynamics modeling (SDM) methodology applies a sociotechnical systems thinking lens to view WD systems as comprising a range of influential factors linked by feedback relationships. SDM can potentially overcome limitations in contemporary WD models by uncovering causal feedback relationships, and conceptualizing dynamic system behaviors. It employs a collaborative and stakeholder-based model building methodology to create a visual depiction of the system as a whole. SDM can also enable researchers to run dynamic simulations to provide evidence of anticipated or unanticipated outcomes that could result from policy and programmatic intervention. SDM may advance rehabilitation research by providing greater insights into the structure and dynamics of WD systems while helping to understand inherent complexity. Challenges related to data availability, determining validity, and the extensive time and technical skill requirements for model building may limit SDM's use in the field and should be considered. Contemporary work disability (WD) models provide limited insight into complexity associated with WD processes. System dynamics modeling (SDM) has the potential to capture complexity through a stakeholder-based approach that generates a simulation model consisting of multiple feedback loops. SDM may enable WD researchers and practitioners to understand the structure and behavior of the WD system as a whole, and inform development of improved strategies to manage straightforward and complex WD cases.

  19. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Science.gov (United States)

    Rivera, Diego; Rivas, Yessica; Godoy, Alex

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  20. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  1. New methodologies for calculation of flight parameters on reduced scale wings models in wind tunnel =

    Science.gov (United States)

    Ben Mosbah, Abdallah

    In order to improve the qualities of wind tunnel tests, and the tools used to perform aerodynamic tests on aircraft wings in the wind tunnel, new methodologies were developed and tested on rigid and flexible wings models. A flexible wing concept is consists in replacing a portion (lower and/or upper) of the skin with another flexible portion whose shape can be changed using an actuation system installed inside of the wing. The main purpose of this concept is to improve the aerodynamic performance of the aircraft, and especially to reduce the fuel consumption of the airplane. Numerical and experimental analyses were conducted to develop and test the methodologies proposed in this thesis. To control the flow inside the test sections of the Price-Paidoussis wind tunnel of LARCASE, numerical and experimental analyses were performed. Computational fluid dynamics calculations have been made in order to obtain a database used to develop a new hybrid methodology for wind tunnel calibration. This approach allows controlling the flow in the test section of the Price-Paidoussis wind tunnel. For the fast determination of aerodynamic parameters, new hybrid methodologies were proposed. These methodologies were used to control flight parameters by the calculation of the drag, lift and pitching moment coefficients and by the calculation of the pressure distribution around an airfoil. These aerodynamic coefficients were calculated from the known airflow conditions such as angles of attack, the mach and the Reynolds numbers. In order to modify the shape of the wing skin, electric actuators were installed inside the wing to get the desired shape. These deformations provide optimal profiles according to different flight conditions in order to reduce the fuel consumption. A controller based on neural networks was implemented to obtain desired displacement actuators. A metaheuristic algorithm was used in hybridization with neural networks, and support vector machine approaches and their

  2. Agile Methodologies and Software Process Improvement Maturity Models, Current State of Practice in Small and Medium Enterprises

    OpenAIRE

    Koutsoumpos, Vasileios; Marinelarena, Iker

    2013-01-01

    Abstract—Background: Software Process Improvement (SPI) maturity models have been developed to assist organizations to enhance software quality. Agile methodologies are used to ensure productivity and quality of a software product. Amongst others they are applied in Small and Medium – sized Enterprises (SMEs). However, little is known about the combination of Agile methodologies and SPI maturity models regarding SMEs and the results that could emerge, as all the current SPI models are address...

  3. Typologically robust statistical machine translation : Understanding and exploiting differences and similarities between languages in machine translation

    NARCIS (Netherlands)

    Daiber, J.

    2018-01-01

    Machine translation systems often incorporate modeling assumptions motivated by properties of the language pairs they initially target. When such systems are applied to language families with considerably different properties, translation quality can deteriorate. Phrase-based machine translation

  4. Frescoed Vaults: Accuracy Controlled Simplified Methodology for Planar Development of Three-Dimensional Textured Models

    Directory of Open Access Journals (Sweden)

    Marco Giorgio Bevilacqua

    2016-03-01

    Full Text Available In the field of documentation and preservation of cultural heritage, there is keen interest in 3D metric viewing and rendering of architecture for both formal appearance and color. On the other hand, operative steps of restoration interventions still require full-scale, 2D metric surface representations. The transition from 3D to 2D representation, with the related geometric transformations, has not yet been fully formalized for planar development of frescoed vaults. Methodologies proposed so far on this subject provide transitioning from point cloud models to ideal mathematical surfaces and projecting textures using software tools. The methodology used for geometry and texture development in the present work does not require any dedicated software. The different processing steps can be individually checked for any error introduced, which can be then quantified. A direct accuracy check of the planar development of the frescoed surface has been carried out by qualified restorers, yielding a result of 3 mm. The proposed methodology, although requiring further studies to improve automation of the different processing steps, allowed extracting 2D drafts fully usable by operators restoring the vault frescoes.

  5. Ab initio translationally invariant nonlocal one-body densities from no-core shell-model theory

    Science.gov (United States)

    Burrows, M.; Elster, Ch.; Popa, G.; Launey, K. D.; Nogga, A.; Maris, P.

    2018-02-01

    Background: It is well known that effective nuclear interactions are in general nonlocal. Thus if nuclear densities obtained from ab initio no-core shell-model (NCSM) calculations are to be used in reaction calculations, translationally invariant nonlocal densities must be available. Purpose: Though it is standard to extract translationally invariant one-body local densities from NCSM calculations to calculate local nuclear observables like radii and transition amplitudes, the corresponding nonlocal one-body densities have not been considered so far. A major reason for this is that the procedure for removing the center-of-mass component from NCSM wave functions up to now has only been developed for local densities. Results: A formulation for removing center-of-mass contributions from nonlocal one-body densities obtained from NCSM and symmetry-adapted NCSM (SA-NCSM) calculations is derived, and applied to the ground state densities of 4He, 6Li, 12C, and 16O. The nonlocality is studied as a function of angular momentum components in momentum as well as coordinate space. Conclusions: We find that the nonlocality for the ground state densities of the nuclei under consideration increases as a function of the angular momentum. The relative magnitude of those contributions decreases with increasing angular momentum. In general, the nonlocal structure of the one-body density matrices we studied is given by the shell structure of the nucleus, and cannot be described with simple functional forms.

  6. Boolean modeling in systems biology: an overview of methodology and applications

    International Nuclear Information System (INIS)

    Wang, Rui-Sheng; Albert, Réka; Saadatpour, Assieh

    2012-01-01

    Mathematical modeling of biological processes provides deep insights into complex cellular systems. While quantitative and continuous models such as differential equations have been widely used, their use is obstructed in systems wherein the knowledge of mechanistic details and kinetic parameters is scarce. On the other hand, a wealth of molecular level qualitative data on individual components and interactions can be obtained from the experimental literature and high-throughput technologies, making qualitative approaches such as Boolean network modeling extremely useful. In this paper, we build on our research to provide a methodology overview of Boolean modeling in systems biology, including Boolean dynamic modeling of cellular networks, attractor analysis of Boolean dynamic models, as well as inferring biological regulatory mechanisms from high-throughput data using Boolean models. We finally demonstrate how Boolean models can be applied to perform the structural analysis of cellular networks. This overview aims to acquaint life science researchers with the basic steps of Boolean modeling and its applications in several areas of systems biology. (paper)

  7. A novel methodology improves reservoir characterization models using geologic fuzzy variables

    Energy Technology Data Exchange (ETDEWEB)

    Soto B, Rodolfo [DIGITOIL, Maracaibo (Venezuela); Soto O, David A. [Texas A and M University, College Station, TX (United States)

    2004-07-01

    One of the research projects carried out in Cusiana field to explain its rapid decline during the last years was to get better permeability models. The reservoir of this field has a complex layered system that it is not easy to model using conventional methods. The new technique included the development of porosity and permeability maps from cored wells following the same trend of the sand depositions for each facie or layer according to the sedimentary facie and the depositional system models. Then, we used fuzzy logic to reproduce those maps in three dimensions as geologic fuzzy variables. After multivariate statistical and factor analyses, we found independence and a good correlation coefficient between the geologic fuzzy variables and core permeability and porosity. This means, the geologic fuzzy variable could explain the fabric, the grain size and the pore geometry of the reservoir rock trough the field. Finally, we developed a neural network permeability model using porosity, gamma ray and the geologic fuzzy variable as input variables. This model has a cross-correlation coefficient of 0.873 and average absolute error of 33% compared with the actual model with a correlation coefficient of 0.511 and absolute error greater than 250%. We tested different methodologies, but this new one showed dramatically be a promiser way to get better permeability models. The use of the models have had a high impact in the explanation of well performance and workovers, and reservoir simulation models. (author)

  8. The Translator's Turn: in the Cultural Turn

    Institute of Scientific and Technical Information of China (English)

    徐玮玮

    2003-01-01

    @@ Introduction: Douglas Robinson rose to the defense of the " atheoretical" American literary translator in The Translator's Turn (1991). Here, I borrowed the title from him, but I will write my paper in the thought of the translator's role in translating. In his book, Robinson argued that the literary translator embodies an integration of feeling and thought, of intuition and systematization. In analyzing the " turn" that the translator take from the source text to the target text, Robinson offered a " dialogical" model, that is the translator's dialogical engagement with the source language and with the ethic of the target language. Robinson allows for the translator to intervene, subvert, divert, even entertain, emphasizing the creative aspect of literary translation. The translation linguists, scientists, and philosophers have had their chance at translation theory; now it is time, he argued, for the literary translators to have their " turn".

  9. PWR Facility Dose Modeling Using MCNP5 and the CADIS/ADVANTG Variance-Reduction Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Blakeman, Edward D [ORNL; Peplow, Douglas E. [ORNL; Wagner, John C [ORNL; Murphy, Brian D [ORNL; Mueller, Don [ORNL

    2007-09-01

    The feasibility of modeling a pressurized-water-reactor (PWR) facility and calculating dose rates at all locations within the containment and adjoining structures using MCNP5 with mesh tallies is presented. Calculations of dose rates resulting from neutron and photon sources from the reactor (operating and shut down for various periods) and the spent fuel pool, as well as for the photon source from the primary coolant loop, were all of interest. Identification of the PWR facility, development of the MCNP-based model and automation of the run process, calculation of the various sources, and development of methods for visually examining mesh tally files and extracting dose rates were all a significant part of the project. Advanced variance reduction, which was required because of the size of the model and the large amount of shielding, was performed via the CADIS/ADVANTG approach. This methodology uses an automatically generated three-dimensional discrete ordinates model to calculate adjoint fluxes from which MCNP weight windows and source bias parameters are generated. Investigative calculations were performed using a simple block model and a simplified full-scale model of the PWR containment, in which the adjoint source was placed in various regions. In general, it was shown that placement of the adjoint source on the periphery of the model provided adequate results for regions reasonably close to the source (e.g., within the containment structure for the reactor source). A modification to the CADIS/ADVANTG methodology was also studied in which a global adjoint source is weighted by the reciprocal of the dose response calculated by an earlier forward discrete ordinates calculation. This method showed improved results over those using the standard CADIS/ADVANTG approach, and its further investigation is recommended for future efforts.

  10. Modeling companion diagnostics in economic evaluations of targeted oncology therapies: systematic review and methodological checklist.

    Science.gov (United States)

    Doble, Brett; Tan, Marcus; Harris, Anthony; Lorgelly, Paula

    2015-02-01

    The successful use of a targeted therapy is intrinsically linked to the ability of a companion diagnostic to correctly identify patients most likely to benefit from treatment. The aim of this study was to review the characteristics of companion diagnostics that are of importance for inclusion in an economic evaluation. Approaches for including these characteristics in model-based economic evaluations are compared with the intent to describe best practice methods. Five databases and government agency websites were searched to identify model-based economic evaluations comparing a companion diagnostic and subsequent treatment strategy to another alternative treatment strategy with model parameters for the sensitivity and specificity of the companion diagnostic (primary synthesis). Economic evaluations that limited model parameters for the companion diagnostic to only its cost were also identified (secondary synthesis). Quality was assessed using the Quality of Health Economic Studies instrument. 30 studies were included in the review (primary synthesis n = 12; secondary synthesis n = 18). Incremental cost-effectiveness ratios may be lower when the only parameter for the companion diagnostic included in a model is the cost of testing. Incorporating the test's accuracy in addition to its cost may be a more appropriate methodological approach. Altering the prevalence of the genetic biomarker, specific population tested, type of test, test accuracy and timing/sequence of multiple tests can all impact overall model results. The impact of altering a test's threshold for positivity is unknown as it was not addressed in any of the included studies. Additional quality criteria as outlined in our methodological checklist should be considered due to the shortcomings of standard quality assessment tools in differentiating studies that incorporate important test-related characteristics and those that do not. There is a need to refine methods for incorporating the characteristics

  11. PWR Facility Dose Modeling Using MCNP5 and the CADIS/ADVANTG Variance-Reduction Methodology

    International Nuclear Information System (INIS)

    Blakeman, Edward D.; Peplow, Douglas E.; Wagner, John C.; Murphy, Brian D.; Mueller, Don

    2007-01-01

    The feasibility of modeling a pressurized-water-reactor (PWR) facility and calculating dose rates at all locations within the containment and adjoining structures using MCNP5 with mesh tallies is presented. Calculations of dose rates resulting from neutron and photon sources from the reactor (operating and shut down for various periods) and the spent fuel pool, as well as for the photon source from the primary coolant loop, were all of interest. Identification of the PWR facility, development of the MCNP-based model and automation of the run process, calculation of the various sources, and development of methods for visually examining mesh tally files and extracting dose rates were all a significant part of the project. Advanced variance reduction, which was required because of the size of the model and the large amount of shielding, was performed via the CADIS/ADVANTG approach. This methodology uses an automatically generated three-dimensional discrete ordinates model to calculate adjoint fluxes from which MCNP weight windows and source bias parameters are generated. Investigative calculations were performed using a simple block model and a simplified full-scale model of the PWR containment, in which the adjoint source was placed in various regions. In general, it was shown that placement of the adjoint source on the periphery of the model provided adequate results for regions reasonably close to the source (e.g., within the containment structure for the reactor source). A modification to the CADIS/ADVANTG methodology was also studied in which a global adjoint source is weighted by the reciprocal of the dose response calculated by an earlier forward discrete ordinates calculation. This method showed improved results over those using the standard CADIS/ADVANTG approach, and its further investigation is recommended for future efforts

  12. Methodology for geometric modelling. Presentation and administration of site descriptive models; Metodik foer geometrisk modellering. Presentation och administration av platsbeskrivande modeller

    Energy Technology Data Exchange (ETDEWEB)

    Munier, Raymond [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hermanson, Jan [Golder Associates (Sweden)

    2001-03-01

    This report presents a methodology to construct, visualise and present geoscientific descriptive models based on data from the site investigations, which the SKB currently performs, to build an underground nuclear waste disposal facility in Sweden. It is designed for interaction with SICADA (SKB:s site characterisation database) and RVS (SKB:s Rock Visualisation System). However, the concepts of the methodology are general and can be used with other tools capable of handling 3D geometries and parameters. The descriptive model is intended to be an instrument where site investigation data from all disciplines are put together to form a comprehensive visual interpretation of the studied rock mass. The methodology has four main components: 1. Construction of a geometrical model of the interpreted main structures at the site. 2. Description of the geoscientific characteristics of the structures. 3. Description and geometrical implementation of the geometric uncertainties in the interpreted model structures. 4. Quality system for the handling of the geometrical model, its associated database and some aspects of the technical auditing. The geometrical model forms a basis for understanding the main elements and structures of the investigated site. Once the interpreted geometries are in place in the model, the system allows for adding descriptive and quantitative data to each modelled object through a system of intuitive menus. The associated database allows each geometrical object a complete quantitative description of all geoscientific disciplines, variabilities, uncertainties in interpretation and full version history. The complete geometrical model and its associated database of object descriptions are to be recorded in a central quality system. Official, new and old versions of the model are administered centrally in order to have complete quality assurance of each step in the interpretation process. The descriptive model is a cornerstone in the understanding of the

  13. StellaR: A software to translate Stella models into R open-source environment

    NARCIS (Netherlands)

    Naimi, N.; Voinov, A.

    2012-01-01

    Stella is a popular system dynamics modeling tool, which helps to put together conceptual diagrams and converts them into numeric computer models. Although it can be very useful, especially in participatory modeling, it lacks the power and flexibility of a programming language. This paper presents

  14. Animal Models and Bone Histomorphometry: Translational Research for the Human Research Program

    Science.gov (United States)

    Sibonga, Jean D.

    2010-01-01

    This slide presentation reviews the use of animal models to research and inform bone morphology, in particular relating to human research in bone loss as a result of low gravity environments. Reasons for use of animal models as tools for human research programs include: time-efficient, cost-effective, invasive measures, and predictability as some model are predictive for drug effects.

  15. Translation and Adaptation of Tests: Lessons Learned and Recommendations for Countries Participating in timss, pisa and other International Comparisons

    Directory of Open Access Journals (Sweden)

    Guillermo Solano-Flores

    2006-11-01

    Full Text Available In this paper we present a conceptual model and methodology for the review of translated tests in the context of such international comparisons as the Trends in International Mathematics and Science Study (TIMSS and the Programme for International Student Assessment (PISA. We also present the results of an investigation into the quality of the Mexican translation of the TIMSS-1995 into the Spanish language. We identified translation errors in a significant percentage of the items, as well as relatively high correlations between the severity of translation errors and the items’ p-values. These findings indicate that our error-coding system is highly sensitive to test-translation error. The results underscore the need for improved translation and translation-review procedures in international comparisons. In our opinion, to implement the guidelines properly for test translation in international comparisons, each participating country needs to have internal procedures that would ensure a rigorous review of its own translations. The article concludes with four recommendations for countries participating in international comparisons. These recommendations relate to: (a the characteristics of the individuals in charge of translating instruments; (b the use of review, not simply at the end of the process, but during the process of test translation; (c the minimum time needed for various translation review iterations to take place; and (d the need for proper documentation of the entire process of test translation.

  16. Complex methodology of the model elaboration of the quantified transnationalization process assessment

    Directory of Open Access Journals (Sweden)

    Larysa Rudenko-Sudarieva

    2009-03-01

    Full Text Available In the article there are studied the theoretical fundamentals of transnationalization, the peculiarities of its development based on the studying of the world theory and practices; suggested a systematic approach of the methodical background as for determination of the economic category of «transnationalization» and its author’s definition; developed a complex methodology of the model building of the quantified transnationalization process assessment based on the seven-milestone algorithm of the formation of key indicators; systematized and carried out synthesis of the empiric investigations concerning the state, development of the available tendencies, comparative analysis of the transnationalization level within the separate TNC’s groups.

  17. 3CE Methodology for Conducting a Modeling, Simulation, and Instrumentation Tool Capability Analysis

    Science.gov (United States)

    2010-05-01

    flRmurn I F )T:Ir,tir)l! MCr)lto.-lng DHin nttbli..’"Ollc:~ E,;m:a..liut .!,)’l’lt’Mn:l’lll.ll~ t Managemen t F unction a l Arem 1 .5 Toola na...a modeling, simulation, and instrumentation (MS&I) environment. This methodology uses the DoDAF product set to document operational and systems...engineering process were identified and resolved, such as duplication of data elements derived from DoDAF operational and system views used to

  18. Do Methodological Choices in Environmental Modeling Bias Rebound Effects? A Case Study on Electric Cars.

    Science.gov (United States)

    Font Vivanco, David; Tukker, Arnold; Kemp, René

    2016-10-18

    Improvements in resource efficiency often underperform because of rebound effects. Calculations of the size of rebound effects are subject to various types of bias, among which methodological choices have received particular attention. Modellers have primarily focused on choices related to changes in demand, however, choices related to modeling the environmental burdens from such changes have received less attention. In this study, we analyze choices in the environmental assessment methods (life cycle assessment (LCA) and hybrid LCA) and environmental input-output databases (E3IOT, Exiobase and WIOD) used as a source of bias. The analysis is done for a case study on battery electric and hydrogen cars in Europe. The results describe moderate rebound effects for both technologies in the short term. Additionally, long-run scenarios are calculated by simulating the total cost of ownership, which describe notable rebound effect sizes-from 26 to 59% and from 18 to 28%, respectively, depending on the methodological choices-with favorable economic conditions. Relevant sources of bias are found to be related to incomplete background systems, technology assumptions and sectorial aggregation. These findings highlight the importance of the method setup and of sensitivity analyses of choices related to environmental modeling in rebound effect assessments.

  19. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A., E-mail: adlyamr@gmail.com [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)

    2017-07-15

    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  20. A methodology for assessing the market benefits of alternative motor fuels: The Alternative Fuels Trade Model

    Energy Technology Data Exchange (ETDEWEB)

    Leiby, P.N.

    1993-09-01

    This report describes a modeling methodology for examining the prospective economic benefits of displacing motor gasoline use by alternative fuels. The approach is based on the Alternative Fuels Trade Model (AFTM). AFTM development was undertaken by the US Department of Energy (DOE) as part of a longer term study of alternative fuels issues. The AFTM is intended to assist with evaluating how alternative fuels may be promoted effectively, and what the consequences of substantial alternative fuels use might be. Such an evaluation of policies and consequences of an alternative fuels program is being undertaken by DOE as required by Section 502(b) of the Energy Policy Act of 1992. Interest in alternative fuels is based on the prospective economic, environmental and energy security benefits from the substitution of these fuels for conventional transportation fuels. The transportation sector is heavily dependent on oil. Increased oil use implies increased petroleum imports, with much of the increase coming from OPEC countries. Conversely, displacement of gasoline has the potential to reduce US petroleum imports, thereby reducing reliance on OPEC oil and possibly weakening OPEC`s ability to extract monopoly profits. The magnitude of US petroleum import reduction, the attendant fuel price changes, and the resulting US benefits, depend upon the nature of oil-gas substitution and the supply and demand behavior of other world regions. The methodology applies an integrated model of fuel market interactions to characterize these effects.

  1. Methodological aspects of modeling household solid waste generation in Japan: Evidence from Okayama and Otsu cities.

    Science.gov (United States)

    Gu, Binxian; Fujiwara, Takeshi; Jia, Renfu; Duan, Ruiyang; Gu, Aijun

    2017-12-01

    This paper presents a quantitative methodology and two empirical case studies in Japan on modeling household solid waste (HSW) generation based on individual consumption expenditure (ICE) and local waste policy effects by using the coupled estimation model systems. Results indicate that ICE on food, miscellaneous commodities and services, as well as education, cultural, and recreation services are mainly associated with the changes of HSW generation and its components in Okayama and Otsu from 1980 to 2014. The effects of waste policy measures were also identified. HSW generation in Okayama will increase from 11.60 million tons (mt) in 1980 to 25.02 mt in 2025, and the corresponding figures are 6.82 mt (in 1980) and 14.00 mt (in 2025) in Otsu. To better manage local HSW, several possible and appropriate implications such as promoting a green lifestyle, extending producer responsibility, intensifying recycling and source separation, generalizing composting, and establishing flexible measures and sustainable policies should be adopted. Results of this study would facilitate consumer management of low waste generation and support an effective HSW policy design in the two case cities. Success could lead to emulation by other Japanese cities seeking to build and maintain a sustainable, eco-friendly society. Moreover, the methodologies of establishing coupled estimation model systems could be extended to China and other global cities.

  2. Translational Pharmacokinetic‐Pharmacodynamic Modeling and Simulation: Optimizing 5‐Fluorouracil Dosing in Children With Pediatric Ependymoma

    Science.gov (United States)

    Daryani, VM; Patel, YT; Tagen, M; Turner, DC; Carcaboso, AM; Atkinson, JM; Gajjar, A; Gilbertson, RJ; Wright, KD

    2016-01-01

    We previously investigated novel therapies for pediatric ependymoma and found 5‐fluorouracil (5‐FU) i.v. bolus increased survival in a representative mouse model. However, without a quantitative framework to derive clinical dosing recommendations, we devised a translational pharmacokinetic‐pharmacodynamic (PK‐PD) modeling and simulation approach. Results from our preclinical PK‐PD model suggested tumor concentrations exceeded the 1‐hour target exposure (in vitro IC90), leading to tumor growth delay and increased survival. Using an adult population PK model, we scaled our preclinical PK‐PD model to children. To select a 5‐FU dosage for our clinical trial in children with ependymoma, we simulated various 5‐FU dosages for tumor exposures and tumor growth inhibition, as well as considering tolerability to bolus 5‐FU administration. We developed a pediatric population PK model of bolus 5‐FU and simulated tumor exposures for our patients. Simulations for tumor concentrations indicated that all patients would be above the 1‐hour target exposure for antitumor effect. PMID:27104090

  3. Science for education: a new model of translational research applied to education

    Directory of Open Access Journals (Sweden)

    Roberto Lent

    2017-07-01

    Full Text Available A great advance in the last transition of centuries has been the consolidation of the concept of translational research, applied with success in Health and Engineering in practically all countries of medium/high GDP. Intriguingly, this has not occurred with Education. It is yet not perceived that Science can already understand how people learn, which are the mechanisms that accelerate learning and teaching, and how this would impact on the economy and the social progress of nations. It is also not perceived that innovations can be validated with populational studies to rationalize and scale novel teaching initiatives, nor which socioemotional competences should future citizens possess to work in companies more and more automatized and informatized. Perhaps because of this omission, the progress of Brazilian educational indicators has been so modest. In Health, public policies not only invest in material improvements (sanitation, hospital attendance, nutritional coverture, etc, but also on Science and Innovation capable of creating new options in the international scenario (therapies for degenerative diseases, vaccines for infectious diseases, etc. Differently, on Education investment has focused exclusively on material improvements (more schools, better salaries for teachers, etc, necessary but insufficient to accelerate growth of our indicators at faster and more competitive rates. This scenario opens to us a window of opportunity to create a new Science policy aiming at Education. To give concreteness to this possibility, the proposal on discussion is that the new initiatives of support and funding by public and private agencies should have Science for Education as its structurant axis.

  4. Translation Competence

    DEFF Research Database (Denmark)

    Vandepitte, Sonia; Mousten, Birthe; Maylath, Bruce

    2014-01-01

    After Kiraly (2000) introduced the collaborative form of translation in classrooms, Pavlovic (2007), Kenny (2008), and Huertas Barros (2011) provided empirical evidence that testifies to the impact of collaborative learning. This chapter sets out to describe the collaborative forms of learning at...

  5. Translating Harbourscapes

    DEFF Research Database (Denmark)

    Diedrich, Lisa Babette

    -specific design are proposed for all actors involved in harbour transformation. The study ends with an invitation to further investigate translation as a powerful metaphor for the way existing qualities of a site can be transformed, rather than erased or rewritten, and to explore how this metaphor can foster new...

  6. Translation Meets Cognitive Science: The Imprint of Translation on Cognitive Processing

    Science.gov (United States)

    Rojo, Ana

    2015-01-01

    Translation has long played a role in linguistic and literary studies research. More recently, the theoretical and methodological concerns of process research have given translation an additional role in cognitive science. The interest in the cognitive aspects of translation has led scholars to turn to disciplines such as cognitive linguistics,…

  7. A new methodology for modelling of health risk from urban flooding exemplified by cholera

    DEFF Research Database (Denmark)

    Mark, Ole; Jørgensen, Claus; Hammond, Michael

    2016-01-01

    outlines a novel methodology for linking dynamic urban flood modelling with quantitative microbial risk assessment (QMRA). This provides a unique possibility for understanding the interaction between urban flooding and health risk caused by direct human contact with the flood water and hence gives...... and mortality, especially during floods. At present, there are no software tools capable of combining hydrodynamic modelling and health risk analyses, and the links between urban flooding and the health risk for the population due to direct contact with the flood water are poorly understood. The present paper...... an option for reducing the burden of disease in the population by use of intelligent urban flood risk management. The model linking urban flooding and health risk is applied to Dhaka City in Bangladesh, where waterborne diseases including cholera are endemic. The application to Dhaka City is supported...

  8. METHODOLOGY FOR THE ESTIMATION OF PARAMETERS, OF THE MODIFIED BOUC-WEN MODEL

    Directory of Open Access Journals (Sweden)

    Tomasz HANISZEWSKI

    2015-03-01

    Full Text Available Bouc-Wen model is theoretical formulation that allows to reflect real hysteresis loop of modeled object. Such object is for example a wire rope, which is present on equipment of crane lifting mechanism. Where adopted modified version of the model has nine parameters. Determination of such a number of parameters is complex and problematic issue. In this article are shown the methodology to identify and sample results of numerical simulations. The results were compared with data obtained on the basis of laboratory tests of ropes [3] and on their basis it was found that there is compliance between results and there is possibility to apply in dynamic systems containing in their structures wire ropes [4].

  9. A Consistent Methodology Based Parameter Estimation for a Lactic Acid Bacteria Fermentation Model

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2017-01-01

    Lactic acid bacteria are used in many industrial applications, e.g. as starter cultures in the dairy industry or as probiotics, and research on their cell production is highly required. A first principles kinetic model was developed to describe and understand the biological, physical, and chemical...... mechanisms in a lactic acid bacteria fermentation. We present here a consistent approach for a methodology based parameter estimation for a lactic acid fermentation. In the beginning, just an initial knowledge based guess of parameters was available and an initial parameter estimation of the complete set...... of parameters was performed in order to get a good model fit to the data. However, not all parameters are identifiable with the given data set and model structure. Sensitivity, identifiability, and uncertainty analysis were completed and a relevant identifiable subset of parameters was determined for a new...

  10. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    Science.gov (United States)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  11. CALS and the Product State Model - Methodology and Supporting Schools and Paradigms

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1998-01-01

    incorporates relevant information about each stage of the production process.The paper will describe the research object, the model object and discuss a part of the methodology in developing a Product State Model. The project is primarily technological, however, organisational and human aspects......This paper address the preliminary considerations in a research project, initiated February 1997, regarding Continuous Acquisition and Life-cycle Support (CALS) which is a part of the activities in CALS Center Denmark. The CALS concept is presented focusing on the Product State Model (PSM). The PSM...... will be developed upon, will be discussed. Also, the parameters for evaluating the PSM will be considered. In establishing the theoretical body of knowledge with respect to CALS, an identification of schools and paradigms within the research area of applying information technology in a manufacturing environment...

  12. Biomedical informatics and translational medicine

    Directory of Open Access Journals (Sweden)

    Sarkar Indra

    2010-02-01

    Full Text Available Abstract Biomedical informatics involves a core set of methodologies that can provide a foundation for crossing the "translational barriers" associated with translational medicine. To this end, the fundamental aspects of biomedical informatics (e.g., bioinformatics, imaging informatics, clinical informatics, and public health informatics may be essential in helping improve the ability to bring basic research findings to the bedside, evaluate the efficacy of interventions across communities, and enable the assessment of the eventual impact of translational medicine innovations on health policies. Here, a brief description is provided for a selection of key biomedical informatics topics (Decision Support, Natural Language Processing, Standards, Information Retrieval, and Electronic Health Records and their relevance to translational medicine. Based on contributions and advancements in each of these topic areas, the article proposes that biomedical informatics practitioners ("biomedical informaticians" can be essential members of translational medicine teams.

  13. Word translation entropy in translation

    DEFF Research Database (Denmark)

    Schaeffer, Moritz; Dragsted, Barbara; Hvelplund, Kristian Tangsgaard

    2016-01-01

    This study reports on an investigation into the relationship between the number of translation alternatives for a single word and eye movements on the source text. In addition, the effect of word order differences between source and target text on eye movements on the source text is studied....... In particular, the current study investigates the effect of these variables on early and late eye movement measures. Early eye movement measures are indicative of processes that are more automatic while late measures are more indicative of conscious processing. Most studies that found evidence of target...... language activation during source text reading in translation, i.e. co-activation of the two linguistic systems, employed late eye movement measures or reaction times. The current study therefore aims to investigate if and to what extent earlier eye movement measures in reading for translation show...

  14. The epistemology of mathematical and statistical modeling: a quiet methodological revolution.

    Science.gov (United States)

    Rodgers, Joseph Lee

    2010-01-01

    A quiet methodological revolution, a modeling revolution, has occurred over the past several decades, almost without discussion. In contrast, the 20th century ended with contentious argument over the utility of null hypothesis significance testing (NHST). The NHST controversy may have been at least partially irrelevant, because in certain ways the modeling revolution obviated the NHST argument. I begin with a history of NHST and modeling and their relation to one another. Next, I define and illustrate principles involved in developing and evaluating mathematical models. Following, I discuss the difference between using statistical procedures within a rule-based framework and building mathematical models from a scientific epistemology. Only the former is treated carefully in most psychology graduate training. The pedagogical implications of this imbalance and the revised pedagogy required to account for the modeling revolution are described. To conclude, I discuss how attention to modeling implies shifting statistical practice in certain progressive ways. The epistemological basis of statistics has moved away from being a set of procedures, applied mechanistically, and moved toward building and evaluating statistical and scientific models. Copyrigiht 2009 APA, all rights reserved.

  15. Topical Review: Translating Translational Research in Behavioral Science.

    Science.gov (United States)

    Hommel, Kevin A; Modi, Avani C; Piazza-Waggoner, Carrie; Myers, James D

    2015-01-01

    To present a model of translational research for behavioral science that communicates the role of behavioral research at each phase of translation. A task force identified gaps in knowledge regarding behavioral translational research processes and made recommendations regarding advancement of knowledge. A comprehensive model of translational behavioral research was developed. This model represents T1, T2, and T3 research activities, as well as Phase 1, 2, 3, and 4 clinical trials. Clinical illustrations of translational processes are also offered as support for the model. Behavioral science has struggled with defining a translational research model that effectively articulates each stage of translation and complements biomedical research. Our model defines key activities at each phase of translation from basic discovery to dissemination/implementation. This should be a starting point for communicating the role of behavioral science in translational research and a catalyst for better integration of biomedical and behavioral research. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. A MAINTENANCE STRATEGY MODEL FOR STATIC EQUIPMENT USING INSPECTION METHODOLOGIES AND RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    J.K. Visser

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Mechanical equipment used on process plants can be categorised into two main types, namely static and rotating equipment. A brief survey at a number of chemical process plants indicated that a number of maintenance strategies exist and are used for rotating equipment. However, some of these strategies are not directly applicable to static equipment, although the risk-based inspection (RBI methodology has been developed for pressure vessels. A generalised risk-based maintenance strategy for all types of static equipment does not currently exist. This paper describes the development of an optimised model of inspection methodologies, maintenance strategies, and risk management principles that are generically applicable for static equipment. It enables maintenance managers and engineers to select an applicable maintenance strategy and inspection methodology, based on the operational and business risks posed by the individual pieces of equipment.

    AFRIKAANSE OPSOMMING: Meganiese toerusting wat op prosesaanlegte gebruik word kan in twee kategorieë verdeel word, naamlik statiese en roterende toerusting. 'n Bondige ondersoek by 'n aantal chemiese prosesaanlegte het aangedui dat 'n aantal strategieë vir instandhouding van roterende toerusting gebruik word, terwyl die risikogebaseerde inspeksiemetodologie wel vir drukvate gebruik word. 'n Algemene risikogebaseerde instandhoudingstrategie vir alle tipes statiese toerusting is egter nie tans beskikbaar nie. Hierdie artikel beskryf die ontwikkeling van 'n geoptimeerde model van inspeksiemetodologieë, instandhoudingstrategieë, en risikobestuursbeginsels wat algemeen gebruik kan word vir statiese toerusting. Dit stel die instandhouding-bestuurders en -ingenieurs in staat om 'n instandhoudingstrategie en inspeksie-metodologie te kies, gebaseer op die operasionele en besigheidsrisiko's van die individuele toerusting.

  17. Prototype methodology for obtaining cloud seeding guidance from HRRR model data

    Science.gov (United States)

    Dawson, N.; Blestrud, D.; Kunkel, M. L.; Waller, B.; Ceratto, J.

    2017-12-01

    Weather model data, along with real time observations, are critical to determine whether atmospheric conditions are prime for super-cooled liquid water during cloud seeding operations. Cloud seeding groups can either use operational forecast models, or run their own model on a computer cluster. A custom weather model provides the most flexibility, but is also expensive. For programs with smaller budgets, openly-available operational forecasting models are the de facto method for obtaining forecast data. The new High-Resolution Rapid Refresh (HRRR) model (3 x 3 km grid size), developed by the Earth System Research Laboratory (ESRL), provides hourly model runs with 18 forecast hours per run. While the model cannot be fine-tuned for a specific area or edited to provide cloud-seeding-specific output, model output is openly available on a near-real-time basis. This presentation focuses on a prototype methodology for using HRRR model data to create maps which aid in near-real-time cloud seeding decision making. The R programming language is utilized to run a script on a Windows® desktop/laptop computer either on a schedule (such as every half hour) or manually. The latest HRRR model run is downloaded from NOAA's Operational Model Archive and Distribution System (NOMADS). A GRIB-filter service, provided by NOMADS, is used to obtain surface and mandatory pressure level data for a subset domain which greatly cuts down on the amount of data transfer. Then, a set of criteria, identified by the Idaho Power Atmospheric Science Group, is used to create guidance maps. These criteria include atmospheric stability (lapse rates), dew point depression, air temperature, and wet bulb temperature. The maps highlight potential areas where super-cooled liquid water may exist, reasons as to why cloud seeding should not be attempted, and wind speed at flight level.

  18. Three-dimensional analytic probabilities of coupled vibrational-rotational-translational energy transfer for DSMC modeling of nonequilibrium flows

    International Nuclear Information System (INIS)

    Adamovich, Igor V.

    2014-01-01

    A three-dimensional, nonperturbative, semiclassical analytic model of vibrational energy transfer in collisions between a rotating diatomic molecule and an atom, and between two rotating diatomic molecules (Forced Harmonic Oscillator–Free Rotation model) has been extended to incorporate rotational relaxation and coupling between vibrational, translational, and rotational energy transfer. The model is based on analysis of semiclassical trajectories of rotating molecules interacting by a repulsive exponential atom-to-atom potential. The model predictions are compared with the results of three-dimensional close-coupled semiclassical trajectory calculations using the same potential energy surface. The comparison demonstrates good agreement between analytic and numerical probabilities of rotational and vibrational energy transfer processes, over a wide range of total collision energies, rotational energies, and impact parameter. The model predicts probabilities of single-quantum and multi-quantum vibrational-rotational transitions and is applicable up to very high collision energies and quantum numbers. Closed-form analytic expressions for these transition probabilities lend themselves to straightforward incorporation into DSMC nonequilibrium flow codes

  19. A new methodology for dynamic modelling of health risks arising from wastewater influenced urban flooding

    Science.gov (United States)

    Jørgensen, Claus; Mark, Ole; Djordjevic, Slobodan; Hammond, Michael; Khan, David M.; Erichsen, Anders; Dorrit Enevoldsen, Ann; Heinicke, Gerald; Helwigh, Birgitte

    2015-04-01

    Indroduction Urban flooding due to rainfall exceeding the design capacity of drainage systems is a global problem and it has significant economic and social consequences. While the cost of the direct flood damages of urban flooding is well understood, the indirect damages, like the water borne diseases is in general still poorly understood. Climate changes are expected to increase the frequency of urban flooding in many countries which is likely to increase water borne diseases. Diarrheal diseases are most prevalent in developing countries, where poor sanitation, poor drinking water and poor surface water quality causes a high disease burden and mortality, especially during floods. The level of water borne diarrhea in countries with well-developed water and waste water infrastructure has been reduced to an acceptable level, and the population in general do not consider waste water as being a health risk. Hence, exposure to wastewater influenced urban flood water still has the potential to cause transmission of diarrheal diseases. When managing urban flooding and planning urban climate change adaptations, health risks are rarely taken into consideration. This paper outlines a novel methodology for linking dynamic urban flood modelling with Quantitative Microbial Risk Assessment (QMRA). This provides a unique possibility for understanding the interaction between urban flooding and the health risks caused by direct human contact with flood water and provides an option for reducing the burden of disease in the population through the use of intelligent urban flood risk management. Methodology We have linked hydrodynamic urban flood modelling with quantitative microbial risk assessment (QMRA) to determine the risk of infection caused by exposure to wastewater influenced urban flood water. The deterministic model MIKE Flood, which integrates the sewer network model in MIKE Urban and the 2D surface model MIKE21, was used to calculate the concentration of pathogens in the

  20. A system-of-systems modeling methodology for strategic general aviation design decision-making

    Science.gov (United States)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting

  1. Neuroinflammation in epileptogenesis: Insights and translational perspectives from new models of epilepsy.

    Science.gov (United States)

    Barker-Haliski, Melissa L; Löscher, Wolfgang; White, H Steve; Galanopoulou, Aristea S

    2017-07-01

    Animal models have provided a wealth of information on mechanisms of epileptogenesis and comorbidogenesis, and have significantly advanced our ability to investigate the potential of new therapies. Processes implicating brain inflammation have been increasingly observed in epilepsy research. Herein we discuss the progress on animal models of epilepsy and comorbidities that inform us on the potential role of inflammation in epileptogenesis and comorbidity pathogenesis in rodent models of West syndrome and the Theiler's murine encephalomyelitis virus (TMEV) mouse model of viral encephalitis-induced epilepsy. Rat models of infantile spasms were generated in rat pups after right intracerebral injections of proinflammatory compounds (lipopolysaccharides with or without doxorubicin, or cytokines) and were longitudinally monitored for epileptic spasms and neurodevelopmental and cognitive deficits. Anti-inflammatory treatments were tested after the onset of spasms. The TMEV mouse model was induced with intracerebral administration of TMEV and prospective monitoring for handling-induced seizures or seizure susceptibility, as well as long-term evaluations of behavioral comorbidities of epilepsy. Inflammatory processes are evident in both models and are implicated in the pathogenesis of the observed seizures and comorbidities. A common feature of these models, based on the data so far available, is their pharmacoresistant profile. The presented data support the role of inflammatory pathways in epileptogenesis and comorbidities in two distinct epilepsy models. Pharmacoresistance is a common feature of both inflammation-based models. Utilization of these models may facilitate the identification of age-specific, syndrome- or etiology-specific therapies for the epilepsies and attendant comorbidities, including the drug-resistant forms. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  2. Exaggerated Cap-Dependent Translation as a Mechanism for Corticostriatal Dysfunction in Fragile X Syndrome Model Mice

    Science.gov (United States)

    2016-10-01

    Gordon  Research  Conference  “Fragile  X  and  Autism-­Related  Disorders”.  Vermont,  USA. 2) NCCR  Synapsy  “The  Neurobiology  of   Mental   Health ...and   repetitive/perseverative   behaviours  displayed  by  FXS  model  mice  are  reversed  by  novel   cap-­dependent   translation   inhibitors...graduate   student   Person  months  worked:  3  cal  mos   Contribution  to  project:  perform  experiments  and  analyze  data   Name:  Jonathan

  3. Using Workflow Modeling to Identify Areas to Improve Genetic Test Processes in the University of Maryland Translational Pharmacogenomics Project.

    Science.gov (United States)

    Cutting, Elizabeth M; Overby, Casey L; Banchero, Meghan; Pollin, Toni; Kelemen, Mark; Shuldiner, Alan R; Beitelshees, Amber L

    Delivering genetic test results to clinicians is a complex process. It involves many actors and multiple steps, requiring all of these to work together in order to create an optimal course of treatment for the patient. We used information gained from focus groups in order to illustrate the current process of delivering genetic test results to clinicians. We propose a business process model and notation (BPMN) representation of this process for a Translational Pharmacogenomics Project being implemented at the University of Maryland Medical Center, so that personalized medicine program implementers can identify areas to improve genetic testing processes. We found that the current process could be improved to reduce input errors, better inform and notify clinicians about the implications of certain genetic tests, and make results more easily understood. We demonstrate our use of BPMN to improve this important clinical process for CYP2C19 genetic testing in patients undergoing invasive treatment of coronary heart disease.

  4. In Vivo Imaging Biomarkers in Mouse Models of Alzheimer's Disease: Are We Lost in Translation or Breaking Through?

    Directory of Open Access Journals (Sweden)

    Benoît Delatour

    2010-01-01

    Full Text Available Identification of biomarkers of Alzheimer's Disease (AD is a critical priority to efficiently diagnose the patients, to stage the progression of neurodegeneration in living subjects, and to assess the effects of disease-modifier treatments. This paper addresses the development and usefulness of preclinical neuroimaging biomarkers of AD. It is today possible to image in vivo the brain of small rodents at high resolution and to detect the occurrence of macroscopic/microscopic lesions in these species, as well as of functional alterations reminiscent of AD pathology. We will outline three different types of imaging biomarkers that can be used in AD mouse models: biomarkers with clear translational potential, biomarkers that can serve as in vivo readouts (in particular in the context of drug discovery exclusively for preclinical research, and finally biomarkers that constitute new tools for fundamental research on AD physiopathogeny.

  5. Pattern-based Automatic Translation of Structured Power System Data to Functional Models for Decision Support Applications

    DEFF Research Database (Denmark)

    Heussen, Kai; Weckesser, Johannes Tilman Gabriel; Kullmann, Daniel

    2013-01-01

    Improved information and insight for decision support in operations and design are central promises of a smart grid. Well-structured information about the composition of power systems is increasingly becoming available in the domain, e.g. due to standard information models (e.g. CIM or IEC61850......) or otherwise structured databases. More measurements and data do not automatically improve decisions, but there is an opportunity to capitalize on this information for decision support. With suitable reasoning strategies data can be contextualized and decision-relevant events can be promoted and identified....... This paper presents an approach to link available structured power system data directly to a functional representation suitable for diagnostic reasoning. The translation method is applied to test cases also illustrating decision support....

  6. Patient-Derived Xenograft Models : An Emerging Platform for Translational Cancer Research

    NARCIS (Netherlands)

    Hidalgo, Manuel; Amant, Frederic; Biankin, Andrew V.; Budinska, Eva; Byrne, Annette T.; Caldas, Carlos; Clarke, Robert B.; de Jong, Steven; Jonkers, Jos; Maelandsmo, Gunhild Mari; Roman-Roman, Sergio; Seoane, Joan; Trusolino, Livio; Villanueva, Alberto

    Recently, there has been an increasing interest in the development and characterization of patient-derived tumor xenograft (PDX) models for cancer research. PDX models mostly retain the principal histologic and genetic characteristics of their donor tumor and remain stable across passages. These

  7. Pattern-based translation of BPMN process models to BPEL web services

    NARCIS (Netherlands)

    Ouyang, C.; Dumas, M.; Hofstede, ter A.H.M.; Aalst, van der W.M.P.

    2008-01-01

    The business process modeling notation (BPMN) is a graph-oriented language primarily targeted at domain analysts and supported by many modeling tools. The business process execution language for Web services (BPEL) on the other hand is a mainly block-structured language targeted at software

  8. UPCaD: A Methodology of Integration Between Ontology-Based Context-Awareness Modeling and Relational Domain Data

    Directory of Open Access Journals (Sweden)

    Vinícius Maran

    2018-01-01

    Full Text Available Context-awareness is a key feature for ubiquitous computing scenarios applications. Currently, technologies and methodologies have been proposed for the integration of context-awareness concepts in intelligent information systems to adapt them to the execution of services, user interfaces and data retrieval. Recent research proposed conceptual modeling alternatives to the integration of the domain modeling in RDBMS and context-awareness modeling. The research described using highly expressiveness ontologies. The present work describes the UPCaD (Unified Process for Integration between Context-Awareness and Domain methodology, which is composed of formalisms and processes to guide the data integration considering RDBMS and context modeling. The methodology was evaluated in a virtual learning environment application. The evaluation shows the possibility to use a highly expressive context ontology to filter the relational data query and discusses the main contributions of the methodology compared with recent approaches.

  9. [Systemic inflammation: theoretical and methodological approaches to description of general pathological process model. Part 3. Backgroung for nonsyndromic approach].

    Science.gov (United States)

    Gusev, E Yu; Chereshnev, V A

    2013-01-01

    Theoretical and methodological approaches to description of systemic inflammation as general pathological process are discussed. It is shown, that there is a need of integration of wide range of types of researches to develop a model of systemic inflammation.

  10. Methodological Bases for Describing Risks of the Enterprise Business Model in Integrated Reporting

    Directory of Open Access Journals (Sweden)

    Nesterenko Oksana O.

    2017-12-01

    Full Text Available The aim of the article is to substantiate the methodological bases for describing the business and accounting risks of an enterprise business model in integrated reporting for their timely detection and assessment, and develop methods for their leveling or minimizing and possible prevention. It is proposed to consider risks in the process of forming integrated reporting from two sides: first, risks that arise in the business model of an organization and should be disclosed in its integrated report; second, accounting risks of integrated reporting, which should be taken into account by members of the cross-sectoral working group and management personnel in the process of forming and promulgating integrated reporting. To develop an adequate accounting and analytical tool for disclosure of information about the risks of the business model and integrated reporting, their leveling or minimization, in the article a terminological analysis of the essence of entrepreneurial and accounting risks is carried out. The entrepreneurial risk is defined as an objective-subjective economic category that characterizes the probability of negative or positive consequences of economic-social-ecological activity within the framework of the business model of an enterprise under uncertainty. The accounting risk is suggested to be understood as the probability of unfavorable consequences as a result of organizational, methodological errors in the integrated accounting system, which present threat to the quality, accuracy and reliability of the reporting information on economic, social and environmental activities in integrated reporting as well as threat of inappropriate decision-making by stakeholders based on the integrated report. For the timely identification of business risks and maximum leveling of the influence of accounting risks on the process of formation and publication of integrated reporting, in the study the place of entrepreneurial and accounting risks in

  11. Modeling and Analysis of The Pressure Die Casting Using Response Surface Methodology

    International Nuclear Information System (INIS)

    Kittur, Jayant K.; Herwadkar, T. V.; Parappagoudar, M. B.

    2010-01-01

    Pressure die casting is successfully used in the manufacture of Aluminum alloys components for automobile and many other industries. Die casting is a process involving many process parameters having complex relationship with the quality of the cast product. Though various process parameters have influence on the quality of die cast component, major influence is seen by the die casting machine parameters and their proper settings. In the present work, non-linear regression models have been developed for making predictions and analyzing the effect of die casting machine parameters on the performance characteristics of die casting process. Design of Experiments (DOE) with Response Surface Methodology (RSM) has been used to analyze the effect of effect of input parameters and their interaction on the response and further used to develop nonlinear input-output relationships. Die casting machine parameters, namely, fast shot velocity, slow shot to fast shot change over point, intensification pressure and holding time have been considered as the input variables. The quality characteristics of the cast product were determined by porosity, hardness and surface rough roughness (output/responses). Design of experiments has been used to plan the experiments and analyze the impact of variables on the quality of casting. On the other-hand Response Surface Methodology (Central Composite Design) is utilized to develop non-linear input-output relationships (regression models). The developed regression models have been tested for their statistical adequacy through ANOVA test. The practical usefulness of these models has been tested with some test cases. These models can be used to make the predictions about different quality characteristics, for the known set of die casting machine parameters, without conducting the experiments.

  12. Translational mixed-effects PKPD modelling of recombinant human growth hormone - from hypophysectomized rat to patients

    DEFF Research Database (Denmark)

    Thorsted, Anders; Thygesen, Peter; Agersø, Henrik

    2016-01-01

    was developed from experimental PKPD studies of rhGH and effects of long-term treatment as measured by insulin-like growth factor 1 (IGF-1) and bodyweight gain in rats. Modelled parameter values were scaled to human values using the allometric approach with fixed exponents for PKs and unscaled for PDs...... and validated through simulations relative to patient data. KEY RESULTS: The final model described rhGH PK as a two compartmental model with parallel linear and non-linear elimination terms, parallel first-order absorption with a total s.c. bioavailability of 87% in rats. Induction of IGF-1 was described...... by an indirect response model with stimulation of kin and related to rhGH exposure through an Emax relationship. Increase in bodyweight was directly linked to individual concentrations of IGF-1 by a linear relation. The scaled model provided robust predictions of human systemic PK of rhGH, but exposure following...

  13. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  14. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  15. PAGIS summary report of phase 1: a common methodological approach based on European data and models

    International Nuclear Information System (INIS)

    Cadelli, N.; Cottone, G.; Bertozzi, G.; Girardi, F.

    1984-01-01

    Since 1982 a joint study has been launched by the CEC with the participation of national institutions in the E.C., aiming at a Performance Assessment of Geological Isolation Systems (PAGIS) for HLW disposal. This document is a summary of the first phase of the study which was devoted to the collection of data and models and to the choice of an appropriate methodology. To this purpose, real or national sites have been chosen, which are representative of three types of continental geological formations in the E.C.: clay, granite and salt (although the choices imply no committment of any kind about their final use). Moreover, sub-seabed areas have also been identified. The study covers the following items: - basic data on waste characteristics, site data and repository designs; - methodology, which allows sensitivity and uncertainty analyses to be performed, as well as the assessment of radiation doses to individuals and populations; - preliminary modelling of radionuclide release and migration through the geosphere (near- and far-field) and the biosphere following their various pathways to man; - selection of the most relevant radionuclide release scenarios and their probability of occurrence. Reference values have been selected for the basic data as well as variants covering the various options which are under consideration in the different Countries of the E.C.

  16. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  17. Assessment of historical leak model methodology as applied to the REDOX high-level waste tank SX-108

    International Nuclear Information System (INIS)

    JONES, T.E.

    1999-01-01

    Using the Historical Leak Model approach, the estimated leak rate (and therefore, projected leak volume) for Tank 241-SX-108 could not be reproduced using the data included in the initial document describing the leak methodology. An analysis of parameters impacting tank heat load calculations strongly suggest that the historical tank operating data lack the precision and accuracy required to estimate tank leak volumes using the Historical Leak Model methodology

  18. Methodology for Outdoor Water Savings Model and Spreadsheet Tool for U.S. and Selected States

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alison A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chen, Yuting [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dunham, Camilla [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Hannah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    Green lawns and landscaping are archetypical of the populated American landscape, and typically require irrigation, which corresponds to a significant fraction of residential, commercial, and institutional water use. In North American cities, the estimated portion of residential water used for outdoor purposes ranges from 22-38% in cooler climates up to 59-67% in dry and hot environments, while turfgrass coverage within the United States spans 11.1-20.2 million hectares (Milesi et al. 2009). One national estimate uses satellite and aerial photography data to develop a relationship between impervious surface and lawn surface area, yielding a conservative estimate of 16.4 (± 3.6) million hectares of lawn surface area in the United States—an area three times larger than that devoted to any irrigated crop (Milesi et al. 2005). One approach that holds promise for cutting unnecessary outdoor water use is the increased deployment of “smart” irrigation controllers to increase the water efficiency of irrigation systems. This report describes the methodology and inputs employed in a mathematical model that quantifies the effects of the U.S. Environmental Protection Agency’s WaterSense labeling program for one such type of controller, weather-based irrigation controllers (WBIC). This model builds off that described in “Methodology for National Water Savings Model and Spreadsheet Tool–Outdoor Water Use” and uses a two-tiered approach to quantify outdoor water savings attributable to the WaterSense program for WBIC, as well as net present value (NPV) of that savings. While the first iteration of the model assessed national impacts using averaged national values, this version begins by evaluating impacts in three key large states that make up a sizable portion of the irrigation market: California, Florida, and Texas. These states are considered to be the principal market of “smart” irrigation controllers that may result in the bulk of national savings. Modeled

  19. Methodology for predicting oily mixture properties in the mathematical modeling of molecular distillation

    Directory of Open Access Journals (Sweden)

    M. F. Gayol

    2017-06-01

    Full Text Available A methodology for predicting the thermodynamic and transport properties of a multi-component oily mixture, in which the different mixture components are grouped into a small number of pseudo components is shown. This prediction of properties is used in the mathematical modeling of molecular distillation, which consists of a system of differential equations in partial derivatives, according to the principles of the Transport Phenomena and is solved by an implicit finite difference method using a computer code. The mathematical model was validated with experimental data, specifically the molecular distillation of a deodorizer distillate (DD of sunflower oil. The results obtained were satisfactory, with errors less than 10% with respect to the experimental data in a temperature range in which it is possible to apply the proposed method.

  20. Methodology for predicting oily mixture properties in the mathematical modeling of molecular distillation

    International Nuclear Information System (INIS)

    Gayol, M.F.; Pramparo, M.C.; Miró Erdmann, S.M.

    2017-01-01

    A methodology for predicting the thermodynamic and transport properties of a multi-component oily mixture, in which the different mixture components are grouped into a small number of pseudo components is shown. This prediction of properties is used in the mathematical modeling of molecular distillation, which consists of a system of differential equations in partial derivatives, according to the principles of the Transport Phenomena and is solved by an implicit finite difference method using a computer code. The mathematical model was validated with experimental data, specifically the molecular distillation of a deodorizer distillate (DD) of sunflower oil. The results obtained were satisfactory, with errors less than 10% with respect to the experimental data in a temperature range in which it is possible to apply the proposed method. [es

  1. A methodology for modeling surface effects on stiff and soft solids

    Science.gov (United States)

    He, Jin; Park, Harold S.

    2018-06-01

    We present a computational method that can be applied to capture surface stress and surface tension-driven effects in both stiff, crystalline nanostructures, like size-dependent mechanical properties, and soft solids, like elastocapillary effects. We show that the method is equivalent to the classical Young-Laplace model. The method is based on converting surface tension and surface elasticity on a zero-thickness surface to an initial stress and corresponding elastic properties on a finite thickness shell, where the consideration of geometric nonlinearity enables capturing the out-of-plane component of the surface tension that results for curved surfaces through evaluation of the surface stress in the deformed configuration. In doing so, we are able to use commercially available finite element technology, and thus do not require consideration and implementation of the classical Young-Laplace equation. Several examples are presented to demonstrate the capability of the methodology for modeling surface stress in both soft solids and crystalline nanostructures.

  2. A generic analytical foot rollover model for predicting translational ankle kinematics in gait simulation studies.

    Science.gov (United States)

    Ren, Lei; Howard, David; Ren, Luquan; Nester, Chris; Tian, Limei

    2010-01-19

    The objective of this paper is to develop an analytical framework to representing the ankle-foot kinematics by modelling the foot as a rollover rocker, which cannot only be used as a generic tool for general gait simulation but also allows for case-specific modelling if required. Previously, the rollover models used in gait simulation have often been based on specific functions that have usually been of a simple form. In contrast, the analytical model described here is in a general form that the effective foot rollover shape can be represented by any polar function rho=rho(phi). Furthermore, a normalized generic foot rollover model has been established based on a normative foot rollover shape dataset of 12 normal healthy subjects. To evaluate model accuracy, the predicted ankle motions and the centre of pressure (CoP) were compared with measurement data for both subject-specific and general cases. The results demonstrated that the ankle joint motions in both vertical and horizontal directions (relative RMSE approximately 10%) and CoP (relative RMSE approximately 15% for most of the subjects) are accurately predicted over most of the stance phase (from 10% to 90% of stance). However, we found that the foot cannot be very accurately represented by a rollover model just after heel strike (HS) and just before toe off (TO), probably due to shear deformation of foot plantar tissues (ankle motion can occur without any foot rotation). The proposed foot rollover model can be used in both inverse and forward dynamics gait simulation studies and may also find applications in rehabilitation engineering. Copyright 2009 Elsevier Ltd. All rights reserved.

  3. Defining Leadership as Process Reference Model: Translating Organizational Goals into Practice Using a Structured Leadership Approach

    OpenAIRE

    Tuffley , David

    2010-01-01

    International audience; Effective leadership in organisations is important to the achievement of organizational objectives. Yet leadership is widely seen as a quality that individuals innately possess, and which cannot be learned. This paper makes two assertions; (a) that leadership is a skill that not only can be learned, but which can be formalized into a Process Reference Model that is intelligible from an Enterprise Architecture perspective, and (b) that Process Reference Models in the st...

  4. Reliability Modeling of Electromechanical System with Meta-Action Chain Methodology

    Directory of Open Access Journals (Sweden)

    Genbao Zhang

    2018-01-01

    Full Text Available To establish a more flexible and accurate reliability model, the reliability modeling and solving algorithm based on the meta-action chain thought are used in this thesis. Instead of estimating the reliability of the whole system only in the standard operating mode, this dissertation adopts the structure chain and the operating action chain for the system reliability modeling. The failure information and structure information for each component are integrated into the model to overcome the given factors applied in the traditional modeling. In the industrial application, there may be different operating modes for a multicomponent system. The meta-action chain methodology can estimate the system reliability under different operating modes by modeling the components with varieties of failure sensitivities. This approach has been identified by computing some electromechanical system cases. The results indicate that the process could improve the system reliability estimation. It is an effective tool to solve the reliability estimation problem in the system under various operating modes.

  5. Introduction of a methodology for visualization and graphical interpretation of Bayesian classification models.

    Science.gov (United States)

    Balfer, Jenny; Bajorath, Jürgen

    2014-09-22

    Supervised machine learning models are widely used in chemoinformatics, especially for the prediction of new active compounds or targets of known actives. Bayesian classification methods are among the most popular machine learning approaches for the prediction of activity from chemical structure. Much work has focused on predicting structure-activity relationships (SARs) on the basis of experimental training data. By contrast, only a few efforts have thus far been made to rationalize the performance of Bayesian or other supervised machine learning models and better understand why they might succeed or fail. In this study, we introduce an intuitive approach for the visualization and graphical interpretation of naïve Bayesian classification models. Parameters derived during supervised learning are visualized and interactively analyzed to gain insights into model performance and identify features that determine predictions. The methodology is introduced in detail and applied to assess Bayesian modeling efforts and predictions on compound data sets of varying structural complexity. Different classification models and features determining their performance are characterized in detail. A prototypic implementation of the approach is provided.

  6. Development of a practical methodology for integrating shoreline oil-holding capacity into modeling

    International Nuclear Information System (INIS)

    Schmidt Etkin, D.; French-McCay, D.; Rowe, J.; Michel, J.; Boufadel, M.; Li, H.

    2008-01-01

    The factors that influence the behaviour of oil in the aftermath of an oil spill on water include oil type and characteristics; oil thickness on the shoreline; time until shoreline impact; timing with regards to tides; weathering during and after the spill; and nearshore wave energy. The oil behaviour also depends on the shoreline characteristics, particularly porosity and permeability. The interactions of spilled oil with sediments on beaches must be well understood in order to model the oil spill trajectory, fate and risk. The movement of oil can be most accurately simulated if the algorithm incorporates an estimate of shoreline oil retention. This paper presented a literature review of relevant shoreline oiling studies and considered the relevance of study findings for inclusion in modelling. Survey data from a detailed shoreline cleanup assessment team (SCAT) were analyzed for patterns in oil penetration and oil-holding capacity by shoreline sediment type and oil type for potential use in modelling algorithms. A theoretical beach hydraulics model was then developed for use in a stochastic spill model. Gaps in information were identified, including the manner in which wave action and other environmental variables have an impact on the dynamic processes involved in shoreline oiling. The methodology presented in this paper can be used to estimate the amount of oil held by a shoreline upon impact to allow a trajectory model to more accurately project the total spread of oil. 27 refs., 13 tabs., 3 figs

  7. Guinea pig models for translation of the developmental origins of health and disease hypothesis into the clinic.

    Science.gov (United States)

    Morrison, Janna L; Botting, Kimberley J; Darby, Jack R T; David, Anna L; Dyson, Rebecca M; Gatford, Kathryn L; Gray, Clint; Herrera, Emilio A; Hirst, Jonathan J; Kim, Bona; Kind, Karen L; Krause, Bernardo J; Matthews, Stephen G; Palliser, Hannah K; Regnault, Timothy R H; Richardson, Bryan S; Sasaki, Aya; Thompson, Loren P; Berry, Mary J

    2018-04-06

    Over 30 years ago Professor David Barker first proposed the theory that events in early life could explain an individual's risk of non-communicable disease in later life: the developmental origins of health and disease (DOHaD) hypothesis. During the 1990s the validity of the DOHaD hypothesis was extensively tested in a number of human populations and the mechanisms underpinning it characterised in a range of experimental animal models. Over the past decade, researchers have sought to use this mechanistic understanding of DOHaD to develop therapeutic interventions during pregnancy and early life to improve adult health. A variety of animal models have been used to develop and evaluate interventions, each with strengths and limitations. It is becoming apparent that effective translational research requires that the animal paradigm selected mirrors the tempo of human fetal growth and development as closely as possible so that the effect of a perinatal insult and/or therapeutic intervention can be fully assessed. The guinea pig is one such animal model that over the past two decades has demonstrated itself to be a very useful platform for these important reproductive studies. This review highlights similarities in the in utero development between humans and guinea pigs, the strengths and limitations of the guinea pig as an experimental model of DOHaD and the guinea pig's potential to enhance clinical therapeutic innovation to improve human health. © 2018 The Authors. The Journal of Physiology © 2018 The Physiological Society.

  8. Predicting Barrett's Esophagus in Families: An Esophagus Translational Research Network (BETRNet) Model Fitting Clinical Data to a Familial Paradigm.

    Science.gov (United States)

    Sun, Xiangqing; Elston, Robert C; Barnholtz-Sloan, Jill S; Falk, Gary W; Grady, William M; Faulx, Ashley; Mittal, Sumeet K; Canto, Marcia; Shaheen, Nicholas J; Wang, Jean S; Iyer, Prasad G; Abrams, Julian A; Tian, Ye D; Willis, Joseph E; Guda, Kishore; Markowitz, Sanford D; Chandar, Apoorva; Warfe, James M; Brock, Wendy; Chak, Amitabh

    2016-05-01

    Barrett's esophagus is often asymptomatic and only a small portion of Barrett's esophagus patients are currently diagnosed and under surveillance. Therefore, it is important to develop risk prediction models to identify high-risk individuals with Barrett's esophagus. Familial aggregation of Barrett's esophagus and esophageal adenocarcinoma, and the increased risk of esophageal adenocarcinoma for individuals with a family history, raise the necessity of including genetic factors in the prediction model. Methods to determine risk prediction models using both risk covariates and ascertained family data are not well developed. We developed a Barrett's Esophagus Translational Research Network (BETRNet) risk prediction model from 787 singly ascertained Barrett's esophagus pedigrees and 92 multiplex Barrett's esophagus pedigrees, fitting a multivariate logistic model that incorporates family history and clinical risk factors. The eight risk factors, age, sex, education level, parental status, smoking, heartburn frequency, regurgitation frequency, and use of acid suppressant, were included in the model. The prediction accuracy was evaluated on the training dataset and an independent validation dataset of 643 multiplex Barrett's esophagus pedigrees. Our results indicate family information helps to predict Barrett's esophagus risk, and predicting in families improves both prediction calibration and discrimination accuracy. Our model can predict Barrett's esophagus risk for anyone with family members known to have, or not have, had Barrett's esophagus. It can predict risk for unrelated individuals without knowing any relatives' information. Our prediction model will shed light on effectively identifying high-risk individuals for Barrett's esophagus screening and surveillance, consequently allowing intervention at an early stage, and reducing mortality from esophageal adenocarcinoma. Cancer Epidemiol Biomarkers Prev; 25(5); 727-35. ©2016 AACR. ©2016 American Association for

  9. Translational genomics

    Directory of Open Access Journals (Sweden)

    Martin Kussmann

    2014-09-01

    Full Text Available The term “Translational Genomics” reflects both title and mission of this new journal. “Translational” has traditionally been understood as “applied research” or “development”, different from or even opposed to “basic research”. Recent scientific and societal developments have triggered a re-assessment of the connotation that “translational” and “basic” are either/or activities: translational research nowadays aims at feeding the best science into applications and solutions for human society. We therefore argue here basic science to be challenged and leveraged for its relevance to human health and societal benefits. This more recent approach and attitude are catalyzed by four trends or developments: evidence-based solutions; large-scale, high dimensional data; consumer/patient empowerment; and systems-level understanding.

  10. Beyond Translation

    DEFF Research Database (Denmark)

    Olwig, Mette Fog

    2013-01-01

    This article contributes to the growing scholarship on local development practitioners by re-examining conceptualizations of practitioners as ‘brokers’ strategically translating between ‘travelling’ (development institution) rationalities and ‘placed’ (recipient area) rationalities in relation...... and practice spurred by new challenges deriving from climate change anxiety, the study shows how local practitioners often make local activities fit into travelling development rationalities as a matter of habit, rather than as a conscious strategy. They may therefore cease to ‘translate’ between different...... rationalities. This is shown to have important implications for theory, research and practice concerning disaster risk reduction and climate change adaptation in which such translation is often expected....

  11. Revising Translations

    DEFF Research Database (Denmark)

    Rasmussen, Kirsten Wølch; Schjoldager, Anne

    2011-01-01

    The paper explains the theoretical background and findings of an empirical study of revision policies, using Denmark as a case in point. After an overview of important definitions, types and parameters, the paper explains the methods and data gathered from a questionnaire survey and an interview...... survey. Results clearly show that most translation companies regard both unilingual and comparative revisions as essential components of professional quality assurance. Data indicate that revision is rarely fully comparative, as the preferred procedure seems to be a unilingual revision followed by a more...... or less comparative rereading. Though questionnaire data seem to indicate that translation companies use linguistic correctness and presentation as the only revision parameters, interview data reveal that textual and communicative aspects are also considered. Generally speaking, revision is not carried...

  12. A generalized methodology for identification of threshold for HRU delineation in SWAT model

    Science.gov (United States)

    M, J.; Sudheer, K.; Chaubey, I.; Raj, C.

    2016-12-01

    The distributed hydrological model, Soil and Water Assessment Tool (SWAT) is a comprehensive hydrologic model widely used for making various decisions. The simulation accuracy of the distributed hydrological model differs due to the mechanism involved in the subdivision of the watershed. Soil and Water Assessment Tool (SWAT) considers sub-dividing the watershed and the sub-basins into small computing units known as 'hydrologic response units (HRU). The delineation of HRU is done based on unique combinations of land use, soil types, and slope within the sub-watersheds, which are not spatially defined. The computations in SWAT are done at HRU level and are then aggregated up to the sub-basin outlet, which is routed through the stream system. Generally, the HRUs are delineated by considering a threshold percentage of land use, soil and slope are to be given by the modeler to decrease the computation time of the model. The thresholds constrain the minimum area for constructing an HRU. In the current HRU delineation practice in SWAT, the land use, soil and slope of the watershed within a sub-basin, which is less than the predefined threshold, will be surpassed by the dominating land use, soil and slope, and introduce some level of ambiguity in the process simulations in terms of inappropriate representation of the area. But the loss of information due to variation in the threshold values depends highly on the purpose of the study. Therefore this research studies the effects of threshold values of HRU delineation on the hydrological modeling of SWAT on sediment simulations and suggests guidelines for selecting the appropriate threshold values considering the sediment simulation accuracy. The preliminary study was done on Illinois watershed by assigning different thresholds for land use and soil. A general methodology was proposed for identifying an appropriate threshold for HRU delineation in SWAT model that considered computational time and accuracy of the simulation

  13. A consistent modelling methodology for secondary settling tanks in wastewater treatment.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Nopens, Ingmar

    2011-03-01

    The aim of this contribution is partly to build consensus on a consistent modelling methodology (CMM) of complex real processes in wastewater treatment by combining classical concepts with results from applied mathematics, and partly to apply it to the clarification-thickening process in the secondary settling tank. In the CMM, the real process should be approximated by a mathematical model (process model; ordinary or partial differential equation (ODE or PDE)), which in turn is approximated by a simulation model (numerical method) implemented on a computer. These steps have often not been carried out in a correct way. The secondary settling tank was chosen as a case since this is one of the most complex processes in a wastewater treatment plant and simulation models developed decades ago have no guarantee of satisfying fundamental mathematical and physical properties. Nevertheless, such methods are still used in commercial tools to date. This particularly becomes of interest as the state-of-the-art practice is moving towards plant-wide modelling. Then all submodels interact and errors propagate through the model and severely hamper any calibration effort and, hence, the predictive purpose of the model. The CMM is described by applying it first to a simple conversion process in the biological reactor yielding an ODE solver, and then to the solid-liquid separation in the secondary settling tank, yielding a PDE solver. Time has come to incorporate established mathematical techniques into environmental engineering, and wastewater treatment modelling in particular, and to use proven reliable and consistent simulation models. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Application of fault tree methodology to modeling of the AP1000 plant digital reactor protection system

    International Nuclear Information System (INIS)

    Teolis, D.S.; Zarewczynski, S.A.; Detar, H.L.

    2012-01-01

    The reactor trip system (RTS) and engineered safety features actuation system (ESFAS) in nuclear power plants utilizes instrumentation and control (IC) to provide automatic protection against unsafe and improper reactor operation during steady-state and transient power operations. During normal operating conditions, various plant parameters are continuously monitored to assure that the plant is operating in a safe state. In response to deviations of these parameters from pre-determined set points, the protection system will initiate actions required to maintain the reactor in a safe state. These actions may include shutting down the reactor by opening the reactor trip breakers and actuation of safety equipment based on the situation. The RTS and ESFAS are represented in probabilistic risk assessments (PRAs) to reflect the impact of their contribution to core damage frequency (CDF). The reactor protection systems (RPS) in existing nuclear power plants are generally analog based and there is general consensus within the PRA community on fault tree modeling of these systems. In new plants, such as AP1000 plant, the RPS is based on digital technology. Digital systems are more complex combinations of hardware components and software. This combination of complex hardware and software can result in the presence of faults and failure modes unique to a digital RPS. The United States Nuclear Regulatory Commission (NRC) is currently performing research on the development of probabilistic models for digital systems for inclusion in PRAs; however, no consensus methodology exists at this time. Westinghouse is currently updating the AP1000 plant PRA to support initial operation of plants currently under construction in the United States. The digital RPS is modeled using fault tree methodology similar to that used for analog based systems. This paper presents high level descriptions of a typical analog based RPS and of the AP1000 plant digital RPS. Application of current fault

  15. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    Science.gov (United States)

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in d