WorldWideScience

Sample records for include evolving interpretations

  1. Decoding transcriptional enhancers: Evolving from annotation to functional interpretation.

    Science.gov (United States)

    Engel, Krysta L; Mackiewicz, Mark; Hardigan, Andrew A; Myers, Richard M; Savic, Daniel

    2016-09-01

    Deciphering the intricate molecular processes that orchestrate the spatial and temporal regulation of genes has become an increasingly major focus of biological research. The differential expression of genes by diverse cell types with a common genome is a hallmark of complex cellular functions, as well as the basis for multicellular life. Importantly, a more coherent understanding of gene regulation is critical for defining developmental processes, evolutionary principles and disease etiologies. Here we present our current understanding of gene regulation by focusing on the role of enhancer elements in these complex processes. Although functional genomic methods have provided considerable advances to our understanding of gene regulation, these assays, which are usually performed on a genome-wide scale, typically provide correlative observations that lack functional interpretation. Recent innovations in genome editing technologies have placed gene regulatory studies at an exciting crossroads, as systematic, functional evaluation of enhancers and other transcriptional regulatory elements can now be performed in a coordinated, high-throughput manner across the entire genome. This review provides insights on transcriptional enhancer function, their role in development and disease, and catalogues experimental tools commonly used to study these elements. Additionally, we discuss the crucial role of novel techniques in deciphering the complex gene regulatory landscape and how these studies will shape future research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. EVOLVE

    CERN Document Server

    Deutz, André; Schütze, Oliver; Legrand, Pierrick; Tantar, Emilia; Tantar, Alexandru-Adrian

    2017-01-01

    This book comprises nine selected works on numerical and computational methods for solving multiobjective optimization, game theory, and machine learning problems. It provides extended versions of selected papers from various fields of science such as computer science, mathematics and engineering that were presented at EVOLVE 2013 held in July 2013 at Leiden University in the Netherlands. The internationally peer-reviewed papers include original work on important topics in both theory and applications, such as the role of diversity in optimization, statistical approaches to combinatorial optimization, computational game theory, and cell mapping techniques for numerical landscape exploration. Applications focus on aspects including robustness, handling multiple objectives, and complex search spaces in engineering design and computational biology.

  3. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  4. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  5. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    Research highlights: → The best estimate plus uncertainty methodology (BEPU) is one option in the licensing of nuclear reactors. → The challenges for extending the BEPU method for fuel qualification for an advanced reactor fuel are primarily driven by schedule, the need for data, and the sufficiency of the data. → In this paper we develop an extended BEPU methodology that can potentially be used to address these new challenges in the design and licensing of advanced nuclear reactors. → The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. → The methodology includes a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. - Abstract: Many evolving nuclear energy technologies use advanced predictive multiscale, multiphysics modeling and simulation (M and S) capabilities to reduce the cost and schedule of design and licensing. Historically, the role of experiments has been as a primary tool for the design and understanding of nuclear system behavior, while M and S played the subordinate role of supporting experiments. In the new era of multiscale, multiphysics computational-based technology development, this role has been reversed. The experiments will still be needed, but they will be performed at different scales to calibrate and validate the models leading to predictive simulations for design and licensing. Minimizing the required number of validation experiments produces cost and time savings. The use of multiscale, multiphysics models introduces challenges in validating these predictive tools - traditional methodologies will have to be modified to address these challenges. This paper gives the basic aspects of a methodology that can potentially be used to address these new challenges in

  6. Professional Development for Researchers in Solid Earth Science Evolved to Include Scientific and Educational Content

    Science.gov (United States)

    Eriksson, S. C.; Arrowsmith, R.; Olds, S. E.

    2011-12-01

    Integrated measures of crustal deformation provide valuable insight about tectonic and human-induced processes for scientists and educators alike. UNAVCO in conjunction with EarthScope initiated a series of short courses for researchers to learn the processing and interpretation of data from new technologies such as high precision GPS, Strainmeter, InSar and LiDAR that provide deformation information relevant to many geoscience sub-disciplines. Intensive short courses of a few days and the widespread availability of processed data through large projects such as EarthScope and GEON enable more geoscientists to incorporate these data into diverse projects. Characteristics of the UNAVCO Short Course Series, reaching over 400 participants since 2005, include having short course faculty who have pioneered development of each technology; open web-access to course materials; processing software installed on class-ready computers; no course fees; scholarships for students, post-doctoral fellows, and emerging faculty when needed; formative evaluation of the courses; community-based decisions on topics; and recruitment of participants across relevant geoscience disciplines. In 2009, when EarthScope airborne LiDAR data became available to the public through OpenTopographhy, teaching materials were provided to these researchers to incorporate the latest technologies into teaching. Multiple data sets across technologies have been developed with instructions on how to access the various data sets and incorporate them into geological problem sets. Courses in GPS, airborne LiDAR, strainmeter, and InSAR concentrate on data processing with examples of various geoscience applications. Ground-based LiDAR courses also include data acquisition. Google Earth is used to integrate various forms of data in educational applications. Various types of EarthScope data can now be used by a variety of geoscientists, and the number of scientists who have the skills and tools to use these various

  7. DEEP MIXING IN EVOLVED STARS. II. INTERPRETING Li ABUNDANCES IN RED GIANT BRANCH AND ASYMPTOTIC GIANT BRANCH STARS

    International Nuclear Information System (INIS)

    Palmerini, S.; Busso, M.; Maiorca, E.; Cristallo, S.; Abia, C.; Uttenthaler, S.; Gialanella, L.

    2011-01-01

    We reanalyze the problem of Li abundances in red giants of nearly solar metallicity. After outlining the problems affecting our knowledge of the Li content in low-mass stars (M ≤ 3 M sun ), we discuss deep-mixing models for the red giant branch stages suitable to account for the observed trends and for the correlated variations of the carbon isotope ratio; we find that Li destruction in these phases is limited to masses below about 2.3 M sun . Subsequently, we concentrate on the final stages of evolution for both O-rich and C-rich asymptotic giant branch (AGB) stars. Here, the constraints on extra-mixing phenomena previously derived from heavier nuclei (from C to Al), coupled to recent updates in stellar structure models (including both the input physics and the set of reaction rates used), are suitable to account for the observations of Li abundances below A(Li) ≡ log ε(Li) ≅ 1.5 (and sometimes more). Also, their relations with other nucleosynthesis signatures of AGB phases (like the abundance of F, and the C/O and 12 C/ 13 C ratios) can be explained. This requires generally moderate efficiencies (M-dot -6 M sun yr -1 ) for non-convective mass transport. At such rates, slow extra mixing does not remarkably modify Li abundances in early AGB phases; on the other hand, faster mixing encounters a physical limit in destroying Li, set by the mixing velocity. Beyond this limit, Li starts to be produced; therefore, its destruction on the AGB is modest. Li is then significantly produced by the third dredge up. We also show that effective circulation episodes, while not destroying Li, would easily bring the 12 C/ 13 C ratios to equilibrium, contrary to the evidence in most AGB stars, and would burn F beyond the limits shown by C(N) giants. Hence, we do not confirm the common idea that efficient extra mixing drastically reduces the Li content of C stars with respect to K-M giants. This misleading appearance is induced by biases in the data, namely: (1) the difficulty

  8. Quantitative evaluation of SIMS spectra including spectrum interpretation and Saha-Eggert correction

    International Nuclear Information System (INIS)

    Steiger, W.; Ruedenauer, F.G.

    1978-01-01

    A spectrum identification program is described, using a computer algorithm which solely relies on the natural isotopic abundances for identification of elemental, molecular and cluster ions. The thermodynamic approach to the quantitative interpretation of SIMS spectra, through the use of the Saha-Eggert equation, is discussed, and a computer program is outlined. (U.K.)

  9. John Dewey in Italy. The Operation of The New Italian Publishing: Including Translation, Interpretation and Dissemination

    Directory of Open Access Journals (Sweden)

    Franco Cambi

    2016-07-01

    Full Text Available The essay reconstructs the various phases of the discovery of John Dewey’s ideas on Education and the spread of their influence throughout Italian pedagogical circles from the end of the Second World War to the 1970s. Several Italian intellectual pioneers discerned within Dewey’s theories significant overtones of democratic political activism, and the potential for developing innovative practices by which the obsolete education system of the day could be modernized, and the demands for better schooling being put forward by many could be met. In the immediate aftermath of the Second World War, one such pioneer was Ernesto Codignola, a shrewd educational theorist who used the journal «Scuola e Città» (Schooling and the City, published by La Nuova Italia publishing house, as a mouthpiece for his ideas. Once the American philosopher’s ideas had been rediscovered, his most significant works were quickly translated and published, and then subjected to a flurry of detailed critical analysis and interpretation. During the 1960s and ‘70s, much of the research into Dewey’s theories was carried out in Florence, in particular by Lamberto Borghi, who interpreted them as the blueprint for a secular, democratic system of education that could be applied across the Italian peninsula.

  10. Evolving interpretation of the athlete's electrocardiogram: from European Society of Cardiology and Stanford criteria, to Seattle criteria and beyond.

    Science.gov (United States)

    Zorzi, Alessandro; ElMaghawry, Mohamed; Corrado, Domenico

    2015-01-01

    Electrocardiographic (ECG) pre-participation screening can prevent sudden cardiac death in the athletes by early diagnosis and disqualification of affected individuals. Interpretation of the athlete's ECG should be based on specific criteria, because ECG changes that would be considered abnormal in the untrained population may develop in trained athletes as a physiologic and benign consequence of the heart's adaptation to exercise. In 2010, a stem document from the Section of Sports Cardiology of the European Society of Cardiology (ESC) proposed to classify the athlete's ECG changes according to the prevalence, relation to exercise training, association with an increased risk of cardiovascular disease and need for further investigations into two groups: "common and training-related" (Group 1) and "uncommon and training-unrelated" (Group 2). Over the last years, several efforts have been made to refine the ESC criteria for interpretation of the athlete's ECG in order to improve specificity maintaining good sensitivity, especially among elite and Afro-Caribbean athletes, which show the highest rate of false positives Group 2 ECG abnormalities. However, the balance between improvement in specificity and loss of sensitivity should be evaluated keeping in mind that the primary aim of the screening program is to save the athlete's lives rather than money. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Improvement in Detection of Wrong-Patient Errors When Radiologists Include Patient Photographs in Their Interpretation of Portable Chest Radiographs.

    Science.gov (United States)

    Tridandapani, Srini; Olsen, Kevin; Bhatti, Pamela

    2015-12-01

    This study was conducted to determine whether facial photographs obtained simultaneously with radiographs improve radiologists' detection rate of wrong-patient errors, when they are explicitly asked to include the photographs in their evaluation. Radiograph-photograph combinations were obtained from 28 patients at the time of portable chest radiography imaging. From these, pairs of radiographs were generated. Each unique pair consisted of one new and one old (comparison) radiograph. Twelve pairs of mismatched radiographs (i.e., pairs containing radiographs of different patients) were also generated. In phase 1 of the study, 5 blinded radiologist observers were asked to interpret 20 pairs of radiographs without the photographs. In phase 2, each radiologist interpreted another 20 pairs of radiographs with the photographs. Radiologist observers were not instructed about the purpose of the photographs but were asked to include the photographs in their review. The detection rate of mismatched errors was recorded along with the interpretation time for each session for each observer. The two-tailed Fisher exact test was used to evaluate differences in mismatch detection rates between the two phases. A p value of error detection rates without (0/20 = 0%) and with (17/18 = 94.4%) photographs were different (p = 0.0001). The average interpretation times for the set of 20 radiographs were 26.45 (SD 8.69) and 20.55 (SD 3.40) min, for phase 1 and phase 2, respectively (two-tailed Student t test, p = 0.1911). When radiologists include simultaneously obtained photographs in their review of portable chest radiographs, there is a significant improvement in the detection of labeling errors. No statistically significant difference in interpretation time was observed. This may lead to improved patient safety without affecting radiologists' throughput.

  12. Evolvable synthetic neural system

    Science.gov (United States)

    Curtis, Steven A. (Inventor)

    2009-01-01

    An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.

  13. The visualCMAT: A web-server to select and interpret correlated mutations/co-evolving residues in protein families.

    Science.gov (United States)

    Suplatov, Dmitry; Sharapova, Yana; Timonina, Daria; Kopylov, Kirill; Švedas, Vytas

    2018-04-01

    The visualCMAT web-server was designed to assist experimental research in the fields of protein/enzyme biochemistry, protein engineering, and drug discovery by providing an intuitive and easy-to-use interface to the analysis of correlated mutations/co-evolving residues. Sequence and structural information describing homologous proteins are used to predict correlated substitutions by the Mutual information-based CMAT approach, classify them into spatially close co-evolving pairs, which either form a direct physical contact or interact with the same ligand (e.g. a substrate or a crystallographic water molecule), and long-range correlations, annotate and rank binding sites on the protein surface by the presence of statistically significant co-evolving positions. The results of the visualCMAT are organized for a convenient visual analysis and can be downloaded to a local computer as a content-rich all-in-one PyMol session file with multiple layers of annotation corresponding to bioinformatic, statistical and structural analyses of the predicted co-evolution, or further studied online using the built-in interactive analysis tools. The online interactivity is implemented in HTML5 and therefore neither plugins nor Java are required. The visualCMAT web-server is integrated with the Mustguseal web-server capable of constructing large structure-guided sequence alignments of protein families and superfamilies using all available information about their structures and sequences in public databases. The visualCMAT web-server can be used to understand the relationship between structure and function in proteins, implemented at selecting hotspots and compensatory mutations for rational design and directed evolution experiments to produce novel enzymes with improved properties, and employed at studying the mechanism of selective ligand's binding and allosteric communication between topologically independent sites in protein structures. The web-server is freely available at https

  14. Combined scintigraphic and radiographic diagnosis of bone and joint diseases. Including gamma correction interpretation. 4. rev. and enl. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Bahk, Yong-Whee [Sung Ae General Hospital, Seoul (Korea, Republic of). Dept. of Nuclear Medicine and Radiology

    2013-07-01

    In this fourth edition of Combined Scintigraphic and Radiographic Diagnosis of Bone and Joint Diseases, the text has been thoroughly amended, updated, and partially rearranged to reflect the latest advances. In addition to discussing the role of pinhole imaging in the range of disorders previously covered, the new edition pays detailed attention to the novel diagnostic use of gamma correction pinhole bone scan in a broad spectrum of skeletal disorders, including physical, traumatic, and sports injuries, infectious and non-infectious bone diseases, benign and malignant bone tumors, and soft tissue diseases. A large number of state of the art pinhole scans and corroborative CT, MRI, and/or ultrasound images are presented side by side. The book has been enlarged to encompass various new topics, including occult fractures; cervical sprain and whiplash trauma; bone marrow edema; microfractures of trabeculae; evident, gaping, and stress fractures; and differential diagnosis. This new edition will be essential reading for practitioners and researchers in not only nuclear medicine but also radiology, orthopedic surgery, and pathology.

  15. Combined scintigraphic and radiographic diagnosis of bone and joint diseases. Including gamma correction interpretation. 4. rev. and enl. ed.

    International Nuclear Information System (INIS)

    Bahk, Yong-Whee

    2013-01-01

    In this fourth edition of Combined Scintigraphic and Radiographic Diagnosis of Bone and Joint Diseases, the text has been thoroughly amended, updated, and partially rearranged to reflect the latest advances. In addition to discussing the role of pinhole imaging in the range of disorders previously covered, the new edition pays detailed attention to the novel diagnostic use of gamma correction pinhole bone scan in a broad spectrum of skeletal disorders, including physical, traumatic, and sports injuries, infectious and non-infectious bone diseases, benign and malignant bone tumors, and soft tissue diseases. A large number of state of the art pinhole scans and corroborative CT, MRI, and/or ultrasound images are presented side by side. The book has been enlarged to encompass various new topics, including occult fractures; cervical sprain and whiplash trauma; bone marrow edema; microfractures of trabeculae; evident, gaping, and stress fractures; and differential diagnosis. This new edition will be essential reading for practitioners and researchers in not only nuclear medicine but also radiology, orthopedic surgery, and pathology.

  16. Maintaining evolvability.

    Science.gov (United States)

    Crow, James F

    2008-12-01

    Although molecular methods, such as QTL mapping, have revealed a number of loci with large effects, it is still likely that the bulk of quantitative variability is due to multiple factors, each with small effect. Typically, these have a large additive component. Conventional wisdom argues that selection, natural or artificial, uses up additive variance and thus depletes its supply. Over time, the variance should be reduced, and at equilibrium be near zero. This is especially expected for fitness and traits highly correlated with it. Yet, populations typically have a great deal of additive variance, and do not seem to run out of genetic variability even after many generations of directional selection. Long-term selection experiments show that populations continue to retain seemingly undiminished additive variance despite large changes in the mean value. I propose that there are several reasons for this. (i) The environment is continually changing so that what was formerly most fit no longer is. (ii) There is an input of genetic variance from mutation, and sometimes from migration. (iii) As intermediate-frequency alleles increase in frequency towards one, producing less variance (as p --> 1, p(1 - p) --> 0), others that were originally near zero become more common and increase the variance. Thus, a roughly constant variance is maintained. (iv) There is always selection for fitness and for characters closely related to it. To the extent that the trait is heritable, later generations inherit a disproportionate number of genes acting additively on the trait, thus increasing genetic variance. For these reasons a selected population retains its ability to evolve. Of course, genes with large effect are also important. Conspicuous examples are the small number of loci that changed teosinte to maize, and major phylogenetic changes in the animal kingdom. The relative importance of these along with duplications, chromosome rearrangements, horizontal transmission and polyploidy

  17. Life cycle planning: An evolving concept

    International Nuclear Information System (INIS)

    Moore, P.J.R.; Gorman, I.G.

    1994-01-01

    Life-cycle planning is an evolving concept in the management of oil and gas projects. BHP Petroleum now interprets this idea to include all development planning from discovery and field appraisal to final abandonment and includes safety, environmental, technical, plant, regulatory, and staffing issues. This article describes in the context of the Timor Sea, how despite initial successes and continuing facilities upgrades, BHPP came to perceive that current operations could be the victim of early development successes, particularly in the areas of corrosion and maintenance. The search for analogies elsewhere lead to the UK North Sea, including the experiences of Britoil and BP, both of which performed detailed Life of Field studies in the later eighties. These materials have been used to construct a format and content for total Life-cycle plans in general and the social changes required to ensure their successful application in Timor Sea operations and deployment throughout Australia

  18. Spacetimes containing slowly evolving horizons

    International Nuclear Information System (INIS)

    Kavanagh, William; Booth, Ivan

    2006-01-01

    Slowly evolving horizons are trapping horizons that are ''almost'' isolated horizons. This paper reviews their definition and discusses several spacetimes containing such structures. These include certain Vaidya and Tolman-Bondi solutions as well as (perturbatively) tidally distorted black holes. Taking into account the mass scales and orders of magnitude that arise in these calculations, we conjecture that slowly evolving horizons are the norm rather than the exception in astrophysical processes that involve stellar-scale black holes

  19. Performing Interpretation

    Science.gov (United States)

    Kothe, Elsa Lenz; Berard, Marie-France

    2013-01-01

    Utilizing a/r/tographic methodology to interrogate interpretive acts in museums, multiple areas of inquiry are raised in this paper, including: which knowledge is assigned the greatest value when preparing a gallery talk; what lies outside of disciplinary knowledge; how invitations to participate invite and disinvite in the same gesture; and what…

  20. canEvolve: a web portal for integrative oncogenomics.

    Directory of Open Access Journals (Sweden)

    Mehmet Kemal Samur

    Full Text Available BACKGROUND & OBJECTIVE: Genome-wide profiles of tumors obtained using functional genomics platforms are being deposited to the public repositories at an astronomical scale, as a result of focused efforts by individual laboratories and large projects such as the Cancer Genome Atlas (TCGA and the International Cancer Genome Consortium. Consequently, there is an urgent need for reliable tools that integrate and interpret these data in light of current knowledge and disseminate results to biomedical researchers in a user-friendly manner. We have built the canEvolve web portal to meet this need. RESULTS: canEvolve query functionalities are designed to fulfill most frequent analysis needs of cancer researchers with a view to generate novel hypotheses. canEvolve stores gene, microRNA (miRNA and protein expression profiles, copy number alterations for multiple cancer types, and protein-protein interaction information. canEvolve allows querying of results of primary analysis, integrative analysis and network analysis of oncogenomics data. The querying for primary analysis includes differential gene and miRNA expression as well as changes in gene copy number measured with SNP microarrays. canEvolve provides results of integrative analysis of gene expression profiles with copy number alterations and with miRNA profiles as well as generalized integrative analysis using gene set enrichment analysis. The network analysis capability includes storage and visualization of gene co-expression, inferred gene regulatory networks and protein-protein interaction information. Finally, canEvolve provides correlations between gene expression and clinical outcomes in terms of univariate survival analysis. CONCLUSION: At present canEvolve provides different types of information extracted from 90 cancer genomics studies comprising of more than 10,000 patients. The presence of multiple data types, novel integrative analysis for identifying regulators of oncogenesis, network

  1. Fat: an evolving issue

    Directory of Open Access Journals (Sweden)

    John R. Speakman

    2012-09-01

    Work on obesity is evolving, and obesity is a consequence of our evolutionary history. In the space of 50 years, we have become an obese species. The reasons why can be addressed at a number of different levels. These include separating between whether the primary cause lies on the food intake or energy expenditure side of the energy balance equation, and determining how genetic and environmental effects contribute to weight variation between individuals. Opinion on whether increased food intake or decreased energy expenditure drives the obesity epidemic is still divided, but recent evidence favours the idea that food intake, rather than altered expenditure, is most important. There is more of a consensus that genetics explains most (probably around 65% of weight variation between individuals. Recent advances in genome-wide association studies have identified many polymorphisms that are linked to obesity, yet much of the genetic variance remains unexplained. Finding the causes of this unexplained variation will be an impetus of genetic and epigenetic research on obesity over the next decade. Many environmental factors – including gut microbiota, stress and endocrine disruptors – have been linked to the risk of developing obesity. A better understanding of gene-by-environment interactions will also be key to understanding obesity in the years to come.

  2. Interpretive Medicine

    Science.gov (United States)

    Reeve, Joanne

    2010-01-01

    Patient-centredness is a core value of general practice; it is defined as the interpersonal processes that support the holistic care of individuals. To date, efforts to demonstrate their relationship to patient outcomes have been disappointing, whilst some studies suggest values may be more rhetoric than reality. Contextual issues influence the quality of patient-centred consultations, impacting on outcomes. The legitimate use of knowledge, or evidence, is a defining aspect of modern practice, and has implications for patient-centredness. Based on a critical review of the literature, on my own empirical research, and on reflections from my clinical practice, I critique current models of the use of knowledge in supporting individualised care. Evidence-Based Medicine (EBM), and its implementation within health policy as Scientific Bureaucratic Medicine (SBM), define best evidence in terms of an epistemological emphasis on scientific knowledge over clinical experience. It provides objective knowledge of disease, including quantitative estimates of the certainty of that knowledge. Whilst arguably appropriate for secondary care, involving episodic care of selected populations referred in for specialist diagnosis and treatment of disease, application to general practice can be questioned given the complex, dynamic and uncertain nature of much of the illness that is treated. I propose that general practice is better described by a model of Interpretive Medicine (IM): the critical, thoughtful, professional use of an appropriate range of knowledges in the dynamic, shared exploration and interpretation of individual illness experience, in order to support the creative capacity of individuals in maintaining their daily lives. Whilst the generation of interpreted knowledge is an essential part of daily general practice, the profession does not have an adequate framework by which this activity can be externally judged to have been done well. Drawing on theory related to the

  3. Interpreting Impoliteness: Interpreters’ Voices

    Directory of Open Access Journals (Sweden)

    Tatjana Radanović Felberg

    2017-11-01

    Full Text Available Interpreters in the public sector in Norway interpret in a variety of institutional encounters, and the interpreters evaluate the majority of these encounters as polite. However, some encounters are evaluated as impolite, and they pose challenges when it comes to interpreting impoliteness. This issue raises the question of whether interpreters should take a stance on their own evaluation of impoliteness and whether they should interfere in communication. In order to find out more about how interpreters cope with this challenge, in 2014 a survey was sent to all interpreters registered in the Norwegian Register of Interpreters. The survey data were analyzed within the theoretical framework of impoliteness theory using the notion of moral order as an explanatory tool in a close reading of interpreters’ answers. The analysis shows that interpreters reported using a variety of strategies for interpreting impoliteness, including omissions and downtoning. However, the interpreters also gave examples of individual strategies for coping with impoliteness, such as interrupting and postponing interpreting. These strategies border behavioral strategies and conflict with the Norwegian ethical guidelines for interpreting. In light of the ethical guidelines and actual practice, mapping and discussing different strategies used by interpreters might heighten interpreters’ and interpreter-users’ awareness of the role impoliteness can play in institutional interpreter– mediated encounters. 

  4. Mentoring: An Evolving Relationship.

    Science.gov (United States)

    Block, Michelle; Florczak, Kristine L

    2017-04-01

    The column concerns itself with mentoring as an evolving relationship between mentor and mentee. The collegiate mentoring model, the transformational transcendence model, and the humanbecoming mentoring model are considered in light of a dialogue with mentors at a Midwest university and conclusions are drawn.

  5. Methods Evolved by Observation

    Science.gov (United States)

    Montessori, Maria

    2016-01-01

    Montessori's idea of the child's nature and the teacher's perceptiveness begins with amazing simplicity, and when she speaks of "methods evolved," she is unveiling a methodological system for observation. She begins with the early childhood explosion into writing, which is a familiar child phenomenon that Montessori has written about…

  6. EVOLVE 2014 International Conference

    CERN Document Server

    Tantar, Emilia; Sun, Jian-Qiao; Zhang, Wei; Ding, Qian; Schütze, Oliver; Emmerich, Michael; Legrand, Pierrick; Moral, Pierre; Coello, Carlos

    2014-01-01

    This volume encloses research articles that were presented at the EVOLVE 2014 International Conference in Beijing, China, July 1–4, 2014.The book gathers contributions that emerged from the conference tracks, ranging from probability to set oriented numerics and evolutionary computation; all complemented by the bridging purpose of the conference, e.g. Complex Networks and Landscape Analysis, or by the more application oriented perspective. The novelty of the volume, when considering the EVOLVE series, comes from targeting also the practitioner’s view. This is supported by the Machine Learning Applied to Networks and Practical Aspects of Evolutionary Algorithms tracks, providing surveys on new application areas, as in the networking area and useful insights in the development of evolutionary techniques, from a practitioner’s perspective. Complementary to these directions, the conference tracks supporting the volume, follow on the individual advancements of the subareas constituting the scope of the confe...

  7. Interpreting land records

    CERN Document Server

    Wilson, Donald A

    2014-01-01

    Base retracement on solid research and historically accurate interpretation Interpreting Land Records is the industry's most complete guide to researching and understanding the historical records germane to land surveying. Coverage includes boundary retracement and the primary considerations during new boundary establishment, as well as an introduction to historical records and guidance on effective research and interpretation. This new edition includes a new chapter titled "Researching Land Records," and advice on overcoming common research problems and insight into alternative resources wh

  8. Evolving Procurement Organizations

    DEFF Research Database (Denmark)

    Bals, Lydia; Laine, Jari; Mugurusi, Godfrey

    Procurement has to find further levers and advance its contribution to corporate goals continuously. This places pressure on its organization in order to facilitate its performance. Therefore, procurement organizations constantly have to evolve in order to match these demands. A conceptual model...... and external contingency factors and having a more detailed look at the structural dimensions chosen, beyond the well-known characteristics of centralization, formalization, participation, specialization, standardization and size. From a theoretical perspective, it opens up insights that can be leveraged...

  9. Symbiotic Composition and Evolvability

    OpenAIRE

    Watson, Richard A.; Pollack, Jordan B.

    2001-01-01

    Several of the Major Transitions in natural evolution, such as the symbiogenic origin of eukaryotes from prokaryotes, share the feature that existing entities became the components of composite entities at a higher level of organisation. This composition of pre-adapted extant entities into a new whole is a fundamentally different source of variation from the gradual accumulation of small random variations, and it has some interesting consequences for issues of evolvability. In this paper we p...

  10. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  11. Evolved H II regions

    International Nuclear Information System (INIS)

    Churchwell, E.

    1975-01-01

    A probable evolutionary sequence of H II regions based on six distinct types of observed objects is suggested. Two examples which may deviate from this idealized sequence, are discussed. Even though a size-mean density relation of H II regions can be used as a rough indication of whether a nebula is very young or evolved, it is argued that such a relation is not likely to be useful for the quantitative assignment of ages to H II regions. Evolved H II regions appear to fit into one of four structural types: rings, core-halos, smooth structures, and irregular or filamentary structures. Examples of each type are given with their derived physical parameters. The energy balance in these nebulae is considered. The mass of ionized gas in evolved H II regions is in general too large to trace the nebula back to single compact H II regions. Finally, the morphological type of the Galaxy is considered from its H II region content. 2 tables, 2 figs., 29 refs

  12. Evolving phenotypic networks in silico.

    Science.gov (United States)

    François, Paul

    2014-11-01

    Evolved gene networks are constrained by natural selection. Their structures and functions are consequently far from being random, as exemplified by the multiple instances of parallel/convergent evolution. One can thus ask if features of actual gene networks can be recovered from evolutionary first principles. I review a method for in silico evolution of small models of gene networks aiming at performing predefined biological functions. I summarize the current implementation of the algorithm, insisting on the construction of a proper "fitness" function. I illustrate the approach on three examples: biochemical adaptation, ligand discrimination and vertebrate segmentation (somitogenesis). While the structure of the evolved networks is variable, dynamics of our evolved networks are usually constrained and present many similar features to actual gene networks, including properties that were not explicitly selected for. In silico evolution can thus be used to predict biological behaviours without a detailed knowledge of the mapping between genotype and phenotype. Copyright © 2014 The Author. Published by Elsevier Ltd.. All rights reserved.

  13. Ranking in evolving complex networks

    Science.gov (United States)

    Liao, Hao; Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng; Zhou, Ming-Yang

    2017-05-01

    Complex networks have emerged as a simple yet powerful framework to represent and analyze a wide range of complex systems. The problem of ranking the nodes and the edges in complex networks is critical for a broad range of real-world problems because it affects how we access online information and products, how success and talent are evaluated in human activities, and how scarce resources are allocated by companies and policymakers, among others. This calls for a deep understanding of how existing ranking algorithms perform, and which are their possible biases that may impair their effectiveness. Many popular ranking algorithms (such as Google's PageRank) are static in nature and, as a consequence, they exhibit important shortcomings when applied to real networks that rapidly evolve in time. At the same time, recent advances in the understanding and modeling of evolving networks have enabled the development of a wide and diverse range of ranking algorithms that take the temporal dimension into account. The aim of this review is to survey the existing ranking algorithms, both static and time-aware, and their applications to evolving networks. We emphasize both the impact of network evolution on well-established static algorithms and the benefits from including the temporal dimension for tasks such as prediction of network traffic, prediction of future links, and identification of significant nodes.

  14. Interpretative commenting.

    Science.gov (United States)

    Vasikaran, Samuel

    2008-08-01

    * Clinical laboratories should be able to offer interpretation of the results they produce. * At a minimum, contact details for interpretative advice should be available on laboratory reports.Interpretative comments may be verbal or written and printed. * Printed comments on reports should be offered judiciously, only where they would add value; no comment preferred to inappropriate or dangerous comment. * Interpretation should be based on locally agreed or nationally recognised clinical guidelines where available. * Standard tied comments ("canned" comments) can have some limited use.Individualised narrative comments may be particularly useful in the case of tests that are new, complex or unfamiliar to the requesting clinicians and where clinical details are available. * Interpretative commenting should only be provided by appropriately trained and credentialed personnel. * Audit of comments and continued professional development of personnel providing them are important for quality assurance.

  15. Objective interpretation as conforming interpretation

    Directory of Open Access Journals (Sweden)

    Lidka Rodak

    2011-12-01

    Full Text Available The practical discourse willingly uses the formula of “objective interpretation”, with no regards to its controversial nature that has been discussed in literature.The main aim of the article is to investigate what “objective interpretation” could mean and how it could be understood in the practical discourse, focusing on the understanding offered by judicature.The thesis of the article is that objective interpretation, as identified with textualists’ position, is not possible to uphold, and should be rather linked with conforming interpretation. And what this actually implies is that it is not the virtue of certainty and predictability – which are usually associated with objectivity- but coherence that makes the foundation of applicability of objectivity in law.What could be observed from the analyses, is that both the phenomenon of conforming interpretation and objective interpretation play the role of arguments in the interpretive discourse, arguments that provide justification that interpretation is not arbitrary or subjective. With regards to the important part of the ideology of legal application which is the conviction that decisions should be taken on the basis of law in order to exclude arbitrariness, objective interpretation could be read as a question “what kind of authority “supports” certain interpretation”? that is almost never free of judicial creativity and judicial activism.One can say that, objective and conforming interpretation are just another arguments used in legal discourse.

  16. Evolving Procurement Organizations

    DEFF Research Database (Denmark)

    Bals, Lydia; Laiho, Aki; Laine, Jari

    Procurement has to find further levers and advance its contribution to corporate goals continuously. This places pressure on its organization in order to facilitate its performance. Therefore, Procurement organizations constantly have to evolve in order to match these demands. A conceptual model...... is presented and results of a first case study discussed. The findings highlight the importance of taking a contingency perspective on Procurement organization, understanding the internal and internal contingency factors. From a theoretical perspective, it opens up insights that can be furthermore leveraged...... in future studies in the fields of hybrid procurement organizations, global sourcing organizations as well as international procurement offices (IPOs). From a practical standpoint, an assessment of external and internal contingencies provides the opportunity to consciously match organization to its...

  17. Diffusion between evolving interfaces

    International Nuclear Information System (INIS)

    Juntunen, Janne; Merikoski, Juha

    2010-01-01

    Diffusion in an evolving environment is studied by continuous-time Monte Carlo simulations. Diffusion is modeled by continuous-time random walkers on a lattice, in a dynamic environment provided by bubbles between two one-dimensional interfaces driven symmetrically towards each other. For one-dimensional random walkers constrained by the interfaces, the bubble size distribution dominates diffusion. For two-dimensional random walkers, it is also controlled by the topography and dynamics of the interfaces. The results of the one-dimensional case are recovered in the limit where the interfaces are strongly driven. Even with simple hard-core repulsion between the interfaces and the particles, diffusion is found to depend strongly on the details of the dynamical rules of particles close to the interfaces.

  18. Penultimate interpretation.

    Science.gov (United States)

    Neuman, Yair

    2010-10-01

    Interpretation is at the center of psychoanalytic activity. However, interpretation is always challenged by that which is beyond our grasp, the 'dark matter' of our mind, what Bion describes as ' O'. O is one of the most central and difficult concepts in Bion's thought. In this paper, I explain the enigmatic nature of O as a high-dimensional mental space and point to the price one should pay for substituting the pre-symbolic lexicon of the emotion-laden and high-dimensional unconscious for a low-dimensional symbolic representation. This price is reification--objectifying lived experience and draining it of vitality and complexity. In order to address the difficulty of approaching O through symbolization, I introduce the term 'Penultimate Interpretation'--a form of interpretation that seeks 'loopholes' through which the analyst and the analysand may reciprocally save themselves from the curse of reification. Three guidelines for 'Penultimate Interpretation' are proposed and illustrated through an imaginary dialogue. Copyright © 2010 Institute of Psychoanalysis.

  19. Interpreting Physics

    CERN Document Server

    MacKinnon, Edward

    2012-01-01

    This book is the first to offer a systematic account of the role of language in the development and interpretation of physics. An historical-conceptual analysis of the co-evolution of mathematical and physical concepts leads to the classical/quatum interface. Bohrian orthodoxy stresses the indispensability of classical concepts and the functional role of mathematics. This book analyses ways of extending, and then going beyond this orthodoxy orthodoxy. Finally, the book analyzes how a revised interpretation of physics impacts on basic philosophical issues: conceptual revolutions, realism, and r

  20. Why did heterospory evolve?

    Science.gov (United States)

    Petersen, Kurt B; Burd, Martin

    2017-08-01

    The primitive land plant life cycle featured the production of spores of unimodal size, a condition called homospory. The evolution of bimodal size distributions with small male spores and large female spores, known as heterospory, was an innovation that occurred repeatedly in the history of land plants. The importance of desiccation-resistant spores for colonization of the land is well known, but the adaptive value of heterospory has never been well established. It was an addition to a sexual life cycle that already involved male and female gametes. Its role as a precursor to the evolution of seeds has received much attention, but this is an evolutionary consequence of heterospory that cannot explain the transition from homospory to heterospory (and the lack of evolutionary reversal from heterospory to homospory). Enforced outcrossing of gametophytes has often been mentioned in connection to heterospory, but we review the shortcomings of this argument as an explanation of the selective advantage of heterospory. Few alternative arguments concerning the selective forces favouring heterospory have been proposed, a paucity of attention that is surprising given the importance of this innovation in land plant evolution. In this review we highlight two ideas that may lead us to a better understanding of why heterospory evolved. First, models of optimal resource allocation - an approach that has been used for decades in evolutionary ecology to help understand parental investment and other life-history patterns - suggest that an evolutionary increase in spore size could reach a threshold at which small spores yielding small, sperm-producing gametophytes would return greater fitness per unit of resource investment than would large spores and bisexual gametophytes. With the advent of such microspores, megaspores would evolve under frequency-dependent selection. This argument can account for the appearance of heterospory in the Devonian, when increasingly tall and complex

  1. Evolving a photosynthetic organelle

    Directory of Open Access Journals (Sweden)

    Nakayama Takuro

    2012-04-01

    Full Text Available Abstract The evolution of plastids from cyanobacteria is believed to represent a singularity in the history of life. The enigmatic amoeba Paulinella and its 'recently' acquired photosynthetic inclusions provide a fascinating system through which to gain fresh insight into how endosymbionts become organelles. The plastids, or chloroplasts, of algae and plants evolved from cyanobacteria by endosymbiosis. This landmark event conferred on eukaryotes the benefits of photosynthesis - the conversion of solar energy into chemical energy - and in so doing had a huge impact on the course of evolution and the climate of Earth 1. From the present state of plastids, however, it is difficult to trace the evolutionary steps involved in this momentous development, because all modern-day plastids have fully integrated into their hosts. Paulinella chromatophora is a unicellular eukaryote that bears photosynthetic entities called chromatophores that are derived from cyanobacteria and has thus received much attention as a possible example of an organism in the early stages of organellogenesis. Recent studies have unlocked the genomic secrets of its chromatophore 23 and provided concrete evidence that the Paulinella chromatophore is a bona fide photosynthetic organelle 4. The question is how Paulinella can help us to understand the process by which an endosymbiont is converted into an organelle.

  2. Evolving a photosynthetic organelle.

    Science.gov (United States)

    Nakayama, Takuro; Archibald, John M

    2012-04-24

    The evolution of plastids from cyanobacteria is believed to represent a singularity in the history of life. The enigmatic amoeba Paulinella and its 'recently' acquired photosynthetic inclusions provide a fascinating system through which to gain fresh insight into how endosymbionts become organelles.The plastids, or chloroplasts, of algae and plants evolved from cyanobacteria by endosymbiosis. This landmark event conferred on eukaryotes the benefits of photosynthesis--the conversion of solar energy into chemical energy--and in so doing had a huge impact on the course of evolution and the climate of Earth 1. From the present state of plastids, however, it is difficult to trace the evolutionary steps involved in this momentous development, because all modern-day plastids have fully integrated into their hosts. Paulinella chromatophora is a unicellular eukaryote that bears photosynthetic entities called chromatophores that are derived from cyanobacteria and has thus received much attention as a possible example of an organism in the early stages of organellogenesis. Recent studies have unlocked the genomic secrets of its chromatophore 23 and provided concrete evidence that the Paulinella chromatophore is a bona fide photosynthetic organelle 4. The question is how Paulinella can help us to understand the process by which an endosymbiont is converted into an organelle.

  3. Communicability across evolving networks.

    Science.gov (United States)

    Grindrod, Peter; Parsons, Mark C; Higham, Desmond J; Estrada, Ernesto

    2011-04-01

    Many natural and technological applications generate time-ordered sequences of networks, defined over a fixed set of nodes; for example, time-stamped information about "who phoned who" or "who came into contact with who" arise naturally in studies of communication and the spread of disease. Concepts and algorithms for static networks do not immediately carry through to this dynamic setting. For example, suppose A and B interact in the morning, and then B and C interact in the afternoon. Information, or disease, may then pass from A to C, but not vice versa. This subtlety is lost if we simply summarize using the daily aggregate network given by the chain A-B-C. However, using a natural definition of a walk on an evolving network, we show that classic centrality measures from the static setting can be extended in a computationally convenient manner. In particular, communicability indices can be computed to summarize the ability of each node to broadcast and receive information. The computations involve basic operations in linear algebra, and the asymmetry caused by time's arrow is captured naturally through the noncommutativity of matrix-matrix multiplication. Illustrative examples are given for both synthetic and real-world communication data sets. We also discuss the use of the new centrality measures for real-time monitoring and prediction.

  4. Evolving Concepts of Asthma

    Science.gov (United States)

    Ray, Anuradha; Wenzel, Sally E.

    2015-01-01

    Our understanding of asthma has evolved over time from a singular disease to a complex of various phenotypes, with varied natural histories, physiologies, and responses to treatment. Early therapies treated most patients with asthma similarly, with bronchodilators and corticosteroids, but these therapies had varying degrees of success. Similarly, despite initial studies that identified an underlying type 2 inflammation in the airways of patients with asthma, biologic therapies targeted toward these type 2 pathways were unsuccessful in all patients. These observations led to increased interest in phenotyping asthma. Clinical approaches, both biased and later unbiased/statistical approaches to large asthma patient cohorts, identified a variety of patient characteristics, but they also consistently identified the importance of age of onset of disease and the presence of eosinophils in determining clinically relevant phenotypes. These paralleled molecular approaches to phenotyping that developed an understanding that not all patients share a type 2 inflammatory pattern. Using biomarkers to select patients with type 2 inflammation, repeated trials of biologics directed toward type 2 cytokine pathways saw newfound success, confirming the importance of phenotyping in asthma. Further research is needed to clarify additional clinical and molecular phenotypes, validate predictive biomarkers, and identify new areas for possible interventions. PMID:26161792

  5. UKAEA'S evolving contract philosophy

    International Nuclear Information System (INIS)

    Nicol, R. D.

    2003-01-01

    The United Kingdom Atomic Energy Authority (UKAEA) has gone through fundamental change over the last ten years. At the heart of this change has been UKAEA's relationship with the contracting and supply market. This paper describes the way in which UKAEA actively developed the market to support the decommissioning programme, and how the approach to contracting has evolved as external pressures and demands have changed. UKAEA's pro-active approach to industry has greatly assisted the development of a healthy, competitive market for services supporting decommissioning in the UK. There have been difficult changes and many challenges along the way, and some retrenchment was necessary to meet regulatory requirements. Nevertheless, UKAEA has sustained a high level of competition - now measured in terms of competed spend as a proportion of competable spend - with annual out-turns consistently over 80%. The prime responsibility for market development will pass to the new Nuclear Decommissioning Authority (NDA) in 2005, as the owner, on behalf of the Government, of the UK's civil nuclear liabilities. The preparatory work for the NDA indicates that the principles established by UKAEA will be carried forward. (author)

  6. Interpreting conjunctions.

    Science.gov (United States)

    Bott, Lewis; Frisson, Steven; Murphy, Gregory L

    2009-04-01

    The interpretation generated from a sentence of the form P and Q can often be different to that generated by Q and P, despite the fact that and has a symmetric truth-conditional meaning. We experimentally investigated to what extent this difference in meaning is due to the connective and and to what extent it is due to order of mention of the events in the sentence. In three experiments, we collected interpretations of sentences in which we varied the presence of the conjunction, the order of mention of the events, and the type of relation holding between the events (temporally vs. causally related events). The results indicated that the effect of using a conjunction was dependent on the discourse relation between the events. Our findings contradict a narrative marker theory of and, but provide partial support for a single-unit theory derived from Carston (2002). The results are discussed in terms of conjunction processing and implicatures of temporal order.

  7. Infrared spectroscopy of evolved objects

    International Nuclear Information System (INIS)

    Aitken, D.K.; Roche, P.F.

    1984-01-01

    In this review, the authors are concerned with spectroscopic observations of evolved objects made in the wavelength range 1-300μm. Spectroscopic observations can conveniently be divided into studies of narrow lines, bands and broader continua. The vibrational frequencies of molecular groups fall mainly in this spectral region and appear as vibration-rotation bands from the gas phase, and as less structured, but often broader, features from the solid state. Many ionic lines, including recombination lines of abundant species and fine structure lines of astrophysically important ions also appear in this region. The continuum can arise from a number of mechanisms - photospheric emission, radiation from dust, free-free transitions in ionized gas and non-thermal processes. (Auth.)

  8. Interpretation of computed tomographic images

    International Nuclear Information System (INIS)

    Stickle, R.L.; Hathcock, J.T.

    1993-01-01

    This article discusses the production of optimal CT images in small animal patients as well as principles of radiographic interpretation. Technical factors affecting image quality and aiding image interpretation are included. Specific considerations for scanning various anatomic areas are given, including indications and potential pitfalls. Principles of radiographic interpretation are discussed. Selected patient images are illustrated

  9. Objective interpretation as conforming interpretation

    OpenAIRE

    Lidka Rodak

    2011-01-01

    The practical discourse willingly uses the formula of “objective interpretation”, with no regards to its controversial nature that has been discussed in literature.The main aim of the article is to investigate what “objective interpretation” could mean and how it could be understood in the practical discourse, focusing on the understanding offered by judicature.The thesis of the article is that objective interpretation, as identified with textualists’ position, is not possible to uphold, and ...

  10. Disgust: Evolved function and structure

    NARCIS (Netherlands)

    Tybur, J.M.; Lieberman, D.; Kurzban, R.; DeScioli, P.

    2013-01-01

    Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and

  11. Natural selection promotes antigenic evolvability

    NARCIS (Netherlands)

    Graves, C.J.; Ros, V.I.D.; Stevenson, B.; Sniegowski, P.D.; Brisson, D.

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide

  12. Mammographic interpretation

    International Nuclear Information System (INIS)

    Tabor, L.

    1987-01-01

    For mammography to be an effective diagnostic method, it must be performed to a very high standard of quality. Otherwise many lesions, in particular cancer in its early stages, will simply not be detectable on the films, regardless of the skill of the mammographer. Mammographic interpretation consists of two basic steps: perception and analysis. The process of mammographic interpretation begins with perception of the lesion on the mammogram. Perception is influenced by several factors. One of the most important is the parenchymal pattern of the breast tissue, detection of pathologic lesions being easier with fatty involution. The mammographer should use a method for the systematic viewing of the mammograms that will ensure that all parts of each mammogram are carefully searched for the presence of lesions. The method of analysis proceeds according to the type of lesion. The contour analysis of primary importance in the evaluation of circumscribed tumors. After having analyzed the contour and density of a lesion and considered its size, the mammographer should be fairly certain whether the circumscribed tumor is benign or malignant. Fine-needle puncture and/or US may assist the mammographer in making this decision. Painstaking analysis is required because many circumscribed tumors do not need to be biopsied. The perception of circumscribed tumors seldom causes problems, but their analysis needs careful attention. On the other hand, the major challenge with star-shaped lesions is perception. They may be difficult to discover when small. Although the final diagnosis of a stellate lesion can be made only with the help of histologic examination, the preoperative mammorgraphic differential diagnosis can be highly accurate. The differential diagnostic problem is between malignant tumors (scirrhous carcinoma), on the one hand, and traumatic fat necrosis as well as radial scars on the other hand

  13. Designing Garments to Evolve Over Time

    DEFF Research Database (Denmark)

    Riisberg, Vibeke; Grose, Lynda

    2017-01-01

    This paper proposes a REDO of the current fashion paradigm by investigating how garments might be designed to evolve over time. The purpose is to discuss ways of expanding the traditional role of the designer to include temporal dimensions of creating, producing and using clothes and to suggest...... to a REDO of design education, to further research and the future fashion and textile industry....

  14. The Interpretive Function

    DEFF Research Database (Denmark)

    Agerbo, Heidi

    2017-01-01

    Approximately a decade ago, it was suggested that a new function should be added to the lexicographical function theory: the interpretive function(1). However, hardly any research has been conducted into this function, and though it was only suggested that this new function was relevant...... to incorporate into lexicographical theory, some scholars have since then assumed that this function exists(2), including the author of this contribution. In Agerbo (2016), I present arguments supporting the incorporation of the interpretive function into the function theory and suggest how non-linguistic signs...... can be treated in specific dictionary articles. However, in the current article, due to the results of recent research, I argue that the interpretive function should not be considered an individual main function. The interpretive function, contrary to some of its definitions, is not connected...

  15. Schrodinger's mechanics interpretation

    CERN Document Server

    Cook, David B

    2018-01-01

    The interpretation of quantum mechanics has been in dispute for nearly a century with no sign of a resolution. Using a careful examination of the relationship between the final form of classical particle mechanics (the Hamilton–Jacobi Equation) and Schrödinger's mechanics, this book presents a coherent way of addressing the problems and paradoxes that emerge through conventional interpretations.Schrödinger's Mechanics critiques the popular way of giving physical interpretation to the various terms in perturbation theory and other technologies and places an emphasis on development of the theory and not on an axiomatic approach. When this interpretation is made, the extension of Schrödinger's mechanics in relation to other areas, including spin, relativity and fields, is investigated and new conclusions are reached.

  16. Revealing evolved massive stars with Spitzer

    Science.gov (United States)

    Gvaramadze, V. V.; Kniazev, A. Y.; Fabrika, S.

    2010-06-01

    Massive evolved stars lose a large fraction of their mass via copious stellar wind or instant outbursts. During certain evolutionary phases, they can be identified by the presence of their circumstellar nebulae. In this paper, we present the results of a search for compact nebulae (reminiscent of circumstellar nebulae around evolved massive stars) using archival 24-μm data obtained with the Multiband Imaging Photometer for Spitzer. We have discovered 115 nebulae, most of which bear a striking resemblance to the circumstellar nebulae associated with luminous blue variables (LBVs) and late WN-type (WNL) Wolf-Rayet (WR) stars in the Milky Way and the Large Magellanic Cloud (LMC). We interpret this similarity as an indication that the central stars of detected nebulae are either LBVs or related evolved massive stars. Our interpretation is supported by follow-up spectroscopy of two dozen of these central stars, most of which turn out to be either candidate LBVs (cLBVs), blue supergiants or WNL stars. We expect that the forthcoming spectroscopy of the remaining objects from our list, accompanied by the spectrophotometric monitoring of the already discovered cLBVs, will further increase the known population of Galactic LBVs. This, in turn, will have profound consequences for better understanding the LBV phenomenon and its role in the transition between hydrogen-burning O stars and helium-burning WR stars. We also report on the detection of an arc-like structure attached to the cLBV HD 326823 and an arc associated with the LBV R99 (HD 269445) in the LMC. Partially based on observations collected at the German-Spanish Astronomical Centre, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC). E-mail: vgvaram@mx.iki.rssi.ru (VVG); akniazev@saao.ac.za (AYK); fabrika@sao.ru (SF)

  17. An Interpreter's Interpretation: Sign Language Interpreters' View of Musculoskeletal Disorders

    National Research Council Canada - National Science Library

    Johnson, William L

    2003-01-01

    Sign language interpreters are at increased risk for musculoskeletal disorders. This study used content analysis to obtain detailed information about these disorders from the interpreters' point of view...

  18. Paediatric dentistry- novel evolvement

    Directory of Open Access Journals (Sweden)

    Saleha Shah, B.D.S, MClinDent Paediatric Dentistry (UK

    2018-01-01

    Full Text Available Pediatric dentistry provides primary and comprehensive preventive and therapeutic oral health care for infants and children through adolescence, together with special health care needs. This specialty encompasses a variety of skills, disciplines, procedures and techniques that share a common origin with other dental specialties however these have been modified and reformed to the distinctive requirements of infants, children, adolescents and special health care needs. Disciplines comprise of behavior guidance, care of the medically and developmentally compromised and disabled patient, supervision of orofacial growth and development, caries prevention, sedation, pharmacological management, and hospital dentistry including other traditional fields of dentistry. The skills apply to the ever-changing stages of dental, physical, and psychosocial development for treating conditions and diseases distinctive to growing individuals. Hence with the changing scope of practice it is imperative that the clinician stays updated with the current evidence based trends in practice, collaborates with other disciplines and Imparts quality oral health care tailored to the specific needs of every child.

  19. Interpreting the Customary Rules on Interpretation

    NARCIS (Netherlands)

    Merkouris, Panos

    2017-01-01

    International courts have at times interpreted the customary rules on interpretation. This is interesting because what is being interpreted is: i) rules of interpretation, which sounds dangerously tautological, and ii) customary law, the interpretation of which has not been the object of critical

  20. Evolving Capabilities for Virtual Globes

    Science.gov (United States)

    Glennon, A.

    2006-12-01

    Though thin-client spatial visualization software like Google Earth and NASA World Wind enjoy widespread popularity, a common criticism is their general lack of analytical functionality. This concern, however, is rapidly being addressed; standard and advanced geographic information system (GIS) capabilities are being developed for virtual globes--though not centralized into a single implementation or software package. The innovation is mostly originating from the user community. Three such capabilities relevant to the earth science, education, and emergency management communities are modeling dynamic spatial phenomena, real-time data collection and visualization, and multi-input collaborative databases. Modeling dynamic spatial phenomena has been facilitated through joining virtual globe geometry definitions--like KML--to relational databases. Real-time data collection uses short scripts to transform user-contributed data into a format usable by virtual globe software. Similarly, collaborative data collection for virtual globes has become possible by dynamically referencing online, multi-person spreadsheets. Examples of these functions include mapping flows within a karst watershed, real-time disaster assessment and visualization, and a collaborative geyser eruption spatial decision support system. Virtual globe applications will continue to evolve further analytical capabilities, more temporal data handling, and from nano to intergalactic scales. This progression opens education and research avenues in all scientific disciplines.

  1. Natural selection promotes antigenic evolvability.

    Science.gov (United States)

    Graves, Christopher J; Ros, Vera I D; Stevenson, Brian; Sniegowski, Paul D; Brisson, Dustin

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed 'cassettes' that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios) and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish chronic infections.

  2. Natural selection promotes antigenic evolvability.

    Directory of Open Access Journals (Sweden)

    Christopher J Graves

    Full Text Available The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed 'cassettes' that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish

  3. Robustness to Faults Promotes Evolvability: Insights from Evolving Digital Circuits.

    Science.gov (United States)

    Milano, Nicola; Nolfi, Stefano

    2016-01-01

    We demonstrate how the need to cope with operational faults enables evolving circuits to find more fit solutions. The analysis of the results obtained in different experimental conditions indicates that, in absence of faults, evolution tends to select circuits that are small and have low phenotypic variability and evolvability. The need to face operation faults, instead, drives evolution toward the selection of larger circuits that are truly robust with respect to genetic variations and that have a greater level of phenotypic variability and evolvability. Overall our results indicate that the need to cope with operation faults leads to the selection of circuits that have a greater probability to generate better circuits as a result of genetic variation with respect to a control condition in which circuits are not subjected to faults.

  4. Interpreting & Biomechanics. PEPNet Tipsheet

    Science.gov (United States)

    PEPNet-Northeast, 2001

    2001-01-01

    Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…

  5. Idiopathic pulmonary fibrosis: evolving concepts.

    Science.gov (United States)

    Ryu, Jay H; Moua, Teng; Daniels, Craig E; Hartman, Thomas E; Yi, Eunhee S; Utz, James P; Limper, Andrew H

    2014-08-01

    Idiopathic pulmonary fibrosis (IPF) occurs predominantly in middle-aged and older adults and accounts for 20% to 30% of interstitial lung diseases. It is usually progressive, resulting in respiratory failure and death. Diagnostic criteria for IPF have evolved over the years, and IPF is currently defined as a disease characterized by the histopathologic pattern of usual interstitial pneumonia occurring in the absence of an identifiable cause of lung injury. Understanding of the pathogenesis of IPF has shifted away from chronic inflammation and toward dysregulated fibroproliferative repair in response to alveolar epithelial injury. Idiopathic pulmonary fibrosis is likely a heterogeneous disorder caused by various interactions between genetic components and environmental exposures. High-resolution computed tomography can be diagnostic in the presence of typical findings such as bilateral reticular opacities associated with traction bronchiectasis/bronchiolectasis in a predominantly basal and subpleural distribution, along with subpleural honeycombing. In other circumstances, a surgical lung biopsy may be needed. The clinical course of IPF can be unpredictable and may be punctuated by acute deteriorations (acute exacerbation). Although progress continues in unraveling the mechanisms of IPF, effective therapy has remained elusive. Thus, clinicians and patients need to reach informed decisions regarding management options including lung transplant. The findings in this review were based on a literature search of PubMed using the search terms idiopathic pulmonary fibrosis and usual interstitial pneumonia, limited to human studies in the English language published from January 1, 2000, through December 31, 2013, and supplemented by key references published before the year 2000. Copyright © 2014 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  6. A neighbourhood evolving network model

    International Nuclear Information System (INIS)

    Cao, Y.J.; Wang, G.Z.; Jiang, Q.Y.; Han, Z.X.

    2006-01-01

    Many social, technological, biological and economical systems are best described by evolved network models. In this short Letter, we propose and study a new evolving network model. The model is based on the new concept of neighbourhood connectivity, which exists in many physical complex networks. The statistical properties and dynamics of the proposed model is analytically studied and compared with those of Barabasi-Albert scale-free model. Numerical simulations indicate that this network model yields a transition between power-law and exponential scaling, while the Barabasi-Albert scale-free model is only one of its special (limiting) cases. Particularly, this model can be used to enhance the evolving mechanism of complex networks in the real world, such as some social networks development

  7. The evolving epidemiology of inflammatory bowel disease.

    LENUS (Irish Health Repository)

    Shanahan, Fergus

    2009-07-01

    Epidemiologic studies in inflammatory bowel disease (IBD) include assessments of disease burden and evolving patterns of disease presentation. Although it is hoped that sound epidemiologic studies provide aetiological clues, traditional risk factor-based epidemiology has provided limited insights into either Crohn\\'s disease or ulcerative colitis etiopathogenesis. In this update, we will summarize how the changing epidemiology of IBD associated with modernization can be reconciled with current concepts of disease mechanisms and will discuss studies of clinically significant comorbidity in IBD.

  8. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    Science.gov (United States)

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  9. Evolving artificial metalloenzymes via random mutagenesis

    Science.gov (United States)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  10. Evolving expectations from international organisations

    International Nuclear Information System (INIS)

    Ruiz Lopez, C.

    2008-01-01

    The author stated that implementation of the geological disposal concept requires a strategy that provides national decision makers with sufficient confidence in the level of long-term safety and protection ultimately achieved. The concept of protection against harm has a broader meaning than radiological protection in terms of risk and dose. It includes the protection of the environment and socio-economic interests of communities. She recognised that a number of countries have established regulatory criteria already, and others are now discussing what constitutes a proper regulatory test and suitable time frame for judging the safety of long-term disposal. Each regulatory programme seeks to define reasonable tests of repository performance, using protection criteria and safety approaches consistent with the culture, values and expectations of the citizens of the country concerned. This means that there are differences in how protection and safety are addressed in national approaches to regulation and in the bases used for that. However, as was recognised in the Cordoba Workshop, it would be important to reach a minimum level of consistency and be able to explain the differences. C. Ruiz-Lopez presented an overview of the development of international guidance from ICRP, IAEA and NEA from the Cordoba workshop up to now, and positions of independent National Advisory Bodies. The evolution of these guidelines over time demonstrates an evolving understanding of long-term implications, with the recognition that dose and risk constraints should not be seen as measures of detriment beyond a few hundred years, the emphasis on sound engineering practices, and the introduction of new concepts and approaches which take into account social and economical aspects (e.g. constrained optimisation, BAT, managerial principles). In its new recommendations, ICRP (draft 2006) recognizes. in particular, that decision making processes may depend on other societal concerns and considers

  11. Interpretive Media Study and Interpretive Social Science.

    Science.gov (United States)

    Carragee, Kevin M.

    1990-01-01

    Defines the major theoretical influences on interpretive approaches in mass communication, examines the central concepts of these perspectives, and provides a critique of these approaches. States that the adoption of interpretive approaches in mass communication has ignored varied critiques of interpretive social science. Suggests that critical…

  12. Interpreters, Interpreting, and the Study of Bilingualism.

    Science.gov (United States)

    Valdes, Guadalupe; Angelelli, Claudia

    2003-01-01

    Discusses research on interpreting focused specifically on issues raised by this literature about the nature of bilingualism. Suggests research carried out on interpreting--while primarily produced with a professional audience in mind and concerned with improving the practice of interpreting--provides valuable insights about complex aspects of…

  13. Laplacian Estrada and normalized Laplacian Estrada indices of evolving graphs.

    Directory of Open Access Journals (Sweden)

    Yilun Shang

    Full Text Available Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices.

  14. Laplacian Estrada and normalized Laplacian Estrada indices of evolving graphs.

    Science.gov (United States)

    Shang, Yilun

    2015-01-01

    Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices.

  15. Interpretation and clinical applications

    International Nuclear Information System (INIS)

    Higgins, C.B.

    1987-01-01

    This chapter discusses the factors to be kept in mind during routine interpretation of MR images. This includes the factors that determine contrast on standard spin-echo images and some distinguishing features between true lesions and artifactually simulated lesions. This chapter also indicates the standard protocols for MRI of various portions of the body. Finally, the current indications for MRI of various portions of the body are suggested; however, it is recognized that the indications for MRI are rapidly increasing and consequently, at the time of publication of this chapter, it is likely that many more applications will have become evident. Interpretation of magnetic resonance (MR) images requires consideration of anatomy and tissue characteristics and extraction of artifacts resulting from motion and other factors

  16. The 'E' factor -- evolving endodontics.

    Science.gov (United States)

    Hunter, M J

    2013-03-01

    Endodontics is a constantly developing field, with new instruments, preparation techniques and sealants competing with trusted and traditional approaches to tooth restoration. Thus general dental practitioners must question and understand the significance of these developments before adopting new practices. In view of this, the aim of this article, and the associated presentation at the 2013 British Dental Conference & Exhibition, is to provide an overview of endodontic methods and constantly evolving best practice. The presentation will review current preparation techniques, comparing rotary versus reciprocation, and question current trends in restoration of the endodontically treated tooth.

  17. Mobile computing acceptance grows as applications evolve.

    Science.gov (United States)

    Porn, Louis M; Patrick, Kelly

    2002-01-01

    Handheld devices are becoming more cost-effective to own, and their use in healthcare environments is increasing. Handheld devices currently are being used for e-prescribing, charge capture, and accessing daily schedules and reference tools. Future applications may include education on medications, dictation, order entry, and test-results reporting. Selecting the right handheld device requires careful analysis of current and future applications, as well as vendor expertise. It is important to recognize the technology will continue to evolve over the next three years.

  18. Medical interpreters as tools: dangers and challenges in the utilitarian approach to interpreters' roles and functions.

    Science.gov (United States)

    Hsieh, Elaine; Kramer, Eric Mark

    2012-10-01

    This study explores the tensions, challenges, and dangers when a utilitarian view of interpreter is constructed, imposed, and/or reinforced in health care settings. We conducted in-depth interviews and focus groups with 26 medical interpreters from 17 different languages and cultures and 39 providers of five specialties. Grounded theory was used for data analysis. The utilitarian view to interpreters' roles and functions influences providers in the following areas: (a) hierarchical structure and unidirectional communication, (b) the interpreter seen as information gatekeeper, (c) the interpreter seen as provider proxy, and (d) interpreter's emotional support perceived as tools. When interpreters are viewed as passive instruments, a utilitarian approach may compromise the quality of care by silencing patients' and interpreters' voice, objectifying interpreters' emotional work, and exploiting patients' needs. Providers need to recognize that a utilitarian approach to the interpreter's role and functions may create interpersonal and ethical dilemmas that compromise the quality of care. By viewing interpreters as smart technology (rather than passive instruments), both providers and interpreters can learn from and co-evolve with each other, allowing them to maintain control over their expertise and to work as collaborators in providing quality care. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. On court interpreters' visibility

    DEFF Research Database (Denmark)

    Dubslaff, Friedel; Martinsen, Bodil

    of the service they receive. Ultimately, the findings will be used for training purposes. Future - and, for that matter, already practising - interpreters as well as the professional users of interpreters ought to take the reality of the interpreters' work in practice into account when assessing the quality...... on the interpreter's interpersonal role and, in particular, on signs of the interpreter's visibility, i.e. active co-participation. At first sight, the interpreting assignment in question seems to be a short and simple routine task which would not require the interpreter to deviate from the traditional picture...

  20. Peripartum hysterectomy: an evolving picture.

    LENUS (Irish Health Repository)

    Turner, Michael J

    2012-02-01

    Peripartum hysterectomy (PH) is one of the obstetric catastrophes. Evidence is emerging that the role of PH in modern obstetrics is evolving. Improving management of postpartum hemorrhage and newer surgical techniques should decrease PH for uterine atony. Rising levels of repeat elective cesarean deliveries should decrease PH following uterine scar rupture in labor. Increasing cesarean rates, however, have led to an increase in the number of PHs for morbidly adherent placenta. In the case of uterine atony or rupture where PH is required, a subtotal PH is often sufficient. In the case of pathological placental localization involving the cervix, however, a total hysterectomy is required. Furthermore, the involvement of other pelvic structures may prospectively make the diagnosis difficult and the surgery challenging. If resources permit, PH for pathological placental localization merits a multidisciplinary approach. Despite advances in clinical practice, it is likely that peripartum hysterectomy will be more challenging for obstetricians in the future.

  1. Adaptive inferential sensors based on evolving fuzzy models.

    Science.gov (United States)

    Angelov, Plamen; Kordon, Arthur

    2010-04-01

    A new technique to the design and use of inferential sensors in the process industry is proposed in this paper, which is based on the recently introduced concept of evolving fuzzy models (EFMs). They address the challenge that the modern process industry faces today, namely, to develop such adaptive and self-calibrating online inferential sensors that reduce the maintenance costs while keeping the high precision and interpretability/transparency. The proposed new methodology makes possible inferential sensors to recalibrate automatically, which reduces significantly the life-cycle efforts for their maintenance. This is achieved by the adaptive and flexible open-structure EFM used. The novelty of this paper lies in the following: (1) the overall concept of inferential sensors with evolving and self-developing structure from the data streams; (2) the new methodology for online automatic selection of input variables that are most relevant for the prediction; (3) the technique to detect automatically a shift in the data pattern using the age of the clusters (and fuzzy rules); (4) the online standardization technique used by the learning procedure of the evolving model; and (5) the application of this innovative approach to several real-life industrial processes from the chemical industry (evolving inferential sensors, namely, eSensors, were used for predicting the chemical properties of different products in The Dow Chemical Company, Freeport, TX). It should be noted, however, that the methodology and conclusions of this paper are valid for the broader area of chemical and process industries in general. The results demonstrate that well-interpretable and with-simple-structure inferential sensors can automatically be designed from the data stream in real time, which predict various process variables of interest. The proposed approach can be used as a basis for the development of a new generation of adaptive and evolving inferential sensors that can address the

  2. CERN internal communication is evolving

    CERN Multimedia

    2016-01-01

    CERN news will now be regularly updated on the CERN People page (see here).      Dear readers, All over the world, communication is becoming increasingly instantaneous, with news published in real time on websites and social networks. In order to keep pace with these changes, CERN's internal communication is evolving too. From now on, you will be informed of what’s happening at CERN more often via the “CERN people” page, which will frequently be updated with news. The Bulletin is following this trend too: twice a month, we will compile the most important articles published on the CERN site, with a brand-new layout. You will receive an e-mail every two weeks as soon as this new form of the Bulletin is available. If you have interesting news or stories to share, tell us about them through the form at: https://communications.web.cern.ch/got-story-cern-website​. You can also find out about news from CERN in real time...

  3. Economies Evolve by Energy Dispersal

    Directory of Open Access Journals (Sweden)

    Stanley Salthe

    2009-10-01

    Full Text Available Economic activity can be regarded as an evolutionary process governed by the 2nd law of thermodynamics. The universal law, when formulated locally as an equation of motion, reveals that a growing economy develops functional machinery and organizes hierarchically in such a way as to tend to equalize energy density differences within the economy and in respect to the surroundings it is open to. Diverse economic activities result in flows of energy that will preferentially channel along the most steeply descending paths, leveling a non-Euclidean free energy landscape. This principle of 'maximal energy dispersal‘, equivalent to the maximal rate of entropy production, gives rise to economic laws and regularities. The law of diminishing returns follows from the diminishing free energy while the relation between supply and demand displays a quest for a balance among interdependent energy densities. Economic evolution is dissipative motion where the driving forces and energy flows are inseparable from each other. When there are multiple degrees of freedom, economic growth and decline are inherently impossible to forecast in detail. Namely, trajectories of an evolving economy are non-integrable, i.e. unpredictable in detail because a decision by a player will affect also future decisions of other players. We propose that decision making is ultimately about choosing from various actions those that would reduce most effectively subjectively perceived energy gradients.

  4. Recommendation in evolving online networks

    Science.gov (United States)

    Hu, Xiao; Zeng, An; Shang, Ming-Sheng

    2016-02-01

    Recommender system is an effective tool to find the most relevant information for online users. By analyzing the historical selection records of users, recommender system predicts the most likely future links in the user-item network and accordingly constructs a personalized recommendation list for each user. So far, the recommendation process is mostly investigated in static user-item networks. In this paper, we propose a model which allows us to examine the performance of the state-of-the-art recommendation algorithms in evolving networks. We find that the recommendation accuracy in general decreases with time if the evolution of the online network fully depends on the recommendation. Interestingly, some randomness in users' choice can significantly improve the long-term accuracy of the recommendation algorithm. When a hybrid recommendation algorithm is applied, we find that the optimal parameter gradually shifts towards the diversity-favoring recommendation algorithm, indicating that recommendation diversity is essential to keep a high long-term recommendation accuracy. Finally, we confirm our conclusions by studying the recommendation on networks with the real evolution data.

  5. Genre and Interpretation

    DEFF Research Database (Denmark)

    Auken, Sune

    2015-01-01

    Despite the immensity of genre studies as well as studies in interpretation, our understanding of the relationship between genre and interpretation is sketchy at best. The article attempts to unravel some of intricacies of that relationship through an analysis of the generic interpretation carrie...

  6. Engineering Definitional Interpreters

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Ramsay, Norman; Larsen, Bradford

    2013-01-01

    A definitional interpreter should be clear and easy to write, but it may run 4--10 times slower than a well-crafted bytecode interpreter. In a case study focused on implementation choices, we explore ways of making definitional interpreters faster without expending much programming effort. We imp...

  7. Interpreter services in emergency medicine.

    Science.gov (United States)

    Chan, Yu-Feng; Alagappan, Kumar; Rella, Joseph; Bentley, Suzanne; Soto-Greene, Marie; Martin, Marcus

    2010-02-01

    Emergency physicians are routinely confronted with problems associated with language barriers. It is important for emergency health care providers and the health system to strive for cultural competency when communicating with members of an increasingly diverse society. Possible solutions that can be implemented include appropriate staffing, use of new technology, and efforts to develop new kinds of ties to the community served. Linguistically specific solutions include professional interpretation, telephone interpretation, the use of multilingual staff members, the use of ad hoc interpreters, and, more recently, the use of mobile computer technology at the bedside. Each of these methods carries a specific set of advantages and disadvantages. Although professionally trained medical interpreters offer improved communication, improved patient satisfaction, and overall cost savings, they are often underutilized due to their perceived inefficiency and the inconclusive results of their effect on patient care outcomes. Ultimately, the best solution for each emergency department will vary depending on the population served and available resources. Access to the multiple interpretation options outlined above and solid support and commitment from hospital institutions are necessary to provide proper and culturally competent care for patients. Appropriate communications inclusive of interpreter services are essential for culturally and linguistically competent provider/health systems and overall improved patient care and satisfaction. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  8. Semantic interpretation of search engine resultant

    Science.gov (United States)

    Nasution, M. K. M.

    2018-01-01

    In semantic, logical language can be interpreted in various forms, but the certainty of meaning is included in the uncertainty, which directly always influences the role of technology. One results of this uncertainty applies to search engines as user interfaces with information spaces such as the Web. Therefore, the behaviour of search engine results should be interpreted with certainty through semantic formulation as interpretation. Behaviour formulation shows there are various interpretations that can be done semantically either temporary, inclusion, or repeat.

  9. A new evolutionary system for evolving artificial neural networks.

    Science.gov (United States)

    Yao, X; Liu, Y

    1997-01-01

    This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP). Unlike most previous studies on evolving ANN's, this paper puts its emphasis on evolving ANN's behaviors. Five mutation operators proposed in EPNet reflect such an emphasis on evolving behaviors. Close behavioral links between parents and their offspring are maintained by various mutations, such as partial training and node splitting. EPNet evolves ANN's architectures and connection weights (including biases) simultaneously in order to reduce the noise in fitness evaluation. The parsimony of evolved ANN's is encouraged by preferring node/connection deletion to addition. EPNet has been tested on a number of benchmark problems in machine learning and ANNs, such as the parity problem, the medical diagnosis problems, the Australian credit card assessment problem, and the Mackey-Glass time series prediction problem. The experimental results show that EPNet can produce very compact ANNs with good generalization ability in comparison with other algorithms.

  10. Interpretation of galaxy counts

    International Nuclear Information System (INIS)

    Tinsely, B.M.

    1980-01-01

    New models are presented for the interpretation of recent counts of galaxies to 24th magnitude, and predictions are shown to 28th magnitude for future comparison with data from the Space Telescope. The results supersede earlier, more schematic models by the author. Tyson and Jarvis found in their counts a ''local'' density enhancement at 17th magnitude, on comparison with the earlier models; the excess is no longer significant when a more realistic mixture of galaxy colors is used. Bruzual and Kron's conclusion that Kron's counts show evidence for evolution at faint magnitudes is confirmed, and it is predicted that some 23d magnitude galaxies have redshifts greater than unity. These may include spheroidal systems, elliptical galaxies, and the bulges of early-type spirals and S0's, seen during their primeval rapid star formation

  11. The Interpretive Approach to Religious Education: Challenging Thompson's Interpretation

    Science.gov (United States)

    Jackson, Robert

    2012-01-01

    In a recent book chapter, Matthew Thompson makes some criticisms of my work, including the interpretive approach to religious education and the research and activity of Warwick Religions and Education Research Unit. Against the background of a discussion of religious education in the public sphere, my response challenges Thompson's account,…

  12. Evolving Technologies: A View to Tomorrow

    Science.gov (United States)

    Tamarkin, Molly; Rodrigo, Shelley

    2011-01-01

    Technology leaders must participate in strategy creation as well as operational delivery within higher education institutions. The future of higher education--the view to tomorrow--is irrevocably integrated and intertwined with evolving technologies. This article focuses on two specific evolving technologies: (1) alternative IT sourcing; and (2)…

  13. Evolvability Search: Directly Selecting for Evolvability in order to Study and Produce It

    DEFF Research Database (Denmark)

    Mengistu, Henok; Lehman, Joel Anthony; Clune, Jeff

    2016-01-01

    of evolvable digital phenotypes. Although some types of selection in evolutionary computation indirectly encourage evolvability, one unexplored possibility is to directly select for evolvability. To do so, we estimate an individual's future potential for diversity by calculating the behavioral diversity of its...... immediate offspring, and select organisms with increased offspring variation. While the technique is computationally expensive, we hypothesized that direct selection would better encourage evolvability than indirect methods. Experiments in two evolutionary robotics domains confirm this hypothesis: in both...... domains, such Evolvability Search produces solutions with higher evolvability than those produced with Novelty Search or traditional objective-based search algorithms. Further experiments demonstrate that the higher evolvability produced by Evolvability Search in a training environment also generalizes...

  14. Copenhagen interpretation versus Bohm's theory

    International Nuclear Information System (INIS)

    Baumann, K.

    1985-01-01

    The objections raised against Bohm's interpretation of quantum theory are reexamined, and arguments are presented in favour of this theory. Bohm's QED is modified such as to include Dirac particles. It is pointed out that the electric field may be chosen as the 'actual' field instead of the magnetic field. Finally, the theory is reformulated in terms of an arbitrary actual field. (Author)

  15. Evolving the Evolving: Territory, Place and Rewilding in the California Delta

    Directory of Open Access Journals (Sweden)

    Brett Milligan

    2017-10-01

    Full Text Available Current planning and legislation in California’s Sacramento-San Joaquin Delta call for the large-scale ecological restoration of aquatic and terrestrial habitats. These ecological mandates have emerged in response to the region’s infrastructural transformation and the Delta’s predominant use as the central logistical hub in the state’s vast water conveyance network. Restoration is an attempt to recover what was externalized by the logic and abstractions of this logistical infrastructure. However, based on findings from our research, which examined how people are using restored and naturalized landscapes in the Delta and how these landscapes are currently planned for, we argue that as mitigatory response, restoration planning continues some of the same spatial abstractions and inequities by failing to account for the Delta as an urbanized, cultural and unique place. In interpreting how these conditions have come to be, we give attention to a pluralistic landscape approach and a coevolutionary reading of planning, policy, science and landscapes to discuss the conservation challenges presented by “Delta as an Evolving Place”. We suggest that for rewilding efforts to be successful in the Delta, a range of proactive, opportunistic, grounded and participatory tactics will be required to shift towards a more socio-ecological approach.

  16. The evolving universe and the origin of life the search for our cosmic roots

    CERN Document Server

    Teerikorpi, Pekka; Lehto, Harry; Chernin, Arthur; Byrd, Gene; Lehto, K

    2008-01-01

    Sir Isaac Newton famously said, regarding his discoveries, "If I have seen further it is by standing upon the shoulders of giants." The Evolving Universe and the Origin of Life describes, complete with fascinating biographical details of the thinkers involved, the ascent to the metaphorical shoulders accomplished by the greatest minds in history. For the first time, a single book can take the reader on a journey through the history of the universe as interpreted by the expanding body of knowledge of humankind. From subatomic particles to the protein chains that form life, and expanding in scale to the entire universe, this book covers the science that explains how we came to be. The Evolving Universe and the Origin of Life contains a great breadth of knowledge, from astronomy to physics, from chemistry to biology. It includes over 350 figures that enhance the comprehension of concepts both basic and advanced, and is a non-technical, easy-to-read text at an introductory college level that is ideal for anyone i...

  17. Image analysis enhancement and interpretation

    International Nuclear Information System (INIS)

    Glauert, A.M.

    1978-01-01

    The necessary practical and mathematical background are provided for the analysis of an electron microscope image in order to extract the maximum amount of structural information. Instrumental methods of image enhancement are described, including the use of the energy-selecting electron microscope and the scanning transmission electron microscope. The problems of image interpretation are considered with particular reference to the limitations imposed by radiation damage and specimen thickness. A brief survey is given of the methods for producing a three-dimensional structure from a series of two-dimensional projections, although emphasis is really given on the analysis, processing and interpretation of the two-dimensional projection of a structure. (Auth.)

  18. Diverticular Disease: Traditional and Evolving Paradigms.

    Science.gov (United States)

    Lamanna, Lenore; Moran, Patricia E

    Diverticular disease includes diverticulosis, which are sac protrusions of the intestinal mucosa, and diverticulitis, inflammation of the diverticula. Diverticular disease is listed as one of the top 10 leading physician diagnoses for gastrointestinal disorders in outpatient clinic visits in the United States. There are several classifications of diverticular disease ranging from asymptomatic diverticulosis to diverticulitis with complications. Several theories are linked to the development of diverticula which includes the physiology of the colon itself, collagen cross-linking, and recently challenged, low-fiber intake. The differential diagnoses of lower abdominal pain in addition to diverticular disease have overlapping signs and symptoms, which can make a diagnosis challenging. Identification of the distinct signs and symptoms of each classification will assist the practitioner in making the correct diagnosis and lead to appropriate management. The findings from recent studies have changed the paradigm of diverticular disease. The purpose of this article is to discuss traditional dogma and evolving concepts in the pathophysiology, prevention, and management of diverticular disease. Practitioners must be knowledgeable about diverticular disease for improved outcomes.

  19. FY1995 evolvable hardware chip; 1995 nendo shinkasuru hardware chip

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This project aims at the development of 'Evolvable Hardware' (EHW) which can adapt its hardware structure to the environment to attain better hardware performance, under the control of genetic algorithms. EHW is a key technology to explore the new application area requiring real-time performance and on-line adaptation. 1. Development of EHW-LSI for function level hardware evolution, which includes 15 DSPs in one chip. 2. Application of the EHW to the practical industrial applications such as data compression, ATM control, digital mobile communication. 3. Two patents : (1) the architecture and the processing method for programmable EHW-LSI. (2) The method of data compression for loss-less data, using EHW. 4. The first international conference for evolvable hardware was held by authors: Intl. Conf. on Evolvable Systems (ICES96). It was determined at ICES96 that ICES will be held every two years between Japan and Europe. So the new society has been established by us. (NEDO)

  20. Cosmic Biology How Life Could Evolve on Other Worlds

    CERN Document Server

    Irwin, Louis Neil

    2011-01-01

    It is very unlikely that little green humanoids are living on Mars. But what are the possible life forms that might exist in our Solar System and how might they have evolved? This uniquely authoritative and imaginative book on the possibilties for alien life addresses the intrinsic interest that we have about life on other worlds - reinforcing some of our assumptions and reshaping others. It introduces new possibilties that will enlarge our understanding of the issue overall, in particular the enormous range of environments and planetary conditions within which life might evolve. Cosmic Biology -discusses a broad range of possible environments where alien life might have evolved; -explains why carbon-based, water-borne life is more likely that its alternatives, but is not the only possiblity; -applies the principles of planetary science and modern biology to evolutionary scenarios on other worlds; -looks at the future fates of living systems, including those on Earth.

  1. Linguistics in Text Interpretation

    DEFF Research Database (Denmark)

    Togeby, Ole

    2011-01-01

    A model for how text interpretation proceeds from what is pronounced, through what is said to what is comunicated, and definition of the concepts 'presupposition' and 'implicature'.......A model for how text interpretation proceeds from what is pronounced, through what is said to what is comunicated, and definition of the concepts 'presupposition' and 'implicature'....

  2. Somatoparaphrenia: evolving theories and concepts.

    Science.gov (United States)

    Feinberg, Todd E; Venneri, Annalena

    2014-12-01

    Somatoparaphrenia, a syndrome that involves at a minimum unawareness of ownership of a body part, in addition involves productive features including delusional misidentification and confabulation. In this review we describe some of the clinical and neuroanatomical features of somatoparaphrenia highlighting its delusional and confabulatory aspects. Possible theoretical frameworks are reviewed taking into account cognitive, psychodynamic, and philosophical views. We suggest that future studies should approach this syndrome through investigations of structural and functional connectivity and focus on the possible interplay between alterations in major functional networks of the brain, such as the default mode and salience networks, but also take into account motivational variables. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Evolving Deep Networks Using HPC

    Energy Technology Data Exchange (ETDEWEB)

    Young, Steven R. [ORNL, Oak Ridge; Rose, Derek C. [ORNL, Oak Ridge; Johnston, Travis [ORNL, Oak Ridge; Heller, William T. [ORNL, Oak Ridge; Karnowski, thomas P. [ORNL, Oak Ridge; Potok, Thomas E. [ORNL, Oak Ridge; Patton, Robert M. [ORNL, Oak Ridge; Perdue, Gabriel [Fermilab; Miller, Jonathan [Santa Maria U., Valparaiso

    2017-01-01

    While a large number of deep learning networks have been studied and published that produce outstanding results on natural image datasets, these datasets only make up a fraction of those to which deep learning can be applied. These datasets include text data, audio data, and arrays of sensors that have very different characteristics than natural images. As these “best” networks for natural images have been largely discovered through experimentation and cannot be proven optimal on some theoretical basis, there is no reason to believe that they are the optimal network for these drastically different datasets. Hyperparameter search is thus often a very important process when applying deep learning to a new problem. In this work we present an evolutionary approach to searching the possible space of network hyperparameters and construction that can scale to 18, 000 nodes. This approach is applied to datasets of varying types and characteristics where we demonstrate the ability to rapidly find best hyperparameters in order to enable practitioners to quickly iterate between idea and result.

  4. The fastest evolving white dwarfs

    International Nuclear Information System (INIS)

    D'antona, F.; Mazzitelli, I.

    1989-01-01

    The evolution of white dwarfs (WDs) at their lowest luminosities is investigated by computing a reference track with solar metal and helium abundances down to the beginning of WD evolution. The main characteristics of the cooling tracks are described, including the onset of crystallization and its completion, and the differentiation in the relation T(c) - T(eff) is shown for the tracks. It is shown why the evolutionary times do not shorten abruptly at a given luminosity as a result of Debye cooling. The structure of the coolest models is shown to consist of dense atmospheres, with photospheres lying at the boundary of pressure ionization. A study of the resulting luminosity functions (LFs) shows that fast cooling never occurs, and that the LF in the crucial region log L/L(solar) between -4 and -6 is either flat or slowly decreasing. Comparisons with the observed LFs explains well the peak or flattening of the LF at log L/L(solar) = -3 or less but fails to reproduce the drop at log L/L(solar) = -4.5. 48 refs

  5. Sex determination: ways to evolve a hermaphrodite.

    OpenAIRE

    Braendle , Christian; Félix , Marie-Anne

    2006-01-01

    Most species of the nematode genus Caenorhabditis reproduce through males and females; C. elegans and C. briggsae, however, produce self-fertile hermaphrodites instead of females. These transitions to hermaphroditism evolved convergently through distinct modifications of germline sex determination mechanisms.

  6. WSC-07: Evolving the Web Services Challenge

    NARCIS (Netherlands)

    Blake, M. Brian; Cheung, William K.W.; Jaeger, Michael C.; Wombacher, Andreas

    Service-oriented architecture (SOA) is an evolving architectural paradigm where businesses can expose their capabilities as modular, network-accessible software services. By decomposing capabilities into modular services, organizations can share their offerings at multiple levels of granularity

  7. Marshal: Maintaining Evolving Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — SIFT proposes to design and develop the Marshal system, a mixed-initiative tool for maintaining task models over the course of evolving missions. Marshal-enabled...

  8. Satcom access in the Evolved Packet Core

    NARCIS (Netherlands)

    Cano Soveri, M.D.; Norp, A.H.J.; Popova, M.P.

    2011-01-01

    Satellite communications (Satcom) networks are increasingly integrating with terrestrial communications networks, namely Next Generation Networks (NGN). In the area of NGN the Evolved Packet Core (EPC) is a new network architecture that can support multiple access technologies. When Satcom is

  9. Satcom access in the evolved packet core

    NARCIS (Netherlands)

    Cano, M.D.; Norp, A.H.J.; Popova, M.P.

    2012-01-01

    Satellite communications (Satcom) networks are increasingly integrating with terrestrial communications networks, namely Next Generation Networks (NGN). In the area of NGN the Evolved Packet Core (EPC) is a new network architecture that can support multiple access technologies. When Satcom is

  10. The evolving energy budget of accretionary wedges

    Science.gov (United States)

    McBeck, Jessica; Cooke, Michele; Maillot, Bertrand; Souloumiac, Pauline

    2017-04-01

    The energy budget of evolving accretionary systems reveals how deformational processes partition energy as faults slip, topography uplifts, and layer-parallel shortening produces distributed off-fault deformation. The energy budget provides a quantitative framework for evaluating the energetic contribution or consumption of diverse deformation mechanisms. We investigate energy partitioning in evolving accretionary prisms by synthesizing data from physical sand accretion experiments and numerical accretion simulations. We incorporate incremental strain fields and cumulative force measurements from two suites of experiments to design numerical simulations that represent accretionary wedges with stronger and weaker detachment faults. One suite of the physical experiments includes a basal glass bead layer and the other does not. Two physical experiments within each suite implement different boundary conditions (stable base versus moving base configuration). Synthesizing observations from the differing base configurations reduces the influence of sidewall friction because the force vector produced by sidewall friction points in opposite directions depending on whether the base is fixed or moving. With the numerical simulations, we calculate the energy budget at two stages of accretion: at the maximum force preceding the development of the first thrust pair, and at the minimum force following the development of the pair. To identify the appropriate combination of material and fault properties to apply in the simulations, we systematically vary the Young's modulus and the fault static and dynamic friction coefficients in numerical accretion simulations, and identify the set of parameters that minimizes the misfit between the normal force measured on the physical backwall and the numerically simulated force. Following this derivation of the appropriate material and fault properties, we calculate the components of the work budget in the numerical simulations and in the

  11. Functional Topology of Evolving Urban Drainage Networks

    Science.gov (United States)

    Yang, Soohyun; Paik, Kyungrock; McGrath, Gavan S.; Urich, Christian; Krueger, Elisabeth; Kumar, Praveen; Rao, P. Suresh C.

    2017-11-01

    We investigated the scaling and topology of engineered urban drainage networks (UDNs) in two cities, and further examined UDN evolution over decades. UDN scaling was analyzed using two power law scaling characteristics widely employed for river networks: (1) Hack's law of length (L)-area (A) [L∝Ah] and (2) exceedance probability distribution of upstream contributing area (δ) [P>(A≥δ>)˜aδ-ɛ]. For the smallest UDNs ((A≥δ>) plots for river networks are abruptly truncated, those for UDNs display exponential tempering [P>(A≥δ>)=aδ-ɛexp⁡>(-cδ>)]. The tempering parameter c decreases as the UDNs grow, implying that the distribution evolves in time to resemble those for river networks. However, the power law exponent ɛ for large UDNs tends to be greater than the range reported for river networks. Differences in generative processes and engineering design constraints contribute to observed differences in the evolution of UDNs and river networks, including subnet heterogeneity and nonrandom branching.

  12. An Evolving Worldview: Making Open Source Easy

    Science.gov (United States)

    Rice, Z.

    2017-12-01

    NASA Worldview is an interactive interface for browsing full-resolution, global satellite imagery. Worldview supports an open data policy so that academia, private industries and the general public can use NASA's satellite data to address Earth science related issues. Worldview was open sourced in 2014. By shifting to an open source approach, the Worldview application has evolved to better serve end-users. Project developers are able to have discussions with end-users and community developers to understand issues and develop new features. Community developers are able to track upcoming features, collaborate on them and make their own contributions. Developers who discover issues are able to address those issues and submit a fix. This reduces the time it takes for a project developer to reproduce an issue or develop a new feature. Getting new developers to contribute to the project has been one of the most important and difficult aspects of open sourcing Worldview. After witnessing potential outside contributors struggle, a focus has been made on making the installation of Worldview simple to reduce the initial learning curve and make contributing code easy. One way we have addressed this is through a simplified setup process. Our setup documentation includes a set of prerequisites and a set of straightforward commands to clone, configure, install and run. This presentation will emphasize our focus to simplify and standardize Worldview's open source code so that more people are able to contribute. The more people who contribute, the better the application will become over time.

  13. How does cognition evolve? Phylogenetic comparative psychology.

    Science.gov (United States)

    MacLean, Evan L; Matthews, Luke J; Hare, Brian A; Nunn, Charles L; Anderson, Rindy C; Aureli, Filippo; Brannon, Elizabeth M; Call, Josep; Drea, Christine M; Emery, Nathan J; Haun, Daniel B M; Herrmann, Esther; Jacobs, Lucia F; Platt, Michael L; Rosati, Alexandra G; Sandel, Aaron A; Schroepfer, Kara K; Seed, Amanda M; Tan, Jingzhi; van Schaik, Carel P; Wobber, Victoria

    2012-03-01

    Now more than ever animal studies have the potential to test hypotheses regarding how cognition evolves. Comparative psychologists have developed new techniques to probe the cognitive mechanisms underlying animal behavior, and they have become increasingly skillful at adapting methodologies to test multiple species. Meanwhile, evolutionary biologists have generated quantitative approaches to investigate the phylogenetic distribution and function of phenotypic traits, including cognition. In particular, phylogenetic methods can quantitatively (1) test whether specific cognitive abilities are correlated with life history (e.g., lifespan), morphology (e.g., brain size), or socio-ecological variables (e.g., social system), (2) measure how strongly phylogenetic relatedness predicts the distribution of cognitive skills across species, and (3) estimate the ancestral state of a given cognitive trait using measures of cognitive performance from extant species. Phylogenetic methods can also be used to guide the selection of species comparisons that offer the strongest tests of a priori predictions of cognitive evolutionary hypotheses (i.e., phylogenetic targeting). Here, we explain how an integration of comparative psychology and evolutionary biology will answer a host of questions regarding the phylogenetic distribution and history of cognitive traits, as well as the evolutionary processes that drove their evolution.

  14. How does cognition evolve? Phylogenetic comparative psychology

    Science.gov (United States)

    Matthews, Luke J.; Hare, Brian A.; Nunn, Charles L.; Anderson, Rindy C.; Aureli, Filippo; Brannon, Elizabeth M.; Call, Josep; Drea, Christine M.; Emery, Nathan J.; Haun, Daniel B. M.; Herrmann, Esther; Jacobs, Lucia F.; Platt, Michael L.; Rosati, Alexandra G.; Sandel, Aaron A.; Schroepfer, Kara K.; Seed, Amanda M.; Tan, Jingzhi; van Schaik, Carel P.; Wobber, Victoria

    2014-01-01

    Now more than ever animal studies have the potential to test hypotheses regarding how cognition evolves. Comparative psychologists have developed new techniques to probe the cognitive mechanisms underlying animal behavior, and they have become increasingly skillful at adapting methodologies to test multiple species. Meanwhile, evolutionary biologists have generated quantitative approaches to investigate the phylogenetic distribution and function of phenotypic traits, including cognition. In particular, phylogenetic methods can quantitatively (1) test whether specific cognitive abilities are correlated with life history (e.g., lifespan), morphology (e.g., brain size), or socio-ecological variables (e.g., social system), (2) measure how strongly phylogenetic relatedness predicts the distribution of cognitive skills across species, and (3) estimate the ancestral state of a given cognitive trait using measures of cognitive performance from extant species. Phylogenetic methods can also be used to guide the selection of species comparisons that offer the strongest tests of a priori predictions of cognitive evolutionary hypotheses (i.e., phylogenetic targeting). Here, we explain how an integration of comparative psychology and evolutionary biology will answer a host of questions regarding the phylogenetic distribution and history of cognitive traits, as well as the evolutionary processes that drove their evolution. PMID:21927850

  15. Evolving effective incremental SAT solvers with GP

    OpenAIRE

    Bader, Mohamed; Poli, R.

    2008-01-01

    Hyper-Heuristics could simply be defined as heuristics to choose other heuristics, and it is a way of combining existing heuristics to generate new ones. In a Hyper-Heuristic framework, the framework is used for evolving effective incremental (Inc*) solvers for SAT. We test the evolved heuristics (IncHH) against other known local search heuristics on a variety of benchmark SAT problems.

  16. Nuclear power: An evolving scenario

    International Nuclear Information System (INIS)

    ElBaradei, Mohamed

    2004-01-01

    The past two years have found the IAEA often in the spotlight - primarily because of our role as the world's 'nuclear watchdog', as we are sometimes referred to on the evening news. The most visible, and often controversial, peaceful nuclear application is the generation of electricity, the focus of this article largely from a European perspective. At the end of last year there were 440 nuclear power units operating worldwide. Together, they supply about 16% of the world's electricity. That percentage has remained relatively steady for almost 20 years. Expansion and growth prospects for nuclear power are centred in Asia. Of the 31 units under construction worldwide, 18 are located in India, Japan, South Korea and China, including Taiwan. Twenty of the last 29 reactors to be connected to the grid are also in the Far East and South Asia. That is probably more active construction than most Europeans would guess, given how little recent growth has occurred in the West. For Western Europe and North America, nuclear construction has been a frozen playing field - the last plant to be completed being Civaux-2 in France in 1999. That should raise a question: with little to no new construction, how has nuclear power been able to keep up with other energy sources, to maintain its share of electricity generation? Interestingly enough, the answer is tied directly to efforts to improve safety performance. The accident at Chernobyl in 1986 prompted the creation of the World Association of Nuclear Operators (WANO), and revolutionized the IAEA approach to nuclear power plant safety. Some analysts believe the case for new nuclear construction in Europe is gaining new ground, for a number of reasons: efforts to limit greenhouse gas emissions and reduce the risk of climate change; security of energy supply; Comparative Public Health Risk; different set of variables when choosing Each country's and region energy strategy. Looking to the future, certain key challenges are, of direct

  17. Architectural design of an Algol interpreter

    Science.gov (United States)

    Jackson, C. K.

    1971-01-01

    The design of a syntax-directed interpreter for a subset of Algol is described. It is a conceptual design with sufficient details and completeness but as much independence of implementation as possible. The design includes a detailed description of a scanner, an analyzer described in the Floyd-Evans productions, a hash-coded symbol table, and an executor. Interpretation of sample programs is also provided to show how the interpreter functions.

  18. Cytological artifacts masquerading interpretation

    Directory of Open Access Journals (Sweden)

    Khushboo Sahay

    2013-01-01

    Conclusions: In order to justify a cytosmear interpretation, a cytologist must be well acquainted with delayed fixation-induced cellular changes and microscopic appearances of common contaminants so as to implicate better prognosis and therapy.

  19. Normative interpretations of diversity

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2009-01-01

    Normative interpretations of particular cases consist of normative principles or values coupled with social theoretical accounts of the empirical facts of the case. The article reviews the most prominent normative interpretations of the Muhammad cartoons controversy over the publication of drawings...... of the Prophet Muhammad in the Danish newspaper Jyllands-Posten. The controversy was seen as a case of freedom of expression, toleration, racism, (in)civility and (dis)respect, and the article notes different understandings of these principles and how the application of them to the controversy implied different...... social theoretical accounts of the case. In disagreements between different normative interpretations, appeals are often made to the ‘context', so it is also considered what roles ‘context' might play in debates over normative interpretations...

  20. Principles of radiological interpretation

    International Nuclear Information System (INIS)

    Rowe, L.J.; Yochum, T.R.

    1987-01-01

    Conventional radiographic procedures (plain film) are the most frequently utilized imaging modality in the evaluation of the skeletal system. This chapter outlines the essentials of skeletal imaging, anatomy, physiology, and interpretation

  1. Interpretable Active Learning

    OpenAIRE

    Phillips, Richard L.; Chang, Kyu Hyun; Friedler, Sorelle A.

    2017-01-01

    Active learning has long been a topic of study in machine learning. However, as increasingly complex and opaque models have become standard practice, the process of active learning, too, has become more opaque. There has been little investigation into interpreting what specific trends and patterns an active learning strategy may be exploring. This work expands on the Local Interpretable Model-agnostic Explanations framework (LIME) to provide explanations for active learning recommendations. W...

  2. QUALITATIVE INTERPRETATION OF GALAXY SPECTRA

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Almeida, J.; Morales-Luis, A. B. [Instituto de Astrofisica de Canarias, E-38205 La Laguna, Tenerife (Spain); Terlevich, R.; Terlevich, E. [Instituto Nacional de Astrofisica, Optica y Electronica, Tonantzintla, Puebla (Mexico); Cid Fernandes, R., E-mail: jos@iac.es, E-mail: abml@iac.es, E-mail: rjt@ast.cam.ac.uk, E-mail: eterlevi@inaoep.mx, E-mail: cid@astro.ufsc.br [Departamento de Fisica-CFM, Universidade Federal de Santa Catarina, P.O. Box 476, 88040-900 Florianopolis, SC (Brazil)

    2012-09-10

    We describe a simple step-by-step guide to qualitative interpretation of galaxy spectra. Rather than an alternative to existing automated tools, it is put forward as an instrument for quick-look analysis and for gaining physical insight when interpreting the outputs provided by automated tools. Though the recipe is for general application, it was developed for understanding the nature of the Automatic Spectroscopic K-means-based (ASK) template spectra. They resulted from the classification of all the galaxy spectra in the Sloan Digital Sky Survey data release 7, thus being a comprehensive representation of the galaxy spectra in the local universe. Using the recipe, we give a description of the properties of the gas and the stars that characterize the ASK classes, from those corresponding to passively evolving galaxies, to H II galaxies undergoing a galaxy-wide starburst. The qualitative analysis is found to be in excellent agreement with quantitative analyses of the same spectra. We compare the mean ages of the stellar populations with those inferred using the code STARLIGHT. We also examine the estimated gas-phase metallicity with the metallicities obtained using electron-temperature-based methods. A number of byproducts follow from the analysis. There is a tight correlation between the age of the stellar population and the metallicity of the gas, which is stronger than the correlations between galaxy mass and stellar age, and galaxy mass and gas metallicity. The galaxy spectra are known to follow a one-dimensional sequence, and we identify the luminosity-weighted mean stellar age as the affine parameter that describes the sequence. All ASK classes happen to have a significant fraction of old stars, although spectrum-wise they are outshined by the youngest populations. Old stars are metal-rich or metal-poor depending on whether they reside in passive galaxies or in star-forming galaxies.

  3. Interpreter-mediated dentistry.

    Science.gov (United States)

    Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F

    2015-05-01

    The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Design of interpretable fuzzy systems

    CERN Document Server

    Cpałka, Krzysztof

    2017-01-01

    This book shows that the term “interpretability” goes far beyond the concept of readability of a fuzzy set and fuzzy rules. It focuses on novel and precise operators of aggregation, inference, and defuzzification leading to flexible Mamdani-type and logical-type systems that can achieve the required accuracy using a less complex rule base. The individual chapters describe various aspects of interpretability, including appropriate selection of the structure of a fuzzy system, focusing on improving the interpretability of fuzzy systems designed using both gradient-learning and evolutionary algorithms. It also demonstrates how to eliminate various system components, such as inputs, rules and fuzzy sets, whose reduction does not adversely affect system accuracy. It illustrates the performance of the developed algorithms and methods with commonly used benchmarks. The book provides valuable tools for possible applications in many fields including expert systems, automatic control and robotics.

  5. An Authentic Interpretation of Laws

    Directory of Open Access Journals (Sweden)

    Teodor Antić

    2015-01-01

    Full Text Available Authentic interpretation of laws is a legal institute whereby a legislator gives the authentic meaning to a specific legal norm in case of its incorrect or diversified interpretation in practice. It has the same legal force as the law. Retroactivity and influence on pending cases are its inherent characteristics. Due to these characteristics and their relation to the principles of the rule of law, legal certainty and separation of powers, it is subjected to severe criticism not only by legal theory but also legal practice. The author analyses the institute of authentic interpretation from historical and comparative point of view and through the Croatian normative regulation, practice of the Croatian Parliament and academic debate, including opinions in favour as well as against it. On these grounds the author concludes that higher quality of law making procedure could make the authentic interpretation dispensable. On the other hand, should this institute be kept in the legal order it is essential to receive more effective constitutional control.

  6. A Generator for Composition Interpreters

    DEFF Research Database (Denmark)

    Steensgaard-Madsen, Jørgen

    1997-01-01

    programming language design, specification and implementation then apply. A component can be considered as defining objects or commands according to convenience. A description language including type information provides sufficient means to describe component interaction according to the underlying abstract......Composition of program components must be expressed in some language, and late composition can be achieved by an interpreter for the composition language. A suitable notion of component is obtained by identifying it with the semantics of a generalised structured command. Experiences from...

  7. Problems Faced by Court Interpreters in Botswana | Miyanda ...

    African Journals Online (AJOL)

    Journal Home > Vol 19 (2009) > ... The problems established include lack of training for interpreters, the absence of a job description and guidelines for interpreters, long hours of work, lack of a forum for interpreters to share ideas on their job, lack ... other staff, and lack of equipment such as microphones during interpreting.

  8. 12 CFR 609.920 - Interpretations.

    Science.gov (United States)

    2010-01-01

    ... and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM ELECTRONIC COMMERCE Interpretations and Definitions § 609.920 Interpretations. (a) E-SIGN preempts most statutes and regulations, including the Act... E-commerce as long as the safeguards of E-SIGN are met and its exceptions recognized. Generally, an...

  9. Localized Smart-Interpretation

    Science.gov (United States)

    Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom

    2014-05-01

    The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f

  10. Conjunctive interpretations of disjunctions

    Directory of Open Access Journals (Sweden)

    Robert van Rooij

    2010-09-01

    Full Text Available In this extended commentary I discuss the problem of how to account for "conjunctive" readings of some sentences with embedded disjunctions for globalist analyses of conversational implicatures. Following Franke (2010, 2009, I suggest that earlier proposals failed, because they did not take into account the interactive reasoning of what else the speaker could have said, and how else the hearer could have interpreted the (alternative sentence(s. I show how Franke's idea relates to more traditional pragmatic interpretation strategies. doi:10.3765/sp.3.11 BibTeX info

  11. Evolving Intelligent Systems Methodology and Applications

    CERN Document Server

    Angelov, Plamen; Kasabov, Nik

    2010-01-01

    From theory to techniques, the first all-in-one resource for EIS. There is a clear demand in advanced process industries, defense, and Internet and communication (VoIP) applications for intelligent yet adaptive/evolving systems. Evolving Intelligent Systems is the first self- contained volume that covers this newly established concept in its entirety, from a systematic methodology to case studies to industrial applications. Featuring chapters written by leading world experts, it addresses the progress, trends, and major achievements in this emerging research field, with a strong emphasis on th

  12. Interactively Evolving Compositional Sound Synthesis Networks

    DEFF Research Database (Denmark)

    Jónsson, Björn Þór; Hoover, Amy K.; Risi, Sebastian

    2015-01-01

    the space of potential sounds that can be generated through such compositional sound synthesis networks (CSSNs). To study the effect of evolution on subjective appreciation, participants in a listener study ranked evolved timbres by personal preference, resulting in preferences skewed toward the first......While the success of electronic music often relies on the uniqueness and quality of selected timbres, many musicians struggle with complicated and expensive equipment and techniques to create their desired sounds. Instead, this paper presents a technique for producing novel timbres that are evolved...

  13. Wild Origins: The Evolving Nature of Animal Behavior

    Science.gov (United States)

    Flores, Ifigenia

    For billions of years, evolution has been the driving force behind the incredible range of biodiversity on our planet. Wild Origins is a concept plan for an exhibition at the National Zoo that uses case studies of animal behavior to explain the theory of evolution. Behaviors evolve, just as physical forms do. Understanding natural selection can help us interpret animal behavior and vice-versa. A living collection, digital media, interactives, fossils, and photographs will relay stories of social behavior, sex, navigation and migration, foraging, domestication, and relationships between different species. The informal learning opportunities visitors are offered at the zoo will create a connection with the exhibition's teaching points. Visitors will leave with an understanding and sense of wonder at the evolutionary view of life.

  14. Reconsidering evolved sex differences in jealousy: comment on Harris (2003).

    Science.gov (United States)

    Sagarin, Brad J

    2005-01-01

    In a recent article, Harris (2003) concluded that the data do not support the existence of evolved sex differences in jealousy. Harris' review correctly identifies fatal flaws in three lines of evidence (spousal abuse, homicide, morbid jealousy), but her criticism of two other lines of evidence (self-report responses, psychophysiological measures) is based, in part, on a mischaracterization of the evolutionary psychological theory and a misunderstanding of the empirical implications of the theory. When interpreted according to the correct criterion (i.e., an interaction between sex and infidelity type), self-report studies (both forced-choice and non-forced choice) offer strong support for the existence of sex differences in jealousy. Psychophysiological data also offer some support, although these data are weakened by validity-related concerns. In addition, some refutational evidence cited by Harris (responses to real infidelity, responses under cognitive load) actually does not refute the theory. An integrative model that describes how jealousy might result from the interaction of sociocultural variables and evolved sex differences and suggestions for future research directions are discussed.

  15. On the Benefits of Divergent Search for Evolved Representations

    DEFF Research Database (Denmark)

    Lehman, Joel; Risi, Sebastian; Stanley, Kenneth O

    2012-01-01

    Evolved representations in evolutionary computation are often fragile, which can impede representation-dependent mechanisms such as self-adaptation. In contrast, evolved representations in nature are robust, evolvable, and creatively exploit available representational features. This paper provide...

  16. Preface: evolving rotifers, evolving science: Proceedings of the XIV International Rotifer Symposium

    Czech Academy of Sciences Publication Activity Database

    Devetter, Miloslav; Fontaneto, D.; Jersabek, Ch.D.; Welch, D.B.M.; May, L.; Walsh, E.J.

    2017-01-01

    Roč. 796, č. 1 (2017), s. 1-6 ISSN 0018-8158 Institutional support: RVO:60077344 Keywords : evolving rotifers * 14th International Rotifer Symposium * evolving science Subject RIV: EG - Zoology OBOR OECD: Zoology Impact factor: 2.056, year: 2016

  17. Tokens: Facts and Interpretation.

    Science.gov (United States)

    Schmandt-Besserat, Denise

    1986-01-01

    Summarizes some of the major pieces of evidence concerning the archeological clay tokens, specifically the technique for their manufacture, their geographic distribution, chronology, and the context in which they are found. Discusses the interpretation of tokens as the first example of visible language, particularly as an antecedent of Sumerian…

  18. Life Cycle Interpretation

    DEFF Research Database (Denmark)

    Hauschild, Michael Z.; Bonou, Alexandra; Olsen, Stig Irving

    2018-01-01

    The interpretation is the final phase of an LCA where the results of the other phases are considered together and analysed in the light of the uncertainties of the applied data and the assumptions that have been made and documented throughout the study. This chapter teaches how to perform an inte...

  19. Interpretations of Greek Mythology

    NARCIS (Netherlands)

    Bremmer, Jan

    1987-01-01

    This collection of original studies offers new interpretations of some of the best known characters and themes of Greek mythology, reflecting the complexity and fascination of the Greek imagination. Following analyses of the concept of myth and the influence of the Orient on Greek mythology, the

  20. Translation, Interpreting and Lexicography

    DEFF Research Database (Denmark)

    Dam, Helle Vrønning; Tarp, Sven

    2018-01-01

    in the sense that their practice fields are typically ‘about something else’. Translators may, for example, be called upon to translate medical texts, and interpreters may be assigned to work on medical speeches. Similarly, practical lexicography may produce medical dictionaries. In this perspective, the three...

  1. Visual perception and radiographic interpretation

    International Nuclear Information System (INIS)

    Papageorges, M.

    1998-01-01

    Although interpretation errors are common in radiology, their causes are still debated. Perceptual mechanisms appear to be responsible for a large proportion of mistakes made by both neophytes and trained radiologists. Erroneous perception of familiar contours can be triggered by unrelated opacities. Conversely, visual information cannot induce a specific perception if the observer is not familiar with the concept represented or its radiographicappearance. Additionally, the area of acute vision is smaller than is commonly recognized. Other factors, such as the attitude, beliefs,.: preconceptions, and expectations of the viewer, can affect what he or she ''sees'' whenviewing any object, including a radiograph. Familiarity with perceptual mechanisms and the limitations of the visual system as well as multiple readings may be necessary to reduce interpretation errors

  2. Evolving R Coronae Borealis Stars with MESA

    Science.gov (United States)

    Clayton, Geoffrey C.; Lauer, Amber; Chatzopoulos, Emmanouil; Frank, Juhan

    2018-01-01

    R Coronae Borealis (RCB) stars form a small class of cool, carbon-rich supergiants that have almost no hydrogen. They undergo extreme, irregular declines in brightness of up to 8 magnitudes due to the formation of thick clouds of carbon dust. Two scenarios have been proposed for the origin of an RCB star: the merger of a CO/He white dwarf (WD) binary and a final helium-shell flash. We are using a combination of 3D hydrodynamics codes and the 1D MESA (Modules for Experiments in Stellar Astrophysics) stellar evolution code including nucleosynthesis to construct post-merger spherical models based on realistic merger progenitor models and on our hydrodynamical simulations, and then following the evolution into the region of the HR diagram where RCB stars are located. We are investigating nucleosynthesis in the dynamically accreting material of CO/He WD mergers which may provide a suitable environment for significant production of 18O and the very low 16O/18O values observed.Our MESA modeling consists of two steps: first mimicking the WD merger event using two different techniques, (a) by choosing a very high mass accretion rate with appropriate abundances and (b) by applying "stellar engineering" to an initial CO WD model to account for the newly merged material by applying an entropy adjusting procedure. Second, we follow the post-merger evolution using a large nuclear reaction network including the effects of convective and rotational instabilities to the mixing of material in order to match the observed RCB abundances. MESA follows the evolution of the merger product as it expands and cools to become an RCB star. We then examine the surface abundances and compare them to the observed RCB abundances. We also investigate how long fusion continues in the He shell near the core and how this processed material is mixed up to the surface of the star. We then model the later evolution of RCB stars to determine their likely lifetimes and endpoints when they have returned to

  3. The evolving integrated vascular surgery residency curriculum.

    Science.gov (United States)

    Smith, Brigitte K; Greenberg, Jacob A; Mitchell, Erica L

    2014-10-01

    Since their introduction several years ago, integrated (0 + 5) vascular surgery residency programs are being increasingly developed across the country. To date, however, there is no defined "universal" curriculum for these programs and each program is responsible for creating its own curriculum. The aim of this study was to review the experiences of current 0 + 5 program directors (PDs) to determine what factors contributed to the curricular development within their institution. Semistructured interviews were conducted with 0 + 5 PDs to explore their experiences with program development, factors influencing the latter, and rationale for current curricula. The interview script was loosely structured to explore several factors including time of incoming residents' first exposure to the vascular surgical service, timing and rationale behind the timing of core surgical rotations throughout the 5 year program, educational value of nonsurgical rotations, opportunities for leadership and scholarly activity, and influence the general surgery program and institutional climate had on curricular structure. All interviews were conducted by a single interviewer. All interviews were qualitatively analyzed using emergent theme analysis. Twenty-six 0 + 5 PDs participated in the study. A total of 69% believed establishing professional identity early reduces resident attrition and recommend starting incoming trainees on vascular surgical services. Sixty-two percent spread core surgical rotations over the first 3 years to optimize general surgical exposure and most of the programs have eliminated specific rotations, as they were not considered valuable to the goals of training. Factors considered most important by PDs in curricular development include building on existing institutional opportunities (96%), avoiding rotations considered unsuccessful by "experienced" programs (92%), and maintaining a good working relationship with general surgery (77%). Fifty-eight percent of

  4. Views on Evolvability of Embedded Systems

    NARCIS (Netherlands)

    Laar, P. van de; Punter, T.

    2011-01-01

    Evolvability, the ability to respond effectively to change, represents a major challenge to today's high-end embedded systems, such as those developed in the medical domain by Philips Healthcare. These systems are typically developed by multi-disciplinary teams, located around the world, and are in

  5. Views on evolvability of embedded systems

    NARCIS (Netherlands)

    Laar, van de P.J.L.J.; Punter, H.T.

    2011-01-01

    Evolvability, the ability to respond effectively to change, represents a major challenge to today's high-end embedded systems, such as those developed in the medical domain by Philips Healthcare. These systems are typically developed by multi-disciplinary teams, located around the world, and are in

  6. EVOLVING AN EMPIRICAL METHODOLOGY DOR DETERMINING ...

    African Journals Online (AJOL)

    The uniqueness of this approach, is that it can be applied to any forest or dynamic feature on the earth, and can enjoy universal application as well. KEY WORDS: Evolving empirical methodology, innovative mathematical model, appropriate interval, remote sensing, forest environment planning and management. Global Jnl ...

  7. Continual Learning through Evolvable Neural Turing Machines

    DEFF Research Database (Denmark)

    Lüders, Benno; Schläger, Mikkel; Risi, Sebastian

    2016-01-01

    Continual learning, i.e. the ability to sequentially learn tasks without catastrophic forgetting of previously learned ones, is an important open challenge in machine learning. In this paper we take a step in this direction by showing that the recently proposed Evolving Neural Turing Machine (ENTM...

  8. Did Language Evolve Like the Vertebrate Eye?

    Science.gov (United States)

    Botha, Rudolf P.

    2002-01-01

    Offers a critical appraisal of the way in which the idea that human language or some of its features evolved like the vertebrate eye by natural selection is articulated in Pinker and Bloom's (1990) selectionist account of language evolution. Argues that this account is less than insightful because it fails to draw some of the conceptual…

  9. Interpretation of aluminum-alloy weld radiography

    Science.gov (United States)

    Duren, P. C.; Risch, E. R.

    1971-01-01

    Report proposes radiographic terminology standardization which allows scientific interpretation of radiographic films to replace dependence on individual judgement and experience. Report includes over 50 photographic pages where radiographs of aluminum welds with defects are compared with prepared weld sections photomacrographs.

  10. Risk factors which cause senile cataract evolvement: outline

    Directory of Open Access Journals (Sweden)

    E.V. Bragin

    2018-03-01

    Full Text Available Examination of natural ageing processes including those caused by multiple external factors has been attracting re-searchers' attention over the last years. Senile cataract is a multi-factor disease. Expenditure on cataract surgery remain one of the greatest expenses items in public health care. Age is a basic factor which causes senile cataract. Morbidity with cataract doubles each 10 years of life. This outline considers some literature sources which describe research results on influence exerted on cataract evolvement by such risk factors as age, sex, race, smoking, alcohol intake, pancreatic diabetes, intake of certain medications, a number of environmental factors including ultraviolet and ionizing radiation. mane of these factors are shown to increase or reduce senile cataract risk; there are conflicting data on certain factors. The outline also contains quantitative characteristics of cataract risks which are given via odds relation and evolve due to age parameters impacts, alcohol intake, ionizing radiation, etc. The authors also state that still there is no answer to the question whether dose-effect relationship for cataract evolvement is a threshold or non-threshold.

  11. Clinical ethics and values: how do norms evolve from practice?

    Science.gov (United States)

    Spranzi, Marta

    2013-02-01

    Bioethics laws in France have just undergone a revision process. The bioethics debate is often cast in terms of ethical principles and norms resisting emerging social and technological practices. This leads to the expression of confrontational attitudes based on widely differing interpretations of the same principles and values, and ultimately results in a deadlock. In this paper I would like to argue that focusing on values, as opposed to norms and principles, provides an interesting perspective on the evolution of norms. As Joseph Raz has convincingly argued, "life-building" values and practices are closely intertwined. Precisely because values have a more indeterminate meaning than norms, they can be cited as reasons for action by concerned stakeholders, and thus can help us understand how controversial practices, e.g. surrogate motherhood, can be justified. Finally, norms evolve when the interpretations of the relevant values shift and cause a change in the presumptions implicit in the norms. Thus, norms are not a prerequisite of the ethical solution of practical dilemmas, but rather the outcome of the decision-making process itself. Struggling to reach the right decision in controversial clinical ethics situations indirectly causes social and moral values to change and principles to be understood differently.

  12. Sperm should evolve to make female meiosis fair.

    Science.gov (United States)

    Brandvain, Yaniv; Coop, Graham

    2015-04-01

    Genomic conflicts arise when an allele gains an evolutionary advantage at a cost to organismal fitness. Oögenesis is inherently susceptible to such conflicts because alleles compete for inclusion into the egg. Alleles that distort meiosis in their favor (i.e., meiotic drivers) often decrease organismal fitness, and therefore indirectly favor the evolution of mechanisms to suppress meiotic drive. In this light, many facets of oögenesis and gametogenesis have been interpreted as mechanisms of protection against genomic outlaws. That females of many animal species do not complete meiosis until after fertilization, appears to run counter to this interpretation, because this delay provides an opportunity for sperm-acting alleles to meddle with the outcome of female meiosis and help like alleles drive in heterozygous females. Contrary to this perceived danger, the population genetic theory presented herein suggests that, in fact, sperm nearly always evolve to increase the fairness of female meiosis in the face of genomic conflicts. These results are consistent with the apparent sperm dependence of the best characterized female meiotic driversin animals. Rather than providing an opportunity for sperm collaboration in female meiotic drive, the "fertilization requirement" indirectly protects females from meiotic drivers by providing sperm an opportunity to suppress drive. © 2015 The Author(s).

  13. Sperm should evolve to make female meiosis fair

    Science.gov (United States)

    Brandvain, Yaniv; Coop, Graham

    2017-01-01

    Genomic conflicts arise when an allele gains an evolutionary advantage at a cost to organismal fitness. Oögenesis is inherently susceptible to such conflicts because alleles compete for inclusion into the egg. Alleles that distort meiosis in their favor (i.e. meiotic drivers) often decrease organismal fitness, and therefore indirectly favor the evolution of mechanisms to suppress meiotic drive. In this light, many facets of oögenesis and gametogenesis have been interpreted as mechanisms of protection against genomic outlaws. That females of many animal species do not complete meiosis until after fertilization, appears to run counter to this interpretation, because this delay provides an opportunity for sperm-acting alleles to meddle with the outcome of female meiosis and help like alleles drive in heterozygous females. Contrary to this perceived danger, the population genetic theory presented herein suggests that, in fact, sperm nearly always evolve to increase the fairness of female meiosis in the face of genomic conflicts. These results are consistent with the apparent sperm dependence of the best characterized female meiotic drivers in animals. Rather than providing an opportunity for sperm collaboration in female meiotic drive, the ‘fertilization requirement’ indirectly protects females from meiotic drivers by providing sperm an opportunity to suppress drive. PMID:25662355

  14. Design of the tool for periodic not evolvent profiles

    Directory of Open Access Journals (Sweden)

    Anisimov Roman

    2017-01-01

    Full Text Available The new approach to profiling of the tool for processing of parts with periodic not evolvent profiles are considered in the article The discriminatory analysis of periodic profiles including repetition of profile both in the plane of perpendicular axis of part, and in the plane of passing along part of axis is offered. In the basis of the offered profiling method the idea of space shaping by rated surface of product of tool surface lies. The big advantage of the offered approach in profiling is its combination with the analysis of parameters of process of engineering work. It allows to predict the accuracy and surface quality of product with not evolvent periodic profile. While using the offered approach the pinion cutter for processing of wheels with internal triangular teeths and mill for processing of the screw of the counter of consumption of liquid, complex profile of which consists of several formings, have been received

  15. Personal literary interpretation

    Directory of Open Access Journals (Sweden)

    Michał Januszkiewicz

    2015-11-01

    Full Text Available The article titled “Personal literary interpretation” deals with problems which have usually been marginalized in literary studies, but which seem to be very important in the context of the humanities, as broadly defined. The author of this article intends to rethink the problem of literary studies not in objective, but in personal terms. This is why the author wants to talk about what he calls personal literary interpretation, which has nothing to do with subjective or irrational thinking, but which is rather grounded in the hermeneutical rule that says that one must believe in order tounderstand a text or the other (where ‘believe’ also means: ‘to love’, ‘engage’, and ‘be open’. The article presents different determinants of this attitude, ranging from Dilthey to Heidegger and Gadamer. Finally, the author subscribes to the theory of personal interpretation, which is always dialogical.

  16. The Age of Interpretation

    OpenAIRE

    Gianni Vattimo

    2013-01-01

    Gianni Vattimo, who is both a Catholic and a frequent critic of the Church, explores the surprising congruence between Christianity and hermeneutics in light of the dissolution of metaphysical truth. As in hermeneutics, Vatimo claims, interpretation is central to Christianity. Influenced by hermeneutics and borrowing largely from the Nietzschean and Heideggerian heritage, the Italian philosopher, who has been instrumental in promoting a nihilistic approach to Christianity, draws here on Nietz...

  17. The Age of Interpretation

    Directory of Open Access Journals (Sweden)

    Gianni Vattimo

    2013-01-01

    Full Text Available Gianni Vattimo, who is both a Catholic and a frequent critic of the Church, explores the surprising congruence between Christianity and hermeneutics in light of the dissolution of metaphysical truth. As in hermeneutics, Vatimo claims, interpretation is central to Christianity. Influenced by hermeneutics and borrowing largely from the Nietzschean and Heideggerian heritage, the Italian philosopher, who has been instrumental in promoting a nihilistic approach to Christianity, draws here on Nietzsche’s writings on nihilism, which is not to be understood in a purely negative sense. Vattimo suggests that nihilism not only expands the Christian message of charity, but also transforms it into its endless human potential. In “The Age of Interpretation,” the author shows that hermeneutical radicalism “reduces all reality to message,” so that the opposition between facts and norms turns out to be misguided, for both are governed by the interpretative paradigms through which someone (always a concrete, historically situated someone makes sense of them. Vattimo rejects some of the deplorable political consequences of hermeneutics and claims that traditional hermeneutics is in collusion with various political-ideological neutralizations.

  18. The transactional interpretation of quantum mechanics

    Science.gov (United States)

    Cramer, John G.

    2001-06-01

    The transactional interpretation of quantum mechanics [1] was originally published in 1986 and is now about 14 years old. It is an explicitly nonlocal and Lorentz invariant alternative to the Copenhagen interpretation. It interprets the formalism for a quantum interaction as describing a "handshake" between retarded waves (ψ) and advanced waves (ψ*) for each quantum event or "transaction" in which energy, momentum, angular momentum, and other conserved quantities are transferred. The transactional interpretation offers the advantages that (1) it is actually "visible" in the formalism of quantum mechanics, (2) it is economical, involving fewer independent assumptions than its rivals, (3) it is paradox-free, resolving all of the paradoxes of standard quantum theory including nonlocality and wave function collapse, (4) it does not give a privileged role to observers or measurements, and (5) it permits the visualization of quantum events. We will review the transactional interpretation and some of its applications to "quantum paradoxes."

  19. Interpretive focus groups: a participatory method for interpreting and extending secondary analysis of qualitative data

    Directory of Open Access Journals (Sweden)

    Michelle Redman-MacLaren

    2014-08-01

    Full Text Available Background: Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. Objective: To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. Design: A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or ‘chunks’ of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. Results: New understandings of the data were evoked when women in interpretive focus groups analysed the data ‘chunks’. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Conclusions: Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action.

  20. Interpretive focus groups: a participatory method for interpreting and extending secondary analysis of qualitative data.

    Science.gov (United States)

    Redman-MacLaren, Michelle; Mills, Jane; Tommbe, Rachael

    2014-01-01

    Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or 'chunks' of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. New understandings of the data were evoked when women in interpretive focus groups analysed the data 'chunks'. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action.

  1. The evolving definition of systemic arterial hypertension.

    Science.gov (United States)

    Ram, C Venkata S; Giles, Thomas D

    2010-05-01

    Systemic hypertension is an important risk factor for premature cardiovascular disease. Hypertension also contributes to excessive morbidity and mortality. Whereas excellent therapeutic options are available to treat hypertension, there is an unsettled issue about the very definition of hypertension. At what level of blood pressure should we treat hypertension? Does the definition of hypertension change in the presence of co-morbid conditions? This article covers in detail the evolving concepts in the diagnosis and management of hypertension.

  2. Development and the evolvability of human limbs

    OpenAIRE

    Young, Nathan M.; Wagner, Günter P.; Hallgrímsson, Benedikt

    2010-01-01

    The long legs and short arms of humans are distinctive for a primate, the result of selection acting in opposite directions on each limb at different points in our evolutionary history. This mosaic pattern challenges our understanding of the relationship of development and evolvability because limbs are serially homologous and genetic correlations should act as a significant constraint on their independent evolution. Here we test a developmental model of limb covariation in anthropoid primate...

  3. Quantum games on evolving random networks

    OpenAIRE

    Pawela, Łukasz

    2015-01-01

    We study the advantages of quantum strategies in evolutionary social dilemmas on evolving random networks. We focus our study on the two-player games: prisoner's dilemma, snowdrift and stag-hunt games. The obtained result show the benefits of quantum strategies for the prisoner's dilemma game. For the other two games, we obtain regions of parameters where the quantum strategies dominate, as well as regions where the classical strategies coexist.

  4. The Evolving Leadership Path of Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Kluse, Michael; Peurrung, Anthony J.; Gracio, Deborah K.

    2012-01-02

    This is a requested book chapter for an internationally authored book on visual analytics and related fields, coordianted by a UK university and to be published by Springer in 2012. This chapter is an overview of the leadship strategies that PNNL's Jim Thomas and other stakeholders used to establish visual analytics as a field, and how those strategies may evolve in the future.

  5. CMIP6 Data Citation of Evolving Data

    Directory of Open Access Journals (Sweden)

    Martina Stockhause

    2017-06-01

    Full Text Available Data citations have become widely accepted. Technical infrastructures as well as principles and recommendations for data citation are in place but best practices or guidelines for their implementation are not yet available. On the other hand, the scientific climate community requests early citations on evolving data for credit, e.g. for CMIP6 (Coupled Model Intercomparison Project Phase 6. The data citation concept for CMIP6 is presented. The main challenges lie in limited resources, a strict project timeline and the dependency on changes of the data dissemination infrastructure ESGF (Earth System Grid Federation to meet the data citation requirements. Therefore a pragmatic, flexible and extendible approach for the CMIP6 data citation service was developed, consisting of a citation for the full evolving data superset and a data cart approach for citing the concrete used data subset. This two citation approach can be implemented according to the RDA recommendations for evolving data. Because of resource constraints and missing project policies, the implementation of the second part of the citation concept is postponed to CMIP7.

  6. FY1995 evolvable hardware chip; 1995 nendo shinkasuru hardware chip

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This project aims at the development of 'Evolvable Hardware' (EHW) which can adapt its hardware structure to the environment to attain better hardware performance, under the control of genetic algorithms. EHW is a key technology to explore the new application area requiring real-time performance and on-line adaptation. 1. Development of EHW-LSI for function level hardware evolution, which includes 15 DSPs in one chip. 2. Application of the EHW to the practical industrial applications such as data compression, ATM control, digital mobile communication. 3. Two patents : (1) the architecture and the processing method for programmable EHW-LSI. (2) The method of data compression for loss-less data, using EHW. 4. The first international conference for evolvable hardware was held by authors: Intl. Conf. on Evolvable Systems (ICES96). It was determined at ICES96 that ICES will be held every two years between Japan and Europe. So the new society has been established by us. (NEDO)

  7. Structural interpretation of seismic data and inherent uncertainties

    Science.gov (United States)

    Bond, Clare

    2013-04-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with

  8. Interpretation of Internet technology

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    2001-01-01

    Research scope: The topic of the research project is to investigate how new internet technologies such as e-trade and customer relation marketing and management are implemented in Danish food processing companies. The aim is to use Weick's (1995) sensemaking concept to analyse the strategic...... processes leading to the use of internet marketing technologies and to investigate how these new technologies are interpreted into the organisation. Investigating the organisational socio-cognitive processes underlying the decision making processes will give further insight into the socio...

  9. Changing interpretations of Plotinus

    DEFF Research Database (Denmark)

    Catana, Leo

    2013-01-01

    about method point in other directions. Eduard Zeller (active in the second half of the 19th century) is typically regarded as the first who gave a satisfying account of Plotinus’ philosophy as a whole. In this article, on the other hand, Zeller is seen as the one who finalised a tradition initiated...... in the 18th century. Very few Plotinus scholars have examined the interpretative development prior to Zeller. Schiavone (1952) and Bonetti (1971), for instance, have given little attention to Brucker’s introduction of the concept system of philosophy. The present analysis, then, has value...

  10. Human biomonitoring data interpretation and ethics; obstacles or surmountable challenges?

    Directory of Open Access Journals (Sweden)

    Sepai Ovnair

    2008-01-01

    Full Text Available Abstract The use of human samples to assess environmental exposure and uptake of chemicals is more than an analytical exercise and requires consideration of the utility and interpretation of data as well as due consideration of ethical issues. These aspects are inextricably linked. In 2004 the EC expressed its commitment to the development of a harmonised approach to human biomonitoring (HBM by including an action in the EU Environment and Health Strategy to develop a Human Biomonitoring Pilot Study. This further underlined the need for interpretation strategies as well as guidance on ethical issues. A workshop held in December 2006 brought together stakeholders from academia, policy makers as well as non-governmental organisations and chemical industry associations to a two day workshop built a mutual understanding of the issues in an open and frank discussion forum. This paper describes the discussion and recommendations from the workshop. The workshop developed key recommendations for a Pan-European HBM Study: 1. A strategy for the interpretation of human biomonitoring data should be developed. 2. The pilot study should include the development of a strategy to integrate health data and environmental monitoring with human biomonitoring data at national and international levels. 3. Communication strategies should be developed when designing the study and evolve as the study continues. 4. Early communication with stakeholders is essential to achieve maximum efficacy of policy developments and facilitate subsequent monitoring. 5. Member states will have to apply individually for project approval from their National Research Ethics Committees. 6. The study population needs to have sufficient information on the way data will be gathered, interpreted and disseminated and how samples will be stored and used in the future (if biobanking before they can give informed consent. 7. The participants must be given the option of anonymity. This has an impact

  11. Physical interpretation of antigravity

    Science.gov (United States)

    Bars, Itzhak; James, Albin

    2016-02-01

    Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl-invariant Standard Model coupled to general relativity (SM +GR ), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated, but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and out signals that interact with the antigravity region. This is no different from a spacetime black box for which the information about its interior is encoded in scattering amplitudes for in/out states at its exterior. Through examples we show that negative kinetic energy in antigravity presents no problems of principles but is an interesting topic for physical investigations of fundamental significance.

  12. Evolving Random Forest for Preference Learning

    DEFF Research Database (Denmark)

    Abou-Zleikha, Mohamed; Shaker, Noor

    2015-01-01

    This paper introduces a novel approach for pairwise preference learning through a combination of an evolutionary method and random forest. Grammatical evolution is used to describe the structure of the trees in the Random Forest (RF) and to handle the process of evolution. Evolved random forests ...... obtained for predicting pairwise self-reports of users for the three emotional states engagement, frustration and challenge show very promising results that are comparable and in some cases superior to those obtained from state-of-the-art methods....

  13. An evolving network model with community structure

    International Nuclear Information System (INIS)

    Li Chunguang; Maini, Philip K

    2005-01-01

    Many social and biological networks consist of communities-groups of nodes within which connections are dense, but between which connections are sparser. Recently, there has been considerable interest in designing algorithms for detecting community structures in real-world complex networks. In this paper, we propose an evolving network model which exhibits community structure. The network model is based on the inner-community preferential attachment and inter-community preferential attachment mechanisms. The degree distributions of this network model are analysed based on a mean-field method. Theoretical results and numerical simulations indicate that this network model has community structure and scale-free properties

  14. Radio Imaging of Envelopes of Evolved Stars

    Science.gov (United States)

    Cotton, Bill

    2018-04-01

    This talk will cover imaging of stellar envelopes using radio VLBI techniques; special attention will be paid to the technical differences between radio and optical/IR interferomery. Radio heterodyne receivers allow a straightforward way to derive spectral cubes and full polarization observations. Milliarcsecond resolution of very bright, i.e. non thermal, emission of molecular masers in the envelopes of evolved stars can be achieved using VLBI techniques with baselines of thousands of km. Emission from SiO, H2O and OH masers are commonly seen at increasing distance from the photosphere. The very narrow maser lines allow accurate measurements of the velocity field within the emitting region.

  15. Evolved Minimal Frustration in Multifunctional Biomolecules.

    Science.gov (United States)

    Röder, Konstantin; Wales, David J

    2018-05-25

    Protein folding is often viewed in terms of a funnelled potential or free energy landscape. A variety of experiments now indicate the existence of multifunnel landscapes, associated with multifunctional biomolecules. Here, we present evidence that these systems have evolved to exhibit the minimal number of funnels required to fulfil their cellular functions, suggesting an extension to the principle of minimum frustration. We find that minimal disruptive mutations result in additional funnels, and the associated structural ensembles become more diverse. The same trends are observed in an atomic cluster. These observations suggest guidelines for rational design of engineered multifunctional biomolecules.

  16. SALT Spectroscopy of Evolved Massive Stars

    Science.gov (United States)

    Kniazev, A. Y.; Gvaramadze, V. V.; Berdnikov, L. N.

    2017-06-01

    Long-slit spectroscopy with the Southern African Large Telescope (SALT) of central stars of mid-infrared nebulae detected with the Spitzer Space Telescope and Wide-Field Infrared Survey Explorer (WISE) led to the discovery of numerous candidate luminous blue variables (cLBVs) and other rare evolved massive stars. With the recent advent of the SALT fiber-fed high-resolution echelle spectrograph (HRS), a new perspective for the study of these interesting objects is appeared. Using the HRS we obtained spectra of a dozen newly identified massive stars. Some results on the recently identified cLBV Hen 3-729 are presented.

  17. Biblical Interpretation Beyond Historicity

    DEFF Research Database (Denmark)

    Biblical Interpretation beyond Historicity evaluates the new perspectives that have emerged since the crisis over historicity in the 1970s and 80s in the field of biblical scholarship. Several new studies in the field, as well as the ‘deconstructive’ side of literary criticism that emerged from...... writers such as Derrida and Wittgenstein, among others, lead biblical scholars today to view the texts of the Bible more as literary narratives than as sources for a history of Israel. Increased interest in archaeological and anthropological studies in writing the history of Palestine and the ancient Near...... and the commitment to a new approach to both the history of Palestine and the Bible’s place in ancient history. This volume features essays from a range of highly regarded scholars, and is divided into three sections: “Beyond Historicity”, which explores alternative historical roles for the Bible, “Greek Connections...

  18. A Weighted Evolving Network with Community Size Preferential Attachment

    International Nuclear Information System (INIS)

    Zhuo Zhiwei; Shan Erfang

    2010-01-01

    Community structure is an important characteristic in real complex network. It is a network consists of groups of nodes within which links are dense but among which links are sparse. In this paper, the evolving network include node, link and community growth and we apply the community size preferential attachment and strength preferential attachment to a growing weighted network model and utilize weight assigning mechanism from BBV model. The resulting network reflects the intrinsic community structure with generalized power-law distributions of nodes' degrees and strengths.

  19. Scanning Tunneling Microscopy - image interpretation

    International Nuclear Information System (INIS)

    Maca, F.

    1998-01-01

    The basic ideas of image interpretation in Scanning Tunneling Microscopy are presented using simple quantum-mechanical models and supplied with examples of successful application. The importance is stressed of a correct interpretation of this brilliant experimental surface technique

  20. Critical Assessment of Metagenome Interpretation

    DEFF Research Database (Denmark)

    Sczyrba, Alexander; Hofmann, Peter; Belmann, Peter

    2017-01-01

    Methods for assembly, taxonomic profiling and binning are key to interpreting metagenome data, but a lack of consensus about benchmarking complicates performance assessment. The Critical Assessment of Metagenome Interpretation (CAMI) challenge has engaged the global developer community to benchma...

  1. Completed lineament interpretation of the Olkiluoto region

    International Nuclear Information System (INIS)

    Paananen, M.

    2013-10-01

    Site characterization activities at Olkiluoto have been taking place for c. 25 years, including a wide range of different geophysical survey methods using various geometries and scales of investigation. The measurements have been done from the air, ground surface, shallow and deep drillholes and the ONKALO underground facility. As a part of the complementary site investigations, two low-altitude geophysical airborne survey campaigns were done around and at Olkiluoto in 2008 and 2009. The survey in 2008 was focused in the Eurajoensalmi area N or NE of Olkiluoto Island. The survey in 2009 covered most of the Olkiluoto Island, the neighbouring sea area and the archipelago W, SW and S of Olkiluoto as well as some of the mainland area SE of Olkiluoto. This report presents a new lineament interpretation based on these new geophysical airborne surveys. For the interpretation work, the data were extensively further processed into different gradients and filtered data sets and maps. Furthermore, the potential of automatic curvature analyses was examined. Also, quantitative profile interpretation was done from a number of profiles to find out the dips and exact locations of the contacts of some features. The qualitative interpretation of the lineaments was carried out by visually inspecting the different versions of the geophysical maps and by digitizing the geometry of each interpreted lineament. The lineaments are collated into two ArcGIS themes (one for magnetic and one for EM lineaments), accompanied by an attribute table that includes a number of attributes for each interpreted feature: lineament identifier, reference to the data used in interpretation, uncertainty, length, average orientation and probable geological character. The total number of new interpreted features is 125 magnetic and 33 electromagnetic lineaments. The main trend of the interpreted features varies between WNW-ESE and NNW-SSE. Furthermore, trends in directions almost N-S and E-W are also

  2. Completed lineament interpretation of the Olkiluoto region

    Energy Technology Data Exchange (ETDEWEB)

    Paananen, M. [Geological Survey of Finland, Espoo (Finland)

    2013-10-15

    Site characterization activities at Olkiluoto have been taking place for c. 25 years, including a wide range of different geophysical survey methods using various geometries and scales of investigation. The measurements have been done from the air, ground surface, shallow and deep drillholes and the ONKALO underground facility. As a part of the complementary site investigations, two low-altitude geophysical airborne survey campaigns were done around and at Olkiluoto in 2008 and 2009. The survey in 2008 was focused in the Eurajoensalmi area N or NE of Olkiluoto Island. The survey in 2009 covered most of the Olkiluoto Island, the neighbouring sea area and the archipelago W, SW and S of Olkiluoto as well as some of the mainland area SE of Olkiluoto. This report presents a new lineament interpretation based on these new geophysical airborne surveys. For the interpretation work, the data were extensively further processed into different gradients and filtered data sets and maps. Furthermore, the potential of automatic curvature analyses was examined. Also, quantitative profile interpretation was done from a number of profiles to find out the dips and exact locations of the contacts of some features. The qualitative interpretation of the lineaments was carried out by visually inspecting the different versions of the geophysical maps and by digitizing the geometry of each interpreted lineament. The lineaments are collated into two ArcGIS themes (one for magnetic and one for EM lineaments), accompanied by an attribute table that includes a number of attributes for each interpreted feature: lineament identifier, reference to the data used in interpretation, uncertainty, length, average orientation and probable geological character. The total number of new interpreted features is 125 magnetic and 33 electromagnetic lineaments. The main trend of the interpreted features varies between WNW-ESE and NNW-SSE. Furthermore, trends in directions almost N-S and E-W are also

  3. Participatory Interpretive Training for Tikal National Park, Guatemala.

    Science.gov (United States)

    Jacobson, Susan K.; Jurado, Magali

    1996-01-01

    Describes an interpretive training course for Tikal National Park, Guatemala to promote environmentally sound management of the region. Goals were to ensure that local knowledge and cultural norms were included in the design of interpretive materials, to introduce resource managers to park interpretation through course participation, and to train…

  4. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ...

    Science.gov (United States)

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, and it is not taking hundreds of years. If you are interested in these trends, and would like to find out more about where this is all headed and what it means to you, consider downloding Opening Science, edited by Sönke Bartling and Sascha Friesike, subtitled The Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing. In 26 chapters by various authors from a range of disciplines the book explores the developing world of open science, starting from the first scientific revolution and bringing us to the next scientific revolution, sometimes referred to as “Science 2.0”. Some of the articles deal with the impact of the changing landscape of how science is done, looking at the impact of open science on Academia, or journal publishing, or medical research. Many of the articles look at the uses, pitfalls, and impact of specific tools, like microblogging (think Twitter), social networking, and reference management. There is lots of discussion and definition of terms you might use or misuse like “altmetrics” and “impact factor”. Science will probably never be completely open, and Twitter will probably never replace the journal article,

  5. Evolving NASA's Earth Science Data Systems

    Science.gov (United States)

    Walter, J.; Behnke, J.; Murphy, K. J.; Lowe, D. R.

    2013-12-01

    NASA's Earth Science Data and Information System Project (ESDIS) is charged with managing, maintaining, and evolving NASA's Earth Observing System Data and Information System (EOSDIS) and is responsible for processing, archiving, and distributing NASA Earth science data. The system supports a multitude of missions and serves diverse science research and other user communities. Keeping up with ever-changing information technology and figuring out how to leverage those changes across such a large system in order to continuously improve and meet the needs of a diverse user community is a significant challenge. Maintaining and evolving the system architecture and infrastructure is a continuous and multi-layered effort. It requires a balance between a "top down" management paradigm that provides a coherent system view and maintaining the managerial, technological, and functional independence of the individual system elements. This presentation will describe some of the key elements of the current system architecture, some of the strategies and processes we employ to meet these challenges, current and future challenges, and some ideas for meeting those challenges.

  6. The Comet Cometh: Evolving Developmental Systems.

    Science.gov (United States)

    Jaeger, Johannes; Laubichler, Manfred; Callebaut, Werner

    In a recent opinion piece, Denis Duboule has claimed that the increasing shift towards systems biology is driving evolutionary and developmental biology apart, and that a true reunification of these two disciplines within the framework of evolutionary developmental biology (EvoDevo) may easily take another 100 years. He identifies methodological, epistemological, and social differences as causes for this supposed separation. Our article provides a contrasting view. We argue that Duboule's prediction is based on a one-sided understanding of systems biology as a science that is only interested in functional, not evolutionary, aspects of biological processes. Instead, we propose a research program for an evolutionary systems biology, which is based on local exploration of the configuration space in evolving developmental systems. We call this approach-which is based on reverse engineering, simulation, and mathematical analysis-the natural history of configuration space. We discuss a number of illustrative examples that demonstrate the past success of local exploration, as opposed to global mapping, in different biological contexts. We argue that this pragmatic mode of inquiry can be extended and applied to the mathematical analysis of the developmental repertoire and evolutionary potential of evolving developmental mechanisms and that evolutionary systems biology so conceived provides a pragmatic epistemological framework for the EvoDevo synthesis.

  7. The interpretation of administrative contracts

    Directory of Open Access Journals (Sweden)

    Cătălin-Silviu SĂRARU

    2014-06-01

    Full Text Available The article analyzes the principles of interpretation for administrative contracts, in French law and in Romanian law. In the article are highlighted derogations from the rules of contract interpretation in common law. Are examined the exceptions to the principle of good faith, the principle of common intention (willingness of the parties, the principle of good administration, the principle of extensive interpretation of the administrative contract. The article highlights the importance and role of the interpretation in administrative contracts.

  8. Hidden variable interpretation of spontaneous localization theory

    Energy Technology Data Exchange (ETDEWEB)

    Bedingham, Daniel J, E-mail: d.bedingham@imperial.ac.uk [Blackett Laboratory, Imperial College, London SW7 2BZ (United Kingdom)

    2011-07-08

    The spontaneous localization theory of Ghirardi, Rimini, and Weber (GRW) is a theory in which wavepacket reduction is treated as a genuine physical process. Here it is shown that the mathematical formalism of GRW can be given an interpretation in terms of an evolving distribution of particles on configuration space similar to Bohmian mechanics (BM). The GRW wavefunction acts as a pilot wave for the set of particles. In addition, a continuous stream of noisy information concerning the precise whereabouts of the particles must be specified. Nonlinear filtering techniques are used to determine the dynamics of the distribution of particles conditional on this noisy information and consistency with the GRW wavefunction dynamics is demonstrated. Viewing this development as a hybrid BM-GRW theory, it is argued that, besides helping to clarify the relationship between the GRW theory and BM, its merits make it worth considering in its own right.

  9. Wilhelm Wundt's Theory of Interpretation

    Directory of Open Access Journals (Sweden)

    Jochen Fahrenberg

    2008-09-01

    Full Text Available Wilhelm WUNDT was a pioneer in experimental and physiological psychology. However, his theory of interpretation (hermeneutics remains virtually neglected. According to WUNDT psychology belongs to the domain of the humanities (Geisteswissenschaften, and, throughout his books and research, he advocated two basic methodologies: experimentation (as the means of controlled self-observation and interpretative analysis of mental processes and products. He was an experimental psychologist and a profound expert in traditional hermeneutics. Today, he still may be acknowledged as the author of the monumental Völkerpsychologie, but not his advances in epistemology and methodology. His subsequent work, the Logik (1908/1921, contains about 120 pages on hermeneutics. In the present article a number of issues are addressed. Noteworthy was WUNDT's general intention to account for the logical constituents and the psychological process of understanding, and his reflections on quality control. In general, WUNDT demanded methodological pluralism and a complementary approach to the study of consciousness and neurophysiological processes. In the present paper WUNDT's approach is related to the continuing controversy on basic issues in methodology; e.g. experimental and statistical methods vs. qualitative (hermeneutic methods. Varied explanations are given for the one-sided or distorted reception of WUNDT's methodology. Presently, in Germany the basic program of study in psychology lacks thorough teaching and training in qualitative (hermeneutic methods. Appropriate courses are not included in the curricula, in contrast to the training in experimental design, observation methods, and statistics. URN: urn:nbn:de:0114-fqs0803291

  10. Network Analysis of Earth's Co-Evolving Geosphere and Biosphere

    Science.gov (United States)

    Hazen, R. M.; Eleish, A.; Liu, C.; Morrison, S. M.; Meyer, M.; Consortium, K. D.

    2017-12-01

    A fundamental goal of Earth science is the deep understanding of Earth's dynamic, co-evolving geosphere and biosphere through deep time. Network analysis of geo- and bio- `big data' provides an interactive, quantitative, and predictive visualization framework to explore complex and otherwise hidden high-dimension features of diversity, distribution, and change in the evolution of Earth's geochemistry, mineralogy, paleobiology, and biochemistry [1]. Networks also facilitate quantitative comparison of different geological time periods, tectonic settings, and geographical regions, as well as different planets and moons, through network metrics, including density, centralization, diameter, and transitivity.We render networks by employing data related to geographical, paragenetic, environmental, or structural relationships among minerals, fossils, proteins, and microbial taxa. An important recent finding is that the topography of many networks reflects parameters not explicitly incorporated in constructing the network. For example, networks for minerals, fossils, and protein structures reveal embedded qualitative time axes, with additional network geometries possibly related to extinction and/or other punctuation events (see Figure). Other axes related to chemical activities and volatile fugacities, as well as pressure and/or depth of formation, may also emerge from network analysis. These patterns provide new insights into the way planets evolve, especially Earth's co-evolving geosphere and biosphere. 1. Morrison, S.M. et al. (2017) Network analysis of mineralogical systems. American Mineralogist 102, in press. Figure Caption: A network of Phanerozoic Era fossil animals from the past 540 million years includes blue, red, and black circles (nodes) representing family-level taxa and grey lines (links) between coexisting families. Age information was not used in the construction of this network; nevertheless an intrinsic timeline is embedded in the network topology. In

  11. Deaf-Blind Interpreting: Building on What You Already Know

    OpenAIRE

    Petronio, Karen

    2010-01-01

    http://dx.doi.org/10.5007/2175-7968.2010v2n26p237 This article focuses on visual considerations and describes the numerous similarities between video interpreting and deaf-blind interpreting. It also looks at linguistic considerations for deaf-blind interpreting and presents research findings showing similarities and differences between ASL and Tactile ASL. Because many interpreters are unfamiliar with tactile communication, there is a section that includes an overview of Tactile ASL. The...

  12. Interpretations, perspectives and intentions in surrogate motherhood

    OpenAIRE

    van Zyl, L.; van Niekerk, A.

    2000-01-01

    In this paper we examine the questions "What does it mean to be a surrogate mother?" and "What would be an appropriate perspective for a surrogate mother to have on her pregnancy?" In response to the objection that such contracts are alienating or dehumanising since they require women to suppress their evolving perspective on their pregnancies, liberal supporters of surrogate motherhood argue that the freedom to contract includes the freedom to enter a contract to bear a child for an infertil...

  13. Evolvability as a Quality Attribute of Software Architectures

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Duchien, Laurence; D'Hondt, Maja; Mens, Tom

    We review the definition of evolvability as it appears on the literature. In particular, the concept of software evolvability is compared with other system quality attributes, such as adaptability, maintainability and modifiability.

  14. Evolving colon injury management: a review.

    Science.gov (United States)

    Greer, Lauren T; Gillern, Suzanne M; Vertrees, Amy E

    2013-02-01

    The colon is the second most commonly injured intra-abdominal organ in penetrating trauma. Management of traumatic colon injuries has evolved significantly over the past 200 years. Traumatic colon injuries can have a wide spectrum of severity, presentation, and management options. There is strong evidence that most non-destructive colon injuries can be successfully managed with primary repair or primary anastomosis. The management of destructive colon injuries remains controversial with most favoring resection with primary anastomosis and others favor colonic diversion in specific circumstances. The historical management of traumatic colon injuries, common mechanisms of injury, demographics, presentation, assessment, diagnosis, management, and complications of traumatic colon injuries both in civilian and military practice are reviewed. The damage control revolution has added another layer of complexity to management with continued controversy.

  15. Pulmonary Sporotrichosis: An Evolving Clinical Paradigm.

    Science.gov (United States)

    Aung, Ar K; Spelman, Denis W; Thompson, Philip J

    2015-10-01

    In recent decades, sporotrichosis, caused by thermally dimorphic fungi Sporothrix schenckii complex, has become an emerging infection in many parts of the world. Pulmonary infection with S. schenckii still remains relatively uncommon, possibly due to underrecognition. Pulmonary sporotrichosis presents with distinct clinical and radiological patterns in both immunocompetent and immunocompromised hosts and can often result in significant morbidity and mortality despite treatment. Current understanding regarding S. schenckii biology, epidemiology, immunopathology, clinical diagnostics, and treatment options has been evolving in the recent years with increased availability of molecular sequencing techniques. However, this changing knowledge has not yet been fully translated into a better understanding of the clinical aspects of pulmonary sporotrichosis, as such current management guidelines remain unsupported by high-level clinical evidence. This article examines recent advances in the knowledge of sporotrichosis and its application to the difficult challenges of managing pulmonary sporotrichosis. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  16. Resiliently evolving supply-demand networks

    Science.gov (United States)

    Rubido, Nicolás; Grebogi, Celso; Baptista, Murilo S.

    2014-01-01

    The ability to design a transport network such that commodities are brought from suppliers to consumers in a steady, optimal, and stable way is of great importance for distribution systems nowadays. In this work, by using the circuit laws of Kirchhoff and Ohm, we provide the exact capacities of the edges that an optimal supply-demand network should have to operate stably under perturbations, i.e., without overloading. The perturbations we consider are the evolution of the connecting topology, the decentralization of hub sources or sinks, and the intermittence of supplier and consumer characteristics. We analyze these conditions and the impact of our results, both on the current United Kingdom power-grid structure and on numerically generated evolving archetypal network topologies.

  17. Development and the evolvability of human limbs.

    Science.gov (United States)

    Young, Nathan M; Wagner, Günter P; Hallgrímsson, Benedikt

    2010-02-23

    The long legs and short arms of humans are distinctive for a primate, the result of selection acting in opposite directions on each limb at different points in our evolutionary history. This mosaic pattern challenges our understanding of the relationship of development and evolvability because limbs are serially homologous and genetic correlations should act as a significant constraint on their independent evolution. Here we test a developmental model of limb covariation in anthropoid primates and demonstrate that both humans and apes exhibit significantly reduced integration between limbs when compared to quadrupedal monkeys. This result indicates that fossil hominins likely escaped constraints on independent limb variation via reductions to genetic pleiotropy in an ape-like last common ancestor (LCA). This critical change in integration among hominoids, which is reflected in macroevolutionary differences in the disparity between limb lengths, facilitated selection for modern human limb proportions and demonstrates how development helps shape evolutionary change.

  18. Evolving spiking networks with variable resistive memories.

    Science.gov (United States)

    Howard, Gerard; Bull, Larry; de Lacy Costello, Ben; Gale, Ella; Adamatzky, Andrew

    2014-01-01

    Neuromorphic computing is a brainlike information processing paradigm that requires adaptive learning mechanisms. A spiking neuro-evolutionary system is used for this purpose; plastic resistive memories are implemented as synapses in spiking neural networks. The evolutionary design process exploits parameter self-adaptation and allows the topology and synaptic weights to be evolved for each network in an autonomous manner. Variable resistive memories are the focus of this research; each synapse has its own conductance profile which modifies the plastic behaviour of the device and may be altered during evolution. These variable resistive networks are evaluated on a noisy robotic dynamic-reward scenario against two static resistive memories and a system containing standard connections only. The results indicate that the extra behavioural degrees of freedom available to the networks incorporating variable resistive memories enable them to outperform the comparative synapse types.

  19. Argentina and Brazil: an evolving nuclear relationship

    International Nuclear Information System (INIS)

    Redick, J.R.

    1990-01-01

    Argentina and Brazil have Latin America's most advanced nuclear research and power programs. Both nations reject the Non-Proliferation Treaty (NPT), and have not formally embraced the Tlatelolco Treaty creating a regional nuclear-weapon-free zone. Disturbing ambiguities persist regarding certain indigenous nuclear facilities and growing nuclear submarine and missile capabilities. For these, and other reasons, the two nations are widely considered potential nuclear weapon states. However both nations have been active supporters of the International Atomic Energy Agency (IAEA) and have, in recent years, assumed a generally responsible position in regard to their own nuclear export activities (requiring IAEA safeguards). Most important, however, has been the advent of bilateral nuclear cooperation. This paper considers the evolving nuclear relationship in the context of recent and dramatic political change in Argentina and Brazil. It discusses current political and nuclear developments and the prospects for maintaining and expanding present bilateral cooperation into an effective non-proliferation arrangement. (author)

  20. The genotype-phenotype map of an evolving digital organism

    OpenAIRE

    Fortuna, Miguel A.; Zaman, Luis; Ofria, Charles; Wagner, Andreas

    2017-01-01

    To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms fr...

  1. Delineating slowly and rapidly evolving fractions of the Drosophila genome.

    Science.gov (United States)

    Keith, Jonathan M; Adams, Peter; Stephen, Stuart; Mattick, John S

    2008-05-01

    Evolutionary conservation is an important indicator of function and a major component of bioinformatic methods to identify non-protein-coding genes. We present a new Bayesian method for segmenting pairwise alignments of eukaryotic genomes while simultaneously classifying segments into slowly and rapidly evolving fractions. We also describe an information criterion similar to the Akaike Information Criterion (AIC) for determining the number of classes. Working with pairwise alignments enables detection of differences in conservation patterns among closely related species. We analyzed three whole-genome and three partial-genome pairwise alignments among eight Drosophila species. Three distinct classes of conservation level were detected. Sequences comprising the most slowly evolving component were consistent across a range of species pairs, and constituted approximately 62-66% of the D. melanogaster genome. Almost all (>90%) of the aligned protein-coding sequence is in this fraction, suggesting much of it (comprising the majority of the Drosophila genome, including approximately 56% of non-protein-coding sequences) is functional. The size and content of the most rapidly evolving component was species dependent, and varied from 1.6% to 4.8%. This fraction is also enriched for protein-coding sequence (while containing significant amounts of non-protein-coding sequence), suggesting it is under positive selection. We also classified segments according to conservation and GC content simultaneously. This analysis identified numerous sub-classes of those identified on the basis of conservation alone, but was nevertheless consistent with that classification. Software, data, and results available at www.maths.qut.edu.au/-keithj/. Genomic segments comprising the conservation classes available in BED format.

  2. Japanese experience of evolving nurses' roles in changing social contexts.

    Science.gov (United States)

    Kanbara, S; Yamamoto, Y; Sugishita, T; Nakasa, T; Moriguchi, I

    2017-06-01

    To discuss the evolving roles of Japanese nurses in meeting the goals and concerns of ongoing global sustainable development. Japanese nurses' roles have evolved as the needs of the country and the communities they served, changed over time. The comprehensive public healthcare services in Japan were provided by the cooperation of hospitals and public health nurses. The nursing profession is exploring ways to identify and systemize nursing skills and competencies that address global health initiatives for sustainable development goals. This paper is based on the summary of a symposium, (part of the 2015 annual meeting of the Japan Association for International Health) with panel members including experts from Japan's Official Development Assistance. The evolving role of nurses in response to national and international needs is illustrated by nursing practices from Japan. Japanese public health nurses have also assisted overseas healthcare plans. In recent catastrophes, Japanese nurses assumed the roles of community health coordinators for restoration and maintenance of public health. The Japanese experience shows that nursing professionals are best placed to work with community health issues, high-risk situations and vulnerable communities. Their cooperation can address current social needs and help global communities to transform our world. Nurses have tremendous potential to make transformative changes in health and bring about the necessary paradigm shift. They must be involved in global sustainable development goals, health policies and disaster risk management. A mutual understanding of global citizen and nurses will help to renew and strengthen their capacities. Nursing professionals can contribute effectively to achieve national and global health goals and make transformative changes. © 2017 International Council of Nurses.

  3. Orientalismi: nuove prospettive interpretative

    Directory of Open Access Journals (Sweden)

    Gabriele Proglio

    2012-11-01

    Full Text Available This paper is aimed at reconsidering the concept of Orientalism in a new and multiple perspective, and at proposing a different interpretation of the relationship between culture and power, starting from Edward Said’s theoretical frame of reference. If Said’s representational model is repositioned out of structuralist and foucaultian frameworks and separated from the gramscian idea of hegemony-subordination, indeed, it may be possible to re-discuss the traditional profile identifying the Other in the European cultures. My basic assumption here is that Orientalism should not be understood as a consensus mechanism, which is able to produce diversified images of the Orient and the Oriental on demand. Although, of course, in most cases Orientalism is connected to the issue of power, its meanings could also be explained —as it will be soon shown— otherwise. Let’s take The Invisible Cities by Italo Calvino as an example. Here the narratives are not just multiple repetitions of Venice —in Said’s case, the same would hold for Europeanism—, but they could be strategically re-appropriated by those “others” and “alterities” whose bodies and identities are imposed by the Eurocentric discourse. In this sense, a double link may be identified with queer theories and postcolonial studies, and the notion of subordination will be rethought. Finally, from the above mentioned borders, a new idea of image emerges, which appears as linear, uniform and flattened only to the European gaze, whereas in actual fact it is made of imaginaries and forms of knowledge, which combine representation with the conceptualization of power relationships.

  4. A mathematical model for interpretable clinical decision support with applications in gynecology.

    Directory of Open Access Journals (Sweden)

    Vanya M C A Van Belle

    Full Text Available Over time, methods for the development of clinical decision support (CDS systems have evolved from interpretable and easy-to-use scoring systems to very complex and non-interpretable mathematical models. In order to accomplish effective decision support, CDS systems should provide information on how the model arrives at a certain decision. To address the issue of incompatibility between performance, interpretability and applicability of CDS systems, this paper proposes an innovative model structure, automatically leading to interpretable and easily applicable models. The resulting models can be used to guide clinicians when deciding upon the appropriate treatment, estimating patient-specific risks and to improve communication with patients.We propose the interval coded scoring (ICS system, which imposes that the effect of each variable on the estimated risk is constant within consecutive intervals. The number and position of the intervals are automatically obtained by solving an optimization problem, which additionally performs variable selection. The resulting model can be visualised by means of appealing scoring tables and color bars. ICS models can be used within software packages, in smartphone applications, or on paper, which is particularly useful for bedside medicine and home-monitoring. The ICS approach is illustrated on two gynecological problems: diagnosis of malignancy of ovarian tumors using a dataset containing 3,511 patients, and prediction of first trimester viability of pregnancies using a dataset of 1,435 women. Comparison of the performance of the ICS approach with a range of prediction models proposed in the literature illustrates the ability of ICS to combine optimal performance with the interpretability of simple scoring systems.The ICS approach can improve patient-clinician communication and will provide additional insights in the importance and influence of available variables. Future challenges include extensions of the

  5. New seismograph includes filters

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-02

    The new Nimbus ES-1210 multichannel signal enhancement seismograph from EG and G geometrics has recently been redesigned to include multimode signal fillers on each amplifier. The ES-1210F is a shallow exploration seismograph for near subsurface exploration such as in depth-to-bedrock, geological hazard location, mineral exploration, and landslide investigations.

  6. Radiological interpretation 2020: Toward quantitative image assessment

    International Nuclear Information System (INIS)

    Boone, John M.

    2007-01-01

    The interpretation of medical images by radiologists is primarily and fundamentally a subjective activity, but there are a number of clinical applications such as tumor imaging where quantitative imaging (QI) metrics (such as tumor growth rate) would be valuable to the patient’s care. It is predicted that the subjective interpretive environment of the past will, over the next decade, evolve toward the increased use of quantitative metrics for evaluating patient health from images. The increasing sophistication and resolution of modern tomographic scanners promote the development of meaningful quantitative end points, determined from images which are in turn produced using well-controlled imaging protocols. For the QI environment to expand, medical physicists, physicians, other researchers and equipment vendors need to work collaboratively to develop the quantitative protocols for imaging, scanner calibrations, and robust analytical software that will lead to the routine inclusion of quantitative parameters in the diagnosis and therapeutic assessment of human health. Most importantly, quantitative metrics need to be developed which have genuine impact on patient diagnosis and welfare, and only then will QI techniques become integrated into the clinical environment.

  7. Analytic device including nanostructures

    KAUST Repository

    Di Fabrizio, Enzo M.; Fratalocchi, Andrea; Totero Gongora, Juan Sebastian; Coluccio, Maria Laura; Candeloro, Patrizio; Cuda, Gianni

    2015-01-01

    A device for detecting an analyte in a sample comprising: an array including a plurality of pixels, each pixel including a nanochain comprising: a first nanostructure, a second nanostructure, and a third nanostructure, wherein size of the first nanostructure is larger than that of the second nanostructure, and size of the second nanostructure is larger than that of the third nanostructure, and wherein the first nanostructure, the second nanostructure, and the third nanostructure are positioned on a substrate such that when the nanochain is excited by an energy, an optical field between the second nanostructure and the third nanostructure is stronger than an optical field between the first nanostructure and the second nanostructure, wherein the array is configured to receive a sample; and a detector arranged to collect spectral data from a plurality of pixels of the array.

  8. Saskatchewan resources. [including uranium

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    The production of chemicals and minerals for the chemical industry in Saskatchewan are featured, with some discussion of resource taxation. The commodities mentioned include potash, fatty amines, uranium, heavy oil, sodium sulfate, chlorine, sodium hydroxide, sodium chlorate and bentonite. Following the successful outcome of the Cluff Lake inquiry, the uranium industry is booming. Some developments and production figures for Gulf Minerals, Amok, Cenex and Eldorado are mentioned.

  9. Working memory and simultaneous interpreting

    OpenAIRE

    Timarova, Sarka

    2009-01-01

    Working memory is a cognitive construct underlying a number of abilities, and it has been hypothesised for many years that it is crucial for interpreting. A number of studies have been conducted with the aim to support this hypothesis, but research has not yielded convincing results. Most researchers focused on studying working memory differences between interpreters and non-interpreters with the rationale that differences in working memory between the two groups would provide evidence of wor...

  10. Evolving Microbial Communities in Cellulose-Fed Microbial Fuel Cell

    Directory of Open Access Journals (Sweden)

    Renata Toczyłowska-Mamińska

    2018-01-01

    Full Text Available The abundance of cellulosic wastes make them attractive source of energy for producing electricity in microbial fuel cells (MFCs. However, electricity production from cellulose requires obligate anaerobes that can degrade cellulose and transfer electrons to the electrode (exoelectrogens, and thus most previous MFC studies have been conducted using two-chamber systems to avoid oxygen contamination of the anode. Single-chamber, air-cathode MFCs typically produce higher power densities than aqueous catholyte MFCs and avoid energy input for the cathodic reaction. To better understand the bacterial communities that evolve in single-chamber air-cathode MFCs fed cellulose, we examined the changes in the bacterial consortium in an MFC fed cellulose over time. The most predominant bacteria shown to be capable electron generation was Firmicutes, with the fermenters decomposing cellulose Bacteroidetes. The main genera developed after extended operation of the cellulose-fed MFC were cellulolytic strains, fermenters and electrogens that included: Parabacteroides, Proteiniphilum, Catonella and Clostridium. These results demonstrate that different communities evolve in air-cathode MFCs fed cellulose than the previous two-chamber reactors.

  11. Evolving technologies drive the new roles of Biomedical Engineering.

    Science.gov (United States)

    Frisch, P H; St Germain, J; Lui, W

    2008-01-01

    Rapidly changing technology coupled with the financial impact of organized health care, has required hospital Biomedical Engineering organizations to augment their traditional operational and business models to increase their role in developing enhanced clinical applications utilizing new and evolving technologies. The deployment of these technology based applications has required Biomedical Engineering organizations to re-organize to optimize the manner in which they provide and manage services. Memorial Sloan-Kettering Cancer Center has implemented a strategy to explore evolving technologies integrating them into enhanced clinical applications while optimally utilizing the expertise of the traditional Biomedical Engineering component (Clinical Engineering) to provide expanded support in technology / equipment management, device repair, preventive maintenance and integration with legacy clinical systems. Specifically, Biomedical Engineering is an integral component of the Medical Physics Department which provides comprehensive and integrated support to the Center in advanced physical, technical and engineering technology. This organizational structure emphasizes the integration and collaboration between a spectrum of technical expertise for clinical support and equipment management roles. The high cost of clinical equipment purchases coupled with the increasing cost of service has driven equipment management responsibilities to include significant business and financial aspects to provide a cost effective service model. This case study details the dynamics of these expanded roles, future initiatives and benefits for Biomedical Engineering and Memorial Sloan Kettering Cancer Center.

  12. International Conference “Ultraviolet Properties of Evolved Stellar Populations

    CERN Document Server

    Chavez Dagostino, Miguel

    2009-01-01

    This book presents an up-to-date collection of reviews and contributed articles in the field of ultraviolet astronomy. Its content has been mainly motivated by the recent access to the rest frame UV light of distant red galaxies, gained through large optical facilities. This driveway has derived in a renewed interest on the stars that presumably dominate or have important effects on the integrated UV properties of evolved systems of the nearby and faraway Universe. The topics included in this volume extend from the fresh spectroscopic analyses of high redshift early-type galaxies observed with the 8-10m class telescopes to the fundamental outcomes from various satellites, from the long-lived International Ultraviolet Explorer to current facilities, such as the Galaxy Evolution Explorer. This is one of the few volumes published in recent years devoted to UV astronomical research and the only one dedicated to the properties of evolved stellar populations at these wavelengths. This contemporary panorama will be ...

  13. How Life and Rocks Have Co-Evolved

    Science.gov (United States)

    Hazen, R.

    2014-04-01

    The near-surface environment of terrestrial planets and moons evolves as a consequence of selective physical, chemical, and biological processes - an evolution that is preserved in the mineralogical record. Mineral evolution begins with approximately 12 different refractory minerals that form in the cooling envelopes of exploding stars. Subsequent aqueous and thermal alteration of planetessimals results in the approximately 250 minerals now found in unweathered lunar and meteorite samples. Following Earth's accretion and differentiation, mineral evolution resulted from a sequence of geochemical and petrologic processes, which led to perhaps 1500 mineral species. According to some origin-of-life scenarios, a planet must progress through at least some of these stages of chemical processing as a prerequisite for life. Once life emerged, mineralogy and biology co-evolved and dramatically increased Earth's mineral diversity to >4000 species. Sequential stages of a planet's near-surface evolution arise from three primary mechanisms: (1) the progressive separation and concentration of the elements from their original relatively uniform distribution in the presolar nebula; (2) the increase in range of intensive variables such as pressure, temperature, and volatile activities; and (3) the generation of far-from-equilibrium conditions by living systems. Remote observations of the mineralogy of other terrestrial bodies may thus provide evidence for biological influences beyond Earth. Recent studies of mineral diversification through time reveal striking correlations with major geochemical, tectonic, and biological events, including large-changes in ocean chemistry, the supercontinent cycle, the increase of atmospheric oxygen, and the rise of the terrestrial biosphere.

  14. Evolving cell models for systems and synthetic biology.

    Science.gov (United States)

    Cao, Hongqing; Romero-Campero, Francisco J; Heeb, Stephan; Cámara, Miguel; Krasnogor, Natalio

    2010-03-01

    This paper proposes a new methodology for the automated design of cell models for systems and synthetic biology. Our modelling framework is based on P systems, a discrete, stochastic and modular formal modelling language. The automated design of biological models comprising the optimization of the model structure and its stochastic kinetic constants is performed using an evolutionary algorithm. The evolutionary algorithm evolves model structures by combining different modules taken from a predefined module library and then it fine-tunes the associated stochastic kinetic constants. We investigate four alternative objective functions for the fitness calculation within the evolutionary algorithm: (1) equally weighted sum method, (2) normalization method, (3) randomly weighted sum method, and (4) equally weighted product method. The effectiveness of the methodology is tested on four case studies of increasing complexity including negative and positive autoregulation as well as two gene networks implementing a pulse generator and a bandwidth detector. We provide a systematic analysis of the evolutionary algorithm's results as well as of the resulting evolved cell models.

  15. Automated, computer interpreted radioimmunoassay results

    International Nuclear Information System (INIS)

    Hill, J.C.; Nagle, C.E.; Dworkin, H.J.; Fink-Bennett, D.; Freitas, J.E.; Wetzel, R.; Sawyer, N.; Ferry, D.; Hershberger, D.

    1984-01-01

    90,000 Radioimmunoassay results have been interpreted and transcribed automatically using software developed for use on a Hewlett Packard Model 1000 mini-computer system with conventional dot matrix printers. The computer program correlates the results of a combination of assays, interprets them and prints a report ready for physician review and signature within minutes of completion of the assay. The authors designed and wrote a computer program to query their patient data base for radioassay laboratory results and to produce a computer generated interpretation of these results using an algorithm that produces normal and abnormal interpretives. Their laboratory assays 50,000 patient samples each year using 28 different radioassays. Of these 85% have been interpreted using our computer program. Allowances are made for drug and patient history and individualized reports are generated with regard to the patients age and sex. Finalization of reports is still subject to change by the nuclear physician at the time of final review. Automated, computerized interpretations have realized cost savings through reduced personnel and personnel time and provided uniformity of the interpretations among the five physicians. Prior to computerization of interpretations, all radioassay results had to be dictated and reviewed for signing by one of the resident or staff physicians. Turn around times for reports prior to the automated computer program generally were two to three days. Whereas, the computerized interpret system allows reports to generally be issued the day assays are completed

  16. On the Critical Role of Divergent Selection in Evolvability

    Directory of Open Access Journals (Sweden)

    Joel Lehman

    2016-08-01

    Full Text Available An ambitious goal in evolutionary robotics is to evolve increasingly complex robotic behaviors with minimal human design effort. Reaching this goal requires evolutionary algorithms that can unlock from genetic encodings their latent potential for evolvability. One issue clouding this goal is conceptual confusion about evolvability, which often obscures the aspects of evolvability that are important or desirable. The danger from such confusion is that it may establish unrealistic goals for evolvability that prove unproductive in practice. An important issue separate from conceptual confusion is the common misalignment between selection and evolvability in evolutionary robotics. While more expressive encodings can represent higher-level adaptations (e.g. sexual reproduction or developmental systems that increase long-term evolutionary potential (i.e. evolvability, realizing such potential requires gradients of fitness and evolvability to align. In other words, selection is often a critical factor limiting increasing evolvability. Thus, drawing from a series of recent papers, this article seeks to both (1 clarify and focus the ways in which the term evolvability is used within artificial evolution, and (2 argue for the importance of one type of selection, i.e. divergent selection, for enabling evolvability. The main argument is that there is a fundamental connection between divergent selection and evolvability (on both the individual and population level that does not hold for typical goal-oriented selection. The conclusion is that selection pressure plays a critical role in realizing the potential for evolvability, and that divergent selection in particular provides a principled mechanism for encouraging evolvability in artificial evolution.

  17. Interpreting Early Career Trajectories

    Science.gov (United States)

    Barnatt, Joan; Gahlsdorf Terrell, Dianna; D'Souza, Lisa Andries; Jong, Cindy; Cochran-Smith, Marilyn; Viesca, Kara Mitchell; Gleeson, Ann Marie; McQuillan, Patrick; Shakman, Karen

    2017-01-01

    Career decisions of four teachers are explored through the concept of figured worlds in this qualitative, longitudinal case study. Participants were purposefully chosen for similarity at entry, with a range of career trajectories over time. Teacher career paths included remaining in one school, repeated changes in schools, attrition after…

  18. Internship: Interpreting Micropolitical Contexts

    Science.gov (United States)

    Ehrich, Lisa C.; Millwater, Jan

    2011-01-01

    Many university faculties of education across Australia employ a model of internship for final semester pre-service teacher education students to help them make a smooth transition into the teaching profession. While a growing body of research has explored pre-service teachers' experiences of their practicum, including the internship, which is the…

  19. Approximating centrality in evolving graphs: toward sublinearity

    Science.gov (United States)

    Priest, Benjamin W.; Cybenko, George

    2017-05-01

    The identification of important nodes is a ubiquitous problem in the analysis of social networks. Centrality indices (such as degree centrality, closeness centrality, betweenness centrality, PageRank, and others) are used across many domains to accomplish this task. However, the computation of such indices is expensive on large graphs. Moreover, evolving graphs are becoming increasingly important in many applications. It is therefore desirable to develop on-line algorithms that can approximate centrality measures using memory sublinear in the size of the graph. We discuss the challenges facing the semi-streaming computation of many centrality indices. In particular, we apply recent advances in the streaming and sketching literature to provide a preliminary streaming approximation algorithm for degree centrality utilizing CountSketch and a multi-pass semi-streaming approximation algorithm for closeness centrality leveraging a spanner obtained through iteratively sketching the vertex-edge adjacency matrix. We also discuss possible ways forward for approximating betweenness centrality, as well as spectral measures of centrality. We provide a preliminary result using sketched low-rank approximations to approximate the output of the HITS algorithm.

  20. Extreme insular dwarfism evolved in a mammoth.

    Science.gov (United States)

    Herridge, Victoria L; Lister, Adrian M

    2012-08-22

    The insular dwarfism seen in Pleistocene elephants has come to epitomize the island rule; yet our understanding of this phenomenon is hampered by poor taxonomy. For Mediterranean dwarf elephants, where the most extreme cases of insular dwarfism are observed, a key systematic question remains unresolved: are all taxa phyletic dwarfs of a single mainland species Palaeoloxodon antiquus (straight-tusked elephant), or are some referable to Mammuthus (mammoths)? Ancient DNA and geochronological evidence have been used to support a Mammuthus origin for the Cretan 'Palaeoloxodon' creticus, but these studies have been shown to be flawed. On the basis of existing collections and recent field discoveries, we present new, morphological evidence for the taxonomic status of 'P'. creticus, and show that it is indeed a mammoth, most probably derived from Early Pleistocene Mammuthus meridionalis or possibly Late Pliocene Mammuthus rumanus. We also show that Mammuthus creticus is smaller than other known insular dwarf mammoths, and is similar in size to the smallest dwarf Palaeoloxodon species from Sicily and Malta, making it the smallest mammoth species known to have existed. These findings indicate that extreme insular dwarfism has evolved to a similar degree independently in two elephant lineages.

  1. An evolving network model with modular growth

    International Nuclear Information System (INIS)

    Zou Zhi-Yun; Liu Peng; Lei Li; Gao Jian-Zhi

    2012-01-01

    In this paper, we propose an evolving network model growing fast in units of module, according to the analysis of the evolution characteristics in real complex networks. Each module is a small-world network containing several interconnected nodes and the nodes between the modules are linked by preferential attachment on degree of nodes. We study the modularity measure of the proposed model, which can be adjusted by changing the ratio of the number of inner-module edges and the number of inter-module edges. In view of the mean-field theory, we develop an analytical function of the degree distribution, which is verified by a numerical example and indicates that the degree distribution shows characteristics of the small-world network and the scale-free network distinctly at different segments. The clustering coefficient and the average path length of the network are simulated numerically, indicating that the network shows the small-world property and is affected little by the randomness of the new module. (interdisciplinary physics and related areas of science and technology)

  2. A local-world evolving hypernetwork model

    International Nuclear Information System (INIS)

    Yang Guang-Yong; Liu Jian-Guo

    2014-01-01

    Complex hypernetworks are ubiquitous in the real system. It is very important to investigate the evolution mechanisms. In this paper, we present a local-world evolving hypernetwork model by taking into account the hyperedge growth and local-world hyperedge preferential attachment mechanisms. At each time step, a newly added hyperedge encircles a new coming node and a number of nodes from a randomly selected local world. The number of the selected nodes from the local world obeys the uniform distribution and its mean value is m. The analytical and simulation results show that the hyperdegree approximately obeys the power-law form and the exponent of hyperdegree distribution is γ = 2 + 1/m. Furthermore, we numerically investigate the node degree, hyperedge degree, clustering coefficient, as well as the average distance, and find that the hypernetwork model shares the scale-free and small-world properties, which shed some light for deeply understanding the evolution mechanism of the real systems. (interdisciplinary physics and related areas of science and technology)

  3. Evolving autonomous learning in cognitive networks.

    Science.gov (United States)

    Sheneman, Leigh; Hintze, Arend

    2017-12-01

    There are two common approaches for optimizing the performance of a machine: genetic algorithms and machine learning. A genetic algorithm is applied over many generations whereas machine learning works by applying feedback until the system meets a performance threshold. These methods have been previously combined, particularly in artificial neural networks using an external objective feedback mechanism. We adapt this approach to Markov Brains, which are evolvable networks of probabilistic and deterministic logic gates. Prior to this work MB could only adapt from one generation to the other, so we introduce feedback gates which augment their ability to learn during their lifetime. We show that Markov Brains can incorporate these feedback gates in such a way that they do not rely on an external objective feedback signal, but instead can generate internal feedback that is then used to learn. This results in a more biologically accurate model of the evolution of learning, which will enable us to study the interplay between evolution and learning and could be another step towards autonomously learning machines.

  4. Orbital Decay in Binaries with Evolved Stars

    Science.gov (United States)

    Sun, Meng; Arras, Phil; Weinberg, Nevin N.; Troup, Nicholas; Majewski, Steven R.

    2018-01-01

    Two mechanisms are often invoked to explain tidal friction in binary systems. The ``dynamical tide” is the resonant excitation of internal gravity waves by the tide, and their subsequent damping by nonlinear fluid processes or thermal diffusion. The ``equilibrium tide” refers to non-resonant excitation of fluid motion in the star’s convection zone, with damping by interaction with the turbulent eddies. There have been numerous studies of these processes in main sequence stars, but less so on the subgiant and red giant branches. Motivated by the newly discovered close binary systems in the Apache Point Observatory Galactic Evolution Experiment (APOGEE-1), we have performed calculations of both the dynamical and equilibrium tide processes for stars over a range of mass as the star’s cease core hydrogen burning and evolve to shell burning. Even for stars which had a radiative core on the main sequence, the dynamical tide may have very large amplitude in the newly radiative core in post-main sequence, giving rise to wave breaking. The resulting large dynamical tide dissipation rate is compared to the equilibrium tide, and the range of secondary masses and orbital periods over which rapid orbital decay may occur will be discussed, as well as applications to close APOGEE binaries.

  5. Minority games, evolving capitals and replicator dynamics

    International Nuclear Information System (INIS)

    Galla, Tobias; Zhang, Yi-Cheng

    2009-01-01

    We discuss a simple version of the minority game (MG) in which agents hold only one strategy each, but in which their capitals evolve dynamically according to their success and in which the total trading volume varies in time accordingly. This feature is known to be crucial for MGs to reproduce stylized facts of real market data. The stationary states and phase diagram of the model can be computed, and we show that the ergodicity breaking phase transition common for MGs, and marked by a divergence of the integrated response, is present also in this simplified model. An analogous majority game turns out to be relatively void of interesting features, and the total capital is found to diverge in time. Introducing a restraining force leads to a model akin to the replicator dynamics of evolutionary game theory, and we demonstrate that here a different type of phase transition is observed. Finally we briefly discuss the relation of this model with one strategy per player to more sophisticated minority games with dynamical capitals and several trading strategies per agent

  6. An evolving model of online bipartite networks

    Science.gov (United States)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Liu, Chuang

    2013-12-01

    Understanding the structure and evolution of online bipartite networks is a significant task since they play a crucial role in various e-commerce services nowadays. Recently, various attempts have been tried to propose different models, resulting in either power-law or exponential degree distributions. However, many empirical results show that the user degree distribution actually follows a shifted power-law distribution, the so-called Mandelbrot’s law, which cannot be fully described by previous models. In this paper, we propose an evolving model, considering two different user behaviors: random and preferential attachment. Extensive empirical results on two real bipartite networks, Delicious and CiteULike, show that the theoretical model can well characterize the structure of real networks for both user and object degree distributions. In addition, we introduce a structural parameter p, to demonstrate that the hybrid user behavior leads to the shifted power-law degree distribution, and the region of power-law tail will increase with the increment of p. The proposed model might shed some lights in understanding the underlying laws governing the structure of real online bipartite networks.

  7. The Evolving Classification of Pulmonary Hypertension.

    Science.gov (United States)

    Foshat, Michelle; Boroumand, Nahal

    2017-05-01

    - An explosion of information on pulmonary hypertension has occurred during the past few decades. The perception of this disease has shifted from purely clinical to incorporate new knowledge of the underlying pathology. This transfer has occurred in light of advancements in pathophysiology, histology, and molecular medical diagnostics. - To update readers about the evolving understanding of the etiology and pathogenesis of pulmonary hypertension and to demonstrate how pathology has shaped the current classification. - Information presented at the 5 World Symposia on pulmonary hypertension held since 1973, with the last meeting occurring in 2013, was used in this review. - Pulmonary hypertension represents a heterogeneous group of disorders that are differentiated based on differences in clinical, hemodynamic, and histopathologic features. Early concepts of pulmonary hypertension were largely influenced by pharmacotherapy, hemodynamic function, and clinical presentation of the disease. The initial nomenclature for pulmonary hypertension segregated the clinical classifications from pathologic subtypes. Major restructuring of this disease classification occurred between the first and second symposia, which was the first to unite clinical and pathologic information in the categorization scheme. Additional changes were introduced in subsequent meetings, particularly between the third and fourth World Symposia meetings, when additional pathophysiologic information was gained. Discoveries in molecular diagnostics significantly progressed the understanding of idiopathic pulmonary arterial hypertension. Continued advancements in imaging modalities, mechanistic pathogenicity, and molecular biomarkers will enable physicians to define pulmonary hypertension phenotypes based on the pathobiology and allow for treatment customization.

  8. Evolving application of biomimetic nanostructured hydroxyapatite

    Directory of Open Access Journals (Sweden)

    Norberto Roveri

    2010-11-01

    Full Text Available Norberto Roveri, Michele IafiscoLaboratory of Environmental and Biological Structural Chemistry (LEBSC, Dipartimento di Chimica ‘G. Ciamician’, Alma Mater Studiorum, Università di Bologna, Bologna, ItalyAbstract: By mimicking Nature, we can design and synthesize inorganic smart materials that are reactive to biological tissues. These smart materials can be utilized to design innovative third-generation biomaterials, which are able to not only optimize their interaction with biological tissues and environment, but also mimic biogenic materials in their functionalities. The biomedical applications involve increasing the biomimetic levels from chemical composition, structural organization, morphology, mechanical behavior, nanostructure, and bulk and surface chemical–physical properties until the surface becomes bioreactive and stimulates cellular materials. The chemical–physical characteristics of biogenic hydroxyapatites from bone and tooth have been described, in order to point out the elective sides, which are important to reproduce the design of a new biomimetic synthetic hydroxyapatite. This review outlines the evolving applications of biomimetic synthetic calcium phosphates, details the main characteristics of bone and tooth, where the calcium phosphates are present, and discusses the chemical–physical characteristics of biomimetic calcium phosphates, methods of synthesizing them, and some of their biomedical applications.Keywords: hydroxyapatite, nanocrystals, biomimetism, biomaterials, drug delivery, remineralization

  9. Synchrony in diachronic analysis: the interpretation of

    NARCIS (Netherlands)

    Vis, J.; Kitis, E.; Lavidas, N.; Topintzi, N.; Tsangalidis, T.

    2011-01-01

    In historical linguistics, it is very common to interpret the data mainly by means of a diachronic approach. In this article, I will claim that, a combination of various linguistic methods, including a synchronic analysis and cross-linguistic parallels, leads to better motivated conclusions. I will

  10. Being Included and Excluded

    DEFF Research Database (Denmark)

    Korzenevica, Marina

    2016-01-01

    Following the civil war of 1996–2006, there was a dramatic increase in the labor mobility of young men and the inclusion of young women in formal education, which led to the transformation of the political landscape of rural Nepal. Mobility and schooling represent a level of prestige that rural...... politics. It analyzes how formal education and mobility either challenge or reinforce traditional gendered norms which dictate a lowly position for young married women in the household and their absence from community politics. The article concludes that women are simultaneously excluded and included from...... community politics. On the one hand, their mobility and decision-making powers decrease with the increase in the labor mobility of men and their newly gained education is politically devalued when compared to the informal education that men gain through mobility, but on the other hand, schooling strengthens...

  11. IMAGE INTERPRETATION OF COASTAL AREAS

    Directory of Open Access Journals (Sweden)

    M. A. Lazaridou

    2012-07-01

    Full Text Available Coasts were formed with the overall shape of earth's surface. Τhey represent a landform, as determined by the science of geomorphology. Being the boundary between land and sea, they present important features – particularities such as water currents, waves, winds, estuaries, drainage network, pollution etc. Coasts are examined at various levels: continents – oceans, states – large seas, as for example Mediterranean Sea. Greece, because of its horizontal and vertical partitioning, presents great extent and variety of coasts as mainland, peninsulas and islands. Depending on geomorphology, geology, soils, hydrology, land use of the inland and the coasts themselves, these are very diverse. Photogrammetry and Remote Sensing (defined by Statute II of ISPRS is the art, science, and technology of obtaining reliable information from non-contact imaging and other sensor systems about the Earth and its environment, and other physical objects and of processes through recording, measuring, analyzing and representation. This paper concerns critical considerations on the above. It also includes the case of Thessaloniki coasts in Greece, particularly river estuaries areas (river delta. The study of coastal areas of the wide surroundings of Thessaloniki city includes visual image interpretation – digital image processing techniques on satellite data of high spatial resolution.

  12. Perceptual basis of evolving Western musical styles.

    Science.gov (United States)

    Rodriguez Zivic, Pablo H; Shifres, Favio; Cecchi, Guillermo A

    2013-06-11

    The brain processes temporal statistics to predict future events and to categorize perceptual objects. These statistics, called expectancies, are found in music perception, and they span a variety of different features and time scales. Specifically, there is evidence that music perception involves strong expectancies regarding the distribution of a melodic interval, namely, the distance between two consecutive notes within the context of another. The recent availability of a large Western music dataset, consisting of the historical record condensed as melodic interval counts, has opened new possibilities for data-driven analysis of musical perception. In this context, we present an analytical approach that, based on cognitive theories of music expectation and machine learning techniques, recovers a set of factors that accurately identifies historical trends and stylistic transitions between the Baroque, Classical, Romantic, and Post-Romantic periods. We also offer a plausible musicological and cognitive interpretation of these factors, allowing us to propose them as data-driven principles of melodic expectation.

  13. Can Venus magnetosheath plasma evolve into turbulence?

    Science.gov (United States)

    Dwivedi, Navin; Schmid, Daniel; Narita, Yasuhito; Volwerk, Martin; Delva, Magda; Voros, Zoltan; Zhang, Tielong

    2014-05-01

    The present work aims to understand turbulence properties in planetary magnetosheath regions to obtain physical insight on the energy transfer from the larger to smaller scales, in spirit of searching for power-law behaviors in the spectra which is an indication of the energy cascade and wave-wave interaction. We perform a statistical analysis of energy spectra using the Venus Express spacecraft data in the Venusian magnetosheath. The fluxgate magnetometer data (VEXMAG) calibrated down to 1 Hz as well as plasma data from the ion mass analyzer (ASPERA) aboard the spacecraft are used in the years 2006-2009. Ten-minute intervals in the magnetosheath are selected, which is typical time length of observations of quasi-stationary fluctuations avoiding multiple boundaries crossings. The magnetic field data are transformed into the mean-field-aligned (MFA) coordinate system with respect to the large-scale magnetic field direction and the energy spectra are evaluated using a Welch algorithm in the frequency range between 0.008 Hz and 0.5 Hz for 105 time intervals. The averaged energy spectra show a power law upto 0.3 Hz with the approximate slope of -1, which is flatter than the Kolmogorov slope, -5/3. A slight hump in the spectra is found in the compressive component near 0.3 Hz, which could possibly be realization of mirror mode in the magnetosheath. A spectral break (sudden change in slope) accompanies the spectral hump at 0.4 Hz, above which the spectral curve becomes steeper. The overall spectral shape is reminiscent of turbulence. The low-frequency part with the slope -1 is interpreted as realization of the energy containing range, while the high-frequency part with the steepening is interpreted either as the beginning of energy cascade mediated by mirror mode or as the dissipation range due to wave-particle resonance processes. The present research work is fully supported by FP7/STORM (313038).

  14. Nuclear arbitration: Interpreting non-proliferation agreements

    International Nuclear Information System (INIS)

    Tzeng, Peter

    2015-01-01

    At the core of the nuclear non-proliferation regime lie international agreements. These agreements include, inter alia, the Nuclear Non-proliferation Treaty, nuclear co-operation agreements and nuclear export control agreements.1 States, however, do not always comply with their obligations under these agreements. In response, commentators have proposed various enforcement mechanisms to promote compliance. The inconvenient truth, however, is that states are generally unwilling to consent to enforcement mechanisms concerning issues as critical to national security as nuclear non-proliferation.3 This article suggests an alternative solution to the non-compliance problem: interpretation mechanisms. Although an interpretation mechanism does not have the teeth of an enforcement mechanism, it can induce compliance by providing an authoritative interpretation of a legal obligation. Interpretation mechanisms would help solve the non-compliance problem because, as this article shows, in many cases of alleged non-compliance with a non-proliferation agreement, the fundamental problem has been the lack of an authoritative interpretation of the agreement, not the lack of an enforcement mechanism. Specifically, this article proposes arbitration as the proper interpretation mechanism for non-proliferation agreements. It advocates the establishment of a 'Nuclear Arbitration Centre' as an independent branch of the International Atomic Energy Agency (IAEA), and recommends the gradual introduction of arbitration clauses into the texts of non-proliferation agreements. Section I begins with a discussion of international agreements in general and the importance of interpretation and enforcement mechanisms. Section II then discusses nuclear non-proliferation agreements and their lack of interpretation and enforcement mechanisms. Section III examines seven case studies of alleged non-compliance with non-proliferation agreements in order to show that the main problem in many cases

  15. Interpretation of Written Contracts in England

    Directory of Open Access Journals (Sweden)

    Neil Andrews

    2014-01-01

    Full Text Available This article examines the leading principles governing interpretation of written contracts under English law. This is a comprehensive and incisive analysis of the current law and of the relevant doctrines, including the equitable principles of rectification, as well as the powers of appeal courts or of the High Court when hearing an appeal from an arbitral award. The topic of interpretation of written contracts is fast-moving. It is of fundamental importance because this is the most significant commercial focus for dispute and because of the number of cross-border transactions to which English law is expressly applied by businesses.

  16. Tax Treaty Interpretation in Spain

    OpenAIRE

    Soler Roch, María Teresa; Ribes Ribes, Aurora

    2001-01-01

    This paper provides insight in the interpretation of Spanish double taxation conventions. Taking as a premise the Vienna Convention on the Law of Treaties and the wording of Article 3(2) OECD Model Convention, the authors explore the relevance of mutual agreements, tax authority practice and foreign court decisions on the tax treaty interpretation.

  17. Pragmatics in Court Interpreting: Additions

    DEFF Research Database (Denmark)

    Jacobsen, Bente

    2003-01-01

    Danish court interpreters are expected to follow ethical guidelines, which instruct them to deliver exact verbatim versions of source texts. However, this requirement often clashes with the reality of the interpreting situation in the courtroom. This paper presents and discusses the findings of a...

  18. Intercultural pragmatics and court interpreting

    DEFF Research Database (Denmark)

    Jacobsen, Bente

    2008-01-01

      This paper reports on an on-going investigation of conversational implicature in triadic speech events: Interpreter-mediated questionings in criminal proceedings in Danish district courts. The languages involved are Danish and English, and the mode of interpreting is the consecutive mode. The c...

  19. Interpreting Recoil for Undergraduate Students

    Science.gov (United States)

    Elsayed, Tarek A.

    2012-01-01

    The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is…

  20. Quantum mechanics in an evolving Hilbert space

    Science.gov (United States)

    Artacho, Emilio; O'Regan, David D.

    2017-03-01

    Many basis sets for electronic structure calculations evolve with varying external parameters, such as moving atoms in dynamic simulations, giving rise to extra derivative terms in the dynamical equations. Here we revisit these derivatives in the context of differential geometry, thereby obtaining a more transparent formalization, and a geometrical perspective for better understanding the resulting equations. The effect of the evolution of the basis set within the spanned Hilbert space separates explicitly from the effect of the turning of the space itself when moving in parameter space, as the tangent space turns when moving in a curved space. New insights are obtained using familiar concepts in that context such as the Riemann curvature. The differential geometry is not strictly that for curved spaces as in general relativity, a more adequate mathematical framework being provided by fiber bundles. The language used here, however, will be restricted to tensors and basic quantum mechanics. The local gauge implied by a smoothly varying basis set readily connects with Berry's formalism for geometric phases. Generalized expressions for the Berry connection and curvature are obtained for a parameter-dependent occupied Hilbert space spanned by nonorthogonal Wannier functions. The formalism is applicable to basis sets made of atomic-like orbitals and also more adaptative moving basis functions (such as in methods using Wannier functions as intermediate or support bases), but should also apply to other situations in which nonorthogonal functions or related projectors should arise. The formalism is applied to the time-dependent quantum evolution of electrons for moving atoms. The geometric insights provided here allow us to propose new finite-difference time integrators, and also better understand those already proposed.

  1. Public participation at Fernald: FERMCO's evolving role

    International Nuclear Information System (INIS)

    Williams, J.B.; Fellman, R.W.; Brettschneider, D.J.

    1995-01-01

    In an effort to improve public involvement in the site restoration decision making process, the DOE has established site specific advisory boards, of which the Fernald Citizens Task Force is one. The Fernald Task Force is focused on making recommendations in four areas: (1) What should be the future use of the site? (2) Determinations of cleanup levels (how clean is clean?) (3) Where should the wastes be disposed of? (4) What should be the cleanup priorities? Because these questions are being asked very early in the decision-making process, the answers are necessarily qualified, and are based on a combination of preliminary data, assumptions, and professional judgment. The requirement to make progress in the absence of accurate data has necessitated FERMCO and the Task Force to employ an approach similar to sensitivity analysis, in which a range of possible data values are evaluated and the relative importance of the various factors is assessed. Because of its charter to provide recommendations of future site use, the Task Force has developed a sitewide perspective, compared to the more common operable unit specific focus of public participation under CERCLA. The relationship between FERMCO and the Task Force is evolving toward one of partnership with DOE in managing the obstacles and hidden opportunities for success. The Task Force likely will continue to participate in the Fernald project long after its initial recommendations have been made. DOE already has made the commitment that the process of public participation will extend into the Remedial Design phase. There is substantial reason for optimism that continuing the Task Force process through the design phase will assist in developing the appropriate balance of cost and engineered protectiveness

  2. A consistent interpretation of quantum mechanics

    International Nuclear Information System (INIS)

    Omnes, Roland

    1990-01-01

    Some mostly recent theoretical and mathematical advances can be linked together to yield a new consistent interpretation of quantum mechanics. It relies upon a unique and universal interpretative rule of a logical character which is based upon Griffiths consistent history. Some new results in semi-classical physics allow classical physics to be derived from this rule, including its logical aspects, and to prove accordingly the existence of determinism within the quantum framework. Together with decoherence, this can be used to retrieve the existence of facts, despite the probabilistic character of the theory. Measurement theory can then be made entirely deductive. It is accordingly found that wave packet reduction is a logical property, whereas one can always choose to avoid using it. The practical consequences of this interpretation are most often in agreement with the Copenhagen formulation but they can be proved never to give rise to any logical inconsistency or paradox. (author)

  3. DIFFICULTY OF AMENDMENT AND INTERPRETATIVE CHOICE

    Directory of Open Access Journals (Sweden)

    Andrew Coan

    2016-01-01

    Full Text Available The extreme difficulty of amending the U.S. Constitution plays a central but largely unexamined role in theoretical debates over interpretive choice. In particular, conventional wisdom assumes that the extreme difficulty of Article V amendment weakens the case for originalism. This view might ultimately be correct, but it is not the freestanding argument against originalism it is often presumed to be. Rather, it depends on contestable normative and empirical premises that require defense. If those premises are wrong, the stringency of Article V might actually strengthen the case for originalism. Or Article V might have no impact on that case one way or another. This “complexity thesis” highlights and clarifies the role that difficulty of amendment plays across a range of significant interpretive debates, including those surrounding writtenness, John Hart Ely’s representation-reinforcement theory, interpretive pluralism, and originalism as a theory of positive law. It also has important implications for the under-studied relations between statutory and constitutional interpretation and federal and state constitutional interpretation.

  4. Do Interpreters Indeed Have Superior Working Memory in Interpreting

    Institute of Scientific and Technical Information of China (English)

    于飞

    2012-01-01

    With the frequent communications between China and western countries in the field of economy,politics and culture,etc,Inter preting becomes more and more important to people in all walks of life.This paper aims to testify the author’s hypothesis "professional interpreters have similar short-term memory with unprofessional interpreters,but they have superior working memory." After the illustration of literatures concerning with consecutive interpreting,short-term memory and working memory,experiments are designed and analysis are described.

  5. Alcohol use and policy formation: an evolving social problem.

    Science.gov (United States)

    Levine, Amir

    2012-01-01

    This article explores the evolutionary course that the social problem of alcohol use has taken in the United States since the Colonial Era. This article utilizes a range of theoretical models to analyze the evolving nature of alcohol use from an unrecognized to a perceived social problem. The models used include critical constructionism (Heiner, 2002), top-down policy model (Dye, 2001) and Mauss'(1975) understanding of social problems and movements. These theoretical constructs exhibit the relative nature of alcohol use as a social problem in regards to a specific time, place, and social context as well as the powerful and influential role that social elites have in defining alcohol asa social problem. Studies regarding the development of alcohol policy formation are discussed to illuminate the different powers, constituents, and factors that play a role in alcohol policy formation.Finally, implications for future study are discussed [corrected].

  6. Insect sex determination: it all evolves around transformer.

    Science.gov (United States)

    Verhulst, Eveline C; van de Zande, Louis; Beukeboom, Leo W

    2010-08-01

    Insects exhibit a variety of sex determining mechanisms including male or female heterogamety and haplodiploidy. The primary signal that starts sex determination is processed by a cascade of genes ending with the conserved switch doublesex that controls sexual differentiation. Transformer is the doublesex splicing regulator and has been found in all examined insects, indicating its ancestral function as a sex-determining gene. Despite this conserved function, the variation in transformer nucleotide sequence, amino acid composition and protein structure can accommodate a multitude of upstream sex determining signals. Transformer regulation of doublesex and its taxonomic distribution indicate that the doublesex-transformer axis is conserved among all insects and that transformer is the key gene around which variation in sex determining mechanisms has evolved.

  7. Concurrent approach for evolving compact decision rule sets

    Science.gov (United States)

    Marmelstein, Robert E.; Hammack, Lonnie P.; Lamont, Gary B.

    1999-02-01

    The induction of decision rules from data is important to many disciplines, including artificial intelligence and pattern recognition. To improve the state of the art in this area, we introduced the genetic rule and classifier construction environment (GRaCCE). It was previously shown that GRaCCE consistently evolved decision rule sets from data, which were significantly more compact than those produced by other methods (such as decision tree algorithms). The primary disadvantage of GRaCCe, however, is its relatively poor run-time execution performance. In this paper, a concurrent version of the GRaCCE architecture is introduced, which improves the efficiency of the original algorithm. A prototype of the algorithm is tested on an in- house parallel processor configuration and the results are discussed.

  8. The evolving history of influenza viruses and influenza vaccines.

    Science.gov (United States)

    Hannoun, Claude

    2013-09-01

    The isolation of influenza virus 80 years ago in 1933 very quickly led to the development of the first generation of live-attenuated vaccines. The first inactivated influenza vaccine was monovalent (influenza A). In 1942, a bivalent vaccine was produced after the discovery of influenza B. It was later discovered that influenza viruses mutated leading to antigenic changes. Since 1973, the WHO has issued annual recommendations for the composition of the influenza vaccine based on results from surveillance systems that identify currently circulating strains. In 1978, the first trivalent vaccine included two influenza A strains and one influenza B strain. Currently, there are two influenza B lineages circulating; in the latest WHO recommendations, it is suggested that a second B strain could be added to give a quadrivalent vaccine. The history of influenza vaccine and the associated technology shows how the vaccine has evolved to match the evolution of influenza viruses.

  9. Evolving Our Evaluation of Lighting Environments Project

    Science.gov (United States)

    Terrier, Douglas; Clayton, Ronald; Clark, Toni Anne

    2016-01-01

    Imagine you are an astronaut on their 100th day of your three year exploration mission. During your daily routine to the small hygiene compartment of the spacecraft, you realize that no matter what you do, your body blocks the light from the lamp. You can clearly see your hands or your toes but not both! What were those design engineers thinking! It would have been nice if they could have made the walls glow instead! The reason the designers were not more innovative is that their interpretation of the system lighting requirements didn't allow them to be so! Currently, our interior spacecraft lighting standards and requirements are written around the concept of a quantity of light illuminating a spacecraft surface. The natural interpretation for the engineer is that a lamp that throws light to the surface is required. Because of certification costs, only one lamp is designed and small rooms can wind up with lamps that may be inappropriate for the room architecture. The advances in solid state light emitting technologies and optics for lighting and visual communication necessitates the evaluation of how NASA envisions spacecraft lighting architectures and how NASA uses industry standards for the design and evaluation of lighting system. Current NASA lighting standards and requirements for existing architectures focus on the separate ability of a lighting system to throw light against a surface or the ability of a display system to provide the appropriate visual contrast. Realization that these systems can be integrated is not realized. The result is that the systems are developed independent from one another and potential efficiencies that could be realized from borrowing from the concept of one technology and applying it for the purpose of the other does not occur. This project investigated the possibility of incorporating large luminous surface lamps as an alternative or supplement to overhead lighting. We identified existing industry standards for architectural

  10. Interpretations of virtual reality.

    Science.gov (United States)

    Voiskounsky, Alexander

    2011-01-01

    University students were surveyed to learn what they know about virtual realities. The two studies were administered with a half-year interval in which the students (N=90, specializing either in mathematics and science, or in social science and humanities) were asked to name particular examples of virtual realities. The second, but not the first study, was administered after the participants had the chance to see the movie "Avatar" (no investigation was held into whether they really saw it). While the students in both studies widely believed that activities such as social networking and online gaming represent virtual realities, some other examples provided by the students in the two studies differ: in the second study the participants expressed a better understanding of the items related to virtual realities. At the same time, not a single participant reported particular psychological states (either regular or altered) as examples of virtual realities. Profound popularization efforts need to be done to acquaint the public, including college students, with virtual realities and let the public adequately understand how such systems work.

  11. Mechanics of evolving thin film structures

    Science.gov (United States)

    Liang, Jim

    In the Stranski-Krastanov system, the lattice mismatch between the film and the substrate causes the film to break into islands. During annealing, both the surface energy and the elastic energy drive the islands to coarsen. Motivated by several related studies, we suggest that stable islands should form when a stiff ceiling is placed at a small gap above the film. We show that the role of elasticity is reversed: with the ceiling, the total elastic energy stored in the system increases as the islands coarsen laterally. Consequently, the islands select an equilibrium size to minimize the combined elastic energy and surface energy. In lithographically-induced self-assembly, when a two-phase fluid confined between parallel substrates is subjected to an electric field, one phase can self-assemble into a triangular lattice of islands in another phase. We describe a theory of the stability of the island lattice. The islands select the equilibrium diameter to minimize the combined interface energy and electrostatic energy. Furthermore, we study compressed SiGe thin film islands fabricated on a glass layer, which itself lies on a silicon wafer. Upon annealing, the glass flows, and the islands relax. A small island relaxes by in-plane expansion. A large island, however, wrinkles at the center before the in-plane relaxation arrives. The wrinkles may cause significant tensile stress in the island, leading to fracture. We model the island by the von Karman plate theory and the glass layer by the Reynolds lubrication theory. Numerical simulations evolve the in-plane expansion and the wrinkles simultaneously. We determine the critical island size, below which in-plane expansion prevails over wrinkling. Finally, in devices that integrate dissimilar materials in small dimensions, crack extension in one material often accompanies inelastic deformation in another. We analyze a channel crack advancing in an elastic film under tension, while an underlayer creeps. We use a two

  12. Abstract Interpretation and Attribute Gramars

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    The objective of this thesis is to explore the connections between abstract interpretation and attribute grammars as frameworks in program analysis. Abstract interpretation is a semantics-based program analysis method. A large class of data flow analysis problems can be expressed as non-standard ...... is presented in the thesis. Methods from abstract interpretation can also be used in correctness proofs of attribute grammars. This proof technique introduces a new class of attribute grammars based on domain theory. This method is illustrated with examples....

  13. Evolving the theory and praxis of knowledge translation through social interaction: a social phenomenological study

    Directory of Open Access Journals (Sweden)

    Forbes Dorothy

    2009-05-01

    Full Text Available Abstract Background As an inherently human process fraught with subjectivity, dynamic interaction, and change, social interaction knowledge translation (KT invites implementation scientists to explore what might be learned from adopting the academic tradition of social constructivism and an interpretive research approach. This paper presents phenomenological investigation of the second cycle of a participatory action KT intervention in the home care sector to answer the question: What is the nature of the process of implementing KT through social interaction? Methods Social phenomenology was selected to capture how the social processes of the KT intervention were experienced, with the aim of representing these as typical socially-constituted patterns. Participants (n = 203, including service providers, case managers, administrators, and researchers organized into nine geographically-determined multi-disciplinary action groups, purposefully selected and audiotaped three meetings per group to capture their enactment of the KT process at early, middle, and end-of-cycle timeframes. Data, comprised of 36 hours of transcribed audiotapes augmented by researchers' field notes, were analyzed using social phenomenology strategies and authenticated through member checking and peer review. Results Four patterns of social interaction representing organization, team, and individual interests were identified: overcoming barriers and optimizing facilitators; integrating 'science push' and 'demand pull' approaches within the social interaction process; synthesizing the research evidence with tacit professional craft and experiential knowledge; and integrating knowledge creation, transfer, and uptake throughout everyday work. Achieved through relational transformative leadership constituted simultaneously by both structure and agency, in keeping with social phenomenology analysis approaches, these four patterns are represented holistically in a typical

  14. Competency in ECG Interpretation Among Medical Students

    Science.gov (United States)

    Kopeć, Grzegorz; Magoń, Wojciech; Hołda, Mateusz; Podolec, Piotr

    2015-01-01

    Background Electrocardiogram (ECG) is commonly used in diagnosis of heart diseases, including many life-threatening disorders. We aimed to assess skills in ECG interpretation among Polish medical students and to analyze the determinants of these skills. Material/Methods Undergraduates from all Polish medical schools were asked to complete a web-based survey containing 18 ECG strips. Questions concerned primary ECG parameters (rate, rhythm, and axis), emergencies, and common ECG abnormalities. Analysis was restricted to students in their clinical years (4th–6th), and students in their preclinical years (1st–3rd) were used as controls. Results We enrolled 536 medical students (females: n=299; 55.8%), aged 19 to 31 (23±1.6) years from all Polish medical schools. Most (72%) were in their clinical years. The overall rate of good response was better in students in years 4th–5th than those in years 1st–3rd (66% vs. 56%; pECG interpretation was higher in students who reported ECG self-learning (69% vs. 62%; pECG classes (66% vs. 66%; p=0.99). On multivariable analysis (pECG interpretation. Conclusions Polish medical students in their clinical years have a good level of competency in interpreting the primary ECG parameters, but their ability to recognize ECG signs of emergencies and common heart abnormalities is low. ECG interpretation skills are determined by self-education but not by attendance at regular ECG classes. Our results indicate qualitative and quantitative deficiencies in teaching ECG interpretation at medical schools. PMID:26541993

  15. Andries van Aarde's Matthew Interpretation

    African Journals Online (AJOL)

    Test

    2011-01-14

    Jan 14, 2011 ... Secondly, it is an individual and independent interpretation of the Matthean .... specific social context is emphasised: certain events in the early church ...... Moses-theology, a Covenant-theology or any other exclusive theology.

  16. Dialectica Interpretation with Marked Counterexamples

    Directory of Open Access Journals (Sweden)

    Trifon Trifonov

    2011-01-01

    Full Text Available Goedel's functional "Dialectica" interpretation can be used to extract functional programs from non-constructive proofs in arithmetic by employing two sorts of higher-order witnessing terms: positive realisers and negative counterexamples. In the original interpretation decidability of atoms is required to compute the correct counterexample from a set of candidates. When combined with recursion, this choice needs to be made for every step in the extracted program, however, in some special cases the decision on negative witnesses can be calculated only once. We present a variant of the interpretation in which the time complexity of extracted programs can be improved by marking the chosen witness and thus avoiding recomputation. The achieved effect is similar to using an abortive control operator to interpret computational content of non-constructive principles.

  17. Federal Aviation Administration Legal Interpretations

    Data.gov (United States)

    Department of Transportation — Legal Interpretations and the Chief Counsel's opinions are now available at this site. Your may choose to search by year or by text search. Please note that not all...

  18. Interpreting Sustainability for Urban Forests

    Directory of Open Access Journals (Sweden)

    Camilo Ordóñez

    2010-06-01

    Full Text Available Incisive interpretations of urban-forest sustainability are important in furthering our understanding of how to sustain the myriad values associated with urban forests. Our analysis of earlier interpretations reveals conceptual gaps. These interpretations are attached to restrictive definitions of a sustainable urban forest and limited to a rather mechanical view of maintaining the biophysical structure of trees. The probing of three conceptual domains (urban forest concepts, sustainable development, and sustainable forest management leads to a broader interpretation of urban-forest sustainability as the process of sustaining urban forest values through time and across space. We propose that values—and not services, benefits, functions or goods—is a superior concept to refer to what is to be sustained in and by an urban forest.

  19. Tarague Interpretive Trail Mitigation Plan

    National Research Council Canada - National Science Library

    Welch, David

    2001-01-01

    ...), International Archaeological Research Institute, Inc. (lARfI) has prepared a mitigation plan for development of an interpretive trail at Tarague Beach, located on the north coast of the island of Guam (Fig. 1...

  20. Risks of cardiovascular diseases evolvement and occupational stress

    Directory of Open Access Journals (Sweden)

    Z.F. Gimaeva

    2017-03-01

    Full Text Available Our aim was to study how significant psychosocial factors are in occupational stress and cardiovascular diseases evolvement in workers employed at petrochemical production; we also intended to work out a set of preventive measures. Our hygienic and social-psychological research enabled us to detect factors causing stress evolvement in workers employed at petrochemical production. These factors included chemical impact, noise, unfavorable microclimate, labor hardness and labor intensity. High level of risk for their own lives and responsibility for safety of others, as well as work under time deficiency conditions with increased responsibility for the final results, were the most significant psychosocial factors for workers. In the course of questioning we detected that 74 % machine operators, 63 % tool men working with controllers and automatic devices, and 57 % repairmen mentioned having stress at work. Here 38 % workers gave a subjective estimation of their professional activity as having apparent "stress nature". The questioning revealed that 48 % workers with various occupations had increased parameters as per anxiety scale (HADS; 23 % workers had increased parameters as per depressions scale (HADS. Primary hypertension was the most widely spread nosologic form among chronic non-infectious diseases; it was found in 46.1 % operators and in 45.2 % repairmen dealing with processing stations repair. 30.1 % tool men working with controllers and automatic devices had average occupational causation of primary hypertension by production factors. We detected direct relation between hyperlipidemia and age and working period. We created foundation for preventive measures and worked out a program aimed at increasing resistance to stress at corporate and individual level. It will provide significant social effect and later on economic one. To overcome social stress we need to create safe working conditions at workplaces and to increase labor motivation

  1. Co-evolving prisoner's dilemma: Performance indicators and analytic approaches

    Science.gov (United States)

    Zhang, W.; Choi, C. W.; Li, Y. S.; Xu, C.; Hui, P. M.

    2017-02-01

    Understanding the intrinsic relation between the dynamical processes in a co-evolving network and the necessary ingredients in formulating a reliable theory is an important question and a challenging task. Using two slightly different definitions of performance indicator in the context of a co-evolving prisoner's dilemma game, it is shown that very different cooperative levels result and theories of different complexity are required to understand the key features. When the payoff per opponent is used as the indicator (Case A), non-cooperative strategy has an edge and dominates in a large part of the parameter space formed by the cutting-and-rewiring probability and the strategy imitation probability. When the payoff from all opponents is used (Case B), cooperative strategy has an edge and dominates the parameter space. Two distinct phases, one homogeneous and dynamical and another inhomogeneous and static, emerge and the phase boundary in the parameter space is studied in detail. A simple theory assuming an average competing environment for cooperative agents and another for non-cooperative agents is shown to perform well in Case A. The same theory, however, fails badly for Case B. It is necessary to include more spatial correlation into a theory for Case B. We show that the local configuration approximation, which takes into account of the different competing environments for agents with different strategies and degrees, is needed to give reliable results for Case B. The results illustrate that formulating a proper theory requires both a conceptual understanding of the effects of the adaptive processes in the problem and a delicate balance between simplicity and accuracy.

  2. Begriffsverwirrung? Interpretation Analyse Bedeutung Applikation

    Directory of Open Access Journals (Sweden)

    Mayr, Jeremia Josef M.

    2017-11-01

    Full Text Available Empirical research on the reception of biblical texts confronts scientific exegesis with valid and challenging requests and demands. The hermeneutic question of the compatibility of interpretations resulting out of different contexts (e.g. scientific exegesis and ordinary readers‘ exegesis plays an important role. Taking these requests seriously by coherently restructuring fundamental and central aspects of the theory of scientific interpretation, the present article attempts to offer a stimulating approach for further investigation.

  3. Court interpreting and pragmatic meaning

    DEFF Research Database (Denmark)

    Jacobsen, Bente

    In Denmark, court interpreters are required to deliver verbatim translations of speakers' originals and to refrain from transferring pragmatic meaning. Yet, as this paper demonstrates, pragmatic meaning is central to courtroom interaction.......In Denmark, court interpreters are required to deliver verbatim translations of speakers' originals and to refrain from transferring pragmatic meaning. Yet, as this paper demonstrates, pragmatic meaning is central to courtroom interaction....

  4. Interpretation of macroscopic quantum phenomena

    International Nuclear Information System (INIS)

    Baumann, K.

    1986-01-01

    It is argued that a quantum theory without observer is required for the interpretation of macroscopic quantum tunnelling. Such a theory is obtained by augmenting QED by the actual electric field in the rest system of the universe. An equation of the motion of this field is formulated form which the correct macroscopic behavior of the universe and the validity of the Born interpretation is derived. Care is taken to use mathematically sound concepts only. (Author)

  5. Interpretation of vector magnetograph data including magneto-optic effects. Pt. 1

    International Nuclear Information System (INIS)

    West, E.A.; Hagyard, J.; National Aeronautics and Space Administration, Huntsville, AL

    1983-01-01

    In this paper, the presence of Faraday rotation in measurements of orientation of a sunspot's transvese magnetic field is investigated. Using observations obtained with the Marshall Space Flight Center's (MSFC) vector magnetograph, the derived vector magnetic field of a simple, symmetric sunspot is used to calculate the degree of Faraday rotation in the azimuth of the transverse field as a function of wavelength from analytical expressions for the Stokes parameters. These results are then compared with the observed rotation of the field's azimuth which is derived from observations at different wavelengths within the Fe sub(I) 5250 A spectral line. From these comparisons, we find: the observed rotation of the azimuth is simulated to a reasonable degree by the theoretical formulations if the line-formation parameter eta 0 is varied over the sunspot; these variations in eta 0 are substantiated by the line-intensity data; for the MSFC system, Faraday rotation can be neglected for field strengths less than 1800 G and field inclinations greater than 45 0 ; to minimize the effects of Faraday rotation in sunspot umbrae, MSFC magnetograph measurements must be made in the far wings of the Zeeman-sensitive spectral line. (orig.)

  6. A depth-averaged debris-flow model that includes the effects of evolving dilatancy. I. physical basis

    Science.gov (United States)

    Iverson, Richard M.; George, David L.

    2014-01-01

    To simulate debris-flow behaviour from initiation to deposition, we derive a depth-averaged, two-phase model that combines concepts of critical-state soil mechanics, grain-flow mechanics and fluid mechanics. The model's balance equations describe coupled evolution of the solid volume fraction, m, basal pore-fluid pressure, flow thickness and two components of flow velocity. Basal friction is evaluated using a generalized Coulomb rule, and fluid motion is evaluated in a frame of reference that translates with the velocity of the granular phase, vs. Source terms in each of the depth-averaged balance equations account for the influence of the granular dilation rate, defined as the depth integral of ∇⋅vs. Calculation of the dilation rate involves the effects of an elastic compressibility and an inelastic dilatancy angle proportional to m−meq, where meq is the value of m in equilibrium with the ambient stress state and flow rate. Normalization of the model equations shows that predicted debris-flow behaviour depends principally on the initial value of m−meq and on the ratio of two fundamental timescales. One of these timescales governs downslope debris-flow motion, and the other governs pore-pressure relaxation that modifies Coulomb friction and regulates evolution of m. A companion paper presents a suite of model predictions and tests.

  7. The genotype-phenotype map of an evolving digital organism.

    Directory of Open Access Journals (Sweden)

    Miguel A Fortuna

    2017-02-01

    Full Text Available To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms from a vast space of 10141 genotypes (instruction sequences, which can form 512 different phenotypes. These phenotypes are distinguished by different Boolean logic functions they can compute, as well as by the complexity of these functions. We observe several properties with parallels in natural systems, such as connected genotype networks and asymmetric phenotypic transitions. The likely common cause is robustness to genotypic change. We describe an intriguing tension between phenotypic complexity and evolvability that may have implications for biological evolution. On the one hand, genotypic change is more likely to yield novel phenotypes in more complex organisms. On the other hand, the total number of novel phenotypes reachable through genotypic change is highest for organisms with simple phenotypes. Artificial evolving systems can help us study aspects of biological evolvability that are not accessible in vastly more complex natural systems. They can also help identify properties, such as robustness, that are required for both human-designed artificial systems and synthetic biological systems to be evolvable.

  8. The genotype-phenotype map of an evolving digital organism.

    Science.gov (United States)

    Fortuna, Miguel A; Zaman, Luis; Ofria, Charles; Wagner, Andreas

    2017-02-01

    To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms from a vast space of 10141 genotypes (instruction sequences), which can form 512 different phenotypes. These phenotypes are distinguished by different Boolean logic functions they can compute, as well as by the complexity of these functions. We observe several properties with parallels in natural systems, such as connected genotype networks and asymmetric phenotypic transitions. The likely common cause is robustness to genotypic change. We describe an intriguing tension between phenotypic complexity and evolvability that may have implications for biological evolution. On the one hand, genotypic change is more likely to yield novel phenotypes in more complex organisms. On the other hand, the total number of novel phenotypes reachable through genotypic change is highest for organisms with simple phenotypes. Artificial evolving systems can help us study aspects of biological evolvability that are not accessible in vastly more complex natural systems. They can also help identify properties, such as robustness, that are required for both human-designed artificial systems and synthetic biological systems to be evolvable.

  9. An Evolving Asymmetric Game for Modeling Interdictor-Smuggler Problems

    Science.gov (United States)

    2016-06-01

    ASYMMETRIC GAME FOR MODELING INTERDICTOR-SMUGGLER PROBLEMS by Richard J. Allain June 2016 Thesis Advisor: David L. Alderson Second Reader: W...DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE AN EVOLVING ASYMMETRIC GAME FOR MODELING INTERDICTOR- SMUGGLER PROBLEMS 5. FUNDING NUMBERS 6...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited AN EVOLVING

  10. Adaptation of Escherichia coli to glucose promotes evolvability in lactose.

    Science.gov (United States)

    Phillips, Kelly N; Castillo, Gerardo; Wünsche, Andrea; Cooper, Tim F

    2016-02-01

    The selective history of a population can influence its subsequent evolution, an effect known as historical contingency. We previously observed that five of six replicate populations that were evolved in a glucose-limited environment for 2000 generations, then switched to lactose for 1000 generations, had higher fitness increases in lactose than populations started directly from the ancestor. To test if selection in glucose systematically increased lactose evolvability, we started 12 replay populations--six from a population subsample and six from a single randomly selected clone--from each of the six glucose-evolved founder populations. These replay populations and 18 ancestral populations were evolved for 1000 generations in a lactose-limited environment. We found that replay populations were initially slightly less fit in lactose than the ancestor, but were more evolvable, in that they increased in fitness at a faster rate and to higher levels. This result indicates that evolution in the glucose environment resulted in genetic changes that increased the potential of genotypes to adapt to lactose. Genome sequencing identified four genes--iclR, nadR, spoT, and rbs--that were mutated in most glucose-evolved clones and are candidates for mediating increased evolvability. Our results demonstrate that short-term selective costs during selection in one environment can lead to changes in evolvability that confer longer term benefits. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  11. PCBA demand forecasting using an evolving Takagi-Sugeno system

    NARCIS (Netherlands)

    van Rooijen, M.; Almeida, R.J.; Kaymak, U.

    2016-01-01

    This paper investigates the use of using an evolving fuzzy system for printed circuit board (PCBA) demand forecasting. The algorithm is based on the evolving Takagi-Sugeno (eTS) fuzzy system, which has the ability to incorporate new patterns by changing its internal structure in an on-line fashion.

  12. Revisiting Robustness and Evolvability: Evolution in Weighted Genotype Spaces

    Science.gov (United States)

    Partha, Raghavendran; Raman, Karthik

    2014-01-01

    Robustness and evolvability are highly intertwined properties of biological systems. The relationship between these properties determines how biological systems are able to withstand mutations and show variation in response to them. Computational studies have explored the relationship between these two properties using neutral networks of RNA sequences (genotype) and their secondary structures (phenotype) as a model system. However, these studies have assumed every mutation to a sequence to be equally likely; the differences in the likelihood of the occurrence of various mutations, and the consequence of probabilistic nature of the mutations in such a system have previously been ignored. Associating probabilities to mutations essentially results in the weighting of genotype space. We here perform a comparative analysis of weighted and unweighted neutral networks of RNA sequences, and subsequently explore the relationship between robustness and evolvability. We show that assuming an equal likelihood for all mutations (as in an unweighted network), underestimates robustness and overestimates evolvability of a system. In spite of discarding this assumption, we observe that a negative correlation between sequence (genotype) robustness and sequence evolvability persists, and also that structure (phenotype) robustness promotes structure evolvability, as observed in earlier studies using unweighted networks. We also study the effects of base composition bias on robustness and evolvability. Particularly, we explore the association between robustness and evolvability in a sequence space that is AU-rich – sequences with an AU content of 80% or higher, compared to a normal (unbiased) sequence space. We find that evolvability of both sequences and structures in an AU-rich space is lesser compared to the normal space, and robustness higher. We also observe that AU-rich populations evolving on neutral networks of phenotypes, can access less phenotypic variation compared to

  13. Video interpretability rating scale under network impairments

    Science.gov (United States)

    Kreitmair, Thomas; Coman, Cristian

    2014-01-01

    This paper presents the results of a study of the impact of network transmission channel parameters on the quality of streaming video data. A common practice for estimating the interpretability of video information is to use the Motion Imagery Quality Equation (MIQE). MIQE combines a few technical features of video images (such as: ground sampling distance, relative edge response, modulation transfer function, gain and signal-to-noise ratio) to estimate the interpretability level. One observation of this study is that the MIQE does not fully account for video-specific parameters such as spatial and temporal encoding, which are relevant to appreciating degradations caused by the streaming process. In streaming applications the main artifacts impacting the interpretability level are related to distortions in the image caused by lossy decompression of video data (due to loss of information and in some cases lossy re-encoding by the streaming server). One parameter in MIQE that is influenced by network transmission errors is the Relative Edge Response (RER). The automated calculation of RER includes the selection of the best edge in the frame, which in case of network errors may be incorrectly associated with a blocked region (e.g. low resolution areas caused by loss of information). A solution is discussed in this document to address this inconsistency by removing corrupted regions from the image analysis process. Furthermore, a recommendation is made on how to account for network impairments in the MIQE, such that a more realistic interpretability level is estimated in case of streaming applications.

  14. The impact of working memory on interpreting

    Institute of Scientific and Technical Information of China (English)

    白云安; 张国梅

    2016-01-01

    This paper investigates the roles of working memory in interpreting process. First of all, it gives a brief introduction to interpreting. Secondly, the paper exemplifies the role of working memory in interpreting. The result reveals that the working memory capacity of interpreters is not adsolutely proportional to the quality of interpreting in the real interpreting conditions. The performance of an interpreter with well-equipped working memory capacity will comprehensively influenced by various elements.

  15. Discrimination in Public Employment: The Evolving Law.

    Science.gov (United States)

    McCarthy, Martha M.

    This monograph reviews the current status of constitutional, statutory, and case law governing public employers' obligations to assure equal employment opportunities and employees' rights to nondiscriminatory treatment. An initial overview of the legal framework discusses federal equal protection mandates including the guarantee of equal…

  16. Selective Attention and Control of Action: Comparative Psychology of an Artificial, Evolved Agent and People

    Science.gov (United States)

    Ward, Robert; Ward, Ronnie

    2008-01-01

    This study examined the selective attention abilities of a simple, artificial, evolved agent and considered implications of the agent's performance for theories of selective attention and action. The agent processed two targets in continuous time, catching one and then the other. This task required many cognitive operations, including prioritizing…

  17. Virtual Microscopy: A Useful Tool for Meeting Evolving Challenges in the Veterinary Medical Curriculum

    Science.gov (United States)

    Kogan, Lori R.; Dowers, Kristy L.; Cerda, Jacey R.; Schoenfeld-Tacher, Regina M.; Stewart, Sherry M.

    2014-01-01

    Veterinary schools, similar to many professional health programs, face a myriad of evolving challenges in delivering their professional curricula including expansion of class size, costs to maintain expensive laboratories, and increased demands on veterinary educators to use curricular time efficiently and creatively. Additionally, exponential…

  18. Will the Amaranthus tuberculatus Resistance Mechanism to PPO-Inhibiting Herbicides Evolve in Other Amaranthus Species?

    Directory of Open Access Journals (Sweden)

    Chance W. Riggins

    2012-01-01

    Full Text Available Resistance to herbicides that inhibit protoporphyrinogen oxidase (PPO has been slow to evolve and, to date, is confirmed for only four weed species. Two of these species are members of the genus Amaranthus L. Previous research has demonstrated that PPO-inhibitor resistance in A. tuberculatus (Moq. Sauer, the first weed to have evolved this type of resistance, involves a unique codon deletion in the PPX2 gene. Our hypothesis is that A. tuberculatus may have been predisposed to evolving this resistance mechanism due to the presence of a repetitive motif at the mutation site and that lack of this motif in other amaranth species is why PPO-inhibitor resistance has not become more common despite strong herbicide selection pressure. Here we investigate inter- and intraspecific variability of the PPX2 gene—specifically exon 9, which includes the mutation site—in ten amaranth species via sequencing and a PCR-RFLP assay. Few polymorphisms were observed in this region of the gene, and intraspecific variation was observed only in A. quitensis. However, sequencing revealed two distinct repeat patterns encompassing the mutation site. Most notably, A. palmeri S. Watson possesses the same repetitive motif found in A. tuberculatus. We thus predict that A. palmeri will evolve resistance to PPO inhibitors via the same PPX2 codon deletion that evolved in A. tuberculatus.

  19. Closed loop deep brain stimulation: an evolving technology.

    Science.gov (United States)

    Hosain, Md Kamal; Kouzani, Abbas; Tye, Susannah

    2014-12-01

    Deep brain stimulation is an effective and safe medical treatment for a variety of neurological and psychiatric disorders including Parkinson's disease, essential tremor, dystonia, and treatment resistant obsessive compulsive disorder. A closed loop deep brain stimulation (CLDBS) system automatically adjusts stimulation parameters by the brain response in real time. The CLDBS continues to evolve due to the advancement in the brain stimulation technologies. This paper provides a study on the existing systems developed for CLDBS. It highlights the issues associated with CLDBS systems including feedback signal recording and processing, stimulation parameters setting, control algorithm, wireless telemetry, size, and power consumption. The benefits and limitations of the existing CLDBS systems are also presented. Whilst robust clinical proof of the benefits of the technology remains to be achieved, it has the potential to offer several advantages over open loop DBS. The CLDBS can improve efficiency and efficacy of therapy, eliminate lengthy start-up period for programming and adjustment, provide a personalized treatment, and make parameters setting automatic and adaptive.

  20. Default Sarcastic Interpretations: On the Priority of Nonsalient Interpretations

    Science.gov (United States)

    Giora, Rachel; Drucker, Ari; Fein, Ofer; Mendelson, Itamar

    2015-01-01

    Findings from five experiments support the view that negation generates sarcastic utterance-interpretations by default. When presented in isolation, novel negative constructions ("Punctuality is not his forte," "Thoroughness is not her most distinctive feature"), free of semantic anomaly or internal incongruity, were…

  1. Interpretation and evaluation of radiograph

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    After digestion, the interpreter must interpreted and evaluate the image on film, usually many radiograph stuck in this step, if there is good density, so there are no problem. This is a final stage of radiography work and this work must be done by level two or three radiographer. This is a final stages before the radiographer give a result to their customer for further action. The good interpreter must know what kind of artifact, is this artifact are dangerous or not and others. In this chapter, the entire artifact that usually showed will be discussed briefly with the good illustration and picture to make the reader understand and know the type of artifact that exists.

  2. Interpretation and digestion of radiograph

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography digestion is final test for the radiography to make sure that radiograph produced will inspect their quality of the image before its interpreted. This level is critical level where if there is a mistake, all of the radiography work done before will be unaccepted. So as mention earlier, it can waste time, cost and more worst it can make the production must shut down. So, this step, level two radiographers or interpreter must evaluate the radiograph carefully. For this purpose, digestion room and densitometer must used. Of course all the procedure must follow the specification that mentioned in document. There are several needs must fill before we can say the radiograph is corrected or not like the location of penetrameter, number of penetrameter that showed, the degree of density of film, and usually there is no problem in this step and the radiograph can go to interpretation and evaluation step as will mentioned in next chapter.

  3. Diverse CRISPRs evolving in human microbiomes.

    Directory of Open Access Journals (Sweden)

    Mina Rho

    Full Text Available CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats loci, together with cas (CRISPR-associated genes, form the CRISPR/Cas adaptive immune system, a primary defense strategy that eubacteria and archaea mobilize against foreign nucleic acids, including phages and conjugative plasmids. Short spacer sequences separated by the repeats are derived from foreign DNA and direct interference to future infections. The availability of hundreds of shotgun metagenomic datasets from the Human Microbiome Project (HMP enables us to explore the distribution and diversity of known CRISPRs in human-associated microbial communities and to discover new CRISPRs. We propose a targeted assembly strategy to reconstruct CRISPR arrays, which whole-metagenome assemblies fail to identify. For each known CRISPR type (identified from reference genomes, we use its direct repeat consensus sequence to recruit reads from each HMP dataset and then assemble the recruited reads into CRISPR loci; the unique spacer sequences can then be extracted for analysis. We also identified novel CRISPRs or new CRISPR variants in contigs from whole-metagenome assemblies and used targeted assembly to more comprehensively identify these CRISPRs across samples. We observed that the distributions of CRISPRs (including 64 known and 86 novel ones are largely body-site specific. We provide detailed analysis of several CRISPR loci, including novel CRISPRs. For example, known streptococcal CRISPRs were identified in most oral microbiomes, totaling ∼8,000 unique spacers: samples resampled from the same individual and oral site shared the most spacers; different oral sites from the same individual shared significantly fewer, while different individuals had almost no common spacers, indicating the impact of subtle niche differences on the evolution of CRISPR defenses. We further demonstrate potential applications of CRISPRs to the tracing of rare species and the virus exposure of individuals

  4. Evolving Educational Techniques in Surgical Training.

    Science.gov (United States)

    Evans, Charity H; Schenarts, Kimberly D

    2016-02-01

    Training competent and professional surgeons efficiently and effectively requires innovation and modernization of educational methods. Today's medical learner is quite adept at using multiple platforms to gain information, providing surgical educators with numerous innovative avenues to promote learning. With the growth of technology, and the restriction of work hours in surgical education, there has been an increase in use of simulation, including virtual reality, robotics, telemedicine, and gaming. The use of simulation has shifted the learning of basic surgical skills to the laboratory, reserving limited time in the operating room for the acquisition of complex surgical skills". Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Interpretative challenges in face analysis

    DEFF Research Database (Denmark)

    de Oliveira, Sandi Michele; Hernández-Flores, Nieves

    2015-01-01

    In current research on face analysis questions of who and what should be interpreted, as well as how, are of central interest. In English language research, this question has led to a debate on the concepts of P1 (laypersons, representing the “emic” perspective) and P2 (researchers, representing...... in Spanish and address forms in European Portuguese, we view P1 and P2 as being far more complex than the literature suggests, with subgroups (different types of laypersons and researchers, respectively). At the micro-level we will describe the roles each subgroup plays in the interpretative process...

  6. The evolving genetic foundations of eating disorders.

    Science.gov (United States)

    Klump, K L; Kaye, W H; Strober, M

    2001-06-01

    Data described earlier are clear in establishing a role for genes in the development of eating abnormalities. Estimates from the most rigorous studies suggest that more than 50% of the variance in eating disorders and disordered eating behaviors can be accounted for by genetic effects. These high estimates indicate a need for studies identifying the specific genes contributing to this large proportion of variance. Twin and family studies suggest that several heritable characteristics that are commonly comorbid with AN and BN may share genetic transmission with these disorders, including anxiety disorders or traits, body weight, and possibly major depression. Moreover, some developmental research suggests that the genes involved in ovarian hormones or the genes that these steroids affect also may be genetically linked to eating abnormalities. Molecular genetic research of these disorders is in its infant stages. However, promising areas for future research have already been identified (e.g., 5-HT2A receptor gene, UCP-2/UCP-3 gene, and estrogen receptor beta gene), and several large-scale linkage and association studies are underway. These studies likely will provide invaluable information regarding the appropriate phenotypes to be included in genetic studies and the genes with the most influence on the development of these disorders.

  7. Choledocholithiasis: Evolving standards for diagnosis and management

    Institute of Scientific and Technical Information of China (English)

    Marilee L Freitas; Robert L Bell; Andrew J Duffy

    2006-01-01

    Cholelithiasis, one of the most common medical conditions leading to surgical intervention, affects approximately 10 % of the adult population in the United States.Choledocholithiasis develops in about 10%-20% of patients with gallbladder stones and the literature suggests that at least 3%-10% of patients undergoing cholecystectomy will have common bile duct (CBD) stones.CBD stones may be discovered preoperatively, intraoperatively or postoperatively Multiple modalities are available for assessing patients for choledocholithiasis including laboratory tests, ultrasound, computed tomography scans (CT), and magnetic resonance cholangiopancreatography (MRCP). Intraoperative cholangiography during cholecystectomy can be used routinely or selectively to diagnose CBD stones.The most common intervention for CBD stones is ERCP. Other commonly used interventions include intraoperative bile duct exploration, either laparoscopic or open. Percutaneous, transhepatic stone removal other novel techniques of biliary clearance have been devised.The availability of equipment and skilled practitioners who are facile with these techniques varies among institutions. The timing of the intervention is often dictated by the clinical situation.

  8. Preparing for Mars: The Evolvable Mars Campaign 'Proving Ground' Approach

    Science.gov (United States)

    Bobskill, Marianne R.; Lupisella, Mark L.; Mueller, Rob P.; Sibille, Laurent; Vangen, Scott; Williams-Byrd, Julie

    2015-01-01

    As the National Aeronautics and Space Administration (NASA) prepares to extend human presence beyond Low Earth Orbit, we are in the early stages of planning missions within the framework of an Evolvable Mars Campaign. Initial missions would be conducted in near-Earth cis-lunar space and would eventually culminate in extended duration crewed missions on the surface of Mars. To enable such exploration missions, critical technologies and capabilities must be identified, developed, and tested. NASA has followed a principled approach to identify critical capabilities and a "Proving Ground" approach is emerging to address testing needs. The Proving Ground is a period subsequent to current International Space Station activities wherein exploration-enabling capabilities and technologies are developed and the foundation is laid for sustained human presence in space. The Proving Ground domain essentially includes missions beyond Low Earth Orbit that will provide increasing mission capability while reducing technical risks. Proving Ground missions also provide valuable experience with deep space operations and support the transition from "Earth-dependence" to "Earth-independence" required for sustainable space exploration. A Technology Development Assessment Team identified a suite of critical technologies needed to support the cadence of exploration missions. Discussions among mission planners, vehicle developers, subject-matter-experts, and technologists were used to identify a minimum but sufficient set of required technologies and capabilities. Within System Maturation Teams, known challenges were identified and expressed as specific performance gaps in critical capabilities, which were then refined and activities required to close these critical gaps were identified. Analysis was performed to identify test and demonstration opportunities for critical technical capabilities across the Proving Ground spectrum of missions. This suite of critical capabilities is expected to

  9. Diversity Generation in Evolving Microbial Populations

    DEFF Research Database (Denmark)

    Markussen, Trine

    Pseudomonas aeruginosa infections in the airways of patients with cystic fibrosis (CF) offer opportunities to study bacterial evolution and adaptation in natural environments. Significantly phenotypic and genomic changes of P. aeruginosa have been observed during chronic infection. While P. aeruginosa...... bacterial genome sequencing, phenotypic profiling and unique sampling materials which included clonal bacterial isolates sampled for more than 4 decades from chronically infected CF patients, we were able to investigate the diversity generation of the clinical important and highly successful P. aeruginosa...... DK1 clone type during chronic airway infection in CF patients. We show here that diversification of P. aeruginosa DK1 occurs through the emergence of coexisting subpopulations with distinct phenotypic and genomic features and demonstrate that this diversification was a result of niche specialization...

  10. Evolving Marine Biomimetics for Regenerative Dentistry

    Science.gov (United States)

    Green, David W.; Lai, Wing-Fu; Jung, Han-Sung

    2014-01-01

    New products that help make human tissue and organ regeneration more effective are in high demand and include materials, structures and substrates that drive cell-to-tissue transformations, orchestrate anatomical assembly and tissue integration with biology. Marine organisms are exemplary bioresources that have extensive possibilities in supporting and facilitating development of human tissue substitutes. Such organisms represent a deep and diverse reserve of materials, substrates and structures that can facilitate tissue reconstruction within lab-based cultures. The reason is that they possess sophisticated structures, architectures and biomaterial designs that are still difficult to replicate using synthetic processes, so far. These products offer tantalizing pre-made options that are versatile, adaptable and have many functions for current tissue engineers seeking fresh solutions to the deficiencies in existing dental biomaterials, which lack the intrinsic elements of biofunctioning, structural and mechanical design to regenerate anatomically correct dental tissues both in the culture dish and in vivo. PMID:24828293

  11. Evolving Marine Biomimetics for Regenerative Dentistry

    Directory of Open Access Journals (Sweden)

    David W. Green

    2014-05-01

    Full Text Available New products that help make human tissue and organ regeneration more effective are in high demand and include materials, structures and substrates that drive cell-to-tissue transformations, orchestrate anatomical assembly and tissue integration with biology. Marine organisms are exemplary bioresources that have extensive possibilities in supporting and facilitating development of human tissue substitutes. Such organisms represent a deep and diverse reserve of materials, substrates and structures that can facilitate tissue reconstruction within lab-based cultures. The reason is that they possess sophisticated structures, architectures and biomaterial designs that are still difficult to replicate using synthetic processes, so far. These products offer tantalizing pre-made options that are versatile, adaptable and have many functions for current tissue engineers seeking fresh solutions to the deficiencies in existing dental biomaterials, which lack the intrinsic elements of biofunctioning, structural and mechanical design to regenerate anatomically correct dental tissues both in the culture dish and in vivo.

  12. Evolving prosocial and sustainable neighborhoods and communities.

    Science.gov (United States)

    Biglan, Anthony; Hinds, Erika

    2009-01-01

    In this review, we examine randomized controlled trials of community interventions to affect health. The evidence supports the efficacy of community interventions for preventing tobacco, alcohol, and other drug use; several recent trials have shown the benefits of community interventions for preventing multiple problems of young people, including antisocial behavior. However, the next generation of community intervention research needs to reflect more fully the fact that most psychological and behavioral problems of humans are interrelated and result from the same environmental conditions. The evidence supports testing new comprehensive community interventions that focus on increasing nurturance in communities. Nurturing communities will be ones in which families, schools, neighborhoods, and workplaces (a) minimize biologically and socially toxic events, (b) richly reinforce prosocial behavior, and (c) foster psychological acceptance. Such interventions also have the potential to make neighborhoods more sustainable.

  13. Data analysis and interpretation for environmental surveillance

    International Nuclear Information System (INIS)

    1992-06-01

    The Data Analysis and Interpretation for Environmental Surveillance Conference was held in Lexington, Kentucky, February 5--7, 1990. The conference was sponsored by what is now the Office of Environmental Compliance and Documentation, Oak Ridge National Laboratory. Participants included technical professionals from all Martin Marietta Energy Systems facilities, Westinghouse Materials Company of Ohio, Pacific Northwest Laboratory, and several technical support contractors. Presentations at the conference ranged the full spectrum of issues that effect the analysis and interpretation of environmental data. Topics included tracking systems for samples and schedules associated with ongoing programs; coalescing data from a variety of sources and pedigrees into integrated data bases; methods for evaluating the quality of environmental data through empirical estimates of parameters such as charge balance, pH, and specific conductance; statistical applications to the interpretation of environmental information; and uses of environmental information in risk and dose assessments. Hearing about and discussing this wide variety of topics provided an opportunity to capture the subtlety of each discipline and to appreciate the continuity that is required among the disciplines in order to perform high-quality environmental information analysis

  14. Analyzing and Interpreting Historical Sources

    DEFF Research Database (Denmark)

    Kipping, Matthias; Wadhwani, Dan; Bucheli, Marcelo

    2014-01-01

    This chapter outlines a methodology for the interpretation of historical sources, helping to realize their full potential for the study of organization, while overcoming their challenges in terms of distortions created by time, changes in context, and selective production or preservation. Drawing....... The chapter contributes to the creation of a language for describing the use of historical sources in management research....

  15. Quantum theory needs no 'Interpretation'

    International Nuclear Information System (INIS)

    Fuchs, Christopher A.; Peres, Asher

    2000-01-01

    Purpose of this article is to stress the fact that Quantum Theory does not need an interpretation other than being an algorithm for computing probabilities associated with macroscopic phenomena and measurements. It does not ''describ'' reality, and the wave function is not objective entity, it only gives the evolution of our probabilities for the outcomes potential experiments. (AIP) (c)

  16. Interpretation of Recurrent Neural Networks

    DEFF Research Database (Denmark)

    Pedersen, Morten With; Larsen, Jan

    1997-01-01

    This paper addresses techniques for interpretation and characterization of trained recurrent nets for time series problems. In particular, we focus on assessment of effective memory and suggest an operational definition of memory. Further we discuss the evaluation of learning curves. Various nume...

  17. An interpretation of signature inversion

    International Nuclear Information System (INIS)

    Onishi, Naoki; Tajima, Naoki

    1988-01-01

    An interpretation in terms of the cranking model is presented to explain why signature inversion occurs for positive γ of the axially asymmetric deformation parameter and emerges into specific orbitals. By introducing a continuous variable, the eigenvalue equation can be reduced to a one dimensional Schroedinger equation by means of which one can easily understand the cause of signature inversion. (author)

  18. Abstract Interpretation of Mobile Ambients

    DEFF Research Database (Denmark)

    Hansen, René Rydhof; Jensen, J. G.; Nielson, Flemming

    1999-01-01

    We demonstrate that abstract interpretation is useful for analysing calculi of computation such as the ambient calculus (which is based on the p-calculus); more importantly, we show that the entire development can be expressed in a constraint-based formalism that is becoming exceedingly popular...

  19. Interpretive Reproduction in Children's Play

    Science.gov (United States)

    Corsaro, William A.

    2012-01-01

    The author looks at children's play from the perspective of interpretive reproduction, emphasizing the way children create their own unique peer cultures, which he defines as a set of routines, artifacts, values, and concerns that children engage in with their playmates. The article focuses on two types of routines in the peer culture of preschool…

  20. Interpreting Data: The Hybrid Mind

    Science.gov (United States)

    Heisterkamp, Kimberly; Talanquer, Vicente

    2015-01-01

    The central goal of this study was to characterize major patterns of reasoning exhibited by college chemistry students when analyzing and interpreting chemical data. Using a case study approach, we investigated how a representative student used chemical models to explain patterns in the data based on structure-property relationships. Our results…

  1. Interpretable functional principal component analysis.

    Science.gov (United States)

    Lin, Zhenhua; Wang, Liangliang; Cao, Jiguo

    2016-09-01

    Functional principal component analysis (FPCA) is a popular approach to explore major sources of variation in a sample of random curves. These major sources of variation are represented by functional principal components (FPCs). The intervals where the values of FPCs are significant are interpreted as where sample curves have major variations. However, these intervals are often hard for naïve users to identify, because of the vague definition of "significant values". In this article, we develop a novel penalty-based method to derive FPCs that are only nonzero precisely in the intervals where the values of FPCs are significant, whence the derived FPCs possess better interpretability than the FPCs derived from existing methods. To compute the proposed FPCs, we devise an efficient algorithm based on projection deflation techniques. We show that the proposed interpretable FPCs are strongly consistent and asymptotically normal under mild conditions. Simulation studies confirm that with a competitive performance in explaining variations of sample curves, the proposed FPCs are more interpretable than the traditional counterparts. This advantage is demonstrated by analyzing two real datasets, namely, electroencephalography data and Canadian weather data. © 2015, The International Biometric Society.

  2. Interpretation and the Aesthetic Dimension

    Science.gov (United States)

    Mortensen, Charles O.

    1976-01-01

    The author, utilizing a synthesis of philosophic comments on aesthetics, provides a discourse on the aesthetic dimension and offers examples of how interpreters can nurture the innate sense of beauty in man. Poetic forms, such as haiku, are used to relate the aesthetic relationship between man and the environment. (BT)

  3. Abstract Interpretation Using Attribute Grammar

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1990-01-01

    This paper deals with the correctness proofs of attribute grammars using methods from abstract interpretation. The technique will be described by defining a live-variable analysis for a small flow-chart language and proving it correct with respect to a continuation style semantics. The proof...

  4. Interpreting peptide mass spectra by VEMS

    DEFF Research Database (Denmark)

    Mathiesen, Rune; Lundsgaard, M.; Welinder, Karen G.

    2003-01-01

    the calculated and the experimental mass spectrum of the called peptide. The program package includes four accessory programs. VEMStrans creates protein databases in FASTA format from EST or cDNA sequence files. VEMSdata creates a virtual peptide database from FASTA files. VEMSdist displays the distribution......Most existing Mass Spectra (MS) analysis programs are automatic and provide limited opportunity for editing during the interpretation. Furthermore, they rely entirely on publicly available databases for interpretation. VEMS (Virtual Expert Mass Spectrometrist) is a program for interactive analysis...... of peptide MS/MS spectra imported in text file format. Peaks are annotated, the monoisotopic peaks retained, and the b-and y-ion series identified in an interactive manner. The called peptide sequence is searched against a local protein database for sequence identity and peptide mass. The report compares...

  5. Probabilistic interpretation of data a physicist's approach

    CERN Document Server

    Miller, Guthrie

    2013-01-01

    This book is a physicists approach to interpretation of data using Markov Chain Monte Carlo (MCMC). The concepts are derived from first principles using a style of mathematics that quickly elucidates the basic ideas, sometimes with the aid of examples. Probabilistic data interpretation is a straightforward problem involving conditional probability. A prior probability distribution is essential, and examples are given. In this small book (200 pages) the reader is led from the most basic concepts of mathematical probability all the way to parallel processing algorithms for Markov Chain Monte Carlo. Fortran source code (for eigenvalue analysis of finite discrete Markov Chains, for MCMC, and for nonlinear least squares) is included with the supplementary material for this book (available online).

  6. Interpreting the universal phylogenetic tree

    Science.gov (United States)

    Woese, C. R.

    2000-01-01

    The universal phylogenetic tree not only spans all extant life, but its root and earliest branchings represent stages in the evolutionary process before modern cell types had come into being. The evolution of the cell is an interplay between vertically derived and horizontally acquired variation. Primitive cellular entities were necessarily simpler and more modular in design than are modern cells. Consequently, horizontal gene transfer early on was pervasive, dominating the evolutionary dynamic. The root of the universal phylogenetic tree represents the first stage in cellular evolution when the evolving cell became sufficiently integrated and stable to the erosive effects of horizontal gene transfer that true organismal lineages could exist.

  7. Information filtering in evolving online networks

    Science.gov (United States)

    Chen, Bo-Lun; Li, Fen-Fen; Zhang, Yong-Jun; Ma, Jia-Lin

    2018-02-01

    Recommender systems use the records of users' activities and profiles of both users and products to predict users' preferences in the future. Considerable works towards recommendation algorithms have been published to solve the problems such as accuracy, diversity, congestion, cold-start, novelty, coverage and so on. However, most of these research did not consider the temporal effects of the information included in the users' historical data. For example, the segmentation of the training set and test set was completely random, which was entirely different from the real scenario in recommender systems. More seriously, all the objects are treated as the same, regardless of the new, the popular or obsoleted products, so do the users. These data processing methods always lose useful information and mislead the understanding of the system's state. In this paper, we detailed analyzed the difference of the network structure between the traditional random division method and the temporal division method on two benchmark data sets, Netflix and MovieLens. Then three classical recommendation algorithms, Global Ranking method, Collaborative Filtering and Mass Diffusion method, were employed. The results show that all these algorithms became worse in all four key indicators, ranking score, precision, popularity and diversity, in the temporal scenario. Finally, we design a new recommendation algorithm based on both users' and objects' first appearance time in the system. Experimental results showed that the new algorithm can greatly improve the accuracy and other metrics.

  8. Evolving understanding and treatment of labour dystocia.

    Science.gov (United States)

    Karaçam, Zekiye; Walsh, Denis; Bugg, George John

    2014-11-01

    The objective of the review is to critically review the diagnosis and management of dystocia in the first stage of labour. We conducted a narrative review of research since 1998. Eight studies were identified, four about the onset and duration of active phase of the first stage of labour, one on the diagnosis of dystocia, and three focused on the treatment of dystocia. The review demonstrates that current understandings of dystocia rest on outdated definitions of active first stage of labour, its progress and on treatments with an equivocal evidence base. These include the cervical dilatation threshold for active first stage, uncertainty over whether a reduced rate of dilatation and reduced strength of uterine contractions always represent pathology and the effectiveness of amniotomy/oxytocin for treating dystocia. Prospective studies should evaluate the impact of defining the active phase of the first stage of labour as commencing at 6 cm dilated and should test this definition in combination with Zhang's revised partogram. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Star formation in evolving molecular clouds

    Science.gov (United States)

    Völschow, M.; Banerjee, R.; Körtgen, B.

    2017-09-01

    Molecular clouds are the principle stellar nurseries of our universe; they thus remain a focus of both observational and theoretical studies. From observations, some of the key properties of molecular clouds are well known but many questions regarding their evolution and star formation activity remain open. While numerical simulations feature a large number and complexity of involved physical processes, this plethora of effects may hide the fundamentals that determine the evolution of molecular clouds and enable the formation of stars. Purely analytical models, on the other hand, tend to suffer from rough approximations or a lack of completeness, limiting their predictive power. In this paper, we present a model that incorporates central concepts of astrophysics as well as reliable results from recent simulations of molecular clouds and their evolutionary paths. Based on that, we construct a self-consistent semi-analytical framework that describes the formation, evolution, and star formation activity of molecular clouds, including a number of feedback effects to account for the complex processes inside those objects. The final equation system is solved numerically but at much lower computational expense than, for example, hydrodynamical descriptions of comparable systems. The model presented in this paper agrees well with a broad range of observational results, showing that molecular cloud evolution can be understood as an interplay between accretion, global collapse, star formation, and stellar feedback.

  10. Project Seahorse evolves into major marine protector | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2012-10-29

    Oct 29, 2012 ... Project Seahorse evolves into major marine protector ... local people, have greatly improved the prospects of survival for threatened species. ... “We tackle issues on any political level or geographical scale, according to what ...

  11. Incremental Frequent Subgraph Mining on Large Evolving Graphs

    KAUST Repository

    Abdelhamid, Ehab; Canim, Mustafa; Sadoghi, Mohammad; Bhatta, Bishwaranjan; Chang, Yuan-Chi; Kalnis, Panos

    2017-01-01

    , such as social networks, utilize large evolving graphs. Mining these graphs using existing techniques is infeasible, due to the high computational cost. In this paper, we propose IncGM+, a fast incremental approach for continuous frequent subgraph mining problem

  12. Genetic Algorithms Evolve Optimized Transforms for Signal Processing Applications

    National Research Council Canada - National Science Library

    Moore, Frank; Babb, Brendan; Becke, Steven; Koyuk, Heather; Lamson, Earl, III; Wedge, Christopher

    2005-01-01

    .... The primary goal of the research described in this final report was to establish a methodology for using genetic algorithms to evolve coefficient sets describing inverse transforms and matched...

  13. Biofabrication : reappraising the definition of an evolving field

    NARCIS (Netherlands)

    Groll, Jürgen; Boland, Thomas; Blunk, Torsten; Burdick, Jason A; Cho, Dong-Woo; Dalton, Paul D; Derby, Brian; Forgacs, Gabor; Li, Qing; Mironov, Vladimir A; Moroni, Lorenzo; Nakamura, Makoto; Shu, Wenmiao; Takeuchi, Shoji; Vozzi, Giovanni; Woodfield, Tim B F; Xu, Tao; Yoo, James J; Malda, Jos|info:eu-repo/dai/nl/412461099

    2016-01-01

    Biofabrication is an evolving research field that has recently received significant attention. In particular, the adoption of Biofabrication concepts within the field of Tissue Engineering and Regenerative Medicine has grown tremendously, and has been accompanied by a growing inconsistency in

  14. Biofabrication : Reappraising the definition of an evolving field

    NARCIS (Netherlands)

    Groll, Jürgen; Boland, Thomas; Blunk, Torsten; Burdick, Jason A.; Cho, Dong Woo; Dalton, Paul D.; Derby, Brian; Forgacs, Gabor; Li, Qing; Mironov, Vladimir A.; Moroni, Lorenzo; Nakamura, Makoto; Shu, Wenmiao; Takeuchi, Shoji; Vozzi, Giovanni; Woodfield, Tim B.F.; Xu, Tao; Yoo, James J.; Malda, Jos

    2016-01-01

    Biofabrication is an evolving research field that has recently received significant attention. In particular, the adoption of Biofabrication concepts within the field of Tissue Engineering and Regenerative Medicine has grown tremendously, and has been accompanied by a growing inconsistency in

  15. Orthogonally Evolved AI to Improve Difficulty Adjustment in Video Games

    DEFF Research Database (Denmark)

    Hintze, Arend; Olson, Randal; Lehman, Joel Anthony

    2016-01-01

    Computer games are most engaging when their difficulty is well matched to the player's ability, thereby providing an experience in which the player is neither overwhelmed nor bored. In games where the player interacts with computer-controlled opponents, the difficulty of the game can be adjusted...... not only by changing the distribution of opponents or game resources, but also through modifying the skill of the opponents. Applying evolutionary algorithms to evolve the artificial intelligence that controls opponent agents is one established method for adjusting opponent difficulty. Less-evolved agents...... (i.e. agents subject to fewer generations of evolution) make for easier opponents, while highly-evolved agents are more challenging to overcome. In this publication we test a new approach for difficulty adjustment in games: orthogonally evolved AI, where the player receives support from collaborating...

  16. Evolving political science. Biological adaptation, rational action, and symbolism.

    Science.gov (United States)

    Tingley, Dustin

    2006-01-01

    Political science, as a discipline, has been reluctant to adopt theories and methodologies developed in fields studying human behavior from an evolutionary standpoint. I ask whether evolutionary concepts are reconcilable with standard political-science theories and whether those concepts help solve puzzles to which these theories classically are applied. I find that evolutionary concepts readily and simultaneously accommodate theories of rational choice, symbolism, interpretation, and acculturation. Moreover, phenomena perennially hard to explain in standard political science become clearer when human interactions are understood in light of natural selection and evolutionary psychology. These phenomena include the political and economic effects of emotion, status, personal attractiveness, and variations in information-processing and decision-making under uncertainty; exemplary is the use of "focal points" in multiple-equilibrium games. I conclude with an overview of recent research by, and ongoing debates among, scholars analyzing politics in evolutionarily sophisticated terms.

  17. Evolving Frameworks for Different Communities of Scientists and End Users

    Science.gov (United States)

    Graves, S. J.; Keiser, K.

    2016-12-01

    Two evolving frameworks for interdisciplinary science will be described in the context of the Common Data Framework for Earth-Observation Data and the importance of standards and protocols. The Event Data Driven Delivery (ED3) Framework, funded by NASA Applied Sciences, provides the delivery of data based on predetermined subscriptions and associated workflows to various communities of end users. ED3's capabilities are used by scientists, as well as policy and resource managers, when event alerts are triggered to respond to their needs. The EarthCube Integration and Testing Environment (ECITE) Assessment Framework for Technology Interoperability and Integration is being developed to facilitate the EarthCube community's assessment of NSF funded technologies addressing Earth science problems. ECITE is addressing the translation of geoscience researchers' use cases into technology use case that apply EarthCube-funded building block technologies (and other existing technologies) for solving science problems. EarthCube criteria for technology assessment include the use of data, metadata and service standards to improve interoperability and integration across program components. The long-range benefit will be the growth of a cyberinfrastructure with technology components that have been shown to work together to solve known science objectives.

  18. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  19. The Evolving Context for Science and Society

    Science.gov (United States)

    Leshner, Alan I.

    2012-01-01

    The relationship between science and the rest of society is critical both to the support it receives from the public and to the receptivity of the broader citizenry to science's explanations of the nature of the world and to its other outputs. Science's ultimate usefulness depends on a receptive public. For example, given that science and technology are imbedded in virtually every issue of modern life, either as a cause or a cure, it is critical that the relationship be strong and that the role of science is well appreciated by society, or the impacts of scientific advances will fall short of their great potential. Unfortunately, a variety of problems have been undermining the science-society relationship for over a decade. Some problems emerge from within the scientific enterprise - like scientific misconduct or conflicts of interest - and tarnish or weaken its image and credibility. Other problems and stresses come from outside the enterprise. The most obvious external pressure is that the world economic situation is undermining the financial support of both the conduct and infrastructure of science. Other examples of external pressures include conflicts between what science is revealing and political or economic expediency - e.g., global climate change - or instances where scientific advances encroach upon core human values or beliefs - e.g., scientific understanding of the origins and evolution of the universe as compared to biblical accounts of creation. Significant efforts - some dramatically non-traditional for many in the scientific community - are needed to restore balance to the science-society relationship.

  20. Nonalcoholic fatty liver disease: Evolving paradigms

    Science.gov (United States)

    Lonardo, Amedeo; Nascimbeni, Fabio; Maurantonio, Mauro; Marrazzo, Alessandra; Rinaldi, Luca; Adinolfi, Luigi Elio

    2017-01-01

    In the last years new evidence has accumulated on nonalcoholic fatty liver disease (NAFLD) challenging the paradigms that had been holding the scene over the previous 30 years. NAFLD has such an epidemic prevalence as to make it impossible to screen general population looking for NAFLD cases. Conversely, focusing on those cohorts of individuals exposed to the highest risk of NAFLD could be a more rational approach. NAFLD, which can be diagnosed with either non-invasive strategies or through liver biopsy, is a pathogenically complex and clinically heterogeneous disease. The existence of metabolic as opposed to genetic-associated disease, notably including ”lean NAFLD” has recently been recognized. Moreover, NAFLD is a systemic condition, featuring metabolic, cardiovascular and (hepatic/extra-hepatic) cancer risk. Among the clinico-laboratory features of NAFLD we discuss hyperuricemia, insulin resistance, atherosclerosis, gallstones, psoriasis and selected endocrine derangements. NAFLD is a precursor of type 2 diabetes (T2D) and metabolic syndrome and progressive liver disease develops in T2D patients in whom the course of disease is worsened by NAFLD. Finally, lifestyle changes and drug treatment options to be implemented in the individual patient are also critically discussed. In conclusion, this review emphasizes the new concepts on clinical and pathogenic heterogeneity of NAFLD, a systemic disorder with a multifactorial pathogenesis and protean clinical manifestations. It is highly prevalent in certain cohorts of individuals who are thus potentially amenable to selective screening strategies, intensive follow-up schedules for early identification of liver-related and extrahepatic complications and in whom earlier and more aggressive treatment schedules should be carried out whenever possible. PMID:29085206

  1. Evolving a polymerase for hydrophobic base analogues.

    Science.gov (United States)

    Loakes, David; Gallego, José; Pinheiro, Vitor B; Kool, Eric T; Holliger, Philipp

    2009-10-21

    Hydrophobic base analogues (HBAs) have shown great promise for the expansion of the chemical and coding potential of nucleic acids but are generally poor polymerase substrates. While extensive synthetic efforts have yielded examples of HBAs with favorable substrate properties, their discovery has remained challenging. Here we describe a complementary strategy for improving HBA substrate properties by directed evolution of a dedicated polymerase using compartmentalized self-replication (CSR) with the archetypal HBA 5-nitroindole (d5NI) and its derivative 5-nitroindole-3-carboxamide (d5NIC) as selection substrates. Starting from a repertoire of chimeric polymerases generated by molecular breeding of DNA polymerase genes from the genus Thermus, we isolated a polymerase (5D4) with a generically enhanced ability to utilize HBAs. The selected polymerase. 5D4 was able to form and extend d5NI and d5NIC (d5NI(C)) self-pairs as well as d5NI(C) heteropairs with all four bases with efficiencies approaching, or exceeding, those of the cognate Watson-Crick pairs, despite significant distortions caused by the intercalation of the d5NI(C) heterocycles into the opposing strand base stack, as shown by nuclear magnetic resonance spectroscopy (NMR). Unlike Taq polymerase, 5D4 was also able to extend HBA pairs such as Pyrene: varphi (abasic site), d5NI: varphi, and isocarbostyril (ICS): 7-azaindole (7AI), allowed bypass of a chemically diverse spectrum of HBAs, and enabled PCR amplification with primers comprising multiple d5NI(C)-substitutions, while maintaining high levels of catalytic activity and fidelity. The selected polymerase 5D4 promises to expand the range of nucleobase analogues amenable to replication and should find numerous applications, including the synthesis and replication of nucleic acid polymers with expanded chemical and functional diversity.

  2. Origins of fluorescence in evolved bacteriophytochromes.

    Science.gov (United States)

    Bhattacharya, Shyamosree; Auldridge, Michele E; Lehtivuori, Heli; Ihalainen, Janne A; Forest, Katrina T

    2014-11-14

    Use of fluorescent proteins to study in vivo processes in mammals requires near-infrared (NIR) biomarkers that exploit the ability of light in this range to penetrate tissue. Bacteriophytochromes (BphPs) are photoreceptors that couple absorbance of NIR light to photoisomerization, protein conformational changes, and signal transduction. BphPs have been engineered to form NIR fluorophores, including IFP1.4, Wi-Phy, and the iRFP series, initially by replacement of Asp-207 by His. This position was suggestive because its main chain carbonyl is within hydrogen-bonding distance to pyrrole ring nitrogens of the biliverdin chromophore, thus potentially functioning as a crucial transient proton sink during photoconversion. To explain the origin of fluorescence in these phytofluors, we solved the crystal structures of IFP1.4 and a comparison non-fluorescent monomeric phytochrome DrCBDmon. Met-186 and Val-288 in IFP1.4 are responsible for the formation of a tightly packed hydrophobic hub around the biliverdin D ring. Met-186 is also largely responsible for the blue-shifted IFP1.4 excitation maximum relative to the parent BphP. The structure of IFP1.4 revealed decreased structural heterogeneity and a contraction of two surface regions as direct consequences of side chain substitutions. Unexpectedly, IFP1.4 with Asp-207 reinstalled (IFPrev) has a higher fluorescence quantum yield (∼9%) than most NIR phytofluors published to date. In agreement, fluorescence lifetime measurements confirm the exceptionally long excited state lifetimes, up to 815 ps, in IFP1.4 and IFPrev. Our research helps delineate the origin of fluorescence in engineered BphPs and will facilitate the wide-spread adoption of phytofluors as biomarkers. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  3. A changed name with an evolving function.

    Science.gov (United States)

    Xie, Z

    1995-12-01

    Changes in family planning, which took place in 1994, are described for the Mianzhu County Family Planning Committee and other townships in Sichuan Province. The Committee changed its name to Population Committee. The administrative structure changed at the town and township level. The Secretary of the Chinese Communist Party assigned the former Director of the township Family Planning Office to serve as Director of the General Office of township Population Committee. This administrative change did not take place in the county office. Reforms at the county level were expected to be more gradual, since there was no other model elsewhere in China to follow. The name change reflected a change in function and not a decline in family planning. The function will include implementation, management, and coordination instead of just fertility control. The Committee joined with the Women's Federation in offering premarital education to young people and in establishing a kindergarten for 3-5 year old children. In Qifu there were 18 township businesses, which hired surplus labor. In Qifu preferential treatment in hiring was given to single-child and two-daughter families. Wage labor has resulted in higher income and less time in the fields. The average Qifu township income in 1994 was 1250 yuan. 3200 of the 6100 single-child households were given elderly insurance by the Population Committee. In Dongbei town 4173 households had single children (56.4% of total households). In 1994 average household yearly income was 1400 yuan. 3350 households (80.2% of total single-child households) had an average yearly income of 1500-3000 yuan. 307 households (7.5%) had a yearly income of 3000-5000 yuan. 100 households (2.5%) had income greater than 5000 yuan.

  4. Diabetes benefit management: evolving strategies for payers.

    Science.gov (United States)

    Tzeel, Albert L

    2011-11-01

    Over the next quarter century, the burden of type 2 diabetes mellitus (T2DM) is expected to at least double. Currently, 1 in every 10 healthcare dollars is spent on diabetes management; by 2050, it has been projected that the annual costs of managing T2DM will rise to $336 billion. Without substantial, systemic changes, T2DM management costs will lead to a potentially untenable strain on the healthcare system. However, the appropriate management of diabetes can reduce associated mortality and delay comorbidities. In addition, adequate glycemic control can improve patient outcomes and significantly reduce diabetes-related complications. This article provides an overview of key concepts associated with a value-based insurance design (VBID) approach to T2DM coverage. By promoting the use of services or treatments that provide high benefits relative to cost, and by alternatively discouraging patients from utilizing services whose benefits do not justify their cost, VBID improves the quality of healthcare while simultaneously reining in spending. VBID initiatives tend to focus on chronic disease management and generally target prescription drug use. However, some programs have expanded their scope by incorporating services traditionally offered by wellness and disease management programs. The concept of VBID is growing, and it is increasingly being implemented by a diverse and growing number of public and private entities, including pharmacy benefit managers, health plans, and employers. This article provides key background on VBID strategies, with a focus on T2DM management. It also provides a road map for health plans seeking to implement VBID as part of their programs.

  5. Clustering evolving proteins into homologous families.

    Science.gov (United States)

    Chan, Cheong Xin; Mahbob, Maisarah; Ragan, Mark A

    2013-04-08

    Clustering sequences into groups of putative homologs (families) is a critical first step in many areas of comparative biology and bioinformatics. The performance of clustering approaches in delineating biologically meaningful families depends strongly on characteristics of the data, including content bias and degree of divergence. New, highly scalable methods have recently been introduced to cluster the very large datasets being generated by next-generation sequencing technologies. However, there has been little systematic investigation of how characteristics of the data impact the performance of these approaches. Using clusters from a manually curated dataset as reference, we examined the performance of a widely used graph-based Markov clustering algorithm (MCL) and a greedy heuristic approach (UCLUST) in delineating protein families coded by three sets of bacterial genomes of different G+C content. Both MCL and UCLUST generated clusters that are comparable to the reference sets at specific parameter settings, although UCLUST tends to under-cluster compositionally biased sequences (G+C content 33% and 66%). Using simulated data, we sought to assess the individual effects of sequence divergence, rate heterogeneity, and underlying G+C content. Performance decreased with increasing sequence divergence, decreasing among-site rate variation, and increasing G+C bias. Two MCL-based methods recovered the simulated families more accurately than did UCLUST. MCL using local alignment distances is more robust across the investigated range of sequence features than are greedy heuristics using distances based on global alignment. Our results demonstrate that sequence divergence, rate heterogeneity and content bias can individually and in combination affect the accuracy with which MCL and UCLUST can recover homologous protein families. For application to data that are more divergent, and exhibit higher among-site rate variation and/or content bias, MCL may often be the better

  6. (N+1)-dimensional Lorentzian evolving wormholes supported by polytropic matter

    Energy Technology Data Exchange (ETDEWEB)

    Cataldo, Mauricio [Universidad del Bio-Bio, Departamento de Fisica, Facultad de Ciencias, Concepcion (Chile); Arostica, Fernanda; Bahamonde, Sebastian [Universidad de Concepcion, Departamento de Fisica, Concepcion (Chile)

    2013-08-15

    In this paper we study (N+1)-dimensional evolving wormholes supported by energy satisfying a polytropic equation of state. The considered evolving wormhole models are described by a constant redshift function and generalizes the standard flat Friedmann-Robertson-Walker spacetime. The polytropic equation of state allows us to consider in (3+1)-dimensions generalizations of the phantom energy and the generalized Chaplygin gas sources. (orig.)

  7. Ecotourism and Interpretation in Costa Rica: Parallels and Peregrinations.

    Science.gov (United States)

    Williams, Wayne E.

    1994-01-01

    Discusses the ecotourism industry in Costa Rica and some of the problems faced by its national park system, including megaparks, rapid increase in tourism, and interpretive services. Suggests alternatives for the problems. (MKR)

  8. Interpretative Communities in Conflict: A Master Syllabus for Political Communication.

    Science.gov (United States)

    Smith, Craig Allen

    1992-01-01

    Advocates the interpretive communities approach to teaching political communication. Discusses philosophical issues in the teaching of political communication courses, and pedagogical techniques (including concepts versus cases, clustering examples, C-SPAN video examples, and simulations and games). (SR)

  9. Federal Motor Vehicle Safety Standards Interpretations

    Data.gov (United States)

    Department of Transportation — NHTSA's Chief Counsel interprets the statutes that the agency administers and the regulations that it promulgates. The Chief Counsel's interpretations, issued in the...

  10. Including climate change in energy investment decisions

    International Nuclear Information System (INIS)

    Ybema, J.R.; Boonekamp, P.G.M.; Smit, J.T.J.

    1995-08-01

    To properly take climate change into account in the analysis of energy investment decisions, it is required to apply decision analysis methods that are capable of considering the specific characteristics of climate change (large uncertainties, long term horizon). Such decision analysis methods do exist. They can explicitly include evolving uncertainties, multi-stage decisions, cumulative effects and risk averse attitudes. Various methods are considered in this report and two of these methods have been selected: hedging calculations and sensitivity analysis. These methods are applied to illustrative examples, and its limitations are discussed. The examples are (1a) space heating and hot water for new houses from a private investor perspective and (1b) as example (1a) but from a government perspective, (2) electricity production with an integrated coal gasification combined cycle (ICGCC) with or without CO 2 removal, and (3) national energy strategy to hedge for climate change. 9 figs., 21 tabs., 42 refs., 1 appendix

  11. Rate My Stake: Interpretation of Ordinal Stake Ratings

    Science.gov (United States)

    Patricia Lebow; Grant Kirker

    2014-01-01

    Ordinal rating systems are commonly employed to evaluate biodeterioration of wood exposed outdoors over long periods of time. The purpose of these ratings is to compare the durability of test systems to nondurable wood products or known durable wood products. There are many reasons why these systems have evolved as the chosen method of evaluation, including having an...

  12. An Evolved System of Radiological Protection

    International Nuclear Information System (INIS)

    Kaneko, M.

    2004-01-01

    The current system of radiological protection based on the Linear No-Threshold (LNT) hypothesis has greatly contributed to the minimization of doses received by workers and members of the public. However, it has brought about r adiophobia a mong people and waste of resources due to over-regulation, because the LNT implies that radiation is harmful no matter how small the dose is. The author reviewed the results of research on health effects of radiation including major epidemiological studies on radiation workers and found no clear evidence of deleterious health effects from radiation exposures below the current maximum dose limits (50 mSv/y for workers and 5 mSv/y for members of the public), which have been adopted worldwide in the second half of the 20th century. Now that the existence of bio-defensive mechanisms such as DNA repair, apoptosis and adaptive response are well recognized, the linearity assumption cannot be said to be s cientific . Evidences increasingly imply that there are threshold effects in risk of radiation. A concept of practical thresholds or virtually safe doses will have to be introduced into the new system of radiological protection in order to resolve the low dose issues. Practical thresholds may be defined as dose levels below which induction of detectable radiogenic cancers or hereditary effects are not expected. If any workers and members of the public do not gain benefits from being exposed, excepting intentional irradiation for medical purposes, their radiation exposures should be kept below practical thresholds. On the assumption that the current dose limits are below practical thresholds and with no radiation detriments, there is no need of justification and optimization (ALARA) principles for occupational and public exposures. Then the ethical issue of justification to allow benefit to society to offset radiation detriments to individuals can be resolved. And also the ethical issue of optimization to exchange health or safety for

  13. Directionality effects in simultaneous language interpreting: the case of sign language interpreters in The Netherlands.

    Science.gov (United States)

    Van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of The Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives was assessed by 5 certified sign language interpreters who did not participate in the study. Two measures were used to assess interpreting quality: the propositional accuracy of the interpreters' interpretations and a subjective quality measure. The results showed that the interpreted narratives in the SLN-to-Dutch interpreting direction were of lower quality (on both measures) than the interpreted narratives in the Dutch-to-SLN and Dutch-to-SSD directions. Furthermore, interpreters who had begun acquiring SLN when they entered the interpreter training program performed as well in all 3 interpreting directions as interpreters who had acquired SLN from birth.

  14. MRI for clinically suspected pediatric appendicitis: case interpretation

    International Nuclear Information System (INIS)

    Moore, Michael M.; Brian, James M.; Methratta, Sosamma T.; Hulse, Michael A.; Choudhary, Arabinda K.; Eggli, Kathleen D.; Boal, Danielle K.B.

    2014-01-01

    As utilization of MRI for clinically suspected pediatric appendicitis becomes more common, there will be increased focus on case interpretation. The purpose of this pictorial essay is to share our institution's case interpretation experience. MRI findings of appendicitis include appendicoliths, tip appendicitis, intraluminal fluid-debris level, pitfalls of size measurements, and complications including abscesses. The normal appendix and inguinal appendix are also discussed. (orig.)

  15. MRI for clinically suspected pediatric appendicitis: case interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Michael M.; Brian, James M.; Methratta, Sosamma T.; Hulse, Michael A.; Choudhary, Arabinda K.; Eggli, Kathleen D.; Boal, Danielle K.B. [Penn State Milton S. Hershey Medical Center, Division of Pediatric Radiology, Department of Radiology, Hershey, PA (United States)

    2014-05-15

    As utilization of MRI for clinically suspected pediatric appendicitis becomes more common, there will be increased focus on case interpretation. The purpose of this pictorial essay is to share our institution's case interpretation experience. MRI findings of appendicitis include appendicoliths, tip appendicitis, intraluminal fluid-debris level, pitfalls of size measurements, and complications including abscesses. The normal appendix and inguinal appendix are also discussed. (orig.)

  16. Design and interpretation of anthropometric and fitness testing of basketball players.

    Science.gov (United States)

    Drinkwater, Eric J; Pyne, David B; McKenna, Michael J

    2008-01-01

    The volume of literature on fitness testing in court sports such as basketball is considerably less than for field sports or individual sports such as running and cycling. Team sport performance is dependent upon a diverse range of qualities including size, fitness, sport-specific skills, team tactics, and psychological attributes. The game of basketball has evolved to have a high priority on body size and physical fitness by coaches and players. A player's size has a large influence on the position in the team, while the high-intensity, intermittent nature of the physical demands requires players to have a high level of fitness. Basketball coaches and sport scientists often use a battery of sport-specific physical tests to evaluate body size and composition, and aerobic fitness and power. This testing may be used to track changes within athletes over time to evaluate the effectiveness of training programmes or screen players for selection. Sports science research is establishing typical (or 'reference') values for both within-athlete changes and between-athlete differences. Newer statistical approaches such as magnitude-based inferences have emerged that are providing more meaningful interpretation of fitness testing results in the field for coaches and athletes. Careful selection and implementation of tests, and more pertinent interpretation of data, will enhance the value of fitness testing in high-level basketball programmes. This article presents reference values of fitness and body size in basketball players, and identifies practical methods of interpreting changes within players and differences between players beyond the null-hypothesis.

  17. Modeling and interpretation of images*

    Directory of Open Access Journals (Sweden)

    Min Michiel

    2015-01-01

    Full Text Available Imaging protoplanetary disks is a challenging but rewarding task. It is challenging because of the glare of the central star outshining the weak signal from the disk at shorter wavelengths and because of the limited spatial resolution at longer wavelengths. It is rewarding because it contains a wealth of information on the structure of the disks and can (directly probe things like gaps and spiral structure. Because it is so challenging, telescopes are often pushed to their limitations to get a signal. Proper interpretation of these images therefore requires intimate knowledge of the instrumentation, the detection method, and the image processing steps. In this chapter I will give some examples and stress some issues that are important when interpreting images from protoplanetary disks.

  18. An objective interpretation of Lagrangian quantum mechanics

    International Nuclear Information System (INIS)

    Roberts, K.V.

    1978-01-01

    Unlike classical mechanics, the Copenhagen interpretation of quantum mechanics does not provide an objective space-time picture of the actual history of a physical system. This paper suggests how the conceptual foundations of quantum mechanics can be reformulated, without changing the mathematical content of the theory or its detailed agreement with experiment and without introducing any hidden variables, in order to provide an objective, covariant, Lagrangian description of reality which is deterministic and time-symmetric on the microscopic scale. The basis of this description can be expressed either as an action functional or as a summation over Feynman diagrams or paths. The probability laws associated with the quantum-mechanical measurement process, and the asymmetry in time of the principles of macroscopic causality and of the laws of statistical mechanics, are interpreted as consequences of the particular boundary conditions that apply to the actual universe. The objective interpretation does not include the observer and the measurement process among the fundamental concepts of the theory, but it does not entail a revision of the ideas of determinism and of time, since in a Lagrangian theory both initial and final boundary conditions on the action functional are required. (author)

  19. Quantum mechanics interpretation: scalled debate

    International Nuclear Information System (INIS)

    Sanchez Gomez, J. L.

    2000-01-01

    This paper discusses the two main issues of the so called quantum debate, that started in 1927 with the famous Bohr-Einstein controversy; namely non-separability and the projection postulate. Relevant interpretations and formulations of quantum mechanics are critically analyzed in the light of the said issues. The treatment is focused chiefly on fundamental points, so that technical ones are practically not dealt with here. (Author) 20 refs

  20. Topological interpretation of Luttinger theorem

    OpenAIRE

    Seki, Kazuhiro; Yunoki, Seiji

    2017-01-01

    Based solely on the analytical properties of the single-particle Green's function of fermions at finite temperatures, we show that the generalized Luttinger theorem inherently possesses topological aspects. The topological interpretation of the generalized Luttinger theorem can be introduced because i) the Luttinger volume is represented as the winding number of the single-particle Green's function and thus ii) the deviation of the theorem, expressed with a ratio between the interacting and n...

  1. Operational interpretations of quantum discord

    International Nuclear Information System (INIS)

    Cavalcanti, D.; Modi, K.; Aolita, L.; Boixo, S.; Piani, M.; Winter, A.

    2011-01-01

    Quantum discord quantifies nonclassical correlations beyond the standard classification of quantum states into entangled and unentangled. Although it has received considerable attention, it still lacks any precise interpretation in terms of some protocol in which quantum features are relevant. Here we give quantum discord its first information-theoretic operational meaning in terms of entanglement consumption in an extended quantum-state-merging protocol. We further relate the asymmetry of quantum discord with the performance imbalance in quantum state merging and dense coding.

  2. Defunctionalized Interpreters for Programming Languages

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2008-01-01

    by Reynolds in ``Definitional Interpreters for Higher-Order Programming Languages'' for functional implementations of denotational semantics, natural semantics, and big-step abstract machines using closure conversion, CPS transformation, and defunctionalization. Over the last few years, the author and his...... operational semantics can be expressed as a reduction semantics: for deterministic languages, a reduction semantics is a structural operational semantics in continuation style, where the reduction context is a defunctionalized continuation. As the defunctionalized counterpart of the continuation of a one...

  3. Interpreting radiographs. 4. The carpus

    International Nuclear Information System (INIS)

    Burguez, P.N.

    1984-01-01

    The complexity of the carpus which has three major joints, seven or eight carpal bones and five adjacent bones, each of which articulates with one or more of the carpal elements, necessitates good quality radiographs for definitive radiographic interpretation may be extremely difficult because of the disparity between radiographic changes and obvious clinical signs and, therefore, must be discussed in the light of a thorough clinical assessment

  4. Paris convention - Decisions, recommendations, interpretations

    International Nuclear Information System (INIS)

    1990-01-01

    This booklet is published in a single edition in English and French. It contains decisions, recommendations and interpretations concerning the 1960 Paris Convention on Third Party Liability in the Field of Nuclear Energy adopted by the OECD Steering Committee and the OECD Council. All the instruments are set out according to the Article of the Convention to which they relate and explanatory notes are added where necessary [fr

  5. EEVEE: the Empathy-Enhancing Virtual Evolving Environment

    Directory of Open Access Journals (Sweden)

    Philip L. Jackson

    2015-03-01

    Full Text Available Empathy is a multifaceted emotional and mental faculty that is often found to be affected in a great number of psychopathologies, including schizophrenia, yet it remains very difficult to measure in an ecological context. The challenge stems partly from the complexity and fluidity of this social process, but also from its covert nature. A powerful tool to enhance experimental control over such dynamic social interactions is the use of avatars in virtual reality (VR, and one way to collect information about an individual in an interaction is through the analysis of his or her neurophysiological and behavioural responses. We have developed a unique platform, the Empathy-Enhancing Virtual Evolving Environment (EEVEE, which is built around three main components: 1 different avatars capable of expressing feelings and emotions at various levels based on the Facial Action Coding System (FACS; 2 systems for measuring the physiological responses of the observer (heart and respiration rate, skin conductance, gaze and eye movements, facial expression; and 3 a multimodal interface linking the avatar’s behaviour to the observer’s neurophysiological response. In this article, we provide a detailed description of the components of this innovative platform and validation data from the first phases of development. Our data show that healthy adults can discriminate different negative emotions, including pain, expressed by avatars at varying intensities. We also provide evidence that masking part of an avatar’s face (top or bottom half does not prevent the detection of different levels of pain. Overall, this innovative and flexible platform provides a unique tool to study and even modulate empathy in a comprehensive and ecological manner in number of populations suffering from neurological or psychiatric disorders.

  6. Deaf-Blind Interpreting: Building on What You Already Know

    Directory of Open Access Journals (Sweden)

    Karen Petronio

    2010-10-01

    Full Text Available http://dx.doi.org/10.5007/2175-7968.2010v2n26p237 This article focuses on visual considerations and describes the numerous similarities between video interpreting and deaf-blind interpreting. It also looks at linguistic considerations for deaf-blind interpreting and presents research findings showing similarities and differences between ASL and Tactile ASL. Because many interpreters are unfamiliar with tactile communication, there is a section that includes an overview of Tactile ASL. The issues, descriptions, and data presented in this article are based on situations in the United States and involve the use of ASL and Tactile ASL; however, it is highly likely that these discussions and findings also relate to deaf-blind interpreting done in other countries using other sign languages.

  7. Challenges for fuel cells as stationary power resource in the evolving energy enterprise

    Science.gov (United States)

    Rastler, Dan

    The primary market challenges for fuel cells as stationary power resources in evolving energy markets are reviewed. Fuel cell power systems have significant barriers to overcome in their anticipated role as decentralized energy power systems. Market segments for fuel cells include combined heat and power; low-cost energy, premium power; peak shaving; and load management and grid support. Understanding the role and fit of fuel cell systems in evolving energy markets and the highest value applications are a major challenge for developers and government funding organizations. The most likely adopters of fuel cell systems and the challenges facing each adopter in the target market segment are reviewed. Adopters include generation companies, utility distribution companies, retail energy service providers and end-users. Key challenges include: overcoming technology risk; achieving retail competitiveness; understanding high value markets and end-user needs; distribution and service channels; regulatory policy issues; and the integration of these decentralized resources within the electrical distribution system.

  8. Interpreter in Criminal Cases: Allrounders First!

    Science.gov (United States)

    Frid, Arthur

    1974-01-01

    The interpreter in criminal cases generally has had a purely linguistic training with no difference from the education received by his colleague interpreters. The position of interpreters in criminal cases is vague and their role depends to a large extent on individual interpretation of officials involved in the criminal procedure. Improvements on…

  9. Patterns of Communication through Interpreters: A Detailed Sociolinguistic Analysis

    Science.gov (United States)

    Aranguri, Cesar; Davidson, Brad; Ramirez, Robert

    2006-01-01

    BACKGROUND Numerous articles have detailed how the presence of an interpreter leads to less satisfactory communication with physicians; few have studied how actual communication takes place through an interpreter in a clinical setting. OBJECTIVE Record and analyze physician-interpreter-patient interactions. DESIGN Primary care physicians with high-volume Hispanic practices were recruited for a communication study. Dyslipidemic Hispanic patients, either monolingual Spanish or bilingual Spanish-English, were recruited on the day of a normally scheduled appointment and, once consented, recorded without a researcher present in the room. Separate postvisit interviews were conducted with the patient and the physician. All interactions were fully transcribed and analyzed. PARTICIPANTS Sixteen patients were recorded interacting with 9 physicians. Thirteen patients used an interpreter with 8 physicians, and 3 patients spoke Spanish with the 1 bilingual physician. APPROACH Transcript analysis based on sociolinguistic and discourse analytic techniques, including but not limited to time speaking, analysis of questions asked and answered, and the loss of semantic information. RESULTS Speech was significantly reduced and revised by the interpreter, resulting in an alteration of linguistic features such as content, meaning, reinforcement/validation, repetition, and affect. In addition, visits that included an interpreter had virtually no rapport-building “small talk,” which typically enables the physician to gain comprehensive patient history, learn clinically relevant information, and increase emotional engagement in treatment. CONCLUSIONS The presence of an interpreter increases the difficulty of achieving good physician-patient communication. Physicians and interpreters should be trained in the process of communication and interpretation, to minimize conversational loss and maximize the information and relational exchange with interpreted patients. PMID:16808747

  10. Electrocardiographic interpretation skills of cardiology residents: are they competent?

    Science.gov (United States)

    Sibbald, Matthew; Davies, Edward G; Dorian, Paul; Yu, Eric H C

    2014-12-01

    Achieving competency at electrocardiogram (ECG) interpretation among cardiology subspecialty residents has traditionally focused on interpreting a target number of ECGs during training. However, there is little evidence to support this approach. Further, there are no data documenting the competency of ECG interpretation skills among cardiology residents, who become de facto the gold standard in their practice communities. We tested 29 Cardiology residents from all 3 years in a large training program using a set of 20 ECGs collected from a community cardiology practice over a 1-month period. Residents interpreted half of the ECGs using a standard analytic framework, and half using their own approach. Residents were scored on the number of correct and incorrect diagnoses listed. Overall diagnostic accuracy was 58%. Of 6 potentially life-threatening diagnoses, residents missed 36% (123 of 348) including hyperkalemia (81%), long QT (52%), complete heart block (35%), and ventricular tachycardia (19%). Residents provided additional inappropriate diagnoses on 238 ECGs (41%). Diagnostic accuracy was similar between ECGs interpreted using an analytic framework vs ECGs interpreted without an analytic framework (59% vs 58%; F(1,1333) = 0.26; P = 0.61). Cardiology resident proficiency at ECG interpretation is suboptimal. Despite the use of an analytic framework, there remain significant deficiencies in ECG interpretation among Cardiology residents. A more systematic method of addressing these important learning gaps is urgently needed. Copyright © 2014 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  11. Statistics and Data Interpretation for Social Work

    CERN Document Server

    Rosenthal, James

    2011-01-01

    "Without question, this text will be the most authoritative source of information on statistics in the human services. From my point of view, it is a definitive work that combines a rigorous pedagogy with a down to earth (commonsense) exploration of the complex and difficult issues in data analysis (statistics) and interpretation. I welcome its publication.". -Praise for the First Edition. Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes

  12. Holographic Imaging of Evolving Laser-Plasma Structures

    Energy Technology Data Exchange (ETDEWEB)

    Downer, Michael [Univ. of Texas, Austin, TX (United States); Shvets, G. [Univ. of Texas, Austin, TX (United States)

    2014-07-31

    In the 1870s, English photographer Eadweard Muybridge captured motion pictures within one cycle of a horse’s gallop, which settled a hotly debated question of his time by showing that the horse became temporarily airborne. In the 1940s, Manhattan project photographer Berlin Brixner captured a nuclear blast at a million frames per second, and resolved a dispute about the explosion’s shape and speed. In this project, we developed methods to capture detailed motion pictures of evolving, light-velocity objects created by a laser pulse propagating through matter. These objects include electron density waves used to accelerate charged particles, laser-induced refractive index changes used for micromachining, and ionization tracks used for atmospheric chemical analysis, guide star creation and ranging. Our “movies”, like Muybridge’s and Brixner’s, are obtained in one shot, since the laser-created objects of interest are insufficiently repeatable for accurate stroboscopic imaging. Our high-speed photographs have begun to resolve controversies about how laser-created objects form and evolve, questions that previously could be addressed only by intensive computer simulations based on estimated initial conditions. Resolving such questions helps develop better tabletop particle accelerators, atmospheric ranging devices and many other applications of laser-matter interactions. Our photographic methods all begin by splitting one or more “probe” pulses from the laser pulse that creates the light-speed object. A probe illuminates the object and obtains information about its structure without altering it. We developed three single-shot visualization methods that differ in how the probes interact with the object of interest or are recorded. (1) Frequency-Domain Holography (FDH). In FDH, there are 2 probes, like “object” and “reference” beams in conventional holography. Our “object” probe surrounds the light-speed object, like a fleas swarming around a

  13. Modeling and clustering users with evolving profiles in usage streams

    KAUST Repository

    Zhang, Chongsheng

    2012-09-01

    Today, there is an increasing need of data stream mining technology to discover important patterns on the fly. Existing data stream models and algorithms commonly assume that users\\' records or profiles in data streams will not be updated or revised once they arrive. Nevertheless, in various applications such asWeb usage, the records/profiles of the users can evolve along time. This kind of streaming data evolves in two forms, the streaming of tuples or transactions as in the case of traditional data streams, and more importantly, the evolving of user records/profiles inside the streams. Such data streams bring difficulties on modeling and clustering for exploring users\\' behaviors. In this paper, we propose three models to summarize this kind of data streams, which are the batch model, the Evolving Objects (EO) model and the Dynamic Data Stream (DDS) model. Through creating, updating and deleting user profiles, these models summarize the behaviors of each user as a profile object. Based upon these models, clustering algorithms are employed to discover interesting user groups from the profile objects. We have evaluated all the proposed models on a large real-world data set, showing that the DDS model summarizes the data streams with evolving tuples more efficiently and effectively, and provides better basis for clustering users than the other two models. © 2012 IEEE.

  14. Modeling and clustering users with evolving profiles in usage streams

    KAUST Repository

    Zhang, Chongsheng; Masseglia, Florent; Zhang, Xiangliang

    2012-01-01

    Today, there is an increasing need of data stream mining technology to discover important patterns on the fly. Existing data stream models and algorithms commonly assume that users' records or profiles in data streams will not be updated or revised once they arrive. Nevertheless, in various applications such asWeb usage, the records/profiles of the users can evolve along time. This kind of streaming data evolves in two forms, the streaming of tuples or transactions as in the case of traditional data streams, and more importantly, the evolving of user records/profiles inside the streams. Such data streams bring difficulties on modeling and clustering for exploring users' behaviors. In this paper, we propose three models to summarize this kind of data streams, which are the batch model, the Evolving Objects (EO) model and the Dynamic Data Stream (DDS) model. Through creating, updating and deleting user profiles, these models summarize the behaviors of each user as a profile object. Based upon these models, clustering algorithms are employed to discover interesting user groups from the profile objects. We have evaluated all the proposed models on a large real-world data set, showing that the DDS model summarizes the data streams with evolving tuples more efficiently and effectively, and provides better basis for clustering users than the other two models. © 2012 IEEE.

  15. REQUIREMENTS FOR A GENERAL INTERPRETATION THEORY

    Directory of Open Access Journals (Sweden)

    Anda Laura Lungu Petruescu

    2013-06-01

    Full Text Available Time has proved that Economic Analysis is not enough as to ensure all the needs of the economic field. The present study wishes to propose a new approach method of the economic phenomena and processes based on the researches made outside the economic space- a new general interpretation theory- which is centered on the human being as the basic actor of economy. A general interpretation theory must assure the interpretation of the causalities among the economic phenomena and processes- causal interpretation; the interpretation of the correlations and dependencies among indicators- normative interpretation; the interpretation of social and communicational processes in economic organizations- social and communicational interpretation; the interpretation of the community status of companies- transsocial interpretation; the interpretation of the purposes of human activities and their coherency – teleological interpretation; the interpretation of equilibrium/ disequilibrium from inside the economic systems- optimality interpretation. In order to respond to such demands, rigor, pragmatism, praxiology and contextual connectors are required. In order to progress, the economic science must improve its language, both its syntax and its semantics. The clarity of exposure requires a language clarity and the scientific theory progress asks for the need of hypotheses in the building of the theories. The switch from the common language to the symbolic one means the switch from ambiguity to rigor and rationality, that is order in thinking. But order implies structure, which implies formalization. Our paper should be a plea for these requirements, requirements which should be fulfilled by a modern interpretation theory.

  16. Evolving BioAssay Ontology (BAO): modularization, integration and applications.

    Science.gov (United States)

    Abeyruwan, Saminda; Vempati, Uma D; Küçük-McGinty, Hande; Visser, Ubbo; Koleti, Amar; Mir, Ahsan; Sakurai, Kunie; Chung, Caty; Bittker, Joshua A; Clemons, Paul A; Brudz, Steve; Siripala, Anosha; Morales, Arturo J; Romacker, Martin; Twomey, David; Bureeva, Svetlana; Lemmon, Vance; Schürer, Stephan C

    2014-01-01

    The lack of established standards to describe and annotate biological assays and screening outcomes in the domain of drug and chemical probe discovery is a severe limitation to utilize public and proprietary drug screening data to their maximum potential. We have created the BioAssay Ontology (BAO) project (http://bioassayontology.org) to develop common reference metadata terms and definitions required for describing relevant information of low-and high-throughput drug and probe screening assays and results. The main objectives of BAO are to enable effective integration, aggregation, retrieval, and analyses of drug screening data. Since we first released BAO on the BioPortal in 2010 we have considerably expanded and enhanced BAO and we have applied the ontology in several internal and external collaborative projects, for example the BioAssay Research Database (BARD). We describe the evolution of BAO with a design that enables modeling complex assays including profile and panel assays such as those in the Library of Integrated Network-based Cellular Signatures (LINCS). One of the critical questions in evolving BAO is the following: how can we provide a way to efficiently reuse and share among various research projects specific parts of our ontologies without violating the integrity of the ontology and without creating redundancies. This paper provides a comprehensive answer to this question with a description of a methodology for ontology modularization using a layered architecture. Our modularization approach defines several distinct BAO components and separates internal from external modules and domain-level from structural components. This approach facilitates the generation/extraction of derived ontologies (or perspectives) that can suit particular use cases or software applications. We describe the evolution of BAO related to its formal structures, engineering approaches, and content to enable modeling of complex assays and integration with other ontologies and

  17. Sauropod dinosaurs evolved moderately sized genomes unrelated to body size.

    Science.gov (United States)

    Organ, Chris L; Brusatte, Stephen L; Stein, Koen

    2009-12-22

    Sauropodomorph dinosaurs include the largest land animals to have ever lived, some reaching up to 10 times the mass of an African elephant. Despite their status defining the upper range for body size in land animals, it remains unknown whether sauropodomorphs evolved larger-sized genomes than non-avian theropods, their sister taxon, or whether a relationship exists between genome size and body size in dinosaurs, two questions critical for understanding broad patterns of genome evolution in dinosaurs. Here we report inferences of genome size for 10 sauropodomorph taxa. The estimates are derived from a Bayesian phylogenetic generalized least squares approach that generates posterior distributions of regression models relating genome size to osteocyte lacunae volume in extant tetrapods. We estimate that the average genome size of sauropodomorphs was 2.02 pg (range of species means: 1.77-2.21 pg), a value in the upper range of extant birds (mean = 1.42 pg, range: 0.97-2.16 pg) and near the average for extant non-avian reptiles (mean = 2.24 pg, range: 1.05-5.44 pg). The results suggest that the variation in size and architecture of genomes in extinct dinosaurs was lower than the variation found in mammals. A substantial difference in genome size separates the two major clades within dinosaurs, Ornithischia (large genomes) and Saurischia (moderate to small genomes). We find no relationship between body size and estimated genome size in extinct dinosaurs, which suggests that neutral forces did not dominate the evolution of genome size in this group.

  18. Rapidly Evolving Transients in the Dark Energy Survey

    Energy Technology Data Exchange (ETDEWEB)

    Pursiainen, M.; et al.

    2018-03-13

    We present the results of a search for rapidly evolving transients in the Dark Energy Survey Supernova Programme. These events are characterized by fast light curve evolution (rise to peak in $\\lesssim 10$ d and exponential decline in $\\lesssim30$ d after peak). We discovered 72 events, including 37 transients with a spectroscopic redshift from host galaxy spectral features. The 37 events increase the total number of rapid optical transients by more than factor of two. They are found at a wide range of redshifts ($0.05M_\\mathrm{g}>-22.25$). The multiband photometry is well fit by a blackbody up to few weeks after peak. The events appear to be hot ($T\\approx10000-30000$ K) and large ($R\\approx 10^{14}-2\\cdot10^{15}$ cm) at peak, and generally expand and cool in time, though some events show evidence for a receding photosphere with roughly constant temperature. Spectra taken around peak are dominated by a blue featureless continuum consistent with hot, optically thick ejecta. We compare our events with a previously suggested physical scenario involving shock breakout in an optically thick wind surrounding a core-collapse supernova (CCSNe), we conclude that current models for such a scenario might need an additional power source to describe the exponential decline. We find these transients tend to favor star-forming host galaxies, which could be consistent with a core-collapse origin. However, more detailed modeling of the light curves is necessary to determine their physical origin.

  19. Radiographer interpretation of trauma radiographs: Issues for radiography education providers

    International Nuclear Information System (INIS)

    Hardy, Maryann; Snaith, Beverly

    2009-01-01

    Background: The role of radiographers with respect to image interpretation within clinical practice is well recognised. It is the expectation of the professional, regulatory and academic bodies that upon qualification, radiographers will possess image interpretation skills. Additionally, The College of Radiographers has asserted that its aspiration is for all radiographers to be able to provide an immediate written interpretation on skeletal trauma radiographs by 2010. This paper explores the readiness of radiography education programmes in the UK to deliver this expectation. Method: A postal questionnaire was distributed to 25 Higher Education Institutions in the UK (including Northern Ireland) that provided pre-registration radiography education as identified from the Society and College of Radiographers register. Information was sought relating to the type of image interpretation education delivered at pre- and post-registration levels; the anatomical range of image interpretation education; and education delivery styles. Results: A total of 19 responses (n = 19/25; 76.0%) were received. Image interpretation education was included as part of all radiographer pre-registration programmes and offered at post-registration level at 12 academic centres (n = 12/19; 63.2%). The anatomical areas and educational delivery methods varied across institutions. Conclusion: Radiography education providers have embraced the need for image interpretation education within both pre- and post-registration radiography programmes. As a result, UK education programmes are able to meet the 2010 College of Radiographers aspiration.

  20. Evolving Systems: An Outcome of Fondest Hopes and Wildest Dreams

    Science.gov (United States)

    Frost, Susan A.; Balas, Mark J.

    2012-01-01

    New theory is presented for evolving systems, which are autonomously controlled subsystems that self-assemble into a new evolved system with a higher purpose. Evolving systems of aerospace structures often require additional control when assembling to maintain stability during the entire evolution process. This is the concept of Adaptive Key Component Control that operates through one specific component to maintain stability during the evolution. In addition, this control must often overcome persistent disturbances that occur while the evolution is in progress. Theoretical results will be presented for Adaptive Key Component control for persistent disturbance rejection. An illustrative example will demonstrate the Adaptive Key Component controller on a system composed of rigid body and flexible body modes.

  1. Qualitative Functional Decomposition Analysis of Evolved Neuromorphic Flight Controllers

    Directory of Open Access Journals (Sweden)

    Sanjay K. Boddhu

    2012-01-01

    Full Text Available In the previous work, it was demonstrated that one can effectively employ CTRNN-EH (a neuromorphic variant of EH method methodology to evolve neuromorphic flight controllers for a flapping wing robot. This paper describes a novel frequency grouping-based analysis technique, developed to qualitatively decompose the evolved controllers into explainable functional control blocks. A summary of the previous work related to evolving flight controllers for two categories of the controller types, called autonomous and nonautonomous controllers, is provided, and the applicability of the newly developed decomposition analysis for both controller categories is demonstrated. Further, the paper concludes with appropriate discussion of ongoing work and implications for possible future work related to employing the CTRNN-EH methodology and the decomposition analysis techniques presented in this paper.

  2. Computational Genetic Regulatory Networks Evolvable, Self-organizing Systems

    CERN Document Server

    Knabe, Johannes F

    2013-01-01

    Genetic Regulatory Networks (GRNs) in biological organisms are primary engines for cells to enact their engagements with environments, via incessant, continually active coupling. In differentiated multicellular organisms, tremendous complexity has arisen in the course of evolution of life on earth. Engineering and science have so far achieved no working system that can compare with this complexity, depth and scope of organization. Abstracting the dynamics of genetic regulatory control to a computational framework in which artificial GRNs in artificial simulated cells differentiate while connected in a changing topology, it is possible to apply Darwinian evolution in silico to study the capacity of such developmental/differentiated GRNs to evolve. In this volume an evolutionary GRN paradigm is investigated for its evolvability and robustness in models of biological clocks, in simple differentiated multicellularity, and in evolving artificial developing 'organisms' which grow and express an ontogeny starting fr...

  3. An open room for interpretation

    DEFF Research Database (Denmark)

    Tofte-Hansen, Inge

    2015-01-01

    Based on a concept that I have developed, which is called: "An open room for interpretation", the following article states that creative work and aesthetic expression in a pedagogical context with 2-6 years old children must give space for the children's own expressions. To teach music should...... not only be seen as a learning task where initiative and product is defined by the teacher. In contrast, I suggest that creative activities and aesthetic processes must be seen as an interaction between children's immediate physicality and curiosity and the teacher's musical skills and abilities to follow...

  4. Touch design and narrative interpretation

    DEFF Research Database (Denmark)

    Zhao, Sumin; Unsworth, Len

    2016-01-01

    and the Bottle in depth, and illustrate how interactive design elements help to create an interpretative possibility of the story. We suggest that a better understanding of interactive touch design would promote more effective adult-child interactions around mobile applications....... of technology, but also a resource for meaning making. We distinguish two basic types of interactivity—intra-text and extra-text—incorporated in the touch design, and explore the different functions they perform in a broad range of picture book apps. In particular, we look at the app version of The Heart...

  5. Interpreting CNNs via Decision Trees

    OpenAIRE

    Zhang, Quanshi; Yang, Yu; Wu, Ying Nian; Zhu, Song-Chun

    2018-01-01

    This paper presents a method to learn a decision tree to quantitatively explain the logic of each prediction of a pre-trained convolutional neural networks (CNNs). Our method boosts the following two aspects of network interpretability. 1) In the CNN, each filter in a high conv-layer must represent a specific object part, instead of describing mixed patterns without clear meanings. 2) People can explain each specific prediction made by the CNN at the semantic level using a decision tree, i.e....

  6. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  7. Is plate tectonics needed to evolve technological species on exoplanets?

    Directory of Open Access Journals (Sweden)

    Robert J. Stern

    2016-07-01

    Full Text Available As we continue searching for exoplanets, we wonder if life and technological species capable of communicating with us exists on any of them. As geoscientists, we can also wonder how important is the presence or absence of plate tectonics for the evolution of technological species. This essay considers this question, focusing on tectonically active rocky (silicate planets, like Earth, Venus, and Mars. The development of technological species on Earth provides key insights for understanding evolution on exoplanets, including the likely role that plate tectonics may play. An Earth-sized silicate planet is likely to experience several tectonic styles over its lifetime, as it cools and its lithosphere thickens, strengthens, and becomes denser. These include magma ocean, various styles of stagnant lid, and perhaps plate tectonics. Abundant liquid water favors both life and plate tectonics. Ocean is required for early evolution of diverse single-celled organisms, then colonies of cells which specialized further to form guts, appendages, and sensory organisms up to the complexity of fish (central nervous system, appendages, eyes. Large expanses of dry land also begin in the ocean, today produced above subduction zones in juvenile arcs and by their coalescence to form continents, although it is not clear that plate tectonics was required to create continental crust on Earth. Dry land of continents is required for further evolution of technological species, where modification of appendages for grasping and manipulating, and improvement of eyes and central nervous system could be perfected. These bioassets allowed intelligent creatures to examine the night sky and wonder, the beginning of abstract thinking, including religion and science. Technology arises from the exigencies of daily living such as tool-making, agriculture, clothing, and weapons, but the pace of innovation accelerates once it is allied with science. Finally, the importance of plate

  8. Long-Term Environmental Research Programs - Evolving Capacity for Discovery

    Science.gov (United States)

    Swanson, F. J.

    2008-12-01

    Long-term forestry, watershed, and ecological research sites have become critical, productive nodes for environmental science research and in some cases for work in the social sciences and humanities. The Forest Service's century-old Experimental Forests and Ranges and the National Science Foundation's 28- year-old Long-Term Ecological Research program have been remarkably productive in both basic and applied sciences, including characterization of acid rain and old-growth ecosystems and development of forest, watershed, and range management systems for commercial and other land use objectives. A review of recent developments suggests steps to enhance the function of collections of long-term research sites as interactive science networks. The programs at these sites have evolved greatly, especially over the past few decades, as the questions addressed, disciplines engaged, and degree of science integration have grown. This is well displayed by small, experimental watershed studies, which first were used for applied hydrology studies then more fundamental biogeochemical studies and now examination of complex ecosystem processes; all capitalizing on the legacy of intensive studies and environmental monitoring spanning decades. In very modest ways these collections of initially independent sites have functioned increasingly as integrated research networks addressing inter-site questions by using common experimental designs, being part of a single experiment, and examining long-term data in a common analytical framework. The network aspects include data sharing via publicly-accessible data-harvester systems for climate and streamflow data. The layering of one research or environmental monitoring network upon another facilitates synergies. Changing climate and atmospheric chemistry highlight a need to use these networks as continental-scale observatory systems for assessing the impacts of environmental change on ecological services. To better capitalize on long

  9. A BINARY ORBIT FOR THE MASSIVE, EVOLVED STAR HDE 326823, A WR+O SYSTEM PROGENITOR

    International Nuclear Information System (INIS)

    Richardson, N. D.; Gies, D. R.; Williams, S. J.

    2011-01-01

    The hot star HDE 326823 is a candidate transition-phase object that is evolving into a nitrogen-enriched Wolf-Rayet star. It is also a known low-amplitude, photometric variable with a 6.123 day period. We present new, high- and moderate-resolution spectroscopy of HDE 326823, and we show that the absorption lines show coherent Doppler shifts with this period while the emission lines display little or no velocity variation. We interpret the absorption line shifts as the orbital motion of the apparently brighter star in a close, interacting binary. We argue that this star is losing mass to a mass gainer star hidden in a thick accretion torus and to a circumbinary disk that is the source of the emission lines. HDE 326823 probably belongs to a class of objects that produce short-period WR+O binaries.

  10. Interpretation of ultrasonic images; Interpretation von Ultraschall-Abbildungen

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, W; Schmitz, V; Kroening, M [Fraunhofer-Institut fuer Zerstoerungsfreie Pruefverfahren, Saarbruecken (Germany)

    1998-11-01

    During the evaluation of ultrasonic images, e.g. SAFT-reconstructed B-scan images (SAFT=Synthetic Aperture Focusing Technique) it is often difficult to decide, what is the origin of reconstructed image points: were they caused by defects, specimens geometry or mode-conversions. To facilitate this evaluation a tool based on the comparison of data was developed. Different kinds of data comparison are possible: identification of that RF-signals, which caused the reconstructed image point. This is the comparison of a reconstructed image with the corresponding RF-data. Comparison of two reconstructed images performing a superposition using logical operators. In this case e.g. the reconstruction of an unknown reflector is compared with that of a known one. Comparison of raw-RF-data by simultaneous scanning through two data sets. Here the echoes of an unknown reflector are compared with the echoes of a known one. The necessary datasets of known reflectors may be generated experimentally on reference reflectors or modelled. The aim is the identification of the reflector type, e.g. cracklike or not, the determination of position, size and orientation as well as the identification of accompanying satellite echoes. The interpretation of the SAFT-reconstructed B-scan image is carried out by a complete description of the reflector. In addition to the aim of interpretation the tool described is well suited to educate and train ultrasonic testers. (orig./MM) [Deutsch] Bei der Auswertung von Ultraschall-Abbildungen, z.B. SAFT-rekonstruierten B-Bildern (SAFT=Synthetische Apertur Fokus Technik), ist es oft schwierig zu entscheiden, wo rekonstruierte Bildpunkte herruehren: wurden sie durch Materialfehler, Bauteilgeometrie oder durch Wellenumwandlungen versursacht. Um diese Auswertung zu erleichtern, wurde ein Werkzeug entwickelt, welches auf dem Vergleich von Datensaetzen basiert. Es koennen verschiedene Arten des Datenvergleichs durchgefuehrt werden: Identifikation der HF

  11. Towards operational interpretations of generalized entropies

    Science.gov (United States)

    Topsøe, Flemming

    2010-12-01

    The driving force behind our study has been to overcome the difficulties you encounter when you try to extend the clear and convincing operational interpretations of classical Boltzmann-Gibbs-Shannon entropy to other notions, especially to generalized entropies as proposed by Tsallis. Our approach is philosophical, based on speculations regarding the interplay between truth, belief and knowledge. The main result demonstrates that, accepting philosophically motivated assumptions, the only possible measures of entropy are those suggested by Tsallis - which, as we know, include classical entropy. This result constitutes, so it seems, a more transparent interpretation of entropy than previously available. However, further research to clarify the assumptions is still needed. Our study points to the thesis that one should never consider the notion of entropy in isolation - in order to enable a rich and technically smooth study, further concepts, such as divergence, score functions and descriptors or controls should be included in the discussion. This will clarify the distinction between Nature and Observer and facilitate a game theoretical discussion. The usefulness of this distinction and the subsequent exploitation of game theoretical results - such as those connected with the notion of Nash equilibrium - is demonstrated by a discussion of the Maximum Entropy Principle.

  12. Towards operational interpretations of generalized entropies

    International Nuclear Information System (INIS)

    Topsoee, Flemming

    2010-01-01

    The driving force behind our study has been to overcome the difficulties you encounter when you try to extend the clear and convincing operational interpretations of classical Boltzmann-Gibbs-Shannon entropy to other notions, especially to generalized entropies as proposed by Tsallis. Our approach is philosophical, based on speculations regarding the interplay between truth, belief and knowledge. The main result demonstrates that, accepting philosophically motivated assumptions, the only possible measures of entropy are those suggested by Tsallis - which, as we know, include classical entropy. This result constitutes, so it seems, a more transparent interpretation of entropy than previously available. However, further research to clarify the assumptions is still needed. Our study points to the thesis that one should never consider the notion of entropy in isolation - in order to enable a rich and technically smooth study, further concepts, such as divergence, score functions and descriptors or controls should be included in the discussion. This will clarify the distinction between Nature and Observer and facilitate a game theoretical discussion. The usefulness of this distinction and the subsequent exploitation of game theoretical results - such as those connected with the notion of Nash equilibrium - is demonstrated by a discussion of the Maximum Entropy Principle.

  13. Practical guide to interpretive near-infrared spectroscopy

    CERN Document Server

    Workman, Jr, Jerry

    2007-01-01

    Containing focused, comprehensive coverage, Practical Guide to Interpretive Near-Infrared Spectroscopy gives you the tools necessary to interpret NIR spectra. The authors present extensive tables, charts, and figures with NIR absorption band assignments and structural information for a broad range of functional groups, organic compounds, and polymers. They include visual spectral representation of all major compound functional groupings and NIR frequency ranges. Organized by functional group type and chemical structure, based on standard compound classification, the chapters are easy to

  14. Q&A: What is human language, when did it evolve and why should we care?

    OpenAIRE

    Pagel, Mark

    2017-01-01

    Human language is unique among all forms of animal communication. It is unlikely that any other species, including our close genetic cousins the Neanderthals, ever had language, and so-called sign 'language' in Great Apes is nothing like human language. Language evolution shares many features with biological evolution, and this has made it useful for tracing recent human history and for studying how culture evolves among groups of people with related languages. A case can be made that languag...

  15. Circumstellar ammonia in oxygen-rich evolved stars

    Science.gov (United States)

    Wong, K. T.; Menten, K. M.; Kamiński, T.; Wyrowski, F.; Lacy, J. H.; Greathouse, T. K.

    2018-04-01

    Context. The circumstellar ammonia (NH3) chemistry in evolved stars is poorly understood. Previous observations and modelling showed that NH3 abundance in oxygen-rich stars is several orders of magnitude above that predicted by equilibrium chemistry. Aims: We would like to characterise the spatial distribution and excitation of NH3 in the oxygen-rich circumstellar envelopes (CSEs) of four diverse targets: IK Tau, VY CMa, OH 231.8+4.2, and IRC +10420. Methods: We observed NH3 emission from the ground state in the inversion transitions near 1.3 cm with the Very Large Array (VLA) and submillimetre rotational transitions with the Heterodyne Instrument for the Far-Infrared (HIFI) aboard Herschel Space Observatory from all four targets. For IK Tau and VY CMa, we observed NH3 rovibrational absorption lines in the ν2 band near 10.5 μm with the Texas Echelon Cross Echelle Spectrograph (TEXES) at the NASA Infrared Telescope Facility (IRTF). We also attempted to search for the rotational transition within the excited vibrational state (v2 = 1) near 2 mm with the IRAM 30m Telescope. Non-LTE radiative transfer modelling, including radiative pumping to the vibrational state, was carried out to derive the radial distribution of NH3 in the CSEs of these targets. Results: We detected NH3 inversion and rotational emission in all four targets. IK Tau and VY CMa show blueshifted absorption in the rovibrational spectra. We did not detect vibrationally excited rotational transition from IK Tau. Spatially resolved VLA images of IK Tau and IRC +10420 show clumpy emission structures; unresolved images of VY CMa and OH 231.8+4.2 indicate that the spatial-kinematic distribution of NH3 is similar to that of assorted molecules, such as SO and SO2, that exhibit localised and clumpy emission. Our modelling shows that the NH3 abundance relative to molecular hydrogen is generally of the order of 10-7, which is a few times lower than previous estimates that were made without considering radiative

  16. Dynamics of Large Systems of Nonlinearly Evolving Units

    Science.gov (United States)

    Lu, Zhixin

    The dynamics of large systems of many nonlinearly evolving units is a general research area that has great importance for many areas in science and technology, including biology, computation by artificial neural networks, statistical mechanics, flocking in animal groups, the dynamics of coupled neurons in the brain, and many others. While universal principles and techniques are largely lacking in this broad area of research, there is still one particular phenomenon that seems to be broadly applicable. In particular, this is the idea of emergence, by which is meant macroscopic behaviors that "emerge" from a large system of many "smaller or simpler entities such that...large entities" [i.e., macroscopic behaviors] arise which "exhibit properties the smaller/simpler entities do not exhibit." In this thesis we investigate mechanisms and manifestations of emergence in four dynamical systems consisting many nonlinearly evolving units. These four systems are as follows. (a) We first study the motion of a large ensemble of many noninteracting particles in a slowly changing Hamiltonian system that undergoes a separatrix crossing. In such systems, we find that separatrix-crossing induces a counterintuitive effect. Specifically, numerical simulation of two sets of densely sprinkled initial conditions on two energy curves appears to suggest that the two energy curves, one originally enclosing the other, seemingly interchange their positions. This, however, is topologically forbidden. We resolve this paradox by introducing a numerical simulation method we call "robust" and study its consequences. (b) We next study the collective dynamics of oscillatory pacemaker neurons in Suprachiasmatic Nucleus (SCN), which, through synchrony, govern the circadian rhythm of mammals. We start from a high-dimensional description of the many coupled oscillatory neuronal units within the SCN. This description is based on a forced Kuramoto model. We then reduce the system dimensionality by using

  17. In-Space Transportation for NASA's Evolvable Mars Campaign

    Science.gov (United States)

    Percy, Thomas K.; McGuire, Melissa; Polsgrove, Tara

    2015-01-01

    As the nation embarks on a new and bold journey to Mars, significant work is being done to determine what that mission and those architectural elements will look like. The Evolvable Mars Campaign, or EMC, is being evaluated as a potential approach to getting humans to Mars. Built on the premise of leveraging current technology investments and maximizing element commonality to reduce cost and development schedule, the EMC transportation architecture is focused on developing the elements required to move crew and equipment to Mars as efficiently and effectively as possible both from a performance and a programmatic standpoint. Over the last 18 months the team has been evaluating potential options for those transportation elements. One of the key aspects of the EMC is leveraging investments being made today in missions like the Asteroid Redirect Mission (ARM) mission using derived versions of the Solar Electric Propulsion (SEP) propulsion systems and coupling them with other chemical propulsion elements that maximize commonality across the architecture between both transportation and Mars operations elements. This paper outlines the broad trade space being evaluated including the different technologies being assessed for transportation elements and how those elements are assembled into an architecture. Impacts to potential operational scenarios at Mars are also investigated. Trades are being made on the size and power level of the SEP vehicle for delivering cargo as well as the size of the chemical propulsion systems and various mission aspects including Inspace assembly and sequencing. Maximizing payload delivery to Mars with the SEP vehicle will better support the operational scenarios at Mars by enabling the delivery of landers and habitation elements that are appropriately sized for the mission. The purpose of this investigation is not to find the solution but rather a suite of solutions with potential application to the challenge of sending cargo and crew to Mars

  18. Modular Orbital Demonstration of an Evolvable Space Telescope (MODEST)

    Science.gov (United States)

    Baldauf, Brian; Conti, Alberto

    2016-01-01

    The "Search for Life" via imaging of exoplanets is a mission that requires extremely stable telescopes with apertures in the 10 m to 20 m range. The High Definition Space Telescope (HDST) envisioned for this mission would have an aperture >10 m, which is a larger payload than what can be delivered to space using a single launch vehicle. Building and assembling the mirror segments enabling large telescopes will likely require multiple launches and assembly in space. Space-based telescopes with large apertures will require major changes to system architectures.The Optical Telescope Assembly (OTA) for HDST is a primary mission cost driver. Enabling and affordable solutions for this next generation of large aperture space-based telescope are needed.This paper reports on the concept for the Modular Orbital Demonstration of an Evolvable Space Telescope (MODEST), which demonstrates on-orbit robotic and/or astronaut assembly of a precision optical telescope in space. It will also facilitate demonstration of active correction of phase and mirror shape. MODEST is proposed to be delivered to the ISS using standard Express Logistics Carriers (ELCs) and can mounted to one of a variety of ISS pallets. Post-assembly value includes space, ground, and environmental studies, and a testbed for new instruments. This demonstration program for next generation mirror technology provides significant risk reduction and demonstrates the technology in a six-mirror phased telescope. Other key features of the demonstration include the use of an active primary optical surface with wavefront feedback control that allows on-orbit optimization and demonstration of precise surface control to meet optical system wavefront and stability requirements.MODEST will also be used to evaluate advances in lightweight mirror and metering structure materials such as SiC or Carbon Fiber Reinforced Polymer that have excellent mechanical and thermal properties, e.g. high stiffness, high modulus, high thermal

  19. Behaviour of and mass transfer at gas-evolving electrodes

    NARCIS (Netherlands)

    Janssen, L.J.J.

    1989-01-01

    A completes set of models for the mass transfer of indicator ions to gas-evolving electrodes with different behaviour of bubbles is described theoretically. Sliding bubbles, rising detached single bubbles, jumping detached coalescence bubbles and ensembles of these types of bubbles are taken into

  20. Analysis of Lamarckian Evolution in Morphologically Evolving Robots

    NARCIS (Netherlands)

    Jelisavcic, Milan; Kiesel, Rafael; Glette, Kyrre; Haasdijk, Evert; Eiben, A.E.

    Evolving robot morphologies implies the need for lifetime learning so that newborn robots can learn to manipulate their bodies. An individual’s morphology will obviously combine traits of all its parents; it must adapt its own controller to suit its morphology, and cannot rely on the controller of

  1. Evolving fuzzy rules for relaxed-criteria negotiation.

    Science.gov (United States)

    Sim, Kwang Mong

    2008-12-01

    In the literature on automated negotiation, very few negotiation agents are designed with the flexibility to slightly relax their negotiation criteria to reach a consensus more rapidly and with more certainty. Furthermore, these relaxed-criteria negotiation agents were not equipped with the ability to enhance their performance by learning and evolving their relaxed-criteria negotiation rules. The impetus of this work is designing market-driven negotiation agents (MDAs) that not only have the flexibility of relaxing bargaining criteria using fuzzy rules, but can also evolve their structures by learning new relaxed-criteria fuzzy rules to improve their negotiation outcomes as they participate in negotiations in more e-markets. To this end, an evolutionary algorithm for adapting and evolving relaxed-criteria fuzzy rules was developed. Implementing the idea in a testbed, two kinds of experiments for evaluating and comparing EvEMDAs (MDAs with relaxed-criteria rules that are evolved using the evolutionary algorithm) and EMDAs (MDAs with relaxed-criteria rules that are manually constructed) were carried out through stochastic simulations. Empirical results show that: 1) EvEMDAs generally outperformed EMDAs in different types of e-markets and 2) the negotiation outcomes of EvEMDAs generally improved as they negotiated in more e-markets.

  2. Friends Drinking Together: Young Adults' Evolving Support Practices

    Science.gov (United States)

    Dresler, Emma; Anderson, Margaret

    2018-01-01

    Purpose: Young adult's drinking is about pleasure, a communal practice of socialising together in a friendship group. The purpose of this paper is to investigate the evolving support practices of drinking groups for better targeting of health communications messages. Design/methodology/approach: This qualitative descriptive study examined the…

  3. Evolving Concepts of Development through the Experience of ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will explore the experiences of emerging and developing countries in order to identify how the concept of international development has evolved and where it they may be heading. It will do so through a series of workshops convening scholars and practitioners from both the developing and the industrialized ...

  4. Heritage – A Conceptually Evolving and Dissonant Phenomenon ...

    African Journals Online (AJOL)

    I therefore, drawing from literature and experiences gained during field observations and focus group interviews, came up with the idea of working with three viewpoints of heritage. Drawing on real life cases I argue that current heritage management and education practices' failure to recognise and respect the evolving, ...

  5. A Conceptual Framework for Evolving, Recommender Online Learning Systems

    Science.gov (United States)

    Peiris, K. Dharini Amitha; Gallupe, R. Brent

    2012-01-01

    A comprehensive conceptual framework is developed and described for evolving recommender-driven online learning systems (ROLS). This framework describes how such systems can support students, course authors, course instructors, systems administrators, and policy makers in developing and using these ROLS. The design science information systems…

  6. Evolving intelligent vehicle control using multi-objective NEAT

    NARCIS (Netherlands)

    Willigen, W.H. van; Haasdijk, E.; Kester, L.J.H.M.

    2013-01-01

    The research in this paper is inspired by a vision of intelligent vehicles that autonomously move along motorways: they join and leave trains of vehicles (platoons), overtake other vehicles, etc. We propose a multi-objective algorithm based on NEAT and SPEA2 that evolves controllers for such

  7. Hormonal evaluation of the infertile male: has it evolved?

    Science.gov (United States)

    Sussman, Ernest M; Chudnovsky, Aleksander; Niederberger, Craig S

    2008-05-01

    An endocrinologic evaluation of patients who have male-factor infertility has clearly evolved and leads to specific diagnoses and treatment strategies in a large population of infertile men. A well-considered endocrine evaluation is especially essential with the ever-growing popularity of assisted reproductive techniques and continued refinements with intracytoplasmic sperm injection.

  8. Regional and Inter-Regional Effects in Evolving Climate Networks

    Czech Academy of Sciences Publication Activity Database

    Hlinka, Jaroslav; Hartman, David; Jajcay, Nikola; Vejmelka, Martin; Donner, R.; Marwan, N.; Kurths, J.; Paluš, Milan

    2014-01-01

    Roč. 21, č. 2 (2014), s. 451-462 ISSN 1023-5809 R&D Projects: GA ČR GCP103/11/J068 Institutional support: RVO:67985807 Keywords : climate networks * evolving networks * principal component analysis * network connectivity * El Nino Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.987, year: 2014

  9. Multivariate Epi-splines and Evolving Function Identification Problems

    Science.gov (United States)

    2015-04-15

    such extrinsic information as well as observed function and subgradient values often evolve in applications, we establish conditions under which the...previous study [30] dealt with compact intervals of IR. Splines are intimately tied to optimization problems through their variational theory pioneered...approxima- tion. Motivated by applications in curve fitting, regression, probability density estimation, variogram computation, financial curve construction

  10. Adapting Morphology to Multiple Tasks in Evolved Virtual Creatures

    DEFF Research Database (Denmark)

    Lessin, Dan; Fussell, Don; Miikkulainen, Risto

    2014-01-01

    The ESP method for evolving virtual creatures (Lessin et al., 2013) consisted of an encapsulation mechanism to preserve learned skills, a human-designed syllabus to build higherlevel skills by combining lower-level skills systematically, and a pandemonium mechanism to resolve conflicts between...

  11. Exploring the Evolving Professional Identity of Novice School Counselors

    Science.gov (United States)

    Bamgbose, Olamojiba Omolara

    2017-01-01

    The study employed a grounded theory approach to explore the evolving professional identity of novice school counselors. Participants, who are currently employed as school counselors at the elementary, middle, or high school level with 1-4 years' experience, were career changers from other helping professions and graduates from an intensive school…

  12. Evaluation and testing methodology for evolving entertainment systems

    NARCIS (Netherlands)

    Jurgelionis, A.; Bellotti, F.; IJsselsteijn, W.A.; Kort, de Y.A.W.; Bernhaupt, R.; Tscheligi, M.

    2007-01-01

    This paper presents a testing and evaluation methodology for evolving pervasive gaming and multimedia systems. We introduce the Games@Large system, a complex gaming and multimedia architecture comprised of a multitude of elements: heterogeneous end user devices, wireless and wired network

  13. Degree distribution of a new model for evolving networks

    Indian Academy of Sciences (India)

    on intuitive but realistic consideration that nodes are added to the network with both preferential and random attachments. The degree distribution of the model is between a power-law and an exponential decay. Motivated by the features of network evolution, we introduce a new model of evolving networks, incorporating the ...

  14. Evolving Nature of Sexual Orientation and Gender Identity

    Science.gov (United States)

    Jourian, T. J.

    2015-01-01

    This chapter discusses the historical and evolving terminology, constructs, and ideologies that inform the language used by those who are lesbian, gay, bisexual, and same-gender loving, who may identify as queer, as well as those who are members of trans* communities from multiple and intersectional perspectives.

  15. The Evolving Military Learner Population: A Review of the Literature

    Science.gov (United States)

    Ford, Kate; Vignare, Karen

    2015-01-01

    This literature review examines the evolving online military learner population with emphasis on current generation military learners, who are most frequently Post-9/11 veterans. The review synthesizes recent scholarly and grey literature on military learner demographics and attributes, college experiences, and academic outcomes against a backdrop…

  16. The evolving role of governments in the nuclear energy field

    International Nuclear Information System (INIS)

    Anon.

    2004-01-01

    The NEA Nuclear Development Committee (NDC) recently completed a study that looks into the evolving role of governments in nuclear energy matters. Many decisions on government intervention in recent decades have been based on the earlier experience of what works best. The report suggests some considerations that all governments could take into account when establishing their respective roles. (author)

  17. Evolving information systems: meeting the ever-changing environment

    NARCIS (Netherlands)

    Oei, J.L.H.; Proper, H.A.; Falkenberg, E.D.

    1994-01-01

    To meet the demands of organizations and their ever-changing environment, information systems are required which are able to evolve to the same extent as organizations do. Such a system has to support changes in all time-and application-dependent aspects. In this paper, requirements and a conceptual

  18. You 3.0: The Most Important Evolving Technology

    Science.gov (United States)

    Tamarkin, Molly; Bantz, David A.; Childs, Melody; diFilipo, Stephen; Landry, Stephen G.; LoPresti, Frances; McDonald, Robert H.; McGuthry, John W.; Meier, Tina; Rodrigo, Rochelle; Sparrow, Jennifer; Diggs, D. Teddy; Yang, Catherine W.

    2010-01-01

    That technology evolves is a given. Not as well understood is the impact of technological evolution on each individual--on oneself, one's skill development, one's career, and one's relationship with the work community. The authors believe that everyone in higher education will become an IT worker and that IT workers will be managing a growing…

  19. Sextant: Visualizing time-evolving linked geospatial data

    NARCIS (Netherlands)

    C. Nikolaou (Charalampos); K. Dogani (Kallirroi); K. Bereta (Konstantina); G. Garbis (George); M. Karpathiotakis (Manos); K. Kyzirakos (Konstantinos); M. Koubarakis (Manolis)

    2015-01-01

    textabstractThe linked open data cloud is constantly evolving as datasets get continuously updated with newer versions. As a result, representing, querying, and visualizing the temporal dimension of linked data is crucial. This is especially important for geospatial datasets that form the backbone

  20. Quantitative interpretation of nuclear logging data by adopting point-by-point spectrum striping deconvolution technology

    International Nuclear Information System (INIS)

    Tang Bin; Liu Ling; Zhou Shumin; Zhou Rongsheng

    2006-01-01

    The paper discusses the gamma-ray spectrum interpretation technology on nuclear logging. The principles of familiar quantitative interpretation methods, including the average content method and the traditional spectrum striping method, are introduced, and their limitation of determining the contents of radioactive elements on unsaturated ledges (where radioactive elements distribute unevenly) is presented. On the basis of the intensity gamma-logging quantitative interpretation technology by using the deconvolution method, a new quantitative interpretation method of separating radioactive elements is presented for interpreting the gamma spectrum logging. This is a point-by-point spectrum striping deconvolution technology which can give the logging data a quantitative interpretation. (authors)

  1. Adequate proverb interpretation is associated with performance on the independent living scales.

    Science.gov (United States)

    Ahmed, Fayeza S; Miller, L Stephen

    2015-01-01

    The purpose of this study was to examine proverb interpretation performance and functional independence in older adults. From the limited literature on proverb interpretation in aging and its conceptualization as an executive function, it was hypothesized that proverb interpretation would be related to functional independence similar to other executive functions. Tests of proverb interpretation, additional executive functions, and functional ability were administered to nondemented older adults. Results showed that proverb interpretation accounted for a significant amount of unique variance of functional ability scores. This supports including a measure of proverb interpretation to the assessment of older adults.

  2. Evolving career choice narratives of new graduate nurses.

    Science.gov (United States)

    Price, Sheri L; McGillis Hall, Linda; Murphy, Gail Tomblin; Pierce, Bridget

    2018-01-01

    This article describes findings from one stage of a longitudinal study of the professional socialization experiences of Millennial nurses as they prepared for graduation and transition to practice. This study employed an interpretive narrative methodology guided by Polkinghorne's theory of narrative identity. Analysis of face-to-face interviews and journal entries by Millennial nursing students uncovered the formal professional socialization experiences over four years of nursing education. Participants include six Millennial nursing student participants (born after 1980) interviewed approximately one-month aftergraduation. These six participants are a voluntary subset of twelve who were interviewed prior to beginning their nursing studies, the analysis of which is captured in Price et al. (2013a) and Price et al. (2013b). Narrative analysis of the post-graduation interviews resulted in three main themes: 'Real Nursing: Making a Difference', 'The Good Nurse: Defined by Practice' and 'Creating Career Life Balance'. Graduate nurses strive to provide excellent nursing care as they transition into the workforce and identify a need for ongoing peer and professional supports to assist their ongoing professional socialization. Ongoing formal socialization and professional development is required to support the transition and retention of new nurse graduates in the workplace and the profession. Millenial generation nurses seek opportunities for career mapping, goal setting and formal mentorship by role models and peers to actualize their professional aspirations. Copyright © 2017. Published by Elsevier Ltd.

  3. Risk and responsibility: a complex and evolving relationship.

    Science.gov (United States)

    Kermisch, Céline

    2012-03-01

    This paper analyses the nature of the relationship between risk and responsibility. Since neither the concept of risk nor the concept of responsibility has an unequivocal definition, it is obvious that there is no single interpretation of their relationship. After introducing the different meanings of responsibility used in this paper, we analyse four conceptions of risk. This allows us to make their link with responsibility explicit and to determine if a shift in the connection between risk and responsibility can be outlined. (1) In the engineer's paradigm, the quantitative conception of risk does not include any concept of responsibility. Their relationship is indirect, the locus of responsibility being risk management. (2) In Mary Douglas' cultural theory, risks are constructed through the responsibilities they engage. (3) Rayner and (4) Wolff go further by integrating forms of responsibility in the definition of risk itself. Analysis of these four frameworks shows that the concepts of risk and responsibility are increasingly intertwined. This tendency is reinforced by increasing public awareness and a call for the integration of a moral dimension in risk management. Therefore, we suggest that a form of virtue-responsibility should also be integrated in the concept of risk. © Springer Science+Business Media B.V. 2010

  4. Interpretation modification training reduces social anxiety in clinically anxious children.

    Science.gov (United States)

    Klein, Anke M; Rapee, Ronald M; Hudson, Jennifer L; Schniering, Carolyn A; Wuthrich, Viviana M; Kangas, Maria; Lyneham, Heidi J; Souren, Pierre M; Rinck, Mike

    2015-12-01

    The present study was designed to examine the effects of training in positive interpretations in clinically anxious children. A total of 87 children between 7 and 12 years of age were randomly assigned to either a positive cognitive bias modification training for interpretation (CMB-I) or a neutral training. Training included 15 sessions in a two-week period. Children with an interpretation bias prior to training in the positive training group showed a significant reduction in interpretation bias on the social threat scenarios after training, but not children in the neutral training group. No effects on interpretation biases were found for the general threat scenarios or the non-threat scenarios. Furthermore, children in the positive training did not self-report lower anxiety than children in the neutral training group. However, mothers and fathers reported a significant reduction in social anxiety in their children after positive training, but not after neutral training. This study demonstrated that clinically anxious children with a prior interpretation bias can be trained away from negative social interpretation biases and there is some evidence that this corresponds to reductions in social anxiety. This study also highlights the importance of using specific training stimuli. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. ICADx: interpretable computer aided diagnosis of breast masses

    Science.gov (United States)

    Kim, Seong Tae; Lee, Hakmin; Kim, Hak Gu; Ro, Yong Man

    2018-02-01

    In this study, a novel computer aided diagnosis (CADx) framework is devised to investigate interpretability for classifying breast masses. Recently, a deep learning technology has been successfully applied to medical image analysis including CADx. Existing deep learning based CADx approaches, however, have a limitation in explaining the diagnostic decision. In real clinical practice, clinical decisions could be made with reasonable explanation. So current deep learning approaches in CADx are limited in real world deployment. In this paper, we investigate interpretability in CADx with the proposed interpretable CADx (ICADx) framework. The proposed framework is devised with a generative adversarial network, which consists of interpretable diagnosis network and synthetic lesion generative network to learn the relationship between malignancy and a standardized description (BI-RADS). The lesion generative network and the interpretable diagnosis network compete in an adversarial learning so that the two networks are improved. The effectiveness of the proposed method was validated on public mammogram database. Experimental results showed that the proposed ICADx framework could provide the interpretability of mass as well as mass classification. It was mainly attributed to the fact that the proposed method was effectively trained to find the relationship between malignancy and interpretations via the adversarial learning. These results imply that the proposed ICADx framework could be a promising approach to develop the CADx system.

  6. Chest radiograph interpretation by medical students

    International Nuclear Information System (INIS)

    Jeffrey, D.R.; Goddard, P.R.; Callaway, M.P.; Greenwood, R.

    2003-01-01

    AIM: To assess the ability of final year medical students to interpret conventional chest radiographs. MATERIALS AND METHODS: Ten conventional chest radiographs were selected from a teaching hospital radiology department library that were good radiological examples of common conditions. All were conditions that a medical student should be expected to recognize by the end of their training. One normal radiograph was included. The radiographs were shown to 52 final year medical students who were asked to describe their findings. RESULTS: The median score achieved was 12.5 out of 20 (range 6-18). There was no difference between the median scores of male and female students (12.5 and 12.3, respectively, p=0.82) but male students were more likely to be certain of their answers than female students (median certainty scores 23.0 and 14.0, respectively). The overall degree of certainty was low. On no radiograph were more than 25% of students definite about their answer. Students had received little formal radiology teaching (2-42 h, median 21) and few expressed an interest in radiology as a career. Only two (3.8%) students thought they were good at interpreting chest radiographs, 17 (32.7%) thought they were bad or awful. CONCLUSION: Medical students reaching the end of their training do not perform well at interpreting simple chest radiographs. They lack confidence and have received little formal radiological tuition. Perhaps as a result, few are interested in radiology as a career, which is a matter for concern in view of the current shortage of radiologists in the UK

  7. A new interpretation of chaos

    International Nuclear Information System (INIS)

    Luo Chuanwen; Wang Gang; Wang Chuncheng; Wei Junjie

    2009-01-01

    The concepts of uniform index and expectation uniform index are two mathematical descriptions of the uniformity and the mean uniformity of a finite set in a polyhedron. The concepts of instantaneous chaometry (ICM) and k step chaometry (k SCM) are introduced in order to apply the method in statistics for studying the nonlinear difference equations. It is found that k step chaometry is an indirect estimation of the expectation uniform index. The simulation illustrate that the expectation uniform index for the Lorenz System is increasing linearly, but increasing nonlinearly for the Chen's System with parameter b. In other words, the orbits for each system become more and more uniform with parameter b increasing. Finally, a conjecture is also brought forward, which implies that chaos can be interpreted by its orbit's mean uniformity described by the expectation uniform index and indirectly estimated by k SCM. The k SCM of the heart rate showes the feeble and old process of the heart.

  8. APL interpreter on MITRA-15

    International Nuclear Information System (INIS)

    Davcev, Danco

    1975-01-01

    APL in its present forms is an ideal instrument for the establishment of logic systems since it requires no specific declaration of type or form of variables. An APL system for C II computer of the MITRA series is described, with the following minimum configuration: MITRA central unit, 16-bit 32 K word memory, disc with fixed or mobile heads, type 4013 TEKTRONIX visualisation system. The originality of our APL interpreter on MITRA 15 lies in the use of a virtual memory system with pages of 128 word size. The so-called beating process is used to set up APL operators: the selection expressions in the tables may be evaluated without any manipulation of the values. (author) [fr

  9. Interpretation of neonatal chest radiography

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Hye Kyung [Dept. of Radiology, Kangwon National University Hospital, Chuncheon (Korea, Republic of)

    2016-05-15

    Plain radiographs for infants in the neonatal intensive care unit are obtained using the portable X-ray equipment in order to evaluate the neonatal lungs and also to check the position of the tubes and catheters used for monitoring critically-ill neonates. Neonatal respiratory distress is caused by a variety of medical or surgical disease conditions. Clinical information about the gestational week, respiratory symptoms, and any events during delivery is essential for interpretation of the neonatal chest radiographs. Awareness of common chest abnormality in the prematurely born or term babies is also very important for chest evaluation in the newborn. Furthermore, knowledge about complications such as air leaks and bronchopulmonary dysplasia following treatment are required to accurately inform the clinicians. The purpose of this article was to briefly review radiographic findings of chest diseases in newborns that are relatively common in daily practice.

  10. Interpreting the cosmic ray composition

    International Nuclear Information System (INIS)

    O'C Drury, L.; Ellisson, D.C; Meyer, J.-P.

    2000-01-01

    The detailed pattern of elemental abundances in the Galactic Cosmic Rays is well determined at energies of a few GeV per nucleon. After correction for propagation effects the inferred source composition shows significant deviations from the standard pattern of Galactic elemental abundances. These deviations, surprisingly overabundances of the heavy elements relative to Hydrogen, are clearly a significant clue to the origin of the cosmic rays, but one which has proven very difficult to interpret. We have recently shown that the 'standard' model for the origin of the bulk of the Galactic cosmic rays, namely acceleration by the diffusive shock acceleration process at the strong shocks associated with supernova remnants, can quantitatively explain all features of the source composition if the acceleration occurs from a dusty interstellar medium. This success must be regarded as one of the stronger pieces of evidence in favour of the standard model

  11. Interpreting the cosmic ray composition

    Energy Technology Data Exchange (ETDEWEB)

    O' C Drury, L.; Ellisson, D.C; Meyer, J.-P

    2000-01-31

    The detailed pattern of elemental abundances in the Galactic Cosmic Rays is well determined at energies of a few GeV per nucleon. After correction for propagation effects the inferred source composition shows significant deviations from the standard pattern of Galactic elemental abundances. These deviations, surprisingly overabundances of the heavy elements relative to Hydrogen, are clearly a significant clue to the origin of the cosmic rays, but one which has proven very difficult to interpret. We have recently shown that the 'standard' model for the origin of the bulk of the Galactic cosmic rays, namely acceleration by the diffusive shock acceleration process at the strong shocks associated with supernova remnants, can quantitatively explain all features of the source composition if the acceleration occurs from a dusty interstellar medium. This success must be regarded as one of the stronger pieces of evidence in favour of the standard model.

  12. Interpretation of neonatal chest radiography

    International Nuclear Information System (INIS)

    Yoon, Hye Kyung

    2016-01-01

    Plain radiographs for infants in the neonatal intensive care unit are obtained using the portable X-ray equipment in order to evaluate the neonatal lungs and also to check the position of the tubes and catheters used for monitoring critically-ill neonates. Neonatal respiratory distress is caused by a variety of medical or surgical disease conditions. Clinical information about the gestational week, respiratory symptoms, and any events during delivery is essential for interpretation of the neonatal chest radiographs. Awareness of common chest abnormality in the prematurely born or term babies is also very important for chest evaluation in the newborn. Furthermore, knowledge about complications such as air leaks and bronchopulmonary dysplasia following treatment are required to accurately inform the clinicians. The purpose of this article was to briefly review radiographic findings of chest diseases in newborns that are relatively common in daily practice

  13. Defunctionalized Interpreters for Programming Languages

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2008-01-01

    by Reynolds in ``Definitional Interpreters for Higher-Order Programming Languages'' for functional implementations of denotational semantics, natural semantics, and big-step abstract machines using closure conversion, CPS transformation, and defunctionalization. Over the last few years, the author and his......This document illustrates how functional implementations of formal semantics (structural operational semantics, reduction semantics, small-step and big-step abstract machines, natural semantics, and denotational semantics) can be transformed into each other. These transformations were foreshadowed...... students have further observed that functional implementations of small-step and of big-step abstract machines are related using fusion by fixed-point promotion and that functional implementations of reduction semantics and of small-step abstract machines are related using refocusing and transition...

  14. Concurrent LISP and its interpreter

    Energy Technology Data Exchange (ETDEWEB)

    Tabata, K; Sugimoto, S; Ohno, Y

    1981-01-01

    In the research field of artificial intelligence many languages have been developed based on LISP, such as Planner, Conniver and so on. They have been developed to give users many useful facilities, especially for describing flexible control structures. Backtracking and coroutine facilities are typical ones introduced into these languages. Compared with backtracking and coroutine facilities, multi-process description facilities are considered to be a better alternative for writing well-structured programs. This paper describes concurrent LISP, a new concurrent programming language based on LISP. Concurrent LISP is designed to provide simple and flexible facilities for multi-process description without changing the original language features of LISP. This paper also describes the concurrent LISP interpreter which has been implemented on a FACOM M-200 at the Data Processing Center of Kyoto University. 19 references.

  15. Evolvable mathematical models: A new artificial Intelligence paradigm

    Science.gov (United States)

    Grouchy, Paul

    We develop a novel Artificial Intelligence paradigm to generate autonomously artificial agents as mathematical models of behaviour. Agent/environment inputs are mapped to agent outputs via equation trees which are evolved in a manner similar to Symbolic Regression in Genetic Programming. Equations are comprised of only the four basic mathematical operators, addition, subtraction, multiplication and division, as well as input and output variables and constants. From these operations, equations can be constructed that approximate any analytic function. These Evolvable Mathematical Models (EMMs) are tested and compared to their Artificial Neural Network (ANN) counterparts on two benchmarking tasks: the double-pole balancing without velocity information benchmark and the challenging discrete Double-T Maze experiments with homing. The results from these experiments show that EMMs are capable of solving tasks typically solved by ANNs, and that they have the ability to produce agents that demonstrate learning behaviours. To further explore the capabilities of EMMs, as well as to investigate the evolutionary origins of communication, we develop NoiseWorld, an Artificial Life simulation in which interagent communication emerges and evolves from initially noncommunicating EMM-based agents. Agents develop the capability to transmit their x and y position information over a one-dimensional channel via a complex, dialogue-based communication scheme. These evolved communication schemes are analyzed and their evolutionary trajectories examined, yielding significant insight into the emergence and subsequent evolution of cooperative communication. Evolved agents from NoiseWorld are successfully transferred onto physical robots, demonstrating the transferability of EMM-based AIs from simulation into physical reality.

  16. A modular interpretation of various cubic towers

    DEFF Research Database (Denmark)

    Anbar Meidl, Nurdagül; Bassa, Alp; Beelen, Peter

    2017-01-01

    In this article we give a Drinfeld modular interpretation for various towers of function fields meeting Zink's bound.......In this article we give a Drinfeld modular interpretation for various towers of function fields meeting Zink's bound....

  17. Issues related to interpretation of space imagery

    Energy Technology Data Exchange (ETDEWEB)

    Alferenok, A V; Przhiyalgovskii, Ye S

    1981-01-01

    A method for interpreting remotely derived data of various generalization levels (e.g. the northern section of the Chu-Sarysuiskaya basin) that suggests use of a uniform legend for interpretation of maps.

  18. Language production and interpretation linguistics meets cognition

    CERN Document Server

    Zeevat, Henk

    2014-01-01

    A model of production and interpretation of natural language utterances is developed which explains why communication is normally fast and successful. Interpretation is taken to be analogous with visual perception in finding the most probable hypothesis that explains the utterance.

  19. Three-Dimensional Interpretation of Sculptural Heritage with Digital and Tangible 3D Printed Replicas

    Science.gov (United States)

    Saorin, José Luis; Carbonell-Carrera, Carlos; Cantero, Jorge de la Torre; Meier, Cecile; Aleman, Drago Diaz

    2017-01-01

    Spatial interpretation features as a skill to acquire in the educational curricula. The visualization and interpretation of three-dimensional objects in tactile devices and the possibility of digital manufacturing with 3D printers, offers an opportunity to include replicas of sculptures in teaching and, thus, facilitate the 3D interpretation of…

  20. Blackboard architecture for medical image interpretation

    Science.gov (United States)

    Davis, Darryl N.; Taylor, Christopher J.

    1991-06-01

    There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.

  1. RETHINKING RESIDENTIAL MOBILITY: AN INTERDISCIPLINARY INTERPRETATION

    Directory of Open Access Journals (Sweden)

    Roderick J. Lawrence

    2008-03-01

    Full Text Available Since the 1950s academics and professionals have proposed a number of disciplinary and sector based interpretations of why, when and where households move or choose to stay in the same housing unit at different periods of the life cycle and especially the family cycle. This article challenges studies that only analyse one set of factors. The article stems from a synthesis of 20 years of research by the author who  has an interdisciplinary training in the broad field of people-environment relations. First, it reviews some key concepts related to human ecology, including housing, culture, identity and cultivation. Then it will consider how these concepts can be applied to interpret residential mobility using an interdisciplinary approach. An empirical case study of residential mobility in Geneva, Switzerland is presented in order to show how this approach can help improve our understanding of the motives people have regarding the wish to stay in their residence or to move elsewhere.

  2. Galileo and the Interpretation of the Bible

    Science.gov (United States)

    Carroll, William E.

    Galileo's understanding of the relationship between science and the Bible has frequently been celebrated as anticipating a modern distinction between the essentially religious nature of scripture and the claims of the natural sciences. Galileo's reference to the remarks of Cardinal Baronius, that the Bible teaches one how to go to heaven and not how the heavens go, has been seem as emblematic of his commitment to the distinction between the Book of Nature and the Book of Scripture. This essay argues that, contrary to the common view, Galileo shares with the theologians of the Inquisition the same fundamental principles of biblical interpretation: principles which include traditional scriptural hermeneutics enunciated by Augustine and Aquinas, as well as those characteristic of Counter-Reformation Catholicism. Although Galileo argues that one should not begin with biblical passages in order to discover truths about nature, he does think that the Bible contains scientific truths and that it is the function of wise interpreters to discover these truths. The dispute with the theologians of the Inquisition occurred because they thought that it was obviously true scientifically that the earth did not move and, on the basis of this view, they read the Bible as revealing the same thing. They reached this conclusion because, like Galileo, they thought that the Bible contained truths about nature. Of course, what these theologians accepted as scientifically true, Galileo denied.

  3. An Online Synchronous Test for Professional Interpreters

    Science.gov (United States)

    Chen, Nian-Shing; Ko, Leong

    2010-01-01

    This article is based on an experiment designed to conduct an interpreting test for multiple candidates online, using web-based synchronous cyber classrooms. The test model was based on the accreditation test for Professional Interpreters produced by the National Accreditation Authority of Translators and Interpreters (NAATI) in Australia.…

  4. Modular interpreters with implicit context propagation

    NARCIS (Netherlands)

    P.A. Inostroza Valdera (Pablo); T. van der Storm (Tijs)

    2017-01-01

    textabstractModular interpreters are a crucial first step towards component-based language development: instead of writing language interpreters from scratch, they can be assembled from reusable, semantic building blocks. Unfortunately, traditional language interpreters can be hard to extend because

  5. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  6. Children's Comprehension of Metaphor: A Piagetian Interpretation

    Science.gov (United States)

    Smith, J. W. A.

    1976-01-01

    When the descriptive interpretations that sixth and eighth graders provided for metaphors selected from fifth-grade readers were examined in a Piagetian framework, the poorest interpretations showed characteristics of concrete and pre-operational thought, while the best interpretations showed characteristics of formal operational thought. (RL)

  7. Extended Smoluchowski models for interpreting relaxation phenomena in liquids

    International Nuclear Information System (INIS)

    Polimeno, A.; Frezzato, D.; Saielli, G.; Moro, G.J.; Nordio, P.L.

    1998-01-01

    Interpretation of the dynamical behaviour of single molecules or collective modes in liquids has been increasingly centered, in the last decade, on complex liquid systems, including ionic solutions, polymeric liquids, supercooled fluids and liquid crystals. This has been made necessary by the need of interpreting dynamical data obtained by advanced experiments, like optical Kerr effect, time dependent fluorescence shift experiments, two-dimensional Fourier-transform and high field electron spin resonance and scattering experiments like quasi-elastic neutron scattering. This communication is centered on the definition, treatment and application of several extended stochastic models, which have proved to be very effective tools for interpreting and rationalizing complex relaxation phenomena in liquids structures. First, applications of standard Fokker-Planck equations for the orientational relaxation of molecules in isotropic and ordered liquid phase are reviewed. In particular attention will be focused on the interpretation of neutron scattering in nematics. Next, an extended stochastic model is used to interpret time-domain resolved fluorescence emission experiments. A two-body stochastic model allows the theoretical interpretation of dynamical Stokes shift effects in fluorescence emission spectra, performed on probes in isotropic and ordered polar phases. Finally, for the case of isotropic fluids made of small rigid molecules, a very detailed model is considered, which includes as basic ingredients a Fokker-Planck description of the molecular vibrational motion and the slow diffusive motion of a persistent cage structure together with the decay processes related to the changing structure of the cage. (author)

  8. Momentum conservation decides Heisenberg's interpretation of the uncertainty formulas

    International Nuclear Information System (INIS)

    Angelidis, T.D.

    1977-01-01

    In the light of Heisenberg's interpretation of the uncertainty formulas, the conditions necessary for the derivation of the quantitative statement or law of momentum conservation are considered. The result of such considerations is a contradiction between the formalism of quantum physics and the asserted consequences of Heisenberg's interpretation. This contradiction decides against Heisenberg's interpretation of the uncertainty formulas on upholding that the formalism of quantum physics is both consistent and complete, at least insofar as the statement of momentum conservation can be proved within this formalism. A few comments are also included on Bohr's complementarity interpretation of the formalism of quantum physics. A suggestion, based on a statistical mode of empirical testing of the uncertainty formulas, does not give rise to any such contradiction

  9. Interpreting values in the daily practices of Nordic preschools

    DEFF Research Database (Denmark)

    Broström, Stig; Anna-Maija, Puriola; Johannesson, Eva Marianne

    2016-01-01

    Abstract This study explored how practitioners interpreted educational practices from the perspective of values in Nordic preschools. Drawing data from group interviews in five Nordic countries (Denmark, Finland, Iceland, Norway and Sweden), practitioners reflected on an observational episode about...... children dressing for outdoor play in a Swedish preschool. The research material consisted of extracts from group interviews in ten preschools (two from each Nordic country). The research questions included: How do values emerge in practitioners’ interpretations? What is the interpretive process like...... and the co-construction of interpretations in the group dialogues. The practitioners employed indirect means more often than direct means to express their values. The group interviews contained themes that were connected to caring, disciplinary, competence and democratic values. The study provided evidence...

  10. An evolving user-oriented model of Internet health information seeking.

    Science.gov (United States)

    Gaie, Martha J

    2006-01-01

    This paper presents an evolving user-oriented model of Internet health information seeking (IS) based on qualitative data collected from 22 lung cancer (LC) patients and caregivers. This evolving model represents information search behavior as more highly individualized, complex, and dynamic than previous models, including pre-search psychological activity, use of multiple heuristics throughout the process, and cost-benefit evaluation of search results. This study's findings suggest that IS occurs in four distinct phases: search initiation/continuation, selective exposure, message processing, and message evaluation. The identification of these phases and the heuristics used within them suggests a higher order of complexity in the decision-making processes that underlie IS, which could lead to the development of a conceptual framework that more closely reflects the complex nature of contextualized IS. It also illustrates the advantages of using qualitative methods to extract more subtle details of the IS process and fill in the gaps in existing models.

  11. (Inter)Temporal Considerations in the Interpretative Process of the VCLT : Do Treaties Endure, Perdure or Exdure?

    NARCIS (Netherlands)

    Merkouris, Panos

    2014-01-01

    When interpreted, sometimes treaties have to go through a trial by fire and are found either to be ‘living instruments’ evolving alongside the relevant changes both in law and in facts or to have a ‘fixed’ meaning. The aim of the present article is to examine how temporal considerations find their

  12. Climate in Context - How partnerships evolve in regions

    Science.gov (United States)

    Parris, A. S.

    2014-12-01

    In 2015, NOAA's RISA program will celebrate its 20th year of exploration in the development of usable climate information. In the mid-1990s, a vision emerged to develop interdisciplinary research efforts at the regional scale for several important reasons. Recognizable climate patterns, such as the El Nino Southern Oscillation (ENSO), emerge at the regional level where our understanding of observations and models coalesce. Critical resources for society are managed in a context of regional systems, such as water supply and human populations. Multiple scales of governance (local, state, and federal) with complex institutional relationships can be examined across a region. Climate information (i.e. data, science, research etc) developed within these contexts has greater potential for use. All of this work rests on a foundation of iterative engagement between scientists and decision makers. Throughout these interactions, RISAs have navigated diverse politics, extreme events and disasters, socio-economic and ecological disruptions, and advances in both science and technology. Our understanding of information needs is evolving into a richer understanding of complex institutional, legal, political, and cultural contexts within which people can use science to make informed decisions. The outcome of RISA work includes both cases where climate information was used in decisions and cases where capacity for using climate information and making climate resilient decisions has increased over time. In addition to balancing supply and demand of scientific information, RISAs are engaged in a social process of reconciling climate information use with important drivers of society. Because partnerships are critical for sustained engagement, and because engagement is critically important to the use of science, the rapid development of new capacity in regionally-based science programs focused on providing climate decision support is both needed and challenging. New actors can bolster

  13. Resident physicians' opinions and behaviors regarding the use of interpreters in New Orleans.

    Science.gov (United States)

    Sandler, Rachel; Myers, Leann; Springgate, Benjamin

    2014-11-01

    In academic medical centers, resident physicians are most involved in the care of patients, yet many have little training in the proper use of interpreters in the care of patients with limited English-language proficiency. Residents have cited lack of time and lack of access to trained medical interpreters as barriers to the use of professional interpreter services. The purpose of this study was to examine the usage patterns of interpreters and perceived barriers to using interpreters in New Orleans. Subjects included resident physicians training in internal medicine, pediatrics, and combined internal medicine and pediatrics at Tulane University and Louisiana State University in New Orleans. A survey that consisted of demographics, short-answer, and Likert-scale questions regarding attitudes related to the use of interpreters was used as the metric. The overall response rate was 55.5%. A total of 92.4% of subjects surveyed stated that they had used an interpreter during their residency. Telephone services and family members were the most commonly used types of interpreters (41.3% and 30.5%, respectively). Resident physicians were most likely to use interpreter services during their initial history taking as well as at discharge, but use declined throughout patients' hospitalization (P New Orleans have experience using interpreter services; however, they continue to use untrained interpreters and use varies during the hospital encounter. Targeted training for residents, including interpreter logistics, may help increase the use of interpreters.

  14. Musculoskeletal ultrasound including definitions for ultrasonographic pathology

    DEFF Research Database (Denmark)

    Wakefield, RJ; Balint, PV; Szkudlarek, Marcin

    2005-01-01

    Ultrasound (US) has great potential as an outcome in rheumatoid arthritis trials for detecting bone erosions, synovitis, tendon disease, and enthesopathy. It has a number of distinct advantages over magnetic resonance imaging, including good patient tolerability and ability to scan multiple joints...... in a short period of time. However, there are scarce data regarding its validity, reproducibility, and responsiveness to change, making interpretation and comparison of studies difficult. In particular, there are limited data describing standardized scanning methodology and standardized definitions of US...... pathologies. This article presents the first report from the OMERACT ultrasound special interest group, which has compared US against the criteria of the OMERACT filter. Also proposed for the first time are consensus US definitions for common pathological lesions seen in patients with inflammatory arthritis....

  15. Langevin simulations of QCD, including fermions

    International Nuclear Information System (INIS)

    Kronfeld, A.S.

    1986-02-01

    We encounter critical slow down in updating when xi/a -> infinite and in matrix inversion (needed to include fermions) when msub(q)a -> 0. A simulation that purports to solve QCD numerically will encounter these limits, so to face the challenge in the title of this workshop, we must cure the disease of critical slow down. Physically, this critical slow down is due to the reluctance of changes at short distances to propagate to large distances. Numerically, the stability of an algorithm at short wavelengths requires a (moderately) small step size; critical slow down occurs when the effective long wavelength step size becomes tiny. The remedy for this disease is an algorithm that propagates signals quickly throughout the system; i.e. one whose effective step size is not reduced for the long wavelength conponents of the fields. (Here the effective ''step size'' is essentially an inverse decorrelation time.) To do so one must resolve various wavelengths of the system and modify the dynamics (in CPU time) of the simulation so that all modes evolve at roughly the same rate. This can be achieved by introducing Fourier transforms. I show how to implement Fourier acceleration for Langevin updating and for conjugate gradient matrix inversion. The crucial feature of these algorithms that lends them to Fourier acceleration is that they update the lattice globally; hence the Fourier transforms are computed once per sweep rather than once per hit. (orig./HSI)

  16. A robust interpretation of duration calculus

    DEFF Research Database (Denmark)

    Franzle, M.; Hansen, Michael Reichhardt

    2005-01-01

    We transfer the concept of robust interpretation from arithmetic first-order theories to metric-time temporal logics. The idea is that the interpretation of a formula is robust iff its truth value does not change under small variation of the constants in the formula. Exemplifying this on Duration...... Calculus (DC), our findings are that the robust interpretation of DC is equivalent to a multi-valued interpretation that uses the real numbers as semantic domain and assigns Lipschitz-continuous interpretations to all operators of DC. Furthermore, this continuity permits approximation between discrete...

  17. Interpreting Electromagnetic Reflections In Glaciology

    Science.gov (United States)

    Eisen, O.; Nixdorf, U.; Wilhelms, F.; Steinhage, D.; Miller, H.

    Electromagnetic reflection (EMR) measurements are active remote sensing methods that have become a major tool for glaciological investigations. Although the basic pro- cesses are well understood, the unambiguous interpretation of EMR data, especially internal layering, still requires further information. The Antacrtic ice sheet provides a unique setting for investigating the relation between physical­chemical properties of ice and EMR data. Cold ice, smooth surface topography, and low accumulation facilitates matters to use low energy ground penetrating radar (GPR) devices to pene- trate several tens to hundreds of meters of ice, covering several thousands of years of snow deposition history. Thus, sufficient internal layers, primarily of volcanic origin, are recorded to enable studies on a local and regional scale. Based on dated ice core records, GPR measurements at various frequencies, and airborne radio-echo sound- ing (RES) from Dronning Maud Land (DML), Antarctica, combined with numerical modeling techniques, we investigate the influence of internal layering characteristics and properties of the propagating electromagnetic wave on EMR data.

  18. Guide to Magellan image interpretation

    Science.gov (United States)

    Ford, John P.; Plaut, Jeffrey J.; Weitz, Catherine M.; Farr, Tom G.; Senske, David A.; Stofan, Ellen R.; Michaels, Gregory; Parker, Timothy J.; Fulton, D. (Editor)

    1993-01-01

    An overview of Magellan Mission requirements, radar system characteristics, and methods of data collection is followed by a description of the image data, mosaic formats, areal coverage, resolution, and pixel DN-to-dB conversion. The availability and sources of image data are outlined. Applications of the altimeter data to estimate relief, Fresnel reflectivity, and surface slope, and the radiometer data to derive microwave emissivity are summarized and illustrated in conjunction with corresponding SAR image data. Same-side and opposite-side stereo images provide examples of parallax differences from which to measure relief with a lateral resolution many times greater than that of the altimeter. Basic radar interactions with geologic surfaces are discussed with respect to radar-imaging geometry, surface roughness, backscatter modeling, and dielectric constant. Techniques are described for interpreting the geomorphology and surface properties of surficial features, impact craters, tectonically deformed terrain, and volcanic landforms. The morphologic characteristics that distinguish impact craters from volcanic craters are defined. Criteria for discriminating extensional and compressional origins of tectonic features are discussed. Volcanic edifices, constructs, and lava channels are readily identified from their radar outlines in images. Geologic map units are identified on the basis of surface texture, image brightness, pattern, and morphology. Superposition, cross-cutting relations, and areal distribution of the units serve to elucidate the geologic history.

  19. Monitoring and interpreting bioremediation effectiveness

    International Nuclear Information System (INIS)

    Bragg, J.R.; Prince, R.C.; Harner, J.; Atlas, R.M.

    1993-01-01

    Following the Exxon Valdez oil spill in 1989, extensive research was conducted by the US Environments Protection Agency and Exxon to develop and implement bioremediation techniques for oil spill cleanup. A key challenge of this program was to develop effective methods for monitoring and interpreting bioremediation effectiveness on extremely heterogenous intertidal shorelines. Fertilizers were applied to shorelines at concentrations known to be safe, and effectiveness achieved in acceleration biodegradation of oil residues was measure using several techniques. This paper describes the most definitive method identified, which monitors biodegradation loss by measuring changes in ratios of hydrocarbons to hopane, a cycloalkane present in the oil that showed no measurable degradation. Rates of loss measured by the hopane ratio method have high levels of statistical confidence, and show that the fertilizer addition stimulated biodegradation rates as much a fivefold. Multiple regression analyses of data show that fertilizer addition of nitrogen in interstitial pore water per unit of oil load was the most important parameter affecting biodegradation rate, and results suggest that monitoring nitrogen concentrations in the subsurface pore water is preferred technique for determining fertilizer dosage and reapplication frequency

  20. Equivalent statistics and data interpretation.

    Science.gov (United States)

    Francis, Gregory

    2017-08-01

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  1. Clustering impact regime with shocks in freely evolving granular gas

    Science.gov (United States)

    Isobe, Masaharu

    2017-06-01

    A freely cooling granular gas without any external force evolves from the initial homogeneous state to the inhomogeneous clustering state, at which the energy decay deviates from the Haff's law. The asymptotic behavior of energy in the inelastic hard sphere model have been predicted by several theories, which are based on the mode coupling theory or extension of inelastic hard rods gas. In this study, we revisited the clustering regime of freely evolving granular gas via large-scale molecular dynamics simulation with up to 16.7 million inelastic hard disks. We found novel regime regarding on collisions between "clusters" spontaneously appearing after clustering regime, which can only be identified more than a few million particles system. The volumetric dilatation pattern of semicircular shape originated from density shock propagation are well characterized on the appearing of "cluster impact" during the aggregation process of clusters.

  2. Finding evolved stars in the inner Galactic disk with Gaia

    Science.gov (United States)

    Quiroga-Nuñez, L. H.; van Langevelde, H. J.; Pihlström, Y. M.; Sjouwerman, L. O.; Brown, A. G. A.

    2018-04-01

    The Bulge Asymmetries and Dynamical Evolution (BAaDE) survey will provide positions and line-of-sight velocities of ~20, 000 evolved, maser bearing stars in the Galactic plane. Although this Galactic region is affected by optical extinction, BAaDE targets may have Gaia cross-matches, eventually providing additional stellar information. In an initial attempt to cross-match BAaDE targets with Gaia, we have found more than 5,000 candidates. Of these, we may expect half to show SiO emission, which will allow us to obtain velocity information. The cross-match is being refined to avoid false positives using different criteria based on distance analysis, flux variability, and color assessment in the mid- and near-IR. Once the cross-matches can be confirmed, we will have a unique sample to characterize the stellar population of evolved stars in the Galactic bulge, which can be considered fossils of the Milky Way formation.

  3. Self-regulating and self-evolving particle swarm optimizer

    Science.gov (United States)

    Wang, Hui-Min; Qiao, Zhao-Wei; Xia, Chang-Liang; Li, Liang-Yu

    2015-01-01

    In this article, a novel self-regulating and self-evolving particle swarm optimizer (SSPSO) is proposed. Learning from the idea of direction reversal, self-regulating behaviour is a modified position update rule for particles, according to which the algorithm improves the best position to accelerate convergence in situations where the traditional update rule does not work. Borrowing the idea of mutation from evolutionary computation, self-evolving behaviour acts on the current best particle in the swarm to prevent the algorithm from prematurely converging. The performance of SSPSO and four other improved particle swarm optimizers is numerically evaluated by unimodal, multimodal and rotated multimodal benchmark functions. The effectiveness of SSPSO in solving real-world problems is shown by the magnetic optimization of a Halbach-based permanent magnet machine. The results show that SSPSO has good convergence performance and high reliability, and is well matched to actual problems.

  4. AUTOMOTIVE APPLICATIONS OF EVOLVING TAKAGI-SUGENO-KANG FUZZY MODELS

    Directory of Open Access Journals (Sweden)

    Radu-Emil Precup

    2017-08-01

    Full Text Available This paper presents theoretical and application results concerning the development of evolving Takagi-Sugeno-Kang fuzzy models for two dynamic systems, which will be viewed as controlled processes, in the field of automotive applications. The two dynamic systems models are nonlinear dynamics of the longitudinal slip in the Anti-lock Braking Systems (ABS and the vehicle speed in vehicles with the Continuously Variable Transmission (CVT systems. The evolving Takagi-Sugeno-Kang fuzzy models are obtained as discrete-time fuzzy models by incremental online identification algorithms. The fuzzy models are validated against experimental results in the case of the ABS and the first principles simulation results in the case of the vehicle with the CVT.

  5. Evolvability of thermophilic proteins from archaea and bacteria.

    Science.gov (United States)

    Takano, Kazufumi; Aoi, Atsushi; Koga, Yuichi; Kanaya, Shigenori

    2013-07-16

    Proteins from thermophiles possess high thermostability. The stabilization mechanisms differ between archaeal and bacterial proteins, whereby archaeal proteins are mainly stabilized via hydrophobic interactions and bacterial proteins by ion pairs. High stability is an important factor in promoting protein evolution, but the precise means by which different stabilization mechanisms affect the evolution process remain unclear. In this study, we investigated a random mutational drift of esterases from thermophilic archaea and bacteria at high temperatures. Our results indicate that mutations in archaeal proteins lead to improved function with no loss of stability, while mutant bacterial proteins are largely destabilized with decreased activity at high temperatures. On the basis of these findings, we suggest that archaeal proteins possess higher "evolvability" than bacterial proteins under temperature selection and are additionally able to evolve into eukaryotic proteins.

  6. Real-time evolvable pulse shaper for radiation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Lanchares, Juan, E-mail: julandan@dacya.ucm.es [Facultad de Informática, Universidad Complutense de Madrid (UCM), C/Prof. José García Santesmases s/n, 28040 Madrid (Spain); Garnica, Oscar, E-mail: ogarnica@dacya.ucm.es [Facultad de Informática, Universidad Complutense de Madrid (UCM), C/Prof. José García Santesmases s/n, 28040 Madrid (Spain); Risco-Martín, José L., E-mail: jlrisco@dacya.ucm.es [Facultad de Informática, Universidad Complutense de Madrid (UCM), C/Prof. José García Santesmases s/n, 28040 Madrid (Spain); Ignacio Hidalgo, J., E-mail: hidalgo@dacya.ucm.es [Facultad de Informática, Universidad Complutense de Madrid (UCM), C/Prof. José García Santesmases s/n, 28040 Madrid (Spain); Regadío, Alberto, E-mail: alberto.regadio@insa.es [Área de Tecnologías Electrónicas, Instituto Nacional de Técnica Aeroespacial (INTA), 28850 Torrejón de Ardoz, Madrid (Spain)

    2013-11-01

    In the last two decades, recursive algorithms for real-time digital pulse shaping in pulse height measurements have been developed and published in number of articles and textbooks. All these algorithms try to synthesize in real time optimum or near optimum shapes in the presence of noise. Even though some of these shapers can be considered effective designs, some side effects like aging cannot be ignored. We may observe that after sensors degradation, the signal obtained is not valid. In this regard, we present in this paper a novel technique that, based on evolvable hardware concepts, is able to evolve the degenerated shaper into a new design with better performance than the original one under the new sensor features.

  7. Programming adaptive control to evolve increased metabolite production.

    Science.gov (United States)

    Chou, Howard H; Keasling, Jay D

    2013-01-01

    The complexity inherent in biological systems challenges efforts to rationally engineer novel phenotypes, especially those not amenable to high-throughput screens and selections. In nature, increased mutation rates generate diversity in a population that can lead to the evolution of new phenotypes. Here we construct an adaptive control system that increases the mutation rate in order to generate diversity in the population, and decreases the mutation rate as the concentration of a target metabolite increases. This system is called feedback-regulated evolution of phenotype (FREP), and is implemented with a sensor to gauge the concentration of a metabolite and an actuator to alter the mutation rate. To evolve certain novel traits that have no known natural sensors, we develop a framework to assemble synthetic transcription factors using metabolic enzymes and construct four different sensors that recognize isopentenyl diphosphate in bacteria and yeast. We verify FREP by evolving increased tyrosine and isoprenoid production.

  8. Evolving approaches to the ethical management of genomic data.

    Science.gov (United States)

    McEwen, Jean E; Boyer, Joy T; Sun, Kathie Y

    2013-06-01

    The ethical landscape in the field of genomics is rapidly shifting. Plummeting sequencing costs, along with ongoing advances in bioinformatics, now make it possible to generate an enormous volume of genomic data about vast numbers of people. The informational richness, complexity, and frequently uncertain meaning of these data, coupled with evolving norms surrounding the sharing of data and samples and persistent privacy concerns, have generated a range of approaches to the ethical management of genomic information. As calls increase for the expanded use of broad or even open consent, and as controversy grows about how best to handle incidental genomic findings, these approaches, informed by normative analysis and empirical data, will continue to evolve alongside the science. Published by Elsevier Ltd.

  9. Who can monitor the court interpreter's performance?

    DEFF Research Database (Denmark)

    Martinsen, Bodil

    2009-01-01

    and the conflict about her competence was negotiated. Because of this unusual constellation, combined with a multi-method approach, this single case study can shed some light on the question of the participants' ability to monitor the interpreter's performance. Legal professional users of interpreters tend......  Who can monitor the court interpreter's performance? Results of a case study This paper presents the results of a case study of an unusual interpreting event in a Danish courtroom setting. During the trial, the interpreter's non-normative performance was explicitly criticised by the audience...... are far less transparent for the legal participants than they normally assume. This problem, in turn, stresses the importance of a) the interpreter's competence and self-awareness and b) the use of check interpreters.  ...

  10. Evolving Robot Controllers for Structured Environments Through Environment Decomposition

    DEFF Research Database (Denmark)

    Moreno, Rodrigo; Faiña, Andres; Støy, Kasper

    2015-01-01

    In this paper we aim to develop a controller that allows a robot to traverse an structured environment. The approach we use is to decompose the environment into simple sub-environments that we use as basis for evolving the controller. Specifically, we decompose a narrow corridor environment...... environments and that the order in which the decomposed sub-environments are presented in sequence impacts the performance of the evolutionary algorithm....

  11. The Evolving Importance of Banks and Securities Markets

    OpenAIRE

    Demirguc-Kunt, Asli; Feyen, Erik; Levine, Ross

    2011-01-01

    The roles of banks and securities markets evolve during the process of economic development. As countries develop economically, (1) the size of both banks and securities markets increases relative to the size of the economy, (2) the association between an increase in economic output and an increase in bank development becomes smaller, and (3) the association between an increase in economic output and an increase in securities market development becomes larger. These findings are consistent wi...

  12. A novel evolving scale-free model with tunable attractiveness

    International Nuclear Information System (INIS)

    Xuan, Liu; Tian-Qi, Liu; Xing-Yuan, Li; Hao, Wang

    2010-01-01

    In this paper, a new evolving model with tunable attractiveness is presented. Based on the Barabasi–Albert (BA) model, we introduce the attractiveness of node which can change with node degree. Using the mean-field theory, we obtain the analytical expression of power-law degree distribution with the exponent γ in (3, ∞). The new model is more homogeneous and has a lower clustering coefficient and bigger average path length than the BA model. (general)

  13. india's northward drift and collision with asia: evolving faunal response

    Indian Academy of Sciences (India)

    INDIA'S NORTHWARD DRIFT AND COLLISION WITH ASIA: EVOLVING FAUNAL RESPONSE · Slide 2 · Slide 3 · Slide 4 · Slide 5 · Slide 6 · Slide 7 · Slide 8 · Slide 9 · Slide 10 · Slide 11 · Slide 12 · Slide 13 · Slide 14 · Slide 15 · Slide 16 · Slide 17 · Slide 18 · Slide 19 · Slide 20 · Slide 21 · Slide 22 · Slide 23 · Slide 24.

  14. Reconstructing the Morphology of an Evolving Coronal Mass Ejection

    Science.gov (United States)

    2009-01-01

    694, 707 Wood, B. E., Howard, R. A ., Thernisien, A ., Plunkett, S. P., & Socker, D. G. 2009b, Sol. Phys., 259, 163 Wood, B. E., Karovska , M., Chen, J...Reconstructing the Morphology of an Evolving Coronal Mass Ejection B. E. Wood, R. A . Howard, D. G. Socker Naval Research Laboratory, Space Science...mission, we empirically reconstruct the time-dependent three-dimensional morphology of a coronal mass ejection (CME) from 2008 June 1, which exhibits

  15. Analysis of motility in multicellular Chlamydomonas reinhardtii evolved under predation.

    Directory of Open Access Journals (Sweden)

    Margrethe Boyd

    Full Text Available The advent of multicellularity was a watershed event in the history of life, yet the transition from unicellularity to multicellularity is not well understood. Multicellularity opens up opportunities for innovations in intercellular communication, cooperation, and specialization, which can provide selective advantages under certain ecological conditions. The unicellular alga Chlamydomonas reinhardtii has never had a multicellular ancestor yet it is closely related to the volvocine algae, a clade containing taxa that range from simple unicells to large, specialized multicellular colonies. Simple multicellular structures have been observed to evolve in C. reinhardtii in response to predation or to settling rate-based selection. Structures formed in response to predation consist of individual cells confined within a shared transparent extracellular matrix. Evolved isolates form such structures obligately under culture conditions in which their wild type ancestors do not, indicating that newly-evolved multicellularity is heritable. C. reinhardtii is capable of photosynthesis, and possesses an eyespot and two flagella with which it moves towards or away from light in order to optimize input of radiant energy. Motility contributes to C. reinhardtii fitness because it allows cells or colonies to achieve this optimum. Utilizing phototaxis to assay motility, we determined that newly evolved multicellular strains do not exhibit significant directional movement, even though the flagellae of their constituent unicells are present and active. In C. reinhardtii the first steps towards multicellularity in response to predation appear to result in a trade-off between motility and differential survivorship, a trade-off that must be overcome by further genetic change to ensure long-term success of the new multicellular organism.

  16. The evolving role of information technology in internal auditing

    OpenAIRE

    2015-01-01

    M.Com. (Computer Auditing) Modern organizations are increasingly dependent on information technology (IT) for various reasons: to enhance their operational efficiency, reduce costs or even attain a competitive advantage. The role of information technology in the organization continues to evolve and this has an impact for the internal audit functions that serve these organizations. The study investigated whether the King III report, ISACA standards and IIA standards assist the internal audi...

  17. The evolving role of paramedics - a NICE problem to have?

    Science.gov (United States)

    Eaton, Georgette; Mahtani, Kamal; Catterall, Matt

    2018-07-01

    This short essay supports the growing role of paramedics in the clinical and academic workforce. We present a commentary of recent draft consultations by the National Institute for Health and Care Excellence in England that set out how the role of paramedics may be evolving to assist with the changing demands on the clinical workforce. Using these consultations as a basis, we extend their recommendations and suggest that the profession should also lead the academically driven evaluation of these new roles.

  18. A Genealogical Interpretation of Principal Components Analysis

    Science.gov (United States)

    McVean, Gil

    2009-01-01

    Principal components analysis, PCA, is a statistical method commonly used in population genetics to identify structure in the distribution of genetic variation across geographical location and ethnic background. However, while the method is often used to inform about historical demographic processes, little is known about the relationship between fundamental demographic parameters and the projection of samples onto the primary axes. Here I show that for SNP data the projection of samples onto the principal components can be obtained directly from considering the average coalescent times between pairs of haploid genomes. The result provides a framework for interpreting PCA projections in terms of underlying processes, including migration, geographical isolation, and admixture. I also demonstrate a link between PCA and Wright's fst and show that SNP ascertainment has a largely simple and predictable effect on the projection of samples. Using examples from human genetics, I discuss the application of these results to empirical data and the implications for inference. PMID:19834557

  19. A search for radio emission from exoplanets around evolved stars

    Science.gov (United States)

    O'Gorman, E.; Coughlan, C. P.; Vlemmings, W.; Varenius, E.; Sirothia, S.; Ray, T. P.; Olofsson, H.

    2018-04-01

    The majority of searches for radio emission from exoplanets have to date focused on short period planets, i.e., the so-called hot Jupiter type planets. However, these planets are likely to be tidally locked to their host stars and may not generate sufficiently strong magnetic fields to emit electron cyclotron maser emission at the low frequencies used in observations (typically ≥150 MHz). In comparison, the large mass-loss rates of evolved stars could enable exoplanets at larger orbital distances to emit detectable radio emission. Here, we first show that the large ionized mass-loss rates of certain evolved stars relative to the solar value could make them detectable with the LOw Frequency ARray (LOFAR) at 150 MHz (λ = 2 m), provided they have surface magnetic field strengths >50 G. We then report radio observations of three long period (>1 au) planets that orbit the evolved stars β Gem, ι Dra, and β UMi using LOFAR at 150 MHz. We do not detect radio emission from any system but place tight 3σ upper limits of 0.98, 0.87, and 0.57 mJy on the flux density at 150 MHz for β Gem, ι Dra, and β UMi, respectively. Despite our non-detections these stringent upper limits highlight the potential of LOFAR as a tool to search for exoplanetary radio emission at meter wavelengths.

  20. Biomimetic molecular design tools that learn, evolve, and adapt

    Directory of Open Access Journals (Sweden)

    David A Winkler

    2017-06-01

    Full Text Available A dominant hallmark of living systems is their ability to adapt to changes in the environment by learning and evolving. Nature does this so superbly that intensive research efforts are now attempting to mimic biological processes. Initially this biomimicry involved developing synthetic methods to generate complex bioactive natural products. Recent work is attempting to understand how molecular machines operate so their principles can be copied, and learning how to employ biomimetic evolution and learning methods to solve complex problems in science, medicine and engineering. Automation, robotics, artificial intelligence, and evolutionary algorithms are now converging to generate what might broadly be called in silico-based adaptive evolution of materials. These methods are being applied to organic chemistry to systematize reactions, create synthesis robots to carry out unit operations, and to devise closed loop flow self-optimizing chemical synthesis systems. Most scientific innovations and technologies pass through the well-known “S curve”, with slow beginning, an almost exponential growth in capability, and a stable applications period. Adaptive, evolving, machine learning-based molecular design and optimization methods are approaching the period of very rapid growth and their impact is already being described as potentially disruptive. This paper describes new developments in biomimetic adaptive, evolving, learning computational molecular design methods and their potential impacts in chemistry, engineering, and medicine.

  1. Biomimetic molecular design tools that learn, evolve, and adapt

    Science.gov (United States)

    2017-01-01

    A dominant hallmark of living systems is their ability to adapt to changes in the environment by learning and evolving. Nature does this so superbly that intensive research efforts are now attempting to mimic biological processes. Initially this biomimicry involved developing synthetic methods to generate complex bioactive natural products. Recent work is attempting to understand how molecular machines operate so their principles can be copied, and learning how to employ biomimetic evolution and learning methods to solve complex problems in science, medicine and engineering. Automation, robotics, artificial intelligence, and evolutionary algorithms are now converging to generate what might broadly be called in silico-based adaptive evolution of materials. These methods are being applied to organic chemistry to systematize reactions, create synthesis robots to carry out unit operations, and to devise closed loop flow self-optimizing chemical synthesis systems. Most scientific innovations and technologies pass through the well-known “S curve”, with slow beginning, an almost exponential growth in capability, and a stable applications period. Adaptive, evolving, machine learning-based molecular design and optimization methods are approaching the period of very rapid growth and their impact is already being described as potentially disruptive. This paper describes new developments in biomimetic adaptive, evolving, learning computational molecular design methods and their potential impacts in chemistry, engineering, and medicine. PMID:28694872

  2. Social networks: Evolving graphs with memory dependent edges

    Science.gov (United States)

    Grindrod, Peter; Parsons, Mark

    2011-10-01

    The plethora of digital communication technologies, and their mass take up, has resulted in a wealth of interest in social network data collection and analysis in recent years. Within many such networks the interactions are transient: thus those networks evolve over time. In this paper we introduce a class of models for such networks using evolving graphs with memory dependent edges, which may appear and disappear according to their recent history. We consider time discrete and time continuous variants of the model. We consider the long term asymptotic behaviour as a function of parameters controlling the memory dependence. In particular we show that such networks may continue evolving forever, or else may quench and become static (containing immortal and/or extinct edges). This depends on the existence or otherwise of certain infinite products and series involving age dependent model parameters. We show how to differentiate between the alternatives based on a finite set of observations. To test these ideas we show how model parameters may be calibrated based on limited samples of time dependent data, and we apply these concepts to three real networks: summary data on mobile phone use from a developing region; online social-business network data from China; and disaggregated mobile phone communications data from a reality mining experiment in the US. In each case we show that there is evidence for memory dependent dynamics, such as that embodied within the class of models proposed here.

  3. Higher rates of sex evolve in spatially heterogeneous environments.

    Science.gov (United States)

    Becks, Lutz; Agrawal, Aneil F

    2010-11-04

    The evolution and maintenance of sexual reproduction has puzzled biologists for decades. Although this field is rich in hypotheses, experimental evidence is scarce. Some important experiments have demonstrated differences in evolutionary rates between sexual and asexual populations; other experiments have documented evolutionary changes in phenomena related to genetic mixing, such as recombination and selfing. However, direct experiments of the evolution of sex within populations are extremely rare (but see ref. 12). Here we use the rotifer, Brachionus calyciflorus, which is capable of both sexual and asexual reproduction, to test recent theory predicting that there is more opportunity for sex to evolve in spatially heterogeneous environments. Replicated experimental populations of rotifers were maintained in homogeneous environments, composed of either high- or low-quality food habitats, or in heterogeneous environments that consisted of a mix of the two habitats. For populations maintained in either type of homogeneous environment, the rate of sex evolves rapidly towards zero. In contrast, higher rates of sex evolve in populations experiencing spatially heterogeneous environments. The data indicate that the higher level of sex observed under heterogeneity is not due to sex being less costly or selection against sex being less efficient; rather sex is sufficiently advantageous in heterogeneous environments to overwhelm its inherent costs. Counter to some alternative theories for the evolution of sex, there is no evidence that genetic drift plays any part in the evolution of sex in these populations.

  4. Charting a New Course. NAI National Interpreters Workshop. Proceedings of the Annual Conference (San Diego, California, October 24-28, 1988).

    Science.gov (United States)

    Erikson, Debra M., Ed.

    Selected presentations in this publication include: "AAPRCO, Amtrak, the Railroads, and Interpretation"; "Check It Out!"; "Wildlife Rehabilitation as an Interpretive Tool"; "Improving the Monorail Tour"; "Interpreting Our Heretics"; "Evaluation: A Critical Management Process"; "A…

  5. SPATIO-TEMPORAL DATA MODEL FOR INTEGRATING EVOLVING NATION-LEVEL DATASETS

    Directory of Open Access Journals (Sweden)

    A. Sorokine

    2017-10-01

    Full Text Available Ability to easily combine the data from diverse sources in a single analytical workflow is one of the greatest promises of the Big Data technologies. However, such integration is often challenging as datasets originate from different vendors, governments, and research communities that results in multiple incompatibilities including data representations, formats, and semantics. Semantics differences are hardest to handle: different communities often use different attribute definitions and associate the records with different sets of evolving geographic entities. Analysis of global socioeconomic variables across multiple datasets over prolonged time is often complicated by the difference in how boundaries and histories of countries or other geographic entities are represented. Here we propose an event-based data model for depicting and tracking histories of evolving geographic units (countries, provinces, etc. and their representations in disparate data. The model addresses the semantic challenge of preserving identity of geographic entities over time by defining criteria for the entity existence, a set of events that may affect its existence, and rules for mapping between different representations (datasets. Proposed model is used for maintaining an evolving compound database of global socioeconomic and environmental data harvested from multiple sources. Practical implementation of our model is demonstrated using PostgreSQL object-relational database with the use of temporal, geospatial, and NoSQL database extensions.

  6. Spatio-Temporal Data Model for Integrating Evolving Nation-Level Datasets

    Science.gov (United States)

    Sorokine, A.; Stewart, R. N.

    2017-10-01

    Ability to easily combine the data from diverse sources in a single analytical workflow is one of the greatest promises of the Big Data technologies. However, such integration is often challenging as datasets originate from different vendors, governments, and research communities that results in multiple incompatibilities including data representations, formats, and semantics. Semantics differences are hardest to handle: different communities often use different attribute definitions and associate the records with different sets of evolving geographic entities. Analysis of global socioeconomic variables across multiple datasets over prolonged time is often complicated by the difference in how boundaries and histories of countries or other geographic entities are represented. Here we propose an event-based data model for depicting and tracking histories of evolving geographic units (countries, provinces, etc.) and their representations in disparate data. The model addresses the semantic challenge of preserving identity of geographic entities over time by defining criteria for the entity existence, a set of events that may affect its existence, and rules for mapping between different representations (datasets). Proposed model is used for maintaining an evolving compound database of global socioeconomic and environmental data harvested from multiple sources. Practical implementation of our model is demonstrated using PostgreSQL object-relational database with the use of temporal, geospatial, and NoSQL database extensions.

  7. Evolving Metadata in NASA Earth Science Data Systems

    Science.gov (United States)

    Mitchell, A.; Cechini, M. F.; Walter, J.

    2011-12-01

    NASA's effort to continually evolve its data systems led ECHO to enhancing the method in which it receives inventory metadata from the data centers to allow for multiple metadata formats including ISO 19115. ECHO's metadata model will also be mapped to the NASA-specific convention for ingesting science metadata into the ECHO system. As NASA's new Earth Science missions and data centers are migrating to the ISO 19115 standards, EOSDIS is developing metadata management resources to assist in the reading, writing and parsing ISO 19115 compliant metadata. To foster interoperability with other agencies and international partners, NASA is working to ensure that a common ISO 19115 convention is developed, enhancing data sharing capabilities and other data analysis initiatives. NASA is also investigating the use of ISO 19115 standards to encode data quality, lineage and provenance with stored values. A common metadata standard across NASA's Earth Science data systems promotes interoperability, enhances data utilization and removes levels of uncertainty found in data products.

  8. Páramo is the world’s fastest evolving and coolest biodiversity hotspot

    Directory of Open Access Journals (Sweden)

    Santiago eMadriñán

    2013-10-01

    Full Text Available Understanding the processes that cause speciation is a key aim of evolutionary biology. Lineages or biomes that exhibit recent and rapid diversification are ideal model systems for determining these processes. Species rich biomes reported to be of relatively recent origin, i.e., since the beginning of the Miocene, include Mediterranean ecosystems such as the California Floristic Province, oceanic islands such as the Hawaiian archipelago and the Neotropical high elevation ecosystem of the Páramos. Páramos constitute grasslands above the forest tree-line (at elevations of c. 2800–4700 m with high species endemism. Organisms that occupy this ecosystem are a likely product of unique adaptations to an extreme environment that evolved during the last three to five million years when the Andes reached an altitude that was capable of sustaining this type of vegetation. We compared net diversification rates of lineages in fast evolving biomes using 73 dated molecular phylogenies. Based on our sample, we demonstrate that average net diversification rates of Páramo plant lineages are faster than those of other reportedly fast evolving hotspots and that the faster evolving lineages are more likely to be found in Páramos than the other hotspots. Páramos therefore represent the ideal model system for studying diversification processes. Most of the speciation events that we observed in the Páramos (144 out of 177 occurred during the Pleistocene possibly due to the effects of species range contraction and expansion that may have resulted from the well-documented climatic changes during that period. Understanding these effects will assist with efforts to determine how future climatic changes will impact plant populations.

  9. The emergent Copenhagen interpretation of quantum mechanics

    Science.gov (United States)

    Hollowood, Timothy J.

    2014-05-01

    We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR-Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems.

  10. The emergent Copenhagen interpretation of quantum mechanics

    International Nuclear Information System (INIS)

    Hollowood, Timothy J

    2014-01-01

    We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR–Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems. (paper)

  11. (including travel dates) Proposed itinerary

    Indian Academy of Sciences (India)

    Ashok

    31 July to 22 August 2012 (including travel dates). Proposed itinerary: Arrival in Bangalore on 1 August. 1-5 August: Bangalore, Karnataka. Suggested institutions: Indian Institute of Science, Bangalore. St Johns Medical College & Hospital, Bangalore. Jawaharlal Nehru Centre, Bangalore. 6-8 August: Chennai, TN.

  12. The interplay of evolved seawater and magmatic-hydrothermal fluids in the 3.24 Ga panorama volcanic-hosted massive sulfide hydrothermal system, North Pilbara Craton, Western Australia

    Science.gov (United States)

    Drieberg, Susan L.; Hagemann, Steffen G.; Huston, David L.; Landis, Gary; Ryan, Chris G.; Van Achterbergh, Esmé; Vennemann, Torsten

    2013-01-01

    The ~3240 Ma Panorama volcanic-hosted massive sulfide (VHMS) district is unusual for its high degree of exposure and low degree of postdepositional modification. In addition to typical seafloor VHMS deposits, this district contains greisen- and vein-hosted Mo-Cu-Zn-Sn mineral occurrences that are contemporaneous with VHMS orebodies and are hosted by the Strelley granite complex, which also drove VHMS circulation. Hence the Panorama district is a natural laboratory to investigate the role of magmatic-hydrothermal fluids in VHMS hydrothermal systems. Regional and proximal high-temperature alteration zones in volcanic rocks underlying the VHMS deposits are dominated by chlorite-quartz ± albite assemblages, with lesser low-temperature sericite-quartz ± K-feldspar assemblages. These assemblages are typical of VHMS hydrothermal systems. In contrast, the alteration assemblages associated with granite-hosted greisens and veins include quartz-topaz-muscovite-fluorite and quartz-muscovite (sericite)-chlorite-ankerite. These vein systems generally do not extend into the overlying volcanic pile. Fluid inclusion and stable isotope studies suggest that the greisens were produced by high-temperature (~590°C), high-salinity (38–56 wt % NaCl equiv) fluids with high densities (>1.3 g/cm3) and high δ18O (9.3 ± 0.6‰). These fluids are compatible with the measured characteristics of magmatic fluids evolved from the Strelley granite complex. In contrast, fluids in the volcanic pile (including the VHMS ore-forming fluids) were of lower temperature (90°–270°C), lower salinity (5.0–11.2 wt % NaCl equiv), with lower densities (0.88–1.01 g/cm3) and lower δ18O (−0.8 ± 2.6‰). These fluids are compatible with evolved Paleoarchean seawater. Fluids that formed the quartz-chalcopyrite-sphalerite-cassiterite veins, which are present within the granite complex near the contact with the volcanic pile, were intermediate in temperature and isotopic composition between the greisen

  13. An information gap in DNA evidence interpretation.

    Directory of Open Access Journals (Sweden)

    Mark W Perlin

    Full Text Available Forensic DNA evidence often contains mixtures of multiple contributors, or is present in low template amounts. The resulting data signals may appear to be relatively uninformative when interpreted using qualitative inclusion-based methods. However, these same data can yield greater identification information when interpreted by computer using quantitative data-modeling methods. This study applies both qualitative and quantitative interpretation methods to a well-characterized DNA mixture and dilution data set, and compares the inferred match information. The results show that qualitative interpretation loses identification power at low culprit DNA quantities (below 100 pg, but that quantitative methods produce useful information down into the 10 pg range. Thus there is a ten-fold information gap that separates the qualitative and quantitative DNA mixture interpretation approaches. With low quantities of culprit DNA (10 pg to 100 pg, computer-based quantitative interpretation provides greater match sensitivity.

  14. Theory including future not excluded

    DEFF Research Database (Denmark)

    Nagao, K.; Nielsen, H.B.

    2013-01-01

    We study a complex action theory (CAT) whose path runs over not only past but also future. We show that, if we regard a matrix element defined in terms of the future state at time T and the past state at time TA as an expectation value in the CAT, then we are allowed to have the Heisenberg equation......, Ehrenfest's theorem, and the conserved probability current density. In addition,we showthat the expectation value at the present time t of a future-included theory for large T - t and large t - T corresponds to that of a future-not-included theory with a proper inner product for large t - T. Hence, the CAT...

  15. Using American sign language interpreters to facilitate research among deaf adults: lessons learned.

    Science.gov (United States)

    Sheppard, Kate

    2011-04-01

    Health care providers commonly discuss depressive symptoms with clients, enabling earlier intervention. Such discussions rarely occur between providers and Deaf clients. Most culturally Deaf adults experience early-onset hearing loss, self-identify as part of a unique culture, and communicate in the visual language of American Sign Language (ASL). Communication barriers abound, and depression screening instruments may be unreliable. To train and use ASL interpreters for a qualitative study describing depressive symptoms among Deaf adults. Training included research versus community interpreting. During data collection, interpreters translated to and from voiced English and ASL. Training eliminated potential problems during data collection. Unexpected issues included participants asking for "my interpreter" and worrying about confidentiality or friendship in a small community. Lessons learned included the value of careful training of interpreters prior to initiating data collection, including resolution of possible role conflicts and ensuring conceptual equivalence in real-time interpreting.

  16. Physical interpretation of the combinatorial hierarchy

    International Nuclear Information System (INIS)

    Bastin, T.; Noyes, H.P.

    1978-01-01

    The combinatorial hierarchy model for base particle processes is compared and contrasted with the Ur-theory as developed at the Tutzing Conferences. It agrees with Ur-theory about a finite basis, the ''fixed past--uncertain future'' aspects of physics, and the necessity of dropping Bohr's requirement of reduction to the haptic language of commonsense and classical physics. However, it retains a constructive, hierarchial approach with can yield only an approximate and discrete ''space time'', and introduces the observation metaphysic at the start. Concrete interpretation of the four levels of the hierarchy (with cardinals 3, 7, 127, 2 127 -1 approx. =10 38 ) associates the three levels which map up and down with three absolute conservation laws (charge, baryon number, lepton number) and the spin dichotomy. The first level represents +, -, and +- unit charge. The second has the quantum nubmers of a baryon--antibaryon pair and associated charged meson (e.g., n anti n, p anti n, p anti p, n anti p, π + , π 0 , π - ). The third level associates this pair, now including four spin states as well as four charge states, with a neutral lepton--antilepton pair (e anti e or ν anti ν) in four spin states (total, 64 states): three charged spinless, three charged spin-1, and neutral spin-1 mesons (15 states), and a neutral vector boson associated with the leptons; this gives 3 + 15 + 3 x 15 = 63 possible boson states, so a total correct count of 63 + 64 = 127 states. Something like SU 2 X SU 3 and other indications of quark quantum numbers can occur as substructures at the fourth (unstable) level. A slight extension gives the usual static approximation to the building energy of the hydrogen atom, α 2 m/sub e/c 2 . Cosmological implications of the theory are in accord with current experience. A beginning in the physical interpretation of a theory which could eventually encompass all branches of physics was made. 3 figures, 6 tables

  17. Understanding women's interpretations of infant formula advertising.

    Science.gov (United States)

    Parry, Kathleen; Taylor, Emily; Hall-Dardess, Pam; Walker, Marsha; Labbok, Miriam

    2013-06-01

    Exclusive breastfeeding for 6 months and continued breastfeeding for at least 1 year is recommended by all major health organizations. Whereas 74.6 percent of mothers initiate breastfeeding at birth, exclusivity and duration remain significantly lower than national goals. Empirical evidence suggests that exposure to infant formula marketing contributes to supplementation and premature cessation. The objective of this study was to explore how women interpret infant formula advertising to aid in an understanding of this association. Four focus groups were structured to include women with similar childbearing experience divided according to reproductive status: preconceptional, pregnant, exclusive breastfeeders, and formula feeders. Facilitators used a prepared protocol to guide discussion of infant formula advertisements. Authors conducted a thematic content analysis with special attention to women's statements about what they believed the advertisements said about how the products related to human milk (superior, inferior, similar) and how they reported reacting to these interpretations. Participants reported that the advertisements conveyed an expectation of failure with breastfeeding, and that formula is a solution to fussiness, spitting up, and other normal infant behaviors. Participants reported that the advertisements were confusing in terms of how formula-feeding is superior, inferior or the same as breastfeeding. This confusion was exacerbated by an awareness of distribution by health care practitioners and institutions, suggesting provider endorsement of infant formula. Formula marketing appears to decrease mothers' confidence in their ability to breastfeed, especially when provided by health care practitioners and institutions. Therefore, to be supportive of breastfeeding, perinatal educators and practitioners could be more effective if they did not offer infant formula advertising to mothers. © 2013, Copyright the Authors, Journal compilation © 2013

  18. Model-Agnostic Interpretability of Machine Learning

    OpenAIRE

    Ribeiro, Marco Tulio; Singh, Sameer; Guestrin, Carlos

    2016-01-01

    Understanding why machine learning models behave the way they do empowers both system designers and end-users in many ways: in model selection, feature engineering, in order to trust and act upon the predictions, and in more intuitive user interfaces. Thus, interpretability has become a vital concern in machine learning, and work in the area of interpretable models has found renewed interest. In some applications, such models are as accurate as non-interpretable ones, and thus are preferred f...

  19. Statistical models for brain signals with properties that evolve across trials.

    Science.gov (United States)

    Ombao, Hernando; Fiecas, Mark; Ting, Chee-Ming; Low, Yin Fen

    2017-12-07

    Most neuroscience cognitive experiments involve repeated presentations of various stimuli across several minutes or a few hours. It has been observed that brain responses, even to the same stimulus, evolve over the course of the experiment. These changes in brain activation and connectivity are believed to be associated with learning and/or habituation. In this paper, we present two general approaches to modeling dynamic brain connectivity using electroencephalograms (EEGs) recorded across replicated trials in an experiment. The first approach is the Markovian regime-switching vector autoregressive model (MS-VAR) which treats EEGs as realizations of an underlying brain process that switches between different states both within a trial and across trials in the entire experiment. The second is the slowly evolutionary locally stationary process (SEv-LSP) which characterizes the observed EEGs as a mixture of oscillatory activities at various frequency bands. The SEv-LSP model captures the dynamic nature of the amplitudes of the band-oscillations and cross-correlations between them. The MS-VAR model is able to capture abrupt changes in the dynamics while the SEv-LSP directly gives interpretable results. Moreover, it is nonparametric and hence does not suffer from model misspecification. For both of these models, time-evolving connectivity metrics in the frequency domain are derived from the model parameters for both functional and effective connectivity. We illustrate these two models for estimating cross-trial connectivity in selective attention using EEG data from an oddball paradigm auditory experiment where the goal is to characterize the evolution of brain responses to target stimuli and to standard tones presented randomly throughout the entire experiment. The results suggest dynamic changes in connectivity patterns over trials with inter-subject variability. Copyright © 2017. Published by Elsevier Inc.

  20. Statistical models for brain signals with properties that evolve across trials

    KAUST Repository

    Ombao, Hernando

    2017-12-07

    Most neuroscience cognitive experiments involve repeated presentations of various stimuli across several minutes or a few hours. It has been observed that brain responses, even to the same stimulus, evolve over the course of the experiment. These changes in brain activation and connectivity are believed to be associated with learning and/or habituation. In this paper, we present two general approaches to modeling dynamic brain connectivity using electroencephalograms (EEGs) recorded across replicated trials in an experiment. The first approach is the Markovian regime-switching vector autoregressive model (MS-VAR) which treats EEGs as realizations of an underlying brain process that switches between different states both within a trial and across trials in the entire experiment. The second is the slowly evolutionary locally stationary process (SEv-LSP) which characterizes the observed EEGs as a mixture of oscillatory activities at various frequency bands. The SEv-LSP model captures the dynamic nature of the amplitudes of the band-oscillations and cross-correlations between them. The MS-VAR model is able to capture abrupt changes in the dynamics while the SEv-LSP directly gives interpretable results. Moreover, it is nonparametric and hence does not suffer from model misspecification. For both of these models, time-evolving connectivity metrics in the frequency domain are derived from the model parameters for both functional and effective connectivity. We illustrate these two models for estimating cross-trial connectivity in selective attention using EEG data from an oddball paradigm auditory experiment where the goal is to characterize the evolution of brain responses to target stimuli and to standard tones presented randomly throughout the entire experiment. The results suggest dynamic changes in connectivity patterns over trials with inter-subject variability.

  1. Interpreting quantum discord through quantum state merging

    International Nuclear Information System (INIS)

    Madhok, Vaibhav; Datta, Animesh

    2011-01-01

    We present an operational interpretation of quantum discord based on the quantum state merging protocol. Quantum discord is the markup in the cost of quantum communication in the process of quantum state merging, if one discards relevant prior information. Our interpretation has an intuitive explanation based on the strong subadditivity of von Neumann entropy. We use our result to provide operational interpretations of other quantities like the local purity and quantum deficit. Finally, we discuss in brief some instances where our interpretation is valid in the single-copy scenario.

  2. Evolutions in clinical reasoning assessment: The Evolving Script Concordance Test.

    Science.gov (United States)

    Cooke, Suzette; Lemay, Jean-François; Beran, Tanya

    2017-08-01

    Script concordance testing (SCT) is a method of assessment of clinical reasoning. We developed a new type of SCT case design, the evolving SCT (E-SCT), whereby the patient's clinical story is "evolving" and with thoughtful integration of new information at each stage, decisions related to clinical decision-making become increasingly clear. We aimed to: (1) determine whether an E-SCT could differentiate clinical reasoning ability among junior residents (JR), senior residents (SR), and pediatricians, (2) evaluate the reliability of an E-SCT, and (3) obtain qualitative feedback from participants to help inform the potential acceptability of the E-SCT. A 12-case E-SCT, embedded within a 24-case pediatric SCT (PaedSCT), was administered to 91 pediatric residents (JR: n = 50; SR: n = 41). A total of 21 pediatricians served on the panel of experts (POE). A one-way analysis of variance (ANOVA) was conducted across the levels of experience. Participants' feedback on the E-SCT was obtained with a post-test survey and analyzed using two methods: percentage preference and thematic analysis. Statistical differences existed across levels of training: F = 19.31 (df = 2); p decision-making process. The E-SCT demonstrated very good reliability and was effective in distinguishing clinical reasoning ability across three levels of experience. Participants found the E-SCT engaging and representative of real-life clinical reasoning and decision-making processes. We suggest that further refinement and utilization of the evolving style case will enhance SCT as a robust, engaging, and relevant method for the assessment of clinical reasoning.

  3. A Change Impact Analysis to Characterize Evolving Program Behaviors

    Science.gov (United States)

    Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua

    2012-01-01

    Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks

  4. Why, when and where did honey bee dance communication evolve?

    Directory of Open Access Journals (Sweden)

    Robbie eI'Anson Price

    2015-11-01

    Full Text Available Honey bees (Apis sp. are the only known bee genus that uses nest-based communication to provide nest-mates with information about the location of resources, the so-called dance language. Successful foragers perform waggle dances for high quality food sources and suitable nest-sites during swarming. However, since many species of social insects do not communicate the location of resources to their nest-mates, the question of why the dance language evolved is of ongoing interest. We review recent theoretical and empirical research into the ecological circumstances that make dance communication beneficial in present day environments. This research suggests that the dance language is most beneficial when food sources differ greatly in quality and are hard to find. The dances of extant honey bee species differ in important ways, and phylogenetic studies suggest an increase in dance complexity over time: species with the least complex dance were the first to appear and species with the most complex dance are the most derived. We review the fossil record of honey bees and speculate about the time and context (foraging vs. swarming in which spatially referential dance communication might have evolved. We conclude that there are few certainties about when the dance language first appeared; dance communication could be older than 40 million years and, thus, predate the genus Apis, or it could be as recent as 20 million years when extant honey bee species diverged during the early Miocene. The most parsimonious scenario assumes it evolved in a sub-tropical to temperate climate, with patchy vegetation somewhere in Eurasia.

  5. Reflective Practice: Origins and Interpretations

    Science.gov (United States)

    Reynolds, Michael

    2011-01-01

    The idea of reflection is central to the theory and practice of learning--especially learning which is grounded in past or current experience. This paper proposes a working definition of reflection and reviews its origins and recent developments. The author also provides an account of "critical reflection", including its rationale and…

  6. A Whig Interpretation of History?

    NARCIS (Netherlands)

    Kennedy, J.

    2013-01-01

    Since last year (2012) I have been involved in a European Commission research project that seeks to discover which measures aimed against corruption actually work. This research includes a relatively modest historical dimension, in which "lessons learned" from the past is one of the main aims. In

  7. Ionizing radiation accidents. Data interpretation

    International Nuclear Information System (INIS)

    Cascon, Adriana S.

    2003-01-01

    After a general outlook of the biological effects at the cellular and molecular level, the somatic effects of the ionizing radiation are described. Argentine regulations and the ICRP recommendations on radiological protection of professionally exposed workers are also summarized. The paper includes practical advices for the physician that has to take care of an irradiated patient

  8. Programmable Applications: Interpreter Meets Interface

    Science.gov (United States)

    1991-10-01

    ics program written for professional architects and designers, and including a huge library of files written in AutoLisp , a "design-enriched" Lisp... AutoLisp procedures). The choice of Lisp as a base language is a happy one for AutoCAD; the application has clearly benefitted from the contribution

  9. Gravity Effects on Information Filtering and Network Evolving

    Science.gov (United States)

    Liu, Jin-Hu; Zhang, Zi-Ke; Chen, Lingjiao; Liu, Chuang; Yang, Chengcheng; Wang, Xueqi

    2014-01-01

    In this paper, based on the gravity principle of classical physics, we propose a tunable gravity-based model, which considers tag usage pattern to weigh both the mass and distance of network nodes. We then apply this model in solving the problems of information filtering and network evolving. Experimental results on two real-world data sets, Del.icio.us and MovieLens, show that it can not only enhance the algorithmic performance, but can also better characterize the properties of real networks. This work may shed some light on the in-depth understanding of the effect of gravity model. PMID:24622162

  10. Analytical Design of Evolvable Software for High-Assurance Computing

    Science.gov (United States)

    2001-02-14

    system size Sext wij j 1= Ai ∑ wik k 1= Mi ∑+               i 1= N ∑= = 59 5 Analytical Partition of Components As discussed in Chapter 1...76]. Does the research approach yield evolvable components in less mathematically-oriented applications such as multi- media and e- commerce? There is... Social Security Number Date 216 217 Appendix H Benchmark Design for the Microwave Oven Software The benchmark design consists of the

  11. MK classification of evolved blue stars in the halo

    International Nuclear Information System (INIS)

    Garrison, R.F.

    1987-01-01

    The problem of the masses and origin of the evolved blue stars is very complex. No single approach can give all the answers unambiguously; it would be naive to suppose otherwise. The MK process and the MK system give a perspective which complements photometric, kinematic, high dispersion and other quantitative data. It is useful to know which stars are similar (or not) in spectral morphology, so that interesting candidates can be selected for further study. In many cases, the gross physical characteristics can be fairly well determined by use of the MK System. 8 references

  12. Open-Ended Behavioral Complexity for Evolved Virtual Creatures

    DEFF Research Database (Denmark)

    Lessin, Dan; Fussell, Don; Miikkulainen, Risto

    2013-01-01

    notable exception to this progress. Despite the potential benefits, there has been no clear increase in the behavioral complexity of evolved virtual creatures (EVCs) beyond the light following demonstrated in Sims' original work. This paper presents an open-ended method to move beyond this limit, making...... creature with behavioral complexity that clearly exceeds previously achieved levels. ESP thus demonstrates that EVCs may indeed have the potential to one day rival the behavioral complexity--and therefore the entertainment value--of their non-virtual counterparts....

  13. Evolving Neural Turing Machines for Reward-based Learning

    DEFF Research Database (Denmark)

    Greve, Rasmus Boll; Jacobsen, Emil Juul; Risi, Sebastian

    2016-01-01

    An unsolved problem in neuroevolution (NE) is to evolve artificial neural networks (ANN) that can store and use information to change their behavior online. While plastic neural networks have shown promise in this context, they have difficulties retaining information over longer periods of time...... version of the double T-Maze, a complex reinforcement-like learning problem. In the T-Maze learning task the agent uses the memory bank to display adaptive behavior that normally requires a plastic ANN, thereby suggesting a complementary and effective mechanism for adaptive behavior in NE....

  14. Simulations of embodied evolving semiosis: Emergent semantics in artificial environments

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, L.M.; Joslyn, C.

    1998-02-01

    As we enter this amazing new world of artificial and virtual systems and environments in the context of human communities, we are interested in the development of systems and environments which have the capacity to grow and evolve their own meanings in the context of this community of interaction. In this paper the authors analyze the necessary conditions to achieve systems and environments with these properties: (1) a coupled interaction between a system and its environment; (2) an environment with sufficient initial richness and structure to allow for; (3) embodied emergent classification of that environment system coupling; and (4) which is subject to pragmatic selection.

  15. [Cardiac computed tomography: new applications of an evolving technique].

    Science.gov (United States)

    Martín, María; Corros, Cecilia; Calvo, Juan; Mesa, Alicia; García-Campos, Ana; Rodríguez, María Luisa; Barreiro, Manuel; Rozado, José; Colunga, Santiago; de la Hera, Jesús M; Morís, César; Luyando, Luis H

    2015-01-01

    During the last years we have witnessed an increasing development of imaging techniques applied in Cardiology. Among them, cardiac computed tomography is an emerging and evolving technique. With the current possibility of very low radiation studies, the applications have expanded and go further coronariography In the present article we review the technical developments of cardiac computed tomography and its new applications. Copyright © 2014 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.

  16. f(R) gravity solutions for evolving wormholes

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharya, Subhra [Presidency University, Department of Mathematics, Kolkata (India); Chakraborty, Subenoy [Jadavpur University, Department of Mathematics, Kolkata (India)

    2017-08-15

    The scalar-tensor f(R) theory of gravity is considered in the framework of a simple inhomogeneous space-time model. In this research we use the reconstruction technique to look for possible evolving wormhole solutions within viable f(R) gravity formalism. These f(R) models are then constrained so that they are consistent with existing experimental data. Energy conditions related to the matter threading the wormhole are analyzed graphically and are in general found to obey the null energy conditions (NEC) in regions around the throat, while in the limit f(R) = R, NEC can be violated at large in regions around the throat. (orig.)

  17. Forces shaping the fastest evolving regions in the human genome

    DEFF Research Database (Denmark)

    Pollard, Katherine S; Salama, Sofie R; King, Bryan

    2006-01-01

    Comparative genomics allow us to search the human genome for segments that were extensively changed in the last approximately 5 million years since divergence from our common ancestor with chimpanzee, but are highly conserved in other species and thus are likely to be functional. We found 202...... genomic elements that are highly conserved in vertebrates but show evidence of significantly accelerated substitution rates in human. These are mostly in non-coding DNA, often near genes associated with transcription and DNA binding. Resequencing confirmed that the five most accelerated elements...... contributed to accelerated evolution of the fastest evolving elements in the human genome....

  18. Evolving Oxygen Landscape of the Early Atmosphere and Oceans

    Science.gov (United States)

    Lyons, T. W.; Reinhard, C. T.; Planavsky, N. J.

    2013-12-01

    The past decade has witnessed remarkable advances in our understanding of oxygen on the early Earth, and a new framework, the topic of this presentation, is now in place to address the controls on spatiotemporal distributions of oxygen and their potential relationships to deep-Earth processes. Recent challenges to the Archean biomarker record have put an added burden on inorganic geochemistry to fingerprint and quantify the early production, accumulation, and variation of biospheric oxygen. Fortunately, a wide variety of techniques now point convincingly to photosynthetic oxygen production and dynamic accumulation well before the canonical Great Oxidation Event (GOE). Recent modeling of sulfur recycling over this interval allows for transient oxygen accumulation in the atmosphere without the disappearance of non-mass-dependent (NMD) sulfur isotope anomalies from the stratigraphic record and further allows for persistent accumulation in the atmosphere well before the permanent disappearance of NMD signals. This recent work suggests that the initial rise of oxygen may have occurred in fits and starts rather than a single step, and that once permanently present in the atmosphere, oxygen likely rose to high levels and then plummeted, in phase with the Paleoproterozoic Lomagundi positive carbon isotope excursion. More than a billion years of oxygen-free conditions in the deep ocean followed and set a challenging course for life, including limited abundances and diversity of eukaryotic organisms. Despite this widespread anoxia, sulfidic (euxinic) conditions were likely limited to productive ocean margins. Nevertheless, euxinia was sufficiently widespread to impact redox-dependent nutrient relationships, particularly the availability of bioessential trace metals critical in the nitrogen cycle, which spawned feedbacks that likely maintained oxygen at very low levels in the ocean and atmosphere and delayed the arrival of animals. Then, in the mid, pre-glacial Neoproterozoic

  19. Dream interpretation, affect, and the theory of neuronal group selection: Freud, Winnicott, Bion, and Modell.

    Science.gov (United States)

    Shields, Walker

    2006-12-01

    The author uses a dream specimen as interpreted during psychoanalysis to illustrate Modell's hypothesis that Edelman's theory of neuronal group selection (TNGS) may provide a valuable neurobiological model for Freud's dynamic unconscious, imaginative processes in the mind, the retranscription of memory in psychoanalysis, and intersubjective processes in the analytic relationship. He draws parallels between the interpretation of the dream material with keen attention to affect-laden meanings in the evolving analytic relationship in the domain of psychoanalysis and the principles of Edelman's TNGS in the domain of neurobiology. The author notes how this correlation may underscore the importance of dream interpretation in psychoanalysis. He also suggests areas for further investigation in both realms based on study of their interplay.

  20. Interpretation of Chemical Pathology Test Results in Paediatrics ...

    African Journals Online (AJOL)

    At any time we interprete paediatric chemical pathology test results we must take into consideration a number of factors, which are related with and restricted to paediatric patients. Such factors include the paediatric patient's age that may change from prematurity to above 18 years, and the paediatric patient's body weight ...