WorldWideScience

Sample records for minimal biologically plausible

  1. Biologically plausible learning in neural networks: a lesson from bacterial chemotaxis.

    Science.gov (United States)

    Shimansky, Yury P

    2009-12-01

    Learning processes in the brain are usually associated with plastic changes made to optimize the strength of connections between neurons. Although many details related to biophysical mechanisms of synaptic plasticity have been discovered, it is unclear how the concurrent performance of adaptive modifications in a huge number of spatial locations is organized to minimize a given objective function. Since direct experimental observation of even a relatively small subset of such changes is not feasible, computational modeling is an indispensable investigation tool for solving this problem. However, the conventional method of error back-propagation (EBP) employed for optimizing synaptic weights in artificial neural networks is not biologically plausible. This study based on computational experiments demonstrated that such optimization can be performed rather efficiently using the same general method that bacteria employ for moving closer to an attractant or away from a repellent. With regard to neural network optimization, this method consists of regulating the probability of an abrupt change in the direction of synaptic weight modification according to the temporal gradient of the objective function. Neural networks utilizing this method (regulation of modification probability, RMP) can be viewed as analogous to swimming in the multidimensional space of their parameters in the flow of biochemical agents carrying information about the optimality criterion. The efficiency of RMP is comparable to that of EBP, while RMP has several important advantages. Since the biological plausibility of RMP is beyond a reasonable doubt, the RMP concept provides a constructive framework for the experimental analysis of learning in natural neural networks.

  2. Optimality and Plausibility in Language Design

    Directory of Open Access Journals (Sweden)

    Michael R. Levot

    2016-12-01

    Full Text Available The Minimalist Program in generative syntax has been the subject of much rancour, a good proportion of it stoked by Noam Chomsky’s suggestion that language may represent “a ‘perfect solution’ to minimal design specifications.” A particular flash point has been the application of Minimalist principles to speculations about how language evolved in the human species. This paper argues that Minimalism is well supported as a plausible approach to language evolution. It is claimed that an assumption of minimal design specifications like that employed in MP syntax satisfies three key desiderata of evolutionary and general scientific plausibility: Physical Optimism, Rational Optimism, and Darwin’s Problem. In support of this claim, the methodologies employed in MP to maximise parsimony are characterised through an analysis of recent theories in Minimalist syntax, and those methodologies are defended with reference to practices and arguments from evolutionary biology and other natural sciences.

  3. Epidemiologic studies of occupational pesticide exposure and cancer: regulatory risk assessments and biologic plausibility.

    Science.gov (United States)

    Acquavella, John; Doe, John; Tomenson, John; Chester, Graham; Cowell, John; Bloemen, Louis

    2003-01-01

    Epidemiologic studies frequently show associations between self-reported use of specific pesticides and human cancers. These findings have engendered debate largely on methodologic grounds. However, biologic plausibility is a more fundamental issue that has received only superficial attention. The purpose of this commentary is to review briefly the toxicology and exposure data that are developed as part of the pesticide regulatory process and to discuss the applicability of this data to epidemiologic research. The authors also provide a generic example of how worker pesticide exposures might be estimated and compared to relevant toxicologic dose levels. This example provides guidance for better characterization of exposure and for consideration of biologic plausibility in epidemiologic studies of pesticides.

  4. On the biological plausibility of Wind Turbine Syndrome.

    Science.gov (United States)

    Harrison, Robert V

    2015-01-01

    An emerging environmental health issue relates to potential ill-effects of wind turbine noise. There have been numerous suggestions that the low-frequency acoustic components in wind turbine signals can cause symptoms associated with vestibular system disorders, namely vertigo, nausea, and nystagmus. This constellation of symptoms has been labeled as Wind Turbine Syndrome, and has been identified in case studies of individuals living close to wind farms. This review discusses whether it is biologically plausible for the turbine noise to stimulate the vestibular parts of the inner ear and, by extension, cause Wind Turbine Syndrome. We consider the sound levels that can activate the semicircular canals or otolith end organs in normal subjects, as well as in those with preexisting conditions known to lower vestibular threshold to sound stimulation.

  5. Particulate air pollution and increased mortality: Biological plausibility for causal relationship

    International Nuclear Information System (INIS)

    Henderson, R.F.

    1995-01-01

    Recently, a number of epidemiological studies have concluded that ambient particulate exposure is associated with increased mortality and morbidity at PM concentrations well below those previously thought to affect human health. These studies have been conducted in several different geographical locations and have involved a range of populations. While the consistency of the findings and the presence of an apparent concentration response relationship provide a strong argument for causality, epidemiological studies can only conclude this based upon inference from statistical associations. The biological plausibility of a causal relationship between low concentrations of PM and daily mortality and morbidity rates is neither intuitively obvious nor expected based on past experimental studies on the toxicity of inhaled particles. Chronic toxicity from inhaled, poorly soluble particles has been observed based on the slow accumulation of large lung burdens of particles, not on small daily fluctuations in PM levels. Acute toxicity from inhaled particles is associated mainly with acidic particles and is observed at much higher concentrations than those observed in the epidemiology studies reporting an association between PM concentrations and morbidity/mortality. To approach the difficult problem of determining if the association between PM concentrations and daily morbidity and mortality is biologically plausible and causal, one must consider (1) the chemical and physical characteristics of the particles in the inhaled atmospheres, (2) the characteristics of the morbidity/mortality observed and the people who are affected, and (3) potential mechanisms that might link the two

  6. A swarm intelligence framework for reconstructing gene networks: searching for biologically plausible architectures.

    Science.gov (United States)

    Kentzoglanakis, Kyriakos; Poole, Matthew

    2012-01-01

    In this paper, we investigate the problem of reverse engineering the topology of gene regulatory networks from temporal gene expression data. We adopt a computational intelligence approach comprising swarm intelligence techniques, namely particle swarm optimization (PSO) and ant colony optimization (ACO). In addition, the recurrent neural network (RNN) formalism is employed for modeling the dynamical behavior of gene regulatory systems. More specifically, ACO is used for searching the discrete space of network architectures and PSO for searching the corresponding continuous space of RNN model parameters. We propose a novel solution construction process in the context of ACO for generating biologically plausible candidate architectures. The objective is to concentrate the search effort into areas of the structure space that contain architectures which are feasible in terms of their topological resemblance to real-world networks. The proposed framework is initially applied to the reconstruction of a small artificial network that has previously been studied in the context of gene network reverse engineering. Subsequently, we consider an artificial data set with added noise for reconstructing a subnetwork of the genetic interaction network of S. cerevisiae (yeast). Finally, the framework is applied to a real-world data set for reverse engineering the SOS response system of the bacterium Escherichia coli. Results demonstrate the relative advantage of utilizing problem-specific knowledge regarding biologically plausible structural properties of gene networks over conducting a problem-agnostic search in the vast space of network architectures.

  7. Gene-ontology enrichment analysis in two independent family-based samples highlights biologically plausible processes for autism spectrum disorders.

    LENUS (Irish Health Repository)

    Anney, Richard J L

    2012-02-01

    Recent genome-wide association studies (GWAS) have implicated a range of genes from discrete biological pathways in the aetiology of autism. However, despite the strong influence of genetic factors, association studies have yet to identify statistically robust, replicated major effect genes or SNPs. We apply the principle of the SNP ratio test methodology described by O\\'Dushlaine et al to over 2100 families from the Autism Genome Project (AGP). Using a two-stage design we examine association enrichment in 5955 unique gene-ontology classifications across four groupings based on two phenotypic and two ancestral classifications. Based on estimates from simulation we identify excess of association enrichment across all analyses. We observe enrichment in association for sets of genes involved in diverse biological processes, including pyruvate metabolism, transcription factor activation, cell-signalling and cell-cycle regulation. Both genes and processes that show enrichment have previously been examined in autistic disorders and offer biologically plausibility to these findings.

  8. Photoinduced catalytic synthesis of biologically important metabolites from formaldehyde and ammonia under plausible "prebiotic" conditions

    Science.gov (United States)

    Delidovich, I. V.; Taran, O. P.; Simonov, A. N.; Matvienko, L. G.; Parmon, V. N.

    2011-08-01

    The article analyzes new and previously reported data on several catalytic and photochemical processes yielding biologically important molecules. UV-irradiation of formaldehyde aqueous solution yields acetaldehyde, glyoxal, glycolaldehyde and glyceraldehyde, which can serve as precursors of more complex biochemically relevant compounds. Photolysis of aqueous solution of acetaldehyde and ammonium nitrate results in formation of alanine and pyruvic acid. Dehydration of glyceraldehyde catalyzed by zeolite HZSM-5-17 yields pyruvaldehyde. Monosaccharides are formed in the course of the phosphate-catalyzed aldol condensation reactions of glycolaldehyde, glyceraldehyde and formaldehyde. The possibility of the direct synthesis of tetroses, keto- and aldo-pentoses from pure formaldehyde due to the combination of the photochemical production of glycolahyde and phosphate-catalyzed carbohydrate chain growth is demonstrated. Erythrulose and 3-pentulose are the main products of such combined synthesis with selectivity up to 10%. Biologically relevant aldotetroses, aldo- and ketopentoses are more resistant to the photochemical destruction owing to the stabilization in hemiacetal cyclic forms. They are formed as products of isomerization of erythrulose and 3-pentulose. The conjugation of the concerned reactions results in a plausible route to the formation of sugars, amino and organic acids from formaldehyde and ammonia under presumed 'prebiotic' conditions.

  9. Systems biology perspectives on minimal and simpler cells.

    Science.gov (United States)

    Xavier, Joana C; Patil, Kiran Raosaheb; Rocha, Isabel

    2014-09-01

    The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  10. Systems Biology Perspectives on Minimal and Simpler Cells

    Science.gov (United States)

    Xavier, Joana C.; Patil, Kiran Raosaheb

    2014-01-01

    SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563

  11. Estimating biological elementary flux modes that decompose a flux distribution by the minimal branching property

    DEFF Research Database (Denmark)

    Chan, Siu Hung Joshua; Solem, Christian; Jensen, Peter Ruhdal

    2014-01-01

    biologically feasible EFMs by considering their graphical properties. A previous study on the transcriptional regulation of metabolic genes found that distinct branches at a branch point metabolite usually belong to distinct metabolic pathways. This suggests an intuitive property of biologically feasible EFMs......, i.e. minimal branching. RESULTS: We developed the concept of minimal branching EFM and derived the minimal branching decomposition (MBD) to decompose flux distributions. Testing in the core Escherichia coli metabolic network indicated that MBD can distinguish branches at branch points and greatly...... knowledge, which facilitates interpretation. Comparison of the methods applied to a complex flux distribution in Lactococcus lactis similarly showed the advantages of MBD. The minimal branching EFM concept underlying MBD should be useful in other applications....

  12. Speech recognition employing biologically plausible receptive fields

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Bothe, Hans-Heinrich

    2011-01-01

    spectro-temporal receptive fields to auditory spectrogram input, motivated by the auditory pathway of humans, and ii) the adaptation or learning algorithms involved are biologically inspired. This is in contrast to state-of-the-art combinations of Mel-frequency cepstral coefficients and Hidden Markov...

  13. MRI Proton Density Fat Fraction Is Robust Across the Biologically Plausible Range of Triglyceride Spectra in Adults With Nonalcoholic Steatohepatitis

    Science.gov (United States)

    Hong, Cheng William; Mamidipalli, Adrija; Hooker, Jonathan C.; Hamilton, Gavin; Wolfson, Tanya; Chen, Dennis H.; Dehkordy, Soudabeh Fazeli; Middleton, Michael S.; Reeder, Scott B.; Loomba, Rohit; Sirlin, Claude B.

    2017-01-01

    Background Proton density fat fraction (PDFF) estimation requires spectral modeling of the hepatic triglyceride (TG) signal. Deviations in the TG spectrum may occur, leading to bias in PDFF quantification. Purpose To investigate the effects of varying six-peak TG spectral models on PDFF estimation bias. Study Type Retrospective secondary analysis of prospectively acquired clinical research data. Population Forty-four adults with biopsy-confirmed nonalcoholic steatohepatitis. Field Strength/Sequence Confounder-corrected chemical-shift-encoded 3T MRI (using a 2D multiecho gradient-recalled echo technique with magnitude reconstruction) and MR spectroscopy. Assessment In each patient, 61 pairs of colocalized MRI-PDFF and MRS-PDFF values were estimated: one pair used the standard six-peak spectral model, the other 60 were six-peak variants calculated by adjusting spectral model parameters over their biologically plausible ranges. MRI-PDFF values calculated using each variant model and the standard model were compared, and the agreement between MRI-PDFF and MRS-PDFF was assessed. Statistical Tests MRS-PDFF and MRI-PDFF were summarized descriptively. Bland–Altman (BA) analyses were performed between PDFF values calculated using each variant model and the standard model. Linear regressions were performed between BA biases and mean PDFF values for each variant model, and between MRI-PDFF and MRS-PDFF. Results Using the standard model, mean MRS-PDFF of the study population was 17.9±8.0% (range: 4.1–34.3%). The difference between the highest and lowest mean variant MRI-PDFF values was 1.5%. Relative to the standard model, the model with the greatest absolute BA bias overestimated PDFF by 1.2%. Bias increased with increasing PDFF (P hepatic fat content, PDFF estimation is robust across the biologically plausible range of TG spectra. Although absolute estimation bias increased with higher PDFF, its magnitude was small and unlikely to be clinically meaningful. Level of

  14. Toward Petascale Biologically Plausible Neural Networks

    Science.gov (United States)

    Long, Lyle

    This talk will describe an approach to achieving petascale neural networks. Artificial intelligence has been oversold for many decades. Computers in the beginning could only do about 16,000 operations per second. Computer processing power, however, has been doubling every two years thanks to Moore's law, and growing even faster due to massively parallel architectures. Finally, 60 years after the first AI conference we have computers on the order of the performance of the human brain (1016 operations per second). The main issues now are algorithms, software, and learning. We have excellent models of neurons, such as the Hodgkin-Huxley model, but we do not know how the human neurons are wired together. With careful attention to efficient parallel computing, event-driven programming, table lookups, and memory minimization massive scale simulations can be performed. The code that will be described was written in C + + and uses the Message Passing Interface (MPI). It uses the full Hodgkin-Huxley neuron model, not a simplified model. It also allows arbitrary network structures (deep, recurrent, convolutional, all-to-all, etc.). The code is scalable, and has, so far, been tested on up to 2,048 processor cores using 107 neurons and 109 synapses.

  15. Bisimulation for Single-Agent Plausibility Models

    DEFF Research Database (Denmark)

    Andersen, Mikkel Birkegaard; Bolander, Thomas; van Ditmarsch, H.

    2013-01-01

    define a proper notion of bisimulation, and prove that bisimulation corresponds to logical equivalence on image-finite models. We relate our results to other epistemic notions, such as safe belief and degrees of belief. Our results imply that there are only finitely many non-bisimilar single......-agent epistemic plausibility models on a finite set of propositions. This gives decidability for single-agent epistemic plausibility planning....

  16. From Never Born Proteins to Minimal Living Cells: two projects in synthetic biology.

    Science.gov (United States)

    Luisi, Pier Luigi; Chiarabelli, Cristiano; Stano, Pasquale

    2006-12-01

    The Never Born Proteins (NBPs) and the Minimal Cell projects are two currently developed research lines belonging to the field of synthetic biology. The first deals with the investigation of structural and functional properties of de novo proteins with random sequences, selected and isolated using phage display methods. The minimal cell is the simplest cellular construct which displays living properties, such as self-maintenance, self-reproduction and evolvability. The semi-synthetic approach to minimal cells involves the use of extant genes and proteins in order to build a supramolecular construct based on lipid vesicles. Results and outlooks on these two research lines are shortly discussed, mainly focusing on their relevance to the origin of life studies.

  17. The application of the random regret minimization model to drivers’ choice of crash avoidance maneuvers

    DEFF Research Database (Denmark)

    Kaplan, Sigal; Prato, Carlo Giacomo

    This study explores the plausibility of regret minimization as behavioral paradigm underlying the choice of crash avoidance maneuvers. Alternatively to previous studies that considered utility maximization, this study applies the random regret minimization (RRM) model while assuming that drivers ...

  18. The application of the random regret minimization model to drivers’ choice of crash avoidance maneuvers

    DEFF Research Database (Denmark)

    Kaplan, Sigal; Prato, Carlo Giacomo

    2012-01-01

    This study explores the plausibility of regret minimization as behavioral paradigm underlying the choice of crash avoidance maneuvers. Alternatively to previous studies that considered utility maximization, this study applies the random regret minimization (RRM) model while assuming that drivers ...

  19. A biologically plausible transform for visual recognition that is invariant to translation, scale and rotation

    Directory of Open Access Journals (Sweden)

    Pavel eSountsov

    2011-11-01

    Full Text Available Visual object recognition occurs easily despite differences in position, size, and rotation of the object, but the neural mechanisms responsible for this invariance are not known. We have found a set of transforms that achieve invariance in a neurally plausible way. We find that a transform based on local spatial frequency analysis of oriented segments and on logarithmic mapping, when applied twice in an iterative fashion, produces an output image that is unique to the object and that remains constant as the input image is shifted, scaled or rotated.

  20. A Biologically Plausible Transform for Visual Recognition that is Invariant to Translation, Scale, and Rotation.

    Science.gov (United States)

    Sountsov, Pavel; Santucci, David M; Lisman, John E

    2011-01-01

    Visual object recognition occurs easily despite differences in position, size, and rotation of the object, but the neural mechanisms responsible for this invariance are not known. We have found a set of transforms that achieve invariance in a neurally plausible way. We find that a transform based on local spatial frequency analysis of oriented segments and on logarithmic mapping, when applied twice in an iterative fashion, produces an output image that is unique to the object and that remains constant as the input image is shifted, scaled, or rotated.

  1. Pilgrims sailing the Titanic: plausibility effects on memory for misinformation.

    Science.gov (United States)

    Hinze, Scott R; Slaten, Daniel G; Horton, William S; Jenkins, Ryan; Rapp, David N

    2014-02-01

    People rely on information they read even when it is inaccurate (Marsh, Meade, & Roediger, Journal of Memory and Language 49:519-536, 2003), but how ubiquitous is this phenomenon? In two experiments, we investigated whether this tendency to encode and rely on inaccuracies from text might be influenced by the plausibility of misinformation. In Experiment 1, we presented stories containing inaccurate plausible statements (e.g., "The Pilgrims' ship was the Godspeed"), inaccurate implausible statements (e.g., . . . the Titanic), or accurate statements (e.g., . . . the Mayflower). On a subsequent test of general knowledge, participants relied significantly less on implausible than on plausible inaccuracies from the texts but continued to rely on accurate information. In Experiment 2, we replicated these results with the addition of a think-aloud procedure to elicit information about readers' noticing and evaluative processes for plausible and implausible misinformation. Participants indicated more skepticism and less acceptance of implausible than of plausible inaccuracies. In contrast, they often failed to notice, completely ignored, and at times even explicitly accepted the misinformation provided by plausible lures. These results offer insight into the conditions under which reliance on inaccurate information occurs and suggest potential mechanisms that may underlie reported misinformation effects.

  2. Plausibility and evidence: the case of homeopathy.

    Science.gov (United States)

    Rutten, Lex; Mathie, Robert T; Fisher, Peter; Goossens, Maria; van Wassenhoven, Michel

    2013-08-01

    Homeopathy is controversial and hotly debated. The conclusions of systematic reviews of randomised controlled trials of homeopathy vary from 'comparable to conventional medicine' to 'no evidence of effects beyond placebo'. It is claimed that homeopathy conflicts with scientific laws and that homoeopaths reject the naturalistic outlook, but no evidence has been cited. We are homeopathic physicians and researchers who do not reject the scientific outlook; we believe that examination of the prior beliefs underlying this enduring stand-off can advance the debate. We show that interpretations of the same set of evidence--for homeopathy and for conventional medicine--can diverge. Prior disbelief in homeopathy is rooted in the perceived implausibility of any conceivable mechanism of action. Using the 'crossword analogy', we demonstrate that plausibility bias impedes assessment of the clinical evidence. Sweeping statements about the scientific impossibility of homeopathy are themselves unscientific: scientific statements must be precise and testable. There is growing evidence that homeopathic preparations can exert biological effects; due consideration of such research would reduce the influence of prior beliefs on the assessment of systematic review evidence.

  3. Heuristic Elements of Plausible Reasoning.

    Science.gov (United States)

    Dudczak, Craig A.

    At least some of the reasoning processes involved in argumentation rely on inferences which do not fit within the traditional categories of inductive or deductive reasoning. The reasoning processes involved in plausibility judgments have neither the formal certainty of deduction nor the imputed statistical probability of induction. When utilizing…

  4. Plausible values in statistical inference

    NARCIS (Netherlands)

    Marsman, M.

    2014-01-01

    In Chapter 2 it is shown that the marginal distribution of plausible values is a consistent estimator of the true latent variable distribution, and, furthermore, that convergence is monotone in an embedding in which the number of items tends to infinity. This result is used to clarify some of the

  5. Biological pacemaker created by minimally invasive somatic reprogramming in pigs with complete heart block

    Science.gov (United States)

    Hu, Yu-Feng; Dawkins, James Frederick; Cho, Hee Cheol; Marbán, Eduardo; Cingolani, Eugenio

    2016-01-01

    Somatic reprogramming by reexpression of the embryonic transcription factor T-box 18 (TBX18) converts cardiomyocytes into pacemaker cells. We hypothesized that this could be a viable therapeutic avenue for pacemaker-dependent patients afflicted with device-related complications, and therefore tested whether adenoviral TBX18 gene transfer could create biological pacemaker activity in vivo in a large-animal model of complete heart block. Biological pacemaker activity, originating from the intramyocardial injection site, was evident in TBX18-transduced animals starting at day 2 and persisted for the duration of the study (14 days) with minimal backup electronic pacemaker use. Relative to controls transduced with a reporter gene, TBX18-transduced animals exhibited enhanced autonomic responses and physiologically superior chronotropic support of physical activity. Induced sinoatrial node cells could be identified by their distinctive morphology at the site of injection in TBX18-transduced animals, but not in controls. No local or systemic safety concerns arose. Thus, minimally invasive TBX18 gene transfer creates physiologically relevant pacemaker activity in complete heart block, providing evidence for therapeutic somatic reprogramming in a clinically relevant disease model. PMID:25031269

  6. Anatomically Plausible Surface Alignment and Reconstruction

    DEFF Research Database (Denmark)

    Paulsen, Rasmus R.; Larsen, Rasmus

    2010-01-01

    With the increasing clinical use of 3D surface scanners, there is a need for accurate and reliable algorithms that can produce anatomically plausible surfaces. In this paper, a combined method for surface alignment and reconstruction is proposed. It is based on an implicit surface representation...

  7. Stereotyping to infer group membership creates plausible deniability for prejudice-based aggression.

    Science.gov (United States)

    Cox, William T L; Devine, Patricia G

    2014-02-01

    In the present study, participants administered painful electric shocks to an unseen male opponent who was either explicitly labeled as gay or stereotypically implied to be gay. Identifying the opponent with a gay-stereotypic attribute produced a situation in which the target's group status was privately inferred but plausibly deniable to others. To test the plausible deniability hypothesis, we examined aggression levels as a function of internal (personal) and external (social) motivation to respond without prejudice. Whether plausible deniability was present or absent, participants high in internal motivation aggressed at low levels, and participants low in both internal and external motivation aggressed at high levels. The behavior of participants low in internal and high in external motivation, however, depended on experimental condition. They aggressed at low levels when observers could plausibly attribute their behavior to prejudice and aggressed at high levels when the situation granted plausible deniability. This work has implications for both obstacles to and potential avenues for prejudice-reduction efforts.

  8. Application of plausible reasoning to AI-based control systems

    Science.gov (United States)

    Berenji, Hamid; Lum, Henry, Jr.

    1987-01-01

    Some current approaches to plausible reasoning in artificial intelligence are reviewed and discussed. Some of the most significant recent advances in plausible and approximate reasoning are examined. A synergism among the techniques of uncertainty management is advocated, and brief discussions on the certainty factor approach, probabilistic approach, Dempster-Shafer theory of evidence, possibility theory, linguistic variables, and fuzzy control are presented. Some extensions to these methods are described, and the applications of the methods are considered.

  9. Credibility judgments of narratives: language, plausibility, and absorption.

    Science.gov (United States)

    Nahari, Galit; Glicksohn, Joseph; Nachson, Israel

    2010-01-01

    Two experiments were conducted in order to find out whether textual features of narratives differentially affect credibility judgments made by judges having different levels of absorption (a disposition associated with rich visual imagination). Participants in both experiments were exposed to a textual narrative and requested to judge whether the narrator actually experienced the event he described in his story. In Experiment 1, the narrative varied in terms of language (literal, figurative) and plausibility (ordinary, anomalous). In Experiment 2, the narrative varied in terms of language only. The participants' perceptions of the plausibility of the story described and the extent to which they were absorbed in reading were measured. The data from both experiments together suggest that the groups applied entirely different criteria in credibility judgments. For high-absorption individuals, their credibility judgment depends on the degree to which the text can be assimilated into their own vivid imagination, whereas for low-absorption individuals it depends mainly on plausibility. That is, high-absorption individuals applied an experiential mental set while judging the credibility of the narrator, whereas low-absorption individuals applied an instrumental mental set. Possible cognitive mechanisms and implications for credibility judgments are discussed.

  10. Searching for Plausible N-k Contingencies Endangering Voltage Stability

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel; Van Cutsem, Thierry

    2017-01-01

    This paper presents a novel search algorithm using time-domain simulations to identify plausible N − k contingencies endangering voltage stability. Starting from an initial list of disturbances, progressively more severe contingencies are investigated. After simulation of a N − k contingency......, the simulation results are assessed. If the system response is unstable, a plausible harmful contingency sequence has been found. Otherwise, components affected by the contingencies are considered as candidate next event leading to N − (k + 1) contingencies. This implicitly takes into account hidden failures...

  11. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation.

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-11-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition. Copyright © 2015 Cognitive Science Society, Inc.

  12. Taxonomic minimalism.

    Science.gov (United States)

    Beattle, A J; Oliver, I

    1994-12-01

    Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. Copyright © 1994. Published by Elsevier Ltd.

  13. Endocrine distrupting chemicals and human health: The plausibility ...

    African Journals Online (AJOL)

    The plausibility of research results on DDT and reproductive health ... cals in the environment and that human health is inextri- cably linked to the health of .... periods of folliculo-genesis or embryo-genesis that increases risk for adverse effects.

  14. Biological insights from 108 schizophrenia-associated genetic loci

    DEFF Research Database (Denmark)

    Ripke, Stephan; Neale, Benjamin M.; Corvin, Aiden

    2014-01-01

    and 113,075 controls. We identify 128 independent associations spanning 108 conservatively defined loci that meet genome-wide significance, 83 of which have not been previously reported. Associations were enriched among genes expressed in brain, providing biological plausibility for the findings. Many...

  15. Neural networks, nativism, and the plausibility of constructivism.

    Science.gov (United States)

    Quartz, S R

    1993-09-01

    Recent interest in PDP (parallel distributed processing) models is due in part to the widely held belief that they challenge many of the assumptions of classical cognitive science. In the domain of language acquisition, for example, there has been much interest in the claim that PDP models might undermine nativism. Related arguments based on PDP learning have also been given against Fodor's anti-constructivist position--a position that has contributed to the widespread dismissal of constructivism. A limitation of many of the claims regarding PDP learning, however, is that the principles underlying this learning have not been rigorously characterized. In this paper, I examine PDP models from within the framework of Valiant's PAC (probably approximately correct) model of learning, now the dominant model in machine learning, and which applies naturally to neural network learning. From this perspective, I evaluate the implications of PDP models for nativism and Fodor's influential anti-constructivist position. In particular, I demonstrate that, contrary to a number of claims, PDP models are nativist in a robust sense. I also demonstrate that PDP models actually serve as a good illustration of Fodor's anti-constructivist position. While these results may at first suggest that neural network models in general are incapable of the sort of concept acquisition that is required to refute Fodor's anti-constructivist position, I suggest that there is an alternative form of neural network learning that demonstrates the plausibility of constructivism. This alternative form of learning is a natural interpretation of the constructivist position in terms of neural network learning, as it employs learning algorithms that incorporate the addition of structure in addition to weight modification schemes. By demonstrating that there is a natural and plausible interpretation of constructivism in terms of neural network learning, the position that nativism is the only plausible model of

  16. Generation of Plausible Hurricane Tracks for Preparedness Exercises

    Science.gov (United States)

    2017-04-25

    product kernel. KDE with a beta kernel gene- rates maximum sustained winds, and linear regression simulates minimum central pressure. Maximum significant...the Storm level models the number of waypoints M , birth and death locations w1 and wM , and total number of steps L. The Stage level models the...MATLAB and leverages HURDAT2 to construct data-driven statistical models that can generate plausible yet never-before-seen storm behaviors. For a

  17. Restoration ecology: two-sex dynamics and cost minimization.

    Directory of Open Access Journals (Sweden)

    Ferenc Molnár

    Full Text Available We model a spatially detailed, two-sex population dynamics, to study the cost of ecological restoration. We assume that cost is proportional to the number of individuals introduced into a large habitat. We treat dispersal as homogeneous diffusion in a one-dimensional reaction-diffusion system. The local population dynamics depends on sex ratio at birth, and allows mortality rates to differ between sexes. Furthermore, local density dependence induces a strong Allee effect, implying that the initial population must be sufficiently large to avert rapid extinction. We address three different initial spatial distributions for the introduced individuals; for each we minimize the associated cost, constrained by the requirement that the species must be restored throughout the habitat. First, we consider spatially inhomogeneous, unstable stationary solutions of the model's equations as plausible candidates for small restoration cost. Second, we use numerical simulations to find the smallest rectangular cluster, enclosing a spatially homogeneous population density, that minimizes the cost of assured restoration. Finally, by employing simulated annealing, we minimize restoration cost among all possible initial spatial distributions of females and males. For biased sex ratios, or for a significant between-sex difference in mortality, we find that sex-specific spatial distributions minimize the cost. But as long as the sex ratio maximizes the local equilibrium density for given mortality rates, a common homogeneous distribution for both sexes that spans a critical distance yields a similarly low cost.

  18. Restoration ecology: two-sex dynamics and cost minimization.

    Science.gov (United States)

    Molnár, Ferenc; Caragine, Christina; Caraco, Thomas; Korniss, Gyorgy

    2013-01-01

    We model a spatially detailed, two-sex population dynamics, to study the cost of ecological restoration. We assume that cost is proportional to the number of individuals introduced into a large habitat. We treat dispersal as homogeneous diffusion in a one-dimensional reaction-diffusion system. The local population dynamics depends on sex ratio at birth, and allows mortality rates to differ between sexes. Furthermore, local density dependence induces a strong Allee effect, implying that the initial population must be sufficiently large to avert rapid extinction. We address three different initial spatial distributions for the introduced individuals; for each we minimize the associated cost, constrained by the requirement that the species must be restored throughout the habitat. First, we consider spatially inhomogeneous, unstable stationary solutions of the model's equations as plausible candidates for small restoration cost. Second, we use numerical simulations to find the smallest rectangular cluster, enclosing a spatially homogeneous population density, that minimizes the cost of assured restoration. Finally, by employing simulated annealing, we minimize restoration cost among all possible initial spatial distributions of females and males. For biased sex ratios, or for a significant between-sex difference in mortality, we find that sex-specific spatial distributions minimize the cost. But as long as the sex ratio maximizes the local equilibrium density for given mortality rates, a common homogeneous distribution for both sexes that spans a critical distance yields a similarly low cost.

  19. Consistent robustness analysis (CRA) identifies biologically relevant properties of regulatory network models.

    Science.gov (United States)

    Saithong, Treenut; Painter, Kevin J; Millar, Andrew J

    2010-12-16

    A number of studies have previously demonstrated that "goodness of fit" is insufficient in reliably classifying the credibility of a biological model. Robustness and/or sensitivity analysis is commonly employed as a secondary method for evaluating the suitability of a particular model. The results of such analyses invariably depend on the particular parameter set tested, yet many parameter values for biological models are uncertain. Here, we propose a novel robustness analysis that aims to determine the "common robustness" of the model with multiple, biologically plausible parameter sets, rather than the local robustness for a particular parameter set. Our method is applied to two published models of the Arabidopsis circadian clock (the one-loop [1] and two-loop [2] models). The results reinforce current findings suggesting the greater reliability of the two-loop model and pinpoint the crucial role of TOC1 in the circadian network. Consistent Robustness Analysis can indicate both the relative plausibility of different models and also the critical components and processes controlling each model.

  20. Of paradox and plausibility: the dynamic of change in medical law.

    Science.gov (United States)

    Harrington, John

    2014-01-01

    This article develops a model of change in medical law. Drawing on systems theory, it argues that medical law participates in a dynamic of 'deparadoxification' and 'reparadoxification' whereby the underlying contingency of the law is variously concealed through plausible argumentation, or revealed by critical challenge. Medical law is, thus, thoroughly rhetorical. An examination of the development of the law on abortion and on the sterilization of incompetent adults shows that plausibility is achieved through the deployment of substantive common sense and formal stylistic devices. It is undermined where these elements are shown to be arbitrary and constructed. In conclusion, it is argued that the politics of medical law are constituted by this antagonistic process of establishing and challenging provisionally stable normative regimes. © The Author [2014]. Published by Oxford University Press; all rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Minimal information: an urgent need to assess the functional reliability of recombinant proteins used in biological experiments

    Directory of Open Access Journals (Sweden)

    de Marco Ario

    2008-07-01

    Full Text Available Abstract Structural characterization of proteins used in biological experiments is largely neglected. In most publications, the information available is totally insufficient to judge the functionality of the proteins used and, therefore, the significance of identified protein-protein interactions (was the interaction specific or due to unspecific binding of misfolded protein regions? or reliability of kinetic and thermodynamic data (how much protein was in its native form?. As a consequence, the results of single experiments might not only become questionable, but the whole reliability of systems biology, built on these fundaments, would be weakened. The introduction of Minimal Information concerning purified proteins to add as metadata to the main body of a manuscript would render straightforward the assessment of their functional and structural qualities and, consequently, of results obtained using these proteins. Furthermore, accepted standards for protein annotation would simplify data comparison and exchange. This article has been envisaged as a proposal for aggregating scientists who share the opinion that the scientific community needs a platform for Minimum Information for Protein Functionality Evaluation (MIPFE.

  2. Minimal rates for lepton flavour violation from supersymmetric leptogenesis

    International Nuclear Information System (INIS)

    Ibarra, A; Simonetto, C

    2010-01-01

    The see-saw is a very attractive model for neutrino mass generation in particular in association with supersymmetry as a solution to the hierarchy problem. Under the plausible assumptions of hierarchical neutrino Yukawa eigenvalues and the absence of cancellations, we derive an upper bound on the lightest right-handed neutrino mass from the non-observation of μ → eγ and μ-e conversion in nuclei. The ongoing experiment MEG as well as the planned experiments Mu2e, COMET and PRISM/PRIME will improve this bound if no evidence of lepton flavour violation is found. We lastly comment on the possibility of ruling out minimal leptogenesis if these experiments find no signal.

  3. Minimize corrosion degradation of steam generator tube materials

    International Nuclear Information System (INIS)

    Lu, Y.

    2006-01-01

    As part of a coordinated program, AECL is developing a set of tools to aid with the prediction and management of steam generator performance. Although stress corrosion cracking (of Alloy 800) has not been detected in any operating steam generator, for life management it is necessary to develop mechanistic models to predict the conditions under which stress corrosion cracking is plausible. Experimental data suggest that all steam generator tube materials are susceptible to corrosion degradation under some specific off-specification conditions. The tolerance to the chemistry upset for each steam generator tube alloy is different. Electrochemical corrosion behaviors of major steam generator tube alloys were studied under the plausible aggressive crevice chemistry conditions. The potential hazardous conditions leading to steam generator tube degradation and the conditions, which can minimize steam generator tube degradation have been determined. Recommended electrochemical corrosion potential/pH zones were defined for all major steam generator tube materials, including Alloys 600, 800, 690 and 400, under CANDU steam generator operating and startup conditions. Stress corrosion cracking tests and accelerated corrosion tests were carried out to verify and revise the recommended electrochemical corrosion potential/pH zones. Based on this information, utilities can prevent steam generator material degradation surprises by appropriate steam generator water chemistry management and increase the reliability of nuclear power generating stations. (author)

  4. A Stochastic Model of Plausibility in Live Virtual Constructive Environments

    Science.gov (United States)

    2017-09-14

    from the model parameters that are inputs to the computer model ( mathematical model) but whose exact values are unknown to experimentalists and...Environments Jeremy R. Millar Follow this and additional works at: https://scholar.afit.edu/etd Part of the Computer Sciences Commons This Dissertation...25 3.3 Computing Plausibility Exceedance Probabilities . . . . . . . . . . . . . . . . . . . 28 IV

  5. Depressive symptoms predict head and neck cancer survival: Examining plausible behavioral and biological pathways.

    Science.gov (United States)

    Zimmaro, Lauren A; Sephton, Sandra E; Siwik, Chelsea J; Phillips, Kala M; Rebholz, Whitney N; Kraemer, Helena C; Giese-Davis, Janine; Wilson, Liz; Bumpous, Jeffrey M; Cash, Elizabeth D

    2018-03-01

    Head and neck cancers are associated with high rates of depression, which may increase the risk for poorer immediate and long-term outcomes. Here it was hypothesized that greater depressive symptoms would predict earlier mortality, and behavioral (treatment interruption) and biological (treatment response) mediators were examined. Patients (n = 134) reported depressive symptomatology at treatment planning. Clinical data were reviewed at the 2-year follow-up. Greater depressive symptoms were associated with significantly shorter survival (hazard ratio, 0.868; 95% confidence interval [CI], 0.819-0.921; P ratio, 0.865; 95% CI, 0.774-0.966; P = .010), and poorer treatment response (odds ratio, 0.879; 95% CI, 0.803-0.963; P = .005). The poorer treatment response partially explained the depression-survival relation. Other known prognostic indicators did not challenge these results. Depressive symptoms at the time of treatment planning predict overall 2-year mortality. Effects are partly influenced by the treatment response. Depression screening and intervention may be beneficial. Future studies should examine parallel biological pathways linking depression to cancer survival, including endocrine disruption and inflammation. Cancer 2018;124:1053-60. © 2018 American Cancer Society. © 2018 American Cancer Society.

  6. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model.

    Science.gov (United States)

    Aberg, Kristoffer C; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  7. Sludge minimization technologies - an overview

    Energy Technology Data Exchange (ETDEWEB)

    Oedegaard, Hallvard

    2003-07-01

    The management of wastewater sludge from wastewater treatment plants represents one of the major challenges in wastewater treatment today. The cost of the sludge treatment amounts to more that the cost of the liquid in many cases. Therefore the focus on and interest in sludge minimization is steadily increasing. In the paper an overview is given for sludge minimization (sludge mass reduction) options. It is demonstrated that sludge minimization may be a result of reduced production of sludge and/or disintegration processes that may take place both in the wastewater treatment stage and in the sludge stage. Various sludge disintegration technologies for sludge minimization are discussed, including mechanical methods (focusing on stirred ball-mill, high-pressure homogenizer, ultrasonic disintegrator), chemical methods (focusing on the use of ozone), physical methods (focusing on thermal and thermal/chemical hydrolysis) and biological methods (focusing on enzymatic processes). (author)

  8. Biology Teachers' Conceptions of the Diversity of Life and the Historical Development of Evolutionary Concepts

    Science.gov (United States)

    da Silva, Paloma Rodrigues; de Andrade, Mariana A. Bologna Soares; de Andrade Caldeira, Ana Maria

    2015-01-01

    Biology is a science that involves study of the diversity of living organisms. This diversity has always generated questions and has motivated cultures to seek plausible explanations for the differences and similarities between types of organisms. In biology teaching, these issues are addressed by adopting an evolutionary approach. The aim of this…

  9. A plausible neural circuit for decision making and its formation based on reinforcement learning.

    Science.gov (United States)

    Wei, Hui; Dai, Dawei; Bu, Yijie

    2017-06-01

    A human's, or lower insects', behavior is dominated by its nervous system. Each stable behavior has its own inner steps and control rules, and is regulated by a neural circuit. Understanding how the brain influences perception, thought, and behavior is a central mandate of neuroscience. The phototactic flight of insects is a widely observed deterministic behavior. Since its movement is not stochastic, the behavior should be dominated by a neural circuit. Based on the basic firing characteristics of biological neurons and the neural circuit's constitution, we designed a plausible neural circuit for this phototactic behavior from logic perspective. The circuit's output layer, which generates a stable spike firing rate to encode flight commands, controls the insect's angular velocity when flying. The firing pattern and connection type of excitatory and inhibitory neurons are considered in this computational model. We simulated the circuit's information processing using a distributed PC array, and used the real-time average firing rate of output neuron clusters to drive a flying behavior simulation. In this paper, we also explored how a correct neural decision circuit is generated from network flow view through a bee's behavior experiment based on the reward and punishment feedback mechanism. The significance of this study: firstly, we designed a neural circuit to achieve the behavioral logic rules by strictly following the electrophysiological characteristics of biological neurons and anatomical facts. Secondly, our circuit's generality permits the design and implementation of behavioral logic rules based on the most general information processing and activity mode of biological neurons. Thirdly, through computer simulation, we achieved new understanding about the cooperative condition upon which multi-neurons achieve some behavioral control. Fourthly, this study aims in understanding the information encoding mechanism and how neural circuits achieve behavior control

  10. Turing patterns and biological explanation

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    , promoting theory exploration, and acting as constitutive parts of empirically adequate explanations of naturally occurring phenomena, such as biological pattern formation. Focusing on the roles that minimal model explanations play in science motivates the adoption of a broader diachronic view of scientific......Turing patterns are a class of minimal mathematical models that have been used to discover and conceptualize certain abstract features of early biological development. This paper examines a range of these minimal models in order to articulate and elaborate a philosophical analysis...

  11. Synthetic biology and its alternatives. Descartes, Kant and the idea of engineering biological machines.

    Science.gov (United States)

    Kogge, Werner; Richter, Michael

    2013-06-01

    The engineering-based approach of synthetic biology is characterized by an assumption that 'engineering by design' enables the construction of 'living machines'. These 'machines', as biological machines, are expected to display certain properties of life, such as adapting to changing environments and acting in a situated way. This paper proposes that a tension exists between the expectations placed on biological artefacts and the notion of producing such systems by means of engineering; this tension makes it seem implausible that biological systems, especially those with properties characteristic of living beings, can in fact be produced using the specific methods of engineering. We do not claim that engineering techniques have nothing to contribute to the biotechnological construction of biological artefacts. However, drawing on Descartes's and Kant's thinking on the relationship between the organism and the machine, we show that it is considerably more plausible to assume that distinctively biological artefacts emerge within a paradigm different from the paradigm of the Cartesian machine that underlies the engineering approach. We close by calling for increased attention to be paid to approaches within molecular biology and chemistry that rest on conceptions different from those of synthetic biology's engineering paradigm. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Consciousness and biological evolution.

    Science.gov (United States)

    Lindahl, B I

    1997-08-21

    It has been suggested that if the preservation and development of consciousness in the biological evolution is a result of natural selection, it is plausible that consciousness not only has been influenced by neural processes, but has had a survival value itself; and it could only have had this, if it had also been efficacious. This argument for mind-brain interaction is examined, both as the argument has been developed by William James and Karl Popper and as it has been discussed by C.D. Broad. The problem of identifying mental phenomena with certain neural phenomena is also addressed. The main conclusion of the analysis is that an explanation of the evolution of consciousness in Darwinian terms of natural selection does not rule out that consciousness may have evolved as a mere causally inert effect of the evolution of the nervous system, or that mental phenomena are identical with certain neural phenomena. However, the interactionistic theory still seems, more plausible and more fruitful for other reasons brought up in the discussion.

  13. Probabilistic reasoning in intelligent systems networks of plausible inference

    CERN Document Server

    Pearl, Judea

    1988-01-01

    Probabilistic Reasoning in Intelligent Systems is a complete and accessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty. The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. The author distinguishes syntactic and semantic approaches to uncertainty--and offers techniques, based on belief networks, that provid

  14. A Note on Unified Statistics Including Fermi-Dirac, Bose-Einstein, and Tsallis Statistics, and Plausible Extension to Anisotropic Effect

    Directory of Open Access Journals (Sweden)

    Christianto V.

    2007-04-01

    Full Text Available In the light of some recent hypotheses suggesting plausible unification of thermostatistics where Fermi-Dirac, Bose-Einstein and Tsallis statistics become its special subsets, we consider further plausible extension to include non-integer Hausdorff dimension, which becomes realization of fractal entropy concept. In the subsequent section, we also discuss plausible extension of this unified statistics to include anisotropic effect by using quaternion oscillator, which may be observed in the context of Cosmic Microwave Background Radiation. Further observation is of course recommended in order to refute or verify this proposition.

  15. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    Science.gov (United States)

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15

  16. Theories and models on the biological of cells in space

    Science.gov (United States)

    Todd, P.; Klaus, D. M.

    1996-01-01

    A wide variety of observations on cells in space, admittedly made under constraining and unnatural conditions in may cases, have led to experimental results that were surprising or unexpected. Reproducibility, freedom from artifacts, and plausibility must be considered in all cases, even when results are not surprising. The papers in symposium on 'Theories and Models on the Biology of Cells in Space' are dedicated to the subject of the plausibility of cellular responses to gravity -- inertial accelerations between 0 and 9.8 m/sq s and higher. The mechanical phenomena inside the cell, the gravitactic locomotion of single eukaryotic and prokaryotic cells, and the effects of inertial unloading on cellular physiology are addressed in theoretical and experimental studies.

  17. Consequentialism and the Synthetic Biology Problem.

    Science.gov (United States)

    Heavey, Patrick

    2017-04-01

    This article analyzes the ethics of synthetic biology (synbio) from a consequentialist perspective, examining potential effects on food and agriculture, and on medicine, fuel, and the advancement of science. The issues of biosafety and biosecurity are also examined. A consequentialist analysis offers an essential road map to policymakers and regulators as to how to deal with synbio. Additionally, the article discusses the limitations of consequentialism as a tool for analysing synbioethics. Is it possible to predict, with any degree of plausibility, what the consequences of synthetic biology will be in 50 years, or in 100, or in 500? Synbio may take humanity to a place of radical departure from what is known or knowable.

  18. From information processing to decisions: Formalizing and comparing psychologically plausible choice models.

    Science.gov (United States)

    Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten

    2017-08-01

    Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. L’Analyse du Risque Géopolitique: du Plausible au Probable

    OpenAIRE

    Adib Bencherif

    2015-01-01

    This paper is going to explore the logical process behind risk analysis, particularly in geopolitics. The main goal is to demonstrate the ambiguities behind risk calculation and to highlight the continuum between plausibility and probability in risk analysis. To demonstrate it, the author introduces two notions: the inference of abduction, often neglected in the social sciences literature, and the Bayesian calculation. Inspired by the works of Louise Amoore, this paper tries to go further by ...

  20. The Radical Promise of Reformist Zeal: What Makes "Inquiry for Equity" Plausible?

    Science.gov (United States)

    Lashaw, Amanda

    2010-01-01

    Education reform movements often promise more than they deliver. Why are such promises plausible in light of seemingly perpetual education reform? Drawing on ethnographic fieldwork based in a nonprofit education reform organization, this article explores the appeal of popular notions about "using data to close the racial achievement…

  1. Toward mechanical systems biology in bone.

    Science.gov (United States)

    Trüssel, Andreas; Müller, Ralph; Webster, Duncan

    2012-11-01

    Cyclic mechanical loading is perhaps the most important physiological factor regulating bone mass and shape in a way which balances optimal strength with minimal weight. This bone adaptation process spans multiple length and time scales. Forces resulting from physiological exercise at the organ scale are sensed at the cellular scale by osteocytes, which reside inside the bone matrix. Via biochemical pathways, osteocytes orchestrate the local remodeling action of osteoblasts (bone formation) and osteoclasts (bone resorption). Together these local adaptive remodeling activities sum up to strengthen bone globally at the organ scale. To resolve the underlying mechanisms it is required to identify and quantify both cause and effect across the different scales. Progress has been made at the different scales experimentally. Computational models of bone adaptation have been developed to piece together various experimental observations at the different scales into coherent and plausible mechanisms. However additional quantitative experimental validation is still required to build upon the insights which have already been achieved. In this review we discuss emerging as well as state of the art experimental and computational techniques and how they might be used in a mechanical systems biology approach to further our understanding of the mechanisms governing load induced bone adaptation, i.e., ways are outlined in which experimental and computational approaches could be coupled, in a quantitative manner to create more reliable multiscale models of bone.

  2. Contrast normalization contributes to a biologically-plausible model of receptive-field development in primary visual cortex (V1)

    Science.gov (United States)

    Willmore, Ben D.B.; Bulstrode, Harry; Tolhurst, David J.

    2012-01-01

    Neuronal populations in the primary visual cortex (V1) of mammals exhibit contrast normalization. Neurons that respond strongly to simple visual stimuli – such as sinusoidal gratings – respond less well to the same stimuli when they are presented as part of a more complex stimulus which also excites other, neighboring neurons. This phenomenon is generally attributed to generalized patterns of inhibitory connections between nearby V1 neurons. The Bienenstock, Cooper and Munro (BCM) rule is a neural network learning rule that, when trained on natural images, produces model neurons which, individually, have many tuning properties in common with real V1 neurons. However, when viewed as a population, a BCM network is very different from V1 – each member of the BCM population tends to respond to the same dominant features of visual input, producing an incomplete, highly redundant code for visual information. Here, we demonstrate that, by adding contrast normalization into the BCM rule, we arrive at a neurally-plausible Hebbian learning rule that can learn an efficient sparse, overcomplete representation that is a better model for stimulus selectivity in V1. This suggests that one role of contrast normalization in V1 is to guide the neonatal development of receptive fields, so that neurons respond to different features of visual input. PMID:22230381

  3. Cell-free protein synthesis in micro compartments: building a minimal cell from biobricks.

    Science.gov (United States)

    Jia, Haiyang; Heymann, Michael; Bernhard, Frank; Schwille, Petra; Kai, Lei

    2017-10-25

    The construction of a minimal cell that exhibits the essential characteristics of life is a great challenge in the field of synthetic biology. Assembling a minimal cell requires multidisciplinary expertise from physics, chemistry and biology. Scientists from different backgrounds tend to define the essence of 'life' differently and have thus proposed different artificial cell models possessing one or several essential features of living cells. Using the tools and methods of molecular biology, the bottom-up engineering of a minimal cell appears in reach. However, several challenges still remain. In particular, the integration of individual sub-systems that is required to achieve a self-reproducing cell model presents a complex optimization challenge. For example, multiple self-organisation and self-assembly processes have to be carefully tuned. We review advances and developments of new methods and techniques, for cell-free protein synthesis as well as micro-fabrication, for their potential to resolve challenges and to accelerate the development of minimal cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    Science.gov (United States)

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  5. Plausibility of stromal initiation of epithelial cancers without a mutation in the epithelium: a computer simulation of morphostats

    Directory of Open Access Journals (Sweden)

    Cappuccio Antonio

    2009-03-01

    Full Text Available Abstract Background There is experimental evidence from animal models favoring the notion that the disruption of interactions between stroma and epithelium plays an important role in the initiation of carcinogenesis. These disrupted interactions are hypothesized to be mediated by molecules, termed morphostats, which diffuse through the tissue to determine cell phenotype and maintain tissue architecture. Methods We developed a computer simulation based on simple properties of cell renewal and morphostats. Results Under the computer simulation, the disruption of the morphostat gradient in the stroma generated epithelial precursors of cancer without any mutation in the epithelium. Conclusion The model is consistent with the possibility that the accumulation of genetic and epigenetic changes found in tumors could arise after the formation of a founder population of aberrant cells, defined as cells that are created by low or insufficient morphostat levels and that no longer respond to morphostat concentrations. Because the model is biologically plausible, we hope that these results will stimulate further experiments.

  6. Cost minimization in a full-scale conventional wastewater treatment plant: associated costs of biological energy consumption versus sludge production.

    Science.gov (United States)

    Sid, S; Volant, A; Lesage, G; Heran, M

    2017-11-01

    Energy consumption and sludge production minimization represent rising challenges for wastewater treatment plants (WWTPs). The goal of this study is to investigate how energy is consumed throughout the whole plant and how operating conditions affect this energy demand. A WWTP based on the activated sludge process was selected as a case study. Simulations were performed using a pre-compiled model implemented in GPS-X simulation software. Model validation was carried out by comparing experimental and modeling data of the dynamic behavior of the mixed liquor suspended solids (MLSS) concentration and nitrogen compounds concentration, energy consumption for aeration, mixing and sludge treatment and annual sludge production over a three year exercise. In this plant, the energy required for bioreactor aeration was calculated at approximately 44% of the total energy demand. A cost optimization strategy was applied by varying the MLSS concentrations (from 1 to 8 gTSS/L) while recording energy consumption, sludge production and effluent quality. An increase of MLSS led to an increase of the oxygen requirement for biomass aeration, but it also reduced total sludge production. Results permit identification of a key MLSS concentration allowing identification of the best compromise between levels of treatment required, biological energy demand and sludge production while minimizing the overall costs.

  7. An information geometric approach to least squares minimization

    Science.gov (United States)

    Transtrum, Mark; Machta, Benjamin; Sethna, James

    2009-03-01

    Parameter estimation by nonlinear least squares minimization is a ubiquitous problem that has an elegant geometric interpretation: all possible parameter values induce a manifold embedded within the space of data. The minimization problem is then to find the point on the manifold closest to the origin. The standard algorithm for minimizing sums of squares, the Levenberg-Marquardt algorithm, also has geometric meaning. When the standard algorithm fails to efficiently find accurate fits to the data, geometric considerations suggest improvements. Problems involving large numbers of parameters, such as often arise in biological contexts, are notoriously difficult. We suggest an algorithm based on geodesic motion that may offer improvements over the standard algorithm for a certain class of problems.

  8. Is knowing believing? The role of event plausibility and background knowledge in planting false beliefs about the personal past.

    Science.gov (United States)

    Pezdek, Kathy; Blandon-Gitlin, Iris; Lam, Shirley; Hart, Rhiannon Ellis; Schooler, Jonathan W

    2006-12-01

    False memories are more likely to be planted for plausible than for implausible events, but does just knowing about an implausible event make individuals more likely to think that the event happened to them? Two experiments assessed the independent contributions o f plausibility a nd background knowledge to planting false beliefs. In Experiment 1, subjects rated 20 childhood events as to the likelihood of each event having happened to them. The list included the implausible target event "received an enema," a critical target event of Pezdek, Finger, and Hodge (1997). Two weeks later, subjects were presented with (1) information regarding the high prevalence rate of enemas; (2) background information on how to administer an enema; (3) neither type of information; or (4) both. Immediately or 2 weeks later, they rated the 20 childhood events again. Only plausibility significantly increased occurrence ratings. In Experiment 2, the target event was changed from "barium enema administered in a hospital" to "home enema for constipation"; significant effects of both plausibility and background knowledge resulted. The results suggest that providing background knowledge can increase beliefs about personal events, but that its impact is limited by the extent of the individual's familiarity with the context of the suggested target event.

  9. Resolution of cosmological singularity and a plausible mechanism of the big bang

    International Nuclear Information System (INIS)

    Choudhury, D.C.

    2002-01-01

    The initial cosmological singularity in the framework of the general theory of relativity is resolved by introducing the effect of the uncertainty principle of quantum theory without violating conventional laws of physics. A plausible account of the mechanism of the big bang, analogous to that of a nuclear explosion, is given and the currently accepted Planck temperature of ≅10 32 K at the beginning of the big bang is predicted

  10. Metastasis in renal cell carcinoma: Biology and implications for therapy

    Directory of Open Access Journals (Sweden)

    Jun Gong

    2016-10-01

    Full Text Available Although multiple advances have been made in systemic therapy for renal cell carcinoma (RCC, metastatic RCC remains incurable. In the current review, we focus on the underlying biology of RCC and plausible mechanisms of metastasis. We further outline evolving strategies to combat metastasis through adjuvant therapy. Finally, we discuss clinical patterns of metastasis in RCC and how distinct systemic therapy approaches may be considered based on the anatomic location of metastasis.

  11. The prospect of minimally invasive therapy for oncology in 21st century

    International Nuclear Information System (INIS)

    Wu Peihong

    2005-01-01

    Minimally invasive therapy and biotherapy are two tendencies in medicine of the 21st century. It is minimally invasive with exact fixing and therapy, few pains and fast recovery. By the host self defecnce mechanism and biologicals, confinement of tumor and decreasing recurrence will give improvement to the patient's quality of life. The followings are the megatrends of minimally invasive therapy in the 21st century: 1. Follow closely with new technology; 2. Exact fixing and therapy; 3. Mode of sequencely combination; 4. Combined with immunotherapy; 5. Radical cure of minimally invasive therapy on oncology. New mode of minimally invasive therapy combined with biotherapy is expected as an important ingredient for oncotherapy in the 21 century. (authors)

  12. Vulnerabilities to agricultural production shocks: An extreme, plausible scenario for assessment of risk for the insurance sector

    Directory of Open Access Journals (Sweden)

    Tobias Lunt

    2016-01-01

    Full Text Available Climate risks pose a threat to the function of the global food system and therefore also a hazard to the global financial sector, the stability of governments, and the food security and health of the world’s population. This paper presents a method to assess plausible impacts of an agricultural production shock and potential materiality for global insurers. A hypothetical, near-term, plausible, extreme scenario was developed based upon modules of historical agricultural production shocks, linked under a warm phase El Niño-Southern Oscillation (ENSO meteorological framework. The scenario included teleconnected floods and droughts in disparate agricultural production regions around the world, as well as plausible, extreme biotic shocks. In this scenario, global crop yield declines of 10% for maize, 11% for soy, 7% for wheat and 7% for rice result in quadrupled commodity prices and commodity stock fluctuations, civil unrest, significant negative humanitarian consequences and major financial losses worldwide. This work illustrates a need for the scientific community to partner across sectors and industries towards better-integrated global data, modeling and analytical capacities, to better respond to and prepare for concurrent agricultural failure. Governments, humanitarian organizations and the private sector collectively may recognize significant benefits from more systematic assessment of exposure to agricultural climate risk.

  13. Minimal metabolic pathway structure is consistent with associated biomolecular interactions

    DEFF Research Database (Denmark)

    Bordbar, Aarash; Nagarajan, Harish; Lewis, Nathan E.

    2014-01-01

    Pathways are a universal paradigm for functionally describing cellular processes. Even though advances in high-throughput data generation have transformed biology, the core of our biological understanding, and hence data interpretation, is still predicated on human-defined pathways. Here, we......, effectively doubling the known regulatory roles for Nac and MntR. This study suggests an underlying and fundamental principle in the evolutionary selection of pathway structures; namely, that pathways may be minimal, independent, and segregated....

  14. Morality Principles for Risk Modelling: Needs and Links with the Origins of Plausible Inference

    Science.gov (United States)

    Solana-Ortega, Alberto; Solana, Vicente

    2009-12-01

    In comparison with the foundations of probability calculus, the inescapable and controversial issue of how to assign probabilities has only recently become a matter of formal study. The introduction of information as a technical concept was a milestone, but the most promising entropic assignment methods still face unsolved difficulties, manifesting the incompleteness of plausible inference theory. In this paper we examine the situation faced by risk analysts in the critical field of extreme events modelling, where the former difficulties are especially visible, due to scarcity of observational data, the large impact of these phenomena and the obligation to assume professional responsibilities. To respond to the claim for a sound framework to deal with extremes, we propose a metafoundational approach to inference, based on a canon of extramathematical requirements. We highlight their strong moral content, and show how this emphasis in morality, far from being new, is connected with the historic origins of plausible inference. Special attention is paid to the contributions of Caramuel, a contemporary of Pascal, unfortunately ignored in the usual mathematical accounts of probability.

  15. Plausible scenarios for the radiography profession in Sweden in 2025

    International Nuclear Information System (INIS)

    Björkman, B.; Fridell, K.; Tavakol Olofsson, P.

    2017-01-01

    Introduction: Radiography is a healthcare speciality with many technical challenges. Advances in engineering and information technology applications may continue to drive and be driven by radiographers. The world of diagnostic imaging is changing rapidly and radiographers must be proactive in order to survive. To ensure sustainable development, organisations have to identify future opportunities and threats in a timely manner and incorporate them into their strategic planning. Hence, the aim of this study was to analyse and describe plausible scenarios for the radiography profession in 2025. Method: The study has a qualitative design with an inductive approach based on focus group interviews. The interviews were inspired by the Scenario-Planning method. Results: Of the seven trends identified in a previous study, the radiographers considered two as the most uncertain scenarios that would have the greatest impact on the profession should they occur. These trends, labelled “Access to career advancement” and “A sufficient number of radiographers”, were inserted into the scenario cross. The resulting four plausible future scenarios were: The happy radiographer, the specialist radiographer, the dying profession and the assembly line. Conclusion: It is suggested that “The dying profession” scenario could probably be turned in the opposite direction by facilitating career development opportunities for radiographers within the profession. Changing the direction would probably lead to a profession composed of “happy radiographers” who are specialists, proud of their profession and competent to carry out advanced tasks, in contrast to being solely occupied by “the assembly line”. - Highlights: • The world of radiography is changing rapidly and radiographers must be proactive in order to survive. • Future opportunities and threats should be identified and incorporated into the strategic planning. • Appropriate actions can probably change the

  16. Resolution of Cosmological Singularity and a Plausible Mechanism of the Big Bang

    OpenAIRE

    Choudhury, D. C.

    2001-01-01

    The initial cosmological singularity in the framework of the general theory of relativity is resolved by introducing the effect of the uncertainty principle of quantum theory without violating conventional laws of physics. A plausible account of the mechanism of the big bang, analogous to that of a nuclear explosion, is given and the currently accepted Planck temperature of about 10^(32)K at the beginning of the big bang is predicted. Subj-class: cosmology: theory-pre-big bang; mechanism of t...

  17. The metaphysical lessons of synthetic biology and neuroscience.

    Science.gov (United States)

    Baertschi, Bernard

    2015-01-01

    In this paper, I examine some important metaphysical lessons that are often presented as derived from two new scientific disciplines: synthetic biology and neuroscience. I analyse four of them: the nature of life, the existence of a soul (the mind-body problem), personhood, and free will. Many caveats are in order, and each 'advance' or each case should be assessed for itself. I conclude that a main lesson can nevertheless be learned: in conjunction with modern science, neuroscience and synthetic biology allow us to enrich old metaphysical debates, to deepen and even renew them. In particular, it becomes less and less plausible to consider life, mind, person, and agency as non-natural or non-physical entities. Copyright © 2015 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  18. High School Students' Evaluations, Plausibility (Re) Appraisals, and Knowledge about Topics in Earth Science

    Science.gov (United States)

    Lombardi, Doug; Bickel, Elliot S.; Bailey, Janelle M.; Burrell, Shondricka

    2018-01-01

    Evaluation is an important aspect of science and is receiving increasing attention in science education. The present study investigated (1) changes to plausibility judgments and knowledge as a result of a series of instructional scaffolds, called model-evidence link activities, that facilitated evaluation of scientific and alternative models in…

  19. Neural correlates of early-closure garden-path processing: Effects of prosody and plausibility.

    Science.gov (United States)

    den Ouden, Dirk-Bart; Dickey, Michael Walsh; Anderson, Catherine; Christianson, Kiel

    2016-01-01

    Functional magnetic resonance imaging (fMRI) was used to investigate neural correlates of early-closure garden-path sentence processing and use of extrasyntactic information to resolve temporary syntactic ambiguities. Sixteen participants performed an auditory picture verification task on sentences presented with natural versus flat intonation. Stimuli included sentences in which the garden-path interpretation was plausible, implausible because of a late pragmatic cue, or implausible because of a semantic mismatch between an optionally transitive verb and the following noun. Natural sentence intonation was correlated with left-hemisphere temporal activation, but also with activation that suggests the allocation of more resources to interpretation when natural prosody is provided. Garden-path processing was associated with upregulation in bilateral inferior parietal and right-hemisphere dorsolateral prefrontal and inferior frontal cortex, while differences between the strength and type of plausibility cues were also reflected in activation patterns. Region of interest (ROI) analyses in regions associated with complex syntactic processing are consistent with a role for posterior temporal cortex supporting access to verb argument structure. Furthermore, ROI analyses within left-hemisphere inferior frontal gyrus suggest a division of labour, with the anterior-ventral part primarily involved in syntactic-semantic mismatch detection, the central part supporting structural reanalysis, and the posterior-dorsal part showing a general structural complexity effect.

  20. Systematic reviews need to consider applicability to disadvantaged populations: inter-rater agreement for a health equity plausibility algorithm.

    Science.gov (United States)

    Welch, Vivian; Brand, Kevin; Kristjansson, Elizabeth; Smylie, Janet; Wells, George; Tugwell, Peter

    2012-12-19

    Systematic reviews have been challenged to consider effects on disadvantaged groups. A priori specification of subgroup analyses is recommended to increase the credibility of these analyses. This study aimed to develop and assess inter-rater agreement for an algorithm for systematic review authors to predict whether differences in effect measures are likely for disadvantaged populations relative to advantaged populations (only relative effect measures were addressed). A health equity plausibility algorithm was developed using clinimetric methods with three items based on literature review, key informant interviews and methodology studies. The three items dealt with the plausibility of differences in relative effects across sex or socioeconomic status (SES) due to: 1) patient characteristics; 2) intervention delivery (i.e., implementation); and 3) comparators. Thirty-five respondents (consisting of clinicians, methodologists and research users) assessed the likelihood of differences across sex and SES for ten systematic reviews with these questions. We assessed inter-rater reliability using Fleiss multi-rater kappa. The proportion agreement was 66% for patient characteristics (95% confidence interval: 61%-71%), 67% for intervention delivery (95% confidence interval: 62% to 72%) and 55% for the comparator (95% confidence interval: 50% to 60%). Inter-rater kappa, assessed with Fleiss kappa, ranged from 0 to 0.199, representing very low agreement beyond chance. Users of systematic reviews rated that important differences in relative effects across sex and socioeconomic status were plausible for a range of individual and population-level interventions. However, there was very low inter-rater agreement for these assessments. There is an unmet need for discussion of plausibility of differential effects in systematic reviews. Increased consideration of external validity and applicability to different populations and settings is warranted in systematic reviews to meet this

  1. The minimally tuned minimal supersymmetric standard model

    International Nuclear Information System (INIS)

    Essig, Rouven; Fortin, Jean-Francois

    2008-01-01

    The regions in the Minimal Supersymmetric Standard Model with the minimal amount of fine-tuning of electroweak symmetry breaking are presented for general messenger scale. No a priori relations among the soft supersymmetry breaking parameters are assumed and fine-tuning is minimized with respect to all the important parameters which affect electroweak symmetry breaking. The superpartner spectra in the minimally tuned region of parameter space are quite distinctive with large stop mixing at the low scale and negative squark soft masses at the high scale. The minimal amount of tuning increases enormously for a Higgs mass beyond roughly 120 GeV

  2. Generative models versus underlying symmetries to explain biological pattern.

    Science.gov (United States)

    Frank, S A

    2014-06-01

    Mathematical models play an increasingly important role in the interpretation of biological experiments. Studies often present a model that generates the observations, connecting hypothesized process to an observed pattern. Such generative models confirm the plausibility of an explanation and make testable hypotheses for further experiments. However, studies rarely consider the broad family of alternative models that match the same observed pattern. The symmetries that define the broad class of matching models are in fact the only aspects of information truly revealed by observed pattern. Commonly observed patterns derive from simple underlying symmetries. This article illustrates the problem by showing the symmetry associated with the observed rate of increase in fitness in a constant environment. That underlying symmetry reveals how each particular generative model defines a single example within the broad class of matching models. Further progress on the relation between pattern and process requires deeper consideration of the underlying symmetries. © 2014 The Author. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.

  3. [Medical and biological consequences of nuclear disasters].

    Science.gov (United States)

    Stalpers, Lukas J A; van Dullemen, Simon; Franken, N A P Klaas

    2012-01-01

    Medical risks of radiation exaggerated; psychological risks underestimated. The discussion about atomic energy has become topical again following the nuclear accident in Fukushima. There is some argument about the gravity of medical and biological consequences of prolonged exposure to radiation. The risk of cancer following a low dose of radiation is usually estimated by linear extrapolation of the incidence of cancer among survivors of the atomic bombs dropped on Hiroshima and Nagasaki in 1945. The radiobiological linear-quadratic model (LQ-model) gives a more accurate description of observed data, is radiobiologically more plausible and is better supported by experimental and clinical data. On the basis of this model there is less risk of cancer being induced following radiation exposure. The gravest consequence of Chernobyl and Fukushima is not the medical and biological damage, but the psychological and economical impact on rescue workers and former inhabitants.

  4. Telemetry System of Biological Parameters

    Directory of Open Access Journals (Sweden)

    Jan Spisak

    2005-01-01

    Full Text Available The mobile telemetry system of biological parameters serves for reading and wireless data transfer of measured values of selected biological parameters to an outlying computer. It concerns basically long time monitoring of vital function of car pilot.The goal of this projects is to propose mobile telemetry system for reading, wireless transfer and processing of biological parameters of car pilot during physical and psychical stress. It has to be made with respect to minimal consumption, weight and maximal device mobility. This system has to eliminate signal noise, which is created by biological artifacts and disturbances during the data transfer.

  5. The minimal non-minimal standard model

    International Nuclear Information System (INIS)

    Bij, J.J. van der

    2006-01-01

    In this Letter I discuss a class of extensions of the standard model that have a minimal number of possible parameters, but can in principle explain dark matter and inflation. It is pointed out that the so-called new minimal standard model contains a large number of parameters that can be put to zero, without affecting the renormalizability of the model. With the extra restrictions one might call it the minimal (new) non-minimal standard model (MNMSM). A few hidden discrete variables are present. It is argued that the inflaton should be higher-dimensional. Experimental consequences for the LHC and the ILC are discussed

  6. Anticholinergic drugs and negative outcomes in the older population: from biological plausibility to clinical evidence.

    Science.gov (United States)

    Collamati, Agnese; Martone, Anna Maria; Poscia, Andrea; Brandi, Vincenzo; Celi, Michela; Marzetti, Emanuele; Cherubini, Antonio; Landi, Francesco

    2016-02-01

    The use of medication with anticholinergic properties is widespread among older subjects. Many drugs of common use such as antispasmodics, bronchodilators, antiarrhythmics, antihistamines, anti-hypertensive drugs, antiparkinson agents, skeletal muscle relaxants, and psychotropic drugs have been demonstrated to have an anticholinergic activity. The most frequent adverse effects are dry mouth, nausea, vomiting, constipation, abdominal pain, urinary retention, blurred vision, tachycardia and neurologic impairment such as confusion, agitation and coma. A growing evidence from experimental studies and clinical observations suggests that drugs with anticholinergic properties can cause physical and mental impairment in the elderly population. However, the morbidity and management issues associated with unwanted anticholinergic activity are underestimated and frequently overlooked. Moreover, their possible relation with specific negative outcome in the elderly population is still not firmly established. The aim of the present review was to evaluate the relationship between the use of drugs with anticholinergic activity and negative outcomes in older persons. We searched PubMed and Cochrane combining the search terms "anticholinergic", "delirium", "cognitive impairment", "falls", "mortality" and "discontinuation". Medicines with anticholinergic properties may increase the risks of functional and cognitive decline, morbidity, institutionalization and mortality in older people. However, such evidences are still not conclusive probably due to possible confounding factors. In particular, more studies are needed to investigate the effects of discontinuation of drug with anticholinergic properties. Overall, minimizing anticholinergic burden should always be encouraged in clinical practice to improve short-term memory, confusion and delirium, quality of life and daily functioning.

  7. Minimalism

    CERN Document Server

    Obendorf, Hartmut

    2009-01-01

    The notion of Minimalism is proposed as a theoretical tool supporting a more differentiated understanding of reduction and thus forms a standpoint that allows definition of aspects of simplicity. This book traces the development of minimalism, defines the four types of minimalism in interaction design, and looks at how to apply it.

  8. Bio-physically plausible visualization of highly scattering fluorescent neocortical models for in silico experimentation

    KAUST Repository

    Abdellah, Marwan

    2017-02-15

    Background We present a visualization pipeline capable of accurate rendering of highly scattering fluorescent neocortical neuronal models. The pipeline is mainly developed to serve the computational neurobiology community. It allows the scientists to visualize the results of their virtual experiments that are performed in computer simulations, or in silico. The impact of the presented pipeline opens novel avenues for assisting the neuroscientists to build biologically accurate models of the brain. These models result from computer simulations of physical experiments that use fluorescence imaging to understand the structural and functional aspects of the brain. Due to the limited capabilities of the current visualization workflows to handle fluorescent volumetric datasets, we propose a physically-based optical model that can accurately simulate light interaction with fluorescent-tagged scattering media based on the basic principles of geometric optics and Monte Carlo path tracing. We also develop an automated and efficient framework for generating dense fluorescent tissue blocks from a neocortical column model that is composed of approximately 31000 neurons. Results Our pipeline is used to visualize a virtual fluorescent tissue block of 50 μm3 that is reconstructed from the somatosensory cortex of juvenile rat. The fluorescence optical model is qualitatively analyzed and validated against experimental emission spectra of different fluorescent dyes from the Alexa Fluor family. Conclusion We discussed a scientific visualization pipeline for creating images of synthetic neocortical neuronal models that are tagged virtually with fluorescent labels on a physically-plausible basis. The pipeline is applied to analyze and validate simulation data generated from neuroscientific in silico experiments.

  9. A Critical Perspective on Synthetic Biology

    OpenAIRE

    Michel Morange

    2009-01-01

    Synthetic biology emerged around 2000 as a new biological discipline. It shares with systems biology the same modular vision of organisms, but is more concerned with applications than with a better understanding of the functioning of organisms. A herald of this new discipline is Craig Venter who aims to create an artificial microorganism with the minimal genome compatible with life and to implement into it different 'functional modules' to generate new micro-organisms adapted to specific task...

  10. The ethical plausibility of the 'Right To Try' laws.

    Science.gov (United States)

    Carrieri, D; Peccatori, F A; Boniolo, G

    2018-02-01

    'Right To Try' (RTT) laws originated in the USA to allow terminally ill patients to request access to early stage experimental medical products directly from the producer, removing the oversight and approval of the Food and Drug Administration. These laws have received significant media attention and almost equally unanimous criticism by the bioethics, clinical and scientific communities. They touch indeed on complex issues such as the conflict between individual and public interest, and the public understanding of medical research and its regulation. The increased awareness around RTT laws means that healthcare providers directly involved in the management of patients with life-threatening conditions such as cancer, infective, or neurologic conditions will deal more frequently with patients' requests of access to experimental medical products. This paper aims to assess the ethical plausibility of the RTT laws, and to suggest some possible ethical tools and considerations to address the main issues they touch. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Plausible inference: A multi-valued logic for problem solving

    Science.gov (United States)

    Friedman, L.

    1979-01-01

    A new logic is developed which permits continuously variable strength of belief in the truth of assertions. Four inference rules result, with formal logic as a limiting case. Quantification of belief is defined. Propagation of belief to linked assertions results from dependency-based techniques of truth maintenance so that local consistency is achieved or contradiction discovered in problem solving. Rules for combining, confirming, or disconfirming beliefs are given, and several heuristics are suggested that apply to revising already formed beliefs in the light of new evidence. The strength of belief that results in such revisions based on conflicting evidence are a highly subjective phenomenon. Certain quantification rules appear to reflect an orderliness in the subjectivity. Several examples of reasoning by plausible inference are given, including a legal example and one from robot learning. Propagation of belief takes place in directions forbidden in formal logic and this results in conclusions becoming possible for a given set of assertions that are not reachable by formal logic.

  12. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb-Argument Information on Predictive Processing in Aphasia.

    Science.gov (United States)

    Hayes, Rebecca A; Dickey, Michael Walsh; Warren, Tessa

    2016-12-01

    This study examined the influence of verb-argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54-82 years) as well as 44 young adults (aged 18-31 years) and 18 older adults (aged 50-71 years) participated. Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure.

  13. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb–Argument Information on Predictive Processing in Aphasia

    Science.gov (United States)

    Dickey, Michael Walsh; Warren, Tessa

    2016-01-01

    Purpose This study examined the influence of verb–argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. Method This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54–82 years) as well as 44 young adults (aged 18–31 years) and 18 older adults (aged 50–71 years) participated. Results Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Conclusions Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure. PMID:27997951

  14. Liderazgo preventivo para la universidad. Una experiencia plausible

    Directory of Open Access Journals (Sweden)

    Alejandro Rodríguez Rodríguez

    2015-06-01

    Full Text Available El desarrollo del liderazgo, en el ámbito educativo superior, busca soluciones de aplicación inmediata a contextos en que todo líder se desenvuelve, pero se diluye el sustento teórico-práctico en la formación del líder que posibilite entender los procesos intelectivos durante la toma de decisiones. El paradigma de convergencia entre el método antropológico lonerganiano, la comunidad de aprendizaje vygotskiana y una relectura del sistema preventivo salesiano se presentan como propuesta plausible de formación al liderazgo preventivo entre los diversos actores de una comunidad universitaria. Un estudio de caso de la Universidad Salesiana en México empleando un método mixto de investigación, facilita una relectura del liderazgo desde una óptica preventiva como posibilidad de convergencia en un diálogo interdisciplinar. Los resultados teórico-práctico propuestos y examinados se muestran como herramienta útil para evaluar, enriquecer y renovar la teoría sobre el líder y el desarrollo de liderazgo en las universidades frente a una sociedad globalizada.

  15. Minimal changes in health status questionnaires: distinction between minimally detectable change and minimally important change

    Directory of Open Access Journals (Sweden)

    Knol Dirk L

    2006-08-01

    Full Text Available Abstract Changes in scores on health status questionnaires are difficult to interpret. Several methods to determine minimally important changes (MICs have been proposed which can broadly be divided in distribution-based and anchor-based methods. Comparisons of these methods have led to insight into essential differences between these approaches. Some authors have tried to come to a uniform measure for the MIC, such as 0.5 standard deviation and the value of one standard error of measurement (SEM. Others have emphasized the diversity of MIC values, depending on the type of anchor, the definition of minimal importance on the anchor, and characteristics of the disease under study. A closer look makes clear that some distribution-based methods have been merely focused on minimally detectable changes. For assessing minimally important changes, anchor-based methods are preferred, as they include a definition of what is minimally important. Acknowledging the distinction between minimally detectable and minimally important changes is useful, not only to avoid confusion among MIC methods, but also to gain information on two important benchmarks on the scale of a health status measurement instrument. Appreciating the distinction, it becomes possible to judge whether the minimally detectable change of a measurement instrument is sufficiently small to detect minimally important changes.

  16. Minimal surfaces

    CERN Document Server

    Dierkes, Ulrich; Sauvigny, Friedrich; Jakob, Ruben; Kuster, Albrecht

    2010-01-01

    Minimal Surfaces is the first volume of a three volume treatise on minimal surfaces (Grundlehren Nr. 339-341). Each volume can be read and studied independently of the others. The central theme is boundary value problems for minimal surfaces. The treatise is a substantially revised and extended version of the monograph Minimal Surfaces I, II (Grundlehren Nr. 295 & 296). The first volume begins with an exposition of basic ideas of the theory of surfaces in three-dimensional Euclidean space, followed by an introduction of minimal surfaces as stationary points of area, or equivalently

  17. Preview Effects of Plausibility and Character Order in Reading Chinese Transposed Words: Evidence from Eye Movements

    Science.gov (United States)

    Yang, Jinmian

    2013-01-01

    The current paper examined the role of plausibility information in the parafovea for Chinese readers by using two-character transposed words (in which the order of the component characters is reversed but are still words). In two eye-tracking experiments, readers received a preview of a target word that was (1) identical to the target word, (2) a…

  18. Periodical cicadas: A minimal automaton model

    Science.gov (United States)

    de O. Cardozo, Giovano; de A. M. M. Silvestre, Daniel; Colato, Alexandre

    2007-08-01

    The Magicicada spp. life cycles with its prime periods and highly synchronized emergence have defied reasonable scientific explanation since its discovery. During the last decade several models and explanations for this phenomenon appeared in the literature along with a great deal of discussion. Despite this considerable effort, there is no final conclusion about this long standing biological problem. Here, we construct a minimal automaton model without predation/parasitism which reproduces some of these aspects. Our results point towards competition between different strains with limited dispersal threshold as the main factor leading to the emergence of prime numbered life cycles.

  19. Biological treatment of Crohn's disease

    DEFF Research Database (Denmark)

    Nielsen, Ole Haagen; Bjerrum, Jacob Tveiten; Seidelin, Jakob Benedict

    2012-01-01

    Introduction of biological agents for the treatment of Crohn's disease (CD) has led to a transformation of the treatment paradigm. Several biological compounds have been approved for patients with CD refractory to conventional treatment: infliximab, adalimumab and certolizumab pegol (and...... natalizumab in several countries outside the European Union). However, despite the use of biologics for more than a decade, questions still remain about the true efficacy and the best treatment regimens - especially about when to discontinue treatment. Furthermore, a need for optimizing treatment...... with biologics still exists, as 20-40% of patients with CD (depending on selection criteria) do not have any relevant response to the current biological agents (i.e. primary failures). A better patient selection might maximize the clinical outcome while minimizing the complications associated with this type...

  20. Synthetic Biology and the Moral Significance of Artificial Life: A Reply to Douglas, Powell and Savulescu.

    Science.gov (United States)

    Christiansen, Andreas

    2016-06-01

    I discuss the moral significance of artificial life within synthetic biology via a discussion of Douglas, Powell and Savulescu's paper 'Is the creation of artificial life morally significant'. I argue that the definitions of 'artificial life' and of 'moral significance' are too narrow. Douglas, Powell and Savulescu's definition of artificial life does not capture all core projects of synthetic biology or the ethical concerns that have been voiced, and their definition of moral significance fails to take into account the possibility that creating artificial life is conditionally acceptable. Finally, I show how several important objections to synthetic biology are plausibly understood as arguing that creating artificial life in a wide sense is only conditionally acceptable. © 2016 John Wiley & Sons Ltd.

  1. Systems Biology as an Integrated Platform for Bioinformatics, Systems Synthetic Biology, and Systems Metabolic Engineering

    Science.gov (United States)

    Chen, Bor-Sen; Wu, Chia-Chou

    2013-01-01

    Systems biology aims at achieving a system-level understanding of living organisms and applying this knowledge to various fields such as synthetic biology, metabolic engineering, and medicine. System-level understanding of living organisms can be derived from insight into: (i) system structure and the mechanism of biological networks such as gene regulation, protein interactions, signaling, and metabolic pathways; (ii) system dynamics of biological networks, which provides an understanding of stability, robustness, and transduction ability through system identification, and through system analysis methods; (iii) system control methods at different levels of biological networks, which provide an understanding of systematic mechanisms to robustly control system states, minimize malfunctions, and provide potential therapeutic targets in disease treatment; (iv) systematic design methods for the modification and construction of biological networks with desired behaviors, which provide system design principles and system simulations for synthetic biology designs and systems metabolic engineering. This review describes current developments in systems biology, systems synthetic biology, and systems metabolic engineering for engineering and biology researchers. We also discuss challenges and future prospects for systems biology and the concept of systems biology as an integrated platform for bioinformatics, systems synthetic biology, and systems metabolic engineering. PMID:24709875

  2. Systems Biology as an Integrated Platform for Bioinformatics, Systems Synthetic Biology, and Systems Metabolic Engineering

    Directory of Open Access Journals (Sweden)

    Bor-Sen Chen

    2013-10-01

    Full Text Available Systems biology aims at achieving a system-level understanding of living organisms and applying this knowledge to various fields such as synthetic biology, metabolic engineering, and medicine. System-level understanding of living organisms can be derived from insight into: (i system structure and the mechanism of biological networks such as gene regulation, protein interactions, signaling, and metabolic pathways; (ii system dynamics of biological networks, which provides an understanding of stability, robustness, and transduction ability through system identification, and through system analysis methods; (iii system control methods at different levels of biological networks, which provide an understanding of systematic mechanisms to robustly control system states, minimize malfunctions, and provide potential therapeutic targets in disease treatment; (iv systematic design methods for the modification and construction of biological networks with desired behaviors, which provide system design principles and system simulations for synthetic biology designs and systems metabolic engineering. This review describes current developments in systems biology, systems synthetic biology, and systems metabolic engineering for engineering and biology researchers. We also discuss challenges and future prospects for systems biology and the concept of systems biology as an integrated platform for bioinformatics, systems synthetic biology, and systems metabolic engineering.

  3. Minimizing Environmental Magnetic Field Sources for nEDM

    Science.gov (United States)

    Brinson, Alex; Filippone, Bradley; Slutsky, Simon; Osthelder, Charles

    2017-09-01

    Measurement of the neutron's Electric Dipole Moment (nEDM) could potentially explain the Baryon Asymmetry Problem, and would suggest plausible extensions to the Standard Model. We will attempt to detect the nEDM by measuring the electric-field-dependent neutron precession frequency, which is highly sensitive to magnetic field gradients. In order to produce fields with sufficiently low gradients for our experiment, we eliminate environmental effects by offsetting the ambient field with a Field Compensation System (FCS), then magnetically shielding the reduced field with a Mu-Metal cylinder. We discovered that the strongest environmental effect in our lab came from iron rebar embedded in the floor beneath the proposed experiment location. The large extent and strength of the floor's magnetization made the effect too large to offset with the FCS, forcing us to relocate our apparatus. The floor's magnetic field was mapped with a Hall probe in order to determine the most viable experiment locations. A 3-axis Fluxgate magnetometer was then used to determine the floor field's drop-off and shape at these locations, and a final apparatus position was determined which minimized the floor's effect such that it could be effectively offset and shielded by our experiment. Caltech SFP Office.

  4. The role of biological fertility in predicting family size

    DEFF Research Database (Denmark)

    Joffe, M; Key, J; Best, N

    2009-01-01

    for the first child. CONCLUSIONS: Within the limits of the available data quality, family size appears to be predicted by biological fertility, even after adjustment for maternal age, if the woman was at least 20 years old when the couple's first attempt at conception started. The contribution of behavioural......BACKGROUND: It is plausible that a couple's ability to achieve the desired number of children is limited by biological fertility, especially if childbearing is postponed. Family size has declined and semen quality may have deteriorated in much of Europe, although studies have found an increase....... Potential confounders were maternal age when unprotected sex began prior to the first birth, and maternal smoking. Desired family size was available in only one of the datasets. RESULTS: Couples with a TTP of at least 12 months tended to have smaller families, with odds ratios for the risk of not having...

  5. Uncertain socioeconomic projections used in travel demand and emissions models: could plausible errors result in air quality nonconformity?

    International Nuclear Information System (INIS)

    Rodier, C.J.; Johnston, R.A.

    2002-01-01

    A sensitivity analysis of plausible errors in population, employment, fuel price, and income projections is conducted using the travel demand and emissions models of the Sacramento, CA, USA, region for their transportation plan. The results of the analyses indicate that plausible error ranges for household income and fuel prices are not a significant source of uncertainty with respect to the region's travel demand and emissions projections. However, plausible errors in population and employment projections (within approximately one standard deviation) may result in the region's transportation plan not meeting the conformity test for nitrogens of oxides (NO x ) in the year 2005 (i.e., an approximately 16% probability). This outcome is also possible in the year 2015 but less likely (within approximately two standard deviations or a 2.5% probability). Errors in socioeconomic projections are only one of many sources of error in travel demand and emissions models. These results have several policy implications. First, regions like Sacramento that meet their conformity tests by a very small margin should rethink new highway investment and consider contingency transportation plans that incorporate more aggressive emissions reduction policies. Second, regional transportation planning agencies should conduct sensitivity analyses as part of their conformity analysis to make explicit significant uncertainties in the methods and to identify the probability of their transportation plan not conforming. Third, the US Environmental Protection Agency (EPA) should clarify the interpretation of ''demonstrate'' conformity of transportation plans; that is, specify the level of certainty that it considers a sufficient demonstration of conformity. (author)

  6. [Theory and practice of minimally invasive endodontics].

    Science.gov (United States)

    Jiang, H W

    2016-08-01

    The primary goal of modern endodontic therapy is to achieve the long-term retention of a functional tooth by preventing or treating pulpitis or apical periodontitis is. The long-term retention of endodontically treated tooth is correlated with the remaining amount of tooth tissue and the quality of the restoration after root canal filling. In recent years, there has been rapid progress and development in the basic research of endodontic biology, instrument and applied materials, making treatment procedures safer, more accurate, and more efficient. Thus, minimally invasive endodontics(MIE)has received increasing attention at present. MIE aims to preserve the maximum of tooth structure during root canal therapy, and the concept covers the whole process of diagnosis and treatment of teeth. This review article focuses on describing the minimally invasive concepts and operating essentials in endodontics, from diagnosis and treatment planning to the access opening, pulp cavity finishing, root canal cleaning and shaping, 3-dimensional root canal filling and restoration after root canal treatment.

  7. A Biologically Plausible Action Selection System for Cognitive Architectures: Implications of Basal Ganglia Anatomy for Learning and Decision-Making Models

    Science.gov (United States)

    Stocco, Andrea

    2018-01-01

    Several attempts have been made previously to provide a biological grounding for cognitive architectures by relating their components to the computations of specific brain circuits. Often, the architecture's action selection system is identified with the basal ganglia. However, this identification overlooks one of the most important features of…

  8. Approaches to chemical synthetic biology.

    Science.gov (United States)

    Chiarabelli, Cristiano; Stano, Pasquale; Anella, Fabrizio; Carrara, Paolo; Luisi, Pier Luigi

    2012-07-16

    Synthetic biology is first represented in terms of two complementary aspects, the bio-engineering one, based on the genetic manipulation of extant microbial forms in order to obtain forms of life which do not exist in nature; and the chemical synthetic biology, an approach mostly based on chemical manipulation for the laboratory synthesis of biological structures that do not exist in nature. The paper is mostly devoted to shortly review chemical synthetic biology projects currently carried out in our laboratory. In particular, we describe: the minimal cell project, then the "Never Born Proteins" and lastly the Never Born RNAs. We describe and critically analyze the main results, emphasizing the possible relevance of chemical synthetic biology for the progress in basic science and biotechnology. Copyright © 2012 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  9. Transplant Biology at a Crossroads

    OpenAIRE

    Sedwick, Caitlin

    2008-01-01

    Despite major advances in transplantation biology, allowing transplants not just of critical organs like heart and kidney but also of limbs and faces, researchers are still struggling to minimize the risks from achieving the level of immunosuppression needed to make the body accept foreign tissues.

  10. Analytic models of plausible gravitational lens potentials

    International Nuclear Information System (INIS)

    Baltz, Edward A.; Marshall, Phil; Oguri, Masamune

    2009-01-01

    Gravitational lenses on galaxy scales are plausibly modelled as having ellipsoidal symmetry and a universal dark matter density profile, with a Sérsic profile to describe the distribution of baryonic matter. Predicting all lensing effects requires knowledge of the total lens potential: in this work we give analytic forms for that of the above hybrid model. Emphasising that complex lens potentials can be constructed from simpler components in linear combination, we provide a recipe for attaining elliptical symmetry in either projected mass or lens potential. We also provide analytic formulae for the lens potentials of Sérsic profiles for integer and half-integer index. We then present formulae describing the gravitational lensing effects due to smoothly-truncated universal density profiles in cold dark matter model. For our isolated haloes the density profile falls off as radius to the minus fifth or seventh power beyond the tidal radius, functional forms that allow all orders of lens potential derivatives to be calculated analytically, while ensuring a non-divergent total mass. We show how the observables predicted by this profile differ from that of the original infinite-mass NFW profile. Expressions for the gravitational flexion are highlighted. We show how decreasing the tidal radius allows stripped haloes to be modelled, providing a framework for a fuller investigation of dark matter substructure in galaxies and clusters. Finally we remark on the need for finite mass halo profiles when doing cosmological ray-tracing simulations, and the need for readily-calculable higher order derivatives of the lens potential when studying catastrophes in strong lenses

  11. Biological formal counterparts of logical machines

    Energy Technology Data Exchange (ETDEWEB)

    Moreno-diaz, R; Hernandez Guarch, F

    1983-01-01

    The significance of the McCulloch-Pitts formal neural net theory (1943) is still nowadays frequently misunderstood, and their basic units are wrongly considered as factual models for neurons. As a consequence, the whole original theory and its later addenda are unreasonably criticized for their simplicity. But, as it was proved then and since, the theory is after the modular neurophysiological counterpart of logical machines, so that it actually provides biologically plausible models for automata, turing machines, etc., and not vice versa. In its true context, no theory has surpassed its proposals. In McCulloch and Pitts memoriam and for the sake of future theoretical research, the authors stress this important historical point, including also some recent results on the neurophysiological counterparts of modular arbitrary probabilistic automata. 16 references.

  12. Standard reporting requirements for biological samples in metabolomics experiments: Microbial and in vitro biology experiments

    NARCIS (Netherlands)

    Werf, M.J. van der; Takors, R.; Smedsgaard, J.; Nielsen, J.; Ferenci, T.; Portais, J.C.; Wittmann, C.; Hooks, M.; Tomassini, A.; Oldiges, M.; Fostel, J.; Sauer, U.

    2007-01-01

    With the increasing use of metabolomics as a means to study a large number of different biological research questions, there is a need for a minimal set of reporting standards that allow the scientific community to evaluate, understand, repeat, compare and re-investigate metabolomics studies. Here

  13. Learning and coding in biological neural networks

    Science.gov (United States)

    Fiete, Ila Rani

    How can large groups of neurons that locally modify their activities learn to collectively perform a desired task? Do studies of learning in small networks tell us anything about learning in the fantastically large collection of neurons that make up a vertebrate brain? What factors do neurons optimize by encoding sensory inputs or motor commands in the way they do? In this thesis I present a collection of four theoretical works: each of the projects was motivated by specific constraints and complexities of biological neural networks, as revealed by experimental studies; together, they aim to partially address some of the central questions of neuroscience posed above. We first study the role of sparse neural activity, as seen in the coding of sequential commands in a premotor area responsible for birdsong. We show that the sparse coding of temporal sequences in the songbird brain can, in a network where the feedforward plastic weights must translate the sparse sequential code into a time-varying muscle code, facilitate learning by minimizing synaptic interference. Next, we propose a biologically plausible synaptic plasticity rule that can perform goal-directed learning in recurrent networks of voltage-based spiking neurons that interact through conductances. Learning is based on the correlation of noisy local activity with a global reward signal; we prove that this rule performs stochastic gradient ascent on the reward. Thus, if the reward signal quantifies network performance on some desired task, the plasticity rule provably drives goal-directed learning in the network. To assess the convergence properties of the learning rule, we compare it with a known example of learning in the brain. Song-learning in finches is a clear example of a learned behavior, with detailed available neurophysiological data. With our learning rule, we train an anatomically accurate model birdsong network that drives a sound source to mimic an actual zebrafinch song. Simulation and

  14. emMAW: computing minimal absent words in external memory.

    Science.gov (United States)

    Héliou, Alice; Pissis, Solon P; Puglisi, Simon J

    2017-09-01

    The biological significance of minimal absent words has been investigated in genomes of organisms from all domains of life. For instance, three minimal absent words of the human genome were found in Ebola virus genomes. There exists an O(n) -time and O(n) -space algorithm for computing all minimal absent words of a sequence of length n on a fixed-sized alphabet based on suffix arrays. A standard implementation of this algorithm, when applied to a large sequence of length n , requires more than 20 n  bytes of RAM. Such memory requirements are a significant hurdle to the computation of minimal absent words in large datasets. We present emMAW, the first external-memory algorithm for computing minimal absent words. A free open-source implementation of our algorithm is made available. This allows for computation of minimal absent words on far bigger data sets than was previously possible. Our implementation requires less than 3 h on a standard workstation to process the full human genome when as little as 1 GB of RAM is made available. We stress that our implementation, despite making use of external memory, is fast; indeed, even on relatively smaller datasets when enough RAM is available to hold all necessary data structures, it is less than two times slower than state-of-the-art internal-memory implementations. https://github.com/solonas13/maw (free software under the terms of the GNU GPL). alice.heliou@lix.polytechnique.fr or solon.pissis@kcl.ac.uk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. Free-energy minimization and the dark-room problem.

    Science.gov (United States)

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the "free-energy minimization" formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b - see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the "Dark-Room Problem." Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington's Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark).

  16. Finding minimal action sequences with a simple evaluation of actions

    Science.gov (United States)

    Shah, Ashvin; Gurney, Kevin N.

    2014-01-01

    Animals are able to discover the minimal number of actions that achieves an outcome (the minimal action sequence). In most accounts of this, actions are associated with a measure of behavior that is higher for actions that lead to the outcome with a shorter action sequence, and learning mechanisms find the actions associated with the highest measure. In this sense, previous accounts focus on more than the simple binary signal of “was the outcome achieved?”; they focus on “how well was the outcome achieved?” However, such mechanisms may not govern all types of behavioral development. In particular, in the process of action discovery (Redgrave and Gurney, 2006), actions are reinforced if they simply lead to a salient outcome because biological reinforcement signals occur too quickly to evaluate the consequences of an action beyond an indication of the outcome's occurrence. Thus, action discovery mechanisms focus on the simple evaluation of “was the outcome achieved?” and not “how well was the outcome achieved?” Notwithstanding this impoverishment of information, can the process of action discovery find the minimal action sequence? We address this question by implementing computational mechanisms, referred to in this paper as no-cost learning rules, in which each action that leads to the outcome is associated with the same measure of behavior. No-cost rules focus on “was the outcome achieved?” and are consistent with action discovery. No-cost rules discover the minimal action sequence in simulated tasks and execute it for a substantial amount of time. Extensive training, however, results in extraneous actions, suggesting that a separate process (which has been proposed in action discovery) must attenuate learning if no-cost rules participate in behavioral development. We describe how no-cost rules develop behavior, what happens when attenuation is disrupted, and relate the new mechanisms to wider computational and biological context. PMID:25506326

  17. Enumeration of minimal stoichiometric precursor sets in metabolic networks.

    Science.gov (United States)

    Andrade, Ricardo; Wannagat, Martin; Klein, Cecilia C; Acuña, Vicente; Marchetti-Spaccamela, Alberto; Milreu, Paulo V; Stougie, Leen; Sagot, Marie-France

    2016-01-01

    What an organism needs at least from its environment to produce a set of metabolites, e.g. target(s) of interest and/or biomass, has been called a minimal precursor set. Early approaches to enumerate all minimal precursor sets took into account only the topology of the metabolic network (topological precursor sets). Due to cycles and the stoichiometric values of the reactions, it is often not possible to produce the target(s) from a topological precursor set in the sense that there is no feasible flux. Although considering the stoichiometry makes the problem harder, it enables to obtain biologically reasonable precursor sets that we call stoichiometric. Recently a method to enumerate all minimal stoichiometric precursor sets was proposed in the literature. The relationship between topological and stoichiometric precursor sets had however not yet been studied. Such relationship between topological and stoichiometric precursor sets is highlighted. We also present two algorithms that enumerate all minimal stoichiometric precursor sets. The first one is of theoretical interest only and is based on the above mentioned relationship. The second approach solves a series of mixed integer linear programming problems. We compared the computed minimal precursor sets to experimentally obtained growth media of several Escherichia coli strains using genome-scale metabolic networks. The results show that the second approach efficiently enumerates minimal precursor sets taking stoichiometry into account, and allows for broad in silico studies of strains or species interactions that may help to understand e.g. pathotype and niche-specific metabolic capabilities. sasita is written in Java, uses cplex as LP solver and can be downloaded together with all networks and input files used in this paper at http://www.sasita.gforge.inria.fr.

  18. More on Grandmother Cells and the Biological Implausibility of PDP Models of Cognition: A Reply to Plaut and McClelland (2010) and Quian Quiroga and Kreiman (2010)

    Science.gov (United States)

    Bowers, Jeffrey S.

    2010-01-01

    Plaut and McClelland (2010) and Quian Quiroga and Kreiman both challenged my characterization of localist and distributed representations. They also challenged the biological plausibility of grandmother cells on conceptual and empirical grounds. This reply addresses these issues in turn. The premise of my argument is that grandmother cells in…

  19. The United States and biological warfare: secrets from the early cold war and Korea.

    Science.gov (United States)

    Bruwer, A

    2001-01-01

    The United States and Biological Warfare is about accusations that the United States resorted to bacteriological warfare at a time of great military stress during the Korean War. In December 1951, the then US Secretary of Defense ordered early readiness for offensive use of biological weapons. Soon afterwards, the North Korean and Chinese armies accused the United States of starting a large-scale biological warfare experiment in Korea. The US State Department denied the accusation. Both parties to the dispute maintain their positions today. The authors spent 20 years researching the accusations in North America, Europe and Japan. They were the first foreigners to be given access to Chinese classified documents. The reader is also introduced to the concept of 'plausible denial', an official US policy which allowed responsible governmental representatives to deny knowledge of certain events. The authors hope that their work will contribute to the understanding of a time when modern war expanded into a new type of violence.

  20. Influence of biological, experiential and psychological factors in wine preference segmentation.

    Science.gov (United States)

    Pickering, Gary J; Hayes, John E

    2017-06-01

    We sought to determine the influence of selected biological, experiential and psychological variables on self-reported liking and consumption of wine in a sample of 329 Ontario wine consumers. Cluster analysis revealed three distinct groups, representing plausible market segments: wine lovers; dry table wine likers/sweet dislikers; and sweet wine likers/fortified dislikers. These groups differ in level of wine expertise, wine adventurousness, alcohol intake, bitterness from 6- n -propylthiouracil (PROP), and several demographic variables. PROP hypo-tasters ( n =113) and PROP hyper-tasters ( n =112) differed in liking scores for nine of the 11 wine styles [ANCOVA, P (F)branding and marketing strategies.

  1. Quantum theory as plausible reasoning applied to data obtained by robust experiments.

    Science.gov (United States)

    De Raedt, H; Katsnelson, M I; Michielsen, K

    2016-05-28

    We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independent and for which the frequency distribution of these events is robust with respect to small changes of the conditions under which the experiments are carried out yields, without introducing any concept of quantum theory, the quantum theoretical description in terms of the Schrödinger or the Pauli equation, the Stern-Gerlach or Einstein-Podolsky-Rosen-Bohm experiments. The extraordinary descriptive power of quantum theory then follows from the fact that it is plausible reasoning, that is common sense, applied to reproducible and robust experimental data. © 2016 The Author(s).

  2. Structure before meaning: sentence processing, plausibility, and subcategorization.

    Science.gov (United States)

    Kizach, Johannes; Nyvad, Anne Mette; Christensen, Ken Ramshøj

    2013-01-01

    Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about) implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous.

  3. Structure before meaning: sentence processing, plausibility, and subcategorization.

    Directory of Open Access Journals (Sweden)

    Johannes Kizach

    Full Text Available Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous.

  4. Nicotine dose-concentration relationship and pregnancy outcomes in rat: Biologic plausibility and implications for future research

    International Nuclear Information System (INIS)

    Hussein, Jabeen; Farkas, Svetlana; MacKinnon, Yolanda; Ariano, Robert E.; Sitar, Daniel S.; Hasan, Shabih U.

    2007-01-01

    Cigarette smoke (CS) exposure during pregnancy can lead to profound adverse effects on fetal development. Although CS contains several thousand chemicals, nicotine has been widely used as its surrogate as well as in its own right as a neuroteratogen. The justification for the route and dose of nicotine administration is largely based on inferential data suggesting that nicotine 6 mg/kg/day infused continuously via osmotic mini pumps (OMP) would mimic maternal CS exposure. We provide evidence that 6 mg/kg/day nicotine dose as commonly administered to pregnant rats leads to plasma nicotine concentrations that are 3-10-fold higher than those observed in moderate to heavy smokers and pregnant mothers, respectively. Furthermore, the cumulative daily nicotine dose exceeds by several hundred fold the amount consumed by human heavy smokers. Our study does not support the widely accepted notion that regardless of the nicotine dose, a linear nicotine dose-concentration relationship exists in a steady-state OMP model. We also show that total nicotine clearance increases with advancing pregnancy but no significant change is observed between the 2nd and 3rd trimester. Furthermore, nicotine infusion even at this extremely high dose has little effect on a number of maternal and fetal biologic variables and pregnancy outcome suggesting that CS constituents other than nicotine mediate the fetal growth restriction in infants born to smoking mothers. Our current study has major implications for translational research in developmental toxicology and pharmacotherapy using nicotine replacement treatment as an aid to cessation of cigarette smoking in pregnant mothers

  5. Influence of biological, experiential and psychological factors in wine preference segmentation

    Science.gov (United States)

    Pickering, Gary J; Hayes, John E

    2016-01-01

    Background and Aims We sought to determine the influence of selected biological, experiential and psychological variables on self-reported liking and consumption of wine in a sample of 329 Ontario wine consumers. Methods and Results Cluster analysis revealed three distinct groups, representing plausible market segments: wine lovers; dry table wine likers/sweet dislikers; and sweet wine likers/fortified dislikers. These groups differ in level of wine expertise, wine adventurousness, alcohol intake, bitterness from 6-n-propylthiouracil (PROP), and several demographic variables. PROP hypo-tasters (n=113) and PROP hyper-tasters (n=112) differed in liking scores for nine of the 11 wine styles [ANCOVA, P(F)variables examined. Taste phenotype also contributes significantly to variation in wine liking. Significance of the Study Ontario wine consumers fall into one of three wine liking clusters, which differ in experiential, biological, psychological and demographic features that can be targeted through branding and marketing strategies. PMID:28579910

  6. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  7. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support

    OpenAIRE

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Background Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians? experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mech...

  8. On Matrix Sampling and Imputation of Context Questionnaires with Implications for the Generation of Plausible Values in Large-Scale Assessments

    Science.gov (United States)

    Kaplan, David; Su, Dan

    2016-01-01

    This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…

  9. The Sarrazin effect: the presence of absurd statements in conspiracy theories makes canonical information less plausible.

    Science.gov (United States)

    Raab, Marius Hans; Auer, Nikolas; Ortlieb, Stefan A; Carbon, Claus-Christian

    2013-01-01

    Reptile prime ministers and flying Nazi saucers-extreme and sometimes off-wall conclusion are typical ingredients of conspiracy theories. While individual differences are a common research topic concerning conspiracy theories, the role of extreme statements in the process of acquiring and passing on conspiratorial stories has not been regarded in an experimental design so far. We identified six morphological components of conspiracy theories empirically. On the basis of these content categories a set of narrative elements for a 9/11 story was compiled. These elements varied systematically in terms of conspiratorial allegation, i.e., they contained official statements concerning the events of 9/11, statements alleging to a conspiracy limited in time and space as well as extreme statements indicating an all-encompassing cover-up. Using the method of narrative construction, 30 people were given a set of cards with these statements and asked to construct the course of events of 9/11 they deem most plausible. When extreme statements were present in the set, the resulting stories were more conspiratorial; the number of official statements included in the narrative dropped significantly, whereas the self-assessment of the story's plausibility did not differ between conditions. This indicates that blatant statements in a pool of information foster the synthesis of conspiracy theories on an individual level. By relating these findings to one of Germany's most successful (and controversial) non-fiction books, we refer to the real-world dangers of this effect.

  10. Leveraging advances in biology to design biomaterials

    Science.gov (United States)

    Darnell, Max; Mooney, David J.

    2017-12-01

    Biomaterials have dramatically increased in functionality and complexity, allowing unprecedented control over the cells that interact with them. From these engineering advances arises the prospect of improved biomaterial-based therapies, yet practical constraints favour simplicity. Tools from the biology community are enabling high-resolution and high-throughput bioassays that, if incorporated into a biomaterial design framework, could help achieve unprecedented functionality while minimizing the complexity of designs by identifying the most important material parameters and biological outputs. However, to avoid data explosions and to effectively match the information content of an assay with the goal of the experiment, material screens and bioassays must be arranged in specific ways. By borrowing methods to design experiments and workflows from the bioprocess engineering community, we outline a framework for the incorporation of next-generation bioassays into biomaterials design to effectively optimize function while minimizing complexity. This framework can inspire biomaterials designs that maximize functionality and translatability.

  11. Minimally invasive orthognathic surgery.

    Science.gov (United States)

    Resnick, Cory M; Kaban, Leonard B; Troulis, Maria J

    2009-02-01

    Minimally invasive surgery is defined as the discipline in which operative procedures are performed in novel ways to diminish the sequelae of standard surgical dissections. The goals of minimally invasive surgery are to reduce tissue trauma and to minimize bleeding, edema, and injury, thereby improving the rate and quality of healing. In orthognathic surgery, there are two minimally invasive techniques that can be used separately or in combination: (1) endoscopic exposure and (2) distraction osteogenesis. This article describes the historical developments of the fields of orthognathic surgery and minimally invasive surgery, as well as the integration of the two disciplines. Indications, techniques, and the most current outcome data for specific minimally invasive orthognathic surgical procedures are presented.

  12. Spectral Mixing in Nervous Systems: Experimental Evidenceand Biologically Plausible Circuits

    Science.gov (United States)

    Kleinfeld, D.; Mehta, S. B.

    The ability to compute the difference frequency for two periodic signals depends on a nonlinear operation that mixes those signals. Behavioral and psychophysical evidence suggest that such mixing is likely to occur in the vertebrate nervous system as a means to compare rhythmic sensory signals, such as occurs in human audition, and as a means to lock an intrinsic rhythm to a sensory input. Electrophysiological data from electroreceptors in the immobilized electric fish and somatosensory cortex in the anesthetized rat yield direct evidence for such mixing, providing a neurological substrate for the modulation and demodulation of rhythmic neuronal signals. We consider an analytical model of spectral mixing that makes use of the threshold characteristics of neuronal firing and which has features consistent with the experimental observations. This model serves as a guide for constructing circuits that isolate given mixture components. In particular, such circuits can generate nearly pure difference tones from sinusoidal inputs without the use of band-pass filters, in analogy to an image-reject mixer in communications engineering. We speculate that such computations may play a role in coding of sensory input and feedback stabilization of motor output in nervous systems.

  13. Regularity of Minimal Surfaces

    CERN Document Server

    Dierkes, Ulrich; Tromba, Anthony J; Kuster, Albrecht

    2010-01-01

    "Regularity of Minimal Surfaces" begins with a survey of minimal surfaces with free boundaries. Following this, the basic results concerning the boundary behaviour of minimal surfaces and H-surfaces with fixed or free boundaries are studied. In particular, the asymptotic expansions at interior and boundary branch points are derived, leading to general Gauss-Bonnet formulas. Furthermore, gradient estimates and asymptotic expansions for minimal surfaces with only piecewise smooth boundaries are obtained. One of the main features of free boundary value problems for minimal surfaces is t

  14. Proper project planning helps minimize overruns and delays

    International Nuclear Information System (INIS)

    Donnelly, G.; Cooney, D.J.

    1994-01-01

    This paper describes planning methods to help minimize cost overruns during the construction of oil and gas pipelines. These steps include background data collection methods, field surveys, determining preliminary pipeline routes, regulatory agency pre-application meetings, and preliminary engineering. Methods for planning also include preliminary aerial mapping, biological assessments, cultural resources investigations, wetlands delineation, geotechnical investigations, and environmental audits. Identification of potential problems can allow for rerouting of the pipeline or remediation processes before they are raised during the permitting process. By coordinating these events from the very beginning, significant cost savings will result that prevent having to rebudget for them after the permitting process starts

  15. Computing motion using resistive networks

    Science.gov (United States)

    Koch, Christof; Luo, Jin; Mead, Carver; Hutchinson, James

    1988-01-01

    Recent developments in the theory of early vision are described which lead from the formulation of the motion problem as an ill-posed one to its solution by minimizing certain 'cost' functions. These cost or energy functions can be mapped onto simple analog and digital resistive networks. It is shown how the optical flow can be computed by injecting currents into resistive networks and recording the resulting stationary voltage distribution at each node. These networks can be implemented in cMOS VLSI circuits and represent plausible candidates for biological vision systems.

  16. Synthetic biology, inspired by synthetic chemistry.

    Science.gov (United States)

    Malinova, V; Nallani, M; Meier, W P; Sinner, E K

    2012-07-16

    The topic synthetic biology appears still as an 'empty basket to be filled'. However, there is already plenty of claims and visions, as well as convincing research strategies about the theme of synthetic biology. First of all, synthetic biology seems to be about the engineering of biology - about bottom-up and top-down approaches, compromising complexity versus stability of artificial architectures, relevant in biology. Synthetic biology accounts for heterogeneous approaches towards minimal and even artificial life, the engineering of biochemical pathways on the organismic level, the modelling of molecular processes and finally, the combination of synthetic with nature-derived materials and architectural concepts, such as a cellular membrane. Still, synthetic biology is a discipline, which embraces interdisciplinary attempts in order to have a profound, scientific base to enable the re-design of nature and to compose architectures and processes with man-made matter. We like to give an overview about the developments in the field of synthetic biology, regarding polymer-based analogs of cellular membranes and what questions can be answered by applying synthetic polymer science towards the smallest unit in life, namely a cell. Copyright © 2012 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  17. Minimal Poems Written in 1979 Minimal Poems Written in 1979

    Directory of Open Access Journals (Sweden)

    Sandra Sirangelo Maggio

    2008-04-01

    Full Text Available The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism. The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism.

  18. Minimally invasive treatment of pilon fractures with a low profile plate: preliminary results in 17 cases

    NARCIS (Netherlands)

    Borens, Olivier; Kloen, Peter; Richmond, Jeffrey; Roederer, Goetz; Levine, David S.; Helfet, David L.

    2009-01-01

    To determine the results of "biologic fixation" with a minimally invasive plating technique using a newly designed low profile "Scallop" plate in the treatment of pilon fractures. Retrospective case series. A tertiary referral center. Seventeen patients were treated between 1999 and 2001 for a

  19. Correlates of minimal dating.

    Science.gov (United States)

    Leck, Kira

    2006-10-01

    Researchers have associated minimal dating with numerous factors. The present author tested shyness, introversion, physical attractiveness, performance evaluation, anxiety, social skill, social self-esteem, and loneliness to determine the nature of their relationships with 2 measures of self-reported minimal dating in a sample of 175 college students. For women, shyness, introversion, physical attractiveness, self-rated anxiety, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. For men, physical attractiveness, observer-rated social skill, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. The patterns of relationships were not identical for the 2 indicators of minimal dating, indicating the possibility that minimal dating is not a single construct as researchers previously believed. The present author discussed implications and suggestions for future researchers.

  20. Optoelectronic system and apparatus for connection to biological systems

    Science.gov (United States)

    Okandan, Murat; Nielson, Gregory N.

    2018-03-06

    The present invention relates to a biological probe structure, as well as apparatuses, systems, and methods employing this structure. In particular embodiments, the structure includes a hermetically sealed unit configured to receive and transmit one or more optical signals. Furthermore, the structure can be implanted subcutaneously and interrogated externally. In this manner, a minimally invasive method can be employed to detect, treat, and/or assess the biological target. Additional methods and systems are also provided.

  1. Minimal Super Technicolor

    DEFF Research Database (Denmark)

    Antola, M.; Di Chiara, S.; Sannino, F.

    2011-01-01

    We introduce novel extensions of the Standard Model featuring a supersymmetric technicolor sector (supertechnicolor). As the first minimal conformal supertechnicolor model we consider N=4 Super Yang-Mills which breaks to N=1 via the electroweak interactions. This is a well defined, economical......, between unparticle physics and Minimal Walking Technicolor. We consider also other N =1 extensions of the Minimal Walking Technicolor model. The new models allow all the standard model matter fields to acquire a mass....

  2. Residual Neurocognitive Features of Long-Term Ecstasy Users With Minimal Exposure to Other Drugs

    Science.gov (United States)

    Halpern, John H.; Sherwood, Andrea R.; Hudson, James I.; Gruber, Staci; Kozin, David; Pope, Harrison G.

    2010-01-01

    Aims In field studies assessing cognitive function in illicit ecstasy users, there are several frequent confounding factors that might plausibly bias the findings toward an overestimate of ecstasy-induced neurocognitive toxicity. We designed an investigation seeking to minimize these possible sources of bias. Design We compared illicit ecstasy users and non-users while 1) excluding individuals with significant lifetime exposure to other illicit drugs or alcohol; 2) requiring that all participants be members of the “rave” subculture; and 3) testing all participants with breath, urine, and hair samples at the time of evaluation to exclude possible surreptitious substance use. We compared groups with adjustment for age, gender, race/ethnicity, family-of-origin variables, and childhood history of conduct disorder and attention deficit hyperactivity disorder. We provide significance levels without correction for multiple comparisons. Setting Field study. Participants Fifty-two illicit ecstasy users and 59 non-users, age 18-45. Measurements Battery of 15 neuropsychological tests tapping a range of cognitive functions. Findings We found little evidence of decreased cognitive performance in ecstasy users, save for poorer strategic-self-regulation, possibly reflecting increased impulsivity. However this finding might have reflected a premorbid attribute of ecstasy users, rather than a residual neurotoxic effect of the drug. Conclusions In a study designed to minimize limitations found in many prior investigations, we failed to demonstrate marked residual cognitive effects in ecstasy users. This finding contrasts with many previous findings—including our own—and emphasizes the need for continued caution in interpreting field studies of cognitive function in illicit ecstasy users. PMID:21205042

  3. Robust dynamical pattern formation from a multifunctional minimal genetic circuit

    Directory of Open Access Journals (Sweden)

    Carrera Javier

    2010-04-01

    Full Text Available Abstract Background A practical problem during the analysis of natural networks is their complexity, thus the use of synthetic circuits would allow to unveil the natural mechanisms of operation. Autocatalytic gene regulatory networks play an important role in shaping the development of multicellular organisms, whereas oscillatory circuits are used to control gene expression under variable environments such as the light-dark cycle. Results We propose a new mechanism to generate developmental patterns and oscillations using a minimal number of genes. For this, we design a synthetic gene circuit with an antagonistic self-regulation to study the spatio-temporal control of protein expression. Here, we show that our minimal system can behave as a biological clock or memory, and it exhibites an inherent robustness due to a quorum sensing mechanism. We analyze this property by accounting for molecular noise in an heterogeneous population. We also show how the period of the oscillations is tunable by environmental signals, and we study the bifurcations of the system by constructing different phase diagrams. Conclusions As this minimal circuit is based on a single transcriptional unit, it provides a new mechanism based on post-translational interactions to generate targeted spatio-temporal behavior.

  4. Minimal Disease Activity as a Treatment Target in Psoriatic Arthritis

    DEFF Research Database (Denmark)

    Gossec, Laure; McGonagle, Dennis; Korotaeva, Tatiana

    2018-01-01

    As in other inflammatory rheumatic diseases, the objective of psoriatic arthritis (PsA) treatment is the achievement of a defined target. Recent recommendations propose aiming for remission or low disease activity; however, a consensual definition of remission is lacking. A state of minimal disease....... Since its development, MDA has been used increasingly in studies and clinical trials. In this article, the potential use of MDA as a treatment target in PsA is reviewed. The frequencies of MDA achievement with biologic disease-modifying antirheumatic drugs are summarized based on data from registries...

  5. Synthetic constructs in/for the environment: managing the interplay between natural and engineered Biology.

    Science.gov (United States)

    Schmidt, Markus; de Lorenzo, Víctor

    2012-07-16

    The plausible release of deeply engineered or even entirely synthetic/artificial microorganisms raises the issue of their intentional (e.g. bioremediation) or accidental interaction with the Environment. Containment systems designed in the 1980s-1990s for limiting the spread of genetically engineered bacteria and their recombinant traits are still applicable to contemporary Synthetic Biology constructs. Yet, the ease of DNA synthesis and the uncertainty on how non-natural properties and strains could interplay with the existing biological word poses yet again the challenge of designing safe and efficacious firewalls to curtail possible interactions. Such barriers may include xeno-nucleic acids (XNAs) instead of DNA as information-bearing molecules, rewriting the genetic code to make it non-understandable by the existing gene expression machineries, and/or making growth dependent on xenobiotic chemicals. Copyright © 2012 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  6. A flood-based information flow analysis and network minimization method for gene regulatory networks.

    Science.gov (United States)

    Pavlogiannis, Andreas; Mozhayskiy, Vadim; Tagkopoulos, Ilias

    2013-04-24

    Biological networks tend to have high interconnectivity, complex topologies and multiple types of interactions. This renders difficult the identification of sub-networks that are involved in condition- specific responses. In addition, we generally lack scalable methods that can reveal the information flow in gene regulatory and biochemical pathways. Doing so will help us to identify key participants and paths under specific environmental and cellular context. This paper introduces the theory of network flooding, which aims to address the problem of network minimization and regulatory information flow in gene regulatory networks. Given a regulatory biological network, a set of source (input) nodes and optionally a set of sink (output) nodes, our task is to find (a) the minimal sub-network that encodes the regulatory program involving all input and output nodes and (b) the information flow from the source to the sink nodes of the network. Here, we describe a novel, scalable, network traversal algorithm and we assess its potential to achieve significant network size reduction in both synthetic and E. coli networks. Scalability and sensitivity analysis show that the proposed method scales well with the size of the network, and is robust to noise and missing data. The method of network flooding proves to be a useful, practical approach towards information flow analysis in gene regulatory networks. Further extension of the proposed theory has the potential to lead in a unifying framework for the simultaneous network minimization and information flow analysis across various "omics" levels.

  7. Biological intrusion of low-level-waste trench covers

    Science.gov (United States)

    Hakonson, T. E.; Gladney, E. S.

    The long-term integrity of low-level waste shallow land burialsites is dependent on the interaction of physical, chemical, and biological factors that modify the waste containment system. The need to consider biological processes as being potentially important in reducing the integrity of waste burial site cover treatment is demonstrated. One approach to limiting biological intrusion through the waste cover is to apply a barrier within the profile to limit root and animal penetration with depth. Experiments in the Los Alamos Experimental Engineered Test Facility were initiated to develop and evaluate biological barriers that are effective in minimizing intrusion into waste trenches. The experiments that are described employ four different candidate barrier materials of geologic origin. Experimental variables that will be evaluated, in addition to barrier type, are barrier depth and sil overburden depth.

  8. On the plausibility of socioeconomic mortality estimates derived from linked data: a demographic approach.

    Science.gov (United States)

    Lerch, Mathias; Spoerri, Adrian; Jasilionis, Domantas; Viciana Fernandèz, Francisco

    2017-07-14

    Reliable estimates of mortality according to socioeconomic status play a crucial role in informing the policy debate about social inequality, social cohesion, and exclusion as well as about the reform of pension systems. Linked mortality data have become a gold standard for monitoring socioeconomic differentials in survival. Several approaches have been proposed to assess the quality of the linkage, in order to avoid the misclassification of deaths according to socioeconomic status. However, the plausibility of mortality estimates has never been scrutinized from a demographic perspective, and the potential problems with the quality of the data on the at-risk populations have been overlooked. Using indirect demographic estimation (i.e., the synthetic extinct generation method), we analyze the plausibility of old-age mortality estimates according to educational attainment in four European data contexts with different quality issues: deterministic and probabilistic linkage of deaths, as well as differences in the methodology of the collection of educational data. We evaluate whether the at-risk population according to educational attainment is misclassified and/or misestimated, correct these biases, and estimate the education-specific linkage rates of deaths. The results confirm a good linkage of death records within different educational strata, even when probabilistic matching is used. The main biases in mortality estimates concern the classification and estimation of the person-years of exposure according to educational attainment. Changes in the census questions about educational attainment led to inconsistent information over time, which misclassified the at-risk population. Sample censuses also misestimated the at-risk populations according to educational attainment. The synthetic extinct generation method can be recommended for quality assessments of linked data because it is capable not only of quantifying linkage precision, but also of tracking problems in

  9. Nitrogenous Derivatives of Phosphorus and the Origins of Life: Plausible Prebiotic Phosphorylating Agents in Water

    Directory of Open Access Journals (Sweden)

    Megha Karki

    2017-07-01

    Full Text Available Phosphorylation under plausible prebiotic conditions continues to be one of the defining issues for the role of phosphorus in the origins of life processes. In this review, we cover the reactions of alternative forms of phosphate, specifically the nitrogenous versions of phosphate (and other forms of reduced phosphorus species from a prebiotic, synthetic organic and biochemistry perspective. The ease with which such amidophosphates or phosphoramidate derivatives phosphorylate a wide variety of substrates suggests that alternative forms of phosphate could have played a role in overcoming the “phosphorylation in water problem”. We submit that serious consideration should be given to the search for primordial sources of nitrogenous versions of phosphate and other versions of phosphorus.

  10. Minimizing Mutual Couping

    DEFF Research Database (Denmark)

    2010-01-01

    Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna.......Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna....

  11. A Reconfigurable and Biologically Inspired Paradigm for Computation Using Network-On-Chip and Spiking Neural Networks

    Directory of Open Access Journals (Sweden)

    Jim Harkin

    2009-01-01

    Full Text Available FPGA devices have emerged as a popular platform for the rapid prototyping of biological Spiking Neural Networks (SNNs applications, offering the key requirement of reconfigurability. However, FPGAs do not efficiently realise the biologically plausible neuron and synaptic models of SNNs, and current FPGA routing structures cannot accommodate the high levels of interneuron connectivity inherent in complex SNNs. This paper highlights and discusses the current challenges of implementing scalable SNNs on reconfigurable FPGAs. The paper proposes a novel field programmable neural network architecture (EMBRACE, incorporating low-power analogue spiking neurons, interconnected using a Network-on-Chip architecture. Results on the evaluation of the EMBRACE architecture using the XOR benchmark problem are presented, and the performance of the architecture is discussed. The paper also discusses the adaptability of the EMBRACE architecture in supporting fault tolerant computing.

  12. Engineering genetic circuit interactions within and between synthetic minimal cells

    Science.gov (United States)

    Adamala, Katarzyna P.; Martin-Alarcon, Daniel A.; Guthrie-Honea, Katriona R.; Boyden, Edward S.

    2017-05-01

    Genetic circuits and reaction cascades are of great importance for synthetic biology, biochemistry and bioengineering. An open question is how to maximize the modularity of their design to enable the integration of different reaction networks and to optimize their scalability and flexibility. One option is encapsulation within liposomes, which enables chemical reactions to proceed in well-isolated environments. Here we adapt liposome encapsulation to enable the modular, controlled compartmentalization of genetic circuits and cascades. We demonstrate that it is possible to engineer genetic circuit-containing synthetic minimal cells (synells) to contain multiple-part genetic cascades, and that these cascades can be controlled by external signals as well as inter-liposomal communication without crosstalk. We also show that liposomes that contain different cascades can be fused in a controlled way so that the products of incompatible reactions can be brought together. Synells thus enable a more modular creation of synthetic biology cascades, an essential step towards their ultimate programmability.

  13. Legal incentives for minimizing waste

    International Nuclear Information System (INIS)

    Clearwater, S.W.; Scanlon, J.M.

    1991-01-01

    Waste minimization, or pollution prevention, has become an integral component of federal and state environmental regulation. Minimizing waste offers many economic and public relations benefits. In addition, waste minimization efforts can also dramatically reduce potential criminal requirements. This paper addresses the legal incentives for minimizing waste under current and proposed environmental laws and regulations

  14. MOCUS, Minimal Cut Sets and Minimal Path Sets from Fault Tree Analysis

    International Nuclear Information System (INIS)

    Fussell, J.B.; Henry, E.B.; Marshall, N.H.

    1976-01-01

    1 - Description of problem or function: From a description of the Boolean failure logic of a system, called a fault tree, and control parameters specifying the minimal cut set length to be obtained MOCUS determines the system failure modes, or minimal cut sets, and the system success modes, or minimal path sets. 2 - Method of solution: MOCUS uses direct resolution of the fault tree into the cut and path sets. The algorithm used starts with the main failure of interest, the top event, and proceeds to basic independent component failures, called primary events, to resolve the fault tree to obtain the minimal sets. A key point of the algorithm is that an and gate alone always increases the number of path sets; an or gate alone always increases the number of cut sets and increases the size of path sets. Other types of logic gates must be described in terms of and and or logic gates. 3 - Restrictions on the complexity of the problem: Output from MOCUS can include minimal cut and path sets for up to 20 gates

  15. Effect of sub-pore scale morphology of biological deposits on porous media flow properties

    Science.gov (United States)

    Ghezzehei, T. A.

    2012-12-01

    Biological deposits often influence fluid flow by altering the pore space morphology and related hydrologic properties such as porosity, water retention characteristics, and permeability. In most coupled-processes models changes in porosity are inferred from biological process models using mass-balance. The corresponding evolution of permeability is estimated using (semi-) empirical porosity-permeability functions such as the Kozeny-Carman equation or power-law functions. These equations typically do not account for the heterogeneous spatial distribution and morphological irregularities of the deposits. As a result, predictions of permeability evolution are generally unsatisfactory. In this presentation, we demonstrate the significance of pore-scale deposit distribution on porosity-permeability relations using high resolution simulations of fluid flow through a single pore interspersed with deposits of varying morphologies. Based on these simulations, we present a modification to the Kozeny-Carman model that accounts for the shape of the deposits. Limited comparison with published experimental data suggests the plausibility of the proposed conceptual model.

  16. Is non-minimal inflation eternal?

    International Nuclear Information System (INIS)

    Feng, Chao-Jun; Li, Xin-Zhou

    2010-01-01

    The possibility that the non-minimal coupling inflation could be eternal is investigated. We calculate the quantum fluctuation of the inflaton in a Hubble time and find that it has the same value as that in the minimal case in the slow-roll limit. Armed with this result, we have studied some concrete non-minimal inflationary models including the chaotic inflation and the natural inflation, in which the inflaton is non-minimally coupled to the gravity. We find that the non-minimal coupling inflation could be eternal in some parameter spaces.

  17. Signature of Plausible Accreting Supermassive Black Holes in Mrk 261/262 and Mrk 266

    Directory of Open Access Journals (Sweden)

    Gagik Ter-Kazarian

    2013-01-01

    Full Text Available We address the neutrino radiation of plausible accreting supermassive black holes closely linking to the 5 nuclear components of galaxy samples of Mrk 261/262 and Mrk 266. We predict a time delay before neutrino emission of the same scale as the age of the Universe. The ultrahigh energy neutrinos are produced in superdense protomatter medium via simple (quark or pionic reactions or modified URCA processes (G. Gamow was inspired to name the process URCA after the name of a casino in Rio de Janeiro. The resulting neutrino fluxes for quark reactions are ranging from to , where is the opening parameter. For pionic and modified URCA reactions, the fluxes are and , respectively. These fluxes are highly beamed along the plane of accretion disk, peaked at ultrahigh energies, and collimated in smaller opening angle .

  18. Using novel descriptor accounting for ligand-receptor interactions to define and visually explore biologically relevant chemical space.

    Science.gov (United States)

    Rabal, Obdulia; Oyarzabal, Julen

    2012-05-25

    The definition and pragmatic implementation of biologically relevant chemical space is critical in addressing navigation strategies in the overlapping regions where chemistry and therapeutically relevant targets reside and, therefore, also key to performing an efficient drug discovery project. Here, we describe the development and implementation of a simple and robust method for representing biologically relevant chemical space as a general reference according to current knowledge, independently of any reference space, and analyzing chemical structures accordingly. Underlying our method is the generation of a novel descriptor (LiRIf) that converts structural information into a one-dimensional string accounting for the plausible ligand-receptor interactions as well as for topological information. Capitalizing on ligand-receptor interactions as a descriptor enables the clustering, profiling, and comparison of libraries of compounds from a chemical biology and medicinal chemistry perspective. In addition, as a case study, R-groups analysis is performed to identify the most populated ligand-receptor interactions according to different target families (GPCR, kinases, etc.), as well as to evaluate the coverage of biologically relevant chemical space by structures annotated in different databases (ChEMBL, Glida, etc.).

  19. Minimal families of curves on surfaces

    KAUST Repository

    Lubbes, Niels

    2014-11-01

    A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal families of a given surface.The classification of minimal families of curves can be reduced to the classification of minimal families which cover weak Del Pezzo surfaces. We classify the minimal families of weak Del Pezzo surfaces and present a table with the number of minimal families of each weak Del Pezzo surface up to Weyl equivalence.As an application of this classification we generalize some results of Schicho. We classify algebraic surfaces that carry a family of conics. We determine the minimal lexicographic degree for the parametrization of a surface that carries at least 2 minimal families. © 2014 Elsevier B.V.

  20. Hexavalent Chromium Minimization Strategy

    Science.gov (United States)

    2011-05-01

    Logistics 4 Initiative - DoD Hexavalent Chromium Minimization Non- Chrome Primer IIEXAVAJ ENT CHRO:M I~UMI CHROMIUM (VII Oil CrfVli.J CANCEfl HAnRD CD...Management Office of the Secretary of Defense Hexavalent Chromium Minimization Strategy Report Documentation Page Form ApprovedOMB No. 0704-0188...00-2011 4. TITLE AND SUBTITLE Hexavalent Chromium Minimization Strategy 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  1. Silk-polypyrrole biocompatible actuator performance under biologically relevant conditions

    Science.gov (United States)

    Hagler, Jo'elen; Peterson, Ben; Murphy, Amanda; Leger, Janelle

    Biocompatible actuators that are capable of controlled movement and can function under biologically relevant conditions are of significant interest in biomedical fields. Previously, we have demonstrated that a composite material of silk biopolymer and the conducting polymer polypyrrole (PPy) can be formed into a bilayer device that can bend under applied voltage. Further, these silk-PPy composites can generate forces comparable to human muscle (>0.1 MPa) making them ideal candidates for interfacing with biological tissues. Here silk-PPy composite films are tested for performance under biologically relevant conditions including exposure to a complex protein serum and biologically relevant temperatures. Free-end bending actuation performance, current response, force generation and, mass degradation were investigated . Preliminary results show that when exposed to proteins and biologically relevant temperatures, these silk-PPy composites show minimal degradation and are able to generate forces and conduct currents comparable to devices tested under standard conditions. NSF.

  2. Prochlorococcus: Advantages and Limits of Minimalism

    Science.gov (United States)

    Partensky, Frédéric; Garczarek, Laurence

    2010-01-01

    Prochlorococcus is the key phytoplanktonic organism of tropical gyres, large ocean regions that are depleted of the essential macronutrients needed for photosynthesis and cell growth. This cyanobacterium has adapted itself to oligotrophy by minimizing the resources necessary for life through a drastic reduction of cell and genome sizes. This rarely observed strategy in free-living organisms has conferred on Prochlorococcus a considerable advantage over other phototrophs, including its closest relative Synechococcus, for life in this vast yet little variable ecosystem. However, this strategy seems to reach its limits in the upper layer of the S Pacific gyre, the most oligotrophic region of the world ocean. By losing some important genes and/or functions during evolution, Prochlorococcus has seemingly become dependent on co-occurring microorganisms. In this review, we present some of the recent advances in the ecology, biology, and evolution of Prochlorococcus, which because of its ecological importance and tiny genome is rapidly imposing itself as a model organism in environmental microbiology.

  3. Minimal and non-minimal standard models: Universality of radiative corrections

    International Nuclear Information System (INIS)

    Passarino, G.

    1991-01-01

    The possibility of describing electroweak processes by means of models with a non-minimal Higgs sector is analyzed. The renormalization procedure which leads to a set of fitting equations for the bare parameters of the lagrangian is first reviewed for the minimal standard model. A solution of the fitting equations is obtained, which correctly includes large higher-order corrections. Predictions for physical observables, notably the W boson mass and the Z O partial widths, are discussed in detail. Finally the extension to non-minimal models is described under the assumption that new physics will appear only inside the vector boson self-energies and the concept of universality of radiative corrections is introduced, showing that to a large extent they are insensitive to the details of the enlarged Higgs sector. Consequences for the bounds on the top quark mass are also discussed. (orig.)

  4. Towards physical principles of biological evolution

    Science.gov (United States)

    Katsnelson, Mikhail I.; Wolf, Yuri I.; Koonin, Eugene V.

    2018-03-01

    Biological systems reach organizational complexity that far exceeds the complexity of any known inanimate objects. Biological entities undoubtedly obey the laws of quantum physics and statistical mechanics. However, is modern physics sufficient to adequately describe, model and explain the evolution of biological complexity? Detailed parallels have been drawn between statistical thermodynamics and the population-genetic theory of biological evolution. Based on these parallels, we outline new perspectives on biological innovation and major transitions in evolution, and introduce a biological equivalent of thermodynamic potential that reflects the innovation propensity of an evolving population. Deep analogies have been suggested to also exist between the properties of biological entities and processes, and those of frustrated states in physics, such as glasses. Such systems are characterized by frustration whereby local state with minimal free energy conflict with the global minimum, resulting in ‘emergent phenomena’. We extend such analogies by examining frustration-type phenomena, such as conflicts between different levels of selection, in biological evolution. These frustration effects appear to drive the evolution of biological complexity. We further address evolution in multidimensional fitness landscapes from the point of view of percolation theory and suggest that percolation at level above the critical threshold dictates the tree-like evolution of complex organisms. Taken together, these multiple connections between fundamental processes in physics and biology imply that construction of a meaningful physical theory of biological evolution might not be a futile effort. However, it is unrealistic to expect that such a theory can be created in one scoop; if it ever comes to being, this can only happen through integration of multiple physical models of evolutionary processes. Furthermore, the existing framework of theoretical physics is unlikely to suffice

  5. Cannabis and psychosis: an update on course and biological plausible mechanisms

    NARCIS (Netherlands)

    Linszen, Don; van Amelsvoort, Therese

    2007-01-01

    PURPOSE OF REVIEW: Cannabis use is the most commonly abused illicit substance. Its relation with psychosis remains a topic of debate. Epidemiological studies suggest that cannabis is a component cause accounting for approximately 10% of cases. An increasing number of studies have been published on

  6. Detection of Low-order Curves in Images using Biologically-plausible Hardware

    Science.gov (United States)

    2012-09-29

    the intersections of iso-eccentricity and iso-polar contours were entered into the computer via a graphics tablet . In regions where there was...functional mri . Cerebral Cortex, 7:181 – 192, 1997. [25] Jacob Feldman. Bayesian contour integration. Perception and Psychophysics, 63:1171 – 1182, 2001. [26

  7. A cognitively plausible model for grammar induction

    Directory of Open Access Journals (Sweden)

    Roni Katzir

    2015-01-01

    Full Text Available This paper aims to bring theoretical linguistics and cognition-general theories of learning into closer contact. I argue that linguists' notions of rich UGs are well-founded, but that cognition-general learning approaches are viable as well and that the two can and should co-exist and support each other. Specifically, I use the observation that any theory of UG provides a learning criterion -- the total memory space used to store a grammar and its encoding of the input -- that supports learning according to the principle of Minimum Description-Length. This mapping from UGs to learners maintains a minimal ontological commitment: the learner for a particular UG uses only what is already required to account for linguistic competence in adults. I suggest that such learners should be our null hypothesis regarding the child's learning mechanism, and that furthermore, the mapping from theories of UG to learners provides a framework for comparing theories of UG.

  8. Using reefcheck monitoring database to develop the coral reef index of biological integrity

    DEFF Research Database (Denmark)

    Nguyen, Hai Yen T.; Pedersen, Ole; Ikejima, Kou

    2009-01-01

    The coral reef indices of biological integrity was constituted based on the reef check monitoring data. Seventy six minimally disturbed sites and 72 maximallv disturbed sites in shallow water and 39 minimally disturbed sites and 37 maximally disturbed sites in deep water were classified based...... on the high-end and low-end percentages and ratios of hard coral, dead coral and fieshy algae. A total of 52 candidate metrics was identified and compiled, Eight and four metrics were finally selected to constitute the shallow and deep water coral reef indices respectively. The rating curve was applied.......05) and coral damaged by other factors -0.283 (pcoral reef indices were sensitive responses to stressors and can be capable to use as the coral reef biological monitoring tool....

  9. Minimal Gromov-Witten rings

    International Nuclear Information System (INIS)

    Przyjalkowski, V V

    2008-01-01

    We construct an abstract theory of Gromov-Witten invariants of genus 0 for quantum minimal Fano varieties (a minimal class of varieties which is natural from the quantum cohomological viewpoint). Namely, we consider the minimal Gromov-Witten ring: a commutative algebra whose generators and relations are of the form used in the Gromov-Witten theory of Fano varieties (of unspecified dimension). The Gromov-Witten theory of any quantum minimal variety is a homomorphism from this ring to C. We prove an abstract reconstruction theorem which says that this ring is isomorphic to the free commutative ring generated by 'prime two-pointed invariants'. We also find solutions of the differential equation of type DN for a Fano variety of dimension N in terms of the generating series of one-pointed Gromov-Witten invariants

  10. Swarm robotics and minimalism

    Science.gov (United States)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  11. Supporting cognition in systems biology analysis: findings on users' processes and design implications.

    Science.gov (United States)

    Mirel, Barbara

    2009-02-13

    Current usability studies of bioinformatics tools suggest that tools for exploratory analysis support some tasks related to finding relationships of interest but not the deep causal insights necessary for formulating plausible and credible hypotheses. To better understand design requirements for gaining these causal insights in systems biology analyses a longitudinal field study of 15 biomedical researchers was conducted. Researchers interacted with the same protein-protein interaction tools to discover possible disease mechanisms for further experimentation. Findings reveal patterns in scientists' exploratory and explanatory analysis and reveal that tools positively supported a number of well-structured query and analysis tasks. But for several of scientists' more complex, higher order ways of knowing and reasoning the tools did not offer adequate support. Results show that for a better fit with scientists' cognition for exploratory analysis systems biology tools need to better match scientists' processes for validating, for making a transition from classification to model-based reasoning, and for engaging in causal mental modelling. As the next great frontier in bioinformatics usability, tool designs for exploratory systems biology analysis need to move beyond the successes already achieved in supporting formulaic query and analysis tasks and now reduce current mismatches with several of scientists' higher order analytical practices. The implications of results for tool designs are discussed.

  12. Minimal Marking: A Success Story

    Science.gov (United States)

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  13. Minimal families of curves on surfaces

    KAUST Repository

    Lubbes, Niels

    2014-01-01

    A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal

  14. COMPUTATIONAL MODELING AND SIMULATION IN BIOLOGY TEACHING: A MINIMALLY EXPLORED FIELD OF STUDY WITH A LOT OF POTENTIAL

    Directory of Open Access Journals (Sweden)

    Sonia López

    2016-09-01

    Full Text Available This study is part of a research project that aims to characterize the epistemological, psychological and didactic presuppositions of science teachers (Biology, Physics, Chemistry that implement Computational Modeling and Simulation (CMS activities as a part of their teaching practice. We present here a synthesis of a literature review on the subject, evidencing how in the last two decades this form of computer usage for science teaching has boomed in disciplines such as Physics and Chemistry, but in a lesser degree in Biology. Additionally, in the works that dwell on the use of CMS in Biology, we identified a lack of theoretical bases that support their epistemological, psychological and/or didactic postures. Accordingly, this generates significant considerations for the fields of research and teacher instruction in Science Education.

  15. A generic framework for individual-based modelling and physical-biological interaction

    DEFF Research Database (Denmark)

    Christensen, Asbjørn; Mariani, Patrizio; Payne, Mark R.

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian...... scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions...

  16. Minimal residual disease after surgery of HPV 16-associated tumours as target for immunotherapy

    Czech Academy of Sciences Publication Activity Database

    Bubeník, Jan; Reiniš, Milan; Šímová, Jana

    2006-01-01

    Roč. 18, Supplement 1 (2006), - ISSN 1107-3756. [World Congress on Advances in Oncology /11./ and International Symposium on Molecular Medicine /9./. 12.10.2006-14.10.2006, Hersonissos] R&D Projects: GA MZd(CZ) NR7807; GA ČR(CZ) GA301/04/0492; GA AV ČR(CZ) IAA500520605 Institutional research plan: CEZ:AV0Z50520514 Keywords : minimal residual disease * HPV16 * immunotherapy Subject RIV: EB - Genetics ; Molecular Biology

  17. Semi-phenomenological method for applying microdosimetry in estimating biological response

    International Nuclear Information System (INIS)

    Higgins, P.D.; DeLuca, P.M. Jr.; Pearson, D.W.; Gould, M.N.

    1981-01-01

    A semi-phenomenological approach has been used to estimate cell survival on the basis of microdosimetrically obtained measurements of beam quality, together with determinations of the biological cytotoxic response parameters of V79 Chinese hamster cells. Cells were exposed to a field of minimally ionizing radiation and to fields at least partially comprised of high LET radiation. We show that for widely varying experimental conditions, we can predict, with good reliability, cell survival for any arbitrary known beam quality and with a minimum of biological input

  18. Waste minimization assessment procedure

    International Nuclear Information System (INIS)

    Kellythorne, L.L.

    1993-01-01

    Perry Nuclear Power Plant began developing a waste minimization plan early in 1991. In March of 1991 the plan was documented following a similar format to that described in the EPA Waste Minimization Opportunity Assessment Manual. Initial implementation involved obtaining management's commitment to support a waste minimization effort. The primary assessment goal was to identify all hazardous waste streams and to evaluate those streams for minimization opportunities. As implementation of the plan proceeded, non-hazardous waste streams routinely generated in large volumes were also evaluated for minimization opportunities. The next step included collection of process and facility data which would be useful in helping the facility accomplish its assessment goals. This paper describes the resources that were used and which were most valuable in identifying both the hazardous and non-hazardous waste streams that existed on site. For each material identified as a waste stream, additional information regarding the materials use, manufacturer, EPA hazardous waste number and DOT hazard class was also gathered. Once waste streams were evaluated for potential source reduction, recycling, re-use, re-sale, or burning for heat recovery, with disposal as the last viable alternative

  19. Westinghouse Hanford Company waste minimization actions

    International Nuclear Information System (INIS)

    Greenhalgh, W.O.

    1988-09-01

    Companies that generate hazardous waste materials are now required by national regulations to establish a waste minimization program. Accordingly, in FY88 the Westinghouse Hanford Company formed a waste minimization team organization. The purpose of the team is to assist the company in its efforts to minimize the generation of waste, train personnel on waste minimization techniques, document successful waste minimization effects, track dollar savings realized, and to publicize and administer an employee incentive program. A number of significant actions have been successful, resulting in the savings of materials and dollars. The team itself has been successful in establishing some worthwhile minimization projects. This document briefly describes the waste minimization actions that have been successful to date. 2 refs., 26 figs., 3 tabs

  20. The Role of Synthetic Biology in NASA's Missions

    Science.gov (United States)

    Rothschild, Lynn J.

    2016-01-01

    The time has come to for NASA to exploit synthetic biology in pursuit of its missions, including aeronautics, earth science, astrobiology and most notably, human exploration. Conversely, NASA advances the fundamental technology of synthetic biology as no one else can because of its unique expertise in the origin of life and life in extreme environments, including the potential for alternate life forms. This enables unique, creative "game changing" advances. NASA's requirement for minimizing upmass in flight will also drive the field toward miniaturization and automation. These drivers will greatly increase the utility of synthetic biology solutions for military, health in remote areas and commercial purposes. To this end, we have begun a program at NASA to explore the use of synthetic biology in NASA's missions, particular space exploration. As part of this program, we began hosting an iGEM team of undergraduates drawn from Brown and Stanford Universities to conduct synthetic biology research at NASA Ames Research Center. The 2011 team (http://2011.igem.org/Team:Brown-Stanford) produced an award-winning project on using synthetic biology as a basis for a human Mars settlement.

  1. Phthalates impact human health: Epidemiological evidences and plausible mechanism of action.

    Science.gov (United States)

    Benjamin, Sailas; Masai, Eiji; Kamimura, Naofumi; Takahashi, Kenji; Anderson, Robin C; Faisal, Panichikkal Abdul

    2017-10-15

    Disregarding the rising alarm on the hazardous nature of various phthalates and their metabolites, ruthless usage of phthalates as plasticizer in plastics and as additives in innumerable consumer products continues due low their cost, attractive properties, and lack of suitable alternatives. Globally, in silico computational, in vitro mechanistic, in vivo preclinical and limited clinical or epidemiological human studies showed that over a dozen phthalates and their metabolites ingested passively by man from the general environment, foods, drinks, breathing air, and routine household products cause various dysfunctions. Thus, this review addresses the health hazards posed by phthalates on children and adolescents, epigenetic modulation, reproductive toxicity in women and men; insulin resistance and type II diabetes; overweight and obesity, skeletal anomalies, allergy and asthma, cancer, etc., coupled with the description of major phthalates and their general uses, phthalate exposure routes, biomonitoring and risk assessment, special account on endocrine disruption; and finally, a plausible molecular cross-talk with a unique mechanism of action. This clinically focused comprehensive review on the hazards of phthalates would benefit the general population, academia, scientists, clinicians, environmentalists, and law or policy makers to decide upon whether usage of phthalates to be continued swiftly without sufficient deceleration or regulated by law or to be phased out from earth forever. Copyright © 2017. Published by Elsevier B.V.

  2. Minimizing tip-sample forces in jumping mode atomic force microscopy in liquid

    Energy Technology Data Exchange (ETDEWEB)

    Ortega-Esteban, A. [Departamento de Fisica de la Materia Condensada, C-3, Universidad Autonoma de Madrid, Cantoblanco, 28049 Madrid (Spain); Horcas, I. [Nanotec Electronica S.L., Centro Empresarial Euronova 3, Ronda de Poniente 12, 28760 Tres Cantos, Madrid (Spain); Hernando-Perez, M. [Departamento de Fisica de la Materia Condensada, C-3, Universidad Autonoma de Madrid, Cantoblanco, 28049 Madrid (Spain); Ares, P. [Nanotec Electronica S.L., Centro Empresarial Euronova 3, Ronda de Poniente 12, 28760 Tres Cantos, Madrid (Spain); Perez-Berna, A.J.; San Martin, C.; Carrascosa, J.L. [Centro Nacional de Biotecnologia (CNB-CSIC), Darwin 3, 28049 Madrid (Spain); Pablo, P.J. de [Departamento de Fisica de la Materia Condensada, C-3, Universidad Autonoma de Madrid, Cantoblanco, 28049 Madrid (Spain); Gomez-Herrero, J., E-mail: julio.gomez@uam.es [Departamento de Fisica de la Materia Condensada, C-3, Universidad Autonoma de Madrid, Cantoblanco, 28049 Madrid (Spain)

    2012-03-15

    Control and minimization of tip-sample interaction forces are imperative tasks to maximize the performance of atomic force microscopy. In particular, when imaging soft biological matter in liquids, the cantilever dragging force prevents identification of the tip-sample mechanical contact, resulting in deleterious interaction with the specimen. In this work we present an improved jumping mode procedure that allows detecting the tip-sample contact with high accuracy, thus minimizing the scanning forces ({approx}100 pN) during the approach cycles. To illustrate this method we report images of human adenovirus and T7 bacteriophage particles which are prone to uncontrolled modifications when using conventional jumping mode. -- Highlights: Black-Right-Pointing-Pointer Improvement in atomic force microscopy in buffer solution. Black-Right-Pointing-Pointer Peak force detection. Black-Right-Pointing-Pointer Subtracting the cantilever dragging force. Black-Right-Pointing-Pointer Forces in the 100 pN range. Black-Right-Pointing-Pointer Imaging of delicate viruses with atomic force microscopy.

  3. Minimal but non-minimal inflation and electroweak symmetry breaking

    Energy Technology Data Exchange (ETDEWEB)

    Marzola, Luca [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia); Institute of Physics, University of Tartu,Ravila 14c, 50411 Tartu (Estonia); Racioppi, Antonio [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia)

    2016-10-07

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  4. Software Replica of Minimal Living Processes

    Science.gov (United States)

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela’s autopoietic cellular automata, Ganti’s chemoton model, whose running delivers interesting take home messages to open-minded biologists.

  5. Mixed-Methods Design in Biology Education Research: Approach and Uses

    Science.gov (United States)

    Warfa, Abdi-Rizak M.

    2016-01-01

    Educational research often requires mixing different research methodologies to strengthen findings, better contextualize or explain results, or minimize the weaknesses of a single method. This article provides practical guidelines on how to conduct such research in biology education, with a focus on mixed-methods research (MMR) that uses both…

  6. The Protein Cost of Metabolic Fluxes: Prediction from Enzymatic Rate Laws and Cost Minimization.

    Directory of Open Access Journals (Sweden)

    Elad Noor

    2016-11-01

    Full Text Available Bacterial growth depends crucially on metabolic fluxes, which are limited by the cell's capacity to maintain metabolic enzymes. The necessary enzyme amount per unit flux is a major determinant of metabolic strategies both in evolution and bioengineering. It depends on enzyme parameters (such as kcat and KM constants, but also on metabolite concentrations. Moreover, similar amounts of different enzymes might incur different costs for the cell, depending on enzyme-specific properties such as protein size and half-life. Here, we developed enzyme cost minimization (ECM, a scalable method for computing enzyme amounts that support a given metabolic flux at a minimal protein cost. The complex interplay of enzyme and metabolite concentrations, e.g. through thermodynamic driving forces and enzyme saturation, would make it hard to solve this optimization problem directly. By treating enzyme cost as a function of metabolite levels, we formulated ECM as a numerically tractable, convex optimization problem. Its tiered approach allows for building models at different levels of detail, depending on the amount of available data. Validating our method with measured metabolite and protein levels in E. coli central metabolism, we found typical prediction fold errors of 4.1 and 2.6, respectively, for the two kinds of data. This result from the cost-optimized metabolic state is significantly better than randomly sampled metabolite profiles, supporting the hypothesis that enzyme cost is important for the fitness of E. coli. ECM can be used to predict enzyme levels and protein cost in natural and engineered pathways, and could be a valuable computational tool to assist metabolic engineering projects. Furthermore, it establishes a direct connection between protein cost and thermodynamics, and provides a physically plausible and computationally tractable way to include enzyme kinetics into constraint-based metabolic models, where kinetics have usually been ignored or

  7. Mindfulness and Cardiovascular Disease Risk: State of the Evidence, Plausible Mechanisms, and Theoretical Framework

    Science.gov (United States)

    Schuman-Olivier, Zev; Britton, Willoughby B.; Fresco, David M.; Desbordes, Gaelle; Brewer, Judson A.; Fulwiler, Carl

    2016-01-01

    The purpose of this review is to provide (1) a synopsis on relations of mindfulness with cardiovascular disease (CVD) and major CVD risk factors, and (2) an initial consensus-based overview of mechanisms and theoretical framework by which mindfulness might influence CVD. Initial evidence, often of limited methodological quality, suggests possible impacts of mindfulness on CVD risk factors including physical activity, smoking, diet, obesity, blood pressure, and diabetes regulation. Plausible mechanisms include (1) improved attention control (e.g., ability to hold attention on experiences related to CVD risk, such as smoking, diet, physical activity, and medication adherence), (2) emotion regulation (e.g., improved stress response, self-efficacy, and skills to manage craving for cigarettes, palatable foods, and sedentary activities), and (3) self-awareness (e.g., self-referential processing and awareness of physical sensations due to CVD risk factors). Understanding mechanisms and theoretical framework should improve etiologic knowledge, providing customized mindfulness intervention targets that could enable greater mindfulness intervention efficacy. PMID:26482755

  8. Reciprocity-based reasons for benefiting research participants: most fail, the most plausible is problematic.

    Science.gov (United States)

    Sofaer, Neema

    2014-11-01

    A common reason for giving research participants post-trial access (PTA) to the trial intervention appeals to reciprocity, the principle, stated most generally, that if one person benefits a second, the second should reciprocate: benefit the first in return. Many authors consider it obvious that reciprocity supports PTA. Yet their reciprocity principles differ, with many authors apparently unaware of alternative versions. This article is the first to gather the range of reciprocity principles. It finds that: (1) most are false. (2) The most plausible principle, which is also problematic, applies only when participants experience significant net risks or burdens. (3) Seldom does reciprocity support PTA for participants or give researchers stronger reason to benefit participants than equally needy non-participants. (4) Reciprocity fails to explain the common view that it is bad when participants in a successful trial have benefited from the trial intervention but lack PTA to it. © 2013 John Wiley & Sons Ltd.

  9. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail

    2011-10-30

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ) + λ(sin φ, cos φ, 0), where A,B,C,D ε ℝ are fixed. To achieve invariance under Laguerre transformations, we also derive all Laguerre minimal surfaces that are enveloped by a family of cones. The methodology is based on the isotropic model of Laguerre geometry. In this model a Laguerre minimal surface enveloped by a family of cones corresponds to a graph of a biharmonic function carrying a family of isotropic circles. We classify such functions by showing that the top view of the family of circles is a pencil. © 2011 Springer-Verlag.

  10. Global Analysis of Minimal Surfaces

    CERN Document Server

    Dierkes, Ulrich; Tromba, Anthony J

    2010-01-01

    Many properties of minimal surfaces are of a global nature, and this is already true for the results treated in the first two volumes of the treatise. Part I of the present book can be viewed as an extension of these results. For instance, the first two chapters deal with existence, regularity and uniqueness theorems for minimal surfaces with partially free boundaries. Here one of the main features is the possibility of 'edge-crawling' along free parts of the boundary. The third chapter deals with a priori estimates for minimal surfaces in higher dimensions and for minimizers of singular integ

  11. Minimal Surfaces for Hitchin Representations

    DEFF Research Database (Denmark)

    Li, Qiongling; Dai, Song

    2018-01-01

    . In this paper, we investigate the properties of immersed minimal surfaces inside symmetric space associated to a subloci of Hitchin component: $q_n$ and $q_{n-1}$ case. First, we show that the pullback metric of the minimal surface dominates a constant multiple of the hyperbolic metric in the same conformal...... class and has a strong rigidity property. Secondly, we show that the immersed minimal surface is never tangential to any flat inside the symmetric space. As a direct corollary, the pullback metric of the minimal surface is always strictly negatively curved. In the end, we find a fully decoupled system...

  12. Minimal Webs in Riemannian Manifolds

    DEFF Research Database (Denmark)

    Markvorsen, Steen

    2008-01-01

    For a given combinatorial graph $G$ a {\\it geometrization} $(G, g)$ of the graph is obtained by considering each edge of the graph as a $1-$dimensional manifold with an associated metric $g$. In this paper we are concerned with {\\it minimal isometric immersions} of geometrized graphs $(G, g......)$ into Riemannian manifolds $(N^{n}, h)$. Such immersions we call {\\em{minimal webs}}. They admit a natural 'geometric' extension of the intrinsic combinatorial discrete Laplacian. The geometric Laplacian on minimal webs enjoys standard properties such as the maximum principle and the divergence theorems, which...... are of instrumental importance for the applications. We apply these properties to show that minimal webs in ambient Riemannian spaces share several analytic and geometric properties with their smooth (minimal submanifold) counterparts in such spaces. In particular we use appropriate versions of the divergence...

  13. Waste minimization handbook, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  14. Waste minimization handbook, Volume 1

    International Nuclear Information System (INIS)

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility's life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996

  15. Systems biology definition of the core proteome of metabolism and expression is consistent with high-throughput data

    DEFF Research Database (Denmark)

    Yang, Laurence; Tan, Justin; O'Brien, Edward J.

    2015-01-01

    based on proteomics data. This systems biology core proteome includes 212 genes not found in previous comparative genomics-based core proteome definitions, accounts for 65% of known essential genes in E. coli, and has 78% gene function overlap with minimal genomes (Buchnera aphidicola and Mycoplasma......Finding the minimal set of gene functions needed to sustain life is of both fundamental and practical importance. Minimal gene lists have been proposed by using comparative genomics-based core proteome definitions. A definition of a core proteome that is supported by empirical data, is understood...... at the systems-level, and provides a basis for computing essential cell functions is lacking. Here, we use a systems biology-based genome-scale model of metabolism and expression to define a functional core proteome consisting of 356 gene products, accounting for 44% of the Escherichia coli proteome by mass...

  16. Exposures to airborne particulate matter and adverse perinatal outcomes: a biologically plausible mechanistic framework for exploring potential Exposição à matéria particulada aérea e efeitos perinatais adversos: referencial mecanístico biologicamente plausível para exploração de potenciais

    Directory of Open Access Journals (Sweden)

    Srimathi Kannan

    2007-12-01

    Full Text Available This article has three objectives: to describe the biologically plausible mechanistic pathways by which exposure to particulate matter (PM may lead to adverse perinatal outcomes of low birth weight (LBW, intrauterine growth retardation (IUGR, and preterm delivery (PTD; review evidence showing that nutrition affects biologic pathways; and explain mechanisms by which nutrition may modify the impact of PM exposure on perinatal outcomes. We propose an interdisciplinary framework that brings together maternal and infant nutrition, air pollution exposure assessment, and cardiopulmonary and perinatal epidemiology. Five possible biologic mechanisms have been put forth in the emerging environmental sciences literature and provide corollaries for the proposed framework. The literature indicates that the effects of PM on LBW, PTD, and IUGR may manifest through the cardiovascular mechanisms of oxidative stress, inflammation, coagulation, endothelial function, and hemodynamic responses. PM exposure studies relating mechanistic pathways to perinatal outcomes should consider the likelihood that biologic responses and adverse birth outcomes may be derived from both PM and non-PM sources. We present strategies for empirically testing the proposed model and developing future research efforts.São três os objetivos deste artigo: descrever rotas mecanísticas biologicamente plausíveis pelas quais a exposição à matéria particulada (MP pode levar a efeitos perinatais adversos, como baixo peso ao nascer (BPN, retardo do crescimento intra-uterino (RCIU e nascimentos pré-termo (NPT; fazer uma revisão de evidências mostrando que a nutrição afeta rotas biológicas; explicar os mecanismos através dos quais a nutrição pode modificar o impacto da exposição a MP nos efeitos perinatais adversos. Propomos um referencial interdisciplinar que aproxime nutrição materna e infantil, avaliação de poluição do ar e epidemiologia cardiopulmonar e perinatal

  17. The biological effectiveness of antiproton irradiation

    International Nuclear Information System (INIS)

    Holzscheiter, Michael H.; Bassler, Niels; Agazaryan, Nzhde; Beyer, Gerd; Blackmore, Ewart; DeMarco, John J.; Doser, Michael; Durand, Ralph E.; Hartley, Oliver; Iwamoto, Keisuke S.; Knudsen, Helge V.; Landua, Rolf; Maggiore, Carl; McBride, William H.; Moller, Soren Pape; Petersen, Jorgen; Skarsgard, Lloyd D.; Smathers, James B.; Solberg, Timothy D.; Uggerhoj, Ulrik I.; Vranjes, Sanja; Withers, H. Rodney; Wong, Michelle; Wouters, Bradly G.

    2006-01-01

    Background and purpose: Antiprotons travel through tissue in a manner similar to that for protons until they reach the end of their range where they annihilate and deposit additional energy. This makes them potentially interesting for radiotherapy. The aim of this study was to conduct the first ever measurements of the biological effectiveness of antiprotons. Materials and methods: V79 cells were suspended in a semi-solid matrix and irradiated with 46.7 MeV antiprotons, 48 MeV protons, or 6 Co γ-rays. Clonogenic survival was determined as a function of depth along the particle beams. Dose and particle fluence response relationships were constructed from data in the plateau and Bragg peak regions of the beams and used to assess the biological effectiveness. Results: Due to uncertainties in antiproton dosimetry we defined a new term, called the biologically effective dose ratio (BEDR), which compares the response in a minimally spread out Bragg peak (SOBP) to that in the plateau as a function of particle fluence. This value was ∼3.75 times larger for antiprotons than for protons. This increase arises due to the increased dose deposited in the Bragg peak by annihilation and because this dose has a higher relative biological effectiveness (RBE). Conclusion: We have produced the first measurements of the biological consequences of antiproton irradiation. These data substantiate theoretical predictions of the biological effects of antiproton annihilation within the Bragg peak, and suggest antiprotons warrant further investigation

  18. Non-specific effects of vaccines: plausible and potentially important, but implications uncertain.

    Science.gov (United States)

    Pollard, Andrew J; Finn, Adam; Curtis, Nigel

    2017-11-01

    Non-specific effects (NSE) or heterologous effects of vaccines are proposed to explain observations in some studies that certain vaccines have an impact beyond the direct protection against infection with the specific pathogen for which the vaccines were designed. The importance and implications of such effects remain controversial. There are several known immunological mechanisms which could lead to NSE, since it is widely recognised that the generation of specific immunity is initiated by non-specific innate immune mechanisms that may also have wider effects on adaptive immune function. However, there are no published studies that demonstrate a mechanistic link between such immunological phenomena and clinically relevant NSE in humans. While it is highly plausible that some vaccines do have NSE, their magnitude and duration, and thus importance, remain uncertain. Although the WHO recently concluded that current evidence does not justify changes to immunisation policy, further studies of sufficient size and quality are needed to assess the importance of NSE for all-cause mortality. This could provide insights into vaccine immunobiology with important implications for infant health and survival. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Siting Samplers to Minimize Expected Time to Detection

    Energy Technology Data Exchange (ETDEWEB)

    Walter, Travis; Lorenzetti, David M.; Sohn, Michael D.

    2012-05-02

    We present a probabilistic approach to designing an indoor sampler network for detecting an accidental or intentional chemical or biological release, and demonstrate it for a real building. In an earlier paper, Sohn and Lorenzetti(1) developed a proof of concept algorithm that assumed samplers could return measurements only slowly (on the order of hours). This led to optimal detect to treat architectures, which maximize the probability of detecting a release. This paper develops a more general approach, and applies it to samplers that can return measurements relatively quickly (in minutes). This leads to optimal detect to warn architectures, which minimize the expected time to detection. Using a model of a real, large, commercial building, we demonstrate the approach by optimizing networks against uncertain release locations, source terms, and sampler characteristics. Finally, we speculate on rules of thumb for general sampler placement.

  20. Biological applications of phase-contrast electron microscopy.

    Science.gov (United States)

    Nagayama, Kuniaki

    2014-01-01

    Here, I review the principles and applications of phase-contrast electron microscopy using phase plates. First, I develop the principle of phase contrast based on a minimal model of microscopy, introducing a double Fourier-transform process to mathematically formulate the image formation. Next, I explain four phase-contrast (PC) schemes, defocus PC, Zernike PC, Hilbert differential contrast, and schlieren optics, as image-filtering processes in the context of the minimal model, with particular emphases on the Zernike PC and corresponding Zernike phase plates. Finally, I review applications of Zernike PC cryo-electron microscopy to biological systems such as protein molecules, virus particles, and cells, including single-particle analysis to delineate three-dimensional (3D) structures of protein and virus particles and cryo-electron tomography to reconstruct 3D images of complex protein systems and cells.

  1. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail; Pottmann, Helmut; Grohs, Philipp

    2011-01-01

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ

  2. Manipulation of biological samples using micro and nano techniques.

    Science.gov (United States)

    Castillo, Jaime; Dimaki, Maria; Svendsen, Winnie Edith

    2009-01-01

    The constant interest in handling, integrating and understanding biological systems of interest for the biomedical field, the pharmaceutical industry and the biomaterial researchers demand the use of techniques that allow the manipulation of biological samples causing minimal or no damage to their natural structure. Thanks to the advances in micro- and nanofabrication during the last decades several manipulation techniques offer us the possibility to image, characterize and manipulate biological material in a controlled way. Using these techniques the integration of biomaterials with remarkable properties with physical transducers has been possible, giving rise to new and highly sensitive biosensing devices. This article reviews the different techniques available to manipulate and integrate biological materials in a controlled manner either by sliding them along a surface (2-D manipulation), by grapping them and moving them to a new position (3-D manipulation), or by manipulating and relocating them applying external forces. The advantages and drawbacks are mentioned together with examples that reflect the state of the art of manipulation techniques for biological samples (171 references).

  3. Anticipating and Communicating Plausible Environmental and Health Concerns Associated with Future Disasters: The ShakeOut and ARkStorm Scenarios as Examples

    Science.gov (United States)

    Plumlee, G. S.; Morman, S. A.; Alpers, C. N.; Hoefen, T. M.; Meeker, G. P.

    2010-12-01

    Disasters commonly pose immediate threats to human safety, but can also produce hazardous materials (HM) that pose short- and long-term environmental-health threats. The U.S. Geological Survey (USGS) has helped assess potential environmental health characteristics of HM produced by various natural and anthropogenic disasters, such as the 2001 World Trade Center collapse, 2005 hurricanes Katrina and Rita, 2007-2009 southern California wildfires, various volcanic eruptions, and others. Building upon experience gained from these responses, we are now developing methods to anticipate plausible environmental and health implications of the 2008 Great Southern California ShakeOut scenario (which modeled the impacts of a 7.8 magnitude earthquake on the southern San Andreas fault, http://urbanearth.gps.caltech.edu/scenario08/), and the recent ARkStorm scenario (modeling the impacts of a major, weeks-long winter storm hitting nearly all of California, http://urbanearth.gps.caltech.edu/winter-storm/). Environmental-health impacts of various past earthquakes and extreme storms are first used to identify plausible impacts that could be associated with the disaster scenarios. Substantial insights can then be gleaned using a Geographic Information Systems (GIS) approach to link ShakeOut and ARkStorm effects maps with data extracted from diverse database sources containing geologic, hazards, and environmental information. This type of analysis helps constrain where potential geogenic (natural) and anthropogenic sources of HM (and their likely types of contaminants or pathogens) fall within areas of predicted ShakeOut-related shaking, firestorms, and landslides, and predicted ARkStorm-related precipitation, flooding, and winds. Because of uncertainties in the event models and many uncertainties in the databases used (e.g., incorrect location information, lack of detailed information on specific facilities, etc.) this approach should only be considered as the first of multiple steps

  4. Minimal experimental requirements for definition of extracellular vesicles and their functions: a position statement from the International Society for Extracellular Vesicles.

    Science.gov (United States)

    Lötvall, Jan; Hill, Andrew F; Hochberg, Fred; Buzás, Edit I; Di Vizio, Dolores; Gardiner, Christopher; Gho, Yong Song; Kurochkin, Igor V; Mathivanan, Suresh; Quesenberry, Peter; Sahoo, Susmita; Tahara, Hidetoshi; Wauben, Marca H; Witwer, Kenneth W; Théry, Clotilde

    2014-01-01

    Secreted membrane-enclosed vesicles, collectively called extracellular vesicles (EVs), which include exosomes, ectosomes, microvesicles, microparticles, apoptotic bodies and other EV subsets, encompass a very rapidly growing scientific field in biology and medicine. Importantly, it is currently technically challenging to obtain a totally pure EV fraction free from non-vesicular components for functional studies, and therefore there is a need to establish guidelines for analyses of these vesicles and reporting of scientific studies on EV biology. Here, the International Society for Extracellular Vesicles (ISEV) provides researchers with a minimal set of biochemical, biophysical and functional standards that should be used to attribute any specific biological cargo or functions to EVs.

  5. Y-12 Plant waste minimization strategy

    International Nuclear Information System (INIS)

    Kane, M.A.

    1987-01-01

    The 1984 Amendments to the Resource Conservation and Recovery Act (RCRA) mandate that waste minimization be a major element of hazardous waste management. In response to this mandate and the increasing costs for waste treatment, storage, and disposal, the Oak Ridge Y-12 Plant developed a waste minimization program to encompass all types of wastes. Thus, waste minimization has become an integral part of the overall waste management program. Unlike traditional approaches, waste minimization focuses on controlling waste at the beginning of production instead of the end. This approach includes: (1) substituting nonhazardous process materials for hazardous ones, (2) recycling or reusing waste effluents, (3) segregating nonhazardous waste from hazardous and radioactive waste, and (4) modifying processes to generate less waste or less toxic waste. An effective waste minimization program must provide the appropriate incentives for generators to reduce their waste and provide the necessary support mechanisms to identify opportunities for waste minimization. This presentation focuses on the Y-12 Plant's strategy to implement a comprehensive waste minimization program. This approach consists of four major program elements: (1) promotional campaign, (2) process evaluation for waste minimization opportunities, (3) waste generation tracking system, and (4) information exchange network. The presentation also examines some of the accomplishments of the program and issues which need to be resolved

  6. Minimal open strings

    International Nuclear Information System (INIS)

    Hosomichi, Kazuo

    2008-01-01

    We study FZZT-branes and open string amplitudes in (p, q) minimal string theory. We focus on the simplest boundary changing operators in two-matrix models, and identify the corresponding operators in worldsheet theory through the comparison of amplitudes. Along the way, we find a novel linear relation among FZZT boundary states in minimal string theory. We also show that the boundary ground ring is realized on physical open string operators in a very simple manner, and discuss its use for perturbative computation of higher open string amplitudes.

  7. Minimal Composite Inflation

    DEFF Research Database (Denmark)

    Channuie, Phongpichit; Jark Joergensen, Jakob; Sannino, Francesco

    2011-01-01

    We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity, and that the u......We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity...

  8. International Conference on Medical and Biological Engineering 2017

    CERN Document Server

    2017-01-01

    This volume presents the proceedings of the International Conference on Medical and Biological Engineering held from 16 to 18 March 2017 in Sarajevo, Bosnia and Herzegovina. Focusing on the theme of ‘Pursuing innovation. Shaping the future’, it highlights the latest advancements in Biomedical Engineering and also presents the latest findings, innovative solutions and emerging challenges in this field. Topics include: - Biomedical Signal Processing - Biomedical Imaging and Image Processing - Biosensors and Bioinstrumentation - Bio-Micro/Nano Technologies - Biomaterials - Biomechanics, Robotics and Minimally Invasive Surgery - Cardiovascular, Respiratory and Endocrine Systems Engineering - Neural and Rehabilitation Engineering - Molecular, Cellular and Tissue Engineering - Bioinformatics and Computational Biology - Clinical Engineering and Health Technology Assessment - Health Informatics, E-Health and Telemedicine - Biomedical Engineering Education - Pharmaceutical Engineering.

  9. Oversight of High-Containment Biological Laboratories: Issues for Congress

    Science.gov (United States)

    2009-05-04

    Laboratories: Issues for Congress Congressional Research Service 14 Industry and Non-Profit Laboratories Private sector companies and non-profit...resources for these endeavors. Whether public or private sector , high-containment laboratories are planned and designed to minimize the possibility of... equine encephalitis, and yellow fever. Some of the pathogens that cause these diseases have been considered as biological weapons.104 Expanding the number

  10. Minimal abdominal incisions

    Directory of Open Access Journals (Sweden)

    João Carlos Magi

    2017-04-01

    Full Text Available Minimally invasive procedures aim to resolve the disease with minimal trauma to the body, resulting in a rapid return to activities and in reductions of infection, complications, costs and pain. Minimally incised laparotomy, sometimes referred to as minilaparotomy, is an example of such minimally invasive procedures. The aim of this study is to demonstrate the feasibility and utility of laparotomy with minimal incision based on the literature and exemplifying with a case. The case in question describes reconstruction of the intestinal transit with the use of this incision. Male, young, HIV-positive patient in a late postoperative of ileotiflectomy, terminal ileostomy and closing of the ascending colon by an acute perforating abdomen, due to ileocolonic tuberculosis. The barium enema showed a proximal stump of the right colon near the ileostomy. The access to the cavity was made through the orifice resulting from the release of the stoma, with a lateral-lateral ileo-colonic anastomosis with a 25 mm circular stapler and manual closure of the ileal stump. These surgeries require their own tactics, such as rigor in the lysis of adhesions, tissue traction, and hemostasis, in addition to requiring surgeon dexterity – but without the need for investments in technology; moreover, the learning curve is reported as being lower than that for videolaparoscopy. Laparotomy with minimal incision should be considered as a valid and viable option in the treatment of surgical conditions. Resumo: Procedimentos minimamente invasivos visam resolver a doença com o mínimo de trauma ao organismo, resultando em retorno rápido às atividades, reduções nas infecções, complicações, custos e na dor. A laparotomia com incisão mínima, algumas vezes referida como minilaparotomia, é um exemplo desses procedimentos minimamente invasivos. O objetivo deste trabalho é demonstrar a viabilidade e utilidade das laparotomias com incisão mínima com base na literatura e

  11. Philosophy of biology: naturalistic or transcendental?

    Science.gov (United States)

    Kolen, Filip; Van de Vijver, Gertrudis

    2007-01-01

    The aim of this article is to clarify the meaning of a naturalistic position within philosophy of biology, against the background of an alternative view, founded on the basic insights of transcendental philosophy. It is argued that the apparently minimal and neutral constraints naturalism imposes on philosophy of science turn out to involve a quite heavily constraining metaphysics, due to the naturalism's fundamental neglect of its own perspective. Because of its intrinsic sensitivity to perspectivity and historicity, transcendental philosophy can avoid this type of hidden metaphysics.

  12. Mixed-Methods Design in Biology Education Research: Approach and Uses

    Science.gov (United States)

    Warfa, Abdi-Rizak M.

    2016-01-01

    Educational research often requires mixing different research methodologies to strengthen findings, better contextualize or explain results, or minimize the weaknesses of a single method. This article provides practical guidelines on how to conduct such research in biology education, with a focus on mixed-methods research (MMR) that uses both quantitative and qualitative inquiries. Specifically, the paper provides an overview of mixed-methods design typologies most relevant in biology education research. It also discusses common methodological issues that may arise in mixed-methods studies and ways to address them. The paper concludes with recommendations on how to report and write about MMR. PMID:27856556

  13. Quantitative global sensitivity analysis of a biologically based dose-response pregnancy model for the thyroid endocrine system.

    Science.gov (United States)

    Lumen, Annie; McNally, Kevin; George, Nysia; Fisher, Jeffrey W; Loizou, George D

    2015-01-01

    A deterministic biologically based dose-response model for the thyroidal system in a near-term pregnant woman and the fetus was recently developed to evaluate quantitatively thyroid hormone perturbations. The current work focuses on conducting a quantitative global sensitivity analysis on this complex model to identify and characterize the sources and contributions of uncertainties in the predicted model output. The workflow and methodologies suitable for computationally expensive models, such as the Morris screening method and Gaussian Emulation processes, were used for the implementation of the global sensitivity analysis. Sensitivity indices, such as main, total and interaction effects, were computed for a screened set of the total thyroidal system descriptive model input parameters. Furthermore, a narrower sub-set of the most influential parameters affecting the model output of maternal thyroid hormone levels were identified in addition to the characterization of their overall and pair-wise parameter interaction quotients. The characteristic trends of influence in model output for each of these individual model input parameters over their plausible ranges were elucidated using Gaussian Emulation processes. Through global sensitivity analysis we have gained a better understanding of the model behavior and performance beyond the domains of observation by the simultaneous variation in model inputs over their range of plausible uncertainties. The sensitivity analysis helped identify parameters that determine the driving mechanisms of the maternal and fetal iodide kinetics, thyroid function and their interactions, and contributed to an improved understanding of the system modeled. We have thus demonstrated the use and application of global sensitivity analysis for a biologically based dose-response model for sensitive life-stages such as pregnancy that provides richer information on the model and the thyroidal system modeled compared to local sensitivity analysis.

  14. Quantitative global sensitivity analysis of a biologically based dose-response pregnancy model for the thyroid endocrine system

    Directory of Open Access Journals (Sweden)

    Annie eLumen

    2015-05-01

    Full Text Available A deterministic biologically based dose-response model for the thyroidal system in a near-term pregnant woman and the fetus was recently developed to evaluate quantitatively thyroid hormone perturbations. The current work focuses on conducting a quantitative global sensitivity analysis on this complex model to identify and characterize the sources and contributions of uncertainties in the predicted model output. The workflow and methodologies suitable for computationally expensive models, such as the Morris screening method and Gaussian Emulation processes, were used for the implementation of the global sensitivity analysis. Sensitivity indices, such as main, total and interaction effects, were computed for a screened set of the total thyroidal system descriptive model input parameters. Furthermore, a narrower sub-set of the most influential parameters affecting the model output of maternal thyroid hormone levels were identified in addition to the characterization of their overall and pair-wise parameter interaction quotients. The characteristic trends of influence in model output for each of these individual model input parameters over their plausible ranges were elucidated using Gaussian Emulation processes. Through global sensitivity analysis we have gained a better understanding of the model behavior and performance beyond the domains of observation by the simultaneous variation in model inputs over their range of plausible uncertainties. The sensitivity analysis helped identify parameters that determine the driving mechanisms of the maternal and fetal iodide kinetics, thyroid function and their interactions, and contributed to an improved understanding of the system modeled. We have thus demonstrated the use and application of global sensitivity analysis for a biologically based dose-response model for sensitive life-stages such as pregnancy that provides richer information on the model and the thyroidal system modeled compared to local

  15. A plausible mechanism of biosorption in dual symbioses by vesicular-arbuscular mycorrhizal in plants.

    Science.gov (United States)

    Azmat, Rafia; Hamid, Neelofer

    2015-03-01

    Dual symbioses of vesicular-arbuscular mycorrhizal (VAM) fungi with growth of Momordica charantia were elucidated in terms of plausible mechanism of biosorption in this article. The experiment was conducted in green house and mixed inoculum of the VAM fungi was used in the three replicates. Results demonstrated that the starch contents were the main source of C for the VAM to builds their hyphae. The increased plant height and leaves surface area were explained in relation with an increase in the photosynthetic rates to produce rapid sugar contents for the survival of plants. A decreased in protein, and amino acid contents and increased proline and protease activity in VAM plants suggested that these contents were the main bio-indicators of the plants under biotic stress. The decline in protein may be due to the degradation of these contents, which later on converted into dextrose where it can easily be absorbed by for the period of symbioses. A mechanism of C chemisorption in relation with physiology and morphology of plant was discussed.

  16. Integration of ecological-biological thresholds in conservation decision making.

    Science.gov (United States)

    Mavrommati, Georgia; Bithas, Kostas; Borsuk, Mark E; Howarth, Richard B

    2016-12-01

    In the Anthropocene, coupled human and natural systems dominate and only a few natural systems remain relatively unaffected by human influence. On the one hand, conservation criteria based on areas of minimal human impact are not relevant to much of the biosphere. On the other hand, conservation criteria based on economic factors are problematic with respect to their ability to arrive at operational indicators of well-being that can be applied in practice over multiple generations. Coupled human and natural systems are subject to economic development which, under current management structures, tends to affect natural systems and cross planetary boundaries. Hence, designing and applying conservation criteria applicable in real-world systems where human and natural systems need to interact and sustainably coexist is essential. By recognizing the criticality of satisfying basic needs as well as the great uncertainty over the needs and preferences of future generations, we sought to incorporate conservation criteria based on minimal human impact into economic evaluation. These criteria require the conservation of environmental conditions such that the opportunity for intergenerational welfare optimization is maintained. Toward this end, we propose the integration of ecological-biological thresholds into decision making and use as an example the planetary-boundaries approach. Both conservation scientists and economists must be involved in defining operational ecological-biological thresholds that can be incorporated into economic thinking and reflect the objectives of conservation, sustainability, and intergenerational welfare optimization. © 2016 Society for Conservation Biology.

  17. [How to be prudent with synthetic biology. Synthetic Biology and the precautionary principle].

    Science.gov (United States)

    Rodríguez López, Blanca

    2014-01-01

    Synthetic biology is a new discipline that is twofold: firstly it offers the promise to pay benefits that can alleviate some of the ills that plague mankind; On the other hand, like all technologies, holds risks. Given these, the most critical and concerned about the risks, invoke the application of the precautionary principle, common in cases where an activity or new technology creates risks to the environment and/or human health, but far from universally accepted happens to be currently one of the most controversial principles. In this paper the question of the risks and benefits of synthetic biology and the relevance of applying the precautionary principle are analyzed. To do this we proceed as follows. The first part focuses on synthetic biology. At first, this discipline is characterized, with special attention to what is novel compared to the known as "genetic engineering". In the second stage both the benefits and the risks associated with it are discussed. The first part concludes with a review of the efforts currently being made to control or minimize the risks. The second part aims to analyze the precautionary principle and its possible relevance to the case of Synthetic Biology. At first, the different versions and interpretations of the principle and the various criticisms of which has been the subject are reviewed. Finally, after discarding the Precautionary Principle as an useful tool, it is seen as more appropriate some recent proposals to treat technologies that take into account not only risks but also their benefits.

  18. Minimal Flavour Violation and Beyond

    CERN Document Server

    Isidori, Gino

    2012-01-01

    We review the formulation of the Minimal Flavour Violation (MFV) hypothesis in the quark sector, as well as some "variations on a theme" based on smaller flavour symmetry groups and/or less minimal breaking terms. We also review how these hypotheses can be tested in B decays and by means of other flavour-physics observables. The phenomenological consequences of MFV are discussed both in general terms, employing a general effective theory approach, and in the specific context of the Minimal Supersymmetric extension of the SM.

  19. Mixed waste and waste minimization: The effect of regulations and waste minimization on the laboratory

    International Nuclear Information System (INIS)

    Dagan, E.B.; Selby, K.B.

    1993-08-01

    The Hanford Site is located in the State of Washington and is subject to state and federal environmental regulations that hamper waste minimization efforts. This paper addresses the negative effect of these regulations on waste minimization and mixed waste issues related to the Hanford Site. Also, issues are addressed concerning the regulations becoming more lenient. In addition to field operations, the Hanford Site is home to the Pacific Northwest Laboratory which has many ongoing waste minimization activities of particular interest to laboratories

  20. Cooperation through Competition?Dynamics and Microeconomics of a Minimal Nutrient Trade System in Arbuscular Mycorrhizal Symbiosis

    OpenAIRE

    Schott, Stephan; Valdebenito, Braulio; Bustos, Daniel; Gomez-Porras, Judith L.; Sharma, Tripti; Dreyer, Ingo

    2016-01-01

    In arbuscular mycorrhizal (AM) symbiosis, fungi and plants exchange nutrients (sugars and phosphate, for instance) for reciprocal benefit. Until now it is not clear how this nutrient exchange system works. Here, we used computational cell biology to simulate the dynamics of a network of proton pumps and proton-coupled transporters that are upregulated during AM formation. We show that this minimal network is sufficient to describe accurately and realistically the nutrient trade system. By app...

  1. Knowledge-fused differential dependency network models for detecting significant rewiring in biological networks.

    Science.gov (United States)

    Tian, Ye; Zhang, Bai; Hoffman, Eric P; Clarke, Robert; Zhang, Zhen; Shih, Ie-Ming; Xuan, Jianhua; Herrington, David M; Wang, Yue

    2014-07-24

    Modeling biological networks serves as both a major goal and an effective tool of systems biology in studying mechanisms that orchestrate the activities of gene products in cells. Biological networks are context-specific and dynamic in nature. To systematically characterize the selectively activated regulatory components and mechanisms, modeling tools must be able to effectively distinguish significant rewiring from random background fluctuations. While differential networks cannot be constructed by existing knowledge alone, novel incorporation of prior knowledge into data-driven approaches can improve the robustness and biological relevance of network inference. However, the major unresolved roadblocks include: big solution space but a small sample size; highly complex networks; imperfect prior knowledge; missing significance assessment; and heuristic structural parameter learning. To address these challenges, we formulated the inference of differential dependency networks that incorporate both conditional data and prior knowledge as a convex optimization problem, and developed an efficient learning algorithm to jointly infer the conserved biological network and the significant rewiring across different conditions. We used a novel sampling scheme to estimate the expected error rate due to "random" knowledge. Based on that scheme, we developed a strategy that fully exploits the benefit of this data-knowledge integrated approach. We demonstrated and validated the principle and performance of our method using synthetic datasets. We then applied our method to yeast cell line and breast cancer microarray data and obtained biologically plausible results. The open-source R software package and the experimental data are freely available at http://www.cbil.ece.vt.edu/software.htm. Experiments on both synthetic and real data demonstrate the effectiveness of the knowledge-fused differential dependency network in revealing the statistically significant rewiring in biological

  2. Preventing Absenteeism and Promoting Resilience Among Health Care Workers In Biological Emergencies

    Energy Technology Data Exchange (ETDEWEB)

    Lesperance, Ann M.; Miller, James S.

    2009-05-08

    The ability to ensure adequate numbers of medical staff represents a crucial part of the medical response to any disaster. However, healthcare worker absenteeism during disasters, especially in the event of an attack of biological terrorism or an epidemic such as pandemic influenza, is a serious concern. Though a significant rate of absenteeism is often included as a baseline assumption in emergency planning, published reports on strategies to minimize absenteeism are comparatively few. This report documents interviews with managers and emergency response planners at hospitals and public health agencies and reviews existing survey data on healthcare worker absenteeism and studies of disasters to glean lessons about the needs of healthcare workers during those disasters. Based on this research, expected rates of absenteeism and individual determinants of absenteeism are presented along with recommendations of steps that hospitals, emergency medical services departments, public health organizations, and government agencies can take to meet the needs of healthcare workers and minimize absenteeism during a biological event.

  3. Quantifying conservation biological control for management of Bemisia tabaci (Hemiptera: Aleyrodidae) in cotton

    Science.gov (United States)

    Conservation biological control can be an effective tactic for minimizing insect-induced damage to agricultural production. The most effective manner of applying CBC is through an Integrated Pest Management (IPM) strategy, combining many tactics including cultural controls, pest sampling, the use of...

  4. Minimal experimental requirements for definition of extracellular vesicles and their functions: a position statement from the International Society for Extracellular Vesicles

    Directory of Open Access Journals (Sweden)

    Jan Lötvall

    2014-12-01

    Full Text Available Secreted membrane-enclosed vesicles, collectively called extracellular vesicles (EVs, which include exosomes, ectosomes, microvesicles, microparticles, apoptotic bodies and other EV subsets, encompass a very rapidly growing scientific field in biology and medicine. Importantly, it is currently technically challenging to obtain a totally pure EV fraction free from non-vesicular components for functional studies, and therefore there is a need to establish guidelines for analyses of these vesicles and reporting of scientific studies on EV biology. Here, the International Society for Extracellular Vesicles (ISEV provides researchers with a minimal set of biochemical, biophysical and functional standards that should be used to attribute any specific biological cargo or functions to EVs.

  5. Minimizing waste in environmental restoration

    International Nuclear Information System (INIS)

    Thuot, J.R.; Moos, L.

    1996-01-01

    Environmental restoration, decontamination and decommissioning, and facility dismantlement projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized; however, there are significant areas where waste and cost can be reduced by careful planning and execution. Waste reduction can occur in three ways: beneficial reuse or recycling, segregation of waste types, and reducing generation of secondary waste

  6. Quantization of the minimal and non-minimal vector field in curved space

    OpenAIRE

    Toms, David J.

    2015-01-01

    The local momentum space method is used to study the quantized massive vector field (the Proca field) with the possible addition of non-minimal terms. Heat kernel coefficients are calculated and used to evaluate the divergent part of the one-loop effective action. It is shown that the naive expression for the effective action that one would write down based on the minimal coupling case needs modification. We adopt a Faddeev-Jackiw method of quantization and consider the case of an ultrastatic...

  7. Mixed-Methods Design in Biology Education Research: Approach and Uses.

    Science.gov (United States)

    Warfa, Abdi-Rizak M

    Educational research often requires mixing different research methodologies to strengthen findings, better contextualize or explain results, or minimize the weaknesses of a single method. This article provides practical guidelines on how to conduct such research in biology education, with a focus on mixed-methods research (MMR) that uses both quantitative and qualitative inquiries. Specifically, the paper provides an overview of mixed-methods design typologies most relevant in biology education research. It also discusses common methodological issues that may arise in mixed-methods studies and ways to address them. The paper concludes with recommendations on how to report and write about MMR. © 2016 L. A.-R. M. Warfa. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  8. Parameter-free Network Sparsification and Data Reduction by Minimal Algorithmic Information Loss

    KAUST Repository

    Zenil, Hector

    2018-02-16

    The study of large and complex datasets, or big data, organized as networks has emerged as one of the central challenges in most areas of science and technology. Cellular and molecular networks in biology is one of the prime examples. Henceforth, a number of techniques for data dimensionality reduction, especially in the context of networks, have been developed. Yet, current techniques require a predefined metric upon which to minimize the data size. Here we introduce a family of parameter-free algorithms based on (algorithmic) information theory that are designed to minimize the loss of any (enumerable computable) property contributing to the object\\'s algorithmic content and thus important to preserve in a process of data dimension reduction when forcing the algorithm to delete first the least important features. Being independent of any particular criterion, they are universal in a fundamental mathematical sense. Using suboptimal approximations of efficient (polynomial) estimations we demonstrate how to preserve network properties outperforming other (leading) algorithms for network dimension reduction. Our method preserves all graph-theoretic indices measured, ranging from degree distribution, clustering-coefficient, edge betweenness, and degree and eigenvector centralities. We conclude and demonstrate numerically that our parameter-free, Minimal Information Loss Sparsification (MILS) method is robust, has the potential to maximize the preservation of all recursively enumerable features in data and networks, and achieves equal to significantly better results than other data reduction and network sparsification methods.

  9. Minimal and careful processing

    OpenAIRE

    Nielsen, Thorkild

    2004-01-01

    In several standards, guidelines and publications, organic food processing is strongly associated with "minimal processing" and "careful processing". The term "minimal processing" is nowadays often used in the general food processing industry and described in literature. The term "careful processing" is used more specifically within organic food processing but is not yet clearly defined. The concept of carefulness seems to fit very well with the processing of organic foods, especially if it i...

  10. Non-minimal Wu-Yang monopole

    International Nuclear Information System (INIS)

    Balakin, A.B.; Zayats, A.E.

    2007-01-01

    We discuss new exact spherically symmetric static solutions to non-minimally extended Einstein-Yang-Mills equations. The obtained solution to the Yang-Mills subsystem is interpreted as a non-minimal Wu-Yang monopole solution. We focus on the analysis of two classes of the exact solutions to the gravitational field equations. Solutions of the first class belong to the Reissner-Nordstroem type, i.e., they are characterized by horizons and by the singularity at the point of origin. The solutions of the second class are regular ones. The horizons and singularities of a new type, the non-minimal ones, are indicated

  11. Autonomy and Fear of Synthetic Biology: How Can Patients' Autonomy Be Enhanced in the Field of Synthetic Biology? A Qualitative Study with Stable Patients.

    Science.gov (United States)

    Rakic, Milenko; Wienand, Isabelle; Shaw, David; Nast, Rebecca; Elger, Bernice S

    2017-04-01

    We analyzed stable patients' views regarding synthetic biology in general, the medical application of synthetic biology, and their potential participation in trials of synthetic biology in particular. The aim of the study was to find out whether patients' views and preferences change after receiving more detailed information about synthetic biology and its clinical applications. The qualitative study was carried out with a purposive sample of 36 stable patients, who suffered from diabetes or gout. Interviews were transcribed verbatim, translated and fully anonymized. Thematic analysis was applied in order to examine stable patients' attitudes towards synthetic biology, its medical application, and their participation in trials. When patients were asked about synthetic biology in general, most of them were anxious that something uncontrollable could be created. After a concrete example of possible future treatment options, patients started to see synthetic biology in a more positive way. Our study constitutes an important first empirical insight into stable patients' views on synthetic biology and into the kind of fears triggered by the term "synthetic biology." Our results show that clear and concrete information can change patients' initial negative feelings towards synthetic biology. Information should thus be transmitted with great accuracy and transparency in order to reduce irrational fears of patients and to minimize the risk that researchers present facts too positively for the purposes of persuading patients to participate in clinical trials. Potential participants need to be adequately informed in order to be able to autonomously decide whether to participate in human subject research involving synthetic biology.

  12. Wilson loops in minimal surfaces

    International Nuclear Information System (INIS)

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-01-01

    The AdS/CFT correspondence suggests that the Wilson loop of the large N gauge theory with N = 4 supersymmetry in 4 dimensions is described by a minimal surface in AdS 5 x S 5 . The authors examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which the authors call BPS loops, whose expectation values are free from ultra-violet divergence. They formulate the loop equation for such loops. To the extent that they have checked, the minimal surface in AdS 5 x S 5 gives a solution of the equation. The authors also discuss the zig-zag symmetry of the loop operator. In the N = 4 gauge theory, they expect the zig-zag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. They will show how this is realized for the minimal surface

  13. Wilson loops and minimal surfaces

    International Nuclear Information System (INIS)

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-01-01

    The AdS-CFT correspondence suggests that the Wilson loop of the large N gauge theory with N=4 supersymmetry in four dimensions is described by a minimal surface in AdS 5 xS 5 . We examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which we call BPS loops, whose expectation values are free from ultraviolet divergence. We formulate the loop equation for such loops. To the extent that we have checked, the minimal surface in AdS 5 xS 5 gives a solution of the equation. We also discuss the zigzag symmetry of the loop operator. In the N=4 gauge theory, we expect the zigzag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. We will show how this is realized for the minimal surface. (c) 1999 The American Physical Society

  14. β-Catenin transcriptional activity is minimal in canine osteosarcoma and its targeted inhibition results in minimal changes to cell line behaviour.

    Science.gov (United States)

    Piskun, Caroline M; Stein, Timothy J

    2016-06-01

    Canine osteosarcoma (OS) is an aggressive malignancy associated with poor outcomes. Therapeutic improvements are likely to develop from an improved understanding of signalling pathways contributing to OS development and progression. The Wnt signalling pathway is of interest for its role in osteoblast differentiation, its dysregulation in numerous cancer types, and the relative frequency of cytoplasmic accumulation of β-catenin in canine OS. This study aimed to determine the biological impact of inhibiting canonical Wnt signalling in canine OS, by utilizing either β-catenin siRNA or a dominant-negative T-cell factor (TCF) construct. There were no consistent, significant changes in cell line behaviour with either method compared to parental cell lines. Interestingly, β-catenin transcriptional activity was three-fold higher in normal canine primary osteoblasts compared to canine OS cell lines. These results suggest canonical Wnt signalling is minimally active in canine OS and its targeted inhibition is not a relevant therapeutic strategy. © 2013 John Wiley & Sons Ltd.

  15. Minimally invasive and targeted therapeutic cell delivery to the skin using microneedle devices.

    Science.gov (United States)

    Gualeni, B; Coulman, S A; Shah, D; Eng, P F; Ashraf, H; Vescovo, P; Blayney, G J; Piveteau, L-D; Guy, O J; Birchall, J C

    2018-03-01

    Translation of cell therapies to the clinic is accompanied by numerous challenges, including controlled and targeted delivery of the cells to their site of action, without compromising cell viability and functionality. To explore the use of hollow microneedle devices (to date only used for the delivery of drugs and vaccines into the skin and for the extraction of biological fluids) to deliver cells into skin in a minimally invasive, user-friendly and targeted fashion. Melanocyte, keratinocyte and mixed epidermal cell suspensions were passed through various types of microneedles and subsequently delivered into the skin. Cell viability and functionality are maintained after injection through hollow microneedles with a bore size ≥ 75 μm. Healthy cells are delivered into the skin at clinically relevant depths. Hollow microneedles provide an innovative and minimally invasive method for delivering functional cells into the skin. Microneedle cell delivery represents a potential new treatment option for cell therapy approaches including skin repigmentation, wound repair, scar and burn remodelling, immune therapies and cancer vaccines. © 2017 British Association of Dermatologists.

  16. Minimizing casualties in biological and chemical threats (war and terrorism): the importance of information to the public in a prevention program.

    Science.gov (United States)

    Noy, Shabtai

    2004-01-01

    The most effective means of defending against biological or chemical warfare, whether in war or as a result of terror, is the use of primary prevention. The main goal of such a prevention program is to minimize the human loss by reducing the number of casualties (fatalities, physical wounds, and psychological injury). A secondary objective is to prevent the widespread sense of helplessness in the general population. These two aims complement each other. The more the public is active in defending itself, rather than viewing itself as helpless, the lesser the expected number of casualties of any kind. In order to achieve these two goals, educating the civilian population about risk factors and pointing out appropriate defensive strategies is critical. In the absence of an effective prevention program and active participation by the public, there is a high risk for massive numbers of physical and psychological casualties. An essential ingredient of any preventive program, which ultimately may determine the success or failure of all other protective actions, is early, gradual dissemination of information and guidance to the public, so that citizens can become active participants in the program. The public needs to be given information concerning the nature of the threat and effective methods of coping with it, should an unconventional attack occur. Lack of such adaptive behavior (such as wearing protective gear) is likely to bring about vast numbers of physical and psychological casualties. These large numbers may burden the medical, political, and public safety systems beyond their ability to manage. Failure to provide reasonable prevention and effective interventions can lead to a destruction of the social and emotional fabric of individuals and the society. Furthermore, inadequate preparation, education, and communication can result in the development of damaging mistrust of the political and military leadership, disintegration of social and political structures

  17. Minimally invasive transcriptome profiling in salmon: Detection of biological response in rainbow trout caudal fin following exposure to environmental chemical contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Veldhoen, Nik; Stevenson, Mitchel R. [Department of Biochemistry and Microbiology, University of Victoria, P.O. Box 3055, STN CSC, Victoria, BC V8W 3P6 (Canada); Skirrow, Rachel C. [Pacific and Yukon Laboratory for Environmental Testing, Pacific Environmental Science Centre, Environment Canada, 2645 Dollarton Highway, North Vancouver, BC V7H 1B1 (Canada); Rieberger, Kevin J. [Environmental Sustainability and Strategic Policy Division, Water Protection and Sustainability Branch, British Columbia Ministry of Environment, P.O. Box 9362 Stn Prov Govt, Victoria, BC V8W 9M2 (Canada); Aggelen, Graham van [Pacific and Yukon Laboratory for Environmental Testing, Pacific Environmental Science Centre, Environment Canada, 2645 Dollarton Highway, North Vancouver, BC V7H 1B1 (Canada); Meays, Cynthia L. [Environmental Sustainability and Strategic Policy Division, Water Protection and Sustainability Branch, British Columbia Ministry of Environment, P.O. Box 9362 Stn Prov Govt, Victoria, BC V8W 9M2 (Canada); Helbing, Caren C., E-mail: chelbing@uvic.ca [Department of Biochemistry and Microbiology, University of Victoria, P.O. Box 3055, STN CSC, Victoria, BC V8W 3P6 (Canada)

    2013-10-15

    Highlights: •A minimally-invasive tail fin biopsy assay was developed for use in fish. •Quantitative real time polymerase reaction provided gene expression readout. •Results were comparable to classical liver tissue responses. •The approach was used on two salmonid species and can be coupled with genomic sex determination using an additional biopsy for maximal information. -- Abstract: An increasing number of anthropogenic chemicals have demonstrated potential for disruption of biological processes critical to normal growth and development of wildlife species. Both anadromous and freshwater salmon species are at risk of exposure to environmental chemical contaminants that may affect migratory behavior, environmental fitness, and reproductive success. A sensitive metric in determination of the presence and impact of such environmental chemical contaminants is through detection of changes in the status of gene transcript levels using a targeted quantitative real-time polymerase chain reaction assay. Ideally, the wildlife assessment strategy would incorporate conservation-centered non-lethal practices. Herein, we describe the development of such an assay for rainbow trout, Oncorhynchus mykiss, following an acute 96 h exposure to increasing concentrations of either 17α-ethinyl estradiol or cadmium. The estrogenic screen included measurement of mRNA encoding estrogen receptor α and β isoforms, vitellogenin, vitelline envelope protein γ, cytochrome p450 family 19 subfamily A, aryl hydrocarbon receptor, and the stress indicator, catalase. The metal exposure screen included evaluation of the latter two mRNA transcripts along with those encoding the metallothionein A and B isoforms. Exposure-dependent transcript abundance profiles were detected in both liver and caudal fin supporting the use of the caudal fin as a non-lethally obtained tissue source. The potential for both transcriptome profiling and genotypic sex determination from fin biopsy was extended, in

  18. Minimally invasive transcriptome profiling in salmon: Detection of biological response in rainbow trout caudal fin following exposure to environmental chemical contaminants

    International Nuclear Information System (INIS)

    Veldhoen, Nik; Stevenson, Mitchel R.; Skirrow, Rachel C.; Rieberger, Kevin J.; Aggelen, Graham van; Meays, Cynthia L.; Helbing, Caren C.

    2013-01-01

    Highlights: •A minimally-invasive tail fin biopsy assay was developed for use in fish. •Quantitative real time polymerase reaction provided gene expression readout. •Results were comparable to classical liver tissue responses. •The approach was used on two salmonid species and can be coupled with genomic sex determination using an additional biopsy for maximal information. -- Abstract: An increasing number of anthropogenic chemicals have demonstrated potential for disruption of biological processes critical to normal growth and development of wildlife species. Both anadromous and freshwater salmon species are at risk of exposure to environmental chemical contaminants that may affect migratory behavior, environmental fitness, and reproductive success. A sensitive metric in determination of the presence and impact of such environmental chemical contaminants is through detection of changes in the status of gene transcript levels using a targeted quantitative real-time polymerase chain reaction assay. Ideally, the wildlife assessment strategy would incorporate conservation-centered non-lethal practices. Herein, we describe the development of such an assay for rainbow trout, Oncorhynchus mykiss, following an acute 96 h exposure to increasing concentrations of either 17α-ethinyl estradiol or cadmium. The estrogenic screen included measurement of mRNA encoding estrogen receptor α and β isoforms, vitellogenin, vitelline envelope protein γ, cytochrome p450 family 19 subfamily A, aryl hydrocarbon receptor, and the stress indicator, catalase. The metal exposure screen included evaluation of the latter two mRNA transcripts along with those encoding the metallothionein A and B isoforms. Exposure-dependent transcript abundance profiles were detected in both liver and caudal fin supporting the use of the caudal fin as a non-lethally obtained tissue source. The potential for both transcriptome profiling and genotypic sex determination from fin biopsy was extended, in

  19. Systems biology definition of the core proteome of metabolism and expression is consistent with high-throughput data.

    Science.gov (United States)

    Yang, Laurence; Tan, Justin; O'Brien, Edward J; Monk, Jonathan M; Kim, Donghyuk; Li, Howard J; Charusanti, Pep; Ebrahim, Ali; Lloyd, Colton J; Yurkovich, James T; Du, Bin; Dräger, Andreas; Thomas, Alex; Sun, Yuekai; Saunders, Michael A; Palsson, Bernhard O

    2015-08-25

    Finding the minimal set of gene functions needed to sustain life is of both fundamental and practical importance. Minimal gene lists have been proposed by using comparative genomics-based core proteome definitions. A definition of a core proteome that is supported by empirical data, is understood at the systems-level, and provides a basis for computing essential cell functions is lacking. Here, we use a systems biology-based genome-scale model of metabolism and expression to define a functional core proteome consisting of 356 gene products, accounting for 44% of the Escherichia coli proteome by mass based on proteomics data. This systems biology core proteome includes 212 genes not found in previous comparative genomics-based core proteome definitions, accounts for 65% of known essential genes in E. coli, and has 78% gene function overlap with minimal genomes (Buchnera aphidicola and Mycoplasma genitalium). Based on transcriptomics data across environmental and genetic backgrounds, the systems biology core proteome is significantly enriched in nondifferentially expressed genes and depleted in differentially expressed genes. Compared with the noncore, core gene expression levels are also similar across genetic backgrounds (two times higher Spearman rank correlation) and exhibit significantly more complex transcriptional and posttranscriptional regulatory features (40% more transcription start sites per gene, 22% longer 5'UTR). Thus, genome-scale systems biology approaches rigorously identify a functional core proteome needed to support growth. This framework, validated by using high-throughput datasets, facilitates a mechanistic understanding of systems-level core proteome function through in silico models; it de facto defines a paleome.

  20. Topological gravity with minimal matter

    International Nuclear Information System (INIS)

    Li Keke

    1991-01-01

    Topological minimal matter, obtained by twisting the minimal N = 2 supeconformal field theory, is coupled to two-dimensional topological gravity. The free field formulation of the coupled system allows explicit representations of BRST charge, physical operators and their correlation functions. The contact terms of the physical operators may be evaluated by extending the argument used in a recent solution of topological gravity without matter. The consistency of the contact terms in correlation functions implies recursion relations which coincide with the Virasoro constraints derived from the multi-matrix models. Topological gravity with minimal matter thus provides the field theoretic description for the multi-matrix models of two-dimensional quantum gravity. (orig.)

  1. Minimizing waste in environmental restoration

    International Nuclear Information System (INIS)

    Moos, L.; Thuot, J.R.

    1996-01-01

    Environmental restoration, decontamination and decommissioning and facility dismantelment projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized. In fact, however, there are three significant areas where waste and cost can be reduced. Waste reduction can occur in three ways: beneficial reuse or recycling; segregation of waste types; and reducing generation of secondary waste. This paper will discuss several examples of reuse, recycle, segregation, and secondary waste reduction at ANL restoration programs

  2. On minimizers of causal variational principles

    International Nuclear Information System (INIS)

    Schiefeneder, Daniela

    2011-01-01

    Causal variational principles are a class of nonlinear minimization problems which arise in a formulation of relativistic quantum theory referred to as the fermionic projector approach. This thesis is devoted to a numerical and analytic study of the minimizers of a general class of causal variational principles. We begin with a numerical investigation of variational principles for the fermionic projector in discrete space-time. It is shown that for sufficiently many space-time points, the minimizing fermionic projector induces non-trivial causal relations on the space-time points. We then generalize the setting by introducing a class of causal variational principles for measures on a compact manifold. In our main result we prove under general assumptions that the support of a minimizing measure is either completely timelike, or it is singular in the sense that its interior is empty. In the examples of the circle, the sphere and certain flag manifolds, the general results are supplemented by a more detailed analysis of the minimizers. (orig.)

  3. Cooperation through Competition-Dynamics and Microeconomics of a Minimal Nutrient Trade System in Arbuscular Mycorrhizal Symbiosis.

    Science.gov (United States)

    Schott, Stephan; Valdebenito, Braulio; Bustos, Daniel; Gomez-Porras, Judith L; Sharma, Tripti; Dreyer, Ingo

    2016-01-01

    In arbuscular mycorrhizal (AM) symbiosis, fungi and plants exchange nutrients (sugars and phosphate, for instance) for reciprocal benefit. Until now it is not clear how this nutrient exchange system works. Here, we used computational cell biology to simulate the dynamics of a network of proton pumps and proton-coupled transporters that are upregulated during AM formation. We show that this minimal network is sufficient to describe accurately and realistically the nutrient trade system. By applying basic principles of microeconomics, we link the biophysics of transmembrane nutrient transport with the ecology of organismic interactions and straightforwardly explain macroscopic scenarios of the relations between plant and AM fungus. This computational cell biology study allows drawing far reaching hypotheses about the mechanism and the regulation of nutrient exchange and proposes that the "cooperation" between plant and fungus can be in fact the result of a competition between both for the same resources in the tiny periarbuscular space. The minimal model presented here may serve as benchmark to evaluate in future the performance of more complex models of AM nutrient exchange. As a first step toward this goal, we included SWEET sugar transporters in the model and show that their co-occurrence with proton-coupled sugar transporters results in a futile carbon cycle at the plant plasma membrane proposing that two different pathways for the same substrate should not be active at the same time.

  4. [Minimally invasive approach for cervical spondylotic radiculopathy].

    Science.gov (United States)

    Ding, Liang; Sun, Taicun; Huang, Yonghui

    2010-01-01

    To summarize the recent minimally invasive approach for cervical spondylotic radiculopathy (CSR). The recent literature at home and abroad concerning minimally invasive approach for CSR was reviewed and summarized. There were two techniques of minimally invasive approach for CSR at present: percutaneous puncture techniques and endoscopic techniques. The degenerate intervertebral disc was resected or nucleolysis by percutaneous puncture technique if CSR was caused by mild or moderate intervertebral disc herniations. The cervical microendoscopic discectomy and foraminotomy was an effective minimally invasive approach which could provide a clear view. The endoscopy techniques were suitable to treat CSR caused by foraminal osteophytes, lateral disc herniations, local ligamentum flavum thickening and spondylotic foraminal stenosis. The minimally invasive procedure has the advantages of simple handling, minimally invasive and low incidence of complications. But the scope of indications is relatively narrow at present.

  5. Opportunity of interventional radiology: advantages and application of interventional technique in biological target therapy

    International Nuclear Information System (INIS)

    Teng Gaojun; Lu Qin

    2007-01-01

    Interventional techniques not only provide opportunity of treatment for many diseases, but also alter the traditional therapeutic pattern. With the new century of wide application of biological therapies, interventional technique also shows extensive roles. The current biological therapy, including gene therapy, cell transplantation therapy, immunobiologic molecule therapy containing cell factors, tumor antibody or vaccine, recombined proteins, radioactive-particles and targeting materials therapy, can be locally administrated by interventional techniques. The combination of targeting biological therapies and high-targeted interventional technique holds advantages of minimal invasion, accurate delivery, vigorous local effect, and less systemic adverse reactions. Authors believe that the biological therapy may arise a great opportunity for interventional radiology, therefore interventional colleagues should grasp firmly and promptly for the development and extension in this field. (authors)

  6. Guidelines for mixed waste minimization

    International Nuclear Information System (INIS)

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization

  7. Minimization of the blank values in the neutron activation analysis of biological samples considering the whole procedure

    International Nuclear Information System (INIS)

    Lux, F.; Bereznai, T.; Trebert Haeberlin, S.

    1987-01-01

    During the determination of trace element contents of animal tissue by neutron activation analysis in the course of structure-activity relationship studies on platinum containing cancer drugs and wound healing the authors tried to minimize the blank values that are caused by different sources of contamination during surgery, sampling and the activation analysis procedure. The following topics were investigated: abrasions from scalpels made of stainless steel, titanium or quartz, the type of surgery, the surface contaminations of the quartz ampoules, etc. The measures to be taken to reduce tha blank values are described. (author) 19 refs.; 4 tables

  8. Universal biology and the statistical mechanics of early life

    Science.gov (United States)

    Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid

    2017-11-01

    All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  9. Minimal string theories and integrable hierarchies

    Science.gov (United States)

    Iyer, Ramakrishnan

    Well-defined, non-perturbative formulations of the physics of string theories in specific minimal or superminimal model backgrounds can be obtained by solving matrix models in the double scaling limit. They provide us with the first examples of completely solvable string theories. Despite being relatively simple compared to higher dimensional critical string theories, they furnish non-perturbative descriptions of interesting physical phenomena such as geometrical transitions between D-branes and fluxes, tachyon condensation and holography. The physics of these theories in the minimal model backgrounds is succinctly encoded in a non-linear differential equation known as the string equation, along with an associated hierarchy of integrable partial differential equations (PDEs). The bosonic string in (2,2m-1) conformal minimal model backgrounds and the type 0A string in (2,4 m) superconformal minimal model backgrounds have the Korteweg-de Vries system, while type 0B in (2,4m) backgrounds has the Zakharov-Shabat system. The integrable PDE hierarchy governs flows between backgrounds with different m. In this thesis, we explore this interesting connection between minimal string theories and integrable hierarchies further. We uncover the remarkable role that an infinite hierarchy of non-linear differential equations plays in organizing and connecting certain minimal string theories non-perturbatively. We are able to embed the type 0A and 0B (A,A) minimal string theories into this single framework. The string theories arise as special limits of a rich system of equations underpinned by an integrable system known as the dispersive water wave hierarchy. We find that there are several other string-like limits of the system, and conjecture that some of them are type IIA and IIB (A,D) minimal string backgrounds. We explain how these and several other string-like special points arise and are connected. In some cases, the framework endows the theories with a non

  10. Waste minimization at Chalk River Laboratories

    Energy Technology Data Exchange (ETDEWEB)

    Kranz, P.; Wong, P.C.F. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2011-07-01

    Waste minimization supports Atomic Energy of Canada Limited (AECL) Environment Policy with regard to pollution prevention and has positive impacts on the environment, human health and safety, and economy. In accordance with the principle of pollution prevention, the quantities and degree of hazard of wastes requiring storage or disposition at facilities within or external to AECL sites shall be minimized, following the principles of Prevent, Reduce, Reuse, and Recycle, to the extent practical. Waste minimization is an important element in the Waste Management Program. The Waste Management Program has implemented various initiatives for waste minimization since 2007. The key initiatives have focused on waste reduction, segregation and recycling, and included: 1) developed waste minimization requirements and recycling procedure to establish the framework for applying the Waste Minimization Hierarchy; 2) performed waste minimization assessments for the facilities, which generate significant amounts of waste, to identify the opportunities for waste reduction and assist the waste generators to develop waste reduction targets and action plans to achieve the targets; 3) implemented the colour-coded, standardized waste and recycling containers to enhance waste segregation; 4) established partnership with external agents for recycling; 5) extended the likely clean waste and recyclables collection to selected active areas; 6) provided on-going communications to promote waste reduction and increase awareness for recycling; and 7) continually monitored performance, with respect to waste minimization, to identify opportunities for improvement and to communicate these improvements. After implementation of waste minimization initiatives at CRL, the solid waste volume generated from routine operations at CRL has significantly decreased, while the amount of recyclables diverted from the onsite landfill has significantly increased since 2007. The overall refuse volume generated at

  11. Waste minimization at Chalk River Laboratories

    International Nuclear Information System (INIS)

    Kranz, P.; Wong, P.C.F.

    2011-01-01

    Waste minimization supports Atomic Energy of Canada Limited (AECL) Environment Policy with regard to pollution prevention and has positive impacts on the environment, human health and safety, and economy. In accordance with the principle of pollution prevention, the quantities and degree of hazard of wastes requiring storage or disposition at facilities within or external to AECL sites shall be minimized, following the principles of Prevent, Reduce, Reuse, and Recycle, to the extent practical. Waste minimization is an important element in the Waste Management Program. The Waste Management Program has implemented various initiatives for waste minimization since 2007. The key initiatives have focused on waste reduction, segregation and recycling, and included: 1) developed waste minimization requirements and recycling procedure to establish the framework for applying the Waste Minimization Hierarchy; 2) performed waste minimization assessments for the facilities, which generate significant amounts of waste, to identify the opportunities for waste reduction and assist the waste generators to develop waste reduction targets and action plans to achieve the targets; 3) implemented the colour-coded, standardized waste and recycling containers to enhance waste segregation; 4) established partnership with external agents for recycling; 5) extended the likely clean waste and recyclables collection to selected active areas; 6) provided on-going communications to promote waste reduction and increase awareness for recycling; and 7) continually monitored performance, with respect to waste minimization, to identify opportunities for improvement and to communicate these improvements. After implementation of waste minimization initiatives at CRL, the solid waste volume generated from routine operations at CRL has significantly decreased, while the amount of recyclables diverted from the onsite landfill has significantly increased since 2007. The overall refuse volume generated at

  12. Predicting Consensus Structures for RNA Alignments Via Pseudo-Energy Minimization

    Directory of Open Access Journals (Sweden)

    Junilda Spirollari

    2009-01-01

    Full Text Available Thermodynamic processes with free energy parameters are often used in algorithms that solve the free energy minimization problem to predict secondary structures of single RNA sequences. While results from these algorithms are promising, an observation is that single sequence-based methods have moderate accuracy and more information is needed to improve on RNA secondary structure prediction, such as covariance scores obtained from multiple sequence alignments. We present in this paper a new approach to predicting the consensus secondary structure of a set of aligned RNA sequences via pseudo-energy minimization. Our tool, called RSpredict, takes into account sequence covariation and employs effective heuristics for accuracy improvement. RSpredict accepts, as input data, a multiple sequence alignment in FASTA or ClustalW format and outputs the consensus secondary structure of the input sequences in both the Vienna style Dot Bracket format and the Connectivity Table format. Our method was compared with some widely used tools including KNetFold, Pfold and RNAalifold. A comprehensive test on different datasets including Rfam sequence alignments and a multiple sequence alignment obtained from our study on the Drosophila X chromosome reveals that RSpredict is competitive with the existing tools on the tested datasets. RSpredict is freely available online as a web server and also as a jar file for download at http:// datalab.njit.edu/biology/RSpredict.

  13. The missing link between sleep disorders and age-related dementia: recent evidence and plausible mechanisms.

    Science.gov (United States)

    Zhang, Feng; Zhong, Rujia; Li, Song; Chang, Raymond Chuen-Chung; Le, Weidong

    2017-05-01

    Sleep disorders are among the most common clinical problems and possess a significant concern for the geriatric population. More importantly, while around 40% of elderly adults have sleep-related complaints, sleep disorders are more frequently associated with co-morbidities including age-related neurodegenerative diseases and mild cognitive impairment. Recently, increasing evidence has indicated that disturbed sleep may not only serve as the consequence of brain atrophy, but also contribute to the pathogenesis of dementia and, therefore, significantly increase dementia risk. Since the current therapeutic interventions lack efficacies to prevent, delay or reverse the pathological progress of dementia, a better understanding of underlying mechanisms by which sleep disorders interact with the pathogenesis of dementia will provide possible targets for the prevention and treatment of dementia. In this review, we briefly describe the physiological roles of sleep in learning/memory, and specifically update the recent research evidence demonstrating the association between sleep disorders and dementia. Plausible mechanisms are further discussed. Moreover, we also evaluate the possibility of sleep therapy as a potential intervention for dementia.

  14. Non-minimal inflation revisited

    International Nuclear Information System (INIS)

    Nozari, Kourosh; Shafizadeh, Somayeh

    2010-01-01

    We reconsider an inflationary model that inflaton field is non-minimally coupled to gravity. We study the parameter space of the model up to the second (and in some cases third) order of the slow-roll parameters. We calculate inflation parameters in both Jordan and Einstein frames, and the results are compared in these two frames and also with observations. Using the recent observational data from combined WMAP5+SDSS+SNIa datasets, we study constraints imposed on our model parameters, especially the non-minimal coupling ξ.

  15. Minimal quantization and confinement

    International Nuclear Information System (INIS)

    Ilieva, N.P.; Kalinowskij, Yu.L.; Nguyen Suan Han; Pervushin, V.N.

    1987-01-01

    A ''minimal'' version of the Hamiltonian quantization based on the explicit solution of the Gauss equation and on the gauge-invariance principle is considered. By the example of the one-particle Green function we show that the requirement for gauge invariance leads to relativistic covariance of the theory and to more proper definition of the Faddeev - Popov integral that does not depend on the gauge choice. The ''minimal'' quantization is applied to consider the gauge-ambiguity problem and a new topological mechanism of confinement

  16. Null-polygonal minimal surfaces in AdS4 from perturbed W minimal models

    International Nuclear Information System (INIS)

    Hatsuda, Yasuyuki; Ito, Katsushi; Satoh, Yuji

    2012-11-01

    We study the null-polygonal minimal surfaces in AdS 4 , which correspond to the gluon scattering amplitudes/Wilson loops in N=4 super Yang-Mills theory at strong coupling. The area of the minimal surfaces with n cusps is characterized by the thermodynamic Bethe ansatz (TBA) integral equations or the Y-system of the homogeneous sine-Gordon model, which is regarded as the SU(n-4) 4 /U(1) n-5 generalized parafermion theory perturbed by the weight-zero adjoint operators. Based on the relation to the TBA systems of the perturbed W minimal models, we solve the TBA equations by using the conformal perturbation theory, and obtain the analytic expansion of the remainder function around the UV/regular-polygonal limit for n = 6 and 7. We compare the rescaled remainder function for n=6 with the two-loop one, to observe that they are close to each other similarly to the AdS 3 case.

  17. A survey on classical minimal surface theory

    CERN Document Server

    Meeks, William H

    2012-01-01

    Meeks and Pérez present a survey of recent spectacular successes in classical minimal surface theory. The classification of minimal planar domains in three-dimensional Euclidean space provides the focus of the account. The proof of the classification depends on the work of many currently active leading mathematicians, thus making contact with much of the most important results in the field. Through the telling of the story of the classification of minimal planar domains, the general mathematician may catch a glimpse of the intrinsic beauty of this theory and the authors' perspective of what is happening at this historical moment in a very classical subject. This book includes an updated tour through some of the recent advances in the theory, such as Colding-Minicozzi theory, minimal laminations, the ordering theorem for the space of ends, conformal structure of minimal surfaces, minimal annular ends with infinite total curvature, the embedded Calabi-Yau problem, local pictures on the scale of curvature and t...

  18. Minimalism and Speakers’ Intuitions

    Directory of Open Access Journals (Sweden)

    Matías Gariazzo

    2011-08-01

    Full Text Available Minimalism proposes a semantics that does not account for speakers’ intuitions about the truth conditions of a range of sentences or utterances. Thus, a challenge for this view is to offer an explanation of how its assignment of semantic contents to these sentences is grounded in their use. Such an account was mainly offered by Soames, but also suggested by Cappelen and Lepore. The article criticizes this explanation by presenting four kinds of counterexamples to it, and arrives at the conclusion that minimalism has not successfully answered the above-mentioned challenge.

  19. Biological substantiation of antipsychotic-associated pneumonia: Systematic literature review and computational analyses.

    Science.gov (United States)

    Sultana, Janet; Calabró, Marco; Garcia-Serna, Ricard; Ferrajolo, Carmen; Crisafulli, Concetta; Mestres, Jordi; Trifirò', Gianluca

    2017-01-01

    Antipsychotic (AP) safety has been widely investigated. However, mechanisms underlying AP-associated pneumonia are not well-defined. The aim of this study was to investigate the known mechanisms of AP-associated pneumonia through a systematic literature review, confirm these mechanisms using an independent data source on drug targets and attempt to identify novel AP drug targets potentially linked to pneumonia. A search was conducted in Medline and Web of Science to identify studies exploring the association between pneumonia and antipsychotic use, from which information on hypothesized mechanism of action was extracted. All studies had to be in English and had to concern AP use as an intervention in persons of any age and for any indication, provided that the outcome was pneumonia. Information on the study design, population, exposure, outcome, risk estimate and mechanism of action was tabulated. Public repositories of pharmacology and drug safety data were used to identify the receptor binding profile and AP safety events. Cytoscape was then used to map biological pathways that could link AP targets and off-targets to pneumonia. The literature search yielded 200 articles; 41 were included in the review. Thirty studies reported a hypothesized mechanism of action, most commonly activation/inhibition of cholinergic, histaminergic and dopaminergic receptors. In vitro pharmacology data confirmed receptor affinities identified in the literature review. Two targets, thromboxane A2 receptor (TBXA2R) and platelet activating factor receptor (PTAFR) were found to be novel AP target receptors potentially associated with pneumonia. Biological pathways constructed using Cytoscape identified plausible biological links potentially leading to pneumonia downstream of TBXA2R and PTAFR. Innovative approaches for biological substantiation of drug-adverse event associations may strengthen evidence on drug safety profiles and help to tailor pharmacological therapies to patient risk

  20. Origin of microbial life: Nano- and molecular events, thermodynamics/entropy, quantum mechanisms and genetic instructions.

    Science.gov (United States)

    Trevors, J T

    2011-03-01

    Currently, there are no agreed upon mechanisms and supporting evidence for the origin of the first microbial cells on the Earth. However, some hypotheses have been proposed with minimal supporting evidence and experimentation/observations. The approach taken in this article is that life originated at the nano- and molecular levels of biological organization, using quantum mechanic principles that became manifested as classical microbial cell(s), allowing the origin of microbial life on the Earth with a core or minimal, organic, genetic code containing the correct instructions for cell(s) for growth and division, in a micron dimension environment, with a local entropy range conducive to life (present about 4 billion years ago), and obeying the laws of thermodynamics. An integrated approach that explores all encompassing factors necessary for the origin of life, may bring forth plausible hypotheses (and mechanisms) with much needed supporting experimentation and observations for an origin of life theory. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. The re-emergence of the minimal running shoe.

    Science.gov (United States)

    Davis, Irene S

    2014-10-01

    The running shoe has gone through significant changes since its inception. The purpose of this paper is to review these changes, the majority of which have occurred over the past 50 years. Running footwear began as very minimal, then evolved to become highly cushioned and supportive. However, over the past 5 years, there has been a reversal of this trend, with runners seeking more minimal shoes that allow their feet more natural motion. This abrupt shift toward footwear without cushioning and support has led to reports of injuries associated with minimal footwear. In response to this, the running footwear industry shifted again toward the development of lightweight, partial minimal shoes that offer some support and cushioning. In this paper, studies comparing the mechanics between running in minimal, partial minimal, and traditional shoes are reviewed. The implications for injuries in all 3 conditions are examined. The use of minimal footwear in other populations besides runners is discussed. Finally, areas for future research into minimal footwear are suggested.

  2. Effects of biological control agents and exotic plant invasion on deer mouse populations

    Science.gov (United States)

    Yvette K. Ortega; Dean E. Pearson; Kevin S. McKelvey

    2004-01-01

    Exotic insects are commonly introduced as biological control agents to reduce densities of invasive exotic plants. Although current biocontrol programs for weeds take precautions to minimize ecological risks, little attention is paid to the potential nontarget effects of introduced food subsidies on native consumers. Previous research demonstrated that two gall flies (...

  3. 10 CFR 20.1406 - Minimization of contamination.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and the...

  4. A minimal architecture for joint action

    DEFF Research Database (Denmark)

    Vesper, Cordula; Butterfill, Stephen; Knoblich, Günther

    2010-01-01

    What kinds of processes and representations make joint action possible? In this paper we suggest a minimal architecture for joint action that focuses on representations, action monitoring and action prediction processes, as well as ways of simplifying coordination. The architecture spells out...... minimal requirements for an individual agent to engage in a joint action. We discuss existing evidence in support of the architecture as well as open questions that remain to be empirically addressed. In addition, we suggest possible interfaces between the minimal architecture and other approaches...... to joint action. The minimal architecture has implications for theorizing about the emergence of joint action, for human-machine interaction, and for understanding how coordination can be facilitated by exploiting relations between multiple agents’ actions and between actions and the environment....

  5. Minimal Walking Technicolor

    DEFF Research Database (Denmark)

    Foadi, Roshan; Frandsen, Mads Toudal; A. Ryttov, T.

    2007-01-01

    Different theoretical and phenomenological aspects of the Minimal and Nonminimal Walking Technicolor theories have recently been studied. The goal here is to make the models ready for collider phenomenology. We do this by constructing the low energy effective theory containing scalars......, pseudoscalars, vector mesons and other fields predicted by the minimal walking theory. We construct their self-interactions and interactions with standard model fields. Using the Weinberg sum rules, opportunely modified to take into account the walking behavior of the underlying gauge theory, we find...... interesting relations for the spin-one spectrum. We derive the electroweak parameters using the newly constructed effective theory and compare the results with the underlying gauge theory. Our analysis is sufficiently general such that the resulting model can be used to represent a generic walking technicolor...

  6. Minimal Marking: A Success Story

    Directory of Open Access Journals (Sweden)

    Anne McNeilly

    2014-11-01

    Full Text Available The minimal-marking project conducted in Ryerson’s School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The “minimal-marking” concept (Haswell, 1983, which requires dramatically more student engagement, resulted in more successful learning outcomes for surface-level knowledge acquisition than the more traditional approach of “teacher-corrects-all.” Results suggest it would be effective, not just for grammar, punctuation, and word usage, the objective here, but for any material that requires rote-memory learning, such as the Associated Press or Canadian Press style rules used by news publications across North America.

  7. Biological monitoring results for cadmium exposed workers.

    Science.gov (United States)

    McDiarmid, M A; Freeman, C S; Grossman, E A; Martonik, J

    1996-11-01

    As part of a settlement agreement with the Occupational Safety and Health Administration (OSHA) involving exposure to cadmium (Cd), a battery production facility provided medical surveillance data to OSHA for review. Measurements of cadmium in blood, cadmium in urine, and beta 2-microglobulin in urine were obtained for more than 100 workers over an 18-month period. Some airborne Cd exposure data were also made available. Two subpopulations of this cohort were of primary interest in evaluating compliance with the medical surveillance provisions of the Cadmium Standard. These were a group of 16 workers medically removed from cadmium exposure due to elevations in some biological parameter, and a group of platemakers. Platemaking had presented a particularly high exposure opportunity and had recently undergone engineering interventions to minimize exposure. The effect on three biological monitoring parameters of medical removal protection in the first group and engineering controls in platemakers is reported. Results reveal that both medical removal from cadmium exposures and exposure abatement through the use of engineering and work practice controls generally result in declines in biological monitoring parameters of exposed workers. Implications for the success of interventions are discussed.

  8. Minimizing the cost of keeping options open for conservation in a changing climate.

    Science.gov (United States)

    Mills, Morena; Nicol, Sam; Wells, Jessie A; Lahoz-Monfort, José J; Wintle, Brendan; Bode, Michael; Wardrop, Martin; Walshe, Terry; Probert, William J M; Runge, Michael C; Possingham, Hugh P; Madden, Eve McDonald

    2014-06-01

    Policy documents advocate that managers should keep their options open while planning to protect coastal ecosystems from climate-change impacts. However, the actual costs and benefits of maintaining flexibility remain largely unexplored, and alternative approaches for decision making under uncertainty may lead to better joint outcomes for conservation and other societal goals. For example, keeping options open for coastal ecosystems incurs opportunity costs for developers. We devised a decision framework that integrates these costs and benefits with probabilistic forecasts for the extent of sea-level rise to find a balance between coastal ecosystem protection and moderate coastal development. Here, we suggest that instead of keeping their options open managers should incorporate uncertain sea-level rise predictions into a decision-making framework that evaluates the benefits and costs of conservation and development. In our example, based on plausible scenarios for sea-level rise and assuming a risk-neutral decision maker, we found that substantial development could be accommodated with negligible loss of environmental assets. Characterization of the Pareto efficiency of conservation and development outcomes provides valuable insight into the intensity of trade-offs between development and conservation. However, additional work is required to improve understanding of the consequences of alternative spatial plans and the value judgments and risk preferences of decision makers and stakeholders. © 2014 Society for Conservation Biology.

  9. Theories of minimalism in architecture: Post scriptum

    Directory of Open Access Journals (Sweden)

    Stevanović Vladimir

    2012-01-01

    Full Text Available Owing to the period of intensive development in the last decade of XX century, architectural phenomenon called Minimalism in Architecture was remembered as the Style of the Nineties, which is characterized, morphologically speaking, by simplicity and formal reduction. Simultaneously with its development in practice, on a theoretical level several dominant interpretative models were able to establish themselves. The new millennium and time distance bring new problems; therefore this paper represents a discussion on specific theorization related to Minimalism in Architecture that can bear the designation of post scriptum, because their development starts after the constitutional period of architectural minimalist discourse. In XXI century theories, the problem of definition of minimalism remains important topic, approached by theorists through resolving on the axis: Modernism - Minimal Art - Postmodernism - Minimalism in Architecture. With regard to this, analyzed texts can be categorized in two groups: 1 texts of affirmative nature and historical-associative approach in which minimalism is identified with anything that is simple and reduced, in an idealizing manner, relied mostly on the existing hypotheses; 2 critically oriented texts, in which authors reconsider adequacy of the very term 'minimalism' in the context of architecture and take a metacritical attitude towards previous texts.

  10. Understanding schizophrenia as a disorder of consciousness: biological correlates and translational implications from quantum theory perspectives.

    Science.gov (United States)

    Venkatasubramanian, Ganesan

    2015-04-30

    From neurophenomenological perspectives, schizophrenia has been conceptualized as "a disorder with heterogeneous manifestations that can be integrally understood to involve fundamental perturbations in consciousness". While these theoretical constructs based on consciousness facilitate understanding the 'gestalt' of schizophrenia, systematic research to unravel translational implications of these models is warranted. To address this, one needs to begin with exploration of plausible biological underpinnings of "perturbed consciousness" in schizophrenia. In this context, an attractive proposition to understand the biology of consciousness is "the orchestrated object reduction (Orch-OR) theory" which invokes quantum processes in the microtubules of neurons. The Orch-OR model is particularly important for understanding schizophrenia especially due to the shared 'scaffold' of microtubules. The initial sections of this review focus on the compelling evidence to support the view that "schizophrenia is a disorder of consciousness" through critical summary of the studies that have demonstrated self-abnormalities, aberrant time perception as well as dysfunctional intentional binding in this disorder. Subsequently, these findings are linked with 'Orch-OR theory' through the research evidence for aberrant neural oscillations as well as microtubule abnormalities observed in schizophrenia. Further sections emphasize the applicability and translational implications of Orch-OR theory in the context of schizophrenia and elucidate the relevance of quantum biology to understand the origins of this puzzling disorder as "fundamental disturbances in consciousness".

  11. Large-scale association analyses identify new loci influencing glycemic traits and provide insight into the underlying biological pathways

    Science.gov (United States)

    Scott, Robert A; Lagou, Vasiliki; Welch, Ryan P; Wheeler, Eleanor; Montasser, May E; Luan, Jian’an; Mägi, Reedik; Strawbridge, Rona J; Rehnberg, Emil; Gustafsson, Stefan; Kanoni, Stavroula; Rasmussen-Torvik, Laura J; Yengo, Loïc; Lecoeur, Cecile; Shungin, Dmitry; Sanna, Serena; Sidore, Carlo; Johnson, Paul C D; Jukema, J Wouter; Johnson, Toby; Mahajan, Anubha; Verweij, Niek; Thorleifsson, Gudmar; Hottenga, Jouke-Jan; Shah, Sonia; Smith, Albert V; Sennblad, Bengt; Gieger, Christian; Salo, Perttu; Perola, Markus; Timpson, Nicholas J; Evans, David M; Pourcain, Beate St; Wu, Ying; Andrews, Jeanette S; Hui, Jennie; Bielak, Lawrence F; Zhao, Wei; Horikoshi, Momoko; Navarro, Pau; Isaacs, Aaron; O’Connell, Jeffrey R; Stirrups, Kathleen; Vitart, Veronique; Hayward, Caroline; Esko, Tönu; Mihailov, Evelin; Fraser, Ross M; Fall, Tove; Voight, Benjamin F; Raychaudhuri, Soumya; Chen, Han; Lindgren, Cecilia M; Morris, Andrew P; Rayner, Nigel W; Robertson, Neil; Rybin, Denis; Liu, Ching-Ti; Beckmann, Jacques S; Willems, Sara M; Chines, Peter S; Jackson, Anne U; Kang, Hyun Min; Stringham, Heather M; Song, Kijoung; Tanaka, Toshiko; Peden, John F; Goel, Anuj; Hicks, Andrew A; An, Ping; Müller-Nurasyid, Martina; Franco-Cereceda, Anders; Folkersen, Lasse; Marullo, Letizia; Jansen, Hanneke; Oldehinkel, Albertine J; Bruinenberg, Marcel; Pankow, James S; North, Kari E; Forouhi, Nita G; Loos, Ruth J F; Edkins, Sarah; Varga, Tibor V; Hallmans, Göran; Oksa, Heikki; Antonella, Mulas; Nagaraja, Ramaiah; Trompet, Stella; Ford, Ian; Bakker, Stephan J L; Kong, Augustine; Kumari, Meena; Gigante, Bruna; Herder, Christian; Munroe, Patricia B; Caulfield, Mark; Antti, Jula; Mangino, Massimo; Small, Kerrin; Miljkovic, Iva; Liu, Yongmei; Atalay, Mustafa; Kiess, Wieland; James, Alan L; Rivadeneira, Fernando; Uitterlinden, Andre G; Palmer, Colin N A; Doney, Alex S F; Willemsen, Gonneke; Smit, Johannes H; Campbell, Susan; Polasek, Ozren; Bonnycastle, Lori L; Hercberg, Serge; Dimitriou, Maria; Bolton, Jennifer L; Fowkes, Gerard R; Kovacs, Peter; Lindström, Jaana; Zemunik, Tatijana; Bandinelli, Stefania; Wild, Sarah H; Basart, Hanneke V; Rathmann, Wolfgang; Grallert, Harald; Maerz, Winfried; Kleber, Marcus E; Boehm, Bernhard O; Peters, Annette; Pramstaller, Peter P; Province, Michael A; Borecki, Ingrid B; Hastie, Nicholas D; Rudan, Igor; Campbell, Harry; Watkins, Hugh; Farrall, Martin; Stumvoll, Michael; Ferrucci, Luigi; Waterworth, Dawn M; Bergman, Richard N; Collins, Francis S; Tuomilehto, Jaakko; Watanabe, Richard M; de Geus, Eco J C; Penninx, Brenda W; Hofman, Albert; Oostra, Ben A; Psaty, Bruce M; Vollenweider, Peter; Wilson, James F; Wright, Alan F; Hovingh, G Kees; Metspalu, Andres; Uusitupa, Matti; Magnusson, Patrik K E; Kyvik, Kirsten O; Kaprio, Jaakko; Price, Jackie F; Dedoussis, George V; Deloukas, Panos; Meneton, Pierre; Lind, Lars; Boehnke, Michael; Shuldiner, Alan R; van Duijn, Cornelia M; Morris, Andrew D; Toenjes, Anke; Peyser, Patricia A; Beilby, John P; Körner, Antje; Kuusisto, Johanna; Laakso, Markku; Bornstein, Stefan R; Schwarz, Peter E H; Lakka, Timo A; Rauramaa, Rainer; Adair, Linda S; Smith, George Davey; Spector, Tim D; Illig, Thomas; de Faire, Ulf; Hamsten, Anders; Gudnason, Vilmundur; Kivimaki, Mika; Hingorani, Aroon; Keinanen-Kiukaanniemi, Sirkka M; Saaristo, Timo E; Boomsma, Dorret I; Stefansson, Kari; van der Harst, Pim; Dupuis, Josée; Pedersen, Nancy L; Sattar, Naveed; Harris, Tamara B; Cucca, Francesco; Ripatti, Samuli; Salomaa, Veikko; Mohlke, Karen L; Balkau, Beverley; Froguel, Philippe; Pouta, Anneli; Jarvelin, Marjo-Riitta; Wareham, Nicholas J; Bouatia-Naji, Nabila; McCarthy, Mark I; Franks, Paul W; Meigs, James B; Teslovich, Tanya M; Florez, Jose C; Langenberg, Claudia; Ingelsson, Erik; Prokopenko, Inga; Barroso, Inês

    2012-01-01

    Through genome-wide association meta-analyses of up to 133,010 individuals of European ancestry without diabetes, including individuals newly genotyped using the Metabochip, we have raised the number of confirmed loci influencing glycemic traits to 53, of which 33 also increase type 2 diabetes risk (q fasting insulin showed association with lipid levels and fat distribution, suggesting impact on insulin resistance. Gene-based analyses identified further biologically plausible loci, suggesting that additional loci beyond those reaching genome-wide significance are likely to represent real associations. This conclusion is supported by an excess of directionally consistent and nominally significant signals between discovery and follow-up studies. Functional follow-up of these newly discovered loci will further improve our understanding of glycemic control. PMID:22885924

  12. Increased scientific rigor will improve reliability of research and effectiveness of management

    Science.gov (United States)

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and

  13. Technology applications for radioactive waste minimization

    International Nuclear Information System (INIS)

    Devgun, J.S.

    1994-01-01

    The nuclear power industry has achieved one of the most successful examples of waste minimization. The annual volume of low-level radioactive waste shipped for disposal per reactor has decreased to approximately one-fifth the volume about a decade ago. In addition, the curie content of the total waste shipped for disposal has decreased. This paper will discuss the regulatory drivers and economic factors for waste minimization and describe the application of technologies for achieving waste minimization for low-level radioactive waste with examples from the nuclear power industry

  14. Transience and capacity of minimal submanifolds

    DEFF Research Database (Denmark)

    Markvorsen, Steen; Palmer, V.

    2003-01-01

    We prove explicit lower bounds for the capacity of annular domains of minimal submanifolds P-m in ambient Riemannian spaces N-n with sectional curvatures bounded from above. We characterize the situations in which the lower bounds for the capacity are actually attained. Furthermore we apply...... these bounds to prove that Brownian motion defined on a complete minimal submanifold is transient when the ambient space is a negatively curved Hadamard-Cartan manifold. The proof stems directly from the capacity bounds and also covers the case of minimal submanifolds of dimension m > 2 in Euclidean spaces....

  15. Defining Biological Networks for Noise Buffering and Signaling Sensitivity Using Approximate Bayesian Computation

    Directory of Open Access Journals (Sweden)

    Shuqiang Wang

    2014-01-01

    Full Text Available Reliable information processing in cells requires high sensitivity to changes in the input signal but low sensitivity to random fluctuations in the transmitted signal. There are often many alternative biological circuits qualifying for this biological function. Distinguishing theses biological models and finding the most suitable one are essential, as such model ranking, by experimental evidence, will help to judge the support of the working hypotheses forming each model. Here, we employ the approximate Bayesian computation (ABC method based on sequential Monte Carlo (SMC to search for biological circuits that can maintain signaling sensitivity while minimizing noise propagation, focusing on cases where the noise is characterized by rapid fluctuations. By systematically analyzing three-component circuits, we rank these biological circuits and identify three-basic-biological-motif buffering noise while maintaining sensitivity to long-term changes in input signals. We discuss in detail a particular implementation in control of nutrient homeostasis in yeast. The principal component analysis of the posterior provides insight into the nature of the reaction between nodes.

  16. On the isoperimetric rigidity of extrinsic minimal balls

    DEFF Research Database (Denmark)

    Markvorsen, Steen; Palmer, V.

    2003-01-01

    We consider an m-dimensional minimal submanifold P and a metric R-sphere in the Euclidean space R-n. If the sphere has its center p on P, then it will cut out a well defined connected component of P which contains this center point. We call this connected component an extrinsic minimal R-ball of P....... The quotient of the volume of the extrinsic ball and the volume of its boundary is not larger than the corresponding quotient obtained in the space form standard situation, where the minimal submanifold is the totally geodesic linear subspace R-m. Here we show that if the minimal submanifold has dimension...... larger than 3, if P is not too curved along the boundary of an extrinsic minimal R-ball, and if the inequality alluded to above is an equality for the extrinsic minimal ball, then the minimal submanifold is totally geodesic....

  17. Procedure for developing biological input for the design, location, or modification of water-intake structures

    Energy Technology Data Exchange (ETDEWEB)

    Neitzel, D.A.; McKenzie, D.H.

    1981-12-01

    To minimize adverse impact on aquatic ecosystems resulting from the operation of water intake structures, design engineers must have relevant information on the behavior, physiology and ecology of local fish and shellfish. Identification of stimulus/response relationships and the environmental factors that influence them is the first step in incorporating biological information in the design, location or modification of water intake structures. A procedure is presented in this document for providing biological input to engineers who are designing, locating or modifying a water intake structure. The authors discuss sources of stimuli at water intakes, historical approaches in assessing potential/actual impact and review biological information needed for intake design.

  18. Safety control and minimization of radioactive wastes

    International Nuclear Information System (INIS)

    Wang Jinming; Rong Feng; Li Jinyan; Wang Xin

    2010-01-01

    Compared with the developed countries, the safety control and minimization of the radwastes in China are under-developed. The research of measures for the safety control and minimization of the radwastes is very important for the safety control of the radwastes, and the reduction of the treatment and disposal cost and environment radiation hazards. This paper has systematically discussed the safety control and the minimization of the radwastes produced in the nuclear fuel circulation, nuclear technology applications and the process of decommission of nuclear facilities, and has provided some measures and methods for the safety control and minimization of the radwastes. (authors)

  19. The evidence for biologic immunotherapy in Sarcoidosis: A systematic review

    Directory of Open Access Journals (Sweden)

    Pooja Shah

    2017-09-01

    Full Text Available Background Sarcoidosis is a chronic inflammatory disease with a myriad of clinical manifestations. Treatment involves immunosuppression with corticosteroids or steroid-sparing agents. A proportion of patients does not respond to or are intolerant to therapy. Targeted immunotherapy with biologic agents has emerged as a novel approach with plausible mechanistic reasons to warrant study. Aims The aim of this review was to evaluate the evidence for the efficacy of biological therapy in sarcoidosis. Methods We conducted a systematic literature review and meta-analysis of all published randomised-controlled trials (RCT evaluating biological therapy in sarcoidosis, using MEDLINE and Embase databases, through to September 2017. The search terms included sarcoidosis, infliximab, adalimumab, etanercept, golimumab, certolizumab, rituximab, abatacept, tocilizumab, anakinra, ustekinumab, secukinumab. Only articles reporting RCTs were selected. Improvements in respiratory disease were assessed by changes in forced vital capacity (FVC by weighted mean difference (WMD. There were insufficient data on outcome measures in other organ systems to comparatively assess efficacy. Results The search identified 2,324 studies of which only 5 provided relevant and original data. This comprised a total of 364 patients, evaluating pulmonary, cutaneous and ocular sarcoidosis. One study in pulmonary disease and one study in cutaneous disease demonstrated improvements in the primary outcome. In pulmonary disease, meta-analysis of the treatment effect of anti-TNF therapy versus placebo on FVC revealed a WMD of 1.69 per cent (95 per cent confidence interval, 1.44–1.94. Conclusion There are insufficient data to suggest the long-term efficacy of anti-TNFα inhibitors in the treatment of sarcoidosis. This may be due to heterogeneity, small sample sizes and the lack of consistent reporting of outcome measures.

  20. Cooperation through Competition—Dynamics and Microeconomics of a Minimal Nutrient Trade System in Arbuscular Mycorrhizal Symbiosis

    Science.gov (United States)

    Schott, Stephan; Valdebenito, Braulio; Bustos, Daniel; Gomez-Porras, Judith L.; Sharma, Tripti; Dreyer, Ingo

    2016-01-01

    In arbuscular mycorrhizal (AM) symbiosis, fungi and plants exchange nutrients (sugars and phosphate, for instance) for reciprocal benefit. Until now it is not clear how this nutrient exchange system works. Here, we used computational cell biology to simulate the dynamics of a network of proton pumps and proton-coupled transporters that are upregulated during AM formation. We show that this minimal network is sufficient to describe accurately and realistically the nutrient trade system. By applying basic principles of microeconomics, we link the biophysics of transmembrane nutrient transport with the ecology of organismic interactions and straightforwardly explain macroscopic scenarios of the relations between plant and AM fungus. This computational cell biology study allows drawing far reaching hypotheses about the mechanism and the regulation of nutrient exchange and proposes that the “cooperation” between plant and fungus can be in fact the result of a competition between both for the same resources in the tiny periarbuscular space. The minimal model presented here may serve as benchmark to evaluate in future the performance of more complex models of AM nutrient exchange. As a first step toward this goal, we included SWEET sugar transporters in the model and show that their co-occurrence with proton-coupled sugar transporters results in a futile carbon cycle at the plant plasma membrane proposing that two different pathways for the same substrate should not be active at the same time. PMID:27446142

  1. At the biological modeling and simulation frontier.

    Science.gov (United States)

    Hunt, C Anthony; Ropella, Glen E P; Lam, Tai Ning; Tang, Jonathan; Kim, Sean H J; Engelberg, Jesse A; Sheikh-Bahaei, Shahab

    2009-11-01

    We provide a rationale for and describe examples of synthetic modeling and simulation (M&S) of biological systems. We explain how synthetic methods are distinct from familiar inductive methods. Synthetic M&S is a means to better understand the mechanisms that generate normal and disease-related phenomena observed in research, and how compounds of interest interact with them to alter phenomena. An objective is to build better, working hypotheses of plausible mechanisms. A synthetic model is an extant hypothesis: execution produces an observable mechanism and phenomena. Mobile objects representing compounds carry information enabling components to distinguish between them and react accordingly when different compounds are studied simultaneously. We argue that the familiar inductive approaches contribute to the general inefficiencies being experienced by pharmaceutical R&D, and that use of synthetic approaches accelerates and improves R&D decision-making and thus the drug development process. A reason is that synthetic models encourage and facilitate abductive scientific reasoning, a primary means of knowledge creation and creative cognition. When synthetic models are executed, we observe different aspects of knowledge in action from different perspectives. These models can be tuned to reflect differences in experimental conditions and individuals, making translational research more concrete while moving us closer to personalized medicine.

  2. Minimal string theory is logarithmic

    International Nuclear Information System (INIS)

    Ishimoto, Yukitaka; Yamaguchi, Shun-ichi

    2005-01-01

    We study the simplest examples of minimal string theory whose worldsheet description is the unitary (p,q) minimal model coupled to two-dimensional gravity ( Liouville field theory). In the Liouville sector, we show that four-point correlation functions of 'tachyons' exhibit logarithmic singularities, and that the theory turns out to be logarithmic. The relation with Zamolodchikov's logarithmic degenerate fields is also discussed. Our result holds for generic values of (p,q)

  3. Barrett's esophagus: cancer and molecular biology.

    Science.gov (United States)

    Gibson, Michael K; Dhaliwal, Arashinder S; Clemons, Nicholas J; Phillips, Wayne A; Dvorak, Katerina; Tong, Daniel; Law, Simon; Pirchi, E Daniel; Räsänen, Jari; Krasna, Mark J; Parikh, Kaushal; Krishnadath, Kausilia K; Chen, Yu; Griffiths, Leonard; Colleypriest, Benjamin J; Farrant, J Mark; Tosh, David; Das, Kiron M; Bajpai, Manisha

    2013-10-01

    The following paper on the molecular biology of Barrett's esophagus (BE) includes commentaries on signaling pathways central to the development of BE including Hh, NF-κB, and IL-6/STAT3; surgical approaches for esophagectomy and classification of lesions by appropriate therapy; the debate over the merits of minimally invasive esophagectomy versus open surgery; outcomes for patients with pharyngolaryngoesophagectomy; the applications of neoadjuvant chemotherapy and chemoradiotherapy; animal models examining the surgical models of BE and esophageal adenocarcinoma; the roles of various morphogens and Cdx2 in BE; and the use of in vitro BE models for chemoprevention studies. © 2013 New York Academy of Sciences.

  4. The role of biological fertility in predicting family size.

    Science.gov (United States)

    Joffe, M; Key, J; Best, N; Jensen, T K; Keiding, N

    2009-08-01

    It is plausible that a couple's ability to achieve the desired number of children is limited by biological fertility, especially if childbearing is postponed. Family size has declined and semen quality may have deteriorated in much of Europe, although studies have found an increase rather than a decrease in couple fertility. Using four high-quality European datasets, we took the reported time to pregnancy (TTP) as the predictor variable; births reported as following contraceptive failure were an additional category. The outcome variable was final or near-final family size. Potential confounders were maternal age when unprotected sex began prior to the first birth, and maternal smoking. Desired family size was available in only one of the datasets. Couples with a TTP of at least 12 months tended to have smaller families, with odds ratios for the risk of not having a second child approximately 1.8, and for the risk of not having a third child approximately 1.6. Below 12 months no association was observed. Findings were generally consistent across datasets. There was also a more than 2-fold risk of not achieving the desired family size if TTP was 12 months or more for the first child. Within the limits of the available data quality, family size appears to be predicted by biological fertility, even after adjustment for maternal age, if the woman was at least 20 years old when the couple's first attempt at conception started. The contribution of behavioural factors to this result also needs to be investigated.

  5. Preliminary assessment of geologic materials to minimize biological intrusion of low-level waste trench covers and plans for the future

    International Nuclear Information System (INIS)

    Hakonson, T.E.; White, G.C.; Gladney, E.S.; Muller, M.

    1981-01-01

    The long-term integrity of low-level waste shallow land burial sites is dependent on the interaction of physical, chemical, and biological factors that modify the waste containment system. Past research on low-level waste shallow land burial methods has emphasized physical (i.e., water infiltration, soil erosion) and chemical (radionuclide leaching) processes that can cause radionuclide transport from a waste site. Preliminary results demonstrate that a sandy backfill material offers little resistance to root and animal intrusion through the cover profile. However, bentonite clay, cobble, and cobble-gravel combinations do reduce plant root and animal intrusion through cover profiles compared with sandy backfill soil. However, bentonite clay barrier systems appear to be degraded by plant roots through time. Desiccation of the clay barrier by invading plant roots may limit the usefulness of bentonite clay as a moisture and/or biological carrier unless due consideration is given to this interaction. Future experiments are described that further examine the effect of plant roots on clay barrier systems and that determine the effectiveness of proposed biological barriers on larger scales and under various stress conditions

  6. Sequential unconstrained minimization algorithms for constrained optimization

    International Nuclear Information System (INIS)

    Byrne, Charles

    2008-01-01

    The problem of minimizing a function f(x):R J → R, subject to constraints on the vector variable x, occurs frequently in inverse problems. Even without constraints, finding a minimizer of f(x) may require iterative methods. We consider here a general class of iterative algorithms that find a solution to the constrained minimization problem as the limit of a sequence of vectors, each solving an unconstrained minimization problem. Our sequential unconstrained minimization algorithm (SUMMA) is an iterative procedure for constrained minimization. At the kth step we minimize the function G k (x)=f(x)+g k (x), to obtain x k . The auxiliary functions g k (x):D subset of R J → R + are nonnegative on the set D, each x k is assumed to lie within D, and the objective is to minimize the continuous function f:R J → R over x in the set C = D-bar, the closure of D. We assume that such minimizers exist, and denote one such by x-circumflex. We assume that the functions g k (x) satisfy the inequalities 0≤g k (x)≤G k-1 (x)-G k-1 (x k-1 ), for k = 2, 3, .... Using this assumption, we show that the sequence {(x k )} is decreasing and converges to f(x-circumflex). If the restriction of f(x) to D has bounded level sets, which happens if x-circumflex is unique and f(x) is closed, proper and convex, then the sequence {x k } is bounded, and f(x*)=f(x-circumflex), for any cluster point x*. Therefore, if x-circumflex is unique, x* = x-circumflex and {x k } → x-circumflex. When x-circumflex is not unique, convergence can still be obtained, in particular cases. The SUMMA includes, as particular cases, the well-known barrier- and penalty-function methods, the simultaneous multiplicative algebraic reconstruction technique (SMART), the proximal minimization algorithm of Censor and Zenios, the entropic proximal methods of Teboulle, as well as certain cases of gradient descent and the Newton–Raphson method. The proof techniques used for SUMMA can be extended to obtain related results

  7. Comparative Study on Interaction of Form and Motion Processing Streams by Applying Two Different Classifiers in Mechanism for Recognition of Biological Movement

    Science.gov (United States)

    2014-01-01

    Research on psychophysics, neurophysiology, and functional imaging shows particular representation of biological movements which contains two pathways. The visual perception of biological movements formed through the visual system called dorsal and ventral processing streams. Ventral processing stream is associated with the form information extraction; on the other hand, dorsal processing stream provides motion information. Active basic model (ABM) as hierarchical representation of the human object had revealed novelty in form pathway due to applying Gabor based supervised object recognition method. It creates more biological plausibility along with similarity with original model. Fuzzy inference system is used for motion pattern information in motion pathway creating more robustness in recognition process. Besides, interaction of these paths is intriguing and many studies in various fields considered it. Here, the interaction of the pathways to get more appropriated results has been investigated. Extreme learning machine (ELM) has been implied for classification unit of this model, due to having the main properties of artificial neural networks, but crosses from the difficulty of training time substantially diminished in it. Here, there will be a comparison between two different configurations, interactions using synergetic neural network and ELM, in terms of accuracy and compatibility. PMID:25276860

  8. Comparative Study on Interaction of Form and Motion Processing Streams by Applying Two Different Classifiers in Mechanism for Recognition of Biological Movement

    Directory of Open Access Journals (Sweden)

    Bardia Yousefi

    2014-01-01

    Full Text Available Research on psychophysics, neurophysiology, and functional imaging shows particular representation of biological movements which contains two pathways. The visual perception of biological movements formed through the visual system called dorsal and ventral processing streams. Ventral processing stream is associated with the form information extraction; on the other hand, dorsal processing stream provides motion information. Active basic model (ABM as hierarchical representation of the human object had revealed novelty in form pathway due to applying Gabor based supervised object recognition method. It creates more biological plausibility along with similarity with original model. Fuzzy inference system is used for motion pattern information in motion pathway creating more robustness in recognition process. Besides, interaction of these paths is intriguing and many studies in various fields considered it. Here, the interaction of the pathways to get more appropriated results has been investigated. Extreme learning machine (ELM has been implied for classification unit of this model, due to having the main properties of artificial neural networks, but crosses from the difficulty of training time substantially diminished in it. Here, there will be a comparison between two different configurations, interactions using synergetic neural network and ELM, in terms of accuracy and compatibility.

  9. Pengaruh Pelapis Bionanokomposit terhadap Mutu Mangga Terolah Minimal

    Directory of Open Access Journals (Sweden)

    Ata Aditya Wardana

    2017-04-01

    Full Text Available Abstract Minimally-processed mango is a perishable product due to high respiration and transpiration and microbial decay. Edible coating is one of the alternative methods to maintain the quality of minimally - processed mango. The objective of this study was to evaluate the effects of bionanocomposite edible coating from tapioca and ZnO nanoparticles (NP-ZnO on quality of minimally - processed mango cv. Arumanis, stored for 12 days at 8°C. The combination of tapioca and NP-ZnO (0, 1, 2% by weight of tapioca were used to coat minimally processed mango. The result showed that application of bionanocomposite edible coatings were able to maintain the quality of minimally-processed mango during the storage periods. The bionanocomposite from tapioca + NP-ZnO (2% by weight of tapioca was the most effective in reducing weight loss, firmness, browning index, total acidity, total soluble solids ,respiration, and microbial counts. Thus, the use of bionanocomposite edible coating might provide an alternative method to maintain storage quality of minimally-processed mango. Abstrak Mangga terolah minimal merupakan produk yang cepat mengalami kerusakan dikarenakan respirasi yang cepat, transpirasi dan kerusakan oleh mikroba. Edible coating merupakan salah satu alternatif metode untuk mempertahankan mutu mangga terolah minimal. Tujuan dari penelitian ini adalah untuk mengevaluasi pengaruh pelapis bionanokomposit dari tapioka dan nanopartikel ZnO (NP-ZnO terhadap mutu mangga terolah minimal cv. Arumanis yang disimpan selama 12 hari pada suhu 8oC. Kombinasi dari tapioka dan NP-ZnO (0, 1, 2% b/b tapioka digunakan untuk melapisi mangga terolah minimal. Hasil menunjukkan bahwa pelapisan bionanokomposit mampu mempertahankan mutu mangga terolah minimal selama penyimpanan. Bionanokomposit dari tapioka + NP-ZnO (2% b/b tapioka paling efektif dalam menghambat penurunan susut bobot, kekerasan, indeks pencoklatan, total asam, total padatan terlarut, respirasi dan total

  10. Hazardous waste minimization tracking system

    International Nuclear Information System (INIS)

    Railan, R.

    1994-01-01

    Under RCRA section 3002 9(b) and 3005f(h), hazardous waste generators and owners/operators of treatment, storage, and disposal facilities (TSDFs) are required to certify that they have a program in place to reduce the volume or quantity and toxicity of hazardous waste to the degree determined to be economically practicable. In many cases, there are environmental, as well as, economic benefits, for agencies that pursue pollution prevention options. Several state governments have already enacted waste minimization legislation (e.g., Massachusetts Toxic Use Reduction Act of 1989, and Oregon Toxic Use Reduction Act and Hazardous Waste Reduction Act, July 2, 1989). About twenty six other states have established legislation that will mandate some type of waste minimization program and/or facility planning. The need to address the HAZMIN (Hazardous Waste Minimization) Program at government agencies and private industries has prompted us to identify the importance of managing The HAZMIN Program, and tracking various aspects of the program, as well as the progress made in this area. The open-quotes WASTEclose quotes is a tracking system, which can be used and modified in maintaining the information related to Hazardous Waste Minimization Program, in a manageable fashion. This program maintains, modifies, and retrieves information related to hazardous waste minimization and recycling, and provides automated report generating capabilities. It has a built-in menu, which can be printed either in part or in full. There are instructions on preparing The Annual Waste Report, and The Annual Recycling Report. The program is very user friendly. This program is available in 3.5 inch or 5 1/4 inch floppy disks. A computer with 640K memory is required

  11. The Quest for Minimal Quotients for Probabilistic Automata

    DEFF Research Database (Denmark)

    Eisentraut, Christian; Hermanns, Holger; Schuster, Johann

    2013-01-01

    One of the prevailing ideas in applied concurrency theory and verification is the concept of automata minimization with respect to strong or weak bisimilarity. The minimal automata can be seen as canonical representations of the behaviour modulo the bisimilarity considered. Together with congruence...... results wrt. process algebraic operators, this can be exploited to alleviate the notorious state space explosion problem. In this paper, we aim at identifying minimal automata and canonical representations for concurrent probabilistic models. We present minimality and canonicity results for probabilistic...... automata wrt. strong and weak bisimilarity, together with polynomial time minimization algorithms....

  12. Supersymmetric hybrid inflation with non-minimal Kahler potential

    International Nuclear Information System (INIS)

    Bastero-Gil, M.; King, S.F.; Shafi, Q.

    2007-01-01

    Minimal supersymmetric hybrid inflation based on a minimal Kahler potential predicts a spectral index n s ∼>0.98. On the other hand, WMAP three year data prefers a central value n s ∼0.95. We propose a class of supersymmetric hybrid inflation models based on the same minimal superpotential but with a non-minimal Kahler potential. Including radiative corrections using the one-loop effective potential, we show that the prediction for the spectral index is sensitive to the small non-minimal corrections, and can lead to a significantly red-tilted spectrum, in agreement with WMAP

  13. Null-polygonal minimal surfaces in AdS{sub 4} from perturbed W minimal models

    Energy Technology Data Exchange (ETDEWEB)

    Hatsuda, Yasuyuki [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Ito, Katsushi [Tokyo Institute of Technology (Japan). Dept. of Physics; Satoh, Yuji [Tsukuba Univ., Sakura, Ibaraki (Japan). Inst. of Physics

    2012-11-15

    We study the null-polygonal minimal surfaces in AdS{sub 4}, which correspond to the gluon scattering amplitudes/Wilson loops in N=4 super Yang-Mills theory at strong coupling. The area of the minimal surfaces with n cusps is characterized by the thermodynamic Bethe ansatz (TBA) integral equations or the Y-system of the homogeneous sine-Gordon model, which is regarded as the SU(n-4){sub 4}/U(1){sup n-5} generalized parafermion theory perturbed by the weight-zero adjoint operators. Based on the relation to the TBA systems of the perturbed W minimal models, we solve the TBA equations by using the conformal perturbation theory, and obtain the analytic expansion of the remainder function around the UV/regular-polygonal limit for n = 6 and 7. We compare the rescaled remainder function for n=6 with the two-loop one, to observe that they are close to each other similarly to the AdS{sub 3} case.

  14. Assessment of LANL waste minimization plan

    International Nuclear Information System (INIS)

    Davis, K.D.; McNair, D.A.; Jennrich, E.A.; Lund, D.M.

    1991-04-01

    The objective of this report is to evaluate the Los Alamos National Laboratory (LANL) Waste Minimization Plan to determine if it meets applicable internal (DOE) and regulatory requirements. The intent of the effort is to assess the higher level elements of the documentation to determine if they have been addressed rather than the detailed mechanics of the program's implementation. The requirement for a Waste Minimization Plan is based in several DOE Orders as well as environmental laws and regulations. Table 2-1 provides a list of the major documents or regulations that require waste minimization efforts. The table also summarizes the applicable requirements

  15. Blackfolds, plane waves and minimal surfaces

    Science.gov (United States)

    Armas, Jay; Blau, Matthias

    2015-07-01

    Minimal surfaces in Euclidean space provide examples of possible non-compact horizon geometries and topologies in asymptotically flat space-time. On the other hand, the existence of limiting surfaces in the space-time provides a simple mechanism for making these configurations compact. Limiting surfaces appear naturally in a given space-time by making minimal surfaces rotate but they are also inherent to plane wave or de Sitter space-times in which case minimal surfaces can be static and compact. We use the blackfold approach in order to scan for possible black hole horizon geometries and topologies in asymptotically flat, plane wave and de Sitter space-times. In the process we uncover several new configurations, such as black helicoids and catenoids, some of which have an asymptotically flat counterpart. In particular, we find that the ultraspinning regime of singly-spinning Myers-Perry black holes, described in terms of the simplest minimal surface (the plane), can be obtained as a limit of a black helicoid, suggesting that these two families of black holes are connected. We also show that minimal surfaces embedded in spheres rather than Euclidean space can be used to construct static compact horizons in asymptotically de Sitter space-times.

  16. Blackfolds, plane waves and minimal surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Armas, Jay [Physique Théorique et Mathématique, Université Libre de Bruxelles and International Solvay Institutes, ULB-Campus Plaine CP231, B-1050 Brussels (Belgium); Albert Einstein Center for Fundamental Physics, University of Bern,Sidlerstrasse 5, 3012 Bern (Switzerland); Blau, Matthias [Albert Einstein Center for Fundamental Physics, University of Bern,Sidlerstrasse 5, 3012 Bern (Switzerland)

    2015-07-29

    Minimal surfaces in Euclidean space provide examples of possible non-compact horizon geometries and topologies in asymptotically flat space-time. On the other hand, the existence of limiting surfaces in the space-time provides a simple mechanism for making these configurations compact. Limiting surfaces appear naturally in a given space-time by making minimal surfaces rotate but they are also inherent to plane wave or de Sitter space-times in which case minimal surfaces can be static and compact. We use the blackfold approach in order to scan for possible black hole horizon geometries and topologies in asymptotically flat, plane wave and de Sitter space-times. In the process we uncover several new configurations, such as black helicoids and catenoids, some of which have an asymptotically flat counterpart. In particular, we find that the ultraspinning regime of singly-spinning Myers-Perry black holes, described in terms of the simplest minimal surface (the plane), can be obtained as a limit of a black helicoid, suggesting that these two families of black holes are connected. We also show that minimal surfaces embedded in spheres rather than Euclidean space can be used to construct static compact horizons in asymptotically de Sitter space-times.

  17. Minimal modification to tribimaximal mixing

    International Nuclear Information System (INIS)

    He Xiaogang; Zee, A.

    2011-01-01

    We explore some ways of minimally modifying the neutrino mixing matrix from tribimaximal, characterized by introducing at most one mixing angle and a CP violating phase thus extending our earlier work. One minimal modification, motivated to some extent by group theoretic considerations, is a simple case with the elements V α2 of the second column in the mixing matrix equal to 1/√(3). Modifications by keeping one of the columns or one of the rows unchanged from tribimaximal mixing all belong to the class of minimal modification. Some of the cases have interesting experimentally testable consequences. In particular, the T2K and MINOS collaborations have recently reported indications of a nonzero θ 13 . For the cases we consider, the new data sharply constrain the CP violating phase angle δ, with δ close to 0 (in some cases) and π disfavored.

  18. Biological intrusion of low-level-waste trench covers

    International Nuclear Information System (INIS)

    Hakonson, T.E.; Gladney, E.S.

    1981-01-01

    The long-term integrity of low-level waste shallow land burial sites is dependent on the interaction of physical, chemical, and biological factors that modify the waste containment system. Past research on low-level waste shallow land burial methods has emphasized physical (i.e., water infiltration, soil erosion) and chemical (radionuclide leaching) processes that can cause waste site failure and subsequent radionuclide transport. The purpose of this paper is to demonstrate the need to consider biological processes as being potentially important in reducing the integrity of waste burial site cover treatments. Plants and animals not only can transport radionuclides to the ground surface via root systems and soil excavated from the cover profile by animal burrowing activities, but they modify physical and chemical processes within the cover profile by changing the water infiltration rates, soil erosion rates and chemical composition of the soil. One approach to limiting biological intrusion through the waste cover is to apply a barrier within the profile to limit root and animal penetration with depth. Experiments in the Los Alamos Experimental Engineered Test Facility were initiated to develop and evaluate biological barriers that are effective in minimizing intrusion into waste trenches. The experiments that are described employ four different candidate barrier materials of geologic origin. Experimental variables that will be evaluated, in addition to barrier type, are barrier depth and soil overburden depth. The rate of biological intrusion through the various barrier materials is being evaluated through the use of activatable stable tracers

  19. Hazardous waste minimization

    International Nuclear Information System (INIS)

    Freeman, H.

    1990-01-01

    This book presents an overview of waste minimization. Covers applications of technology to waste reduction, techniques for implementing programs, incorporation of programs into R and D, strategies for private industry and the public sector, and case studies of programs already in effect

  20. Minimal constrained supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Cribiori, N. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Dall' Agata, G., E-mail: dallagat@pd.infn.it [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Farakos, F. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Porrati, M. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States)

    2017-01-10

    We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  1. Minimal constrained supergravity

    International Nuclear Information System (INIS)

    Cribiori, N.; Dall'Agata, G.; Farakos, F.; Porrati, M.

    2017-01-01

    We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  2. Molecular biology of mycoplasmas: from the minimum cell concept to the artificial cell.

    Science.gov (United States)

    Cordova, Caio M M; Hoeltgebaum, Daniela L; Machado, Laís D P N; Santos, Larissa Dos

    2016-01-01

    Mycoplasmas are a large group of bacteria, sorted into different genera in the Mollicutes class, whose main characteristic in common, besides the small genome, is the absence of cell wall. They are considered cellular and molecular biology study models. We present an updated review of the molecular biology of these model microorganisms and the development of replicative vectors for the transformation of mycoplasmas. Synthetic biology studies inspired by these pioneering works became possible and won the attention of the mainstream media. For the first time, an artificial genome was synthesized (a minimal genome produced from consensus sequences obtained from mycoplasmas). For the first time, a functional artificial cell has been constructed by introducing a genome completely synthesized within a cell envelope of a mycoplasma obtained by transformation techniques. Therefore, this article offers an updated insight to the state of the art of these peculiar organisms' molecular biology.

  3. The minimal melanogenesis dose/minimal erythema dose ratio declines with increasing skin pigmentation using solar simulator and narrowband ultraviolet B exposure

    DEFF Research Database (Denmark)

    Ravnbak, Mette H; Philipsen, Peter A; Wulf, Hans Christian

    2010-01-01

    To investigate the relation between pre-exposure skin pigmentation and the minimal melanogenesis dose (MMD)/minimal erythema dose (MED) ratio after a single narrowband ultraviolet B (nUVB) and solar simulator (Solar) exposure.......To investigate the relation between pre-exposure skin pigmentation and the minimal melanogenesis dose (MMD)/minimal erythema dose (MED) ratio after a single narrowband ultraviolet B (nUVB) and solar simulator (Solar) exposure....

  4. Qualifying and quantifying minimal hepatic encephalopathy

    DEFF Research Database (Denmark)

    Morgan, Marsha Y; Amodio, Piero; Cook, Nicola A

    2016-01-01

    Minimal hepatic encephalopathy is the term applied to the neuropsychiatric status of patients with cirrhosis who are unimpaired on clinical examination but show alterations in neuropsychological tests exploring psychomotor speed/executive function and/or in neurophysiological variables. There is ......Minimal hepatic encephalopathy is the term applied to the neuropsychiatric status of patients with cirrhosis who are unimpaired on clinical examination but show alterations in neuropsychological tests exploring psychomotor speed/executive function and/or in neurophysiological variables...... analytical techniques may provide better diagnostic information while the advent of portable wireless headsets may facilitate more widespread use. A large number of other diagnostic tools have been validated for the diagnosis of minimal hepatic encephalopathy including Critical Flicker Frequency......, the Inhibitory Control Test, the Stroop test, the Scan package and the Continuous Reaction Time; each has its pros and cons; strengths and weaknesses; protagonists and detractors. Recent AASLD/EASL Practice Guidelines suggest that the diagnosis of minimal hepatic encephalopathy should be based on the PHES test...

  5. A Systems Biology Analysis Unfolds the Molecular Pathways and Networks of Two Proteobacteria in Spaceflight and Simulated Microgravity Conditions.

    Science.gov (United States)

    Roy, Raktim; Shilpa, P Phani; Bagh, Sangram

    2016-09-01

    Bacteria are important organisms for space missions due to their increased pathogenesis in microgravity that poses risks to the health of astronauts and for projected synthetic biology applications at the space station. We understand little about the effect, at the molecular systems level, of microgravity on bacteria, despite their significant incidence. In this study, we proposed a systems biology pipeline and performed an analysis on published gene expression data sets from multiple seminal studies on Pseudomonas aeruginosa and Salmonella enterica serovar Typhimurium under spaceflight and simulated microgravity conditions. By applying gene set enrichment analysis on the global gene expression data, we directly identified a large number of new, statistically significant cellular and metabolic pathways involved in response to microgravity. Alteration of metabolic pathways in microgravity has rarely been reported before, whereas in this analysis metabolic pathways are prevalent. Several of those pathways were found to be common across studies and species, indicating a common cellular response in microgravity. We clustered genes based on their expression patterns using consensus non-negative matrix factorization. The genes from different mathematically stable clusters showed protein-protein association networks with distinct biological functions, suggesting the plausible functional or regulatory network motifs in response to microgravity. The newly identified pathways and networks showed connection with increased survival of pathogens within macrophages, virulence, and antibiotic resistance in microgravity. Our work establishes a systems biology pipeline and provides an integrated insight into the effect of microgravity at the molecular systems level. Systems biology-Microgravity-Pathways and networks-Bacteria. Astrobiology 16, 677-689.

  6. Matthew Arnold and Minimal Competency Testing.

    Science.gov (United States)

    Tuman, Myron C.

    1979-01-01

    Presents arguments by Robert Lowe and Matthew Arnold on the 19th century British "Payment by Results" Plan, whereby schools received funds for students who passed minimal competency tests. Emphasizes that the Victorian experience produced acrimonious teachers with low morale and encourages contemporary minimal testing advocates not to…

  7. Minimal constrained supergravity

    Directory of Open Access Journals (Sweden)

    N. Cribiori

    2017-01-01

    Full Text Available We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  8. Corporate tax minimization and stock price reactions

    OpenAIRE

    Blaufus, Kay; Möhlmann, Axel; Schwäbe, Alexander

    2016-01-01

    Tax minimization strategies may lead to significant tax savings, which could, in turn, increase firm value. However, such strategies are also associated with significant costs, such as expected penalties and planning, agency, and reputation costs. The overall impact of firms' tax minimization strategies on firm value is, therefore, unclear. To investigate whether corporate tax minimization increases firm value, we analyze the stock price reaction to news concerning corporate tax avoidance or ...

  9. Annual Waste Minimization Summary Report

    International Nuclear Information System (INIS)

    Haworth, D.M.

    2011-01-01

    This report summarizes the waste minimization efforts undertaken by National Security TechnoIogies, LLC, for the U. S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year 2010. The NNSA/NSO Pollution Prevention Program establishes a process to reduce the volume and toxicity of waste generated by NNSA/NSO activities and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment.

  10. Minimal free resolutions over complete intersections

    CERN Document Server

    Eisenbud, David

    2016-01-01

    This book introduces a theory of higher matrix factorizations for regular sequences and uses it to describe the minimal free resolutions of high syzygy modules over complete intersections. Such resolutions have attracted attention ever since the elegant construction of the minimal free resolution of the residue field by Tate in 1957. The theory extends the theory of matrix factorizations of a non-zero divisor, initiated by Eisenbud in 1980, which yields a description of the eventual structure of minimal free resolutions over a hypersurface ring. Matrix factorizations have had many other uses in a wide range of mathematical fields, from singularity theory to mathematical physics.

  11. Minimal-change nephropathy and malignant thymoma.

    Science.gov (United States)

    Varsano, S; Bruderman, I; Bernheim, J L; Rathaus, M; Griffel, B

    1980-05-01

    A 56-year-old man had fever, precordial pain, and a mediastinal mass. The mass disappeared two months later and the patient remained asymptomatic for 2 1/2 years. At that time a full-blown nephrotic syndrome developed, with minimal-change glomerulopathy. The chest x-ray film showed the reappearance of a giant mediastinal mass. On biopsy of the mass, malignant thymoma was diagnosed. Association between minimal-change disease and Hodgkin's disease is well known, while the association with malignant thymoma has not been previously reported. The relationship between malignant thymoma and minimal-change disease is discussed, and a possible pathogenic mechanism involving cell-mediated immunity is proposed.

  12. Minimally invasive spine surgery: Hurdles to be crossed

    Directory of Open Access Journals (Sweden)

    Mahesh Bijjawara

    2014-01-01

    Full Text Available MISS as a concept is noble and all surgeons need to address and minimize the surgical morbidity for better results. However, we need to be cautions and not fall prey into accepting that minimally invasive spine surgery can be done only when certain metal access systems are used. Minimally invasive spine surgery (MISS has come a long way since the description of endoscopic discectomy in 1997 and minimally invasive TLIF (mTLIF in 2003. Today there is credible evidence (though not level-I that MISS has comparable results to open spine surgery with the advantage of early postoperative recovery and decreased blood loss and infection rates. However, apart from decreasing the muscle trauma and decreasing the muscle dissection during multilevel open spinal instrumentation, there has been little contribution to address the other morbidity parameters like operative time , blood loss , access to decompression and atraumatic neural tissue handling with the existing MISS technologies. Since all these parameters contribute to a greater degree than posterior muscle trauma for the overall surgical morbidity, we as surgeons need to introspect before we accept the concept of minimally invasive spine surgery being reduced to surgeries performed with a few tubular retractors. A spine surgeon needs to constantly improve his skills and techniques so that he can minimize blood loss, minimize traumatic neural tissue handling and minimizing operative time without compromising on the surgical goals. These measures actually contribute far more, to decrease the morbidity than approach related muscle damage alone. Minimally invasine spine surgery , though has come a long way, needs to provide technical solutions to minimize all the morbidity parameters involved in spine surgery, before it can replace most of the open spine surgeries, as in the case of laparoscopic surgery or arthroscopic surgery.

  13. Minimal knotted polygons in cubic lattices

    International Nuclear Information System (INIS)

    Van Rensburg, E J Janse; Rechnitzer, A

    2011-01-01

    In this paper we examine numerically the properties of minimal length knotted lattice polygons in the simple cubic, face-centered cubic, and body-centered cubic lattices by sieving minimal length polygons from a data stream of a Monte Carlo algorithm, implemented as described in Aragão de Carvalho and Caracciolo (1983 Phys. Rev. B 27 1635), Aragão de Carvalho et al (1983 Nucl. Phys. B 215 209) and Berg and Foester (1981 Phys. Lett. B 106 323). The entropy, mean writhe, and mean curvature of minimal length polygons are computed (in some cases exactly). While the minimal length and mean curvature are found to be lattice dependent, the mean writhe is found to be only weakly dependent on the lattice type. Comparison of our results to numerical results for the writhe obtained elsewhere (see Janse van Rensburg et al 1999 Contributed to Ideal Knots (Series on Knots and Everything vol 19) ed Stasiak, Katritch and Kauffman (Singapore: World Scientific), Portillo et al 2011 J. Phys. A: Math. Theor. 44 275004) shows that the mean writhe is also insensitive to the length of a knotted polygon. Thus, while these results for the mean writhe and mean absolute writhe at minimal length are not universal, our results demonstrate that these values are quite close the those of long polygons regardless of the underlying lattice and length

  14. LLNL Waste Minimization Program Plan

    International Nuclear Information System (INIS)

    1990-01-01

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs

  15. Improving the performance of minimizers and winnowing schemes.

    Science.gov (United States)

    Marçais, Guillaume; Pellow, David; Bork, Daniel; Orenstein, Yaron; Shamir, Ron; Kingsford, Carl

    2017-07-15

    The minimizers scheme is a method for selecting k -mers from sequences. It is used in many bioinformatics software tools to bin comparable sequences or to sample a sequence in a deterministic fashion at approximately regular intervals, in order to reduce memory consumption and processing time. Although very useful, the minimizers selection procedure has undesirable behaviors (e.g. too many k -mers are selected when processing certain sequences). Some of these problems were already known to the authors of the minimizers technique, and the natural lexicographic ordering of k -mers used by minimizers was recognized as their origin. Many software tools using minimizers employ ad hoc variations of the lexicographic order to alleviate those issues. We provide an in-depth analysis of the effect of k -mer ordering on the performance of the minimizers technique. By using small universal hitting sets (a recently defined concept), we show how to significantly improve the performance of minimizers and avoid some of its worse behaviors. Based on these results, we encourage bioinformatics software developers to use an ordering based on a universal hitting set or, if not possible, a randomized ordering, rather than the lexicographic order. This analysis also settles negatively a conjecture (by Schleimer et al. ) on the expected density of minimizers in a random sequence. The software used for this analysis is available on GitHub: https://github.com/gmarcais/minimizers.git . gmarcais@cs.cmu.edu or carlk@cs.cmu.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. Danish Sixties Avant-Garde and American Minimal Art

    Directory of Open Access Journals (Sweden)

    Max Ipsen

    2007-05-01

    Full Text Available Denmark is peripheral in the history of minimalism in the arts. In an international perspective Danish artists made almost no contributions to minimalism, according to art historians. But the fact is that Danish artists made minimalist works of art, and they did it very early. Art historians tend to describe minimal art as an entirely American phenomenon. America is the centre, Europe the periphery that lagged behind the centre, imitating American art. I will try to query this view with examples from Danish minimalism. I will discuss minimalist tendencies in Danish art and literature in the 1960s, and I will examine whether one can claim that Danish artists were influenced by American minimal art.

  17. Theories of minimalism in architecture: When prologue becomes palimpsest

    Directory of Open Access Journals (Sweden)

    Stevanović Vladimir

    2014-01-01

    Full Text Available This paper examines the modus and conditions of constituting and establishing architectural discourse on minimalism. One of the key topics in this discourse are historical line of development and the analysis of theoretical influences, which comprise connections of recent minimalism with the theorizations of various minimal, architectural and artistic, forms and concepts from the past. The paper shall particularly discuss those theoretical relations which, in a unitary way, link minimalism in architecture with its artistic nominal counterpart - minimal art. These are the relations founded on the basis of interpretative models on self-referentiality, phenomenological experience and contextualism, which are superficialy observed, common to both, artistic and architectural, minimalist discourses. It seems that in this constellation certain relations on the historical line of minimalism in architecture are questionable, while some other are overlooked. Precisely, posmodern fundamentalism is the architectural direction: 1 in which these three interpretations also existed; 2 from which architectural theorists retroactively appropriated many architects proclaiming them minimalists; 3 which establish identical relations with modern and postmodern theoretical and socio-historical contexts, as well as it will be done in minimalism. In spite of this, theoretical field of postmodern fundamentalism is surprisingly neglected in the discourse of minimalism in architecture. Instead of understanding postmodern fundamentalism as a kind of prologue to minimalism in architecture, it becomes an erased palimpsest over whom the different history of minimalism is rewriting, the history in which minimal art which occupies a central place.

  18. BIOLOGICAL MONITORING PROGRAM FOR EAST FORK POPLAR CREEK

    Energy Technology Data Exchange (ETDEWEB)

    ADAMS, S.M.; ASHWOOD, T.L.; BEATY, T.W.; BRANDT, C.C.

    1997-10-24

    In May 1985, a National Pollutant Discharge Elimination System (NPDES) permit was issued for the Oak Ridge Y-12 Plant. As a condition of the permit a Biological Monitoring and Abatement Program (BMAP) was developed to demonstrate that the effluent limitations established for the Y- 12 Plant protect the classified uses of the receiving stream (East Fork Poplar Creek; EFPC), in particular, the growth and propagation of aquatic life (Lear et al. 1989). A second objective of the BMAP is to document the ecological effects resulting from the implementation of a water pollution control program designed to eliminate direct discharges of wastewaters to EFPC and to minimize the inadvertent release of pollutants to the environment. Because of the complex nature of the discharges to EFPC and the temporal and spatial variability in the composition of the discharges, a comprehensive, integrated approach to biological monitoring was developed. A new permit was issued to the Y-12 Plant on April 28, 1995 and became effective on July 1, 1995. Biological monitoring continues to be required under the new permit. The BMAP consists of four major tasks that reflect different but complementary approaches to evaluating the effects of the Y-12 Plant discharges on the aquatic integrity of EFPC. These tasks are (1) toxicity monitoring, (2) biological indicator studies, (3) bioaccumulation studies, and (4) ecological surveys of the periphyton, benthic macroinvertebrate, and fish communities.

  19. Biological monitoring program for East Fork Poplar Creek

    Energy Technology Data Exchange (ETDEWEB)

    Adams, S.M.; Ashwood, T.L.; Beaty, T.W.; Brandt, C.C.; Christensen, S.W.; Cicerone, D.S.; Greeley, M.S. Jr.; Hill, W.R.; Kszos, L.S.

    1997-04-18

    In May 1985, a National Pollutant Discharge Elimination System (NPDES) permit was issued for the Oak Ridge Y-12 Plant. As a condition of the permit, a Biological Monitoring and Abatement Program (BMAP) was developed to demonstrate that the effluent limitations established for the Y-12 Plant protect the classified uses of the receiving stream (East Fork Poplar Creek; EFPC), in particular, the growth and propagation of aquatic life (Lear et al. 1989). A second objective of the BMAP is to document the ecological effects resulting from the implementation of a water pollution control program designed to eliminate direct discharges of wastewaters to EFPC and to minimize the inadvertent release of pollutants to the environment. Because of the complex nature of the discharges to EFPC and the temporal and spatial variability in the composition of the discharges, a comprehensive, integrated approach to biological monitoring was developed. A new permit was issued to the Y-12 Plant on April 28, 1995 and became effective on July 1, 1995. Biological monitoring continues to be required under the new permit. The BMAP consists of four major tasks that reflect different but complementary approaches to evaluating the effects of the Y-12 Plant discharges on the aquatic integrity of EFPC. These tasks are (1) toxicity monitoring, (2) biological indicator studies, (3) bioaccumulation studies, and (4) ecological surveys of the periphyton, benthic macroinvertebrate, and fish communities.

  20. Waste minimization and pollution prevention awareness plan

    International Nuclear Information System (INIS)

    1991-01-01

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs

  1. Waste minimization and pollution prevention awareness plan

    Energy Technology Data Exchange (ETDEWEB)

    1991-05-31

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs.

  2. Inflationary models with non-minimally derivative coupling

    International Nuclear Information System (INIS)

    Yang, Nan; Fei, Qin; Gong, Yungui; Gao, Qing

    2016-01-01

    We derive the general formulae for the scalar and tensor spectral tilts to the second order for the inflationary models with non-minimally derivative coupling without taking the high friction limit. The non-minimally kinetic coupling to Einstein tensor brings the energy scale in the inflationary models down to be sub-Planckian. In the high friction limit, the Lyth bound is modified with an extra suppression factor, so that the field excursion of the inflaton is sub-Planckian. The inflationary models with non-minimally derivative coupling are more consistent with observations in the high friction limit. In particular, with the help of the non-minimally derivative coupling, the quartic power law potential is consistent with the observational constraint at 95% CL. (paper)

  3. Reliable gene expression analysis by reverse transcription-quantitative PCR: reporting and minimizing the uncertainty in data accuracy.

    Science.gov (United States)

    Remans, Tony; Keunen, Els; Bex, Geert Jan; Smeets, Karen; Vangronsveld, Jaco; Cuypers, Ann

    2014-10-01

    Reverse transcription-quantitative PCR (RT-qPCR) has been widely adopted to measure differences in mRNA levels; however, biological and technical variation strongly affects the accuracy of the reported differences. RT-qPCR specialists have warned that, unless researchers minimize this variability, they may report inaccurate differences and draw incorrect biological conclusions. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines describe procedures for conducting and reporting RT-qPCR experiments. The MIQE guidelines enable others to judge the reliability of reported results; however, a recent literature survey found low adherence to these guidelines. Additionally, even experiments that use appropriate procedures remain subject to individual variation that statistical methods cannot correct. For example, since ideal reference genes do not exist, the widely used method of normalizing RT-qPCR data to reference genes generates background noise that affects the accuracy of measured changes in mRNA levels. However, current RT-qPCR data reporting styles ignore this source of variation. In this commentary, we direct researchers to appropriate procedures, outline a method to present the remaining uncertainty in data accuracy, and propose an intuitive way to select reference genes to minimize uncertainty. Reporting the uncertainty in data accuracy also serves for quality assessment, enabling researchers and peer reviewers to confidently evaluate the reliability of gene expression data. © 2014 American Society of Plant Biologists. All rights reserved.

  4. Characterization of microbial communities in pest colonized books by molecular biology tools

    OpenAIRE

    Franco Palla

    2011-01-01

    This work presents the identification of bacteria and fungi colonies in insect infesting books, by cultural-independent methodologies based on molecular biology techniques. Microbial genomic DNA extraction, in vitro amplification of specific target sequences by polymerase chain reactions (PCR), sequencing and sequence analysis were performed. These procedures minimized the samples amount, optimized the diagnostic studies on bacteria and fungi colonization and allowed the identification of man...

  5. Minimally inconsistent reasoning in Semantic Web.

    Science.gov (United States)

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.

  6. Waste Minimization and Pollution Prevention Awareness Plan

    International Nuclear Information System (INIS)

    1992-01-01

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) and other legal requirements that are discussed in Section C, below. The Pollution Prevention Awareness Program is included with the Waste Minimization Program as suggested by DOE Order 5400.1. The intent of this plan is to respond to and comply with the Department's policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Directorate-, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Directorates, Programs and Departments. Several Directorates have been reorganized, necessitating changes in the Directorate plans that were published in 1991

  7. Non-minimally coupled tachyon and inflation

    International Nuclear Information System (INIS)

    Piao Yunsong; Huang Qingguo; Zhang Xinmin; Zhang Yuanzhong

    2003-01-01

    In this Letter, we consider a model of tachyon with a non-minimal coupling to gravity and study its cosmological effects. Regarding inflation, we show that only for a specific coupling of tachyon to gravity this model satisfies observations and solves various problems which exist in the single and multi tachyon inflation models. But noting in the string theory the coupling coefficient of tachyon to gravity is of order g s , which in general is very small, we can hardly expect that the non-minimally coupling of tachyon to gravity could provide a reasonable tachyon inflation scenario. Our work may be a meaningful try for the cosmological effect of tachyon non-minimally coupled to gravity

  8. One-dimensional Gromov minimal filling problem

    International Nuclear Information System (INIS)

    Ivanov, Alexandr O; Tuzhilin, Alexey A

    2012-01-01

    The paper is devoted to a new branch in the theory of one-dimensional variational problems with branching extremals, the investigation of one-dimensional minimal fillings introduced by the authors. On the one hand, this problem is a one-dimensional version of a generalization of Gromov's minimal fillings problem to the case of stratified manifolds. On the other hand, this problem is interesting in itself and also can be considered as a generalization of another classical problem, the Steiner problem on the construction of a shortest network connecting a given set of terminals. Besides the statement of the problem, we discuss several properties of the minimal fillings and state several conjectures. Bibliography: 38 titles.

  9. BIOLOGIC JOINT RECONSTRUCTION: ALTERNATIVES TO ARTHROPLASTY

    Directory of Open Access Journals (Sweden)

    Brian J. Cole

    2009-06-01

    Full Text Available A comprehensive source of information in the management of cartilage lesions of major joints using nonoperative or surgical techniques other than total joint replacement. The text also includes chapters in basic sciences, imaging and rehabilitation.The editors are aiming to provide a reference about the latest concepts and techniques in the treatment of cartilage lesions including future aspects by a comprehensive approach to the alternative joint restoration procedures such as biological, pharmacological and surgical techniques of cartilage repairing and partial resurfacing etc.Orthopedic surgeons in sports medicine, orthopedic surgeons performing joint replacements, orthopedic resident and fellows will be the main audiences.The text is 349 pages, divided into 34 chapters in 7 sections. Section I is "Background-articular cartilage and allograft processing" including chapters about pathology, patient evaluation, imaging and allograft processing. Section II is "Nonoperative treatment" including chapters about neutraceuticals, pharmacological treatment and rehabilitation. Section III is "Operative treatment-knee" including chapters about arthroscopic debridment, microfracture, osteochondral autograft transplantation, mosaicplasty, osteochondral autograft transfer, osteochondral allografts, autologous chondrocyte implantation, existing cell-based technologies, minimally invasive second-generation autologous chondrocyte implantation, future development in cartilage repair, meniscus transplantation, management of OCD, patellafemoral chondral disease, proximal tibial and distal femoral osteotomies, unicompartmental arthritis current techniques, unicompartmental knee replacement. Section IV is "Operative treatment-Hip" including chapters about hip arthroscopy and arthroscopic partial resurfacing, related osteotomies. Section V is "operative treatment-shoulder" including chapters about arthroscopic debridment and release, biologic resurfacing and

  10. Minimization of rad waste production in NPP Dukovany

    International Nuclear Information System (INIS)

    Kulovany, J.

    2001-01-01

    A whole range of measures has been taken in the power plant in connection with the minimization of radioactive waste. It will lead to the set goals. The procedures that prevent possible endangering of the operation take precedence during introduction of the minimization measures. Further economically undemanding procedures are implemented that bring about minimization in an effective way. In accordance with the EMS principles it can be expected that the minimizing measures will be implemented also in areas where their greatest contribution will be for the environment

  11. H.R. 5051: a bill to authorize funding for research on the potential atmospheric, climatic, biological, health, and environmental consequences of nuclear explosions and nuclear exchanges... Introduced in the House of Representatives, Ninety-Ninth Congress, Second Session, June 18, 1986

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    The Nuclear Winter Research Act of 1986 authorizes funding to study the potential consequences of nuclear explosions. The research will cover possible atmospheric, climatic, biological, health, or environmental changes to see if the nuclear winter theory is plausible. The bill authorizes $8.5 million over a five-year period for the Department of Defense study. It also establishes a Nuclear Winter Study Commission to determine and evaluate what implications these potential effects have for defense policy

  12. Creating biological nanomaterials using synthetic biology

    International Nuclear Information System (INIS)

    Rice, MaryJoe K; Ruder, Warren C

    2014-01-01

    Synthetic biology is a new discipline that combines science and engineering approaches to precisely control biological networks. These signaling networks are especially important in fields such as biomedicine and biochemical engineering. Additionally, biological networks can also be critical to the production of naturally occurring biological nanomaterials, and as a result, synthetic biology holds tremendous potential in creating new materials. This review introduces the field of synthetic biology, discusses how biological systems naturally produce materials, and then presents examples and strategies for incorporating synthetic biology approaches in the development of new materials. In particular, strategies for using synthetic biology to produce both organic and inorganic nanomaterials are discussed. Ultimately, synthetic biology holds the potential to dramatically impact biological materials science with significant potential applications in medical systems. (review)

  13. Creating biological nanomaterials using synthetic biology.

    Science.gov (United States)

    Rice, MaryJoe K; Ruder, Warren C

    2014-02-01

    Synthetic biology is a new discipline that combines science and engineering approaches to precisely control biological networks. These signaling networks are especially important in fields such as biomedicine and biochemical engineering. Additionally, biological networks can also be critical to the production of naturally occurring biological nanomaterials, and as a result, synthetic biology holds tremendous potential in creating new materials. This review introduces the field of synthetic biology, discusses how biological systems naturally produce materials, and then presents examples and strategies for incorporating synthetic biology approaches in the development of new materials. In particular, strategies for using synthetic biology to produce both organic and inorganic nanomaterials are discussed. Ultimately, synthetic biology holds the potential to dramatically impact biological materials science with significant potential applications in medical systems.

  14. The Effects of Minimal Length, Maximal Momentum, and Minimal Momentum in Entropic Force

    Directory of Open Access Journals (Sweden)

    Zhong-Wen Feng

    2016-01-01

    Full Text Available The modified entropic force law is studied by using a new kind of generalized uncertainty principle which contains a minimal length, a minimal momentum, and a maximal momentum. Firstly, the quantum corrections to the thermodynamics of a black hole are investigated. Then, according to Verlinde’s theory, the generalized uncertainty principle (GUP corrected entropic force is obtained. The result shows that the GUP corrected entropic force is related not only to the properties of the black holes but also to the Planck length and the dimensionless constants α0 and β0. Moreover, based on the GUP corrected entropic force, we also derive the modified Einstein’s field equation (EFE and the modified Friedmann equation.

  15. Minimal DBM Substraction

    DEFF Research Database (Denmark)

    David, Alexandre; Håkansson, John; G. Larsen, Kim

    In this paper we present an algorithm to compute DBM substractions with a guaranteed minimal number of splits and disjoint DBMs to avoid any redundance. The substraction is one of the few operations that result in a non-convex zone, and thus, requires splitting. It is of prime importance to reduce...

  16. Abelian groups with a minimal generating set | Ruzicka ...

    African Journals Online (AJOL)

    We study the existence of minimal generating sets in Abelian groups. We prove that Abelian groups with minimal generating sets are not closed under quotients, nor under subgroups, nor under infinite products. We give necessary and sufficient conditions for existence of a minimal generating set providing that the Abelian ...

  17. Application of genetic algorithms for determination biological half-life of 137 Cs in milk

    International Nuclear Information System (INIS)

    Pantelic, G.

    1998-01-01

    Genetic algorithm an optimization method involving natural selection mechanisms, was used to determine biological half-life of sup 1 sup 3 sup 7 Cs in the milk, after the Chernobyl accident, based on a two compartment linear system model. Genetic algorithms operate on populations of strings. Reproduction, crossover and mutation are applied to successive string population to create new string population. A model parameter estimation is performed by minimizing square differences between fitting function and experimental data. The calculated biological half-life of sup 1 sup 3 sup 7 Cs in milk is (32(+(-) days (author)

  18. Biological variation, reference change value (RCV) and minimal important difference (MID) of inspiratory muscle strength (PImax) in patients with stable chronic heart failure.

    Science.gov (United States)

    Täger, Tobias; Schell, Miriam; Cebola, Rita; Fröhlich, Hanna; Dösch, Andreas; Franke, Jennifer; Katus, Hugo A; Wians, Frank H; Frankenstein, Lutz

    2015-10-01

    Despite the widespread application of measurements of respiratory muscle force (PImax) in clinical trials there is no data on biological variation, reference change value (RCV), or the minimal important difference (MID) for PImax irrespective of the target cohort. We addressed this issue for patients with chronic stable heart failure. From the outpatients' clinic of the University of Heidelberg we retrospectively selected three groups of patients with stable systolic chronic heart failure (CHF). Each group had two measurements of PImax: 90 days apart in Group A (n = 25), 180 days apart in Group B (n = 93), and 365 days apart in Group C (n = 184). Stability was defined as (a) no change in NYHA class between visits and (b) absence of cardiac decompensation 3 months prior, during, and 3 months after measurements. For each group, we determined within-subject (CVI), between-subject (CVG), and total (CVT) coefficient of variation (CV), the index of individuality (II), RCV, reliability coefficient, and MID of PImax. CVT was 8.7, 7.5, and 6.9 % for groups A, B, and C, respectively. The II and RCV were 0.21, 0.20, 0.16 and 13.6, 11.6, 10.8 %, respectively. The reliability coefficient and MID were 0.83, 0.87, 0.88 and 1.44, 1.06, 1.12 kPa, respectively. Results were similar between age, gender, and aetiology subgroups. In patients with stable CHF, measurements of PImax are highly stable for intervals up to 1 year. The low values for II suggest that evaluation of change in PImax should be performed on an individual (per patient) basis. Individually significant change can be assumed beyond 14 % (RCV) or 1.12 kPa (MID).

  19. Transfer closed and transfer open multimaps in minimal spaces

    International Nuclear Information System (INIS)

    Alimohammady, M.; Roohi, M.; Delavar, M.R.

    2009-01-01

    This paper is devoted to introduce the concepts of transfer closed and transfer open multimaps in minimal spaces. Also, some characterizations of them are considered. Further, the notion of minimal local intersection property will be introduced and characterized. Moreover, some maximal element theorems via minimal transfer closed multimaps and minimal local intersection property are given.

  20. 6th European Conference of the International Federation for Medical and Biological Engineering

    CERN Document Server

    Vasic, Darko

    2015-01-01

    This volume presents the Proceedings of the 6th European Conference of the International Federation for Medical and Biological Engineering (MBEC2014), held in Dubrovnik September 7 – 11, 2014. The general theme of MBEC 2014 is "Towards new horizons in biomedical engineering" The scientific discussions in these conference proceedings include the following themes: - Biomedical Signal Processing - Biomedical Imaging and Image Processing - Biosensors and Bioinstrumentation - Bio-Micro/Nano Technologies - Biomaterials - Biomechanics, Robotics and Minimally Invasive Surgery - Cardiovascular, Respiratory and Endocrine Systems Engineering - Neural and Rehabilitation Engineering - Molecular, Cellular and Tissue Engineering - Bioinformatics and Computational Biology - Clinical Engineering and Health Technology Assessment - Health Informatics, E-Health and Telemedicine - Biomedical Engineering Education

  1. Laparoscopic colonic resection in inflammatory bowel disease: minimal surgery, minimal access and minimal hospital stay.

    LENUS (Irish Health Repository)

    Boyle, E

    2008-11-01

    Laparoscopic surgery for inflammatory bowel disease (IBD) is technically demanding but can offer improved short-term outcomes. The introduction of minimally invasive surgery (MIS) as the default operative approach for IBD, however, may have inherent learning curve-associated disadvantages. We hypothesise that the establishment of MIS as the standard operative approach does not increase patient morbidity as assessed in the initial period of its introduction into a specialised unit, and that it confers earlier postoperative gastrointestinal recovery and reduced hospitalisation compared with conventional open resection.

  2. Minimal covariant observables identifying all pure states

    Energy Technology Data Exchange (ETDEWEB)

    Carmeli, Claudio, E-mail: claudio.carmeli@gmail.com [D.I.M.E., Università di Genova, Via Cadorna 2, I-17100 Savona (Italy); I.N.F.N., Sezione di Genova, Via Dodecaneso 33, I-16146 Genova (Italy); Heinosaari, Teiko, E-mail: teiko.heinosaari@utu.fi [Turku Centre for Quantum Physics, Department of Physics and Astronomy, University of Turku (Finland); Toigo, Alessandro, E-mail: alessandro.toigo@polimi.it [Dipartimento di Matematica, Politecnico di Milano, Piazza Leonardo da Vinci 32, I-20133 Milano (Italy); I.N.F.N., Sezione di Milano, Via Celoria 16, I-20133 Milano (Italy)

    2013-09-02

    It has been recently shown by Heinosaari, Mazzarella and Wolf (2013) [1] that an observable that identifies all pure states of a d-dimensional quantum system has minimally 4d−4 outcomes or slightly less (the exact number depending on d). However, no simple construction of this type of minimal observable is known. We investigate covariant observables that identify all pure states and have minimal number of outcomes. It is shown that the existence of this kind of observables depends on the dimension of the Hilbert space.

  3. Graphical approach for multiple values logic minimization

    Science.gov (United States)

    Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.

    1999-03-01

    Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.

  4. Harm minimization among teenage drinkers

    DEFF Research Database (Denmark)

    Jørgensen, Morten Hulvej; Curtis, Tine; Christensen, Pia Haudrup

    2007-01-01

    AIM: To examine strategies of harm minimization employed by teenage drinkers. DESIGN, SETTING AND PARTICIPANTS: Two periods of ethnographic fieldwork were conducted in a rural Danish community of approximately 2000 inhabitants. The fieldwork included 50 days of participant observation among 13....... In regulating the social context of drinking they relied on their personal experiences more than on formalized knowledge about alcohol and harm, which they had learned from prevention campaigns and educational programmes. CONCLUSIONS: In this study we found that teenagers may help each other to minimize alcohol...

  5. Minimal entropy approximation for cellular automata

    International Nuclear Information System (INIS)

    Fukś, Henryk

    2014-01-01

    We present a method for the construction of approximate orbits of measures under the action of cellular automata which is complementary to the local structure theory. The local structure theory is based on the idea of Bayesian extension, that is, construction of a probability measure consistent with given block probabilities and maximizing entropy. If instead of maximizing entropy one minimizes it, one can develop another method for the construction of approximate orbits, at the heart of which is the iteration of finite-dimensional maps, called minimal entropy maps. We present numerical evidence that the minimal entropy approximation sometimes outperforms the local structure theory in characterizing the properties of cellular automata. The density response curve for elementary CA rule 26 is used to illustrate this claim. (paper)

  6. Minimal Invasive Urologic Surgery and Postoperative Ileus

    Directory of Open Access Journals (Sweden)

    Fouad Aoun

    2015-07-01

    Full Text Available Postoperative ileus (POI is the most common cause of prolonged length of hospital stays (LOS and associated healthcare costs. The advent of minimal invasive technique was a major breakthrough in the urologic landscape with great potential to progress in the future. In the field of gastrointestinal surgery, several studies had reported lower incidence rates for POI following minimal invasive surgery compared to conventional open procedures. In contrast, little is known about the effect of minimal invasive approach on the recovery of bowel motility after urologic surgery. We performed an overview of the potential benefit of minimal invasive approach on POI for urologic procedures. The mechanisms and risk factors responsible for the onset of POI are discussed with emphasis on the advantages of minimal invasive approach. In the urologic field, POI is the main complication following radical cystectomy but it is rarely of clinical significance for other minimal invasive interventions. Laparoscopy or robotic assisted laparoscopic techniques when studied individually may reduce to their own the duration and prevent the onset of POI in a subset of procedures. The potential influence of age and urinary diversion type on postoperative ileus is contradictory in the literature. There is some evidence suggesting that BMI, blood loss, urinary extravasation, existence of a major complication, bowel resection, operative time and transperitoneal approach are independent risk factors for POI. Treatment of POI remains elusive. One of the most important and effective management strategies for patients undergoing radical cystectomy has been the development and use of enhanced recovery programs. An optimal rational strategy to shorten the duration of POI should incorporate minimal invasive approach when appropriate into multimodal fast track programs designed to reduce POI and shorten LOS.

  7. Minimalism in Art, Medical Science and Neurosurgery.

    Science.gov (United States)

    Okten, Ali Ihsan

    2018-01-01

    The word "minimalism" is a word derived from French the word "minimum". Whereas the lexical meaning of minimum is "the least or the smallest quantity necessary for something", its expression in mathematics can be described as "the lowest step a variable number can descend, least, minimal". Minimalism, which advocates an extreme simplicity of the artistic form, is a current in modern art and music whose origins go to 1960s and which features simplicity and objectivity. Although art, science and philosophy are different disciplines, they support each other from time to time, sometimes they intertwine and sometimes they copy each other. A periodic schools or teaching in one of them can take the others into itself, so, they proceed on their ways empowering each other. It is also true for the minimalism in art and the minimal invasive surgical approaches in science. Concepts like doing with less, avoiding unnecessary materials and reducing the number of the elements in order to increase the effect in the expression which are the main elements of the minimalism in art found their equivalents in medicine and neurosurgery. Their equivalents in medicine or neurosurgery have been to protect the physical integrity of the patient with less iatrogenic injury, minimum damage and the same therapeutic effect in the most effective way and to enable the patient to regain his health in the shortest span of time. As an anticipation, we can consider that the minimal approaches started by Richard Wollheim and Barbara Rose in art and Lars Leksell, Gazi Yaşargil and other neurosurgeons in neurosurgery in the 1960s are the present day equivalents of the minimalist approaches perhaps unconsciously started by Kazimir Malevich in art and Victor Darwin L"Espinasse in neurosurgery in the early 1900s. We can also consider that they have developed interacting with each other, not by chance.

  8. Minimally conscious state or cortically mediated state?

    Science.gov (United States)

    Naccache, Lionel

    2018-04-01

    Durable impairments of consciousness are currently classified in three main neurological categories: comatose state, vegetative state (also recently coined unresponsive wakefulness syndrome) and minimally conscious state. While the introduction of minimally conscious state, in 2002, was a major progress to help clinicians recognize complex non-reflexive behaviours in the absence of functional communication, it raises several problems. The most important issue related to minimally conscious state lies in its criteria: while behavioural definition of minimally conscious state lacks any direct evidence of patient's conscious content or conscious state, it includes the adjective 'conscious'. I discuss this major problem in this review and propose a novel interpretation of minimally conscious state: its criteria do not inform us about the potential residual consciousness of patients, but they do inform us with certainty about the presence of a cortically mediated state. Based on this constructive criticism review, I suggest three proposals aiming at improving the way we describe the subjective and cognitive state of non-communicating patients. In particular, I present a tentative new classification of impairments of consciousness that combines behavioural evidence with functional brain imaging data, in order to probe directly and univocally residual conscious processes.

  9. Minimally inconsistent reasoning in Semantic Web.

    Directory of Open Access Journals (Sweden)

    Xiaowang Zhang

    Full Text Available Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical description logic reasoning.

  10. Embodied learning of a generative neural model for biological motion perception and inference.

    Science.gov (United States)

    Schrodt, Fabian; Layher, Georg; Neumann, Heiko; Butz, Martin V

    2015-01-01

    Although an action observation network and mirror neurons for understanding the actions and intentions of others have been under deep, interdisciplinary consideration over recent years, it remains largely unknown how the brain manages to map visually perceived biological motion of others onto its own motor system. This paper shows how such a mapping may be established, even if the biologically motion is visually perceived from a new vantage point. We introduce a learning artificial neural network model and evaluate it on full body motion tracking recordings. The model implements an embodied, predictive inference approach. It first learns to correlate and segment multimodal sensory streams of own bodily motion. In doing so, it becomes able to anticipate motion progression, to complete missing modal information, and to self-generate learned motion sequences. When biological motion of another person is observed, this self-knowledge is utilized to recognize similar motion patterns and predict their progress. Due to the relative encodings, the model shows strong robustness in recognition despite observing rather large varieties of body morphology and posture dynamics. By additionally equipping the model with the capability to rotate its visual frame of reference, it is able to deduce the visual perspective onto the observed person, establishing full consistency to the embodied self-motion encodings by means of active inference. In further support of its neuro-cognitive plausibility, we also model typical bistable perceptions when crucial depth information is missing. In sum, the introduced neural model proposes a solution to the problem of how the human brain may establish correspondence between observed bodily motion and its own motor system, thus offering a mechanism that supports the development of mirror neurons.

  11. Embodied Learning of a Generative Neural Model for Biological Motion Perception and Inference

    Directory of Open Access Journals (Sweden)

    Fabian eSchrodt

    2015-07-01

    Full Text Available Although an action observation network and mirror neurons for understanding the actions and intentions of others have been under deep, interdisciplinary consideration over recent years, it remains largely unknown how the brain manages to map visually perceived biological motion of others onto its own motor system. This paper shows how such a mapping may be established, even if the biologically motion is visually perceived from a new vantage point. We introduce a learning artificial neural network model and evaluate it on full body motion tracking recordings. The model implements an embodied, predictive inference approach. It first learns to correlate and segment multimodal sensory streams of own bodily motion. In doing so, it becomes able to anticipate motion progression, to complete missing modal information, and to self-generate learned motion sequences. When biological motion of another person is observed, this self-knowledge is utilized to recognize similar motion patterns and predict their progress. Due to the relative encodings, the model shows strong robustness in recognition despite observing rather large varieties of body morphology and posture dynamics. By additionally equipping the model with the capability to rotate its visual frame of reference, it is able to deduce the visual perspective onto the observed person, establishing full consistency to the embodied self-motion encodings by means of active inference. In further support of its neuro-cognitive plausibility, we also model typical bistable perceptions when crucial depth information is missing. In sum, the introduced neural model proposes a solution to the problem of how the human brain may establish correspondence between observed bodily motion and its own motor system, thus offering a mechanism that supports the development of mirror neurons.

  12. Minimal Dark Matter in the sky

    International Nuclear Information System (INIS)

    Panci, P.

    2016-01-01

    We discuss some theoretical and phenomenological aspects of the Minimal Dark Matter (MDM) model proposed in 2006, which is a theoretical framework highly appreciated for its minimality and yet its predictivity. We first critically review the theoretical requirements of MDM pointing out generalizations of this framework. Then we review the phenomenology of the originally proposed fermionic hyperchargeless electroweak quintuplet showing its main γ-ray tests.

  13. Supraorbital Versus Endoscopic Endonasal Approaches for Olfactory Groove Meningiomas: A Cost-Minimization Study.

    Science.gov (United States)

    Gandhoke, Gurpreet S; Pease, Matthew; Smith, Kenneth J; Sekula, Raymond F

    2017-09-01

    To perform a cost-minimization study comparing the supraorbital and endoscopic endonasal (EEA) approach with or without craniotomy for the resection of olfactory groove meningiomas (OGMs). We built a decision tree using probabilities of gross total resection (GTR) and cerebrospinal fluid (CSF) leak rates with the supraorbital approach versus EEA with and without additional craniotomy. The cost (not charge or reimbursement) at each "stem" of this decision tree for both surgical options was obtained from our hospital's finance department. After a base case calculation, we applied plausible ranges to all parameters and carried out multiple 1-way sensitivity analyses. Probabilistic sensitivity analyses confirmed our results. The probabilities of GTR (0.8) and CSF leak (0.2) for the supraorbital craniotomy were obtained from our series of 5 patients who underwent a supraorbital approach for the resection of an OGM. The mean tumor volume was 54.6 cm 3 (range, 17-94.2 cm 3 ). Literature-reported rates of GTR (0.6) and CSF leak (0.3) with EEA were applied to our economic analysis. Supraorbital craniotomy was the preferred strategy, with an expected value of $29,423, compared with an EEA cost of $83,838. On multiple 1-way sensitivity analyses, supraorbital craniotomy remained the preferred strategy, with a minimum cost savings of $46,000 and a maximum savings of $64,000. Probabilistic sensitivity analysis found the lowest cost difference between the 2 surgical options to be $37,431. Compared with EEA, supraorbital craniotomy provides substantial cost savings in the treatment of OGMs. Given the potential differences in effectiveness between approaches, a cost-effectiveness analysis should be undertaken. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Staphylococcus aureus Survives with a Minimal Peptidoglycan Synthesis Machine but Sacrifices Virulence and Antibiotic Resistance.

    Directory of Open Access Journals (Sweden)

    Patricia Reed

    2015-05-01

    Full Text Available Many important cellular processes are performed by molecular machines, composed of multiple proteins that physically interact to execute biological functions. An example is the bacterial peptidoglycan (PG synthesis machine, responsible for the synthesis of the main component of the cell wall and the target of many contemporary antibiotics. One approach for the identification of essential components of a cellular machine involves the determination of its minimal protein composition. Staphylococcus aureus is a Gram-positive pathogen, renowned for its resistance to many commonly used antibiotics and prevalence in hospitals. Its genome encodes a low number of proteins with PG synthesis activity (9 proteins, when compared to other model organisms, and is therefore a good model for the study of a minimal PG synthesis machine. We deleted seven of the nine genes encoding PG synthesis enzymes from the S. aureus genome without affecting normal growth or cell morphology, generating a strain capable of PG biosynthesis catalyzed only by two penicillin-binding proteins, PBP1 and the bi-functional PBP2. However, multiple PBPs are important in clinically relevant environments, as bacteria with a minimal PG synthesis machinery became highly susceptible to cell wall-targeting antibiotics, host lytic enzymes and displayed impaired virulence in a Drosophila infection model which is dependent on the presence of specific peptidoglycan receptor proteins, namely PGRP-SA. The fact that S. aureus can grow and divide with only two active PG synthesizing enzymes shows that most of these enzymes are redundant in vitro and identifies the minimal PG synthesis machinery of S. aureus. However a complex molecular machine is important in environments other than in vitro growth as the expendable PG synthesis enzymes play an important role in the pathogenicity and antibiotic resistance of S. aureus.

  15. Minimal model holography

    International Nuclear Information System (INIS)

    Gaberdiel, Matthias R; Gopakumar, Rajesh

    2013-01-01

    We review the duality relating 2D W N minimal model conformal field theories, in a large-N ’t Hooft like limit, to higher spin gravitational theories on AdS 3 . This article is part of a special issue of Journal of Physics A: Mathematical and Theoretical devoted to ‘Higher spin theories and holography’. (review)

  16. Supersymmetric Hybrid Inflation with Non-Minimal Kähler potential

    CERN Document Server

    Bastero-Gil, M; Shafi, Q

    2007-01-01

    Minimal supersymmetric hybrid inflation based on a minimal Kahler potential predicts a spectral index n_s\\gsim 0.98. On the other hand, WMAP three year data prefers a central value n_s \\approx 0.95. We propose a class of supersymmetric hybrid inflation models based on the same minimal superpotential but with a non-minimal Kahler potential. Including radiative corrections using the one-loop effective potential, we show that the prediction for the spectral index is sensitive to the small non-minimal corrections, and can lead to a significantly red-tilted spectrum, in agreement with WMAP.

  17. Self-focused and other-focused resiliency: Plausible mechanisms linking early family adversity to health problems in college women.

    Science.gov (United States)

    Coleman, Sulamunn R M; Zawadzki, Matthew J; Heron, Kristin E; Vartanian, Lenny R; Smyth, Joshua M

    2016-01-01

    This study examined whether self-focused and other-focused resiliency help explain how early family adversity relates to perceived stress, subjective health, and health behaviors in college women. Female students (N = 795) participated between October 2009 and May 2010. Participants completed self-report measures of early family adversity, self-focused (self-esteem, personal growth initiative) and other-focused (perceived social support, gratitude) resiliency, stress, subjective health, and health behaviors. Using structural equation modeling, self-focused resiliency associated with less stress, better subjective health, more sleep, less smoking, and less weekend alcohol consumption. Other-focused resiliency associated with more exercise, greater stress, and more weekend alcohol consumption. Early family adversity was indirectly related to all health outcomes, except smoking, via self-focused and other-focused resiliency. Self-focused and other-focused resiliency represent plausible mechanisms through which early family adversity relates to stress and health in college women. This highlights areas for future research in disease prevention and management.

  18. Low doses of ionizing radiation: Biological effects and regulatory control. Invited papers and discussions. Proceedings of an international conference

    International Nuclear Information System (INIS)

    1998-01-01

    The levels and biological effects resulting from exposure to ionizing radiation are continuously reviewed by the United Nations Committee on the Effects of Atomic Radiation (UNSCEAR). Since its creation in 1928, the International Commission on Radiological Protection (ICRP) has issued recommendations on protection against ionizing radiation. The UNSCEAR estimates and the ICRP recommendations have served as the basis for national and international safety standards on radiation safety, including those developed by the International Atomic Energy Agency (IAEA) and the World Health Organization (WHO). Concerning health effects of low doses of ionizing radiation, the international standards are based on the plausible assumption that, above the unavoidable background radiation dose, the probability of effects increases linearly with dose, i.e. on a 'linear, no threshold' (LNT) assumption. However, in recent years the biological estimates of health effects of low doses of ionizing radiation and the regulatory approach to the control of low level radiation exposure have been much debated. To foster information exchange on the relevant issues, an International Conference on Low Doses of Ionizing Radiation: Biological Effects and Regulatory Control, jointly sponsored by the IAEA and WHO in co-operation with UNSCEAR, was held from 17-21 November 1997 at Seville, Spain. These Proceedings contain the invited special reports, keynote papers, summaries of discussions, session summaries and addresses presented at the opening and closing of the Conference

  19. Minimizing electrode contamination in an electrochemical cell

    Science.gov (United States)

    Kim, Yu Seung; Zelenay, Piotr; Johnston, Christina

    2014-12-09

    An electrochemical cell assembly that is expected to prevent or at least minimize electrode contamination includes one or more getters that trap a component or components leached from a first electrode and prevents or at least minimizes them from contaminating a second electrode.

  20. Oxidative decontamination of chemical and biological warfare agents using L-Gel.

    Science.gov (United States)

    Raber, Ellen; McGuire, Raymond

    2002-08-05

    A decontamination method has been developed using a single reagent that is effective both against chemical warfare (CW) and biological warfare (BW) agents. The new reagent, "L-Gel", consists of an aqueous solution of a mild commercial oxidizer, Oxone, together with a commercial fumed silica gelling agent, Cab-O-Sil EH-5. L-Gel is non-toxic, environmentally friendly, relatively non-corrosive, maximizes contact time because of its thixotropic nature, clings to walls and ceilings, and does not harm carpets or painted surfaces. The new reagent also addresses the most demanding requirements for decontamination in the civilian sector, including availability, low maintenance, ease of application and deployment by a variety of dispersal mechanisms, minimal training and acceptable expense. Experiments to test the effectiveness of L-Gel were conducted at Lawrence Livermore National Laboratory and independently at four other locations. L-Gel was tested against all classes of chemical warfare agents and against various biological warfare agent surrogates, including spore-forming bacteria and non-virulent strains of real biological agents. Testing showed that L-Gel is as effective against chemical agents and biological materials, including spores, as the best military decontaminants.

  1. Minimal Flavor Constraints for Technicolor

    DEFF Research Database (Denmark)

    Sakuma, Hidenori; Sannino, Francesco

    2010-01-01

    We analyze the constraints on the the vacuum polarization of the standard model gauge bosons from a minimal set of flavor observables valid for a general class of models of dynamical electroweak symmetry breaking. We will show that the constraints have a strong impact on the self-coupling and mas......We analyze the constraints on the the vacuum polarization of the standard model gauge bosons from a minimal set of flavor observables valid for a general class of models of dynamical electroweak symmetry breaking. We will show that the constraints have a strong impact on the self...

  2. Microbial Degradation of Forensic Samples of Biological Origin: Potential Threat to Human DNA Typing.

    Science.gov (United States)

    Dash, Hirak Ranjan; Das, Surajit

    2018-02-01

    Forensic biology is a sub-discipline of biological science with an amalgam of other branches of science used in the criminal justice system. Any nucleated cell/tissue harbouring DNA, either live or dead, can be used as forensic exhibits, a source of investigation through DNA typing. These biological materials of human origin are rich source of proteins, carbohydrates, lipids, trace elements as well as water and, thus, provide a virtuous milieu for the growth of microbes. The obstinate microbial growth augments the degradation process and is amplified with the passage of time and improper storage of the biological materials. Degradation of these biological materials carriages a huge challenge in the downstream processes of forensic DNA typing technique, such as short tandem repeats (STR) DNA typing. Microbial degradation yields improper or no PCR amplification, heterozygous peak imbalance, DNA contamination from non-human sources, degradation of DNA by microbial by-products, etc. Consequently, the most precise STR DNA typing technique is nullified and definite opinion can be hardly given with degraded forensic exhibits. Thus, suitable precautionary measures should be taken for proper storage and processing of the biological exhibits to minimize their decaying process by micro-organisms.

  3. A Prospective Randomized Study on Operative Treatment for Simple Distal Tibial Fractures-Minimally Invasive Plate Osteosynthesis Versus Minimal Open Reduction and Internal Fixation.

    Science.gov (United States)

    Kim, Ji Wan; Kim, Hyun Uk; Oh, Chang-Wug; Kim, Joon-Woo; Park, Ki Chul

    2018-01-01

    To compare the radiologic and clinical results of minimally invasive plate osteosynthesis (MIPO) and minimal open reduction and internal fixation (ORIF) for simple distal tibial fractures. Randomized prospective study. Three level 1 trauma centers. Fifty-eight patients with simple and distal tibial fractures were randomized into a MIPO group (treatment with MIPO; n = 29) or a minimal group (treatment with minimal ORIF; n = 29). These numbers were designed to define the rate of soft tissue complication; therefore, validation of superiority in union time or determination of differences in rates of delayed union was limited in this study. Simple distal tibial fractures treated with MIPO or minimal ORIF. The clinical outcome measurements included operative time, radiation exposure time, and soft tissue complications. To evaluate a patient's function, the American Orthopedic Foot and Ankle Society ankle score (AOFAS) was used. Radiologic measurements included fracture alignment, delayed union, and union time. All patients acquired bone union without any secondary intervention. The mean union time was 17.4 weeks and 16.3 weeks in the MIPO and minimal groups, respectively. There was 1 case of delayed union and 1 case of superficial infection in each group. The radiation exposure time was shorter in the minimal group than in the MIPO group. Coronal angulation showed a difference between both groups. The American Orthopedic Foot and Ankle Society ankle scores were 86.0 and 86.7 in the MIPO and minimal groups, respectively. Minimal ORIF resulted in similar outcomes, with no increased rate of soft tissue problems compared to MIPO. Both MIPO and minimal ORIF have high union rates and good functional outcomes for simple distal tibial fractures. Minimal ORIF did not result in increased rates of infection and wound dehiscence. Therapeutic Level II. See Instructions for Authors for a complete description of levels of evidence.

  4. Minimally Invasive Surgery (MIS) Approaches to Thoracolumbar Trauma.

    Science.gov (United States)

    Kaye, Ian David; Passias, Peter

    2018-03-01

    Minimally invasive surgical (MIS) techniques offer promising improvements in the management of thoracolumbar trauma. Recent advances in MIS techniques and instrumentation for degenerative conditions have heralded a growing interest in employing these techniques for thoracolumbar trauma. Specifically, surgeons have applied these techniques to help manage flexion- and extension-distraction injuries, neurologically intact burst fractures, and cases of damage control. Minimally invasive surgical techniques offer a means to decrease blood loss, shorten operative time, reduce infection risk, and shorten hospital stays. Herein, we review thoracolumbar minimally invasive surgery with an emphasis on thoracolumbar trauma classification, minimally invasive spinal stabilization, surgical indications, patient outcomes, technical considerations, and potential complications.

  5. Analysis of working conditions focusing on biological risk: firefighters in Campo Grande, MS, Brazil.

    Science.gov (United States)

    Contrera-Moreno, Luciana; de Andrade, Sonia Maria Oliveira; Motta-Castro, Ana Rita Coimbra; Pinto, Alexandra Maria Almeida Carvalho; Salas, Frederico Reis Pouso; Stief, Alcione Cavalheiros Faro

    2012-01-01

    Firefighters are exposed to a wide range of risks, among them, biological risk. The objective was to analyze working conditions of firefighters in the city of Campo Grande, MS, Brazil, focusing on risk conditions of exposure to biological material. Three hundred and seven (307) firefighters were interviewed for data collection and observed for ergonomic job analysis (AET). 63.5% of the firefighters suffered some kind of job related accident with blood or body fluids. Statistically significant association was found between having suffered accidents at work and incomplete use of personal protective equipment (PPE). About AET regarding the biological risks, 57.1% of all patients had blood or secretions, which corresponds in average to 16.0% of the total work time, based on a working day of 24 h. Besides biological risks, other stressing factors were identified: emergency and complexity of decision, high responsibility regarding patients and environment, and conflicts. Health promotion and accident prevention actions must be emphasized as measures to minimize these risks.

  6. Specialized minimal PDFs for optimized LHC calculations

    NARCIS (Netherlands)

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-01-01

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct

  7. The minimal manual: is less really more?

    NARCIS (Netherlands)

    Lazonder, Adrianus W.; van der Meij, Hans

    1993-01-01

    Carroll, Smith-Kerker, Ford and Mazur-Rimetz (The minimal manual, Human-Computer Interaction , 3, 123-153, 1987) have introduced the minimal manual as an alternative to standard self-instruction manuals. While their research indicates strong gains, only a few attempts have been made to validate

  8. First biological measurements of deep-sea corals from the Red Sea

    OpenAIRE

    C. Roder; M. L. Berumen; J. Bouwmeester; E. Papathanassiou; A. Al-Suwailem; C. R. Voolstra

    2013-01-01

    It is usually assumed that metabolic constraints restrict deep-sea corals to cold-water habitats, with ?deep-sea? and ?cold-water? corals often used as synonymous. Here we report on the first measurements of biological characters of deep-sea corals from the central Red Sea, where they occur at temperatures exceeding 20?C in highly oligotrophic and oxygen-limited waters. Low respiration rates, low calcification rates, and minimized tissue cover indicate that a reduced metabolism is one of the ...

  9. Biological Control of Bacterial Wilt in South East Asia

    Directory of Open Access Journals (Sweden)

    Triwidodo Arwiyanto

    2014-12-01

    Full Text Available Bacterial wilt disease caused by Ralstonia solanacearum destroys many crops of different plant families in South East Asia despite many researches about the disease, and the availability of developed control method in other parts of the world. There is no chemical available for the bacterial wilt pathogen and biological control is then chosen as an alternative to save the crops. Most of the biological control studies were based on antagonism between biological control agent and the pathogen. The biological control agents were intended to reduce the initial inoculum of the pathogen. The effort to minimize the initial inoculum of the pathogen by baiting with the use of hypersensitive host-plant was only reliable when conducted in the greenhouse experiments. Various microorganisms have been searched as possible biological control agents, for instance avirulent form of the pathogen, soil or rhizosphere bacteria (Bacillus spp. and fluorescent pseudomonads, actinomycetes (Streptomyces spp., yeast (Pichia uillermondii, Candida ethanolica, and a consortium of microorganisms known as effective microorganisms (EM. None of these biological control agents has been used in field application and they need further investigation in order to effectively control bacterial wilt. Opportunities and challenges in developing biological control to combat bacterial wilt are discussed in the paper. Penyakit layu bakteri yang disebabkan oleh Ralstonia solanacearum menghancurkan banyak tanaman dalam famili yang berbeda di Asia Tenggara meskipun telah banyak penelitian tentang metode pengendaliannya. Penyakit ini sulit dikendalikan karena banyaknya variabilitas patogen dan belum tersedianya sumber ketahanan yang mapan. Di samping itu, sampai saat ini belum ada bahan kimia yang tersedia untuk patogen layu bakteri ini sehingga pengendalian biologi kemudian dipilih sebagai cara alternatif untuk menyelamatkan tanaman. Sebagian besar penelitian pengendalian biologi didasarkan

  10. Evaluation of smoothing in an iterative lp-norm minimization algorithm for surface-based source localization of MEG

    Science.gov (United States)

    Han, Jooman; Sic Kim, June; Chung, Chun Kee; Park, Kwang Suk

    2007-08-01

    The imaging of neural sources of magnetoencephalographic data based on distributed source models requires additional constraints on the source distribution in order to overcome ill-posedness and obtain a plausible solution. The minimum lp norm (0 temporal gyrus.

  11. Minimization of the blank values in the neutron activation analysis of biological samples considering the whole procedure

    International Nuclear Information System (INIS)

    Lux, F.; Bereznai, T.; Haeberlin, T.H.

    1986-01-01

    In our determination of trace element contents of animal tissue by neutron activation analysis in the course of structure-activity relationship studies on platinum containing cancer drugs and wound healing we have tried to minimize the blank values that are caused by different sources of contamination during surgery, sampling and the activation analysis procedure. The following topics have been investigated: the abrasions from scalpels made of stainless steel, titanium or quartz; the type of surgery; the homogenisation of the samples before irradiation by use of a ball milll; the surface contaminations of the quartz ampoules that pass into the digestion solution of the irradiated samples. The appropriate measures taken in order to reduce the blank values are described. The results of analyses performed under these conditions indicate the effectiveness of the given measures, especially shown by the low values obtained for the chromium contents of the analysed muscle samples. (author)

  12. Minimally processed fruit salad enriched with Lactobacillus ...

    African Journals Online (AJOL)

    paula

    2015-06-17

    Jun 17, 2015 ... Minimal processing promotes browning of some vegetal tissues due to cell membrane disruption, which results in the release of oxidative enzymes. This study evaluated the efficiency of citric acid, ascorbic acid, sodium metabisulfite and L-cysteine hydrochloride to retard enzymatic browning of minimally.

  13. Minimally processed fruit salad enriched with Lactobacillus ...

    African Journals Online (AJOL)

    Minimal processing promotes browning of some vegetal tissues due to cell membrane disruption, which results in the release of oxidative enzymes. This study evaluated the efficiency of citric acid, ascorbic acid, sodium metabisulfite and L-cysteine hydrochloride to retard enzymatic browning of minimally processed fruit ...

  14. Touch-down reverse transcriptase-PCR detection of IgV(H) rearrangement and Sybr-Green-based real-time RT-PCR quantitation of minimal residual disease in patients with chronic lymphocytic leukemia

    Czech Academy of Sciences Publication Activity Database

    Peková, Soňa; Marková, J.; Pajer, Petr; Dvořák, Michal; Cetkovský, P.; Schwarz, J.

    2005-01-01

    Roč. 9, č. 1 (2005), s. 23-34 ISSN 1084-8592 Institutional research plan: CEZ:AV0Z50520514 Keywords : minimal residual disease * chronic lymphocytic leukaemia * IgV (H) rearrangement Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.562, year: 2003

  15. Responsiveness and minimal clinically important change

    DEFF Research Database (Denmark)

    Christiansen, David Høyrup; Frost, Poul; Falla, Deborah

    2015-01-01

    Study Design A prospective cohort study nested in a randomized controlled trial. Objectives To determine and compare responsiveness and minimal clinically important change of the modified Constant score (CS) and the Oxford Shoulder Score (OSS). Background The OSS and the CS are commonly used...... to assess shoulder outcomes. However, few studies have evaluated the measurement properties of the OSS and CS in terms of responsiveness and minimal clinically important change. Methods The study included 126 patients who reported having difficulty returning to usual activities 8 to 12 weeks after...... were observed for the CS and the OSS. Minimal clinically important change ROC values were 6 points for the OSS and 11 points for the CS, with upper 95% cutoff limits of 12 and 22 points, respectively. Conclusion The CS and the OSS were both suitable for assessing improvement after decompression surgery....

  16. Approximate error conjugation gradient minimization methods

    Science.gov (United States)

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  17. New trends in minimally invasive urological surgery

    Directory of Open Access Journals (Sweden)

    Prabhakar Rajan

    2009-10-01

    Full Text Available Purpose: The perceived benefits of minimally-invasive surgery include less postoperative pain, shorter hospitalization, reduced morbidity and better cosmesis while maintaining diagnostic accuracy and therapeutic outcome. We review the new trends in minimally-invasive urological surgery. Materials and method: We reviewed the English language literature using the National Library of Medicine database to identify the latest technological advances in minimally-invasive surgery with particular reference to urology. Results: Amongst other advances, studies incorporating needlescopic surgery, laparoendoscopic single-site surgery , magnetic anchoring and guidance systems, natural orifice transluminal endoscopic surgery and flexible robots were considered of interest. The results from initial animal and human studies are also outlined. Conclusion: Minimally-invasive surgery continues to evolve to meet the demands of the operators and patients. Many novel technologies are still in the testing phase, whilst others have entered clinical practice. Further evaluation is required to confirm the safety and efficacy of these techniques and validate the published reports.

  18. Towards an optimal experimental design for N2O model calibration during biological nitrogen removal

    DEFF Research Database (Denmark)

    Domingo Felez, Carlos; Valverde Pérez, Borja; Plósz, Benedek G.

    Process models describing nitrous oxide (N2O) production during biological nitrogen removal allow for the development of mitigation strategies of this potent greenhouse gas. N2O is an intermediate of nitrogen removal, hence its prediction is negatively affected by the uncertainty associated to it...... of strategies to minimize the carbon footprint of wastewater treatment plants....

  19. Tactical Mobility of the Medium Weight Force in Urban Terrain

    National Research Council Canada - National Science Library

    Johnson, Scott

    2001-01-01

    The potential for urban combat grows more plausible and probable as the world's population migrates toward cities and adversaries attempt to minimize the technological advantage of the U.S. military...

  20. Gravitino problem in minimal supergravity inflation

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, Fuminori [Institute for Cosmic Ray Research, The University of Tokyo, Kashiwa, Chiba 277-8582 (Japan); Mukaida, Kyohei [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba 277-8583 (Japan); Nakayama, Kazunori [Department of Physics, Faculty of Science, The University of Tokyo, Bunkyo-ku, Tokyo 133-0033 (Japan); Terada, Takahiro, E-mail: terada@kias.re.kr [School of Physics, Korea Institute for Advanced Study (KIAS), Seoul 02455 (Korea, Republic of); Yamada, Yusuke [Stanford Institute for Theoretical Physics and Department of Physics, Stanford University, Stanford, CA 94305 (United States)

    2017-04-10

    We study non-thermal gravitino production in the minimal supergravity inflation. In this minimal model utilizing orthogonal nilpotent superfields, the particle spectrum includes only graviton, gravitino, inflaton, and goldstino. We find that a substantial fraction of the cosmic energy density can be transferred to the longitudinal gravitino due to non-trivial change of its sound speed. This implies either a breakdown of the effective theory after inflation or a serious gravitino problem.

  1. Gravitino problem in minimal supergravity inflation

    Directory of Open Access Journals (Sweden)

    Fuminori Hasegawa

    2017-04-01

    Full Text Available We study non-thermal gravitino production in the minimal supergravity inflation. In this minimal model utilizing orthogonal nilpotent superfields, the particle spectrum includes only graviton, gravitino, inflaton, and goldstino. We find that a substantial fraction of the cosmic energy density can be transferred to the longitudinal gravitino due to non-trivial change of its sound speed. This implies either a breakdown of the effective theory after inflation or a serious gravitino problem.

  2. Minimal Function Graphs are not Instrumented

    DEFF Research Database (Denmark)

    Mycroft, Alan; Rosendahl, Mads

    1992-01-01

    The minimal function graph semantics of Jones and Mycroft is a standard denotational semantics modified to include only `reachable' parts of a program. We show that it may be expressed directly in terms of the standard semantics without the need for instrumentation at the expression level and......, in doing so, bring out a connection with strictness. This also makes it possible to prove a stronger theorem of correctness for the minimal function graph semantics....

  3. Search for Minimal Standard Model and Minimal Supersymmetric Model Higgs Bosons in e+ e- Collisions with the OPAL detector at LEP

    International Nuclear Information System (INIS)

    Ganel, Ofer

    1993-06-01

    When LEP machine was turned on in August 1989, a new era had opened. For the first time, direct, model-independent searches for Higgs boson could be carried out. The Minimal Standard Model Higgs boson is expected to be produced in e + e - collisions via the H o Z o . The Minimal Supersymmetric Model Higgs boson are expected to be produced in the analogous e + e - -> h o Z o process or in pairs via the process e + e - -> h o A o . In this thesis we describe the search for Higgs bosons within the framework of the Minimal Standard Model and the Minimal Supersymmetric Model, using the data accumulated by the OPAL detector at LEP in the 1989, 1990, 1991 and part of the 1992 running periods at and around the Z o pole. An MInimal Supersymmetric Model Higgs boson generator is described as well as its use in several different searches. As a result of this work, the Minimal Standard Model Higgs boson mass is bounded from below by 54.2 GeV/c 2 at 95% C.L. This is, at present, the highest such bound. A novel method of overcoming the m τ and m s dependence of Minimal Supersymmetric Higgs boson production and decay introduced by one-loop radiative corrections is used to obtain model-independent exclusion. The thesis describes also an algorithm for off line identification of calorimeter noise in the OPAL detector. (author)

  4. Inverse Problems in Systems Biology: A Critical Review.

    Science.gov (United States)

    Guzzi, Rodolfo; Colombo, Teresa; Paci, Paola

    2018-01-01

    Systems Biology may be assimilated to a symbiotic cyclic interplaying between the forward and inverse problems. Computational models need to be continuously refined through experiments and in turn they help us to make limited experimental resources more efficient. Every time one does an experiment we know that there will be some noise that can disrupt our measurements. Despite the noise certainly is a problem, the inverse problems already involve the inference of missing information, even if the data is entirely reliable. So the addition of a certain limited noise does not fundamentally change the situation but can be used to solve the so-called ill-posed problem, as defined by Hadamard. It can be seen as an extra source of information. Recent studies have shown that complex systems, among others the systems biology, are poorly constrained and ill-conditioned because it is difficult to use experimental data to fully estimate their parameters. For these reasons was born the concept of sloppy models, a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. Furthermore the concept of sloppy models contains also the concept of un-identifiability, because the models are characterized by many parameters that are poorly constrained by experimental data. Then a strategy needs to be designed to infer, analyze, and understand biological systems. The aim of this work is to provide a critical review to the inverse problems in systems biology defining a strategy to determine the minimal set of information needed to overcome the problems arising from dynamic biological models that generally may have many unknown, non-measurable parameters.

  5. Making the Most of Minimalism in Music.

    Science.gov (United States)

    Geiersbach, Frederick J.

    1998-01-01

    Describes the minimalist movement in music. Discusses generations of minimalist musicians and, in general, the minimalist approach. Considers various ways that minimalist strategies can be integrated into the music classroom focusing on (1) minimalism and (2) student-centered composition and principles of minimalism for use with elementary band…

  6. Development of biological and nonbiological explanations for the Viking label release data. [hydrogen peroxide theory

    Science.gov (United States)

    1980-01-01

    The plausibility that hydrogen peroxide, widely distributed within the Mars surface material, was responsible for the evocative response obtained by the Viking Labeled Release (LR) experiment on Mars was investigated. Although a mixture of gamma Fe2O3 and silica sand stimulated the LR nutrient reaction with hydrogen peroxide and reduced the rate of hydrogen decomposition under various storage conditions, the Mars analog soil prepared by the Viking Inorganic Analysis Team to match the Mars analytical data does not cause such effects. Nor is adequate resistance to UV irradiation shown. On the basis of the results and consideration presented while the hydrogen peroxide theory remains the most, if not only, attractive chemical explanation of the LR data, it remains unconvincing on critical points. Until problems concerning the formation and stabilization of hydrogen peroxide on the surface of Mars can be overcome, adhere to the scientific evidence requires serious consideration of the biological theory.

  7. High molecular weight DNA assembly in vivo for synthetic biology applications.

    Science.gov (United States)

    Juhas, Mario; Ajioka, James W

    2017-05-01

    DNA assembly is the key technology of the emerging interdisciplinary field of synthetic biology. While the assembly of smaller DNA fragments is usually performed in vitro, high molecular weight DNA molecules are assembled in vivo via homologous recombination in the host cell. Escherichia coli, Bacillus subtilis and Saccharomyces cerevisiae are the main hosts used for DNA assembly in vivo. Progress in DNA assembly over the last few years has paved the way for the construction of whole genomes. This review provides an update on recent synthetic biology advances with particular emphasis on high molecular weight DNA assembly in vivo in E. coli, B. subtilis and S. cerevisiae. Special attention is paid to the assembly of whole genomes, such as those of the first synthetic cell, synthetic yeast and minimal genomes.

  8. Life's Biological Chemistry: A Destiny or Destination Starting from Prebiotic Chemistry?

    Science.gov (United States)

    Krishnamurthy, Ramanarayanan

    2018-06-05

    Research into understanding the origins -and evolution- of life has long been dominated by the concept of taking clues from extant biology and extrapolating its molecules and pathways backwards in time. This approach has also guided the search for solutions to the problem of how contemporary biomolecules would have arisen directly from prebiotic chemistry on early earth. However, the continuing difficulties in finding universally convincing solutions in connecting prebiotic chemistry to biological chemistry should give us pause, and prompt us to rethink this concept of treating extant life's chemical processes as the sole end goal and, therefore, focusing only -and implicitly- on the respective extant chemical building blocks. Rather, it may be worthwhile "to set aside the goal" and begin with what would have been plausible prebiotic reaction mixtures (which may have no obvious or direct connection to life's chemical building blocks and processes) - and allow their chemistries and interactions, under different geochemical constraints, to guide and illuminate as to what processes and systems can emerge. Such a conceptual approach gives rise to the prospect that chemistry of life-as-we-know-it is not the only result (not a "destiny"), but one that has emerged among many potential possibilities (a "destination"). This postulate, in turn, could impact the way we think about chemical signatures and criteria used in the search for alternative and extraterrestrial "life". As a bonus, we may discover the chemistries and pathways naturally that led to the emergence of life as we know it. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Minimizing Exposure at Work

    Science.gov (United States)

    ; Environment Human Health Animal Health Safe Use Practices Food Safety Environment Air Water Soil Wildlife Home Page Pesticide Health and Safety Information Safe Use Practices Minimizing Exposure at Work Pesticides - Pennsylvania State University Cooperative Extension Personal Protective Equipment for Working

  10. Minimalism. Clip and Save.

    Science.gov (United States)

    Hubbard, Guy

    2002-01-01

    Provides background information on the art movement called "Minimalism" discussing why it started and its characteristics. Includes learning activities and information on the artist, Donald Judd. Includes a reproduction of one of his art works and discusses its content. (CMK)

  11. Development of a minimal growth medium for Lactobacillus plantarum

    NARCIS (Netherlands)

    Wegkamp, H.B.A.; Teusink, B.; Vos, de W.M.; Smid, E.J.

    2010-01-01

    Aim: A medium with minimal requirements for the growth of Lactobacillus plantarum WCFS was developed. The composition of the minimal medium was compared to a genome-scale metabolic model of L. plantarum. Methods and Results: By repetitive single omission experiments, two minimal media were

  12. Support minimized inversion of acoustic and elastic wave scattering

    International Nuclear Information System (INIS)

    Safaeinili, A.

    1994-01-01

    This report discusses the following topics on support minimized inversion of acoustic and elastic wave scattering: Minimum support inversion; forward modelling of elastodynamic wave scattering; minimum support linearized acoustic inversion; support minimized nonlinear acoustic inversion without absolute phase; and support minimized nonlinear elastic inversion

  13. Noether symmetry for non-minimally coupled fermion fields

    International Nuclear Information System (INIS)

    Souza, Rudinei C de; Kremer, Gilberto M

    2008-01-01

    A cosmological model where a fermion field is non-minimally coupled with the gravitational field is studied. By applying Noether symmetry the possible functions for the potential density of the fermion field and for the coupling are determined. Cosmological solutions are found showing that the non-minimally coupled fermion field behaves as an inflaton describing an inflationary scenario, whereas the minimally coupled fermion field describes a decelerated period, behaving as a standard matter field

  14. Plateau inflation from random non-minimal coupling

    International Nuclear Information System (INIS)

    Broy, Benedict J.; Roest, Diederik

    2016-06-01

    A generic non-minimal coupling can push any higher-order terms of the scalar potential sufficiently far out in field space to yield observationally viable plateau inflation. We provide analytic and numerical evidence that this generically happens for a non-minimal coupling strength ξ of the order N 2 e . In this regime, the non-minimally coupled field is sub-Planckian during inflation and is thus protected from most higher-order terms. For larger values of ξ, the inflationary predictions converge towards the sweet spot of PLANCK. The latter includes ξ≅10 4 obtained from CMB normalization arguments, thus providing a natural explanation for the inflationary observables measured.

  15. Minimizing the regrets of long-term urban floodplain management decisions under deeply uncertain climate change

    Science.gov (United States)

    Hecht, J. S.; Kirshen, P. H.; Vogel, R. M.

    2016-12-01

    Making long-term floodplain management decisions under uncertain climate change is a major urban planning challenge of the 21stcentury. To support these efforts, we introduce a screening-level optimization model that identifies adaptation portfolios by minimizing the regrets associated with their flood-control and damage costs under different climate change trajectories that are deeply uncertain, i.e. have probabilities that cannot be specified plausibly. This mixed integer program explicitly considers the coupled damage-reduction impacts of different floodwall designs and property-scale investments (first-floor elevation, wet floodproofing of basements, permanent retreat and insurance), recommends implementation schedules, and assesses impacts to stakeholders residing in three types of homes. An application to a stylized municipality illuminates many nonlinear system dynamics stemming from large fixed capital costs, infrastructure design thresholds, and discharge-depth-damage relationships. If stakeholders tolerate mild damage, floodwalls that fully protect a community from large design events are less cost-effective than portfolios featuring both smaller floodwalls and property-scale measures. Potential losses of property tax revenue from permanent retreat motivate municipal property-tax initiatives for adaptation financing. Yet, insurance incentives for first-floor elevation may discourage locally financed floodwalls, in turn making lower-income residents more vulnerable to severe flooding. A budget constraint analysis underscores the benefits of flexible floodwall designs with low incremental expansion costs while near-optimal solutions demonstrate the scheduling flexibility of many property-scale measures. Finally, an equity analysis shows the importance of evaluating the overpayment and under-design regrets of recommended adaptation portfolios for each stakeholder and contrasts them to single-scenario model results.

  16. Minimization of radioactive solid wastes from uranium mining and metallurgy

    International Nuclear Information System (INIS)

    Zhang Xueli; Xu Lechang; Wei Guangzhi; Gao Jie; Wang Erqi

    2010-01-01

    The concept and contents of radioactive waste minimization are introduced. The principle of radioactive waste minimization involving administration optimization, source reduction, recycling and reuse as well as volume reduction are discussed. The strategies and methods to minimize radioactive solid wastes from uranium mining and metallurgy are summarized. In addition, the benefit from its application of radioactive waste minimization is analyzed. Prospects for the research on radioactive so-lid waste minimization are made in the end. (authors)

  17. A Color-Opponency Based Biological Model for Color Constancy

    Directory of Open Access Journals (Sweden)

    Yongjie Li

    2011-05-01

    Full Text Available Color constancy is the ability of the human visual system to adaptively correct color-biased scenes under different illuminants. Most of the existing color constancy models are nonphysiologically plausible. Among the limited biological models, the great majority is Retinex and its variations, and only two or three models directly simulate the feature of color-opponency, but only of the very earliest stages of visual pathway, i.e., the single-opponent mechanisms involved at the levels of retinal ganglion cells and lateral geniculate nucleus (LGN neurons. Considering the extensive physiological evidences supporting that both the single-opponent cells in retina and LGN and the double-opponent neurons in primary visual cortex (V1 are the building blocks for color constancy, in this study we construct a color-opponency based color constancy model by simulating the opponent fashions of both the single-opponent and double-opponent cells in a forward manner. As for the spatial structure of the receptive fields (RF, both the classical RF (CRF center and the nonclassical RF (nCRF surround are taken into account for all the cells. The proposed model was tested on several typical image databases commonly used for performance evaluation of color constancy methods, and exciting results were achieved.

  18. Image denoising by a direct variational minimization

    Directory of Open Access Journals (Sweden)

    Pilipović Stevan

    2011-01-01

    Full Text Available Abstract In this article we introduce a novel method for the image de-noising which combines a mathematically well-posdenes of the variational modeling with the efficiency of a patch-based approach in the field of image processing. It based on a direct minimization of an energy functional containing a minimal surface regularizer that uses fractional gradient. The minimization is obtained on every predefined patch of the image, independently. By doing so, we avoid the use of an artificial time PDE model with its inherent problems of finding optimal stopping time, as well as the optimal time step. Moreover, we control the level of image smoothing on each patch (and thus on the whole image by adapting the Lagrange multiplier using the information on the level of discontinuities on a particular patch, which we obtain by pre-processing. In order to reduce the average number of vectors in the approximation generator and still to obtain the minimal degradation, we combine a Ritz variational method for the actual minimization on a patch, and a complementary fractional variational principle. Thus, the proposed method becomes computationally feasible and applicable for practical purposes. We confirm our claims with experimental results, by comparing the proposed method with a couple of PDE-based methods, where we get significantly better denoising results specially on the oscillatory regions.

  19. Neural nets for the plausibility check of measured values in the integrated measurement and information system for the surveillance of environmental radioactivity (IMIS)

    International Nuclear Information System (INIS)

    Haase, G.

    2003-01-01

    Neural nets to the plausibility check of measured values in the ''integrated measurement and information system for the surveillance of environmental radioactivity, IMIS'' is a research project supported by the Federal Minister for the Environment, Nature Conservation and Nuclear Safety. A goal of this project was the automatic recognition of implausible measured values in the data base ORACLE, which measured values from surveillance of environmental radioactivity of most diverse environmental media contained. The conversion of this project [ 1 ] was realized by institut of logic, complexity and deduction systems of the university Karlsruhe under the direction of Professor Dr. Menzel, Dr. Martin Riedmueller and Martin Lauer. (orig.)

  20. Minimally invasive distal pancreatectomy

    NARCIS (Netherlands)

    Røsok, Bård I.; de Rooij, Thijs; van Hilst, Jony; Diener, Markus K.; Allen, Peter J.; Vollmer, Charles M.; Kooby, David A.; Shrikhande, Shailesh V.; Asbun, Horacio J.; Barkun, Jeffrey; Besselink, Marc G.; Boggi, Ugo; Conlon, Kevin; Han, Ho Seong; Hansen, Paul; Kendrick, Michael L.; Kooby, David; Montagnini, Andre L.; Palanivelu, Chinnasamy; Wakabayashi, Go; Zeh, Herbert J.

    2017-01-01

    The first International conference on Minimally Invasive Pancreas Resection was arranged in conjunction with the annual meeting of the International Hepato-Pancreato-Biliary Association (IHPBA), in Sao Paulo, Brazil on April 19th 2016. The presented evidence and outcomes resulting from the session

  1. Maximizing biomarker discovery by minimizing gene signatures

    Directory of Open Access Journals (Sweden)

    Chang Chang

    2011-12-01

    Full Text Available Abstract Background The use of gene signatures can potentially be of considerable value in the field of clinical diagnosis. However, gene signatures defined with different methods can be quite various even when applied the same disease and the same endpoint. Previous studies have shown that the correct selection of subsets of genes from microarray data is key for the accurate classification of disease phenotypes, and a number of methods have been proposed for the purpose. However, these methods refine the subsets by only considering each single feature, and they do not confirm the association between the genes identified in each gene signature and the phenotype of the disease. We proposed an innovative new method termed Minimize Feature's Size (MFS based on multiple level similarity analyses and association between the genes and disease for breast cancer endpoints by comparing classifier models generated from the second phase of MicroArray Quality Control (MAQC-II, trying to develop effective meta-analysis strategies to transform the MAQC-II signatures into a robust and reliable set of biomarker for clinical applications. Results We analyzed the similarity of the multiple gene signatures in an endpoint and between the two endpoints of breast cancer at probe and gene levels, the results indicate that disease-related genes can be preferably selected as the components of gene signature, and that the gene signatures for the two endpoints could be interchangeable. The minimized signatures were built at probe level by using MFS for each endpoint. By applying the approach, we generated a much smaller set of gene signature with the similar predictive power compared with those gene signatures from MAQC-II. Conclusions Our results indicate that gene signatures of both large and small sizes could perform equally well in clinical applications. Besides, consistency and biological significances can be detected among different gene signatures, reflecting the

  2. Operant Conditioning: A Minimal Components Requirement in Artificial Spiking Neurons Designed for Bio-Inspired Robot’s Controller

    Directory of Open Access Journals (Sweden)

    André eCyr

    2014-07-01

    Full Text Available We demonstrate the operant conditioning (OC learning process within a basic bio-inspired robot controller paradigm, using an artificial spiking neural network (ASNN with minimal component count as artificial brain. In biological agents, OC results in behavioral changes that are learned from the consequences of previous actions, using progressive prediction adjustment triggered by reinforcers. In a robotics context, virtual and physical robots may benefit from a similar learning skill when facing unknown environments with no supervision. In this work, we demonstrate that a simple ASNN can efficiently realise many OC scenarios. The elementary learning kernel that we describe relies on a few critical neurons, synaptic links and the integration of habituation and spike-timing dependent plasticity (STDP as learning rules. Using four tasks of incremental complexity, our experimental results show that such minimal neural component set may be sufficient to implement many OC procedures. Hence, with the described bio-inspired module, OC can be implemented in a wide range of robot controllers, including those with limited computational resources.

  3. Cost-minimization analysis of subcutaneous abatacept in the treatment of rheumatoid arthritis in Spain

    Directory of Open Access Journals (Sweden)

    R. Ariza

    2014-07-01

    Full Text Available Objective: To compare the cost of treating rheumatoid arthritis patients that have failed an initial treatment with methotrexate, with subcutaneous aba - tacept versus other first-line biologic disease-modifying antirheumatic drugs. Method: Subcutaneous abatacept was considered comparable to intravenous abatacept, adalimumab, certolizumab pegol, etanercept, golimumab, infliximab and tocilizumab, based on indirect comparison using mixed treatment analysis. A cost-minimization analysis was therefore considered appropriate. The Spanish Health System perspective and a 3 year time horizon were selected. Pharmaceutical and administration costs (, 2013 of all available first-line biological disease-modifying antirheumatic drugs were considered. Administration costs were obtained from a local costs database. Patients were considered to have a weight of 70 kg. A 3% annual discount rate was applied. Deterministic and probabilistic sensitivity analyses were performed. Results: Subcutaneous abatacept proved in the base case to be less costly than all other biologic antirrheumatic drugs (ranging from -831.42 to -9,741.69 versus infliximab and tocilizumab, respectively. Subcutaneous abatacept was associated with a cost of 10,760.41 per patient during the first year of treatment and 10,261.29 in subsequent years. The total 3-year cost of subcutaneous abatacept was 29,953.89 per patient. Sensitivity analyses proved the model to be robust. Subcutaneous abatacept remained cost-saving in 100% of probabilistic sensitivity analysis simulations versus adalimumab, certolizumab, etanercept and golimumab, in more than 99.6% versus intravenous abatacept and tocilizumab and in 62.3% versus infliximab. Conclusions: Treatment with subcutaneous abatacept is cost-saving versus intravenous abatacept, adalimumab, certolizumab, etanercept, golimumab, infliximab and tocilizumab in the management of rheumatoid arthritis patients initiating

  4. Benefit from the minimally invasive sinus technique.

    Science.gov (United States)

    Salama, N; Oakley, R J; Skilbeck, C J; Choudhury, N; Jacob, A

    2009-02-01

    Sinus drainage is impeded by the transition spaces that the anterior paranasal sinuses drain into, not the ostia themselves. Addressing the transition spaces and leaving the ostia intact, using the minimally invasive sinus technique, should reverse chronic rhinosinusitis. To assess patient benefit following use of the minimally invasive sinus technique for chronic rhinosinusitis. One hundred and forty-three consecutive patients underwent the minimally invasive sinus technique for chronic rhinosinusitis. Symptoms (i.e. blocked nose, poor sense of smell, rhinorrhoea, post-nasal drip, facial pain and sneezing) were recorded using a visual analogue scale, pre-operatively and at six and 12 weeks post-operatively. Patients were also surveyed using the Glasgow benefit inventory, one and three years post-operatively. We found a significant reduction in all nasal symptom scores at six and 12 weeks post-operatively, and increased total quality of life scores at one and three years post-operatively (25.2 and 14.8, respectively). The patient benefits of treatment with the minimally invasive sinus technique compare with the published patient benefits for functional endoscopic sinus surgery.

  5. Polycyclic Aromatic Hydrocarbons as Plausible Prebiotic Membrane Components

    Science.gov (United States)

    Groen, Joost; Deamer, David W.; Kros, Alexander; Ehrenfreund, Pascale

    2012-08-01

    Aromatic molecules delivered to the young Earth during the heavy bombardment phase in the early history of our solar system were likely to be among the most abundant and stable organic compounds available. The Aromatic World hypothesis suggests that aromatic molecules might function as container elements, energy transduction elements and templating genetic components for early life forms. To investigate the possible role of aromatic molecules as container elements, we incorporated different polycyclic aromatic hydrocarbons (PAH) in the membranes of fatty acid vesicles. The goal was to determine whether PAH could function as a stabilizing agent, similar to the role that cholesterol plays in membranes today. We studied vesicle size distribution, critical vesicle concentration and permeability of the bilayers using C6-C10 fatty acids mixed with amphiphilic PAH derivatives such as 1-hydroxypyrene, 9-anthracene carboxylic acid and 1,4 chrysene quinone. Dynamic Light Scattering (DLS) spectroscopy was used to measure the size distribution of vesicles and incorporation of PAH species was established by phase-contrast and epifluorescence microscopy. We employed conductimetric titration to determine the minimal concentration at which fatty acids could form stable vesicles in the presence of PAHs. We found that oxidized PAH derivatives can be incorporated into decanoic acid (DA) vesicle bilayers in mole ratios up to 1:10 (PAH:DA). Vesicle size distribution and critical vesicle concentration were largely unaffected by PAH incorporation, but 1-hydroxypyrene and 9-anthracene carboxylic acid lowered the permeability of fatty acid bilayers to small solutes up to 4-fold. These data represent the first indication of a cholesterol-like stabilizing effect of oxidized PAH derivatives in a simulated prebiotic membrane.

  6. Principle of minimal work fluctuations.

    Science.gov (United States)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  7. [Minimally invasive coronary artery surgery].

    Science.gov (United States)

    Zalaquett, R; Howard, M; Irarrázaval, M J; Morán, S; Maturana, G; Becker, P; Medel, J; Sacco, C; Lema, G; Canessa, R; Cruz, F

    1999-01-01

    There is a growing interest to perform a left internal mammary artery (LIMA) graft to the left anterior descending coronary artery (LAD) on a beating heart through a minimally invasive access to the chest cavity. To report the experience with minimally invasive coronary artery surgery. Analysis of 11 patients aged 48 to 79 years old with single vessel disease that, between 1996 and 1997, had a LIMA graft to the LAD performed through a minimally invasive left anterior mediastinotomy, without cardiopulmonary bypass. A 6 to 10 cm left parasternal incision was done. The LIMA to the LAD anastomosis was done after pharmacological heart rate and blood pressure control and a period of ischemic pre conditioning. Graft patency was confirmed intraoperatively by standard Doppler techniques. Patients were followed for a mean of 11.6 months (7-15 months). All patients were extubated in the operating room and transferred out of the intensive care unit on the next morning. Seven patients were discharged on the third postoperative day. Duplex scanning confirmed graft patency in all patients before discharge; in two patients, it was confirmed additionally by arteriography. There was no hospital mortality, no perioperative myocardial infarction and no bleeding problems. After follow up, ten patients were free of angina, in functional class I and pleased with the surgical and cosmetic results. One patient developed atypical angina on the seventh postoperative month and a selective arteriography confirmed stenosis of the anastomosis. A successful angioplasty of the original LAD lesion was carried out. A minimally invasive left anterior mediastinotomy is a good surgical access to perform a successful LIMA to LAD graft without cardiopulmonary bypass, allowing a shorter hospital stay and earlier postoperative recovery. However, a larger experience and a longer follow up is required to define its role in the treatment of coronary artery disease.

  8. Minimal genera of open 4-manifolds

    OpenAIRE

    Gompf, Robert E.

    2013-01-01

    We study exotic smoothings of open 4-manifolds using the minimal genus function and its analog for end homology. While traditional techniques in open 4-manifold smoothing theory give no control of minimal genera, we make progress by using the adjunction inequality for Stein surfaces. Smoothings can be constructed with much more control of these genus functions than the compact setting seems to allow. As an application, we expand the range of 4-manifolds known to have exotic smoothings (up to ...

  9. Minimally Invasive Surgery in Thymic Malignances

    Directory of Open Access Journals (Sweden)

    Wentao FANG

    2018-04-01

    Full Text Available Surgery is the most important therapy for thymic malignances. The last decade has seen increasing adoption of minimally invasive surgery (MIS for thymectomy. MIS for early stage thymoma patients has been shown to yield similar oncological results while being helpful in minimize surgical trauma, improving postoperative recovery, and reduce incisional pain. Meanwhile, With the advance in surgical techniques, the patients with locally advanced thymic tumors, preoperative induction therapies or recurrent diseases, may also benefit from MIS in selected cases.

  10. Direct analysis of biological samples by total reflection X-ray fluorescence

    International Nuclear Information System (INIS)

    Lue M, Marco P.; Hernandez-Caraballo, Edwin A.

    2004-01-01

    The technique of total reflection X-ray fluorescence (TXRF) is well suited for the direct analysis of biological samples due to the low matrix interferences and simultaneous multi-element nature. Nevertheless, biological organic samples are frequently analysed after digestion procedures. The direct determination of analytes requires shorter analysis time, low reactive consumption and simplifies the whole analysis process. On the other hand, the biological/clinical samples are often available in minimal amounts and routine studies require the analysis of large number of samples. To overcome the difficulties associated with the analysis of organic samples, particularly of solid ones, different procedures of sample preparation and calibration to approach the direct analysis have been evaluated: (1) slurry sampling, (2) Compton peak standardization, (3) in situ microwave digestion, (4) in situ chemical modification and (5) direct analysis with internal standardization. Examples of analytical methods developed by our research group are discussed. Some of them have not been previously published, illustrating alternative strategies for coping with various problems that may be encountered in the direct analysis by total reflection X-ray fluorescence spectrometry

  11. Strong Sector in non-minimal SUSY model

    Directory of Open Access Journals (Sweden)

    Costantini Antonio

    2016-01-01

    Full Text Available We investigate the squark sector of a supersymmetric theory with an extended Higgs sector. We give the mass matrices of stop and sbottom, comparing the Minimal Supersymmetric Standard Model (MSSM case and the non-minimal case. We discuss the impact of the extra superfields on the decay channels of the stop searched at the LHC.

  12. Waste Minimization Measurement and Progress Reporting

    International Nuclear Information System (INIS)

    Stone, K.A.

    1995-01-01

    Westinghouse Savannah River Company is implementing productivity improvement concepts into the Waste Minimization Program by focusing on the positive initiatives taken to reduce waste generation at the Savannah River Site. Previous performance measures, based only on waste generation rates, proved to be an ineffective metric for measuring performance and promoting continuous improvements within the Program. Impacts of mission changes and non-routine operations impeded development of baseline waste generation rates and often negated waste generation trending reports. A system was developed to quantify, document and track innovative activities that impact waste volume and radioactivity/toxicity reductions. This system coupled with Management-driven waste disposal avoidance goals is proving to be a powerful tool to promote waste minimization awareness and the implementation of waste reduction initiatives. Measurement of waste not generated, in addition to waste generated, increases the credibility of the Waste Minimization Program, improves sharing of success stories, and supports development of regulatory and management reports

  13. Opportunity-based age replacement policy with minimal repair

    International Nuclear Information System (INIS)

    Jhang, J.P.; Sheu, S.H.

    1999-01-01

    This paper proposes an opportunity-based age replacement policy with minimal repair. The system has two types of failures. Type I failures (minor failures) are removed by minimal repairs, whereas type II failures are removed by replacements. Type I and type II failures are age-dependent. A system is replaced at type II failure (catastrophic failure) or at the opportunity after age T, whichever occurs first. The cost of the minimal repair of the system at age z depends on the random part C(z) and the deterministic part c(z). The opportunity arises according to a Poisson process, independent of failures of the component. The expected cost rate is obtained. The optimal T * which would minimize the cost rate is discussed. Various special cases are considered. Finally, a numerical example is given

  14. Systems biology elucidates common pathogenic mechanisms between nonalcoholic and alcoholic-fatty liver disease.

    Directory of Open Access Journals (Sweden)

    Silvia Sookoian

    Full Text Available The abnormal accumulation of fat in the liver is often related either to metabolic risk factors associated with metabolic syndrome in the absence of alcohol consumption (nonalcoholic fatty liver disease, NAFLD or to chronic alcohol consumption (alcoholic fatty liver disease, AFLD. Clinical and histological studies suggest that NAFLD and AFLD share pathogenic mechanisms. Nevertheless, current data are still inconclusive as to whether the underlying biological process and disease pathways of NAFLD and AFLD are alike. Our primary aim was to integrate omics and physiological data to answer the question of whether NAFLD and AFLD share molecular processes that lead to disease development. We also explored the extent to which insulin resistance (IR is a distinctive feature of NAFLD. To answer these questions, we used systems biology approaches, such as gene enrichment analysis, protein-protein interaction networks, and gene prioritization, based on multi-level data extracted by computational data mining. We observed that the leading disease pathways associated with NAFLD did not significantly differ from those of AFLD. However, systems biology revealed the importance of each molecular process behind each of the two diseases, and dissected distinctive molecular NAFLD and AFLD-signatures. Comparative co-analysis of NAFLD and AFLD clarified the participation of NAFLD, but not AFLD, in cardiovascular disease, and showed that insulin signaling is impaired in fatty liver regardless of the noxa, but the putative regulatory mechanisms associated with NAFLD seem to encompass a complex network of genes and proteins, plausible of epigenetic modifications. Gene prioritization showed a cancer-related functional map that suggests that the fatty transformation of the liver tissue is regardless of the cause, an emerging mechanism of ubiquitous oncogenic activation. In conclusion, similar underlying disease mechanisms lead to NAFLD and AFLD, but specific ones depict a

  15. Implementation of Complex Biological Logic Circuits Using Spatially Distributed Multicellular Consortia

    Science.gov (United States)

    Urrios, Arturo; de Nadal, Eulàlia; Solé, Ricard; Posas, Francesc

    2016-01-01

    Engineered synthetic biological devices have been designed to perform a variety of functions from sensing molecules and bioremediation to energy production and biomedicine. Notwithstanding, a major limitation of in vivo circuit implementation is the constraint associated to the use of standard methodologies for circuit design. Thus, future success of these devices depends on obtaining circuits with scalable complexity and reusable parts. Here we show how to build complex computational devices using multicellular consortia and space as key computational elements. This spatial modular design grants scalability since its general architecture is independent of the circuit’s complexity, minimizes wiring requirements and allows component reusability with minimal genetic engineering. The potential use of this approach is demonstrated by implementation of complex logical functions with up to six inputs, thus demonstrating the scalability and flexibility of this method. The potential implications of our results are outlined. PMID:26829588

  16. Abrasive water jet cutting technique for biological shield concrete dismantlement

    International Nuclear Information System (INIS)

    Konno, T.; Narazaki, T.; Yokota, M.; Yoshida, H.; Miura, M.; Miyazaki, Y.

    1987-01-01

    The Japan Atomic Energy Research Institute (JAERI) is developing the abrasive-water jet cutting system to be applied to dismantling the biological shield walls of the JPDR as a part of the reactor dismantling technology development project. This is a total system for dismantling highly activated concrete. The concrete biological shield wall is cut into blocks by driving the abrasive-water jet nozzle, which is operated with a remote, automated control system. In this system, the concrete blocks are removed to a container, while the slurry and dust/mist which are generated during cutting are collected and treated, both automatically. It is a very practical method and will quite probably by used for actual dismantling of commercial power reactors in the future because it can minimize workers' exposure to radioactivity during dismantling, contributes to preventing diffusion of radiation, and reduces the volume of contaminated secondary waste

  17. Minimizing species extinctions through strategic planning for conservation fencing.

    Science.gov (United States)

    Ringma, Jeremy L; Wintle, Brendan; Fuller, Richard A; Fisher, Diana; Bode, Michael

    2017-10-01

    Conservation fences are an increasingly common management action, particularly for species threatened by invasive predators. However, unlike many conservation actions, fence networks are expanding in an unsystematic manner, generally as a reaction to local funding opportunities or threats. We conducted a gap analysis of Australia's large predator-exclusion fence network by examining translocation of Australian mammals relative to their extinction risk. To address gaps identified in species representation, we devised a systematic prioritization method for expanding the conservation fence network that explicitly incorporated population viability analysis and minimized expected species' extinctions. The approach was applied to New South Wales, Australia, where the state government intends to expand the existing conservation fence network. Existing protection of species in fenced areas was highly uneven; 67% of predator-sensitive species were unrepresented in the fence network. Our systematic prioritization yielded substantial efficiencies in that it reduced expected number of species extinctions up to 17 times more effectively than ad hoc approaches. The outcome illustrates the importance of governance in coordinating management action when multiple projects have similar objectives and rely on systematic methods rather than expanding networks opportunistically. © 2017 Society for Conservation Biology.

  18. Pengaruh Pelapis Bionanokomposit terhadap Mutu Mangga Terolah Minimal

    OpenAIRE

    Wardana, Ata Aditya; Suyatma, Nugraha Edhi; Muchtadi, Tien Ruspriatin; Yuliani, Sri

    2017-01-01

    Abstract Minimally-processed mango is a perishable product due to high respiration and transpiration and microbial decay. Edible coating is one of the alternative methods to maintain the quality of minimally - processed mango. The objective of this study was to evaluate the effects of bionanocomposite edible coating from tapioca and ZnO nanoparticles (NP-ZnO) on quality of minimally - processed mango cv. Arumanis, stored for 12 days at 8°C. The combination of tapioca and NP-ZnO (0, 1, 2% b...

  19. Is synthetic biology mechanical biology?

    Science.gov (United States)

    Holm, Sune

    2015-12-01

    A widespread and influential characterization of synthetic biology emphasizes that synthetic biology is the application of engineering principles to living systems. Furthermore, there is a strong tendency to express the engineering approach to organisms in terms of what seems to be an ontological claim: organisms are machines. In the paper I investigate the ontological and heuristic significance of the machine analogy in synthetic biology. I argue that the use of the machine analogy and the aim of producing rationally designed organisms does not necessarily imply a commitment to mechanical biology. The ideal of applying engineering principles to biology is best understood as expressing recognition of the machine-unlikeness of natural organisms and the limits of human cognition. The paper suggests an interpretation of the identification of organisms with machines in synthetic biology according to which it expresses a strategy for representing, understanding, and constructing living systems that are more machine-like than natural organisms.

  20. Synchronous volcanic eruptions and abrupt climate change ∼17.7 ka plausibly linked by stratospheric ozone depletion.

    Science.gov (United States)

    McConnell, Joseph R; Burke, Andrea; Dunbar, Nelia W; Köhler, Peter; Thomas, Jennie L; Arienzo, Monica M; Chellman, Nathan J; Maselli, Olivia J; Sigl, Michael; Adkins, Jess F; Baggenstos, Daniel; Burkhart, John F; Brook, Edward J; Buizert, Christo; Cole-Dai, Jihong; Fudge, T J; Knorr, Gregor; Graf, Hans-F; Grieman, Mackenzie M; Iverson, Nels; McGwire, Kenneth C; Mulvaney, Robert; Paris, Guillaume; Rhodes, Rachael H; Saltzman, Eric S; Severinghaus, Jeffrey P; Steffensen, Jørgen Peder; Taylor, Kendrick C; Winckler, Gisela

    2017-09-19

    Glacial-state greenhouse gas concentrations and Southern Hemisphere climate conditions persisted until ∼17.7 ka, when a nearly synchronous acceleration in deglaciation was recorded in paleoclimate proxies in large parts of the Southern Hemisphere, with many changes ascribed to a sudden poleward shift in the Southern Hemisphere westerlies and subsequent climate impacts. We used high-resolution chemical measurements in the West Antarctic Ice Sheet Divide, Byrd, and other ice cores to document a unique, ∼192-y series of halogen-rich volcanic eruptions exactly at the start of accelerated deglaciation, with tephra identifying the nearby Mount Takahe volcano as the source. Extensive fallout from these massive eruptions has been found >2,800 km from Mount Takahe. Sulfur isotope anomalies and marked decreases in ice core bromine consistent with increased surface UV radiation indicate that the eruptions led to stratospheric ozone depletion. Rather than a highly improbable coincidence, circulation and climate changes extending from the Antarctic Peninsula to the subtropics-similar to those associated with modern stratospheric ozone depletion over Antarctica-plausibly link the Mount Takahe eruptions to the onset of accelerated Southern Hemisphere deglaciation ∼17.7 ka.

  1. Bilinguals' Plausibility Judgments for Phrases with a Literal vs. Non-literal Meaning: The Influence of Language Brokering Experience

    Directory of Open Access Journals (Sweden)

    Belem G. López

    2017-09-01

    Full Text Available Previous work has shown that prior experience in language brokering (informal translation may facilitate the processing of meaning within and across language boundaries. The present investigation examined the influence of brokering on bilinguals' processing of two word collocations with either a literal or a figurative meaning in each language. Proficient Spanish-English bilinguals classified as brokers or non-brokers were asked to judge if adjective+noun phrases presented in each language made sense or not. Phrases with a literal meaning (e.g., stinging insect were interspersed with phrases with a figurative meaning (e.g., stinging insult and non-sensical phrases (e.g., stinging picnic. It was hypothesized that plausibility judgments would be facilitated for literal relative to figurative meanings in each language but that experience in language brokering would be associated with a more equivalent pattern of responding across languages. These predictions were confirmed. The findings add to the body of empirical work on individual differences in language processing in bilinguals associated with prior language brokering experience.

  2. Automated economic analysis model for hazardous waste minimization

    International Nuclear Information System (INIS)

    Dharmavaram, S.; Mount, J.B.; Donahue, B.A.

    1990-01-01

    The US Army has established a policy of achieving a 50 percent reduction in hazardous waste generation by the end of 1992. To assist the Army in reaching this goal, the Environmental Division of the US Army Construction Engineering Research Laboratory (USACERL) designed the Economic Analysis Model for Hazardous Waste Minimization (EAHWM). The EAHWM was designed to allow the user to evaluate the life cycle costs for various techniques used in hazardous waste minimization and to compare them to the life cycle costs of current operating practices. The program was developed in C language on an IBM compatible PC and is consistent with other pertinent models for performing economic analyses. The potential hierarchical minimization categories used in EAHWM include source reduction, recovery and/or reuse, and treatment. Although treatment is no longer an acceptable minimization option, its use is widespread and has therefore been addressed in the model. The model allows for economic analysis for minimization of the Army's six most important hazardous waste streams. These include, solvents, paint stripping wastes, metal plating wastes, industrial waste-sludges, used oils, and batteries and battery electrolytes. The EAHWM also includes a general application which can be used to calculate and compare the life cycle costs for minimization alternatives of any waste stream, hazardous or non-hazardous. The EAHWM has been fully tested and implemented in more than 60 Army installations in the United States

  3. The relative volume growth of minimal submanifolds

    DEFF Research Database (Denmark)

    Markvorsen, Steen; Palmer, V.

    2002-01-01

    The volume growth of certain well-defined subsets of minimal submanifolds in riemannian spaces are compared with the volume growth of balls and spheres ill space forms of constant curvature.......The volume growth of certain well-defined subsets of minimal submanifolds in riemannian spaces are compared with the volume growth of balls and spheres ill space forms of constant curvature....

  4. Pathways of DNA unlinking: A story of stepwise simplification.

    Science.gov (United States)

    Stolz, Robert; Yoshida, Masaaki; Brasher, Reuben; Flanner, Michelle; Ishihara, Kai; Sherratt, David J; Shimokawa, Koya; Vazquez, Mariel

    2017-09-29

    In Escherichia coli DNA replication yields interlinked chromosomes. Controlling topological changes associated with replication and returning the newly replicated chromosomes to an unlinked monomeric state is essential to cell survival. In the absence of the topoisomerase topoIV, the site-specific recombination complex XerCD- dif-FtsK can remove replication links by local reconnection. We previously showed mathematically that there is a unique minimal pathway of unlinking replication links by reconnection while stepwise reducing the topological complexity. However, the possibility that reconnection preserves or increases topological complexity is biologically plausible. In this case, are there other unlinking pathways? Which is the most probable? We consider these questions in an analytical and numerical study of minimal unlinking pathways. We use a Markov Chain Monte Carlo algorithm with Multiple Markov Chain sampling to model local reconnection on 491 different substrate topologies, 166 knots and 325 links, and distinguish between pathways connecting a total of 881 different topologies. We conclude that the minimal pathway of unlinking replication links that was found under more stringent assumptions is the most probable. We also present exact results on unlinking a 6-crossing replication link. These results point to a general process of topology simplification by local reconnection, with applications going beyond DNA.

  5. Constituent period in theoretization of minimalism in architecture

    Directory of Open Access Journals (Sweden)

    Stevanović Vladimir

    2012-01-01

    Full Text Available The paper analyzes architectural discourse that is formed around the term minimalism, between 1976 and 1999, a period that I consider constitutive for theorization of the term. The presentation is directed by two hypotheses: I minimalism in architecture does not have a continuous stream of origin, development, and is not a style, direction, movement, school, genre or trend in terms of how it is defined in disciplines such as art history, aesthetics and art theory II the fact that it's rare for an architect to declare himself a minimalist suggests that minimalism in architecture is actually a product or construct of an architectural discourse that emerged from the need to consolidate the existing obvious and widespread formal idiom in architecture partly during and after post-modernism. It is indicative that the writing of history of minimalism in architecture, in its most intensive period - the nineties, takes place mainly in three cities: London, Barcelona and Milan. In this sense, we can examine how each of these centers emphasized its role, through the ambition of minimalism in architecture to appear as an authentic local creation.

  6. Algorithm for finding minimal cut sets in a fault tree

    International Nuclear Information System (INIS)

    Rosenberg, Ladislav

    1996-01-01

    This paper presents several algorithms that have been used in a computer code for fault-tree analysing by the minimal cut sets method. The main algorithm is the more efficient version of the new CARA algorithm, which finds minimal cut sets with an auxiliary dynamical structure. The presented algorithm for finding the minimal cut sets enables one to do so by defined requirements - according to the order of minimal cut sets, or to the number of minimal cut sets, or both. This algorithm is from three to six times faster when compared with the primary version of the CARA algorithm

  7. Development of a waste minimization plan for a Department of Energy remedial action program: Ideas for minimizing waste in remediation scenarios

    International Nuclear Information System (INIS)

    Hubbard, Linda M.; Galen, Glen R.

    1992-01-01

    Waste minimization has become an important consideration in the management of hazardous waste because of regulatory as well as cost considerations. Waste minimization techniques are often process specific or industry specific and generally are not applicable to site remediation activities. This paper will examine ways in which waste can be minimized in a remediation setting such as the U.S. Department of Energy's Formerly Utilized Sites Remedial Action Program, where the bulk of the waste produced results from remediating existing contamination, not from generating new waste. (author)

  8. Solving inverse problems for biological models using the collage method for differential equations.

    Science.gov (United States)

    Capasso, V; Kunze, H E; La Torre, D; Vrscay, E R

    2013-07-01

    In the first part of this paper we show how inverse problems for differential equations can be solved using the so-called collage method. Inverse problems can be solved by minimizing the collage distance in an appropriate metric space. We then provide several numerical examples in mathematical biology. We consider applications of this approach to the following areas: population dynamics, mRNA and protein concentration, bacteria and amoeba cells interaction, tumor growth.

  9. Predicting the minimal translation apparatus: lessons from the reductive evolution of mollicutes.

    Directory of Open Access Journals (Sweden)

    Henri Grosjean

    2014-05-01

    reconstruction of a minimal cell by synthetic biology approaches.

  10. Helium Ion Microscopy (HIM) for the imaging of biological samples at sub-nanometer resolution

    Science.gov (United States)

    Joens, Matthew S.; Huynh, Chuong; Kasuboski, James M.; Ferranti, David; Sigal, Yury J.; Zeitvogel, Fabian; Obst, Martin; Burkhardt, Claus J.; Curran, Kevin P.; Chalasani, Sreekanth H.; Stern, Lewis A.; Goetze, Bernhard; Fitzpatrick, James A. J.

    2013-12-01

    Scanning Electron Microscopy (SEM) has long been the standard in imaging the sub-micrometer surface ultrastructure of both hard and soft materials. In the case of biological samples, it has provided great insights into their physical architecture. However, three of the fundamental challenges in the SEM imaging of soft materials are that of limited imaging resolution at high magnification, charging caused by the insulating properties of most biological samples and the loss of subtle surface features by heavy metal coating. These challenges have recently been overcome with the development of the Helium Ion Microscope (HIM), which boasts advances in charge reduction, minimized sample damage, high surface contrast without the need for metal coating, increased depth of field, and 5 angstrom imaging resolution. We demonstrate the advantages of HIM for imaging biological surfaces as well as compare and contrast the effects of sample preparation techniques and their consequences on sub-nanometer ultrastructure.

  11. GeLC-MS: A Sample Preparation Method for Proteomics Analysis of Minimal Amount of Tissue.

    Science.gov (United States)

    Makridakis, Manousos; Vlahou, Antonia

    2017-10-10

    Application of various proteomics methodologies have been implemented for the global and targeted proteome analysis of many different types of biological samples such as tissue, urine, plasma, serum, blood, and cell lines. Among the aforementioned biological samples, tissue has an exceptional role into clinical research and practice. Disease initiation and progression is usually located at the tissue level of different organs, making the analysis of this material very important for the understanding of the disease pathophysiology. Despite the significant advances in the mass spectrometry instrumentation, tissue proteomics still faces several challenges mainly due to increased sample complexity and heterogeneity. However, the most prominent challenge is attributed to the invasive procedure of tissue sampling which restricts the availability of fresh frozen tissue to minimal amounts and limited number of samples. Application of GeLC-MS sample preparation protocol for tissue proteomics analysis can greatly facilitate making up for these difficulties. In this chapter, a step by step guide for the proteomics analysis of minute amounts of tissue samples using the GeLC-MS sample preparation protocol, as applied by our group in the analysis of multiple different types of tissues (vessels, kidney, bladder, prostate, heart) is provided.

  12. Protein Intake and Muscle Health in Old Age: From Biological Plausibility to Clinical Evidence

    Directory of Open Access Journals (Sweden)

    Francesco Landi

    2016-05-01

    Full Text Available The provision of sufficient amounts of dietary proteins is central to muscle health as it ensures the supply of essential amino acids and stimulates protein synthesis. Older persons, in particular, are at high risk of insufficient protein ingestion. Furthermore, the current recommended dietary allowance for protein (0.8 g/kg/day might be inadequate for maintaining muscle health in older adults, probably as a consequence of “anabolic resistance” in aged muscle. Older individuals therefore need to ingest a greater quantity of protein to maintain muscle function. The quality of protein ingested is also essential to promoting muscle health. Given the role of leucine as the master dietary regulator of muscle protein turnover, the ingestion of protein sources enriched with this essential amino acid, or its metabolite β-hydroxy β-methylbutyrate, is thought to offer the greatest benefit in terms of preservation of muscle mass and function in old age.

  13. Minimalism and the Pragmatic Frame

    Directory of Open Access Journals (Sweden)

    Ana Falcato

    2016-02-01

    Full Text Available In the debate between literalism and contextualism in semantics, Kent Bach’s project is often taken to stand on the latter side of the divide. In this paper I argue this is a misleading assumption and justify it by contrasting Bach’s assessment of the theoretical eliminability of minimal propositions arguably expressed by well-formed sentences with standard minimalist views, and by further contrasting his account of the division of interpretative processes ascribable to the semantics and pragmatics of a language with a parallel analysis carried out by the most radical opponent to semantic minimalism, i.e., by occasionalism. If my analysis proves right, the sum of its conclusions amounts to a refusal of Bach’s main dichotomies.

  14. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  15. A videoscope for use in minimally invasive periodontal surgery.

    Science.gov (United States)

    Harrel, Stephen K; Wilson, Thomas G; Rivera-Hidalgo, Francisco

    2013-09-01

    Minimally invasive periodontal procedures have been reported to produce excellent clinical results. Visualization during minimally invasive procedures has traditionally been obtained by the use of surgical telescopes, surgical microscopes, glass fibre endoscopes or a combination of these devices. All of these methods for visualization are less than fully satisfactory due to problems with access, magnification and blurred imaging. A videoscope for use with minimally invasive periodontal procedures has been developed to overcome some of the difficulties that exist with current visualization approaches. This videoscope incorporates a gas shielding technology that eliminates the problems of fogging and fouling of the optics of the videoscope that has previously prevented the successful application of endoscopic visualization to periodontal surgery. In addition, as part of the gas shielding technology the videoscope also includes a moveable retractor specifically adapted for minimally invasive surgery. The clinical use of the videoscope during minimally invasive periodontal surgery is demonstrated and discussed. The videoscope with gas shielding alleviates many of the difficulties associated with visualization during minimally invasive periodontal surgery. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Environmental Restoration Program waste minimization and pollution prevention self-assessment

    International Nuclear Information System (INIS)

    1994-10-01

    The Environmental Restoration (ER) Program within Martin Marietta Energy Systems, Inc. is currently developing a more active waste minimization and pollution prevention program. To determine areas of programmatic improvements within the ER Waste Minimization and Pollution Prevention Awareness Program, the ER Program required an evaluation of the program across the Oak Ridge K-25 Site, the Oak Ridge National Laboratory, the Oak Ridge Y-12 Plant, the Paducah Environmental Restoration and Waste Minimization Site, and the Portsmouth Environmental Restoration and Waste Minimization Site. This document presents the status of the overall program as of fourth quarter FY 1994, presents pollution prevention cost avoidance data associated with FY 1994 activities, and identifies areas for improvement. Results of this assessment indicate that the ER Waste Minimization and Pollution Prevention Awareness Program is firmly established and is developing rapidly. Several procedural goals were met in FY 1994 and many of the sites implemented ER waste minimization options. Additional growth is needed, however, for the ER Waste Minimization and Pollution Prevention Awareness Program

  17. Local Risk-Minimization for Defaultable Claims with Recovery Process

    International Nuclear Information System (INIS)

    Biagini, Francesca; Cretarola, Alessandra

    2012-01-01

    We study the local risk-minimization approach for defaultable claims with random recovery at default time, seen as payment streams on the random interval [0,τ∧T], where T denotes the fixed time-horizon. We find the pseudo-locally risk-minimizing strategy in the case when the agent information takes into account the possibility of a default event (local risk-minimization with G-strategies) and we provide an application in the case of a corporate bond. We also discuss the problem of finding a pseudo-locally risk-minimizing strategy if we suppose the agent obtains her information only by observing the non-defaultable assets.

  18. Flattening the inflaton potential beyond minimal gravity

    Directory of Open Access Journals (Sweden)

    Lee Hyun Min

    2018-01-01

    Full Text Available We review the status of the Starobinsky-like models for inflation beyond minimal gravity and discuss the unitarity problem due to the presence of a large non-minimal gravity coupling. We show that the induced gravity models allow for a self-consistent description of inflation and discuss the implications of the inflaton couplings to the Higgs field in the Standard Model.

  19. Fast nonconvex nonsmooth minimization methods for image restoration and reconstruction.

    Science.gov (United States)

    Nikolova, Mila; Ng, Michael K; Tam, Chi-Pan

    2010-12-01

    Nonconvex nonsmooth regularization has advantages over convex regularization for restoring images with neat edges. However, its practical interest used to be limited by the difficulty of the computational stage which requires a nonconvex nonsmooth minimization. In this paper, we deal with nonconvex nonsmooth minimization methods for image restoration and reconstruction. Our theoretical results show that the solution of the nonconvex nonsmooth minimization problem is composed of constant regions surrounded by closed contours and neat edges. The main goal of this paper is to develop fast minimization algorithms to solve the nonconvex nonsmooth minimization problem. Our experimental results show that the effectiveness and efficiency of the proposed algorithms.

  20. Home visiting and the biology of toxic stress: opportunities to address early childhood adversity.

    Science.gov (United States)

    Garner, Andrew S

    2013-11-01

    Home visiting is an important mechanism for minimizing the lifelong effects of early childhood adversity. To do so, it must be informed by the biology of early brain and child development. Advances in neuroscience, epigenetics, and the physiology of stress are revealing the biological mechanisms underlying well-established associations between early childhood adversity and suboptimal life-course trajectories. Left unchecked, mediators of physiologic stress become toxic, alter both genome and brain, and lead to a vicious cycle of chronic stress. This so-called "toxic stress" results a wide array of behavioral attempts to blunt the stress response, a process known as "behavioral allostasis." Although behaviors like smoking, overeating, promiscuity, and substance abuse decrease stress transiently, over time they become maladaptive and result in the unhealthy lifestyles and noncommunicable diseases that are the leading causes of morbidity and mortality. The biology of toxic stress and the concept of behavioral allostasis shed new light on the developmental origins of lifelong disease and highlight opportunities for early intervention and prevention. Future efforts to minimize the effects of childhood adversity should focus on expanding the capacity of caregivers and communities to promote (1) the safe, stable, and nurturing relationships that buffer toxic stress, and (2) the rudimentary but foundational social-emotional, language, and cognitive skills needed to develop healthy, adaptive coping skills. Building these critical caregiver and community capacities will require a public health approach with unprecedented levels of collaboration and coordination between the healthcare, childcare, early education, early intervention, and home visiting sectors.

  1. Minimal Length Scale Scenarios for Quantum Gravity.

    Science.gov (United States)

    Hossenfelder, Sabine

    2013-01-01

    We review the question of whether the fundamental laws of nature limit our ability to probe arbitrarily short distances. First, we examine what insights can be gained from thought experiments for probes of shortest distances, and summarize what can be learned from different approaches to a theory of quantum gravity. Then we discuss some models that have been developed to implement a minimal length scale in quantum mechanics and quantum field theory. These models have entered the literature as the generalized uncertainty principle or the modified dispersion relation, and have allowed the study of the effects of a minimal length scale in quantum mechanics, quantum electrodynamics, thermodynamics, black-hole physics and cosmology. Finally, we touch upon the question of ways to circumvent the manifestation of a minimal length scale in short-distance physics.

  2. Minimal Self-Models and the Free Energy Principle

    Directory of Open Access Journals (Sweden)

    Jakub eLimanowski

    2013-09-01

    Full Text Available The term "minimal phenomenal selfhood" describes the basic, pre-reflective experience of being a self (Blanke & Metzinger, 2009. Theoretical accounts of the minimal self have long recognized the importance and the ambivalence of the body as both part of the physical world, and the enabling condition for being in this world (Gallagher, 2005; Grafton, 2009. A recent account of minimal phenomenal selfhood (MPS, Metzinger, 2004a centers on the consideration that minimal selfhood emerges as the result of basic self-modeling mechanisms, thereby being founded on pre-reflective bodily processes. The free energy principle (FEP, Friston, 2010 is a novel unified theory of cortical function that builds upon the imperative that self-organizing systems entail hierarchical generative models of the causes of their sensory input, which are optimized by minimizing free energy as an approximation of the log-likelihood of the model. The implementation of the FEP via predictive coding mechanisms and in particular the active inference principle emphasizes the role of embodiment for predictive self-modeling, which has been appreciated in recent publications. In this review, we provide an overview of these conceptions and illustrate thereby the potential power of the FEP in explaining the mechanisms underlying minimal selfhood and its key constituents, multisensory integration, interoception, agency, perspective, and the experience of mineness. We conclude that the conceptualization of MPS can be well mapped onto a hierarchical generative model furnished by the free energy principle and may constitute the basis for higher-level, cognitive forms of self-referral, as well as the understanding of other minds.

  3. A plausible (overlooked) super-luminous supernova in the Sloan digital sky survey stripe 82 data

    International Nuclear Information System (INIS)

    Kostrzewa-Rutkowska, Zuzanna; Kozłowski, Szymon; Wyrzykowski, Łukasz; Djorgovski, S. George; Mahabal, Ashish A.; Glikman, Eilat; Koposov, Sergey

    2013-01-01

    We present the discovery of a plausible super-luminous supernova (SLSN), found in the archival data of Sloan Digital Sky Survey (SDSS) Stripe 82, called PSN 000123+000504. The supernova (SN) peaked at m g < 19.4 mag in the second half of 2005 September, but was missed by the real-time SN hunt. The observed part of the light curve (17 epochs) showed that the rise to the maximum took over 30 days, while the decline time lasted at least 70 days (observed frame), closely resembling other SLSNe of SN 2007bi type. The spectrum of the host galaxy reveals a redshift of z = 0.281 and the distance modulus of μ = 40.77 mag. Combining this information with the SDSS photometry, we found the host galaxy to be an LMC-like irregular dwarf galaxy with an absolute magnitude of M B = –18.2 ± 0.2 mag and an oxygen abundance of 12+log [O/H]=8.3±0.2; hence, the SN peaked at M g < –21.3 mag. Our SLSN follows the relation for the most energetic/super-luminous SNe exploding in low-metallicity environments, but we found no clear evidence for SLSNe to explode in low-luminosity (dwarf) galaxies only. The available information on the PSN 000123+000504 light curve suggests the magnetar-powered model as a likely scenario of this event. This SLSN is a new addition to a quickly growing family of super-luminous SNe.

  4. Minimally-aggressive gestational trophoblastic neoplasms.

    Science.gov (United States)

    Cole, Laurence A

    2012-04-01

    We have previously defined a new syndrome "Minimally-aggressive gestational trophoblastic neoplasms" in which choriocarcinoma or persistent hydatidiform mole has a minimal growth rate and becomes chemorefractory. Previously we described a new treatment protocol, waiting for hCG rise to >3000 mIU/ml and disease becomes more advanced, then using combination chemotherapy. Initially we found this treatment successful in 8 of 8 cases, here we find this protocol appropriate in a further 16 cases. Initially we used hyperglycosylated hCG, a limited availability test, to identify this syndrome. Here we propose also using hCG doubling rate to detect this syndrome. Minimally aggressive gestational trophoblastic disease can be detected by chemotherapy resistance or low hyperglycosylated hCG, disease by hyperglycosylated hCG and by hCG doubling test. All were recommended to hold off further chemotherapy until hCG >3000mIU/ml. One case died prior to the start of the study, one case withdrew because of a lung nodule and one withdrew refusing the suggested combination chemotherapy. The remaining 16 women were all successfully treated. A total of 8 plus 16 or 24 of 24 women were successfully treated using the proposed protocol, holding back on chemotherapy until hCG >3000mIU/ml. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Developmental biology, the stem cell of biological disciplines.

    Science.gov (United States)

    Gilbert, Scott F

    2017-12-01

    Developmental biology (including embryology) is proposed as "the stem cell of biological disciplines." Genetics, cell biology, oncology, immunology, evolutionary mechanisms, neurobiology, and systems biology each has its ancestry in developmental biology. Moreover, developmental biology continues to roll on, budding off more disciplines, while retaining its own identity. While its descendant disciplines differentiate into sciences with a restricted set of paradigms, examples, and techniques, developmental biology remains vigorous, pluripotent, and relatively undifferentiated. In many disciplines, especially in evolutionary biology and oncology, the developmental perspective is being reasserted as an important research program.

  6. Simulating biological processes: stochastic physics from whole cells to colonies

    Science.gov (United States)

    Earnest, Tyler M.; Cole, John A.; Luthey-Schulten, Zaida

    2018-05-01

    The last few decades have revealed the living cell to be a crowded spatially heterogeneous space teeming with biomolecules whose concentrations and activities are governed by intrinsically random forces. It is from this randomness, however, that a vast array of precisely timed and intricately coordinated biological functions emerge that give rise to the complex forms and behaviors we see in the biosphere around us. This seemingly paradoxical nature of life has drawn the interest of an increasing number of physicists, and recent years have seen stochastic modeling grow into a major subdiscipline within biological physics. Here we review some of the major advances that have shaped our understanding of stochasticity in biology. We begin with some historical context, outlining a string of important experimental results that motivated the development of stochastic modeling. We then embark upon a fairly rigorous treatment of the simulation methods that are currently available for the treatment of stochastic biological models, with an eye toward comparing and contrasting their realms of applicability, and the care that must be taken when parameterizing them. Following that, we describe how stochasticity impacts several key biological functions, including transcription, translation, ribosome biogenesis, chromosome replication, and metabolism, before considering how the functions may be coupled into a comprehensive model of a ‘minimal cell’. Finally, we close with our expectation for the future of the field, focusing on how mesoscopic stochastic methods may be augmented with atomic-scale molecular modeling approaches in order to understand life across a range of length and time scales.

  7. Biological sex affects the neurobiology of autism

    Science.gov (United States)

    Lombardo, Michael V.; Suckling, John; Ruigrok, Amber N. V.; Chakrabarti, Bhismadev; Ecker, Christine; Deoni, Sean C. L.; Craig, Michael C.; Murphy, Declan G. M.; Bullmore, Edward T.; Baron-Cohen, Simon

    2013-01-01

    In autism, heterogeneity is the rule rather than the exception. One obvious source of heterogeneity is biological sex. Since autism was first recognized, males with autism have disproportionately skewed research. Females with autism have thus been relatively overlooked, and have generally been assumed to have the same underlying neurobiology as males with autism. Growing evidence, however, suggests that this is an oversimplification that risks obscuring the biological base of autism. This study seeks to answer two questions about how autism is modulated by biological sex at the level of the brain: (i) is the neuroanatomy of autism different in males and females? and (ii) does the neuroanatomy of autism fit predictions from the ‘extreme male brain’ theory of autism, in males and/or in females? Neuroanatomical features derived from voxel-based morphometry were compared in a sample of equal-sized high-functioning male and female adults with and without autism (n = 120, n = 30/group). The first question was investigated using a 2 × 2 factorial design, and by spatial overlap analyses of the neuroanatomy of autism in males and females. The second question was tested through spatial overlap analyses of specific patterns predicted by the extreme male brain theory. We found that the neuroanatomy of autism differed between adult males and females, evidenced by minimal spatial overlap (not different from that occurred under random condition) in both grey and white matter, and substantially large white matter regions showing significant sex × diagnosis interactions in the 2 × 2 factorial design. These suggest that autism manifests differently by biological sex. Furthermore, atypical brain areas in females with autism substantially and non-randomly (P males with autism. How differences in neuroanatomy relate to the similarities in cognition between males and females with autism remains to be understood. Future research should stratify by biological sex to reduce

  8. Relations between Intuitive Biological Thinking and Biological Misconceptions in Biology Majors and Nonmajors

    Science.gov (United States)

    Coley, John D.; Tanner, Kimberly

    2015-01-01

    Research and theory development in cognitive psychology and science education research remain largely isolated. Biology education researchers have documented persistent scientifically inaccurate ideas, often termed misconceptions, among biology students across biological domains. In parallel, cognitive and developmental psychologists have described intuitive conceptual systems—teleological, essentialist, and anthropocentric thinking—that humans use to reason about biology. We hypothesize that seemingly unrelated biological misconceptions may have common origins in these intuitive ways of knowing, termed cognitive construals. We presented 137 undergraduate biology majors and nonmajors with six biological misconceptions. They indicated their agreement with each statement, and explained their rationale for their response. Results indicate frequent agreement with misconceptions, and frequent use of construal-based reasoning among both biology majors and nonmajors in their written explanations. Moreover, results also show associations between specific construals and the misconceptions hypothesized to arise from those construals. Strikingly, such associations were stronger among biology majors than nonmajors. These results demonstrate important linkages between intuitive ways of thinking and misconceptions in discipline-based reasoning, and raise questions about the origins, persistence, and generality of relations between intuitive reasoning and biological misconceptions. PMID:25713093

  9. Minimization of mixed waste in explosive testing operations

    International Nuclear Information System (INIS)

    Gonzalez, M.A.; Sator, F.E.; Simmons, L.F.

    1993-02-01

    In the 1970s and 1980s, efforts to manage mixed waste and reduce pollution focused largely on post-process measures. In the late 1980s, the approach to waste management and pollution control changed, focusing on minimization and prevention rather than abatement, treatment, and disposal. The new approach, and the formulated guidance from the US Department of Energy, was to take all necessary measures to minimize waste and prevent the release of pollutants to the environment. Two measures emphasized in particular were source reduction (reducing the volume and toxicity of the waste source) and recycling. In 1988, a waste minimization and pollution prevention program was initiated at Site 300, where the Lawrence Livermore National Laboratory (LLNL) conducts explosives testing. LLNL's Defense Systems/Nuclear Design (DS/ND) Program has adopted a variety of conservation techniques to minimize waste generation and cut disposal costs associated with ongoing operations. The techniques include minimizing the generation of depleted uranium and lead mixed waste through inventory control and material substitution measures and through developing a management system to recycle surplus explosives. The changes implemented have reduced annual mixed waste volumes by more than 95% and reduced overall radioactive waste generation (low-level and mixed) by more than 75%. The measures employed were cost-effective and easily implemented

  10. Environmental Restoration Progam Waste Minimization and Pollution Prevention Awareness Program Plan

    Energy Technology Data Exchange (ETDEWEB)

    Grumski, J. T.; Swindle, D. W.; Bates, L. D.; DeLozier, M. F.P.; Frye, C. E.; Mitchell, M. E.

    1991-09-30

    In response to DOE Order 5400.1 this plan outlines the requirements for a Waste Minimization and Pollution Prevention Awareness Program for the Environmental Restoration (ER) Program at Martin Marietta Energy System, Inc. Statements of the national, Department of Energy, Energy Systems, and Energy Systems ER Program policies on waste minimization are included and reflect the attitudes of these organizations and their commitment to the waste minimization effort. Organizational responsibilities for the waste minimization effort are clearly defined and discussed, and the program objectives and goals are set forth. Waste assessment is addressed as being a key element in developing the waste generation baseline. There are discussions on the scope of ER-specific waste minimization techniques and approaches to employee awareness and training. There is also a discussion on the process for continual evaluation of the Waste Minimization Program. Appendixes present an implementation schedule for the Waste Minimization and Pollution Prevention Program, the program budget, an organization chart, and the ER waste minimization policy.

  11. Charge and energy minimization in electrical/magnetic stimulation of nervous tissue.

    Science.gov (United States)

    Jezernik, Saso; Sinkjaer, Thomas; Morari, Manfred

    2010-08-01

    In this work we address the problem of stimulating nervous tissue with the minimal necessary energy at reduced/minimal charge. Charge minimization is related to a valid safety concern (avoidance and reduction of stimulation-induced tissue and electrode damage). Energy minimization plays a role in battery-driven electrical or magnetic stimulation systems (increased lifetime, repetition rates, reduction of power requirements, thermal management). Extensive new theoretical results are derived by employing an optimal control theory framework. These results include derivation of the optimal electrical stimulation waveform for a mixed energy/charge minimization problem, derivation of the charge-balanced energy-minimal electrical stimulation waveform, solutions of a pure charge minimization problem with and without a constraint on the stimulation amplitude, and derivation of the energy-minimal magnetic stimulation waveform. Depending on the set stimulus pulse duration, energy and charge reductions of up to 80% are deemed possible. Results are verified in simulations with an active, mammalian-like nerve fiber model.

  12. Environmental Restoration Progam Waste Minimization and Pollution Prevention Awareness Program Plan

    International Nuclear Information System (INIS)

    1991-01-01

    In response to DOE Order 5400.1 this plan outlines the requirements for a Waste Minimization and Pollution Prevention Awareness Program for the Environmental Restoration (ER) Program at Martin Marietta Energy System, Inc. Statements of the national, Department of Energy, Energy Systems, and Energy Systems ER Program policies on waste minimization are included and reflect the attitudes of these organizations and their commitment to the waste minimization effort. Organizational responsibilities for the waste minimization effort are clearly defined and discussed, and the program objectives and goals are set forth. Waste assessment is addressed as being a key element in developing the waste generation baseline. There are discussions on the scope of ER-specific waste minimization techniques and approaches to employee awareness and training. There is also a discussion on the process for continual evaluation of the Waste Minimization Program. Appendixes present an implementation schedule for the Waste Minimization and Pollution Prevention Program, the program budget, an organization chart, and the ER waste minimization policy

  13. National Institutes of Health: Mixed waste minimization and treatment

    International Nuclear Information System (INIS)

    1995-08-01

    The Appalachian States Low-Level Radioactive Waste Commission requested the US Department of Energy's National Low-Level Waste Management Program (NLLWMP) to assist the biomedical community in becoming more knowledgeable about its mixed waste streams, to help minimize the mixed waste stream generated by the biomedical community, and to identify applicable treatment technologies for these mixed waste streams. As the first step in the waste minimization process, liquid low-level radioactive mixed waste (LLMW) streams generated at the National Institutes of Health (NIH) were characterized and combined into similar process categories. This report identifies possible waste minimization and treatment approaches for the LLMW generated by the biomedical community identified in DOE/LLW-208. In development of the report, on site meetings were conducted with NIH personnel responsible for generating each category of waste identified as lacking disposal options. Based on the meetings and general waste minimization guidelines, potential waste minimization options were identified

  14. National Institutes of Health: Mixed waste minimization and treatment

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-01

    The Appalachian States Low-Level Radioactive Waste Commission requested the US Department of Energy`s National Low-Level Waste Management Program (NLLWMP) to assist the biomedical community in becoming more knowledgeable about its mixed waste streams, to help minimize the mixed waste stream generated by the biomedical community, and to identify applicable treatment technologies for these mixed waste streams. As the first step in the waste minimization process, liquid low-level radioactive mixed waste (LLMW) streams generated at the National Institutes of Health (NIH) were characterized and combined into similar process categories. This report identifies possible waste minimization and treatment approaches for the LLMW generated by the biomedical community identified in DOE/LLW-208. In development of the report, on site meetings were conducted with NIH personnel responsible for generating each category of waste identified as lacking disposal options. Based on the meetings and general waste minimization guidelines, potential waste minimization options were identified.

  15. Smartphone-assisted minimally invasive neurosurgery.

    Science.gov (United States)

    Mandel, Mauricio; Petito, Carlo Emanuel; Tutihashi, Rafael; Paiva, Wellingson; Abramovicz Mandel, Suzana; Gomes Pinto, Fernando Campos; Ferreira de Andrade, Almir; Teixeira, Manoel Jacobsen; Figueiredo, Eberval Gadelha

    2018-03-13

    OBJECTIVE Advances in video and fiber optics since the 1990s have led to the development of several commercially available high-definition neuroendoscopes. This technological improvement, however, has been surpassed by the smartphone revolution. With the increasing integration of smartphone technology into medical care, the introduction of these high-quality computerized communication devices with built-in digital cameras offers new possibilities in neuroendoscopy. The aim of this study was to investigate the usefulness of smartphone-endoscope integration in performing different types of minimally invasive neurosurgery. METHODS The authors present a new surgical tool that integrates a smartphone with an endoscope by use of a specially designed adapter, thus eliminating the need for the video system customarily used for endoscopy. The authors used this novel combined system to perform minimally invasive surgery on patients with various neuropathological disorders, including cavernomas, cerebral aneurysms, hydrocephalus, subdural hematomas, contusional hematomas, and spontaneous intracerebral hematomas. RESULTS The new endoscopic system featuring smartphone-endoscope integration was used by the authors in the minimally invasive surgical treatment of 42 patients. All procedures were successfully performed, and no complications related to the use of the new method were observed. The quality of the images obtained with the smartphone was high enough to provide adequate information to the neurosurgeons, as smartphone cameras can record images in high definition or 4K resolution. Moreover, because the smartphone screen moves along with the endoscope, surgical mobility was enhanced with the use of this method, facilitating more intuitive use. In fact, this increased mobility was identified as the greatest benefit of the use of the smartphone-endoscope system compared with the use of the neuroendoscope with the standard video set. CONCLUSIONS Minimally invasive approaches

  16. QuEChERS extraction of benzodiazepines in biological matrices

    Directory of Open Access Journals (Sweden)

    Jessica L. Westland

    2013-12-01

    Full Text Available Two common analytical chemical problems often encountered when using chromatographic techniques in drug analysis are matrix interferences and ion suppression. Common sample preparation often involves the dilution of the sample prior to injection onto an instrument, especially for liquid chromatography–mass spectrometry (LC–MS analyses. This practice frequently does not minimize or eliminate conditions that may cause ion-suppression and therefore, suffer more from reduced method robustness. In order to achieve higher quality results and minimize possible interferences, various sample preparation techniques may be considered. Through the use of QuEChERS (“catchers”, a novel sample preparation technique used for high aqueous content samples, benzodiazepines can be extracted from biological fluids, such as blood and urine. This approach has shown increased recoveries of target compounds when using quantification by both external and internal standard. This increase in the recoveries has been attributed to a matrix enhancement and was determined through the use of the method of standard addition. While improving the overall analytical method for gas chromatography–mass spectrometry (GC–MS analysis, it is not clear if this approach represents an overall benefit for laboratories that have both GC–MS and high performance liquid chromatography tandem mass spectrometry (HPLC–MS/MS capability. Demonstrating evidence of variable ionization (enhancement, ion source inertness, etc., the method of quantification should be focused on in future studies. Keywords: Forensic science, QuEChERS, Drug analysis, Benzodiazepines, Gas Chromatography–Mass Spectrometry, Biological samples

  17. Role of Muramyl Dipeptide in Lipopolysaccharide-Mediated Biological Activity and Osteoclast Activity

    Directory of Open Access Journals (Sweden)

    Hideki Kitaura

    2018-01-01

    Full Text Available Lipopolysaccharide (LPS is an endotoxin and bacterial cell wall component that is capable of inducing inflammation and immunological activity. Muramyl dipeptide (MDP, the minimal essential structural unit responsible for the immunological activity of peptidoglycans, is another inflammation-inducing molecule that is ubiquitously expressed by bacteria. Several studies have shown that inflammation-related biological activities were synergistically induced by interactions between LPS and MDP. MDP synergistically enhances production of proinflammatory cytokines that are induced by LPS exposure. Injection of MDP induces lethal shock in mice challenged with LPS. LPS also induces osteoclast formation and pathological bone resorption; MDP enhances LPS induction of both processes. Furthermore, MDP enhances the LPS-induced receptor activator of NF-κB ligand (RANKL expression and toll-like receptor 4 (TLR4 expression both in vivo and in vitro. Additionally, MDP enhances LPS-induced mitogen-activated protein kinase (MAPK signaling in stromal cells. Taken together, these findings suggest that MDP plays an important role in LPS-induced biological activities. This review discusses the role of MDP in LPS-mediated biological activities, primarily in relation to osteoclastogenesis.

  18. On the convergence of nonconvex minimization methods for image recovery.

    Science.gov (United States)

    Xiao, Jin; Ng, Michael Kwok-Po; Yang, Yu-Fei

    2015-05-01

    Nonconvex nonsmooth regularization method has been shown to be effective for restoring images with neat edges. Fast alternating minimization schemes have also been proposed and developed to solve the nonconvex nonsmooth minimization problem. The main contribution of this paper is to show the convergence of these alternating minimization schemes, based on the Kurdyka-Łojasiewicz property. In particular, we show that the iterates generated by the alternating minimization scheme, converges to a critical point of this nonconvex nonsmooth objective function. We also extend the analysis to nonconvex nonsmooth regularization model with box constraints, and obtain similar convergence results of the related minimization algorithm. Numerical examples are given to illustrate our convergence analysis.

  19. Signals of non-minimal Higgs sectors at future colliders

    International Nuclear Information System (INIS)

    Akeroyd, A.G.

    1996-08-01

    This thesis concerns study of extended Higgs sectors at future colliders. Such studies are well motivated since enlarged Higgs models are a necessity in many extensions of the Standard Model (SM), although these structures may be considered purely in the context of the SM, to be called the 'non-minimal SM'. The continuous theme of the thesis is the task of distinguishing between the (many) theoretically sound non-minimal Higgs sectors at forthcoming colliders. If a Higgs boson is found it is imperative to know from which model it originates. In particular, the possible differences between the Higgs sectors of the Minimal Supersymmetric Standard Model (MSSM) and the non-minimal SM are highlighted. (author)

  20. Evolved Minimal Frustration in Multifunctional Biomolecules.

    Science.gov (United States)

    Röder, Konstantin; Wales, David J

    2018-05-25

    Protein folding is often viewed in terms of a funnelled potential or free energy landscape. A variety of experiments now indicate the existence of multifunnel landscapes, associated with multifunctional biomolecules. Here, we present evidence that these systems have evolved to exhibit the minimal number of funnels required to fulfil their cellular functions, suggesting an extension to the principle of minimum frustration. We find that minimal disruptive mutations result in additional funnels, and the associated structural ensembles become more diverse. The same trends are observed in an atomic cluster. These observations suggest guidelines for rational design of engineered multifunctional biomolecules.

  1. Energy-efficient ECG compression on wireless biosensors via minimal coherence sensing and weighted ℓ₁ minimization reconstruction.

    Science.gov (United States)

    Zhang, Jun; Gu, Zhenghui; Yu, Zhu Liang; Li, Yuanqing

    2015-03-01

    Low energy consumption is crucial for body area networks (BANs). In BAN-enabled ECG monitoring, the continuous monitoring entails the need of the sensor nodes to transmit a huge data to the sink node, which leads to excessive energy consumption. To reduce airtime over energy-hungry wireless links, this paper presents an energy-efficient compressed sensing (CS)-based approach for on-node ECG compression. At first, an algorithm called minimal mutual coherence pursuit is proposed to construct sparse binary measurement matrices, which can be used to encode the ECG signals with superior performance and extremely low complexity. Second, in order to minimize the data rate required for faithful reconstruction, a weighted ℓ1 minimization model is derived by exploring the multisource prior knowledge in wavelet domain. Experimental results on MIT-BIH arrhythmia database reveals that the proposed approach can obtain higher compression ratio than the state-of-the-art CS-based methods. Together with its low encoding complexity, our approach can achieve significant energy saving in both encoding process and wireless transmission.

  2. A Brief Introduction to Chinese Biological Biological

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Chinese Biological Abstracts sponsored by the Library, the Shanghai Institutes for Biological Sciences, the Biological Documentation and Information Network, all of the Chinese Academy of Sciences, commenced publication in 1987 and was initiated to provide access to the Chinese information in the field of biology.

  3. Adoption of waste minimization technology to benefit electroplaters

    Energy Technology Data Exchange (ETDEWEB)

    Ching, E.M.K.; Li, C.P.H.; Yu, C.M.K. [Hong Kong Productivity Council, Kowloon (Hong Kong)

    1996-12-31

    Because of increasingly stringent environmental legislation and enhanced environmental awareness, electroplaters in Hong Kong are paying more heed to protect the environment. To comply with the array of environmental controls, electroplaters can no longer rely solely on the end-of-pipe approach as a means for abating their pollution problems under the particular local industrial environment. The preferred approach is to adopt waste minimization measures that yield both economic and environmental benefits. This paper gives an overview of electroplating activities in Hong Kong, highlights their characteristics, and describes the pollution problems associated with conventional electroplating operations. The constraints of using pollution control measures to achieve regulatory compliance are also discussed. Examples and case studies are given on some low-cost waste minimization techniques readily available to electroplaters, including dragout minimization and water conservation techniques. Recommendations are given as to how electroplaters can adopt and exercise waste minimization techniques in their operations. 1 tab.

  4. Qualitative reasoning for biological network inference from systematic perturbation experiments.

    Science.gov (United States)

    Badaloni, Silvana; Di Camillo, Barbara; Sambo, Francesco

    2012-01-01

    The systematic perturbation of the components of a biological system has been proven among the most informative experimental setups for the identification of causal relations between the components. In this paper, we present Systematic Perturbation-Qualitative Reasoning (SPQR), a novel Qualitative Reasoning approach to automate the interpretation of the results of systematic perturbation experiments. Our method is based on a qualitative abstraction of the experimental data: for each perturbation experiment, measured values of the observed variables are modeled as lower, equal or higher than the measurements in the wild type condition, when no perturbation is applied. The algorithm exploits a set of IF-THEN rules to infer causal relations between the variables, analyzing the patterns of propagation of the perturbation signals through the biological network, and is specifically designed to minimize the rate of false positives among the inferred relations. Tested on both simulated and real perturbation data, SPQR indeed exhibits a significantly higher precision than the state of the art.

  5. Cost-effectiveness analysis in minimally invasive spine surgery.

    Science.gov (United States)

    Al-Khouja, Lutfi T; Baron, Eli M; Johnson, J Patrick; Kim, Terrence T; Drazin, Doniel

    2014-06-01

    Medical care has been evolving with the increased influence of a value-based health care system. As a result, more emphasis is being placed on ensuring cost-effectiveness and utility in the services provided to patients. This study looks at this development in respect to minimally invasive spine surgery (MISS) costs. A literature review using PubMed, the Cost-Effectiveness Analysis (CEA) Registry, and the National Health Service Economic Evaluation Database (NHS EED) was performed. Papers were included in the study if they reported costs associated with minimally invasive spine surgery (MISS). If there was no mention of cost, CEA, cost-utility analysis (CUA), quality-adjusted life year (QALY), quality, or outcomes mentioned, then the article was excluded. Fourteen studies reporting costs associated with MISS in 12,425 patients (3675 undergoing minimally invasive procedures and 8750 undergoing open procedures) were identified through PubMed, the CEA Registry, and NHS EED. The percent cost difference between minimally invasive and open approaches ranged from 2.54% to 33.68%-all indicating cost saving with a minimally invasive surgical approach. Average length of stay (LOS) for minimally invasive surgery ranged from 0.93 days to 5.1 days compared with 1.53 days to 12 days for an open approach. All studies reporting EBL reported lower volume loss in an MISS approach (range 10-392.5 ml) than in an open approach (range 55-535.5 ml). There are currently an insufficient number of studies published reporting the costs of MISS. Of the studies published, none have followed a standardized method of reporting and analyzing cost data. Preliminary findings analyzing the 14 studies showed both cost saving and better outcomes in MISS compared with an open approach. However, more Level I CEA/CUA studies including cost/QALY evaluations with specifics of the techniques utilized need to be reported in a standardized manner to make more accurate conclusions on the cost effectiveness of

  6. Relations between intuitive biological thinking and biological misconceptions in biology majors and nonmajors.

    Science.gov (United States)

    Coley, John D; Tanner, Kimberly

    2015-03-02

    Research and theory development in cognitive psychology and science education research remain largely isolated. Biology education researchers have documented persistent scientifically inaccurate ideas, often termed misconceptions, among biology students across biological domains. In parallel, cognitive and developmental psychologists have described intuitive conceptual systems--teleological, essentialist, and anthropocentric thinking--that humans use to reason about biology. We hypothesize that seemingly unrelated biological misconceptions may have common origins in these intuitive ways of knowing, termed cognitive construals. We presented 137 undergraduate biology majors and nonmajors with six biological misconceptions. They indicated their agreement with each statement, and explained their rationale for their response. Results indicate frequent agreement with misconceptions, and frequent use of construal-based reasoning among both biology majors and nonmajors in their written explanations. Moreover, results also show associations between specific construals and the misconceptions hypothesized to arise from those construals. Strikingly, such associations were stronger among biology majors than nonmajors. These results demonstrate important linkages between intuitive ways of thinking and misconceptions in discipline-based reasoning, and raise questions about the origins, persistence, and generality of relations between intuitive reasoning and biological misconceptions. © 2015 J. D. Coley and K. Tanner. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  7. From microarray to biology: an integrated experimental, statistical and in silico analysis of how the extracellular matrix modulates the phenotype of cancer cells

    OpenAIRE

    Centola Michael B; Dozmorov Igor; Buethe David D; Saban Ricardo; Hauser Paul J; Kyker Kimberly D; Dozmorov Mikhail G; Culkin Daniel J; Hurst Robert E

    2008-01-01

    Abstract A statistically robust and biologically-based approach for analysis of microarray data is described that integrates independent biological knowledge and data with a global F-test for finding genes of interest that minimizes the need for replicates when used for hypothesis generation. First, each microarray is normalized to its noise level around zero. The microarray dataset is then globally adjusted by robust linear regression. Second, genes of interest that capture significant respo...

  8. Minimal Length Scale Scenarios for Quantum Gravity

    Directory of Open Access Journals (Sweden)

    Sabine Hossenfelder

    2013-01-01

    Full Text Available We review the question of whether the fundamental laws of nature limit our ability to probe arbitrarily short distances. First, we examine what insights can be gained from thought experiments for probes of shortest distances, and summarize what can be learned from different approaches to a theory of quantum gravity. Then we discuss some models that have been developed to implement a minimal length scale in quantum mechanics and quantum field theory. These models have entered the literature as the generalized uncertainty principle or the modified dispersion relation, and have allowed the study of the effects of a minimal length scale in quantum mechanics, quantum electrodynamics, thermodynamics, black-hole physics and cosmology. Finally, we touch upon the question of ways to circumvent the manifestation of a minimal length scale in short-distance physics.

  9. Developmental biology, the stem cell of biological disciplines

    OpenAIRE

    Gilbert, Scott F.

    2017-01-01

    Developmental biology (including embryology) is proposed as "the stem cell of biological disciplines.” Genetics, cell biology, oncology, immunology, evolutionary mechanisms, neurobiology, and systems biology each has its ancestry in developmental biology. Moreover, developmental biology continues to roll on, budding off more disciplines, while retaining its own identity. While its descendant disciplines differentiate into sciences with a restricted set of paradigms, examples, and techniques, ...

  10. Minimizing TLD-DRD differences

    International Nuclear Information System (INIS)

    Riley, D.L.; McCoy, R.A.; Connell, W.D.

    1987-01-01

    When substantial differences exist in exposures recorded by TLD's and DRD's, it is often necessary to perform an exposure investigation to reconcile the difference. In working with several operating plants, the authors have observed a number of causes for these differences. This paper outlines these observations and discusses procedures that can be used to minimize them

  11. Biological intrusion barriers for large-volume waste-disposal sites

    International Nuclear Information System (INIS)

    Hakonson, T.E.; Cline, J.F.; Rickard, W.H.

    1982-01-01

    intrusion of plants and animals into shallow land burial sites with subsequent mobilization of toxic and radiotoxic materials has occured. Based on recent pathway modeling studies, such intrusions can contribute to the dose received by man. This paper describes past work on developing biological intrusion barrier systems for application to large volume waste site stabilization. State-of-the-art concepts employing rock and chemical barriers are discussed relative to long term serviceability and cost of application. The interaction of bio-intrusion barrier systems with other processes affecting trench cover stability are discussed to ensure that trench cover designs minimize the potential dose to man. 3 figures, 6 tables

  12. Perturbed Yukawa textures in the minimal seesaw model

    Energy Technology Data Exchange (ETDEWEB)

    Rink, Thomas; Schmitz, Kai [Max Planck Institute for Nuclear Physics (MPIK),69117 Heidelberg (Germany)

    2017-03-29

    We revisit the minimal seesaw model, i.e., the type-I seesaw mechanism involving only two right-handed neutrinos. This model represents an important minimal benchmark scenario for future experimental updates on neutrino oscillations. It features four real parameters that cannot be fixed by the current data: two CP-violating phases, δ and σ, as well as one complex parameter, z, that is experimentally inaccessible at low energies. The parameter z controls the structure of the neutrino Yukawa matrix at high energies, which is why it may be regarded as a label or index for all UV completions of the minimal seesaw model. The fact that z encompasses only two real degrees of freedom allows us to systematically scan the minimal seesaw model over all of its possible UV completions. In doing so, we address the following question: suppose δ and σ should be measured at particular values in the future — to what extent is one then still able to realize approximate textures in the neutrino Yukawa matrix? Our analysis, thus, generalizes previous studies of the minimal seesaw model based on the assumption of exact texture zeros. In particular, our study allows us to assess the theoretical uncertainty inherent to the common texture ansatz. One of our main results is that a normal light-neutrino mass hierarchy is, in fact, still consistent with a two-zero Yukawa texture, provided that the two texture zeros receive corrections at the level of O(10 %). While our numerical results pertain to the minimal seesaw model only, our general procedure appears to be applicable to other neutrino mass models as well.

  13. Bioprecipitation: a feedback cycle linking earth history, ecosystem dynamics and land use through biological ice nucleators in the atmosphere.

    Science.gov (United States)

    Morris, Cindy E; Conen, Franz; Alex Huffman, J; Phillips, Vaughan; Pöschl, Ulrich; Sands, David C

    2014-02-01

    Landscapes influence precipitation via the water vapor and energy fluxes they generate. Biologically active landscapes also generate aerosols containing microorganisms, some being capable of catalyzing ice formation and crystal growth in clouds at temperatures near 0 °C. The resulting precipitation is beneficial for the growth of plants and microorganisms. Mounting evidence from observations and numerical simulations support the plausibility of a bioprecipitation feedback cycle involving vegetated landscapes and the microorganisms they host. Furthermore, the evolutionary history of ice nucleation-active bacteria such as Pseudomonas syringae supports that they have been part of this process on geological time scales since the emergence of land plants. Elucidation of bioprecipitation feedbacks involving landscapes and their microflora could contribute to appraising the impact that modified landscapes have on regional weather and biodiversity, and to avoiding inadvertent, negative consequences of landscape management. © 2013 John Wiley & Sons Ltd.

  14. Statistically Efficient Construction of α-Risk-Minimizing Portfolio

    Directory of Open Access Journals (Sweden)

    Hiroyuki Taniai

    2012-01-01

    Full Text Available We propose a semiparametrically efficient estimator for α-risk-minimizing portfolio weights. Based on the work of Bassett et al. (2004, an α-risk-minimizing portfolio optimization is formulated as a linear quantile regression problem. The quantile regression method uses a pseudolikelihood based on an asymmetric Laplace reference density, and asymptotic properties such as consistency and asymptotic normality are obtained. We apply the results of Hallin et al. (2008 to the problem of constructing α-risk-minimizing portfolios using residual signs and ranks and a general reference density. Monte Carlo simulations assess the performance of the proposed method. Empirical applications are also investigated.

  15. Geometry of minimal rational curves on Fano manifolds

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, J -M [Korea Institute for Advanced Study, Seoul (Korea, Republic of)

    2001-12-15

    This lecture is an introduction to my joint project with N. Mok where we develop a geometric theory of Fano manifolds of Picard number 1 by studying the collection of tangent directions of minimal rational curves through a generic point. After a sketch of some historical background, the fundamental object of this project, the variety of minimal rational tangents, is defined and various examples are examined. Then some results on the variety of minimal rational tangents are discussed including an extension theorem for holomorphic maps preserving the geometric structure. Some applications of this theory to the stability of the tangent bundles and the rigidity of generically finite morphisms are given. (author)

  16. Lawrence Livermore National Laboratory (LLNL) Waste Minimization Program Plan

    International Nuclear Information System (INIS)

    Heckman, R.A.; Tang, W.R.

    1989-01-01

    This Program Plan document describes the background of the Waste Minimization field at Lawrence Livermore National Laboratory (LLNL) and refers to the significant studies that have impacted on legislative efforts, both at the federal and state levels. A short history of formal LLNL waste minimization efforts is provided. Also included are general findings from analysis of work to date, with emphasis on source reduction findings. A short summary is provided on current regulations and probable future legislation which may impact on waste minimization methodology. The LLN Waste Minimization Program Plan is designed to be dynamic and flexible so as to meet current regulations, and yet is able to respond to an everchanging regulatory environment. 19 refs., 12 figs., 8 tabs

  17. Rewiring cells: synthetic biology as a tool to interrogate the organizational principles of living systems.

    Science.gov (United States)

    Bashor, Caleb J; Horwitz, Andrew A; Peisajovich, Sergio G; Lim, Wendell A

    2010-01-01

    The living cell is an incredibly complex entity, and the goal of predictively and quantitatively understanding its function is one of the next great challenges in biology. Much of what we know about the cell concerns its constituent parts, but to a great extent we have yet to decode how these parts are organized to yield complex physiological function. Classically, we have learned about the organization of cellular networks by disrupting them through genetic or chemical means. The emerging discipline of synthetic biology offers an additional, powerful approach to study systems. By rearranging the parts that comprise existing networks, we can gain valuable insight into the hierarchical logic of the networks and identify the modular building blocks that evolution uses to generate innovative function. In addition, by building minimal toy networks, one can systematically explore the relationship between network structure and function. Here, we outline recent work that uses synthetic biology approaches to investigate the organization and function of cellular networks, and describe a vision for a synthetic biology toolkit that could be used to interrogate the design principles of diverse systems.

  18. Foraging site selection of two subspecies of Bar-tailed Godwit Limosa lapponica: time minimizers accept greater predation danger than energy minimizers

    NARCIS (Netherlands)

    Duijns, S.; Dijk, van J.G.B.; Spaans, B.; Jukema, J.; Boer, de W.F.; Piersma, Th.

    2009-01-01

    Different spatial distributions of food abundance and predators may urge birds to make a trade-off between food intake and danger. Such a trade-off might be solved in different ways in migrant birds that either follow a time-minimizing or energy-minimizing strategy; these strategies have been

  19. Foraging site selection of two subspecies of Bar-tailed Godwit Limosa lapponica : time minimizers accept greater predation danger than energy minimizers

    NARCIS (Netherlands)

    Duijns, Sjoerd; van Dijk, Jacintha G. B.; Spaans, Bernard; Jukema, Joop; de Boer, Willem F.; Piersma, Theunis

    2009-01-01

    Different spatial distributions Of food abundance and predators may urge birds to make a trade-off between food intake and danger. Such a trade-off might be solved in different ways in migrant birds that either follow a time-minimizing or energy-minimizing strategy; these strategies have been

  20. SAR image regularization with fast approximate discrete minimization.

    Science.gov (United States)

    Denis, Loïc; Tupin, Florence; Darbon, Jérôme; Sigelle, Marc

    2009-07-01

    Synthetic aperture radar (SAR) images, like other coherent imaging modalities, suffer from speckle noise. The presence of this noise makes the automatic interpretation of images a challenging task and noise reduction is often a prerequisite for successful use of classical image processing algorithms. Numerous approaches have been proposed to filter speckle noise. Markov random field (MRF) modelization provides a convenient way to express both data fidelity constraints and desirable properties of the filtered image. In this context, total variation minimization has been extensively used to constrain the oscillations in the regularized image while preserving its edges. Speckle noise follows heavy-tailed distributions, and the MRF formulation leads to a minimization problem involving nonconvex log-likelihood terms. Such a minimization can be performed efficiently by computing minimum cuts on weighted graphs. Due to memory constraints, exact minimization, although theoretically possible, is not achievable on large images required by remote sensing applications. The computational burden of the state-of-the-art algorithm for approximate minimization (namely the alpha -expansion) is too heavy specially when considering joint regularization of several images. We show that a satisfying solution can be reached, in few iterations, by performing a graph-cut-based combinatorial exploration of large trial moves. This algorithm is applied to joint regularization of the amplitude and interferometric phase in urban area SAR images.

  1. Physical activity, mediating factors and risk of colon cancer: insights into adiposity and circulating biomarkers from the EPIC cohort.

    NARCIS (Netherlands)

    Aleksandrova, Krasimira; Jenab, Mazda; Leitzmann, Michael; Bueno-de-Mesquita, Bas; Kaaks, Rudolf; Trichopoulou, Antonia; Bamia, Christina; Lagiou, Pagona; Rinaldi, Sabina; Freisling, Heinz; Carayol, Marion; Pischon, Tobias; Drogan, Dagmar; Weiderpass, Elisabete; Jakszyn, Paula; Overvad, Kim; Dahm, Christina C; Tjønneland, Anne; Bouton-Ruault, Marie-Christine; Kühn, Tilman; Peppa, Eleni; Valanou, Elissavet; La Vecchia, Carlo; Palli, Domenico; Panico, Salvatore; Sacerdote, Carlotta; Agnoli, Claudia; Tumino, Rosario; May, Anne; van Vulpen, Jonna; Benjaminsen Borch, Kristin; Oluwafemi Oyeyemi, Sunday; Quirós, J Ramón; Bonet, Catalina; Sánchez, María-José; Dorronsoro, Miren; Navarro, Carmen; Barricarte, Aurelio; van Guelpen, Bethany; Wennberg, Patrik; Key, Timothy J; Khaw, Kay-Tee; Wareham, Nicholas; Assi, Nada; Ward, Heather A; Aune, Dagfinn; Riboli, Elio; Boeing, Heiner

    2017-01-01

    There is convincing evidence that high physical activity lowers the risk of colon cancer; however, the underlying biological mechanisms remain largely unknown. We aimed to determine the extent to which body fatness and biomarkers of various biologically plausible pathways account for the association

  2. The effect of membrane-regulated actin polymerization on a two-phase flow model for cell motility

    KAUST Repository

    Kimpton, L. S.

    2014-07-23

    Two-phase flow models have been widely used to model cell motility and we have previously demonstrated that even the simplest, stripped-down, 1D model displays many observed features of cell motility [Kimpton, L.S., Whiteley, J.P., Waters, S.L., King, J.R. & Oliver, J.M. (2013) Multiple travelling-wave solutions in a minimal model for cell motility. Math. Med. Biol. 30, 241 - 272]. In this paper, we address a limitation of the previous model.We show that the two-phase flow framework can exhibit travelling-wave solutions with biologically plausible actin network profiles in two simple models that enforce polymerization or depolymerization of the actin network at the ends of the travelling, 1D strip of cytoplasm. © 2014 The authors 2014. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  3. Classical strings and minimal surfaces

    International Nuclear Information System (INIS)

    Urbantke, H.

    1986-01-01

    Real Lorentzian forms of some complex or complexified Euclidean minimal surfaces are obtained as an application of H.A. Schwarz' solution to the initial value problem or a search for surfaces admitting a group of Poincare transformations. (Author)

  4. Separations: The path to waste minimization

    International Nuclear Information System (INIS)

    Bell, J.T.

    1992-01-01

    Waste materials usually are composed of large amounts of innocuous and frequently useful components mixed with lesser amounts of one or more hazardous components. The ultimate path to waste minimization is the separation of the lesser quantities of hazardous components from the innocuous components, and then recycle the useful components. This vision is so simple that everyone would be expected to properly manage waste. Several parameters interfere with this proper waste management, which encourages the open-quotes sweep it under the rugclose quotes or the open-quotes bury it allclose quotes attitudes, both of which delay and complicate proper waste management. The two primary parameters that interfere with proper waste management are: economics drives a process to a product without concerns of waste minimization, and emergency needs for immediate production of a product usually delays proper waste management. A third parameter in recent years is also interfering with proper waste management: quick relief of waste insults to political and public perceptions is promoting the open-quotes bury it allclose quotes attitude. A fourth parameter can promote better waste management for any scenario that suffers either or all of the first three parameters: separations technology can minimize wastes when the application of this technology is not voided by influence of the first three parameters. The US Department of Energy's management of nuclear waste has been seriously affected by the above four parameters. This paper includes several points about how the generation and management of DOE wastes have been, and continue to be, affected by these parameters. Particular separations technologies for minimizing the DOE wastes that must be stored for long periods are highlighted

  5. On Beyond Star Trek, the Role of Synthetic Biology in Nasa's Missions

    Science.gov (United States)

    Rothschild, Lynn J.

    2016-01-01

    The time has come to for NASA to exploit the nascent field of synthetic biology in pursuit of its mission, including aeronautics, earth science, astrobiology and notably, human exploration. Conversely, NASA advances the fundamental technology of synthetic biology as no one else can because of its unique expertise in the origin of life and life in extreme environments, including the potential for alternate life forms. This enables unique, creative "game changing" advances. NASA's requirement for minimizing upmass in flight will also drive the field toward miniaturization and automation. These drivers will greatly increase the utility of synthetic biology solutions for military, health in remote areas and commercial purposes. To this end, we have begun a program at NASA to explore the use of synthetic biology in NASA's missions, particularly space exploration. As part of this program, we began hosting an iGEM team of undergraduates drawn from Brown and Stanford Universities to conduct synthetic biology research at NASA Ames Research Center. The 2011 team (http://2011.igem.org/Team:Brown-Stanford) produced an award-winning project on using synthetic biology as a basis for a human Mars settlement and the 2012 team has expanded the use of synthetic biology to estimate the potential for life in the clouds of other planets (http://2012.igem.org/Team:Stanford-Brown; http://www.calacademy.org/sciencetoday/igem-competition/). More recent projects from the Stanford-Brown team have expanded our ideas of how synthetic biology can aid NASA's missions from "Synthetic BioCommunication" (http://2013.igem.org/Team:Stanford-Brown) to a "Biodegradable UAS (drone)" in collaboration with Spelman College (http://2014.igem.org/Team:StanfordBrownSpelman#SBS%20iGEM) and most recently, "Self-Folding Origami" (http://2015.igem.org/Team:Stanford-Brown), the winner of the 2015 award for Manufacturing.

  6. Dimensionality of Local Minimizers of the Interaction Energy

    KAUST Repository

    Balagué , D.; Carrillo, J. A.; Laurent, T.; Raoul, G.

    2013-01-01

    In this work we consider local minimizers (in the topology of transport distances) of the interaction energy associated with a repulsive-attractive potential. We show how the dimensionality of the support of local minimizers is related to the repulsive strength of the potential at the origin. © 2013 Springer-Verlag Berlin Heidelberg.

  7. Dimensionality of Local Minimizers of the Interaction Energy

    KAUST Repository

    Balagué, D.

    2013-05-22

    In this work we consider local minimizers (in the topology of transport distances) of the interaction energy associated with a repulsive-attractive potential. We show how the dimensionality of the support of local minimizers is related to the repulsive strength of the potential at the origin. © 2013 Springer-Verlag Berlin Heidelberg.

  8. Conscious visual memory with minimal attention.

    Science.gov (United States)

    Pinto, Yair; Vandenbroucke, Annelinde R; Otten, Marte; Sligte, Ilja G; Seth, Anil K; Lamme, Victor A F

    2017-02-01

    Is conscious visual perception limited to the locations that a person attends? The remarkable phenomenon of change blindness, which shows that people miss nearly all unattended changes in a visual scene, suggests the answer is yes. However, change blindness is found after visual interference (a mask or a new scene), so that subjects have to rely on working memory (WM), which has limited capacity, to detect the change. Before such interference, however, a much larger capacity store, called fragile memory (FM), which is easily overwritten by newly presented visual information, is present. Whether these different stores depend equally on spatial attention is central to the debate on the role of attention in conscious vision. In 2 experiments, we found that minimizing spatial attention almost entirely erases visual WM, as expected. Critically, FM remains largely intact. Moreover, minimally attended FM responses yield accurate metacognition, suggesting that conscious memory persists with limited spatial attention. Together, our findings help resolve the fundamental issue of how attention affects perception: Both visual consciousness and memory can be supported by only minimal attention. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Stabilization of a locally minimal forest

    Science.gov (United States)

    Ivanov, A. O.; Mel'nikova, A. E.; Tuzhilin, A. A.

    2014-03-01

    The method of partial stabilization of locally minimal networks, which was invented by Ivanov and Tuzhilin to construct examples of shortest trees with given topology, is developed. According to this method, boundary vertices of degree 2 are not added to all edges of the original locally minimal tree, but only to some of them. The problem of partial stabilization of locally minimal trees in a finite-dimensional Euclidean space is solved completely in the paper, that is, without any restrictions imposed on the number of edges remaining free of subdivision. A criterion for the realizability of such stabilization is established. In addition, the general problem of searching for the shortest forest connecting a finite family of boundary compact sets in an arbitrary metric space is formalized; it is shown that such forests exist for any family of compact sets if and only if for any finite subset of the ambient space there exists a shortest tree connecting it. The theory developed here allows us to establish further generalizations of the stabilization theorem both for arbitrary metric spaces and for metric spaces with some special properties. Bibliography: 10 titles.

  10. Minimal Liouville gravity correlation numbers from Douglas string equation

    International Nuclear Information System (INIS)

    Belavin, Alexander; Dubrovin, Boris; Mukhametzhanov, Baur

    2014-01-01

    We continue the study of (q,p) Minimal Liouville Gravity with the help of Douglas string equation. We generalize the results of http://dx.doi.org/10.1016/0550-3213(91)90548-Chttp://dx.doi.org/10.1088/1751-8113/42/30/304004, where Lee-Yang series (2,2s+1) was studied, to (3,3s+p 0 ) Minimal Liouville Gravity, where p 0 =1,2. We demonstrate that there exist such coordinates τ m,n on the space of the perturbed Minimal Liouville Gravity theories, in which the partition function of the theory is determined by the Douglas string equation. The coordinates τ m,n are related in a non-linear fashion to the natural coupling constants λ m,n of the perturbations of Minimal Lioville Gravity by the physical operators O m,n . We find this relation from the requirement that the correlation numbers in Minimal Liouville Gravity must satisfy the conformal and fusion selection rules. After fixing this relation we compute three- and four-point correlation numbers when they are not zero. The results are in agreement with the direct calculations in Minimal Liouville Gravity available in the literature http://dx.doi.org/10.1103/PhysRevLett.66.2051http://dx.doi.org/10.1007/s11232-005-0003-3http://dx.doi.org/10.1007/s11232-006-0075-8

  11. Minimal canonical comprehensive Gröbner systems

    OpenAIRE

    Manubens, Montserrat; Montes, Antonio

    2009-01-01

    This is the continuation of Montes' paper "On the canonical discussion of polynomial systems with parameters''. In this paper, we define the Minimal Canonical Comprehensive Gröbner System of a parametric ideal and fix under which hypothesis it exists and is computable. An algorithm to obtain a canonical description of the segments of the Minimal Canonical CGS is given, thus completing the whole MCCGS algorithm (implemented in Maple and Singular). We show its high utility for applications, suc...

  12. Advanced pyrochemical technologies for minimizing nuclear waste

    International Nuclear Information System (INIS)

    Bronson, M.C.; Dodson, K.E.; Riley, D.C.

    1994-01-01

    The Department of Energy (DOE) is seeking to reduce the size of the current nuclear weapons complex and consequently minimize operating costs. To meet this DOE objective, the national laboratories have been asked to develop advanced technologies that take uranium and plutonium, from retired weapons and prepare it for new weapons, long-term storage, and/or final disposition. Current pyrochemical processes generate residue salts and ceramic wastes that require aqueous processing to remove and recover the actinides. However, the aqueous treatment of these residues generates an estimated 100 liters of acidic transuranic (TRU) waste per kilogram of plutonium in the residue. Lawrence Livermore National Laboratory (LLNL) is developing pyrochemical techniques to eliminate, minimize, or more efficiently treat these residue streams. This paper will present technologies being developed at LLNL on advanced materials for actinide containment, reactors that minimize residues, and pyrochemical processes that remove actinides from waste salts

  13. The non-minimal heterotic pure spinor string in a curved background

    Energy Technology Data Exchange (ETDEWEB)

    Chandia, Osvaldo [Facultad de Artes Liberales and Facultad de Ingeniería y Ciencias, Universidad Adolfo Ibáñez,Diagonal Las Torres 2640, Peñalolén, Santiago (Chile)

    2014-03-21

    We study the non-minimal pure spinor string in a curved background. We find that the minimal BRST invariance implies the existence of a non-trivial stress-energy tensor for the minimal and non-minimal variables in the heterotic curved background. We find constraint equations for the b ghost. We construct the b ghost as a solution of these constraints.

  14. Waste minimization applications at a remediation site

    International Nuclear Information System (INIS)

    Allmon, L.A.

    1995-01-01

    The Fernald Environmental Management Project (FEMP) owned by the Department of Energy was used for the processing of uranium. In 1989 Fernald suspended production of uranium metals and was placed on the National Priorities List (NPL). The site's mission has changed from one of production to environmental restoration. Many groups necessary for producing a product were deemed irrelevant for remediation work, including Waste Minimization. Waste Minimization does not readily appear to be applicable to remediation work. Environmental remediation is designed to correct adverse impacts to the environment from past operations and generates significant amounts of waste requiring management. The premise of pollution prevention is to avoid waste generation, thus remediation is in direct conflict with this premise. Although greater amounts of waste will be generated during environmental remediation, treatment capacities are not always available and disposal is becoming more difficult and costly. This creates the need for pollution prevention and waste minimization. Applying waste minimization principles at a remediation site is an enormous challenge. If the remediation site is also radiologically contaminated it is even a bigger challenge. Innovative techniques and ideas must be utilized to achieve reductions in the amount of waste that must be managed or dispositioned. At Fernald the waste minimization paradigm was shifted from focusing efforts on source reduction to focusing efforts on recycle/reuse by inverting the EPA waste management hierarchy. A fundamental difference at remediation sites is that source reduction has limited applicability to legacy wastes but can be applied successfully on secondary waste generation. The bulk of measurable waste reduction will be achieved by the recycle/reuse of primary wastes and by segregation and decontamination of secondary wastestreams. Each effort must be measured in terms of being economically and ecologically beneficial

  15. [Decontamination of chemical and biological warfare agents].

    Science.gov (United States)

    Seto, Yasuo

    2009-01-01

    Chemical and biological warfare agents (CBWA's) are diverse in nature; volatile acute low-molecular-weight toxic compounds, chemical warfare agents (CWA's, gaseous choking and blood agents, volatile nerve gases and blister agents, nonvolatile vomit agents and lacrymators), biological toxins (nonvolatile low-molecular-weight toxins, proteinous toxins) and microbes (bacteria, viruses, rickettsiae). In the consequence management against chemical and biological terrorism, speedy decontamination of victims, facilities and equipment is required for the minimization of the damage. In the present situation, washing victims and contaminated materials with large volumes of water is the basic way, and additionally hypochlorite salt solution is used for decomposition of CWA's. However, it still remains unsolved how to dispose large volumes of waste water, and the decontamination reagents have serious limitation of high toxicity, despoiling nature against the environments, long finishing time and non-durability in effective decontamination. Namely, the existing decontamination system is not effective, nonspecifically affecting the surrounding non-target materials. Therefore, it is the urgent matter to build up the usable decontamination system surpassing the present technologies. The symposiast presents the on-going joint project of research and development of the novel decontamination system against CBWA's, in the purpose of realizing nontoxic, fast, specific, effective and economical terrorism on-site decontamination. The projects consists of (1) establishment of the decontamination evaluation methods and verification of the existing technologies and adaptation of bacterial organophosphorus hydrolase, (2) development of adsorptive elimination technologies using molecular recognition tools, and (4) development of deactivation technologies using photocatalysis.

  16. Minimally processed vegetable salads: microbial quality evaluation.

    Science.gov (United States)

    Fröder, Hans; Martins, Cecília Geraldes; De Souza, Katia Leani Oliveira; Landgraf, Mariza; Franco, Bernadette D G M; Destro, Maria Teresa

    2007-05-01

    The increasing demand for fresh fruits and vegetables and for convenience foods is causing an expansion of the market share for minimally processed vegetables. Among the more common pathogenic microorganisms that can be transmitted to humans by these products are Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella. The aim of this study was to evaluate the microbial quality of a selection of minimally processed vegetables. A total of 181 samples of minimally processed leafy salads were collected from retailers in the city of Sao Paulo, Brazil. Counts of total coliforms, fecal coliforms, Enterobacteriaceae, psychrotrophic microorganisms, and Salmonella were conducted for 133 samples. L. monocytogenes was assessed in 181 samples using the BAX System and by plating the enrichment broth onto Palcam and Oxford agars. Suspected Listeria colonies were submitted to classical biochemical tests. Populations of psychrotrophic microorganisms >10(6) CFU/g were found in 51% of the 133 samples, and Enterobacteriaceae populations between 10(5) and 106 CFU/g were found in 42% of the samples. Fecal coliform concentrations higher than 10(2) CFU/g (Brazilian standard) were found in 97 (73%) of the samples, and Salmonella was detected in 4 (3%) of the samples. Two of the Salmonella-positive samples had minimally processed vegetables had poor microbiological quality, and these products could be a vehicle for pathogens such as Salmonella and L. monocytogenes.

  17. Bio-physically plausible visualization of highly scattering fluorescent neocortical models for in silico experimentation

    KAUST Repository

    Abdellah, Marwan; Bilgili, Ahmet; Eilemann, Stefan; Shillcock, Julian; Markram, Henry; Schü rmann, Felix

    2017-01-01

    to visualize the results of their virtual experiments that are performed in computer simulations, or in silico. The impact of the presented pipeline opens novel avenues for assisting the neuroscientists to build biologically accurate models of the brain

  18. Minimally Invasive Parathyroidectomy

    Directory of Open Access Journals (Sweden)

    Lee F. Starker

    2011-01-01

    Full Text Available Minimally invasive parathyroidectomy (MIP is an operative approach for the treatment of primary hyperparathyroidism (pHPT. Currently, routine use of improved preoperative localization studies, cervical block anesthesia in the conscious patient, and intraoperative parathyroid hormone analyses aid in guiding surgical therapy. MIP requires less surgical dissection causing decreased trauma to tissues, can be performed safely in the ambulatory setting, and is at least as effective as standard cervical exploration. This paper reviews advances in preoperative localization, anesthetic techniques, and intraoperative management of patients undergoing MIP for the treatment of pHPT.

  19. Membangun Sistem Linux Mandrake Minimal Menggunakan Inisial Disk Ram

    OpenAIRE

    Wagito, Wagito

    2006-01-01

    Minimal Linux system is commonly used for special systems like router, gateway, Linux installer and diskless Linux system. Minimal Linux system is a Linux system that use a few facilities of all Linux capabilities. Mandrake Linux, as one of Linux distribution is able to perform minimal Linux system. RAM is a computer resource that especially used as main memory. A part of RAM's function can be changed into disk called RAM disk. This RAM disk can be used to run the Linux system. This ...

  20. Subspace Correction Methods for Total Variation and $\\ell_1$-Minimization

    KAUST Repository

    Fornasier, Massimo

    2009-01-01

    This paper is concerned with the numerical minimization of energy functionals in Hilbert spaces involving convex constraints coinciding with a seminorm for a subspace. The optimization is realized by alternating minimizations of the functional on a sequence of orthogonal subspaces. On each subspace an iterative proximity-map algorithm is implemented via oblique thresholding, which is the main new tool introduced in this work. We provide convergence conditions for the algorithm in order to compute minimizers of the target energy. Analogous results are derived for a parallel variant of the algorithm. Applications are presented in domain decomposition methods for degenerate elliptic PDEs arising in total variation minimization and in accelerated sparse recovery algorithms based on 1-minimization. We include numerical examples which show e.cient solutions to classical problems in signal and image processing. © 2009 Society for Industrial and Applied Physics.