WorldWideScience

Sample records for multiple point principler

  1. Coexistence of different vacua in the effective quantum field theory and multiple point principle

    International Nuclear Information System (INIS)

    Volovik, G.E.

    2004-01-01

    According to the multiple point principle our Universe in on the coexistence curve of two or more phases of the quantum vacuum. The coexistence of different quantum vacua can be regulated by the exchange of the global fermionic charges between the vacua. If the coexistence is regulated by the baryonic charge, all the coexisting vacua exhibit the baryonic asymmetry. Due to the exchange of the baryonic charge between the vacuum and matter which occurs above the electroweak transition, the baryonic asymmetry of the vacuum induces the baryonic asymmetry of matter in our Standard-Model phase of the quantum vacuum [ru

  2. Set Partitions and the Multiplication Principle

    Science.gov (United States)

    Lockwood, Elise; Caughman, John S., IV

    2016-01-01

    To further understand student thinking in the context of combinatorial enumeration, we examine student work on a problem involving set partitions. In this context, we note some key features of the multiplication principle that were often not attended to by students. We also share a productive way of thinking that emerged for several students who…

  3. Le Chatelier's principle with multiple relaxation channels

    Science.gov (United States)

    Gilmore, R.; Levine, R. D.

    1986-05-01

    Le Chatelier's principle is discussed within the constrained variational approach to thermodynamics. The formulation is general enough to encompass systems not in thermal (or chemical) equilibrium. Particular attention is given to systems with multiple constraints which can be relaxed. The moderation of the initial perturbation increases as additional constraints are removed. This result is studied in particular when the (coupled) relaxation channels have widely different time scales. A series of inequalities is derived which describes the successive moderation as each successive relaxation channel opens up. These inequalities are interpreted within the metric-geometry representation of thermodynamics.

  4. PRINCIPLE OF POINT MAKING OFMUTUALLY ACCEPTABLE MULTIPROJECTION DECISION

    Directory of Open Access Journals (Sweden)

    Olga N. Lapaeva

    2015-01-01

    Full Text Available The principle of point making of mutually acceptable multi-projection decision in economics is set forth in the article. The principle envisages searching for the best variant by each stakeholder and result making by crossing of individual sets.

  5. Supporting Multiple Pointing Devices in Microsoft Windows

    DEFF Research Database (Denmark)

    Westergaard, Michael

    2002-01-01

    In this paper the implementation of a Microsoft Windows driver including APIs supporting multiple pointing devices is presented. Microsoft Windows does not natively support multiple pointing devices controlling independent cursors, and a number of solutions to this have been implemented by us and...... and others. Here we motivate and describe a general solution, and how user applications can use it by means of a framework. The device driver and the supporting APIs will be made available free of charge. Interested parties can contact the author for more information....

  6. Analogue of Pontryagin's maximum principle for multiple integrals minimization problems

    OpenAIRE

    Mikhail, Zelikin

    2016-01-01

    The theorem like Pontryagin's maximum principle for multiple integrals is proved. Unlike the usual maximum principle, the maximum should be taken not over all matrices, but only on matrices of rank one. Examples are given.

  7. Ten Anchor Points for Teaching Principles of Marketing

    Science.gov (United States)

    Tomkovick, Chuck

    2004-01-01

    Effective marketing instructors commonly share a love for their students, an affinity for the subject matter, and a devotion to continuous quality improvement. The purpose of this article is to highlight 10 anchor points for teaching Principles of Marketing, which are designed to better engage students in the learning process. These anchor…

  8. Evaluation of multiple emission point facilities

    International Nuclear Information System (INIS)

    Miltenberger, R.P.; Hull, A.P.; Strachan, S.; Tichler, J.

    1988-01-01

    In 1970, the New York State Department of Environmental Conservation (NYSDEC) assumed responsibility for the environmental aspect of the state's regulatory program for by-product, source, and special nuclear material. The major objective of this study was to provide consultation to NYSDEC and the US NRC to assist NYSDEC in determining if broad-based licensed facilities with multiple emission points were in compliance with NYCRR Part 380. Under this contract, BNL would evaluate a multiple emission point facility, identified by NYSDEC, as a case study. The review would be a nonbinding evaluation of the facility to determine likely dispersion characteristics, compliance with specified release limits, and implementation of the ALARA philosophy regarding effluent release practices. From the data collected, guidance as to areas of future investigation and the impact of new federal regulations were to be developed. Reported here is the case study for the University of Rochester, Strong Memorial Medical Center and Riverside Campus

  9. Point defects in thorium nitride: A first-principles study

    Energy Technology Data Exchange (ETDEWEB)

    Pérez Daroca, D., E-mail: pdaroca@tandar.cnea.gov.ar [Gerencia de Investigación y Aplicaciones, Comisión Nacional de Energía Atómica (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas (Argentina); Llois, A.M. [Gerencia de Investigación y Aplicaciones, Comisión Nacional de Energía Atómica (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas (Argentina); Mosca, H.O. [Gerencia de Investigación y Aplicaciones, Comisión Nacional de Energía Atómica (Argentina); Instituto de Tecnología Jorge A. Sabato, UNSAM-CNEA (Argentina)

    2016-11-15

    Thorium and its compounds (carbides and nitrides) are being investigated as possible materials to be used as nuclear fuels for Generation-IV reactors. As a first step in the research of these materials under irradiation, we study the formation energies and stability of point defects in thorium nitride by means of first-principles calculations within the framework of density functional theory. We focus on vacancies, interstitials, Frenkel pairs and Schottky defects. We found that N and Th vacancies have almost the same formation energy and that the most energetically favorable defects of all studied in this work are N interstitials. These kind of results for ThN, to the best authors' knowledge, have not been obtained previously, neither experimentally, nor theoretically.

  10. First-principles study of point defects in thorium carbide

    International Nuclear Information System (INIS)

    Pérez Daroca, D.; Jaroszewicz, S.; Llois, A.M.; Mosca, H.O.

    2014-01-01

    Thorium-based materials are currently being investigated in relation with their potential utilization in Generation-IV reactors as nuclear fuels. One of the most important issues to be studied is their behavior under irradiation. A first approach to this goal is the study of point defects. By means of first-principles calculations within the framework of density functional theory, we study the stability and formation energies of vacancies, interstitials and Frenkel pairs in thorium carbide. We find that C isolated vacancies are the most likely defects, while C interstitials are energetically favored as compared to Th ones. These kind of results for ThC, to the best authors’ knowledge, have not been obtained previously, neither experimentally, nor theoretically. For this reason, we compare with results on other compounds with the same NaCl-type structure

  11. Point defects in thorium nitride: A first-principles study

    International Nuclear Information System (INIS)

    Pérez Daroca, D.; Llois, A.M.; Mosca, H.O.

    2016-01-01

    Thorium and its compounds (carbides and nitrides) are being investigated as possible materials to be used as nuclear fuels for Generation-IV reactors. As a first step in the research of these materials under irradiation, we study the formation energies and stability of point defects in thorium nitride by means of first-principles calculations within the framework of density functional theory. We focus on vacancies, interstitials, Frenkel pairs and Schottky defects. We found that N and Th vacancies have almost the same formation energy and that the most energetically favorable defects of all studied in this work are N interstitials. These kind of results for ThN, to the best authors' knowledge, have not been obtained previously, neither experimentally, nor theoretically.

  12. First-principles study of point defects in thorium carbide

    Energy Technology Data Exchange (ETDEWEB)

    Pérez Daroca, D., E-mail: pdaroca@tandar.cnea.gov.ar [Gerencia de Investigación y Aplicaciones, Comisión Nacional de Energía Atómica, Av. General Paz 1499, (1650) San Martin, Buenos Aires (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, (1033) Buenos Aires (Argentina); Jaroszewicz, S. [Gerencia de Investigación y Aplicaciones, Comisión Nacional de Energía Atómica, Av. General Paz 1499, (1650) San Martin, Buenos Aires (Argentina); Instituto de Tecnología Jorge A. Sabato, UNSAM-CNEA, Av. General Paz 1499, (1650) San Martin, Buenos Aires (Argentina); Llois, A.M. [Gerencia de Investigación y Aplicaciones, Comisión Nacional de Energía Atómica, Av. General Paz 1499, (1650) San Martin, Buenos Aires (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, (1033) Buenos Aires (Argentina); Mosca, H.O. [Gerencia de Investigación y Aplicaciones, Comisión Nacional de Energía Atómica, Av. General Paz 1499, (1650) San Martin, Buenos Aires (Argentina); Instituto de Tecnología Jorge A. Sabato, UNSAM-CNEA, Av. General Paz 1499, (1650) San Martin, Buenos Aires (Argentina)

    2014-11-15

    Thorium-based materials are currently being investigated in relation with their potential utilization in Generation-IV reactors as nuclear fuels. One of the most important issues to be studied is their behavior under irradiation. A first approach to this goal is the study of point defects. By means of first-principles calculations within the framework of density functional theory, we study the stability and formation energies of vacancies, interstitials and Frenkel pairs in thorium carbide. We find that C isolated vacancies are the most likely defects, while C interstitials are energetically favored as compared to Th ones. These kind of results for ThC, to the best authors’ knowledge, have not been obtained previously, neither experimentally, nor theoretically. For this reason, we compare with results on other compounds with the same NaCl-type structure.

  13. 41 CFR Appendix A to Subpart C of... - 3-Key Points and Principles

    Science.gov (United States)

    2010-07-01

    ... Principles A Appendix A to Subpart C of Part 102 Public Contracts and Property Management Federal Property... 102-3—Key Points and Principles This appendix provides additional guidance in the form of answers to frequently asked questions and identifies key points and principles that may be applied to situations not...

  14. 41 CFR Appendix A to Subpart D of... - 3-Key Points and Principles

    Science.gov (United States)

    2010-07-01

    ... Principles A Appendix A to Subpart D of Part 102 Public Contracts and Property Management Federal Property... Subpart D of Part 102-3—Key Points and Principles This appendix provides additional guidance in the form of answers to frequently asked questions and identifies key points and principles that may be applied...

  15. Multiple point statistical simulation using uncertain (soft) conditional data

    Science.gov (United States)

    Hansen, Thomas Mejer; Vu, Le Thanh; Mosegaard, Klaus; Cordua, Knud Skou

    2018-05-01

    Geostatistical simulation methods have been used to quantify spatial variability of reservoir models since the 80s. In the last two decades, state of the art simulation methods have changed from being based on covariance-based 2-point statistics to multiple-point statistics (MPS), that allow simulation of more realistic Earth-structures. In addition, increasing amounts of geo-information (geophysical, geological, etc.) from multiple sources are being collected. This pose the problem of integration of these different sources of information, such that decisions related to reservoir models can be taken on an as informed base as possible. In principle, though difficult in practice, this can be achieved using computationally expensive Monte Carlo methods. Here we investigate the use of sequential simulation based MPS simulation methods conditional to uncertain (soft) data, as a computational efficient alternative. First, it is demonstrated that current implementations of sequential simulation based on MPS (e.g. SNESIM, ENESIM and Direct Sampling) do not account properly for uncertain conditional information, due to a combination of using only co-located information, and a random simulation path. Then, we suggest two approaches that better account for the available uncertain information. The first make use of a preferential simulation path, where more informed model parameters are visited preferentially to less informed ones. The second approach involves using non co-located uncertain information. For different types of available data, these approaches are demonstrated to produce simulation results similar to those obtained by the general Monte Carlo based approach. These methods allow MPS simulation to condition properly to uncertain (soft) data, and hence provides a computationally attractive approach for integration of information about a reservoir model.

  16. Eta Carinae: Viewed from Multiple Vantage Points

    Science.gov (United States)

    Gull, Theodore

    2007-01-01

    The central source of Eta Carinae and its ejecta is a massive binary system buried within a massive interacting wind structure which envelops the two stars. However the hot, less massive companion blows a small cavity in the very massive primary wind, plus ionizes a portion of the massive wind just beyond the wind-wind boundary. We gain insight on this complex structure by examining the spatially-resolved Space Telescope Imaging Spectrograph (STIS) spectra of the central source (0.1") with the wind structure which extends out to nearly an arcsecond (2300AU) and the wind-blown boundaries, plus the ejecta of the Little Homunculus. Moreover, the spatially resolved Very Large Telescope/UltraViolet Echelle Spectrograph (VLT/UVES) stellar spectrum (one arcsecond) and spatially sampled spectra across the foreground lobe of the Homunculus provide us vantage points from different angles relative to line of sight. Examples of wind line profiles of Fe II, and the.highly excited [Fe III], [Ne III], [Ar III] and [S III)], plus other lines will be presented.

  17. 41 CFR Appendix A to Subpart B of... - 3-Key Points and Principles

    Science.gov (United States)

    2010-07-01

    ... Principles A Appendix A to Subpart B of Part 102 Public Contracts and Property Management Federal Property.... B, App. A Appendix A to Subpart B of Part 102-3—Key Points and Principles This appendix provides... principles that may be applied to situations not covered elsewhere in this subpart. The guidance follows: Key...

  18. 41 CFR Appendix A to Subpart A of... - 3-Key Points and Principles

    Science.gov (United States)

    2010-07-01

    ... Principles A Appendix A to Subpart A of Part 102 Public Contracts and Property Management Federal Property..., Subpt. A, App. A Appendix A to Subpart A of Part 102-3—Key Points and Principles This appendix provides... principles that may be applied to situations not covered elsewhere in this subpart. The guidance follows: Key...

  19. Tripled Fixed Point in Ordered Multiplicative Metric Spaces

    Directory of Open Access Journals (Sweden)

    Laishram Shanjit

    2017-06-01

    Full Text Available In this paper, we present some triple fixed point theorems in partially ordered multiplicative metric spaces depended on another function. Our results generalise the results of [6] and [5].

  20. An evolutionary reduction principle for mutation rates at multiple Loci.

    Science.gov (United States)

    Altenberg, Lee

    2011-06-01

    A model of mutation rate evolution for multiple loci under arbitrary selection is analyzed. Results are obtained using techniques from Karlin (Evolutionary Biology, vol. 14, pp. 61-204, 1982) that overcome the weak selection constraints needed for tractability in prior studies of multilocus event models.A multivariate form of the reduction principle is found: reduction results at individual loci combine topologically to produce a surface of mutation rate alterations that are neutral for a new modifier allele. New mutation rates survive if and only if they fall below this surface-a generalization of the hyperplane found by Zhivotovsky et al. (Proc. Natl. Acad. Sci. USA 91, 1079-1083, 1994) for a multilocus recombination modifier. Increases in mutation rates at some loci may evolve if compensated for by decreases at other loci. The strength of selection on the modifier scales in proportion to the number of germline cell divisions, and increases with the number of loci affected. Loci that do not make a difference to marginal fitnesses at equilibrium are not subject to the reduction principle, and under fine tuning of mutation rates would be expected to have higher mutation rates than loci in mutation-selection balance.Other results include the nonexistence of 'viability analogous, Hardy-Weinberg' modifier polymorphisms under multiplicative mutation, and the sufficiency of average transmission rates to encapsulate the effect of modifier polymorphisms on the transmission of loci under selection. A conjecture is offered regarding situations, like recombination in the presence of mutation, that exhibit departures from the reduction principle. Constraints for tractability are: tight linkage of all loci, initial fixation at the modifier locus, and mutation distributions comprising transition probabilities of reversible Markov chains.

  1. A micro dew point sensor with a thermal detection principle

    Science.gov (United States)

    Kunze, M.; Merz, J.; Hummel, W.-J.; Glosch, H.; Messner, S.; Zengerle, R.

    2012-01-01

    We present a dew point temperature sensor with the thermal detection of condensed water on a thin membrane, fabricated by silicon micromachining. The membrane (600 × 600 × ~1 µm3) is part of a silicon chip and contains a heating element as well as a thermopile for temperature measurement. By dynamically heating the membrane and simultaneously analyzing the transient increase of its temperature it is detected whether condensed water is on the membrane or not. To cool the membrane down, a peltier cooler is used and electronically controlled in a way that the temperature of the membrane is constantly held at a value where condensation of water begins. This temperature is measured and output as dew point temperature. The sensor system works in a wide range of dew point temperatures between 1 K and down to 44 K below air temperature. In experimental investigations it could be proven that the deviation of the measured dew point temperatures compared to reference values is below ±0.2 K in an air temperature range of 22 to 70 °C. At low dew point temperatures of -20 °C (air temperature = 22 °C) the deviation increases to nearly -1 K.

  2. A micro dew point sensor with a thermal detection principle

    International Nuclear Information System (INIS)

    Kunze, M; Merz, J; Glosch, H; Messner, S; Zengerle, R; Hummel, W-J

    2012-01-01

    We present a dew point temperature sensor with the thermal detection of condensed water on a thin membrane, fabricated by silicon micromachining. The membrane (600 × 600 × ∼1 µm 3 ) is part of a silicon chip and contains a heating element as well as a thermopile for temperature measurement. By dynamically heating the membrane and simultaneously analyzing the transient increase of its temperature it is detected whether condensed water is on the membrane or not. To cool the membrane down, a peltier cooler is used and electronically controlled in a way that the temperature of the membrane is constantly held at a value where condensation of water begins. This temperature is measured and output as dew point temperature. The sensor system works in a wide range of dew point temperatures between 1 K and down to 44 K below air temperature. In experimental investigations it could be proven that the deviation of the measured dew point temperatures compared to reference values is below ±0.2 K in an air temperature range of 22 to 70 °C. At low dew point temperatures of −20 °C (air temperature = 22 °C) the deviation increases to nearly −1 K

  3. Point source reconstruction principle of linear inverse problems

    International Nuclear Information System (INIS)

    Terazono, Yasushi; Matani, Ayumu; Fujimaki, Norio; Murata, Tsutomu

    2010-01-01

    Exact point source reconstruction for underdetermined linear inverse problems with a block-wise structure was studied. In a block-wise problem, elements of a source vector are partitioned into blocks. Accordingly, a leadfield matrix, which represents the forward observation process, is also partitioned into blocks. A point source is a source having only one nonzero block. An example of such a problem is current distribution estimation in electroencephalography and magnetoencephalography, where a source vector represents a vector field and a point source represents a single current dipole. In this study, the block-wise norm, a block-wise extension of the l p -norm, was defined as the family of cost functions of the inverse method. The main result is that a set of three conditions was found to be necessary and sufficient for block-wise norm minimization to ensure exact point source reconstruction for any leadfield matrix that admit such reconstruction. The block-wise norm that satisfies the conditions is the sum of the cost of all the observations of source blocks, or in other words, the block-wisely extended leadfield-weighted l 1 -norm. Additional results are that minimization of such a norm always provides block-wisely sparse solutions and that its solutions form cones in source space

  4. Multi-lane detection based on multiple vanishing points detection

    Science.gov (United States)

    Li, Chuanxiang; Nie, Yiming; Dai, Bin; Wu, Tao

    2015-03-01

    Lane detection plays a significant role in Advanced Driver Assistance Systems (ADAS) for intelligent vehicles. In this paper we present a multi-lane detection method based on multiple vanishing points detection. A new multi-lane model assumes that a single lane, which has two approximately parallel boundaries, may not parallel to others on road plane. Non-parallel lanes associate with different vanishing points. A biological plausibility model is used to detect multiple vanishing points and fit lane model. Experimental results show that the proposed method can detect both parallel lanes and non-parallel lanes.

  5. Pilot points method for conditioning multiple-point statistical facies simulation on flow data

    Science.gov (United States)

    Ma, Wei; Jafarpour, Behnam

    2018-05-01

    We propose a new pilot points method for conditioning discrete multiple-point statistical (MPS) facies simulation on dynamic flow data. While conditioning MPS simulation on static hard data is straightforward, their calibration against nonlinear flow data is nontrivial. The proposed method generates conditional models from a conceptual model of geologic connectivity, known as a training image (TI), by strategically placing and estimating pilot points. To place pilot points, a score map is generated based on three sources of information: (i) the uncertainty in facies distribution, (ii) the model response sensitivity information, and (iii) the observed flow data. Once the pilot points are placed, the facies values at these points are inferred from production data and then are used, along with available hard data at well locations, to simulate a new set of conditional facies realizations. While facies estimation at the pilot points can be performed using different inversion algorithms, in this study the ensemble smoother (ES) is adopted to update permeability maps from production data, which are then used to statistically infer facies types at the pilot point locations. The developed method combines the information in the flow data and the TI by using the former to infer facies values at selected locations away from the wells and the latter to ensure consistent facies structure and connectivity where away from measurement locations. Several numerical experiments are used to evaluate the performance of the developed method and to discuss its important properties.

  6. Mirrored pyramidal wells for simultaneous multiple vantage point microscopy.

    Science.gov (United States)

    Seale, K T; Reiserer, R S; Markov, D A; Ges, I A; Wright, C; Janetopoulos, C; Wikswo, J P

    2008-10-01

    We report a novel method for obtaining simultaneous images from multiple vantage points of a microscopic specimen using size-matched microscopic mirrors created from anisotropically etched silicon. The resulting pyramidal wells enable bright-field and fluorescent side-view images, and when combined with z-sectioning, provide additional information for 3D reconstructions of the specimen. We have demonstrated the 3D localization and tracking over time of the centrosome of a live Dictyostelium discoideum. The simultaneous acquisition of images from multiple perspectives also provides a five-fold increase in the theoretical collection efficiency of emitted photons, a property which may be useful for low-light imaging modalities such as bioluminescence, or low abundance surface-marker labelling.

  7. Multiple contacts with diversion at the point of arrest.

    Science.gov (United States)

    Riordan, Sharon; Wix, Stuart; Haque, M Sayeed; Humphreys, Martin

    2003-04-01

    A diversion at the point of arrest (DAPA) scheme was set up in five police stations in South Birmingham in 1992. In a study of all referrals made over a four-year period a sub group of multiple contact individuals was identified. During that time four hundred and ninety-two contacts were recorded in total, of which 130 were made by 58 individuals. The latter group was generally no different from the single contact group but did have a tendency to be younger. This research highlights the need for a re-evaluation of service provision and associated education of police officers and relevant mental health care professionals.

  8. Multiplicative point process as a model of trading activity

    Science.gov (United States)

    Gontis, V.; Kaulakys, B.

    2004-11-01

    Signals consisting of a sequence of pulses show that inherent origin of the 1/ f noise is a Brownian fluctuation of the average interevent time between subsequent pulses of the pulse sequence. In this paper, we generalize the model of interevent time to reproduce a variety of self-affine time series exhibiting power spectral density S( f) scaling as a power of the frequency f. Furthermore, we analyze the relation between the power-law correlations and the origin of the power-law probability distribution of the signal intensity. We introduce a stochastic multiplicative model for the time intervals between point events and analyze the statistical properties of the signal analytically and numerically. Such model system exhibits power-law spectral density S( f)∼1/ fβ for various values of β, including β= {1}/{2}, 1 and {3}/{2}. Explicit expressions for the power spectra in the low-frequency limit and for the distribution density of the interevent time are obtained. The counting statistics of the events is analyzed analytically and numerically, as well. The specific interest of our analysis is related with the financial markets, where long-range correlations of price fluctuations largely depend on the number of transactions. We analyze the spectral density and counting statistics of the number of transactions. The model reproduces spectral properties of the real markets and explains the mechanism of power-law distribution of trading activity. The study provides evidence that the statistical properties of the financial markets are enclosed in the statistics of the time interval between trades. A multiplicative point process serves as a consistent model generating this statistics.

  9. Multiplicity: discussion points from the Statisticians in the Pharmaceutical Industry multiplicity expert group.

    Science.gov (United States)

    Phillips, Alan; Fletcher, Chrissie; Atkinson, Gary; Channon, Eddie; Douiri, Abdel; Jaki, Thomas; Maca, Jeff; Morgan, David; Roger, James Henry; Terrill, Paul

    2013-01-01

    In May 2012, the Committee of Health and Medicinal Products issued a concept paper on the need to review the points to consider document on multiplicity issues in clinical trials. In preparation for the release of the updated guidance document, Statisticians in the Pharmaceutical Industry held a one-day expert group meeting in January 2013. Topics debated included multiplicity and the drug development process, the usefulness and limitations of newly developed strategies to deal with multiplicity, multiplicity issues arising from interim decisions and multiregional development, and the need for simultaneous confidence intervals (CIs) corresponding to multiple test procedures. A clear message from the meeting was that multiplicity adjustments need to be considered when the intention is to make a formal statement about efficacy or safety based on hypothesis tests. Statisticians have a key role when designing studies to assess what adjustment really means in the context of the research being conducted. More thought during the planning phase needs to be given to multiplicity adjustments for secondary endpoints given these are increasing in importance in differentiating products in the market place. No consensus was reached on the role of simultaneous CIs in the context of superiority trials. It was argued that unadjusted intervals should be employed as the primary purpose of the intervals is estimation, while the purpose of hypothesis testing is to formally establish an effect. The opposing view was that CIs should correspond to the test decision whenever possible. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Principles of a new treatment algorithm in multiple sclerosis

    DEFF Research Database (Denmark)

    Hartung, Hans-Peter; Montalban, Xavier; Sorensen, Per Soelberg

    2011-01-01

    We are entering a new era in the management of patients with multiple sclerosis (MS). The first oral treatment (fingolimod) has now gained US FDA approval, addressing an unmet need for patients with MS who wish to avoid parenteral administration. A second agent (cladribine) is currently being...

  11. Departing from PowerPoint default mode: Applying Mayer's multimedia principles for enhanced learning of parasitology.

    Science.gov (United States)

    Nagmoti, Jyoti Mahantesh

    2017-01-01

    PowerPoint (PPT™) presentation has become an integral part of day-to-day teaching in medicine. Most often, PPT™ is used in its default mode which in fact, is known to cause boredom and ineffective learning. Research has shown improved short-term memory by applying multimedia principles for designing and delivering lectures. However, such evidence in medical education is scarce. Therefore, we attempted to evaluate the effect of multimedia principles on enhanced learning of parasitology. Second-year medical students received a series of lectures, half of the lectures used traditionally designed PPT™ and the rest used slides designed by Mayer's multimedia principles. Students answered pre and post-tests at the end of each lecture (test-I) and an essay test after six months (test-II) which assessed their short and long term knowledge retention respectively. Students' feedback on quality and content of lectures were collected. Statistically significant difference was found between post test scores of traditional and modified lectures (P = 0.019) indicating, improved short-term memory after modified lectures. Similarly, students scored better in test II on the contents learnt through modified lectures indicating, enhanced comprehension and improved long-term memory (P learning through multimedia designed PPT™ and suggested for their continued use. It is time to depart from default PPT™ and adopt multimedia principles to enhance comprehension and improve short and long term knowledge retention. Further, medical educators may be trained and encouraged to apply multimedia principles for designing and delivering effective lectures.

  12. Regulatory issues with multiplicity in drug approval: Principles and controversies in a changing landscape.

    Science.gov (United States)

    Benda, Norbert; Brandt, Andreas

    2018-01-01

    Recently, new draft guidelines on multiplicity issues in clinical trials have been issued by European Medicine Agency (EMA) and Food and Drug Administration (FDA), respectively. Multiplicity is an issue in clinical trials, if the probability of a false-positive decision is increased by insufficiently accounting for testing multiple hypotheses. We outline the regulatory principles related to multiplicity issues in confirmatory clinical trials intended to support a marketing authorization application in the EU, describe the reasons for an increasing complexity regarding multiple hypotheses testing and discuss the specific multiplicity issues emerging within the regulatory context and being relevant for drug approval.

  13. Principles of a new treatment algorithm in multiple sclerosis

    DEFF Research Database (Denmark)

    Hartung, Hans-Peter; Montalban, Xavier; Sorensen, Per Soelberg

    2011-01-01

    We are entering a new era in the management of patients with multiple sclerosis (MS). The first oral treatment (fingolimod) has now gained US FDA approval, addressing an unmet need for patients with MS who wish to avoid parenteral administration. A second agent (cladribine) is currently being...... considered for approval. With the arrival of these oral agents, a key question is where they may fit into the existing MS treatment algorithm. This article aims to help answer this question by analyzing the trial data for the new oral therapies, as well as for existing MS treatments, by applying practical...... clinical experience, and through consideration of our increased understanding of how to define treatment success in MS. This article also provides a speculative look at what the treatment algorithm may look like in 5 years, with the availability of new data, greater experience and, potentially, other novel...

  14. Multiple comparisons in drug efficacy studies: scientific or marketing principles?

    Science.gov (United States)

    Leo, Jonathan

    2004-01-01

    When researchers design an experiment to compare a given medication to another medication, a behavioral therapy, or a placebo, the experiment often involves numerous comparisons. For instance, there may be several different evaluation methods, raters, and time points. Although scientifically justified, such comparisons can be abused in the interests of drug marketing. This article provides two recent examples of such questionable practices. The first involves the case of the arthritis drug celecoxib (Celebrex), where the study lasted 12 months but the authors only presented 6 months of data. The second case involves the NIMH Multimodal Treatment Study (MTA) study evaluating the efficacy of stimulant medication for attention-deficit hyperactivity disorder where ratings made by several groups are reported in contradictory fashion. The MTA authors have not clarified the confusion, at least in print, suggesting that the actual findings of the study may have played little role in the authors' reported conclusions.

  15. First-principles study of point-defect production in Si and SiC

    International Nuclear Information System (INIS)

    Windl, W.; Lenosky, T.J.; Kress, J.D.; Voter, A.F.

    1998-03-01

    The authors have calculated the displacement-threshold energy E(d) for point-defect production in Si and SiC using empirical potentials, tight-binding, and first-principles methods. They show that -- depending on the knock-on direction -- 64-atom simulation cells can be sufficient to allow a nearly finite-size-effect-free calculation, thus making the use of first-principles methods possible. They use molecular dynamics (MD) techniques and propose the use of a sudden approximation which agrees reasonably well with the MD results for selected directions and which allows estimates of Ed without employing an MD simulation and the use of computationally demanding first-principles methods. Comparing the results with experiment, the authors find the full self-consistent first-principles method in conjunction with the sudden approximation to be a reliable and easy method to predict E d . Furthermore, they have examined the temperature dependence of E d for C in SiC and found it to be negligible

  16. Downscaling remotely sensed imagery using area-to-point cokriging and multiple-point geostatistical simulation

    Science.gov (United States)

    Tang, Yunwei; Atkinson, Peter M.; Zhang, Jingxiong

    2015-03-01

    A cross-scale data integration method was developed and tested based on the theory of geostatistics and multiple-point geostatistics (MPG). The goal was to downscale remotely sensed images while retaining spatial structure by integrating images at different spatial resolutions. During the process of downscaling, a rich spatial correlation model in the form of a training image was incorporated to facilitate reproduction of similar local patterns in the simulated images. Area-to-point cokriging (ATPCK) was used as locally varying mean (LVM) (i.e., soft data) to deal with the change of support problem (COSP) for cross-scale integration, which MPG cannot achieve alone. Several pairs of spectral bands of remotely sensed images were tested for integration within different cross-scale case studies. The experiment shows that MPG can restore the spatial structure of the image at a fine spatial resolution given the training image and conditioning data. The super-resolution image can be predicted using the proposed method, which cannot be realised using most data integration methods. The results show that ATPCK-MPG approach can achieve greater accuracy than methods which do not account for the change of support issue.

  17. Insight into point defects and impurities in titanium from first principles

    Science.gov (United States)

    Nayak, Sanjeev K.; Hung, Cain J.; Sharma, Vinit; Alpay, S. Pamir; Dongare, Avinash M.; Brindley, William J.; Hebert, Rainer J.

    2018-03-01

    Titanium alloys find extensive use in the aerospace and biomedical industries due to a unique combination of strength, density, and corrosion resistance. Decades of mostly experimental research has led to a large body of knowledge of the processing-microstructure-properties linkages. But much of the existing understanding of point defects that play a significant role in the mechanical properties of titanium is based on semi-empirical rules. In this work, we present the results of a detailed self-consistent first-principles study that was developed to determine formation energies of intrinsic point defects including vacancies, self-interstitials, and extrinsic point defects, such as, interstitial and substitutional impurities/dopants. We find that most elements, regardless of size, prefer substitutional positions, but highly electronegative elements, such as C, N, O, F, S, and Cl, some of which are common impurities in Ti, occupy interstitial positions.

  18. Point defect thermodynamics and diffusion in Fe3C: A first-principles study

    International Nuclear Information System (INIS)

    Chao Jiang; Uberuaga, B.P.; Srinivasan, S.G.

    2008-01-01

    The point defect structure of cementite (Fe 3 C) is investigated using a combination of the statistical mechanical Wagner-Schottky model and first-principles calculations within the generalized gradient approximation. Large 128-atom supercells are employed to obtain fully converged point defect formation energies. The present study unambiguously shows that carbon vacancies and octahedral carbon interstitials are the structural defects in C-depleted and C-rich cementite, respectively. The dominant thermal defects in C-depleted and stoichiometric cementite are found to be carbon Frenkel pairs. In C-rich cementite, however, the primary thermal excitations are strongly temperature-dependent: interbranch, Schottky and Frenkel defects dominate successively with increasing temperature. Using the nudged elastic band technique, the migration barriers of major point defects in cementite are also determined and compared with available experiments in the literature

  19. Focal Points Revisited: Team Reasoning, the Principle of Insufficient Reason and Cognitive Hierarchy Theory

    NARCIS (Netherlands)

    Bardsley, N.; Ule, A.

    It is well-established that people can coordinate their behaviour on focal points in games with multiple equilibria, but it is not firmly established how. Much coordination game data might be explained by team reasoning, a departure from individualistic choice theory. However, a less exotic

  20. Turning challenges into design principles: Telemonitoring systems for patients with multiple chronic conditions.

    Science.gov (United States)

    Sultan, Mehwish; Kuluski, Kerry; McIsaac, Warren J; Cafazzo, Joseph A; Seto, Emily

    2018-01-01

    People with multiple chronic conditions often struggle with managing their health. The purpose of this research was to identify specific challenges of patients with multiple chronic conditions and to use the findings to form design principles for a telemonitoring system tailored for these patients. Semi-structured interviews with 15 patients with multiple chronic conditions and 10 clinicians were conducted to gain an understanding of their needs and preferences for a smartphone-based telemonitoring system. The interviews were analyzed using a conventional content analysis technique, resulting in six themes. Design principles developed from the themes included that the system must be modular to accommodate various combinations of conditions, reinforce a routine, consolidate record keeping, as well as provide actionable feedback to the patients. Designing an application for multiple chronic conditions is complex due to variability in patient conditions, and therefore, design principles developed in this study can help with future innovations aimed to help manage this population.

  1. Pressure Points in Reading Comprehension: A Quantile Multiple Regression Analysis

    Science.gov (United States)

    Logan, Jessica

    2017-01-01

    The goal of this study was to examine how selected pressure points or areas of vulnerability are related to individual differences in reading comprehension and whether the importance of these pressure points varies as a function of the level of children's reading comprehension. A sample of 245 third-grade children were given an assessment battery…

  2. Performance analysis of commercial multiple-input-multiple-output access point in distributed antenna system.

    Science.gov (United States)

    Fan, Yuting; Aighobahi, Anthony E; Gomes, Nathan J; Xu, Kun; Li, Jianqiang

    2015-03-23

    In this paper, we experimentally investigate the throughput of IEEE 802.11n 2x2 multiple-input-multiple-output (MIMO) signals in a radio-over-fiber-based distributed antenna system (DAS) with different fiber lengths and power imbalance. Both a MIMO-supported access point (AP) and a spatial-diversity-supported AP were separately employed in the experiments. Throughput measurements were carried out with wireless users at different locations in a typical office environment. For the different fiber length effect, the results indicate that MIMO signals can maintain high throughput when the fiber length difference between the two remote antenna units (RAUs) is under 100 m and falls quickly when the length difference is greater. For the spatial diversity signals, high throughput can be maintained even when the difference is 150 m. On the other hand, the separation of the MIMO antennas allows additional freedom in placing the antennas in strategic locations for overall improved system performance, although it may also lead to received power imbalance problems. The results show that the throughput performance drops in specific positions when the received power imbalance is above around 13 dB. Hence, there is a trade-off between the extent of the wireless coverage for moderate bit-rates and the area over which peak bit-rates can be achieved.

  3. Operating principle of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • Two control modes were developed for a B2B VSCs based SOP. • The SOP operating principle was investigated under various network conditions. • The performance of the SOP using two control modes was analyzed. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. Two control modes were developed for the operation of an SOP, using back-to-back voltage-source converters (VSCs). A power flow control mode with current control provides independent control of real and reactive power. A supply restoration mode with a voltage controller enables power supply to isolated loads due to network faults. The operating principle of the back-to-back VSCs based SOP was investigated under both normal and abnormal network operating conditions. Studies on a two-feeder medium-voltage distribution network showed the performance of the SOP under different network-operating conditions: normal, during a fault and post-fault supply restoration. During the change of network operating conditions, a mode switch method based on the phase locked loop controller was used to achieve the transitions between the two control modes. Hard transitions by a direct mode switching were noticed unfavourable, but seamless transitions were obtained by deploying a soft cold load pickup and voltage synchronization process.

  4. A MOSUM procedure for the estimation of multiple random change points

    OpenAIRE

    Eichinger, Birte; Kirch, Claudia

    2018-01-01

    In this work, we investigate statistical properties of change point estimators based on moving sum statistics. We extend results for testing in a classical situation with multiple deterministic change points by allowing for random exogenous change points that arise in Hidden Markov or regime switching models among others. To this end, we consider a multiple mean change model with possible time series errors and prove that the number and location of change points are estimated consistently by ...

  5. Multiple Monte Carlo Testing with Applications in Spatial Point Processes

    DEFF Research Database (Denmark)

    Mrkvička, Tomáš; Myllymäki, Mari; Hahn, Ute

    with a function as the test statistic, 3) several Monte Carlo tests with functions as test statistics. The rank test has correct (global) type I error in each case and it is accompanied with a p-value and with a graphical interpretation which shows which subtest or which distances of the used test function......(s) lead to the rejection at the prescribed significance level of the test. Examples of null hypothesis from point process and random set statistics are used to demonstrate the strength of the rank envelope test. The examples include goodness-of-fit test with several test functions, goodness-of-fit test...

  6. Effective communication at the point of multiple sclerosis diagnosis.

    Science.gov (United States)

    Solari, Alessandra

    2014-04-01

    As a consequence of the current shortened diagnostic workup, people with multiple sclerosis (PwMS) are rapidly confronted with a disease of uncertain prognosis that requires complex treatment decisions. This paper reviews studies that have assessed the experiences of PwMS in the peri-diagnostic period and have evaluated the efficacy of interventions providing information at this critical moment. The studies found that the emotional burden on PwMS at diagnosis was high, and emphasised the need for careful monitoring and management of mood symptoms (chiefly anxiety). Information provision did not affect anxiety symptoms but improved patients' knowledge of their condition, the achievement of 'informed choice', and satisfaction with the diagnosis communication. It is vital to develop and implement information and decision aids for PwMS, but this is resource intensive, and international collaboration may be a way forward. The use of patient self-assessed outcome measures that appraise the quality of diagnosis communication is also important to allow health services to understand and meet the needs and preferences of PwMS.

  7. A large deviation principle in H\\"older norm for multiple fractional integrals

    OpenAIRE

    Sanz-Solé, Marta; Torrecilla-Tarantino, Iván

    2007-01-01

    For a fractional Brownian motion $B^H$ with Hurst parameter $H\\in]{1/4},{1/2}[\\cup]{1/2},1[$, multiple indefinite integrals on a simplex are constructed and the regularity of their sample paths are studied. Then, it is proved that the family of probability laws of the processes obtained by replacing $B^H$ by $\\epsilon^{{1/2}} B^H$ satisfies a large deviation principle in H\\"older norm. The definition of the multiple integrals relies upon a representation of the fractional Brownian motion in t...

  8. Point defects in hexagonal germanium carbide monolayer: A first-principles calculation

    International Nuclear Information System (INIS)

    Ersan, Fatih; Gökçe, Aytaç Gürhan; Aktürk, Ethem

    2016-01-01

    Highlights: • Semiconductor GeC turns into metal by introducing a carbon vacancy. • Semiconductor GeC becomes half-metal by a single Ge vacancy. • Band gap value of GeC system can be tuned in the range of 0.308–1.738 eV by antisite or Stone–Wales defects. - Abstract: On the basis of first-principles plane-wave calculations, we investigated the electronic and magnetic properties of various point defects including single Ge and C vacancies, Ge + C divacancy, Ge↔C antisites and the Stone–Wales (SW) defects in a GeC monolayer. We found that various periodic vacancy defects in GeC single layer give rise to crucial effects on the electronic and magnetic properties. The band gaps of GeC monolayer vary significantly from 0.308 eV to 1.738 eV due to the presence of antisites and Stone–Wales defects. While nonmagnetic ground state of semiconducting GeC turns into metal by introducing a carbon vacancy, it becomes half-metal by a single Ge vacancy with high magnetization (4 μ_B) value per supercell. All the vacancy types have zero net magnetic moments, except single Ge vacancy.

  9. Point defects in hexagonal germanium carbide monolayer: A first-principles calculation

    Energy Technology Data Exchange (ETDEWEB)

    Ersan, Fatih [Department of Physics, Adnan Menderes University, 09100 Aydın (Turkey); Gökçe, Aytaç Gürhan [Department of Physics, Adnan Menderes University, 09100 Aydın (Turkey); Department of Physics, Dokuz Eylül University, 35160 İzmir (Turkey); Aktürk, Ethem, E-mail: ethem.akturk@adu.edu.tr [Department of Physics, Adnan Menderes University, 09100 Aydın (Turkey); Nanotechnology Application and Research Center, Adnan Menderes University, 09100 Aydın (Turkey)

    2016-12-15

    Highlights: • Semiconductor GeC turns into metal by introducing a carbon vacancy. • Semiconductor GeC becomes half-metal by a single Ge vacancy. • Band gap value of GeC system can be tuned in the range of 0.308–1.738 eV by antisite or Stone–Wales defects. - Abstract: On the basis of first-principles plane-wave calculations, we investigated the electronic and magnetic properties of various point defects including single Ge and C vacancies, Ge + C divacancy, Ge↔C antisites and the Stone–Wales (SW) defects in a GeC monolayer. We found that various periodic vacancy defects in GeC single layer give rise to crucial effects on the electronic and magnetic properties. The band gaps of GeC monolayer vary significantly from 0.308 eV to 1.738 eV due to the presence of antisites and Stone–Wales defects. While nonmagnetic ground state of semiconducting GeC turns into metal by introducing a carbon vacancy, it becomes half-metal by a single Ge vacancy with high magnetization (4 μ{sub B}) value per supercell. All the vacancy types have zero net magnetic moments, except single Ge vacancy.

  10. First-principles investigation of the energetics of point defects at a grain boundary in tungsten

    Energy Technology Data Exchange (ETDEWEB)

    Chai, Jun; Li, Yu-Hao; Niu, Liang-Liang; Qin, Shi-Yao; Zhou, Hong-Bo, E-mail: hbzhou@buaa.edu.cn; Jin, Shuo; Zhang, Ying; Lu, Guang-Hong

    2017-02-15

    Tungsten (W) and W alloys are considered as the most promising candidates for plasma facing materials in future fusion reactor. Grain boundaries (GBs) play an important role in the self-healing of irradiation defects in W. Here, we investigate the stability of point defects [vacancy and self-interstitial atoms (SIA’s)] in a Σ5(3 1 0) [0 0 1] tilt W GB by calculating the energetics using a first-principles method. It is found that both the vacancy and SIA are energetically favorable to locate at neighboring sites of the GB, suggesting the vacancy and SIA can easily segregate to the GB region with the segregation energy of 1.53 eV and 7.5 eV, respectively. This can be attributed to the special atomic configuration and large available space of the GB. The effective interaction distance between the GB and the SIA is ∼6.19 Å, which is ∼2 Å larger than that of the vacancy-GB, indicating the SIA are more preferable to locate at the GB in comparison with the vacancy. Further, the binding energy of di-vacancies in the W GB are much larger than that in bulk W, suggesting that the vacancy energetically prefers to congregate in the GB.

  11. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Science.gov (United States)

    2010-02-24

    ... (HACCP); Approval of Information Collection Request AGENCY: Food and Nutrition Service, USDA. ACTION... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... must be based on the (HACCP) system established by the Secretary of Agriculture. The food safety...

  12. Error analysis of dimensionless scaling experiments with multiple points using linear regression

    International Nuclear Information System (INIS)

    Guercan, Oe.D.; Vermare, L.; Hennequin, P.; Bourdelle, C.

    2010-01-01

    A general method of error estimation in the case of multiple point dimensionless scaling experiments, using linear regression and standard error propagation, is proposed. The method reduces to the previous result of Cordey (2009 Nucl. Fusion 49 052001) in the case of a two-point scan. On the other hand, if the points follow a linear trend, it explains how the estimated error decreases as more points are added to the scan. Based on the analytical expression that is derived, it is argued that for a low number of points, adding points to the ends of the scanned range, rather than the middle, results in a smaller error estimate. (letter)

  13. Exact multiple scattering theory of two-nucleus collisions including the Pauli principle

    International Nuclear Information System (INIS)

    Gurvitz, S.A.

    1981-01-01

    Exact equations for two-nucleus scattering are derived in which the effects of the Pauli principle are fully included. Our method exploits a modified equation for the scattering of two identical nucleons, which is obtained at the beginning. Considering proton-nucleus scattering we found that the resulting amplitude has two components, one resembling a multiple scattering series for distinguishable particles, and the other a distorted (A-1) nucleon cluster exchange. For elastic pA scattering the multiple scattering amplitude is found in the form of an optical potential expansion. We show that the Kerman-McManus-Thaler theory of the optical potential could be easily modified to include the effects of antisymmetrization of the projectile with the target nucleons. Nucleus-nucleus scattering is studied first for distinguishable target and beam nucleus. Afterwards the Pauli principle is included, where only the case of deuteron-nucleus scattering is discussed in detail. The resulting amplitude has four components. Two of them correspond to modified multiple scattering expansions and the others are distorted (A-1)- and (A-2)- nucleon cluster exchange. The result for d-A scattering is extended to the general case of nucleus-nucleus scattering. The equations are simple to use and as such constitute an improvement over existing schemes

  14. History Matching Through a Smooth Formulation of Multiple-Point Statistics

    DEFF Research Database (Denmark)

    Melnikova, Yulia; Zunino, Andrea; Lange, Katrine

    2014-01-01

    and the mismatch with multiple-point statistics. As a result, in the framework of the Bayesian approach, such a solution belongs to a high posterior region. The methodology, while applicable to any inverse problem with a training-image-based prior, is especially beneficial for problems which require expensive......We propose a smooth formulation of multiple-point statistics that enables us to solve inverse problems using gradient-based optimization techniques. We introduce a differentiable function that quantifies the mismatch between multiple-point statistics of a training image and of a given model. We...... show that, by minimizing this function, any continuous image can be gradually transformed into an image that honors the multiple-point statistics of the discrete training image. The solution to an inverse problem is then found by minimizing the sum of two mismatches: the mismatch with data...

  15. Estimation of Multiple Point Sources for Linear Fractional Order Systems Using Modulating Functions

    KAUST Repository

    Belkhatir, Zehor; Laleg-Kirati, Taous-Meriem

    2017-01-01

    This paper proposes an estimation algorithm for the characterization of multiple point inputs for linear fractional order systems. First, using polynomial modulating functions method and a suitable change of variables the problem of estimating

  16. The fundamental principles of the physical protection, the group of six point of view

    International Nuclear Information System (INIS)

    Claeys, M.; Carnas, L.; Robeyns, G.; Rommevaux, G.; Venot, R.; Hagemann, A.; Fontaneda Gonzalez, A.; Gimenez Gonzalez, S.; Isaksson, S.G.; Wager, K.; Price, C.

    2001-01-01

    This paper presents the joint experience of the Group of Six in the field of physical protection against the theft or unauthorized removal of nuclear material and against the sabotage of nuclear material and nuclear facilities, which emerged from the joint discussion. Several fundamental principles stem from this experience. Of course the particular terms and conditions of the implementation of these principles are specific to each country. (authors)

  17. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems.

    Science.gov (United States)

    Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray

    2014-05-13

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.

  18. Common Fixed Points of Generalized Rational Type Cocyclic Mappings in Multiplicative Metric Spaces

    Directory of Open Access Journals (Sweden)

    Mujahid Abbas

    2015-01-01

    Full Text Available The aim of this paper is to present fixed point result of mappings satisfying a generalized rational contractive condition in the setup of multiplicative metric spaces. As an application, we obtain a common fixed point of a pair of weakly compatible mappings. Some common fixed point results of pair of rational contractive types mappings involved in cocyclic representation of a nonempty subset of a multiplicative metric space are also obtained. Some examples are presented to support the results proved herein. Our results generalize and extend various results in the existing literature.

  19. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    Science.gov (United States)

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  20. Parallel point-multiplication architecture using combined group operations for high-speed cryptographic applications.

    Directory of Open Access Journals (Sweden)

    Md Selim Hossain

    Full Text Available In this paper, we propose a novel parallel architecture for fast hardware implementation of elliptic curve point multiplication (ECPM, which is the key operation of an elliptic curve cryptography processor. The point multiplication over binary fields is synthesized on both FPGA and ASIC technology by designing fast elliptic curve group operations in Jacobian projective coordinates. A novel combined point doubling and point addition (PDPA architecture is proposed for group operations to achieve high speed and low hardware requirements for ECPM. It has been implemented over the binary field which is recommended by the National Institute of Standards and Technology (NIST. The proposed ECPM supports two Koblitz and random curves for the key sizes 233 and 163 bits. For group operations, a finite-field arithmetic operation, e.g. multiplication, is designed on a polynomial basis. The delay of a 233-bit point multiplication is only 3.05 and 3.56 μs, in a Xilinx Virtex-7 FPGA, for Koblitz and random curves, respectively, and 0.81 μs in an ASIC 65-nm technology, which are the fastest hardware implementation results reported in the literature to date. In addition, a 163-bit point multiplication is also implemented in FPGA and ASIC for fair comparison which takes around 0.33 and 0.46 μs, respectively. The area-time product of the proposed point multiplication is very low compared to similar designs. The performance ([Formula: see text] and Area × Time × Energy (ATE product of the proposed design are far better than the most significant studies found in the literature.

  1. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    Science.gov (United States)

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  2. Improving the Pattern Reproducibility of Multiple-Point-Based Prior Models Using Frequency Matching

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2014-01-01

    Some multiple-point-based sampling algorithms, such as the snesim algorithm, rely on sequential simulation. The conditional probability distributions that are used for the simulation are based on statistics of multiple-point data events obtained from a training image. During the simulation, data...... events with zero probability in the training image statistics may occur. This is handled by pruning the set of conditioning data until an event with non-zero probability is found. The resulting probability distribution sampled by such algorithms is a pruned mixture model. The pruning strategy leads...... to a probability distribution that lacks some of the information provided by the multiple-point statistics from the training image, which reduces the reproducibility of the training image patterns in the outcome realizations. When pruned mixture models are used as prior models for inverse problems, local re...

  3. A feature point identification method for positron emission particle tracking with multiple tracers

    Energy Technology Data Exchange (ETDEWEB)

    Wiggins, Cody, E-mail: cwiggin2@vols.utk.edu [University of Tennessee-Knoxville, Department of Physics and Astronomy, 1408 Circle Drive, Knoxville, TN 37996 (United States); Santos, Roque [University of Tennessee-Knoxville, Department of Nuclear Engineering (United States); Escuela Politécnica Nacional, Departamento de Ciencias Nucleares (Ecuador); Ruggles, Arthur [University of Tennessee-Knoxville, Department of Nuclear Engineering (United States)

    2017-01-21

    A novel detection algorithm for Positron Emission Particle Tracking (PEPT) with multiple tracers based on optical feature point identification (FPI) methods is presented. This new method, the FPI method, is compared to a previous multiple PEPT method via analyses of experimental and simulated data. The FPI method outperforms the older method in cases of large particle numbers and fine time resolution. Simulated data show the FPI method to be capable of identifying 100 particles at 0.5 mm average spatial error. Detection error is seen to vary with the inverse square root of the number of lines of response (LORs) used for detection and increases as particle separation decreases. - Highlights: • A new approach to positron emission particle tracking is presented. • Using optical feature point identification analogs, multiple particle tracking is achieved. • Method is compared to previous multiple particle method. • Accuracy and applicability of method is explored.

  4. Reduction of bias in neutron multiplicity assay using a weighted point model

    Energy Technology Data Exchange (ETDEWEB)

    Geist, W. H. (William H.); Krick, M. S. (Merlyn S.); Mayo, D. R. (Douglas R.)

    2004-01-01

    Accurate assay of most common plutonium samples was the development goal for the nondestructive assay technique of neutron multiplicity counting. Over the past 20 years the technique has been proven for relatively pure oxides and small metal items. Unfortunately, the technique results in large biases when assaying large metal items. Limiting assumptions, such as unifoh multiplication, in the point model used to derive the multiplicity equations causes these biases for large dense items. A weighted point model has been developed to overcome some of the limitations in the standard point model. Weighting factors are detemiined from Monte Carlo calculations using the MCNPX code. Monte Carlo calculations give the dependence of the weighting factors on sample mass and geometry, and simulated assays using Monte Carlo give the theoretical accuracy of the weighted-point-model assay. Measured multiplicity data evaluated with both the standard and weighted point models are compared to reference values to give the experimental accuracy of the assay. Initial results show significant promise for the weighted point model in reducing or eliminating biases in the neutron multiplicity assay of metal items. The negative biases observed in the assay of plutonium metal samples are caused by variations in the neutron multiplication for neutrons originating in various locations in the sample. The bias depends on the mass and shape of the sample and depends on the amount and energy distribution of the ({alpha},n) neutrons in the sample. When the standard point model is used, this variable-multiplication bias overestimates the multiplication and alpha values of the sample, and underestimates the plutonium mass. The weighted point model potentially can provide assay accuracy of {approx}2% (1 {sigma}) for cylindrical plutonium metal samples < 4 kg with {alpha} < 1 without knowing the exact shape of the samples, provided that the ({alpha},n) source is uniformly distributed throughout the

  5. Method of Fusion Diagnosis for Dam Service Status Based on Joint Distribution Function of Multiple Points

    Directory of Open Access Journals (Sweden)

    Zhenxiang Jiang

    2016-01-01

    Full Text Available The traditional methods of diagnosing dam service status are always suitable for single measuring point. These methods also reflect the local status of dams without merging multisource data effectively, which is not suitable for diagnosing overall service. This study proposes a new method involving multiple points to diagnose dam service status based on joint distribution function. The function, including monitoring data of multiple points, can be established with t-copula function. Therefore, the possibility, which is an important fusing value in different measuring combinations, can be calculated, and the corresponding diagnosing criterion is established with typical small probability theory. Engineering case study indicates that the fusion diagnosis method can be conducted in real time and the abnormal point can be detected, thereby providing a new early warning method for engineering safety.

  6. Determination of shell correction energies at saddle point using pre-scission neutron multiplicities

    International Nuclear Information System (INIS)

    Golda, K.S.; Saxena, A.; Mittal, V.K.; Mahata, K.; Sugathan, P.; Jhingan, A.; Singh, V.; Sandal, R.; Goyal, S.; Gehlot, J.; Dhal, A.; Behera, B.R.; Bhowmik, R.K.; Kailas, S.

    2013-01-01

    Pre-scission neutron multiplicities have been measured for 12 C + 194, 198 Pt systems at matching excitation energies at near Coulomb barrier region. Statistical model analysis with a modified fission barrier and level density prescription have been carried out to fit the measured pre-scission neutron multiplicities and the available evaporation residue and fission cross sections simultaneously to constrain statistical model parameters. Simultaneous fitting of the pre-scission neutron multiplicities and cross section data requires shell correction at the saddle point

  7. First-Principles Prediction of Spin-Polarized Multiple Dirac Rings in Manganese Fluoride

    Science.gov (United States)

    Jiao, Yalong; Ma, Fengxian; Zhang, Chunmei; Bell, John; Sanvito, Stefano; Du, Aijun

    2017-07-01

    Spin-polarized materials with Dirac features have sparked great scientific interest due to their potential applications in spintronics. But such a type of structure is very rare and none has been fabricated. Here, we investigate the already experimentally synthesized manganese fluoride (MnF3 ) as a novel spin-polarized Dirac material by using first-principles calculations. MnF3 exhibits multiple Dirac cones in one spin orientation, while it behaves like a large gap semiconductor in the other spin channel. The estimated Fermi velocity for each cone is of the same order of magnitude as that in graphene. The 3D band structure further reveals that MnF3 possesses rings of Dirac nodes in the Brillouin zone. Such a spin-polarized multiple Dirac ring feature is reported for the first time in an experimentally realized material. Moreover, similar band dispersions can be also found in other transition metal fluorides (e.g., CoF3 , CrF3 , and FeF3 ). Our results highlight a new interesting single-spin Dirac material with promising applications in spintronics and information technologies.

  8. First-Principles Prediction of Spin-Polarized Multiple Dirac Rings in Manganese Fluoride.

    Science.gov (United States)

    Jiao, Yalong; Ma, Fengxian; Zhang, Chunmei; Bell, John; Sanvito, Stefano; Du, Aijun

    2017-07-07

    Spin-polarized materials with Dirac features have sparked great scientific interest due to their potential applications in spintronics. But such a type of structure is very rare and none has been fabricated. Here, we investigate the already experimentally synthesized manganese fluoride (MnF_{3}) as a novel spin-polarized Dirac material by using first-principles calculations. MnF_{3} exhibits multiple Dirac cones in one spin orientation, while it behaves like a large gap semiconductor in the other spin channel. The estimated Fermi velocity for each cone is of the same order of magnitude as that in graphene. The 3D band structure further reveals that MnF_{3} possesses rings of Dirac nodes in the Brillouin zone. Such a spin-polarized multiple Dirac ring feature is reported for the first time in an experimentally realized material. Moreover, similar band dispersions can be also found in other transition metal fluorides (e.g., CoF_{3}, CrF_{3}, and FeF_{3}). Our results highlight a new interesting single-spin Dirac material with promising applications in spintronics and information technologies.

  9. [Anatomical key points and operative principle of "two planes and four landmarks" in extralevator abdominoperineal excision].

    Science.gov (United States)

    Ye, Yingjiang; Shen, Zhanlong; Wang, Shan

    2014-11-01

    Abominoperineal resection (APR) is the main approach of lower rectal cancer treatment. Recently, it was found that conventional APR had higher incidence rate of positive circumferential resection margin(CRM) and intraoperative perforation (IOP), which was the crucial reason of local recurrence and worse prognosis. Extralevator abdominoperineal excision(ELAPE) procedure was proposed by European panels including surgeons, radiologist and pathologists, and considered to lower the positive rates of CRM and IOP. Definitive surgical planes and anatomic landmarks are the cores of this procedure, which are the prerequisite for the guarantee of safety and smoothness of surgery. To realize the anatomy of muscles, fascias, blood vessels and nervous of perineal region is the base of carrying out ELAPE procedure. In this paper, we introduce the key anatomy related to ELAPE procedure and summarize the principle of ELAPE procedure as "two planes and four landmarks", which will be beneficial to the popularization and application.

  10. 77 FR 34211 - Modification of Multiple Compulsory Reporting Points; Continental United States, Alaska and Hawaii

    Science.gov (United States)

    2012-06-11

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration 14 CFR Part 71 [Docket No. FAA-2012-0130; Airspace Docket No. 12-AWA-2] RIN 2120-AA66 Modification of Multiple Compulsory Reporting Points; Continental United States, Alaska and Hawaii AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Final...

  11. Derivation of the blackbody radiation spectrum from the equivalence principle in classical physics with classical electromagnetic zero-point radiation

    International Nuclear Information System (INIS)

    Boyer, T.H.

    1984-01-01

    A derivation of Planck's spectrum including zero-point radiation is given within classical physics from recent results involving the thermal effects of acceleration through classical electromagnetic zero-point radiation. A harmonic electric-dipole oscillator undergoing a uniform acceleration a through classical electromagnetic zero-point radiation responds as would the same oscillator in an inertial frame when not in zero-point radiation but in a different spectrum of random classical radiation. Since the equivalence principle tells us that the oscillator supported in a gravitational field g = -a will respond in the same way, we see that in a gravitational field we can construct a perpetual-motion machine based on this different spectrum unless the different spectrum corresponds to that of thermal equilibrium at a finite temperature. Therefore, assuming the absence of perpetual-motion machines of the first kind in a gravitational field, we conclude that the response of an oscillator accelerating through classical zero-point radiation must be that of a thermal system. This then determines the blackbody radiation spectrum in an inertial frame which turns out to be exactly Planck's spectrum including zero-point radiation

  12. Photonic crystals possessing multiple Weyl points and the experimental observation of robust surface states

    Science.gov (United States)

    Chen, Wen-Jie; Xiao, Meng; Chan, C. T.

    2016-01-01

    Weyl points, as monopoles of Berry curvature in momentum space, have captured much attention recently in various branches of physics. Realizing topological materials that exhibit such nodal points is challenging and indeed, Weyl points have been found experimentally in transition metal arsenide and phosphide and gyroid photonic crystal whose structure is complex. If realizing even the simplest type of single Weyl nodes with a topological charge of 1 is difficult, then making a real crystal carrying higher topological charges may seem more challenging. Here we design, and fabricate using planar fabrication technology, a photonic crystal possessing single Weyl points (including type-II nodes) and multiple Weyl points with topological charges of 2 and 3. We characterize this photonic crystal and find nontrivial 2D bulk band gaps for a fixed kz and the associated surface modes. The robustness of these surface states against kz-preserving scattering is experimentally observed for the first time. PMID:27703140

  13. Generating and executing programs for a floating point single instruction multiple data instruction set architecture

    Science.gov (United States)

    Gschwind, Michael K

    2013-04-16

    Mechanisms for generating and executing programs for a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA) are provided. A computer program product comprising a computer recordable medium having a computer readable program recorded thereon is provided. The computer readable program, when executed on a computing device, causes the computing device to receive one or more instructions and execute the one or more instructions using logic in an execution unit of the computing device. The logic implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA), based on data stored in a vector register file of the computing device. The vector register file is configured to store both scalar and floating point values as vectors having a plurality of vector elements.

  14. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    International Nuclear Information System (INIS)

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    2017-01-01

    An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If the detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.

  15. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    Science.gov (United States)

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    2017-07-01

    An extension of the point kinetics model is developed to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If the detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. The spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.

  16. Molecular dynamics of polarizable point dipole models for molten NaI. Comparison with first principles simulations

    Directory of Open Access Journals (Sweden)

    Trullàs J.

    2011-05-01

    Full Text Available Molecular dynamics simulations of molten NaI at 995 K have been carried out using polarizable ion models based on rigid ion pair potentials to which the anion induced dipole polarization is added. The polarization is added in such a way that point dipoles are induced on the anions by both local electric field and deformation short-range damping interactions that oppose the electrically induced dipole moments. The structure and self-diffusion results are compared with those obtained by Galamba and Costa Cabral using first principles Hellmann-Feynman molecular dynamics simulations and using classical molecular dynamics of a shell model which allows only the iodide polarization

  17. Intrinsic point defects in inorganic perovskite CsPbI3 from first-principles prediction

    KAUST Repository

    Li, Yifan

    2017-10-19

    Cubic inorganic perovskite CsPbI3 is a direct bandgap semiconductor, which is promising for optoelectronic applications, such as solar cells, light emitting diodes, and lasers. The intrinsic defects in semiconductors play crucial roles in determining carrier conductivity, the efficiency of carrier recombination, and so on. However, the thermodynamic stability and intrinsic defect physics are still unclear for cubic CsPbI3. By using the first-principles calculations, we study the thermodynamic process and find out that the window for CsPbI3 growth is quite narrow and the concentration of Cs is important for cubic CsPbI3 growth. Under Pb-rich conditions, VPb and VI can pin the Fermi energy in the middle of the bandgap, which results in a low carrier concentration. Under Pb-poor conditions, VPb is the dominant defect and the material has a high concentration of hole carriers with a long lifetime. Our present work gives an insight view of the defect physics of cubic CsPbI3 and will be beneficial for optoelectronic applications based on cubic CsPbI3 and other analogous inorganic perovskites.

  18. Intrinsic point defects in inorganic perovskite CsPbI3 from first-principles prediction

    KAUST Repository

    Li, Yifan; Zhang, Chenhui; Zhang, Xixiang; Huang, Dan; Shen, Qian; Cheng, Yingchun; Huang, Wei

    2017-01-01

    Cubic inorganic perovskite CsPbI3 is a direct bandgap semiconductor, which is promising for optoelectronic applications, such as solar cells, light emitting diodes, and lasers. The intrinsic defects in semiconductors play crucial roles in determining carrier conductivity, the efficiency of carrier recombination, and so on. However, the thermodynamic stability and intrinsic defect physics are still unclear for cubic CsPbI3. By using the first-principles calculations, we study the thermodynamic process and find out that the window for CsPbI3 growth is quite narrow and the concentration of Cs is important for cubic CsPbI3 growth. Under Pb-rich conditions, VPb and VI can pin the Fermi energy in the middle of the bandgap, which results in a low carrier concentration. Under Pb-poor conditions, VPb is the dominant defect and the material has a high concentration of hole carriers with a long lifetime. Our present work gives an insight view of the defect physics of cubic CsPbI3 and will be beneficial for optoelectronic applications based on cubic CsPbI3 and other analogous inorganic perovskites.

  19. First-principles study of point defects in solar cell semiconductor CuI

    International Nuclear Information System (INIS)

    Chen, Hui; Wang, Chong-Yu; Wang, Jian-Tao; Wu, Ying; Zhou, Shao-Xiong

    2013-01-01

    Hybrid density functional theory is used to study the formation energies and transition levels of point defects V Cu , V I , I Cu , Cu I , and O I in CuI. It is shown that the Heyd–Scuseria–Ernzerhof (HSE06) method can accurately describe the band gap of bulk CuI. As a solar cell material, we find that p-type semiconductor CuI can be obtained under the iodine-rich and copper-poor conditions. Our results are in good agreement with experiment and provide an excellent account for tuning the structural and electronic properties of CuI

  20. Simultaneous colour visualizations of multiple ALS point cloud attributes for land cover and vegetation analysis

    Science.gov (United States)

    Zlinszky, András; Schroiff, Anke; Otepka, Johannes; Mandlburger, Gottfried; Pfeifer, Norbert

    2014-05-01

    LIDAR point clouds hold valuable information for land cover and vegetation analysis, not only in the spatial distribution of the points but also in their various attributes. However, LIDAR point clouds are rarely used for visual interpretation, since for most users, the point cloud is difficult to interpret compared to passive optical imagery. Meanwhile, point cloud viewing software is available allowing interactive 3D interpretation, but typically only one attribute at a time. This results in a large number of points with the same colour, crowding the scene and often obscuring detail. We developed a scheme for mapping information from multiple LIDAR point attributes to the Red, Green, and Blue channels of a widely used LIDAR data format, which are otherwise mostly used to add information from imagery to create "photorealistic" point clouds. The possible combinations of parameters are therefore represented in a wide range of colours, but relative differences in individual parameter values of points can be well understood. The visualization was implemented in OPALS software, using a simple and robust batch script, and is viewer independent since the information is stored in the point cloud data file itself. In our case, the following colour channel assignment delivered best results: Echo amplitude in the Red, echo width in the Green and normalized height above a Digital Terrain Model in the Blue channel. With correct parameter scaling (but completely without point classification), points belonging to asphalt and bare soil are dark red, low grassland and crop vegetation are bright red to yellow, shrubs and low trees are green and high trees are blue. Depending on roof material and DTM quality, buildings are shown from red through purple to dark blue. Erroneously high or low points, or points with incorrect amplitude or echo width usually have colours contrasting from terrain or vegetation. This allows efficient visual interpretation of the point cloud in planar

  1. Testing to fulfill HACCP (Hazard Analysis Critical Control Points) requirements: principles and examples.

    Science.gov (United States)

    Gardner, I A

    1997-12-01

    On-farm HACCP (hazard analysis critical control points) monitoring requires cost-effective, yet accurate and reproducible tests that can determine the status of cows, milk, and the dairy environment. Tests need to be field-validated, and their limitations need to be established so that appropriate screening strategies can be initiated and test results can be rationally interpreted. For infections and residues of low prevalence, tests or testing strategies that are highly specific help to minimize false-positive results and excessive costs to the dairy industry. The determination of the numbers of samples to be tested in HACCP monitoring programs depends on the specific purpose of the test and the likely prevalence of the agent or residue at the critical control point. The absence of positive samples from a herd test should not be interpreted as freedom from a particular agent or residue unless the entire herd has been tested with a test that is 100% sensitive. The current lack of field-validated tests for most of the chemical and infectious agents of concern makes it difficult to ensure that the stated goals of HACCP programs are consistently achieved.

  2. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Science.gov (United States)

    Bednar, David; Beerens, Koen; Sebestova, Eva; Bendl, Jaroslav; Khare, Sagar; Chaloupkova, Radka; Prokop, Zbynek; Brezovsky, Jan; Baker, David; Damborsky, Jiri

    2015-11-01

    There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C) by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  3. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Directory of Open Access Journals (Sweden)

    David Bednar

    2015-11-01

    Full Text Available There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  4. Surface Tension of Multi-phase Flow with Multiple Junctions Governed by the Variational Principle

    International Nuclear Information System (INIS)

    Matsutani, Shigeki; Nakano, Kota; Shinjo, Katsuhiko

    2011-01-01

    We explore a computational model of an incompressible fluid with a multi-phase field in three-dimensional Euclidean space. By investigating an incompressible fluid with a two-phase field geometrically, we reformulate the expression of the surface tension for the two-phase field found by Lafaurie et al. (J Comput Phys 113:134–147, 1994) as a variational problem related to an infinite dimensional Lie group, the volume-preserving diffeomorphism. The variational principle to the action integral with the surface energy reproduces their Euler equation of the two-phase field with the surface tension. Since the surface energy of multiple interfaces even with singularities is not difficult to be evaluated in general and the variational formulation works for every action integral, the new formulation enables us to extend their expression to that of a multi-phase (N-phase, N ≥ 2) flow and to obtain a novel Euler equation with the surface tension of the multi-phase field. The obtained Euler equation governs the equation for motion of the multi-phase field with different surface tension coefficients without any difficulties for the singularities at multiple junctions. In other words, we unify the theory of multi-phase fields which express low dimensional interface geometry and the theory of the incompressible fluid dynamics on the infinite dimensional geometry as a variational problem. We apply the equation to the contact angle problems at triple junctions. We computed the fluid dynamics for a two-phase field with a wall numerically and show the numerical computational results that for given surface tension coefficients, the contact angles are generated by the surface tension as results of balances of the kinematic energy and the surface energy.

  5. Universal principles governing multiple random searchers on complex networks: The logarithmic growth pattern and the harmonic law

    Science.gov (United States)

    Weng, Tongfeng; Zhang, Jie; Small, Michael; Harandizadeh, Bahareh; Hui, Pan

    2018-03-01

    We propose a unified framework to evaluate and quantify the search time of multiple random searchers traversing independently and concurrently on complex networks. We find that the intriguing behaviors of multiple random searchers are governed by two basic principles—the logarithmic growth pattern and the harmonic law. Specifically, the logarithmic growth pattern characterizes how the search time increases with the number of targets, while the harmonic law explores how the search time of multiple random searchers varies relative to that needed by individual searchers. Numerical and theoretical results demonstrate these two universal principles established across a broad range of random search processes, including generic random walks, maximal entropy random walks, intermittent strategies, and persistent random walks. Our results reveal two fundamental principles governing the search time of multiple random searchers, which are expected to facilitate investigation of diverse dynamical processes like synchronization and spreading.

  6. Registration of Aerial Optical Images with LiDAR Data Using the Closest Point Principle and Collinearity Equations.

    Science.gov (United States)

    Huang, Rongyong; Zheng, Shunyi; Hu, Kun

    2018-06-01

    Registration of large-scale optical images with airborne LiDAR data is the basis of the integration of photogrammetry and LiDAR. However, geometric misalignments still exist between some aerial optical images and airborne LiDAR point clouds. To eliminate such misalignments, we extended a method for registering close-range optical images with terrestrial LiDAR data to a variety of large-scale aerial optical images and airborne LiDAR data. The fundamental principle is to minimize the distances from the photogrammetric matching points to the terrestrial LiDAR data surface. Except for the satisfactory efficiency of about 79 s per 6732 × 8984 image, the experimental results also show that the unit weighted root mean square (RMS) of the image points is able to reach a sub-pixel level (0.45 to 0.62 pixel), and the actual horizontal and vertical accuracy can be greatly improved to a high level of 1/4⁻1/2 (0.17⁻0.27 m) and 1/8⁻1/4 (0.10⁻0.15 m) of the average LiDAR point distance respectively. Finally, the method is proved to be more accurate, feasible, efficient, and practical in variety of large-scale aerial optical image and LiDAR data.

  7. First-principles study of point defects in CePO{sub 4} monazite

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Yong; Zhao, Xiaofeng [State Key Laboratory Cultivation Base for Nonmetal Composites and Functional Materials, Southwest University of Science and Technology, Mianyang 621010 (China); Teng, Yuancheng, E-mail: tyc239@163.com [State Key Laboratory Cultivation Base for Nonmetal Composites and Functional Materials, Southwest University of Science and Technology, Mianyang 621010 (China); Bi, Beng [Joint Laboratory for Extreme Conditions Matter Properties, Southwest University of Science and Technology, Mianyang 621010 (China); Wang, Lili [Institute of Computer Application, China Academy of Engineering Physics, Mianyang 621900 (China); Wu, Lang; Zhang, Kuibao [State Key Laboratory Cultivation Base for Nonmetal Composites and Functional Materials, Southwest University of Science and Technology, Mianyang 621010 (China)

    2016-12-15

    CePO{sub 4} monazite is an important radiation-resistant material that may act as a potential minor actinides waste form. Here, we present the results of the calculations for the basic radiation defect modellings in CePO{sub 4} crystals, along with the examination of their defect formation energies and effect of the defect concentrations. This study focused on building a fully-relaxed CePO{sub 4} model with the step iterative optimization from the DFT-GGA calculations using the VASP and CASTEP databases. The results show that the Frenkel defect configuration resulting from the center interstitials has a lower energy when compared to two adjacent orthophosphate centers (the saddle point position). High formation energies were found for all the types of intrinsic Frenkel and vacancy defects. The formation energies conform to the following trend (given in the decreasing order of energy): Ce Frenkel (12.41 eV) > O Frenkel (11.02 eV) > Ce vacancy (9.09 eV) > O vacancy (6.69 eV). We observed almost no effect from the defect concentrations on the defect formation energies.

  8. NEWTONIAN IMPERIALIST COMPETITVE APPROACH TO OPTIMIZING OBSERVATION OF MULTIPLE TARGET POINTS IN MULTISENSOR SURVEILLANCE SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. Afghan-Toloee

    2013-09-01

    Full Text Available The problem of specifying the minimum number of sensors to deploy in a certain area to face multiple targets has been generally studied in the literatures. In this paper, we are arguing the multi-sensors deployment problem (MDP. The Multi-sensor placement problem can be clarified as minimizing the cost required to cover the multi target points in the area. We propose a more feasible method for the multi-sensor placement problem. Our method makes provision the high coverage of grid based placements while minimizing the cost as discovered in perimeter placement techniques. The NICA algorithm as improved ICA (Imperialist Competitive Algorithm is used to decrease the performance time to explore an enough solution compared to other meta-heuristic schemes such as GA, PSO and ICA. A three dimensional area is used for clarify the multiple target and placement points, making provision x, y, and z computations in the observation algorithm. A structure of model for the multi-sensor placement problem is proposed: The problem is constructed as an optimization problem with the objective to minimize the cost while covering all multiple target points upon a given probability of observation tolerance.

  9. Modeling of Semiconductors and Correlated Oxides with Point Defects by First Principles Methods

    KAUST Repository

    Wang, Hao

    2014-06-15

    Point defects in silicon, vanadium dioxide, and doped ceria are investigated by density functional theory. Defects involving vacancies and interstitial oxygen and carbon in silicon are after formed in outer space and significantly affect device performances. The screened hybrid functional by Heyd-Scuseria-Ernzerhof is used to calculate formation energies, binding energies, and electronic structures of the defective systems because standard density functional theory underestimates the bang gap of silicon. The results indicate for the A-center a −2 charge state. Tin is proposed to be an effective dopant to suppress the formation of A-centers. For the total energy difference between the A- and B-type carbon related G-centers we find close agreement with the experiment. The results indicate that the C-type G-center is more stable than both the A- and B-types. The electronic structures of the monoclinic and rutile phases of vanadium dioxide are also studied using the Heyd-Scuseria-Ernzerhof functional. The ground states of the pure phases obtained by calculations including spin polarization disagree with the experimental observations that the monoclinic phase should not be magnetic, the rutile phase should be metallic, and the monoclinic phase should have a lower total energy than the rutile phase. By tuning the Hartree-Fock fraction α to 10% the agreement with experiments is improved in terms of band gaps and relative energies of the phases. A calculation scheme is proposed to simulate the relationship between the transition temperature of the metal-insulator transition and the dopant concentration in tungsten doped vanadium dioxide. We achieve good agreement with the experimental situation. 18.75% and 25% yttrium, lanthanum, praseodymium, samarium, and gadolinium doped ceria supercells generated by the special quasirandom structure approach are employed to investigate the impact of doping on the O diffusion. The experimental behavior of the conductivity for the

  10. Modeling of Semiconductors and Correlated Oxides with Point Defects by First Principles Methods

    KAUST Repository

    Wang, Hao

    2014-01-01

    Point defects in silicon, vanadium dioxide, and doped ceria are investigated by density functional theory. Defects involving vacancies and interstitial oxygen and carbon in silicon are after formed in outer space and significantly affect device performances. The screened hybrid functional by Heyd-Scuseria-Ernzerhof is used to calculate formation energies, binding energies, and electronic structures of the defective systems because standard density functional theory underestimates the bang gap of silicon. The results indicate for the A-center a −2 charge state. Tin is proposed to be an effective dopant to suppress the formation of A-centers. For the total energy difference between the A- and B-type carbon related G-centers we find close agreement with the experiment. The results indicate that the C-type G-center is more stable than both the A- and B-types. The electronic structures of the monoclinic and rutile phases of vanadium dioxide are also studied using the Heyd-Scuseria-Ernzerhof functional. The ground states of the pure phases obtained by calculations including spin polarization disagree with the experimental observations that the monoclinic phase should not be magnetic, the rutile phase should be metallic, and the monoclinic phase should have a lower total energy than the rutile phase. By tuning the Hartree-Fock fraction α to 10% the agreement with experiments is improved in terms of band gaps and relative energies of the phases. A calculation scheme is proposed to simulate the relationship between the transition temperature of the metal-insulator transition and the dopant concentration in tungsten doped vanadium dioxide. We achieve good agreement with the experimental situation. 18.75% and 25% yttrium, lanthanum, praseodymium, samarium, and gadolinium doped ceria supercells generated by the special quasirandom structure approach are employed to investigate the impact of doping on the O diffusion. The experimental behavior of the conductivity for the

  11. Effect of point defects on the electronic density states of SnC nanosheets: First-principles calculations

    Directory of Open Access Journals (Sweden)

    Soleyman Majidi

    Full Text Available In this work, we investigated the electronic and structural properties of various defects including single Sn and C vacancies, double vacancy of the Sn and C atoms, anti-sites, position exchange and the Stone–Wales (SW defects in SnC nanosheets by using density-functional theory (DFT. We found that various vacancy defects in the SnC monolayer can change the electronic and structural properties. Our results show that the SnC is an indirect band gap compound, with the band gap of 2.10 eV. The system turns into metal for both structure of the single Sn and C vacancies. However, for the double vacancy contained Sn and C atoms, the structure remains semiconductor with the direct band gap of 0.37 eV at the G point. We also found that for anti-site defects, the structure remains semiconductor and for the exchange defect, the structure becomes indirect semiconductor with the K-G point and the band gap of 0.74 eV. Finally, the structure of SW defect remains semiconductor with the direct band gap at K point with band gap of 0.54 eV. Keywords: SnC nanosheets, Density-functional theory, First-principles calculations, Electronic density of states, Band gap

  12. MULTIPLE ACCESS POINTS WITHIN THE ONLINE CLASSROOM: WHERE STUDENTS LOOK FOR INFORMATION

    Directory of Open Access Journals (Sweden)

    John STEELE

    2017-01-01

    Full Text Available The purpose of this study is to examine the impact of information placement within the confines of the online classroom architecture. Also reviewed was the impact of other variables such as course design, teaching presence and student patterns in looking for information. The sample population included students from a major online university in their first year course sequence. Students were tasked with completing a survey at the end of the course, indicating their preference for accessing information within the online classroom. The qualitative data indicated that student preference is to receive information from multiple access points and sources within the online classroom architecture. Students also expressed a desire to have information delivered through the usage of technology such as email and text messaging. In addition to receiving information from multiple sources, the qualitative data indicated students were satisfied overall, with the current ways in which they received and accessed information within the online classroom setting. Major findings suggest that instructors teaching within the online classroom should have multiple data access points within the classroom architecture. Furthermore, instructors should use a variety of communication venues to enhance the ability for students to access and receive information pertinent to the course.

  13. Methods of fast, multiple-point in vivo T1 determination

    International Nuclear Information System (INIS)

    Zhang, Y.; Spigarelli, M.; Fencil, L.E.; Yeung, H.N.

    1989-01-01

    Two methods of rapid, multiple-point determination of T1 in vivo have been evaluated with a phantom consisting of vials of gel in different Mn + + concentrations. The first method was an inversion-recovery- on-the-fly technique, and the second method used a variable- tip-angle (α) progressive saturation with two sub- sequences of different repetition times. In the first method, 1/T1 was evaluated by an exponential fit. In the second method, 1/T1 was obtained iteratively with a linear fit and then readjusted together with α to a model equation until self-consistency was reached

  14. Search for Pauli exclusion principle violating atomic transitions and electron decay with a p-type point contact germanium detector

    Energy Technology Data Exchange (ETDEWEB)

    Abgrall, N.; Bradley, A.W.; Chan, Y.D.; Mertens, S.; Poon, A.W.P. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Arnquist, I.J.; Hoppe, E.W.; Kouzes, R.T.; LaFerriere, B.D.; Orrell, J.L. [Pacific Northwest National Laboratory, Richland, WA (United States); Avignone, F.T. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Barabash, A.S.; Konovalov, S.I.; Yumatov, V. [National Research Center ' ' Kurchatov Institute' ' Institute for Theoretical and Experimental Physics, Moscow (Russian Federation); Bertrand, F.E.; Galindo-Uribarri, A.; Radford, D.C.; Varner, R.L.; White, B.R.; Yu, C.H. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Brudanin, V.; Shirchenko, M.; Vasilyev, S.; Yakushev, E.; Zhitnikov, I. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Busch, M. [Duke University, Department of Physics, Durham, NC (United States); Triangle Universities Nuclear Laboratory, Durham, NC (United States); Buuck, M.; Cuesta, C.; Detwiler, J.A.; Gruszko, J.; Guinn, I.S.; Leon, J.; Robertson, R.G.H. [University of Washington, Department of Physics, Center for Experimental Nuclear Physics and Astrophysics, Seattle, WA (United States); Caldwell, A.S.; Christofferson, C.D.; Dunagan, C.; Howard, S.; Suriano, A.M. [South Dakota School of Mines and Technology, Rapid City, SD (United States); Chu, P.H.; Elliott, S.R.; Goett, J.; Massarczyk, R.; Rielage, K. [Los Alamos National Laboratory, Los Alamos, NM (United States); Efremenko, Yu. [University of Tennessee, Department of Physics and Astronomy, Knoxville, TN (United States); Ejiri, H. [Osaka University, Research Center for Nuclear Physics, Ibaraki, Osaka (Japan); Finnerty, P.S.; Gilliss, T.; Giovanetti, G.K.; Henning, R.; Howe, M.A.; MacMullin, J.; Meijer, S.J.; O' Shaughnessy, C.; Rager, J.; Shanks, B.; Trimble, J.E.; Vorren, K.; Xu, W. [Triangle Universities Nuclear Laboratory, Durham, NC (United States); University of North Carolina, Department of Physics and Astronomy, Chapel Hill, NC (United States); Green, M.P. [North Carolina State University, Department of Physics, Raleigh, NC (United States); Oak Ridge National Laboratory, Oak Ridge, TN (United States); Triangle Universities Nuclear Laboratory, Durham, NC (United States); Guiseppe, V.E.; Tedeschi, D.; Wiseman, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Jasinski, B.R. [University of South Dakota, Department of Physics, Vermillion, SD (United States); Keeter, K.J. [Black Hills State University, Department of Physics, Spearfish, SD (United States); Kidd, M.F. [Tennessee Tech University, Cookeville, TN (United States); Martin, R.D. [Queen' s University, Department of Physics, Engineering Physics and Astronomy, Kingston, ON (Canada); Romero-Romero, E. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); University of Tennessee, Department of Physics and Astronomy, Knoxville, TN (United States); Vetter, K. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); University of California, Department of Nuclear Engineering, Berkeley, CA (United States); Wilkerson, J.F. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Triangle Universities Nuclear Laboratory, Durham, NC (United States); University of North Carolina, Department of Physics and Astronomy, Chapel Hill, NC (United States)

    2016-11-15

    A search for Pauli-exclusion-principle-violating K{sub α} electron transitions was performed using 89.5 kg-d of data collected with a p-type point contact high-purity germanium detector operated at the Kimballton Underground Research Facility. A lower limit on the transition lifetime of 5.8 x 10{sup 30} s at 90% C.L. was set by looking for a peak at 10.6 keV resulting from the X-ray and Auger electrons present following the transition. A similar analysis was done to look for the decay of atomic K-shell electrons into neutrinos, resulting in a lower limit of 6.8 x 10{sup 30} s at 90% C.L. It is estimated that the Majorana Demonstrator, a 44 kg array of p-type point contact detectors that will search for the neutrinoless double-beta decay of {sup 76}Ge, could improve upon these exclusion limits by an order of magnitude after three years of operation. (orig.)

  15. Robust set-point regulation for ecological models with multiple management goals.

    Science.gov (United States)

    Guiver, Chris; Mueller, Markus; Hodgson, Dave; Townley, Stuart

    2016-05-01

    Population managers will often have to deal with problems of meeting multiple goals, for example, keeping at specific levels both the total population and population abundances in given stage-classes of a stratified population. In control engineering, such set-point regulation problems are commonly tackled using multi-input, multi-output proportional and integral (PI) feedback controllers. Building on our recent results for population management with single goals, we develop a PI control approach in a context of multi-objective population management. We show that robust set-point regulation is achieved by using a modified PI controller with saturation and anti-windup elements, both described in the paper, and illustrate the theory with examples. Our results apply more generally to linear control systems with positive state variables, including a class of infinite-dimensional systems, and thus have broader appeal.

  16. Estimation of Multiple Point Sources for Linear Fractional Order Systems Using Modulating Functions

    KAUST Repository

    Belkhatir, Zehor

    2017-06-28

    This paper proposes an estimation algorithm for the characterization of multiple point inputs for linear fractional order systems. First, using polynomial modulating functions method and a suitable change of variables the problem of estimating the locations and the amplitudes of a multi-pointwise input is decoupled into two algebraic systems of equations. The first system is nonlinear and solves for the time locations iteratively, whereas the second system is linear and solves for the input’s amplitudes. Second, closed form formulas for both the time location and the amplitude are provided in the particular case of single point input. Finally, numerical examples are given to illustrate the performance of the proposed technique in both noise-free and noisy cases. The joint estimation of pointwise input and fractional differentiation orders is also presented. Furthermore, a discussion on the performance of the proposed algorithm is provided.

  17. First principles calculation of point defects and mobility degradation in bulk AlSb for radiation detection application

    International Nuclear Information System (INIS)

    Lordi, V; Aberg, D; Erhart, P; Wu, K J

    2007-01-01

    The development of high resolution, room temperature semiconductor radiation detectors requires the introduction of materials with increased carrier mobility-lifetime (μτ) product, while having a band gap in the 1.4-2.2 eV range. AlSb is a promising material for this application. However, systematic improvements in the material quality are necessary to achieve an adequate μτ product. We are using a combination of simulation and experiment to develop a fundamental understanding of the factors which affect detector material quality. First principles calculations are used to study the microscopic mechanisms of mobility degradation from point defects and to calculate the intrinsic limit of mobility from phonon scattering. We use density functional theory (DFT) to calculate the formation energies of native and impurity point defects, to determine their equilibrium concentrations as a function of temperature and charge state. Perturbation theory via the Born approximation is coupled with Boltzmann transport theory to calculate the contribution toward mobility degradation of each type of point defect, using DFT-computed carrier scattering rates. A comparison is made to measured carrier concentrations and mobilities from AlSb crystals grown in our lab. We find our predictions in good quantitative agreement with experiment, allowing optimized annealing conditions to be deduced. A major result is the determination of oxygen impurity as a severe mobility killer, despite the ability of oxygen to compensation dope AlSb and reduce the net carrier concentration. In this case, increased resistivity is not a good indicator of improved material performance, due to the concomitant sharp reduction in μτ

  18. Bridges between multiple-point geostatistics and texture synthesis: Review and guidelines for future research

    Science.gov (United States)

    Mariethoz, Gregoire; Lefebvre, Sylvain

    2014-05-01

    Multiple-Point Simulations (MPS) is a family of geostatistical tools that has received a lot of attention in recent years for the characterization of spatial phenomena in geosciences. It relies on the definition of training images to represent a given type of spatial variability, or texture. We show that the algorithmic tools used are similar in many ways to techniques developed in computer graphics, where there is a need to generate large amounts of realistic textures for applications such as video games and animated movies. Similarly to MPS, these texture synthesis methods use training images, or exemplars, to generate realistic-looking graphical textures. Both domains of multiple-point geostatistics and example-based texture synthesis present similarities in their historic development and share similar concepts. These disciplines have however remained separated, and as a result significant algorithmic innovations in each discipline have not been universally adopted. Texture synthesis algorithms present drastically increased computational efficiency, patterns reproduction and user control. At the same time, MPS developed ways to condition models to spatial data and to produce 3D stochastic realizations, which have not been thoroughly investigated in the field of texture synthesis. In this paper we review the possible links between these disciplines and show the potential and limitations of using concepts and approaches from texture synthesis in MPS. We also provide guidelines on how recent developments could benefit both fields of research, and what challenges remain open.

  19. A Comparison of Combustion Dynamics for Multiple 7-Point Lean Direct Injection Combustor Configurations

    Science.gov (United States)

    Tacina, K. M.; Hicks, Y. R.

    2017-01-01

    The combustion dynamics of multiple 7-point lean direct injection (LDI) combustor configurations are compared. LDI is a fuel-lean combustor concept for aero gas turbine engines in which multiple small fuel-air mixers replace one traditionally-sized fuel-air mixer. This 7-point LDI configuration has a circular cross section, with a center (pilot) fuel-air mixer surrounded by six outer (main) fuel-air mixers. Each fuel-air mixer consists of an axial air swirler followed by a converging-diverging venturi. A simplex fuel injector is inserted through the center of the air swirler, with the fuel injector tip located near the venturi throat. All 7 fuel-air mixers are identical except for the swirler blade angle, which varies with the configuration. Testing was done in a 5-atm flame tube with inlet air temperatures from 600 to 800 F and equivalence ratios from 0.4 to 0.7. Combustion dynamics were measured using a cooled PCB pressure transducer flush-mounted in the wall of the combustor test section.

  20. Improving multiple-point-based a priori models for inverse problems by combining Sequential Simulation with the Frequency Matching Method

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Lange, Katrine

    In order to move beyond simplified covariance based a priori models, which are typically used for inverse problems, more complex multiple-point-based a priori models have to be considered. By means of marginal probability distributions ‘learned’ from a training image, sequential simulation has...... proven to be an efficient way of obtaining multiple realizations that honor the same multiple-point statistics as the training image. The frequency matching method provides an alternative way of formulating multiple-point-based a priori models. In this strategy the pattern frequency distributions (i.......e. marginals) of the training image and a subsurface model are matched in order to obtain a solution with the same multiple-point statistics as the training image. Sequential Gibbs sampling is a simulation strategy that provides an efficient way of applying sequential simulation based algorithms as a priori...

  1. Point spread function due to multiple scattering of light in the atmosphere

    International Nuclear Information System (INIS)

    Pękala, J.; Wilczyński, H.

    2013-01-01

    The atmospheric scattering of light has a significant influence on the results of optical observations of air showers. It causes attenuation of direct light from the shower, but also contributes a delayed signal to the observed light. The scattering of light therefore should be accounted for, both in simulations of air shower detection and reconstruction of observed events. In this work a Monte Carlo simulation of multiple scattering of light has been used to determine the contribution of the scattered light in observations of a point source of light. Results of the simulations and a parameterization of the angular distribution of the scattered light contribution to the observed signal (the point spread function) are presented. -- Author-Highlights: •Analysis of atmospheric scattering of light from an isotropic point source. •Different geometries and atmospheric conditions were investigated. •A parameterization of scattered light distribution has been developed. •The parameterization allows one to easily account for the light scattering in air. •The results will be useful in analyses of observations of extensive air shower

  2. Fundamental principles of nanostructures and multiple exciton generation effect in quantum dots

    International Nuclear Information System (INIS)

    Turaeva, N.; Oksengendler, B.; Rashidova, S.

    2011-01-01

    In this work the theoretical aspects of the effect of multiple exciton generation in QDs has been studied. The statistic theory of multiple exciton generation in quantum dots is presented based on the Fermi approach to the problem of multiple generation of elementary particles at nucleon-nucleon collisions. Our calculations show that the quantum efficiencies of multiple exciton generation in various quantum dots at absorption of single photon are in a good agreement with the experimental data. The microscopic mechanism of this effect is based on the theory of electronic 'shaking'. In the work the deviation of averaged multiplicity of MEG effect from the Poisson law of fluctuations has been investigated. Besides, the role of interface electronic states of quantum dot and ligand has been considered by means of quantum mechanics. The size optimization of quantum dot has been arranged to receive the maximum multiplicity of MEG effect. (authors)

  3. Multiple-output all-optical header processing technique based on two-pulse correlation principle

    NARCIS (Netherlands)

    Calabretta, N.; Liu, Y.; Waardt, de H.; Hill, M.T.; Khoe, G.D.; Dorren, H.J.S.

    2001-01-01

    A serial all-optical header processing technique based on a two-pulse correlation principle in a semiconductor laser amplifier in a loop mirror (SLALOM) configuration that can have a large number of output ports is presented. The operation is demonstrated experimentally at a 10Gbit/s Manchester

  4. The shooting method and multiple solutions of two/multi-point BVPs of second-order ODE

    Directory of Open Access Journals (Sweden)

    Man Kam Kwong

    2006-06-01

    Full Text Available Within the last decade, there has been growing interest in the study of multiple solutions of two- and multi-point boundary value problems of nonlinear ordinary differential equations as fixed points of a cone mapping. Undeniably many good results have emerged. The purpose of this paper is to point out that, in the special case of second-order equations, the shooting method can be an effective tool, sometimes yielding better results than those obtainable via fixed point techniques.

  5. Use of multiple water surface flow constructed wetlands for non-point source water pollution control.

    Science.gov (United States)

    Li, Dan; Zheng, Binghui; Liu, Yan; Chu, Zhaosheng; He, Yan; Huang, Minsheng

    2018-05-02

    Multiple free water surface flow constructed wetlands (multi-FWS CWs) are a variety of conventional water treatment plants for the interception of pollutants. This review encapsulated the characteristics and applications in the field of ecological non-point source water pollution control technology. The roles of in-series design and operation parameters (hydraulic residence time, hydraulic load rate, water depth and aspect ratio, composition of influent, and plant species) for performance intensification were also analyzed, which were crucial to achieve sustainable and effective contaminants removal, especially the retention of nutrient. The mechanism study of design and operation parameters for the removal of nitrogen and phosphorus was also highlighted. Conducive perspectives for further research on optimizing its design/operation parameters and advanced technologies of ecological restoration were illustrated to possibly interpret the functions of multi-FWS CWs.

  6. The positive impact of simultaneous implementation of the BD FocalPoint GS Imaging System and lean principles on the operation of gynecologic cytology.

    Science.gov (United States)

    Wong, Rebecca; Levi, Angelique W; Harigopal, Malini; Schofield, Kevin; Chhieng, David C

    2012-02-01

    Our cytology laboratory, like many others, is under pressure to improve quality and provide test results faster while decreasing costs. We sought to address these issues by introducing new technology and lean principles. To determine the combined impact of the FocalPoint Guided Screener (GS) Imaging System (BD Diagnostics-TriPath, Burlington, North Carolina) and lean manufacturing principles on the turnaround time (TAT) and productivity of the gynecologic cytology operation. We established a baseline measure of the TAT for Papanicolaou tests. We then compared that to the performance after implementing the FocalPoint GS Imaging System and lean principles. The latter included value-stream mapping, workflow modification, and a first in-first out policy. The mean (SD) TAT for Papanicolaou tests before and after the implementation of FocalPoint GS Imaging System and lean principles was 4.38 (1.28) days and 3.20 (1.32) days, respectively. This represented a 27% improvement in the average TAT, which was statistically significant (P implementation of FocalPoint GS Imaging System in conjunction with lean principles resulted in a significant decrease in the average TAT for Papanicolaou tests and a substantial increase in the productivity of cytotechnologists while maintaining the diagnostic quality of gynecologic cytology.

  7. Multiple-point statistical prediction on fracture networks at Yucca Mountain

    International Nuclear Information System (INIS)

    Liu, X.Y; Zhang, C.Y.; Liu, Q.S.; Birkholzer, J.T.

    2009-01-01

    In many underground nuclear waste repository systems, such as at Yucca Mountain, water flow rate and amount of water seepage into the waste emplacement drifts are mainly determined by hydrological properties of fracture network in the surrounding rock mass. Natural fracture network system is not easy to describe, especially with respect to its connectivity which is critically important for simulating the water flow field. In this paper, we introduced a new method for fracture network description and prediction, termed multi-point-statistics (MPS). The process of the MPS method is to record multiple-point statistics concerning the connectivity patterns of a fracture network from a known fracture map, and to reproduce multiple-scale training fracture patterns in a stochastic manner, implicitly and directly. It is applied to fracture data to study flow field behavior at the Yucca Mountain waste repository system. First, the MPS method is used to create a fracture network with an original fracture training image from Yucca Mountain dataset. After we adopt a harmonic and arithmetic average method to upscale the permeability to a coarse grid, THM simulation is carried out to study near-field water flow in the surrounding waste emplacement drifts. Our study shows that connectivity or patterns of fracture networks can be grasped and reconstructed by MPS methods. In theory, it will lead to better prediction of fracture system characteristics and flow behavior. Meanwhile, we can obtain variance from flow field, which gives us a way to quantify model uncertainty even in complicated coupled THM simulations. It indicates that MPS can potentially characterize and reconstruct natural fracture networks in a fractured rock mass with advantages of quantifying connectivity of fracture system and its simulation uncertainty simultaneously.

  8. Systems near a critical point under multiplicative noise and the concept of effective potential

    Science.gov (United States)

    Shapiro, V. E.

    1993-07-01

    This paper presents a general approach to and elucidates the main features of the effective potential, friction, and diffusion exerted by systems near a critical point due to nonlinear influence of noise. The model is that of a general many-dimensional system of coupled nonlinear oscillators of finite damping under frequently alternating influences, multiplicative or additive, and arbitrary form of the power spectrum, provided the time scales of the system's drift due to noise are large compared to the scales of unperturbed relaxation behavior. The conventional statistical approach and the widespread deterministic effective potential concept use the assumptions about a small parameter which are particular cases of the considered. We show close correspondence between the asymptotic methods of these approaches and base the analysis on this. The results include an analytical treatment of the system's long-time behavior as a function of the noise covering all the range of its table- and bell-shaped spectra, from the monochromatic limit to white noise. The trend is considered both in the coordinate momentum and in the coordinate system's space. Particular attention is paid to the stabilization behavior forced by multiplicative noise. An intermittency, in a broad area of the control parameter space, is shown to be an intrinsic feature of these phenomena.

  9. Multiple types of motives don't multiply the motivation of West Point cadets.

    Science.gov (United States)

    Wrzesniewski, Amy; Schwartz, Barry; Cong, Xiangyu; Kane, Michael; Omar, Audrey; Kolditz, Thomas

    2014-07-29

    Although people often assume that multiple motives for doing something will be more powerful and effective than a single motive, research suggests that different types of motives for the same action sometimes compete. More specifically, research suggests that instrumental motives, which are extrinsic to the activities at hand, can weaken internal motives, which are intrinsic to the activities at hand. We tested whether holding both instrumental and internal motives yields negative outcomes in a field context in which various motives occur naturally and long-term educational and career outcomes are at stake. We assessed the impact of the motives of over 10,000 West Point cadets over the period of a decade on whether they would become commissioned officers, extend their officer service beyond the minimum required period, and be selected for early career promotions. For each outcome, motivation internal to military service itself predicted positive outcomes; a relationship that was negatively affected when instrumental motives were also in evidence. These results suggest that holding multiple motives damages persistence and performance in educational and occupational contexts over long periods of time.

  10. Screening of point mutations by multiple SSCP analysis in the dystrophin gene

    Energy Technology Data Exchange (ETDEWEB)

    Lasa, A.; Baiget, M.; Gallano, P. [Hospital Sant Pau, Barcelona (Spain)

    1994-09-01

    Duchenne muscular dystrophy (DMD) is a lethal, X-linked neuromuscular disorder. The population frequency of DMD is one in approximately 3500 boys, of which one third is thought to be a new mutant. The DMD gene is the largest known to date, spanning over 2,3 Mb in band Xp21.2; 79 exons are transcribed into a 14 Kb mRNA coding for a protein of 427 kD which has been named dystrophin. It has been shown that about 65% of affected boys have a gene deletion with a wide variation in localization and size. The remaining affected individuals who have no detectable deletions or duplications would probably carry more subtle mutations that are difficult to detect. These mutations occur in several different exons and seem to be unique to single patients. Their identification represents a formidable goal because of the large size and complexity of the dystrophin gene. SSCP is a very efficient method for the detection of point mutations if the parameters that affect the separation of the strands are optimized for a particular DNA fragment. The multiple SSCP allows the simultaneous study of several exons, and implies the use of different conditions because no single set of conditions will be optimal for all fragments. Seventy-eight DMD patients with no deletion or duplication in the dystrophin gene were selected for the multiple SSCP analysis. Genomic DNA from these patients was amplified using the primers described for the diagnosis procedure (muscle promoter and exons 3, 8, 12, 16, 17, 19, 32, 45, 48 and 51). We have observed different mobility shifts in bands corresponding to exons 8, 12, 43 and 51. In exons 17 and 45, altered electrophoretic patterns were found in different samples identifying polymorphisms already described.

  11. Association of a novel point mutation in MSH2 gene with familial multiple primary cancers

    Directory of Open Access Journals (Sweden)

    Hai Hu

    2017-10-01

    Full Text Available Abstract Background Multiple primary cancers (MPC have been identified as two or more cancers without any subordinate relationship that occur either simultaneously or metachronously in the same or different organs of an individual. Lynch syndrome is an autosomal dominant genetic disorder that increases the risk of many types of cancers. Lynch syndrome patients who suffer more than two cancers can also be considered as MPC; patients of this kind provide unique resources to learn how genetic mutation causes MPC in different tissues. Methods We performed a whole genome sequencing on blood cells and two tumor samples of a Lynch syndrome patient who was diagnosed with five primary cancers. The mutational landscape of the tumors, including somatic point mutations and copy number alternations, was characterized. We also compared Lynch syndrome with sporadic cancers and proposed a model to illustrate the mutational process by which Lynch syndrome progresses to MPC. Results We revealed a novel pathologic mutation on the MSH2 gene (G504 splicing that associates with Lynch syndrome. Systematical comparison of the mutation landscape revealed that multiple cancers in the proband were evolutionarily independent. Integrative analysis showed that truncating mutations of DNA mismatch repair (MMR genes were significantly enriched in the patient. A mutation progress model that included germline mutations of MMR genes, double hits of MMR system, mutations in tissue-specific driver genes, and rapid accumulation of additional passenger mutations was proposed to illustrate how MPC occurs in Lynch syndrome patients. Conclusion Our findings demonstrate that both germline and somatic alterations are driving forces of carcinogenesis, which may resolve the carcinogenic theory of Lynch syndrome.

  12. Multiple ECG Fiducial Points-Based Random Binary Sequence Generation for Securing Wireless Body Area Networks.

    Science.gov (United States)

    Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif

    2017-05-01

    Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.

  13. Optimizing the diagnostic power with gastric emptying scintigraphy at multiple time points

    Directory of Open Access Journals (Sweden)

    Gajewski Byron J

    2011-05-01

    Full Text Available Abstract Background Gastric Emptying Scintigraphy (GES at intervals over 4 hours after a standardized radio-labeled meal is commonly regarded as the gold standard for diagnosing gastroparesis. The objectives of this study were: 1 to investigate the best time point and the best combination of multiple time points for diagnosing gastroparesis with repeated GES measures, and 2 to contrast and cross-validate Fisher's Linear Discriminant Analysis (LDA, a rank based Distribution Free (DF approach, and the Classification And Regression Tree (CART model. Methods A total of 320 patients with GES measures at 1, 2, 3, and 4 hour (h after a standard meal using a standardized method were retrospectively collected. Area under the Receiver Operating Characteristic (ROC curve and the rate of false classification through jackknife cross-validation were used for model comparison. Results Due to strong correlation and an abnormality in data distribution, no substantial improvement in diagnostic power was found with the best linear combination by LDA approach even with data transformation. With DF method, the linear combination of 4-h and 3-h increased the Area Under the Curve (AUC and decreased the number of false classifications (0.87; 15.0% over individual time points (0.83, 0.82; 15.6%, 25.3%, for 4-h and 3-h, respectively at a higher sensitivity level (sensitivity = 0.9. The CART model using 4 hourly GES measurements along with patient's age was the most accurate diagnostic tool (AUC = 0.88, false classification = 13.8%. Patients having a 4-h gastric retention value >10% were 5 times more likely to have gastroparesis (179/207 = 86.5% than those with ≤10% (18/113 = 15.9%. Conclusions With a mixed group of patients either referred with suspected gastroparesis or investigated for other reasons, the CART model is more robust than the LDA and DF approaches, capable of accommodating covariate effects and can be generalized for cross institutional applications, but

  14. Quantifying natural delta variability using a multiple-point geostatistics prior uncertainty model

    Science.gov (United States)

    Scheidt, Céline; Fernandes, Anjali M.; Paola, Chris; Caers, Jef

    2016-10-01

    We address the question of quantifying uncertainty associated with autogenic pattern variability in a channelized transport system by means of a modern geostatistical method. This question has considerable relevance for practical subsurface applications as well, particularly those related to uncertainty quantification relying on Bayesian approaches. Specifically, we show how the autogenic variability in a laboratory experiment can be represented and reproduced by a multiple-point geostatistical prior uncertainty model. The latter geostatistical method requires selection of a limited set of training images from which a possibly infinite set of geostatistical model realizations, mimicking the training image patterns, can be generated. To that end, we investigate two methods to determine how many training images and what training images should be provided to reproduce natural autogenic variability. The first method relies on distance-based clustering of overhead snapshots of the experiment; the second method relies on a rate of change quantification by means of a computer vision algorithm termed the demon algorithm. We show quantitatively that with either training image selection method, we can statistically reproduce the natural variability of the delta formed in the experiment. In addition, we study the nature of the patterns represented in the set of training images as a representation of the "eigenpatterns" of the natural system. The eigenpattern in the training image sets display patterns consistent with previous physical interpretations of the fundamental modes of this type of delta system: a highly channelized, incisional mode; a poorly channelized, depositional mode; and an intermediate mode between the two.

  15. Accelerating simulation for the multiple-point statistics algorithm using vector quantization

    Science.gov (United States)

    Zuo, Chen; Pan, Zhibin; Liang, Hao

    2018-03-01

    Multiple-point statistics (MPS) is a prominent algorithm to simulate categorical variables based on a sequential simulation procedure. Assuming training images (TIs) as prior conceptual models, MPS extracts patterns from TIs using a template and records their occurrences in a database. However, complex patterns increase the size of the database and require considerable time to retrieve the desired elements. In order to speed up simulation and improve simulation quality over state-of-the-art MPS methods, we propose an accelerating simulation for MPS using vector quantization (VQ), called VQ-MPS. First, a variable representation is presented to make categorical variables applicable for vector quantization. Second, we adopt a tree-structured VQ to compress the database so that stationary simulations are realized. Finally, a transformed template and classified VQ are used to address nonstationarity. A two-dimensional (2D) stationary channelized reservoir image is used to validate the proposed VQ-MPS. In comparison with several existing MPS programs, our method exhibits significantly better performance in terms of computational time, pattern reproductions, and spatial uncertainty. Further demonstrations consist of a 2D four facies simulation, two 2D nonstationary channel simulations, and a three-dimensional (3D) rock simulation. The results reveal that our proposed method is also capable of solving multifacies, nonstationarity, and 3D simulations based on 2D TIs.

  16. Zero-Point Energy Constraint for Unimolecular Dissociation Reactions. Giving Trajectories Multiple Chances To Dissociate Correctly.

    Science.gov (United States)

    Paul, Amit K; Hase, William L

    2016-01-28

    A zero-point energy (ZPE) constraint model is proposed for classical trajectory simulations of unimolecular decomposition and applied to CH4* → H + CH3 decomposition. With this model trajectories are not allowed to dissociate unless they have ZPE in the CH3 product. If not, they are returned to the CH4* region of phase space and, if necessary, given additional opportunities to dissociate with ZPE. The lifetime for dissociation of an individual trajectory is the time it takes to dissociate with ZPE in CH3, including multiple possible returns to CH4*. With this ZPE constraint the dissociation of CH4* is exponential in time as expected for intrinsic RRKM dynamics and the resulting rate constant is in good agreement with the harmonic quantum value of RRKM theory. In contrast, a model that discards trajectories without ZPE in the reaction products gives a CH4* → H + CH3 rate constant that agrees with the classical and not quantum RRKM value. The rate constant for the purely classical simulation indicates that anharmonicity may be important and the rate constant from the ZPE constrained classical trajectory simulation may not represent the complete anharmonicity of the RRKM quantum dynamics. The ZPE constraint model proposed here is compared with previous models for restricting ZPE flow in intramolecular dynamics, and connecting product and reactant/product quantum energy levels in chemical dynamics simulations.

  17. LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics

    Science.gov (United States)

    Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel

    2017-10-01

    Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.

  18. A location-based multiple point statistics method: modelling the reservoir with non-stationary characteristics

    Directory of Open Access Journals (Sweden)

    Yin Yanshu

    2017-12-01

    Full Text Available In this paper, a location-based multiple point statistics method is developed to model a non-stationary reservoir. The proposed method characterizes the relationship between the sedimentary pattern and the deposit location using the relative central position distance function, which alleviates the requirement that the training image and the simulated grids have the same dimension. The weights in every direction of the distance function can be changed to characterize the reservoir heterogeneity in various directions. The local integral replacements of data events, structured random path, distance tolerance and multi-grid strategy are applied to reproduce the sedimentary patterns and obtain a more realistic result. This method is compared with the traditional Snesim method using a synthesized 3-D training image of Poyang Lake and a reservoir model of Shengli Oilfield in China. The results indicate that the new method can reproduce the non-stationary characteristics better than the traditional method and is more suitable for simulation of delta-front deposits. These results show that the new method is a powerful tool for modelling a reservoir with non-stationary characteristics.

  19. Forces, surface finish and friction characteristics in surface engineered single- and multiple-point cutting edges

    International Nuclear Information System (INIS)

    Sarwar, M.; Gillibrand, D.; Bradbury, S.R.

    1991-01-01

    Advanced surface engineering technologies (physical and chemical vapour deposition) have been successfully applied to high speed steel and carbide cutting tools, and the potential benefits in terms of both performance and longer tool life, are now well established. Although major achievements have been reported by many manufacturers and users, there are a number of applications where surface engineering has been unsuccessful. Considerable attention has been given to the film characteristics and the variables associated with its properties; however, very little attention has been directed towards the benefits to the tool user. In order to apply surface engineering technology effectively to cutting tools, the coater needs to have accurate information relating to cutting conditions, i.e. cutting forces, stress and temperature etc. The present paper describes results obtained with single- and multiple-point cutting tools with examples of failures, which should help the surface coater to appreciate the significance of the cutting conditions, and in particular the magnitude of the forces and stresses present during cutting processes. These results will assist the development of a systems approach to cutting tool technology and surface engineering with a view to developing an improved product. (orig.)

  20. A Numerical Study on the Excitation of Guided Waves in Rectangular Plates Using Multiple Point Sources

    Directory of Open Access Journals (Sweden)

    Wenbo Duan

    2017-12-01

    Full Text Available Ultrasonic guided waves are widely used to inspect and monitor the structural integrity of plates and plate-like structures, such as ship hulls and large storage-tank floors. Recently, ultrasonic guided waves have also been used to remove ice and fouling from ship hulls, wind-turbine blades and aeroplane wings. In these applications, the strength of the sound source must be high for scanning a large area, or to break the bond between ice, fouling and plate substrate. More than one transducer may be used to achieve maximum sound power output. However, multiple sources can interact with each other, and form a sound field in the structure with local constructive and destructive regions. Destructive regions are weak regions and shall be avoided. When multiple transducers are used it is important that they are arranged in a particular way so that the desired wave modes can be excited to cover the whole structure. The objective of this paper is to provide a theoretical basis for generating particular wave mode patterns in finite-width rectangular plates whose length is assumed to be infinitely long with respect to its width and thickness. The wave modes have displacements in both width and thickness directions, and are thus different from the classical Lamb-type wave modes. A two-dimensional semi-analytical finite element (SAFE method was used to study dispersion characteristics and mode shapes in the plate up to ultrasonic frequencies. The modal analysis provided information on the generation of modes suitable for a particular application. The number of point sources and direction of loading for the excitation of a few representative modes was investigated. Based on the SAFE analysis, a standard finite element modelling package, Abaqus, was used to excite the designed modes in a three-dimensional plate. The generated wave patterns in Abaqus were then compared with mode shapes predicted in the SAFE model. Good agreement was observed between the

  1. Development of a Whole-Body Haptic Sensor with Multiple Supporting Points and Its Application to a Manipulator

    Science.gov (United States)

    Hanyu, Ryosuke; Tsuji, Toshiaki

    This paper proposes a whole-body haptic sensing system that has multiple supporting points between the body frame and the end-effector. The system consists of an end-effector and multiple force sensors. Using this mechanism, the position of a contact force on the surface can be calculated without any sensor array. A haptic sensing system with a single supporting point structure has previously been developed by the present authors. However, the system has drawbacks such as low stiffness and low strength. Therefore, in this study, a mechanism with multiple supporting points was proposed and its performance was verified. In this paper, the basic concept of the mechanism is first introduced. Next, an evaluation of the proposed method, performed by conducting some experiments, is presented.

  2. Adaptive aspirations and performance heterogeneity : Attention allocation among multiple reference points

    NARCIS (Netherlands)

    Blettner, D.P.; He, Z.; Hu, S.; Bettis, R.

    Organizations learn and adapt their aspiration levels based on reference points (prior aspiration, prior performance, and prior performance of reference groups). The relative attention that organizations allocate to these reference points impacts organizational search and strategic decisions.

  3. Modeling spatial variability of sand-lenses in clay till settings using transition probability and multiple-point geostatistics

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Nilsson, Bertel; Klint, Knud Erik

    2010-01-01

    (TPROGS) of alternating geological facies. The second method, multiple-point statistics, uses training images to estimate the conditional probability of sand-lenses at a certain location. Both methods respect field observations such as local stratigraphy, however, only the multiple-point statistics can...... of sand-lenses in clay till. Sand-lenses mainly account for horizontal transport and are prioritised in this study. Based on field observations, the distribution has been modeled using two different geostatistical approaches. One method uses a Markov chain model calculating the transition probabilities...

  4. X-ray diffraction imaging with the Multiple Inverse Fan Beam topology: Principles, performance and potential for security screening

    Energy Technology Data Exchange (ETDEWEB)

    Harding, G., E-mail: Geoffrey.Harding@Morphodetection.com [Morpho Detection Germany GmbH, Heselstuecken 3, 22453 Hamburg (Germany); Fleckenstein, H.; Kosciesza, D.; Olesinski, S.; Strecker, H.; Theedt, T.; Zienert, G. [Morpho Detection Germany GmbH, Heselstuecken 3, 22453 Hamburg (Germany)

    2012-07-15

    The steadily increasing number of explosive threat classes, including home-made explosives (HMEs), liquids, amorphous and gels (LAGs), is forcing up the false-alarm rates of security screening equipment. This development can best be countered by increasing the number of features available for classification. X-ray diffraction intrinsically offers multiple features for both solid and LAGs explosive detection, and is thus becoming increasingly important for false-alarm and cost reduction in both carry-on and checked baggage security screening. Following a brief introduction to X-ray diffraction imaging (XDI), which synthesizes in a single modality the image-forming and material-analysis capabilities of X-rays, the Multiple Inverse Fan Beam (MIFB) XDI topology is described. Physical relationships obtaining in such MIFB XDI components as the radiation source, collimators and room-temperature detectors are presented with experimental performances that have been achieved. Representative X-ray diffraction profiles of threat substances measured with a laboratory MIFB XDI system are displayed. The performance of Next-Generation (MIFB) XDI relative to that of the 2nd Generation XRD 3500{sup TM} screener (Morpho Detection Germany GmbH) is assessed. The potential of MIFB XDI, both for reducing the exorbitant cost of false alarms in hold baggage screening (HBS), as well as for combining 'in situ' liquid and solid explosive detection in carry-on luggage screening is outlined. - Highlights: Black-Right-Pointing-Pointer X-ray diffraction imaging (XDI) synthesizes analysis and imaging in one x-ray modality. Black-Right-Pointing-Pointer A novel XDI beam topology comprising multiple inverse fan-beams (MIFB) is described. Black-Right-Pointing-Pointer The MIFB topology is technically easy to realize and has high photon collection efficiency. Black-Right-Pointing-Pointer Applications are envisaged in checkpoint, hold baggage and cargo screening.

  5. Channel capacity of TDD-OFDM-MIMO for multiple access points in a wireless single-frequency-network

    DEFF Research Database (Denmark)

    Takatori, Y.; Fitzek, Frank; Tsunekawa, K.

    2005-01-01

    MIMO data transmission scheme, which combines Single-Frequency-Network (SFN) with TDD-OFDM-MIMO applied for wireless LAN networks. In our proposal, we advocate to use SFN for multiple access points (MAP) MIMO data transmission. The goal of this approach is to achieve very high channel capacity in both......The multiple-input-multiple-output (MIMO) technique is the most attractive candidate to improve the spectrum efficiency in the next generation wireless communication systems. However, the efficiency of MIMO techniques reduces in the line of sight (LOS) environments. In this paper, we propose a new...

  6. Multiple-point statistical simulation for hydrogeological models: 3-D training image development and conditioning strategies

    Science.gov (United States)

    Høyer, Anne-Sophie; Vignoli, Giulio; Mejer Hansen, Thomas; Thanh Vu, Le; Keefer, Donald A.; Jørgensen, Flemming

    2017-12-01

    Most studies on the application of geostatistical simulations based on multiple-point statistics (MPS) to hydrogeological modelling focus on relatively fine-scale models and concentrate on the estimation of facies-level structural uncertainty. Much less attention is paid to the use of input data and optimal construction of training images. For instance, even though the training image should capture a set of spatial geological characteristics to guide the simulations, the majority of the research still relies on 2-D or quasi-3-D training images. In the present study, we demonstrate a novel strategy for 3-D MPS modelling characterized by (i) realistic 3-D training images and (ii) an effective workflow for incorporating a diverse group of geological and geophysical data sets. The study covers an area of 2810 km2 in the southern part of Denmark. MPS simulations are performed on a subset of the geological succession (the lower to middle Miocene sediments) which is characterized by relatively uniform structures and dominated by sand and clay. The simulated domain is large and each of the geostatistical realizations contains approximately 45 million voxels with size 100 m × 100 m × 5 m. Data used for the modelling include water well logs, high-resolution seismic data, and a previously published 3-D geological model. We apply a series of different strategies for the simulations based on data quality, and develop a novel method to effectively create observed spatial trends. The training image is constructed as a relatively small 3-D voxel model covering an area of 90 km2. We use an iterative training image development strategy and find that even slight modifications in the training image create significant changes in simulations. Thus, this study shows how to include both the geological environment and the type and quality of input information in order to achieve optimal results from MPS modelling. We present a practical workflow to build the training image and

  7. The collapsed cone algorithm for (192)Ir dosimetry using phantom-size adaptive multiple-scatter point kernels.

    Science.gov (United States)

    Tedgren, Åsa Carlsson; Plamondon, Mathieu; Beaulieu, Luc

    2015-07-07

    The aim of this work was to investigate how dose distributions calculated with the collapsed cone (CC) algorithm depend on the size of the water phantom used in deriving the point kernel for multiple scatter. A research version of the CC algorithm equipped with a set of selectable point kernels for multiple-scatter dose that had initially been derived in water phantoms of various dimensions was used. The new point kernels were generated using EGSnrc in spherical water phantoms of radii 5 cm, 7.5 cm, 10 cm, 15 cm, 20 cm, 30 cm and 50 cm. Dose distributions derived with CC in water phantoms of different dimensions and in a CT-based clinical breast geometry were compared to Monte Carlo (MC) simulations using the Geant4-based brachytherapy specific MC code Algebra. Agreement with MC within 1% was obtained when the dimensions of the phantom used to derive the multiple-scatter kernel were similar to those of the calculation phantom. Doses are overestimated at phantom edges when kernels are derived in larger phantoms and underestimated when derived in smaller phantoms (by around 2% to 7% depending on distance from source and phantom dimensions). CC agrees well with MC in the high dose region of a breast implant and is superior to TG43 in determining skin doses for all multiple-scatter point kernel sizes. Increased agreement between CC and MC is achieved when the point kernel is comparable to breast dimensions. The investigated approximation in multiple scatter dose depends on the choice of point kernel in relation to phantom size and yields a significant fraction of the total dose only at distances of several centimeters from a source/implant which correspond to volumes of low doses. The current implementation of the CC algorithm utilizes a point kernel derived in a comparatively large (radius 20 cm) water phantom. A fixed point kernel leads to predictable behaviour of the algorithm with the worst case being a source/implant located well within a patient

  8. Simultaneous reconstruction of multiple depth images without off-focus points in integral imaging using a graphics processing unit.

    Science.gov (United States)

    Yi, Faliu; Lee, Jieun; Moon, Inkyu

    2014-05-01

    The reconstruction of multiple depth images with a ray back-propagation algorithm in three-dimensional (3D) computational integral imaging is computationally burdensome. Further, a reconstructed depth image consists of a focus and an off-focus area. Focus areas are 3D points on the surface of an object that are located at the reconstructed depth, while off-focus areas include 3D points in free-space that do not belong to any object surface in 3D space. Generally, without being removed, the presence of an off-focus area would adversely affect the high-level analysis of a 3D object, including its classification, recognition, and tracking. Here, we use a graphics processing unit (GPU) that supports parallel processing with multiple processors to simultaneously reconstruct multiple depth images using a lookup table containing the shifted values along the x and y directions for each elemental image in a given depth range. Moreover, each 3D point on a depth image can be measured by analyzing its statistical variance with its corresponding samples, which are captured by the two-dimensional (2D) elemental images. These statistical variances can be used to classify depth image pixels as either focus or off-focus points. At this stage, the measurement of focus and off-focus points in multiple depth images is also implemented in parallel on a GPU. Our proposed method is conducted based on the assumption that there is no occlusion of the 3D object during the capture stage of the integral imaging process. Experimental results have demonstrated that this method is capable of removing off-focus points in the reconstructed depth image. The results also showed that using a GPU to remove the off-focus points could greatly improve the overall computational speed compared with using a CPU.

  9. MINIMALISM IN A PSYCHOLINGUISTIC POINT OF VIEW: BINDING PRINCIPLES AND ITS OPERATION IN ON-LINE PROCESSING OF COREFERENCE

    Directory of Open Access Journals (Sweden)

    José Ferrari Neto

    2014-12-01

    Full Text Available This article aims to evaluate how much a formal model of Grammar can be apply to on-line mental processes that underlying the sentential processing. For this intent, it was carried on an experiment in which it was observed how the Binding Principles act in the processing of correferential relations in Brazilian Portuguese (BP. The results suggest that there is a convergence between linguistic computation and theories about linguistic processing.

  10. Assisting People with Multiple Disabilities by Improving Their Computer Pointing Efficiency with an Automatic Target Acquisition Program

    Science.gov (United States)

    Shih, Ching-Hsiang; Shih, Ching-Tien; Peng, Chin-Ling

    2011-01-01

    This study evaluated whether two people with multiple disabilities would be able to improve their pointing performance through an Automatic Target Acquisition Program (ATAP) and a newly developed mouse driver (i.e. a new mouse driver replaces standard mouse driver, and is able to monitor mouse movement and intercept click action). Initially, both…

  11. Multiple memory systems, multiple time points: how science can inform treatment to control the expression of unwanted emotional memories.

    Science.gov (United States)

    Visser, Renée M; Lau-Zhu, Alex; Henson, Richard N; Holmes, Emily A

    2018-03-19

    Memories that have strong emotions associated with them are particularly resilient to forgetting. This is not necessarily problematic, however some aspects of memory can be. In particular, the involuntary expression of those memories, e.g. intrusive memories after trauma, are core to certain psychological disorders. Since the beginning of this century, research using animal models shows that it is possible to change the underlying memory, for example by interfering with its consolidation or reconsolidation. While the idea of targeting maladaptive memories is promising for the treatment of stress and anxiety disorders, a direct application of the procedures used in non-human animals to humans in clinical settings is not straightforward. In translational research, more attention needs to be paid to specifying what aspect of memory (i) can be modified and (ii) should be modified. This requires a clear conceptualization of what aspect of memory is being targeted, and how different memory expressions may map onto clinical symptoms. Furthermore, memory processes are dynamic, so procedural details concerning timing are crucial when implementing a treatment and when assessing its effectiveness. To target emotional memory in its full complexity, including its malleability, science cannot rely on a single method, species or paradigm. Rather, a constructive dialogue is needed between multiple levels of research, all the way 'from mice to mental health'.This article is part of a discussion meeting issue 'Of mice and mental health: facilitating dialogue between basic and clinical neuroscientists'. © 2018 The Authors.

  12. Solving the multiple-set split equality common fixed-point problem of firmly quasi-nonexpansive operators.

    Science.gov (United States)

    Zhao, Jing; Zong, Haili

    2018-01-01

    In this paper, we propose parallel and cyclic iterative algorithms for solving the multiple-set split equality common fixed-point problem of firmly quasi-nonexpansive operators. We also combine the process of cyclic and parallel iterative methods and propose two mixed iterative algorithms. Our several algorithms do not need any prior information about the operator norms. Under mild assumptions, we prove weak convergence of the proposed iterative sequences in Hilbert spaces. As applications, we obtain several iterative algorithms to solve the multiple-set split equality problem.

  13. Roadside Multiple Objects Extraction from Mobile Laser Scanning Point Cloud Based on DBN

    Directory of Open Access Journals (Sweden)

    LUO Haifeng

    2018-02-01

    Full Text Available This paper proposed an novel algorithm for exploring deep belief network (DBN architectures to extract and recognize roadside facilities (trees,cars and traffic poles from mobile laser scanning (MLS point cloud.The proposed methods firstly partitioned the raw MLS point cloud into blocks and then removed the ground and building points.In order to partition the off-ground objects into individual objects,off-ground points were organized into an Octree structure and clustered into candidate objects based on connected component.To improve segmentation performance on clusters containing overlapped objects,a refining processing using a voxel-based normalized cut was then implemented.In addition,multi-view features descriptor was generated for each independent roadside facilities based on binary images.Finally,a deep belief network (DBN was trained to extract trees,cars and traffic pole objects.Experiments are undertaken to evaluate the validities of the proposed method with two datasets acquired by Lynx Mobile Mapper System.The precision of trees,cars and traffic poles objects extraction results respectively was 97.31%,97.79% and 92.78%.The recall was 98.30%,98.75% and 96.77% respectively.The quality is 95.70%,93.81% and 90.00%.And the F1 measure was 97.80%,96.81% and 94.73%.

  14. Under digital fluoroscopic guidance multiple-point injection with absolute alcohol and pinyangmycin for the treatment of superficial venous malformations

    International Nuclear Information System (INIS)

    Yang Ming; Xiao Gang; Peng Youlin

    2010-01-01

    Objective: to investigate the therapeutic efficacy of multiple-point injection with absolute alcohol and pinyangmycin under digital fluoroscopic guidance for superficial venous malformations. Methods: By using a disposal venous transfusion needle the superficial venous malformation was punctured and then contrast media lohexol was injected in to visualize the tumor body, which was followed by the injection of ethanol and pinyangmycin when the needle was confirmed in the correct position. The procedure was successfully performed in 31 patients. The clinical results were observed and analyzed. Results: After one treatment complete cure was achieved in 21 cases and marked effect was obtained in 8 cases, with a total effectiveness of 93.5%. Conclusion: Multiple-point injection with ethanol and pinyangmycin under digital fluoroscopic guidance is an effective and safe technique for the treatment of superficial venous malformations, especially for the lesions that are deeply located and ill-defined. (authors)

  15. Modeling a Single SEP Event from Multiple Vantage Points Using the iPATH Model

    Science.gov (United States)

    Hu, Junxiang; Li, Gang; Fu, Shuai; Zank, Gary; Ao, Xianzhi

    2018-02-01

    Using the recently extended 2D improved Particle Acceleration and Transport in the Heliosphere (iPATH) model, we model an example gradual solar energetic particle event as observed at multiple locations. Protons and ions that are energized via the diffusive shock acceleration mechanism are followed at a 2D coronal mass ejection-driven shock where the shock geometry varies across the shock front. The subsequent transport of energetic particles, including cross-field diffusion, is modeled by a Monte Carlo code that is based on a stochastic differential equation method. Time intensity profiles and particle spectra at multiple locations and different radial distances, separated in longitudes, are presented. The results shown here are relevant to the upcoming Parker Solar Probe mission.

  16. Variogram based and Multiple - Point Statistical simulation of shallow aquifer structures in the Upper Salzach valley, Austria

    Science.gov (United States)

    Jandrisevits, Carmen; Marschallinger, Robert

    2014-05-01

    Quarternary sediments in overdeepened alpine valleys and basins in the Eastern Alps bear substantial groundwater resources. The associated aquifer systems are generally geometrically complex with highly variable hydraulic properties. 3D geological models provide predictions of both geometry and properties of the subsurface required for subsequent modelling of groundwater flow and transport. In hydrology, geostatistical Kriging and Kriging based conditional simulations are widely used to predict the spatial distribution of hydrofacies. In the course of investigating the shallow aquifer structures in the Zell basin in the Upper Salzach valley (Salzburg, Austria), a benchmark of available geostatistical modelling and simulation methods was performed: traditional variogram based geostatistical methods, i.e. Indicator Kriging, Sequential Indicator Simulation and Sequential Indicator Co - Simulation were used as well as Multiple Point Statistics. The ~ 6 km2 investigation area is sampled by 56 drillings with depths of 5 to 50 m; in addition, there are 2 geophysical sections with lengths of 2 km and depths of 50 m. Due to clustered drilling sites, indicator Kriging models failed to consistently model the spatial variability of hydrofacies. Using classical variogram based geostatistical simulation (SIS), equally probable realizations were generated with differences among the realizations providing an uncertainty measure. The yielded models are unstructured from a geological point - they do not portray the shapes and lateral extensions of associated sedimentary units. Since variograms consider only two - point spatial correlations, they are unable to capture the spatial variability of complex geological structures. The Multiple Point Statistics approach overcomes these limitations of two point statistics as it uses a Training image instead of variograms. The 3D Training Image can be seen as a reference facies model where geological knowledge about depositional

  17. Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model

    Science.gov (United States)

    Musekiwa, Alfred; Manda, Samuel O. M.; Mwambi, Henry G.; Chen, Ding-Geng

    2016-01-01

    Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results. PMID:27798661

  18. Higher moments of net kaon multiplicity distributions at RHIC energies for the search of QCD Critical Point at STAR

    Directory of Open Access Journals (Sweden)

    Sarkar Amal

    2013-11-01

    Full Text Available In this paper we report the measurements of the various moments mean (M, standard deviation (σ skewness (S and kurtosis (κ of the net-Kaon multiplicity distribution at midrapidity from Au+Au collisions at √sNN = 7.7 to 200 GeV in the STAR experiment at RHIC in an effort to locate the critical point in the QCD phase diagram. These moments and their products are related to the thermodynamic susceptibilities of conserved quantities such as net baryon number, net charge, and net strangeness as also to the correlation length of the system. A non-monotonic behavior of these variable indicate the presence of the critical point. In this work we also present the moments products Sσ, κσ2 of net-Kaon multiplicity distribution as a function of collision centrality and energies. The energy and the centrality dependence of higher moments of net-Kaons and their products have been compared with it0s Poisson expectation and with simulations from AMPT which does not include the critical point. From the measurement at all seven available beam energies, we find no evidence for a critical point in the QCD phase diagram for √sNN below 200 GeV.

  19. Variational Principles, Lie Point Symmetries, and Similarity Solutions of the Vector Maxwell Equations in Non-linear Optics

    DEFF Research Database (Denmark)

    Webb, Garry; Sørensen, Mads Peter; Brio, Moysey

    2004-01-01

    the electromagnetic momentum and energy conservation laws, corresponding to the space and time translation invariance symmetries. The symmetries are used to obtain classical similarity solutions of the equations. The traveling wave similarity solutions for the case of a cubic Kerr nonlinearity, are shown to reduce...... the properties of Maxwell's equations in nonlinear optics, without resorting to the commonly used nonlinear Schr\\"odinger (NLS) equation approximation in which a high frequency carrier wave is modulated on long length and time scales due to nonlinear sideband wave interactions. This is important in femto......-second pulse propagation in which the NLS approximation is expected to break down. The canonical Hamiltonian description of the equations involves the solution of a polynomial equation for the electric field $E$, in terms of the the canonical variables, with possible multiple real roots for $E$. In order...

  20. Clutter-free Visualization of Large Point Symbols at Multiple Scales by Offset Quadtrees

    Directory of Open Access Journals (Sweden)

    ZHANG Xiang

    2016-08-01

    Full Text Available To address the cartographic problems in map mash-up applications in the Web 2.0 context, this paper studies a clutter-free technique for visualizing large symbols on Web maps. Basically, a quadtree is used to select one symbol in each grid cell at each zoom level. To resolve the symbol overlaps between neighboring quad-grids, multiple offsets are applied to the quadtree and a voting strategy is used to compute the significant level of symbols for their selection at multiple scales. The method is able to resolve spatial conflicts without explicit conflict detection, thus enabling a highly efficient processing. Also the resulting map forms a visual hierarchy of semantic importance. We discuss issues such as the relative importance, symbol-to-grid size ratio, and effective offset schemes, and propose two extensions to make better use of the free space available on the map. Experiments were carried out to validate the technique,which demonstrates its robustness and efficiency (a non-optimal implementation leads to a sub-second processing for datasets of a 105 magnitude.

  1. Tracking an oil slick from multiple natural sources, Coal Oil Point, California

    International Nuclear Information System (INIS)

    Leifer, Ira; Luyendyk, Bruce; Broderick, Kris

    2006-01-01

    Oil slicks on the ocean surface emitted from natural marine hydrocarbon seeps offshore from Coal Oil Point in the Santa Barbara Channel, California were tracked and sampled over a 2-h period. The objectives were to characterize the seep oil and to track its composition over time using a new sampling device, a catamaran drum sampler (CATDRUMS). The sampler was designed and developed at UCSB. Chromatograms showed that oil originating from an informally named, very active seep area, Shane Seep, primarily evolved during the first hour due to mixing with oil originating from a convergence zone slick surrounding Shane Seep. (author)

  2. Tracking an oil slick from multiple natural sources, Coal Oil Point, California

    Energy Technology Data Exchange (ETDEWEB)

    Leifer, Ira [Marine Sciences Institute, University of California, Santa Barbara, CA 93106 (United States); Luyendyk, Bruce [Department of Geological Sciences, University of California, Santa Barbara, CA 93106 (United States); Broderick, Kris [Exxon/Mobil Exploration Company, 13401 N. Freeway, Houston, TX 77060 (United States)

    2006-06-15

    Oil slicks on the ocean surface emitted from natural marine hydrocarbon seeps offshore from Coal Oil Point in the Santa Barbara Channel, California were tracked and sampled over a 2-h period. The objectives were to characterize the seep oil and to track its composition over time using a new sampling device, a catamaran drum sampler (CATDRUMS). The sampler was designed and developed at UCSB. Chromatograms showed that oil originating from an informally named, very active seep area, Shane Seep, primarily evolved during the first hour due to mixing with oil originating from a convergence zone slick surrounding Shane Seep. (author)

  3. Linking data repositories - an illustration of agile data curation principles through robust documentation and multiple application programming interfaces

    Science.gov (United States)

    Benedict, K. K.; Servilla, M. S.; Vanderbilt, K.; Wheeler, J.

    2015-12-01

    The growing volume, variety and velocity of production of Earth science data magnifies the impact of inefficiencies in data acquisition, processing, analysis, and sharing workflows, potentially to the point of impairing the ability of researchers to accomplish their desired scientific objectives. The adaptation of agile software development principles (http://agilemanifesto.org/principles.html) to data curation processes has significant potential to lower barriers to effective scientific data discovery and reuse - barriers that otherwise may force the development of new data to replace existing but unusable data, or require substantial effort to make data usable in new research contexts. This paper outlines a data curation process that was developed at the University of New Mexico that provides a cross-walk of data and associated documentation between the data archive developed by the Long Term Ecological Research (LTER) Network Office (PASTA - http://lno.lternet.edu/content/network-information-system) and UNM's institutional repository (LoboVault - http://repository.unm.edu). The developed automated workflow enables the replication of versioned data objects and their associated standards-based metadata between the LTER system and LoboVault - providing long-term preservation for those data/metadata packages within LoboVault while maintaining the value-added services that the PASTA platform provides. The relative ease with which this workflow was developed is a product of the capabilities independently developed on both platforms - including the simplicity of providing a well-documented application programming interface (API) for each platform enabling scripted interaction and the use of well-established documentation standards (EML in the case of PASTA, Dublin Core in the case of LoboVault) by both systems. These system characteristics, when combined with an iterative process of interaction between the Data Curation Librarian (on the LoboVault side of the process

  4. Analytical solutions of nonlocal Poisson dielectric models with multiple point charges inside a dielectric sphere

    Science.gov (United States)

    Xie, Dexuan; Volkmer, Hans W.; Ying, Jinyong

    2016-04-01

    The nonlocal dielectric approach has led to new models and solvers for predicting electrostatics of proteins (or other biomolecules), but how to validate and compare them remains a challenge. To promote such a study, in this paper, two typical nonlocal dielectric models are revisited. Their analytical solutions are then found in the expressions of simple series for a dielectric sphere containing any number of point charges. As a special case, the analytical solution of the corresponding Poisson dielectric model is also derived in simple series, which significantly improves the well known Kirkwood's double series expansion. Furthermore, a convolution of one nonlocal dielectric solution with a commonly used nonlocal kernel function is obtained, along with the reaction parts of these local and nonlocal solutions. To turn these new series solutions into a valuable research tool, they are programed as a free fortran software package, which can input point charge data directly from a protein data bank file. Consequently, different validation tests can be quickly done on different proteins. Finally, a test example for a protein with 488 atomic charges is reported to demonstrate the differences between the local and nonlocal models as well as the importance of using the reaction parts to develop local and nonlocal dielectric solvers.

  5. LDPC decoder with a limited-precision FPGA-based floating-point multiplication coprocessor

    Science.gov (United States)

    Moberly, Raymond; O'Sullivan, Michael; Waheed, Khurram

    2007-09-01

    Implementing the sum-product algorithm, in an FPGA with an embedded processor, invites us to consider a tradeoff between computational precision and computational speed. The algorithm, known outside of the signal processing community as Pearl's belief propagation, is used for iterative soft-decision decoding of LDPC codes. We determined the feasibility of a coprocessor that will perform product computations. Our FPGA-based coprocessor (design) performs computer algebra with significantly less precision than the standard (e.g. integer, floating-point) operations of general purpose processors. Using synthesis, targeting a 3,168 LUT Xilinx FPGA, we show that key components of a decoder are feasible and that the full single-precision decoder could be constructed using a larger part. Soft-decision decoding by the iterative belief propagation algorithm is impacted both positively and negatively by a reduction in the precision of the computation. Reducing precision reduces the coding gain, but the limited-precision computation can operate faster. A proposed solution offers custom logic to perform computations with less precision, yet uses the floating-point format to interface with the software. Simulation results show the achievable coding gain. Synthesis results help theorize the the full capacity and performance of an FPGA-based coprocessor.

  6. Multiple organ failure in the newborn: the point of view of the pathologist

    Directory of Open Access Journals (Sweden)

    Clara Gerosa

    2014-06-01

    Full Text Available One of the most severe events occurring in critically ill patients admitted to a neonatal intensive care unit (NICU center is represented by the multiple organ failure (MOF, a systemic inflammatory response leading to a progressive organ dysfunction and mortality in newborns. MOF may occur in newborns primarily affected by multiple single organ diseases, including respiratory distress syndrome neonatal sepsis with acute kidney injury, post-asphyxial hypoxic-ischemic encephalopathy and pandemic influenza A (H1N1 infection. In a previous article from our group, based on the histological examination of all organs at autopsy of newborns affected by MOF, all organs studied did not escape to be damaged, including thymus and pancreas normally not mentioned in the literature of MOF. The aim of this article is to review the most important pathological changes pathologists should look for in every case of MOF occurring in the perinatal period, with particular attention to systemic endothelial changes occurring in blood vessels in all organs and sytems. On the basis of our experience, matching data during the last phases of the clinicopathological diagnosis represents a useful method, much more productive as compared to the method based on giving pathological answers to the clinical questions prospected before autopsy. As for the pathological features observed in neonatal MOF, one of them deserves a particular attention: the vascular lesions, and in particular the multiple changes occurring during MOF development in endothelial cells, ending with the loss of the endothelial barrier, probably the most relevant histological lesion followed by the insurgence of interstitial edema and disseminated intravascular coagulation. Small vessels should be observed at high power, with particular attention to the size and shape of endothelial nuclei, in order to evidence endothelial swelling, probably the initial modification of the endothelial cells leading to their

  7. A Microsoft Kinect-Based Point-of-Care Gait Assessment Framework for Multiple Sclerosis Patients.

    Science.gov (United States)

    Gholami, Farnood; Trojan, Daria A; Kovecses, Jozsef; Haddad, Wassim M; Gholami, Behnood

    2017-09-01

    Gait impairment is a prevalent and important difficulty for patients with multiple sclerosis (MS), a common neurological disorder. An easy to use tool to objectively evaluate gait in MS patients in a clinical setting can assist clinicians to perform an objective assessment. The overall objective of this study is to develop a framework to quantify gait abnormalities in MS patients using the Microsoft Kinect for the Windows sensor; an inexpensive, easy to use, portable camera. Specifically, we aim to evaluate its feasibility for utilization in a clinical setting, assess its reliability, evaluate the validity of gait indices obtained, and evaluate a novel set of gait indices based on the concept of dynamic time warping. In this study, ten ambulatory MS patients, and ten age and sex-matched normal controls were studied at one session in a clinical setting with gait assessment using a Kinect camera. The expanded disability status scale (EDSS) clinical ambulation score was calculated for the MS subjects, and patients completed the Multiple Sclerosis walking scale (MSWS). Based on this study, we established the potential feasibility of using a Microsoft Kinect camera in a clinical setting. Seven out of the eight gait indices obtained using the proposed method were reliable with intraclass correlation coefficients ranging from 0.61 to 0.99. All eight MS gait indices were significantly different from those of the controls (p-values less than 0.05). Finally, seven out of the eight MS gait indices were correlated with the objective and subjective gait measures (Pearson's correlation coefficients greater than 0.40). This study shows that the Kinect camera is an easy to use tool to assess gait in MS patients in a clinical setting.

  8. Leading bureaucracies to the tipping point: An alternative model of multiple stable equilibrium levels of corruption

    Science.gov (United States)

    Caulkins, Jonathan P.; Feichtinger, Gustav; Grass, Dieter; Hartl, Richard F.; Kort, Peter M.; Novak, Andreas J.; Seidl, Andrea

    2013-01-01

    We present a novel model of corruption dynamics in the form of a nonlinear optimal dynamic control problem. It has a tipping point, but one whose origins and character are distinct from that in the classic Schelling (1978) model. The decision maker choosing a level of corruption is the chief or some other kind of authority figure who presides over a bureaucracy whose state of corruption is influenced by the authority figure’s actions, and whose state in turn influences the pay-off for the authority figure. The policy interpretation is somewhat more optimistic than in other tipping models, and there are some surprising implications, notably that reforming the bureaucracy may be of limited value if the bureaucracy takes its cues from a corrupt leader. PMID:23565027

  9. Leading bureaucracies to the tipping point: An alternative model of multiple stable equilibrium levels of corruption.

    Science.gov (United States)

    Caulkins, Jonathan P; Feichtinger, Gustav; Grass, Dieter; Hartl, Richard F; Kort, Peter M; Novak, Andreas J; Seidl, Andrea

    2013-03-16

    We present a novel model of corruption dynamics in the form of a nonlinear optimal dynamic control problem. It has a tipping point, but one whose origins and character are distinct from that in the classic Schelling (1978) model. The decision maker choosing a level of corruption is the chief or some other kind of authority figure who presides over a bureaucracy whose state of corruption is influenced by the authority figure's actions, and whose state in turn influences the pay-off for the authority figure. The policy interpretation is somewhat more optimistic than in other tipping models, and there are some surprising implications, notably that reforming the bureaucracy may be of limited value if the bureaucracy takes its cues from a corrupt leader.

  10. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    International Nuclear Information System (INIS)

    Gora, D.; Bernardini, E.; Cruz Silva, A.H.

    2011-04-01

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  11. Multiple Information Fusion Face Recognition Using Key Feature Points

    Directory of Open Access Journals (Sweden)

    LIN Kezheng

    2017-06-01

    Full Text Available After years of face recognition research,due to the effect of illumination,noise and other conditions have led to the recognition rate is relatively low,2 d face recognition technology has couldn’t keep up with the pace of The Times the forefront,Although 3 d face recognition technology is developing step by step,but it has a higher complexity. In order to solve this problem,based on the traditional depth information positioning method and local characteristic analysis methods LFA,puts forward an improved 3 d face key feature points localization algorithm, and on the basis of the trained sample which obtained by complete cluster,further put forward the global and local feature extraction algorithm of weighted fusion. Through FRGC and BU-3DFE experiment data comparison and analysis of the two face library,the method in terms of 3 d face recognition effect has a higher robustness.

  12. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    Energy Technology Data Exchange (ETDEWEB)

    Gora, D. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Institute of Nuclear Physics PAN, Cracow (Poland); Bernardini, E.; Cruz Silva, A.H. [Institute of Nuclear Physics PAN, Cracow (Poland)

    2011-04-15

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  13. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    Science.gov (United States)

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  14. Does the nervous system use equilibrium-point control to guide single and multiple joint movements?

    Science.gov (United States)

    Bizzi, E; Hogan, N; Mussa-Ivaldi, F A; Giszter, S

    1992-12-01

    The hypothesis that the central nervous system (CNS) generates movement as a shift of the limb's equilibrium posture has been corroborated experimentally in studies involving single- and multijoint motions. Posture may be controlled through the choice of muscle length-tension curve that set agonist-antagonist torque-angle curves determining an equilibrium position for the limb and the stiffness about the joints. Arm trajectories seem to be generated through a control signal defining a series of equilibrium postures. The equilibrium-point hypothesis drastically simplifies the requisite computations for multijoint movements and mechanical interactions with complex dynamic objects in the environment. Because the neuromuscular system is springlike, the instantaneous difference between the arm's actual position and the equilibrium position specified by the neural activity can generate the requisite torques, avoiding the complex "inverse dynamic" problem of computing the torques at the joints. The hypothesis provides a simple, unified description of posture and movement as well as contact control task performance, in which the limb must exert force stably and do work on objects in the environment. The latter is a surprisingly difficult problem, as robotic experience has shown. The prior evidence for the hypothesis came mainly from psychophysical and behavioral experiments. Our recent work has shown that microstimulation of the frog spinal cord's premotoneural network produces leg movements to various positions in the frog's motor space. The hypothesis can now be investigated in the neurophysiological machinery of the spinal cord.

  15. Design and Integration of an All-Magnetic Attitude Control System for FASTSAT-HSV01's Multiple Pointing Objectives

    Science.gov (United States)

    DeKock, Brandon; Sanders, Devon; Vanzwieten, Tannen; Capo-Lugo, Pedro

    2011-01-01

    The FASTSAT-HSV01 spacecraft is a microsatellite with magnetic torque rods as it sole attitude control actuator. FASTSAT s multiple payloads and mission functions require the Attitude Control System (ACS) to maintain Local Vertical Local Horizontal (LVLH)-referenced attitudes without spin-stabilization, while the pointing errors for some attitudes be significantly smaller than the previous best-demonstrated for this type of control system. The mission requires the ACS to hold multiple stable, unstable, and non-equilibrium attitudes, as well as eject a 3U CubeSat from an onboard P-POD and recover from the ensuing tumble. This paper describes the Attitude Control System, the reasons for design choices, how the ACS integrates with the rest of the spacecraft, and gives recommendations for potential future applications of the work.

  16. Monte Carlo full-waveform inversion of crosshole GPR data using multiple-point geostatistical a priori information

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2012-01-01

    We present a general Monte Carlo full-waveform inversion strategy that integrates a priori information described by geostatistical algorithms with Bayesian inverse problem theory. The extended Metropolis algorithm can be used to sample the a posteriori probability density of highly nonlinear...... inverse problems, such as full-waveform inversion. Sequential Gibbs sampling is a method that allows efficient sampling of a priori probability densities described by geostatistical algorithms based on either two-point (e.g., Gaussian) or multiple-point statistics. We outline the theoretical framework......) Based on a posteriori realizations, complicated statistical questions can be answered, such as the probability of connectivity across a layer. (3) Complex a priori information can be included through geostatistical algorithms. These benefits, however, require more computing resources than traditional...

  17. A novel data mining system points out hidden relationships between immunological markers in multiple sclerosis

    Directory of Open Access Journals (Sweden)

    Gironi Maira

    2013-01-01

    Full Text Available Abstract Background Multiple Sclerosis (MS is a multi-factorial disease, where a single biomarker unlikely can provide comprehensive information. Moreover, due to the non-linearity of biomarkers, traditional statistic is both unsuitable and underpowered to dissect their relationship. Patients affected with primary (PP=14, secondary (SP=33, benign (BB=26, relapsing-remitting (RR=30 MS, and 42 sex and age matched healthy controls were studied. We performed a depth immune-phenotypic and functional analysis of peripheral blood mononuclear cell (PBMCs by flow-cytometry. Semantic connectivity maps (AutoCM were applied to find the natural associations among immunological markers. AutoCM is a special kind of Artificial Neural Network able to find consistent trends and associations among variables. The matrix of connections, visualized through minimum spanning tree, keeps non linear associations among variables and captures connection schemes among clusters. Results Complex immunological relationships were shown to be related to different disease courses. Low CD4IL25+ cells level was strongly related (link strength, ls=0.81 to SP MS. This phenotype was also associated to high CD4ROR+ cells levels (ls=0.56. BB MS was related to high CD4+IL13 cell levels (ls=0.90, as well as to high CD14+IL6 cells percentage (ls=0.80. RR MS was strongly (ls=0.87 related to CD4+IL25 high cell levels, as well indirectly to high percentages of CD4+IL13 cells. In this latter strong (ls=0.92 association could be confirmed the induction activity of the former cells (CD4+IL25 on the latter (CD4+IL13. Another interesting topographic data was the isolation of Th9 cells (CD4IL9 from the main part of the immunological network related to MS, suggesting a possible secondary role of this new described cell phenotype in MS disease. Conclusions This novel application of non-linear mathematical techniques suggests peculiar immunological signatures for different MS phenotypes. Notably, the

  18. Accuracy improvement techniques in Precise Point Positioning method using multiple GNSS constellations

    Science.gov (United States)

    Vasileios Psychas, Dimitrios; Delikaraoglou, Demitris

    2016-04-01

    The future Global Navigation Satellite Systems (GNSS), including modernized GPS, GLONASS, Galileo and BeiDou, offer three or more signal carriers for civilian use and much more redundant observables. The additional frequencies can significantly improve the capabilities of the traditional geodetic techniques based on GPS signals at two frequencies, especially with regard to the availability, accuracy, interoperability and integrity of high-precision GNSS applications. Furthermore, highly redundant measurements can allow for robust simultaneous estimation of static or mobile user states including more parameters such as real-time tropospheric biases and more reliable ambiguity resolution estimates. This paper presents an investigation and analysis of accuracy improvement techniques in the Precise Point Positioning (PPP) method using signals from the fully operational (GPS and GLONASS), as well as the emerging (Galileo and BeiDou) GNSS systems. The main aim was to determine the improvement in both the positioning accuracy achieved and the time convergence it takes to achieve geodetic-level (10 cm or less) accuracy. To this end, freely available observation data from the recent Multi-GNSS Experiment (MGEX) of the International GNSS Service, as well as the open source program RTKLIB were used. Following a brief background of the PPP technique and the scope of MGEX, the paper outlines the various observational scenarios that were used in order to test various data processing aspects of PPP solutions with multi-frequency, multi-constellation GNSS systems. Results from the processing of multi-GNSS observation data from selected permanent MGEX stations are presented and useful conclusions and recommendations for further research are drawn. As shown, data fusion from GPS, GLONASS, Galileo and BeiDou systems is becoming increasingly significant nowadays resulting in a position accuracy increase (mostly in the less favorable East direction) and a large reduction of convergence

  19. Point specificity in acupuncture

    Directory of Open Access Journals (Sweden)

    Choi Emma M

    2012-02-01

    Full Text Available Abstract The existence of point specificity in acupuncture is controversial, because many acupuncture studies using this principle to select control points have found that sham acupoints have similar effects to those of verum acupoints. Furthermore, the results of pain-related studies based on visual analogue scales have not supported the concept of point specificity. In contrast, hemodynamic, functional magnetic resonance imaging and neurophysiological studies evaluating the responses to stimulation of multiple points on the body surface have shown that point-specific actions are present. This review article focuses on clinical and laboratory studies supporting the existence of point specificity in acupuncture and also addresses studies that do not support this concept. Further research is needed to elucidate the point-specific actions of acupuncture.

  20. Comparison of four microfinance markets from the point of view of the effectuation theory, complemented by proposed musketeer principle illustrating forces within village banks

    Directory of Open Access Journals (Sweden)

    Hes Tomáš

    2017-03-01

    Full Text Available Microfinance services are essential tools of formalization of shadow economics, leveraging immature entrepreneurship with external capital. Given the importance of shadow economics for the social balance of developing countries, the importance of an answer to a question of how microfinance entities come into existence, is rather essential. While decision-taking process leading to entrepreneurship were explained by the effectuation theory developed in the 90’, these explanations were not concerned with the logics of creation of microenterprises in neither developing countries nor microfinance village banks. While the abovementioned theories explain the nascence of companies in environment of developed markets, importance of a focus on emerging markets related to large share of human society of microfinance clientele is obvious. The study provides a development streak to the effectuation Theory, adding the musketeer principle to the five effectuation principles proposed by Sarasvathy. Furthermore, the hitherto not considered relationship between social capital and effectuation related concepts is another proposal of the paper focusing on description of the nature of microfinance clientele from the point of view of effectuation theory and social capital drawing a comparison of microfinance markets in four countries, Turkey, Sierra Leone, Indonesia and Afghanistan.

  1. Assimilation of concentration measurements for retrieving multiple point releases in atmosphere: A least-squares approach to inverse modelling

    Science.gov (United States)

    Singh, Sarvesh Kumar; Rani, Raj

    2015-10-01

    The study addresses the identification of multiple point sources, emitting the same tracer, from their limited set of merged concentration measurements. The identification, here, refers to the estimation of locations and strengths of a known number of simultaneous point releases. The source-receptor relationship is described in the framework of adjoint modelling by using an analytical Gaussian dispersion model. A least-squares minimization framework, free from an initialization of the release parameters (locations and strengths), is presented to estimate the release parameters. This utilizes the distributed source information observable from the given monitoring design and number of measurements. The technique leads to an exact retrieval of the true release parameters when measurements are noise free and exactly described by the dispersion model. The inversion algorithm is evaluated using the real data from multiple (two, three and four) releases conducted during Fusion Field Trials in September 2007 at Dugway Proving Ground, Utah. The release locations are retrieved, on average, within 25-45 m of the true sources with the distance from retrieved to true source ranging from 0 to 130 m. The release strengths are also estimated within a factor of three to the true release rates. The average deviations in retrieval of source locations are observed relatively large in two release trials in comparison to three and four release trials.

  2. Dynamic analysis of multiple nuclear-coupled boiling channels based on a multi-point reactor model

    International Nuclear Information System (INIS)

    Lee, J.D.; Pan Chin

    2005-01-01

    This work investigates the non-linear dynamics and stabilities of a multiple nuclear-coupled boiling channel system based on a multi-point reactor model using the Galerkin nodal approximation method. The nodal approximation method for the multiple boiling channels developed by Lee and Pan [Lee, J.D., Pan, C., 1999. Dynamics of multiple parallel boiling channel systems with forced flows. Nucl. Eng. Des. 192, 31-44] is extended to address the two-phase flow dynamics in the present study. The multi-point reactor model, modified from Uehiro et al. [Uehiro, M., Rao, Y.F., Fukuda, K., 1996. Linear stability analysis on instabilities of in-phase and out-of-phase modes in boiling water reactors. J. Nucl. Sci. Technol. 33, 628-635], is employed to study a multiple-channel system with unequal steady-state neutron density distribution. Stability maps, non-linear dynamics and effects of major parameters on the multiple nuclear-coupled boiling channel system subject to a constant total flow rate are examined. This study finds that the void-reactivity feedback and neutron interactions among subcores are coupled and their competing effects may influence the system stability under different operating conditions. For those cases with strong neutron interaction conditions, by strengthening the void-reactivity feedback, the nuclear-coupled effect on the non-linear dynamics may induce two unstable oscillation modes, the supercritical Hopf bifurcation and the subcritical Hopf bifurcation. Moreover, for those cases with weak neutron interactions, by quadrupling the void-reactivity feedback coefficient, period-doubling and complex chaotic oscillations may appear in a three-channel system under some specific operating conditions. A unique type of complex chaotic attractor may evolve from the Rossler attractor because of the coupled channel-to-channel thermal-hydraulic and subcore-to-subcore neutron interactions. Such a complex chaotic attractor has the imbedding dimension of 5 and the

  3. Modelling a real-world buried valley system with vertical non-stationarity using multiple-point statistics

    DEFF Research Database (Denmark)

    He, Xiulan; Sonnenborg, Torben; Jørgensen, Flemming

    2017-01-01

    -stationary geological system characterized by a network of connected buried valleys that incise deeply into layered Miocene sediments (case study in Denmark). The results show that, based on fragmented information of the formation boundaries, the MPS partition method is able to simulate a non-stationary system......Stationarity has traditionally been a requirement of geostatistical simulations. A common way to deal with non-stationarity is to divide the system into stationary sub-regions and subsequently merge the realizations for each region. Recently, the so-called partition approach that has...... the flexibility to model non-stationary systems directly was developed for multiple-point statistics simulation (MPS). The objective of this study is to apply the MPS partition method with conventional borehole logs and high-resolution airborne electromagnetic (AEM) data, for simulation of a real-world non...

  4. Investigating lithological and geophysical relationships with applications to geological uncertainty analysis using Multiple-Point Statistical methods

    DEFF Research Database (Denmark)

    Barfod, Adrian

    The PhD thesis presents a new method for analyzing the relationship between resistivity and lithology, as well as a method for quantifying the hydrostratigraphic modeling uncertainty related to Multiple-Point Statistical (MPS) methods. Three-dimensional (3D) geological models are im...... is to improve analysis and research of the resistivity-lithology relationship and ensemble geological/hydrostratigraphic modeling. The groundwater mapping campaign in Denmark, beginning in the 1990’s, has resulted in the collection of large amounts of borehole and geophysical data. The data has been compiled...... in two publicly available databases, the JUPITER and GERDA databases, which contain borehole and geophysical data, respectively. The large amounts of available data provided a unique opportunity for studying the resistivity-lithology relationship. The method for analyzing the resistivity...

  5. The traveltime holographic principle

    Science.gov (United States)

    Huang, Yunsong; Schuster, Gerard T.

    2015-01-01

    Fermat's interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat's interferometric principle. We denote this principle as the `traveltime holographic principle', by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.

  6. Three-dimensional numerical study of heat transfer characteristics of plain plate fin-and-tube heat exchangers from view point of field synergy principle

    International Nuclear Information System (INIS)

    He, Y.L.; Tao, W.Q.; Song, F.Q.; Zhang, W.

    2005-01-01

    In this paper, 3-D numerical simulations were performed for laminar heat transfer and fluid flow characteristics of plate fin-and-tube heat exchanger. The effects of five factors were examined: Re number, fin pitch, tube row number, spanwise and longitudinal tube pitch. The Reynolds number based on the tube diameter varied from 288 to 5000, the non-dimensional fin pitch based on the tube diameter varied from 0.04 to 0.5, the tube row number from 1 to 4, the spanwise tube pitch S 1 /d varies from 1.2 to 3, and the longitudinal tube pitch S 2 /d from 1.0 to 2.4. The numerical results were analyzed from the view point of field synergy principle, which says that the reduction of the intersection angle between velocity and fluid temperature gradient is the basic mechanism to enhance convective heat transfer. It is found that the effects of the five parameters on the heat transfer performance of the finned tube banks can be well described by the field synergy principle, i.e., the enhancement or deterioration of the convective heat transfer across the finned tube banks is inherently related to the variation of the intersection angle between the velocity and the fluid temperature gradient. It is also recommended that to further enhance the convective heat transfer, the enhancement techniques, such as slotting the fin, should be adopted mainly in the rear part of the fin where the synergy between local velocity and temperature gradient become worse

  7. First-principles study on oxidation effects in uranium oxides and high-pressure high-temperature behavior of point defects in uranium dioxide

    Science.gov (United States)

    Geng, Hua Y.; Song, Hong X.; Jin, K.; Xiang, S. K.; Wu, Q.

    2011-11-01

    Formation Gibbs free energy of point defects and oxygen clusters in uranium dioxide at high-pressure high-temperature conditions are calculated from first principles, using the LSDA+U approach for the electronic structure and the Debye model for the lattice vibrations. The phonon contribution on Frenkel pairs is found to be notable, whereas it is negligible for the Schottky defect. Hydrostatic compression changes the formation energies drastically, making defect concentrations depend more sensitively on pressure. Calculations show that, if no oxygen clusters are considered, uranium vacancy becomes predominant in overstoichiometric UO2 with the aid of the contribution from lattice vibrations, while compression favors oxygen defects and suppresses uranium vacancy greatly. At ambient pressure, however, the experimental observation of predominant oxygen defects in this regime can be reproduced only in a form of cuboctahedral clusters, underlining the importance of defect clustering in UO2+x. Making use of the point defect model, an equation of state for nonstoichiometric oxides is established, which is then applied to describe the shock Hugoniot of UO2+x. Furthermore, the oxidization and compression behavior of uranium monoxide, triuranium octoxide, uranium trioxide, and a series of defective UO2 at 0 K are investigated. The evolution of mechanical properties and electronic structures with an increase of the oxidation degree are analyzed, revealing the transition of the ground state of uranium oxides from metallic to Mott insulator and then to charge-transfer insulator due to the interplay of strongly correlated effects of 5f orbitals and the shift of electrons from uranium to oxygen atoms.

  8. Multiple Positive Solutions of a Nonlinear Four-Point Singular Boundary Value Problem with a p-Laplacian Operator on Time Scales

    Directory of Open Access Journals (Sweden)

    Shihuang Hong

    2009-01-01

    Full Text Available We present sufficient conditions for the existence of at least twin or triple positive solutions of a nonlinear four-point singular boundary value problem with a p-Laplacian dynamic equation on a time scale. Our results are obtained via some new multiple fixed point theorems.

  9. Structural phases arising from reconstructive and isostructural transitions in high-melting-point oxides under hydrostatic pressure: A first-principles study

    Science.gov (United States)

    Tian, Hao; Kuang, Xiao-Yu; Mao, Ai-Jie; Yang, Yurong; Xu, Changsong; Sayedaghaee, S. Omid; Bellaiche, L.

    2018-01-01

    High-melting-point oxides of chemical formula A B O3 with A =Ca , Sr, Ba and B =Zr , Hf are investigated as a function of hydrostatic pressure up to 200 GPa by combining first-principles calculations with a particle swarm optimization method. Ca- and Sr-based systems: (1) first undergo a reconstructive phase transition from a perovskite state to a novel structure that belongs to the post-post-perovskite family and (2) then experience an isostructural transition to a second, also new post-post-perovskite state at higher pressures, via the sudden formation of a specific out-of-plane B -O bond. In contrast, the studied Ba compounds evolve from a perovskite phase to a third novel post-post-perovskite structure via another reconstructive phase transition. The original characteristics of these three different post-post-perovskite states are emphasized. Unusual electronic properties, including significant piezochromic effects and an insulator-metal transition, are also reported and explained.

  10. Electron interaction and spin effects in quantum wires, quantum dots and quantum point contacts: a first-principles mean-field approach

    International Nuclear Information System (INIS)

    Zozoulenko, I V; Ihnatsenka, S

    2008-01-01

    We have developed a mean-field first-principles approach for studying electronic and transport properties of low dimensional lateral structures in the integer quantum Hall regime. The electron interactions and spin effects are included within the spin density functional theory in the local density approximation where the conductance, the density, the effective potentials and the band structure are calculated on the basis of the Green's function technique. In this paper we present a systematic review of the major results obtained on the energetics, spin polarization, effective g factor, magnetosubband and edge state structure of split-gate and cleaved-edge overgrown quantum wires as well as on the conductance of quantum point contacts (QPCs) and open quantum dots. In particular, we discuss how the spin-resolved subband structure, the current densities, the confining potentials, as well as the spin polarization of the electron and current densities in quantum wires and antidots evolve when an applied magnetic field varies. We also discuss the role of the electron interaction and spin effects in the conductance of open systems focusing our attention on the 0.7 conductance anomaly in the QPCs. Special emphasis is given to the effect of the electron interaction on the conductance oscillations and their statistics in open quantum dots as well as to interpretation of the related experiments on the ultralow temperature saturation of the coherence time in open dots

  11. Using the Direct Sampling Multiple-Point Geostatistical Method for Filling Gaps in Landsat 7 ETM+ SLC-off Imagery

    KAUST Repository

    Yin, Gaohong

    2016-05-01

    Since the failure of the Scan Line Corrector (SLC) instrument on Landsat 7, observable gaps occur in the acquired Landsat 7 imagery, impacting the spatial continuity of observed imagery. Due to the highly geometric and radiometric accuracy provided by Landsat 7, a number of approaches have been proposed to fill the gaps. However, all proposed approaches have evident constraints for universal application. The main issues in gap-filling are an inability to describe the continuity features such as meandering streams or roads, or maintaining the shape of small objects when filling gaps in heterogeneous areas. The aim of the study is to validate the feasibility of using the Direct Sampling multiple-point geostatistical method, which has been shown to reconstruct complicated geological structures satisfactorily, to fill Landsat 7 gaps. The Direct Sampling method uses a conditional stochastic resampling of known locations within a target image to fill gaps and can generate multiple reconstructions for one simulation case. The Direct Sampling method was examined across a range of land cover types including deserts, sparse rural areas, dense farmlands, urban areas, braided rivers and coastal areas to demonstrate its capacity to recover gaps accurately for various land cover types. The prediction accuracy of the Direct Sampling method was also compared with other gap-filling approaches, which have been previously demonstrated to offer satisfactory results, under both homogeneous area and heterogeneous area situations. Studies have shown that the Direct Sampling method provides sufficiently accurate prediction results for a variety of land cover types from homogeneous areas to heterogeneous land cover types. Likewise, it exhibits superior performances when used to fill gaps in heterogeneous land cover types without input image or with an input image that is temporally far from the target image in comparison with other gap-filling approaches.

  12. Detection of uterine MMG contractions using a multiple change point estimator and the K-means cluster algorithm.

    Science.gov (United States)

    La Rosa, Patricio S; Nehorai, Arye; Eswaran, Hari; Lowery, Curtis L; Preissl, Hubert

    2008-02-01

    We propose a single channel two-stage time-segment discriminator of uterine magnetomyogram (MMG) contractions during pregnancy. We assume that the preprocessed signals are piecewise stationary having distribution in a common family with a fixed number of parameters. Therefore, at the first stage, we propose a model-based segmentation procedure, which detects multiple change-points in the parameters of a piecewise constant time-varying autoregressive model using a robust formulation of the Schwarz information criterion (SIC) and a binary search approach. In particular, we propose a test statistic that depends on the SIC, derive its asymptotic distribution, and obtain closed-form optimal detection thresholds in the sense of the Neyman-Pearson criterion; therefore, we control the probability of false alarm and maximize the probability of change-point detection in each stage of the binary search algorithm. We compute and evaluate the relative energy variation [root mean squares (RMS)] and the dominant frequency component [first order zero crossing (FOZC)] in discriminating between time segments with and without contractions. The former consistently detects a time segment with contractions. Thus, at the second stage, we apply a nonsupervised K-means cluster algorithm to classify the detected time segments using the RMS values. We apply our detection algorithm to real MMG records obtained from ten patients admitted to the hospital for contractions with gestational ages between 31 and 40 weeks. We evaluate the performance of our detection algorithm in computing the detection and false alarm rate, respectively, using as a reference the patients' feedback. We also analyze the fusion of the decision signals from all the sensors as in the parallel distributed detection approach.

  13. Identification of multiple cracks in 2D elasticity by means of the reciprocity principle and cluster analysis

    Science.gov (United States)

    Shifrin, Efim I.; Kaptsov, Alexander V.

    2018-01-01

    An inverse 2D elastostatic problem is considered. It is assumed that an isotropic, linear elastic body can contain a finite number of rectilinear, well-separated cracks. The surfaces of the cracks are assumed to be free of the loads. A method is developed for reconstruction the cracks by means of the applied loads and displacements on the boundary of the body, obtained in a single static test. The method is based on the reciprocity principle, elements of the theory of distributions, and cluster analysis. Numerical examples are considered.

  14. Dissection of Biological Property of Chinese Acupuncture Point Zusanli Based on Long-Term Treatment via Modulating Multiple Metabolic Pathways

    Directory of Open Access Journals (Sweden)

    Guangli Yan

    2013-01-01

    Full Text Available Acupuncture has a history of over 3000 years and is a traditional Chinese medical therapy that uses hair-thin metal needles to puncture the skin at specific points on the body to promote wellbeing, while its molecular mechanism and ideal biological pathways are still not clear. High-throughput metabolomics is the global assessment of endogenous metabolites within a biologic system and can potentially provide a more accurate snap shot of the actual physiological state. We hypothesize that acupuncture-treated human would produce unique characterization of metabolic phenotypes. In this study, UPLC/ESI-HDMS coupled with pattern recognition methods and system analysis were carried out to investigate the mechanism and metabolite biomarkers for acupuncture treatment at “Zusanli” acupoint (ST-36 as a case study. The top 5 canonical pathways including alpha-linolenic acid metabolism, d-glutamine and d-glutamate metabolism, citrate cycle, alanine, aspartate, and glutamate metabolism, and vitamin B6 metabolism pathways were acutely perturbed, and 53 differential metabolites were identified by chemical profiling and may be useful to clarify the physiological basis and mechanism of ST-36. More importantly, network construction has led to the integration of metabolites associated with the multiple perturbation pathways. Urine metabolic profiling might be a promising method to investigate the molecular mechanism of acupuncture.

  15. Dissection of Biological Property of Chinese Acupuncture Point Zusanli Based on Long-Term Treatment via Modulating Multiple Metabolic Pathways.

    Science.gov (United States)

    Yan, Guangli; Zhang, Aihua; Sun, Hui; Cheng, Weiping; Meng, Xiangcai; Liu, Li; Zhang, Yingzhi; Xie, Ning; Wang, Xijun

    2013-01-01

    Acupuncture has a history of over 3000 years and is a traditional Chinese medical therapy that uses hair-thin metal needles to puncture the skin at specific points on the body to promote wellbeing, while its molecular mechanism and ideal biological pathways are still not clear. High-throughput metabolomics is the global assessment of endogenous metabolites within a biologic system and can potentially provide a more accurate snap shot of the actual physiological state. We hypothesize that acupuncture-treated human would produce unique characterization of metabolic phenotypes. In this study, UPLC/ESI-HDMS coupled with pattern recognition methods and system analysis were carried out to investigate the mechanism and metabolite biomarkers for acupuncture treatment at "Zusanli" acupoint (ST-36) as a case study. The top 5 canonical pathways including alpha-linolenic acid metabolism, d-glutamine and d-glutamate metabolism, citrate cycle, alanine, aspartate, and glutamate metabolism, and vitamin B6 metabolism pathways were acutely perturbed, and 53 differential metabolites were identified by chemical profiling and may be useful to clarify the physiological basis and mechanism of ST-36. More importantly, network construction has led to the integration of metabolites associated with the multiple perturbation pathways. Urine metabolic profiling might be a promising method to investigate the molecular mechanism of acupuncture.

  16. The traveltime holographic principle

    KAUST Repository

    Huang, Y.; Schuster, Gerard T.

    2014-01-01

    Fermat's interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat's interferometric principle. We denote this principle as the ‘traveltime holographic principle’, by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.

  17. The traveltime holographic principle

    KAUST Repository

    Huang, Y.

    2014-11-06

    Fermat\\'s interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat\\'s interferometric principle. We denote this principle as the ‘traveltime holographic principle’, by analogy with the holographic principle in cosmology where information in a volume is encoded on the region\\'s boundary.

  18. X-ray diffraction imaging with the Multiple Inverse Fan Beam topology: principles, performance and potential for security screening.

    Science.gov (United States)

    Harding, G; Fleckenstein, H; Kosciesza, D; Olesinski, S; Strecker, H; Theedt, T; Zienert, G

    2012-07-01

    The steadily increasing number of explosive threat classes, including home-made explosives (HMEs), liquids, amorphous and gels (LAGs), is forcing up the false-alarm rates of security screening equipment. This development can best be countered by increasing the number of features available for classification. X-ray diffraction intrinsically offers multiple features for both solid and LAGs explosive detection, and is thus becoming increasingly important for false-alarm and cost reduction in both carry-on and checked baggage security screening. Following a brief introduction to X-ray diffraction imaging (XDI), which synthesizes in a single modality the image-forming and material-analysis capabilities of X-rays, the Multiple Inverse Fan Beam (MIFB) XDI topology is described. Physical relationships obtaining in such MIFB XDI components as the radiation source, collimators and room-temperature detectors are presented with experimental performances that have been achieved. Representative X-ray diffraction profiles of threat substances measured with a laboratory MIFB XDI system are displayed. The performance of Next-Generation (MIFB) XDI relative to that of the 2nd Generation XRD 3500TM screener (Morpho Detection Germany GmbH) is assessed. The potential of MIFB XDI, both for reducing the exorbitant cost of false alarms in hold baggage screening (HBS), as well as for combining "in situ" liquid and solid explosive detection in carry-on luggage screening is outlined. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. An efficient method for the prediction of deleterious multiple-point mutations in the secondary structure of RNAs using suboptimal folding solutions

    Directory of Open Access Journals (Sweden)

    Barash Danny

    2008-04-01

    Full Text Available Abstract Background RNAmute is an interactive Java application which, given an RNA sequence, calculates the secondary structure of all single point mutations and organizes them into categories according to their similarity to the predicted structure of the wild type. The secondary structure predictions are performed using the Vienna RNA package. A more efficient implementation of RNAmute is needed, however, to extend from the case of single point mutations to the general case of multiple point mutations, which may often be desired for computational predictions alongside mutagenesis experiments. But analyzing multiple point mutations, a process that requires traversing all possible mutations, becomes highly expensive since the running time is O(nm for a sequence of length n with m-point mutations. Using Vienna's RNAsubopt, we present a method that selects only those mutations, based on stability considerations, which are likely to be conformational rearranging. The approach is best examined using the dot plot representation for RNA secondary structure. Results Using RNAsubopt, the suboptimal solutions for a given wild-type sequence are calculated once. Then, specific mutations are selected that are most likely to cause a conformational rearrangement. For an RNA sequence of about 100 nts and 3-point mutations (n = 100, m = 3, for example, the proposed method reduces the running time from several hours or even days to several minutes, thus enabling the practical application of RNAmute to the analysis of multiple-point mutations. Conclusion A highly efficient addition to RNAmute that is as user friendly as the original application but that facilitates the practical analysis of multiple-point mutations is presented. Such an extension can now be exploited prior to site-directed mutagenesis experiments by virologists, for example, who investigate the change of function in an RNA virus via mutations that disrupt important motifs in its secondary

  20. Variation in type and frequency of diagnostic imaging during trauma care across multiple time points by patient insurance type

    International Nuclear Information System (INIS)

    Bell, Nathaniel; Repáraz, Laura; Fry, William R.; Smith, R. Stephen; Luis, Alejandro

    2016-01-01

    Research has shown that uninsured patients receive fewer radiographic studies during trauma care, but less is known as to whether differences in care are present among other insurance groups or across different time points during hospitalization. Our objective was to examine the number of radiographic studies administered to a cohort of trauma patients over the entire hospital stay as well as during the first 24-hours of care. Patient data were obtained from an American College of Surgeons (ACS) verified Level I Trauma Center between January 1, 2011 and December 31, 2012. We used negative binomial regression to construct relative risk (RR) ratios for type and frequency of radiographic imaging received among persons with Medicare, Medicaid, no insurance, or government insurance plans in reference to those with commercial indemnity plans. The analysis was adjusted for patient age, sex, race/ethnicity, injury severity score, injury mechanism, comorbidities, complications, hospital length of stay, and Intensive Care Unit (ICU) admission. A total of 3621 records from surviving patients age > =18 years were assessed. After adjustment for potential confounders, the expected number of radiographic studies decreased by 15 % among Medicare recipients (RR 0.85, 95 % CI 0.78–0.93), 11 % among Medicaid recipients (0.89, 0.81–0.99), 10 % among the uninsured (0.90, 0.85–0.96) and 19 % among government insurance groups (0.81, 0.72–0.90), compared with the reference group. This disparity was observed during the first 24-hours of care among patients with Medicare (0.78, 0.71–0.86) and government insurance plans (0.83, 0.74–0.94). Overall, there were no differences in the number of radiographic studies among the uninsured or among Medicaid patients during the first 24-hours of care compared with the reference group, but differences were observed among the uninsured in a sub-analysis of severely injured patients (ISS > 15). Both uninsured and insured patients treated at a

  1. Multiple-point statistical simulation for hydrogeological models: 3-D training image development and conditioning strategies

    Directory of Open Access Journals (Sweden)

    A.-S. Høyer

    2017-12-01

    Full Text Available Most studies on the application of geostatistical simulations based on multiple-point statistics (MPS to hydrogeological modelling focus on relatively fine-scale models and concentrate on the estimation of facies-level structural uncertainty. Much less attention is paid to the use of input data and optimal construction of training images. For instance, even though the training image should capture a set of spatial geological characteristics to guide the simulations, the majority of the research still relies on 2-D or quasi-3-D training images. In the present study, we demonstrate a novel strategy for 3-D MPS modelling characterized by (i realistic 3-D training images and (ii an effective workflow for incorporating a diverse group of geological and geophysical data sets. The study covers an area of 2810 km2 in the southern part of Denmark. MPS simulations are performed on a subset of the geological succession (the lower to middle Miocene sediments which is characterized by relatively uniform structures and dominated by sand and clay. The simulated domain is large and each of the geostatistical realizations contains approximately 45 million voxels with size 100 m  ×  100 m  ×  5 m. Data used for the modelling include water well logs, high-resolution seismic data, and a previously published 3-D geological model. We apply a series of different strategies for the simulations based on data quality, and develop a novel method to effectively create observed spatial trends. The training image is constructed as a relatively small 3-D voxel model covering an area of 90 km2. We use an iterative training image development strategy and find that even slight modifications in the training image create significant changes in simulations. Thus, this study shows how to include both the geological environment and the type and quality of input information in order to achieve optimal results from MPS modelling. We present a practical

  2. Collaboration between a human group and artificial intelligence can improve prediction of multiple sclerosis course: a proof-of-principle study.

    Science.gov (United States)

    Tacchella, Andrea; Romano, Silvia; Ferraldeschi, Michela; Salvetti, Marco; Zaccaria, Andrea; Crisanti, Andrea; Grassi, Francesca

    2017-01-01

    Background: Multiple sclerosis has an extremely variable natural course. In most patients, disease starts with a relapsing-remitting (RR) phase, which proceeds to a secondary progressive (SP) form. The duration of the RR phase is hard to predict, and to date predictions on the rate of disease progression remain suboptimal. This limits the opportunity to tailor therapy on an individual patient's prognosis, in spite of the choice of several therapeutic options. Approaches to improve clinical decisions, such as collective intelligence of human groups and machine learning algorithms are widely investigated. Methods: Medical students and a machine learning algorithm predicted the course of disease on the basis of randomly chosen clinical records of patients that attended at the Multiple Sclerosis service of Sant'Andrea hospital in Rome. Results: A significant improvement of predictive ability was obtained when predictions were combined with a weight that depends on the consistence of human (or algorithm) forecasts on a given clinical record. Conclusions: In this work we present proof-of-principle that human-machine hybrid predictions yield better prognoses than machine learning algorithms or groups of humans alone. To strengthen this preliminary result, we propose a crowdsourcing initiative to collect prognoses by physicians on an expanded set of patients.

  3. Cosmological principles. II. Physical principles

    International Nuclear Information System (INIS)

    Harrison, E.R.

    1974-01-01

    The discussion of cosmological principle covers the uniformity principle of the laws of physics, the gravitation and cognizability principles, and the Dirac creation, chaos, and bootstrap principles. (U.S.)

  4. Multiscale Analysis of Effects of Additive and Multiplicative Noise on Delay Differential Equations near a Bifurcation Point

    International Nuclear Information System (INIS)

    Klosek, M.M.

    2004-01-01

    We study effects of noisy and deterministic perturbations on oscillatory solutions to delay differential equations. We develop the multiscale technique and derive amplitude equations for noisy oscillations near a critical delay. We investigate effects of additive and multiplicative noise. We show that if the magnitudes of noise and deterministic perturbations are balanced, then the oscillatory behavior persists for long times being sustained by the noise. We illustrate the technique and its results on linear and logistic delay equations. (author)

  5. Two-surface Monte Carlo with basin hopping: quantum mechanical trajectory and multiple stationary points of water cluster.

    Science.gov (United States)

    Bandyopadhyay, Pradipta

    2008-04-07

    The efficiency of the two-surface monte carlo (TSMC) method depends on the closeness of the actual potential and the biasing potential used to propagate the system of interest. In this work, it is shown that by combining the basin hopping method with TSMC, the efficiency of the method can be increased by several folds. TSMC with basin hopping is used to generate quantum mechanical trajectory and large number of stationary points of water clusters.

  6. Screening the Medicines for Malaria Venture Pathogen Box across Multiple Pathogens Reclassifies Starting Points for Open-Source Drug Discovery.

    Science.gov (United States)

    Duffy, Sandra; Sykes, Melissa L; Jones, Amy J; Shelper, Todd B; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E; Avery, Vicky M

    2017-09-01

    Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. Copyright © 2017 Duffy et al.

  7. The local domain wall position in ferromagnetic thin wires: simultaneous measurement of resistive and transverse voltages at multiple points

    International Nuclear Information System (INIS)

    Hanada, R.; Sugawara, H.; Aoki, Y.; Sato, H.; Shigeto, K.; Shinjo, T.; Ono, T.; Miyajima, H.

    2002-01-01

    We have simultaneously measured the field dependences of voltages at multiple pairs of resistance and transverse voltage probes in ferromagnetic wires (with either magnetic or non-magnetic voltage probes). Both the resistive (through the giant magnetoresistance and anisotropic magnetoresistance) and transverse voltages (through the planar Hall effect) exhibit abrupt jumps, reflecting discrete motion of domain walls or rotations of magnetization. Voltage probes, even if non-magnetic, are found to affect the jump fields depending on the sample conditions. We demonstrate that the specific information on the domain (wall) motion along a thin ferromagnetic wire could be obtained from the jump fields. (author)

  8. Notice of Violation of IEEE Publication PrinciplesJoint Redundant Residue Number Systems and Module Isolation for Mitigating Single Event Multiple Bit Upsets in Datapath

    Science.gov (United States)

    Li, Lei; Hu, Jianhao

    2010-12-01

    Notice of Violation of IEEE Publication Principles"Joint Redundant Residue Number Systems and Module Isolation for Mitigating Single Event Multiple Bit Upsets in Datapath"by Lei Li and Jianhao Hu,in the IEEE Transactions on Nuclear Science, vol.57, no.6, Dec. 2010, pp. 3779-3786After careful and considered review of the content and authorship of this paper by a duly constituted expert committee, this paper has been found to be in violation of IEEE's Publication Principles.This paper contains substantial duplication of original text from the paper cited below. The original text was copied without attribution (including appropriate references to the original author(s) and/or paper title) and without permission.Due to the nature of this violation, reasonable effort should be made to remove all past references to this paper, and future references should be made to the following articles:"Multiple Error Detection and Correction Based on Redundant Residue Number Systems"by Vik Tor Goh and M.U. Siddiqi,in the IEEE Transactions on Communications, vol.56, no.3, March 2008, pp.325-330"A Coding Theory Approach to Error Control in Redundant Residue Number Systems. I: Theory and Single Error Correction"by H. Krishna, K-Y. Lin, and J-D. Sun, in the IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, vol.39, no.1, Jan 1992, pp.8-17In this paper, we propose a joint scheme which combines redundant residue number systems (RRNS) with module isolation (MI) for mitigating single event multiple bit upsets (SEMBUs) in datapath. The proposed hardening scheme employs redundant residues to improve the fault tolerance for datapath and module spacings to guarantee that SEMBUs caused by charge sharing do not propagate among the operation channels of different moduli. The features of RRNS, such as independence, parallel and error correction, are exploited to establish the radiation hardening architecture for the datapath in radiation environments. In the proposed

  9. Bayesian inference as a tool for analysis of first-principles calculations of complex materials: an application to the melting point of Ti2GaN

    International Nuclear Information System (INIS)

    Davis, Sergio; Gutiérrez, Gonzalo

    2013-01-01

    We present a systematic implementation of the recently developed Z-method for computing melting points of solids, augmented by a Bayesian analysis of the data obtained from molecular dynamics simulations. The use of Bayesian inference allows us to extract valuable information from limited data, reducing the computational cost of drawing the isochoric curve. From this Bayesian Z-method we obtain posterior distributions for the melting temperature T m , the critical superheating temperature T LS and the slopes dT/dE of the liquid and solid phases. The method therefore gives full quantification of the errors in the prediction of the melting point. This procedure is applied to the estimation of the melting point of Ti 2 GaN (one of the so-called MAX phases), a complex, laminar material, by density functional theory molecular dynamics, finding an estimate T m of 2591.61 ± 89.61 K, which is in good agreement with melting points of similar ceramics. (paper)

  10. Multiple critical points and liquid-liquid equilibria from the van der Waals like equations of state

    International Nuclear Information System (INIS)

    Artemenko, Sergey; Lozovsky, Taras; Mazur, Victor

    2008-01-01

    The principal aim of this work is a comprehensive analysis of the phase diagram of water via the van der Waals like equations of state (EoSs) which are considered as superpositions of repulsive and attractive forces. We test more extensively the modified van der Waals EoS (MVDW) proposed by Skibinski et al (2004 Phys. Rev. E 69 061206) and refine this model by introducing instead of the classical van der Waals repulsive term a very accurate hard sphere EoS over the entire stable and metastable regions (Liu 2006 Preprint cond-mat/0605392). It was detected that the simplest form of MVDW EoS displays a complex phase behavior, including three critical points, and identifies four fluid phases (gas, low density liquid (LDL), high density liquid (HDL), and very high density liquid (VHDL)). Moreover the experimentally observed (Mallamace et al 2007 Proc. Natl Acad. Sci. USA 104 18387) anomalous behavior of the density of water in the deeply supercooled region (a density minimum) is reproduced by the MWDW EoS. An improvement of the repulsive part does not change the topological picture of the phase behavior of water in the wide range of thermodynamic variables. The new parameters set for second and third critical points are recognized by thorough analysis of experimental data for the loci of thermodynamic response function extrema

  11. Point process-based modeling of multiple debris flow landslides using INLA: an application to the 2009 Messina disaster

    KAUST Repository

    Lombardo, Luigi

    2018-02-13

    We develop a stochastic modeling approach based on spatial point processes of log-Gaussian Cox type for a collection of around 5000 landslide events provoked by a precipitation trigger in Sicily, Italy. Through the embedding into a hierarchical Bayesian estimation framework, we can use the integrated nested Laplace approximation methodology to make inference and obtain the posterior estimates of spatially distributed covariate and random effects. Several mapping units are useful to partition a given study area in landslide prediction studies. These units hierarchically subdivide the geographic space from the highest grid-based resolution to the stronger morphodynamic-oriented slope units. Here we integrate both mapping units into a single hierarchical model, by treating the landslide triggering locations as a random point pattern. This approach diverges fundamentally from the unanimously used presence–absence structure for areal units since we focus on modeling the expected landslide count jointly within the two mapping units. Predicting this landslide intensity provides more detailed and complete information as compared to the classically used susceptibility mapping approach based on relative probabilities. To illustrate the model’s versatility, we compute absolute probability maps of landslide occurrences and check their predictive power over space. While the landslide community typically produces spatial predictive models for landslides only in the sense that covariates are spatially distributed, no actual spatial dependence has been explicitly integrated so far. Our novel approach features a spatial latent effect defined at the slope unit level, allowing us to assess the spatial influence that remains unexplained by the covariates in the model. For rainfall-induced landslides in regions where the raingauge network is not sufficient to capture the spatial distribution of the triggering precipitation event, this latent effect provides valuable imaging support

  12. Bernoulli's Principle

    Science.gov (United States)

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  13. Two-point versus multiple-point geostatistics: the ability of geostatistical methods to capture complex geobodies and their facies associations—an application to a channelized carbonate reservoir, southwest Iran

    International Nuclear Information System (INIS)

    Hashemi, Seyyedhossein; Javaherian, Abdolrahim; Ataee-pour, Majid; Khoshdel, Hossein

    2014-01-01

    Facies models try to explain facies architectures which have a primary control on the subsurface heterogeneities and the fluid flow characteristics of a given reservoir. In the process of facies modeling, geostatistical methods are implemented to integrate different sources of data into a consistent model. The facies models should describe facies interactions; the shape and geometry of the geobodies as they occur in reality. Two distinct categories of geostatistical techniques are two-point and multiple-point (geo) statistics (MPS). In this study, both of the aforementioned categories were applied to generate facies models. A sequential indicator simulation (SIS) and a truncated Gaussian simulation (TGS) represented two-point geostatistical methods, and a single normal equation simulation (SNESIM) selected as an MPS simulation representative. The dataset from an extremely channelized carbonate reservoir located in southwest Iran was applied to these algorithms to analyze their performance in reproducing complex curvilinear geobodies. The SNESIM algorithm needs consistent training images (TI) in which all possible facies architectures that are present in the area are included. The TI model was founded on the data acquired from modern occurrences. These analogies delivered vital information about the possible channel geometries and facies classes that are typically present in those similar environments. The MPS results were conditioned to both soft and hard data. Soft facies probabilities were acquired from a neural network workflow. In this workflow, seismic-derived attributes were implemented as the input data. Furthermore, MPS realizations were conditioned to hard data to guarantee the exact positioning and continuity of the channel bodies. A geobody extraction workflow was implemented to extract the most certain parts of the channel bodies from the seismic data. These extracted parts of the channel bodies were applied to the simulation workflow as hard data

  14. More Than Just Accuracy: A Novel Method to Incorporate Multiple Test Attributes in Evaluating Diagnostic Tests Including Point of Care Tests.

    Science.gov (United States)

    Thompson, Matthew; Weigl, Bernhard; Fitzpatrick, Annette; Ide, Nicole

    2016-01-01

    Current frameworks for evaluating diagnostic tests are constrained by a focus on diagnostic accuracy, and assume that all aspects of the testing process and test attributes are discrete and equally important. Determining the balance between the benefits and harms associated with new or existing tests has been overlooked. Yet, this is critically important information for stakeholders involved in developing, testing, and implementing tests. This is particularly important for point of care tests (POCTs) where tradeoffs exist between numerous aspects of the testing process and test attributes. We developed a new model that multiple stakeholders (e.g., clinicians, patients, researchers, test developers, industry, regulators, and health care funders) can use to visualize the multiple attributes of tests, the interactions that occur between these attributes, and their impacts on health outcomes. We use multiple examples to illustrate interactions between test attributes (test availability, test experience, and test results) and outcomes, including several POCTs. The model could be used to prioritize research and development efforts, and inform regulatory submissions for new diagnostics. It could potentially provide a way to incorporate the relative weights that various subgroups or clinical settings might place on different test attributes. Our model provides a novel way that multiple stakeholders can use to visualize test attributes, their interactions, and impacts on individual and population outcomes. We anticipate that this will facilitate more informed decision making around diagnostic tests.

  15. An Improved Quantum-Behaved Particle Swarm Optimization Method for Economic Dispatch Problems with Multiple Fuel Options and Valve-Points Effects

    Directory of Open Access Journals (Sweden)

    Hong-Yun Zhang

    2012-09-01

    Full Text Available Quantum-behaved particle swarm optimization (QPSO is an efficient and powerful population-based optimization technique, which is inspired by the conventional particle swarm optimization (PSO and quantum mechanics theories. In this paper, an improved QPSO named SQPSO is proposed, which combines QPSO with a selective probability operator to solve the economic dispatch (ED problems with valve-point effects and multiple fuel options. To show the performance of the proposed SQPSO, it is tested on five standard benchmark functions and two ED benchmark problems, including a 40-unit ED problem with valve-point effects and a 10-unit ED problem with multiple fuel options. The results are compared with differential evolution (DE, particle swarm optimization (PSO and basic QPSO, as well as a number of other methods reported in the literature in terms of solution quality, convergence speed and robustness. The simulation results confirm that the proposed SQPSO is effective and reliable for both function optimization and ED problems.

  16. A point-wise fiber Bragg grating displacement sensing system and its application for active vibration suppression of a smart cantilever beam subjected to multiple impact loadings

    International Nuclear Information System (INIS)

    Chuang, Kuo-Chih; Ma, Chien-Ching; Liao, Heng-Tseng

    2012-01-01

    In this work, active vibration suppression of a smart cantilever beam subjected to disturbances from multiple impact loadings is investigated with a point-wise fiber Bragg grating (FBG) displacement sensing system. An FBG demodulator is employed in the proposed fiber sensing system to dynamically demodulate the responses obtained by the FBG displacement sensor with high sensitivity. To investigate the ability of the proposed FBG displacement sensor as a feedback sensor, velocity feedback control and delay control are employed to suppress the vibrations of the first three bending modes of the smart cantilever beam. To improve the control performance for the first bending mode when the cantilever beam is subjected to an impact loading, we improve the conventional velocity feedback controller by tuning the control gain online with the aid of information from a higher vibration mode. Finally, active control of vibrations induced by multiple impact loadings due to a plastic ball is performed with the improved velocity feedback control. The experimental results show that active vibration control of smart structures subjected to disturbances such as impact loadings can be achieved by employing the proposed FBG sensing system to feed back out-of-plane point-wise displacement responses with high sensitivity. (paper)

  17. Point defects in the 1 T' and 2 H phases of single-layer MoS2: A comparative first-principles study

    Science.gov (United States)

    Pizzochero, Michele; Yazyev, Oleg V.

    2017-12-01

    The metastable 1 T' phase of layered transition metal dichalcogenides has recently attracted considerable interest due to electronic properties, possible topological phases, and catalytic activity. We report a comprehensive theoretical investigation of intrinsic point defects in the 1 T' crystalline phase of single-layer molybdenum disulfide (1 T'-MoS2 ) and provide comparison to the well-studied semiconducting 2 H phase. Based on density functional theory calculations, we explore a large number of configurations of vacancy, adatom, and antisite defects and analyze their atomic structure, thermodynamic stability, and electronic and magnetic properties. The emerging picture suggests that, under thermodynamic equilibrium, 1 T'-MoS2 is more prone to hosting lattice imperfections than the 2 H phase. More specifically, our findings reveal that the S atoms that are closer to the Mo atomic plane are the most reactive sites. Similarly to the 2 H phase, S vacancies and adatoms in 1 T'-MoS2 are very likely to occur while Mo adatoms and antisites induce local magnetic moments. Contrary to the 2 H phase, Mo vacancies in 1 T'-MoS2 are expected to be an abundant defect due to the structural relaxation that plays a major role in lowering the defect formation energy. Overall, our study predicts that the realization of high-quality flakes of 1 T'-MoS2 should be carried out under very careful laboratory conditions but at the same time the facile defects introduction can be exploited to tailor physical and chemical properties of this polymorph.

  18. Multiple active myofascial trigger points and pressure pain sensitivity maps in the temporalis muscle are related in women with chronic tension type headache.

    Science.gov (United States)

    Fernández-de-las-Peñas, César; Caminero, Ana B; Madeleine, Pascal; Guillem-Mesado, Amparo; Ge, Hong-You; Arendt-Nielsen, Lars; Pareja, Juan A

    2009-01-01

    To describe the common locations of active trigger points (TrPs) in the temporalis muscle and their referred pain patterns in chronic tension type headache (CTTH), and to determine if pressure sensitivity maps of this muscle can be used to describe the spatial distribution of active TrPs. Forty women with CTTH were included. An electronic pressure algometer was used to assess pressure pain thresholds (PPT) from 9 points over each temporalis muscle: 3 points in the anterior, medial and posterior part, respectively. Both muscles were examined for the presence of active TrPs over each of the 9 points. The referred pain pattern of each active TrP was assessed. Two-way analysis of variance detected significant differences in mean PPT levels between the measurement points (F=30.3; P<0.001), but not between sides (F=2.1; P=0.2). PPT scores decreased from the posterior to the anterior column (P<0.001). No differences were found in the number of active TrPs (F=0.3; P=0.9) between the dominant side the nondominant side. Significant differences were found in the distribution of the active TrPs (chi2=12.2; P<0.001): active TrPs were mostly found in the anterior column and in the middle of the muscle belly. The analysis of variance did not detect significant differences in the referred pain pattern between active TrPs (F=1.1, P=0.4). The topographical pressure pain sensitivity maps showed the distinct distribution of the TrPs indicated by locations with low PPTs. Multiple active TrPs in the temporalis muscle were found, particularly in the anterior column and in the middle of the muscle belly. Bilateral posterior to anterior decreased distribution of PPTs in the temporalis muscle in women with CTTH was found. The locations of active TrPs in the temporalis muscle corresponded well to the muscle areas with lower PPT, supporting the relationship between multiple active muscle TrPs and topographical pressure sensitivity maps in the temporalis muscle in women with CTTH.

  19. A comparison of multiple indicator kriging and area-to-point Poisson kriging for mapping patterns of herbivore species abundance in Kruger National Park, South Africa.

    Science.gov (United States)

    Kerry, Ruth; Goovaerts, Pierre; Smit, Izak P J; Ingram, Ben R

    Kruger National Park (KNP), South Africa, provides protected habitats for the unique animals of the African savannah. For the past 40 years, annual aerial surveys of herbivores have been conducted to aid management decisions based on (1) the spatial distribution of species throughout the park and (2) total species populations in a year. The surveys are extremely time consuming and costly. For many years, the whole park was surveyed, but in 1998 a transect survey approach was adopted. This is cheaper and less time consuming but leaves gaps in the data spatially. Also the distance method currently employed by the park only gives estimates of total species populations but not their spatial distribution. We compare the ability of multiple indicator kriging and area-to-point Poisson kriging to accurately map species distribution in the park. A leave-one-out cross-validation approach indicates that multiple indicator kriging makes poor estimates of the number of animals, particularly the few large counts, as the indicator variograms for such high thresholds are pure nugget. Poisson kriging was applied to the prediction of two types of abundance data: spatial density and proportion of a given species. Both Poisson approaches had standardized mean absolute errors (St. MAEs) of animal counts at least an order of magnitude lower than multiple indicator kriging. The spatial density, Poisson approach (1), gave the lowest St. MAEs for the most abundant species and the proportion, Poisson approach (2), did for the least abundant species. Incorporating environmental data into Poisson approach (2) further reduced St. MAEs.

  20. Variational principles in physics

    CERN Document Server

    Basdevant, Jean-Louis

    2007-01-01

    Optimization under constraints is an essential part of everyday life. Indeed, we routinely solve problems by striking a balance between contradictory interests, individual desires and material contingencies. This notion of equilibrium was dear to thinkers of the enlightenment, as illustrated by Montesquieu’s famous formulation: "In all magistracies, the greatness of the power must be compensated by the brevity of the duration." Astonishingly, natural laws are guided by a similar principle. Variational principles have proven to be surprisingly fertile. For example, Fermat used variational methods to demonstrate that light follows the fastest route from one point to another, an idea which came to be known as Fermat’s principle, a cornerstone of geometrical optics. Variational Principles in Physics explains variational principles and charts their use throughout modern physics. The heart of the book is devoted to the analytical mechanics of Lagrange and Hamilton, the basic tools of any physicist. Prof. Basdev...

  1. Time-lapse analysis of methane quantity in Mary Lee group of coal seams using filter-based multiple-point geostatistical simulation

    Science.gov (United States)

    Karacan, C. Özgen; Olea, Ricardo A.

    2013-01-01

    Coal seam degasification and its success are important for controlling methane, and thus for the health and safety of coal miners. During the course of degasification, properties of coal seams change. Thus, the changes in coal reservoir conditions and in-place gas content as well as methane emission potential into mines should be evaluated by examining time-dependent changes and the presence of major heterogeneities and geological discontinuities in the field. In this work, time-lapsed reservoir and fluid storage properties of the New Castle coal seam, Mary Lee/Blue Creek seam, and Jagger seam of Black Warrior Basin, Alabama, were determined from gas and water production history matching and production forecasting of vertical degasification wellbores. These properties were combined with isotherm and other important data to compute gas-in-place (GIP) and its change with time at borehole locations. Time-lapsed training images (TIs) of GIP and GIP difference corresponding to each coal and date were generated by using these point-wise data and Voronoi decomposition on the TI grid, which included faults as discontinuities for expansion of Voronoi regions. Filter-based multiple-point geostatistical simulations, which were preferred in this study due to anisotropies and discontinuities in the area, were used to predict time-lapsed GIP distributions within the study area. Performed simulations were used for mapping spatial time-lapsed methane quantities as well as their uncertainties within the study area.

  2. Multiple Time-Point 68Ga-PSMA I&T PET/CT for Characterization of Primary Prostate Cancer: Value of Early Dynamic and Delayed Imaging.

    Science.gov (United States)

    Schmuck, Sebastian; Mamach, Martin; Wilke, Florian; von Klot, Christoph A; Henkenberens, Christoph; Thackeray, James T; Sohns, Jan M; Geworski, Lilli; Ross, Tobias L; Wester, Hans-Juergen; Christiansen, Hans; Bengel, Frank M; Derlin, Thorsten

    2017-06-01

    The aims of this study were to gain mechanistic insights into prostate cancer biology using dynamic imaging and to evaluate the usefulness of multiple time-point Ga-prostate-specific membrane antigen (PSMA) I&T PET/CT for the assessment of primary prostate cancer before prostatectomy. Twenty patients with prostate cancer underwent Ga-PSMA I&T PET/CT before prostatectomy. The PET protocol consisted of early dynamic pelvic imaging, followed by static scans at 60 and 180 minutes postinjection (p.i.). SUVs, time-activity curves, quantitative analysis based on a 2-tissue compartment model, Patlak analysis, histopathology, and Gleason grading were compared between prostate cancer and benign prostate gland. Primary tumors were identified on both early dynamic and delayed imaging in 95% of patients. Tracer uptake was significantly higher in prostate cancer compared with benign prostate tissue at any time point (P ≤ 0.0003) and increased over time. Consequently, the tumor-to-nontumor ratio within the prostate gland improved over time (2.8 at 10 minutes vs 17.1 at 180 minutes p.i.). Tracer uptake at both 60 and 180 minutes p.i. was significantly higher in patients with higher Gleason scores (P dynamic and static delayed Ga-PSMA ligand PET images. The tumor-to-nontumor ratio in the prostate gland improves over time, supporting a role of delayed imaging for optimal visualization of prostate cancer.

  3. Disaster damage detection through synergistic use of deep learning and 3D point cloud features derived from very high resolution oblique aerial images, and multiple-kernel-learning

    Science.gov (United States)

    Vetrivel, Anand; Gerke, Markus; Kerle, Norman; Nex, Francesco; Vosselman, George

    2018-06-01

    Oblique aerial images offer views of both building roofs and façades, and thus have been recognized as a potential source to detect severe building damages caused by destructive disaster events such as earthquakes. Therefore, they represent an important source of information for first responders or other stakeholders involved in the post-disaster response process. Several automated methods based on supervised learning have already been demonstrated for damage detection using oblique airborne images. However, they often do not generalize well when data from new unseen sites need to be processed, hampering their practical use. Reasons for this limitation include image and scene characteristics, though the most prominent one relates to the image features being used for training the classifier. Recently features based on deep learning approaches, such as convolutional neural networks (CNNs), have been shown to be more effective than conventional hand-crafted features, and have become the state-of-the-art in many domains, including remote sensing. Moreover, often oblique images are captured with high block overlap, facilitating the generation of dense 3D point clouds - an ideal source to derive geometric characteristics. We hypothesized that the use of CNN features, either independently or in combination with 3D point cloud features, would yield improved performance in damage detection. To this end we used CNN and 3D features, both independently and in combination, using images from manned and unmanned aerial platforms over several geographic locations that vary significantly in terms of image and scene characteristics. A multiple-kernel-learning framework, an effective way for integrating features from different modalities, was used for combining the two sets of features for classification. The results are encouraging: while CNN features produced an average classification accuracy of about 91%, the integration of 3D point cloud features led to an additional

  4. Research on an uplink carrier sense multiple access algorithm of large indoor visible light communication networks based on an optical hard core point process.

    Science.gov (United States)

    Nan, Zhufen; Chi, Xuefen

    2016-12-20

    The IEEE 802.15.7 protocol suggests that it could coordinate the channel access process based on the competitive method of carrier sensing. However, the directionality of light and randomness of diffuse reflection would give rise to a serious imperfect carrier sense (ICS) problem [e.g., hidden node (HN) problem and exposed node (EN) problem], which brings great challenges in realizing the optical carrier sense multiple access (CSMA) mechanism. In this paper, the carrier sense process implemented by diffuse reflection light is modeled as the choice of independent sets. We establish an ICS model with the presence of ENs and HNs for the multi-point to multi-point visible light communication (VLC) uplink communications system. Considering the severe optical ICS problem, an optical hard core point process (OHCPP) is developed, which characterizes the optical CSMA for the indoor VLC uplink communications system. Due to the limited coverage of the transmitted optical signal, in our OHCPP, the ENs within the transmitters' carrier sense region could be retained provided that they could not corrupt the ongoing communications. Moreover, because of the directionality of both light emitting diode (LED) transmitters and receivers, theoretical analysis of the HN problem becomes difficult. In this paper, we derive the closed-form expression for approximating the outage probability and transmission capacity of VLC networks with the presence of HNs and ENs. Simulation results validate the analysis and also show the existence of an optimal physical carrier-sensing threshold that maximizes the transmission capacity for a given emission angle of LED.

  5. Lattice Boltzmann Simulations of Fluid Flow in Continental Carbonate Reservoir Rocks and in Upscaled Rock Models Generated with Multiple-Point Geostatistics

    Directory of Open Access Journals (Sweden)

    J. Soete

    2017-01-01

    Full Text Available Microcomputed tomography (μCT and Lattice Boltzmann Method (LBM simulations were applied to continental carbonates to quantify fluid flow. Fluid flow characteristics in these complex carbonates with multiscale pore networks are unique and the applied method allows studying their heterogeneity and anisotropy. 3D pore network models were introduced to single-phase flow simulations in Palabos, a software tool for particle-based modelling of classic computational fluid dynamics. In addition, permeability simulations were also performed on rock models generated with multiple-point geostatistics (MPS. This allowed assessing the applicability of MPS in upscaling high-resolution porosity patterns into large rock models that exceed the volume limitations of the μCT. Porosity and tortuosity control fluid flow in these porous media. Micro- and mesopores influence flow properties at larger scales in continental carbonates. Upscaling with MPS is therefore necessary to overcome volume-resolution problems of CT scanning equipment. The presented LBM-MPS workflow is applicable to other lithologies, comprising different pore types, shapes, and pore networks altogether. The lack of straightforward porosity-permeability relationships in complex carbonates highlights the necessity for a 3D approach. 3D fluid flow studies provide the best understanding of flow through porous media, which is of crucial importance in reservoir modelling.

  6. Application of multiple-point geostatistics to simulate the effect of small-scale aquifer heterogeneity on the efficiency of aquifer thermal energy storage

    Science.gov (United States)

    Possemiers, Mathias; Huysmans, Marijke; Batelaan, Okke

    2015-08-01

    Adequate aquifer characterization and simulation using heat transport models are indispensible for determining the optimal design for aquifer thermal energy storage (ATES) systems and wells. Recent model studies indicate that meter-scale heterogeneities in the hydraulic conductivity field introduce a considerable uncertainty in the distribution of thermal energy around an ATES system and can lead to a reduction in the thermal recoverability. In a study site in Bierbeek, Belgium, the influence of centimeter-scale clay drapes on the efficiency of a doublet ATES system and the distribution of the thermal energy around the ATES wells are quantified. Multiple-point geostatistical simulation of edge properties is used to incorporate the clay drapes in the models. The results show that clay drapes have an influence both on the distribution of thermal energy in the subsurface and on the efficiency of the ATES system. The distribution of the thermal energy is determined by the strike of the clay drapes, with the major axis of anisotropy parallel to the clay drape strike. The clay drapes have a negative impact (3.3-3.6 %) on the energy output in the models without a hydraulic gradient. In the models with a hydraulic gradient, however, the presence of clay drapes has a positive influence (1.6-10.2 %) on the energy output of the ATES system. It is concluded that it is important to incorporate small-scale heterogeneities in heat transport models to get a better estimate on ATES efficiency and distribution of thermal energy.

  7. Application of multiple-point geostatistics to simulate the effect of small scale aquifer heterogeneity on the efficiency of Aquifer Thermal Energy Storage (ATES)

    Science.gov (United States)

    Possemiers, Mathias; Huysmans, Marijke; Batelaan, Okke

    2015-04-01

    Adequate aquifer characterization and simulation using heat transport models are indispensible for determining the optimal design for Aquifer Thermal Energy Storage (ATES) systems and wells. Recent model studies indicate that meter scale heterogeneities in the hydraulic conductivity field introduce a considerable uncertainty in the distribution of thermal energy around an ATES system and can lead to a reduction in the thermal recoverability. In this paper, the influence of centimeter scale clay drapes on the efficiency of a doublet ATES system and the distribution of the thermal energy around the ATES wells are quantified. Multiple-point geostatistical simulation of edge properties is used to incorporate the clay drapes in the models. The results show that clay drapes have an influence both on the distribution of thermal energy in the subsurface and on the efficiency of the ATES system. The distribution of the thermal energy is determined by the strike of the clay drapes, with the major axis of anisotropy parallel to the clay drape strike. The clay drapes have a negative impact (3.3 - 3.6%) on the energy output in the models without a hydraulic gradient. In the models with a hydraulic gradient, however, the presence of clay drapes has a positive influence (1.6 - 10.2%) on the energy output of the ATES system. It is concluded that it is important to incorporate small scale heterogeneities in heat transport models to get a better estimate on ATES efficiency and distribution of thermal energy.

  8. Teaching Structure-Property Relationships: Investigating Molecular Structure and Boiling Point

    Science.gov (United States)

    Murphy, Peter M.

    2007-01-01

    A concise, well-organized table of the boiling points of 392 organic compounds has facilitated inquiry-based instruction in multiple scientific principles. Many individual or group learning activities can be derived from the tabulated data of molecular structure and boiling point based on the instructor's education objectives and the students'…

  9. The uncertainty principle

    International Nuclear Information System (INIS)

    Martens, Hans.

    1991-01-01

    The subject of this thesis is the uncertainty principle (UP). The UP is one of the most characteristic points of differences between quantum and classical mechanics. The starting point of this thesis is the work of Niels Bohr. Besides the discussion the work is also analyzed. For the discussion of the different aspects of the UP the formalism of Davies and Ludwig is used instead of the more commonly used formalism of Neumann and Dirac. (author). 214 refs.; 23 figs

  10. A survey of variational principles

    International Nuclear Information System (INIS)

    Lewins, J.D.

    1993-01-01

    In this article survey of variational principles has been given. Variational principles play a significant role in mathematical theory with emphasis on the physical aspects. There are two principals used i.e. to represent the equation of the system in a succinct way and to enable a particular computation in the system to be carried out with greater accuracy. The survey of variational principles has ranged widely from its starting point in the Lagrange multiplier to optimisation principles. In an age of digital computation, these classic methods can be adapted to improve such calculations. We emphasize particularly the advantage of basic finite element methods on variational principles. (A.B.)

  11. A survey of variational principles

    International Nuclear Information System (INIS)

    Lewins, J.D.

    1993-01-01

    The survey of variational principles has ranged widely from its starting point in the Lagrange multiplier to optimisation principles. In an age of digital computation, these classic methods can be adapted to improve such calculations. We emphasize particularly the advantage of basing finite element methods on variational principles, especially if, as maximum and minimum principles, these can provide bounds and hence estimates of accuracy. The non-symmetric (and hence stationary rather than extremum principles) are seen however to play a significant role in optimisation theory. (Orig./A.B.)

  12. Impact of Genomics Platform and Statistical Filtering on Transcriptional Benchmark Doses (BMD and Multiple Approaches for Selection of Chemical Point of Departure (PoD.

    Directory of Open Access Journals (Sweden)

    A Francina Webster

    Full Text Available Many regulatory agencies are exploring ways to integrate toxicogenomic data into their chemical risk assessments. The major challenge lies in determining how to distill the complex data produced by high-content, multi-dose gene expression studies into quantitative information. It has been proposed that benchmark dose (BMD values derived from toxicogenomics data be used as point of departure (PoD values in chemical risk assessments. However, there is limited information regarding which genomics platforms are most suitable and how to select appropriate PoD values. In this study, we compared BMD values modeled from RNA sequencing-, microarray-, and qPCR-derived gene expression data from a single study, and explored multiple approaches for selecting a single PoD from these data. The strategies evaluated include several that do not require prior mechanistic knowledge of the compound for selection of the PoD, thus providing approaches for assessing data-poor chemicals. We used RNA extracted from the livers of female mice exposed to non-carcinogenic (0, 2 mg/kg/day, mkd and carcinogenic (4, 8 mkd doses of furan for 21 days. We show that transcriptional BMD values were consistent across technologies and highly predictive of the two-year cancer bioassay-based PoD. We also demonstrate that filtering data based on statistically significant changes in gene expression prior to BMD modeling creates more conservative BMD values. Taken together, this case study on mice exposed to furan demonstrates that high-content toxicogenomics studies produce robust data for BMD modelling that are minimally affected by inter-technology variability and highly predictive of cancer-based PoD doses.

  13. Mechanical engineering principles

    CERN Document Server

    Bird, John

    2014-01-01

    A student-friendly introduction to core engineering topicsThis book introduces mechanical principles and technology through examples and applications, enabling students to develop a sound understanding of both engineering principles and their use in practice. These theoretical concepts are supported by 400 fully worked problems, 700 further problems with answers, and 300 multiple-choice questions, all of which add up to give the reader a firm grounding on each topic.The new edition is up to date with the latest BTEC National specifications and can also be used on undergraduate courses in mecha

  14. Time-Lapse Analysis of Methane Quantity in the Mary Lee Group of Coal Seams Using Filter-Based Multiple-Point Geostatistical Simulation.

    Science.gov (United States)

    Karacan, C Özgen; Olea, Ricardo A

    2013-08-01

    Coal seam degasification and its success are important for controlling methane, and thus for the health and safety of coal miners. During the course of degasification, properties of coal seams change. Thus, the changes in coal reservoir conditions and in-place gas content as well as methane emission potential into mines should be evaluated by examining time-dependent changes and the presence of major heterogeneities and geological discontinuities in the field. In this work, time-lapsed reservoir and fluid storage properties of the New Castle coal seam, Mary Lee/Blue Creek seam, and Jagger seam of Black Warrior Basin, Alabama, were determined from gas and water production history matching and production forecasting of vertical degasification wellbores. These properties were combined with isotherm and other important data to compute gas-in-place (GIP) and its change with time at borehole locations. Time-lapsed training images (TIs) of GIP and GIP difference corresponding to each coal and date were generated by using these point-wise data and Voronoi decomposition on the TI grid, which included faults as discontinuities for expansion of Voronoi regions. Filter-based multiple-point geostatistical simulations, which were preferred in this study due to anisotropies and discontinuities in the area, were used to predict time-lapsed GIP distributions within the study area. Performed simulations were used for mapping spatial time-lapsed methane quantities as well as their uncertainties within the study area. The systematic approach presented in this paper is the first time in literature that history matching, TIs of GIPs and filter simulations are used for degasification performance evaluation and for assessing GIP for mining safety. Results from this study showed that using production history matching of coalbed methane wells to determine time-lapsed reservoir data could be used to compute spatial GIP and representative GIP TIs generated through Voronoi decomposition

  15. Fermat principles in general relativity and the existence of light rays on Lorentzian manifolds

    International Nuclear Information System (INIS)

    Fortunato, D.; Masiello, A.

    1995-01-01

    In this paper we review some results on the existence and multiplicity of null geodesics (light rays) joining a point with a timelike curve on a Lorentzian manifold. Moreover a Morse Theory for such geodesics is presented. A variational principle, which is a variant of the classical Fermat principle in optics, allows to characterize the null geodesics joining a point with a timelike curve as the critical points of a functional on an infinite dimensional manifold. Global variational methods are used to get the existence results and Morse Theory. Such results cover a class of Lorentzian manifolds including Schwarzschild, Reissner-Nordstroem and Kerr space-time. (author)

  16. Consolidated principles for screening based on a systematic review and consensus process.

    Science.gov (United States)

    Dobrow, Mark J; Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda

    2018-04-09

    In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner's seminal publication, and to conduct a Delphi consensus process to assess the review results. We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner's 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. Wilson and Jungner's principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a

  17. Consolidated principles for screening based on a systematic review and consensus process

    Science.gov (United States)

    Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda

    2018-01-01

    BACKGROUND: In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner’s seminal publication, and to conduct a Delphi consensus process to assess the review results. METHODS: We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. RESULTS: We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner’s 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. INTERPRETATION: Wilson and Jungner’s principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles

  18. Variational principles

    CERN Document Server

    Moiseiwitsch, B L

    2004-01-01

    This graduate-level text's primary objective is to demonstrate the expression of the equations of the various branches of mathematical physics in the succinct and elegant form of variational principles (and thereby illuminate their interrelationship). Its related intentions are to show how variational principles may be employed to determine the discrete eigenvalues for stationary state problems and to illustrate how to find the values of quantities (such as the phase shifts) that arise in the theory of scattering. Chapter-by-chapter treatment consists of analytical dynamics; optics, wave mecha

  19. Electrical principles 3 checkbook

    CERN Document Server

    Bird, J O

    2013-01-01

    Electrical Principles 3 Checkbook aims to introduce students to the basic electrical principles needed by technicians in electrical engineering, electronics, and telecommunications.The book first tackles circuit theorems, single-phase series A.C. circuits, and single-phase parallel A.C. circuits. Discussions focus on worked problems on parallel A.C. circuits, worked problems on series A.C. circuits, main points concerned with D.C. circuit analysis, worked problems on circuit theorems, and further problems on circuit theorems. The manuscript then examines three-phase systems and D.C. transients

  20. Development of a NIR-based blend uniformity method for a drug product containing multiple structurally similar actives by using the quality by design principles.

    Science.gov (United States)

    Lin, Yiqing; Li, Weiyong; Xu, Jin; Boulas, Pierre

    2015-07-05

    The aim of this study is to develop an at-line near infrared (NIR) method for the rapid and simultaneous determination of four structurally similar active pharmaceutical ingredients (APIs) in powder blends intended for the manufacturing of tablets. Two of the four APIs in the formula are present in relatively small amounts, one at 0.95% and the other at 0.57%. Such small amounts in addition to the similarity in structures add significant complexity to the blend uniformity analysis. The NIR method is developed using spectra from six laboratory-created calibration samples augmented by a small set of spectra from a large-scale blending sample. Applying the quality by design (QbD) principles, the calibration design included concentration variations of the four APIs and a main excipient, microcrystalline cellulose. A bench-top FT-NIR instrument was used to acquire the spectra. The obtained NIR spectra were analyzed by applying principal component analysis (PCA) before calibration model development. Score patterns from the PCA were analyzed to reveal relationship between latent variables and concentration variations of the APIs. In calibration model development, both PLS-1 and PLS-2 models were created and evaluated for their effectiveness in predicting API concentrations in the blending samples. The final NIR method shows satisfactory specificity and accuracy. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Gyro precession and Mach's principle

    International Nuclear Information System (INIS)

    Eby, P.

    1979-01-01

    The precession of a gyroscope is calculated in a nonrelativistic theory due to Barbour which satisfies Mach's principle. It is shown that the theory predicts both the geodetic and motional precession of general relativity to within factors of order 1. The significance of the gyro experiment is discussed from the point of view of metric theories of gravity and this is contrasted with its significance from the point of view of Mach's principle. (author)

  2. Safety Principles

    Directory of Open Access Journals (Sweden)

    V. A. Grinenko

    2011-06-01

    Full Text Available The offered material in the article is picked up so that the reader could have a complete representation about concept “safety”, intrinsic characteristics and formalization possibilities. Principles and possible strategy of safety are considered. A material of the article is destined for the experts who are taking up the problems of safety.

  3. Maquet principle

    Energy Technology Data Exchange (ETDEWEB)

    Levine, R.B.; Stassi, J.; Karasick, D.

    1985-04-01

    Anterior displacement of the tibial tubercle is a well-accepted orthopedic procedure in the treatment of certain patellofemoral disorders. The radiologic appearance of surgical procedures utilizing the Maquet principle has not been described in the radiologic literature. Familiarity with the physiologic and biochemical basis for the procedure and its postoperative appearance is necessary for appropriate roentgenographic evaluation and the radiographic recognition of complications.

  4. Cosmological principle

    International Nuclear Information System (INIS)

    Wesson, P.S.

    1979-01-01

    The Cosmological Principle states: the universe looks the same to all observers regardless of where they are located. To most astronomers today the Cosmological Principle means the universe looks the same to all observers because density of the galaxies is the same in all places. A new Cosmological Principle is proposed. It is called the Dimensional Cosmological Principle. It uses the properties of matter in the universe: density (rho), pressure (p), and mass (m) within some region of space of length (l). The laws of physics require incorporation of constants for gravity (G) and the speed of light (C). After combining the six parameters into dimensionless numbers, the best choices are: 8πGl 2 rho/c 2 , 8πGl 2 rho/c 4 , and 2 Gm/c 2 l (the Schwarzchild factor). The Dimensional Cosmological Principal came about because old ideas conflicted with the rapidly-growing body of observational evidence indicating that galaxies in the universe have a clumpy rather than uniform distribution

  5. How Many Principles for Public Health Ethics?

    OpenAIRE

    Coughlin, Steven S.

    2008-01-01

    General moral (ethical) principles play a prominent role in certain methods of moral reasoning and ethical decision-making in bioethics and public health. Examples include the principles of respect for autonomy, beneficence, nonmaleficence, and justice. Some accounts of ethics in public health have pointed to additional principles related to social and environmental concerns, such as the precautionary principle and principles of solidarity or social cohesion. This article provides an overview...

  6. The wavelength frame multiplication chopper system for the ESS test beamline at the BER II reactor—A concept study of a fundamental ESS instrument principle

    International Nuclear Information System (INIS)

    Strobl, M.; Bulat, M.; Habicht, K.

    2013-01-01

    Contributing to the design update phase of the European Spallation Source ESS–scheduled to start operation in 2019–a test beamline is under construction at the BER II research reactor at Helmholtz Zentrum Berlin (HZB). This beamline offers experimental test capabilities of instrument concepts viable for the ESS. The experiments envisaged at this dedicated beamline comprise testing of components as well as of novel experimental approaches and methods taking advantage of the long pulse characteristic of the ESS source. Therefore the test beamline will be equipped with a sophisticated chopper system that provides the specific time structure of the ESS and enables variable wavelength resolutions via wavelength frame multiplication (WFM), a fundamental instrument concept beneficial for a number of instruments at ESS. We describe the unique chopper system developed for these purposes, which allows constant wavelength resolution for a wide wavelength band. Furthermore we discuss the implications for the conceptual design for related instrumentation at the ESS

  7. A Principle of Intentionality.

    Science.gov (United States)

    Turner, Charles K

    2017-01-01

    The mainstream theories and models of the physical sciences, including neuroscience, are all consistent with the principle of causality. Wholly causal explanations make sense of how things go, but are inherently value-neutral, providing no objective basis for true beliefs being better than false beliefs, nor for it being better to intend wisely than foolishly. Dennett (1987) makes a related point in calling the brain a syntactic (procedure-based) engine. He says that you cannot get to a semantic (meaning-based) engine from there. He suggests that folk psychology revolves around an intentional stance that is independent of the causal theories of the brain, and accounts for constructs such as meanings, agency, true belief, and wise desire. Dennett proposes that the intentional stance is so powerful that it can be developed into a valid intentional theory. This article expands Dennett's model into a principle of intentionality that revolves around the construct of objective wisdom. This principle provides a structure that can account for all mental processes, and for the scientific understanding of objective value. It is suggested that science can develop a far more complete worldview with a combination of the principles of causality and intentionality than would be possible with scientific theories that are consistent with the principle of causality alone.

  8. Multiple mononeuropathy

    Science.gov (United States)

    ... with multiple mononeuropathy are prone to new nerve injuries at pressure points such as the knees and elbows. They should avoid putting pressure on these areas, for example, by not leaning on the elbows, crossing the knees, ...

  9. Zymography Principles.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2017-01-01

    Zymography, the detection, identification, and even quantification of enzyme activity fractionated by gel electrophoresis, has received increasing attention in the last years, as revealed by the number of articles published. A number of enzymes are routinely detected by zymography, especially with clinical interest. This introductory chapter reviews the major principles behind zymography. New advances of this method are basically focused towards two-dimensional zymography and transfer zymography as will be explained in the rest of the chapters. Some general considerations when performing the experiments are outlined as well as the major troubleshooting and safety issues necessary for correct development of the electrophoresis.

  10. Basic principles

    International Nuclear Information System (INIS)

    Wilson, P.D.

    1996-01-01

    Some basic explanations are given of the principles underlying the nuclear fuel cycle, starting with the physics of atomic and nuclear structure and continuing with nuclear energy and reactors, fuel and waste management and finally a discussion of economics and the future. An important aspect of the fuel cycle concerns the possibility of ''closing the back end'' i.e. reprocessing the waste or unused fuel in order to re-use it in reactors of various kinds. The alternative, the ''oncethrough'' cycle, discards the discharged fuel completely. An interim measure involves the prolonged storage of highly radioactive waste fuel. (UK)

  11. Collaboration between a human group and artificial intelligence can improve prediction of multiple sclerosis course: a proof-of-principle study [version 1; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Andrea Tacchella

    2017-12-01

    Full Text Available Background: Multiple sclerosis has an extremely variable natural course. In most patients, disease starts with a relapsing-remitting (RR phase, which proceeds to a secondary progressive (SP form. The duration of the RR phase is hard to predict, and to date predictions on the rate of disease progression remain suboptimal. This limits the opportunity to tailor therapy on an individual patient's prognosis, in spite of the choice of several therapeutic options. Approaches to improve clinical decisions, such as collective intelligence of human groups and machine learning algorithms are widely investigated. Methods: Medical students and a machine learning algorithm predicted the course of disease on the basis of randomly chosen clinical records of patients that attended at the Multiple Sclerosis service of Sant'Andrea hospital in Rome. Results: A significant improvement of predictive ability was obtained when predictions were combined with a weight that depends on the consistence of human (or algorithm forecasts on a given clinical record. Conclusions: In this work we present proof-of-principle that human-machine hybrid predictions yield better prognoses than machine learning algorithms or groups of humans alone. To strengthen this preliminary result, we propose a crowdsourcing initiative to collect prognoses by physicians on an expanded set of patients.

  12. The project of documentary space 'ExploRe' Opened pluri-disciplinary exploration of reversibility: multiple-point of view access to exploratory works of Andra on reversibility

    International Nuclear Information System (INIS)

    Cahier, Jean-Pierre; Desfriches, Orelie; Zacklad, Manuel

    2009-01-01

    The authors present a digital space (a web site - 'ExploRe') which would allows a community to share a set of pluri-disciplinary information items concerning reversibility, and in which the community members describe the items by using attributes and themes belonging to different points of view

  13. Note to Budget Cutters: The Arts Are Good Business--Multiple Studies Point to Arts Education as an Important Economic Engine

    Science.gov (United States)

    Olson, Catherine Applefeld

    2009-01-01

    They say desperate times call for desperate measures. But in this time of economic uncertainty, the desperate cutting of budgets for arts funding and, by extension, all types of arts education, including music, is not prudent. That is the consensus of several national and local studies, which converge on a single point--that the arts actually can…

  14. A min-max variational principle

    International Nuclear Information System (INIS)

    Georgiev, P.G.

    1995-11-01

    In this paper a variational principle for min-max problems is proved that is of the same spirit as Deville-Godefroy-Zizler's variational principle for minimization problems. A localization theorem in which the mini-max points for the perturbed function with respect top a given ε-min-max point are localized is presented. 3 refs

  15. Quantum principles in field interactions

    International Nuclear Information System (INIS)

    Shirkov, D.V.

    1986-01-01

    The concept of quantum principle is intruduced as a principle whosee formulation is based on specific quantum ideas and notions. We consider three such principles, viz. those of quantizability, local gauge symmetry, and supersymmetry, and their role in the development of the quantum field theory (QFT). Concerning the first of these, we analyze the formal aspects and physical contents of the renormalization procedure in QFT and its relation to ultraviolet divergences and the renorm group. The quantizability principle is formulated as an existence condition of a self-consistent quantum version with a given mechanism of the field interaction. It is shown that the consecutive (from a historial point of view) use of these quantum principles puts still larger limitations on possible forms of field interactions

  16. Applications of multiple change point detections to monthly streamflow and rainfall in Xijiang River in southern China, part II: trend and mean

    Science.gov (United States)

    Chen, Yongqin David; Jiang, Jianmin; Zhu, Yuxiang; Huang, Changxing; Zhang, Qiang

    2018-05-01

    This article, as part II, illustrates applications of other two algorithms, i.e., the scanning F test of change points in trend and the scanning t test of change points in mean, to both series of the normalized streamflow index (NSI) at Makou section in the Xijiang River and the normalized precipitation index (NPI) over the watershed of Xijiang River. The results from these two tests show mainly positive coherency of changes between the NSI and NPI. However, some minor negative coherency patches may expose somewhat impacts of human activities, but they were often associated with nearly normal climate periods. These suggest that the runoff still depends upon well the precipitation in the Xijiang catchment. The anthropogenic disturbances have not yet reached up to violating natural relationship on the whole in this river.

  17. Does the relativity principle violate?

    International Nuclear Information System (INIS)

    Barashenkov, V.S.

    1994-01-01

    Theoretical and experimental data about a possible existence in Nature of some preferred reference frame with a violation of the principle of relativity are considered. The Einstein's and Lorentz's points of view are compared. Although some experiments are known which, in opinion of their authors, indicate the relativity principle violation persuasive evidences supporting this conclusion are absent for the present. The proposals of new experiments in this region, particularly with electron spin precession, are discussed. 55 refs., 4 figs

  18. Principles of Fourier analysis

    CERN Document Server

    Howell, Kenneth B

    2001-01-01

    Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...

  19. Multi-point Shock and Flux Rope Analysis of Multiple Interplanetary Coronal Mass Ejections around 2010 August 1 in the Inner Heliosphere

    Science.gov (United States)

    Möstl, C.; Farrugia, C. J.; Kilpua, E. K. J.; Jian, L. K.; Liu, Y.; Eastwood, J. P.; Harrison, R. A.; Webb, D. F.; Temmer, M.; Odstrcil, D.; Davies, J. A.; Rollett, T.; Luhmann, J. G.; Nitta, N.; Mulligan, T.; Jensen, E. A.; Forsyth, R.; Lavraud, B.; de Koning, C. A.; Veronig, A. M.; Galvin, A. B.; Zhang, T. L.; Anderson, B. J.

    2012-10-01

    We present multi-point in situ observations of a complex sequence of coronal mass ejections (CMEs) which may serve as a benchmark event for numerical and empirical space weather prediction models. On 2010 August 1, instruments on various space missions, Solar Dynamics Observatory/Solar and Heliospheric Observatory/Solar-TErrestrial-RElations-Observatory (SDO/SOHO/STEREO), monitored several CMEs originating within tens of degrees from the solar disk center. We compare their imprints on four widely separated locations, spanning 120° in heliospheric longitude, with radial distances from the Sun ranging from MESSENGER (0.38 AU) to Venus Express (VEX, at 0.72 AU) to Wind, ACE, and ARTEMIS near Earth and STEREO-B close to 1 AU. Calculating shock and flux rope parameters at each location points to a non-spherical shape of the shock, and shows the global configuration of the interplanetary coronal mass ejections (ICMEs), which have interacted, but do not seem to have merged. VEX and STEREO-B observed similar magnetic flux ropes (MFRs), in contrast to structures at Wind. The geomagnetic storm was intense, reaching two minima in the Dst index (≈ - 100 nT), and was caused by the sheath region behind the shock and one of two observed MFRs. MESSENGER received a glancing blow of the ICMEs, and the events missed STEREO-A entirely. The observations demonstrate how sympathetic solar eruptions may immerse at least 1/3 of the heliosphere in the ecliptic with their distinct plasma and magnetic field signatures. We also emphasize the difficulties in linking the local views derived from single-spacecraft observations to a consistent global picture, pointing to possible alterations from the classical picture of ICMEs.

  20. MULTI-POINT SHOCK AND FLUX ROPE ANALYSIS OF MULTIPLE INTERPLANETARY CORONAL MASS EJECTIONS AROUND 2010 AUGUST 1 IN THE INNER HELIOSPHERE

    Energy Technology Data Exchange (ETDEWEB)

    Moestl, C.; Liu, Y.; Luhmann, J. G. [Space Science Laboratory, University of California, Berkeley, CA (United States); Farrugia, C. J. [Space Science Center and Department of Physics, University of New Hampshire, Durham, NH (United States); Kilpua, E. K. J. [Department of Physics, University of Helsinki, FI-00560 Helsinki (Finland); Jian, L. K. [Department of Astronomy, University of Maryland, College Park, MD (United States); Eastwood, J. P.; Forsyth, R. [The Blackett Laboratory, Imperial College, London (United Kingdom); Harrison, R. A.; Davies, J. A. [RAL Space, Harwell Oxford, Didcot (United Kingdom); Webb, D. F. [Institute for Scientific Research, Boston College, Newton, MA (United States); Temmer, M.; Rollett, T.; Veronig, A. M. [Kanzelhoehe Observatory-IGAM, Institute of Physics, University of Graz, A-8010 Graz (Austria); Odstrcil, D. [NASA Goddard Space Flight Center, Greenbelt, MD (United States); Nitta, N. [Solar and Astrophysics Laboratory, Lockheed Martin Advanced Technology Center, Palo Alto, CA (United States); Mulligan, T. [Space Science Applications Laboratory, The Aerospace Corporation, El Segundo, CA (United States); Jensen, E. A. [ACS Consulting, Houston, TX (United States); Lavraud, B. [Institut de Recherche en Astrophysique et Planetologie, Universite de Toulouse (UPS), F-31400 Toulouse (France); De Koning, C. A., E-mail: christian.moestl@uni-graz.at [NOAA/SWPC, Boulder, Colorado (United States); and others

    2012-10-10

    We present multi-point in situ observations of a complex sequence of coronal mass ejections (CMEs) which may serve as a benchmark event for numerical and empirical space weather prediction models. On 2010 August 1, instruments on various space missions, Solar Dynamics Observatory/Solar and Heliospheric Observatory/Solar-TErrestrial-RElations-Observatory (SDO/SOHO/STEREO), monitored several CMEs originating within tens of degrees from the solar disk center. We compare their imprints on four widely separated locations, spanning 120 Degree-Sign in heliospheric longitude, with radial distances from the Sun ranging from MESSENGER (0.38 AU) to Venus Express (VEX, at 0.72 AU) to Wind, ACE, and ARTEMIS near Earth and STEREO-B close to 1 AU. Calculating shock and flux rope parameters at each location points to a non-spherical shape of the shock, and shows the global configuration of the interplanetary coronal mass ejections (ICMEs), which have interacted, but do not seem to have merged. VEX and STEREO-B observed similar magnetic flux ropes (MFRs), in contrast to structures at Wind. The geomagnetic storm was intense, reaching two minima in the Dst index ( Almost-Equal-To - 100 nT), and was caused by the sheath region behind the shock and one of two observed MFRs. MESSENGER received a glancing blow of the ICMEs, and the events missed STEREO-A entirely. The observations demonstrate how sympathetic solar eruptions may immerse at least 1/3 of the heliosphere in the ecliptic with their distinct plasma and magnetic field signatures. We also emphasize the difficulties in linking the local views derived from single-spacecraft observations to a consistent global picture, pointing to possible alterations from the classical picture of ICMEs.

  1. Gesture & Principle

    DEFF Research Database (Denmark)

    Hvejsel, Marie Frier

    2017-01-01

    The development of architecture is governed by an increasing gap between the experienced quality of architectural space, manifest as a sense of interiority on behalf of the user, and the means applied in its realization. Between means and ends, one could say, which calls for a continuous...... and critical development of architectural method. By method, I mean a systematic series of steps taken to acquire knowledge about an architectural problem. As the building industry and the actual construction of architecture is becoming ever more complex moving from craft over industrialization through...... to integration of digital technologies, the mentioned gap is becoming present in the processes and models of collaborations between the multiple parties involved in the realization of architecture, as well as in the work itself. This, as an increase of construction layers, installations and equipment tending...

  2. Principled Missing Data Treatments.

    Science.gov (United States)

    Lang, Kyle M; Little, Todd D

    2018-04-01

    We review a number of issues regarding missing data treatments for intervention and prevention researchers. Many of the common missing data practices in prevention research are still, unfortunately, ill-advised (e.g., use of listwise and pairwise deletion, insufficient use of auxiliary variables). Our goal is to promote better practice in the handling of missing data. We review the current state of missing data methodology and recent missing data reporting in prevention research. We describe antiquated, ad hoc missing data treatments and discuss their limitations. We discuss two modern, principled missing data treatments: multiple imputation and full information maximum likelihood, and we offer practical tips on how to best employ these methods in prevention research. The principled missing data treatments that we discuss are couched in terms of how they improve causal and statistical inference in the prevention sciences. Our recommendations are firmly grounded in missing data theory and well-validated statistical principles for handling the missing data issues that are ubiquitous in biosocial and prevention research. We augment our broad survey of missing data analysis with references to more exhaustive resources.

  3. Per-point and per-field contextual classification of multipolarization and multiple incidence angle aircraft L-band radar data

    Science.gov (United States)

    Hoffer, Roger M.; Hussin, Yousif Ali

    1989-01-01

    Multipolarized aircraft L-band radar data are classified using two different image classification algorithms: (1) a per-point classifier, and (2) a contextual, or per-field, classifier. Due to the distinct variations in radar backscatter as a function of incidence angle, the data are stratified into three incidence-angle groupings, and training and test data are defined for each stratum. A low-pass digital mean filter with varied window size (i.e., 3x3, 5x5, and 7x7 pixels) is applied to the data prior to the classification. A predominately forested area in northern Florida was the study site. The results obtained by using these image classifiers are then presented and discussed.

  4. Multiple positive solutions of nonlinear singular m-point boundary value problem for second-order dynamic equations with sign changing coefficients on time scales

    Directory of Open Access Journals (Sweden)

    Fuyi Xu

    2010-04-01

    \\end{array}\\right.$$ where $1\\leq k\\leq s\\leq m-2, a_i, b_i\\in(0,+\\infty$ with $0<\\sum_{i=1}^{k}b_{i}-\\sum_{i=k+1}^{s}b_{i}<1, 0<\\sum_{i=1}^{m-2}a_{i}<1, 0<\\xi_1<\\xi_2<\\cdots<\\xi_{m-2}<\\rho(T$, $f\\in C( [0,+\\infty,[0,+\\infty$, $a(t$ may be singular at $t=0$. We show that there exist two positive solutions by using two different fixed point theorems respectively. As an application, some examples are included to illustrate the main results. In particular, our criteria extend and improve some known results.

  5. General principles

    International Nuclear Information System (INIS)

    Hutchison, J.M.S.; Foster, M.A.

    1987-01-01

    NMR characteristics are not unique - T/sub 1/ values of tumour tissues overlap with those from multiple sclerosis plaques or from areas of inflammation. Despite this, NMR imaging is an extremely powerful tool to the diagnostician and for other medical use such as following the course of treatment or planning or surgery or radiotherapy. Magnetic resonance imaging (MRI) is often used solely as an anatomical technique similar to X-ray CT. This is certainly an appropriate use for it and it has certain advantages over X-ray CT such as the greater ease with which sagittal and coronal sections can be obtained (or other views by suitable manipulation of the gradients) NMR is also less bothered by bone-related artefacts. There are disadvantages in terms of resolution (although this is improving) and of speed of acquisition of the image. The NMR signal, however, derives from a complex interaction of biophysical properties and, if properly used, can yield a considerable amount of information about its origin. The NMR image is capable of much more manipulation than that obtained by X-ray methods and, particularly with the addition of spectroscopy to the repertoire the authors expect in vivo NMR examinations to yield much metabolic and biophysical information in addition to providing a demonstration of the anatomy of the body

  6. Pharmacometric Analysis of the Relationship Between Absolute Lymphocyte Count and Expanded Disability Status Scale and Relapse Rate, Efficacy End Points, in Multiple Sclerosis Trials.

    Science.gov (United States)

    Novakovic, A M; Thorsted, A; Schindler, E; Jönsson, S; Munafo, A; Karlsson, M O

    2018-05-10

    The aim of this work was to assess the relationship between the absolute lymphocyte count (ALC), and disability (as measured by the Expanded Disability Status Scale [EDSS]) and occurrence of relapses, 2 efficacy endpoints, respectively, in patients with remitting-relasping multiple sclerosis. Data for ALC, EDSS, and relapse rate were available from 1319 patients receiving placebo and/or cladribine tablets. Pharmacodynamic models were developed to characterize the time course of the endpoints. ALC-related measures were then evaluated as predictors of the efficacy endpoints. EDSS data were best fitted by a model where the logit-linear disease progression is affected by the dynamics of ALC change from baseline. Relapse rate data were best described by the Weibull hazard function, and the ALC change from baseline was also found to be a significant predictor of time to relapse. Presented models have shown that once cladribine exposure driven ALC-derived measures are included in the model, the need for drug effect components is of less importance (EDSS) or disappears (relapse rate). This simplifies the models and theoretically makes them mechanism specific rather than drug specific. Having a reliable mechanism-specific model would allow leveraging historical data across compounds, to support decision making in drug development and possibly shorten the time to market. © 2018, The American College of Clinical Pharmacology.

  7. Multiple "buy buttons" in the brain: Forecasting chocolate sales at point-of-sale based on functional brain activation using fMRI.

    Science.gov (United States)

    Kühn, Simone; Strelow, Enrique; Gallinat, Jürgen

    2016-08-01

    We set out to forecast consumer behaviour in a supermarket based on functional magnetic resonance imaging (fMRI). Data was collected while participants viewed six chocolate bar communications and product pictures before and after each communication. Then self-reports liking judgement were collected. fMRI data was extracted from a priori selected brain regions: nucleus accumbens, medial orbitofrontal cortex, amygdala, hippocampus, inferior frontal gyrus, dorsomedial prefrontal cortex assumed to contribute positively and dorsolateral prefrontal cortex and insula were hypothesized to contribute negatively to sales. The resulting values were rank ordered. After our fMRI-based forecast an instore test was conducted in a supermarket on n=63.617 shoppers. Changes in sales were best forecasted by fMRI signal during communication viewing, second best by a comparison of brain signal during product viewing before and after communication and least by explicit liking judgements. The results demonstrate the feasibility of applying neuroimaging methods in a relatively small sample to correctly forecast sales changes at point-of-sale. Copyright © 2016. Published by Elsevier Inc.

  8. Management Certainly Matters, and There Are Multiple Ways to Conceptualize the Process; Comment on “Management Matters: A Leverage Point for Health Systems Strengthening in Global Health”

    Directory of Open Access Journals (Sweden)

    Beaufort B. Longest

    2015-11-01

    Full Text Available The authors of “Management matters: a leverage point for health systems strengthening in global health,” raise a crucial issue. Because more effective management can contribute to better performing health systems, attempts to strengthen health systems require attention to management. As a guide toward management capacity building, the authors outline a comprehensive set of core management competencies needed for managing global health efforts. Although, I agree with the authors’ central premise about the important role of management in improving global health and concur that focusing on competencies can guide management capacity building, I think it is important to recognize that a set of relevant competencies is not the only way to conceptualize and organize efforts to teach, learn, practice, or conduct research on management. I argue the added utility of also viewing management as a set of functions or activities as an alternative paradigm and suggest that the greatest utility could lie in some hybrid that combines various ways of conceptualizing management for study, practice, and research.

  9. Progress in classical and quantum variational principles

    International Nuclear Information System (INIS)

    Gray, C G; Karl, G; Novikov, V A

    2004-01-01

    We review the development and practical uses of a generalized Maupertuis least action principle in classical mechanics in which the action is varied under the constraint of fixed mean energy for the trial trajectory. The original Maupertuis (Euler-Lagrange) principle constrains the energy at every point along the trajectory. The generalized Maupertuis principle is equivalent to Hamilton's principle. Reciprocal principles are also derived for both the generalized Maupertuis and the Hamilton principles. The reciprocal Maupertuis principle is the classical limit of Schroedinger's variational principle of wave mechanics and is also very useful to solve practical problems in both classical and semiclassical mechanics, in complete analogy with the quantum Rayleigh-Ritz method. Classical, semiclassical and quantum variational calculations are carried out for a number of systems, and the results are compared. Pedagogical as well as research problems are used as examples, which include nonconservative as well as relativistic systems. '... the most beautiful and important discovery of Mechanics.' Lagrange to Maupertuis (November 1756)

  10. Radiation chemistry; principles and applications

    International Nuclear Information System (INIS)

    Aziz, F.; Rodgers, M.A.J.

    1994-01-01

    The book attempts to present those fields of radiation chemistry which depend on the principles of radiation chemistry. The first four chapters are some prelude about radiation chemistry principles with respect to how ionizing radiation interacts with matter, and primary results from these interactions and, which kinetic laws are followed by these primary interactions and which equipment for qualitative studies is necessary. Following chapters included principles fields of radiation chemistry. The last six chapters discussed of principle of chemistry from physical and chemical point of view. In this connection the fundamentals of radiation on biological system is emphasised. On one hand, the importance of it for hygiene and safety as neoplasms therapy is discussed. on the other hand, its industrial importance is presented

  11. The gauge principle vs. the equivalence principle

    International Nuclear Information System (INIS)

    Gates, S.J. Jr.

    1984-01-01

    Within the context of field theory, it is argued that the role of the equivalence principle may be replaced by the principle of gauge invariance to provide a logical framework for theories of gravitation

  12. Comments on field equivalence principles

    DEFF Research Database (Denmark)

    Appel-Hansen, Jørgen

    1987-01-01

    It is pointed Out that often-used arguments based on a short-circuit concept in presentations of field equivalence principles are not correct. An alternative presentation based on the uniqueness theorem is given. It does not contradict the results obtained by using the short-circuit concept...

  13. Structuring Principles for the Designer

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth; Pedersen, Per Erik Elgård

    1998-01-01

    This paper suggests a list of structuring principles that support the designer in making alternative concepts for product architectures. Different architectures may support different points of diversification in the product life-cycle. The aim is to balance reuse of resources and reduction...

  14. Equivalence principles and electromagnetism

    Science.gov (United States)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  15. Tipping Point

    Medline Plus

    Full Text Available ... en español Blog About OnSafety CPSC Stands for Safety The Tipping Point Home > 60 Seconds of Safety (Videos) > The Tipping Point The Tipping Point by ... danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe ...

  16. Critical point predication device

    International Nuclear Information System (INIS)

    Matsumura, Kazuhiko; Kariyama, Koji.

    1996-01-01

    An operation for predicting a critical point by using a existent reverse multiplication method has been complicated, and an effective multiplication factor could not be plotted directly to degrade the accuracy for the prediction. The present invention comprises a detector counting memory section for memorizing the counting sent from a power detector which monitors the reactor power, a reverse multiplication factor calculation section for calculating the reverse multiplication factor based on initial countings and current countings of the power detector, and a critical point prediction section for predicting the criticality by the reverse multiplication method relative to effective multiplication factors corresponding to the state of the reactor core previously determined depending on the cases. In addition, a reactor core characteristic calculation section is added for analyzing an effective multiplication factor depending on the state of the reactor core. Then, if the margin up to the criticality is reduced to lower than a predetermined value during critical operation, an alarm is generated to stop the critical operation when generation of a period of more than a predetermined value predicted by succeeding critical operation. With such procedures, forecasting for the critical point can be easily predicted upon critical operation to greatly mitigate an operator's burden and improve handling for the operation. (N.H.)

  17. The several faces of the cosmological principle

    Energy Technology Data Exchange (ETDEWEB)

    Beisbart, Claus [TU Dortmund (Germany). Fakultaet 14, Institut fuer Philosophie und Politikwissenschaft

    2010-07-01

    Much work in relativistic cosmology relies upon the cosmological principle. Very roughly, this principle has it hat the universe is spatially homogeneous and isotropic. However, if the principle is to do some work, it has to be rendered more precise. The aim of this talk is to show that such a precification significantly depends on the theoretical framework adopted and on its ontology. Moreover, it is shown that present-day cosmology uses the principle in different versions that do not fit together nicely. Whereas, in theoretical cosmology, the principle is spelt out as a requirement on space-time manifolds, observational cosmology cashes out the principle using the notion of a random process. I point out some philosophical problems that arise in this context. My conclusion is that the cosmological principle is not a very precise hypothesis, but rather a rough idea that has several faces in contemporary cosmology.

  18. Microhydrodynamics principles and selected applications

    CERN Document Server

    Kim, Sangtae; Brenner, Howard

    1991-01-01

    Microhydrodynamics: Principles and Selected Applications presents analytical and numerical methods for describing motion of small particles suspended in viscous fluids. The text first covers the fundamental principles of low-Reynolds-number flow, including the governing equations and fundamental theorems; the dynamics of a single particle in a flow field; and hydrodynamic interactions between suspended particles. Next, the book deals with the advances in the mathematical and computational aspects of viscous particulate flows that point to innovations for large-scale simulations on parallel co

  19. Fixed Points

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 5. Fixed Points - From Russia with Love - A Primer of Fixed Point Theory. A K Vijaykumar. Book Review Volume 5 Issue 5 May 2000 pp 101-102. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. Tipping Point

    Medline Plus

    Full Text Available ... OnSafety CPSC Stands for Safety The Tipping Point Home > 60 Seconds of Safety (Videos) > The Tipping Point ... 24 hours a day. For young children whose home is a playground, it’s the best way to ...

  1. Tipping Point

    Medline Plus

    Full Text Available ... 60 Seconds of Safety (Videos) > The Tipping Point The Tipping Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe Flash ...

  2. Acid dew point measurement in flue gases

    Energy Technology Data Exchange (ETDEWEB)

    Struschka, M.; Baumbach, G.

    1986-06-01

    The operation of modern boiler plants requires the continuous measurement of the acid dew point in flue gases. An existing measuring instrument was modified in such a way that it can determine acid dew points reliably, reproduceably and continuously. The authors present the mechanisms of the dew point formation, the dew point measuring principle, the modification and the operational results.

  3. Framework for assessing causality in disease management programs: principles.

    Science.gov (United States)

    Wilson, Thomas; MacDowell, Martin

    2003-01-01

    To credibly state that a disease management (DM) program "caused" a specific outcome it is required that metrics observed in the DM population be compared with metrics that would have been expected in the absence of a DM intervention. That requirement can be very difficult to achieve, and epidemiologists and others have developed guiding principles of causality by which credible estimates of DM impact can be made. This paper introduces those key principles. First, DM program metrics must be compared with metrics from a "reference population." This population should be "equivalent" to the DM intervention population on all factors that could independently impact the outcome. In addition, the metrics used in both groups should use the same defining criteria (ie, they must be "comparable" to each other). The degree to which these populations fulfill the "equivalent" assumption and metrics fulfill the "comparability" assumption should be stated. Second, when "equivalence" or "comparability" is not achieved, the DM managers should acknowledge this fact and, where possible, "control" for those factors that may impact the outcome(s). Finally, it is highly unlikely that one study will provide definitive proof of any specific DM program value for all time; thus, we strongly recommend that studies be ongoing, at multiple points in time, and at multiple sites, and, when observational study designs are employed, that more than one type of study design be utilized. Methodologically sophisticated studies that follow these "principles of causality" will greatly enhance the reputation of the important and growing efforts in DM.

  4. Cosmological implications of Heisenberg's principle

    CERN Document Server

    Gonzalo, Julio A

    2015-01-01

    The aim of this book is to analyze the all important implications of Heisenberg's Uncertainty Principle for a finite universe with very large mass-energy content such as ours. The earlier and main contributors to the formulation of Quantum Mechanics are briefly reviewed regarding the formulation of Heisenberg's Principle. After discussing “indeterminacy” versus ”uncertainty”, the universal constants of physics are reviewed and Planck's units are given. Next, a novel set of units, Heisenberg–Lemaitre units, are defined in terms of the large finite mass of the universe. With the help of Heisenberg's principle, the time evolution of the finite zero-point energy for the universe is investigated quantitatively. Next, taking advantage of the rigorous solutions of Einstein's cosmological equation for a flat, open and mixed universe of finite mass, the most recent and accurate data on the “age” (to) and the expansion rate (Ho) of the universe and their implications are reconsidered.

  5. Dew Point

    OpenAIRE

    Goldsmith, Shelly

    1999-01-01

    Dew Point was a solo exhibition originating at PriceWaterhouseCoopers Headquarters Gallery, London, UK and toured to the Centre de Documentacio i Museu Textil, Terrassa, Spain and Gallery Aoyama, Tokyo, Japan.

  6. Tipping Point

    Medline Plus

    Full Text Available ... Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe Flash ...

  7. Tipping Point

    Science.gov (United States)

    ... Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe Flash ...

  8. Tipping Point

    Medline Plus

    Full Text Available ... Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head ... see news reports about horrible accidents involving young children and furniture, appliance and tv tip-overs. The ...

  9. Tipping Point

    Medline Plus

    Full Text Available ... Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head ... TV falls with about the same force as child falling from the third story of a building. ...

  10. Tipping Point

    Medline Plus

    Full Text Available ... Tipping Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture ... about horrible accidents involving young children and furniture, appliance and tv tip-overs. The force of a ...

  11. Graphics and visualization principles & algorithms

    CERN Document Server

    Theoharis, T; Platis, Nikolaos; Patrikalakis, Nicholas M

    2008-01-01

    Computer and engineering collections strong in applied graphics and analysis of visual data via computer will find Graphics & Visualization: Principles and Algorithms makes an excellent classroom text as well as supplemental reading. It integrates coverage of computer graphics and other visualization topics, from shadow geneeration and particle tracing to spatial subdivision and vector data visualization, and it provides a thorough review of literature from multiple experts, making for a comprehensive review essential to any advanced computer study.-California Bookw

  12. Multiplicity in difference geometry

    OpenAIRE

    Tomasic, Ivan

    2011-01-01

    We prove a first principle of preservation of multiplicity in difference geometry, paving the way for the development of a more general intersection theory. In particular, the fibres of a \\sigma-finite morphism between difference curves are all of the same size, when counted with correct multiplicities.

  13. Existence and Multiplicity Results for Nonlinear Differential Equations Depending on a Parameter in Semipositone Case

    Directory of Open Access Journals (Sweden)

    Hailong Zhu

    2012-01-01

    Full Text Available The existence and multiplicity of solutions for second-order differential equations with a parameter are discussed in this paper. We are mainly concerned with the semipositone case. The analysis relies on the nonlinear alternative principle of Leray-Schauder and Krasnosel'skii's fixed point theorem in cones.

  14. First principles calculations of point defect diffusion in CdS buffer layers: Implications for Cu(In,Ga)(Se,S){sub 2} and Cu{sub 2}ZnSn(Se,S){sub 4}-based thin-film photovoltaics

    Energy Technology Data Exchange (ETDEWEB)

    Varley, J. B.; Lordi, V. [Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); He, X.; Rockett, A. [Department of Materials Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801 (United States)

    2016-01-14

    We investigate point defects in CdS buffer layers that may arise from intermixing with Cu(In,Ga)Se{sub 2} (CIGSe) or Cu{sub 2}ZnSn(S,Se){sub 4} (CZTSSe) absorber layers in thin-film photovoltaics (PV). Using hybrid functional calculations, we characterize the migration barriers of Cu, In, Ga, Se, Sn, Zn, Na, and K impurities and assess the activation energies necessary for their diffusion into the bulk of the buffer. We find that Cu, In, and Ga are the most mobile defects in CIGS-derived impurities, with diffusion expected to proceed into the buffer via interstitial-hopping and cadmium vacancy-assisted mechanisms at temperatures ∼400 °C. Cu is predicted to strongly favor migration paths within the basal plane of the wurtzite CdS lattice, which may facilitate defect clustering and ultimately the formation of Cu-rich interfacial phases as observed by energy dispersive x-ray spectroscopic elemental maps in real PV devices. Se, Zn, and Sn defects are found to exhibit much larger activation energies and are not expected to diffuse within the CdS bulk at temperatures compatible with typical PV processing temperatures. Lastly, we find that Na interstitials are expected to exhibit slightly lower activation energies than K interstitials despite having a larger migration barrier. Still, we find both alkali species are expected to diffuse via an interstitially mediated mechanism at slightly higher temperatures than enable In, Ga, and Cu diffusion in the bulk. Our results indicate that processing temperatures in excess of ∼400 °C will lead to more interfacial intermixing with CdS buffer layers in CIGSe devices, and less so for CZTSSe absorbers where only Cu is expected to significantly diffuse into the buffer.

  15. Multi-valued logic gates based on ballistic transport in quantum point contacts.

    Science.gov (United States)

    Seo, M; Hong, C; Lee, S-Y; Choi, H K; Kim, N; Chung, Y; Umansky, V; Mahalu, D

    2014-01-22

    Multi-valued logic gates, which can handle quaternary numbers as inputs, are developed by exploiting the ballistic transport properties of quantum point contacts in series. The principle of a logic gate that finds the minimum of two quaternary number inputs is demonstrated. The device is scalable to allow multiple inputs, which makes it possible to find the minimum of multiple inputs in a single gate operation. Also, the principle of a half-adder for quaternary number inputs is demonstrated. First, an adder that adds up two quaternary numbers and outputs the sum of inputs is demonstrated. Second, a device to express the sum of the adder into two quaternary digits [Carry (first digit) and Sum (second digit)] is demonstrated. All the logic gates presented in this paper can in principle be extended to allow decimal number inputs with high quality QPCs.

  16. Multi-Valued Logic Gates based on Ballistic Transport in Quantum Point Contacts

    Science.gov (United States)

    Seo, M.; Hong, C.; Lee, S.-Y.; Choi, H. K.; Kim, N.; Chung, Y.; Umansky, V.; Mahalu, D.

    2014-01-01

    Multi-valued logic gates, which can handle quaternary numbers as inputs, are developed by exploiting the ballistic transport properties of quantum point contacts in series. The principle of a logic gate that finds the minimum of two quaternary number inputs is demonstrated. The device is scalable to allow multiple inputs, which makes it possible to find the minimum of multiple inputs in a single gate operation. Also, the principle of a half-adder for quaternary number inputs is demonstrated. First, an adder that adds up two quaternary numbers and outputs the sum of inputs is demonstrated. Second, a device to express the sum of the adder into two quaternary digits [Carry (first digit) and Sum (second digit)] is demonstrated. All the logic gates presented in this paper can in principle be extended to allow decimal number inputs with high quality QPCs.

  17. Multiple Perspectives / Multiple Readings

    Directory of Open Access Journals (Sweden)

    Simon Biggs

    2005-01-01

    Full Text Available People experience things from their own physical point of view. What they see is usually a function of where they are and what physical attitude they adopt relative to the subject. With augmented vision (periscopes, mirrors, remote cameras, etc we are able to see things from places where we are not present. With time-shifting technologies, such as the video recorder, we can also see things from the past; a time and a place we may never have visited.In recent artistic work I have been exploring the implications of digital technology, interactivity and internet connectivity that allow people to not so much space/time-shift their visual experience of things but rather see what happens when everybody is simultaneously able to see what everybody else can see. This is extrapolated through the remote networking of sites that are actual installation spaces; where the physical movements of viewers in the space generate multiple perspectives, linked to other similar sites at remote locations or to other viewers entering the shared data-space through a web based version of the work.This text explores the processes involved in such a practice and reflects on related questions regarding the non-singularity of being and the sense of self as linked to time and place.

  18. Scalets, wavelets and (complex) turning point quantization

    Science.gov (United States)

    Handy, C. R.; Brooks, H. A.

    2001-05-01

    Despite the many successes of wavelet analysis in image and signal processing, the incorporation of continuous wavelet transform theory within quantum mechanics has lacked a compelling, first principles, motivating analytical framework, until now. For arbitrary one-dimensional rational fraction Hamiltonians, we develop a simple, unified formalism, which clearly underscores the complementary, and mutually interdependent, role played by moment quantization theory (i.e. via scalets, as defined herein) and wavelets. This analysis involves no approximation of the Hamiltonian within the (equivalent) wavelet space, and emphasizes the importance of (complex) multiple turning point contributions in the quantization process. We apply the method to three illustrative examples. These include the (double-well) quartic anharmonic oscillator potential problem, V(x) = Z2x2 + gx4, the quartic potential, V(x) = x4, and the very interesting and significant non-Hermitian potential V(x) = -(ix)3, recently studied by Bender and Boettcher.

  19. Radiation protection principles

    International Nuclear Information System (INIS)

    Ismail Bahari

    2007-01-01

    The presentation outlines the aspects of radiation protection principles. It discussed the following subjects; radiation hazards and risk, the objectives of radiation protection, three principles of the system - justification of practice, optimization of protection and safety, dose limit

  20. Principles of project management

    Science.gov (United States)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  1. First principles calculations of interstitial and lamellar rhenium nitrides

    Energy Technology Data Exchange (ETDEWEB)

    Soto, G., E-mail: gerardo@cnyn.unam.mx [Universidad Nacional Autonoma de Mexico, Centro de Nanociencias y Nanotecnologia, Km 107 Carretera Tijuana-Ensenada, Ensenada Baja California (Mexico); Tiznado, H.; Reyes, A.; Cruz, W. de la [Universidad Nacional Autonoma de Mexico, Centro de Nanociencias y Nanotecnologia, Km 107 Carretera Tijuana-Ensenada, Ensenada Baja California (Mexico)

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer The possible structures of rhenium nitride as a function of composition are analyzed. Black-Right-Pointing-Pointer The alloying energy is favorable for rhenium nitride in lamellar arrangements. Black-Right-Pointing-Pointer The structures produced by magnetron sputtering are metastable variations. Black-Right-Pointing-Pointer The structures produced by high-pressure high-temperature are stable configurations. Black-Right-Pointing-Pointer The lamellar structures are a new category of interstitial dissolutions. - Abstract: We report here a systematic first principles study of two classes of variable-composition rhenium nitride: i, interstitial rhenium nitride as a solid solution and ii, rhenium nitride in lamellar structures. The compounds in class i are cubic and hexagonal close-packed rhenium phases, with nitrogen in the octahedral and tetrahedral interstices of the metal, and they are formed without changes to the structure, except for slight distortions of the unit cells. In the compounds in class ii, by contrast, the nitrogen inclusion provokes stacking faults in the parent metal structure. These faults create trigonal-prismatic sites where the nitrogen residence is energetically favored. This second class of compounds produces lamellar structures, where the nitrogen lamellas are inserted among multiple rhenium layers. The Re{sub 3}N and Re{sub 2}N phases produced recently by high-temperature and high-pressure synthesis belong to this class. The ratio of the nitrogen layers to the rhenium layers is given by the composition. While the first principle calculations point to higher stability for the lamellar structures as opposed to the interstitial phases, the experimental evidence presented here demonstrates that the interstitial classes are synthesizable by plasma methods. We conclude that rhenium nitrides possess polymorphism and that the two-dimensional lamellar structures might represent an emerging class of materials

  2. The certainty principle (review)

    OpenAIRE

    Arbatsky, D. A.

    2006-01-01

    The certainty principle (2005) allowed to conceptualize from the more fundamental grounds both the Heisenberg uncertainty principle (1927) and the Mandelshtam-Tamm relation (1945). In this review I give detailed explanation and discussion of the certainty principle, oriented to all physicists, both theorists and experimenters.

  3. Quantum Action Principle with Generalized Uncertainty Principle

    OpenAIRE

    Gu, Jie

    2013-01-01

    One of the common features in all promising candidates of quantum gravity is the existence of a minimal length scale, which naturally emerges with a generalized uncertainty principle, or equivalently a modified commutation relation. Schwinger's quantum action principle was modified to incorporate this modification, and was applied to the calculation of the kernel of a free particle, partly recovering the result previously studied using path integral.

  4. How Many Principles for Public Health Ethics?

    Science.gov (United States)

    Coughlin, Steven S.

    2009-01-01

    General moral (ethical) principles play a prominent role in certain methods of moral reasoning and ethical decision-making in bioethics and public health. Examples include the principles of respect for autonomy, beneficence, nonmaleficence, and justice. Some accounts of ethics in public health have pointed to additional principles related to social and environmental concerns, such as the precautionary principle and principles of solidarity or social cohesion. This article provides an overview of principle-based methods of moral reasoning as they apply to public health ethics including a summary of advantages and disadvantages of methods of moral reasoning that rely upon general principles of moral reasoning. Drawing upon the literature on public health ethics, examples are provided of additional principles, obligations, and rules that may be useful for analyzing complex ethical issues in public health. A framework is outlined that takes into consideration the interplay of ethical principles and rules at individual, community, national, and global levels. Concepts such as the precautionary principle and solidarity are shown to be useful to public health ethics to the extent that they can be shown to provide worthwhile guidance and information above and beyond principles of beneficence, nonmaleficence, and justice, and the clusters of rules and maxims that are linked to these moral principles. Future directions likely to be productive include further work on areas of public health ethics such as public trust, community empowerment, the rights of individuals who are targeted (or not targeted) by public health interventions, individual and community resilience and wellbeing, and further clarification of principles, obligations, and rules in public health disciplines such as environmental science, prevention and control of chronic and infectious diseases, genomics, and global health. PMID:20072707

  5. Dimensional cosmological principles

    International Nuclear Information System (INIS)

    Chi, L.K.

    1985-01-01

    The dimensional cosmological principles proposed by Wesson require that the density, pressure, and mass of cosmological models be functions of the dimensionless variables which are themselves combinations of the gravitational constant, the speed of light, and the spacetime coordinates. The space coordinate is not the comoving coordinate. In this paper, the dimensional cosmological principle and the dimensional perfect cosmological principle are reformulated by using the comoving coordinate. The dimensional perfect cosmological principle is further modified to allow the possibility that mass creation may occur. Self-similar spacetimes are found to be models obeying the new dimensional cosmological principle

  6. The application of pragmatic principles in competitive business writing

    Directory of Open Access Journals (Sweden)

    Wu Haihong

    2016-01-01

    Full Text Available Business English writing, as an important means of communication, plays a vital role in international business communication. And pragmatic principle, as the universal principle, exists in all communication situations. This paper gives a brief introduction to the pragmatic principles and business English writing principles and illustrates the high consistency between these principles. By analyzing samples, it also points out the instructive significance of pragmatic principles in competitive business English writing. To a certain extent, this provides a theoretical support for the research of business English writing.

  7. The precautionary principle in international environmental law and international jurisprudence

    OpenAIRE

    Tubić, Bojan

    2014-01-01

    This paper analysis international regulation of the precautionary principle as one of environmental principles. This principle envisages that when there are threats of serious and irreparable harm, as a consequence of certain economic activity, the lack of scientific evidence and full certainty cannot be used as a reason for postponing efficient measures for preventing environmental harm. From economic point of view, the application of precautionary principle is problematic, because it create...

  8. Track formation. Principles and applications

    International Nuclear Information System (INIS)

    Monnin, M.

    1978-01-01

    The principles and technical aspects of track formation in insulating solids are first described. The characteristics of dialectic track detection are discussed from the technical point of view: the nature of the detectors, the chemical treatment, the sensitivity and the environmental conditions of use. The applications are reviewed. The principle of each type of applied research is described and then the applications are listed. When used as a detector, nuclear tracks can provide valuable information in a number of fields: element content determination and wrapping, imaging, radiation dosimetry, environmental studies, technological uses and miscellaneous other applications. The track-formation process can also be used for making well-defined holes; this method allows other applications which are also described. Finally, some possible future applications are mentioned. (author)

  9. Biomechanics principles and practices

    CERN Document Server

    Peterson, Donald R

    2014-01-01

    Presents Current Principles and ApplicationsBiomedical engineering is considered to be the most expansive of all the engineering sciences. Its function involves the direct combination of core engineering sciences as well as knowledge of nonengineering disciplines such as biology and medicine. Drawing on material from the biomechanics section of The Biomedical Engineering Handbook, Fourth Edition and utilizing the expert knowledge of respected published scientists in the application and research of biomechanics, Biomechanics: Principles and Practices discusses the latest principles and applicat

  10. Fusion research principles

    CERN Document Server

    Dolan, Thomas James

    2013-01-01

    Fusion Research, Volume I: Principles provides a general description of the methods and problems of fusion research. The book contains three main parts: Principles, Experiments, and Technology. The Principles part describes the conditions necessary for a fusion reaction, as well as the fundamentals of plasma confinement, heating, and diagnostics. The Experiments part details about forty plasma confinement schemes and experiments. The last part explores various engineering problems associated with reactor design, vacuum and magnet systems, materials, plasma purity, fueling, blankets, neutronics

  11. Cleaning Massive Sonar Point Clouds

    DEFF Research Database (Denmark)

    Arge, Lars Allan; Larsen, Kasper Green; Mølhave, Thomas

    2010-01-01

    We consider the problem of automatically cleaning massive sonar data point clouds, that is, the problem of automatically removing noisy points that for example appear as a result of scans of (shoals of) fish, multiple reflections, scanner self-reflections, refraction in gas bubbles, and so on. We...

  12. Database principles programming performance

    CERN Document Server

    O'Neil, Patrick

    2014-01-01

    Database: Principles Programming Performance provides an introduction to the fundamental principles of database systems. This book focuses on database programming and the relationships between principles, programming, and performance.Organized into 10 chapters, this book begins with an overview of database design principles and presents a comprehensive introduction to the concepts used by a DBA. This text then provides grounding in many abstract concepts of the relational model. Other chapters introduce SQL, describing its capabilities and covering the statements and functions of the programmi

  13. Principles of ecotoxicology

    National Research Council Canada - National Science Library

    Walker, C. H

    2012-01-01

    "Now in its fourth edition, this exceptionally accessible text provides students with a multidisciplinary perspective and a grounding in the fundamental principles required for research in toxicology today...

  14. APPLYING THE PRINCIPLES OF ACCOUNTING IN

    OpenAIRE

    NAGY CRISTINA MIHAELA; SABĂU CRĂCIUN; ”Tibiscus” University of Timişoara, Faculty of Economic Science

    2015-01-01

    The application of accounting principles (accounting principle on accrual basis; principle of business continuity; method consistency principle; prudence principle; independence principle; the principle of separate valuation of assets and liabilities; intangibility principle; non-compensation principle; the principle of substance over form; the principle of threshold significance) to companies that are in bankruptcy procedure has a number of particularities. Thus, some principl...

  15. Principles of protein targeting to the nucleolus.

    Science.gov (United States)

    Martin, Robert M; Ter-Avetisyan, Gohar; Herce, Henry D; Ludwig, Anne K; Lättig-Tünnemann, Gisela; Cardoso, M Cristina

    2015-01-01

    The nucleolus is the hallmark of nuclear compartmentalization and has been shown to exert multiple roles in cellular metabolism besides its main function as the place of rRNA synthesis and assembly of ribosomes. Nucleolar proteins dynamically localize and accumulate in this nuclear compartment relative to the surrounding nucleoplasm. In this study, we have assessed the molecular requirements that are necessary and sufficient for the localization and accumulation of peptides and proteins inside the nucleoli of living cells. The data showed that positively charged peptide entities composed of arginines alone and with an isoelectric point at and above 12.6 are necessary and sufficient for mediating significant nucleolar accumulation. A threshold of 6 arginines is necessary for peptides to accumulate in nucleoli, but already 4 arginines are sufficient when fused within 15 amino acid residues of a nuclear localization signal of a protein. Using a pH sensitive dye, we found that the nucleolar compartment is particularly acidic when compared to the surrounding nucleoplasm and, hence, provides the ideal electrochemical environment to bind poly-arginine containing proteins. In fact, we found that oligo-arginine peptides and GFP fusions bind RNA in vitro. Consistent with RNA being the main binding partner for arginines in the nucleolus, we found that the same principles apply to cells from insects to man, indicating that this mechanism is highly conserved throughout evolution.

  16. Rotating detectors and Mach's principle

    International Nuclear Information System (INIS)

    Paola, R.D.M. de; Svaiter, N.F.

    2000-08-01

    In this work we consider a quantum version of Newton s bucket experiment in a fl;at spacetime: we take an Unruh-DeWitt detector in interaction with a real massless scalar field. We calculate the detector's excitation rate when it is uniformly rotating around some fixed point and the field is prepared in the Minkowski vacuum and also when the detector is inertial and the field is in the Trocheries-Takeno vacuum state. These results are compared and the relations with Mach's principle are discussed. (author)

  17. Physical Consequences of Mathematical Principles

    Directory of Open Access Journals (Sweden)

    Comay E.

    2009-10-01

    Full Text Available Physical consequences are derived from the following mathematical structures: the variational principle, Wigner’s classifications of the irreducible representations of the Poincar ́ e group and the duality invariance of the homogeneous Maxwell equations. The analysis is carried out within the validity domain of special relativity. Hierarchical re- lations between physical theories are used. Some new results are pointed out together with their comparison with experimental data. It is also predicted that a genuine Higgs particle will not be detected.

  18. Variational principle in quantum mechanics

    International Nuclear Information System (INIS)

    Popiez, L.

    1986-01-01

    The variational principle in a standard, path integral formulation of quantum mechanics (as proposed by Dirac and Feynman) appears only in the context of a classical limit n to 0 and manifests itself through the method of abstract stationary phase. Symbolically it means that a probability amplitude averaged over trajectories denotes a classical evolution operator for points in a configuration space. There exists, however, the formulation of quantum dynamics in which variational priniple is one of basic postulates. It is explained that the translation between stochastic and quantum mechanics in this case can be understood as in Nelson's stochastic mechanics

  19. Fashion, Paper Dolls and Multiplicatives

    Science.gov (United States)

    Ura, Suzana Kaori; Stein-Barana, Alzira C. M.; Munhoz, Deisy P.

    2011-01-01

    The multiplicative principle is the tool allowing the counting of groups that can be described by a sequence of events. An event is a subset of sample space, i.e. a collection of possible outcomes, which may be equal to or smaller than the sample space as a whole. It is important that students understand this basic principle early on and know how…

  20. The genetic difference principle.

    Science.gov (United States)

    Farrelly, Colin

    2004-01-01

    In the newly emerging debates about genetics and justice three distinct principles have begun to emerge concerning what the distributive aim of genetic interventions should be. These principles are: genetic equality, a genetic decent minimum, and the genetic difference principle. In this paper, I examine the rationale of each of these principles and argue that genetic equality and a genetic decent minimum are ill-equipped to tackle what I call the currency problem and the problem of weight. The genetic difference principle is the most promising of the three principles and I develop this principle so that it takes seriously the concerns of just health care and distributive justice in general. Given the strains on public funds for other important social programmes, the costs of pursuing genetic interventions and the nature of genetic interventions, I conclude that a more lax interpretation of the genetic difference principle is appropriate. This interpretation stipulates that genetic inequalities should be arranged so that they are to the greatest reasonable benefit of the least advantaged. Such a proposal is consistent with prioritarianism and provides some practical guidance for non-ideal societies--that is, societies that do not have the endless amount of resources needed to satisfy every requirement of justice.

  1. The principle of equivalence

    International Nuclear Information System (INIS)

    Unnikrishnan, C.S.

    1994-01-01

    Principle of equivalence was the fundamental guiding principle in the formulation of the general theory of relativity. What are its key elements? What are the empirical observations which establish it? What is its relevance to some new experiments? These questions are discussed in this article. (author). 11 refs., 5 figs

  2. The Dutch premium principle

    NARCIS (Netherlands)

    van Heerwaarden, A.E.; Kaas, R.

    1992-01-01

    A premium principle is derived, in which the loading for a risk is the reinsurance loading for an excess-of-loss cover. It is shown that the principle is well-behaved in the sense that it results in larger premiums for risks that are larger in stop-loss order or in stochastic dominance.

  3. A new computing principle

    International Nuclear Information System (INIS)

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  4. The anthropic principle

    International Nuclear Information System (INIS)

    Carr, B.J.

    1982-01-01

    The anthropic principle (the conjecture that certain features of the world are determined by the existence of Man) is discussed with the listing of the objections, and is stated that nearly all the constants of nature may be determined by the anthropic principle which does not give exact values for the constants but only their orders of magnitude. (J.T.)

  5. Mach's holographic principle

    International Nuclear Information System (INIS)

    Khoury, Justin; Parikh, Maulik

    2009-01-01

    Mach's principle is the proposition that inertial frames are determined by matter. We put forth and implement a precise correspondence between matter and geometry that realizes Mach's principle. Einstein's equations are not modified and no selection principle is applied to their solutions; Mach's principle is realized wholly within Einstein's general theory of relativity. The key insight is the observation that, in addition to bulk matter, one can also add boundary matter. Given a space-time, and thus the inertial frames, we can read off both boundary and bulk stress tensors, thereby relating matter and geometry. We consider some global conditions that are necessary for the space-time to be reconstructible, in principle, from bulk and boundary matter. Our framework is similar to that of the black hole membrane paradigm and, in asymptotically anti-de Sitter space-times, is consistent with holographic duality.

  6. Teaching Statistical Principles with a Roulette Simulation

    Directory of Open Access Journals (Sweden)

    Graham D Barr

    2013-03-01

    Full Text Available This paper uses the game of roulette in a simulation setting to teach students in an introductory Stats course some basic issues in theoretical and empirical probability. Using an Excel spreadsheet with embedded VBA (Visual Basic for Applications, one can simulate the empirical return and empirical standard deviation for a range of bets in Roulette over some predetermined number of plays. In particular, the paper illustrates the difference between different playing strategies by contrasting a low payout bet (say a bet on “red” and a high payout bet (say a bet on a particular number by considering the expected return and volatility associated with the bets. The paper includes an Excel VBA based simulation of the Roulette wheel where students can make bets and monitor the return on the bets for one play or multiple plays. In addition it includes a simulation of the casino house advantage for repeated multiple plays; that is, it allows students to see how casinos may derive a new certain return equal to the house advantage by entertaining large numbers of bets which will systematically drive the volatility of the house advantage down to zero. This simulation has been shown to be especially effective at theUniversityofCape Townfor teaching first year Statistics students the subtler points of probability, as well as encouraging discussions around the risk-return trade-off facing gamblers. The program has also been shown to be useful for teaching students the principles of theoretical and empirical probabilities as well as an understanding of volatility.

  7. The principle of proportionality revisited: interpretations and applications.

    Science.gov (United States)

    Hermerén, Göran

    2012-11-01

    The principle of proportionality is used in many different contexts. Some of these uses and contexts are first briefly indicated. This paper focusses on the use of this principle as a moral principle. I argue that under certain conditions the principle of proportionality is helpful as a guide in decision-making. But it needs to be clarified and to be used with some flexibility as a context-dependent principle. Several interpretations of the principle are distinguished, using three conditions as a starting point: importance of objective, relevance of means, and most favourable option. The principle is then tested against an example, which suggests that a fourth condition, focusing on non-excessiveness, needs to be added. I will distinguish between three main interpretations of the principle, some primarily with uses in research ethics, others with uses in other areas of bioethics, for instance in comparisons of therapeutic means and ends. The relations between the principle of proportionality and the precautionary principle are explored in the following section. It is concluded that the principles are different and may even clash. In the next section the principle of proportionality is applied to some medical examples drawn from research ethics and bioethics. In concluding, the status of the principle of proportionality as a moral principle is discussed. What has been achieved so far and what remains to be done is finally summarized.

  8. "Essential Principles of Economics:" A Hypermedia Textbook.

    Science.gov (United States)

    McCain, Roger A.

    2000-01-01

    Discusses an electronic textbook called "Essential Principles of Economics." Explains that economic concepts are found by following links from the table of contents, while each chapter includes both expository information and interactive material including online multiple-choice drill questions. States that the textbook is a "work…

  9. PRINCIPLES AND PROCEDURES ON FISCAL

    Directory of Open Access Journals (Sweden)

    Morar Ioan Dan

    2011-07-01

    Full Text Available Fiscal science advertise in most analytical situations, while the principles reiterated by specialists in the field in various specialized works The two components of taxation, the tax system relating to the theoretical and the practical procedures relating to tax are marked by frequent references and invocations of the underlying principles to tax. This paper attempts a return on equity fiscal general vision as a principle often invoked and used to justify tax policies, but so often violated the laws fiscality . Also want to emphasize the importance of devising procedures to ensure fiscal equitable treatment of taxpayers. Specific approach of this paper is based on the notion that tax equity is based on equality before tax and social policies of the executive that would be more effective than using the other tax instruments. I want to emphasize that if the scientific approach to justify the unequal treatment of the tax law is based on the various social problems of the taxpayers, then deviates from the issue of tax fairness justification explaining the need to promote social policies usually more attractive to taxpayers. Modern tax techniques are believed to be promoted especially in order to ensure an increasing level of high efficiency at the expense of the taxpayers obligations to ensure equality before the law tax. On the other hand, tax inequities reaction generates multiple recipients from the first budget plan, but finalities unfair measures can not quantify and no timeline for the reaction, usually not known. But while statistics show fluctuations in budgetary revenues and often find in literature reviews and analysis relevant to a connection between changes in government policies, budget execution and outcome. The effects of inequality on tax on tax procedures and budgetary revenues are difficult to quantify and is among others to this work. Providing tax equity without combining it with the principles of discrimination and neutrality

  10. Determine point-to-point networking interactions using regular expressions

    Directory of Open Access Journals (Sweden)

    Konstantin S. Deev

    2015-06-01

    Full Text Available As Internet growth and becoming more popular, the number of concurrent data flows start to increasing, which makes sense in bandwidth requested. Providers and corporate customers need ability to identify point-to-point interactions. The best is to use special software and hardware implementations that distribute the load in the internals of the complex, using the principles and approaches, in particular, described in this paper. This paper represent the principles of building system, which searches for a regular expression match using computing on graphics adapter in server station. A significant computing power and capability to parallel execution on modern graphic processor allows inspection of large amounts of data through sets of rules. Using the specified characteristics can lead to increased computing power in 30…40 times compared to the same setups on the central processing unit. The potential increase in bandwidth capacity could be used in systems that provide packet analysis, firewalls and network anomaly detectors.

  11. Architectural Principles for Orchestration of Cross-Organizational Service Delivery: Case Studies from the Netherlands

    Science.gov (United States)

    van Veenstra, Anne Fleur; Janssen, Marijn

    One of the main challenges for e-government is to create coherent services for citizens and businesses. Realizing Integrated Service Delivery (ISD) requires government agencies to collaborate across their organizational boundaries. The coordination of processes across multiple organizations to realize ISD is called orchestration. One way of achieving orchestration is to formalize processes using architecture. In this chapter we identify architectural principles for orchestration by looking at three case studies of cross-organizational service delivery chain formation in the Netherlands. In total, six generic principles were formulated and subsequently validated in two workshops with experts. These principles are: (i) build an intelligent front office, (ii) give processes a clear starting point and end, (iii) build a central workflow application keeping track of the process, (iv) differentiate between simple and complex processes, (v) ensure that the decision-making responsibility and the overview of the process are not performed by the same process role, and (vi) create a central point where risk profiles are maintained. Further research should focus on how organizations can adapt these principles to their own situation.

  12. Limitations of Boltzmann's principle

    International Nuclear Information System (INIS)

    Lavenda, B.H.

    1995-01-01

    The usual form of Boltzmann's principle assures that maximum entropy, or entropy reduction, occurs with maximum probability, implying a unimodal distribution. Boltzmann's principle cannot be applied to nonunimodal distributions, like the arcsine law, because the entropy may be concave only over a limited portion of the interval. The method of subordination shows that the arcsine distribution corresponds to a process with a single degree of freedom, thereby confirming the invalidation of Boltzmann's principle. The fractalization of time leads to a new distribution in which arcsine and Cauchy distributions can coexist simultaneously for nonintegral degrees of freedom between √2 and 2

  13. Biomedical engineering principles

    CERN Document Server

    Ritter, Arthur B; Valdevit, Antonio; Ascione, Alfred N

    2011-01-01

    Introduction: Modeling of Physiological ProcessesCell Physiology and TransportPrinciples and Biomedical Applications of HemodynamicsA Systems Approach to PhysiologyThe Cardiovascular SystemBiomedical Signal ProcessingSignal Acquisition and ProcessingTechniques for Physiological Signal ProcessingExamples of Physiological Signal ProcessingPrinciples of BiomechanicsPractical Applications of BiomechanicsBiomaterialsPrinciples of Biomedical Capstone DesignUnmet Clinical NeedsEntrepreneurship: Reasons why Most Good Designs Never Get to MarketAn Engineering Solution in Search of a Biomedical Problem

  14. Modern electronic maintenance principles

    CERN Document Server

    Garland, DJ

    2013-01-01

    Modern Electronic Maintenance Principles reviews the principles of maintaining modern, complex electronic equipment, with emphasis on preventive and corrective maintenance. Unfamiliar subjects such as the half-split method of fault location, functional diagrams, and fault finding guides are explained. This book consists of 12 chapters and begins by stressing the need for maintenance principles and discussing the problem of complexity as well as the requirements for a maintenance technician. The next chapter deals with the connection between reliability and maintenance and defines the terms fai

  15. [Bioethics of principles].

    Science.gov (United States)

    Pérez-Soba Díez del Corral, Juan José

    2008-01-01

    Bioethics emerges about the tecnological problems of acting in human life. Emerges also the problem of the moral limits determination, because they seem exterior of this practice. The Bioethics of Principles, take his rationality of the teleological thinking, and the autonomism. These divergence manifest the epistemological fragility and the great difficulty of hmoralñ thinking. This is evident in the determination of autonomy's principle, it has not the ethical content of Kant's propose. We need a new ethic rationality with a new refelxion of new Principles whose emerges of the basic ethic experiences.

  16. Principles of dynamics

    CERN Document Server

    Hill, Rodney

    2013-01-01

    Principles of Dynamics presents classical dynamics primarily as an exemplar of scientific theory and method. This book is divided into three major parts concerned with gravitational theory of planetary systems; general principles of the foundations of mechanics; and general motion of a rigid body. Some of the specific topics covered are Keplerian Laws of Planetary Motion; gravitational potential and potential energy; and fields of axisymmetric bodies. The principles of work and energy, fictitious body-forces, and inertial mass are also looked into. Other specific topics examined are kinematics

  17. Hamilton's principle for beginners

    International Nuclear Information System (INIS)

    Brun, J L

    2007-01-01

    I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a line. Next, students are challenged to add gravity to reinforce the argument and, finally, a two-dimensional motion in a vertical plane is considered. Furthermore these examples force us to be very clear about such an abstract principle

  18. Developing principles of growth

    DEFF Research Database (Denmark)

    Neergaard, Helle; Fleck, Emma

    of the principles of growth among women-owned firms. Using an in-depth case study methodology, data was collected from women-owned firms in Denmark and Ireland, as these countries are similar in contextual terms, e.g. population and business composition, dominated by micro, small and medium-sized enterprises....... Extending on principles put forward in effectuation theory, we propose that women grow their firms according to five principles which enable women’s enterprises to survive in the face of crises such as the current financial world crisis....

  19. Matter tensor from the Hilbert variational principle

    International Nuclear Information System (INIS)

    Pandres, D. Jr.

    1976-01-01

    We consider the Hilbert variational principle which is conventionally used to derive Einstein's equations for the source-free gravitational field. We show that at least one version of the equivalence principle suggests an alternative way of performing the variation, resulting in a different set of Einstein equations with sources automatically present. This illustrates a technique which may be applied to any theory that is derived from a variational principle and that admits a gauge group. The essential point is that, if one first imposes a gauge condition and then performs the variation, one obtains field equations with source terms which do not appear if one first performs the variation and then imposes the gauge condition. A second illustration is provided by the variational principle conventionally used to derive Maxwell's equations for the source-free electromagnetic field. If one first imposes the Lorentz gauge condition and then performs the variation, one obtains Maxwell's equations with sources present

  20. A multiplicity logic unit

    International Nuclear Information System (INIS)

    Bialkowski, J.; Moszynski, M.; Zagorski, A.

    1981-01-01

    The logic diagram principle of operation and some details of the design of the multiplicity logic unit are presented. This unit was specially designed to fulfil the requirements of a multidetector arrangement for gamma-ray multiplicity measurements. The unit is equipped with 16 inputs controlled by a common coincidence gate. It delivers a linear output pulse with the height proportional to the multiplicity of coincidences and logic pulses corresponding to 0, 1, ... up to >= 5-fold coincidences. These last outputs are used to steer the routing unit working with the multichannel analyser. (orig.)

  1. Vaccinology: principles and practice

    National Research Council Canada - National Science Library

    Morrow, John

    2012-01-01

    ... principles to implementation. This is an authoritative textbook that details a comprehensive and systematic approach to the science of vaccinology focusing on not only basic science, but the many stages required to commercialize...

  2. On the invariance principle

    Energy Technology Data Exchange (ETDEWEB)

    Moller-Nielsen, Thomas [University of Oxford (United Kingdom)

    2014-07-01

    Physicists and philosophers have long claimed that the symmetries of our physical theories - roughly speaking, those transformations which map solutions of the theory into solutions - can provide us with genuine insight into what the world is really like. According to this 'Invariance Principle', only those quantities which are invariant under a theory's symmetries should be taken to be physically real, while those quantities which vary under its symmetries should not. Physicists and philosophers, however, are generally divided (or, indeed, silent) when it comes to explaining how such a principle is to be justified. In this paper, I spell out some of the problems inherent in other theorists' attempts to justify this principle, and sketch my own proposed general schema for explaining how - and when - the Invariance Principle can indeed be used as a legitimate tool of metaphysical inference.

  3. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  4. Minimum entropy production principle

    Czech Academy of Sciences Publication Activity Database

    Maes, C.; Netočný, Karel

    2013-01-01

    Roč. 8, č. 7 (2013), s. 9664-9677 ISSN 1941-6016 Institutional support: RVO:68378271 Keywords : MINEP Subject RIV: BE - Theoretical Physics http://www.scholarpedia.org/article/Minimum_entropy_production_principle

  5. Global ethics and principlism.

    Science.gov (United States)

    Gordon, John-Stewart

    2011-09-01

    This article examines the special relation between common morality and particular moralities in the four-principles approach and its use for global ethics. It is argued that the special dialectical relation between common morality and particular moralities is the key to bridging the gap between ethical universalism and relativism. The four-principles approach is a good model for a global bioethics by virtue of its ability to mediate successfully between universal demands and cultural diversity. The principle of autonomy (i.e., the idea of individual informed consent), however, does need to be revised so as to make it compatible with alternatives such as family- or community-informed consent. The upshot is that the contribution of the four-principles approach to global ethics lies in the so-called dialectical process and its power to deal with cross-cultural issues against the background of universal demands by joining them together.

  6. Gauge principle for hyper(para) fields

    Energy Technology Data Exchange (ETDEWEB)

    Govorkov, A.B. (Joint Inst. for Nuclear Research, Dubna (USSR))

    1983-04-01

    A special representation for parafields is considered which is based on the use of the Clifford hypernumbers. The principle of gauge invariance under hypercomplex phase transformations of parafields is formulated. A special role of quaternion hyperfields and corresponding Yang-Mills lagrangian with the gauge SO(3)-symmetry is pointed out.

  7. Microprocessors principles and applications

    CERN Document Server

    Debenham, Michael J

    1979-01-01

    Microprocessors: Principles and Applications deals with the principles and applications of microprocessors and covers topics ranging from computer architecture and programmed machines to microprocessor programming, support systems and software, and system design. A number of microprocessor applications are considered, including data processing, process control, and telephone switching. This book is comprised of 10 chapters and begins with a historical overview of computers and computing, followed by a discussion on computer architecture and programmed machines, paying particular attention to t

  8. Electrical and electronic principles

    CERN Document Server

    Knight, S A

    1991-01-01

    Electrical and Electronic Principles, 2, Second Edition covers the syllabus requirements of BTEC Unit U86/329, including the principles of control systems and elements of data transmission. The book first tackles series and parallel circuits, electrical networks, and capacitors and capacitance. Discussions focus on flux density, electric force, permittivity, Kirchhoff's laws, superposition theorem, arrangement of resistors, internal resistance, and powers in a circuit. The text then takes a look at capacitors in circuit, magnetism and magnetization, electromagnetic induction, and alternating v

  9. Microwave system engineering principles

    CERN Document Server

    Raff, Samuel J

    1977-01-01

    Microwave System Engineering Principles focuses on the calculus, differential equations, and transforms of microwave systems. This book discusses the basic nature and principles that can be derived from thermal noise; statistical concepts and binomial distribution; incoherent signal processing; basic properties of antennas; and beam widths and useful approximations. The fundamentals of propagation; LaPlace's Equation and Transmission Line (TEM) waves; interfaces between homogeneous media; modulation, bandwidth, and noise; and communications satellites are also deliberated in this text. This bo

  10. Electrical and electronic principles

    CERN Document Server

    Knight, SA

    1988-01-01

    Electrical and Electronic Principles, 3 focuses on the principles involved in electrical and electronic circuits, including impedance, inductance, capacitance, and resistance.The book first deals with circuit elements and theorems, D.C. transients, and the series circuits of alternating current. Discussions focus on inductance and resistance in series, resistance and capacitance in series, power factor, impedance, circuit magnification, equation of charge, discharge of a capacitor, transfer of power, and decibels and attenuation. The manuscript then examines the parallel circuits of alternatin

  11. Remark on Heisenberg's principle

    International Nuclear Information System (INIS)

    Noguez, G.

    1988-01-01

    Application of Heisenberg's principle to inertial frame transformations allows a distinction between three commutative groups of reciprocal transformations along one direction: Galilean transformations, dual transformations, and Lorentz transformations. These are three conjugate groups and for a given direction, the related commutators are all proportional to one single conjugation transformation which compensates for uniform and rectilinear motions. The three transformation groups correspond to three complementary ways of measuring space-time as a whole. Heisenberg's Principle then gets another explanation [fr

  12. The point of 6 sigma

    International Nuclear Information System (INIS)

    An, Yeong Jin

    2000-07-01

    This book gives descriptions of the point of 6 sigma. These are the titles of this : what 6 sigma is, sigma conception, motor roller 3.4 ppm, centering error, 6 sigma purpose, 6 sigma principle, eight steps of innovation strategy, 6 sigma innovation strategy of easy system step, measurement standard of 6 sigma outcome, the main role of 6 sigma, acknowledgment and reword, 6 sigma characteristic, 6 sigma effect, 6 sigma application and problems which happen when 6 sigma introduces.

  13. On the Bourbaki-Witt principle in toposes

    Science.gov (United States)

    Bauer, Andrej; Lumsdaine, Peter Lefanu

    2013-07-01

    The Bourbaki-Witt principle states that any progressive map on a chain-complete poset has a fixed point above every point. It is provable classically, but not intuitionistically. We study this and related principles in an intuitionistic setting. Among other things, we show that Bourbaki-Witt fails exactly when the trichotomous ordinals form a set, but does not imply that fixed points can always be found by transfinite iteration. Meanwhile, on the side of models, we see that the principle fails in realisability toposes, and does not hold in the free topos, but does hold in all cocomplete toposes.

  14. Multiple solid-phase microextraction

    NARCIS (Netherlands)

    Koster, EHM; de Jong, GJ

    2000-01-01

    Theoretical aspects of multiple solid-phase microextraction are described and the principle is illustrated with the extraction of lidocaine from aqueous solutions. With multiple extraction under non-equilibrium conditions considerably less time is required in order to obtain an extraction yield that

  15. General principles of radiotherapy

    International Nuclear Information System (INIS)

    Easson, E.C.

    1985-01-01

    The daily practice of any established branch of medicine should be based on some acceptable principles. This chapter is concerned with the general principles on which the radiotherapy of the Manchester school is based. Though many radiotherapists in other centres would doubtless accept these principles, there are sufficiently wide differences in practice throughout the world to suggest that some therapists adhere to a fundamentally different philosophy. The authors believe it is important, especially for those beginning their formal training in radiotherapy, to subscribe to an internally consistent school of thought, employing methods of treatment for each type of lesion in each anatomical site that are based on accepted principles and subjected to continuous rigorous scrutiny to test their effectiveness. Not only must each therapeutic technique be evaluated, but the underlying principles too must be questioned if and when this seems indicated. It is a feature of this hospital that similar lesions are all treated by the same technique, so long as statistical evidence justifies such a policy. All members of the staff adhere to the accepted policy until or unless reliable reasons are adduced to change this policy

  16. Principles of Eliminating Access Control Lists within a Domain

    Directory of Open Access Journals (Sweden)

    Vic Grout

    2012-04-01

    Full Text Available The infrastructure of large networks is broken down into areas that have a common security policy called a domain. Security within a domain is commonly implemented at all nodes. However this can have a negative effect on performance since it introduces a delay associated with packet filtering. When Access Control Lists (ACLs are used within a router for this purpose then a significant overhead is introduced associated with this process. It is likely that identical checks are made at multiple points within a domain prior to a packet reaching its destination. Therefore by eliminating ACLs within a domain by modifying the ingress/egress points with equivalent functionality an improvement in the overall performance can be obtained. This paper considers the effect of the delays when using router operating systems offering different levels of functionality. It considers factors which contribute to the delay particularly due to ACLs and by using theoretical principles modified by practical calculation a model is created. Additionally this paper provides an example of an optimized solution which reduces the delay through network routers by distributing the security rules to the ingress/egress points of the domain without affecting the security policy.

  17. Ethical principles of scientific communication

    Directory of Open Access Journals (Sweden)

    Baranov G. V.

    2017-03-01

    Full Text Available the article presents the principles of ethical management of scientific communication. The author approves the priority of ethical principle of social responsibility of the scientist.

  18. Multiple inflation

    International Nuclear Information System (INIS)

    Murphy, P.J.

    1987-01-01

    The Theory of Inflation, namely, that at some point the entropy content of the universe was greatly increased, has much promise. It may solve the puzzles of homogeneity and the creation of structure. However, no particle physics model has yet been found that can successfully drive inflation. The difficulty in satisfying the constraint that the isotropy of the microwave background places on the effective potential of prospective models is immense. In this work we have codified the requirements of such models in a most general form. We have carefully calculated the amounts of inflation the various problems of the Standard Model need for their solution. We have derived a completely model independent upper bond on the inflationary Hubble parameter. We have developed a general notation with which to probe the possibilities of Multiple Inflation. We have shown that only in very unlikely circumstances will any evidence of an earlier inflation, survive the de Sitter period of its successor. In particular, it is demonstrated that it is most unlikely that two bouts of inflation will yield high amplitudes of density perturbations on small scales and low amplitudes on large. We conclude that, while multiple inflation will be of great theoretical interest, it is unlikely to have any observational impact

  19. On the correspondence between quantum and classical variational principles

    International Nuclear Information System (INIS)

    Ruiz, D.E.; Dodin, I.Y.

    2015-01-01

    Classical variational principles can be deduced from quantum variational principles via formal reparameterization of the latter. It is shown that such reparameterization is possible without invoking any assumptions other than classicality and without appealing to dynamical equations. As examples, first principle variational formulations of classical point-particle and cold-fluid motion are derived from their quantum counterparts for Schrödinger, Pauli, and Klein–Gordon particles

  20. The hologram principles and techniques

    CERN Document Server

    Richardson, Martin J

    2018-01-01

    Written by Martin Richardson (an acclaimed leader and pioneer in the field) and John Wiltshire, The Hologram: Principles and Techniques is an important book that explores the various types of hologram in their multiple forms and explains how to create and apply the technology. The authors offer an insightful overview of the currently available recording materials, chemical formulas, and laser technology that includes the history of phase imaging and laser science. Accessible and comprehensive, the text contains a step-by-step guide to the production of holograms. In addition, The Hologram outlines the most common problems encountered in producing satisfactory images in the laboratory, as well as dealing with the wide range of optical and chemical techniques used in commercial holography. The Hologram is a well-designed instructive tool, involving three distinct disciplines: physics, chemistry, and graphic arts. This vital resource offers a guide to the development and understanding of the recording of mater...

  1. Ethical principles and theories.

    Science.gov (United States)

    Schultz, R C

    1993-01-01

    Ethical theory about what is right and good in human conduct lies behind the issues practitioners face and the codes they turn to for guidance; it also provides guidance for actions, practices, and policies. Principles of obligation, such as egoism, utilitarianism, and deontology, offer general answers to the question, "Which acts/practices are morally right?" A re-emerging alternative to using such principles to assess individual conduct is to center normative theory on personal virtues. For structuring society's institutions, principles of social justice offer alternative answers to the question, "How should social benefits and burdens be distributed?" But human concerns about right and good call for more than just theoretical responses. Some critics (eg, the postmodernists and the feminists) charge that normative ethical theorizing is a misguided enterprise. However, that charge should be taken as a caution and not as a refutation of normative ethical theorizing.

  2. Principles of musical acoustics

    CERN Document Server

    Hartmann, William M

    2013-01-01

    Principles of Musical Acoustics focuses on the basic principles in the science and technology of music. Musical examples and specific musical instruments demonstrate the principles. The book begins with a study of vibrations and waves, in that order. These topics constitute the basic physical properties of sound, one of two pillars supporting the science of musical acoustics. The second pillar is the human element, the physiological and psychological aspects of acoustical science. The perceptual topics include loudness, pitch, tone color, and localization of sound. With these two pillars in place, it is possible to go in a variety of directions. The book treats in turn, the topics of room acoustics, audio both analog and digital, broadcasting, and speech. It ends with chapters on the traditional musical instruments, organized by family. The mathematical level of this book assumes that the reader is familiar with elementary algebra. Trigonometric functions, logarithms and powers also appear in the book, but co...

  3. Itch Management: General Principles.

    Science.gov (United States)

    Misery, Laurent

    2016-01-01

    Like pain, itch is a challenging condition that needs to be managed. Within this setting, the first principle of itch management is to get an appropriate diagnosis to perform an etiology-oriented therapy. In several cases it is not possible to treat the cause, the etiology is undetermined, there are several causes, or the etiological treatment is not effective enough to alleviate itch completely. This is also why there is need for symptomatic treatment. In all patients, psychological support and associated pragmatic measures might be helpful. General principles and guidelines are required, yet patient-centered individual care remains fundamental. © 2016 S. Karger AG, Basel.

  4. Principles of Optics

    Science.gov (United States)

    Born, Max; Wolf, Emil

    1999-10-01

    Principles of Optics is one of the classic science books of the twentieth century, and probably the most influential book in optics published in the past forty years. This edition has been thoroughly revised and updated, with new material covering the CAT scan, interference with broad-band light and the so-called Rayleigh-Sommerfeld diffraction theory. This edition also details scattering from inhomogeneous media and presents an account of the principles of diffraction tomography to which Emil Wolf has made a basic contribution. Several new appendices are also included. This new edition will be invaluable to advanced undergraduates, graduate students and researchers working in most areas of optics.

  5. Principles of statistics

    CERN Document Server

    Bulmer, M G

    1979-01-01

    There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo

  6. Indeterminacy and the principle of need.

    Science.gov (United States)

    Herlitz, Anders

    2017-02-01

    The principle of need-the idea that resources should be allocated according to need-is often invoked in priority setting in the health care sector. In this article, I argue that a reasonable principle of need must be indeterminate, and examine three different ways that this can be dealt with: appendicizing the principle with further principles, imposing determinacy, or empowering decision makers. I argue that need must be conceptualized as a composite property composed of at least two factors: health shortfall and capacity to benefit. When one examines how the different factors relate to each other, one discovers that this is sometimes indeterminate. I illustrate this indeterminacy in this article by applying the small improvement argument. If the relation between the factors are always determinate, the comparative relation changes by a small adjustment. Yet, if two needs are dissimilar but of seemingly equal magnitude, the comparative relation does not change by a small adjustment of one of the factors. I then outline arguments in favor of each of the three strategies for dealing with indeterminacy, but also point out that all strategies have significant shortcomings. More research is needed concerning how to deal with this indeterminacy, and the most promising path seems to be to scrutinize the position of the principle of need among a plurality of relevant principles for priority setting in the health care sector.

  7. Comments on 'On a proposed new test of Heisenberg's principle'

    International Nuclear Information System (INIS)

    Home, D.; Sengupta, S.

    1981-01-01

    A logical fallacy is pointed out in Robinson's analysis (J. Phys. A.; 13:877 (1980)) of a thought experiment purporting to show violation of Heisenberg's uncertainty principle. The real problem concerning the interpretation of Heisenberg's principle is precisely stated. (author)

  8. Achieving integration in mixed methods designs-principles and practices.

    Science.gov (United States)

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. © Health Research and Educational Trust.

  9. Achieving Integration in Mixed Methods Designs—Principles and Practices

    Science.gov (United States)

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  10. Blending the most fundamental Remote-Sensing principles (RS ...

    African Journals Online (AJOL)

    Blending the most fundamental Remote-Sensing principles (RS) with the most functional spatial knowledge (GIS) with the objective of the determination of the accident-prone palms and points (case study: Tehran-Hamadan Highway on Saveh Superhighway)

  11. Selective analysis of power plant operation on the Hudson River with emphasis on the Bowline Point Generating Station. Volume 2. [Multiple impact of power plant once-through cooling systems on fish populations

    Energy Technology Data Exchange (ETDEWEB)

    Barnthouse, L. W.; Cannon, J. B.; Christensen, S. G.

    1977-07-01

    Because of the location of the Bowline, Roseton, and Indian Point power generating facilities in the low-salinity zone of the Hudson estuary, operation of these plants with the present once-through cooling systems will adversely influence the fish populations that use the area for spawning and initial periods of growth and development. Recruitment rates and standing crops of several fish species may be lowered in response to the increased mortality caused by entrainment of nonscreenable eggs and larvae and by impingement of screenable young of the year. Entrainment and impingement data are particularly relevant for assessing which fish species have the greatest potential for being adversely affected by operation of Bowline, Roseton, and Indian Point with once-through cooling. These data from each of these three plants suggest that the six species that merit the greatest consideration are striped bass, white perch, tomcod, alewife, blueback herring, and bay anchovy. Two points of view are available for assessing the relative importance of the fish species in the Hudson River. From the fisheries point of view, the only two species of major importance are striped bass and shad. From the fish-community and ecosystem point of view, the dominant species, as determined by seasonal and regional standing crops (in numbers and biomass per hectare), are the six species most commonly entrained and impinged, namely, striped bass, white perch, tomcod, alewife, blueback herring, and anchovy.

  12. Principles of computational fluid dynamics

    CERN Document Server

    Wesseling, Pieter

    2001-01-01

    The book is aimed at graduate students, researchers, engineers and physicists involved in flow computations. An up-to-date account is given of the present state-of-the-art of numerical methods employed in computational fluid dynamics. The underlying numerical principles are treated with a fair amount of detail, using elementary mathematical analysis. Attention is given to difficulties arising from geometric complexity of the flow domain and of nonuniform structured boundary-fitted grids. Uniform accuracy and efficiency for singular perturbation problems is studied, pointing the way to accurate computation of flows at high Reynolds number. Much attention is given to stability analysis, and useful stability conditions are provided, some of them new, for many numerical schemes used in practice. Unified methods for compressible and incompressible flows are discussed. Numerical analysis of the shallow-water equations is included. The theory of hyperbolic conservation laws is treated. Godunov's order barrier and ho...

  13. The Principles of Readability

    Science.gov (United States)

    DuBay, William H.

    2004-01-01

    The principles of readability are in every style manual. Readability formulas are in every writing aid. What is missing is the research and theory on which they stand. This short review of readability research spans 100 years. The first part covers the history of adult literacy studies in the U.S., establishing the stratified nature of the adult…

  14. Principles of electrodynamics

    CERN Document Server

    Schwartz, Melvin

    1972-01-01

    This advanced undergraduate- and graduate-level text by the 1988 Nobel Prize winner establishes the subject's mathematical background, reviews the principles of electrostatics, then introduces Einstein's special theory of relativity and applies it throughout the book in topics ranging from Gauss' theorem and Coulomb's law to electric and magnetic susceptibility.

  15. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  16. The Idiom Principle Revisited

    Science.gov (United States)

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In…

  17. The Pauli Exclusion Principle

    Indian Academy of Sciences (India)

    his exclusion principle, the quantum theory was a mess. Moreover, it could ... This is a function of all the coordinates and 'internal variables' such as spin, of all the ... must remain basically the same (ie change by a phase factor at most) if we ...

  18. The Bohr Correspondence Principle

    Indian Academy of Sciences (India)

    IAS Admin

    Deepak Dhar. Keywords. Correspondence principle, hy- drogen atom, Kepler orbit. Deepak Dhar works at the. Tata Institute of Funda- mental Research,. Mumbai. His research interests are mainly in the area of statistical physics. We consider the quantum-mechanical non-relati- vistic hydrogen atom. We show that for bound.

  19. Fundamental Safety Principles

    International Nuclear Information System (INIS)

    Abdelmalik, W.E.Y.

    2011-01-01

    This work presents a summary of the IAEA Safety Standards Series publication No. SF-1 entitled F UDAMENTAL Safety PRINCIPLES p ublished on 2006. This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purposes. Safety measures and security measures have in common the aim of protecting human life and health and the environment. These safety principles are: 1) Responsibility for safety, 2) Role of the government, 3) Leadership and management for safety, 4) Justification of facilities and activities, 5) Optimization of protection, 6) Limitation of risks to individuals, 7) Protection of present and future generations, 8) Prevention of accidents, 9)Emergency preparedness and response and 10) Protective action to reduce existing or unregulated radiation risks. The safety principles concern the security of facilities and activities to the extent that they apply to measures that contribute to both safety and security. Safety measures and security measures must be designed and implemented in an integrated manner so that security measures do not compromise safety and safety measures do not compromise security.

  20. Principles of Protocol Design

    DEFF Research Database (Denmark)

    Sharp, Robin

    This is a new and updated edition of a book first published in 1994. The book introduces the reader to the principles used in the construction of a large range of modern data communication protocols, as used in distributed computer systems of all kinds. The approach taken is rather a formal one...

  1. Fermat's Principle Revisited.

    Science.gov (United States)

    Kamat, R. V.

    1991-01-01

    A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

  2. Principles of economics textbooks

    DEFF Research Database (Denmark)

    Madsen, Poul Thøis

    2012-01-01

    Has the financial crisis already changed US principles of economics textbooks? Rather little has changed in individual textbooks, but taken as a whole ten of the best-selling textbooks suggest rather encompassing changes of core curriculum. A critical analysis of these changes shows how individual...

  3. Principle of the determination of neutron multiplication coefficients by the Monte Carlo method. Application. Description of a code for ibm 360-75; Principe de la determination des coefficients de multiplication neutronique par methode de Monte-Carlo. Application. Description d'un code pour IBM 360-75

    Energy Technology Data Exchange (ETDEWEB)

    Moreau, J; Parisot, B [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-07-01

    The determination of neutron multiplication coefficients by the Monte Carlo method can be carried out in different ways; the are first examined particularly complex geometries; it makes use of multi-group isotropic cross sections. The performances of this code are illustrated by some examples. (author) [French] La determination des coefficients de multiplication neutronique par methode de Monte Carlo peut se faire par differentes voies, elles sont successivement examinees et comparees. On en deduit un code rapide pour des geometries particulierement complexes, il utilise des sections efficaces multigroupes isotropes. Les performances de ce code sont demontrees par quelques exemples. (auteur)

  4. Extremum principles for irreversible processes

    International Nuclear Information System (INIS)

    Hillert, M.; Agren, J.

    2006-01-01

    Hamilton's extremum principle is a powerful mathematical tool in classical mechanics. Onsager's extremum principle may play a similar role in irreversible thermodynamics and may also become a valuable tool. His principle may formally be regarded as a principle of maximum rate of entropy production but does not have a clear physical interpretation. Prigogine's principle of minimum rate of entropy production has a physical interpretation when it applies, but is not strictly valid except for a very special case

  5. Quantum mechanics and the equivalence principle

    International Nuclear Information System (INIS)

    Davies, P C W

    2004-01-01

    A quantum particle moving in a gravitational field may penetrate the classically forbidden region of the gravitational potential. This raises the question of whether the time of flight of a quantum particle in a gravitational field might deviate systematically from that of a classical particle due to tunnelling delay, representing a violation of the weak equivalence principle. I investigate this using a model quantum clock to measure the time of flight of a quantum particle in a uniform gravitational field, and show that a violation of the equivalence principle does not occur when the measurement is made far from the turning point of the classical trajectory. The results are then confirmed using the so-called dwell time definition of quantum tunnelling. I conclude with some remarks about the strong equivalence principle in quantum mechanics

  6. Babinet's principle in double-refraction systems

    Science.gov (United States)

    Ropars, Guy; Le Floch, Albert

    2014-06-01

    Babinet's principle applied to systems with double refraction is shown to involve spatial interchanges between the ordinary and extraordinary patterns observed through two complementary screens. As in the case of metamaterials, the extraordinary beam does not follow the Snell-Descartes refraction law, the superposition principle has to be applied simultaneously at two points. Surprisingly, by contrast to the intuitive impression, in the presence of the screen with an opaque region, we observe that the emerging extraordinary photon pattern, which however has undergone a deviation, remains fixed when a natural birefringent crystal is rotated while the ordinary one rotates with the crystal. The twofold application of Babinet's principle implies intensity and polarization interchanges but also spatial and dynamic interchanges which should occur in birefringent metamaterials.

  7. Demonstration of Human-Autonomy Teaming Principles

    Science.gov (United States)

    Shively, Robert Jay

    2016-01-01

    Known problems with automation include lack of mode awareness, automation brittleness, and risk of miscalibrated trust. Human-Autonomy Teaming (HAT) is essential for improving these problems. We have identified some critical components of HAT and ran a part-task study to introduce these components to a ground station that supports flight following of multiple aircraft. Our goal was to demonstrate, evaluate, and refine HAT principles. This presentation provides a brief summary of the study and initial findings.

  8. THE PRINCIPLES OF LAW. PHILOSOPHICAL APPROACH

    Directory of Open Access Journals (Sweden)

    MARIUS ANDREESCU

    2013-05-01

    Full Text Available Any scientific intercession that has as objective, the understanding of the significances of the “principle of law” needs to have an interdisciplinary character, the basis for the approach being the philosophy of the law. In this study we fulfill such an analysis with the purpose to underline the multiple theoretical significances due to this concept, but also the relationship between the juridical principles and norms, respectively the normative value of the principle of the law. Thus are being materialized extensive references to the philosophical and juridical doctrine in the matter. This study is a pleading to refer to the principles, in the work for the law’s creation and applying. Starting with the difference between “given” and ‘constructed” we propose the distinction between the “metaphysical principles” outside the law, which by their contents have philosophical significances, and the “constructed principles” elaborated inside the law. We emphasize the obligation of the law maker, but also of the expert to refer to the principles in the work of legislation, interpretation and applying of the law. Arguments are brought for the updating, in certain limits, the justice – naturalistic concepts in the law.

  9. Principles of geodynamics

    CERN Document Server

    Scheidegger, Adrian E

    1982-01-01

    Geodynamics is commonly thought to be one of the subjects which provide the basis for understanding the origin of the visible surface features of the Earth: the latter are usually assumed as having been built up by geodynamic forces originating inside the Earth ("endogenetic" processes) and then as having been degrad­ ed by geomorphological agents originating in the atmosphere and ocean ("exogenetic" agents). The modem view holds that the sequence of events is not as neat as it was once thought to be, and that, in effect, both geodynamic and geomorphological processes act simultaneously ("Principle of Antagonism"); however, the division of theoretical geology into the principles of geodynamics and those of theoretical geomorphology seems to be useful for didactic purposes. It has therefore been maintained in the present writer's works. This present treatise on geodynamics is the first part of the author's treatment of theoretical geology, the treatise on Theoretical Geomorphology (also published by the Sprin...

  10. Principles of systems science

    CERN Document Server

    Mobus, George E

    2015-01-01

    This pioneering text provides a comprehensive introduction to systems structure, function, and modeling as applied in all fields of science and engineering. Systems understanding is increasingly recognized as a key to a more holistic education and greater problem solving skills, and is also reflected in the trend toward interdisciplinary approaches to research on complex phenomena. The subject of systems science, as a basis for understanding the components and drivers of phenomena at all scales, should be viewed with the same importance as a traditional liberal arts education. Principles of Systems Science contains many graphs, illustrations, side bars, examples, and problems to enhance understanding. From basic principles of organization, complexity, abstract representations, and behavior (dynamics) to deeper aspects such as the relations between information, knowledge, computation, and system control, to higher order aspects such as auto-organization, emergence and evolution, the book provides an integrated...

  11. Common principles and multiculturalism.

    Science.gov (United States)

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea.

  12. Principles of Mobile Communication

    CERN Document Server

    Stüber, Gordon L

    2012-01-01

    This mathematically rigorous overview of physical layer wireless communications is now in a third, fully revised and updated edition. Along with coverage of basic principles sufficient for novice students, the volume includes plenty of finer details that will satisfy the requirements of graduate students aiming to research the topic in depth. It also has a role as a handy reference for wireless engineers. The content stresses core principles that are applicable to a broad range of wireless standards. Beginning with a survey of the field that introduces an array of issues relevant to wireless communications and which traces the historical development of today’s accepted wireless standards, the book moves on to cover all the relevant discrete subjects, from radio propagation to error probability performance and cellular radio resource management. A valuable appendix provides a succinct and focused tutorial on probability and random processes, concepts widely used throughout the book. This new edition, revised...

  13. Principles of mathematical modeling

    CERN Document Server

    Dym, Clive

    2004-01-01

    Science and engineering students depend heavily on concepts of mathematical modeling. In an age where almost everything is done on a computer, author Clive Dym believes that students need to understand and "own" the underlying mathematics that computers are doing on their behalf. His goal for Principles of Mathematical Modeling, Second Edition, is to engage the student reader in developing a foundational understanding of the subject that will serve them well into their careers. The first half of the book begins with a clearly defined set of modeling principles, and then introduces a set of foundational tools including dimensional analysis, scaling techniques, and approximation and validation techniques. The second half demonstrates the latest applications for these tools to a broad variety of subjects, including exponential growth and decay in fields ranging from biology to economics, traffic flow, free and forced vibration of mechanical and other systems, and optimization problems in biology, structures, an...

  14. Principles of Stellar Interferometry

    CERN Document Server

    Glindemann, Andreas

    2011-01-01

    Over the last decade, stellar interferometry has developed from a specialist tool to a mainstream observing technique, attracting scientists whose research benefits from milliarcsecond angular resolution. Stellar interferometry has become part of the astronomer’s toolbox, complementing single-telescope observations by providing unique capabilities that will advance astronomical research. This carefully written book is intended to provide a solid understanding of the principles of stellar interferometry to students starting an astronomical research project in this field or to develop instruments and to astronomers using interferometry but who are not interferometrists per se. Illustrated by excellent drawings and calculated graphs the imaging process in stellar interferometers is explained starting from first principles on light propagation and diffraction wave propagation through turbulence is described in detail using Kolmogorov statistics the impact of turbulence on the imaging process is discussed both f...

  15. Principles of mobile communication

    CERN Document Server

    Stüber, Gordon L

    2017-01-01

    This mathematically rigorous overview of physical layer wireless communications is now in a 4th, fully revised and updated edition. The new edition features new content on 4G cellular systems, 5G cellular outlook, bandpass signals and systems, and polarization, among many other topics, in addition to a new chapters on channel assignment techniques. Along with coverage of fundamentals and basic principles sufficient for novice students, the volume includes finer details that satisfy the requirements of graduate students aiming to conduct in-depth research. The book begins with a survey of the field, introducing issues relevant to wireless communications. The book moves on to cover relevant discrete subjects, from radio propagation, to error probability performance, and cellular radio resource management. An appendix provides a tutorial on probability and random processes. The content stresses core principles that are applicable to a broad range of wireless standards. New examples are provided throughout the bo...

  16. Principles of photonics

    CERN Document Server

    Liu, Jia-Ming

    2016-01-01

    With this self-contained and comprehensive text, students will gain a detailed understanding of the fundamental concepts and major principles of photonics. Assuming only a basic background in optics, readers are guided through key topics such as the nature of optical fields, the properties of optical materials, and the principles of major photonic functions regarding the generation, propagation, coupling, interference, amplification, modulation, and detection of optical waves or signals. Numerous examples and problems are provided throughout to enhance understanding, and a solutions manual containing detailed solutions and explanations is available online for instructors. This is the ideal resource for electrical engineering and physics undergraduates taking introductory, single-semester or single-quarter courses in photonics, providing them with the knowledge and skills needed to progress to more advanced courses on photonic devices, systems and applications.

  17. Common Principles and Multiculturalism

    Science.gov (United States)

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  18. Conservation and balance principles approach in NPP decommissioning

    International Nuclear Information System (INIS)

    Anton, V.

    1997-01-01

    In this work some principles of mass, energy, activity level conservation are formulated. When some conditioning or treatment procedures for High Level Waste (HLW) are applied then the corresponding balance principles operate. It is important to note that in AECL computing code DECOM an analysis of different versions of decommissioning based on cost consideration is given.. Our approach pointed out many possibilities which are to be taken into account in the NPP decommissioning, besides the minimum cost principle. We also remarked other circumstances pointing to other conservation principles and to the corresponding balance principles. In our opinion this is the first approach of this kind in international literature. With the progress which is expected in decommissioning techniques some of the considerations presented in this work have to be developed and detailed. (author)

  19. Principles of (Behavioral) Economics

    OpenAIRE

    David Laibson; John A. List

    2015-01-01

    Behavioral economics has become an important and integrated component of modern economics. Behavioral economists embrace the core principles of economics—optimization and equilibrium—and seek to develop and extend those ideas to make them more empirically accurate. Behavioral models assume that economic actors try to pick the best feasible option and those actors sometimes make mistakes. Behavioral ideas should be incorporated throughout the first-year undergraduate course. Instructors should...

  20. Principles of fluid mechanics

    International Nuclear Information System (INIS)

    Kreider, J.F.

    1985-01-01

    This book is an introduction on fluid mechanics incorporating computer applications. Topics covered are as follows: brief history; what is a fluid; two classes of fluids: liquids and gases; the continuum model of a fluid; methods of analyzing fluid flows; important characteristics of fluids; fundamentals and equations of motion; fluid statics; dimensional analysis and the similarity principle; laminar internal flows; ideal flow; external laminar and channel flows; turbulent flow; compressible flow; fluid flow measurements

  1. Principles of electrical safety

    CERN Document Server

    Sutherland, Peter E

    2015-01-01

    Principles of Electrical Safety discusses current issues in electrical safety, which are accompanied by series' of practical applications that can be used by practicing professionals, graduate students, and researchers. .  Provides extensive introductions to important topics in electrical safety Comprehensive overview of inductance, resistance, and capacitance as applied to the human body Serves as a preparatory guide for today's practicing engineers

  2. PREFERENCE, PRINCIPLE AND PRACTICE

    DEFF Research Database (Denmark)

    Skovsgaard, Morten; Bro, Peter

    2011-01-01

    Legitimacy has become a central issue in journalism, since the understanding of what journalism is and who journalists are has been challenged by developments both within and outside the newsrooms. Nonetheless, little scholarly work has been conducted to aid conceptual clarification as to how jou...... distinct, but interconnected categories*preference, principle, and practice. Through this framework, historical attempts to justify journalism and journalists are described and discussed in the light of the present challenges for the profession....

  3. Advertisement without Ethical Principles?

    OpenAIRE

    Wojciech Słomski

    2007-01-01

    The article replies to the question, whether the advertisement can exist without ethical principles or ethics should be the basis of the advertisement. One can say that the ethical opinion of the advertisement does not depend on content and the form of advertising content exclusively, but also on recipientís consciousness. The advertisement appeals to the emotions more than to the intellect, thus restricting the area of conscious and based on rational premises choice, so it is morally bad. It...

  4. General Principles Governing Liability

    International Nuclear Information System (INIS)

    Reyners, P.

    1998-01-01

    This paper contains a brief review of the basic principles which govern the special regime of liability and compensation for nuclear damage originating on nuclear installations, in particular the strict and exclusive liability of the nuclear operator, the provision of a financial security to cover this liability and the limits applicable both in amount and in time. The paper also reviews the most important international agreements currently in force which constitute the foundation of this special regime. (author)

  5. The Principle of Proportionality

    DEFF Research Database (Denmark)

    Bennedsen, Morten; Meisner Nielsen, Kasper

    2005-01-01

    Recent policy initiatives within the harmonization of European company laws have promoted a so-called "principle of proportionality" through proposals that regulate mechanisms opposing a proportional distribution of ownership and control. We scrutinize the foundation for these initiatives...... in relationship to the process of harmonization of the European capital markets.JEL classifications: G30, G32, G34 and G38Keywords: Ownership Structure, Dual Class Shares, Pyramids, EU companylaws....

  6. Common Principles and Multiculturalism

    OpenAIRE

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Con...

  7. The Maquet principle

    International Nuclear Information System (INIS)

    Levine, R.B.; Stassi, J.; Karasick, D.

    1985-01-01

    Anterior displacement of the tibial tubercle is a well-accepted orthopedic procedure in the treatment of certain patellofemoral disorders. The radiologic appearance of surgical procedures utilizing the Maquet principle has not been described in the radiologic literature. Familiarity with the physiologic and biochemical basis for the procedure and its postoperative appearance is necessary for appropriate roentgenographic evaluation and the radiographic recognition of complications. (orig.)

  8. Principles of lake sedimentology

    International Nuclear Information System (INIS)

    Janasson, L.

    1983-01-01

    This book presents a comprehensive outline on the basic sedimentological principles for lakes, and focuses on environmental aspects and matters related to lake management and control-on lake ecology rather than lake geology. This is a guide for those who plan, perform and evaluate lake sedimentological investigations. Contents abridged: Lake types and sediment types. Sedimentation in lakes and water dynamics. Lake bottom dynamics. Sediment dynamics and sediment age. Sediments in aquatic pollution control programmes. Subject index

  9. Principles of artificial intelligence

    CERN Document Server

    Nilsson, Nils J

    1980-01-01

    A classic introduction to artificial intelligence intended to bridge the gap between theory and practice, Principles of Artificial Intelligence describes fundamental AI ideas that underlie applications such as natural language processing, automatic programming, robotics, machine vision, automatic theorem proving, and intelligent data retrieval. Rather than focusing on the subject matter of the applications, the book is organized around general computational concepts involving the kinds of data structures used, the types of operations performed on the data structures, and the properties of th

  10. Economic uncertainty principle?

    OpenAIRE

    Alexander Harin

    2006-01-01

    The economic principle of (hidden) uncertainty is presented. New probability formulas are offered. Examples of solutions of three types of fundamental problems are reviewed.; Principe d'incertitude économique? Le principe économique d'incertitude (cachée) est présenté. De nouvelles formules de chances sont offertes. Les exemples de solutions des trois types de problèmes fondamentaux sont reconsidérés.

  11. The goal of ape pointing.

    Science.gov (United States)

    Halina, Marta; Liebal, Katja; Tomasello, Michael

    2018-01-01

    Captive great apes regularly use pointing gestures in their interactions with humans. However, the precise function of this gesture is unknown. One possibility is that apes use pointing primarily to direct attention (as in "please look at that"); another is that they point mainly as an action request (such as "can you give that to me?"). We investigated these two possibilities here by examining how the looking behavior of recipients affects pointing in chimpanzees (Pan troglodytes) and bonobos (Pan paniscus). Upon pointing to food, subjects were faced with a recipient who either looked at the indicated object (successful-look) or failed to look at the indicated object (failed-look). We predicted that, if apes point primarily to direct attention, subjects would spend more time pointing in the failed-look condition because the goal of their gesture had not been met. Alternatively, we expected that, if apes point primarily to request an object, subjects would not differ in their pointing behavior between the successful-look and failed-look conditions because these conditions differed only in the looking behavior of the recipient. We found that subjects did differ in their pointing behavior across the successful-look and failed-look conditions, but contrary to our prediction subjects spent more time pointing in the successful-look condition. These results suggest that apes are sensitive to the attentional states of gestural recipients, but their adjustments are aimed at multiple goals. We also found a greater number of individuals with a strong right-hand than left-hand preference for pointing.

  12. On minimizers of causal variational principles

    International Nuclear Information System (INIS)

    Schiefeneder, Daniela

    2011-01-01

    Causal variational principles are a class of nonlinear minimization problems which arise in a formulation of relativistic quantum theory referred to as the fermionic projector approach. This thesis is devoted to a numerical and analytic study of the minimizers of a general class of causal variational principles. We begin with a numerical investigation of variational principles for the fermionic projector in discrete space-time. It is shown that for sufficiently many space-time points, the minimizing fermionic projector induces non-trivial causal relations on the space-time points. We then generalize the setting by introducing a class of causal variational principles for measures on a compact manifold. In our main result we prove under general assumptions that the support of a minimizing measure is either completely timelike, or it is singular in the sense that its interior is empty. In the examples of the circle, the sphere and certain flag manifolds, the general results are supplemented by a more detailed analysis of the minimizers. (orig.)

  13. The Alara principle in backfitting Borssele

    International Nuclear Information System (INIS)

    Leurs, C.J.

    1998-01-01

    An extensive backfitting program, the Modifications Project, was carried out at the Borssele Nuclear Power Station. It involved sixteen modifications to technical systems. The scope of activities, and the dose rates encountered in places where work was to be performed, made it obvious from the outset that a high collective dose had to be anticipated. As a consequence, radiation protection within the project was organized in such a way that applicable radiation protection principles were applied in all phases of the project. From the point of view of radiation protection, the Modifications Project had to be subdivided into three phases, i.e., a conceptual design phase in which mainly the justification principle was applied; the engineering phase in which the Alara principle was employed; the execution phase in which management of the (internal) dose limits had to be observed in addition to the Alara principle. Throughout all project phases, radiation protection considerations and results were documented in so-called Alara reports and radiation protection checklists. As a result of the strictest possible observance of radiation protection principles in all phases of the project, a collective dose of 2505 mSv was achieved, which stands for a reduction by a factor of 4 compared to the very first estimate. In view of the scope and complex nature of the activities involved, and the radiation levels in the Borssele Nuclear Power Station, this is an excellent result. (orig.) [de

  14. Food ionization: principles, nutritional aspects and detection

    International Nuclear Information System (INIS)

    Raffi, J.

    1992-01-01

    This document reviews the possible applications of ionizing radiations in the food industry, pointing out the principles of the treatment and its consequences on the nutritionnal value of the product. The last part gives the present status of the researches about the identification of irradiated foodstuffs and of the concerted action sponsored by the Community Bureau of Reference from the Commission of the European Communities

  15. Some Key Principles for Developing Grammar Skills

    Institute of Scientific and Technical Information of China (English)

    张威

    2008-01-01

    Grammar is sometimes defined aft"the way words are put together to make correct sentences"(Ur,2004,P.75).The aim of teaching grammar is to raise the rates of the correctness of language use and help the students transfer the isolated language points to apply language.In this essay,the author introduces two kinds of Conlnlon methods in English grammar class. And there are some key principles in grammar teaching.

  16. Common pitfalls in statistical analysis: The perils of multiple testing

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2016-01-01

    Multiple testing refers to situations where a dataset is subjected to statistical testing multiple times - either at multiple time-points or through multiple subgroups or for multiple end-points. This amplifies the probability of a false-positive finding. In this article, we look at the consequences of multiple testing and explore various methods to deal with this issue. PMID:27141478

  17. Enhancing the Therapy Experience Using Principles of Video Game Design.

    Science.gov (United States)

    Folkins, John Wm; Brackenbury, Tim; Krause, Miriam; Haviland, Allison

    2016-02-01

    This article considers the potential benefits that applying design principles from contemporary video games may have on enhancing therapy experiences. Six principles of video game design are presented, and their relevance for enriching clinical experiences is discussed. The motivational and learning benefits of each design principle have been discussed in the education literature as having positive impacts on student motivation and learning and are related here to aspects of clinical practice. The essential experience principle suggests connecting all aspects of the experience around a central emotion or cognitive connection. The discovery principle promotes indirect learning in focused environments. The risk-taking principle addresses the uncertainties clients face when attempting newly learned skills in novel situations. The generalization principle encourages multiple opportunities for skill transfer. The reward system principle directly relates to the scaffolding of frequent and varied feedback in treatment. Last, the identity principle can assist clients in using their newly learned communication skills to redefine self-perceptions. These principles highlight areas for research and interventions that may be used to reinforce or advance current practice.

  18. Principles of Toxicological Interactions Associated with Multiple Chemical Exposures.

    Science.gov (United States)

    1980-12-01

    specific carrier systems. Some com- pounds of high molecular weight, e.g., polysaccharides , neutral fats, cannot be absorbed because they are destroyed by...degree of ionization of the foreign compound and, hence, the extent of absorp- tion. Certain foreign monosaccharides , amino acids, and pyrimidines are

  19. Efficiency principles of consulting entrepreneurship

    OpenAIRE

    Moroz Yustina S.; Drozdov Igor N.

    2015-01-01

    The article reviews the primary goals and problems of consulting entrepreneurship. The principles defining efficiency of entrepreneurship in the field of consulting are generalized. The special attention is given to the importance of ethical principles of conducting consulting entrepreneurship activity.

  20. Algorithmic Principles of Mathematical Programming

    NARCIS (Netherlands)

    Faigle, Ulrich; Kern, Walter; Still, Georg

    2002-01-01

    Algorithmic Principles of Mathematical Programming investigates the mathematical structures and principles underlying the design of efficient algorithms for optimization problems. Recent advances in algorithmic theory have shown that the traditionally separate areas of discrete optimization, linear

  1. The Playtime Principle

    DEFF Research Database (Denmark)

    Sifa, Rafet; Bauckhage, Christian; Drachen, Anders

    2014-01-01

    be derived from this large-scale analysis, notably that playtime as a function of time, across the thousands of games in the dataset, and irrespective of local differences in the playtime frequency distribution, can be modeled using the same model: the Wei bull distribution. This suggests...... that there are fundamental properties governing player engagement as it evolves over time, which we here refer to as the Playtime Principle. Additionally, the analysis shows that there are distinct clusters, or archetypes, in the playtime frequency distributions of the investigated games. These archetypal groups correspond...

  2. Complex Correspondence Principle

    International Nuclear Information System (INIS)

    Bender, Carl M.; Meisinger, Peter N.; Hook, Daniel W.; Wang Qinghai

    2010-01-01

    Quantum mechanics and classical mechanics are distinctly different theories, but the correspondence principle states that quantum particles behave classically in the limit of high quantum number. In recent years much research has been done on extending both quantum and classical mechanics into the complex domain. These complex extensions continue to exhibit a correspondence, and this correspondence becomes more pronounced in the complex domain. The association between complex quantum mechanics and complex classical mechanics is subtle and demonstrating this relationship requires the use of asymptotics beyond all orders.

  3. Principles of chemical kinetics

    CERN Document Server

    House, James E

    2007-01-01

    James House's revised Principles of Chemical Kinetics provides a clear and logical description of chemical kinetics in a manner unlike any other book of its kind. Clearly written with detailed derivations, the text allows students to move rapidly from theoretical concepts of rates of reaction to concrete applications. Unlike other texts, House presents a balanced treatment of kinetic reactions in gas, solution, and solid states. The entire text has been revised and includes many new sections and an additional chapter on applications of kinetics. The topics covered include quantitative rela

  4. RFID design principles

    CERN Document Server

    Lehpamer, Harvey

    2012-01-01

    This revised edition of the Artech House bestseller, RFID Design Principles, serves as an up-to-date and comprehensive introduction to the subject. The second edition features numerous updates and brand new and expanded material on emerging topics such as the medical applications of RFID and new ethical challenges in the field. This practical book offers you a detailed understanding of RFID design essentials, key applications, and important management issues. The book explores the role of RFID technology in supply chain management, intelligent building design, transportation systems, military

  5. Principles of meteoritics

    CERN Document Server

    Krinov, E L

    1960-01-01

    Principles of Meteoritics examines the significance of meteorites in relation to cosmogony and to the origin of the planetary system. The book discusses the science of meteoritics and the sources of meteorites. Scientists study the morphology of meteorites to determine their motion in the atmosphere. The scope of such study includes all forms of meteorites, the circumstances of their fall to earth, their motion in the atmosphere, and their orbits in space. Meteoric bodies vary in sizes; in calculating their motion in interplanetary space, astronomers apply the laws of Kepler. In the region of

  6. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  7. Principles of speech coding

    CERN Document Server

    Ogunfunmi, Tokunbo

    2010-01-01

    It is becoming increasingly apparent that all forms of communication-including voice-will be transmitted through packet-switched networks based on the Internet Protocol (IP). Therefore, the design of modern devices that rely on speech interfaces, such as cell phones and PDAs, requires a complete and up-to-date understanding of the basics of speech coding. Outlines key signal processing algorithms used to mitigate impairments to speech quality in VoIP networksOffering a detailed yet easily accessible introduction to the field, Principles of Speech Coding provides an in-depth examination of the

  8. On Weak Markov's Principle

    DEFF Research Database (Denmark)

    Kohlenbach, Ulrich Wilhelm

    2002-01-01

    We show that the so-called weak Markov's principle (WMP) which states that every pseudo-positive real number is positive is underivable in E-HA + AC. Since allows one to formalize (atl eastl arge parts of) Bishop's constructive mathematics, this makes it unlikely that WMP can be proved within...... the framework of Bishop-style mathematics (which has been open for about 20 years). The underivability even holds if the ine.ective schema of full comprehension (in all types) for negated formulas (in particular for -free formulas) is added, which allows one to derive the law of excluded middle...

  9. Principles of quantum chemistry

    CERN Document Server

    George, David V

    2013-01-01

    Principles of Quantum Chemistry focuses on the application of quantum mechanics in physical models and experiments of chemical systems.This book describes chemical bonding and its two specific problems - bonding in complexes and in conjugated organic molecules. The very basic theory of spectroscopy is also considered. Other topics include the early development of quantum theory; particle-in-a-box; general formulation of the theory of quantum mechanics; and treatment of angular momentum in quantum mechanics. The examples of solutions of Schroedinger equations; approximation methods in quantum c

  10. Principles of thermodynamics

    CERN Document Server

    Kaufman, Myron

    2002-01-01

    Ideal for one- or two-semester courses that assume elementary knowledge of calculus, This text presents the fundamental concepts of thermodynamics and applies these to problems dealing with properties of materials, phase transformations, chemical reactions, solutions and surfaces. The author utilizes principles of statistical mechanics to illustrate key concepts from a microscopic perspective, as well as develop equations of kinetic theory. The book provides end-of-chapter question and problem sets, some using Mathcad™ and Mathematica™; a useful glossary containing important symbols, definitions, and units; and appendices covering multivariable calculus and valuable numerical methods.

  11. Principles of fluorescence techniques

    CERN Document Server

    2016-01-01

    Fluorescence techniques are being used and applied increasingly in academics and industry. The Principles of Fluorescence Techniques course will outline the basic concepts of fluorescence techniques and the successful utilization of the currently available commercial instrumentation. The course is designed for students who utilize fluorescence techniques and instrumentation and for researchers and industrial scientists who wish to deepen their knowledge of fluorescence applications. Key scientists in the field will deliver theoretical lectures. The lectures will be complemented by the direct utilization of steady-state and lifetime fluorescence instrumentation and confocal microscopy for FLIM and FRET applications provided by leading companies.

  12. Torsades de Pointes

    Directory of Open Access Journals (Sweden)

    Richard J Chen, MD

    2018-04-01

    pacing. Since this patient was asymptomatic, he was given 2gm of magnesium sulfate and placed on an amiodarone infusion, after which Tdp terminated with a resulting sinus rhythm. AICD interrogation showed multiple episodes of ventricular fibrillation. The patient was admitted for further management and to determine why his AICD was not functioning properly. Topics: EKG, ECG, cardiology, ventricular tachycardia, arrhythmia, Torsades de Pointes

  13. The principle of general tovariance

    NARCIS (Netherlands)

    Heunen, C.; Landsman, N.P.; Spitters, B.A.W.; Loja Fernandes, R.; Picken, R.

    2008-01-01

    We tentatively propose two guiding principles for the construction of theories of physics, which should be satisfied by a possible future theory of quantum gravity. These principles are inspired by those that led Einstein to his theory of general relativity, viz. his principle of general covariance

  14. Fermat and the Minimum Principle

    Indian Academy of Sciences (India)

    Arguably, least action and minimum principles were offered or applied much earlier. This (or these) principle(s) is/are among the fundamental, basic, unifying or organizing ones used to describe a variety of natural phenomena. It considers the amount of energy expended in performing a given action to be the least required ...

  15. Fundamental Principle for Quantum Theory

    OpenAIRE

    Khrennikov, Andrei

    2002-01-01

    We propose the principle, the law of statistical balance for basic physical observables, which specifies quantum statistical theory among all other statistical theories of measurements. It seems that this principle might play in quantum theory the role that is similar to the role of Einstein's relativity principle.

  16. Principles for School Drug Education

    Science.gov (United States)

    Meyer, Lois

    2004-01-01

    This document presents a revised set of principles for school drug education. The principles for drug education in schools comprise an evolving framework that has proved useful over a number of decades in guiding the development of effective drug education. The first edition of "Principles for Drug Education in Schools" (Ballard et al.…

  17. Principles of data integration

    CERN Document Server

    Doan, AnHai; Ives, Zachary

    2012-01-01

    How do you approach answering queries when your data is stored in multiple databases that were designed independently by different people? This is first comprehensive book on data integration and is written by three of the most respected experts in the field. This book provides an extensive introduction to the theory and concepts underlying today's data integration techniques, with detailed, instruction for their application using concrete examples throughout to explain the concepts. Data integration is the problem of answering queries that span multiple data sources (e.g., databases, web

  18. Poisson point processes imaging, tracking, and sensing

    CERN Document Server

    Streit, Roy L

    2010-01-01

    This overview of non-homogeneous and multidimensional Poisson point processes and their applications features mathematical tools and applications from emission- and transmission-computed tomography to multiple target tracking and distributed sensor detection.

  19. Multiple sclerosis

    Science.gov (United States)

    ... indwelling catheter Osteoporosis or thinning of the bones Pressure sores Side effects of medicines used to treat the ... Daily bowel care program Multiple sclerosis - discharge Preventing pressure ulcers Swallowing problems Images Multiple sclerosis MRI of the ...

  20. Dew point measurement technique utilizing fiber cut reflection

    Science.gov (United States)

    Kostritskii, S. M.; Dikevich, A. A.; Korkishko, Yu. N.; Fedorov, V. A.

    2009-05-01

    The fiber optical dew point hygrometer based on change of reflection coefficient for fiber cut has been developed and examined. We proposed and verified the model of condensation detector functioning principle. Experimental frost point measurements on air with different frost points have been performed.

  1. Principles of Mechanical Excavation

    International Nuclear Information System (INIS)

    Lislerud, A.

    1997-12-01

    Mechanical excavation of rock today includes several methods such as tunnel boring, raiseboring, roadheading and various continuous mining systems. Of these raiseboring is one potential technique for excavating shafts in the repository for spent nuclear fuel and dry blind boring is promising technique for excavation of deposition holes, as demonstrated in the Research Tunnel at Olkiluoto. In addition, there is potential for use of other mechanical excavation techniques in different parts of the repository. One of the main objectives of this study was to analyze the factors which affect the feasibility of mechanical rock excavation in hard rock conditions and to enhance the understanding of factors which affect rock cutting so as to provide an improved basis for excavator performance prediction modeling. The study included the following four main topics: (a) phenomenological model based on similarity analysis for roller disk cutting, (b) rock mass properties which affect rock cuttability and tool life, (c) principles for linear and field cutting tests and performance prediction modeling and (d) cutter head lacing design procedures and principles. As a conclusion of this study, a test rig was constructed, field tests were planned and started up. The results of the study can be used to improve the performance prediction models used to assess the feasibility of different mechanical excavation techniques at various repository investigation sites. (orig.)

  2. THE RESPONSIBILITY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Elena ANGHEL

    2015-07-01

    Full Text Available "I'm wishing Law this: all legal obligations sholud be executed with the scrupulosity with which moral obligations are being performed by those people who feel bound by them ...", so beautifully portraited by Nicolae Titulescu`s words1. Life in the society means more than a simple coexistence of human beings, it actually means living together, collaborating and cooperating; that is why I always have to relate to other people and to be aware that only by limiting my freedom of action, the others freedom is feasible. Neminem laedere should be a principle of life for each of us. The individual is a responsible being. But responsibility exceeds legal prescriptions. Romanian Constitution underlines that I have to exercise my rights and freedoms in good faith, without infringing the rights and freedoms of others. The legal norm, developer of the constitutional principles, is endowed with sanction, which grants it exigibility. But I wonder: If I choose to obey the law, is my decision essentially determined only due of the fear of punishment? Is it not because I am a rational being, who developed during its life a conscience towards values, and thus I understand that I have to respect the law and I choose to comply with it?

  3. Principles of Mechanical Excavation

    Energy Technology Data Exchange (ETDEWEB)

    Lislerud, A. [Tamrock Corp., Tampere (Finland)

    1997-12-01

    Mechanical excavation of rock today includes several methods such as tunnel boring, raiseboring, roadheading and various continuous mining systems. Of these raiseboring is one potential technique for excavating shafts in the repository for spent nuclear fuel and dry blind boring is promising technique for excavation of deposition holes, as demonstrated in the Research Tunnel at Olkiluoto. In addition, there is potential for use of other mechanical excavation techniques in different parts of the repository. One of the main objectives of this study was to analyze the factors which affect the feasibility of mechanical rock excavation in hard rock conditions and to enhance the understanding of factors which affect rock cutting so as to provide an improved basis for excavator performance prediction modeling. The study included the following four main topics: (a) phenomenological model based on similarity analysis for roller disk cutting, (b) rock mass properties which affect rock cuttability and tool life, (c) principles for linear and field cutting tests and performance prediction modeling and (d) cutter head lacing design procedures and principles. As a conclusion of this study, a test rig was constructed, field tests were planned and started up. The results of the study can be used to improve the performance prediction models used to assess the feasibility of different mechanical excavation techniques at various repository investigation sites. (orig.). 21 refs.

  4. Comment on "Current fluctuations in non-equilibrium diffusive systems: an additivity principle"

    OpenAIRE

    Sukhorukov, Eugene V.; Jordan, Andrew N.

    2004-01-01

    We point out that the "additivity principle" and "scaling hypothesis" postulated by Bodineau and Derrida in Phys. Rev. Lett 92, 180601 (2004), follow naturally from the saddle point evaluation of a diffusive field theory.

  5. Mach's principle and rotating universes

    International Nuclear Information System (INIS)

    King, D.H.

    1990-01-01

    It is shown that the Bianchi 9 model universe satisfies the Mach principle. These closed rotating universes were previously thought to be counter-examples to the principle. The Mach principle is satisfied because the angular momentum of the rotating matter is compensated by the effective angular momentum of gravitational waves. A new formulation of the Mach principle is given that is based on the field theory interpretation of general relativity. Every closed universe with 3-sphere topology is shown to satisfy this formulation of the Mach principle. It is shown that the total angular momentum of the matter and gravitational waves in a closed 3-sphere topology universe is zero

  6. THE EQUALITY PRINCIPLE REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    CLAUDIA ANDRIŢOI

    2013-05-01

    Full Text Available The problem premises and the objectives followed: the idea of inserting the equality principle between the freedom and the justice principles is manifested in positive law in two stages, as a general idea of all judicial norms and as requirement of the owner of a subjective right of the applicants of an objective law. Equality in face of the law and of public authorities can not involve the idea of standardization, of uniformity, of enlisting of all citizens under the mark of the same judicial regime, regardless of their natural or socio-professional situation. Through the Beijing Platform and the position documents of the European Commission we have defined the integrative approach of equality as representing an active and visible integration of the gender perspective in all sectors and at all levels. The research methods used are: the conceptualist method, the logical method and the intuitive method necessary as means of reasoning in order to argue our demonstration. We have to underline the fact that the system analysis of the research methods of the judicial phenomenon doesn’t agree with “value ranking”, because one value cannot be generalized in rapport to another. At the same time, we must fight against a methodological extremism. The final purpose of this study is represented by the reaching of the perfecting/excellence stage by all individuals through the promotion of equality and freedom. This supposes the fact that the existence of a non-discrimination favourable frame (fairness represents a means and a condition of self-determination, and the state of perfection/excellency is a result of this self-determination, the condition necessary for the obtaining of this nondiscrimination frame for all of us and in conditions of freedom for all individuals, represents the same condition that promotes the state of perfection/excellency. In conclusion we may state the fact that the equality principle represents a true catalyst of the

  7. The Principles of HACCP

    Science.gov (United States)

    HACCP is an acronym for Hazard Analysis and Critical Control Point and was initially developed by the Pillsbury Company and NASA. They utilized this program to enhance the safety of the food for manned space flights. The USDA-FSIS implemented the HACCP approach to food safety in the meat and pou...

  8. The principles of HACCP.

    Science.gov (United States)

    The Hazard Analysis and Critical Control Point (HACCP) food safety inspection program is utilized by both USDA Food Safety Inspection Service (FSIS) and FDA for many of the products they regulate. This science-based program was implemented by the USDA FSIS to enhance the food safety of meat and pou...

  9. MULTIPLE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  10. No-Hypersignaling Principle

    Science.gov (United States)

    Dall'Arno, Michele; Brandsen, Sarah; Tosini, Alessandro; Buscemi, Francesco; Vedral, Vlatko

    2017-07-01

    A paramount topic in quantum foundations, rooted in the study of the Einstein-Podolsky-Rosen (EPR) paradox and Bell inequalities, is that of characterizing quantum theory in terms of the spacelike correlations it allows. Here, we show that to focus only on spacelike correlations is not enough: we explicitly construct a toy model theory that, while not contradicting classical and quantum theories at the level of spacelike correlations, still displays an anomalous behavior in its timelike correlations. We call this anomaly, quantified in terms of a specific communication game, the "hypersignaling" phenomena. We hence conclude that the "principle of quantumness," if it exists, cannot be found in spacelike correlations alone: nontrivial constraints need to be imposed also on timelike correlations, in order to exclude hypersignaling theories.

  11. Principles of Lasers

    CERN Document Server

    Svelto, Orazio

    2010-01-01

    This new Fifth Edition of Principles of Lasers incorporates corrections to the previous edition. The text’s essential mission remains the same: to provide a wide-ranging yet unified description of laser behavior, physics, technology, and current applications. Dr. Svelto emphasizes the physical rather than the mathematical aspects of lasers, and presents the subject in the simplest terms compatible with a correct physical understanding. Praise for earlier editions: "Professor Svelto is himself a longtime laser pioneer and his text shows the breadth of his broad acquaintance with all aspects of the field … Anyone mastering the contents of this book will be well prepared to understand advanced treatises and research papers in laser science and technology." (Arthur L. Schawlow, 1981 Nobel Laureate in Physics) "Already well established as a self-contained introduction to the physics and technology of lasers … Professor Svelto’s book, in this lucid translation by David Hanna, can be strongly recommended for...

  12. [Principles of PET].

    Science.gov (United States)

    Beuthien-Baumann, B

    2018-05-01

    Positron emission tomography (PET) is a procedure in nuclear medicine, which is applied predominantly in oncological diagnostics. In the form of modern hybrid machines, such as PET computed tomography (PET/CT) and PET magnetic resonance imaging (PET/MRI) it has found wide acceptance and availability. The PET procedure is more than just another imaging technique, but a functional method with the capability for quantification in addition to the distribution pattern of the radiopharmaceutical, the results of which are used for therapeutic decisions. A profound knowledge of the principles of PET including the correct indications, patient preparation, and possible artifacts is mandatory for the correct interpretation of PET results.

  13. Principles of asymmetric synthesis

    CERN Document Server

    Gawley, Robert E; Aube, Jeffrey

    2012-01-01

    The world is chiral. Most of the molecules in it are chiral, and asymmetric synthesis is an important means by which enantiopure chiral molecules may be obtained for study and sale. Using examples from the literature of asymmetric synthesis, this book presents a detailed analysis of the factors that govern stereoselectivity in organic reactions. After an explanation of the basic physical-organic principles governing stereoselective reactions, the authors provide a detailed, annotated glossary of stereochemical terms. A chapter on "Practical Aspects of Asymmetric Synthesis" provides a critical overview of the most common methods for the preparation of enantiomerically pure compounds, techniques for analysis of stereoisomers using chromatographic, spectroscopic, and chiroptical methods. The authors then present an overview of the most important methods in contemporary asymmetric synthesis organized by reaction type. Thus, there are four chapters on carbon-carbon bond forming reactions, one chapter on reductions...

  14. Principles of modern physics

    CERN Document Server

    Saxena, A K

    2014-01-01

    Principles of Modern Physics, divided into twenty one chapters, begins with quantum ideas followed by discussions on special relativity, atomic structure, basic quantum mechanics, hydrogen atom (and Schrodinger equation) and periodic table, the three statistical distributions, X-rays, physics of solids, imperfections in crystals, magnetic properties of materials, superconductivity, Zeeman-, Stark- and Paschen Back- effects, Lasers, Nuclear physics (Yukawa's meson theory and various nuclear models), radioactivity and nuclear reactions, nuclear fission, fusion and plasma, particle accelerators and detectors, the universe, Elementary particles (classification, eight fold way and quark model, standard model and fundamental interactions), cosmic rays, deuteron problem in nuclear physics, and cathode ray oscilloscope. NEW TO THE FOURTH EDITION: The CO2 Laser Theory of magnetic moments on the basis of shell model Geological dating Laser Induced fusion and laser fusion reactor. Hawking radiation The cosmological red ...

  15. Principles & practice of physics

    CERN Document Server

    Mazur, Eric; Dourmashkin, Peter A; Pedigo, Daryl; Bieniek, Ronald J

    2015-01-01

    Putting physics first Based on his storied research and teaching, Eric Mazur's Principles & Practice of Physics builds an understanding of physics that is both thorough and accessible. Unique organization and pedagogy allow you to develop a true conceptual understanding of physics alongside the quantitative skills needed in the course. *New learning architecture: The book is structured to help you learn physics in an organized way that encourages comprehension and reduces distraction.*Physics on a contemporary foundation: Traditional texts delay the introduction of ideas that we now see as unifying and foundational. This text builds physics on those unifying foundations, helping you to develop an understanding that is stronger, deeper, and fundamentally simpler.*Research-based instruction: This text uses a range of research-based instructional techniques to teach physics in the most effective manner possible. The result is a groundbreaking book that puts physics first, thereby making it more accessible to...

  16. Emulsion Science Basic Principles

    CERN Document Server

    Leal-Calderon, Fernando; Schmitt, Véronique

    2007-01-01

    Emulsions are generally made out of two immiscible fluids like oil and water, one being dispersed in the second in the presence of surface-active compounds.They are used as intermediate or end products in a huge range of areas including the food, chemical, cosmetic, pharmaceutical, paint, and coating industries. Besides the broad domain of technological interest, emulsions are raising a variety of fundamental questions at the frontier between physics and chemistry. This book aims to give an overview of the most recent advances in emulsion science. The basic principles, covering aspects of emulsions from their preparation to their destruction, are presented in close relation to both the fundamental physics and the applications of these materials. The book is intended to help scientists and engineers in formulating new materials by giving them the basics of emulsion science.

  17. Principles of Bioenergetics

    CERN Document Server

    Skulachev, Vladimir P; Kasparinsky, Felix O

    2013-01-01

    Principles of Bioenergetics summarizes one of the quickly growing branches of modern biochemistry. Bioenergetics concerns energy transductions occurring in living systems and this book pays special attention to molecular mechanisms of these processes. The main subject of the book is the "energy coupling membrane" which refers to inner membranes of intracellular organelles, for example, mitochondria and chloroplasts. Cellular cytoplasmic membranes where respiratory and photosynthetic energy transducers, as well as ion-transporting ATP-synthases (ATPases) are also part of this membrane. Significant attention is paid to the alternative function of mitochondria as generators of reactive oxygen species (ROS) that mediate programmed death of cells (apoptosis and necrosis) and organisms (phenoptosis). The latter process is considered as a key mechanism of aging which may be suppressed by mitochondria-targeted antioxidants.

  18. Gas cell neutralizers (Fundamental principles)

    International Nuclear Information System (INIS)

    Fuehrer, B.

    1985-06-01

    Neutralizing an ion-beam of the size and energy levels involved in the neutral-particle-beam program represents a considerable extension of the state-of-the-art of neutralizer technology. Many different mediums (e.g., solid, liquid, gas, plasma, photons) can be used to strip the hydrogen ion of its extra electron. A large, multidisciplinary R and D effort will no doubt be required to sort out all of the ''pros and cons'' of these various techniques. The purpose of this particular presentation is to discuss some basic configurations and fundamental principles of the gas type of neutralizer cell. Particular emphasis is placed on the ''Gasdynamic Free-Jet'' neutralizer since this configuration has the potential of being much shorter than other type of gas cells (in the beam direction) and it could operate in nearly a continuous mode (CW) if necessary. These were important considerations in the ATSU design which is discussed in some detail in the second presentation entitled ''ATSU Point Design''

  19. Principles of computational fluid dynamics

    International Nuclear Information System (INIS)

    Wesseling, P.

    2001-01-01

    The book is aimed at graduate students, researchers, engineers and physicists involved in flow computations. An up-to-date account is given of the present state- of-the-art of numerical methods employed in computational fluid dynamics. The underlying numerical principles are treated with a fair amount of detail, using elementary mathematical analysis. Attention is given to difficulties arising from geometric complexity of the flow domain and of nonuniform structured boundary-fitted grids. Uniform accuracy and efficiency for singular perturbation problems is studied, pointing the way to accurate computation of flows at high Reynolds number. Much attention is given to stability analysis, and useful stability conditions are provided, some of them new, for many numerical schemes used in practice. Unified methods for compressible and incompressible flows are discussed. Numerical analysis of the shallow-water equations is included. The theory of hyperbolic conservation laws is treated. Godunov's order barrier and how to overcome it by means of slope-limited schemes is discussed. An introduction is given to efficient iterative solution methods, using Krylov subspace and multigrid acceleration. Many pointers are given to recent literature, to help the reader to quickly reach the current research frontier. (orig.)

  20. The nonholonomic variational principle

    Energy Technology Data Exchange (ETDEWEB)

    Krupkova, Olga [Department of Algebra and Geometry, Faculty of Science, Palacky University, Tomkova 40, 779 00 Olomouc (Czech Republic); Department of Mathematics, La Trobe University, Bundoora, Victoria 3086 (Australia)], E-mail: krupkova@inf.upol.cz

    2009-05-08

    A variational principle for mechanical systems and fields subject to nonholonomic constraints is found, providing Chetaev-reduced equations as equations for extremals. Investigating nonholonomic variations of the Chetaev type and their properties, we develop foundations of the calculus of variations on constraint manifolds, modelled as fibred submanifolds in jet bundles. This setting is appropriate to study general first-order 'nonlinear nonitegrable constraints' that locally are given by a system of first-order ordinary or partial differential equations. We obtain an invariant constrained first variation formula and constrained Euler-Lagrange equations both in intrinsic and coordinate forms, and show that the equations are the same as Chetaev equations 'without Lagrange multipliers', introduced recently by other methods. We pay attention to two possible settings: first, when the constrained system arises from an unconstrained Lagrangian system defined in a neighbourhood of the constraint, and second, more generally, when an 'internal' constrained system on the constraint manifold is given. In the latter case a corresponding unconstrained system need not be a Lagrangian, nor even exist. We also study in detail an important particular case: nonholonomic constraints that can be alternatively modelled by means of (co)distributions in the total space of the fibred manifold; in nonholonomic mechanics this happens whenever constraints affine in velocities are considered. It becomes clear that (and why) if the distribution is completely integrable (= the constraints are semiholonomic), the principle of virtual displacements holds and can be used to obtain the constrained first variational formula by a more or less standard procedure, traditionally used when unconstrained or holonomic systems are concerned. If, however, the constraint is nonintegrable, no significant simplifications are available. Among others, some properties of nonholonomic

  1. Dynamical principles in neuroscience

    International Nuclear Information System (INIS)

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-01-01

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?

  2. Dynamical principles in neuroscience

    Science.gov (United States)

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-01

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?

  3. Fault Management Guiding Principles

    Science.gov (United States)

    Newhouse, Marilyn E.; Friberg, Kenneth H.; Fesq, Lorraine; Barley, Bryan

    2011-01-01

    Regardless of the mission type: deep space or low Earth orbit, robotic or human spaceflight, Fault Management (FM) is a critical aspect of NASA space missions. As the complexity of space missions grows, the complexity of supporting FM systems increase in turn. Data on recent NASA missions show that development of FM capabilities is a common driver for significant cost overruns late in the project development cycle. Efforts to understand the drivers behind these cost overruns, spearheaded by NASA's Science Mission Directorate (SMD), indicate that they are primarily caused by the growing complexity of FM systems and the lack of maturity of FM as an engineering discipline. NASA can and does develop FM systems that effectively protect mission functionality and assets. The cost growth results from a lack of FM planning and emphasis by project management, as well the maturity of FM as an engineering discipline, which lags behind the maturity of other engineering disciplines. As a step towards controlling the cost growth associated with FM development, SMD has commissioned a multi-institution team to develop a practitioner's handbook representing best practices for the end-to-end processes involved in engineering FM systems. While currently concentrating primarily on FM for science missions, the expectation is that this handbook will grow into a NASA-wide handbook, serving as a companion to the NASA Systems Engineering Handbook. This paper presents a snapshot of the principles that have been identified to guide FM development from cradle to grave. The principles range from considerations for integrating FM into the project and SE organizational structure, the relationship between FM designs and mission risk, and the use of the various tools of FM (e.g., redundancy) to meet the FM goal of protecting mission functionality and assets.

  4. Visualizing Matrix Multiplication

    Science.gov (United States)

    Daugulis, Peteris; Sondore, Anita

    2018-01-01

    Efficient visualizations of computational algorithms are important tools for students, educators, and researchers. In this article, we point out an innovative visualization technique for matrix multiplication. This method differs from the standard, formal approach by using block matrices to make computations more visual. We find this method a…

  5. Applying bioethical principles to human biomonitoring

    Directory of Open Access Journals (Sweden)

    Harrison Myron

    2008-01-01

    Full Text Available Abstract Bioethical principles are widely used as a normative framework in areas of human research and medical care. In recent years there has been increasing formalization of their use in public health decisions. The "traditional bioethical principles" are applied in this discussion to the important issue human biomonitoring for environmental exposures. They are: (1 Autonomy – Also known as the "respect for humans" principle, people understand their own best interests; (2 Beneficence – "do good" for people; (3 Nonmaleficence – "do no harm"; (4 Justice – fair distribution of benefits and costs (including risks to health across stakeholders. Some of the points made are: (1 There is not a single generic bioethical analysis applicable to the use of human biomonitoring data, each specific use requires a separate deliberation; (2 Using unidentified, population-based biomonitoring information for risk assessment or population surveillance raises fewer bioethical concerns than personally identified biomonitoring information such as employed in health screening; (3 Companies should proactively apply normative bioethical principles when considering the disposition of products and by-products in the environment and humans; (4 There is a need for more engagement by scholars on the bioethical issues raised by the use of biomarkers of exposure; (5 Though our scientific knowledge of biology will continue to increase, there will always be a role for methods or frameworks to resolve substantive disagreements in the meaning of this data that are matters of belief rather than knowledge.

  6. Applying bioethical principles to human biomonitoring

    Directory of Open Access Journals (Sweden)

    Harrison Myron

    2008-06-01

    Full Text Available Abstract Bioethical principles are widely used as a normative framework in areas of human research and medical care. In recent years there has been increasing formalization of their use in public health decisions. The "traditional bioethical principles" are applied in this discussion to the important issue human biomonitoring for environmental exposures. They are: (1 Autonomy – Also known as the "respect for humans" principle, people understand their own best interests; (2 Beneficence – "do good" for people; (3 Nonmaleficence – "do no harm"; (4 Justice – fair distribution of benefits and costs (including risks to health across stakeholders. Some of the points made are: (1 There is not a single generic bioethical analysis applicable to the use of human biomonitoring data, each specific use requires a separate deliberation; (2 Using unidentified, population-based biomonitoring information for risk assessment or population surveillance raises fewer bioethical concerns than personally identified biomonitoring information such as employed in health screening; (3 Companies should proactively apply normative bioethical principles when considering the disposition of products and by-products in the environment and humans; (4 There is a need for more engagement by scholars on the bioethical issues raised by the use of biomarkers of exposure; (5 Though our scientific knowledge of biology will continue to increase, there will always be a role for methods or frameworks to resolve substantive disagreements in the meaning of this data that are matters of belief rather than knowledge.

  7. "Point de suspension"

    CERN Multimedia

    2004-01-01

    CERN - Globe of Science and Innovation 20 and 21 October Acrobatics, mime, a cappella singing, projections of images, a magical setting... a host of different tools of a grandeur matching that of the Universe they relate. A camera makes a massive zoom out to reveal the multiple dimensions of Nature. Freeze the frame: half way between the infinitesimally small and the infinitesimally large, a man suspends his everyday life (hence the title "Point de Suspension", which refers to the three dots at the end of an uncompleted sentence) to take a glimpse of the place he occupies in the great history of the Universe. An unusual perspective on what it means to be a human being... This wondrous show in the Globe of Science and Innovation, specially created by the Miméscope* company for the official ceremony marking CERN's fiftieth anniversary, is a gift from the Government of the Republic and Canton of Geneva, which also wishes to share this moment of wonder with the local population. There will be three perfo...

  8. "Point de suspension"

    CERN Multimedia

    2004-01-01

    http://www.cern.ch/cern50/ CERN - Globe of Science and Innovation 20 and 21 October Acrobatics, mime, a cappella singing, projections of images, a magical setting... a host of different tools of a grandeur matching that of the Universe they relate. A camera makes a massive zoom out to reveal the multiple dimensions of Nature. Freeze the frame: half way between the infinitesimally small and the infinitesimally large, a man suspends his everyday life (hence the title "Point de Suspension", which refers to the three dots at the end of an uncompleted sentence) to take a glimpse of the place he occupies in the great history of the Universe. An unusual perspective on what it means to be a human being... This wondrous show in the Globe of Science and Innovation, specially created by the Miméscope* company for the official ceremony marking CERN's fiftieth anniversary, is a gift from the Government of the Republic and Canton of Geneva, which also wishes to share this moment of wonder with the local pop...

  9. "Point de suspension"

    CERN Multimedia

    2004-01-01

    CERN - Globe of Science and Innovation 20 and 21 October Acrobatics, mime, a cappella singing, projections of images, a magical setting... a host of different tools of a grandeur matching that of the Universe they relate. A camera makes a massive zoom out to reveal the multiple dimensions of Nature. Freeze the frame: half way between the infinitesimally small and the infinitesimally large, a man suspends his everyday life (hence the title "Point de Suspension", which refers to the three dots at the end of an uncompleted sentence) to take a glimpse of the place he occupies in the great history of the Universe. An unusual perspective on what it means to be a human being... This spectacle in the Globe of Science and Innovation, specially created by the Miméscope* company for the official ceremony marking CERN's fiftieth anniversary, is a gift from the Government of the Republic and Canton of Geneva, which also wishes to share this moment of wonder with the local population. There will be three performances for...

  10. The Independence of Markov's Principle in Type Theory

    DEFF Research Database (Denmark)

    Coquand, Thierry; Mannaa, Bassel

    2017-01-01

    for the generic point of this model. Instead we design an extension of type theory, which intuitively extends type theory by the addition of a generic point of Cantor space. We then show the consistency of this extension by a normalization argument. Markov's principle does not hold in this extension......In this paper, we show that Markov's principle is not derivable in dependent type theory with natural numbers and one universe. One way to prove this would be to remark that Markov's principle does not hold in a sheaf model of type theory over Cantor space, since Markov's principle does not hold......, and it follows that it cannot be proved in type theory....

  11. Core principles of evolutionary medicine

    Science.gov (United States)

    Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E

    2018-01-01

    Abstract Background and objectives Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. Methodology The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Results Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. Conclusions and implications This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further. PMID:29493660

  12. Reformulation of a stochastic action principle for irregular dynamics

    International Nuclear Information System (INIS)

    Wang, Q.A.; Bangoup, S.; Dzangue, F.; Jeatsa, A.; Tsobnang, F.; Le Mehaute, A.

    2009-01-01

    A stochastic action principle for random dynamics is revisited. Numerical diffusion experiments are carried out to show that the diffusion path probability depends exponentially on the Lagrangian action A=∫ a b Ldt. This result is then used to derive the Shannon measure for path uncertainty. It is shown that the maximum entropy principle and the least action principle of classical mechanics can be unified into δA-bar=0 where the average is calculated over all possible paths of the stochastic motion between two configuration points a and b. It is argued that this action principle and the maximum entropy principle are a consequence of the mechanical equilibrium condition extended to the case of stochastic dynamics.

  13. Objective Principles of Economics

    OpenAIRE

    Kakarot-Handtke, Egmont

    2014-01-01

    Economists have the habit of solving the wrong problems. They speculate circumstantially about the behavior of agents and do not come to grips with the behavior of the monetary economy. This is the consequence of the methodological imperative that all explanations must run in terms of the actions and reactions of individuals. The critical point is that no way leads from the understanding of the interaction of the individuals to the understanding of the working of the economy as a whole. The s...

  14. Questões éticas na esclerose múltipla sob o ponto de vista de médicos e pacientes Ethical issues in multiple sclerosis under physicians and patients point of view

    Directory of Open Access Journals (Sweden)

    Antonio Paulo Nassar Junior

    2005-03-01

    Full Text Available A esclerose múltipla (EM é afecção neurológica que acomete principalmente adultos jovens e evolui, geralmente, para graus variados de incapacidade física dos pacientes. Assim, a abordagem destes pacientes faz com que o médico depare-se com diversas questões éticas. OBJETIVO: Identificar as percepções de médicos e pacientes sobre a doença e, com isso, melhorar o relacionamento médico-paciente. MÉTODO: Foram feitos dois questionários, um respondido por 44 médicos e outro, por 103 pacientes, abordando questões sobre o diagnóstico e a conduta na EM. RESULTADOS: 96,1% dos pacientes sabiam seu diagnóstico, os outros gostariam de sabê-lo. Daqueles, 74,7% achavam que a forma contada foi correta e 90,9% que o médico é que deve contá-lo. Os sintomas que mais os incomodam são a fadiga (29,1% e os déficits motores (28,1%. Por outro lado, 68% dos pacientes afirmaram sofrer com a doença. O motivo mais importante para os médicos contarem o diagnóstico foi para melhorar a adesão ao tratamento (56,8%. A presença de um familiar neste momento é exigida por 54,6% dos médicos. Quando perguntados sobre as orientações de uma gravidez, 50% dos médicos não responderam adequadamente. Finalmente, 50% dos médicos manifestaram-se de forma contrária às terapias complementares. CONCLUSÃO: Os pacientes querem saber seu diagnóstico e o médico deve contá-lo da forma mais adequada e dar mais informações. Um debate sobre cuidados paliativos também faz-se necessário.Multiple sclerosis (MS is a neurologic disorder that mostly affects young adults and can usually evolute to physical disability. Thus, caring patients with MS brings many ethic questions for the physician. OBJECTIVE: To identify physicians and patients' perceptions about the illness and so improve doctor-patient relationship. METHOD: It was made two different questionnaires, one for patients and another for physicians, 103 patients and 44 physicians answered them. RESULTS

  15. First principles calculations of interstitial and lamellar rhenium nitrides

    International Nuclear Information System (INIS)

    Soto, G.; Tiznado, H.; Reyes, A.; Cruz, W. de la

    2012-01-01

    Highlights: ► The possible structures of rhenium nitride as a function of composition are analyzed. ► The alloying energy is favorable for rhenium nitride in lamellar arrangements. ► The structures produced by magnetron sputtering are metastable variations. ► The structures produced by high-pressure high-temperature are stable configurations. ► The lamellar structures are a new category of interstitial dissolutions. - Abstract: We report here a systematic first principles study of two classes of variable-composition rhenium nitride: i, interstitial rhenium nitride as a solid solution and ii, rhenium nitride in lamellar structures. The compounds in class i are cubic and hexagonal close-packed rhenium phases, with nitrogen in the octahedral and tetrahedral interstices of the metal, and they are formed without changes to the structure, except for slight distortions of the unit cells. In the compounds in class ii, by contrast, the nitrogen inclusion provokes stacking faults in the parent metal structure. These faults create trigonal-prismatic sites where the nitrogen residence is energetically favored. This second class of compounds produces lamellar structures, where the nitrogen lamellas are inserted among multiple rhenium layers. The Re 3 N and Re 2 N phases produced recently by high-temperature and high-pressure synthesis belong to this class. The ratio of the nitrogen layers to the rhenium layers is given by the composition. While the first principle calculations point to higher stability for the lamellar structures as opposed to the interstitial phases, the experimental evidence presented here demonstrates that the interstitial classes are synthesizable by plasma methods. We conclude that rhenium nitrides possess polymorphism and that the two-dimensional lamellar structures might represent an emerging class of materials within binary nitride chemistry.

  16. Anthropic principle in biology and radiation biology

    International Nuclear Information System (INIS)

    Akif'ev, A. P.; Degtyarev, S.V.

    1999-01-01

    It was suggested to add the anthropic principle of the Universe according to which the physical constants of fundamental particles of matter and the laws of their counteraction are those that an appearance of man and mind becomes possible and necessary, with some biological constants to the set of fundamental constants. With reparation of DNA as an example it was shown how a cell ran some parameters of Watson-Crick double helix. It was pointed that the concept of the anthropic principle of the Universe in its full body including biological constants was a key to developing of a unified theory of evolution of the Universe within the limits of scientific creationism [ru

  17. Análise dos casos de corrupção na Petrobras sob a ótica dos princípios regulatórios propostos por Joseph Stiglitz / Analysis of the cases of corruption in Petrobras from the point of view of the regulatory principles proposed by Joseph Stiglitz

    Directory of Open Access Journals (Sweden)

    Fernando Antônio da Silva Falcão

    2017-04-01

    Full Text Available Purpose – To find, in light of the principles of regulation proposed by Joseph Stiglitz, how government and market failures contributed to the occurrence of corruption cases in Petrobras. Methodology/approach/design – This article analyzes corruption scandals in Petrobras based on the information gathered in Operation Car Wash and, based on the principles of regulation established by Joseph Stiglitz, critically verifies how market and government failures contributed to the fraud, collusion and corruption in the state-owned enterprise. Findings – Market and government failures contributed decisively to the occurrence of corruption in the state-owned enterprise. Practical implications – The present study can help designing a regulatory framework that effectively discourages the practice of fraud, collusion and corruption in public bids.

  18. A first-principles approach to finite temperature elastic constants

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y; Wang, J J; Zhang, H; Manga, V R; Shang, S L; Chen, L-Q; Liu, Z-K [Department of Materials Science and Engineering, Pennsylvania State University, University Park, PA 16802 (United States)

    2010-06-09

    A first-principles approach to calculating the elastic stiffness coefficients at finite temperatures was proposed. It is based on the assumption that the temperature dependence of elastic stiffness coefficients mainly results from volume change as a function of temperature; it combines the first-principles calculations of elastic constants at 0 K and the first-principles phonon theory of thermal expansion. Its applications to elastic constants of Al, Cu, Ni, Mo, Ta, NiAl, and Ni{sub 3}Al from 0 K up to their respective melting points show excellent agreement between the predicted values and existing experimental measurements.

  19. A first-principles approach to finite temperature elastic constants

    International Nuclear Information System (INIS)

    Wang, Y; Wang, J J; Zhang, H; Manga, V R; Shang, S L; Chen, L-Q; Liu, Z-K

    2010-01-01

    A first-principles approach to calculating the elastic stiffness coefficients at finite temperatures was proposed. It is based on the assumption that the temperature dependence of elastic stiffness coefficients mainly results from volume change as a function of temperature; it combines the first-principles calculations of elastic constants at 0 K and the first-principles phonon theory of thermal expansion. Its applications to elastic constants of Al, Cu, Ni, Mo, Ta, NiAl, and Ni 3 Al from 0 K up to their respective melting points show excellent agreement between the predicted values and existing experimental measurements.

  20. Pedagogical Principles in Online Teaching

    DEFF Research Database (Denmark)

    Beckmann, Suzanne C.; Uth Thomsen, Thyra; von Wallpach, Sylvia

    of the seven pedagogical principles that govern the teaching at our university. We also present a case study that illustrates how both opportunities and challenges were met in two “first-mover” fully online courses during Fall 2014. The experiences from this case study are discussed in terms of to what extent...... they met the pedagogical principles and observations unrelated to the pedagogical principle are shared....

  1. Great Lakes Literacy Principles

    Science.gov (United States)

    Fortner, Rosanne W.; Manzo, Lyndsey

    2011-03-01

    Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These “Great Lakes Literacy Principles” represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.

  2. Advertisement without Ethical Principles?

    Directory of Open Access Journals (Sweden)

    Wojciech Słomski

    2007-10-01

    Full Text Available The article replies to the question, whether the advertisement can exist without ethical principles or ethics should be the basis of the advertisement. One can say that the ethical opinion of the advertisement does not depend on content and the form of advertising content exclusively, but also on recipients consciousness. The advertisement appeals to the emotions more than to the intellect, thus restricting the area of conscious and based on rational premises choice, so it is morally bad. It is not that the moral evil immanently underlines the advertisement, but it concerns the mechanisms which cause that the advertisement turns out to be effective. The only admissible form of the advertisement would be the reliable full information about the advantages and flaws of the concrete advertised product. The only admissible form of the advertisement would be the reliable full information about the advantages and defects of the concrete advertised product. The most serious difficulty connected with the ethical opinion of the advertisement is the fact that the advertisement is the indispensable link of the present economy, and everyone who accepts the free market and perceives the positives of the economic growth, should also accept the advertisement. The advertisement constitutes the element of the economic activity, so in consequence the responsibility first of all lies with enterprises for its far-reaching results.

  3. Principles of Bioremediation Assessment

    Science.gov (United States)

    Madsen, E. L.

    2001-12-01

    Although microorganisms have successfully and spontaneously maintained the biosphere since its inception, industrialized societies now produce undesirable chemical compounds at rates that outpace naturally occurring microbial detoxification processes. This presentation provides an overview of both the complexities of contaminated sites and methodological limitations in environmental microbiology that impede the documentation of biodegradation processes in the field. An essential step toward attaining reliable bioremediation technologies is the development of criteria which prove that microorganisms in contaminated field sites are truly active in metabolizing contaminants of interest. These criteria, which rely upon genetic, biochemical, physiological, and ecological principles and apply to both in situ and ex situ bioremediation strategies include: (i) internal conservative tracers; (ii) added conservative tracers; (iii) added radioactive tracers; (iv) added isotopic tracers; (v) stable isotopic fractionation patterns; (vi) detection of intermediary metabolites; (vii) replicated field plots; (viii) microbial metabolic adaptation; (ix) molecular biological indicators; (x) gradients of coreactants and/or products; (xi) in situ rates of respiration; (xii) mass balances of contaminants, coreactants, and products; and (xiii) computer modeling that incorporates transport and reactive stoichiometries of electron donors and acceptors. The ideal goal is achieving a quantitative understanding of the geochemistry, hydrogeology, and physiology of complex real-world systems.

  4. Quantum principles and particles

    CERN Document Server

    Wilcox, Walter

    2012-01-01

    QUANTUM PRINCIPLESPerspective and PrinciplesPrelude to Quantum MechanicsStern-Gerlach Experiment Idealized Stern-Gerlach ResultsClassical Model AttemptsWave Functions for Two Physical-Outcome CaseProcess Diagrams, Operators, and Completeness Further Properties of Operators/ModulationOperator ReformulationOperator RotationBra-Ket Notation/Basis StatesTransition AmplitudesThree-Magnet Setup Example-CoherenceHermitian ConjugationUnitary OperatorsA Very Special OperatorMatrix RepresentationsMatrix Wave Function RecoveryExpectation ValuesWrap Up ProblemsFree Particles in One DimensionPhotoelectric EffectCompton EffectUncertainty Relation for PhotonsStability of Ground StatesBohr ModelFourier Transform and Uncertainty RelationsSchrödinger EquationSchrödinger Equation ExampleDirac Delta FunctionsWave Functions and ProbabilityProbability CurrentTime Separable SolutionsCompleteness for Particle StatesParticle Operator PropertiesOperator RulesTime Evolution and Expectation ValuesWrap-UpProblemsSome One-Dimensional So...

  5. The iceberg principles

    CERN Document Server

    Spencer-Devlin, Marni

    2013-01-01

    The Iceberg Principles connect spirituality and science in a way that proves that the energy, which is the substance of the Universe, really is Love - not sweet, syrupy, candy-and-roses kind of love but the most powerful force in the Universe. Love without expression is meaningless. This is why the Big Bang was the only logical outcome. Love had to become reflected in dimensionality. With the Big Bang a 4:96 ratio was created between the dimensional and non-dimensional realms. This ratio between visibility and invisibility the ratio of an iceberg also applies to human beings. Only four percent of who we are is visible. Our physical DNA describes us but it does not define us. What defines us are our characteristics, our gifts, and talents - the spiritual DNA. This is invisible but makes up ninety-six percent of who we are. Our talents are not accidental; our life purpose is to express them. Just as the Universe emerges into dimensionality, constantly creating galaxies at millions of miles a minute, we are al...

  6. Principles of alternative gerontology

    Science.gov (United States)

    Bilinski, Tomasz; Bylak, Aneta; Zadrag-Tecza, Renata

    2016-01-01

    Surveys of taxonomic groups of animals have shown that contrary to the opinion of most gerontologists aging is not a genuine trait. The process of aging is not universal and its mechanisms have not been widely conserved among species. All life forms are subject to extrinsic and intrinsic destructive forces. Destructive effects of stochastic events are visible only when allowed by the specific life program of an organism. Effective life programs of immortality and high longevity eliminate the impact of unavoidable damage. Organisms that are capable of agametic reproduction are biologically immortal. Mortality of an organism is clearly associated with terminal specialisation in sexual reproduction. The longevity phenotype that is not accompanied by symptoms of senescence has been observed in those groups of animals that continue to increase their body size after reaching sexual maturity. This is the result of enormous regeneration abilities of both of the above-mentioned groups. Senescence is observed when: (i) an organism by principle switches off the expression of existing growth and regeneration programs, as in the case of imago formation in insect development; (ii) particular programs of growth and regeneration of progenitors are irreversibly lost, either partially or in their entirety, in mammals and birds. “We can't solve problems by using the same kind of thinking we used when we created them.” (Ascribed to Albert Einstein) PMID:27017907

  7. Electrical and electronic principles and technology

    CERN Document Server

    John Bird

    2013-01-01

    This much-loved textbook introduces electrical and electronic principles and technology to students who are new to the subject. Real-world situations and engineering examples put the theory into context. The inclusion of worked problems with solutions really help aid your understanding and further problems then allow you to test and confirm you have mastered each subject. In total the books contains 410 worked problems, 540 further problems, 340 multiple-choice questions, 455 short-answer questions, and 7 revision tests with answers online.This an ideal text for vocational courses enabling a s

  8. Experimental toxicology: the basic principles

    National Research Council Canada - National Science Library

    Anderson, Diana; Conning, D. M

    1988-01-01

    Principles and methods are discussed in detail, covering experimental design, biochemical issues, animal husbandry, species differences, immunological issues, carcinogenesis, reproductive approaches...

  9. The Principle of General Tovariance

    Science.gov (United States)

    Heunen, C.; Landsman, N. P.; Spitters, B.

    2008-06-01

    We tentatively propose two guiding principles for the construction of theories of physics, which should be satisfied by a possible future theory of quantum gravity. These principles are inspired by those that led Einstein to his theory of general relativity, viz. his principle of general covariance and his equivalence principle, as well as by the two mysterious dogmas of Bohr's interpretation of quantum mechanics, i.e. his doctrine of classical concepts and his principle of complementarity. An appropriate mathematical language for combining these ideas is topos theory, a framework earlier proposed for physics by Isham and collaborators. Our principle of general tovariance states that any mathematical structure appearing in the laws of physics must be definable in an arbitrary topos (with natural numbers object) and must be preserved under so-called geometric morphisms. This principle identifies geometric logic as the mathematical language of physics and restricts the constructions and theorems to those valid in intuitionism: neither Aristotle's principle of the excluded third nor Zermelo's Axiom of Choice may be invoked. Subsequently, our equivalence principle states that any algebra of observables (initially defined in the topos Sets) is empirically equivalent to a commutative one in some other topos.

  10. Using Principles of Programmed Instruction

    Science.gov (United States)

    Huffman, Harry

    1971-01-01

    Although programmed instruction in accounting is available, it is limited in scope and in acceptance. Teachers, however, may apply principles of programming to the individualizing of instruction. (Author)

  11. Fundamental principles of quantum theory

    International Nuclear Information System (INIS)

    Bugajski, S.

    1980-01-01

    After introducing general versions of three fundamental quantum postulates - the superposition principle, the uncertainty principle and the complementarity principle - the question of whether the three principles are sufficiently strong to restrict the general Mackey description of quantum systems to the standard Hilbert-space quantum theory is discussed. An example which shows that the answer must be negative is constructed. An abstract version of the projection postulate is introduced and it is demonstrated that it could serve as the missing physical link between the general Mackey description and the standard quantum theory. (author)

  12. Multiple sclerosis

    International Nuclear Information System (INIS)

    Grunwald, I.Q.; Kuehn, A.L.; Backens, M.; Papanagiotou, P.; Shariat, K.; Kostopoulos, P.

    2008-01-01

    Multiple sclerosis is the most common chronic inflammatory disease of myelin with interspersed lesions in the white matter of the central nervous system. Magnetic resonance imaging (MRI) plays a key role in the diagnosis and monitoring of white matter diseases. This article focuses on key findings in multiple sclerosis as detected by MRI. (orig.) [de

  13. Rehabilitation and multiple sclerosis

    DEFF Research Database (Denmark)

    Dalgas, Ulrik

    2011-01-01

    In a chronic and disabling disease like multiple sclerosis, rehabilitation becomes of major importance in the preservation of physical, psychological and social functioning. Approximately 80% of patients have multiple sclerosis for more than 35 years and most will develop disability at some point......, a paradigm shift is taking place and it is now increasingly acknowledged that exercise therapy is both safe and beneficial. Robot-assisted training is also attracting attention in multiple sclerosis rehabilitation. Several sophisticated commercial robots exist, but so far the number of scientific studies...... promising. This drug has been shown to improve walking ability in some patients with multiple sclerosis, associated with a reduction of patients' self-reported ambulatory disability. Rehabilitation strategies involving these different approaches, or combinations of them, may be of great use in improving...

  14. The principle of life

    International Nuclear Information System (INIS)

    Kelly, P.K.; Leinen, J.

    1982-01-01

    The chapters ''History of the movement'', ''How war against man relates to war against the environment'', ''The hurdles'', ''The strategic controversy'' point out that the slow death of mankind by the destruction of the conditions of life is certain, the sudden one by a nuclear war probable. This, of course, applies only to those who have not fallen victim to starvation before. It is something everybody knows, especially there ''in change''; yet the old jog-trot is going on. The book reports how people who fear the worst are forming a new power and advocating a change. The battle is about new majorities that have to be won. (orig./HSCH) [de

  15. A fixed-point farrago

    CERN Document Server

    Shapiro, Joel H

    2016-01-01

    This text provides an introduction to some of the best-known fixed-point theorems, with an emphasis on their interactions with topics in analysis. The level of exposition increases gradually throughout the book, building from a basic requirement of undergraduate proficiency to graduate-level sophistication. Appendices provide an introduction to (or refresher on) some of the prerequisite material and exercises are integrated into the text, contributing to the volume’s ability to be used as a self-contained text. Readers will find the presentation especially useful for independent study or as a supplement to a graduate course in fixed-point theory. The material is split into four parts: the first introduces the Banach Contraction-Mapping Principle and the Brouwer Fixed-Point Theorem, along with a selection of interesting applications; the second focuses on Brouwer’s theorem and its application to John Nash’s work; the third applies Brouwer’s theorem to spaces of infinite dimension; and the fourth rests ...

  16. Acid dew point measurements in combustion gases using the dew point measuring system AH 85100

    Energy Technology Data Exchange (ETDEWEB)

    Fehler, D.

    1984-01-01

    Measuring system for continuous monitoring of the SO/sub 2//SO/sub 3/ dew point in the flue gas, characterized by a low failure rate, applicability inside the flue gas duct, maintenance-free continuous operation, and self-cleaning. The measuring principle is the cooling of the sensor element down to the 'onset condensation' message. Sensor surface temperatures are listed and evaluated as flue gas dew point temperatures. The measuring system is described. (DOMA).

  17. Unbounded critical points for a class of lower semicontinuous functionals

    OpenAIRE

    Pellacci, Benedetta; Squassina, Marco

    2003-01-01

    In this paper we prove existence and multiplicity results of unbounded critical points for a general class of weakly lower semicontinuous functionals. We will apply a suitable nonsmooth critical point theory.

  18. The inconstant "principle of constancy".

    Science.gov (United States)

    Kanzer, M

    1983-01-01

    A review of the principle of constancy, as it appeared in Freud's writings, shows that it was inspired by his clinical observations, first with Breuer in the field of cathartic therapy and then through experiences in the early usage of psychoanalysis. The recognition that memories repressed in the unconscious created increasing tension, and that this was relieved with dischargelike phenomena when the unconscious was made conscious, was the basis for his claim to originality in this area. The two principles of "neuronic inertia" Freud expounded in the Project (1895), are found to offer the key to the ambiguous definition of the principle of constancy he was to offer in later years. The "original" principle, which sought the complete discharge of energy (or elimination of stimuli), became the forerunner of the death drive; the "extended" principle achieved balances that were relatively constant, but succumbed in the end to complete discharge. This was the predecessor of the life drives. The relation between the constancy and pleasure-unpleasure principles was maintained for twenty-five years largely on an empirical basis which invoked the concept of psychophysical parallelism between "quantity" and "quality." As the links between the two principles were weakened by clinical experiences attendant upon the growth of ego psychology, a revision of the principle of constancy was suggested, and it was renamed the Nirvana principle. Actually it was shifted from alignment with the "extended" principle of inertia to the original, so that "constancy" was incongruously identified with self-extinction. The former basis for the constancy principle, the extended principle of inertia, became identified with Eros. Only a few commentators seem aware of this radical transformation, which has been overlooked in the Standard Edition of Freud's writings. Physiological biases in the history and conception of the principle of constancy are noted in the Standard Edition. The historical

  19. Principles of animal extrapolation

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, E.J.

    1991-01-01

    Animal Extrapolation presents a comprehensive examination of the scientific issues involved in extrapolating results of animal experiments to human response. This text attempts to present a comprehensive synthesis and analysis of the host of biomedical and toxicological studies of interspecies extrapolation. Calabrese's work presents not only the conceptual basis of interspecies extrapolation, but also illustrates how these principles may be better used in selection of animal experimentation models and in the interpretation of animal experimental results. The book's theme centers around four types of extrapolation: (1) from average animal model to the average human; (2) from small animals to large ones; (3) from high-risk animal to the high risk human; and (4) from high doses of exposure to lower, more realistic, doses. Calabrese attacks the issues of interspecies extrapolation by dealing individually with the factors which contribute to interspecies variability: differences in absorption, intestinal flora, tissue distribution, metabolism, repair mechanisms, and excretion. From this foundation, Calabrese then discusses the heterogeneticity of these same factors in the human population in an attempt to evaluate the representativeness of various animal models in light of interindividual variations. In addition to discussing the question of suitable animal models for specific high-risk groups and specific toxicological endpoints, the author also examines extrapolation questions related to the use of short-term tests to predict long-term human carcinogenicity and birth defects. The book is comprehensive in scope and specific in detail; for those environmental health professions seeking to understand the toxicological models which underlay health risk assessments, Animal Extrapolation is a valuable information source.

  20. The equivalence principle

    International Nuclear Information System (INIS)

    Smorodinskij, Ya.A.

    1980-01-01

    The prerelativistic history of the equivalence principle (EP) is presented briefly. Its role in history of the general relativity theory (G.R.T.) discovery is elucidated. A modern idea states that the ratio of inert and gravitational masses does not differ from 1 at least up to the 12 sign after comma. Attention is paid to the difference of the gravitational field from electromagnetic one. The difference is as follows, the energy of the gravitational field distributed in space is the source of the field. These fields always interact at superposition. Electromagnetic fields from different sources are put together. On the basis of EP it is established the Sun field interact with the Earth gravitational energy in the same way as with any other one. The latter proves the existence of gravitation of the very gravitational field to a heavy body. A problem on gyroscope movement in the Earth gravitational field is presented as a paradox. The calculation has shown that gyroscope at satellite makes a positive precession, and its axis turns in an angle equal to α during a turn of the satellite round the Earth, but because of the space curvature - into the angle two times larger than α. A resulting turn is equal to 3α. It is shown on the EP basis that the polarization plane in any coordinate system does not turn when the ray of light passes in the gravitational field. Together with the historical value of EP noted is the necessity to take into account the requirements claimed by the EP at description of the physical world

  1. Double meanings will not save the principle of double effect.

    Science.gov (United States)

    Douglas, Charles D; Kerridge, Ian H; Ankeny, Rachel A

    2014-06-01

    In an article somewhat ironically entitled "Disambiguating Clinical Intentions," Lynn Jansen promotes an idea that should be bewildering to anyone familiar with the literature on the intention/foresight distinction. According to Jansen, "intention" has two commonsense meanings, one of which is equivalent to "foresight." Consequently, questions about intention are "infected" with ambiguity-people cannot tell what they mean and do not know how to answer them. This hypothesis is unsupported by evidence, but Jansen states it as if it were accepted fact. In this reply, we make explicit the multiple misrepresentations she has employed to make her hypothesis seem plausible. We also point out the ways in which it defies common sense. In particular, Jansen applies her thesis only to recent empirical research on the intentions of doctors, totally ignoring the widespread confusion that her assertion would imply in everyday life, in law, and indeed in religious and philosophical writings concerning the intention/foresight distinction and the Principle of Double Effect. © The Author 2014. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. A New principle for an all digital preamplifier and equalizer

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1986-01-01

    A new principle for an all digital preamplifier and equalizer, to be used together with a compact disc player, is described. The principle makes it possible to obtain an arbitrary gain transfer function together with a linear phase. The gain can be varied 20 dB from point to point, when specified...... on a logarithmic frequency axis with 30 divisions from 20 Hz to 20 kHz. The deviation in the passbands is max. 0. 2 dB. Taking advantage of the digital signal from the preamplifier, a high-efficiency power amplifier can be developed. A prototype of the preamplifier built with commercially obtainable components has...

  3. "Drone Killings in Principle and in Practice"

    DEFF Research Database (Denmark)

    Dige, Morten

    2017-01-01

    to argue that what we see in the real world cases of drone killings is not merely an accidental or contingent use of drone technology. The real life use reflects to a large extent features that are inherent of the dominant drone systems that has been developed to date. What is being imagined "in principle......" is thus to a large extent drone killings in dreamland. I use an historic example as a point of reference and departure: the debate over the lawfulness of nuclear weapons....

  4. Dynamic performance of maximum power point tracking circuits using sinusoidal extremum seeking control for photovoltaic generation

    Science.gov (United States)

    Leyva, R.; Artillan, P.; Cabal, C.; Estibals, B.; Alonso, C.

    2011-04-01

    The article studies the dynamic performance of a family of maximum power point tracking circuits used for photovoltaic generation. It revisits the sinusoidal extremum seeking control (ESC) technique which can be considered as a particular subgroup of the Perturb and Observe algorithms. The sinusoidal ESC technique consists of adding a small sinusoidal disturbance to the input and processing the perturbed output to drive the operating point at its maximum. The output processing involves a synchronous multiplication and a filtering stage. The filter instance determines the dynamic performance of the MPPT based on sinusoidal ESC principle. The approach uses the well-known root-locus method to give insight about damping degree and settlement time of maximum-seeking waveforms. This article shows the transient waveforms in three different filter instances to illustrate the approach. Finally, an experimental prototype corroborates the dynamic analysis.

  5. Quantification of the equivalence principle

    International Nuclear Information System (INIS)

    Epstein, K.J.

    1978-01-01

    Quantitative relationships illustrate Einstein's equivalence principle, relating it to Newton's ''fictitious'' forces arising from the use of noninertial frames, and to the form of the relativistic time dilatation in local Lorentz frames. The equivalence principle can be interpreted as the equivalence of general covariance to local Lorentz covariance, in a manner which is characteristic of Riemannian and pseudo-Riemannian geometries

  6. Principles and Criteria for Design

    DEFF Research Database (Denmark)

    Beghin, D.; Cervetto, D.; Hansen, Peter Friis

    1997-01-01

    The mandate of ISSC Committee IV.1 on principles and Criteria for Design is to report on the following:The ongoing concern for quantification of general economic and safety criteria for marine structures and for the development of appropriate principles for rational life cycle design using...

  7. The Virtue of Principle Ethics.

    Science.gov (United States)

    Bersoff, Donald N.

    1996-01-01

    Presents arguments against adopting virtue ethics as a guiding concept in developing counseling guidelines: (1) virtue ethics is irrelevant in the resolution of most ethics cases; (2) virtue and principle ethics overlap; (3) principle ethics are more suited to acting and deciding; (4) the emphasis on virtue ethics increases the possibility of…

  8. Gene probes: principles and protocols

    National Research Council Canada - National Science Library

    Aquino de Muro, Marilena; Rapley, Ralph

    2002-01-01

    ... of labeled DNA has allowed genes to be mapped to single chromosomes and in many cases to a single chromosome band, promoting significant advance in human genome mapping. Gene Probes: Principles and Protocols presents the principles for gene probe design, labeling, detection, target format, and hybridization conditions together with detailed protocols, accom...

  9. Multimedia Principle in Teaching Lessons

    Science.gov (United States)

    Kari Jabbour, Khayrazad

    2012-01-01

    Multimedia learning principle occurs when we create mental representations from combining text and relevant graphics into lessons. This article discusses the learning advantages that result from adding multimedia learning principle into instructions; and how to select graphics that support learning. There is a balance that instructional designers…

  10. Legal Principles and Legislative Instrumentalism

    NARCIS (Netherlands)

    Gribnau, J.L.M.; Soeteman, A.

    2003-01-01

    Instrumentalist legislation usually underestimates the importance of legal principles in modern law. Legal principles are the normative core of a value oriented conception of law. They function as essential criteria of evaluation for lawmaking by the legislator and the executive. In fact,

  11. Multiple homicides.

    Science.gov (United States)

    Copeland, A R

    1989-09-01

    A study of multiple homicides or multiple deaths involving a solitary incident of violence by another individual was performed on the case files of the Office of the Medical Examiner of Metropolitan Dade County in Miami, Florida, during 1983-1987. A total of 107 multiple homicides were studied: 88 double, 17 triple, one quadruple, and one quintuple. The 236 victims were analyzed regarding age, race, sex, cause of death, toxicologic data, perpetrator, locale of the incident, and reason for the incident. This article compares this type of slaying with other types of homicide including those perpetrated by serial killers. Suggestions for future research in this field are offered.

  12. Principles of European Contract Law

    DEFF Research Database (Denmark)

    Lando, Ole; Beale, Hugh

    This text provides a comprehensive guide to the principles of European contract law. They have been drawn up by an independent body of experts from each Member State of the EU, under a project supported by the European Commission and many other organizations. The principles are stated in the form...... of articles, with a detailed commentary explaining the purpose and operation of each article and its relation to the remainder. Each article also has extensive comparative notes surveying the national laws and other international provisions on the topic. "The Principles of European Contract Law Parts I &...... in developing a common European legal culture. The European Parliament has twice called for the creation of a European Civil Code. The principles of European contract law are essential steps in these projects. This text provides a comprehensive guide to the Principles of European contract law. They have been...

  13. Two conceptions of legal principles

    Directory of Open Access Journals (Sweden)

    Spaić Bojan

    2017-01-01

    Full Text Available The paper discusses the classical understanding of legal principles as the most general norms of a legal order, confronting it with Dworkin's and Alexy's understanding of legal principles as prima facie, unconditional commands. The analysis shows that the common, classical conception brings into question the status of legal principles as norms, by disreguarding their usefulness in judicial reasoning, while, conversely, the latterhas significant import forlegal practice and consequently for legal dogmatics. It is argued that the heuristic fruitfulness of understanding principles as optimization commands thusbecomesapparent. When we understand the relation of priciples to the idea of proportionality, as thespecific mode of their application, which is different from the supsumtive mode of applying rules, the theory of legal principles advanced by Dworkin and Alexy appears therefore to be descriptively better than others, but not without its flaws.

  14. The End of Points

    Science.gov (United States)

    Feldman, Jo

    2018-01-01

    Have teachers become too dependent on points? This article explores educators' dependency on their points systems, and the ways that points can distract teachers from really analyzing students' capabilities and achievements. Feldman argues that using a more subjective grading system can help illuminate crucial information about students and what…

  15. Demerit points systems.

    NARCIS (Netherlands)

    2006-01-01

    In 2012, 21 of the 27 EU Member States had some form of demerit points system. In theory, demerit points systems contribute to road safety through three mechanisms: 1) prevention of unsafe behaviour through the risk of receiving penalty points, 2) selection and suspension of the most frequent

  16. Ten guiding principles for youth mental health services.

    Science.gov (United States)

    Hughes, Frank; Hebel, Lisa; Badcock, Paul; Parker, Alexandra G

    2018-06-01

    Guiding principles are arguably central to the development of any health service. The aim of this article is to report on the outcomes of a youth mental health (YMH) community of practice (CoP), which identified a range of guiding principles that provide a clear point of comparison for the only other set of principles for YMH service delivery proposed to date. A YMH CoP was established in 2010 as part of the Victorian State Government approach to improving YMH care. An initial literature search was undertaken to locate articles on YMH service delivery. A number of common themes were identified, which the YMH community of practice (YMHCoP) members then elaborated upon by drawing from their collective experience of the YMH sector. The resultant themes were then refined through subsequent group discussions to derive a definitive set of guiding principles. These principles were then augmented by a second literature search conducted in July 2015. Fifteen key themes were derived from the initial literature search and YMH CoP discussions. These were refined by the YMH CoP to produce 10 guiding principles for YMH service development. These are discussed through reference to the relevant literature, using the only other article on principles of YMH service delivery as a notable point of comparison. The 10 principles identified may be useful for quality improvement and are likely to have international relevance. We suggest the timely pursuit of an international consensus on guiding principles for service delivery under the auspices of a peak body for YMH. © 2017 John Wiley & Sons Australia, Ltd.

  17. Multiple Sclerosis

    Science.gov (United States)

    Multiple sclerosis (MS) is a nervous system disease that affects your brain and spinal cord. It damages the myelin sheath, the material that surrounds and protects your nerve cells. This damage slows down ...

  18. Multiple myeloma.

    LENUS (Irish Health Repository)

    Collins, Conor D

    2012-02-01

    Advances in the imaging and treatment of multiple myeloma have occurred over the past decade. This article summarises the current status and highlights how an understanding of both is necessary for optimum management.

  19. Basic principles of fracture treatment in children.

    Science.gov (United States)

    Ömeroğlu, Hakan

    2018-04-01

    This review aims to summarize the basic treatment principles of fractures according to their types and general management principles of special conditions including physeal fractures, multiple fractures, open fractures, and pathologic fractures in children. Definition of the fracture is needed for better understanding the injury mechanism, planning a proper treatment strategy, and estimating the prognosis. As the healing process is less complicated, remodeling capacity is higher and non-union is rare, the fractures in children are commonly treated by non-surgical methods. Surgical treatment is preferred in children with multiple injuries, in open fractures, in some pathologic fractures, in fractures with coexisting vascular injuries, in fractures which have a history of failed initial conservative treatment and in fractures in which the conservative treatment has no/little value such as femur neck fractures, some physeal fractures, displaced extension and flexion type humerus supracondylar fractures, displaced humerus lateral condyle fractures, femur, tibia and forearm shaft fractures in older children and adolescents and unstable pelvis and acetabulum fractures. Most of the fractures in children can successfully be treated by non-surgical methods.

  20. On Existence of Solutions to the Caputo Type Fractional Order Three-Point Boundary Value Problems

    Directory of Open Access Journals (Sweden)

    B.M.B. Krushna

    2016-10-01

    Full Text Available In this paper, we establish the existence of solutions to the fractional order three-point boundary value problems by utilizing Banach contraction principle and Schaefer's fixed point theorem.