WorldWideScience

Sample records for approaching fundamental limits

  1. Limits on fundamental limits to computation.

    Science.gov (United States)

    Markov, Igor L

    2014-08-14

    An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.

  2. Fundamental gravitational limitations to quantum computing

    International Nuclear Information System (INIS)

    Gambini, R.; Porto, A.; Pullin, J.

    2006-01-01

    Lloyd has considered the ultimate limitations the fundamental laws of physics place on quantum computers. He concludes in particular that for an 'ultimate laptop' (a computer of one liter of volume and one kilogram of mass) the maximum number of operations per second is bounded by 10 51 . The limit is derived considering ordinary quantum mechanics. Here we consider additional limits that are placed by quantum gravity ideas, namely the use of a relational notion of time and fundamental gravitational limits that exist on time measurements. We then particularize for the case of an ultimate laptop and show that the maximum number of operations is further constrained to 10 47 per second. (authors)

  3. Fundamental Limitations for Imaging GEO Satellites

    Science.gov (United States)

    2015-10-18

    Fundamental limitations for imaging GEO satellites D. Mozurkewich Seabrook Engineering , Seabrook, MD 20706 USA H. R. Schmitt, J. T. Armstrong Naval...higher spatial frequency. Send correspondence to David Mozurkewich, Seabrook Engineering , 9310 Dubarry Ave., Seabrook MD 20706 E-mail: dave

  4. Queueing networks a fundamental approach

    CERN Document Server

    Dijk, Nico

    2011-01-01

    This handbook aims to highlight fundamental, methodological and computational aspects of networks of queues to provide insights and to unify results that can be applied in a more general manner.  The handbook is organized into five parts: Part 1 considers exact analytical results such as of product form type. Topics include characterization of product forms by physical balance concepts and simple traffic flow equations, classes of service and queue disciplines that allow a product form, a unified description of product forms for discrete time queueing networks, insights for insensitivity, and aggregation and decomposition results that allow subnetworks to be aggregated into single nodes to reduce computational burden. Part 2 looks at monotonicity and comparison results such as for computational simplification by either of two approaches: stochastic monotonicity and ordering results based on the ordering of the proces generators, and comparison results and explicit error bounds based on an underlying Markov r...

  5. Fundamental limits of repeaterless quantum communications

    Science.gov (United States)

    Pirandola, Stefano; Laurenza, Riccardo; Ottaviani, Carlo; Banchi, Leonardo

    2017-01-01

    Quantum communications promises reliable transmission of quantum information, efficient distribution of entanglement and generation of completely secure keys. For all these tasks, we need to determine the optimal point-to-point rates that are achievable by two remote parties at the ends of a quantum channel, without restrictions on their local operations and classical communication, which can be unlimited and two-way. These two-way assisted capacities represent the ultimate rates that are reachable without quantum repeaters. Here, by constructing an upper bound based on the relative entropy of entanglement and devising a dimension-independent technique dubbed ‘teleportation stretching', we establish these capacities for many fundamental channels, namely bosonic lossy channels, quantum-limited amplifiers, dephasing and erasure channels in arbitrary dimension. In particular, we exactly determine the fundamental rate-loss tradeoff affecting any protocol of quantum key distribution. Our findings set the limits of point-to-point quantum communications and provide precise and general benchmarks for quantum repeaters. PMID:28443624

  6. Fundamental limit of light trapping in grating structures

    KAUST Repository

    Yu, Zongfu

    2010-08-11

    We use a rigorous electromagnetic approach to analyze the fundamental limit of light-trapping enhancement in grating structures. This limit can exceed the bulk limit of 4n 2, but has significant angular dependency. We explicitly show that 2D gratings provide more enhancement than 1D gratings. We also show the effects of the grating profile’s symmetry on the absorption enhancement limit. Numerical simulations are applied to support the theory. Our findings provide general guidance for the design of grating structures for light-trapping solar cells.

  7. 33 CFR 86.03 - Limits of fundamental frequencies.

    Science.gov (United States)

    2010-07-01

    ... of fundamental frequencies. To ensure a wide variety of whistle characteristics, the fundamental... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Limits of fundamental frequencies. 86.03 Section 86.03 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY...

  8. From fundamental limits to radioprotection practice

    International Nuclear Information System (INIS)

    Henry, P.; Chassany, J.

    1980-01-01

    The individual dose limits fixed by present French legislation for different categories of people refer to dose equivalents received by or delivered to the whole body or to certain tissues or organs over given periods of time. The values concerning personnel engaged directly in work under radiations are summed up in a table. These are the limits which radioprotection authorities must impose, while ensuring that exposure levels are kept as low as possible. With the means available in practical radioprotection it is not possible to measure dose equivalents directly, but information may be obtained on dose rates, absorbed doses, particle fluxes, activities per unit volume and per surface area. An interpretation of these measurements is necessary if an efficient supervision of worker exposure is to be achieved [fr

  9. Fundamental limits of positron emission mammography

    International Nuclear Information System (INIS)

    Moses, William W.; Qi, Jinyi

    2001-01-01

    We explore the causes of performance limitation in positron emission mammography cameras. We compare two basic camera geometries containing the same volume of 511 keV photon detectors, one with a parallel plane geometry and another with a rectangular geometry. We find that both geometries have similar performance for the phantom imaged (in Monte Carlo simulation), even though the solid angle coverage of the rectangular camera is about 50 percent higher than the parallel plane camera. The reconstruction algorithm used significantly affects the resulting image; iterative methods significantly outperform the commonly used focal plane tomography. Finally, the characteristics of the tumor itself, specifically the absolute amount of radiotracer taken up by the tumor, will significantly affect the imaging performance

  10. Fundamental limits of scintillation detector timing precision

    International Nuclear Information System (INIS)

    Derenzo, Stephen E; Choong, Woon-Seng; Moses, William W

    2014-01-01

    In this paper we review the primary factors that affect the timing precision of a scintillation detector. Monte Carlo calculations were performed to explore the dependence of the timing precision on the number of photoelectrons, the scintillator decay and rise times, the depth of interaction uncertainty, the time dispersion of the optical photons (modeled as an exponential decay), the photodetector rise time and transit time jitter, the leading-edge trigger level, and electronic noise. The Monte Carlo code was used to estimate the practical limits on the timing precision for an energy deposition of 511 keV in 3 mm × 3 mm × 30 mm Lu 2 SiO 5 :Ce and LaBr 3 :Ce crystals. The calculated timing precisions are consistent with the best experimental literature values. We then calculated the timing precision for 820 cases that sampled scintillator rise times from 0 to 1.0 ns, photon dispersion times from 0 to 0.2 ns, photodetector time jitters from 0 to 0.5 ns fwhm, and A from 10 to 10 000 photoelectrons per ns decay time. Since the timing precision R was found to depend on A −1/2  more than any other factor, we tabulated the parameter B, where R = BA −1/2 . An empirical analytical formula was found that fit the tabulated values of B with an rms deviation of 2.2% of the value of B. The theoretical lower bound of the timing precision was calculated for the example of 0.5 ns rise time, 0.1 ns photon dispersion, and 0.2 ns fwhm photodetector time jitter. The lower bound was at most 15% lower than leading-edge timing discrimination for A from 10 to 10 000 photoelectrons ns −1 . A timing precision of 8 ps fwhm should be possible for an energy deposition of 511 keV using currently available photodetectors if a theoretically possible scintillator were developed that could produce 10 000 photoelectrons ns −1 . (paper)

  11. Fundamental limit of nanophotonic light trapping in solar cells.

    Science.gov (United States)

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2010-10-12

    Establishing the fundamental limit of nanophotonic light-trapping schemes is of paramount importance and is becoming increasingly urgent for current solar cell research. The standard theory of light trapping demonstrated that absorption enhancement in a medium cannot exceed a factor of 4n(2)/sin(2)θ, where n is the refractive index of the active layer, and θ is the angle of the emission cone in the medium surrounding the cell. This theory, however, is not applicable in the nanophotonic regime. Here we develop a statistical temporal coupled-mode theory of light trapping based on a rigorous electromagnetic approach. Our theory reveals that the conventional limit can be substantially surpassed when optical modes exhibit deep-subwavelength-scale field confinement, opening new avenues for highly efficient next-generation solar cells.

  12. Fundamental limits of radio interferometers: calibration and source parameter estimation

    OpenAIRE

    Trott, Cathryn M.; Wayth, Randall B.; Tingay, Steven J.

    2012-01-01

    We use information theory to derive fundamental limits on the capacity to calibrate next-generation radio interferometers, and measure parameters of point sources for instrument calibration, point source subtraction, and data deconvolution. We demonstrate the implications of these fundamental limits, with particular reference to estimation of the 21cm Epoch of Reionization power spectrum with next-generation low-frequency instruments (e.g., the Murchison Widefield Array -- MWA, Precision Arra...

  13. Fundamental Limit of Nanophotonic Light-trapping in Solar Cells

    OpenAIRE

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2010-01-01

    Establishing the fundamental limit of nanophotonic light-trapping schemes is of paramount importance and is becoming increasingly urgent for current solar cell research. The standard theory of light trapping demonstrated that absorption enhancement in a medium cannot exceed a factor of 4n^2/ sin^2(\\theta), where n is the refractive index of the active layer, and \\theta is the angle of the emission cone in the medium surrounding the cell. This theory, however, is not applicable in the nanophot...

  14. Investigation of fundamental limits to beam brightness available from photoinjectors

    International Nuclear Information System (INIS)

    Bazarov, Ivan

    2015-01-01

    The goal of this project was investigation of fundamental limits to beam brightness available from photoinjectors. This basic research in accelerator physics spanned over 5 years aiming to extend the fundamental understanding of high average current, low emittance sources of relativistic electrons based on photoemission guns, a necessary prerequisite for a new generation of coherent X-ray synchrotron radiation facilities based on continuous duty superconducting linacs. The program focused on two areas critical to making advances in the electron source performance: 1) the physics of photocathodes for the production of low emittance electrons and 2) control of space charge forces in the immediate vicinity to the cathode via 3D laser pulse shaping.

  15. Investigation of fundamental limits to beam brightness available from photoinjectors

    Energy Technology Data Exchange (ETDEWEB)

    Bazarov, Ivan [Cornell Univ., Ithaca, NY (United States)

    2015-07-09

    The goal of this project was investigation of fundamental limits to beam brightness available from photoinjectors. This basic research in accelerator physics spanned over 5 years aiming to extend the fundamental understanding of high average current, low emittance sources of relativistic electrons based on photoemission guns, a necessary prerequisite for a new generation of coherent X-ray synchrotron radiation facilities based on continuous duty superconducting linacs. The program focused on two areas critical to making advances in the electron source performance: 1) the physics of photocathodes for the production of low emittance electrons and 2) control of space charge forces in the immediate vicinity to the cathode via 3D laser pulse shaping.

  16. Fundamental size limitations of micro four-point probes

    DEFF Research Database (Denmark)

    Ansbæk, Thor; Petersen, Dirch Hjorth; Hansen, Ole

    2009-01-01

    The continued down-scaling of integrated circuits and magnetic tunnel junctions (MTJ) for hard disc read heads presents a challenge to current metrology technology. The four-point probes (4PP), currently used for sheet resistance characterization in these applications, therefore must be down......-scaled as well in order to correctly characterize the extremely thin films used. This presents a four-point probe design and fabrication challenge. We analyze the fundamental limitation on down-scaling of a generic micro four-point probe (M4PP) in a comprehensive study, where mechanical, thermal, and electrical...

  17. Updates on tetanus toxin: a fundamental approach

    Directory of Open Access Journals (Sweden)

    Md. Ahaduzzaman

    2015-03-01

    Full Text Available Clostridium tetani is an anaerobic bacterium that produces second most poisonous protein toxins than any other bacteria. Tetanus in animals is sporadic in nature but difficult to combat even by using antibiotics and antiserum. It is crucial to understand the fundamental mechanisms and signals that control toxin production for advance research and medicinal uses. This review was intended for better understanding the basic patho-physiology of tetanus and neurotoxins (TeNT among the audience of related field.

  18. Fundamental limits to position determination by concentration gradients.

    Directory of Open Access Journals (Sweden)

    Filipe Tostevin

    2007-04-01

    Full Text Available Position determination in biological systems is often achieved through protein concentration gradients. Measuring the local concentration of such a protein with a spatially varying distribution allows the measurement of position within the system. For these systems to work effectively, position determination must be robust to noise. Here, we calculate fundamental limits to the precision of position determination by concentration gradients due to unavoidable biochemical noise perturbing the gradients. We focus on gradient proteins with first-order reaction kinetics. Systems of this type have been experimentally characterised in both developmental and cell biology settings. For a single gradient we show that, through time-averaging, great precision potentially can be achieved even with very low protein copy numbers. As a second example, we investigate the ability of a system with oppositely directed gradients to find its centre. With this mechanism, positional precision close to the centre improves more slowly with increasing averaging time, and so longer averaging times or higher copy numbers are required for high precision. For both single and double gradients, we demonstrate the existence of optimal length scales for the gradients for which precision is maximized, as well as analyze how precision depends on the size of the concentration-measuring apparatus. These results provide fundamental constraints on the positional precision supplied by concentration gradients in various contexts, including both in developmental biology and also within a single cell.

  19. Fundamental limits to the velocity of solid armatures in railguns

    International Nuclear Information System (INIS)

    Long, G.C. Jr.

    1987-01-01

    The fundamental limits to the velocity of solid armatures in railguns are dependent upon the increase in temperature which melts the conducting medium or lowers the yield strength of the material. A two-dimensional transient finite-element electrothermal model is developed to determine the magnetic and temperature fields in the rails and armature of a railgun. The solution for the magnetic and temperature fields is based upon the fundamentals of Maxwell's equations and Fourier's law of heat conduction with no a priori assumptions about the current-density distribution in the rails or the armature. The magnetic-field and temperature-field spatial variations are calculated using finite-element techniques, while the time variations are calculated using finite-differencing methods. A thermal-diffusion iteration is performed between each magnetic diffusion iteration. Joule heating information is provided by solving the magnetic diffusion problem and temperature data for calculating material properties such as the electrical resistivity, thermal conductivity, and specific heat is provided by solving the thermal diffusion problem. Various types of rail and armature designs are simulated to include solid armatures consisting of different homogeneous materials, resistive rails, and a graded-resistance armature

  20. Fundamental limitations of cavity-assisted atom interferometry

    Science.gov (United States)

    Dovale-Álvarez, M.; Brown, D. D.; Jones, A. W.; Mow-Lowry, C. M.; Miao, H.; Freise, A.

    2017-11-01

    Atom interferometers employing optical cavities to enhance the beam splitter pulses promise significant advances in science and technology, notably for future gravitational wave detectors. Long cavities, on the scale of hundreds of meters, have been proposed in experiments aiming to observe gravitational waves with frequencies below 1 Hz, where laser interferometers, such as LIGO, have poor sensitivity. Alternatively, short cavities have also been proposed for enhancing the sensitivity of more portable atom interferometers. We explore the fundamental limitations of two-mirror cavities for atomic beam splitting, and establish upper bounds on the temperature of the atomic ensemble as a function of cavity length and three design parameters: the cavity g factor, the bandwidth, and the optical suppression factor of the first and second order spatial modes. A lower bound to the cavity bandwidth is found which avoids elongation of the interaction time and maximizes power enhancement. An upper limit to cavity length is found for symmetric two-mirror cavities, restricting the practicality of long baseline detectors. For shorter cavities, an upper limit on the beam size was derived from the geometrical stability of the cavity. These findings aim to aid the design of current and future cavity-assisted atom interferometers.

  1. Fundamental image quality limits for microcomputed tomography in small animals

    International Nuclear Information System (INIS)

    Ford, N.L.; Thornton, M.M.; Holdsworth, D.W.

    2003-01-01

    resolution to improve, by decreasing the detector element size to tens of microns or less, high quality images will be limited by the x-ray dose administered. For the highest quality images, these doses will approach the lethal dose or LD50 for the animals. Approaching the lethal dose will affect the way experiments are planned, and may reduce opportunities for experiments involving imaging the same animal over time. Dose considerations will become much more important for live small-animal imaging as the limits of resolution are tested

  2. Towards the Fundamental Quantum Limit of Linear Measurements of Classical Signals.

    Science.gov (United States)

    Miao, Haixing; Adhikari, Rana X; Ma, Yiqiu; Pang, Belinda; Chen, Yanbei

    2017-08-04

    The quantum Cramér-Rao bound (QCRB) sets a fundamental limit for the measurement of classical signals with detectors operating in the quantum regime. Using linear-response theory and the Heisenberg uncertainty relation, we derive a general condition for achieving such a fundamental limit. When applied to classical displacement measurements with a test mass, this condition leads to an explicit connection between the QCRB and the standard quantum limit that arises from a tradeoff between the measurement imprecision and quantum backaction; the QCRB can be viewed as an outcome of a quantum nondemolition measurement with the backaction evaded. Additionally, we show that the test mass is more a resource for improving measurement sensitivity than a victim of the quantum backaction, which suggests a new approach to enhancing the sensitivity of a broad class of sensors. We illustrate these points with laser interferometric gravitational-wave detectors.

  3. Fundamental limits to frequency estimation: a comprehensive microscopic perspective

    Science.gov (United States)

    Haase, J. F.; Smirne, A.; Kołodyński, J.; Demkowicz-Dobrzański, R.; Huelga, S. F.

    2018-05-01

    We consider a metrology scenario in which qubit-like probes are used to sense an external field that affects their energy splitting in a linear fashion. Following the frequency estimation approach in which one optimizes the state and sensing time of the probes to maximize the sensitivity, we provide a systematic study of the attainable precision under the impact of noise originating from independent bosonic baths. Specifically, we invoke an explicit microscopic derivation of the probe dynamics using the spin-boson model with weak coupling of arbitrary geometry. We clarify how the secular approximation leads to a phase-covariant (PC) dynamics, where the noise terms commute with the field Hamiltonian, while the inclusion of non-secular contributions breaks the PC. Moreover, unless one restricts to a particular (i.e., Ohmic) spectral density of the bath modes, the noise terms may contain relevant information about the frequency to be estimated. Thus, by considering general evolutions of a single probe, we study regimes in which these two effects have a non-negligible impact on the achievable precision. We then consider baths of Ohmic spectral density yet fully accounting for the lack of PC, in order to characterize the ultimate attainable scaling of precision when N probes are used in parallel. Crucially, we show that beyond the semigroup (Lindbladian) regime the Zeno limit imposing the 1/N 3/2 scaling of the mean squared error, recently derived assuming PC, generalises to any dynamics of the probes, unless the latter are coupled to the baths in the direction perfectly transversal to the frequency encoding—when a novel scaling of 1/N 7/4 arises. As our microscopic approach covers all classes of dissipative dynamics, from semigroup to non-Markovian ones (each of them potentially non-phase-covariant), it provides an exhaustive picture, in which all the different asymptotic scalings of precision naturally emerge.

  4. Fundamental limits to imaging resolution for focused ion beams

    International Nuclear Information System (INIS)

    Orloff, J.; Swanson, L.W.; Utlaut, M.

    1996-01-01

    This article investigates the limitations on the formation of focused ion beam images from secondary electrons. We use the notion of the information content of an image to account for the effects of resolution, contrast, and signal-to-noise ratio and show that there is a competition between the rate at which small features are sputtered away by the primary beam and the rate of collection of secondary electrons. We find that for small features, sputtering is the limit to imaging resolution, and that for extended small features (e.g., layered structures), rearrangement, redeposition, and differential sputtering rates may limit the resolution in some cases. copyright 1996 American Vacuum Society

  5. Fundamental limits on beam stability at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Decker, G. A.

    1998-01-01

    Orbit correction is now routinely performed at the few-micron level in the Advanced Photon Source (APS) storage ring. Three diagnostics are presently in use to measure and control both AC and DC orbit motions: broad-band turn-by-turn rf beam position monitors (BPMs), narrow-band switched heterodyne receivers, and photoemission-style x-ray beam position monitors. Each type of diagnostic has its own set of systematic error effects that place limits on the ultimate pointing stability of x-ray beams supplied to users at the APS. Limiting sources of beam motion at present are magnet power supply noise, girder vibration, and thermal timescale vacuum chamber and girder motion. This paper will investigate the present limitations on orbit correction, and will delve into the upgrades necessary to achieve true sub-micron beam stability

  6. Fundamental limits on quantum dynamics based on entropy change

    Science.gov (United States)

    Das, Siddhartha; Khatri, Sumeet; Siopsis, George; Wilde, Mark M.

    2018-01-01

    It is well known in the realm of quantum mechanics and information theory that the entropy is non-decreasing for the class of unital physical processes. However, in general, the entropy does not exhibit monotonic behavior. This has restricted the use of entropy change in characterizing evolution processes. Recently, a lower bound on the entropy change was provided in the work of Buscemi, Das, and Wilde [Phys. Rev. A 93(6), 062314 (2016)]. We explore the limit that this bound places on the physical evolution of a quantum system and discuss how these limits can be used as witnesses to characterize quantum dynamics. In particular, we derive a lower limit on the rate of entropy change for memoryless quantum dynamics, and we argue that it provides a witness of non-unitality. This limit on the rate of entropy change leads to definitions of several witnesses for testing memory effects in quantum dynamics. Furthermore, from the aforementioned lower bound on entropy change, we obtain a measure of non-unitarity for unital evolutions.

  7. Fundamental limitation of electrocatalytic methane conversion to methanol

    DEFF Research Database (Denmark)

    Arnarson, Logi; Schmidt, Per Simmendefeldt; Pandey, Mohnish

    2018-01-01

    binding energies on the surface. Based on a simple kinetic model we can conclude that in order to obtain sufficient activity oxygen has to bind weakly to the surface but there is an upper limit to retain selectivity. Few potentially interesting candidates are found but this relatively simple description...

  8. Secret Key Agreement: Fundamental Limits and Practical Challenges

    KAUST Repository

    Rezki, Zouheir

    2017-02-15

    Despite the tremendous progress made toward establishing PLS as a new paradigm to guarantee security of communication systems at the physical layerthere is a common belief among researchers and industrials that there are many practical challenges that prevent PLS from flourishing at the industrial scale. Most secure message transmission constructions available to date are tied to strong assumptions on CSI, consider simple channel models and undermine eavesdropping capabilities; thus compromising their practical interest to a big extent. Perhaps arguably, the most likely reasonable way to leverage PLS potential in securing modern wireless communication systems is via secret-key agreement. In the latter setting, the legitimate parties try to agree on a key exploiting availability of a public channel with high capacity which is also accessible to the eavesdropper. Once a key is shared by the legitimate parties, they may use it in a one-time pad encryption, for instance. In this article, we investigate two performance limits of secret-key agreement communications; namely, the secret-key diversity-multiplexing trade-off and the effect of transmit correlation on the secretkey capacity. We show via examples how secretkey agreement offers more flexibility than secure message transmissions. Finally, we explore a few challenges of secret-key agreement concept and propose a few guidelines to overturn them.

  9. Fundamental-mode sources in approach to critical experiments

    International Nuclear Information System (INIS)

    Goda, J.; Busch, R.

    2000-01-01

    An equivalent fundamental-mode source is an imaginary source that is distributed identically in space, energy, and angle to the fundamental-mode fission source. Therefore, it produces the same neutron multiplication as the fundamental-mode fission source. Even if two source distributions produce the same number of spontaneous fission neutrons, they will not necessarily contribute equally toward the multiplication of a given system. A method of comparing the relative importance of source distributions is needed. A factor, denoted as g* and defined as the ratio of the fixed-source multiplication to the fundamental-mode multiplication, is used to convert a given source strength to its equivalent fundamental-mode source strength. This factor is of interest to criticality safety as it relates to the 1/M method of approach to critical. Ideally, a plot of 1/M versus κ eff is linear. However, since 1/M = (1 minus κ eff )/g*, the plot will be linear only if g* is constant with κ eff . When g* increases with κ eff , the 1/M plot is said to be conservative because the critical mass is underestimated. However, it is possible for g* to decrease with κ eff yielding a nonconservative 1/M plot. A better understanding of g* would help predict whether a given approach to critical will be conservative or nonconservative. The equivalent fundamental-mode source strength g*S can be predicted by experiment. The experimental method was tested on the XIX-1 core on the Fast Critical Assembly at the Japan Atomic Energy Research Institute. The results showed a 30% difference between measured and calculated values. However, the XIX-1 reactor had significant intermediate-energy neutrons. The presence of intermediate-energy neutrons may have made the cross-section set used for predicted values less than ideal for the system

  10. Limiting value definition in radiation protection physics, legislation and toxicology. Fundamentals, contrasts, perspectives

    International Nuclear Information System (INIS)

    Smeddinck, Ulrich; Koenig, Claudia

    2016-01-01

    The volume is the documentation of an ENTRIA workshop discussion on limiting value definition in radiation protection including the following contributions: Introduction in radiation protection -fundamentals concepts of limiting values, heterogeneity; evaluation standards for dose in radiation protection in the context of final repository search; definition of limiting values in toxicology; public participation to limiting value definition - a perspective for the radiation protection regulation; actual developments in radiation protection.

  11. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  12. Some Fundamental Limits on SAW RFID Tag Information Capacity and Collision Resolution

    Science.gov (United States)

    Barton, Richard J.

    2013-01-01

    In this paper, we apply results from multi-user information theory to study the limits of information capacity and collision resolution for SAW RFID tags. In particular, we derive bounds on the achievable data rate per tag as a function of fundamental parameters such as tag time-bandwidth product, tag signal-to-noise ratio (SNR), and number of tags in the environment. We also discuss the implications of these bounds for tag waveform design and tag interrogation efficiency

  13. A fundamentally new approach to air-cooled heat exchangers.

    Energy Technology Data Exchange (ETDEWEB)

    Koplow, Jeffrey P.

    2010-01-01

    We describe breakthrough results obtained in a feasibility study of a fundamentally new architecture for air-cooled heat exchangers. A longstanding but largely unrealized opportunity in energy efficiency concerns the performance of air-cooled heat exchangers used in air conditioners, heat pumps, and refrigeration equipment. In the case of residential air conditioners, for example, the typical performance of the air cooled heat exchangers used for condensers and evaporators is at best marginal from the standpoint the of achieving maximum the possible coefficient of performance (COP). If by some means it were possible to reduce the thermal resistance of these heat exchangers to a negligible level, a typical energy savings of order 30% could be immediately realized. It has long been known that a several-fold increase in heat exchanger size, in conjunction with the use of much higher volumetric flow rates, provides a straight-forward path to this goal but is not practical from the standpoint of real world applications. The tension in the market place between the need for energy efficiency and logistical considerations such as equipment size, cost and operating noise has resulted in a compromise that is far from ideal. This is the reason that a typical residential air conditioner exhibits significant sensitivity to reductions in fan speed and/or fouling of the heat exchanger surface. The prevailing wisdom is that little can be done to improve this situation; the 'fan-plus-finned-heat-sink' heat exchanger architecture used throughout the energy sector represents an extremely mature technology for which there is little opportunity for further optimization. But the fact remains that conventional fan-plus-finned-heat-sink technology simply doesn't work that well. Their primary physical limitation to performance (i.e. low thermal resistance) is the boundary layer of motionless air that adheres to and envelops all surfaces of the heat exchanger. Within this

  14. A systems approach to theoretical fluid mechanics: Fundamentals

    Science.gov (United States)

    Anyiwo, J. C.

    1978-01-01

    A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.

  15. Heat-Assisted Magnetic Recording: Fundamental Limits to Inverse Electromagnetic Design

    Science.gov (United States)

    Bhargava, Samarth

    In this dissertation, we address the burgeoning fields of diffractive optics, metals-optics and plasmonics, and computational inverse problems in the engineering design of electromagnetic structures. We focus on the application of the optical nano-focusing system that will enable Heat-Assisted Magnetic Recording (HAMR), a higher density magnetic recording technology that will fulfill the exploding worldwide demand of digital data storage. The heart of HAMR is a system that focuses light to a nano- sub-diffraction-limit spot with an extremely high power density via an optical antenna. We approach this engineering problem by first discussing the fundamental limits of nano-focusing and the material limits for metal-optics and plasmonics. Then, we use efficient gradient-based optimization algorithms to computationally design shapes of 3D nanostructures that outperform human designs on the basis of mass-market product requirements. In 2014, the world manufactured ˜1 zettabyte (ZB), ie. 1 Billion terabytes (TBs), of data storage devices, including ˜560 million magnetic hard disk drives (HDDs). Global demand of storage will likely increase by 10x in the next 5-10 years, and manufacturing capacity cannot keep up with demand alone. We discuss the state-of-art HDD and why industry invented Heat-Assisted Magnetic Recording (HAMR) to overcome the data density limitations. HAMR leverages the temperature sensitivity of magnets, in which the coercivity suddenly and non-linearly falls at the Curie temperature. Data recording to high-density hard disks can be achieved by locally heating one bit of information while co-applying a magnetic field. The heating can be achieved by focusing 100 microW of light to a 30nm diameter spot on the hard disk. This is an enormous light intensity, roughly ˜100,000,000x the intensity of sunlight on the earth's surface! This power density is ˜1,000x the output of gold-coated tapered optical fibers used in Near-field Scanning Optical Microscopes

  16. Fundamental phenomena affecting low temperature combustion and HCCI engines, high load limits and strategies for extending these limits

    KAUST Repository

    Saxena, Samveg; Bedoya, Ivá n D.

    2013-01-01

    Low temperature combustion (LTC) engines are an emerging engine technology that offers an alternative to spark-ignited and diesel engines. One type of LTC engine, the homogeneous charge compression ignition (HCCI) engine, uses a well-mixed fuel–air charge like spark-ignited engines and relies on compression ignition like diesel engines. Similar to diesel engines, the use of high compression ratios and removal of the throttling valve in HCCI allow for high efficiency operation, thereby allowing lower CO2 emissions per unit of work delivered by the engine. The use of a highly diluted well-mixed fuel–air charge allows for low emissions of nitrogen oxides, soot and particulate matters, and the use of oxidation catalysts can allow low emissions of unburned hydrocarbons and carbon monoxide. As a result, HCCI offers the ability to achieve high efficiencies comparable with diesel while also allowing clean emissions while using relatively inexpensive aftertreatment technologies. HCCI is not, however, without its challenges. Traditionally, two important problems prohibiting market penetration of HCCI are 1) inability to achieve high load, and 2) difficulty in controlling combustion timing. Recent research has significantly mitigated these challenges, and thus HCCI has a promising future for automotive and power generation applications. This article begins by providing a comprehensive review of the physical phenomena governing HCCI operation, with particular emphasis on high load conditions. Emissions characteristics are then discussed, with suggestions on how to inexpensively enable low emissions of all regulated emissions. The operating limits that govern the high load conditions are discussed in detail, and finally a review of recent research which expands the high load limits of HCCI is discussed. Although this article focuses on the fundamental phenomena governing HCCI operation, it is also useful for understanding the fundamental phenomena in reactivity controlled

  17. Fundamental phenomena affecting low temperature combustion and HCCI engines, high load limits and strategies for extending these limits

    KAUST Repository

    Saxena, Samveg

    2013-10-01

    Low temperature combustion (LTC) engines are an emerging engine technology that offers an alternative to spark-ignited and diesel engines. One type of LTC engine, the homogeneous charge compression ignition (HCCI) engine, uses a well-mixed fuel–air charge like spark-ignited engines and relies on compression ignition like diesel engines. Similar to diesel engines, the use of high compression ratios and removal of the throttling valve in HCCI allow for high efficiency operation, thereby allowing lower CO2 emissions per unit of work delivered by the engine. The use of a highly diluted well-mixed fuel–air charge allows for low emissions of nitrogen oxides, soot and particulate matters, and the use of oxidation catalysts can allow low emissions of unburned hydrocarbons and carbon monoxide. As a result, HCCI offers the ability to achieve high efficiencies comparable with diesel while also allowing clean emissions while using relatively inexpensive aftertreatment technologies. HCCI is not, however, without its challenges. Traditionally, two important problems prohibiting market penetration of HCCI are 1) inability to achieve high load, and 2) difficulty in controlling combustion timing. Recent research has significantly mitigated these challenges, and thus HCCI has a promising future for automotive and power generation applications. This article begins by providing a comprehensive review of the physical phenomena governing HCCI operation, with particular emphasis on high load conditions. Emissions characteristics are then discussed, with suggestions on how to inexpensively enable low emissions of all regulated emissions. The operating limits that govern the high load conditions are discussed in detail, and finally a review of recent research which expands the high load limits of HCCI is discussed. Although this article focuses on the fundamental phenomena governing HCCI operation, it is also useful for understanding the fundamental phenomena in reactivity controlled

  18. Fundamental problem in the relativistic approach to atomic structure theory

    International Nuclear Information System (INIS)

    Kagawa, Takashi

    1987-01-01

    It is known that the relativistic atomic structure theory contains a serious fundamental problem, so-called the Brown-Ravenhall (BR) problem or variational collapse. This problem arises from the fact that the energy spectrum of the relativistic Hamiltonian for many-electron systems is not bounded from below because the negative-energy solutions as well as the positive-energy ones are obtained from the relativistic equation. This report outlines two methods to avoid the BR problem in the relativistic calculation, that is, the projection operator method and the general variation method. The former method is described first. The use of a modified Hamiltonian containing a projection operator which projects the positive-energy solutions in the relativistic wave equation has been proposed to remove the BR difficulty. The problem in the use of the projection operator method is that the projection operator for the system cannot be determined uniquely. The final part of this report outlines the general variation method. This method can be applied to any system, such as relativistic ones whose Hamiltonian is not bounded from below. (Nogami, K.)

  19. Standardless quantification approach of TXRF analysis using fundamental parameter method

    International Nuclear Information System (INIS)

    Szaloki, I.; Taniguchi, K.

    2000-01-01

    New standardless evaluation procedure based on the fundamental parameter method (FPM) has been developed for TXRF analysis. The theoretical calculation describes the relationship between characteristic intensities and the geometrical parameters of the excitation, detection system and the specimen parameters: size, thickness, angle of the excitation beam to the surface and the optical properties of the specimen holder. Most of the TXRF methods apply empirical calibration, which requires the application of special preparation technique. However, the characteristic lines of the specimen holder (Si Kα,β) present information from the local excitation and geometrical conditions on the substrate surface. On the basis of the theoretically calculation of the substrate characteristic intensity the excitation beam flux can be approximated. Taking into consideration the elements are in the specimen material a system of non-linear equation can be given involving the unknown concentration values and the geometrical and detection parameters. In order to solve this mathematical problem PASCAL software was written, which calculates the sample composition and the average sample thickness by gradient algorithm. Therefore, this quantitative estimation of the specimen composition requires neither external nor internal standard sample. For verification of the theoretical calculation and the numerical procedure, several experiments were carried out using mixed standard solution containing elements of K, Sc, V, Mn, Co and Cu in 0.1 - 10 ppm concentration range. (author)

  20. Financial fluctuations anchored to economic fundamentals: A mesoscopic network approach.

    Science.gov (United States)

    Sharma, Kiran; Gopalakrishnan, Balagopal; Chakrabarti, Anindya S; Chakraborti, Anirban

    2017-08-14

    We demonstrate the existence of an empirical linkage between nominal financial networks and the underlying economic fundamentals, across countries. We construct the nominal return correlation networks from daily data to encapsulate sector-level dynamics and infer the relative importance of the sectors in the nominal network through measures of centrality and clustering algorithms. Eigenvector centrality robustly identifies the backbone of the minimum spanning tree defined on the return networks as well as the primary cluster in the multidimensional scaling map. We show that the sectors that are relatively large in size, defined with three metrics, viz., market capitalization, revenue and number of employees, constitute the core of the return networks, whereas the periphery is mostly populated by relatively smaller sectors. Therefore, sector-level nominal return dynamics are anchored to the real size effect, which ultimately shapes the optimal portfolios for risk management. Our results are reasonably robust across 27 countries of varying degrees of prosperity and across periods of market turbulence (2008-09) as well as periods of relative calmness (2012-13 and 2015-16).

  1. Coherence-limited solar power conversion: the fundamental thermodynamic bounds and the consequences for solar rectennas

    Science.gov (United States)

    Mashaal, Heylal; Gordon, Jeffrey M.

    2014-10-01

    Solar rectifying antennas constitute a distinct solar power conversion paradigm where sunlight's spatial coherence is a basic constraining factor. In this presentation, we derive the fundamental thermodynamic limit for coherence-limited blackbody (principally solar) power conversion. Our results represent a natural extension of the eponymous Landsberg limit, originally derived for converters that are not constrained by the radiation's coherence, and are irradiated at maximum concentration (i.e., with a view factor of unity to the solar disk). We proceed by first expanding Landsberg's results to arbitrary solar view factor (i.e., arbitrary concentration and/or angular confinement), and then demonstrate how the results are modified when the converter can only process coherent radiation. The results are independent of the specific power conversion mechanism, and hence are valid for diffraction-limited as well as quantum converters (and not just classical heat engines or in the geometric optics regime). The derived upper bounds bode favorably for the potential of rectifying antennas as potentially high-efficiency solar converters.

  2. The separate universe approach to soft limits

    Energy Technology Data Exchange (ETDEWEB)

    Kenton, Zachary; Mulryne, David J., E-mail: z.a.kenton@qmul.ac.uk, E-mail: d.mulryne@qmul.ac.uk [School of Physics and Astronomy, Queen Mary University of London, Mile End Road, London, E1 4NS (United Kingdom)

    2016-10-01

    We develop a formalism for calculating soft limits of n -point inflationary correlation functions using separate universe techniques. Our method naturally allows for multiple fields and leads to an elegant diagrammatic approach. As an application we focus on the trispectrum produced by inflation with multiple light fields, giving explicit formulae for all possible single- and double-soft limits. We also investigate consistency relations and present an infinite tower of inequalities between soft correlation functions which generalise the Suyama-Yamaguchi inequality.

  3. An approach to fundamental study of beam loss minimization

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1999-01-01

    The accelerator design rules involving rms matching, developed at CERN in the 1970's, are discussed. An additional rule, for equipartitioning the beam energy among its degrees of freedom, may be added to insure an rms equilibrium conditions. If the strong stochasticity threshold is avoided, as it is in realistic accelerator designs, the dynamics is characterized by extremely long transient settling times, making the role of equipartitioning hard to explain. An approach to systematic study using the RFQ accelerator as a simulation testbed is discussed. New methods are available from recent advances in research on complexity, nonlinear dynamics, and chaos

  4. Limits of the endoscopic transnasal transtubercular approach.

    Science.gov (United States)

    Gellner, Verena; Tomazic, Peter V

    2018-06-01

    The endoscopic transnasal trans-sphenoidal transtubercular approach has become a standard alternative approach to neurosurgical transcranial routes for lesions of the anterior skull base in particular pathologies of the anterior tubercle, sphenoid plane, and midline lesions up to the interpeduncular cistern. For both the endoscopic and the transcranial approach indications must strictly be evaluated and tailored to the patients' morphology and condition. The purpose of this review was to evaluate the evidence in literature of the limitations of the endoscopic transtubercular approach. A PubMed/Medline search was conducted in January 2018 entering following keywords. Upon initial screening 7 papers were included in this review. There are several other papers describing the endoscopic transtubercular approach (ETTA). We tried to list the limitation factors according to the actual existing literature as cited. The main limiting factors are laterally extending lesions in relation to the optic canal and vascular encasement and/or unfavorable tumor tissue consistency. The ETTA is considered as a high level transnasal endoscopic extended skull base approach and requires excellent training, skills and experience.

  5. Roothaan approach in the thermodynamic limit

    Science.gov (United States)

    Gutierrez, G.; Plastino, A.

    1982-02-01

    A systematic method for the solution of the Hartree-Fock equations in the thermodynamic limit is presented. The approach is seen to be a natural extension of the one usually employed in the finite-fermion case, i.e., that developed by Roothaan. The new techniques developed here are applied, as an example, to neutron matter, employing the so-called V1 Bethe "homework" potential. The results obtained are, by far, superior to those that the ordinary plane-wave Hartree-Fock theory yields. NUCLEAR STRUCTURE Hartree-Fock approach; nuclear and neutron matter.

  6. Fundamental limitations on V/STOL terminal guidance due to aircraft characteristics

    Science.gov (United States)

    Wolkovitch, J.; Lamont, C. W.; Lochtie, D. W.

    1971-01-01

    A review is given of limitations on approach flight paths of V/STOL aircraft, including limits on descent angle due to maximum drag/lift ratio. A method of calculating maximum drag/lift ratio of tilt-wing and deflected slipstream aircraft is presented. Derivatives and transfer functions for the CL-84 tilt-wing and X-22A tilt-duct aircraft are presented. For the unaugmented CL-84 in steep descents the transfer function relating descent angle to thrust contains a right-half plane zero. Using optimal control theory, it is shown that this zero causes a serious degradation in the accuracy with which steep flight paths can be followed in the presence of gusts.

  7. The Limits of Existential Autonomy and the Fundamental Law Duties of Preserving Inconscious People Lives

    Directory of Open Access Journals (Sweden)

    Ana Stela Vieira Mendes Câmara

    2016-12-01

    Full Text Available In the face of factual, conceptual and scientific uncertainties surrounding the finitude of life, and assuming the search for the ideal of a dignified, natural and proper death without prepayments or undue extensions, this research has the scope to investigate the reasonableness of the parameters that establish limitations on existential autonomy, due to the preservation of life of unconscious people. Identifies, based on heteronomous component of human dignity, the existence of a bundle of basic legal duties of protection of these individuals whose ownership rests with the family and the state. The methodology is qualitative, interdisciplinary bibliographic and documentary, in which it is used hypothetical-deductive approach.

  8. Roothaan approach in the thermodynamic limit

    International Nuclear Information System (INIS)

    Gutierrez, G.; Plastino, A.

    1982-01-01

    A systematic method for the solution of the Hartree-Fock equations in the thermodynamic limit is presented. The approach is seen to be a natural extension of the one usually employed in the finite-fermion case, i.e., that developed by Roothaan. The new techniques developed here are applied, as an example, to neutron matter, employing the so-called V 1 Bethe homework potential. The results obtained are, by far, superior to those that the ordinary plane-wave Hartree-Fock theory yields

  9. The point of no return: A fundamental limit on the ability to control thought and action.

    Science.gov (United States)

    Logan, Gordon D

    2015-01-01

    Bartlett (1958. Thinking. New York: Basic Books) described the point of no return as a point of irrevocable commitment to action, which was preceded by a period of gradually increasing commitment. As such, the point of no return reflects a fundamental limit on the ability to control thought and action. I review the literature on the point of no return, taking three perspectives. First, I consider the point of no return from the perspective of the controlled act, as a locus in the architecture and anatomy of the underlying processes. I review experiments from the stop-signal paradigm that suggest that the point of no return is located late in the response system. Then I consider the point of no return from the perspective of the act of control that tries to change the controlled act before it becomes irrevocable. From this perspective, the point of no return is a point in time that provides enough "lead time" for the act of control to take effect. I review experiments that measure the response time to the stop signal as the lead time required for response inhibition in the stop-signal paradigm. Finally, I consider the point of no return in hierarchically controlled tasks, in which there may be many points of no return at different levels of the hierarchy. I review experiments on skilled typing that suggest different points of no return for the commands that determine what is typed and the countermands that inhibit typing, with increasing commitment to action the lower the level in the hierarchy. I end by considering the point of no return in perception and thought as well as action.

  10. Surface chemistry and fundamental limitations on the plasma cleaning of metals

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Bin, E-mail: bindong@my.unt.edu [Department of Chemistry, University of North Texas, 1155 Union Circle 305070, Denton, TX, 76203 (United States); Driver, M. Sky, E-mail: Marcus.Driver@unt.edu [Department of Chemistry, University of North Texas, 1155 Union Circle 305070, Denton, TX, 76203 (United States); Emesh, Ismail, E-mail: Ismail_Emesh@amat.com [Applied Materials Inc., 3050 Bowers Ave, Santa Clara, CA, 95054 (United States); Shaviv, Roey, E-mail: Roey_Shaviv@amat.com [Applied Materials Inc., 3050 Bowers Ave, Santa Clara, CA, 95054 (United States); Kelber, Jeffry A., E-mail: Jeffry.Kelber@unt.edu [Department of Chemistry, University of North Texas, 1155 Union Circle 305070, Denton, TX, 76203 (United States)

    2016-10-30

    Highlights: • O{sub 2}-free plasma treatment of air-exposed Co or Cu surfaces yields remnant C layers inert to further plasma cleaning. • The formation of the remnant C layer is graphitic (Cu) or carbidic (Co). • The formation of a remnant C layer is linked to plasma cleaning of a metal surface. - Abstract: In-situ X-ray photoelectron spectroscopy (XPS) studies reveal that plasma cleaning of air-exposed Co or Cu transition metal surfaces results in the formation of a remnant C film 1–3 monolayers thick, which is not reduced upon extensive further plasma exposure. This effect is observed for H{sub 2} or NH{sub 3} plasma cleaning of Co, and He or NH{sub 3} plasma cleaning of Cu, and is observed with both inductively coupled (ICP) and capacitively-coupled plasma (CCP). Changes in C 1 s XPS spectra indicate that this remnant film formation is accompanied by the formation of carbidic C on Co and of graphitic C on Cu. This is in contrast to published work showing no such remnant carbidic/carbon layer after similar treatments of Si oxynitride surfaces. The observation of the remnant carbidic C film on Co and graphitic film on Cu, but not on silicon oxynitride (SiO{sub x}N{sub y}), regardless of plasma chemistry or type, indicates that this effect is due to plasma induced secondary electron emission from the metal surface, resulting in transformation of sp{sup 3} adventitious C to either a metal carbide or graphite. These results suggest fundamental limitations to plasma-based surface cleaning procedures on metal surfaces.

  11. Ultrafast mid-IR laser scalpel: protein signals of the fundamental limits to minimally invasive surgery.

    Science.gov (United States)

    Amini-Nik, Saeid; Kraemer, Darren; Cowan, Michael L; Gunaratne, Keith; Nadesan, Puviindran; Alman, Benjamin A; Miller, R J Dwayne

    2010-09-28

    Lasers have in principle the capability to cut at the level of a single cell, the fundamental limit to minimally invasive procedures and restructuring biological tissues. To date, this limit has not been achieved due to collateral damage on the macroscale that arises from thermal and shock wave induced collateral damage of surrounding tissue. Here, we report on a novel concept using a specifically designed Picosecond IR Laser (PIRL) that selectively energizes water molecules in the tissue to drive ablation or cutting process faster than thermal exchange of energy and shock wave propagation, without plasma formation or ionizing radiation effects. The targeted laser process imparts the least amount of energy in the remaining tissue without any of the deleterious photochemical or photothermal effects that accompanies other laser wavelengths and pulse parameters. Full thickness incisional and excisional wounds were generated in CD1 mice using the Picosecond IR Laser, a conventional surgical laser (DELight Er:YAG) or mechanical surgical tools. Transmission and scanning electron microscopy showed that the PIRL laser produced minimal tissue ablation with less damage of surrounding tissues than wounds formed using the other modalities. The width of scars formed by wounds made by the PIRL laser were half that of the scars produced using either a conventional surgical laser or a scalpel. Aniline blue staining showed higher levels of collagen in the early stage of the wounds produced using the PIRL laser, suggesting that these wounds mature faster. There were more viable cells extracted from skin using the PIRL laser, suggesting less cellular damage. β-catenin and TGF-β signalling, which are activated during the proliferative phase of wound healing, and whose level of activation correlates with the size of wounds was lower in wounds generated by the PIRL system. Wounds created with the PIRL systsem also showed a lower rate of cell proliferation. Direct comparison of wound

  12. Ultrafast mid-IR laser scalpel: protein signals of the fundamental limits to minimally invasive surgery.

    Directory of Open Access Journals (Sweden)

    Saeid Amini-Nik

    2010-09-01

    Full Text Available Lasers have in principle the capability to cut at the level of a single cell, the fundamental limit to minimally invasive procedures and restructuring biological tissues. To date, this limit has not been achieved due to collateral damage on the macroscale that arises from thermal and shock wave induced collateral damage of surrounding tissue. Here, we report on a novel concept using a specifically designed Picosecond IR Laser (PIRL that selectively energizes water molecules in the tissue to drive ablation or cutting process faster than thermal exchange of energy and shock wave propagation, without plasma formation or ionizing radiation effects. The targeted laser process imparts the least amount of energy in the remaining tissue without any of the deleterious photochemical or photothermal effects that accompanies other laser wavelengths and pulse parameters. Full thickness incisional and excisional wounds were generated in CD1 mice using the Picosecond IR Laser, a conventional surgical laser (DELight Er:YAG or mechanical surgical tools. Transmission and scanning electron microscopy showed that the PIRL laser produced minimal tissue ablation with less damage of surrounding tissues than wounds formed using the other modalities. The width of scars formed by wounds made by the PIRL laser were half that of the scars produced using either a conventional surgical laser or a scalpel. Aniline blue staining showed higher levels of collagen in the early stage of the wounds produced using the PIRL laser, suggesting that these wounds mature faster. There were more viable cells extracted from skin using the PIRL laser, suggesting less cellular damage. β-catenin and TGF-β signalling, which are activated during the proliferative phase of wound healing, and whose level of activation correlates with the size of wounds was lower in wounds generated by the PIRL system. Wounds created with the PIRL systsem also showed a lower rate of cell proliferation. Direct

  13. Quantum cryptography approaching the classical limit.

    Science.gov (United States)

    Weedbrook, Christian; Pirandola, Stefano; Lloyd, Seth; Ralph, Timothy C

    2010-09-10

    We consider the security of continuous-variable quantum cryptography as we approach the classical limit, i.e., when the unknown preparation noise at the sender's station becomes significantly noisy or thermal (even by as much as 10(4) times greater than the variance of the vacuum mode). We show that, provided the channel transmission losses do not exceed 50%, the security of quantum cryptography is not dependent on the channel transmission, and is therefore incredibly robust against significant amounts of excess preparation noise. We extend these results to consider for the first time quantum cryptography at wavelengths considerably longer than optical and find that regions of security still exist all the way down to the microwave.

  14. Fundamental limitations of non-thermal plasma processing for internal combustion engine NOx control

    International Nuclear Information System (INIS)

    Penetrante, B.M.

    1993-01-01

    This paper discusses the physics and chemistry of non-thermal plasma processing for post-combustion NO x control in internal combustion engines. A comparison of electron beam and electrical discharge processing is made regarding their power consumption, radical production, NO x removal mechanisms, and by product formation. Can non-thermal deNO x operate efficiently without additives or catalysts? How much electrical power does it cost to operate? What are the by-products of the process? This paper addresses these fundamental issues based on an analysis of the electron-molecule processes and chemical kinetics

  15. Approach to DOE threshold guidance limits

    International Nuclear Information System (INIS)

    Shuman, R.D.; Wickham, L.E.

    1984-01-01

    The need for less restrictive criteria governing disposal of extremely low-level radioactive waste has long been recognized. The Low-Level Waste Management Program has been directed by the Department of Energy (DOE) to aid in the development of a threshold guidance limit for DOE low-level waste facilities. Project objectives are concernd with the definition of a threshold limit dose and pathway analysis of radionuclide transport within selected exposure scenarios at DOE sites. Results of the pathway analysis will be used to determine waste radionuclide concentration guidelines that meet the defined threshold limit dose. Methods of measurement and verification of concentration limits round out the project's goals. Work on defining a threshold limit dose is nearing completion. Pathway analysis of sanitary landfill operations at the Savannah River Plant and the Idaho National Engineering Laboratory is in progress using the DOSTOMAN computer code. Concentration limit calculations and determination of implementation procedures shall follow completion of the pathways work. 4 references

  16. Probing the fundamental limit of niobium in high radiofrequency fields by dual mode excitation in superconducting radiofrequency cavities

    International Nuclear Information System (INIS)

    Eremeev, Grigory; Geng, Rongli; Palczewski, Ari

    2011-01-01

    We have studied thermal breakdown in several multicell superconducting radiofrequency cavity by simultaneous excitation of two TM 010 passband modes. Unlike measurements done in the past, which indicated a clear thermal nature of the breakdown, our measurements present a more complex picture with interplay of both thermal and magnetic effects. JLab LG-1 that we studied was limited at 40.5 MV/m, corresponding to B peak = 173 mT, in 89 mode. Dual mode measurements on this quench indicate that this quench is not purely magnetic, and so we conclude that this field is not the fundamental limit in SRF cavities

  17. Fundamentals of a graded approach to safety-related equipment setpoints

    International Nuclear Information System (INIS)

    Woodruff, B.A.; Cash, J.S. Jr.; Bockhorst, R.M.

    1993-01-01

    The concept of using a graded approach to reconstitute instrument setpoints associated with safety-related equipment was first presented to the industry by the U.S. Nuclear Regulatory Commission during the 1992 ISA/POWID Symposium in Kansas City, Missouri. The graded approach establishes that the manner in which a utility analyzes and documents setpoints is related to each setpoint's relative importance to safety. This allows a utility to develop separate requirements for setpoints of varying levels of safety significance. A graded approach to setpoints is a viable strategy that minimizes extraneous effort expended in resolving difficult issues that arise when formal setpoint methodology is applied blindly to all setpoints. Close examination of setpoint methodology reveals that the application of a graded approach is fundamentally dependent on the analytical basis of each individual setpoint

  18. Shotgun approaches to gait analysis : insights & limitations

    NARCIS (Netherlands)

    Kaptein, Ronald G.; Wezenberg, Daphne; IJmker, Trienke; Houdijk, Han; Beek, Peter J.; Lamoth, Claudine J. C.; Daffertshofer, Andreas

    2014-01-01

    Background: Identifying features for gait classification is a formidable problem. The number of candidate measures is legion. This calls for proper, objective criteria when ranking their relevance. Methods: Following a shotgun approach we determined a plenitude of kinematic and physiological gait

  19. Limits on variations in fundamental constants from 21-cm and ultraviolet Quasar absorption lines.

    Science.gov (United States)

    Tzanavaris, P; Webb, J K; Murphy, M T; Flambaum, V V; Curran, S J

    2005-07-22

    Quasar absorption spectra at 21-cm and UV rest wavelengths are used to estimate the time variation of x [triple-bond] alpha(2)g(p)mu, where alpha is the fine structure constant, g(p) the proton g factor, and m(e)/m(p) [triple-bond] mu the electron/proton mass ratio. Over a redshift range 0.24 < or = zeta(abs) < or = 2.04, (Deltax/x)(weighted)(total) = (1.17 +/- 1.01) x 10(-5). A linear fit gives x/x = (-1.43 +/- 1.27) x 10(-15) yr(-1). Two previous results on varying alpha yield the strong limits Deltamu/mu = (2.31 +/- 1.03) x 10(-5) and Deltamu/mu=(1.29 +/- 1.01) x10(-5). Our sample, 8 x larger than any previous, provides the first direct estimate of the intrinsic 21-cm and UV velocity differences 6 km s(-1).

  20. The voluntary offset - approaches and limitations

    International Nuclear Information System (INIS)

    2012-06-01

    After having briefly presented the voluntary offset mechanism which aims at funding a project of reduction or capture of greenhouse gas emissions, this document describes the approach to be followed to adopt this voluntary offset, for individuals as well as for companies, communities or event organisations. It describes other important context issues (projects developed under the voluntary offset, actors of the voluntary offsetting market, market status, offset labels), and how to proceed in practice (definition of objectives and expectations, search for needed requirements, to ensure the meeting of requirements with respect to expectations). It addresses the case of voluntary offset in France (difficult implantation, possible solutions)

  1. Fundamental Limits to Coherent Scattering and Photon Coalescence from Solid-State Quantum Emitters [arXiv

    DEFF Research Database (Denmark)

    Iles-Smith, Jake; McCutcheon, Dara; Mørk, Jesper

    2016-01-01

    a substantial suppression of detrimental interactions between the source and its phonon environment. Nevertheless, we demonstrate here that this reasoning is incomplete, and phonon interactions continue to play a crucial role in determining solid-state emission characteristics even for very weak excitation. We...... find that the sideband resulting from non-Markovian relaxation of the phonon environment leads to a fundamental limit to the fraction of coherently scattered light and to the visibility of two-photon coalescence at weak driving, both of which are absent for atomic systems or within simpler Markovian...

  2. The thermodynamic limit and the finite-size behaviour of the fundamental Sp(2N) spin chain

    International Nuclear Information System (INIS)

    Martins, M.J.

    2002-01-01

    This paper is concerned with the study of the fundamental integrable Sp(2N) spin chain. The Bethe ansatz equations are solved by special string structure which allows us to determine the bulk limit properties. We present evidences that the critical properties of the system are governed by the product of N c=1 conformal field theories and therefore different from that of the Sp(2N) Wess-Zumino-Witten theory. We argue that many of our findings can be generalized to include anisotropic symplectic spin chains. The possible relevance of our results to the physics of the spin-orbital spin chains are also discussed

  3. Fire protection for nuclear power plants. Part 1. Fundamental approaches. Version 6/99

    International Nuclear Information System (INIS)

    1999-06-01

    The KTA nuclear safety code sets out the fundamental approaches and principles for the prevention of fires in nuclear power plants, addressing aspects such as initiation, spreading, and effects of a fire: (a) Fire load and ignition sources, (b) structural and plant engineering conditions, (c) ways and means relating to fire call and fire fighting. Relevant technical and organisational measures are defined. Scope and quality of fire prevention measures to be taken, as well the relevant in-service inspection activities are determined according to the protective goals pursued in each case. (orig./CB) [de

  4. Fundamentals, financial factors and firm investment in India: A Panel VAR approach

    OpenAIRE

    Das, Pranab Kumar

    2008-01-01

    This study analyses the role of fundamentals and financial factors in determining firm investment in India with imperfect capital market in a panel VAR framework. Previous research in this area is based on the test of significance (or some variant of this) of the cash flow variable in the investment equation. In this strand of research, cash flow is considered to be a financial factor. The major theoretical problem of this approach is that in a forward-looking model cash flow might be cor...

  5. Extending the fundamental imaging-depth limit of multi-photon microscopy by imaging with photo-activatable fluorophores.

    Science.gov (United States)

    Chen, Zhixing; Wei, Lu; Zhu, Xinxin; Min, Wei

    2012-08-13

    It is highly desirable to be able to optically probe biological activities deep inside live organisms. By employing a spatially confined excitation via a nonlinear transition, multiphoton fluorescence microscopy has become indispensable for imaging scattering samples. However, as the incident laser power drops exponentially with imaging depth due to scattering loss, the out-of-focus fluorescence eventually overwhelms the in-focal signal. The resulting loss of imaging contrast defines a fundamental imaging-depth limit, which cannot be overcome by increasing excitation intensity. Herein we propose to significantly extend this depth limit by multiphoton activation and imaging (MPAI) of photo-activatable fluorophores. The imaging contrast is drastically improved due to the created disparity of bright-dark quantum states in space. We demonstrate this new principle by both analytical theory and experiments on tissue phantoms labeled with synthetic caged fluorescein dye or genetically encodable photoactivatable GFP.

  6. When fast is better: protein folding fundamentals and mechanisms from ultrafast approaches.

    Science.gov (United States)

    Muñoz, Victor; Cerminara, Michele

    2016-09-01

    Protein folding research stalled for decades because conventional experiments indicated that proteins fold slowly and in single strokes, whereas theory predicted a complex interplay between dynamics and energetics resulting in myriad microscopic pathways. Ultrafast kinetic methods turned the field upside down by providing the means to probe fundamental aspects of folding, test theoretical predictions and benchmark simulations. Accordingly, experimentalists could measure the timescales for all relevant folding motions, determine the folding speed limit and confirm that folding barriers are entropic bottlenecks. Moreover, a catalogue of proteins that fold extremely fast (microseconds) could be identified. Such fast-folding proteins cross shallow free energy barriers or fold downhill, and thus unfold with minimal co-operativity (gradually). A new generation of thermodynamic methods has exploited this property to map folding landscapes, interaction networks and mechanisms at nearly atomic resolution. In parallel, modern molecular dynamics simulations have finally reached the timescales required to watch fast-folding proteins fold and unfold in silico All of these findings have buttressed the fundamentals of protein folding predicted by theory, and are now offering the first glimpses at the underlying mechanisms. Fast folding appears to also have functional implications as recent results connect downhill folding with intrinsically disordered proteins, their complex binding modes and ability to moonlight. These connections suggest that the coupling between downhill (un)folding and binding enables such protein domains to operate analogically as conformational rheostats. © 2016 The Author(s).

  7. A Fundamental Approach to Developing Aluminium based Bulk Amorphous Alloys based on Stable Liquid Metal Structures and Electronic Equilibrium - 154041

    Science.gov (United States)

    2017-03-28

    AFRL-AFOSR-JP-TR-2017-0027 A Fundamental Approach to Developing Aluminium -based Bulk Amorphous Alloys based on Stable Liquid-Metal Structures and...to 16 Dec 2016 4.  TITLE AND SUBTITLE A Fundamental Approach to Developing Aluminium -based Bulk Amorphous Alloys based on Stable Liquid-Metal...Air Force Research Laboratory for accurately predicting compositions of new amorphous alloys specifically based on aluminium with properties superior

  8. Fundamental parameters approach applied to focal construct geometry for X-ray diffraction

    International Nuclear Information System (INIS)

    Rogers, K.; Evans, P.; Prokopiou, D.; Dicken, A.; Godber, S.; Rogers, J.

    2012-01-01

    A novel geometry for the acquisition of powder X-ray diffraction data, referred to as focal construct geometry (FCG), is presented. Diffraction data obtained by FCG have been shown to possess significantly enhanced intensity due to the hollow tube beam arrangement utilized. In contrast to conventional diffraction, the detector is translated to collect images along a primary axis and record the location of Bragg maxima. These high intensity condensation foci are unique to FCG and appear due to the convergence of Debye cones at single points on the primary axis. This work focuses on a two dimensional, fundamental parameter's approach to simulate experimental data and subsequently aid with interpretation. This convolution method is shown to favorably reproduce the experimental diffractograms and can also accommodate preferred orientation effects in some circumstances.

  9. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    International Nuclear Information System (INIS)

    Egan, James; McMillan, Normal; Denieffe, David

    2011-01-01

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  10. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    Energy Technology Data Exchange (ETDEWEB)

    Egan, James; McMillan, Normal; Denieffe, David, E-mail: eganj@itcarlow.ie [IT Carlow (Ireland)

    2011-08-17

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  11. The fundamental parameter approach of quantitative XRFA- investigation of photoelectric absorption coefficients

    International Nuclear Information System (INIS)

    Shaltout, A.

    2003-06-01

    The present work describes some actual problems of quantitative x-ray fluorescence analysis by means of the fundamental parameter approach. To perform this task, some of the main parameters are discussed in detail. These parameters are photoelectric cross sections, coherent and incoherent scattering cross sections, mass absorption cross sections and the variation of the x-ray tube voltage. Photoelectric cross sections, coherent and incoherent scattering cross sections and mass absorption cross sections in the energy range from 1 to 300 keV for the elements from Z=1 to 94 considering ten different data bases are studied. These are data bases given by Hubbell, McMaster, Mucall, Scofield, Xcom, Elam, Sasaki, Henke, Cullen and Chantler's data bases. These data bases have been developed also for an application in fundamental parameter programs for quantitative x-ray analysis (Energy Dispersive X-Ray Fluorescence Analysis (EDXRFA), Electron Probe Microanalysis (EPMA), X-Ray Photoelectron Spectroscopy (XPS) and Total Electron Yield (TEY)). In addition a comparison is performed between different data bases. In McMaster's data base, the missing elements (Z=84, 85, 87, 88, 89, 91, and 93) are added by using photoelectric cross sections of Scofield's data base, coherent as well as incoherent scattering cross sections of Elam's data base and the absorption edges of Bearden. Also, the N-fit coefficients of the elements from Z=61 to 69 are wrong in McMaster data base, therefore, linear least squares fits are used to recalculate the N-fit coefficients of these elements. Additionally, in the McMaster tables the positions of the M- and N-edges of all elements with the exception of the M1- and N1- edges are not defined as well as the jump ratio of the edges. In the present work, the M- and N-edges and the related jump ratios are calculated. To include the missing N-edges, Bearden's values of energy edges are used. In Scofield's data base, modifications include check and correction

  12. Fundamental approaches for analysis thermal hydraulic parameter for Puspati Research Reactor

    International Nuclear Information System (INIS)

    Hashim, Zaredah; Lanyau, Tonny Anak; Farid, Mohamad Fairus Abdul; Kassim, Mohammad Suhaimi; Azhar, Noraishah Syahirah

    2016-01-01

    The 1-MW PUSPATI Research Reactor (RTP) is the one and only nuclear pool type research reactor developed by General Atomic (GA) in Malaysia. It was installed at Malaysian Nuclear Agency and has reached the first criticality on 8 June 1982. Based on the initial core which comprised of 80 standard TRIGA fuel elements, the very fundamental thermal hydraulic model was investigated during steady state operation using the PARET-code. The main objective of this paper is to determine the variation of temperature profiles and Departure of Nucleate Boiling Ratio (DNBR) of RTP at full power operation. The second objective is to confirm that the values obtained from PARET-code are in agreement with Safety Analysis Report (SAR) for RTP. The code was employed for the hot and average channels in the core in order to calculate of fuel’s center and surface, cladding, coolant temperatures as well as DNBR’s values. In this study, it was found that the results obtained from the PARET-code showed that the thermal hydraulic parameters related to safety for initial core which was cooled by natural convection was in agreement with the designed values and safety limit in SAR

  13. Fundamental approaches for analysis thermal hydraulic parameter for Puspati Research Reactor

    Science.gov (United States)

    Hashim, Zaredah; Lanyau, Tonny Anak; Farid, Mohamad Fairus Abdul; Kassim, Mohammad Suhaimi; Azhar, Noraishah Syahirah

    2016-01-01

    The 1-MW PUSPATI Research Reactor (RTP) is the one and only nuclear pool type research reactor developed by General Atomic (GA) in Malaysia. It was installed at Malaysian Nuclear Agency and has reached the first criticality on 8 June 1982. Based on the initial core which comprised of 80 standard TRIGA fuel elements, the very fundamental thermal hydraulic model was investigated during steady state operation using the PARET-code. The main objective of this paper is to determine the variation of temperature profiles and Departure of Nucleate Boiling Ratio (DNBR) of RTP at full power operation. The second objective is to confirm that the values obtained from PARET-code are in agreement with Safety Analysis Report (SAR) for RTP. The code was employed for the hot and average channels in the core in order to calculate of fuel's center and surface, cladding, coolant temperatures as well as DNBR's values. In this study, it was found that the results obtained from the PARET-code showed that the thermal hydraulic parameters related to safety for initial core which was cooled by natural convection was in agreement with the designed values and safety limit in SAR.

  14. Fundamental approaches for analysis thermal hydraulic parameter for Puspati Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Hashim, Zaredah, E-mail: zaredah@nm.gov.my; Lanyau, Tonny Anak, E-mail: tonny@nm.gov.my; Farid, Mohamad Fairus Abdul; Kassim, Mohammad Suhaimi [Reactor Technology Centre, Technical Support Division, Malaysia Nuclear Agency, Ministry of Science, Technology and Innovation, Bangi, 43000, Kajang, Selangor Darul Ehsan (Malaysia); Azhar, Noraishah Syahirah [Universiti Teknologi Malaysia, 80350, Johor Bahru, Johor Darul Takzim (Malaysia)

    2016-01-22

    The 1-MW PUSPATI Research Reactor (RTP) is the one and only nuclear pool type research reactor developed by General Atomic (GA) in Malaysia. It was installed at Malaysian Nuclear Agency and has reached the first criticality on 8 June 1982. Based on the initial core which comprised of 80 standard TRIGA fuel elements, the very fundamental thermal hydraulic model was investigated during steady state operation using the PARET-code. The main objective of this paper is to determine the variation of temperature profiles and Departure of Nucleate Boiling Ratio (DNBR) of RTP at full power operation. The second objective is to confirm that the values obtained from PARET-code are in agreement with Safety Analysis Report (SAR) for RTP. The code was employed for the hot and average channels in the core in order to calculate of fuel’s center and surface, cladding, coolant temperatures as well as DNBR’s values. In this study, it was found that the results obtained from the PARET-code showed that the thermal hydraulic parameters related to safety for initial core which was cooled by natural convection was in agreement with the designed values and safety limit in SAR.

  15. [95/95] Approach for design limits analysis in WWER

    International Nuclear Information System (INIS)

    Shishkov, L.; Tsyganov, S.

    2008-01-01

    The paper discusses a well-known condition [95%/95%], which is important for monitoring some limits of core parameters in the course of designing the reactors (such as PWR or WWER). The condition ensures the postulate 'there is at least a 95 % probability at a 95 % confidence level that' some parameter does not exceed the limit. Such conditions are stated, for instance, in US standards and IAEA norms as recommendations for DNBR and fuel temperature. A question may arise: why can such approach for the limits be only applied to these parameters, while not normally applied to any other parameters? What is the way to ensure the limits in design practice? Using the general statements of mathematical statistics the authors interpret the [95/95] approach as applied to WWER design limits. (Authors)

  16. MAKING THE NEIGHBOURHOOD A BETTER PLACE TO LIVE. A SWB APPROACH IMPLEMENTING FUNDAMENTAL HUMAN NEEDS

    Directory of Open Access Journals (Sweden)

    Ioanna Anna Papachristou

    2015-10-01

    Full Text Available Subjective well-being (SWB studies have been at the centre of researchers’ attention during the last years. With the majority of people now living in cities, the necessity for a more anthropocentric approach for the study and betterment of urban environments is constantly increasing. In this sense, defining and measuring SWB in urban contexts can be of particular benefit in urban design and planning processes. In this article, a method for measuring SWB for urban places based on the accomplishment of the fundamental human needs is presented and applied at a neighbourhood of Barcelona; that of Vila de Gràcia. For the measurement, a survey was constructed based on the specific geographical and socio-economic characteristics of the study case. Retrieved from Max-Neef’s Human Scale Development Paradigm (Max-Neef et al. 1991, human needs correspond to the domains of study of the suggested method. The matching of the survey’s questions to each need is the outcome of two consecutive processes: a first qualitative one, involving the work of an expert group, and a second quantitative one, involving the definition of weights among the questions that affect the same need. Although the final result is positive (although low for this study case, results for each need show considerable differences in their level of accomplishment. At the same time people seem to truly believe that most of their feelings are affected by their living environment, with stress and calmness leading the list. In summary, the method defines and applies a simple tool to quantify and evaluate current levels of SWB at different urban scales and to determine more holistic urban indexes in order to improve decision making processes, policies and plans. The classification of the questions per need favours the identification of a potential problem at the urban grid and consequently can be used as a process for implementing related measures of improvement. The method can also be seen

  17. Analysis of Budget Deficits and Macroeconomic Fundamentals: A VAR-VECM Approach

    Directory of Open Access Journals (Sweden)

    Manamba Epaphra

    2017-10-01

    Full Text Available Aim/purpose - This paper examines the relationship between budget deficits and selected macroeconomic variables in Tanzania for the period spanning from 1966 to 2015. Design/methodology/approach - The paper uses Vector autoregression (VAR - Vector Error Correction Model (VECM and variance decomposition techniques. The Johansen's test is applied to examine the long run relationship among the variables under study. Findings - The Johansen's test of cointegration indicates that the variables are cointegrated and thus have a long run relationship. The results based on the VAR-VECM estimation show that real GDP and exchange rate have a negative and significant relationship with budget deficit whereas inflation, money supply and lending interest rate have a positive one. Variance decomposition results show that variances in the budget deficits are mostly explained by the real GDP, followed by inflation and real exchange rate. Research implications/limitations - Results are very indicative, but highlight the importance of containing inflation and money supply to check their effects on budget deficits over the short run and long-run periods. Also, policy recommendation calls for fiscal authorities in Tanzania to adopt efficient and effective methods of tax collection and public sector spending. Originality/value/contribution - Tanzania has been experiencing budget deficit since the 1970s and that this budget deficit has been blamed for high indebtedness, inflation and poor investment and growth. The paper contributes to the empirical debate on the causal relationship between budget deficits and macroeconomic variables by employing VAR-VECM and variance decomposition approaches.

  18. Approaching the Shockley-Queisser limit: General assessment of the main limiting mechanisms in photovoltaic cells

    International Nuclear Information System (INIS)

    Vossier, Alexis; Gualdi, Federico; Dollet, Alain; Ares, Richard; Aimez, Vincent

    2015-01-01

    In principle, the upper efficiency limit of any solar cell technology can be determined using the detailed-balance limit formalism. However, “real” solar cells show efficiencies which are always below this theoretical value due to several limiting mechanisms. We study the ability of a solar cell architecture to approach its own theoretical limit, using a novel index introduced in this work, and the amplitude with which the different limiting mechanisms affect the cell efficiency is scrutinized as a function of the electronic gap and the illumination level to which the cell is submitted. The implications for future generations of solar cells aiming at an improved conversion of the solar spectrum are also addressed

  19. Bayesian analysis of rotating machines - A statistical approach to estimate and track the fundamental frequency

    DEFF Research Database (Denmark)

    Pedersen, Thorkild Find

    2003-01-01

    frequency and the related frequencies as orders of the fundamental frequency. When analyzing rotating or reciprocating machines it is important to know the running speed. Usually this requires direct access to the rotating parts in order to mount a dedicated tachometer probe. In this thesis different......Rotating and reciprocating mechanical machines emit acoustic noise and vibrations when they operate. Typically, the noise and vibrations are concentrated in narrow frequency bands related to the running speed of the machine. The frequency of the running speed is referred to as the fundamental...

  20. THE NECESSITY OF APPROACHING THE ENTERPRISE PERFORMANCE CONCEPT THROUGH A THEORETICAL FUNDAMENTAL SYSTEM

    Directory of Open Access Journals (Sweden)

    DEAC VERONICA

    2017-10-01

    Full Text Available The purpose of this paper is to justify the necessity of building of a theoretical-fundamental system to define and delimitate the integrated notions applicable to the concept of enterprise performance. Standing as a fundamental research, the present paper argues and shows that the literature in this field and the applied environment, as well, require a more clearer segregation, respectively an increase of specificity of the concept "enterprise performance" considering that it is not unanimously defined, on one hand, and, especially, due to the fact that it represents a key concept widely used, which, ultimately, has to be measured in order to be helpful, on the other hand. Moreover, the present paper would be useful to scholars working in the field of firm performance who are willing to understand this concept and to develop the future research referring to enterprise performance measurement.

  1. Enhanced defence in depth: a fundamental approach for innovative nuclear systems recommended by INPRO

    International Nuclear Information System (INIS)

    Kuczera, B.; Juhn, P.E.

    2004-01-01

    In May 2001, the IAEA initiated the 'International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO)'. Having in mind that nuclear power will be an important option for meeting future electricity needs, the scope of INPRO covers nuclear reactors expected to come into service in the next fifty years, together with their associated fuel cycles. This article deals with enhanced defence in depth (DID) strategy that is recommended by INPRO. This strategy is twofold: first, to prevent accidents and second, if prevention fails, to limit their potential consequences and prevent any evolution to more serious conditions. Accident prevention is the first priority. For innovative nuclear systems, the effectiveness of preventive measures should be enhanced compared with existing systems. DID is generally structured in 5 levels of protection, including successive barriers preventing the release of radioactive material to the environment. These levels are: 1) prevention of abnormal operation and failures, 2) control of abnormal operation and detection of failures, 3) control of accidents within the design basis, 4) control of severe plant conditions, including prevention and mitigation of the consequences of severe accidents, and 5) mitigation of radiological consequences of significant release of radioactive materials. In the area of nuclear safety, INPRO has set 5 principles: 1) incorporate DID as a part of the safety approach and make the 5 levels of DID more independent from each other than in current installations; 2) prevent, reduce or contain releases of radioactive or hazardous materials in any normal or abnormal plant operation; 3) incorporate increased emphasis on inherent safety characteristics and passive safety features; 4) include research and development work to bring the capability of computer codes used for the safety of innovative nuclear systems to the standard of codes used for the safety of current reactors; and 5) include a holistic life

  2. Fundamental ecology is fundamental.

    Science.gov (United States)

    Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E

    2015-01-01

    The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Stability of rigid rotors supported by air foil bearings: Comparison of two fundamental approaches

    DEFF Research Database (Denmark)

    Larsen, Jon Steffen; Santos, Ilmar; von Osmanski, Alexander Sebastian

    2016-01-01

    . This paper compares two fundamental methods for predicting the OSI. One is based on a nonlinear time domain simulation and another is based on a linearised frequency domain method and a perturbation of the Reynolds equation. Both methods are based on equivalent models and should predict similar results......High speed direct drive motors enable the use of Air Foil Bearings (AFB) in a wide range of applications due to the elimination of gear forces. Unfortunately, AFB supported rotors are lightly damped, and an accurate prediction of their Onset Speed of Instability (OSI) is therefore important...

  4. The pair potential approach for interfaces: Fundamental problems and practical solutions

    International Nuclear Information System (INIS)

    Maggs, A.C.; Ashcroft, N.W.

    1987-09-01

    A fundamental problem in the use of a central pair-force model for defect problems is that it omits three-body and higher terms which are necessarily present in real systems. Electronic fluctuation effects are also usually omitted. While these can be small in the simple metals, they are significant in noble and transition metals, as shown by a simple real space argument. To guage the importance of their effects in interface problems, the structure of a simple sum 5 twist boundary is examined, with the atoms described by both pair- and three-center interactions and as a function of the relative strength of the two. 15 refs

  5. Reliability analysis - systematic approach based on limited data

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1975-11-01

    The initial approaches required for reliability analysis are outlined. These approaches highlight the system boundaries, examine the conditions under which the system is required to operate, and define the overall performance requirements. The discussion is illustrated by a simple example of an automatic protective system for a nuclear reactor. It is then shown how the initial approach leads to a method of defining the system, establishing performance parameters of interest and determining the general form of reliability models to be used. The overall system model and the availability of reliability data at the system level are next examined. An iterative process is then described whereby the reliability model and data requirements are systematically refined at progressively lower hierarchic levels of the system. At each stage, the approach is illustrated with examples from the protective system previously described. The main advantages of the approach put forward are the systematic process of analysis, the concentration of assessment effort in the critical areas and the maximum use of limited reliability data. (author)

  6. Arguing against fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    This paper aims to open up discussion on the relationship between fundamentality and naturalism, and in particular on the question of whether fundamentality may be denied on naturalistic grounds. A historico-inductive argument for an anti-fundamentalist conclusion, prominent within the contemporary metaphysical literature, is examined; finding it wanting, an alternative 'internal' strategy is proposed. By means of an example from the history of modern physics - namely S-matrix theory - it is demonstrated that (1) this strategy can generate similar (though not identical) anti-fundamentalist conclusions on more defensible naturalistic grounds, and (2) that fundamentality questions can be empirical questions. Some implications and limitations of the proposed approach are discussed.

  7. Promoting fundamental clinical skills: a competency-based college approach at the University of Washington.

    Science.gov (United States)

    Goldstein, Erika A; Maclaren, Carol F; Smith, Sherilyn; Mengert, Terry J; Maestas, Ramoncita R; Foy, Hugh M; Wenrich, Marjorie D; Ramsey, Paul G

    2005-05-01

    The focus on fundamental clinical skills in undergraduate medical education has declined over the last several decades. Dramatic growth in the number of faculty involved in teaching and increasing clinical and research commitments have contributed to depersonalization and declining individual attention to students. In contrast to the close teaching and mentoring relationship between faculty and students 50 years ago, today's medical students may interact with hundreds of faculty members without the benefit of a focused program of teaching and evaluating clinical skills to form the core of their four-year curriculum. Bedside teaching has also declined, which may negatively affect clinical skills development. In response to these and other concerns, the University of Washington School of Medicine has created an integrated developmental curriculum that emphasizes bedside teaching and role modeling, focuses on enhancing fundamental clinical skills and professionalism, and implements these goals via a new administrative structure, the College system, which consists of a core of clinical teachers who spend substantial time teaching and mentoring medical students. Each medical student is assigned a faculty mentor within a College for the duration of his or her medical school career. Mentors continuously teach and reflect with students on clinical skills development and professionalism and, during the second year, work intensively with them at the bedside. They also provide an ongoing personal faculty contact. Competency domains and benchmarks define skill areas in which deepening, progressive attention is focused throughout medical school. This educational model places primary focus on the student.

  8. Fundamental length

    International Nuclear Information System (INIS)

    Pradhan, T.

    1975-01-01

    The concept of fundamental length was first put forward by Heisenberg from purely dimensional reasons. From a study of the observed masses of the elementary particles known at that time, it is sumrised that this length should be of the order of magnitude 1 approximately 10 -13 cm. It was Heisenberg's belief that introduction of such a fundamental length would eliminate the divergence difficulties from relativistic quantum field theory by cutting off the high energy regions of the 'proper fields'. Since the divergence difficulties arise primarily due to infinite number of degrees of freedom, one simple remedy would be the introduction of a principle that limits these degrees of freedom by removing the effectiveness of the waves with a frequency exceeding a certain limit without destroying the relativistic invariance of the theory. The principle can be stated as follows: It is in principle impossible to invent an experiment of any kind that will permit a distintion between the positions of two particles at rest, the distance between which is below a certain limit. A more elegant way of introducing fundamental length into quantum theory is through commutation relations between two position operators. In quantum field theory such as quantum electrodynamics, it can be introduced through the commutation relation between two interpolating photon fields (vector potentials). (K.B.)

  9. Fundamental (f) oscillations in a magnetically coupled solar interior-atmosphere system - An analytical approach

    Science.gov (United States)

    Pintér, Balázs; Erdélyi, R.

    2018-01-01

    Solar fundamental (f) acoustic mode oscillations are investigated analytically in a magnetohydrodynamic (MHD) model. The model consists of three layers in planar geometry, representing the solar interior, the magnetic atmosphere, and a transitional layer sandwiched between them. Since we focus on the fundamental mode here, we assume the plasma is incompressible. A horizontal, canopy-like, magnetic field is introduced to the atmosphere, in which degenerated slow MHD waves can exist. The global (f-mode) oscillations can couple to local atmospheric Alfvén waves, resulting, e.g., in a frequency shift of the oscillations. The dispersion relation of the global oscillation mode is derived, and is solved analytically for the thin-transitional layer approximation and for the weak-field approximation. Analytical formulae are also provided for the frequency shifts due to the presence of a thin transitional layer and a weak atmospheric magnetic field. The analytical results generally indicate that, compared to the fundamental value (ω =√{ gk }), the mode frequency is reduced by the presence of an atmosphere by a few per cent. A thin transitional layer reduces the eigen-frequencies further by about an additional hundred microhertz. Finally, a weak atmospheric magnetic field can slightly, by a few percent, increase the frequency of the eigen-mode. Stronger magnetic fields, however, can increase the f-mode frequency by even up to ten per cent, which cannot be seen in observed data. The presence of a magnetic atmosphere in the three-layer model also introduces non-permitted propagation windows in the frequency spectrum; here, f-mode oscillations cannot exist with certain values of the harmonic degree. The eigen-frequencies can be sensitive to the background physical parameters, such as an atmospheric density scale-height or the rate of the plasma density drop at the photosphere. Such information, if ever observed with high-resolution instrumentation and inverted, could help to

  10. A Unique Mathematical Derivation of the Fundamental Laws of Nature Based on a New Algebraic-Axiomatic (Matrix Approach

    Directory of Open Access Journals (Sweden)

    Ramin Zahedi

    2017-09-01

    Full Text Available In this article, as a new mathematical approach to origin of the laws of nature, using a new basic algebraic axiomatic (matrix formalism based on the ring theory and Clifford algebras (presented in Section 2, “it is shown that certain mathematical forms of fundamental laws of nature, including laws governing the fundamental forces of nature (represented by a set of two definite classes of general covariant massive field equations, with new matrix formalisms, are derived uniquely from only a very few axioms.” In agreement with the rational Lorentz group, it is also basically assumed that the components of relativistic energy-momentum can only take rational values. In essence, the main scheme of this new mathematical axiomatic approach to the fundamental laws of nature is as follows: First, based on the assumption of the rationality of D-momentum and by linearization (along with a parameterization procedure of the Lorentz invariant energy-momentum quadratic relation, a unique set of Lorentz invariant systems of homogeneous linear equations (with matrix formalisms compatible with certain Clifford and symmetric algebras is derived. Then by an initial quantization (followed by a basic procedure of minimal coupling to space-time geometry of these determined systems of linear equations, a set of two classes of general covariant massive (tensor field equations (with matrix formalisms compatible with certain Clifford, and Weyl algebras is derived uniquely as well.

  11. First-Principles Approach to Model Electrochemical Reactions: Understanding the Fundamental Mechanisms behind Mg Corrosion

    Science.gov (United States)

    Surendralal, Sudarsan; Todorova, Mira; Finnis, Michael W.; Neugebauer, Jörg

    2018-06-01

    Combining concepts of semiconductor physics and corrosion science, we develop a novel approach that allows us to perform ab initio calculations under controlled potentiostat conditions for electrochemical systems. The proposed approach can be straightforwardly applied in standard density functional theory codes. To demonstrate the performance and the opportunities opened by this approach, we study the chemical reactions that take place during initial corrosion at the water-Mg interface under anodic polarization. Based on this insight, we derive an atomistic model that explains the origin of the anodic hydrogen evolution.

  12. Fundamental principles of the cultural-activity approach in the psychology of giftedness

    OpenAIRE

    Babaeva, Julia

    2013-01-01

    This article examines the cultural-activity approach to the study of giftedness, which is based on the ideas of L. S. Vygotsky, A. N. Leontiev, and O. K. Tikhomirov. Three basic principles of this approach are described: the principle of polymorphism, the dynamic principle, and the principle of the holistic analysis of the giftedness phenomenon. The article introduces the results of empirical research (including a 10-year longitudinal study), which verifies the efficacy of the cultural-activi...

  13. Social use of alcohol among adolescent offenders: a fundamental approach toward human needs

    Directory of Open Access Journals (Sweden)

    Gustavo D?Andrea

    2014-02-01

    Full Text Available This study examined some basic health care approaches toward human needs, with a particular focus on nursing. We aimed to incorporate these approaches into the discussion of the mental health of adolescent offenders who consume alcohol. We discuss specific needs of the delinquent group, critique policies that prioritize coercion of adolescent offenders, and the role that nurses could play in the sphere of juvenile delinquency.

  14. Treatment for spasmodic dysphonia: limitations of current approaches

    Science.gov (United States)

    Ludlow, Christy L.

    2009-01-01

    Purpose of review Although botulinum toxin injection is the gold standard for treatment of spasmodic dysphonia, surgical approaches aimed at providing long-term symptom control have been advancing over recent years. Recent findings When surgical approaches provide greater long-term benefits to symptom control, they also increase the initial period of side effects of breathiness and swallowing difficulties. However, recent analyses of quality-of-life questionnaires in patients undergoing regular injections of botulinum toxin demonstrate that a large proportion of patients have limited relief for relatively short periods due to early breathiness and loss-of-benefit before reinjection. Summary Most medical and surgical approaches to the treatment of spasmodic dysphonia have been aimed at denervation of the laryngeal muscles to block symptom expression in the voice, and have both adverse effects as well as treatment benefits. Research is needed to identify the central neuropathophysiology responsible for the laryngeal muscle spasms in order target treatment towards the central neurological abnormality responsible for producing symptoms. PMID:19337127

  15. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 2: Fundamentals and application to counting measurements with the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) on samples are counted after treating them (e.g. aliquotation, solution, enrichment, separation). It considers, besides the random character of radioactive decay and of pulse counting, all other influences arising from sample treatment, (e.g. weighing, enrichment, calibration or the instability of the test setup). ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG 2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as non radioactive waste, and ISO 11929-1, ISO 11929-3 and ISO 11929-4 and is, consequently, complementary to these documents

  16. Pedagogical Approaches to and Effects of Fundamental Movement Skill Interventions on Health Outcomes: A Systematic Review.

    Science.gov (United States)

    Tompsett, Claire; Sanders, Ross; Taylor, Caitlin; Cobley, Stephen

    2017-09-01

    Fundamental movement skills (FMS) are assumed to be the basic prerequisite motor movements underpinning coordination of more integrated and advanced movement capabilities. FMS development and interventions have been associated with several beneficial health outcomes in individual studies. The primary aim of this review was to identify FMS intervention characteristics that could be optimised to attain beneficial outcomes in children and adolescents, while the secondary aim was to update the evidence as to the efficacy of FMS interventions on physiological, psychological and behavioural health outcomes. A systematic search [adhering to Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines] was conducted in seven databases. Studies were included if they conducted an FMS intervention and targeted at least one physiological, behavioural or psychological outcome in school-aged children (5-18 years). Twenty-nine studies examining the effect of FMS interventions relative to controls were identified. Specialist-led interventions, taught in conjunction with at-home practice and parent involvement, appeared more efficacious in enhancing FMS proficiency than school physical education alone. Intervention environments encouraging psychological autonomy were likely to enhance perceived and actual competence in FMS alongside physical activity. FMS interventions had little influence on overweight/obesity reduction, strength or flexibility. In 93% of studies, evidence indicated interventions improved FMS motor proficiency. Favourable specific physiological, psychological and behavioural outcomes were also identified across a variety of interventions. With reference to clinical and normative school-age populations, future studies should be directed toward determining validated standard FMS assessments to enable accurate effect estimates, permit intervention comparisons and improve the efficacy of FMS development.

  17. A macrothermodynamic approach to the limit of reversible capillary condensation.

    Science.gov (United States)

    Trens, Philippe; Tanchoux, Nathalie; Galarneau, Anne; Brunel, Daniel; Fubini, Bice; Garrone, Edoardo; Fajula, François; Di Renzo, Francesco

    2005-08-30

    The threshold of reversible capillary condensation is a well-defined thermodynamic property, as evidenced by corresponding states treatment of literature and experimental data on the lowest closure point of the hysteresis loop in capillary condensation-evaporation cycles for several adsorbates. The nonhysteretical filling of small mesopores presents the properties of a first-order phase transition, confirming that the limit of condensation reversibility does not coincide with the pore critical point. The enthalpy of reversible capillary condensation can be calculated by a Clausius-Clapeyron approach and is consistently larger than the condensation heat in unconfined conditions. Calorimetric data on the capillary condensation of tert-butyl alcohol in MCM-41 silica confirm a 20% increase of condensation heat in small mesopores. This enthalpic advantage makes easier the overcoming of the adhesion forces by the capillary forces and justifies the disappearing of the hysteresis loop.

  18. MAKING THE NEIGHBOURHOOD A BETTER PLACE TO LIVE. A SWB APPROACH IMPLEMENTING FUNDAMENTAL HUMAN NEEDS

    OpenAIRE

    Papachristou, Ioanna Anna; Rosas-Casals, Martí

    2015-01-01

    Subjective well-being (SWB) studies have been at the centre of researchers’ attention during the last years. With the majority of people now living in cities, the necessity for a more anthropocentric approach for the study and betterment of urban environments is constantly increasing. In this sense, defining and measuring SWB in urban contexts can be of particular benefit in urban design and planning processes. In this article, a method for measuring SWB for urban places based on the accompli...

  19. Approaching the Hole Mobility Limit of GaSb Nanowires.

    Science.gov (United States)

    Yang, Zai-xing; Yip, SenPo; Li, Dapan; Han, Ning; Dong, Guofa; Liang, Xiaoguang; Shu, Lei; Hung, Tak Fu; Mo, Xiaoliang; Ho, Johnny C

    2015-09-22

    In recent years, high-mobility GaSb nanowires have received tremendous attention for high-performance p-type transistors; however, due to the difficulty in achieving thin and uniform nanowires (NWs), there is limited report until now addressing their diameter-dependent properties and their hole mobility limit in this important one-dimensional material system, where all these are essential information for the deployment of GaSb NWs in various applications. Here, by employing the newly developed surfactant-assisted chemical vapor deposition, high-quality and uniform GaSb NWs with controllable diameters, spanning from 16 to 70 nm, are successfully prepared, enabling the direct assessment of their growth orientation and hole mobility as a function of diameter while elucidating the role of sulfur surfactant and the interplay between surface and interface energies of NWs on their electrical properties. The sulfur passivation is found to efficiently stabilize the high-energy NW sidewalls of (111) and (311) in order to yield the thin NWs (i.e., 40 nm in diameters) would grow along the most energy-favorable close-packed planes with the orientation of ⟨111⟩, supported by the approximate atomic models. Importantly, through the reliable control of sulfur passivation, growth orientation and surface roughness, GaSb NWs with the peak hole mobility of ∼400 cm(2)V s(-1) for the diameter of 48 nm, approaching the theoretical limit under the hole concentration of ∼2.2 × 10(18) cm(-3), can be achieved for the first time. All these indicate their promising potency for utilizations in different technological domains.

  20. A genetic approach to shape reconstruction in limited data tomography

    International Nuclear Information System (INIS)

    Turcanu, C.; Craciunescu, T.

    2001-01-01

    The paper proposes a new method for shape reconstruction in computerized tomography. Unlike nuclear medicine applications, in physical science problems we are often confronted with limited data sets: constraints in the number of projections or limited view angles . The problem of image reconstruction from projection may be considered as a problem of finding an image (solution) having projections that match the experimental ones. In our approach, we choose a statistical correlation coefficient to evaluate the fitness of any potential solution. The optimization process is carried out by a genetic algorithm. The algorithm has some features common to all genetic algorithms but also some problem-oriented characteristics. One of them is that a chromosome, representing a potential solution, is not linear but coded as a matrix of pixels corresponding to a two-dimensional image. This kind of internal representation reflects the genuine manifestation and slight differences between two points situated in the original problem space give rise to similar differences once they become coded. Another particular feature is a newly built crossover operator: the grid-based crossover, suitable for high dimension two-dimensional chromosomes. Except for the population size and the dimension of the cutting grid for the grid-based crossover, all the other parameters of the algorithm are independent of the geometry of the tomographic reconstruction. The performances of the method are evaluated on a phantom typical for an application with limited data sets: the determination of the neutron energy spectra with time resolution in case of short-pulsed neutron emission. A genetic reconstruction is presented. The qualitative judgement and also the quantitative one, based on some figures of merit, point out that the proposed method ensures an improved reconstruction of shapes, sizes and resolution in the image, even in the presence of noise. (authors)

  1. Fundamental approach to the analysis of radionuclide transport resulting from fluid flow through jointed media

    International Nuclear Information System (INIS)

    Erickson, K.L.

    1981-02-01

    A theoretical and experimental basis is being developed for analysis of radionuclide transport in jointed geologic media. Batch equilibration and rate experiments involving samples of Eleana argillite and Tertiary silicic tuffs in contact with solutions containing Cs, Sr or Pm indicated that most radionuclide sorption is associated with the surfaces of very small intergranular regions and that the rate of sorption is controlled by diffusion of the nuclides into such regions. Based on these experimental results, the continuity equations for radionuclides in the mobile and immobile phases were reduced to a model analogous to Rosen's equations for packed beds and were solved similarly to Rosen's solutions. Using the model and experimental data, limited radionuclide transport analyses were made which indicated that important parameters controlling transport include the intergranular porosity and nuclide penetration depth, fracture plate spacing and length, fluid velocity, and sorption distribution coefficient

  2. Wavelength dispersive X-ray fluorescence analysis using fundamental parameter approach of Catha edulis and other related plant samples

    Energy Technology Data Exchange (ETDEWEB)

    Shaltout, Abdallah A., E-mail: shaltout_a@hotmail.com [Spectroscopy Department, Physics Division, National Research Center, El Behooth Str., 12622 Dokki, Cairo (Egypt); Faculty of science, Taif University, 21974 Taif, P.O. Box 888 (Saudi Arabia); Moharram, Mohammed A. [Spectroscopy Department, Physics Division, National Research Center, El Behooth Str., 12622 Dokki, Cairo (Egypt); Mostafa, Nasser Y. [Faculty of science, Taif University, 21974 Taif, P.O. Box 888 (Saudi Arabia); Chemistry Department, Faculty of Science, Suez Canal University, Ismailia (Egypt)

    2012-01-15

    This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method. - Highlights: Black-Right-Pointing-Pointer Quantitative analysis of Catha edulis was carried out using standardless WDXRF. Black-Right-Pointing-Pointer Differential thermal analysis was used for determination of the loss of ignition. Black-Right-Pointing-Pointer The existence of hydroxyapatite in Catha edulis plant has been confirmed. Black-Right-Pointing-Pointer The CRM results confirmed the validity of the developed method.

  3. Wavelength dispersive X-ray fluorescence analysis using fundamental parameter approach of Catha edulis and other related plant samples

    International Nuclear Information System (INIS)

    Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.

    2012-01-01

    This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method. - Highlights: ► Quantitative analysis of Catha edulis was carried out using standardless WDXRF. ► Differential thermal analysis was used for determination of the loss of ignition. ► The existence of hydroxyapatite in Catha edulis plant has been confirmed. ► The CRM results confirmed the validity of the developed method.

  4. Observation of the fundamental Nyquist noise limit in an ultra-high Q-factor cryogenic bulk acoustic wave cavity

    Energy Technology Data Exchange (ETDEWEB)

    Goryachev, Maxim, E-mail: maxim.goryachev@uwa.edu.au; Ivanov, Eugene N.; Tobar, Michael E. [ARC Centre of Excellence for Engineered Quantum Systems, University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia); Kann, Frank van [School of Physics, University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia); Galliou, Serge [Department of Time and Frequency, FEMTO-ST Institute, ENSMM, 26 Chemin de l' Épitaphe, 25000 Besançon (France)

    2014-10-13

    Thermal Nyquist noise fluctuations of high-Q bulk acoustic wave cavities have been observed at cryogenic temperatures with a DC superconducting quantum interference device amplifier. High Q modes with bandwidths of few tens of milliHz produce thermal fluctuations with a signal-to-noise ratio of up to 23 dB. The estimated effective temperature from the Nyquist noise is in good agreement with the physical temperature of the device, confirming the validity of the equivalent circuit model and the non-existence of any excess resonator self-noise. The measurements also confirm that the quality factor remains extremely high (Q > 10{sup 8} at low order overtones) for very weak (thermal) system motion at low temperatures, when compared to values measured with relatively strong external excitation. This result represents an enabling step towards operating such a high-Q acoustic device at the standard quantum limit.

  5. 100 nm scale low-noise sensors based on aligned carbon nanotube networks: overcoming the fundamental limitation of network-based sensors

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Minbaek; Lee, Joohyung; Kim, Tae Hyun; Lee, Hyungwoo; Lee, Byung Yang; Hong, Seunghun [Department of Physics and Astronomy, Seoul National University, Shilim-Dong, Kwanak-Gu, Seoul 151-742 (Korea, Republic of); Park, June; Seong, Maeng-Je [Department of Physics, Chung-Ang University, Heukseok-Dong, Dongjak-Gu, Seoul 156-756 (Korea, Republic of); Jhon, Young Min, E-mail: mseong@cau.ac.kr, E-mail: shong@phya.snu.ac.kr [Korea Institute of Science and Technology, Hawolgok-Dong, Seongbuk-Gu, Seoul 136-791 (Korea, Republic of)

    2010-02-05

    Nanoscale sensors based on single-walled carbon nanotube (SWNT) networks have been considered impractical due to several fundamental limitations such as a poor sensitivity and small signal-to-noise ratio. Herein, we present a strategy to overcome these fundamental problems and build highly-sensitive low-noise nanoscale sensors simply by controlling the structure of the SWNT networks. In this strategy, we prepared nanoscale width channels based on aligned SWNT networks using a directed assembly strategy. Significantly, the aligned network-based sensors with narrower channels exhibited even better signal-to-noise ratio than those with wider channels, which is opposite to conventional random network-based sensors. As a proof of concept, we demonstrated 100 nm scale low-noise sensors to detect mercury ions with the detection limit of {approx}1 pM, which is superior to any state-of-the-art portable detection system and is below the allowable limit of mercury ions in drinking water set by most government environmental protection agencies. This is the first demonstration of 100 nm scale low-noise sensors based on SWNT networks. Considering the increased interests in high-density sensor arrays for healthcare and environmental protection, our strategy should have a significant impact on various industrial applications.

  6. 100 nm scale low-noise sensors based on aligned carbon nanotube networks: overcoming the fundamental limitation of network-based sensors

    International Nuclear Information System (INIS)

    Lee, Minbaek; Lee, Joohyung; Kim, Tae Hyun; Lee, Hyungwoo; Lee, Byung Yang; Hong, Seunghun; Park, June; Seong, Maeng-Je; Jhon, Young Min

    2010-01-01

    Nanoscale sensors based on single-walled carbon nanotube (SWNT) networks have been considered impractical due to several fundamental limitations such as a poor sensitivity and small signal-to-noise ratio. Herein, we present a strategy to overcome these fundamental problems and build highly-sensitive low-noise nanoscale sensors simply by controlling the structure of the SWNT networks. In this strategy, we prepared nanoscale width channels based on aligned SWNT networks using a directed assembly strategy. Significantly, the aligned network-based sensors with narrower channels exhibited even better signal-to-noise ratio than those with wider channels, which is opposite to conventional random network-based sensors. As a proof of concept, we demonstrated 100 nm scale low-noise sensors to detect mercury ions with the detection limit of ∼1 pM, which is superior to any state-of-the-art portable detection system and is below the allowable limit of mercury ions in drinking water set by most government environmental protection agencies. This is the first demonstration of 100 nm scale low-noise sensors based on SWNT networks. Considering the increased interests in high-density sensor arrays for healthcare and environmental protection, our strategy should have a significant impact on various industrial applications.

  7. 100 nm scale low-noise sensors based on aligned carbon nanotube networks: overcoming the fundamental limitation of network-based sensors

    Science.gov (United States)

    Lee, Minbaek; Lee, Joohyung; Kim, Tae Hyun; Lee, Hyungwoo; Lee, Byung Yang; Park, June; Jhon, Young Min; Seong, Maeng-Je; Hong, Seunghun

    2010-02-01

    Nanoscale sensors based on single-walled carbon nanotube (SWNT) networks have been considered impractical due to several fundamental limitations such as a poor sensitivity and small signal-to-noise ratio. Herein, we present a strategy to overcome these fundamental problems and build highly-sensitive low-noise nanoscale sensors simply by controlling the structure of the SWNT networks. In this strategy, we prepared nanoscale width channels based on aligned SWNT networks using a directed assembly strategy. Significantly, the aligned network-based sensors with narrower channels exhibited even better signal-to-noise ratio than those with wider channels, which is opposite to conventional random network-based sensors. As a proof of concept, we demonstrated 100 nm scale low-noise sensors to detect mercury ions with the detection limit of ~1 pM, which is superior to any state-of-the-art portable detection system and is below the allowable limit of mercury ions in drinking water set by most government environmental protection agencies. This is the first demonstration of 100 nm scale low-noise sensors based on SWNT networks. Considering the increased interests in high-density sensor arrays for healthcare and environmental protection, our strategy should have a significant impact on various industrial applications.

  8. Fundamental approach to TRIGA steady-state thermal-hydraulic CHF analysis

    International Nuclear Information System (INIS)

    Feldman, E.

    2008-01-01

    Methods are investigated for predicting the power at which critical heat flux (CHF) occurs in TRIGA reactors that rely on natural convection for primary flow. For a representative TRIGA reactor, two sets of functions are created. For the first set, the General Atomics STAT code and the more widely-used RELAP5-3D code are each employed to obtain reactor flow rate as a function of power. For the second set, the Bernath correlation, the 2006 Groeneveld table, the Hall and Mudawar outlet correlation, and each of the four PG-CHF correlations for rod bundles are used to predict the power at which CHF occurs as a function of channel flow rate. The two sets of functions are combined to yield predictions of the power at which CHF occurs in the reactor. A combination of the RELAP5-3D code and the 2006 Groeneveld table predicts 67% more CHF power than does a combination of the STAT code and the Bernath correlation. Replacing the 2006 Groeneveld table with the Bernath CHF correlation (while using the RELAP5-3D code flow solution) causes the increase to be 23% instead of 67%. Additional RELAP5-3D flow-versus-power solutions obtained from Reference 1 and presented in Appendix B for four specific TRIGA reactors further demonstrates that the Bernath correlation predicts CHF to occur at considerably lower power levels than does the 2006 Groeneveld table. Because of the lack of measured CHF data in the region of interest to TRIGA reactors, none of the CHF correlations considered can be assumed to provide the definitive CHF power. It is recommended, however, to compare the power levels of the potential limiting rods with the power levels at which the Bernath and 2006 Groeneveld CHF correlations predict CHF to occur

  9. The Principles of Proportionality, Legal Argumentation and the Discretionary Power of the Public Administration: An Analysis from the Limits on Fundamental Rights and Guarantees

    Directory of Open Access Journals (Sweden)

    Yezid Carrillo-de la Rosa

    2017-06-01

    Full Text Available This paper examines the implications of the principle of proportionality with regards to administrative decisions that limit civil liberties and fundamental rights. The hypothesis we intend to demonstrate is that a discretionary power of the Public Administration for issuing measures that restricts individual rights and liberties is just apparent, since the reach of agency discretion for choosing time, means and place conditions is very narrow. As the following research shows, the principle of proportionality obliges administrative agencies to implement effective means to attain the purposes of their intervention, but minimizing its impacts on constitutionally protected rights and liberties.

  10. Approaching conversion limit with all-dielectric solar cell reflectors.

    Science.gov (United States)

    Fu, Sze Ming; Lai, Yi-Chun; Tseng, Chi Wei; Yan, Sheng Lun; Zhong, Yan Kai; Shen, Chang-Hong; Shieh, Jia-Min; Li, Yu-Ren; Cheng, Huang-Chung; Chi, Gou-chung; Yu, Peichen; Lin, Albert

    2015-02-09

    Metallic back reflectors has been used for thin-film and wafer-based solar cells for very long time. Nonetheless, the metallic mirrors might not be the best choices for photovoltaics. In this work, we show that solar cells with all-dielectric reflectors can surpass the best-configured metal-backed devices. Theoretical and experimental results all show that superior large-angle light scattering capability can be achieved by the diffuse medium reflectors, and the solar cell J-V enhancement is higher for solar cells using all-dielectric reflectors. Specifically, the measured diffused scattering efficiency (D.S.E.) of a diffuse medium reflector is >0.8 for the light trapping spectral range (600nm-1000nm), and the measured reflectance of a diffuse medium can be as high as silver if the geometry of embedded titanium oxide(TiO(2)) nanoparticles is optimized. Moreover, the diffuse medium reflectors have the additional advantage of room-temperature processing, low cost, and very high throughput. We believe that using all-dielectric solar cell reflectors is a way to approach the thermodynamic conversion limit by completely excluding metallic dissipation.

  11. Promoting physical activity among children and adolescents: the strengths and limitations of school-based approaches.

    Science.gov (United States)

    Booth, Michael; Okely, Anthony

    2005-04-01

    Paediatric overweight and obesity is recognised as one of Australia's most significant health problems and effective approaches to increasing physical activity and reducing energy consumption are being sought urgently. Every potential approach and setting should be subjected to critical review in an attempt to maximise the impact of policy and program initiatives. This paper identifies the strengths and limitations of schools as a setting for promoting physical activity. The strengths are: most children and adolescents attend school; most young people are likely to see teachers as credible sources of information; schools provide access to the facilities, infrastructure and support required for physical activity; and schools are the workplace of skilled educators. Potential limitations are: those students who like school the least are the most likely to engage in health-compromising behaviours and the least likely to be influenced by school-based programs; there are about 20 more hours per week available for physical activity outside schools hours than during school hours; enormous demands are already being made on schools; many primary school teachers have low levels of perceived competence in teaching physical education and fundamental movement skills; and opportunities for being active at school may not be consistent with how and when students prefer to be active.

  12. Entropy-limited hydrodynamics: a novel approach to relativistic hydrodynamics

    Science.gov (United States)

    Guercilena, Federico; Radice, David; Rezzolla, Luciano

    2017-07-01

    We present entropy-limited hydrodynamics (ELH): a new approach for the computation of numerical fluxes arising in the discretization of hyperbolic equations in conservation form. ELH is based on the hybridisation of an unfiltered high-order scheme with the first-order Lax-Friedrichs method. The activation of the low-order part of the scheme is driven by a measure of the locally generated entropy inspired by the artificial-viscosity method proposed by Guermond et al. (J. Comput. Phys. 230(11):4248-4267, 2011, doi: 10.1016/j.jcp.2010.11.043). Here, we present ELH in the context of high-order finite-differencing methods and of the equations of general-relativistic hydrodynamics. We study the performance of ELH in a series of classical astrophysical tests in general relativity involving isolated, rotating and nonrotating neutron stars, and including a case of gravitational collapse to black hole. We present a detailed comparison of ELH with the fifth-order monotonicity preserving method MP5 (Suresh and Huynh in J. Comput. Phys. 136(1):83-99, 1997, doi: 10.1006/jcph.1997.5745), one of the most common high-order schemes currently employed in numerical-relativity simulations. We find that ELH achieves comparable and, in many of the cases studied here, better accuracy than more traditional methods at a fraction of the computational cost (up to {˜}50% speedup). Given its accuracy and its simplicity of implementation, ELH is a promising framework for the development of new special- and general-relativistic hydrodynamics codes well adapted for massively parallel supercomputers.

  13. A fundamental parameters approach to calibration of the Mars Exploration Rover Alpha Particle X-ray Spectrometer

    Science.gov (United States)

    Campbell, J. L.; Lee, M.; Jones, B. N.; Andrushenko, S. M.; Holmes, N. G.; Maxwell, J. A.; Taylor, S. M.

    2009-04-01

    The detection sensitivities of the Alpha Particle X-ray Spectrometer (APXS) instruments on the Mars Exploration Rovers for a wide range of elements were experimentally determined in 2002 using spectra of geochemical reference materials. A flight spare instrument was similarly calibrated, and the calibration exercise was then continued for this unit with an extended set of geochemical reference materials together with pure elements and simple chemical compounds. The flight spare instrument data are examined in detail here using a newly developed fundamental parameters approach which takes precise account of all the physics inherent in the two X-ray generation techniques involved, namely, X-ray fluorescence and particle-induced X-ray emission. The objectives are to characterize the instrument as fully as possible, to test this new approach, and to determine the accuracy of calibration for major, minor, and trace elements. For some of the lightest elements the resulting calibration exhibits a dependence upon the mineral assemblage of the geological reference material; explanations are suggested for these observations. The results will assist in designing the overall calibration approach for the APXS on the Mars Science Laboratory mission.

  14. Population pressure on coral atolls: trends and approaching limits.

    Science.gov (United States)

    Rapaport, M

    1990-09-01

    Trends and approaching limits of population pressure on coral atolls is discussed by examining the atoll environment in terms of the physical geography, the production systems, and resource distribution. Atoll populations are grouped as dependent and independent, and demographic trends in population growth, migraiton, urbanization, and political dependency are reviewed. Examination of the carrying capacity includes a dynamic model, the influences of the West, and philopsophical considerations. The carrying capacity is the "maximal population supportable in a given area". Traditional models are criticized because of a lack in accounting for external linkages. The proposed model is dynamic and considers perceived needs and overseas linkages. It also explains regional disparities in population distribution, and provides a continuing model for population movement from outer islands to district centers and mainland areas. Because of increased expectations and perceived needs, there is a lower carrying capacity for outlying areas, and expanded capacity in district centers. This leads to urbanization, emigration, and carrying capacity overshot in regional and mainland areas. Policy intervention is necessary at the regional and island community level. Atolls, which are islands surrounding deep lagoons, exist in archipelagoes across the oceans, and are rich in aquatic life. The balance in this small land area with a vulnerable ecosystem may be easily disturbed by scarce water supplies, barren soils, rising sea levels in the future, hurricanes, and tsunamis. Traditionally, fisheries and horticulture (pit-taro, coconuts, and breadfruit) have sustained populations, but modern influences such as blasting, reef mining, new industrial technologies, population pressure, and urbanization threaten the balance. Population pressure, which has lead to pollution, epidemics, malnutrition, crime, social disintegration, and foreign dependence, is evidenced in the areas of Tuvalu, Kiribati

  15. Fundamental energy limits of SET-based Brownian NAND and half-adder circuits. Preliminary findings from a physical-information-theoretic methodology

    Science.gov (United States)

    Ercan, İlke; Suyabatmaz, Enes

    2018-06-01

    The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.

  16. 33 CFR 401.52 - Limit of approach to a bridge.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Limit of approach to a bridge... approach to a bridge. (a) No vessel shall pass the limit of approach sign at any movable bridge until the bridge is in a fully open position and the signal light shows green. (b) No vessel shall pass the limit...

  17. Whole-genome sequencing approaches for conservation biology: Advantages, limitations and practical recommendations.

    Science.gov (United States)

    Fuentes-Pardo, Angela P; Ruzzante, Daniel E

    2017-10-01

    Whole-genome resequencing (WGR) is a powerful method for addressing fundamental evolutionary biology questions that have not been fully resolved using traditional methods. WGR includes four approaches: the sequencing of individuals to a high depth of coverage with either unresolved or resolved haplotypes, the sequencing of population genomes to a high depth by mixing equimolar amounts of unlabelled-individual DNA (Pool-seq) and the sequencing of multiple individuals from a population to a low depth (lcWGR). These techniques require the availability of a reference genome. This, along with the still high cost of shotgun sequencing and the large demand for computing resources and storage, has limited their implementation in nonmodel species with scarce genomic resources and in fields such as conservation biology. Our goal here is to describe the various WGR methods, their pros and cons and potential applications in conservation biology. WGR offers an unprecedented marker density and surveys a wide diversity of genetic variations not limited to single nucleotide polymorphisms (e.g., structural variants and mutations in regulatory elements), increasing their power for the detection of signatures of selection and local adaptation as well as for the identification of the genetic basis of phenotypic traits and diseases. Currently, though, no single WGR approach fulfils all requirements of conservation genetics, and each method has its own limitations and sources of potential bias. We discuss proposed ways to minimize such biases. We envision a not distant future where the analysis of whole genomes becomes a routine task in many nonmodel species and fields including conservation biology. © 2017 John Wiley & Sons Ltd.

  18. Credit card spending limit and personal finance: system dynamics approach

    Directory of Open Access Journals (Sweden)

    Mirjana Pejić Bach

    2014-03-01

    Full Text Available Credit cards have become one of the major ways for conducting cashless transactions. However, they have a long term impact on the well being of their owner through the debt generated by credit card usage. Credit card issuers approve high credit limits to credit card owners, thereby influencing their credit burden. A system dynamics model has been used to model behavior of a credit card owner in different scenarios according to the size of a credit limit. Experiments with the model demonstrated that a higher credit limit approved on the credit card decreases the budget available for spending in the long run. This is a contribution toward the evaluation of action for credit limit control based on their consequences.

  19. Chapman--Enskog approach to flux-limited diffusion theory

    International Nuclear Information System (INIS)

    Levermore, C.D.

    1979-01-01

    Using the technique developed by Chapman and Enskog for deriving the Navier--Stokes equations from the Boltzmann equation, a framework is set up for deriving diffusion theories from the transport equation. The procedure is first applied to give a derivation of isotropic diffusion theory and then of a completely new theory which is naturally flux-limited. This new flux-limited diffusion theory is then compared with asymptotic diffusion theory

  20. A Practical Approach for Parameter Identification with Limited Information

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Yang, Guangya; Tarnowski, Germán Claudio

    2014-01-01

    A practical parameter estimation procedure for a real excitation system is reported in this paper. The core algorithm is based on genetic algorithm (GA) which estimates the parameters of a real AC brushless excitation system with limited information about the system. Practical considerations are ...... parameters. The whole methodology is described and the estimation strategy is presented in this paper....

  1. Fracture mechanics approach to estimate rail wear limits

    Science.gov (United States)

    2009-10-01

    This paper describes a systematic methodology to estimate allowable limits for rail head wear in terms of vertical head-height loss, gage-face side wear, and/or the combination of the two. This methodology is based on the principles of engineering fr...

  2. Fundamental degradation mechanisms of layered oxide Li-ion battery cathode materials: Methodology, insights and novel approaches

    International Nuclear Information System (INIS)

    Hausbrand, R.; Cherkashinin, G.; Ehrenberg, H.; Gröting, M.; Albe, K.; Hess, C.; Jaegermann, W.

    2015-01-01

    Graphical abstract: - Highlights: • Description of recent in operando and in situ analysis methodology. • Surface science approach using photoemission for analysis of cathode surfaces and interfaces. • Ageing and fatigue of layered oxide Li-ion battery cathode materials from the atomistic point of view. • Defect formation and electronic structure evolution as causes for cathode degradation. • Significance of interfacial energy alignment and contact potential for side reactions. - Abstract: This overview addresses the atomistic aspects of degradation of layered LiMO 2 (M = Ni, Co, Mn) oxide Li-ion battery cathode materials, aiming to shed light on the fundamental degradation mechanisms especially inside active cathode materials and at their interfaces. It includes recent results obtained by novel in situ/in operando diffraction methods, modelling, and quasi in situ surface science analysis. Degradation of the active cathode material occurs upon overcharge, resulting from a positive potential shift of the anode. Oxygen loss and eventual phase transformation resulting in dead regions are ascribed to changes in electronic structure and defect formation. The anode potential shift results from loss of free lithium due to side reactions occurring at electrode/electrolyte interfaces. Such side reactions are caused by electron transfer, and depend on the electron energy level alignment at the interface. Side reactions at electrode/electrolyte interfaces and capacity fade may be overcome by the use of suitable solid-state electrolytes and Li-containing anodes

  3. Accuracy, precision, and lower detection limits (a deficit reduction approach)

    International Nuclear Information System (INIS)

    Bishop, C.T.

    1993-01-01

    The evaluation of the accuracy, precision and lower detection limits of the determination of trace radionuclides in environmental samples can become quite sophisticated and time consuming. This in turn could add significant cost to the analyses being performed. In the present method, a open-quotes deficit reduction approachclose quotes has been taken to keep costs low, but at the same time provide defensible data. In order to measure the accuracy of a particular method, reference samples are measured over the time period that the actual samples are being analyzed. Using a Lotus spreadsheet, data are compiled and an average accuracy is computed. If pairs of reference samples are analyzed, then precision can also be evaluated from the duplicate data sets. The standard deviation can be calculated if the reference concentrations of the duplicates are all in the same general range. Laboratory blanks are used to estimate the lower detection limits. The lower detection limit is calculated as 4.65 times the standard deviation of a set of blank determinations made over a given period of time. A Lotus spreadsheet is again used to compile data and LDLs over different periods of time can be compared

  4. Geometrical Optimization Approach to Isomerization: Models and Limitations.

    Science.gov (United States)

    Chang, Bo Y; Shin, Seokmin; Engel, Volker; Sola, Ignacio R

    2017-11-02

    We study laser-driven isomerization reactions through an excited electronic state using the recently developed Geometrical Optimization procedure. Our goal is to analyze whether an initial wave packet in the ground state, with optimized amplitudes and phases, can be used to enhance the yield of the reaction at faster rates, driven by a single picosecond pulse or a pair of femtosecond pulses resonant with the electronic transition. We show that the symmetry of the system imposes limitations in the optimization procedure, such that the method rediscovers the pump-dump mechanism.

  5. A general approach to total repair cost limit replacement policies

    Directory of Open Access Journals (Sweden)

    F. Beichelt

    2014-01-01

    Full Text Available A common replacement policy for technical systems consists in replacing a system by a new one after its economic lifetime, i.e. at that moment when its long-run maintenance cost rate is minimal. However, the strict application of the economic lifetime does not take into account the individual deviations of maintenance cost rates of single systems from the average cost development. Hence, Beichet proposed the total repair cost limit replacement policy: the system is replaced by a new one as soon as its total repair cost reaches or exceeds a given level. He modelled the repair cost development by functions of the Wiener process with drift. Here the same policy is considered under the assumption that the one-dimensional probability distribution of the process describing the repair cost development is given. In the examples analysed, applying the total repair cost limit replacement policy instead of the economic life-time leads to cost savings of between 4% and 30%. Finally, it is illustrated how to include the reliability aspect into the policy.

  6. Data Smearing: An Approach to Disclosure Limitation for Tabular Data

    Directory of Open Access Journals (Sweden)

    Toth Daniell

    2014-12-01

    Full Text Available Statistical agencies often collect sensitive data for release to the public at aggregated levels in the form of tables. To protect confidential data, some cells are suppressed in the publicly released data. One problem with this method is that many cells of interest must be suppressed in order to protect a much smaller number of sensitive cells. Another problem is that the covariates used to aggregate and level of aggregation must be fixed before the data is released. Both of these restrictions can severely limit the utility of the data. We propose a new disclosure limitation method that replaces the full set of microdata with synthetic data for use in producing released data in tabular form. This synthetic data set is obtained by replacing each unit’s values with a weighted average of sampled values from the surrounding area. The synthetic data is produced in a way to give asymptotically unbiased estimates for aggregate cells as the number of units in the cell increases. The method is applied to the U.S. Bureau of Labor Statistics Quarterly Census of Employment and Wages data, which is released to the public quarterly in tabular form and aggregated across varying scales of time, area, and economic sector.

  7. Spectroscopy of 211Rn approaching the valence limit

    International Nuclear Information System (INIS)

    Davidson, P.M.; Dracoulis, G.D.; Kibedi, T.; Fabricius, B.; Baxter, A.M.; Stuchbery, A.E.; Poletti, A.R.; Schiffer, K.J.

    1993-02-01

    High spin states in 211 Rn were populated using the reaction 198 Pt( 18 O,5n) at 96 MeV. The decay was studied using γ-ray and electron spectroscopy. The known level scheme is extended up to a spin of greater than 69/2 and many non-yrast states are added. Semi-empirical shell model calculations and the properties of related states in 210 Rn and 212 Rn are used to assign configurations to some of the non-yrast states. The properties of the high spin states observed are compared to the predictions of the Multi-Particle Octupole Coupling model and the semi-empirical shell model. The maximum reasonable spin available from the valence particles and holes is 77/2 and states are observed to near this limit. 12 refs., 4 tabs., 8 figs

  8. Spectroscopy of 211Rn approaching the valence limit

    International Nuclear Information System (INIS)

    Davidson, P.M.; Dracoulis, G.D.; Byrne, A.P.; Kibedi, T.; Fabricus, B.; Baxter, A.M.; Stuchbery, A.E.; Poletti, A.R.; Schiffer, K.J.

    1993-01-01

    High-spin states in 211 Rn were populated using the reaction 198 Pt( 18 O, 5n) at 96 MeV. Their decay was studied using γ-ray and electron spectroscopy. The known level scheme is extended up to a spin of greater than 69/2 and many non-yrast states are added. Semi-empirical shell-model calculations and the properties of related states in 210 Rn and 212 Rn are used to assign configurations to some of the non-yrast states. The properties of the high-spin states observed are compared to the predictions of the multi-particle octupole-coupling model and the semi-empirical shell model. The maximum reasonable spin available from the valence particles and holes in 77/2 and states are observed to near this limit. (orig.)

  9. [Limitation of therapeutic effort: Approach to a combined view].

    Science.gov (United States)

    Bueno Muñoz, M J

    2013-01-01

    Over the past few decades, we have been witnessing that increasing fewer people pass away at home and increasing more do so within the hospital. More specifically, 20% of deaths now occur in an intensive care unit (ICU). However, death in the ICU has become a highly technical process. This sometimes originates excesses because the resources used are not proportionate related to the purposes pursued (futility). It may create situations that do not respect the person's dignity throughout the death process. It is within this context that the situation of the clinical procedure called "limitation of the therapeutic effort" (LTE) is reviewed. This has become a true bridge between Intensive Care and Palliative Care. Its final goal is to guarantee a dignified and painless death for the terminally ill. Copyright © 2012 Elsevier España, S.L. y SEEIUC. All rights reserved.

  10. Stochastic resonance a mathematical approach in the small noise limit

    CERN Document Server

    Herrmann, Samuel; Pavlyukevich, Ilya; Peithmann, Dierk

    2013-01-01

    Stochastic resonance is a phenomenon arising in a wide spectrum of areas in the sciences ranging from physics through neuroscience to chemistry and biology. This book presents a mathematical approach to stochastic resonance which is based on a large deviations principle (LDP) for randomly perturbed dynamical systems with a weak inhomogeneity given by an exogenous periodicity of small frequency. Resonance, the optimal tuning between period length and noise amplitude, is explained by optimizing the LDP's rate function. The authors show that not all physical measures of tuning quality are robust with respect to dimension reduction. They propose measures of tuning quality based on exponential transition rates explained by large deviations techniques and show that these measures are robust. The book sheds some light on the shortcomings and strengths of different concepts used in the theory and applications of stochastic resonance without attempting to give a comprehensive overview of the many facets of stochastic ...

  11. Multicore in Production: Advantages and Limits of the Multiprocess Approach

    CERN Document Server

    Binet, S; The ATLAS collaboration; Lavrijsen, W; Leggett, Ch; Lesny, D; Jha, M K; Severini, H; Smith, D; Snyder, S; Tatarkhanov, M; Tsulaia, V; van Gemmeren, P; Washbrook, A

    2011-01-01

    The shared memory architecture of multicore CPUs provides HENP developers with the opportunity to reduce the memory footprint of their applications by sharing memory pages between the cores in a processor. ATLAS pioneered the multi-process approach to parallelizing HENP applications. Using Linux fork() and the Copy On Write mechanism we implemented a simple event task farm which allows to share up to 50% memory pages among event worker processes with negligible CPU overhead. By leaving the task of managing shared memory pages to the operating system, we have been able to run in parallel large reconstruction and simulation applications originally written to be run in a single thread of execution with little to no change to the application code. In spite of this, the process of validating athena multi-process for production took ten months of concentrated effort and is expected to continue for several more months. In general terms, we had two classes of problems in the multi-process port: merging the output fil...

  12. From the Kohn-Sham band gap to the fundamental gap in solids. An integer electron approach.

    Science.gov (United States)

    Baerends, E J

    2017-06-21

    It is often stated that the Kohn-Sham occupied-unoccupied gap in both molecules and solids is "wrong". We argue that this is not a correct statement. The KS theory does not allow to interpret the exact KS HOMO-LUMO gap as the fundamental gap (difference (I - A) of electron affinity (A) and ionization energy (I), twice the chemical hardness), from which it indeed differs, strongly in molecules and moderately in solids. The exact Kohn-Sham HOMO-LUMO gap in molecules is much below the fundamental gap and very close to the much smaller optical gap (first excitation energy), and LDA/GGA yield very similar gaps. In solids the situation is different: the excitation energy to delocalized excited states and the fundamental gap (I - A) are very similar, not so disparate as in molecules. Again the Kohn-Sham and LDA/GGA band gaps do not represent (I - A) but are significantly smaller. However, the special properties of an extended system like a solid make it very easy to calculate the fundamental gap from the ground state (neutral system) band structure calculations entirely within a density functional framework. The correction Δ from the KS gap to the fundamental gap originates from the response part v resp of the exchange-correlation potential and can be calculated very simply using an approximation to v resp . This affords a calculation of the fundamental gap at the same level of accuracy as other properties of crystals at little extra cost beyond the ground state bandstructure calculation. The method is based on integer electron systems, fractional electron systems (an ensemble of N- and (N + 1)-electron systems) and the derivative discontinuity are not invoked.

  13. The Application of Electrochemical and Surface Analysis Approaches to Studying Copper Corrosion in Water: Fundamentals, Limitations, and Examples

    Science.gov (United States)

    Corrosion control is a concern for many drinking water utilities. The Lead and Copper Rule established a regulatory need to maintain a corrosion control program. Other corrosion-related issues such as “red” water resulting from excessive iron corrosion and copper pinhole leaks ...

  14. From the Kohn-Sham band gap to the fundamental gap in solids. An integer electron approach

    NARCIS (Netherlands)

    Baerends, E. J.

    2017-01-01

    It is often stated that the Kohn-Sham occupied-unoccupied gap in both molecules and solids is "wrong". We argue that this is not a correct statement. The KS theory does not allow to interpret the exact KS HOMO-LUMO gap as the fundamental gap (difference (I - A) of electron affinity (A) and

  15. Radiology fundamentals

    CERN Document Server

    Singh, Harjit

    2011-01-01

    ""Radiology Fundamentals"" is a concise introduction to the dynamic field of radiology for medical students, non-radiology house staff, physician assistants, nurse practitioners, radiology assistants, and other allied health professionals. The goal of the book is to provide readers with general examples and brief discussions of basic radiographic principles and to serve as a curriculum guide, supplementing a radiology education and providing a solid foundation for further learning. Introductory chapters provide readers with the fundamental scientific concepts underlying the medical use of imag

  16. Fundamental Astronomy

    CERN Document Server

    Karttunen, Hannu; Oja, Heikki; Poutanen, Markku; Donner, Karl Johan

    2007-01-01

    Fundamental Astronomy gives a well-balanced and comprehensive introduction to the topics of classical and modern astronomy. While emphasizing both the astronomical concepts and the underlying physical principles, the text provides a sound basis for more profound studies in the astronomical sciences. The fifth edition of this successful undergraduate textbook has been extensively modernized and extended in the parts dealing with the Milky Way, extragalactic astronomy and cosmology as well as with extrasolar planets and the solar system (as a consequence of recent results from satellite missions and the new definition by the International Astronomical Union of planets, dwarf planets and small solar-system bodies). Furthermore a new chapter on astrobiology has been added. Long considered a standard text for physical science majors, Fundamental Astronomy is also an excellent reference and entrée for dedicated amateur astronomers.

  17. Fundamental safety principles. Safety fundamentals

    International Nuclear Information System (INIS)

    2006-01-01

    This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purpose. The fundamental safety objective - to protect people and the environment from harmful effects of ionizing radiation - applies to all circumstances that give rise to radiation risks. The safety principles are applicable, as relevant, throughout the entire lifetime of all facilities and activities - existing and new - utilized for peaceful purposes, and to protective actions to reduce existing radiation risks. They provide the basis for requirements and measures for the protection of people and the environment against radiation risks and for the safety of facilities and activities that give rise to radiation risks, including, in particular, nuclear installations and uses of radiation and radioactive sources, the transport of radioactive material and the management of radioactive waste

  18. Fundamental safety principles. Safety fundamentals

    International Nuclear Information System (INIS)

    2007-01-01

    This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purpose. The fundamental safety objective - to protect people and the environment from harmful effects of ionizing radiation - applies to all circumstances that give rise to radiation risks. The safety principles are applicable, as relevant, throughout the entire lifetime of all facilities and activities - existing and new - utilized for peaceful purposes, and to protective actions to reduce existing radiation risks. They provide the basis for requirements and measures for the protection of people and the environment against radiation risks and for the safety of facilities and activities that give rise to radiation risks, including, in particular, nuclear installations and uses of radiation and radioactive sources, the transport of radioactive material and the management of radioactive waste

  19. Upper limit for Poisson variable incorporating systematic uncertainties by Bayesian approach

    International Nuclear Information System (INIS)

    Zhu, Yongsheng

    2007-01-01

    To calculate the upper limit for the Poisson observable at given confidence level with inclusion of systematic uncertainties in background expectation and signal efficiency, formulations have been established along the line of Bayesian approach. A FORTRAN program, BPULE, has been developed to implement the upper limit calculation

  20. Marketing fundamentals.

    Science.gov (United States)

    Redmond, W H

    2001-01-01

    This chapter outlines current marketing practice from a managerial perspective. The role of marketing within an organization is discussed in relation to efficiency and adaptation to changing environments. Fundamental terms and concepts are presented in an applied context. The implementation of marketing plans is organized around the four P's of marketing: product (or service), promotion (including advertising), place of delivery, and pricing. These are the tools with which marketers seek to better serve their clients and form the basis for competing with other organizations. Basic concepts of strategic relationship management are outlined. Lastly, alternate viewpoints on the role of advertising in healthcare markets are examined.

  1. A "Fundamentals" Train-the-Trainer Approach to Building Pediatric Critical Care Expertise in the Developing World.

    Science.gov (United States)

    Crow, Sheri S; Ballinger, Beth A; Rivera, Mariela; Tsibadze, David; Gakhokidze, Nino; Zavrashvili, Nino; Ritter, Matthew J; Arteaga, Grace M

    2018-01-01

    Pediatric Fundamental Critical Care Support (PFCCS) is an educational tool for training non-intensivists, nurses, and critical care practitioners in diverse health-care settings to deal with the acute deterioration of pediatric patients. Our objective was to evaluate the PFCCS course as a tool for developing a uniform, reproducible, and sustainable model for educating local health-care workers in the optimal management of critically ill children in the Republic of Georgia. Over a period of 18 months and four visits to the country, we worked with Georgian pediatric critical care leadership to complete the following tasks: (1) survey health-care needs within the Republic of Georgia, (2) present representative PFCCS lectures and simulation scenarios to evaluate interest and obtain "buy-in" from key stakeholders throughout the Georgian educational infrastructure, and (3) identify PFCCS instructor candidates. Georgian PFCCS instructor training included the following steps: (1) US PFCCS consultant and content experts presented PFCCS course to Georgian instructor candidates. (2) Simulation learning principles were taught and basic equipment was acquired. (3) Instructor candidates presented PFCCS to Georgian learners, mentored by PFCCS course consultants. Objective evaluation and debriefing with instructor candidates concluded each visit. Between training visits Georgian instructors translated PFCCS slides to the Georgian language. Six candidates were identified and completed PFCCS instructor training. These Georgian instructors independently presented the PFCCS course to 15 Georgian medical students. Student test scores improved significantly from pretest results ( n  = 14) (pretest: 38.7 ± 7 vs. posttest 62.7 ± 6, p  fundamentals of pediatric critical care. Future collaborations will evaluate the clinical impact of PFCCS throughout the Georgian health-care system.

  2. Censoring: a new approach for detection limits in total-reflection X-ray fluorescence

    International Nuclear Information System (INIS)

    Pajek, M.; Kubala-Kukus, A.; Braziewicz, J.

    2004-01-01

    It is shown that the detection limits in the total-reflection X-ray fluorescence (TXRF), which restrict quantification of very low concentrations of trace elements in the samples, can be accounted for using the statistical concept of censoring. We demonstrate that the incomplete TXRF measurements containing the so-called 'nondetects', i.e. the non-measured concentrations falling below the detection limits and represented by the estimated detection limit values, can be viewed as the left random-censored data, which can be further analyzed using the Kaplan-Meier (KM) method correcting for nondetects. Within this approach, which uses the Kaplan-Meier product-limit estimator to obtain the cumulative distribution function corrected for the nondetects, the mean value and median of the detection limit censored concentrations can be estimated in a non-parametric way. The Monte Carlo simulations performed show that the Kaplan-Meier approach yields highly accurate estimates for the mean and median concentrations, being within a few percent with respect to the simulated, uncensored data. This means that the uncertainties of KM estimated mean value and median are limited in fact only by the number of studied samples and not by the applied correction procedure for nondetects itself. On the other hand, it is observed that, in case when the concentration of a given element is not measured in all the samples, simple approaches to estimate a mean concentration value from the data yield erroneous, systematically biased results. The discussed random-left censoring approach was applied to analyze the TXRF detection-limit-censored concentration measurements of trace elements in biomedical samples. We emphasize that the Kaplan-Meier approach allows one to estimate the mean concentrations being substantially below the mean level of detection limits. Consequently, this approach gives a new access to lower the effective detection limits for TXRF method, which is of prime interest for

  3. Estimation of fundamental kinetic parameters of polyhydroxybutyrate fermentation process of Azohydromonas australica using statistical approach of media optimization.

    Science.gov (United States)

    Gahlawat, Geeta; Srivastava, Ashok K

    2012-11-01

    Polyhydroxybutyrate or PHB is a biodegradable and biocompatible thermoplastic with many interesting applications in medicine, food packaging, and tissue engineering materials. The present study deals with the enhanced production of PHB by Azohydromonas australica using sucrose and the estimation of fundamental kinetic parameters of PHB fermentation process. The preliminary culture growth inhibition studies were followed by statistical optimization of medium recipe using response surface methodology to increase the PHB production. Later on batch cultivation in a 7-L bioreactor was attempted using optimum concentration of medium components (process variables) obtained from statistical design to identify the batch growth and product kinetics parameters of PHB fermentation. A. australica exhibited a maximum biomass and PHB concentration of 8.71 and 6.24 g/L, respectively in bioreactor with an overall PHB production rate of 0.75 g/h. Bioreactor cultivation studies demonstrated that the specific biomass and PHB yield on sucrose was 0.37 and 0.29 g/g, respectively. The kinetic parameters obtained in the present investigation would be used in the development of a batch kinetic mathematical model for PHB production which will serve as launching pad for further process optimization studies, e.g., design of several bioreactor cultivation strategies to further enhance the biopolymer production.

  4. [Perioperative management of transthoracic oesophagectomies : Fundamentals of interdisciplinary care and new approaches to accelerated recovery after surgery].

    Science.gov (United States)

    Lambertz, R; Drinhaus, H; Schedler, D; Bludau, M; Schröder, W; Annecke, T

    2016-06-01

    Locally advanced carcinomas of the oesophagus require multimodal treatment. The core element of curative therapy is transthoracic en bloc oesophagectomy, which is the standard procedure carried out in most specialized centres. Reconstruction of intestinal continuity is usually achieved with a gastric sleeve, which is anastomosed either intrathoracically or cervically to the remaining oesophagus. This thoraco-abdominal operation is associated with significant postoperative morbidity, not least because of a vast array of pre-existing illnesses in the surgical patient. For an optimal outcome, the careful interdisciplinary selection of patients, preoperative risk evaluation and conditioning are essential. The caseload of the centres correlates inversely with the complication rate. The leading surgical complication is anastomotic leakage, which is diagnosed endoscopically and usually treated with the aid of endoscopic procedures. Pulmonary infections are the most frequent non-surgical complication. Thoracic epidural anaesthesia and perfusion-orientated fluid management can reduce the rate of pulmonary complications. Patients are ventilated protecting the lungs and are extubated as early as possible. Oesophagectomies should only be performed in high-volume centres with the close cooperation of surgeons and anaesthesia/intensive care specialists. Programmes of enhanced recovery after surgery (ERAS) hold further potential for the patient's quicker postoperative recovery. In this review article the fundamental aspects of the interdisciplinary perioperative management of transthoracic oesophagectomy are described.

  5. Cerebellum: from Fundamentals to Translational Approaches. The Seventh International Symposium of the Society for Research on the Cerebellum.

    Science.gov (United States)

    Manto, Mario; Mariën, Peter

    2016-02-01

    In terms of cerebellar research and ataxiology, a most fascinating period is currently going on. Numerous academic groups are now focusing their innovative research on the so-called little brain, hidden at the bottom of our brain. Indeed, its unique anatomical features make the cerebellum a wonderful window to address major questions about the central nervous system. The seventh international symposium of the SRC was held in Brussels at the Palace of Academies from May 8 to 10, 2015. The main goal of this dense symposium was to gather in a 2-day meeting senior researchers of exceptional scientific quality and talented junior scientists from all over the world working in the multidisciplinary field of cerebellar research. Fundamental and clinical researchers shared the latest knowledge and developments in this rapidly growing field. New ideas, addressed in a variety of inspiring talks, provoked a vivid debate. Advances in genetics, development, electrophysiology, neuroimaging, neurocognition and affect, as well as in the cerebellar ataxias and the controversies on the roles and functions of the cerebellum were presented. The Ferdinando Rossi lecture and the key-note lecture were delivered by Jan Voogd and Chris De Zeeuw, respectively. Contacts between researchers of different neuroscientific disciplines established a robust basis for novel trends and promising new cooperations between researchers and their centers spread all over the world.

  6. Fundamentalism and science

    Directory of Open Access Journals (Sweden)

    Massimo Pigliucci

    2006-06-01

    Full Text Available The many facets of fundamentalism. There has been much talk about fundamentalism of late. While most people's thought on the topic go to the 9/11 attacks against the United States, or to the ongoing war in Iraq, fundamentalism is affecting science and its relationship to society in a way that may have dire long-term consequences. Of course, religious fundamentalism has always had a history of antagonism with science, and – before the birth of modern science – with philosophy, the age-old vehicle of the human attempt to exercise critical thinking and rationality to solve problems and pursue knowledge. “Fundamentalism” is defined by the Oxford Dictionary of the Social Sciences1 as “A movement that asserts the primacy of religious values in social and political life and calls for a return to a 'fundamental' or pure form of religion.” In its broadest sense, however, fundamentalism is a form of ideological intransigence which is not limited to religion, but includes political positions as well (for example, in the case of some extreme forms of “environmentalism”.

  7. A novel approach to derive halo-independent limits on dark matter properties

    OpenAIRE

    Ferrer, Francesc; Ibarra, Alejandro; Wild, Sebastian

    2015-01-01

    We propose a method that allows to place an upper limit on the dark matter elastic scattering cross section with nucleons which is independent of the velocity distribution. Our approach combines null results from direct detection experiments with indirect searches at neutrino telescopes, and goes beyond previous attempts to remove astrophysical uncertainties in that it directly constrains the particle physics properties of the dark matter. The resulting halo-independent upper limits on the sc...

  8. Understanding small biomolecule-biomaterial interactions: a review of fundamental theoretical and experimental approaches for biomolecule interactions with inorganic surfaces.

    Science.gov (United States)

    Costa, Dominique; Garrain, Pierre-Alain; Baaden, Marc

    2013-04-01

    Interactions between biomolecules and inorganic surfaces play an important role in natural environments and in industry, including a wide variety of conditions: marine environment, ship hulls (fouling), water treatment, heat exchange, membrane separation, soils, mineral particles at the earth's surface, hospitals (hygiene), art and buildings (degradation and biocorrosion), paper industry (fouling) and more. To better control the first steps leading to adsorption of a biomolecule on an inorganic surface, it is mandatory to understand the adsorption mechanisms of biomolecules of several sizes at the atomic scale, that is, the nature of the chemical interaction between the biomolecule and the surface and the resulting biomolecule conformations once adsorbed at the surface. This remains a challenging and unsolved problem. Here, we review the state of art in experimental and theoretical approaches. We focus on metallic biomaterial surfaces such as TiO(2) and stainless steel, mentioning some remarkable results on hydroxyapatite. Experimental techniques include atomic force microscopy, surface plasmon resonance, quartz crystal microbalance, X-ray photoelectron spectroscopy, fluorescence microscopy, polarization modulation infrared reflection absorption spectroscopy, sum frequency generation and time of flight secondary ion mass spectroscopy. Theoretical models range from detailed quantum mechanical representations to classical forcefield-based approaches. Copyright © 2012 Wiley Periodicals, Inc.

  9. Censoring approach to the detection limits in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Pajek, M.; Kubala-Kukus, A.

    2004-01-01

    We demonstrate that the effect of detection limits in the X-ray fluorescence analysis (XRF), which limits the determination of very low concentrations of trace elements and results in appearance of the so-called 'nondetects', can be accounted for using the statistical concept of censoring. More precisely, the results of such measurements can be viewed as the left random censored data, which can further be analyzed using the Kaplan-Meier method correcting the data for the presence of nondetects. Using this approach, the results of measured, detection limit censored concentrations can be interpreted in a nonparametric manner including the correction for the nondetects, i.e. the measurements in which the concentrations were found to be below the actual detection limits. Moreover, using the Monte Carlo simulation technique we show that by using the Kaplan-Meier approach the corrected mean concentrations for a population of the samples can be estimated within a few percent uncertainties with respect of the simulated, uncensored data. This practically means that the final uncertainties of estimated mean values are limited in fact by the number of studied samples and not by the correction procedure itself. The discussed random-left censoring approach was applied to analyze the XRF detection-limit-censored concentration measurements of trace elements in biomedical samples

  10. Ethics fundamentals.

    Science.gov (United States)

    Chambers, David W

    2011-01-01

    Ethics is about studying the right and the good; morality is about acting as one should. Although there are differences among what is legal, charitable, professional, ethical, and moral, these desirable characteristics tend to cluster and are treasured in dentistry. The traditional approach to professionalism in dentistry is based on a theory of biomedical ethics advanced 30 years ago. Known as the principles approach, general ideals such as respect for autonomy, nonmaleficence, beneficence, justice, and veracity, are offered as guides. Growth in professionalism consists in learning to interpret the application of these principles as one's peers do. Moral behavior is conceived as a continuous cycle of sensitivity to situations requiring moral response, moral reasoning, the moral courage to take action when necessary, and integration of habits of moral behavior into one's character. This essay is the first of two papers that provide the backbone for the IDEA Project of the College--an online, multiformat, interactive "textbook" of ethics for the profession.

  11. Novel approaches to a study of the fundamental organic chemistry of coal. Final report, September 1, 1977-September 1, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Giam, C.S.; Goodwin, T.E.; Tabor, R.L.; Neff, G.; Smith, S.; Ionescu, F.; Trujillo, D.

    1979-01-01

    The studies are preliminary in nature, and the following conclusions are tentative. (1) The results with mixed carboxylic-sulfonic anhydrides seem to indicate an increase in polymerization of the coal molecular structure, when based on the results of pyrolysis/gas chromatographic analyses. The mixed anhydrides are such powerful acylating reagents that they should be capable of causing profound and dramatic structural modifications of coal and the results suggest sub-optimal reaction conditions. The results may also be due to the presence of only a small number of ether linkages connecting large molecular units together. Possibly, at elevated pressures and larger concentrations of mixed anhydride, a greater extent of depolymerization would occur, coupled perhaps with acylation. (2) The Nimz lignin degradation reaction has now been fully implemented and good conditions have been found for lignite reaction. The products from this degradation were basically hydrocarbon in nature. Thus, in the absence of monolignols, we postulate that such phenolic linkages of the type found in lignin are not found to a large degree in Texas lignites. (3) Our recently developed technique of analyzing methylene to methyl ratios by IR spectroscopy represents a useful method for characterization of both soluble and insoluble coal-derived products. The technique is less expensive than mass spectroscopy and not limited by solubility as in the case of NMR spectroscopy. (4) From the measurements of the acidic hydrogen content of the lignites studies, we have formed a postulate as to the involvement of heteroatoms (especially oxygen) in the lignite structure. We feel that heteroatoms in Texas lignites are involved mainly in carbonyl, low molecular weight alkoxy and/or heterocyclic units. (5) Conditions for depolymerizing and solubilizing lignites by use of t-butyllithium have been developed and utilized successfully.

  12. New approaches to deriving limits of the release of radioactive material into the environment

    International Nuclear Information System (INIS)

    Lindell, B.

    1977-01-01

    During the last few years, new principles have been developed for the limitation of the release of radioactive material into the environment. It is no longer considered appropriate to base the limitation on limits for the concentrations of the various radionuclides in air and water effluents. Such limits would not prevent large amounts of radioactive material from reaching the environment should effluent rates be high. A common practice has been to identify critical radionuclides and critical pathways and to base the limitation on authorized dose limits for local ''critical groups''. If this were the only limitation, however, larger releases could be permitted after installing either higher stacks or equipment to retain the more short-lived radionuclides for decay before release. Continued release at such limits would then lead to considerably higher exposure at a distance than if no such installation had been made. Accordingly there would be no immediate control of overlapping exposures from several sources, nor would the system guarantee control of the future situation. The new principles described in this paper take the future into account by limiting the annual dose commitments rather than the annual doses. They also offer means of controlling the global situation by limiting not only doses in critical groups but also global collective doses. Their objective is not only to ensure that individual dose limits will always be respected but also to meet the requirement that ''all doses be kept as low as reasonably achievable''. The new approach is based on the most recent recommendations by the ICRP and has been described in a report by an IAEA panel (Procedures for establishing limits for the release of radioactive material into the environment). It has been applied in the development of new Swedish release regulations, which illustrate some of the problems which arise in the practical application

  13. New approaches to deriving limits of the release of radioactive material into the environment

    International Nuclear Information System (INIS)

    Lindell, B.

    1977-01-01

    During the last few years, new principles have been developed for the limitation of the release of radioactive material into the environment. It is no longer considered appropriate to base the limitation on limits for the concentrations of the various radionuclides in air and water effluents. Such limits would not prevent large amounts of radioactive material from reaching the environment should effluent rates be high. A common practice has been to identify critical radionuclides and critical pathways and to base the limitation on authorized dose limits for local ''critical groups''. If this were the only limitation, however, larger releases could be permitted after installing either higher stacks or equipment to retain the more shortlived radionuclides for decay before release. Continued release at such limits would then lead to considerably higher exposure at a distance than if no such installation had been made. Accordingly there would be no immediate control of overlapping exposures from several sources, nor would the system guarantee control of the future situation. The new principles described in this paper take the future into account by limiting the annual dose commitments rather than the annual doses. They also offer means of controlling the global situation by limiting not only doses in critical groups but also global collective doses. Their objective is not only to ensure that individual dose limits will always be respected but also to meet the requirement that ''all doses be kept as low as reasonably achievable''. The new approach is based on the most recent recommendations by the ICRP and has been described in a report by an IAEA panel (Procedures for Establishing Limits for the Release of Radioactive Material into the Environment). It has been applied in the development of new Swedish release regulations, which illustrate some of the problems which arise in the practical application. (author)

  14. Relevance of plastic limit loads to reference stress approach for surface cracked cylinder problems

    International Nuclear Information System (INIS)

    Kim, Yun-Jae; Shim, Do-Jun

    2005-01-01

    To investigate the relevance of the definition of the reference stress to estimate J and C* for surface crack problems, this paper compares finite element (FE) J and C* results for surface cracked pipes with those estimated according to the reference stress approach using various definitions of the reference stress. Pipes with part circumferential inner surface cracks and finite internal axial cracks are considered, subject to internal pressure and global bending. The crack depth and aspect ratio are systematically varied. The reference stress is defined in four different ways using (i) a local limit load (ii), a global limit load, (iii) a global limit load determined from the FE limit analysis, and (iv) the optimised reference load. It is found that the reference stress based on a local limit load gives overall excessively conservative estimates of J and C*. Use of a global limit load clearly reduces the conservatism, compared to that of a local limit load, although it can sometimes provide non-conservative estimates of J and C*. The use of the FE global limit load gives overall non-conservative estimates of J and C*. The reference stress based on the optimised reference load gives overall accurate estimates of J and C*, compared to other definitions of the reference stress. Based on the present findings, general guidance on the choice of the reference stress for surface crack problems is given

  15. Forming Limits in Sheet Metal Forming for Non-Proportional Loading Conditions - Experimental and Theoretical Approach

    International Nuclear Information System (INIS)

    Ofenheimer, Aldo; Buchmayr, Bruno; Kolleck, Ralf; Merklein, Marion

    2005-01-01

    The influence of strain paths (loading history) on material formability is well known in sheet forming processes. Sophisticated experimental methods are used to determine the entire shape of strain paths of forming limits for aluminum AA6016-T4 alloy. Forming limits for sheet metal in as-received condition as well as for different pre-deformation are presented. A theoretical approach based on Arrieux's intrinsic Forming Limit Stress Curve (FLSC) concept is employed to numerically predict the influence of loading history on forming severity. The detailed experimental strain paths are used in the theoretical study instead of any linear or bilinear simplified loading histories to demonstrate the predictive quality of forming limits in the state of stress

  16. Limited endoscopic transsphenoidal approach for cavernous sinus biopsy: illustration of 3 cases and discussion.

    Science.gov (United States)

    Graillon, T; Fuentes, S; Metellus, P; Adetchessi, T; Gras, R; Dufour, H

    2014-01-01

    Advances in transsphenoidal surgery and endoscopic techniques have opened new perspectives for cavernous sinus (CS) approaches. The aim of this study was to assess the advantages and disadvantages of limited endoscopic transsphenoidal approach, as performed in pituitary adenoma surgery, for CS tumor biopsy illustrated with three clinical cases. The first case was a 46-year-old woman with a prior medical history of parotid adenocarcinoma successfully treated 10 years previously. The cavernous sinus tumor was revealed by right third and sixth nerve palsy and increased over the past three years. A tumor biopsy using a limited endoscopic transsphenoidal approach revealed an adenocarcinoma metastasis. Complementary radiosurgery was performed. The second case was a 36-year-old woman who consulted for diplopia with right sixth nerve palsy and amenorrhea with hyperprolactinemia. Dopamine agonist treatment was used to restore the patient's menstrual cycle. Cerebral magnetic resonance imaging (MRI) revealed a right sided CS tumor. CS biopsy, via a limited endoscopic transsphenoidal approach, confirmed a meningothelial grade 1 meningioma. Complementary radiosurgery was performed. The third case was a 63-year-old woman with progressive installation of left third nerve palsy and visual acuity loss, revealing a left cavernous sinus tumor invading the optic canal. Surgical biopsy was performed using an enlarged endoscopic transsphenoidal approach to the decompress optic nerve. Biopsy results revealed a meningothelial grade 1 meningioma. Complementary radiotherapy was performed. In these three cases, no complications were observed. Mean hospitalization duration was 4 days. Reported anatomical studies and clinical series have shown the feasibility of reaching the cavernous sinus using an endoscopic endonasal approach. Trans-foramen ovale CS percutaneous biopsy is an interesting procedure but only provides cell analysis results, and not tissue analysis. However, radiotherapy and

  17. Fundamentals of Monte Carlo

    International Nuclear Information System (INIS)

    Wollaber, Allan Benton

    2016-01-01

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating @@), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  18. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  19. Problem of data quality and the limitations of the infrastructure approach

    Science.gov (United States)

    Behlen, Fred M.; Sayre, Richard E.; Rackus, Edward; Ye, Dingzhong

    1998-07-01

    The 'Infrastructure Approach' is a PACS implementation methodology wherein the archive, network and information systems interfaces are acquired first, and workstations are installed later. The approach allows building a history of archived image data, so that most prior examinations are available in digital form when workstations are deployed. A limitation of the Infrastructure Approach is that the deferred use of digital image data defeats many data quality management functions that are provided automatically by human mechanisms when data is immediately used for the completion of clinical tasks. If the digital data is used solely for archiving while reports are interpreted from film, the radiologist serves only as a check against lost films, and another person must be designated as responsible for the quality of the digital data. Data from the Radiology Information System and the PACS were analyzed to assess the nature and frequency of system and data quality errors. The error level was found to be acceptable if supported by auditing and error resolution procedures requiring additional staff time, and in any case was better than the loss rate of a hardcopy film archive. It is concluded that the problem of data quality compromises but does not negate the value of the Infrastructure Approach. The Infrastructure Approach should best be employed only to a limited extent, and that any phased PACS implementation should have a substantial complement of workstations dedicated to softcopy interpretation for at least some applications, and with full deployment following not long thereafter.

  20. Fundamentals of Geophysics

    Science.gov (United States)

    Frohlich, Cliff

    Choosing an intermediate-level geophysics text is always problematic: What should we teach students after they have had introductory courses in geology, math, and physics, but little else? Fundamentals of Geophysics is aimed specifically at these intermediate-level students, and the author's stated approach is to construct a text “using abundant diagrams, a simplified mathematical treatment, and equations in which the student can follow each derivation step-by-step.” Moreover, for Lowrie, the Earth is round, not flat—the “fundamentals of geophysics” here are the essential properties of our Earth the planet, rather than useful techniques for finding oil and minerals. Thus this book is comparable in both level and approach to C. M. R. Fowler's The Solid Earth (Cambridge University Press, 1990).

  1. Quantum mechanics a fundamental approach

    CERN Document Server

    Wan, K Kong

    2018-01-01

    The mathematical formalism of quantum theory in terms of vectors and operators in infinite-dimensional complex vector spaces is very abstract. The definitions of many mathematical quantities used do not seem to have an intuitive meaning. This makes it difficult to appreciate the mathematical formalism and hampers the understanding of quantum mechanics. This book provides intuition and motivation to the mathematics of quantum theory, introducing the mathematics in its simplest and familiar form, for instance, with three-dimensional vectors and operators, which can be readily understood. Feeling confident about and comfortable with the mathematics used helps readers appreciate and understand the concepts and formalism of quantum mechanics. Quantum mechanics is presented in six groups of postulates. A chapter is devoted to each group of postulates with a detailed discussion. Systems with superselection rules, and some conceptual issues such as quantum paradoxes and measurement, are also discussed. The book conc...

  2. Stochastic approach to the derivation of emission limits for wastewater treatment plants.

    Science.gov (United States)

    Stransky, D; Kabelkova, I; Bares, V

    2009-01-01

    Stochastic approach to the derivation of WWTP emission limits meeting probabilistically defined environmental quality standards (EQS) is presented. The stochastic model is based on the mixing equation with input data defined by probability density distributions and solved by Monte Carlo simulations. The approach was tested on a study catchment for total phosphorus (P(tot)). The model assumes input variables independency which was proved for the dry-weather situation. Discharges and P(tot) concentrations both in the study creek and WWTP effluent follow log-normal probability distribution. Variation coefficients of P(tot) concentrations differ considerably along the stream (c(v)=0.415-0.884). The selected value of the variation coefficient (c(v)=0.420) affects the derived mean value (C(mean)=0.13 mg/l) of the P(tot) EQS (C(90)=0.2 mg/l). Even after supposed improvement of water quality upstream of the WWTP to the level of the P(tot) EQS, the WWTP emission limits calculated would be lower than the values of the best available technology (BAT). Thus, minimum dilution ratios for the meaningful application of the combined approach to the derivation of P(tot) emission limits for Czech streams are discussed.

  3. The risk to be tolerated and the limits of practical rationality - problems involved in nuclear licensing. - Are there prerogatives of the administration in decision-making. Fundamental criticism of the undefined legal concept in the law pertaining to plant licinsing

    International Nuclear Information System (INIS)

    Wolf, R.

    1986-01-01

    This chapter discusses in detail the litigation and court decisions in nuclear energy matters, with particular attention being given to the scope and distinctness of juristic interpretations of vaguely defined legal concepts, and to the definition of the 'risk to be tolerated'. Especially the court decisions on the nuclear power plant licences for the installations at Wuergassen, Wyhl, Grafenrheinfeld and Kalkar are reviewed under the following aspects: How safe is safe enough - bursting resistance - risk prevention and practical rationality - limits of scientific research into risk probability - fundamental criticism to be raised against vaguely defined legal terms and concepts in the law governing the licensing of nuclear installations. (HSCH) [de

  4. Approach to the thermodynamic limit in lattice QCD at μ≠0

    International Nuclear Information System (INIS)

    Splittorff, K.; Verbaarschot, J. J. M.

    2008-01-01

    The expectation value of the complex phase factor of the fermion determinant is computed to leading order in the p expansion of the chiral Lagrangian. The computation is valid for μ π /2 and determines the dependence of the sign problem on the volume and on the geometric shape of the volume. In the thermodynamic limit with L i →∞ at fixed temperature 1/L 0 , the average phase factor vanishes. In the low temperature limit where L i /L 0 is fixed as L i becomes large, the average phase factor approaches 1 for μ π /2. The results for a finite volume compare well with lattice results obtained by Allton et al. After taking appropriate limits, we reproduce previously derived results for the ε regime and for one-dimensional QCD. The distribution of the phase itself is also computed

  5. La operación analítica: límites y fundamentos The analytical operation: limits and fundaments

    Directory of Open Access Journals (Sweden)

    David Laznik

    2009-12-01

    Full Text Available La construcción del corpus teórico psicoanalítico, en tanto teoría de una praxis, experimenta a lo largo de la obra freudiana diversas rectificaciones que inciden en la delimitación de los conceptos y de las operaciones inherentes a su campo. Desde esa perspectiva, la pregunta por el alcance y los límites del método psicoanalítico subsiste en articulación con las sucesivas reformulaciones. Luego de establecer la segunda tópica, Freud sistematiza, en 1926, los diferentes tipos de resistencias. Posteriormente y en diversos momentos, retoma la problemática en torno a los obstáculos que complican el trabajo analítico. Estas consideraciones introducen nuevos interrogantes y recortan la incidencia de nuevos factores que, aún sin precipitar en una formalización acabada, complejizan el estatuto y el alcance de la operación analítica.The construction of the theoretical psychoanalytic corpus, as a theory of a praxis, experiences along the Freudian work diverse rectifications that affect in the delimiting of the concepts and of the operations inherent to his field. From this perspective, the question for the range and the limits of the psychoanalytic method survives in joint with the successive reformulations. After establishing the second topic, Freud systematizes, in 1926, the different types of resistances. Later, and in diverse moments, he recaptures the problematics around the obstacles that complicate the analytical work. These considerations introduce new questions and delimits the incident of new factors that, still without precipitating in a finished formalization, complex the statute and the range of the analytical operation.

  6. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) are counted by high resolution gamma spectrometry registrating a pulse-heights distribution (acquisition of a multichannel spectrum), for example on samples. It considers exclusively the random character of radioactive decay and of pulse counting and ignores all other influences (e.g. arising from sample treatment, weighing, enrichment or the instability of the test setup). It assumes that the distance of neighbouring peaks of gamma lines is not smaller than four times the full width half maximum (FWHM) of gamma line and that the background near to gamma line is nearly a straight line. Otherwise ISO 11929-1 or ISO 11929-2 should be used. ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as nonradioactive waste, and ISO 11929-1, ISO 11929-2 and ISO 11929-4, and is, consequently, complementary to these documents

  7. The emergence of the dimensions and fundamental forces in the universe, an information-theoretical approach for the expaining of the quantity ratios of the fundamental interactions. 2. rev. and enl. ed.

    International Nuclear Information System (INIS)

    Ganter, Bernd

    2013-01-01

    After a description of the four fundamental inteactions and the connection of information with energy the principle of the fast maximation together with the Ganter tableau is described. Then as example the derivation of the value of the fine-structure constant from the Ganter tableau is described. Thereafter the extension of the Ganter tableau, further properties of the Ganter tableau, and the persuasion of the Ganter tableau are considered. (HSI)

  8. Novel approach to epicardial pacemaker implantation in patients with limited venous access.

    Science.gov (United States)

    Costa, Roberto; Scanavacca, Mauricio; da Silva, Kátia Regina; Martinelli Filho, Martino; Carrillo, Roger

    2013-11-01

    Limited venous access in certain patients increases the procedural risk and complexity of conventional transvenous pacemaker implantation. The purpose of this study was to determine a minimally invasive epicardial approach using pericardial reflections for dual-chamber pacemaker implantation in patients with limited venous access. Between June 2006 and November 2011, 15 patients underwent epicardial pacemaker implantation. Procedures were performed through a minimally invasive subxiphoid approach and pericardial window with subsequent fluoroscopy-assisted lead placement. Mean patient age was 46.4 ± 15.3 years (9 male [(60.0%], 6 female [40.0%]). The new surgical approach was used in patients determined to have limited venous access due to multiple abandoned leads in 5 (33.3%), venous occlusion in 3 (20.0%), intravascular retention of lead fragments from prior extraction in 3 (20.0%), tricuspid valve vegetation currently under treatment in 2 (13.3%), and unrepaired intracardiac defects in 2 (13.3%). All procedures were successful with no perioperative complications or early deaths. Mean operating time for isolated pacemaker implantation was 231.7 ± 33.5 minutes. Lead placement on the superior aspect of right atrium, through the transverse sinus, was possible in 12 patients. In the remaining 3 patients, the atrial lead was implanted on the left atrium through the oblique sinus, the postcaval recess, or the left pulmonary vein recess. None of the patients displayed pacing or sensing dysfunction, and all parameters remained stable throughout the follow-up period of 36.8 ± 25.1 months. Epicardial pacemaker implantation through pericardial reflections is an effective alternative therapy for those patients requiring physiologic pacing in whom venous access is limited. © 2013 Heart Rhythm Society. All rights reserved.

  9. An Adaptive Approach to Mitigate Background Covariance Limitations in the Ensemble Kalman Filter

    KAUST Repository

    Song, Hajoon

    2010-07-01

    A new approach is proposed to address the background covariance limitations arising from undersampled ensembles and unaccounted model errors in the ensemble Kalman filter (EnKF). The method enhances the representativeness of the EnKF ensemble by augmenting it with new members chosen adaptively to add missing information that prevents the EnKF from fully fitting the data to the ensemble. The vectors to be added are obtained by back projecting the residuals of the observation misfits from the EnKF analysis step onto the state space. The back projection is done using an optimal interpolation (OI) scheme based on an estimated covariance of the subspace missing from the ensemble. In the experiments reported here, the OI uses a preselected stationary background covariance matrix, as in the hybrid EnKF–three-dimensional variational data assimilation (3DVAR) approach, but the resulting correction is included as a new ensemble member instead of being added to all existing ensemble members. The adaptive approach is tested with the Lorenz-96 model. The hybrid EnKF–3DVAR is used as a benchmark to evaluate the performance of the adaptive approach. Assimilation experiments suggest that the new adaptive scheme significantly improves the EnKF behavior when it suffers from small size ensembles and neglected model errors. It was further found to be competitive with the hybrid EnKF–3DVAR approach, depending on ensemble size and data coverage.

  10. An ICMP-Based Mobility Management Approach Suitable for Protocol Deployment Limitation

    Directory of Open Access Journals (Sweden)

    Jeng-Yueng Chen

    2009-01-01

    Full Text Available Mobility management is one of the important tasks on wireless networks. Many approaches have been proposed in the past, but none of them have been widely deployed so far. Mobile IP (MIP and Route Optimization (ROMIP, respectively, suffer from triangular routing problem and binding cache supporting upon each node on the entire Internet. One step toward a solution is the Mobile Routing Table (MRT, which enables edge routers to take over address binding. However, this approach demands that all the edge routers on the Internet support MRT, resulting in protocol deployment difficulties. To address this problem and to offset the limitation of the original MRT approach, we propose two different schemes, an ICMP echo scheme and an ICMP destination-unreachable scheme. These two schemes work with the MRT to efficiently find MRT-enabled routers that greatly reduce the number of triangular routes. In this paper, we analyze and compare the standard MIP and the proposed approaches. Simulation results have shown that the proposed approaches reduce transmission delay, with only a few routers supporting MRT.

  11. Fundamental aspects of plasma chemical physics Thermodynamics

    CERN Document Server

    Capitelli, Mario; D'Angola, Antonio

    2012-01-01

    Fundamental Aspects of Plasma Chemical Physics - Thermodynamics develops basic and advanced concepts of plasma thermodynamics from both classical and statistical points of view. After a refreshment of classical thermodynamics applied to the dissociation and ionization regimes, the book invites the reader to discover the role of electronic excitation in affecting the properties of plasmas, a topic often overlooked by the thermal plasma community. Particular attention is devoted to the problem of the divergence of the partition function of atomic species and the state-to-state approach for calculating the partition function of diatomic and polyatomic molecules. The limit of ideal gas approximation is also discussed, by introducing Debye-Huckel and virial corrections. Throughout the book, worked examples are given in order to clarify concepts and mathematical approaches. This book is a first of a series of three books to be published by the authors on fundamental aspects of plasma chemical physics.  The next bo...

  12. EFFICACY OF SUBMUCOSAL DELIVERY THROUGH A PARAPHARYNGEAL APPROACH IN THE TREATMENT OF LIMITED CRICOID CHONDROMA

    Directory of Open Access Journals (Sweden)

    M.T. Khorsi Y. Amidi

    2008-05-01

    Full Text Available Cartilaginous tumors comprise 1% of all laryngeal masses. Since they grow slowly and metastasis is rare, long term survival is expected in cases of chondroma and chondrosarcoma. Thus, based on these facts and the fact that total salvage surgery after recurrence of previous tumor does not influence treatment outcomes, "Quality of Life" must be taken into great consideration. Based on 3 cases of limited condrosarcoma that we have successfully operated on using submucosal delivery through a parapharyngeal approach, after several years of recurrence free follow ups, authors determine this technique as an efficient method of approach to these tumors. Since this technique takes less time and there is no need for glottic incision and the patient is discharged in 2 days without insertion of endolaryngeal stent, we believe this method is superior to laryngofissure or total laryngectomy.

  13. An Optimization-Based Impedance Approach for Robot Force Regulation with Prescribed Force Limits

    Directory of Open Access Journals (Sweden)

    R. de J. Portillo-Vélez

    2015-01-01

    Full Text Available An optimization based approach for the regulation of excessive or insufficient forces at the end-effector level is introduced. The objective is to minimize the interaction force error at the robot end effector, while constraining undesired interaction forces. To that end, a dynamic optimization problem (DOP is formulated considering a dynamic robot impedance model. Penalty functions are considered in the DOP to handle the constraints on the interaction force. The optimization problem is online solved through the gradient flow approach. Convergence properties are presented and the stability is drawn when the force limits are considered in the analysis. The effectiveness of our proposal is validated via experimental results for a robotic grasping task.

  14. Fundamental limitations on 'warp drive' spacetimes

    International Nuclear Information System (INIS)

    Lobo, Francisco S N; Visser, Matt

    2004-01-01

    'Warp drive' spacetimes are useful as 'gedanken-experiments' that force us to confront the foundations of general relativity, and among other things, to precisely formulate the notion of 'superluminal' communication. After carefully formulating the Alcubierre and Natario warp drive spacetimes, and verifying their non-perturbative violation of the classical energy conditions, we consider a more modest question and apply linearized gravity to the weak-field warp drive, testing the energy conditions to first and second orders of the warp-bubble velocity, v. Since we take the warp-bubble velocity to be non-relativistic, v << c, we are not primarily interested in the 'superluminal' features of the warp drive. Instead we focus on a secondary feature of the warp drive that has not previously been remarked upon-the warp drive (if it could be built) would be an example of a 'reaction-less drive'. For both the Alcubierre and Natario warp drives we find that the occurrence of significant energy condition violations is not just a high-speed effect, but that the violations persist even at arbitrarily low speeds. A particularly interesting feature of this construction is that it is now meaningful to think of placing a finite mass spaceship at the centre of the warp bubble, and then see how the energy in the warp field compares with the mass-energy of the spaceship. There is no hope of doing this in Alcubierre's original version of the warp field, since by definition the point at the centre of the warp bubble moves on a geodesic and is 'massless'. That is, in Alcubierre's original formalism and in the Natario formalism the spaceship is always treated as a test particle, while in the linearized theory we can treat the spaceship as a finite mass object. For both the Alcubierre and Natario warp drives we find that even at low speeds the net (negative) energy stored in the warp fields must be a significant fraction of the mass of the spaceship

  15. Collaborative Filtering: Fundamental Limits and Good Practices

    Indian Academy of Sciences (India)

    Shopping. Services. Travel. Events. Media. Social Network. Web Search: Long list of related items ... People who bought this also bought... •Google News: ... Suggests movies using rating matrix. •Facebook .... Number of rows, columns → ∞.

  16. The "Food Polymer Science" approach to the practice of industrial R&D, leading to patent estates based on fundamental starch science and technology.

    Science.gov (United States)

    Slade, Louise; Levine, Harry

    2018-04-13

    This article reviews the application of the "Food Polymer Science" approach to the practice of industrial R&D, leading to patent estates based on fundamental starch science and technology. The areas of patents and patented technologies reviewed here include: (a) soft-from-the-freezer ice creams and freezer-storage-stable frozen bread dough products, based on "cryostabilization technology" of frozen foods, utilizing commercial starch hydrolysis products (SHPs); (b) glassy-matrix encapsulation technology for flavors and other volatiles, based on structure-function relationships for commercial SHPs; (c) production of stabilized whole-grain wheat flours for biscuit products, based on the application of "solvent retention capacity" technology to develop flours with reduced damaged starch; (d) production of improved-quality, low-moisture cookies and crackers, based on pentosanase enzyme technology; (e) production of "baked-not-fried," chip-like, starch-based snack products, based on the use of commercial modified-starch ingredients with selected functionality; (f) accelerated staling of a starch-based food product from baked bread crumb, based on the kinetics of starch retrogradation, treated as a crystallization process for a partially crystalline glassy polymer system; and (g) a process for producing an enzyme-resistant starch, for use as a reduced-calorie flour replacer in a wide range of grain-based food products, including cookies, extruded expanded snacks, and breakfast cereals.

  17. Scientific and technological fundamentals

    International Nuclear Information System (INIS)

    Roethemeyer, H.

    1991-01-01

    Specific ultimate repositories in a given geological formation have to be assessed on the basis of a safety analysis, taking into account the site specifics of the repository system 'Overall geological situation - ultimate disposal facility - waste forms'. The fundamental possibilities and limits of waste disposal are outlined. Orientation values up to about 10 6 years are derived for the isolation potential of ultimate disposal mines, and about 10 4 years for the calculation of effects of emplaced radioactive wastes also on man. (DG) [de

  18. DOE fundamentals handbook: Chemistry

    International Nuclear Information System (INIS)

    1993-01-01

    The Chemistry Handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of chemistry. The handbook includes information on the atomic structure of matter; chemical bonding; chemical equations; chemical interactions involved with corrosion processes; water chemistry control, including the principles of water treatment; the hazards of chemicals and gases, and basic gaseous diffusion processes. This information will provide personnel with a foundation for understanding the chemical properties of materials and the way these properties can impose limitations on the operation of equipment and systems

  19. Fundamentals of Structural Geology

    Science.gov (United States)

    Pollard, David D.; Fletcher, Raymond C.

    2005-09-01

    Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors

  20. Mathematical analysis fundamentals

    CERN Document Server

    Bashirov, Agamirza

    2014-01-01

    The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o

  1. Fundamentals of Project Management

    CERN Document Server

    Heagney, Joseph

    2011-01-01

    With sales of more than 160,000 copies, Fundamentals of Project Management has helped generations of project managers navigate the ins and outs of every aspect of this complex discipline. Using a simple step-by-step approach, the book is the perfect introduction to project management tools, techniques, and concepts. Readers will learn how to: ò Develop a mission statement, vision, goals, and objectives ò Plan the project ò Create the work breakdown structure ò Produce a workable schedule ò Understand earned value analysis ò Manage a project team ò Control and evaluate progress at every stage.

  2. Fundamentals of Cavitation

    CERN Document Server

    Franc, Jean-Pierre

    2005-01-01

    The present book is aimed at providing a comprehensive presentation of cavitation phenomena in liquid flows. It is further backed up by the experience, both experimental and theoretical, of the authors whose expertise has been internationally recognized. A special effort is made to place the various methods of investigation in strong relation with the fundamental physics of cavitation, enabling the reader to treat specific problems independently. Furthermore, it is hoped that a better knowledge of the cavitation phenomenon will allow engineers to create systems using it positively. Examples in the literature show the feasibility of this approach.

  3. Physiology-based modelling approaches to characterize fish habitat suitability: Their usefulness and limitations

    Science.gov (United States)

    Teal, Lorna R.; Marras, Stefano; Peck, Myron A.; Domenici, Paolo

    2018-02-01

    empirical relationships which can be found consistently within physiological data across the animal kingdom. The advantages of the DEB models are that they make use of the generalities found in terms of animal physiology and can therefore be applied to species for which little data or empirical observations are available. In addition, the limitations as well as useful potential refinements of these and other physiology-based modelling approaches are discussed. Inclusion of the physiological response of various life stages and modelling the patterns of extreme events observed in nature are suggested for future work.

  4. Serious limitations of the QTL/Microarray approach for QTL gene discovery

    Directory of Open Access Journals (Sweden)

    Warden Craig H

    2010-07-01

    Full Text Available Abstract Background It has been proposed that the use of gene expression microarrays in nonrecombinant parental or congenic strains can accelerate the process of isolating individual genes underlying quantitative trait loci (QTL. However, the effectiveness of this approach has not been assessed. Results Thirty-seven studies that have implemented the QTL/microarray approach in rodents were reviewed. About 30% of studies showed enrichment for QTL candidates, mostly in comparisons between congenic and background strains. Three studies led to the identification of an underlying QTL gene. To complement the literature results, a microarray experiment was performed using three mouse congenic strains isolating the effects of at least 25 biometric QTL. Results show that genes in the congenic donor regions were preferentially selected. However, within donor regions, the distribution of differentially expressed genes was homogeneous once gene density was accounted for. Genes within identical-by-descent (IBD regions were less likely to be differentially expressed in chromosome 2, but not in chromosomes 11 and 17. Furthermore, expression of QTL regulated in cis (cis eQTL showed higher expression in the background genotype, which was partially explained by the presence of single nucleotide polymorphisms (SNP. Conclusions The literature shows limited successes from the QTL/microarray approach to identify QTL genes. Our own results from microarray profiling of three congenic strains revealed a strong tendency to select cis-eQTL over trans-eQTL. IBD regions had little effect on rate of differential expression, and we provide several reasons why IBD should not be used to discard eQTL candidates. In addition, mismatch probes produced false cis-eQTL that could not be completely removed with the current strains genotypes and low probe density microarrays. The reviewed studies did not account for lack of coverage from the platforms used and therefore removed genes

  5. STEP and fundamental physics

    Science.gov (United States)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-09-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 1013 to one part in 1018 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels.

  6. STEP and fundamental physics

    International Nuclear Information System (INIS)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-01-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 10 13 to one part in 10 18 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels. (paper)

  7. Laparoscopic approach of hepatic hydatid double cyst in pediatric patient: difficulties, indications and limitations

    Directory of Open Access Journals (Sweden)

    Isabela M. Drăghici

    2016-05-01

    Full Text Available Purpose: Laparoscopic management analysis of a rare condition having potentially severe evolution, seen in pediatric surgical pathology. Aims: Outlining the optimal surgical approach method of hepatic hydatid double cyst and the laparoscopic method’s limitations. Materials and Methods: The patient is a 6 years old girl that presented with two simultaneous giant hepatic hydatid cysts (segments VII-VIII, having close vicinity to the right branch of portal vein and to hepatic veins; she benefited from a single stage partial pericystectomy Lagrot performed by laparoscopy. Results: The procedure had no intraoperative accidents or incidents. Had good postoperative evolution without immediate or late complications. Trocars positioning had been adapted to the patient’s size and cysts topography. Conclusions: The laparoscopic treatment is feasible and safe, but is not yet the gold standard for a hepatic hydatid disease due to certain inconveniences.

  8. Limitations Of The Current State Space Modelling Approach In Multistage Machining Processes Due To Operation Variations

    Science.gov (United States)

    Abellán-Nebot, J. V.; Liu, J.; Romero, F.

    2009-11-01

    The State Space modelling approach has been recently proposed as an engineering-driven technique for part quality prediction in Multistage Machining Processes (MMP). Current State Space models incorporate fixture and datum variations in the multi-stage variation propagation, without explicitly considering common operation variations such as machine-tool thermal distortions, cutting-tool wear, cutting-tool deflections, etc. This paper shows the limitations of the current State Space model through an experimental case study where the effect of the spindle thermal expansion, cutting-tool flank wear and locator errors are introduced. The paper also discusses the extension of the current State Space model to include operation variations and its potential benefits.

  9. Estimated effect of an integrated approach to suspected deep venous thrombosis using limited-compression ultrasound.

    Science.gov (United States)

    Poley, Rachel A; Newbigging, Joseph L; Sivilotti, Marco L A

    2014-09-01

    Deep vein thrombosis (DVT) is both common and serious, yet the desire to never miss the diagnosis, coupled with the low specificity of D-dimer testing, results in high imaging rates, return visits, and empirical anticoagulation. The objective of this study was to evaluate a new approach incorporating bedside limited-compression ultrasound (LC US) by emergency physicians (EPs) into the workup strategy for DVT. This was a cross-sectional observational study of emergency department (ED) patients with suspected DVT. Patients on anticoagulants; those with chronic DVT, leg cast, or amputation; or when the results of comprehensive imaging were already known were excluded. All patients were treated in the usual fashion based on the protocol in use at the center, including comprehensive imaging based on the modified Wells score and serum D-dimer testing. Seventeen physicians were trained and performed LC US in all subjects. The authors identified a priori an alternate workup strategy in which DVT would be ruled out in "DVT unlikely" (Wells score return visits for imaging and 10 (4.4%) cases of unnecessary anticoagulation. In 19% of cases, the treating and scanning physician disagreed whether the patient was DVT likely or DVT unlikely based on Wells score (κ = 0.62; 95% CI = 0.48 to 0.77). Limited-compression US holds promise as one component of the diagnostic approach to DVT, but should not be used as a stand-alone test due to imperfect sensitivity. Tradeoffs in diagnostic efficiency for the sake of perfect sensitivity remain a difficult issue collectively in emergency medicine (EM), but need to be scrutinized carefully in light of the costs of overinvestigation, delays in diagnosis, and risks of empirical anticoagulation. © 2014 by the Society for Academic Emergency Medicine.

  10. Risk newsboy: approach for addressing uncertainty in developing action levels and cleanup limits

    International Nuclear Information System (INIS)

    Cooke, Roger; MacDonell, Margaret

    2007-01-01

    Site cleanup decisions involve developing action levels and residual limits for key contaminants, to assure health protection during the cleanup period and into the long term. Uncertainty is inherent in the toxicity information used to define these levels, based on incomplete scientific knowledge regarding dose-response relationships across various hazards and exposures at environmentally relevant levels. This problem can be addressed by applying principles used to manage uncertainty in operations research, as illustrated by the newsboy dilemma. Each day a newsboy must balance the risk of buying more papers than he can sell against the risk of not buying enough. Setting action levels and cleanup limits involves a similar concept of balancing and distributing risks and benefits in the face of uncertainty. The newsboy approach can be applied to develop health-based target concentrations for both radiological and chemical contaminants, with stakeholder input being crucial to assessing 'regret' levels. Associated tools include structured expert judgment elicitation to quantify uncertainty in the dose-response relationship, and mathematical techniques such as probabilistic inversion and iterative proportional fitting. (authors)

  11. An approach to criteria, design limits and monitoring in nuclear fuel waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, G R; Baumgartner, P; Bird, G A; Davison, C C; Johnson, L H; Tamm, J A

    1994-12-01

    The Nuclear Fuel Waste Management Program has been established to develop and demonstrate the technology for safe geological disposal of nuclear fuel waste. One objective of the program is to show that a disposal system (i.e., a disposal centre and associated transportation system) can be designed and that it would be safe. Therefore the disposal system must be shown to comply with safety requirements specified in guidelines, standards, codes and regulations. The components of the disposal system must also be shown to operate within the limits specified in their design. Compliance and performance of the disposal system would be assessed on a site-specific basis by comparing estimates of the anticipated performance of the system and its components with compliance or performance criteria. A monitoring program would be developed to consider the effects of the disposal system on the environment and would include three types of monitoring: baseline monitoring, compliance monitoring, and performance monitoring. This report presents an approach to establishing compliance and performance criteria, limits for use in disposal system component design, and the main elements of a monitoring program for a nuclear fuel waste disposal system. (author). 70 refs., 9 tabs., 13 figs.

  12. An approach to criteria, design limits and monitoring in nuclear fuel waste disposal

    International Nuclear Information System (INIS)

    Simmons, G.R.; Baumgartner, P.; Bird, G.A.; Davison, C.C.; Johnson, L.H.; Tamm, J.A.

    1994-12-01

    The Nuclear Fuel Waste Management Program has been established to develop and demonstrate the technology for safe geological disposal of nuclear fuel waste. One objective of the program is to show that a disposal system (i.e., a disposal centre and associated transportation system) can be designed and that it would be safe. Therefore the disposal system must be shown to comply with safety requirements specified in guidelines, standards, codes and regulations. The components of the disposal system must also be shown to operate within the limits specified in their design. Compliance and performance of the disposal system would be assessed on a site-specific basis by comparing estimates of the anticipated performance of the system and its components with compliance or performance criteria. A monitoring program would be developed to consider the effects of the disposal system on the environment and would include three types of monitoring: baseline monitoring, compliance monitoring, and performance monitoring. This report presents an approach to establishing compliance and performance criteria, limits for use in disposal system component design, and the main elements of a monitoring program for a nuclear fuel waste disposal system. (author). 70 refs., 9 tabs., 13 figs

  13. Analysis of enamel development using murine model systems: approaches and limitations.

    Directory of Open Access Journals (Sweden)

    Megan K Pugach

    2014-09-01

    Full Text Available A primary goal of enamel research is to understand and potentially treat or prevent enamel defects related to amelogenesis imperfecta (AI. Rodents are ideal models to assist our understanding of how enamel is formed because they are easily genetically modified, and their continuously erupting incisors display all stages of enamel development and mineralization. While numerous methods have been developed to generate and analyze genetically modified rodent enamel, it is crucial to understand the limitations and challenges associated with these methods in order to draw appropriate conclusions that can be applied translationally, to AI patient care. We have highlighted methods involved in generating and analyzing rodent enamel and potential approaches to overcoming limitations of these methods: 1 generating transgenic, knockout and knockin mouse models, and 2 analyzing rodent enamel mineral density and functional properties (structure, mechanics of mature enamel. There is a need for a standardized workflow to analyze enamel phenotypes in rodent models so that investigators can compare data from different studies. These methods include analyses of gene and protein expression, developing enamel histology, enamel pigment, degree of mineralization, enamel structure and mechanical properties. Standardization of these methods with regard to stage of enamel development and sample preparation is crucial, and ideally investigators can use correlative and complementary techniques with the understanding that developing mouse enamel is dynamic and complex.

  14. A qualitative risk assessment approach for Swiss dairy products: opportunities and limitations.

    Science.gov (United States)

    Menéndez González, S; Hartnack, S; Berger, T; Doherr, M; Breidenbach, E

    2011-05-01

    Switzerland implemented a risk-based monitoring of Swiss dairy products in 2002 based on a risk assessment (RA) that considered the probability of exceeding a microbiological limit value set by law. A new RA was launched in 2007 to review and further develop the previous assessment, and to make recommendations for future risk-based monitoring according to current risks. The resulting qualitative RA was designed to ascertain the risk to human health from the consumption of Swiss dairy products. The products and microbial hazards to be considered in the RA were determined based on a risk profile. The hazards included Campylobacter spp., Listeria monocytogenes, Salmonella spp., Shiga toxin-producing Escherichia coli, coagulase-positive staphylococci and Staphylococcus aureus enterotoxin. The release assessment considered the prevalence of the hazards in bulk milk samples, the influence of the process parameters on the microorganisms, and the influence of the type of dairy. The exposure assessment was linked to the production volume. An overall probability was estimated combining the probabilities of release and exposure for each combination of hazard, dairy product and type of dairy. This overall probability represents the likelihood of a product from a certain type of dairy exceeding the microbiological limit value and being passed on to the consumer. The consequences could not be fully assessed due to lack of detailed information on the number of disease cases caused by the consumption of dairy products. The results were expressed as a ranking of overall probabilities. Finally, recommendations for the design of the risk-based monitoring programme and for filling the identified data gaps were given. The aims of this work were (i) to present the qualitative RA approach for Swiss dairy products, which could be adapted to other settings and (ii) to discuss the opportunities and limitations of the qualitative method. © 2010 Blackwell Verlag GmbH.

  15. Virtual and composite fundamentals in the ERM

    NARCIS (Netherlands)

    Knot, KHW; Sturm, JE

    1999-01-01

    A latent-variable approach is applied to identify the appropriate driving process for fundamental exchange rates in the ERM. From the time-series characteristics of so-called "virtual fundamentals" and "composite fundamentals", a significant degree of mean reversion can be asserted. The relative

  16. FUNDAMENTALS OF BIOMECHANICS

    Directory of Open Access Journals (Sweden)

    Duane Knudson

    2007-09-01

    Full Text Available DESCRIPTION This book provides a broad and in-depth theoretical and practical description of the fundamental concepts in understanding biomechanics in the qualitative analysis of human movement. PURPOSE The aim is to bring together up-to-date biomechanical knowledge with expert application knowledge. Extensive referencing for students is also provided. FEATURES This textbook is divided into 12 chapters within four parts, including a lab activities section at the end. The division is as follows: Part 1 Introduction: 1.Introduction to biomechanics of human movement; 2.Fundamentals of biomechanics and qualitative analysis; Part 2 Biological/Structural Bases: 3.Anatomical description and its limitations; 4.Mechanics of the musculoskeletal system; Part 3 Mechanical Bases: 5.Linear and angular kinematics; 6.Linear kinetics; 7.Angular kinetics; 8.Fluid mechanics; Part 4 Application of Biomechanics in Qualitative Analysis :9.Applying biomechanics in physical education; 10.Applying biomechanics in coaching; 11.Applying biomechanics in strength and conditioning; 12.Applying biomechanics in sports medicine and rehabilitation. AUDIENCE This is an important reading for both student and educators in the medicine, sport and exercise-related fields. For the researcher and lecturer it would be a helpful guide to plan and prepare more detailed experimental designs or lecture and/or laboratory classes in exercise and sport biomechanics. ASSESSMENT The text provides a constructive fundamental resource for biomechanics, exercise and sport-related students, teachers and researchers as well as anyone interested in understanding motion. It is also very useful since being clearly written and presenting several ways of examples of the application of biomechanics to help teach and apply biomechanical variables and concepts, including sport-related ones

  17. Handling limited datasets with neural networks in medical applications: A small-data approach.

    Science.gov (United States)

    Shaikhina, Torgyn; Khovanova, Natalia A

    2017-01-01

    Single-centre studies in medical domain are often characterised by limited samples due to the complexity and high costs of patient data collection. Machine learning methods for regression modelling of small datasets (less than 10 observations per predictor variable) remain scarce. Our work bridges this gap by developing a novel framework for application of artificial neural networks (NNs) for regression tasks involving small medical datasets. In order to address the sporadic fluctuations and validation issues that appear in regression NNs trained on small datasets, the method of multiple runs and surrogate data analysis were proposed in this work. The approach was compared to the state-of-the-art ensemble NNs; the effect of dataset size on NN performance was also investigated. The proposed framework was applied for the prediction of compressive strength (CS) of femoral trabecular bone in patients suffering from severe osteoarthritis. The NN model was able to estimate the CS of osteoarthritic trabecular bone from its structural and biological properties with a standard error of 0.85MPa. When evaluated on independent test samples, the NN achieved accuracy of 98.3%, outperforming an ensemble NN model by 11%. We reproduce this result on CS data of another porous solid (concrete) and demonstrate that the proposed framework allows for an NN modelled with as few as 56 samples to generalise on 300 independent test samples with 86.5% accuracy, which is comparable to the performance of an NN developed with 18 times larger dataset (1030 samples). The significance of this work is two-fold: the practical application allows for non-destructive prediction of bone fracture risk, while the novel methodology extends beyond the task considered in this study and provides a general framework for application of regression NNs to medical problems characterised by limited dataset sizes. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Experiencing a probabilistic approach to clarify and disclose uncertainties when setting occupational exposure limits.

    Science.gov (United States)

    Vernez, David; Fraize-Frontier, Sandrine; Vincent, Raymond; Binet, Stéphane; Rousselle, Christophe

    2018-03-15

    Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  19. Hankin and Reeves' approach to estimating fish abundance in small streams: limitations and potential options; TOPICAL

    International Nuclear Information System (INIS)

    Thompson, William L.

    2000-01-01

    Hankin and Reeves' (1988) approach to estimating fish abundance in small streams has been applied in stream-fish studies across North America. However, as with any method of population estimation, there are important assumptions that must be met for estimates to be minimally biased and reasonably precise. Consequently, I investigated effects of various levels of departure from these assumptions via simulation based on results from an example application in Hankin and Reeves (1988) and a spatially clustered population. Coverage of 95% confidence intervals averaged about 5% less than nominal when removal estimates equaled true numbers within sampling units, but averaged 62% - 86% less than nominal when they did not, with the exception where detection probabilities of individuals were and gt;0.85 and constant across sampling units (95% confidence interval coverage= 90%). True total abundances averaged far (20% - 41%) below the lower confidence limit when not included within intervals, which implies large negative bias. Further, average coefficient of variation was about 1.5 times higher when removal estimates did not equal true numbers within sampling units (C(bar V)0.27[SE= 0.0004]) than when they did (C(bar V)= 0.19[SE= 0.0002]). A potential modification to Hankin and Reeves' approach is to include environmental covariates that affect detection rates of fish into the removal model or other mark-recapture model. A potential alternative is to use snorkeling in combination with line transect sampling to estimate fish densities. Regardless of the method of population estimation, a pilot study should be conducted to validate the enumeration method, which requires a known (or nearly so) population of fish to serve as a benchmark to evaluate bias and precision of population estimates

  20. Fundamentals of PIXE analysis

    International Nuclear Information System (INIS)

    Ishii, Keizo

    1997-01-01

    Elemental analysis based on the particle induced x-ray emission (PIXE) is a novel technique to analyze trace elements. It is a very simple method, its sensitivity is very high, multiple elements in a sample can be simultaneously analyzed and a few 10 μg of a sample is enough to be analyzed. Owing to these characteristics, the PIXE analysis is now used in many fields (e.g. biology, medicine, dentistry, environmental pollution, archaeology, culture assets etc.). Fundamentals of the PIXE analysis are described here: the production of characteristic x-rays and inner shell ionization by heavy charged particles, the continuous background in PIXE spectrum, quantitative formulae of the PIXE analysis, the detection limit of PIXE analysis, etc. (author)

  1. Nonlinear flowering responses to climate: are species approaching their limits of phenological change?

    Science.gov (United States)

    Iler, Amy M.; Høye, Toke T.; Inouye, David W.; Schmidt, Niels M.

    2013-01-01

    Many alpine and subalpine plant species exhibit phenological advancements in association with earlier snowmelt. While the phenology of some plant species does not advance beyond a threshold snowmelt date, the prevalence of such threshold phenological responses within plant communities is largely unknown. We therefore examined the shape of flowering phenology responses (linear versus nonlinear) to climate using two long-term datasets from plant communities in snow-dominated environments: Gothic, CO, USA (1974–2011) and Zackenberg, Greenland (1996–2011). For a total of 64 species, we determined whether a linear or nonlinear regression model best explained interannual variation in flowering phenology in response to increasing temperatures and advancing snowmelt dates. The most common nonlinear trend was for species to flower earlier as snowmelt advanced, with either no change or a slower rate of change when snowmelt was early (average 20% of cases). By contrast, some species advanced their flowering at a faster rate over the warmest temperatures relative to cooler temperatures (average 5% of cases). Thus, some species seem to be approaching their limits of phenological change in response to snowmelt but not temperature. Such phenological thresholds could either be a result of minimum springtime photoperiod cues for flowering or a slower rate of adaptive change in flowering time relative to changing climatic conditions. PMID:23836793

  2. An approach to the derivation of radionuclide intake limits for members of the public

    International Nuclear Information System (INIS)

    Thompson, R.C.

    1980-01-01

    The modification of occupational exposure limits for application to general populations is discussed. First, the permitted radiation dose needs to be modified from that considered appropriate for occupational exposure, to that considered appropriate for the particular general population exposure of concern. This is a problem of optimization and is considered only briefly. The second modification allows for the different physical, biological, and societal parameters applicable to general populations as contrasted with occupational populations. These differences derive from the heterogeneity of the general population particularly in terms of age and state-of-health, as these affect radionuclide deposition, absorption, distribution, and retention, and as they affect basic sensitivity to the development of detrimental effects. Environmental factors will influence physical availability and may alter the chemical and physical form of the radionuclide, and hence biological availability to the general population. Societal factors may modify the potential for exposure of different segments of the general population. This complex modifying factor will be different for each radioelement. The suggested approach is illustrated using plutonium as an example. (H.K.)

  3. A computational approach to achieve situational awareness from limited observations of a complex system

    Science.gov (United States)

    Sherwin, Jason

    At the start of the 21st century, the topic of complexity remains a formidable challenge in engineering, science and other aspects of our world. It seems that when disaster strikes it is because some complex and unforeseen interaction causes the unfortunate outcome. Why did the financial system of the world meltdown in 2008--2009? Why are global temperatures on the rise? These questions and other ones like them are difficult to answer because they pertain to contexts that require lengthy descriptions. In other words, these contexts are complex. But we as human beings are able to observe and recognize this thing we call 'complexity'. Furthermore, we recognize that there are certain elements of a context that form a system of complex interactions---i.e., a complex system. Many researchers have even noted similarities between seemingly disparate complex systems. Do sub-atomic systems bear resemblance to weather patterns? Or do human-based economic systems bear resemblance to macroscopic flows? Where do we draw the line in their resemblance? These are the kinds of questions that are asked in complex systems research. And the ability to recognize complexity is not only limited to analytic research. Rather, there are many known examples of humans who, not only observe and recognize but also, operate complex systems. How do they do it? Is there something superhuman about these people or is there something common to human anatomy that makes it possible to fly a plane? Or to drive a bus? Or to operate a nuclear power plant? Or to play Chopin's etudes on the piano? In each of these examples, a human being operates a complex system of machinery, whether it is a plane, a bus, a nuclear power plant or a piano. What is the common thread running through these abilities? The study of situational awareness (SA) examines how people do these types of remarkable feats. It is not a bottom-up science though because it relies on finding general principles running through a host of varied

  4. Fundamentals of Structural Engineering

    CERN Document Server

    Connor, Jerome J

    2013-01-01

    Fundamentals of Structural Engineering provides a balanced, seamless treatment of both classic, analytic methods and contemporary, computer-based techniques for conceptualizing and designing a structure. The book’s principle goal is to foster an intuitive understanding of structural behavior based on problem solving experience for students of civil engineering and architecture who have been exposed to the basic concepts of engineering mechanics and mechanics of materials. Making it distinct from many other undergraduate textbooks, the authors of this text recognize the notion that engineers reason about behavior using simple models and intuition they acquire through problem solving. The approach adopted in this text develops this type of intuition  by presenting extensive, realistic problems and case studies together with computer simulation, which allows rapid exploration of  how a structure responds to changes in geometry and physical parameters. This book also: Emphasizes problem-based understanding of...

  5. Fundamentals of sustainable neighbourhoods

    CERN Document Server

    Friedman, Avi

    2015-01-01

    This book introduces architects, engineers, builders, and urban planners to a range of design principles of sustainable communities and illustrates them with outstanding case studies. Drawing on the author’s experience as well as local and international case studies, Fundamentals of Sustainable Neighbourhoods presents planning concepts that minimize developments' carbon footprint through compact communities, adaptable and expandable dwellings, adaptable landscapes, and smaller-sized yet quality-designed housing. This book also: Examines in-depth global strategies for minimizing the residential carbon footprint, including district heating, passive solar gain, net-zero residences, as well as preserving the communities' natural assets Reconsiders conceptual approaches in building design and urban planning to promote a better connection between communities and nature Demonstrates practical applications of green architecture Focuses on innovative living spaces in urban environments

  6. Calibrating Fundamental British Values: How Head Teachers Are Approaching Appraisal in the Light of the Teachers' Standards 2012, Prevent and the Counter-Terrorism and Security Act, 2015

    Science.gov (United States)

    Revell, Lynn; Bryan, Hazel

    2016-01-01

    In requiring that teachers should "not undermine fundamental British values (FBV)," a phrase originally articulated in the Home Office counter-terrorism document, Prevent, the Teachers' Standards has brought into focus the nature of teacher professionalism. Teachers in England are now required to promote FBV within and outside school,…

  7. A Weakest-Link Approach for Fatigue Limit of 30CrNiMo8 Steels (Preprint)

    Science.gov (United States)

    2011-03-01

    34Application of a Weakest-Link Concept to the Fatigue Limit of the Bearing Steel Sae 52100 in a Bainitic Condition," Fatigue and Fracture of...AFRL-RX-WP-TP-2011-4206 A WEAKEST-LINK APPROACH FOR FATIGUE LIMIT OF 30CrNiMo8 STEELS (PREPRINT) S. Ekwaro-Osire and H.V. Kulkarni Texas...2011 4. TITLE AND SUBTITLE A WEAKEST-LINK APPROACH FOR FATIGUE LIMIT OF 30CrNiMo8 STEELS (PREPRINT) 5a. CONTRACT NUMBER In-house 5b. GRANT

  8. Exchange Rates and Fundamentals.

    Science.gov (United States)

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  9. Price Limit and Volatility in Taiwan Stock Exchange: Some Additional Evidence from the Extreme Value Approach

    OpenAIRE

    Aktham I. Maghyereh; Haitham A. Al Zoubi; Haitham Nobanee

    2007-01-01

    We reexamine the effects of price limits on stock volatility of Taiwan Stock Exchange using a new methodology based on the Extreme-Value technique. Consistent with the advocates of price limits, we find that stock market volatility is sharply moderated under more restrictive price limits.

  10. A new approach to define acceptance limits for hematology in external quality assessment schemes.

    Science.gov (United States)

    Soumali, Mohamed Rida; Van Blerk, Marjan; Akharif, Abdelhadi; Albarède, Stéphanie; Kesseler, Dagmar; Gutierrez, Gabriela; de la Salle, Barbara; Plum, Inger; Guyard, Anne; Favia, Ana Paula; Coucke, Wim

    2017-10-26

    A study performed in 2007 comparing the evaluation procedures used in European external quality assessment schemes (EQAS) for hemoglobin and leukocyte concentrations showed that acceptance criteria vary widely. For this reason, the Hematology working group from the European Organisation for External Quality Assurance Providers in Laboratory Medicine (EQALM) decided to perform a statistical study with the aim of establishing appropriate acceptance limits (AL) allowing harmonization between the evaluation procedures of European EQAS organizers. Eight EQAS organizers from seven European countries provided their hematology survey results from 2010 to 2012 for red blood cells (RBC), hemoglobin, hematocrit, mean corpuscular volume (MCV), white blood cells (WBC), platelets and reticulocytes. More than 440,000 data were collected. The relation between the absolute value of the relative differences between reported EQA results and their corresponding assigned value (U-scores) was modeled by means of an adaptation of Thompson's "characteristic function". Quantile regression was used to investigate the percentiles of the U-scores for each target concentration range. For deriving AL, focus was mainly on the upper percentiles (90th, 95th and 99th). For RBC, hemoglobin, hematocrit and MCV, no relation was found between the U-scores and the target concentrations for any of the percentiles. For WBC, platelets and reticulocytes, a relation with the target concentrations was found and concentration-dependent ALs were determined. The approach enabled to determine state of the art-based ALs, that were concentration-dependent when necessary and usable by various EQA providers. It could also easily be applied to other domains.

  11. Summary: fundamental interactions and processes

    International Nuclear Information System (INIS)

    Koltun, D.S.

    1982-01-01

    The subjects of the talks of the first day of the workshop are discussed in terms of fundamental interactions, dynamical theory, and relevant degrees of freedom. Some general considerations are introduced and are used to confront the various approaches taken in the earlier talks

  12. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    fundamental assumptions.A recent focus set in the Astrophysical Journal Letters, titled Focus on Exploring Fundamental Physics with Extragalactic Transients, consists of multiple published studies doing just that.Testing General RelativitySeveral of the articles focus on the 4th point above. By assuming that the delay in photon arrival times is only due to the gravitational potential of the Milky Way, these studies set constraints on the deviation of our galaxys gravitational potential from what GR would predict. The study by He Gao et al. uses the different photon arrival times from gamma-ray bursts to set constraints at eVGeV energies, and the study by Jun-Jie Wei et al. complements this by setting constraints at keV-TeV energies using photons from high-energy blazar emission.Photons or neutrinos from different extragalactic transients each set different upper limits on delta gamma, the post-Newtonian parameter, vs. particle energy or frequency. This is a test of Einsteins equivalence principle: if the principle is correct, delta gamma would be exactly zero, meaning that photons of different energies move at the same velocity through a vacuum. [Tingay Kaplan 2016]S.J. Tingay D.L. Kaplan make the case that measuring the time delay of photons from fast radio bursts (FRBs; transient radio pulses that last only a few milliseconds) will provide even tighter constraints if we are able to accurately determine distances to these FRBs.And Adi Musser argues that the large-scale structure of the universe plays an even greater role than the Milky Way gravitational potential, allowing for even stricter testing of Einsteins equivalence principle.The ever-narrower constraints from these studies all support GR as a correct set of rules through which to interpret our universe.Other Tests of Fundamental PhysicsIn addition to the above tests, Xue-Feng Wu et al. show that FRBs can be used to provide severe constraints on the rest mass of the photon, and S. Croft et al. even touches on what we

  13. The fundamentals of mathematical analysis

    CERN Document Server

    Fikhtengol'ts, G M

    1965-01-01

    The Fundamentals of Mathematical Analysis, Volume 1 is a textbook that provides a systematic and rigorous treatment of the fundamentals of mathematical analysis. Emphasis is placed on the concept of limit which plays a principal role in mathematical analysis. Examples of the application of mathematical analysis to geometry, mechanics, physics, and engineering are given. This volume is comprised of 14 chapters and begins with a discussion on real numbers, their properties and applications, and arithmetical operations over real numbers. The reader is then introduced to the concept of function, i

  14. Chandrasekhar Limit: An Elementary Approach Based on Classical Physics and Quantum Theory

    Science.gov (United States)

    Pinochet, Jorge; Van Sint Jan, Michael

    2016-01-01

    In a brief article published in 1931, Subrahmanyan Chandrasekhar made public an important astronomical discovery. In his article, the then young Indian astrophysicist introduced what is now known as the "Chandrasekhar limit." This limit establishes the maximum mass of a stellar remnant beyond which the repulsion force between electrons…

  15. Islamic fundamentalism in Indonesia

    OpenAIRE

    Nagy, Sandra L.

    1996-01-01

    This is a study of Islamic fundamentalism in Indonesia. Islamic fundamentalism is defined as the return to the foundations and principles of Islam including all movements based on the desire to create a more Islamic society. After describing the practices and beliefs of Islam, this thesis examines the three aspects of universal Islamic fundamentalism: revivalism, resurgence, and radicalism. It analyzes the role of Islam in Indonesia under Dutch colonial rule, an alien Christian imperialist po...

  16. A neoteric multidrug combination: novel approach to limited cutaneous systemic scleroderma involving the face

    Science.gov (United States)

    Kumar, M Hari; Kumar, M Siva; Kumar, Sabitha Hari; Kumar, Kingsly Selva

    2016-01-01

    Limited cutaneous scleroderma is a subtype of scleroderma limited to the skin of the face, hands, feet and forearms. We present a case of a 45-year-old woman affected by limited cutaneous systemic scleroderma involving the orofacial region and causing restricted mouth opening. The patient showed noteworthy improvement of the skin lesion by use of a combination of intralesional corticosteroid with hyaluronidase and various multiantioxidants, resulting in amelioration of her mouth opening problem. The patient gave her full informed written consent to this report being published. PMID:27033280

  17. A neoteric multidrug combination: novel approach to limited cutaneous systemic scleroderma involving the face.

    Science.gov (United States)

    Kumar, M Hari; Kumar, M Siva; Kumar, Sabitha Hari; Kumar, Kingsly Selva

    2016-03-31

    Limited cutaneous scleroderma is a subtype of scleroderma limited to the skin of the face, hands, feet and forearms. We present a case of a 45-year-old woman affected by limited cutaneous systemic scleroderma involving the orofacial region and causing restricted mouth opening. The patient showed noteworthy improvement of the skin lesion by use of a combination of intralesional corticosteroid with hyaluronidase and various multiantioxidants, resulting in amelioration of her mouth opening problem. The patient gave her full informed written consent to this report being published. 2016 BMJ Publishing Group Ltd.

  18. Fundamentals of gas dynamics

    CERN Document Server

    Babu, V

    2014-01-01

    Fundamentals of Gas Dynamics, Second Edition isa comprehensively updated new edition and now includes a chapter on the gas dynamics of steam. It covers the fundamental concepts and governing equations of different flows, and includes end of chapter exercises based on the practical applications. A number of useful tables on the thermodynamic properties of steam are also included.Fundamentals of Gas Dynamics, Second Edition begins with an introduction to compressible and incompressible flows before covering the fundamentals of one dimensional flows and normal shock wav

  19. Limitations of implementing sustainable construction principles in the conventional South African design approach

    CSIR Research Space (South Africa)

    Sebake, TN

    2008-06-01

    Full Text Available professionals, particularly by architects, in the implementation of sustainability principles in the development of building projects. The aim of the paper is to highlight the limitations of introducing sustainability aspects into the existing South African...

  20. Fundamental Safety Principles

    International Nuclear Information System (INIS)

    Abdelmalik, W.E.Y.

    2011-01-01

    This work presents a summary of the IAEA Safety Standards Series publication No. SF-1 entitled F UDAMENTAL Safety PRINCIPLES p ublished on 2006. This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purposes. Safety measures and security measures have in common the aim of protecting human life and health and the environment. These safety principles are: 1) Responsibility for safety, 2) Role of the government, 3) Leadership and management for safety, 4) Justification of facilities and activities, 5) Optimization of protection, 6) Limitation of risks to individuals, 7) Protection of present and future generations, 8) Prevention of accidents, 9)Emergency preparedness and response and 10) Protective action to reduce existing or unregulated radiation risks. The safety principles concern the security of facilities and activities to the extent that they apply to measures that contribute to both safety and security. Safety measures and security measures must be designed and implemented in an integrated manner so that security measures do not compromise safety and safety measures do not compromise security.

  1. Modeling the evolution of natural cliffs subject to weathering. 1, Limit analysis approach

    OpenAIRE

    Utili, Stefano; Crosta, Giovanni B.

    2011-01-01

    Retrogressive landsliding evolution of natural slopes subjected to weathering has been modeled by assuming Mohr-Coulomb material behavior and by using an analytical method. The case of weathering-limited slope conditions, with complete erosion of the accumulated debris, has been modeled. The limit analysis upper-bound method is used to study slope instability induced by a homogeneous decrease of material strength in space and time. The only assumption required in the model concerns the degree...

  2. DNA isolation protocols affect the detection limit of PCR approaches of bacteria in samples from the human gastrointestinal tract

    NARCIS (Netherlands)

    Zoetendal, E.G.; Ben-Amor, K.; Akkermans, A.D.L.; Abee, T.; Vos, de W.M.

    2001-01-01

    A major concern in molecular ecological studies is the lysis efficiency of different bacteria in a complex ecosystem. We used a PCR-based 16S rDNA approach to determine the effect of two DNA isolation protocols (i.e. the bead beating and Triton-X100 method) on the detection limit of seven

  3. Challenges and Limitations of Applying an Emotion-driven Design Approach on Elderly Users

    DEFF Research Database (Denmark)

    Andersen, Casper L.; Gudmundsson, Hjalte P.; Achiche, Sofiane

    2011-01-01

    a competitive advantage for companies. In this paper, challenges of applying an emotion-driven design approach applied on elderly people, in order to identify their user needs towards walking frames, are discussed. The discussion will be based on the experiences and results obtained from the case study...... related to the participants’ age and cognitive abilities. The challenges encountered are discussed and guidelines on what should be taken into account to facilitate an emotion-driven design approach for elderly people are proposed....

  4. A risk modelling approach for setting microbiological limits using enterococci as indicator for growth potential of Salmonella in pork

    DEFF Research Database (Denmark)

    Bollerslev, Anne Mette; Nauta, Maarten; Hansen, Tina Beck

    2017-01-01

    Microbiological limits are widely used in food processing as an aid to reduce the exposure to hazardous microorganisms for the consumers. However, in pork, the prevalence and concentrations of Salmonella are generally low and microbiological limits are not considered an efficient tool to support...... for this purpose includes the dose-response relationship for Salmonella and a reduction factor to account for preparation of the fresh pork. By use of the risk model, it was estimated that the majority of salmonellosis cases, caused by the consumption of pork in Denmark, is caused by the small fraction of pork...... products that has enterococci concentrations above 5. log. CFU/g. This illustrates that our approach can be used to evaluate the potential effect of different microbiological limits and therefore, the perspective of this novel approach is that it can be used for definition of a risk-based microbiological...

  5. Fundamental neutron physics

    International Nuclear Information System (INIS)

    Deslattes, R.; Dombeck, T.; Greene, G.; Ramsey, N.; Rauch, H.; Werner, S.

    1984-01-01

    Fundamental physics experiments of merit can be conducted at the proposed intense neutron sources. Areas of interest include: neutron particle properties, neutron wave properties, and fundamental physics utilizing reactor produced γ-rays. Such experiments require intense, full-time utilization of a beam station for periods ranging from several months to a year or more

  6. Chandrasekhar limit: an elementary approach based on classical physics and quantum theory

    Science.gov (United States)

    Pinochet, Jorge; Van Sint Jan, Michael

    2016-05-01

    In a brief article published in 1931, Subrahmanyan Chandrasekhar made public an important astronomical discovery. In his article, the then young Indian astrophysicist introduced what is now known as the Chandrasekhar limit. This limit establishes the maximum mass of a stellar remnant beyond which the repulsion force between electrons due to the exclusion principle can no longer stop the gravitational collapse. In the present article, we create an elemental approximation to the Chandrasekhar limit, accessible to non-graduate science and engineering students. The article focuses especially on clarifying the origins of Chandrasekhar’s discovery and the underlying physical concepts. Throughout the article, only basic algebra is used as well as some general notions of classical physics and quantum theory.

  7. Developing Guided Worksheet for Cognitive Apprenticeship Approach in teaching Formal Definition of The Limit of A Function

    Science.gov (United States)

    Oktaviyanthi, R.; Dahlan, J. A.

    2018-04-01

    This study aims to develop student worksheets that correspond to the Cognitive Apprenticeship learning approach. The main subject in this student worksheet is Functions and Limits with the branch of the main subject is Continuity and Limits of Functions. There are two indicators of the achievement of this learning that are intended to be developed in the student worksheet (1) the student can explain the concept of limit by using the formal definition of limit and (2) the student can evaluate the value of limit of a function using epsilon and delta. The type of research used is development research that refers to the development of Plomp products. The research flow starts from literature review, observation, interviews, work sheet design, expert validity test, and limited trial on first-year students in academic year 2016-2017 in Universitas Serang Raya, STKIP Pelita Pratama Al-Azhar Serang, and Universitas Mathla’ul Anwar Pandeglang. Based on the product development result obtained the student worksheets that correspond to the Cognitive Apprenticeship learning approach are valid and reliable.

  8. Universality and the approach to the continuum limit in lattice gauge theory

    CERN Document Server

    De Divitiis, G M; Guagnelli, M; Lüscher, Martin; Petronzio, Roberto; Sommer, Rainer; Weisz, P; Wolff, U; de Divitiis, G; Frezzotti, R; Guagnelli, M; Luescher, M; Petronzio, R; Sommer, R; Weisz, P; Wolff, U

    1995-01-01

    The universality of the continuum limit and the applicability of renormalized perturbation theory are tested in the SU(2) lattice gauge theory by computing two different non-perturbatively defined running couplings over a large range of energies. The lattice data (which were generated on the powerful APE computers at Rome II and DESY) are extrapolated to the continuum limit by simulating sequences of lattices with decreasing spacings. Our results confirm the expected universality at all energies to a precision of a few percent. We find, however, that perturbation theory must be used with care when matching different renormalized couplings at high energies.

  9. A large deviations approach to limit theory for heavy-tailed time series

    DEFF Research Database (Denmark)

    Mikosch, Thomas Valentin; Wintenberger, Olivier

    2016-01-01

    and vanishing in some neighborhood of the origin. We study a variety of such functionals, including large deviations of random walks, their suprema, the ruin functional, and further derive weak limit theory for maxima, point processes, cluster functionals and the tail empirical process. One of the main results...

  10. Exercise testing, limitation and training in patients with cystic fibrosis. A personalized approach

    NARCIS (Netherlands)

    Werkman, M.S.

    2014-01-01

    Exercise testing and training are cornerstones in regular CF care. However, no consensus exists in literature about which exercise test protocol should be used for individual patients. Furthermore, divergence exists in insights about both the dominant exercise limiting mechanisms and the

  11. A Spectral Approach for Quenched Limit Theorems for Random Expanding Dynamical Systems

    Science.gov (United States)

    Dragičević, D.; Froyland, G.; González-Tokman, C.; Vaienti, S.

    2018-01-01

    We prove quenched versions of (i) a large deviations principle (LDP), (ii) a central limit theorem (CLT), and (iii) a local central limit theorem for non-autonomous dynamical systems. A key advance is the extension of the spectral method, commonly used in limit laws for deterministic maps, to the general random setting. We achieve this via multiplicative ergodic theory and the development of a general framework to control the regularity of Lyapunov exponents of twisted transfer operator cocycles with respect to a twist parameter. While some versions of the LDP and CLT have previously been proved with other techniques, the local central limit theorem is, to our knowledge, a completely new result, and one that demonstrates the strength of our method. Applications include non-autonomous (piecewise) expanding maps, defined by random compositions of the form {T_{σ^{n-1} ω} circ\\cdotscirc T_{σω}circ T_ω} . An important aspect of our results is that we only assume ergodicity and invertibility of the random driving {σ:Ω\\toΩ} ; in particular no expansivity or mixing properties are required.

  12. An Effective Approach to Biomedical Information Extraction with Limited Training Data

    Science.gov (United States)

    Jonnalagadda, Siddhartha

    2011-01-01

    In the current millennium, extensive use of computers and the internet caused an exponential increase in information. Few research areas are as important as information extraction, which primarily involves extracting concepts and the relations between them from free text. Limitations in the size of training data, lack of lexicons and lack of…

  13. On the limitations of standard statistical modeling in biological systems: a full Bayesian approach for biology.

    Science.gov (United States)

    Gomez-Ramirez, Jaime; Sanz, Ricardo

    2013-09-01

    One of the most important scientific challenges today is the quantitative and predictive understanding of biological function. Classical mathematical and computational approaches have been enormously successful in modeling inert matter, but they may be inadequate to address inherent features of biological systems. We address the conceptual and methodological obstacles that lie in the inverse problem in biological systems modeling. We introduce a full Bayesian approach (FBA), a theoretical framework to study biological function, in which probability distributions are conditional on biophysical information that physically resides in the biological system that is studied by the scientist. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. An open simulation approach to identify chances and limitations for vulnerable road user (VRU) active safety.

    Science.gov (United States)

    Seiniger, Patrick; Bartels, Oliver; Pastor, Claus; Wisch, Marcus

    2013-01-01

    It is commonly agreed that active safety will have a significant impact on reducing accident figures for pedestrians and probably also bicyclists. However, chances and limitations for active safety systems have only been derived based on accident data and the current state of the art, based on proprietary simulation models. The objective of this article is to investigate these chances and limitations by developing an open simulation model. This article introduces a simulation model, incorporating accident kinematics, driving dynamics, driver reaction times, pedestrian dynamics, performance parameters of different autonomous emergency braking (AEB) generations, as well as legal and logical limitations. The level of detail for available pedestrian accident data is limited. Relevant variables, especially timing of the pedestrian appearance and the pedestrian's moving speed, are estimated using assumptions. The model in this article uses the fact that a pedestrian and a vehicle in an accident must have been in the same spot at the same time and defines the impact position as a relevant accident parameter, which is usually available from accident data. The calculations done within the model identify the possible timing available for braking by an AEB system as well as the possible speed reduction for different accident scenarios as well as for different system configurations. The simulation model identifies the lateral impact position of the pedestrian as a significant parameter for system performance, and the system layout is designed to brake when the accident becomes unavoidable by the vehicle driver. Scenarios with a pedestrian running from behind an obstruction are the most demanding scenarios and will very likely never be avoidable for all vehicle speeds due to physical limits. Scenarios with an unobstructed person walking will very likely be treatable for a wide speed range for next generation AEB systems.

  15. Advantages and limitations of the use of optogenetic approach in studying fast-scale spike encoding.

    Directory of Open Access Journals (Sweden)

    Aleksey Malyshev

    Full Text Available Understanding single-neuron computations and encoding performed by spike-generation mechanisms of cortical neurons is one of the central challenges for cell electrophysiology and computational neuroscience. An established paradigm to study spike encoding in controlled conditions in vitro uses intracellular injection of a mixture of signals with fluctuating currents that mimic in vivo-like background activity. However this technique has two serious limitations: it uses current injection, while synaptic activation leads to changes of conductance, and current injection is technically most feasible in the soma, while the vast majority of synaptic inputs are located on the dendrites. Recent progress in optogenetics provides an opportunity to circumvent these limitations. Transgenic expression of light-activated ionic channels, such as Channelrhodopsin2 (ChR2, allows induction of controlled conductance changes even in thin distant dendrites. Here we show that photostimulation provides a useful extension of the tools to study neuronal encoding, but it has its own limitations. Optically induced fluctuating currents have a low cutoff (~70 Hz, thus limiting the dynamic range of frequency response of cortical neurons. This leads to severe underestimation of the ability of neurons to phase-lock their firing to high frequency components of the input. This limitation could be worked around by using short (2 ms light stimuli which produce membrane potential responses resembling EPSPs by their fast onset and prolonged decay kinetics. We show that combining application of short light stimuli to different parts of dendritic tree for mimicking distant EPSCs with somatic injection of fluctuating current that mimics fluctuations of membrane potential in vivo, allowed us to study fast encoding of artificial EPSPs photoinduced at different distances from the soma. We conclude that dendritic photostimulation of ChR2 with short light pulses provides a powerful tool to

  16. Hankin and Reeves' approach to estimating fish abundance in small streams: limitations and alternatives

    Science.gov (United States)

    William L. Thompson

    2003-01-01

    Hankin and Reeves' (1988) approach to estimating fish abundance in small streams has been applied in stream fish studies across North America. However, their population estimator relies on two key assumptions: (1) removal estimates are equal to the true numbers of fish, and (2) removal estimates are highly correlated with snorkel counts within a subset of sampled...

  17. Limitations of the toxic equivalency factor (TEF) approach for risk assessment of halogenated aromatic hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Safe, S. [Texas A and M Univ., College Station, TX (United States). Dept. of Veterinary Physiology and Pharmacology

    1995-12-31

    2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) and related halogenated aromatic hydrocarbons (HAHs) are present as complex mixtures of polychlorinated dibenzo-p-dioxins (PCDDs), dibenzofurans (PCDFs) and biphenyls (PCBs) in most environmental matrices. Risk management of these mixtures utilize the toxic equivalency factor (TEF) approach in which the TCDD (dioxin) or toxic equivalents of a mixture is a summation of the congener concentration (Ci) times TEF{sub i} (potency relative to TCDD) where. TEQ{sub mixture} = {Sigma}[Cil] {times} TEF{sub i}. TEQs are determined only for those HAHs which are aryl hydrocarbon (Ah) receptor agonists and this approach assumes that the toxic or biochemical effects of individual compounds in a mixture are additive. Several in vivo and in vitro laboratory and field studies with different HAH mixtures have been utilized to validate the TEF approach. For some responses, the calculated toxicities of PCDD/PCDF and PCB mixtures predict the observed toxic potencies. However, for fetal cleft palate and immunotoxicity in mice, nonadditive (antagonistic) responses are observed using complex PCB mixtures or binary mixtures containing an Ah receptor agonist with 2,2{prime},4,4{prime},5,5{prime}-hexachlorobiphenyl (PCB153). The potential interactive effects of PCBs and other dietary Ah receptor antagonist suggest that the TEF approach for risk management of HAHs requires further refinement and should be used selectively.

  18. Relativities of fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    2017-08-01

    S-dualities have been held to have radical implications for our metaphysics of fundamentality. In particular, it has been claimed that they make the fundamentality status of a physical object theory-relative in an important new way. But what physicists have had to say on the issue has not been clear or consistent, and in particular seems to be ambiguous between whether S-dualities demand an anti-realist interpretation of fundamentality talk or merely a revised realism. This paper is an attempt to bring some clarity to the matter. After showing that even antecedently familiar fundamentality claims are true only relative to a raft of metaphysical, physical, and mathematical assumptions, I argue that the relativity of fundamentality inherent in S-duality nevertheless represents something new, and that part of the reason for this is that it has both realist and anti-realist implications for fundamentality talk. I close by discussing the broader significance that S-dualities have for structuralist metaphysics and for fundamentality metaphysics more generally.

  19. Fundamental Work Cost of Quantum Processes

    Science.gov (United States)

    Faist, Philippe; Renner, Renato

    2018-04-01

    Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics to the nanoscale. Here, we provide a general fundamental lower limit, valid for systems with an arbitrary Hamiltonian and in contact with any thermodynamic bath, on the work cost for the implementation of any logical process. This limit is given by a new information measure—the coherent relative entropy—which accounts for the Gibbs weight of each microstate. The coherent relative entropy enjoys a collection of natural properties justifying its interpretation as a measure of information and can be understood as a generalization of a quantum relative entropy difference. As an application, we show that the standard first and second laws of thermodynamics emerge from our microscopic picture in the macroscopic limit. Finally, our results have an impact on understanding the role of the observer in thermodynamics: Our approach may be applied at any level of knowledge—for instance, at the microscopic, mesoscopic, or macroscopic scales—thus providing a formulation of thermodynamics that is inherently relative to the observer. We obtain a precise criterion for when the laws of thermodynamics can be applied, thus making a step forward in determining the exact extent of the universality of thermodynamics and enabling a systematic treatment of Maxwell-demon-like situations.

  20. Fundamental Work Cost of Quantum Processes

    Directory of Open Access Journals (Sweden)

    Philippe Faist

    2018-04-01

    Full Text Available Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics to the nanoscale. Here, we provide a general fundamental lower limit, valid for systems with an arbitrary Hamiltonian and in contact with any thermodynamic bath, on the work cost for the implementation of any logical process. This limit is given by a new information measure—the coherent relative entropy—which accounts for the Gibbs weight of each microstate. The coherent relative entropy enjoys a collection of natural properties justifying its interpretation as a measure of information and can be understood as a generalization of a quantum relative entropy difference. As an application, we show that the standard first and second laws of thermodynamics emerge from our microscopic picture in the macroscopic limit. Finally, our results have an impact on understanding the role of the observer in thermodynamics: Our approach may be applied at any level of knowledge—for instance, at the microscopic, mesoscopic, or macroscopic scales—thus providing a formulation of thermodynamics that is inherently relative to the observer. We obtain a precise criterion for when the laws of thermodynamics can be applied, thus making a step forward in determining the exact extent of the universality of thermodynamics and enabling a systematic treatment of Maxwell-demon-like situations.

  1. A QMU approach for characterizing the operability limits of air-breathing hypersonic vehicles

    International Nuclear Information System (INIS)

    Iaccarino, Gianluca; Pecnik, Rene; Glimm, James; Sharp, David

    2011-01-01

    The operability limits of a supersonic combustion engine for an air-breathing hypersonic vehicle are characterized using numerical simulations and an uncertainty quantification methodology. The time-dependent compressible flow equations with heat release are solved in a simplified configuration. Verification, calibration and validation are carried out to assess the ability of the model to reproduce the flow/thermal interactions that occur when the engine unstarts due to thermal choking. quantification of margins and uncertainty (QMU) is used to determine the safe operation region for a range of fuel flow rates and combustor geometries. - Highlights: → In this work we introduce a method to study the operability limits of hypersonic scramjet engines. → The method is based on a calibrated heat release model. → It accounts explicitly for uncertainties due to flight conditions and model correlations. → We examine changes due to the combustor geometry and fuel injection.

  2. Revegetation in China’s Loess Plateau is approaching sustainable water resource limits

    Science.gov (United States)

    Feng, Xiaoming; Fu, Bojie; Piao, Shilong; Wang, Shuai; Ciais, Philippe; Zeng, Zhenzhong; Lü, Yihe; Zeng, Yuan; Li, Yue; Jiang, Xiaohui; Wu, Bingfang

    2016-11-01

    Revegetation of degraded ecosystems provides opportunities for carbon sequestration and bioenergy production. However, vegetation expansion in water-limited areas creates potentially conflicting demands for water between the ecosystem and humans. Current understanding of these competing demands is still limited. Here, we study the semi-arid Loess Plateau in China, where the `Grain to Green’ large-scale revegetation programme has been in operation since 1999. As expected, we found that the new planting has caused both net primary productivity (NPP) and evapotranspiration (ET) to increase. Also the increase of ET has induced a significant (p develop a new conceptual framework to determine the critical carbon sequestration that is sustainable in terms of both ecological and socio-economic resource demands in a coupled anthropogenic-biological system.

  3. Liquidity dynamics in an electronic open limit order book: An event study approach

    OpenAIRE

    Gomber, Peter; Schweickert, Uwe; Theissen, Erik

    2011-01-01

    We analyze the dynamics of liquidity in Xetra, an electronic open limit order book. We use the Exchange Liquidity Measure (XLM), a measure of the cost of a roundtrip trade of given size V. This measure captures the price and the quantity dimension of liquidity. We present descriptive statistics, analyze the cross-sectional determinants of the XLM measure and document its intraday pattern. Our main contribution is an analysis of the dynamics of the XLM measure around liquidity shocks. We use i...

  4. Implementation of upper limit calculation for a poisson variable by bayesian approach

    International Nuclear Information System (INIS)

    Zhu Yongsheng

    2008-01-01

    The calculation of Bayesian confidence upper limit for a Poisson variable including both signal and background with and without systematic uncertainties has been formulated. A Fortran 77 routine, BPULE, has been developed to implement the calculation. The routine can account for systematic uncertainties in the background expectation and signal efficiency. The systematic uncertainties may be separately parameterized by a Gaussian, Log-Gaussian or flat probability density function (pdf). Some technical details of BPULE have been discussed. (authors)

  5. Execution techniques for high-level radioactive waste disposal. 2. Fundamental concept of geological disposal and implementing approach of disposal project

    International Nuclear Information System (INIS)

    Kawanishi, Motoi; Komada, Hiroya; Tsuchino, Susumu; Shiozaki, Isao; Kitayama, Kazumi; Akasaka, Hidenari; Inagaki, Yusuke; Kawamura, Hideki

    1999-01-01

    The making high activity of the high-level radioactive waste disposal business shall be fully started after establishing of the implementing organization which is planned around 2000. Considering each step of disposal business, in this study, the implementation procedure for a series of disposal business such as the selection of the disposal site, the construction and operation of the disposal facility, the closure and decommissioning of the disposal facility and the management after closure, which are carried forward by the implementation body is discussed in detail from the technical viewpoint and an example of the master schedule is proposed. Furthermore, we investigate and propose the concept of the geological disposal which becomes important in carrying forward to making of the business of the disposal, such as the present site selection smoothly, the fundamental idea of the safe securing for disposal, the basic idea to get trust to the disposal technique and the geological environmental condition which is the basic condition of this whole study for the disposal business making. (author)

  6. A labview approach to instrumentation for the TFTR bumper limiter alignment project

    International Nuclear Information System (INIS)

    Skelly, G.N.; Owens, D.K.

    1992-01-01

    This paper reports on a project recently undertaken to measure the alignment of the TFTR bumper limiter in relation to the toroidal magnetic field axis. The process involved the measurement of the toroidal magnetic field, and the positions of the tiles that make up the bumper limiter. The basis for the instrument control and data acquisition system was National Instrument's LabVIEW 2. LabVIEW is a graphical programming system for developing scientific and engineering applications on a Macintosh. For this project, a Macintosh IIci controlled the IEEE-488 GPIB programmable instruments via an interface box connected to the SCSI port of the computer. With LabVIEW, users create graphical software modules called virtual instruments instead of writing conventional text-based code. To measure the magnetic field, the control system acquired data from two nuclear magnetic resonance magnetometers while the torroidal field coils were pulsed. To measure the position of the tiles on the limiter, an instrumented mechanical arm was used inside the vessel

  7. Approaches to the calculation of limitations on nuclear detonations for peaceful purposes

    Energy Technology Data Exchange (ETDEWEB)

    Whipple, G H [School of Public Health, University of Michigan, Ann Arbor, MI (United States)

    1969-07-01

    The long-term equilibrium levels of tritium, krypton- 85 and carbon-14 which are acceptable in the environment have been estimated on the following premises: 1) the three isotopes reach the environment and equilibrate throughout it in periods shorter than their half lives, 2) nuclear detonations and nuclear power constitute the dominant sources of these isotopes, 3) the doses from these three isotopes add to one another and to the doses from other radioactive isotopes released to the environment, and 4) the United States, by virtue of its population, is entitled to 6% of the world's capacity to accept radioactive wastes. These premises lead to the conclusion that U.S. nuclear detonations are limited by carbon-14 to 60 megatons per year. The corresponding limit for U.S. nuclear power appears to be set by krypton-85 at 100,000 electrical megawatts, although data for carbon-14 production by nuclear power are not available. It is noted that if the equilibration assumed in these estimates does not occur, the limits will in general be lower than those given above. (author)

  8. Approaches to the calculation of limitations on nuclear detonations for peaceful purposes

    International Nuclear Information System (INIS)

    Whipple, G.H.

    1969-01-01

    The long-term equilibrium levels of tritium, krypton- 85 and carbon-14 which are acceptable in the environment have been estimated on the following premises: 1) the three isotopes reach the environment and equilibrate throughout it in periods shorter than their half lives, 2) nuclear detonations and nuclear power constitute the dominant sources of these isotopes, 3) the doses from these three isotopes add to one another and to the doses from other radioactive isotopes released to the environment, and 4) the United States, by virtue of its population, is entitled to 6% of the world's capacity to accept radioactive wastes. These premises lead to the conclusion that U.S. nuclear detonations are limited by carbon-14 to 60 megatons per year. The corresponding limit for U.S. nuclear power appears to be set by krypton-85 at 100,000 electrical megawatts, although data for carbon-14 production by nuclear power are not available. It is noted that if the equilibration assumed in these estimates does not occur, the limits will in general be lower than those given above. (author)

  9. Fundamentals - longitudinal motion

    International Nuclear Information System (INIS)

    Weng, W.T.

    1989-01-01

    There are many ways to accelerate charged particles to high energy for physics research. Each has served its purpose but eventually has encountered fundamental limitations of one kind or another. Looking at the famous Livingston curve, the initial birth and final level-off of all types of accelerators is seen. In fact, in the mid-80s we personally witnessed the creation of a new type of collider - the Stanford Linear Collider. Also witnessed, was the resurgence of study into novel methods of acceleration. This paper will cover acceleration and longitudinal motion in a synchrotron. A synchrotron is a circular accelerator with the following three characteristics: (1) Magnetic guiding (dipole) and confinement (quadrupole) components are placed in a small neighborhood around the equilibrium orbit. (2) Particles are kept in resonance with the radio-frequency electric field indefinitely to achieve acceleration to higher energies. (3) Magnetic fields are varied adiabatically with the energy of the particle. D. Edwards described the transverse oscillations of particles in a synchrotron. Here the author talks about the longitudinal oscillations of particles. The phase stability principle was invented by V. Veksler and E. McMillan independently in 1945. The phase stability and strong focusing principle, invented by Courant and Livingston in 1952, enabled the steady energy gain of accelerators and storage rings witnessed during the past 30 years. This paper is a unified overview of the related rf subjects in an accelerator and a close coupling between accelerator physics and engineering practices, which is essential for the major progress in areas such as high intensity synchrotrons, a multistage accelerator complex, and anti-proton production and cooling, made possible in the past 20 years

  10. Fundamentals of electronics

    CERN Document Server

    Schubert, Thomas F

    2015-01-01

    This book, Electronic Devices and Circuit Application, is the first of four books of a larger work, Fundamentals of Electronics. It is comprised of four chapters describing the basic operation of each of the four fundamental building blocks of modern electronics: operational amplifiers, semiconductor diodes, bipolar junction transistors, and field effect transistors. Attention is focused on the reader obtaining a clear understanding of each of the devices when it is operated in equilibrium. Ideas fundamental to the study of electronic circuits are also developed in the book at a basic level to

  11. Search for fundamental 'God Particle' speeds up

    CERN Multimedia

    Spotts, P N

    2000-01-01

    This month researchers at CERN are driving the accelerator to its limits and beyond to find the missing Higgs boson. Finding it would confirm a 30-yr-old theory about why matter's most fundamental particles have mass (1 page).

  12. A discussion of the limitations of the psychometric and cultural theory approaches to risk perception

    International Nuclear Information System (INIS)

    Sjoeberg, L.

    1996-01-01

    Risk perception has traditionally been conceived as a cognitive phenomenon, basically a question of information processing. The very term perception suggests that information processing is involved and of crucial importance. Kahneman and Tversky suggested that the use of 'heuristics' in the intuitive estimation of probabilities accounts for biased probability perception, hence claiming to explain risk perception as well. The psychometric approach of Slovic et al, a further step in in the cognitive tradition, conceives of perceived risk as a function of general properties of a hazard. However, the psychometric approach is shown here to explain only about 20% of the variance of perceived risk, even less of risk acceptability. Its claim to explanatory power is based on a statistical illusion: mean values were investigated and accounted for, across hazards. A currently popular alternative to the psychometric tradition, Cultural Theory, is even less successful and explains only about 5% of the variance of perceived risk. The claims of this approach were also based on a statistical illusion: 'significant' results were reported and interpreted as being of substantial importance. The present paper presents a new approach: attitude to the risk generating technology, general sensitivity to risks and specific risk explained well over 60% of the variance of perceived risk of nuclear waste, in a study of extensive data from a representative sample of the Swedish population. The attitude component functioning as an explanatory factor of perceived risk, rather than as a consequence of perceived risk, suggests strongly that perceived risk is something other than cognition. Implications for risk communication are discussed. (author)

  13. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    OpenAIRE

    Xiao, Ning-Cong; Li, Yan-Feng; Wang, Zhonglai; Peng, Weiwen; Huang, Hong-Zhong

    2013-01-01

    In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to cal...

  14. An approach to the determination of physical-chemical limits of energy consumption for the transition to a stationary state

    International Nuclear Information System (INIS)

    Zimen, K.E.

    1975-02-01

    The paper gives a model of energy consumption and a programme for its application. Previous models are mainly criticized on the grounds that new technological developments as well as adjustments due to learning processes of homo sapiens are generally not sufficiently accounted for in these models. The approach of this new model is therefore an attempt at the determination of the physical-chemical limiting values for the capacity of the global HST (homo sapiens - Tellus) system or of individual regions with respect to certain critical factors. These limiting values determined by the physical-chemical system of the earth are independent of human ingenuity and flexibility. (orig./AK) [de

  15. Advantages and limitations of quantitative PCR (Q-PCR)-based approaches in microbial ecology.

    Science.gov (United States)

    Smith, Cindy J; Osborn, A Mark

    2009-01-01

    Quantitative PCR (Q-PCR or real-time PCR) approaches are now widely applied in microbial ecology to quantify the abundance and expression of taxonomic and functional gene markers within the environment. Q-PCR-based analyses combine 'traditional' end-point detection PCR with fluorescent detection technologies to record the accumulation of amplicons in 'real time' during each cycle of the PCR amplification. By detection of amplicons during the early exponential phase of the PCR, this enables the quantification of gene (or transcript) numbers when these are proportional to the starting template concentration. When Q-PCR is coupled with a preceding reverse transcription reaction, it can be used to quantify gene expression (RT-Q-PCR). This review firstly addresses the theoretical and practical implementation of Q-PCR and RT-Q-PCR protocols in microbial ecology, highlighting key experimental considerations. Secondly, we review the applications of (RT)-Q-PCR analyses in environmental microbiology and evaluate the contribution and advances gained from such approaches. Finally, we conclude by offering future perspectives on the application of (RT)-Q-PCR in furthering understanding in microbial ecology, in particular, when coupled with other molecular approaches and more traditional investigations of environmental systems.

  16. Different Variants of Fundamental Portfolio

    Directory of Open Access Journals (Sweden)

    Tarczyński Waldemar

    2014-06-01

    Full Text Available The paper proposes the fundamental portfolio of securities. This portfolio is an alternative for the classic Markowitz model, which combines fundamental analysis with portfolio analysis. The method’s main idea is based on the use of the TMAI1 synthetic measure and, in limiting conditions, the use of risk and the portfolio’s rate of return in the objective function. Different variants of fundamental portfolio have been considered under an empirical study. The effectiveness of the proposed solutions has been related to the classic portfolio constructed with the help of the Markowitz model and the WIG20 market index’s rate of return. All portfolios were constructed with data on rates of return for 2005. Their effectiveness in 2006- 2013 was then evaluated. The studied period comprises the end of the bull market, the 2007-2009 crisis, the 2010 bull market and the 2011 crisis. This allows for the evaluation of the solutions’ flexibility in various extreme situations. For the construction of the fundamental portfolio’s objective function and the TMAI, the study made use of financial and economic data on selected indicators retrieved from Notoria Serwis for 2005.

  17. FY 2000 research and development of fundamental technologies for AC superconducting power devices. R and D of fundamental technologies for superconducting power cables and faults current limiters, R and D of superconducting magnets for power applications, and study on the total systems and related subjects; 2000 nendo koryu chodendo denryoku kiki kiban gijutsu kenkyu kaihatsu seika hokokusho. Chodendo soden cable kiban gijutsu no kenkyu kaihatsu, chodendo genryuki kiban gijutsu no kenkyu kaihatsu, denryokuyo chodendo magnet no kenkyu kaihatsu, total system nado no kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    The project for research and development of fundamental technologies for AC superconducting power devices has been started, and the FY 2000 results are reported. The R and D of fundamental technologies for superconducting power cables include grasping the mechanical characteristics associated with integration necessary for fabrication of large current capacity and long cables; development of barrier cable materials by various methods; and development of short insulated tubes as cooling technology for long superconducting cables, and grasping its thermal/mechanical characteristics. The R and D of faults current limiters include introduction of the unit for superconducting film fabrication, determination of the structures and layouts for large currents, and improvement of performance of each device for high voltages. R and D of superconducting magnets for power applications include grasping the fundamental characteristics of insulation at cryogenic temperature, completion of the insulation designs for high voltage/current lead bushing, and development of prototype sub-cooled nitrogen cooling unit for cooling each AC power device. Study on the total systems and related subjects include analysis for stabilization of the group model systems, to confirm improved voltage stability when the superconducting cable is in service. (NEDO)

  18. Quantum Uncertainty and Fundamental Interactions

    Directory of Open Access Journals (Sweden)

    Tosto S.

    2013-04-01

    Full Text Available The paper proposes a simplified theoretical approach to infer some essential concepts on the fundamental interactions between charged particles and their relative strengths at comparable energies by exploiting the quantum uncertainty only. The worth of the present approach relies on the way of obtaining the results, rather than on the results themselves: concepts today acknowledged as fingerprints of the electroweak and strong interactions appear indeed rooted in the same theoretical frame including also the basic principles of special and general relativity along with the gravity force.

  19. Approaches for the development of occupational exposure limits for man-made mineral fibres (MMMFs)

    International Nuclear Information System (INIS)

    Ziegler-Skylakakis, Kyriakoula

    2004-01-01

    Occupational exposure limits (OELs) are an essential tool in the control of exposure to hazardous chemical agents, and serve to minimise the occurrence of occupational diseases associated with such exposure. The setting of OELs, together with other associated measures, forms an essential part of the European Community's strategy on health and safety at work, upon which the legislative framework for the protection of workers from risks related to chemical agents is based. The European Commission is assisted by the Scientific Committee on Occupational Exposure Limits (SCOEL) in its work of setting OELs for hazardous chemical agents. The procedure for setting OELs requires information on the toxic mechanisms of an agent that should allow to differentiate between thresholded and non-thresholded mechanisms. In the first case, a no-observed adverse effect level (NOAEL) can be defined, which can be the basis for a derivation of an OEL. In the latter case, any exposure is correlated with a certain risk. If adequate scientific data are available, SCOEL estimates the risk associated with a series of exposure levels. This can then be used for guidance, when setting OELs at European level. Man-made mineral fibres (MMMFs) are widely used at different worksites. MMMF products can release airborne respirable fibres during their production, use and removal. According to the classification of the EU system, all MMMF fibres are considered to be irritants and are classified for carcinogenicity. EU legislation foresees the use of limit values as one of the provisions for the protection of workers from the risks related to exposure to carcinogens. In the following paper, the research requirements identified by SCOEL for the development of OELs for MMMFs will be presented

  20. Fundamentals of electrochemical science

    CERN Document Server

    Oldham, Keith

    1993-01-01

    Key Features* Deals comprehensively with the basic science of electrochemistry* Treats electrochemistry as a discipline in its own right and not as a branch of physical or analytical chemistry* Provides a thorough and quantitative description of electrochemical fundamentals

  1. Fundamentals of ion exchange

    International Nuclear Information System (INIS)

    Townsend, R.P.

    1993-01-01

    In this paper the fundamentals of ion exchange mechanisms and their thermodynamics are described. A range of ion exchange materials is considered and problems of communication and technology transfer between scientists working in the field are discussed. (UK)

  2. Land Prices and Fundamentals

    OpenAIRE

    Koji Nakamura; Yumi Saita

    2007-01-01

    This paper examines the long-term relationship between macro economic fundamentals and the weighted-average land price indicators, which are supposed to be more appropriate than the official land price indicators when analyzing their impacts on the macro economy. In many cases, we find the cointegrating relationships between the weighted-average land price indicators and the discounted present value of land calculated based on the macro economic fundamentals indicators. We also find that the ...

  3. Fundamentals of structural dynamics

    CERN Document Server

    Craig, Roy R

    2006-01-01

    From theory and fundamentals to the latest advances in computational and experimental modal analysis, this is the definitive, updated reference on structural dynamics.This edition updates Professor Craig's classic introduction to structural dynamics, which has been an invaluable resource for practicing engineers and a textbook for undergraduate and graduate courses in vibrations and/or structural dynamics. Along with comprehensive coverage of structural dynamics fundamentals, finite-element-based computational methods, and dynamic testing methods, this Second Edition includes new and e

  4. Information security fundamentals

    CERN Document Server

    Peltier, Thomas R

    2013-01-01

    Developing an information security program that adheres to the principle of security as a business enabler must be the first step in an enterprise's effort to build an effective security program. Following in the footsteps of its bestselling predecessor, Information Security Fundamentals, Second Edition provides information security professionals with a clear understanding of the fundamentals of security required to address the range of issues they will experience in the field.The book examines the elements of computer security, employee roles and r

  5. Religious fundamentalism and conflict

    OpenAIRE

    Muzaffer Ercan Yılmaz

    2006-01-01

    This study provides an analytical discussion for the issue of religious fundamentalism and itsrelevance to conflict, in its broader sense. It is stressed that religious fundamentalism manifests itself in twoways: nonviolent intolerance and violent intolerance. The sources of both types of intolerance and theirconnection to conflict are addressed and discussed in detail. Further research is also suggested on conditionsconnecting religion to nonviolent intolerance so as to cope with the problem...

  6. Fundamentals of statistics

    CERN Document Server

    Mulholland, Henry

    1968-01-01

    Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges

  7. Approaching the Ultimate Limits of Communication Efficiency with a Photon-Counting Detector

    Science.gov (United States)

    Erkmen, Baris; Moision, Bruce; Dolinar, Samuel J.; Birnbaum, Kevin M.; Divsalar, Dariush

    2012-01-01

    Coherent states achieve the Holevo capacity of a pure-loss channel when paired with an optimal measurement, but a physical realization of this measurement is as of yet unknown, and it is also likely to be of high complexity. In this paper, we focus on the photon-counting measurement and study the photon and dimensional efficiencies attainable with modulations over classical- and nonclassical-state alphabets. We first review the state-of-the-art coherent on-off-keying (OOK) with a photoncounting measurement, illustrating its asymptotic inefficiency relative to the Holevo limit. We show that a commonly made Poisson approximation in thermal noise leads to unbounded photon information efficiencies, violating the conjectured Holevo limit. We analyze two binary-modulation architectures that improve upon the dimensional versus photon efficiency tradeoff achievable with conventional OOK. We show that at high photon efficiency these architectures achieve an efficiency tradeoff that differs from the best possible tradeoff--determined by the Holevo capacity--by only a constant factor. The first architecture we analyze is a coherent-state transmitter that relies on feedback from the receiver to control the transmitted energy. The second architecture uses a single-photon number-state source.

  8. Utility approach to decision-making in extended T1 and limited T2 glottic carcinoma.

    Science.gov (United States)

    van Loon, Yda; Stiggelbout, Anne M; Hakkesteegt, Marieke M; Langeveld, Ton P M; de Jong, Rob J Baatenburg; Sjögren, Elisabeth V

    2017-04-01

    It is still undecided if endoscopic laser surgery or radiotherapy is the preferable treatment in extended T1 and limited T2 glottic tumors. Health utilities assessed from patients can aid in decision-making. Patients treated for extended T1 or limited T2 glottic carcinoma by laser surgery (n = 12) or radiotherapy (n = 14) assigned health utilities using a visual analog scale (VAS), time tradeoff (TTO) technique and scored their voice handicap using the Voice Handicap Index (VHI). VAS and TTO scores were slightly lower for the laser group compared to the radiotherapy group, however, not significantly so. The VHI showed a correlation with the VAS score, which was very low in both groups and can be considered (near) normal. Patients show no clear preference for the outcomes of laser surgery or radiotherapy from a quality of life (QOL) or voice handicap point of view. These data can now be incorporated into decision-making models. © 2017 Wiley Periodicals, Inc. Head Neck, 2017 © 2016 Wiley Periodicals, Inc. Head Neck 39: 779-785, 2017. © 2017 Wiley Periodicals, Inc.

  9. Visual Approach and Landing Aids for Aircraft. A Theoretical Analysis of Some Fundamental Aspects of the Problem by Means of Perspective Diagrams

    Science.gov (United States)

    1947-01-01

    mm one) THAN TNI «umom») HCU<H.»OM OWA—U rowappw or TM» oocmpff. iv ~»^^»K.iMouA»iMiMAf.TOor.TMPii»tMN»iiA«« «it> Aamt.» A natto EIAAUCTE urrm...noted that the pilot is not tied to an ideal approach path fixed in space. 6.5 An important advantage of this pattern is that it resolves the

  10. New approach to the theory of coupled πNN-NN system. III. A three-body limit

    International Nuclear Information System (INIS)

    Avishai, Y.; Mizutani, T.

    1980-01-01

    In the limit where the pion is restricted to be emitted only by the nucleon that first absorbed it, it is shown that the equations previously developed to describe the couple πNN (πd) - NN system reduce to conventional three-body equations. Specifically, it is found in this limit that the input πN p 11 amplitude which, put on-shell, is directly related to the experimental phase shift, contrary to the original equations where the direct (dressed) nucleon pole term and the non-pole part of this partial wave enter separately. The present study clarifies the limitation of pure three-body approach to the πNN-NN problems as well as suggests a rare opportunity of observing a possible resonance behavior in the non-pole part of the πN P 11 amplitude through πd experiments

  11. A Cultural Psychological Approach to Analyze Intercultural Learning: Potential and Limits of the Structure Formation Technique

    Directory of Open Access Journals (Sweden)

    Doris Weidemann

    2009-01-01

    Full Text Available Despite the huge interest in sojourner adjustment, there is still a lack of qualitative as well as of longitudinal research that would offer more detailed insights into intercultural learning processes during overseas stays. The present study aims to partly fill that gap by documenting changes in knowledge structures and general living experiences of fifteen German sojourners in Taiwan in a longitudinal, cultural-psychological study. As part of a multimethod design a structure formation technique was used to document subjective theories on giving/losing face and their changes over time. In a second step results from this study are compared to knowledge-structures of seven long-term German residents in Taiwan, and implications for the conceptualization of intercultural learning will be proposed. Finally, results from both studies serve to discuss the potential and limits of structure formation techniques in the field of intercultural communication research. URN: urn:nbn:de:0114-fqs0901435

  12. Overcoming the Time Limitation in Molecular Dynamics Simulation of Crystal Nucleation: A Persistent-Embryo Approach.

    Science.gov (United States)

    Sun, Yang; Song, Huajing; Zhang, Feng; Yang, Lin; Ye, Zhuo; Mendelev, Mikhail I; Wang, Cai-Zhuang; Ho, Kai-Ming

    2018-02-23

    The crystal nucleation from liquid in most cases is too rare to be accessed within the limited time scales of the conventional molecular dynamics (MD) simulation. Here, we developed a "persistent embryo" method to facilitate crystal nucleation in MD simulations by preventing small crystal embryos from melting using external spring forces. We applied this method to the pure Ni case for a moderate undercooling where no nucleation can be observed in the conventional MD simulation, and obtained nucleation rate in good agreement with the experimental data. Moreover, the method is applied to simulate an even more sluggish event: the nucleation of the B2 phase in a strong glass-forming Cu-Zr alloy. The nucleation rate was found to be 8 orders of magnitude smaller than Ni at the same undercooling, which well explains the good glass formability of the alloy. Thus, our work opens a new avenue to study solidification under realistic experimental conditions via atomistic computer simulation.

  13. Overcoming the Time Limitation in Molecular Dynamics Simulation of Crystal Nucleation: A Persistent-Embryo Approach

    Science.gov (United States)

    Sun, Yang; Song, Huajing; Zhang, Feng; Yang, Lin; Ye, Zhuo; Mendelev, Mikhail I.; Wang, Cai-Zhuang; Ho, Kai-Ming

    2018-02-01

    The crystal nucleation from liquid in most cases is too rare to be accessed within the limited time scales of the conventional molecular dynamics (MD) simulation. Here, we developed a "persistent embryo" method to facilitate crystal nucleation in MD simulations by preventing small crystal embryos from melting using external spring forces. We applied this method to the pure Ni case for a moderate undercooling where no nucleation can be observed in the conventional MD simulation, and obtained nucleation rate in good agreement with the experimental data. Moreover, the method is applied to simulate an even more sluggish event: the nucleation of the B 2 phase in a strong glass-forming Cu-Zr alloy. The nucleation rate was found to be 8 orders of magnitude smaller than Ni at the same undercooling, which well explains the good glass formability of the alloy. Thus, our work opens a new avenue to study solidification under realistic experimental conditions via atomistic computer simulation.

  14. Breaking the theoretical scaling limit for predicting quasiparticle energies: the stochastic GW approach.

    Science.gov (United States)

    Neuhauser, Daniel; Gao, Yi; Arntsen, Christopher; Karshenas, Cyrus; Rabani, Eran; Baer, Roi

    2014-08-15

    We develop a formalism to calculate the quasiparticle energy within the GW many-body perturbation correction to the density functional theory. The occupied and virtual orbitals of the Kohn-Sham Hamiltonian are replaced by stochastic orbitals used to evaluate the Green function G, the polarization potential W, and, thereby, the GW self-energy. The stochastic GW (sGW) formalism relies on novel theoretical concepts such as stochastic time-dependent Hartree propagation, stochastic matrix compression, and spatial or temporal stochastic decoupling techniques. Beyond the theoretical interest, the formalism enables linear scaling GW calculations breaking the theoretical scaling limit for GW as well as circumventing the need for energy cutoff approximations. We illustrate the method for silicon nanocrystals of varying sizes with N_{e}>3000 electrons.

  15. Design and modeling of an SJ infrared solar cell approaching upper limit of theoretical efficiency

    Science.gov (United States)

    Sahoo, G. S.; Mishra, G. P.

    2018-01-01

    Recent trends of photovoltaics account for the conversion efficiency limit making them more cost effective. To achieve this we have to leave the golden era of silicon cell and make a path towards III-V compound semiconductor groups to take advantages like bandgap engineering by alloying these compounds. In this work we have used a low bandgap GaSb material and designed a single junction (SJ) cell with a conversion efficiency of 32.98%. SILVACO ATLAS TCAD simulator has been used to simulate the proposed model using both Ray Tracing and Transfer Matrix Method (under 1 sun and 1000 sun of AM1.5G spectrum). A detailed analyses of photogeneration rate, spectral response, potential developed, external quantum efficiency (EQE), internal quantum efficiency (IQE), short-circuit current density (JSC), open-circuit voltage (VOC), fill factor (FF) and conversion efficiency (η) are discussed. The obtained results are compared with previously reported SJ solar cell reports.

  16. Limited BRC rulemaking: Regulatory approach and experience in Texas for short-lived radioactive waste

    International Nuclear Information System (INIS)

    McBurney, Ruth E.; Pollard, Christine G.

    1992-01-01

    In 1987, the Texas Department of Health (TDH) implemented a rule to allow, under certain conditions, wastes containing limited concentrations of short- lived radionuclides (less than 300-day half-life) to be disposed of in Type I sanitary landfills. The rule was based on a technical analysis that demonstrated the degree of safety for approximately 340 m of radioactive waste generated annually in Texas and identified major restrictions and conditions for disposal. TDH's Bureau of Radiation Control staff have been able to maintain an account of licensees utilizing the rule during the past years. Several research and industrial facilities in the state have saved significantly on waste disposal expenses. Public concerns and economic impacts for licensees as well as other regulatory aspects and experiences with the rule are discussed. (author)

  17. Stochastic modelling of a single ion channel: an alternating renewal approach with application to limited time resolution.

    Science.gov (United States)

    Milne, R K; Yeo, G F; Edeson, R O; Madsen, B W

    1988-04-22

    Stochastic models of ion channels have been based largely on Markov theory where individual states and transition rates must be specified, and sojourn-time densities for each state are constrained to be exponential. This study presents an approach based on random-sum methods and alternating-renewal theory, allowing individual states to be grouped into classes provided the successive sojourn times in a given class are independent and identically distributed. Under these conditions Markov models form a special case. The utility of the approach is illustrated by considering the effects of limited time resolution (modelled by using a discrete detection limit, xi) on the properties of observable events, with emphasis on the observed open-time (xi-open-time). The cumulants and Laplace transform for a xi-open-time are derived for a range of Markov and non-Markov models; several useful approximations to the xi-open-time density function are presented. Numerical studies show that the effects of limited time resolution can be extreme, and also highlight the relative importance of the various model parameters. The theory could form a basis for future inferential studies in which parameter estimation takes account of limited time resolution in single channel records. Appendixes include relevant results concerning random sums and a discussion of the role of exponential distributions in Markov models.

  18. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    Directory of Open Access Journals (Sweden)

    Ning-Cong Xiao

    2013-12-01

    Full Text Available In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to calculate the maximum entropy density function of uncertainty parameters more accurately for it does not need any additional information and assumptions. Finally, two optimization models are presented which can be used to determine the lower and upper bounds of systems probability of failure under vague environment conditions. Two numerical examples are investigated to demonstrate the proposed method.

  19. Limitations of the endonasal endoscopic approach in treating olfactory groove meningiomas. A systematic review.

    Science.gov (United States)

    Shetty, Sathwik Raviraj; Ruiz-Treviño, Armando S; Omay, Sacit Bulent; Almeida, Joao Paulo; Liang, Buqing; Chen, Yu-Ning; Singh, Harminder; Schwartz, Theodore H

    2017-10-01

    To review current management strategies for olfactory groove meningioma (OGM)s and the recent literature comparing endoscopic endonasal (EEA) with traditional transcranial (TCA) approaches. A PubMed search of the recent literature (2011-2016) was performed to examine outcomes following EEA and TCA for OGM. The extent of resection, visual outcome, postoperative complications and recurrence rates were analyzed using percentages and proportions, the Fischer exact test and the Student's t-test using Graphpad PRISM 7.0Aa (San Diego, CA) software. There were 444 patients in the TCA group with a mean diameter of 4.61 (±1.17) cm and 101 patients in the EEA group with a mean diameter of 3.55 (± 0.58) cm (p = 0.0589). GTR was achieved in 90.9% (404/444) in the TCA group and 70.2% (71/101) in the EEA group (p OGMs.

  20. A pharmacology guided approach for setting limits on product-related impurities for bispecific antibody manufacturing.

    Science.gov (United States)

    Rajan, Sharmila; Sonoda, Junichiro; Tully, Timothy; Williams, Ambrose J; Yang, Feng; Macchi, Frank; Hudson, Terry; Chen, Mark Z; Liu, Shannon; Valle, Nicole; Cowan, Kyra; Gelzleichter, Thomas

    2018-04-13

    bFKB1 is a humanized bispecific IgG1 antibody, created by conjoining an anti-Fibroblast Growth Factor Receptor 1 (FGFR1) half-antibody to an anti-Klothoβ (KLB) half-antibody, using the knobs-into-holes strategy. bFKB1 acts as a highly selective agonist for the FGFR1/KLB receptor complex and is intended to ameliorate obesity-associated metabolic defects by mimicking the activity of the hormone FGF21. An important aspect of the biologics product manufacturing process is to establish meaningful product specifications regarding the tolerable levels of impurities that copurify with the drug product. The aim of the current study was to determine acceptable levels of product-related impurities for bFKB1. To determine the tolerable levels of these impurities, we dosed obese mice with bFKB1 enriched with various levels of either HMW impurities or anti-FGFR1-related impurities, and measured biomarkers for KLB-independent FGFR1 signaling. Here, we show that product-related impurities of bFKB1, in particular, high molecular weight (HMW) impurities and anti-FGFR1-related impurities, when purposefully enriched, stimulate FGFR1 in a KLB-independent manner. By taking this approach, the tolerable levels of product-related impurities were successfully determined. Our study demonstrates a general pharmacology-guided approach to setting a product specification for a bispecific antibody whose homomultimer-related impurities could lead to undesired biological effects. Copyright © 2018. Published by Elsevier Inc.

  1. Approach to simultaneously denoise and invert backscatter and extinction from photon-limited atmospheric lidar observations.

    Science.gov (United States)

    Marais, Willem J; Holz, Robert E; Hu, Yu Hen; Kuehn, Ralph E; Eloranta, Edwin E; Willett, Rebecca M

    2016-10-10

    Atmospheric lidar observations provide a unique capability to directly observe the vertical column of cloud and aerosol scattering properties. Detector and solar-background noise, however, hinder the ability of lidar systems to provide reliable backscatter and extinction cross-section estimates. Standard methods for solving this inverse problem are most effective with high signal-to-noise ratio observations that are only available at low resolution in uniform scenes. This paper describes a novel method for solving the inverse problem with high-resolution, lower signal-to-noise ratio observations that are effective in non-uniform scenes. The novelty is twofold. First, the inferences of the backscatter and extinction are applied to images, whereas current lidar algorithms only use the information content of single profiles. Hence, the latent spatial and temporal information in noisy images are utilized to infer the cross-sections. Second, the noise associated with photon-counting lidar observations can be modeled using a Poisson distribution, and state-of-the-art tools for solving Poisson inverse problems are adapted to the atmospheric lidar problem. It is demonstrated through photon-counting high spectral resolution lidar (HSRL) simulations that the proposed algorithm yields inverted backscatter and extinction cross-sections (per unit volume) with smaller mean squared error values at higher spatial and temporal resolutions, compared to the standard approach. Two case studies of real experimental data are also provided where the proposed algorithm is applied on HSRL observations and the inverted backscatter and extinction cross-sections are compared against the standard approach.

  2. Fundamental investigations of catalyst nanoparticles

    DEFF Research Database (Denmark)

    Elkjær, Christian Fink

    and economic development in the 20th century. There is however a downside to this development and we are seeing significant pollution and pressure on resources. Catalysis therefore has an increasingly important role in limiting pollution and optimizing the use of resources. This development will depend on our...... fundamental understanding of catalytic processes and our ability to make use of that understanding. This thesis presents fundamental studies of catalyst nanoparticles with particular focus on dynamic processes. Such studies often require atomic-scale characterization, because the catalytic conversion takes...... important that we only study intrinsic structures and phenomena and not those that may be induced by the high energy electrons used to image the specimen. This requires careful consideration of the influence of the electron beam in order to understand, control and minimize that influence. I present four...

  3. [Substitutive and dietetic approaches in childhood autistic disorder: interests and limits].

    Science.gov (United States)

    Hjiej, H; Doyen, C; Couprie, C; Kaye, K; Contejean, Y

    2008-10-01

    Autism is a developmental disorder that requires specialized therapeutic approaches. Influenced by various theoretical hypotheses, therapeutic programs are typically structured on a psychodynamic, biological or educative basis. Presently, educational strategies are recommended in the treatment of autism, without excluding other approaches when they are necessary. Some authors recommend dietetic or complementary approaches to the treatment of autism, which often stimulates great interest in the parents but also provokes controversy for professionals. Nevertheless, professionals must be informed about this approach because parents are actively in demand of it. First of all, enzymatic disorders and metabolic errors are those most frequently evoked in the literature. The well-known phenylalanine hydroxylase deficit responsible for phenylketonuria has been described as being associated with autism. In this case, adapted diet prevents mental retardation and autistic symptoms. Some enzymatic errors are also corrected by supplementation with uridine or ribose for example, but these supplementations are the responsibility of specialized medical teams in the domain of neurology and cannot be applied by parents alone. Secondly, increased opoid activity due to an excess of peptides is also supposed to be at the origin of some autistic symptoms. Gluten-free or casein-free diets have thus been tested in controlled studies, with contradictory results. With such diets, some studies show symptom regression but others report negative side effects, essentially protein malnutrition. Methodological bias, small sample sizes, the use of various diagnostic criteria or heterogeneity of evaluation interfere with data analysis and interpretation, which prompted professionals to be cautious with such diets. The third hypothesis emphasized in the literature is the amino acid domain. Some autistic children lack some amino acids such as glutamic or aspartic acids for example and this deficiency

  4. Limitation of Socio-Economic Rights in the 2010 Kenyan Constitution: A Proposal for the Adoption of a Proportionality Approach in the Judicial Adjudication of Socio-Economic Rights Disputes

    Directory of Open Access Journals (Sweden)

    Nicholas Wasonga Orago

    2013-12-01

    Full Text Available On 27 August 2010 Kenya adopted a transformative Constitution with the objective of fighting poverty and inequality as well as improving the standards of living of all people in Kenya. One of the mechanisms in the 2010 Constitution aimed at achieving this egalitarian transformation is the entrenchment of justiciable socio-economic rights (SERs, an integral part of the Bill of Rights. The entrenched SERs require the State to put in place a legislative, policy and programmatic framework to enhance the realisation of its constitutional obligations to respect, protect and fulfill these rights for all Kenyans. These SER obligations, just like any other fundamental human rights obligations, are, however, not absolute and are subject to legitimate limitation by the State. Two approaches have been used in international and comparative national law jurisprudence to limit SERs: the proportionality approach, using a general limitation clause that has found application in international and regional jurisprudence on the one hand; and the reasonableness approach, using internal limitations contained in the standard of progressive realisation, an approach that has found application in the SER jurisprudence of the South African Courts, on the other hand. This article proposes that if the entrenched SERs are to achieve their transformative objectives, Kenyan courts must adopt a proportionality approach in the judicial adjudication of SER disputes. This proposal is based on the reasoning that for the entrenched SERs to have a substantive positive impact on the lives of the Kenyan people, any measure by the government aimed at their limitation must be subjected to strict scrutiny by the courts, a form of scrutiny that can be achieved only by using the proportionality standard entrenched in the article 24 general limitation clause.

  5. Multivariate approaches for stability control of the olive oil reference materials for sensory analysis - part I: framework and fundamentals.

    Science.gov (United States)

    Valverde-Som, Lucia; Ruiz-Samblás, Cristina; Rodríguez-García, Francisco P; Cuadros-Rodríguez, Luis

    2018-02-09

    Virgin olive oil is the only food product for which sensory analysis is regulated to classify it in different quality categories. To harmonize the results of the sensorial method, the use of standards or reference materials is crucial. The stability of sensory reference materials is required to enable their suitable control, aiming to confirm that their specific target values are maintained on an ongoing basis. Currently, such stability is monitored by means of sensory analysis and the sensory panels are in the paradoxical situation of controlling the standards that are devoted to controlling the panels. In the present study, several approaches based on similarity analysis are exploited. For each approach, the specific methodology to build a proper multivariate control chart to monitor the stability of the sensory properties is explained and discussed. The normalized Euclidean and Mahalanobis distances, the so-called nearness and hardiness indices respectively, have been defined as new similarity indices to range the values from 0 to 1. Also, the squared mean from Hotelling's T 2 -statistic and Q 2 -statistic has been proposed as another similarity index. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  6. Development of a multilocus-based approach for sponge (phylum Porifera) identification: refinement and limitations.

    Science.gov (United States)

    Yang, Qi; Franco, Christopher M M; Sorokin, Shirley J; Zhang, Wei

    2017-02-02

    For sponges (phylum Porifera), there is no reliable molecular protocol available for species identification. To address this gap, we developed a multilocus-based Sponge Identification Protocol (SIP) validated by a sample of 37 sponge species belonging to 10 orders from South Australia. The universal barcode COI mtDNA, 28S rRNA gene (D3-D5), and the nuclear ITS1-5.8S-ITS2 region were evaluated for their suitability and capacity for sponge identification. The highest Bit Score was applied to infer the identity. The reliability of SIP was validated by phylogenetic analysis. The 28S rRNA gene and COI mtDNA performed better than the ITS region in classifying sponges at various taxonomic levels. A major limitation is that the databases are not well populated and possess low diversity, making it difficult to conduct the molecular identification protocol. The identification is also impacted by the accuracy of the morphological classification of the sponges whose sequences have been submitted to the database. Re-examination of the morphological identification further demonstrated and improved the reliability of sponge identification by SIP. Integrated with morphological identification, the multilocus-based SIP offers an improved protocol for more reliable and effective sponge identification, by coupling the accuracy of different DNA markers.

  7. Yeast biomass production: a new approach in glucose-limited feeding strategy

    Directory of Open Access Journals (Sweden)

    Érika Durão Vieira

    2013-01-01

    Full Text Available The aim of this work was to implement experimentally a simple glucose-limited feeding strategy for yeast biomass production in a bubble column reactor based on a spreadsheet simulator suitable for industrial application. In biomass production process using Saccharomyces cerevisiae strains, one of the constraints is the strong tendency of these species to metabolize sugars anaerobically due to catabolite repression, leading to low values of biomass yield on substrate. The usual strategy to control this metabolic tendency is the use of a fed-batch process in which where the sugar source is fed incrementally and total sugar concentration in broth is maintained below a determined value. The simulator presented in this work was developed to control molasses feeding on the basis of a simple theoretical model in which has taken into account the nutritional growth needs of yeast cell and two input data: the theoretical specific growth rate and initial cell biomass. In experimental assay, a commercial baker's yeast strain and molasses as sugar source were used. Experimental results showed an overall biomass yield on substrate of 0.33, a biomass increase of 6.4 fold and a specific growth rate of 0.165 h-1 in contrast to the predicted value of 0.180 h-1 in the second stage simulation.

  8. Exploring the Obstacles and the Limits of Sustainable Development. A Theoretical Approach

    Directory of Open Access Journals (Sweden)

    Paula-Carmen Roșca

    2017-03-01

    Full Text Available The term “sustainable” or “sustainability” is currently used so much and in so many fields that it has become basically part of our everyday lives. It has been connected and linked to almost everything related to our living, to our lifestyle: energy, transport, housing, diet, clothing etc. But what does the term “sustainable” really mean? Many people may have heard about sustainable development or sustainability and may have even tried to have a sustainable living but their efforts might not be enough. The present paper is meant to bring forward a few of the limits of “sustainability” concept. Moreover, it is focused on revealing some arguments from the “other side” along with disagreements regarding some of the principles of “sustainable development” and even critics related to its progress, to its achievements. Another purpose of this paper is to draw attention over some of the issues and obstacles which may threaten the future of sustainability. The paper is also meant to highlight the impact that some stakeholders might have on the evolution of sustainable development due to their financial power, on a global scale.

  9. The quasi-classical limit of scattering amplitude - L2-approach for short range potentials

    International Nuclear Information System (INIS)

    Yajima, K.; Vienna Univ.

    1984-01-01

    We are concerned with the asymptotic behaviour as Planck's constant h → 0 of the scattering operator Ssup(h) associated with the pair of Schroedinger equations i h/2π delta u/delta t = - ((h/2π) 2 /2m)Δu + V(x) u equivalent to Hsup(h)u and i h/2π delta u/delta t = - ((h/2π) 2 /2m)Δu equivalent to Hsup(h) 0 u. We shall show under certain conditions that the scattering matrix S-circumflexsup(h)(p,q), the distribution kernel of Ssup(h) in momentum representation, may be expressed in terms of a Fourier integral operator. Then applying the stationary phase method to it, we shall prove that S-circumflexsup(h) has an asymptotic expansion in powers of h/2π up to any order in L 2 -space and that the limit as h → 0 of the total cross section is twice the one of classical mechanics, in generic. (Author)

  10. Multicore in production: advantages and limits of the multiprocess approach in the ATLAS experiment

    International Nuclear Information System (INIS)

    Binet, S; Calafiura, P; Lavrijsen, W; Leggett, C; Tatarkhanov, M; Tsulaia, V; Jha, M K; Lesny, D; Severini, H; Smith, D; Snyder, S; VanGemmeren, P; Washbrook, A

    2012-01-01

    The shared memory architecture of multicore CPUs provides HEP developers with the opportunity to reduce the memory footprint of their applications by sharing memory pages between the cores in a processor. ATLAS pioneered the multi-process approach to parallelize HEP applications. Using Linux fork() and the Copy On Write mechanism we implemented a simple event task farm, which allowed us to achieve sharing of almost 80% of memory pages among event worker processes for certain types of reconstruction jobs with negligible CPU overhead. By leaving the task of managing shared memory pages to the operating system, we have been able to parallelize large reconstruction and simulation applications originally written to be run in a single thread of execution with little to no change to the application code. The process of validating AthenaMP for production took ten months of concentrated effort and is expected to continue for several more months. Besides validating the software itself, an important and time-consuming aspect of running multicore applications in production was to configure the ATLAS distributed production system to handle multicore jobs. This entailed defining multicore batch queues, where the unit resource is not a core, but a whole computing node; monitoring the output of many event workers; and adapting the job definition layer to handle computing resources with different event throughputs. We will present scalability and memory usage studies, based on data gathered both on dedicated hardware and at the CERN Computer Center.

  11. Angular plasmon response of gold nanoparticles arrays: approaching the Rayleigh limit

    Directory of Open Access Journals (Sweden)

    Marae-Djouda Joseph

    2016-07-01

    Full Text Available The regular arrangement of metal nanoparticles influences their plasmonic behavior. It has been previously demonstrated that the coupling between diffracted waves and plasmon modes can give rise to extremely narrow plasmon resonances. This is the case when the single-particle localized surface plasmon resonance (λLSP is very close in value to the Rayleigh anomaly wavelength (λRA of the nanoparticles array. In this paper, we performed angle-resolved extinction measurements on a 2D array of gold nano-cylinders designed to fulfil the condition λRA<λLSP. Varying the angle of excitation offers a unique possibility to finely modify the value of λRA, thus gradually approaching the condition of coupling between diffracted waves and plasmon modes. The experimental observation of a collective dipolar resonance has been interpreted by exploiting a simplified model based on the coupling of evanescent diffracted waves with plasmon modes. Among other plasmon modes, the measurement technique has also evidenced and allowed the study of a vertical plasmon mode, only visible in TM polarization at off-normal excitation incidence. The results of numerical simulations, based on the periodic Green’s tensor formalism, match well with the experimental transmission spectra and show fine details that could go unnoticed by considering only experimental data.

  12. Revisiting Pocos de Caldas. Application of the co-precipitation approach to establish realistic solubility limits for performance assessment

    International Nuclear Information System (INIS)

    Bruno, J.; Duro, L.; Jordana, S.; Cera, E.

    1996-02-01

    Solubility limits constitute a critical parameter for the determination of the mobility of radionuclides in the near field and the geosphere, and consequently for the performance assessment of nuclear waste repositories. Mounting evidence from natural system studies indicate that trace elements, and consequently radionuclides, are associated to the dynamic cycling of major geochemical components. We have recently developed a thermodynamic approach to take into consideration the co-precipitation and co-dissolution processes that mainly control this linkage. The approach has been tested in various natural system studies with encouraging results. The Pocos de Caldas natural analogue was one of the sites where a full testing of our predictive geochemical modelling capabilities were done during the analogue project. We have revisited the Pocos de Caldas data and expanded the trace element solubility calculations by considering the documented trace metal/major ion interactions. This has been done by using the co-precipitation/co-dissolution approach. The outcome is as follows: A satisfactory modelling of the behaviour of U, Zn and REEs is achieved by assuming co-precipitation with ferrihydrite. Strontium concentrations are apparently controlled by its co-dissolution from Sr-rich fluorites. From the performance assessment point of view, the present work indicates that calculated solubility limits using the co-precipitation approach are in close agreement with the actual trace element concentrations. Furthermore, the calculated radionuclide concentrations are 2-4 orders of magnitude lower than conservative solubility limits calculated by assuming equilibrium with individual trace element phases. 34 refs, 18 figs, 13 tabs

  13. Revisiting Pocos de Caldas. Application of the co-precipitation approach to establish realistic solubility limits for performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bruno, J.; Duro, L.; Jordana, S.; Cera, E. [QuantiSci, Barcelona (Spain)

    1996-02-01

    Solubility limits constitute a critical parameter for the determination of the mobility of radionuclides in the near field and the geosphere, and consequently for the performance assessment of nuclear waste repositories. Mounting evidence from natural system studies indicate that trace elements, and consequently radionuclides, are associated to the dynamic cycling of major geochemical components. We have recently developed a thermodynamic approach to take into consideration the co-precipitation and co-dissolution processes that mainly control this linkage. The approach has been tested in various natural system studies with encouraging results. The Pocos de Caldas natural analogue was one of the sites where a full testing of our predictive geochemical modelling capabilities were done during the analogue project. We have revisited the Pocos de Caldas data and expanded the trace element solubility calculations by considering the documented trace metal/major ion interactions. This has been done by using the co-precipitation/co-dissolution approach. The outcome is as follows: A satisfactory modelling of the behaviour of U, Zn and REEs is achieved by assuming co-precipitation with ferrihydrite. Strontium concentrations are apparently controlled by its co-dissolution from Sr-rich fluorites. From the performance assessment point of view, the present work indicates that calculated solubility limits using the co-precipitation approach are in close agreement with the actual trace element concentrations. Furthermore, the calculated radionuclide concentrations are 2-4 orders of magnitude lower than conservative solubility limits calculated by assuming equilibrium with individual trace element phases. 34 refs, 18 figs, 13 tabs.

  14. Maximum Entropy Fundamentals

    Directory of Open Access Journals (Sweden)

    F. Topsøe

    2001-09-01

    Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over

  15. Phase behaviour of symmetric binary mixtures with partially miscible components in slit-like pores. Application of the fundamental measure density functional approach

    CERN Document Server

    Martínez, A; Patrykiejew, A; Sokolowski, S

    2003-01-01

    We investigate adsorption in slit-like pores of model symmetric binary mixtures exhibiting demixing in bulk phase, by using a density functional approach. Our focus is on the evaluation of the first-order phase transitions in adsorbed fluids and the lines separating mixed and demixed phases. The scenario for phase transitions is sensitive to the pore width and to the energy of adsorption. Both these parameters can change the phase diagrams of the confined fluid. In particular, for relatively wide pores and for strong wall-fluid interactions, the demixing line can precede the first-order transition. Moreover, a competition between layering transitions and demixing within particular layers also leads to further enrichment of the phase diagram.

  16. Fundamentals of turbomachines

    CERN Document Server

    Dick, Erik

    2015-01-01

    This book explores the working principles of all kinds of turbomachines. The same theoretical framework is used to analyse the different machine types. Fundamentals are first presented and theoretical concepts are then elaborated for particular machine types, starting with the simplest ones.For each machine type, the author strikes a balance between building basic understanding and exploring knowledge of practical aspects. Readers are invited through challenging exercises to consider how the theory applies to particular cases and how it can be generalised.   The book is primarily meant as a course book. It teaches fundamentals and explores applications. It will appeal to senior undergraduate and graduate students in mechanical engineering and to professional engineers seeking to understand the operation of turbomachines. Readers will gain a fundamental understanding of turbomachines. They will also be able to make a reasoned choice of turbomachine for a particular application and to understand its operation...

  17. Fundamentals of piping design

    CERN Document Server

    Smith, Peter

    2013-01-01

    Written for the piping engineer and designer in the field, this two-part series helps to fill a void in piping literature,since the Rip Weaver books of the '90s were taken out of print at the advent of the Computer Aid Design(CAD) era. Technology may have changed, however the fundamentals of piping rules still apply in the digitalrepresentation of process piping systems. The Fundamentals of Piping Design is an introduction to the designof piping systems, various processes and the layout of pipe work connecting the major items of equipment forthe new hire, the engineering student and the vetera

  18. Infosec management fundamentals

    CERN Document Server

    Dalziel, Henry

    2015-01-01

    Infosec Management Fundamentals is a concise overview of the Information Security management concepts and techniques, providing a foundational template for both experienced professionals and those new to the industry. This brief volume will also appeal to business executives and managers outside of infosec who want to understand the fundamental concepts of Information Security and how it impacts their business decisions and daily activities. Teaches ISO/IEC 27000 best practices on information security management Discusses risks and controls within the context of an overall information securi

  19. Homeschooling and religious fundamentalism

    Directory of Open Access Journals (Sweden)

    Robert Kunzman

    2010-10-01

    Full Text Available This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to contemporary culture; suspicion of institutional authority and professional expertise; parental control and centrality of the family; and interweaving of faith and academics. It is important to recognize, however, that fundamentalism exists on a continuum; conservative religious homeschoolers resist liberal democratic values to varying degrees, and efforts to foster dialogue and accommodation with religious homeschoolers can ultimately help strengthen the broader civic fabric.

  20. Fundamentals of continuum mechanics

    CERN Document Server

    Rudnicki, John W

    2014-01-01

    A concise introductory course text on continuum mechanics Fundamentals of Continuum Mechanics focuses on the fundamentals of the subject and provides the background for formulation of numerical methods for large deformations and a wide range of material behaviours. It aims to provide the foundations for further study, not just of these subjects, but also the formulations for much more complex material behaviour and their implementation computationally.  This book is divided into 5 parts, covering mathematical preliminaries, stress, motion and deformation, balance of mass, momentum and energ

  1. Pragmatic electrical engineering fundamentals

    CERN Document Server

    Eccles, William

    2011-01-01

    Pragmatic Electrical Engineering: Fundamentals introduces the fundamentals of the energy-delivery part of electrical systems. It begins with a study of basic electrical circuits and then focuses on electrical power. Three-phase power systems, transformers, induction motors, and magnetics are the major topics.All of the material in the text is illustrated with completely-worked examples to guide the student to a better understanding of the topics. This short lecture book will be of use at any level of engineering, not just electrical. Its goal is to provide the practicing engineer with a practi

  2. Fundamentals of reactor chemistry

    International Nuclear Information System (INIS)

    Akatsu, Eiko

    1981-12-01

    In the Nuclear Engineering School of JAERI, many courses are presented for the people working in and around the nuclear reactors. The curricula of the courses contain also the subject material of chemistry. With reference to the foreign curricula, a plan of educational subject material of chemistry in the Nuclear Engineering School of JAERI was considered, and the fundamental part of reactor chemistry was reviewed in this report. Since the students of the Nuclear Engineering School are not chemists, the knowledge necessary in and around the nuclear reactors was emphasized in order to familiarize the students with the reactor chemistry. The teaching experience of the fundamentals of reactor chemistry is also given. (author)

  3. A risk modelling approach for setting microbiological limits using enterococci as indicator for growth potential of Salmonella in pork.

    Science.gov (United States)

    Bollerslev, Anne Mette; Nauta, Maarten; Hansen, Tina Beck; Aabo, Søren

    2017-01-02

    Microbiological limits are widely used in food processing as an aid to reduce the exposure to hazardous microorganisms for the consumers. However, in pork, the prevalence and concentrations of Salmonella are generally low and microbiological limits are not considered an efficient tool to support hygiene interventions. The objective of the present study was to develop an approach which could make it possible to define potential risk-based microbiological limits for an indicator, enterococci, in order to evaluate the risk from potential growth of Salmonella. A positive correlation between the concentration of enterococci and the prevalence and concentration of Salmonella was shown for 6640 pork samples taken at Danish cutting plants and retail butchers. The samples were collected in five different studies in 2001, 2002, 2010, 2011 and 2013. The observations that both Salmonella and enterococci are carried in the intestinal tract, contaminate pork by the same mechanisms and share similar growth characteristics (lag phase and maximum specific growth rate) at temperatures around 5-10°C, suggest a potential of enterococci to be used as an indicator of potential growth of Salmonella in pork. Elevated temperatures during processing will lead to growth of both enterococci and, if present, also Salmonella. By combining the correlation between enterococci and Salmonella with risk modelling, it is possible to predict the risk of salmonellosis based on the level of enterococci. The risk model used for this purpose includes the dose-response relationship for Salmonella and a reduction factor to account for preparation of the fresh pork. By use of the risk model, it was estimated that the majority of salmonellosis cases, caused by the consumption of pork in Denmark, is caused by the small fraction of pork products that has enterococci concentrations above 5logCFU/g. This illustrates that our approach can be used to evaluate the potential effect of different microbiological

  4. A Life-cycle Approach to Improve the Sustainability of Rural Water Systems in Resource-Limited Countries

    Directory of Open Access Journals (Sweden)

    Nicholas Stacey

    2012-11-01

    Full Text Available A WHO and UNICEF joint report states that in 2008, 884 million people lacked access to potable drinking water. A life-cycle approach to develop potable water systems may improve the sustainability for such systems, however, a review of the literature shows that such an approach has primarily been used for urban systems located in resourced countries. Although urbanization is increasing globally, over 40 percent of the world’s population is currently rural with many considered poor. In this paper, we present a first step towards using life-cycle assessment to develop sustainable rural water systems in resource-limited countries while pointing out the needs. For example, while there are few differences in costs and environmental impacts for many improved rural water system options, a system that uses groundwater with community standpipes is substantially lower in cost that other alternatives with a somewhat lower environmental inventory. However, a LCA approach shows that from institutional as well as community and managerial perspectives, sustainability includes many other factors besides cost and environment that are a function of the interdependent decision process used across the life cycle of a water system by aid organizations, water user committees, and household users. These factors often present the biggest challenge to designing sustainable rural water systems for resource-limited countries.

  5. Development of system based code for integrity of FBR. Fundamental probabilistic approach, Part 1: Model calculation of creep-fatigue damage (Research report)

    International Nuclear Information System (INIS)

    Kawasaki, Nobuchika; Asayama, Tai

    2001-09-01

    Both reliability and safety have to be further improved for the successful commercialization of FBRs. At the same time, construction and operation costs need to be reduced to a same level of future LWRs. To realize compatibility among reliability, safety and, cost, the Structural Mechanics Research Group in JNC started the development of System Based Code for Integrity of FBR. This code extends the present structural design standard to include the areas of fabrication, installation, plant system design, safety design, operation and maintenance, and so on. A quantitative index is necessary to connect different partial standards in this code. Failure probability is considered as a candidate index. Therefore we decided to make a model calculation using failure probability and judge its applicability. We first investigated other probabilistic standards like ASME Code Case N-578. A probabilistic approach in the structural integrity evaluation was created based on these results, and also an evaluation flow was proposed. According to this flow, a model calculation of creep-fatigue damage was performed. This trial calculation was for a vessel in a sodium-cooled FBR. As the result of this model calculation, a crack initiation probability and a crack penetration probability were found to be effective indices. Last we discussed merits of this System Based Code, which are presented in this report. Furthermore, this report presents future development tasks. (author)

  6. DOE fundamentals handbook: Material science

    International Nuclear Information System (INIS)

    1993-01-01

    This handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of the structure and properties of metals. This volume contains the following modules: thermal shock (thermal stress, pressurized thermal shock), brittle fracture (mechanism, minimum pressurization-temperature curves, heatup/cooldown rate limits), and plant materials (properties considered when selecting materials, fuel materials, cladding and reflectors, control materials, nuclear reactor core problems, plant material problems, atomic displacement due to irradiation, thermal and displacement spikes due to irradiation, neutron capture effect, radiation effects in organic compounds, reactor use of aluminum)

  7. Fundamentals of astrodynamics

    NARCIS (Netherlands)

    Wakker, K.F.

    2015-01-01

    This book deals with the motion of the center of mass of a spacecraft; this discipline is generally called astrodynamics. The book focuses on an analytical treatment of the motion of spacecraft and provides insight into the fundamentals of spacecraft orbit dynamics. A large number of topics are

  8. Safety analysis fundamentals

    International Nuclear Information System (INIS)

    Wright, A.C.D.

    2002-01-01

    This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations

  9. Fundamentals and Optimal Institutions

    DEFF Research Database (Denmark)

    Gonzalez-Eiras, Martin; Harmon, Nikolaj Arpe; Rossi, Martín

    2016-01-01

    of regulatory institutions such as revenue sharing, salary caps or luxury taxes. We show, theoretically and empirically, that these large differences in adopted institutions can be rationalized as optimal responses to differences in the fundamental characteristics of the sports being played. This provides...

  10. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  11. Industrial separation processes : fundamentals

    NARCIS (Netherlands)

    Haan, de A.B.; Bosch, Hans

    2013-01-01

    Separation processes on an industrial scale comprise well over half of the capital and operating costs. They are basic knowledge in every chemical engineering and process engineering study. This book provides comprehensive and fundamental knowledge of university teaching in this discipline,

  12. Fundamental partial compositeness

    DEFF Research Database (Denmark)

    Sannino, Francesco; Strumia, Alessandro; Tesi, Andrea

    2016-01-01

    We construct renormalizable Standard Model extensions, valid up to the Planck scale, that give a composite Higgs from a new fundamental strong force acting on fermions and scalars. Yukawa interactions of these particles with Standard Model fermions realize the partial compositeness scenario. Unde...

  13. Grenoble Fundamental Research Department

    International Nuclear Information System (INIS)

    1979-01-01

    A summary of the various activities of the Fundamental Research Institute, Grenoble, France is given. The following fields are covered: Nuclear physics, solid state physics, physical chemistry, biology and advanced techniques. Fore more detailed descriptions readers are referred to scientific literature [fr

  14. Fundamentals of Fire Phenomena

    DEFF Research Database (Denmark)

    Quintiere, James

    analyses. Fire phenomena encompass everything about the scientific principles behind fire behaviour. Combining the principles of chemistry, physics, heat and mass transfer, and fluid dynamics necessary to understand the fundamentals of fire phenomena, this book integrates the subject into a clear...

  15. Fundamental Metallurgy of Solidification

    DEFF Research Database (Denmark)

    Tiedje, Niels

    2004-01-01

    The text takes the reader through some fundamental aspects of solidification, with focus on understanding the basic physics that govern solidification in casting and welding. It is described how the first solid is formed and which factors affect nucleation. It is described how crystals grow from...

  16. Fundamentals of Diesel Engines.

    Science.gov (United States)

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…

  17. Introduction and fundamentals

    International Nuclear Information System (INIS)

    Thomas, R.H.

    1980-01-01

    This introduction discusses advances in the fundamental sciences which underlie the applied science of health physics and radiation protection. Risk assessments in nuclear medicine are made by defining the conditions of exposure, identification of adverse effects, relating exposure with effect, and estimation of the overall risk for ionizing radiations

  18. Fundamentals of plasma physics

    CERN Document Server

    Bittencourt, J A

    1986-01-01

    A general introduction designed to present a comprehensive, logical and unified treatment of the fundamentals of plasma physics based on statistical kinetic theory. Its clarity and completeness make it suitable for self-learning and self-paced courses. Problems are included.

  19. Fast fundamental frequency estimation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm; Jensen, Jesper Rindom

    2017-01-01

    Modelling signals as being periodic is common in many applications. Such periodic signals can be represented by a weighted sum of sinusoids with frequencies being an integer multiple of the fundamental frequency. Due to its widespread use, numerous methods have been proposed to estimate the funda...

  20. Fundamentals of ultrasonic phased arrays

    CERN Document Server

    Schmerr, Lester W

    2014-01-01

    This book describes in detail the physical and mathematical foundations of ultrasonic phased array measurements.?The book uses linear systems theory to develop a comprehensive model of the signals and images that can be formed with phased arrays. Engineers working in the field of ultrasonic nondestructive evaluation (NDE) will find in this approach a wealth of information on how to design, optimize and interpret ultrasonic inspections with phased arrays. The fundamentals and models described in the book will also be of significant interest to other fields, including the medical ultrasound and

  1. Benefits and limitations of a multidisciplinary approach to individualized management of Cornelia de Lange syndrome and related diagnoses.

    Science.gov (United States)

    January, Kathleen; Conway, Laura J; Deardorff, Matthew; Harrington, Ann; Krantz, Ian D; Loomes, Kathleen; Pipan, Mary; Noon, Sarah E

    2016-06-01

    Given the clinical complexities of Cornelia de Lange Syndrome (CdLS), the Center for CdLS and Related Diagnoses at The Children's Hospital of Philadelphia (CHOP) and The Multidisciplinary Clinic for Adolescents and Adults at Greater Baltimore Medical Center (GBMC) were established to develop a comprehensive approach to clinical management and research issues relevant to CdLS. Little work has been done to evaluate the general utility of a multispecialty approach to patient care. Previous research demonstrates several advantages and disadvantages of multispecialty care. This research aims to better understand the benefits and limitations of a multidisciplinary clinic setting for individuals with CdLS and related diagnoses. Parents of children with CdLS and related diagnoses who have visited a multidisciplinary clinic (N = 52) and who have not visited a multidisciplinary clinic (N = 69) were surveyed to investigate their attitudes. About 90.0% of multispecialty clinic attendees indicated a preference for multidisciplinary care. However, some respondents cited a need for additional clinic services including more opportunity to meet with other specialists (N = 20), such as behavioral health, and increased information about research studies (N = 15). Travel distance and expenses often prevented families' multidisciplinary clinic attendance (N = 41 and N = 35, respectively). Despite identified limitations, these findings contribute to the evidence demonstrating the utility of a multispecialty approach to patient care. This approach ultimately has the potential to not just improve healthcare for individuals with CdLS but for those with medically complex diagnoses in general. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. Fundamental Properties of Salts

    Energy Technology Data Exchange (ETDEWEB)

    Toni Y Gutknecht; Guy L Fredrickson

    2012-11-01

    Thermal properties of molten salt systems are of interest to electrorefining operations, pertaining to both the Fuel Cycle Research & Development Program (FCR&D) and Spent Fuel Treatment Mission, currently being pursued by the Department of Energy (DOE). The phase stability of molten salts in an electrorefiner may be adversely impacted by the build-up of fission products in the electrolyte. Potential situations that need to be avoided, during electrorefining operations, include (i) fissile elements build up in the salt that might approach the criticality limits specified for the vessel, (ii) electrolyte freezing at the operating temperature of the electrorefiner due to changes in the liquidus temperature, and (iii) phase separation (non-homogenous solution). The stability (and homogeneity) of the phases can be monitored by studying the thermal characteristics of the molten salts as a function of impurity concentration. Simulated salt compositions consisting of the selected rare earth and alkaline earth chlorides, with a eutectic mixture of LiCl-KCl as the carrier electrolyte, were studied to determine the melting points (thermal characteristics) using a Differential Scanning Calorimeter (DSC). The experimental data were used to model the liquidus temperature. On the basis of the this data, it became possible to predict a spent fuel treatment processing scenario under which electrorefining could no longer be performed as a result of increasing liquidus temperatures of the electrolyte.

  3. Fundamental Structure of Loop Quantum Gravity

    Science.gov (United States)

    Han, Muxin; Ma, Yongge; Huang, Weiming

    In the recent twenty years, loop quantum gravity, a background independent approach to unify general relativity and quantum mechanics, has been widely investigated. The aim of loop quantum gravity is to construct a mathematically rigorous, background independent, non-perturbative quantum theory for a Lorentzian gravitational field on a four-dimensional manifold. In the approach, the principles of quantum mechanics are combined with those of general relativity naturally. Such a combination provides us a picture of, so-called, quantum Riemannian geometry, which is discrete on the fundamental scale. Imposing the quantum constraints in analogy from the classical ones, the quantum dynamics of gravity is being studied as one of the most important issues in loop quantum gravity. On the other hand, the semi-classical analysis is being carried out to test the classical limit of the quantum theory. In this review, the fundamental structure of loop quantum gravity is presented pedagogically. Our main aim is to help non-experts to understand the motivations, basic structures, as well as general results. It may also be beneficial to practitioners to gain insights from different perspectives on the theory. We will focus on the theoretical framework itself, rather than its applications, and do our best to write it in modern and precise langauge while keeping the presentation accessible for beginners. After reviewing the classical connection dynamical formalism of general relativity, as a foundation, the construction of the kinematical Ashtekar-Isham-Lewandowski representation is introduced in the content of quantum kinematics. The algebraic structure of quantum kinematics is also discussed. In the content of quantum dynamics, we mainly introduce the construction of a Hamiltonian constraint operator and the master constraint project. At last, some applications and recent advances are outlined. It should be noted that this strategy of quantizing gravity can also be extended to

  4. Fundamentals of differential beamforming

    CERN Document Server

    Benesty, Jacob; Pan, Chao

    2016-01-01

    This book provides a systematic study of the fundamental theory and methods of beamforming with differential microphone arrays (DMAs), or differential beamforming in short. It begins with a brief overview of differential beamforming and some popularly used DMA beampatterns such as the dipole, cardioid, hypercardioid, and supercardioid, before providing essential background knowledge on orthogonal functions and orthogonal polynomials, which form the basis of differential beamforming. From a physical perspective, a DMA of a given order is defined as an array that measures the differential acoustic pressure field of that order; such an array has a beampattern in the form of a polynomial whose degree is equal to the DMA order. Therefore, the fundamental and core problem of differential beamforming boils down to the design of beampatterns with orthogonal polynomials. But certain constraints also have to be considered so that the resulting beamformer does not seriously amplify the sensors’ self noise and the mism...

  5. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, A.; Murthy, S.

    2007-06-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d ( d = 3, . . . , 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings. (author)

  6. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, Atish; Murthy, Sameer

    2008-01-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d (d = 3, ..., 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings

  7. What is Fundamental?

    CERN Multimedia

    2004-01-01

    Discussing what is fundamental in a variety of fields, biologist Richard Dawkins, physicist Gerardus 't Hooft, and mathematician Alain Connes spoke to a packed Main Auditorium at CERN 15 October. Dawkins, Professor of the Public Understanding of Science at Oxford University, explained simply the logic behind Darwinian natural selection, and how it would seem to apply anywhere in the universe that had the right conditions. 't Hooft, winner of the 1999 Physics Nobel Prize, outlined some of the main problems in physics today, and said he thinks physics is so fundamental that even alien scientists from another planet would likely come up with the same basic principles, such as relativity and quantum mechanics. Connes, winner of the 1982 Fields Medal (often called the Nobel Prize of Mathematics), explained how physics is different from mathematics, which he described as a "factory for concepts," unfettered by connection to the physical world. On 16 October, anthropologist Sharon Traweek shared anecdotes from her ...

  8. Approaching the limits

    African Journals Online (AJOL)

    From the 4th – 17th December 2016, the parties of the. Convention for Biodiversity held their 13th conference in Cancún,. Mexico. At the event, a revised red list was produced. On the list are some species featured for the first time. Others were down- listed, or moved into categories more dire than previously was the case.

  9. Fundamentals of gas counters

    International Nuclear Information System (INIS)

    Bateman, J.E.

    1994-01-01

    The operation of gas counters used for detecting radiation is explained in terms of the four fundamental physical processes which govern their operation. These are 1) conversion of neutral radiation into charged particles, 2) ionization of the host gas by a fast charge particle 3) transport of the gas ions to the electrodes and 4) amplification of the electrons in a region of enhanced electric field. Practical implications of these are illustrated. (UK)

  10. Fundamentals of Filament Interaction

    Science.gov (United States)

    2017-05-19

    AFRL-AFOSR-VA-TR-2017-0110 FUNDAMENTALS OF FILAMENT INTERACTION Martin Richardson UNIVERSITY OF CENTRAL FLORIDA Final Report 06/02/2017 DISTRIBUTION...of Filament Interaction 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA95501110001 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Martin Richardson 5d. PROJECT...NAME OF RESPONSIBLE PERSON Martin Richardson a. REPORT b. ABSTRACT c. THIS PAGE 19b. TELEPHONE NUMBER (Include area code) 407-823-6819 Standard Form

  11. Fundamentals of radiological protection

    International Nuclear Information System (INIS)

    Wells, J.; Mill, A.J.; Charles, M.W.

    1978-05-01

    The basic processes of living cells which are relevant to an understanding of the interaction of ionizing radiation with man are described. Particular reference is made to cell death, cancer induction and genetic effects. This is the second of a series of reports which present the fundamentals necessary for an understanding of the bases of regulatory criteria such as those recommended by the International Commision on Radiological Protection (ICRP). Others consider basic radiation physics and the biological effects of ionizing radiation. (author)

  12. Fundamentals of linear algebra

    CERN Document Server

    Dash, Rajani Ballav

    2008-01-01

    FUNDAMENTALS OF LINEAR ALGEBRA is a comprehensive Text Book, which can be used by students and teachers of All Indian Universities. The Text has easy, understandable form and covers all topics of UGC Curriculum. There are lots of worked out examples which helps the students in solving the problems without anybody's help. The Problem sets have been designed keeping in view of the questions asked in different examinations.

  13. Fundamentals of queueing theory

    CERN Document Server

    Gross, Donald; Thompson, James M; Harris, Carl M

    2013-01-01

    Praise for the Third Edition ""This is one of the best books available. Its excellent organizational structure allows quick reference to specific models and its clear presentation . . . solidifies the understanding of the concepts being presented.""-IIE Transactions on Operations Engineering Thoroughly revised and expanded to reflect the latest developments in the field, Fundamentals of Queueing Theory, Fourth Edition continues to present the basic statistical principles that are necessary to analyze the probabilistic nature of queues. Rather than pre

  14. High voltage engineering fundamentals

    CERN Document Server

    Kuffel, E; Hammond, P

    1984-01-01

    Provides a comprehensive treatment of high voltage engineering fundamentals at the introductory and intermediate levels. It covers: techniques used for generation and measurement of high direct, alternating and surge voltages for general application in industrial testing and selected special examples found in basic research; analytical and numerical calculation of electrostatic fields in simple practical insulation system; basic ionisation and decay processes in gases and breakdown mechanisms of gaseous, liquid and solid dielectrics; partial discharges and modern discharge detectors; and over

  15. Biomedical engineering fundamentals

    CERN Document Server

    Bronzino, Joseph D

    2014-01-01

    Known as the bible of biomedical engineering, The Biomedical Engineering Handbook, Fourth Edition, sets the standard against which all other references of this nature are measured. As such, it has served as a major resource for both skilled professionals and novices to biomedical engineering.Biomedical Engineering Fundamentals, the first volume of the handbook, presents material from respected scientists with diverse backgrounds in physiological systems, biomechanics, biomaterials, bioelectric phenomena, and neuroengineering. More than three dozen specific topics are examined, including cardia

  16. Fundamental concepts on energy

    International Nuclear Information System (INIS)

    Rodriguez, M.H.

    1998-01-01

    The fundamental concepts on energy and the different forms in which it is manifested are presented. Since it is possible to transform energy in a way to other, the laws that govern these transformations are discussed. The energy transformation processes are an essential compound in the capacity humanizes to survive and be developed. The energy use brings important economic aspects, technical and political. Because this, any decision to administer energy system will be key for our future life

  17. Fundamentals of powder metallurgy

    International Nuclear Information System (INIS)

    Khan, I.H.; Qureshi, K.A.; Minhas, J.I.

    1988-01-01

    This book is being presented to introduce the fundamentals of technology of powder metallurgy. An attempt has been made to present an overall view of powder metallurgy technology in the first chapter, whereas chapter 2 to 8 deal with the production of metal powders. The basic commercial methods of powder production are briefly described with illustrations. Chapter 9 to 12 describes briefly metal powder characteristics and principles of testing, mixing, blending, conditioning, compaction and sintering. (orig./A.B.)

  18. Fundamentals of Physical Volcanology

    Science.gov (United States)

    Marsh, Bruce

    2010-04-01

    Fundamentals haunt me. Certain words ignite unavoidable trains of thought, trains that begin in a cascade, unexpectedly leaping chasm after chasm, rushing from single words to whole paragraphs to full books to men's lives. So it is with me with seeing the word “fundamental” in print. I cannot evade the euphoric excitement of thinking that someone has found something terribly original and simple, understandable by every journeyman, explaining everything.

  19. Fundamentals of radiological protection

    International Nuclear Information System (INIS)

    Mill, A.J.; Charles, M.W.; Wells, J.

    1978-04-01

    A review is presented of basic radiation physics with particular relevance to radiological protection. The processes leading to the production and absorption of ionising radiation are outlined, and the important dosimetric quantities and their units of measurements. The review is the first of a series of reports presenting the fundamentals necessary for an understanding of the basis of regulatory criteria such as those recommended by the ICRP. (author)

  20. Value of Fundamental Science

    Science.gov (United States)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  1. Individual differences in fundamental social motives.

    Science.gov (United States)

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. The role of dose limitation and optimization in intervention. Approaches to the remediation of contaminated sites in Germany

    International Nuclear Information System (INIS)

    Goldammer, W.; Helming, M.; Kuehnel, G.; Landfermann, H.-H.

    2000-01-01

    The clean-up of contaminated sites requires appropriate and efficient methodologies for the decision-making about priorities and extent of remedial measures, aiming at the two, usually conflicting, goals to protect people and the environment and to save money and other resources. Finding the cost-effective balance between these two primary objectives often is complicated by several factors. Sensible decision-making in this situation requires the use of appropriate methodologies and tools which assist in identifying and implementing the optimal solution. The paper discusses an approach developed in Germany to achieve environmentally sound and cost-effective solutions. A basic requirement within the German approach is the limitation of individual doses in order to limit inequity between people exposed. An Action Level of 1 mSv per annum is used in this sense for the identification of sites that require farther investigation and potentially remediation. On the basis of this individual dose related criterion secondary reference levels for directly measurable quantities such as activity concentrations have been derived, facilitating the practical application of the Action Level Concept. Decisions on remedial action, in particular for complex sites, are based on justification and optimization analyses. These take into consideration a variety of different contaminants and risks to humans and the environment arising on various exposure pathways. The optimization analyses, carried-out to identify optimal remediation options, address radiological risks as well as short and long term costs within a cost-benefit analysis framework. Other relevant factors of influence, e.g. chemical risks or ecological damage, are incorporated as well. Comprehensive methodologies utilizing probabilistic methods have been developed to assess site conditions and possible remediation options on this basis. The approaches developed are applied within the German uranium mine rehabilitation program

  3. The role of dose limitation and optimization in intervention. Approaches to the remediation of contaminated sites in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Goldammer, W. [Brenk Systemplanung GmbH, Aachen (Germany); Helming, M.; Kuehnel, G.; Landfermann, H.-H. [Federal Ministry for the Environment, Nature Conservation and Nuclear Safety, Bonn (Germany)

    2000-05-01

    The clean-up of contaminated sites requires appropriate and efficient methodologies for the decision-making about priorities and extent of remedial measures, aiming at the two, usually conflicting, goals to protect people and the environment and to save money and other resources. Finding the cost-effective balance between these two primary objectives often is complicated by several factors. Sensible decision-making in this situation requires the use of appropriate methodologies and tools which assist in identifying and implementing the optimal solution. The paper discusses an approach developed in Germany to achieve environmentally sound and cost-effective solutions. A basic requirement within the German approach is the limitation of individual doses in order to limit inequity between people exposed. An Action Level of 1 mSv per annum is used in this sense for the identification of sites that require farther investigation and potentially remediation. On the basis of this individual dose related criterion secondary reference levels for directly measurable quantities such as activity concentrations have been derived, facilitating the practical application of the Action Level Concept. Decisions on remedial action, in particular for complex sites, are based on justification and optimization analyses. These take into consideration a variety of different contaminants and risks to humans and the environment arising on various exposure pathways. The optimization analyses, carried-out to identify optimal remediation options, address radiological risks as well as short and long term costs within a cost-benefit analysis framework. Other relevant factors of influence, e.g. chemical risks or ecological damage, are incorporated as well. Comprehensive methodologies utilizing probabilistic methods have been developed to assess site conditions and possible remediation options on this basis. The approaches developed are applied within the German uranium mine rehabilitation program

  4. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  5. Fundamentals of GPS Receivers A Hardware Approach

    CERN Document Server

    Doberstein, Dan

    2012-01-01

    While much of the current literature on GPS receivers is aimed at those intimately familiar with their workings, this volume summarizes the basic principles using as little mathematics as possible, and details the necessary specifications and circuits for constructing a GPS receiver that is accurate to within 300 meters. Dedicated sections deal with the features of the GPS signal and its data stream, the details of the receiver (using a hybrid design as exemplar), and more advanced receivers and topics including time and frequency measurements. Later segments discuss the Zarlink GPS receiver chip set, as well as providing a thorough examination of the TurboRogue receiver, one of the most accurate yet made. Guiding the reader through the concepts and circuitry, from the antenna to the solution of user position, the book’s deployment of a hybrid receiver as a basis for discussion allows for extrapolation of the core ideas to more complex, and more accurate designs. Digital methods are used, but any analogue c...

  6. Fundamentals of semiconductor devices

    CERN Document Server

    Lindmayer, Joseph

    1965-01-01

    Semiconductor properties ; semiconductor junctions or diodes ; transistor fundamentals ; inhomogeneous impurity distributions, drift or graded-base transistors ; high-frequency properties of transistors ; band structure of semiconductors ; high current densities and mechanisms of carrier transport ; transistor transient response and recombination processes ; surfaces, field-effect transistors, and composite junctions ; additional semiconductor characteristics ; additional semiconductor devices and microcircuits ; more metal, insulator, and semiconductor combinations for devices ; four-pole parameters and configuration rotation ; four-poles of combined networks and devices ; equivalent circuits ; the error function and its properties ; Fermi-Dirac statistics ; useful physical constants.

  7. Fundamentals of radiological protection

    International Nuclear Information System (INIS)

    Charles, M.W.; Wells, J.; Mill, A.J.

    1978-04-01

    A brief review is presented of the early and late effects of ionising radiation on man, with particular emphasis on those aspects of importance in radiological protection. The terminology and dose response curves, are explained. Early effects on cells, tissues and whole organs are discussed. Late somatic effects considered include cancer and life-span shortening. Genetic effects are examined. The review is the third of a series of reports which present the fundamentals necessary for an understanding of the basis of regulatory criteria, such as those of the ICRP. (u.K.)

  8. Fundamental concepts of mathematics

    CERN Document Server

    Goodstein, R L

    Fundamental Concepts of Mathematics, 2nd Edition provides an account of some basic concepts in modern mathematics. The book is primarily intended for mathematics teachers and lay people who wants to improve their skills in mathematics. Among the concepts and problems presented in the book include the determination of which integral polynomials have integral solutions; sentence logic and informal set theory; and why four colors is enough to color a map. Unlike in the first edition, the second edition provides detailed solutions to exercises contained in the text. Mathematics teachers and people

  9. Fundamental composite electroweak dynamics

    DEFF Research Database (Denmark)

    Arbey, Alexandre; Cacciapaglia, Giacomo; Cai, Haiying

    2017-01-01

    Using the recent joint results from the ATLAS and CMS collaborations on the Higgs boson, we determine the current status of composite electroweak dynamics models based on the expected scalar sector. Our analysis can be used as a minimal template for a wider class of models between the two limitin...... space at the effective Lagrangian level. We show that a wide class of models of fundamental composite electroweak dynamics are still compatible with the present constraints. The results are relevant for the ongoing and future searches at the Large Hadron Collider....

  10. Fundamentals of calculus

    CERN Document Server

    Morris, Carla C

    2015-01-01

    Fundamentals of Calculus encourages students to use power, quotient, and product rules for solutions as well as stresses the importance of modeling skills.  In addition to core integral and differential calculus coverage, the book features finite calculus, which lends itself to modeling and spreadsheets.  Specifically, finite calculus is applied to marginal economic analysis, finance, growth, and decay.  Includes: Linear Equations and FunctionsThe DerivativeUsing the Derivative Exponential and Logarithmic Functions Techniques of DifferentiationIntegral CalculusIntegration TechniquesFunctions

  11. Fundamentals of attosecond optics

    CERN Document Server

    Chang, Zenghu

    2011-01-01

    Attosecond optical pulse generation, along with the related process of high-order harmonic generation, is redefining ultrafast physics and chemistry. A practical understanding of attosecond optics requires significant background information and foundational theory to make full use of these cutting-edge lasers and advance the technology toward the next generation of ultrafast lasers. Fundamentals of Attosecond Optics provides the first focused introduction to the field. The author presents the underlying concepts and techniques required to enter the field, as well as recent research advances th

  12. Fundamental of biomedical engineering

    CERN Document Server

    Sawhney, GS

    2007-01-01

    About the Book: A well set out textbook explains the fundamentals of biomedical engineering in the areas of biomechanics, biofluid flow, biomaterials, bioinstrumentation and use of computing in biomedical engineering. All these subjects form a basic part of an engineer''s education. The text is admirably suited to meet the needs of the students of mechanical engineering, opting for the elective of Biomedical Engineering. Coverage of bioinstrumentation, biomaterials and computing for biomedical engineers can meet the needs of the students of Electronic & Communication, Electronic & Instrumenta

  13. Fundamental formulas of physics

    CERN Document Server

    1960-01-01

    The republication of this book, unabridged and corrected, fills the need for a comprehensive work on fundamental formulas of mathematical physics. It ranges from simple operations to highly sophisticated ones, all presented most lucidly with terms carefully defined and formulas given completely. In addition to basic physics, pertinent areas of chemistry, astronomy, meteorology, biology, and electronics are also included.This is no mere listing of formulas, however. Mathematics is integrated into text, for the most part, so that each chapter stands as a brief summary or even short textbook of

  14. Fundamentals of magnetism

    CERN Document Server

    Getzlaff, Mathias

    2007-01-01

    In the last decade a tremendous progress has taken place in understanding the basis of magnetism, especially in reduced dimensions. In the first part, the fundamentals of magnetism are conveyed for atoms and bulk-like solid-state systems providing a basis for the understanding of new phenomena which exclusively occur in low-dimensional systems as the giant magneto resistance. This wide field is discussed in the second part and illustrated by copious examples. This textbook is particularly suitable for graduate students in physical and materials sciences. It includes numerous examples, exercises, and references.

  15. Electronic circuits fundamentals & applications

    CERN Document Server

    Tooley, Mike

    2015-01-01

    Electronics explained in one volume, using both theoretical and practical applications.New chapter on Raspberry PiCompanion website contains free electronic tools to aid learning for students and a question bank for lecturersPractical investigations and questions within each chapter help reinforce learning Mike Tooley provides all the information required to get to grips with the fundamentals of electronics, detailing the underpinning knowledge necessary to appreciate the operation of a wide range of electronic circuits, including amplifiers, logic circuits, power supplies and oscillators. The

  16. Nanomachines fundamentals and applications

    CERN Document Server

    Wang, Joseph

    2013-01-01

    This first-hand account by one of the pioneers of nanobiotechnology brings together a wealth of valuable material in a single source. It allows fascinating insights into motion at the nanoscale, showing how the proven principles of biological nanomotors are being transferred to artificial nanodevices.As such, the author provides engineers and scientists with the fundamental knowledge surrounding the design and operation of biological and synthetic nanomotors and the latest advances in nanomachines. He addresses such topics as nanoscale propulsions, natural biomotors, molecular-scale machin

  17. Fundamentals of photonics

    CERN Document Server

    Saleh, Bahaa E A

    2007-01-01

    Now in a new full-color edition, Fundamentals of Photonics, Second Edition is a self-contained and up-to-date introductory-level textbook that thoroughly surveys this rapidly expanding area of engineering and applied physics. Featuring a logical blend of theory and applications, coverage includes detailed accounts of the primary theories of light, including ray optics, wave optics, electromagnetic optics, and photon optics, as well as the interaction of photons and atoms, and semiconductor optics. Presented at increasing levels of complexity, preliminary sections build toward more advan

  18. DOE fundamentals handbook: Chemistry

    International Nuclear Information System (INIS)

    1993-01-01

    This handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of chemistry. This volume contains the following modules: reactor water chemistry (effects of radiation on water chemistry, chemistry parameters), principles of water treatment (purpose; treatment processes [ion exchange]; dissolved gases, suspended solids, and pH control; water purity), and hazards of chemicals and gases (corrosives [acids, alkalies], toxic compounds, compressed gases, flammable/combustible liquids)

  19. Protein biomarkers on tissue as imaged via MALDI mass spectrometry: A systematic approach to study the limits of detection.

    Science.gov (United States)

    van de Ven, Stephanie M W Y; Bemis, Kyle D; Lau, Kenneth; Adusumilli, Ravali; Kota, Uma; Stolowitz, Mark; Vitek, Olga; Mallick, Parag; Gambhir, Sanjiv S

    2016-06-01

    MALDI mass spectrometry imaging (MSI) is emerging as a tool for protein and peptide imaging across tissue sections. Despite extensive study, there does not yet exist a baseline study evaluating the potential capabilities for this technique to detect diverse proteins in tissue sections. In this study, we developed a systematic approach for characterizing MALDI-MSI workflows in terms of limits of detection, coefficients of variation, spatial resolution, and the identification of endogenous tissue proteins. Our goal was to quantify these figures of merit for a number of different proteins and peptides, in order to gain more insight in the feasibility of protein biomarker discovery efforts using this technique. Control proteins and peptides were deposited in serial dilutions on thinly sectioned mouse xenograft tissue. Using our experimental setup, coefficients of variation were biomarkers and a new benchmarking strategy that can be used for comparing diverse MALDI-MSI workflows. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Observations on the perspectives and limits of the evidence-based approach in the evaluation of gamification processes

    Directory of Open Access Journals (Sweden)

    Bruni Filippo

    2015-12-01

    Full Text Available As continually greater attention is given to the processes of gamification, the dimension pertaining to evaluation must also be focussed on the purpose of avoiding ineffective forms of banalisation. In reference to the evidence-based approach proposed by Mayer and in highlighting its possibilities and limits, an experiment is herein presented related to teacher training, in which we attempt to unite some traits of the processes of gamification to a first evaluation screen. The data obtained, if they seem on the one hand, indicate an overall positive perception on the part of the attendees, on the other though, they indicate forms of resistance and of saturation with respect to both the excessively competitive mechanisms and the peer evaluation procedures.

  1. Analysis and mitigation of systematic errors in spectral shearing interferometry of pulses approaching the single-cycle limit [Invited

    International Nuclear Information System (INIS)

    Birge, Jonathan R.; Kaertner, Franz X.

    2008-01-01

    We derive an analytical approximation for the measured pulse width error in spectral shearing methods, such as spectral phase interferometry for direct electric-field reconstruction (SPIDER), caused by an anomalous delay between the two sheared pulse components. This analysis suggests that, as pulses approach the single-cycle limit, the resulting requirements on the calibration and stability of this delay become significant, requiring precision orders of magnitude higher than the scale of a wavelength. This is demonstrated by numerical simulations of SPIDER pulse reconstruction using actual data from a sub-two-cycle laser. We briefly propose methods to minimize the effects of this sensitivity in SPIDER and review variants of spectral shearing that attempt to avoid this difficulty

  2. Beyond the futility argument: the fair process approach and time-limited trials for managing dialysis conflict.

    Science.gov (United States)

    Rinehart, Ann

    2013-11-01

    Futility is an ancient concept arising from Greek mythology that was resurrected for its medical application in the 1980s with the proliferation of many lifesaving technologies, including dialysis and renal transplantation. By that time, the domineering medical paternalism that characterized the pre-1960s physician-patient relationship morphed into assertive patient autonomy, and some patients began to claim the right to demand aggressive, high-technology interventions, despite physician disapproval. To counter this power struggle, the establishment of a precise definition of futility offered hope for a futility policy that would allow physicians to justify withholding or withdrawing treatment, despite patient and family objections. This article reviews the various attempts made to define medical futility and describes their limited applicability to dialysis. When futility concerns arise, physicians should recognize the opportunity to address conflict, using best practice communication skills. Physicians would also benefit from understanding the ethical principles of respect for patient autonomy, beneficence, nonmaleficence, justice, and professional integrity that underlie medical decision-making. Also reviewed is the use of a fair process approach or time-limited trial when conflict resolution cannot be achieved. These topics are addressed in the Renal Physician Association's clinical practice guideline Shared Decision-Making in the Appropriate Initiation and Withdrawal from Dialysis, with which nephrologists should be well versed. A case presentation of intractable calciphylaxis in a new dialysis patient illustrates the pitfalls of physicians not fully appreciating the ethics of medical decision-making and failing to use effective conflict management approaches in the clinical practice guideline.

  3. Beyond the Futility Argument: The Fair Process Approach and Time-Limited Trials for Managing Dialysis Conflict

    Science.gov (United States)

    2013-01-01

    Summary Futility is an ancient concept arising from Greek mythology that was resurrected for its medical application in the 1980s with the proliferation of many lifesaving technologies, including dialysis and renal transplantation. By that time, the domineering medical paternalism that characterized the pre-1960s physician–patient relationship morphed into assertive patient autonomy, and some patients began to claim the right to demand aggressive, high-technology interventions, despite physician disapproval. To counter this power struggle, the establishment of a precise definition of futility offered hope for a futility policy that would allow physicians to justify withholding or withdrawing treatment, despite patient and family objections. This article reviews the various attempts made to define medical futility and describes their limited applicability to dialysis. When futility concerns arise, physicians should recognize the opportunity to address conflict, using best practice communication skills. Physicians would also benefit from understanding the ethical principles of respect for patient autonomy, beneficence, nonmaleficence, justice, and professional integrity that underlie medical decision-making. Also reviewed is the use of a fair process approach or time-limited trial when conflict resolution cannot be achieved. These topics are addressed in the Renal Physician Association’s clinical practice guideline Shared Decision-Making in the Appropriate Initiation and Withdrawal from Dialysis, with which nephrologists should be well versed. A case presentation of intractable calciphylaxis in a new dialysis patient illustrates the pitfalls of physicians not fully appreciating the ethics of medical decision-making and failing to use effective conflict management approaches in the clinical practice guideline. PMID:23868900

  4. Quivers, words and fundamentals

    International Nuclear Information System (INIS)

    Mattioli, Paolo; Ramgoolam, Sanjaye

    2015-01-01

    A systematic study of holomorphic gauge invariant operators in general N=1 quiver gauge theories, with unitary gauge groups and bifundamental matter fields, was recently presented in http://dx.doi.org/10.1007/JHEP04(2013)094. For large ranks a simple counting formula in terms of an infinite product was given. We extend this study to quiver gauge theories with fundamental matter fields, deriving an infinite product form for the refined counting in these cases. The infinite products are found to be obtained from substitutions in a simple building block expressed in terms of the weighted adjacency matrix of the quiver. In the case without fundamentals, it is a determinant which itself is found to have a counting interpretation in terms of words formed from partially commuting letters associated with simple closed loops in the quiver. This is a new relation between counting problems in gauge theory and the Cartier-Foata monoid. For finite ranks of the unitary gauge groups, the refined counting is given in terms of expressions involving Littlewood-Richardson coefficients.

  5. Impact of a New Law to Reduce the Legal Blood Alcohol Concentration Limit - A Poisson Regression Analysis and Descriptive Approach.

    Science.gov (United States)

    Nistal-Nuño, Beatriz

    2017-03-31

    In Chile, a new law introduced in March 2012 lowered the blood alcohol concentration (BAC) limit for impaired drivers from 0.1% to 0.08% and the BAC limit for driving under the influence of alcohol from 0.05% to 0.03%, but its effectiveness remains uncertain. The goal of this investigation was to evaluate the effects of this enactment on road traffic injuries and fatalities in Chile. A retrospective cohort study. Data were analyzed using a descriptive and a Generalized Linear Models approach, type of Poisson regression, to analyze deaths and injuries in a series of additive Log-Linear Models accounting for the effects of law implementation, month influence, a linear time trend and population exposure. A review of national databases in Chile was conducted from 2003 to 2014 to evaluate the monthly rates of traffic fatalities and injuries associated to alcohol and in total. It was observed a decrease by 28.1 percent in the monthly rate of traffic fatalities related to alcohol as compared to before the law (Plaw (Plaw implemented in 2012 in Chile. Chile experienced a significant reduction in alcohol-related traffic fatalities and injuries, being a successful public health intervention.

  6. RF tissue-heating near metallic implants during magnetic resonance examinations: an approach in the ac limit.

    Science.gov (United States)

    Ballweg, Verena; Eibofner, Frank; Graf, Hansjorg

    2011-10-01

    State of the art to access radiofrequency (RF) heating near implants is computer modeling of the devices and solving Maxwell's equations for the specific setup. For a set of input parameters, a fixed result is obtained. This work presents a theoretical approach in the alternating current (ac) limit, which can potentially render closed formulas for the basic behavior of tissue heating near metallic structures. Dedicated experiments were performed to support the theory. For the ac calculations, the implant was modeled as an RLC parallel circuit, with L being the secondary of a transformer and the RF transmission coil being its primary. Parameters influencing coupling, power matching, and specific absorption rate (SAR) were determined and formula relations were established. Experiments on a copper ring with a radial gap as capacitor for inductive coupling (at 1.5 T) and on needles for capacitive coupling (at 3 T) were carried out. The temperature rise in the embedding dielectric was observed as a function of its specific resistance using an infrared (IR) camera. Closed formulas containing the parameters of the setup were obtained for the frequency dependence of the transmitted power at fixed load resistance, for the calculation of the resistance for optimum power transfer, and for the calculation of the transmitted power in dependence of the load resistance. Good qualitative agreement was found between the course of the experimentally obtained heating curves and the theoretically determined power curves. Power matching revealed as critical parameter especially if the sample was resonant close to the Larmor frequency. The presented ac approach to RF heating near an implant, which mimics specific values for R, L, and C, allows for closed formulas to estimate the potential of RF energy transfer. A first reference point for worst-case determination in MR testing procedures can be obtained. Numerical approaches, necessary to determine spatially resolved heating maps, can

  7. Fundaments of plant cybernetics.

    Science.gov (United States)

    Zucconi, F

    2001-01-01

    A systemic approach is proposed for analyzing plants' physiological organization and cybernesis. To this end, the plant is inspected as a system, starting from the integration of crown and root systems, and its impact on a number of basic epigenetic events. The approach proves to be axiomatic and facilitates the definition of the principles behind the plant's autonomous control of growth and reproduction.

  8. Materials Fundamentals of Gate Dielectrics

    CERN Document Server

    Demkov, Alexander A

    2006-01-01

    This book presents materials fundamentals of novel gate dielectrics that are being introduced into semiconductor manufacturing to ensure the continuous scalling of the CMOS devices. This is a very fast evolving field of research so we choose to focus on the basic understanding of the structure, thermodunamics, and electronic properties of these materials that determine their performance in device applications. Most of these materials are transition metal oxides. Ironically, the d-orbitals responsible for the high dielectric constant cause sever integration difficulties thus intrinsically limiting high-k dielectrics. Though new in the electronics industry many of these materials are wel known in the field of ceramics, and we describe this unique connection. The complexity of the structure-property relations in TM oxides makes the use of the state of the art first-principles calculations necessary. Several chapters give a detailed description of the modern theory of polarization, and heterojunction band discont...

  9. Fundamental aspects of quantum theory

    International Nuclear Information System (INIS)

    Gorini, V.; Frigerio, A.

    1986-01-01

    This book presents information on the following topics: general problems and crucial experiments; the classical behavior of measuring instruments; quantum interference effect for two atoms radiating a single photon; quantization and stochastic processes; quantum Markov processes driven by Bose noise; chaotic behavior in quantum mechanics; quantum ergodicity and chaos; microscopic and macroscopic levels of description; fundamental properties of the ground state of atoms and molecules; n-level systems interacting with Bosons - semiclassical limits; general aspects of gauge theories; adiabatic phase shifts for neutrons and photons; the spins of cyons and dyons; round-table discussion the the Aharonov-Bohm effect; gravity in quantum mechanics; the gravitational phase transition; anomalies and their cancellation; a new gauge without any ghost for Yang-Mills Theory; and energy density and roughening in the 3-D Ising ferromagnet

  10. Making physics more fundamental

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1988-07-15

    The stellar death throes of supernovae have been seen and admired since time immemorial. However last year's was the first to come under the combined scrutiny of space-borne radiation detectors and underground neutrino monitors as well as terrestrial optical telescopes and even gravity wave antennae. The remarkable results underline the power of modern physics to explain and interrelate processes in the furthest reaches of the cosmos and the deep interior of nuclear particles. In recent years this common ground between 'Big Bang' cosmology and particle physics has been regularly trodden and retrodden in the light of fresh new insights and new experimental results, and thinking has steadily converged. In 1983, the first Symposium on Astronomy, Cosmology and Fundamental Physics, organized by CERN and the European Southern Observatory (ESO), was full of optimism, with new ideas ('inflation') to explain how the relatively small variations in the structure of the Universe could have arisen through the quantum structure of the initial cataclysm.

  11. Digital Fourier analysis fundamentals

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations.  These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...

  12. Fundamental partial compositeness

    CERN Document Server

    Sannino, Francesco

    2016-11-07

    We construct renormalizable Standard Model extensions, valid up to the Planck scale, that give a composite Higgs from a new fundamental strong force acting on fermions and scalars. Yukawa interactions of these particles with Standard Model fermions realize the partial compositeness scenario. Successful models exist because gauge quantum numbers of Standard Model fermions admit a minimal enough 'square root'. Furthermore, right-handed SM fermions have an SU(2)$_R$-like structure, yielding a custodially-protected composite Higgs. Baryon and lepton numbers arise accidentally. Standard Model fermions acquire mass at tree level, while the Higgs potential and flavor violations are generated by quantum corrections. We further discuss accidental symmetries and other dynamical features stemming from the new strongly interacting scalars. If the same phenomenology can be obtained from models without our elementary scalars, they would reappear as composite states.

  13. Theory of fundamental interactions

    International Nuclear Information System (INIS)

    Pestov, A.B.

    1992-01-01

    In the present article the theory of fundamental interactions is derived in a systematic way from the first principles. In the developed theory there is no separation between space-time and internal gauge space. Main equations for basic fields are derived. In is shown that the theory satisfies the correspondence principle and gives rise to new notions in the considered region. In particular, the conclusion is made about the existence of particles which are characterized not only by the mass, spin, charge but also by the moment of inertia. These are rotating particles, the particles which represent the notion of the rigid body on the microscopical level and give the key for understanding strong interactions. The main concepts and dynamical laws for these particles are formulated. The basic principles of the theory may be examined experimentally not in the distant future. 29 refs

  14. Fundamentals of Geophysics

    Science.gov (United States)

    Lowrie, William

    1997-10-01

    This unique textbook presents a comprehensive overview of the fundamental principles of geophysics. Unlike most geophysics textbooks, it combines both the applied and theoretical aspects to the subject. The author explains complex geophysical concepts using abundant diagrams, a simplified mathematical treatment, and easy-to-follow equations. After placing the Earth in the context of the solar system, he describes each major branch of geophysics: gravitation, seismology, dating, thermal and electrical properties, geomagnetism, paleomagnetism and geodynamics. Each chapter begins with a summary of the basic physical principles, and a brief account of each topic's historical evolution. The book will satisfy the needs of intermediate-level earth science students from a variety of backgrounds, while at the same time preparing geophysics majors for continued study at a higher level.

  15. Automotive electronics design fundamentals

    CERN Document Server

    Zaman, Najamuz

    2015-01-01

    This book explains the topology behind automotive electronics architectures and examines how they can be profoundly augmented with embedded controllers. These controllers serve as the core building blocks of today’s vehicle electronics. Rather than simply teaching electrical basics, this unique resource focuses on the fundamental concepts of vehicle electronics architecture, and details the wide variety of Electronic Control Modules (ECMs) that enable the increasingly sophisticated "bells & whistles" of modern designs.  A must-have for automotive design engineers, technicians working in automotive electronics repair centers and students taking automotive electronics courses, this guide bridges the gap between academic instruction and industry practice with clear, concise advice on how to design and optimize automotive electronics with embedded controllers.

  16. Fundamental partial compositeness

    International Nuclear Information System (INIS)

    Sannino, Francesco; Strumia, Alessandro; Tesi, Andrea; Vigiani, Elena

    2016-01-01

    We construct renormalizable Standard Model extensions, valid up to the Planck scale, that give a composite Higgs from a new fundamental strong force acting on fermions and scalars. Yukawa interactions of these particles with Standard Model fermions realize the partial compositeness scenario. Under certain assumptions on the dynamics of the scalars, successful models exist because gauge quantum numbers of Standard Model fermions admit a minimal enough ‘square root’. Furthermore, right-handed SM fermions have an SU(2)_R-like structure, yielding a custodially-protected composite Higgs. Baryon and lepton numbers arise accidentally. Standard Model fermions acquire mass at tree level, while the Higgs potential and flavor violations are generated by quantum corrections. We further discuss accidental symmetries and other dynamical features stemming from the new strongly interacting scalars. If the same phenomenology can be obtained from models without our elementary scalars, they would reappear as composite states.

  17. Fundamentals of quantum mechanics

    CERN Document Server

    House, J E

    2017-01-01

    Fundamentals of Quantum Mechanics, Third Edition is a clear and detailed introduction to quantum mechanics and its applications in chemistry and physics. All required math is clearly explained, including intermediate steps in derivations, and concise review of the math is included in the text at appropriate points. Most of the elementary quantum mechanical models-including particles in boxes, rigid rotor, harmonic oscillator, barrier penetration, hydrogen atom-are clearly and completely presented. Applications of these models to selected “real world” topics are also included. This new edition includes many new topics such as band theory and heat capacity of solids, spectroscopy of molecules and complexes (including applications to ligand field theory), and small molecules of astrophysical interest.

  18. Fundamentals of phosphors

    CERN Document Server

    Yen, William M; Yamamoto, Hajime

    2006-01-01

    Drawing from the second edition of the best-selling Handbook of Phosphors, Fundamentals of Phosphors covers the principles and mechanisms of luminescence in detail and surveys the primary phosphor materials as well as their optical properties. The book addresses cutting-edge developments in phosphor science and technology including oxynitride phosphors and the impact of lanthanide level location on phosphor performance.Beginning with an explanation of the physics underlying luminescence mechanisms in solids, the book goes on to interpret various luminescence phenomena in inorganic and organic materials. This includes the interpretation of the luminescence of recently developed low-dimensional systems, such as quantum wells and dots. The book also discusses the excitation mechanisms by cathode-ray and ionizing radiation and by electric fields to produce electroluminescence. The book classifies phosphor materials according to the type of luminescence centers employed or the class of host materials used and inte...

  19. Fundamentals of thinking, patterns

    Science.gov (United States)

    Gafurov, O. M.; Gafurov, D. O.; Syryamkin, V. I.

    2018-05-01

    The authors analyze the fundamentals of thinking and propose to consider a model of the brain based on the presence of magnetic properties of gliacytes (Schwann cells) because of their oxygen saturation (oxygen has paramagnetic properties). The authors also propose to take into account the motion of electrical discharges through synapses causing electric and magnetic fields as well as additional effects such as paramagnetic resonance, which allows combining multisensory object-related information located in different parts of the brain. Therefore, the events of the surrounding world are reflected and remembered in the cortex columns, thus, creating isolated subnets with altered magnetic properties (patterns) and subsequently participate in recognition of objects, form a memory, and so on. The possibilities for the pattern-based thinking are based on the practical experience of applying methods and technologies of artificial neural networks in the form of a neuroemulator and neuromorphic computing devices.

  20. Fluid mechanics fundamentals and applications

    CERN Document Server

    Cengel, Yunus

    2013-01-01

    Cengel and Cimbala's Fluid Mechanics Fundamentals and Applications, communicates directly with tomorrow's engineers in a simple yet precise manner. The text covers the basic principles and equations of fluid mechanics in the context of numerous and diverse real-world engineering examples. The text helps students develop an intuitive understanding of fluid mechanics by emphasizing the physics, using figures, numerous photographs and visual aids to reinforce the physics. The highly visual approach enhances the learning of Fluid mechanics by students. This text distinguishes itself from others by the way the material is presented - in a progressive order from simple to more difficult, building each chapter upon foundations laid down in previous chapters. In this way, even the traditionally challenging aspects of fluid mechanics can be learned effectively. McGraw-Hill is also proud to offer ConnectPlus powered by Maple with the third edition of Cengel/Cimbabla, Fluid Mechanics. This innovative and powerful new sy...

  1. Green Manufacturing Fundamentals and Applications

    CERN Document Server

    2013-01-01

    Green Manufacturing: Fundamentals and Applications introduces the basic definitions and issues surrounding green manufacturing at the process, machine and system (including supply chain) levels. It also shows, by way of several examples from different industry sectors, the potential for substantial improvement and the paths to achieve the improvement. Additionally, this book discusses regulatory and government motivations for green manufacturing and outlines the path for making manufacturing more green as well as making production more sustainable. This book also: • Discusses new engineering approaches for manufacturing and provides a path from traditional manufacturing to green manufacturing • Addresses regulatory and economic issues surrounding green manufacturing • Details new supply chains that need to be in place before going green • Includes state-of-the-art case studies in the areas of automotive, semiconductor and medical areas as well as in the supply chain and packaging areas Green Manufactu...

  2. Limits to magnetic resonance microscopy

    International Nuclear Information System (INIS)

    Glover, Paul; Mansfield, Peter

    2002-01-01

    The last quarter of the twentieth century saw the development of magnetic resonance imaging (MRI) grow from a laboratory demonstration to a multi-billion dollar worldwide industry. There is a clinical body scanner in almost every hospital of the developed nations. The field of magnetic resonance microscopy (MRM), after mostly being abandoned by researchers in the first decade of MRI, has become an established branch of the science. This paper reviews the development of MRM over the last decade with an emphasis on the current state of the art. The fundamental principles of imaging and signal detection are examined to determine the physical principles which limit the available resolution. The limits are discussed with reference to liquid, solid and gas phase microscopy. In each area, the novel approaches employed by researchers to push back the limits of resolution are discussed. Although the limits to resolution are well known, the developments and applications of MRM have not reached their limit. (author)

  3. Anterior versus posterior approach in reconstruction of infected nonunion of the tibia using the vascularized fibular graft: potentialities and limitations.

    Science.gov (United States)

    Amr, Sherif M; El-Mofty, Aly O; Amin, Sherif N

    2002-01-01

    The potentialities, limitations, and technical pitfalls of the vascularized fibular grafting in infected nonunions of the tibia are outlined on the basis of 14 patients approached anteriorly or posteriorly. An infected nonunion of the tibia together with a large exposed area over the shin of the tibia is better approached anteriorly. The anastomosis is placed in an end-to-end or end-to-side fashion onto the anterior tibial vessels. To locate the site of the nonunion, the tibialis anterior muscle should be retracted laterally and the proximal and distal ends of the site of the nonunion debrided up to healthy bleeding bone. All the scarred skin over the anterior tibia should be excised, because it becomes devitalized as a result of the exposure. To cover the exposed area, the fibula has to be harvested with a large skin paddle, incorporating the first septocutaneous branch originating from the peroneal vessels before they gain the upper end of the flexor hallucis longus muscle. A disadvantage of harvesting the free fibula together with a skin paddle is that its pedicle is short. The skin paddle lies at the antimesenteric border of the graft, the site of incising and stripping the periosteum. In addition, it has to be sutured to the skin at the recipient site, so the soft tissues (together with the peroneal vessels), cannot be stripped off the graft to prolong its pedicle. Vein grafts should be resorted to, if the pedicle does not reach a healthy segment of the anterior tibial vessels. Defects with limited exposed areas of skin, especially in questionable patency of the vessels of the leg, require primarily a fibula with a long pedicle that could easily reach the popliteal vessels and are thus better approached posteriorly. In this approach, the site of the nonunion is exposed medial to the flexor digitorum muscle and the proximal and distal ends of the site of the nonunion debrided up to healthy bleeding bone. No attempt should be made to strip the scarred skin off

  4. Hankin and Reeves' Approach to Estimating Fish Abundance in Small Streams : Limitations and Potential Options.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, William L. [Bonneville Power Administration, Portland, OR (US). Environment, Fish and Wildlife

    2000-11-01

    Hankin and Reeves' (1988) approach to estimating fish abundance in small streams has been applied in stream-fish studies across North America. However, as with any method of population estimation, there are important assumptions that must be met for estimates to be minimally biased and reasonably precise. Consequently, I investigated effects of various levels of departure from these assumptions via simulation based on results from an example application in Hankin and Reeves (1988) and a spatially clustered population. Coverage of 95% confidence intervals averaged about 5% less than nominal when removal estimates equaled true numbers within sampling units, but averaged 62% - 86% less than nominal when they did not, with the exception where detection probabilities of individuals were >0.85 and constant across sampling units (95% confidence interval coverage = 90%). True total abundances averaged far (20% - 41%) below the lower confidence limit when not included within intervals, which implies large negative bias. Further, average coefficient of variation was about 1.5 times higher when removal estimates did not equal true numbers within sampling units (C{bar V} = 0.27 [SE = 0.0004]) than when they did (C{bar V} = 0.19 [SE = 0.0002]). A potential modification to Hankin and Reeves' approach is to include environmental covariates that affect detection rates of fish into the removal model or other mark-recapture model. A potential alternative is to use snorkeling in combination with line transect sampling to estimate fish densities. Regardless of the method of population estimation, a pilot study should be conducted to validate the enumeration method, which requires a known (or nearly so) population of fish to serve as a benchmark to evaluate bias and precision of population estimates.

  5. Heat exchanger versus regenerator: A fundamental comparison

    NARCIS (Netherlands)

    Will, M.E.; Waele, de A.T.A.M.

    2005-01-01

    Irreversible processes in regenerators and heat exchangers limit the performance of cryocoolers. In this paper we compare the performance of cryocoolers, operating with regenerators and heat exchangers from a fundamental point of view. The losses in the two systems are calculated from the entropy

  6. Fundamentals of ergonomic exoskeleton robots

    NARCIS (Netherlands)

    Schiele, A.

    2008-01-01

    This thesis is the first to provide the fundamentals of ergonomic exoskeleton design. The fundamental theory as well as technology necessary to analyze and develop ergonomic wearable robots interacting with humans is established and validated by experiments and prototypes. The fundamentals are (1) a

  7. Fundamental Physics with Antihydrogen

    Science.gov (United States)

    Hangst, J. S.

    Antihydrogen—the antimatter equivalent of the hydrogen atom—is of fundamental interest as a test bed for universal symmetries—such as CPT and the Weak Equivalence Principle for gravitation. Invariance under CPT requires that hydrogen and antihydrogen have the same spectrum. Antimatter is of course intriguing because of the observed baryon asymmetry in the universe—currently unexplained by the Standard Model. At the CERN Antiproton Decelerator (AD) [1], several groups have been working diligently since 1999 to produce, trap, and study the structure and behaviour of the antihydrogen atom. One of the main thrusts of the AD experimental program is to apply precision techniques from atomic physics to the study of antimatter. Such experiments complement the high-energy searches for physics beyond the Standard Model. Antihydrogen is the only atom of antimatter to be produced in the laboratory. This is not so unfortunate, as its matter equivalent, hydrogen, is one of the most well-understood and accurately measured systems in all of physics. It is thus very compelling to undertake experimental examinations of the structure of antihydrogen. As experimental spectroscopy of antihydrogen has yet to begin in earnest, I will give here a brief introduction to some of the ion and atom trap developments necessary for synthesizing and trapping antihydrogen, so that it can be studied.

  8. Strings and fundamental physics

    International Nuclear Information System (INIS)

    Baumgartl, Marco; Brunner, Ilka; Haack, Michael

    2012-01-01

    The basic idea, simple and revolutionary at the same time, to replace the concept of a point particle with a one-dimensional string, has opened up a whole new field of research. Even today, four decades later, its multifaceted consequences are still not fully conceivable. Up to now string theory has offered a new way to view particles as different excitations of the same fundamental object. It has celebrated success in discovering the graviton in its spectrum, and it has naturally led scientists to posit space-times with more than four dimensions - which in turn has triggered numerous interesting developments in fields as varied as condensed matter physics and pure mathematics. This book collects pedagogical lectures by leading experts in string theory, introducing the non-specialist reader to some of the newest developments in the field. The carefully selected topics are at the cutting edge of research in string theory and include new developments in topological strings, AdS/CFT dualities, as well as newly emerging subfields such as doubled field theory and holography in the hydrodynamic regime. The contributions to this book have been selected and arranged in such a way as to form a self-contained, graduate level textbook. (orig.)

  9. Fundamentals of precision medicine

    Science.gov (United States)

    Divaris, Kimon

    2018-01-01

    Imagine a world where clinicians make accurate diagnoses and provide targeted therapies to their patients according to well-defined, biologically-informed disease subtypes, accounting for individual differences in genetic make-up, behaviors, cultures, lifestyles and the environment. This is not as utopic as it may seem. Relatively recent advances in science and technology have led to an explosion of new information on what underlies health and what constitutes disease. These novel insights emanate from studies of the human genome and microbiome, their associated transcriptomes, proteomes and metabolomes, as well as epigenomics and exposomics—such ‘omics data can now be generated at unprecedented depth and scale, and at rapidly decreasing cost. Making sense and integrating these fundamental information domains to transform health care and improve health remains a challenge—an ambitious, laudable and high-yield goal. Precision dentistry is no longer a distant vision; it is becoming part of the rapidly evolving present. Insights from studies of the human genome and microbiome, their associated transcriptomes, proteomes and metabolomes, and epigenomics and exposomics have reached an unprecedented depth and scale. Much more needs to be done, however, for the realization of precision medicine in the oral health domain. PMID:29227115

  10. Strings and fundamental physics

    Energy Technology Data Exchange (ETDEWEB)

    Baumgartl, Marco [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Brunner, Ilka; Haack, Michael (eds.) [Muenchen Univ. (Germany). Fakultaet fuer Physik

    2012-07-01

    The basic idea, simple and revolutionary at the same time, to replace the concept of a point particle with a one-dimensional string, has opened up a whole new field of research. Even today, four decades later, its multifaceted consequences are still not fully conceivable. Up to now string theory has offered a new way to view particles as different excitations of the same fundamental object. It has celebrated success in discovering the graviton in its spectrum, and it has naturally led scientists to posit space-times with more than four dimensions - which in turn has triggered numerous interesting developments in fields as varied as condensed matter physics and pure mathematics. This book collects pedagogical lectures by leading experts in string theory, introducing the non-specialist reader to some of the newest developments in the field. The carefully selected topics are at the cutting edge of research in string theory and include new developments in topological strings, AdS/CFT dualities, as well as newly emerging subfields such as doubled field theory and holography in the hydrodynamic regime. The contributions to this book have been selected and arranged in such a way as to form a self-contained, graduate level textbook. (orig.)

  11. Fundamentals of klystron testing

    International Nuclear Information System (INIS)

    Caldwell, J.W. Jr.

    1978-08-01

    Fundamentals of klystron testing is a text primarily intended for the indoctrination of new klystron group test stand operators. It should significantly reduce the familiarization time of a new operator, making him an asset to the group sooner than has been experienced in the past. The new employee must appreciate the mission of SLAC before he can rightfully be expected to make a meaningful contribution to the group's effort. Thus, the introductory section acquaints the reader with basic concepts of accelerators in general, then briefly describes major physical aspects of the Stanford Linear Accelerator. Only then is his attention directed to the klystron, with its auxiliary systems, and the rudiments of klystron tube performance checks. It is presumed that the reader is acquainted with basic principles of electronics and scientific notation. However, to preserve the integrity of an indoctrination guide, tedious technical discussions and mathematical analysis have been studiously avoided. It is hoped that the new operator will continue to use the text for reference long after his indoctrination period is completed. Even the more experienced operator should find that particular sections will refresh his understanding of basic principles of klystron testing

  12. Making physics more fundamental

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    The stellar death throes of supernovae have been seen and admired since time immemorial. However last year's was the first to come under the combined scrutiny of space-borne radiation detectors and underground neutrino monitors as well as terrestrial optical telescopes and even gravity wave antennae. The remarkable results underline the power of modern physics to explain and interrelate processes in the furthest reaches of the cosmos and the deep interior of nuclear particles. In recent years this common ground between 'Big Bang' cosmology and particle physics has been regularly trodden and retrodden in the light of fresh new insights and new experimental results, and thinking has steadily converged. In 1983, the first Symposium on Astronomy, Cosmology and Fundamental Physics, organized by CERN and the European Southern Observatory (ESO), was full of optimism, with new ideas ('inflation') to explain how the relatively small variations in the structure of the Universe could have arisen through the quantum structure of the initial cataclysm

  13. Fundamentals of Quantum Mechanics

    Science.gov (United States)

    Tang, C. L.

    2005-06-01

    Quantum mechanics has evolved from a subject of study in pure physics to one with a wide range of applications in many diverse fields. The basic concepts of quantum mechanics are explained in this book in a concise and easy-to-read manner emphasising applications in solid state electronics and modern optics. Following a logical sequence, the book is focused on the key ideas and is conceptually and mathematically self-contained. The fundamental principles of quantum mechanics are illustrated by showing their application to systems such as the hydrogen atom, multi-electron ions and atoms, the formation of simple organic molecules and crystalline solids of practical importance. It leads on from these basic concepts to discuss some of the most important applications in modern semiconductor electronics and optics. Containing many homework problems and worked examples, the book is suitable for senior-level undergraduate and graduate level students in electrical engineering, materials science and applied physics. Clear exposition of quantum mechanics written in a concise and accessible style Precise physical interpretation of the mathematical foundations of quantum mechanics Illustrates the important concepts and results by reference to real-world examples in electronics and optoelectronics Contains homeworks and worked examples, with solutions available for instructors

  14. Fundamentals of nuclear chemistry

    International Nuclear Information System (INIS)

    Majer, V.

    1982-01-01

    The author of the book has had 25 years of experience at the Nuclear Chemistry of Prague Technical University. In consequence, the book is intended as a basic textbook for students of this field. Its main objectives are an easily understandable presentation of the complex subject and in spite of the uncertainty which still characterizes the definition and subjects of nuclear chemistry - a systematic classification and logical structure. Contents: 1. Introduction (history and definition); 2. General nuclear chemistry (physical fundamentals, hot atom chemistry, interaction of nuclear radiation with matter, radioactive elements, isotope effects, isotope exchange, chemistry of radioactive trace elements); 3. Methods of nuclear chemistry of nuclear chemistry (radiochemical methods, activation, separation and enrichment chemistry); 4. Preparative nuclear chemistry (isotope production, labelled compounds); 5. Analytival nuclear chemistry; 6. Applied nuclear chemistry (isotope applications in general physical and analytical chemistry). The book is supplemented by an annex with tables, a name catalogue and a subject index which will facilitate access to important information. (RB) [de

  15. an aid to mastering fundamental calculus concepts

    African Journals Online (AJOL)

    Erna Kinsey

    Department of Educational Psychology, University of Pretoria, Pretoria, 0002 South Africa ... according to a well thought-out didactical approach is necessary in order to incorporate technology ... developing new hypotheses instead of testing hypotheses. ... mastering fundamental concepts of two-dimensional functions.

  16. Some aspects of fundamental symmetries and interactions

    NARCIS (Netherlands)

    Jungmann, KP; Grzonka, D; Czyzykiewicz, R; Oelert, W; Rozek, T; Winter, P

    2005-01-01

    The known fundamental symmetries and interactions are well described by the Standard Model. Features of this powerful theory, which are described but not deeper explained, are addressed in a variety of speculative models. Experimental tests of the predictions in such approaches can be either through

  17. Fundamental problems in provable security and cryptography.

    Science.gov (United States)

    Dent, Alexander W

    2006-12-15

    This paper examines methods for formally proving the security of cryptographic schemes. We show that, despite many years of active research and dozens of significant results, there are fundamental problems which have yet to be solved. We also present a new approach to one of the more controversial aspects of provable security, the random oracle model.

  18. Fundamentals of C

    CERN Document Server

    Guruprasad, N

    2009-01-01

    The book presents a contemporary approach to programming. Complte C programs are presented as and when it is required. This Book is not a cookbook . To get the maximum benefit from this book, you should take as active a role as possible. Don`t just read the examples. Enter it into your system and try them out.

  19. O enfoque qualitativo na avaliação do consumo alimentar: fundamentos, aplicações e considerações operacionais The qualitative approach in the evaluation of food consumption: fundamentals, applications and operational considerations

    Directory of Open Access Journals (Sweden)

    Maria Lúcia Magalhães Bosi

    2011-12-01

    Full Text Available Disputas grosso modo infundadas entre defensores dos enfoques qualitativo e quantitativo têm impedido o reconhecimento dos benefícios das aplicações combinadas de ambos os métodos em um mesmo estudo, ou seja, de uma abordagem multidimensional e integrada. Não obstante, em anos recentes, o campo da Nutrição em Saúde Coletiva vem vivenciando um aumento na condução de estudos orientados não apenas pela mensuração, mas pela combinação de métodos qualitativos e quantitativos. Com efeito, o enfoque qualitativo tem muito a contribuir para a investigação do consumo alimentar, dentre vários outros objetos e temáticas, nas quais sobressai a importância do aprofundamento da compreensão da produção subjetiva, expressa em crenças, atitudes e comportamentos. Este artigo resume a natureza, os fundamentos e a utilidade do enfoque qualitativo em pesquisas no âmbito da alimentação e nutrição, esclarecendo como esses métodos têm sido ou podem ser usados para estudar os complexos problemas que se apresentam nesse campo, circunscrevendo a discussão ao âmbito dos estudos sobre consumo alimentar. A integração de ambos os métodos, qualitativo e quantitativo, mediante a complementaridade metodológica, pode minimizar os limites do emprego de cada enfoque de forma isolada.Unfounded disputes between advocates of qualitative and quantitative approaches have hindered the recognition of the benefits of combined application of both methods in the same study, ie, a multidimensional and integrated approach. Nevertheless, in recent years, the field of Nutrition in Public Health has experienced an increase in conducting studies guided not only by measurement, but by the combination of qualitative and quantitative methods. Indeed, the qualitative approach has much to contribute to research in food consumption, among many other objects and themes, in which stands the importance of deepening the understanding of subjective production, expressed

  20. Fundamentals of nanomechanical resonators

    CERN Document Server

    Schmid, Silvan; Roukes, Michael Lee

    2016-01-01

    This authoritative book introduces and summarizes the latest models and skills required to design and optimize nanomechanical resonators, taking a top-down approach that uses macroscopic formulas to model the devices. The authors cover the electrical and mechanical aspects of nano electromechanical system (NEMS) devices. The introduced mechanical models are also key to the understanding and optimization of nanomechanical resonators used e.g. in optomechanics. Five comprehensive chapters address: The eigenmodes derived for the most common continuum mechanical structures used as nanomechanical resonators; The main sources of energy loss in nanomechanical resonators; The responsiveness of micro and nanomechanical resonators to mass, forces, and temperature; The most common underlying physical transduction mechanisms; The measurement basics, including amplitude and frequency noise. The applied approach found in this book is appropriate for engineering students and researchers working with micro and nanomechanical...

  1. A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach.

    Science.gov (United States)

    Tipton, Elizabeth; Shuster, Jonathan

    2017-10-15

    Bland-Altman method comparison studies are common in the medical sciences and are used to compare a new measure to a gold-standard (often costlier or more invasive) measure. The distribution of these differences is summarized by two statistics, the 'bias' and standard deviation, and these measures are combined to provide estimates of the limits of agreement (LoA). When these LoA are within the bounds of clinically insignificant differences, the new non-invasive measure is preferred. Very often, multiple Bland-Altman studies have been conducted comparing the same two measures, and random-effects meta-analysis provides a means to pool these estimates. We provide a framework for the meta-analysis of Bland-Altman studies, including methods for estimating the LoA and measures of uncertainty (i.e., confidence intervals). Importantly, these LoA are likely to be wider than those typically reported in Bland-Altman meta-analyses. Frequently, Bland-Altman studies report results based on repeated measures designs but do not properly adjust for this design in the analysis. Meta-analyses of Bland-Altman studies frequently exclude these studies for this reason. We provide a meta-analytic approach that allows inclusion of estimates from these studies. This includes adjustments to the estimate of the standard deviation and a method for pooling the estimates based upon robust variance estimation. An example is included based on a previously published meta-analysis. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Rural eHealth nutrition education for limited-income families: an iterative and user-centered design approach.

    Science.gov (United States)

    Atkinson, Nancy L; Saperstein, Sandra L; Desmond, Sharon M; Gold, Robert S; Billing, Amy S; Tian, Jing

    2009-06-22

    Adult women living in rural areas have high rates of obesity. Although rural populations have been deemed hard to reach, Internet-based programming is becoming a viable strategy as rural Internet access increases. However, when people are able to get online, they may not find information designed for them and their needs, especially harder to reach populations. This results in a "content gap" for many users. User-centered design is a methodology that can be used to create appropriate online materials. This research was conducted to apply a user-centered approach to the design and development of a health promotion website for low-income mothers living in rural Maryland. Three iterative rounds of concept testing were conducted to (1) identify the name and content needs of the site and assess concerns about registering on a health-related website; (2) determine the tone and look of the website and confirm content and functionality; and (3) determine usability and acceptability. The first two rounds involved focus group and small group discussions, and the third round involved usability testing with individual women as they used the prototype system. The formative research revealed that women with limited incomes were enthusiastic about a website providing nutrition and physical activity information targeted to their incomes and tailored to their personal goals and needs. Other priority content areas identified were budgeting, local resources and information, and content that could be used with their children. Women were able to use the prototype system effectively. This research demonstrated that user-centered design strategies can help close the "content gap" for at-risk audiences.

  3. Fundamentals of Space Medicine

    Science.gov (United States)

    Clément, Gilles

    2005-03-01

    A total of more than 240 human space flights have been completed to date, involving about 450 astronauts from various countries, for a combined total presence in space of more than 70 years. The seventh long-duration expedition crew is currently in residence aboard the International Space Station, continuing a permanent presence in space that began in October 2000. During that time, investigations have been conducted on both humans and animal models to study the bone demineralization and muscle deconditioning, space motion sickness, the causes and possible treatment of postflight orthostatic intolerance, the changes in immune function, crew and crew-ground interactions, and the medical issues of living in a space environment, such as the effects of radiation or the risk of developing kidney stones. Some results of these investigations have led to fundamental discoveries about the adaptation of the human body to the space environment. Gilles Clément has been active in this research. This readable text presents the findings from the life science experiments conducted during and after space missions. Topics discussed in this book include: adaptation of sensory-motor, cardio-vascular, bone, and muscle systems to the microgravity of spaceflight; psychological and sociological issues of living in a confined, isolated, and stressful environment; operational space medicine, such as crew selection, training and in-flight health monitoring, countermeasures and support; results of space biology experiments on individual cells, plants, and animal models; and the impact of long-duration missions such as the human mission to Mars. The author also provides a detailed description of how to fly a space experiment, based on his own experience with research projects conducted onboard Salyut-7, Mir, Spacelab, and the Space Shuttle. Now is the time to look at the future of human spaceflight and what comes next. The future human exploration of Mars captures the imagination of both the

  4. Physics fundamentals for ITER

    International Nuclear Information System (INIS)

    Rosenbluth, M.N.

    1999-01-01

    The design of an experimental thermonuclear reactor requires both cutting-edge technology and physics predictions precise enough to carry forward the design. The past few years of worldwide physics studies have seen great progress in understanding, innovation and integration. We will discuss this progress and the remaining issues in several key physics areas. (1) Transport and plasma confinement. A worldwide database has led to an 'empirical scaling law' for tokamaks which predicts adequate confinement for the ITER fusion mission, albeit with considerable but acceptable uncertainty. The ongoing revolution in computer capabilities has given rise to new gyrofluid and gyrokinetic simulations of microphysics which may be expected in the near future to attain predictive accuracy. Important databases on H-mode characteristics and helium retention have also been assembled. (2) Divertors, heat removal and fuelling. A novel concept for heat removal - the radiative, baffled, partially detached divertor - has been designed for ITER. Extensive two-dimensional (2D) calculations have been performed and agree qualitatively with recent experiments. Preliminary studies of the interaction of this configuration with core confinement are encouraging and the success of inside pellet launch provides an attractive alternative fuelling method. (3) Macrostability. The ITER mission can be accomplished well within ideal magnetohydrodynamic (MHD) stability limits, except for internal kink modes. Comparisons with JET, as well as a theoretical model including kinetic effects, predict such sawteeth will be benign in ITER. Alternative scenarios involving delayed current penetration or off-axis current drive may be employed if required. The recent discovery of neoclassical beta limits well below ideal MHD limits poses a threat to performance. Extrapolation to reactor scale is as yet unclear. In theory such modes are controllable by current drive profile control or feedback and experiments should

  5. Fundamentals of quantum mechanics

    CERN Document Server

    Erkoc, Sakir

    2006-01-01

    HISTORICAL EXPERIMENTS AND THEORIESDates of Important Discoveries and Events Blackbody RadiationPhotoelectrice Effect Quantum Theory of Spectra TheComptone Effect Matterwaves, the de Broglie HypothesisThe Davisson -Germer Experiment Heisenberg's Uncertainity PrincipleDifference Between Particles and Waves Interpretation of the Wavefunction AXIOMATIC STRUCTURE OF QUANTUM MECHANICSThe Necessity of Quantum TheoryFunction Spaces Postulates of Quantum Mechanics The Kronecker Delta and the Dirac Delta Function Dirac Notation OBSERVABLES AND SUPERPOSITIONFree Particle Particle In A Box Ensemble Average Hilbert -Space Interpretation The Initial Square Wave Particle Beam Superposition and Uncertainty Degeneracy of States Commutators and Uncertainty TIME DEVELOPMENT AND CONSERVATION THEOREMSTime Development of State Functions, The Discrete Case The Continuous Case, Wave Packets Particle Beam Gaussian Wave Packet Free Particle Propagator The Limiting Cases of the Gaussian Wave Packets Time Development of Expectation Val...

  6. Technology fundamentals: photovoltaic systems

    International Nuclear Information System (INIS)

    Quaschning, V.

    2006-01-01

    The generation of electric power from photovoltaic systems is described in detail. The mechanism of operation of solar cells is described in terms of photons, electrons, charge carriers and charge separation. The various cells, modules, technical terms and related technology are discussed. The chemical elements used in solar cells are mentioned and the manufacturing processes described. The technical advantages of the newer thin-film modules over the traditional silicon cells are given but at present manufacturing cost is limiting their production. Both stand-alone and grid-connected PV systems are described. The potential market for PV systems is discussed. It is suggested that PV could eventually meet the total global electric power demand. (author)

  7. Intrauterine Insemination: Fundamentals Revisited.

    Science.gov (United States)

    Allahbadia, Gautam N

    2017-12-01

    Intrauterine insemination (IUI) is an assisted conception technique that involves the deposition of a processed semen sample in the upper uterine cavity, overcoming natural barriers to sperm ascent in the female reproductive tract. It is a cost-effective, noninvasive first-line therapy for selected patients with functionally normal tubes, and infertility due to a cervical factor, anovulation, moderate male factor, unexplained factors, immunological factor, and ejaculatory disorders with clinical pregnancy rates per cycle ranging from 10 to 20%. It, however, has limited use in patients with endometriosis, severe male factor infertility, tubal factor infertility, and advanced maternal age ≥ 35 years. IUI may be performed with or without ovarian stimulation. Controlled ovarian stimulation, particularly with low-dose gonadotropins, with IUI offers significant benefit in terms of pregnancy outcomes compared with natural cycle or timed intercourse, while reducing associated COH complications such as multiple pregnancies and ovarian hyperstimulation syndrome. Important prognostic indicators of success with IUI include age of patient, duration of infertility, stimulation protocol, infertility etiology, number of cycles, timing of insemination, number of preovulatory follicles on the day of hCG, processed total motile sperm > 10 million, and insemination count > 1 × 106 with > 4% normal spermatozoa. Alternative insemination techniques, such as Fallopian tube sperm perfusion, intracervical insemination, and intratubal insemination, provide no additional benefit compared to IUI. A complete couple workup that includes patient history, physical examination, and clinical and laboratory investigations is mandatory to justify the choice in favor of IUI and guide alternative patient management, while individualizing the treatment protocol according to the patient characteristics with a strict cancelation policy to limit multi-follicular development may help optimize IUI

  8. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2010-01-01

    New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update and Fundamentals is the single best source for the latest developments, trends, and issues in communication technology. Featuring the fundamental framework along with the history and background of communication technologies, Communication Technology Update and Fundamentals, 12th edition helps you stay ahead of these ever-changing and emerging technologies.As always, every chapter ha

  9. Fundamentals of ergonomic exoskeleton robots

    OpenAIRE

    Schiele, A.

    2008-01-01

    This thesis is the first to provide the fundamentals of ergonomic exoskeleton design. The fundamental theory as well as technology necessary to analyze and develop ergonomic wearable robots interacting with humans is established and validated by experiments and prototypes. The fundamentals are (1) a new theoretical framework for analyzing physical human robot interaction (pHRI) with exoskeletons, and (2) a clear set of design rules of how to build wearable, portable exoskeletons to easily and...

  10. Quench limits

    International Nuclear Information System (INIS)

    Sapinski, M.

    2012-01-01

    With thirteen beam induced quenches and numerous Machine Development tests, the current knowledge of LHC magnets quench limits still contains a lot of unknowns. Various approaches to determine the quench limits are reviewed and results of the tests are presented. Attempt to reconstruct a coherent picture emerging from these results is taken. The available methods of computation of the quench levels are presented together with dedicated particle shower simulations which are necessary to understand the tests. The future experiments, needed to reach better understanding of quench limits as well as limits for the machine operation are investigated. The possible strategies to set BLM (Beam Loss Monitor) thresholds are discussed. (author)

  11. Fundamentals of structural engineering

    CERN Document Server

    Connor, Jerome J

    2016-01-01

    This book-presents new methods and tools for the integration and simulation of smart devices. The design approach described in this book explicitly accounts for integration of Smart Systems components and subsystems as a specific constraint. It includes methodologies and EDA tools to enable multi-disciplinary and multi-scale modeling and design, simulation of multi-domain systems, subsystems and components at all levels of abstraction, system integration and exploration for optimization of functional and non-functional metrics. By covering theoretical and practical aspects of smart device design, this book targets people who are working and studying on hardware/software modelling, component integration and simulation under different positions (system integrators, designers, developers, researchers, teachers, students etc.). In particular, it is a good introduction to people who have interest in managing heterogeneous components in an efficient and effective way on different domains and different abstraction l...

  12. Fundamental limitations on 'warp drive' spacetimes

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Francisco S N [Centro de Astronomia e AstrofIsica da Universidade de Lisboa, Campo Grande, Ed. C8 1749-016 Lisbon (Portugal); Visser, Matt [School of Mathematical and Computing Sciences, Victoria University of Wellington, PO Box 600, Wellington (New Zealand)

    2004-12-21

    'Warp drive' spacetimes are useful as 'gedanken-experiments' that force us to confront the foundations of general relativity, and among other things, to precisely formulate the notion of 'superluminal' communication. After carefully formulating the Alcubierre and Natario warp drive spacetimes, and verifying their non-perturbative violation of the classical energy conditions, we consider a more modest question and apply linearized gravity to the weak-field warp drive, testing the energy conditions to first and second orders of the warp-bubble velocity, v. Since we take the warp-bubble velocity to be non-relativistic, v << c, we are not primarily interested in the 'superluminal' features of the warp drive. Instead we focus on a secondary feature of the warp drive that has not previously been remarked upon-the warp drive (if it could be built) would be an example of a 'reaction-less drive'. For both the Alcubierre and Natario warp drives we find that the occurrence of significant energy condition violations is not just a high-speed effect, but that the violations persist even at arbitrarily low speeds. A particularly interesting feature of this construction is that it is now meaningful to think of placing a finite mass spaceship at the centre of the warp bubble, and then see how the energy in the warp field compares with the mass-energy of the spaceship. There is no hope of doing this in Alcubierre's original version of the warp field, since by definition the point at the centre of the warp bubble moves on a geodesic and is 'massless'. That is, in Alcubierre's original formalism and in the Natario formalism the spaceship is always treated as a test particle, while in the linearized theory we can treat the spaceship as a finite mass object. For both the Alcubierre and Natario warp drives we find that even at low speeds the net (negative) energy stored in the warp fields must be a significant fraction of the mass of the spaceship.

  13. Secret Key Agreement: Fundamental Limits and Practical Challenges

    KAUST Repository

    Rezki, Zouheir; Zorgui, Marwen; Alomair, Basel; Alouini, Mohamed-Slim

    2017-01-01

    that prevent PLS from flourishing at the industrial scale. Most secure message transmission constructions available to date are tied to strong assumptions on CSI, consider simple channel models and undermine eavesdropping capabilities; thus compromising

  14. Searching methods for biometric identification systems: Fundamental limits

    NARCIS (Netherlands)

    Willems, F.M.J.

    2009-01-01

    We study two-stage search procedures for biometric identification systems in an information-theoretical setting. Our main conclusion is that clustering based on vector-quantization achieves the optimum trade-off between the number of clusters (cluster rate) and the number of individuals within a

  15. Fundamental Limitations to Gain Enhancement in Periodic Media and Waveguides

    DEFF Research Database (Denmark)

    Grgic, Jure; Ott, Johan Raunkjær; Wang, Fengwen

    2012-01-01

    A common strategy to compensate for losses in optical nanostructures is to add gain material in the system. By exploiting slow-light effects it is expected that the gain may be enhanced beyond its bulk value. Here we show that this route cannot be followed uncritically: inclusion of gain inevitably...

  16. Cr3+-doped fluorides and oxides: role of internal fields and limitations of the Tanabe-Sugano approach.

    Science.gov (United States)

    Trueba, A; García-Lastra, J M; Garcia-Fernandez, P; Aramburu, J A; Barriuso, M T; Moreno, M

    2011-11-24

    This work is aimed at clarifying the changes on optical spectra of Cr(3+) impurities due to either a host lattice variation or a hydrostatic pressure, which can hardly be understood by means of the usual Tanabe-Sugano (TS) approach assuming that the Racah parameter, B, grows when covalency decreases. For achieving this goal, the optical properties of Cr(3+)-doped LiBaF(3) and KMgF(3) model systems have been explored by means of high level ab initio calculations on CrF(6)(3-) units subject to the electric field, E(R)(r), created by the rest of the lattice ions. These calculations, which reproduce available experimental data, indicate that the energy, E((2)E), of the (2)E(t(2g)(3)) → (4)A(2)(t(2g)(3)) emission transition is nearly independent of the host lattice. By contrast, the energy difference corresponding to (4)A(2)(t(2g)(3)) → (4)T(1)(t(2g)(2)e(g)(1)) and (4)A(2)(t(2g)(3)) → (4)T(2)(t(2g)(2)e(g)(1)) excitations, Δ((4)T(1); (4)T(2)), is shown to increase on passing from the normal to the inverted perovskite host lattice despite the increase in covalency, a fact which cannot be accounted for through the usual TS model. Similarly, when the Cr(3+)-F(-) distance, R, is reduced both Δ((4)T(1); (4)T(2)) and the covalency are found to increase. By analyzing the limitations of the usual model, we found surprising results that are shown to arise from the deformation of both 3d(Cr) and ligand orbitals in the antibonding e(g) orbital, which has a σ character and is more extended than the π t(2g) orbital. By contrast, because of the higher stiffness of the t(2g) orbital, the dependence of E((2)E) with R basically follows the corresponding variation of covalency in that level. Bearing in mind the similarities of the optical properties displayed by Cr(3+) impurities in oxides and fluorides, the present results can be useful for understanding experimental data on Cr(3+)-based gemstones where the local symmetry is lower than cubic.

  17. Overview of the fundamental safety principles

    International Nuclear Information System (INIS)

    Chishinga, Milton Mulenga

    2015-02-01

    The primary objective of this work was to provide an overview of the International Atomic Energy (IAEA) document; 'Fundamental Safety principles, SF.1'. The document outlines ten (10) fundamental principles which provide the basis for an effective the radiation protection framework. The document is the topmost in the hierarchy of the IAEA Safety Standards Series. These principles are the foundation of the nuclear safety put stringent obligations on Parties under the Convention on Nuclear Safety. The fundamental safety objective is to protect people and the environment from harmful effects of ionizing radiation. The fundamental Safety objective of protecting people individually and collectively and the environment has to be achieved without unduly limiting the operation of facilities or the conduct of activities that give rise to risks. The thematic areas covered are; responsibility for safety, role of government, leadership and management for safety, justification of facilities and activities, optimization of protection, limitation of risks to individuals, protection of present and future generations, prevention of accidents, emergency preparedness and response and protective actions to reduce existing or unregulated radiation risks. Appropriate recommendations have been provided for effective application of the principles by Governments, Regulatory Bodies and Operating Organizations of facilities and Nuclear Installations the give rise to radiation risks. (au)

  18. Fundamentals of the DIGES code

    Energy Technology Data Exchange (ETDEWEB)

    Simos, N.; Philippacopoulos, A.J.

    1994-08-01

    Recently the authors have completed the development of the DIGES code (Direct GEneration of Spectra) for the US Nuclear Regulatory Commission. This paper presents the fundamental theoretical aspects of the code. The basic modeling involves a representation of typical building-foundation configurations as multi degree-of-freedom dynamic which are subjected to dynamic inputs in the form of applied forces or pressure at the superstructure or in the form of ground motions. Both the deterministic as well as the probabilistic aspects of DIGES are described. Alternate ways of defining the seismic input for the estimation of in-structure spectra and their consequences in terms of realistically appraising the variability of the structural response is discussed in detaiL These include definitions of the seismic input by ground acceleration time histories, ground response spectra, Fourier amplitude spectra or power spectral densities. Conversions of one of these forms to another due to requirements imposed by certain analysis techniques have been shown to lead, in certain cases, in controversial results. Further considerations include the definition of the seismic input as the excitation which is directly applied at the foundation of a structure or as the ground motion of the site of interest at a given point. In the latter case issues related to the transferring of this motion to the foundation through convolution/deconvolution and generally through kinematic interaction approaches are considered.

  19. Nanostructured metals. Fundamentals to applications

    International Nuclear Information System (INIS)

    Grivel, J.-C.; Hansen, N.; Huang, X.; Juul Jensen, D.; Mishin, O.V.; Nielsen, S.F.; Pantleon, W.; Toftegaard, H.; Winther, G.; Yu, T.

    2009-01-01

    In the today's world, materials science and engineering must as other technical fields focus on sustainability. Raw materials and energy have to be conserved and metals with improved or new structural and functional properties must be invented, developed and brought to application. In this endeavour a very promising route is to reduce the structural scale of metallic materials, thereby bridging industrial metals of today with emerging nanometals of tomorrow, i.e. structural scales ranging from a few micrometres to the nanometre regime. While taking a focus on metals with structures in this scale regime the symposium spans from fundamental aspects towards applications, uniting materials scientists and technologists. A holistic approach characterizes the themes of the symposium encompassing synthesis, characterization, modelling and performance where in each area significant progress has been made in recent years. Synthesis now covers top-down processes, e.g. plastic deformation, and bottom-up processes, e.g. chemical and physical synthesis. In the area of structural and mechanical characterization advanced techniques are now widely applied and in-situ techniques for structural characterization under mechanical or thermal loading are under rapid development in both 2D and 3D. Progress in characterization techniques has led to a precise description of different boundaries (grain, dislocation, twin, phase), and of how they form and evolve, also including theoretical modelling and simulations of structures, properties and performance. (au)

  20. Connecting Fundamental Constants

    International Nuclear Information System (INIS)

    Di Mario, D.

    2008-01-01

    A model for a black hole electron is built from three basic constants only: h, c and G. The result is a description of the electron with its mass and charge. The nature of this black hole seems to fit the properties of the Planck particle and new relationships among basic constants are possible. The time dilation factor in a black hole associated with a variable gravitational field would appear to us as a charge; on the other hand the Planck time is acting as a time gap drastically limiting what we are able to measure and its dimension will appear in some quantities. This is why the Planck time is numerically very close to the gravitational/electric force ratio in an electron: its difference, disregarding a π√(2) factor, is only 0.2%. This is not a coincidence, it is always the same particle and the small difference is between a rotating and a non-rotating particle. The determination of its rotational speed yields accurate numbers for many quantities, including the fine structure constant and the electron magnetic moment

  1. Fundamentals of quantum information

    International Nuclear Information System (INIS)

    Zeilinger, A.

    1998-01-01

    The fact that information is physical means that the laws of quantum mechanics can be used to process and transmit it in ways that are not possible with existing systems. Ever since its invention in the 1920s, quantum physics has given rise to countless discussions about its meaning and about how to interpret the theory correctly. These discussions focus on issues like the Einstein-Podolsky-Rosen paradox, quantum non-locality and the role of measurement in quantum physics. In recent years, however, research into the very foundations of quantum mechanics has also led to a new field quantum information technology. The use of quantum physics could revolutionize the way we communicate and process information. The important new observation is that information is not independent of the physical laws used to store and processes it (see Landauer in further reading). Although modern computers rely on quantum mechanics to operate, the information itself is still encoded classically. A new approach is to treat information as a quantum concept and to ask what new insights can be gained by encoding this information in individual quantum systems. In other words, what happens when both the transmission and processing of information are governed by quantum laws? (UK)

  2. Fundamental volatility is regime specific

    NARCIS (Netherlands)

    Arnold, I.J.M.; MacDonald, R.; Vries, de C.G.

    2006-01-01

    A widely held notion holds that freely floating exchange rates are excessively volatile when judged against fundamentals and when moving from fixed to floating exchange rates. We re-examine the data and conclude that the disparity between the fundamentals and exchange rate volatility is more

  3. The emergence of the dimensions and fundamental forces in the universe, an information-theoretical approach for the expaining of the quantity ratios of the fundamental interactions. 2. rev. and enl. ed.; Die Entstehung der Dimensionen und Grundkraefte im Universum, ein informationstheoretischer Ansatz zur Erklaerung der Groessenverhaeltnisse der Fundamentalen Wechselwirkungen

    Energy Technology Data Exchange (ETDEWEB)

    Ganter, Bernd

    2013-06-01

    After a description of the four fundamental inteactions and the connection of information with energy the principle of the fast maximation together with the Ganter tableau is described. Then as example the derivation of the value of the fine-structure constant from the Ganter tableau is described. Thereafter the extension of the Ganter tableau, further properties of the Ganter tableau, and the persuasion of the Ganter tableau are considered. (HSI)

  4. Fundamental plasma emission involving ion sound waves

    International Nuclear Information System (INIS)

    Cairns, I.H.

    1987-01-01

    The theory for fundamental plasma emission by the three-wave processes L ± S → T (where L, S and T denote Langmuir, ion sound and transverse waves, respectively) is developed. Kinematic constraints on the characteristics and growth lengths of waves participating in the wave processes are identified. In addition the rates, path-integrated wave temperatures, and limits on the brightness temperature of the radiation are derived. (author)

  5. Fundamental of cryogenics (for superconducting RF technology)

    CERN Document Server

    Pierini, Paolo

    2013-01-01

    This review briefly illustrates a few fundamental concepts of cryogenic engineering, the technological practice that allows reaching and maintaining the low-temperature operating conditions of the superconducting devices needed in particle accelerators. To limit the scope of the task, and not to duplicate coverage of cryogenic engineering concepts particularly relevant to superconducting magnets that can be found in previous CAS editions, the overview presented in this course focuses on superconducting radio-frequency cavities.

  6. A Case for Flexible Epistemology and Metamethodology in Religious Fundamentalism Research

    Directory of Open Access Journals (Sweden)

    Carter J. Haynes

    2010-07-01

    Full Text Available After reviewing a representative sample of current and historical research in religious fundamentalism, the author addresses the epistemological presuppositions supporting both quantitative and qualitative methodologies and argues for epistemological flexibility and metamethodology, both of which support and are supported by metatheoretical thinking. Habermas’ concept of the scientistic self-understanding of the sciences is used to point up the limitations of positivist epistemology, especially in the context of fundamentalism research. A metamethodological approach, supported by epistemological flexibility, makes dialogical engagement between researchers and those they research possible, and an example of how this would look in an actual research design is provided. The article concludes with a theoretical statement and graphic representation of a model for dialogical engagement between Western scholars and non-Western religious fundamentalists. Such engagement, the author argues, is necessary before any real progress on the “problem” of radicalized fundamentalism can be made.

  7. Sensitivity and uncertainty analyses applied to one-dimensional radionuclide transport in a layered fractured rock: Evaluation of the Limit State approach, Iterative Performance Assessment, Phase 2

    International Nuclear Information System (INIS)

    Wu, Y.T.; Gureghian, A.B.; Sagar, B.; Codell, R.B.

    1992-12-01

    The Limit State approach is based on partitioning the parameter space into two parts: one in which the performance measure is smaller than a chosen value (called the limit state), and the other in which it is larger. Through a Taylor expansion at a suitable point, the partitioning surface (called the limit state surface) is approximated as either a linear or quadratic function. The success and efficiency of the limit state method depends upon choosing an optimum point for the Taylor expansion. The point in the parameter space that has the highest probability of producing the value chosen as the limit state is optimal for expansion. When the parameter space is transformed into a standard Gaussian space, the optimal expansion point, known as the lost Probable Point (MPP), has the property that its location on the Limit State surface is closest to the origin. Additionally, the projections onto the parameter axes of the vector from the origin to the MPP are the sensitivity coefficients. Once the MPP is determined and the Limit State surface approximated, formulas (see Equations 4-7 and 4-8) are available for determining the probability of the performance measure being less than the limit state. By choosing a succession of limit states, the entire cumulative distribution of the performance measure can be detemined. Methods for determining the MPP and also for improving the estimate of the probability are discussed in this report

  8. Fundamentals and Techniques of Nonimaging

    Energy Technology Data Exchange (ETDEWEB)

    O' Gallagher, J. J.; Winston, R.

    2003-07-10

    other system parameter permits the construction of whole new classes of devices with greatly expanded capabilities compared to conventional approaches. These ''tailored edge-ray'' designs have dramatically broadened the range of geometries in which nonimaging optics can provide a significant performance improvement. Considerable progress continues to be made in furthering the incorporation of nonimaging secondaries into practical high concentration and ultra-high concentration solar collector systems. In parallel with the continuing development of nonimaging geometrical optics, our group has been working to develop an understanding of certain fundamental physical optics concepts in the same context. In particular, our study of the behavior of classical radiance in nonimaging systems, has revealed some fundamentally important new understandings that we have pursued both theoretically and experimentally. The field is still relatively new and is rapidly gaining widespread recognition because it fuels many industrial applications. Because of this, during the final years of the project, our group at Chicago has been working more closely with a team of industrial scientists from Science Applications International Corporation (SAIC) at first informally, and later more formally, beginning in 1998, under a formal program initiated by the Department of Energy and incrementally funded through this existing grant. This collaboration has been very fruitful and has led to new conceptual breakthroughs which have provided the foundation for further exciting growth. Many of these concepts are described in some detail in the report.

  9. The Continuum Limit of Causal Fermion Systems

    OpenAIRE

    Finster, Felix

    2016-01-01

    This monograph introduces the basic concepts of the theory of causal fermion systems, a recent approach to the description of fundamental physics. The theory yields quantum mechanics, general relativity and quantum field theory as limiting cases and is therefore a candidate for a unified physical theory. From the mathematical perspective, causal fermion systems provide a general framework for describing and analyzing non-smooth geometries and "quantum geometries." The dynamics is described by...

  10. Correlation or Limits of Agreement? Applying the Bland-Altman Approach to the Comparison of Cognitive Screening Instruments.

    Science.gov (United States)

    Larner, A J

    2016-01-01

    Calculation of correlation coefficients is often undertaken as a way of comparing different cognitive screening instruments (CSIs). However, test scores may correlate but not agree, and high correlation may mask lack of agreement between scores. The aim of this study was to use the methodology of Bland and Altman to calculate limits of agreement between the scores of selected CSIs and contrast the findings with Pearson's product moment correlation coefficients between the test scores of the same instruments. Datasets from three pragmatic diagnostic accuracy studies which examined the Mini-Mental State Examination (MMSE) vs. the Montreal Cognitive Assessment (MoCA), the MMSE vs. the Mini-Addenbrooke's Cognitive Examination (M-ACE), and the M-ACE vs. the MoCA were analysed to calculate correlation coefficients and limits of agreement between test scores. Although test scores were highly correlated (all >0.8), calculated limits of agreement were broad (all >10 points), and in one case, MMSE vs. M-ACE, was >15 points. Correlation is not agreement. Highly correlated test scores may conceal broad limits of agreement, consistent with the different emphases of different tests with respect to the cognitive domains examined. Routine incorporation of limits of agreement into diagnostic accuracy studies which compare different tests merits consideration, to enable clinicians to judge whether or not their agreement is close. © 2016 S. Karger AG, Basel.

  11. Quantum Limits of Space-to-Ground Optical Communications

    Science.gov (United States)

    Hemmati, H.; Dolinar, S.

    2012-01-01

    For a pure loss channel, the ultimate capacity can be achieved with classical coherent states (i.e., ideal laser light): (1) Capacity-achieving receiver (measurement) is yet to be determined. (2) Heterodyne detection approaches the ultimate capacity at high mean photon numbers. (3) Photon-counting approaches the ultimate capacity at low mean photon numbers. A number of current technology limits drive the achievable performance of free-space communication links. Approaching fundamental limits in the bandwidth-limited regime: (1) Heterodyne detection with high-order coherent-state modulation approaches ultimate limits. SOA improvements to laser phase noise, adaptive optics systems for atmospheric transmission would help. (2) High-order intensity modulation and photon-counting can approach heterodyne detection within approximately a factor of 2. This may have advantages over coherent detection in the presence of turbulence. Approaching fundamental limits in the photon-limited regime (1) Low-duty cycle binary coherent-state modulation (OOK, PPM) approaches ultimate limits. SOA improvements to laser extinction ratio, receiver dark noise, jitter, and blocking would help. (2) In some link geometries (near field links) number-state transmission could improve over coherent-state transmission

  12. Approach to high stability beta limit and its control by fast wave current drive in reversed field pinch plasma

    International Nuclear Information System (INIS)

    Kusano, K.; Kondoh, Y.; Gesso, H.; Osanai, Y.; Saito, K.N.; Ukai, R.; Nanba, T.; Nagamine, Y.; Shiina, S.

    2001-01-01

    Before the generation of steady state, dynamo-free RFP configuration by rf current driving scheme, it is necessary to find an optimum configuration into high stability beta limit against m=1 resonant resistive MHD modes and reducing nonlinearly turbulent level with less rf power. As first step to the optimization study, we are interested in partially relaxed state model (PRSM) RFP configuration, which is considered to be closer to a relaxed state at finite beta since it has force-free fields for poloidal direction with a relatively shorter characteristic length of relaxation and a relatively higher stability beta limit to m=1 resonant ideal MHD modes. The stability beta limit to m=1 resonant resistive MHD modes can be predicted to be relatively high among other RFP models and to be enhanced by the current density profile control using fast magnetosonic waves (FMW), which are accessible to high density region with strong absorption rate. (author)

  13. Fundamental principles of heat transfer

    CERN Document Server

    Whitaker, Stephen

    1977-01-01

    Fundamental Principles of Heat Transfer introduces the fundamental concepts of heat transfer: conduction, convection, and radiation. It presents theoretical developments and example and design problems and illustrates the practical applications of fundamental principles. The chapters in this book cover various topics such as one-dimensional and transient heat conduction, energy and turbulent transport, forced convection, thermal radiation, and radiant energy exchange. There are example problems and solutions at the end of every chapter dealing with design problems. This book is a valuable int

  14. Fundamental number theory with applications

    CERN Document Server

    Mollin, Richard A

    2008-01-01

    An update of the most accessible introductory number theory text available, Fundamental Number Theory with Applications, Second Edition presents a mathematically rigorous yet easy-to-follow treatment of the fundamentals and applications of the subject. The substantial amount of reorganizing makes this edition clearer and more elementary in its coverage. New to the Second Edition           Removal of all advanced material to be even more accessible in scope           New fundamental material, including partition theory, generating functions, and combinatorial number theory           Expa

  15. Design of impact limiters of a bulk type B (U) . Trials of fall and validation of the analytical model In the design of a container for transportation of spent fuel, the impact limiters are a fundamental part for compliance with regulatory requirements; Diseno de los Limitadores de impacto de un Bulto Tipo B(U). Ensayos de Caida y validacion del Modelo Analitico

    Energy Technology Data Exchange (ETDEWEB)

    Garrido Quevedo, D.

    2013-07-01

    The aim is to confirm through real trials that the design and the results obtained through simulation conform to reality with a high degree of confidence... The combination of tests on scale models and the validation of the methods of calculation are necessary tools for the design of limiters impact a container of spent fuel transport.

  16. A Risk Based Approach to Limit the Effects of Covert Channels for Internet Sensor Data Aggregators for Sensor Privacy

    Science.gov (United States)

    Viecco, Camilo H.; Camp, L. Jean

    Effective defense against Internet threats requires data on global real time network status. Internet sensor networks provide such real time network data. However, an organization that participates in a sensor network risks providing a covert channel to attackers if that organization’s sensor can be identified. While there is benefit for every party when any individual participates in such sensor deployments, there are perverse incentives against individual participation. As a result, Internet sensor networks currently provide limited data. Ensuring anonymity of individual sensors can decrease the risk of participating in a sensor network without limiting data provision.

  17. Optofluidic bioanalysis: fundamentals and applications

    Directory of Open Access Journals (Sweden)

    Ozcelik Damla

    2017-03-01

    Full Text Available Over the past decade, optofluidics has established itself as a new and dynamic research field for exciting developments at the interface of photonics, microfluidics, and the life sciences. The strong desire for developing miniaturized bioanalytic devices and instruments, in particular, has led to novel and powerful approaches to integrating optical elements and biological fluids on the same chip-scale system. Here, we review the state-of-the-art in optofluidic research with emphasis on applications in bioanalysis and a focus on waveguide-based approaches that represent the most advanced level of integration between optics and fluidics. We discuss recent work in photonically reconfigurable devices and various application areas. We show how optofluidic approaches have been pushing the performance limits in bioanalysis, e.g. in terms of sensitivity and portability, satisfying many of the key requirements for point-of-care devices. This illustrates how the requirements for bianalysis instruments are increasingly being met by the symbiotic integration of novel photonic capabilities in a miniaturized system.

  18. Medical approaches to suffering are limited, so why critique Improving Access to Psychological Therapies from the same ideology.

    Science.gov (United States)

    Binnie, James

    2018-04-01

    Although the article by Scott rightly questions the dynamics of the Improving Access to Psychological Therapies system and re-examines the recovery rates, finding quite shocking results, his recommendations are ultimately flawed. There is a strong critique of the diagnostic procedures in Improving Access to Psychological Therapies services, but the answer is not to diagnose more rigorously and to adhere more strictly to a manualised approach to psychotherapy. The opposite may be required. Alternatives to the medical model of distress offer a less stigmatising and more human approach to helping people with their problems. Perhaps psychological therapists and the people they work alongside would be better served by a psychological approach rather than a psychiatric one.

  19. [Rationalization, rationing, prioritization: terminology and ethical approaches to the allocation of limited resources in hematology/oncology].

    Science.gov (United States)

    Winkler, Eva

    2011-01-01

    The field of oncology with its numerous high-priced innovations contributes considerably to the fact that medical progress is expensive. Additionally, due to the demographic changes and the increasing life expectancy, a growing number of cancer patients want to profit from this progress. Since resources are limited also in the health system, the fair distribution of the available resources urgently needs to be addressed. Dealing with scarcity is a typical problem in the domain of justice theory; therefore, this article first discusses different strategies to manage limited resources: rationalization, rationing, and prioritization. It then presents substantive as well as procedural criteria that assist in the just distribution of effective health benefits. There are various strategies to reduce the utilization of limited resources: Rationalization means that efficiency reserves are being exhausted; by means of rationing, effective health benefits are withheld due to cost considerations. Rationing can occur implicitly and thus covertly, e.g. through budgeting or the implementation of waiting periods, or explicitly, through transparent rules or policies about healthcare coverage. Ranking medical treatments according to their importance (prioritization) is often a prerequisite for rationing decisions. In terms of requirements of justice, both procedural and substantive criteria (e.g. equality, urgency, benefit) are relevant for the acceptance and quality of a decision to limit access to effective health benefits. Copyright © 2011 S. Karger AG, Basel.

  20. Quantum mechanics I the fundamentals

    CERN Document Server

    Rajasekar, S

    2015-01-01

    Quantum Mechanics I: The Fundamentals provides a graduate-level account of the behavior of matter and energy at the molecular, atomic, nuclear, and sub-nuclear levels. It covers basic concepts, mathematical formalism, and applications to physically important systems.

  1. Are fundamental constants really constant

    International Nuclear Information System (INIS)

    Norman, E.B.

    1986-01-01

    Reasons for suspecting that fundamental constants might change with time are reviewed. Possible consequences of such variations are examined. The present status of experimental tests of these ideas is discussed

  2. Fundamentals of modern unsteady aerodynamics

    CERN Document Server

    Gülçat, Ülgen

    2010-01-01

    This introduction to the principles of unsteady aerodynamics covers all the core concepts, provides readers with a review of the fundamental physics, terminology and basic equations, and covers hot new topics such as the use of flapping wings for propulsion.

  3. Fundamentals of electronic image processing

    CERN Document Server

    Weeks, Arthur R

    1996-01-01

    This book is directed to practicing engineers and scientists who need to understand the fundamentals of image processing theory and algorithms to perform their technical tasks. It is intended to fill the gap between existing high-level texts dedicated to specialists in the field and the need for a more practical, fundamental text on image processing. A variety of example images are used to enhance reader understanding of how particular image processing algorithms work.

  4. Qualitative insights on fundamental mechanics

    OpenAIRE

    Mardari, G. N.

    2002-01-01

    The gap between classical mechanics and quantum mechanics has an important interpretive implication: the Universe must have an irreducible fundamental level, which determines the properties of matter at higher levels of organization. We show that the main parameters of any fundamental model must be theory-independent. They cannot be predicted, because they cannot have internal causes. However, it is possible to describe them in the language of classical mechanics. We invoke philosophical reas...

  5. Cr3+-Doped Fluorides and Oxides: Role of Internal Fields and Limitations of the Tanabe–Sugano Approach

    DEFF Research Database (Denmark)

    Trueba, A.; García Lastra, Juan Maria; Garcia-Fernandez, P.

    2011-01-01

    This work is aimed at clarifying the changes on optical spectra of Cr 3+ impurities due to either a host lattice variation or a hydrostatic pressure, which can hardly be understood by means of the usual Tanabe - Sugano (TS) approach assuming that the Racah parameter, B, grows when covalency decre...

  6. Degeneration of penicillin production in ethanol-limited chemostat cultivations of Penicillium chrysogenum : A systems biology approach

    NARCIS (Netherlands)

    Douma, Rutger D.; Batista, Joana M.; Touw, Kai M.; Kiel, Jan A. K. W.; Zhao, Zheng; Veiga, Tania; Klaassen, Paul; Bovenberg, Roel A. L.; Daran, Jean-Marc; van Gulik, Walter M.; Heijnen, J.J.; Krikken, Arjen

    2011-01-01

    Background: In microbial production of non-catabolic products such as antibiotics a loss of production capacity upon long-term cultivation (for example chemostat), a phenomenon called strain degeneration, is often observed. In this study a systems biology approach, monitoring changes from gene to

  7. With Iterative and Bosonized Coupling towards Fundamental Particle Properties

    CERN Document Server

    Binder, B

    2003-01-01

    Previous results have shown that the linear topological potential-to-phase relationship (well known from Josephson junctions) is the key to iterative coupling and non-perturbative bosonization of the 2 two-spinor Dirac equation. In this paper those results are combined to approach the nature of proton, neutron, and electron via extrapolations from Planck units to the System of Units (SI). The electron acts as a bosonizing bridge between opposite parity topological currents. The resulting potentials and masses are based on a fundamental soliton mass limit and two iteratively obtained coupling constants, where one is the fine structure constant. The simple non-perturbative and relativistic results are within measurement uncertainty and show a very high significance. The deviation for the proton and electron masses are approximately 1 ppb ($10^{-9}$), for the neutron 4 ppb.

  8. With Iterative and Bosonized Coupling towards Fundamental Particle Properties

    CERN Document Server

    Binder, B

    2002-01-01

    Previous results have shown that the linear topological potential-to-phase relationship (well known from Josephson junctions) is the key to iterative coupling and non-perturbative bosonization of the 2 two-spinor Dirac equation. In this paper those results are combined to approach the nature of proton, neutron, and electron via extrapolations from the Planck scale to the System of Units (SI). The electron acts as a bosonizing bridge between opposite parity topological currents. The resulting potentials and masses are based on a fundamental soliton mass limit and two iteratively obtained coupling constants where one is the fine structure constant. The simple non-perturbative and relativistic results are within measurement uncertainty and show a very high significance. The deviation for the proton and electron masses are approximately 1 ppb (10^-9), for the neutron 4 ppb.

  9. Estimating the snowfall limit in alpine and pre-alpine valleys: A local evaluation of operational approaches

    Science.gov (United States)

    Fehlmann, Michael; Gascón, Estíbaliz; Rohrer, Mario; Schwarb, Manfred; Stoffel, Markus

    2018-05-01

    The snowfall limit has important implications for different hazardous processes associated with prolonged or heavy precipitation such as flash floods, rain-on-snow events and freezing precipitation. To increase preparedness and to reduce risk in such situations, early warning systems are frequently used to monitor and predict precipitation events at different temporal and spatial scales. However, in alpine and pre-alpine valleys, the estimation of the snowfall limit remains rather challenging. In this study, we characterize uncertainties related to snowfall limit for different lead times based on local measurements of a vertically pointing micro rain radar (MRR) and a disdrometer in the Zulg valley, Switzerland. Regarding the monitoring, we show that the interpolation of surface temperatures tends to overestimate the altitude of the snowfall limit and can thus lead to highly uncertain estimates of liquid precipitation in the catchment. This bias is much smaller in the Integrated Nowcasting through Comprehensive Analysis (INCA) system, which integrates surface station and remotely sensed data as well as outputs of a numerical weather prediction model. To reduce systematic error, we perform a bias correction based on local MRR measurements and thereby demonstrate the added value of such measurements for the estimation of liquid precipitation in the catchment. Regarding the nowcasting, we show that the INCA system provides good estimates up to 6 h ahead and is thus considered promising for operational hydrological applications. Finally, we explore the medium-range forecasting of precipitation type, especially with respect to rain-on-snow events. We show for a selected case study that the probability for a certain precipitation type in an ensemble-based forecast is more persistent than the respective type in the high-resolution forecast (HRES) of the European Centre for Medium Range Weather Forecasts Integrated Forecasting System (ECMWF IFS). In this case study, the

  10. Approaching the basis set limit for DFT calculations using an environment-adapted minimal basis with perturbation theory: Formulation, proof of concept, and a pilot implementation

    International Nuclear Information System (INIS)

    Mao, Yuezhi; Horn, Paul R.; Mardirossian, Narbe; Head-Gordon, Teresa; Skylaris, Chris-Kriton; Head-Gordon, Martin

    2016-01-01

    Recently developed density functionals have good accuracy for both thermochemistry (TC) and non-covalent interactions (NC) if very large atomic orbital basis sets are used. To approach the basis set limit with potentially lower computational cost, a new self-consistent field (SCF) scheme is presented that employs minimal adaptive basis (MAB) functions. The MAB functions are optimized on each atomic site by minimizing a surrogate function. High accuracy is obtained by applying a perturbative correction (PC) to the MAB calculation, similar to dual basis approaches. Compared to exact SCF results, using this MAB-SCF (PC) approach with the same large target basis set produces <0.15 kcal/mol root-mean-square deviations for most of the tested TC datasets, and <0.1 kcal/mol for most of the NC datasets. The performance of density functionals near the basis set limit can be even better reproduced. With further improvement to its implementation, MAB-SCF (PC) is a promising lower-cost substitute for conventional large-basis calculations as a method to approach the basis set limit of modern density functionals.

  11. Good Administration as a Fundamental Right

    Directory of Open Access Journals (Sweden)

    Margrét Vala Kristjánsdóttir

    2013-06-01

    Full Text Available The EU Charter of Fundamental Rights lists good administration as a fundamental right. The scope of this right, as defined in Article 41 of the EU Charter, is limited to situations in which persons are dealing with the institutions and bodies of the European Union; this gives it a narrower scope than that of the Charter as a whole. This paper discusses the status of this right as a subjective, fundamental right and a codified principle of EU law. The focus is on the question of applicability of the right to situations in which persons are dealing with the institutions and bodies of Member States and questions are raised regarding the implications of Article 41 in this respect. The paper concludes that Article 41 of the Charter in fact limits the applicability of good administration to the institutions and bodies of the EU. This does not however, preclude the applicability of a general principle of good administration, as established by the European Court of Justice, to Member States and the formal recognition of this principle in the EU Charter seems to affect legal reasoning and contribute to some extent to the protection of administrative rules in the implementation of EU law.

  12. Latent Fundamentals Arbitrage with a Mixed Effects Factor Model

    Directory of Open Access Journals (Sweden)

    Andrei Salem Gonçalves

    2012-09-01

    Full Text Available We propose a single-factor mixed effects panel data model to create an arbitrage portfolio that identifies differences in firm-level latent fundamentals. Furthermore, we show that even though the characteristics that affect returns are unknown variables, it is possible to identify the strength of the combination of these latent fundamentals for each stock by following a simple approach using historical data. As a result, a trading strategy that bought the stocks with the best fundamentals (strong fundamentals portfolio and sold the stocks with the worst ones (weak fundamentals portfolio realized significant risk-adjusted returns in the U.S. market for the period between July 1986 and June 2008. To ensure robustness, we performed sub period and seasonal analyses and adjusted for trading costs and we found further empirical evidence that using a simple investment rule, that identified these latent fundamentals from the structure of past returns, can lead to profit.

  13. On the Use of Time-Limited Information for Maintenance Decision Support: A Predictive Approach under Maintenance Constraints

    Directory of Open Access Journals (Sweden)

    E. Khoury

    2013-01-01

    Full Text Available This paper deals with a gradually deteriorating system operating under an uncertain environment whose state is only known on a finite rolling horizon. As such, the system is subject to constraints. Maintenance actions can only be planned at imposed times called maintenance opportunities that are available on a limited visibility horizon. This system can, for example, be a commercial vehicle with a monitored critical component that can be maintained only in some specific workshops. Based on the considered system, we aim to use the monitoring data and the time-limited information for maintenance decision support in order to reduce its costs. We propose two predictive maintenance policies based, respectively, on cost and reliability criteria. Classical age-based and condition-based policies are considered as benchmarks. The performance assessment shows the value of the different types of information and the best way to use them in maintenance decision making.

  14. Response approach to the squeezed-limit bispectrum: application to the correlation of quasar and Lyman-α forest power spectrum

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Chi-Ting [C.N. Yang Institute for Theoretical Physics, Stony Brook University, Stony Brook, NY 11794 (United States); Cieplak, Agnieszka M.; Slosar, Anže [Brookhaven National Laboratory, Blgd 510, Upton, NY 11375 (United States); Schmidt, Fabian, E-mail: chi-ting.chiang@stonybrook.edu, E-mail: acieplak@bnl.gov, E-mail: fabians@mpa-garching.mpg.de, E-mail: anze@bnl.gov [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching (Germany)

    2017-06-01

    The squeezed-limit bispectrum, which is generated by nonlinear gravitational evolution as well as inflationary physics, measures the correlation of three wavenumbers, in the configuration where one wavenumber is much smaller than the other two. Since the squeezed-limit bispectrum encodes the impact of a large-scale fluctuation on the small-scale power spectrum, it can be understood as how the small-scale power spectrum ''responds'' to the large-scale fluctuation. Viewed in this way, the squeezed-limit bispectrum can be calculated using the response approach even in the cases which do not submit to perturbative treatment. To illustrate this point, we apply this approach to the cross-correlation between the large-scale quasar density field and small-scale Lyman-α forest flux power spectrum. In particular, using separate universe simulations which implement changes in the large-scale density, velocity gradient, and primordial power spectrum amplitude, we measure how the Lyman-α forest flux power spectrum responds to the local, long-wavelength quasar overdensity, and equivalently their squeezed-limit bispectrum. We perform a Fisher forecast for the ability of future experiments to constrain local non-Gaussianity using the bispectrum of quasars and the Lyman-α forest. Combining with quasar and Lyman-α forest power spectra to constrain the biases, we find that for DESI the expected 1−σ constraint is err[ f {sub NL}]∼60. Ability for DESI to measure f {sub NL} through this channel is limited primarily by the aliasing and instrumental noise of the Lyman-α forest flux power spectrum. The combination of response approach and separate universe simulations provides a novel technique to explore the constraints from the squeezed-limit bispectrum between different observables.

  15. [Overcoming the limitations of the descriptive and categorical approaches in psychiatric diagnosis: a proposal based on Bayesian networks].

    Science.gov (United States)

    Sorias, Soli

    2015-01-01

    Efforts to overcome the problems of descriptive and categorical approaches have not yielded results. In the present article, psychiatric diagnosis using Bayesian networks is proposed. Instead of a yes/no decision, Bayesian networks give the probability of diagnostic category inclusion, thereby yielding both a graded, i.e., dimensional diagnosis, and a value of the certainty of the diagnosis. With the use of Bayesian networks in the diagnosis of mental disorders, information about etiology, associated features, treatment outcome, and laboratory results may be used in addition to clinical signs and symptoms, with each of these factors contributing proportionally to their own specificity and sensitivity. Furthermore, a diagnosis (albeit one with a lower probability) can be made even with incomplete, uncertain, or partially erroneous information, and patients whose symptoms are below the diagnostic threshold can be evaluated. Lastly, there is no need of NOS or "unspecified" categories, and comorbid disorders become different dimensions of the diagnostic evaluation. Bayesian diagnoses allow the preservation of current categories and assessment methods, and may be used concurrently with criteria-based diagnoses. Users need not put in extra effort except to collect more comprehensive information. Unlike the Research Domain Criteria (RDoC) project, the Bayesian approach neither increases the diagnostic validity of existing categories nor explains the pathophysiological mechanisms of mental disorders. It, however, can be readily integrated to present classification systems. Therefore, the Bayesian approach may be an intermediate phase between criteria-based diagnosis and the RDoC ideal.

  16. An Alternative Approach to Overcome the Limitation of HRUs in Analyzing Hydrological Processes Based on Land Use/Cover Change

    Directory of Open Access Journals (Sweden)

    Fanhao Meng

    2018-04-01

    Full Text Available Since the concept of hydrological response units (HRUs is used widely in hydrological modeling, the land use change scenarios analysis based on HRU may have direct influence on hydrological processes due to its simplified flow routing and HRU spatial distribution. This paper intends to overcome this issue based on a new analysis approach to explain what impacts for the impact of land use/cover change on hydrological processes (LUCCIHP, and compare whether differences exist between the conventional approach and the improved approach. Therefore, we proposed a sub-basin segmentation approach to obtain more reasonable impact assessment of LUCC scenario by re-discretizing the HRUs and prolonging the flow path in which the LUCC occurs. As a scenario study, the SWAT model is used in the Aksu River Basin, China, to simulate the response of hydrological processes to LUCC over ten years. Moreover, the impacts of LUCC on hydrological processes before and after model modification are compared and analyzed at three levels (catchment scale, sub-basin scale and HRU scale. Comparative analysis of Nash–Sutcliffe coefficient (NSE, RSR and Pbias, model simulations before and after model improvement shows that NSE increased by up to 2%, RSR decreased from 0.73 to 0.72, and Pbias decreased from 0.13 to 0.05. The major LUCCs affecting hydrological elements in this basin are related to the degradation of grassland and snow/ice and expansion of farmland and bare land. Model simulations before and after model improvement show that the average variation of flow components in typical sub-basins (surface runoff, lateral flow and groundwater flow are changed by +11.09%, −4.51%, and −6.58%, and +10.53%, −1.55%, and −8.98% from the base period model scenario, respectively. Moreover, the spatial response of surface runoff at the HRU level reveals clear spatial differences between before and after model improvement. This alternative approach illustrates the potential

  17. Fundamentals of semiconductor manufacturing and process control

    CERN Document Server

    May, Gary S

    2006-01-01

    A practical guide to semiconductor manufacturing from process control to yield modeling and experimental design Fundamentals of Semiconductor Manufacturing and Process Control covers all issues involved in manufacturing microelectronic devices and circuits, including fabrication sequences, process control, experimental design, process modeling, yield modeling, and CIM/CAM systems. Readers are introduced to both the theory and practice of all basic manufacturing concepts. Following an overview of manufacturing and technology, the text explores process monitoring methods, including those that focus on product wafers and those that focus on the equipment used to produce wafers. Next, the text sets forth some fundamentals of statistics and yield modeling, which set the foundation for a detailed discussion of how statistical process control is used to analyze quality and improve yields. The discussion of statistical experimental design offers readers a powerful approach for systematically varying controllable p...

  18. Fundamentals and advanced techniques in derivatives hedging

    CERN Document Server

    Bouchard, Bruno

    2016-01-01

    This book covers the theory of derivatives pricing and hedging as well as techniques used in mathematical finance. The authors use a top-down approach, starting with fundamentals before moving to applications, and present theoretical developments alongside various exercises, providing many examples of practical interest. A large spectrum of concepts and mathematical tools that are usually found in separate monographs are presented here. In addition to the no-arbitrage theory in full generality, this book also explores models and practical hedging and pricing issues. Fundamentals and Advanced Techniques in Derivatives Hedging further introduces advanced methods in probability and analysis, including Malliavin calculus and the theory of viscosity solutions, as well as the recent theory of stochastic targets and its use in risk management, making it the first textbook covering this topic. Graduate students in applied mathematics with an understanding of probability theory and stochastic calculus will find this b...

  19. Effect of yield to tensile (Y/T) ratio on the structural integrity of offshore pipeline: advanced engineering assessment using limit state design approach

    Energy Technology Data Exchange (ETDEWEB)

    Malatesta, G; Mannucci, G; Demofonti, G [Centro Sviluppo Materiali S.p.A., Rome (Italy); Cumino, G [TenarisDalmine (Italy); Izquierdo, A; Tivelli, M [Tenaris Group (Mexico); Quintanilla, H [TENARIS Group (Mexico). TAMSA

    2005-07-01

    Nowadays specifications require strict Yield to Tensile ratio limitation, nevertheless a fully accepted engineering assessment of its influence on pipeline integrity is still lacking. Probabilistic analysis based on structural reliability approach (Limit State Design) aimed at quantifying the Y/T ratio influence on failure probabilities of offshore pipelines was made. In particular, Tenaris seamless pipe data were used as input for the probabilistic failure analysis. The LSD approach has been applied to two actual deep water design cases that have been on purpose selected, and the most relevant failure modes have been considered. Main result of the work is that the quantitative effect of the Y/T ratio on failure probabilities of a deep water pipeline resulted not so big as expected; it has a minor effect, especially when failure modes are governed by Y only. (author)

  20. Time-domain simulations for metallic nano-structures - a Krylov-subspace approach beyond the limitations of FDTD

    Energy Technology Data Exchange (ETDEWEB)

    Koenig, Michael [Institut fuer Theoretische Festkoerperphysik, Universitaet Karlsruhe (Germany); Karlsruhe School of Optics and Photonics (KSOP), Universitaet Karlsruhe (Germany); Niegemann, Jens; Tkeshelashvili, Lasha; Busch, Kurt [Institut fuer Theoretische Festkoerperphysik, Universitaet Karlsruhe (Germany); DFG Forschungszentrum Center for Functional Nanostructures (CFN), Universitaet Karlsruhe (Germany); Karlsruhe School of Optics and Photonics (KSOP), Universitaet Karlsruhe (Germany)

    2008-07-01

    Numerical simulations of metallic nano-structures are crucial for the efficient design of plasmonic devices. Conventional time-domain solvers such as FDTD introduce large numerical errors especially at metallic surfaces. Our approach combines a discontinuous Galerkin method on an adaptive mesh for the spatial discretisation with a Krylov-subspace technique for the time-stepping procedure. Thus, the higher-order accuracy in both time and space is supported by unconditional stability. As illustrative examples, we compare numerical results obtained with our method against analytical reference solutions and results from FDTD calculations.

  1. Unified approach to the entropy of an extremal rotating BTZ black hole: Thin shells and horizon limits

    Science.gov (United States)

    Lemos, José P. S.; Minamitsuji, Masato; Zaslavskii, Oleg B.

    2017-10-01

    Using a thin shell, the first law of thermodynamics, and a unified approach, we study the thermodymanics and find the entropy of a (2 +1 )-dimensional extremal rotating Bañados-Teitelbom-Zanelli (BTZ) black hole. The shell in (2 +1 ) dimensions, i.e., a ring, is taken to be circularly symmetric and rotating, with the inner region being a ground state of the anti-de Sitter spacetime and the outer region being the rotating BTZ spacetime. The extremal BTZ rotating black hole can be obtained in three different ways depending on the way the shell approaches its own gravitational or horizon radius. These ways are explicitly worked out. The resulting three cases give that the BTZ black hole entropy is either the Bekenstein-Hawking entropy, S =A/+ 4 G , or an arbitrary function of A+, S =S (A+) , where A+=2 π r+ is the area, i.e., the perimeter, of the event horizon in (2 +1 ) dimensions. We speculate that the entropy of an extremal black hole should obey 0 ≤S (A+)≤A/+ 4 G . We also show that the contributions from the various thermodynamic quantities, namely, the mass, the circular velocity, and the temperature, for the entropy in all three cases are distinct. This study complements the previous studies in thin shell thermodynamics and entropy for BTZ black holes. It also corroborates the results found for a (3 +1 )-dimensional extremal electrically charged Reissner-Nordström black hole.

  2. Equations of viscous flow of silicate liquids with different approaches for universality of high temperature viscosity limit

    Directory of Open Access Journals (Sweden)

    Ana F. Kozmidis-Petrović

    2014-06-01

    Full Text Available The Vogel-Fulcher-Tammann (VFT, Avramov and Milchev (AM as well as Mauro, Yue, Ellison, Gupta and Allan (MYEGA functions of viscous flow are analysed when the compositionally independent high temperature viscosity limit is introduced instead of the compositionally dependent parameter η∞ . Two different approaches are adopted. In the first approach, it is assumed that each model should have its own (average high-temperature viscosity parameter η∞ . In that case, η∞ is different for each of these three models. In the second approach, it is assumed that the high-temperature viscosity is a truly universal value, independent of the model. In this case, the parameter η∞ would be the same and would have the same value: log η∞ = −1.93 dPa·s for all three models. 3D diagrams can successfully predict the difference in behaviour of viscous functions when average or universal high temperature limit is applied in calculations. The values of the AM functions depend, to a greater extent, on whether the average or the universal value for η∞ is used which is not the case with the VFT model. Our tests and values of standard error of estimate (SEE show that there are no general rules whether the average or universal high temperature viscosity limit should be applied to get the best agreement with the experimental functions.

  3. Fundamental physics in particle traps

    International Nuclear Information System (INIS)

    Quint, Wolfgang; Vogel, Manuel

    2014-01-01

    The individual topics are covered by leading experts in the respective fields of research. Provides readers with present theory and experiments in this field. A useful reference for researchers. This volume provides detailed insight into the field of precision spectroscopy and fundamental physics with particles confined in traps. It comprises experiments with electrons and positrons, protons and antiprotons, antimatter and highly charged ions, together with corresponding theoretical background. Such investigations represent stringent tests of quantum electrodynamics and the Standard model, antiparticle and antimatter research, test of fundamental symmetries, constants, and their possible variations with time and space. They are key to various aspects within metrology such as mass measurements and time standards, as well as promising to further developments in quantum information processing. The reader obtains a valuable source of information suited for beginners and experts with an interest in fundamental studies using particle traps.

  4. A new analytical approach for limit cycles and quasi-periodic solutions of nonlinear oscillators: the example of the forced Van der Pol Duffing oscillator

    International Nuclear Information System (INIS)

    Shukla, Anant Kant; Ramamohan, T R; Srinivas, S

    2014-01-01

    In this paper we propose a technique to obtain limit cycles and quasi-periodic solutions of forced nonlinear oscillators. We apply this technique to the forced Van der Pol oscillator and the forced Van der Pol Duffing oscillator and obtain for the first time their limit cycles (periodic) and quasi-periodic solutions analytically. We introduce a modification of the homotopy analysis method to obtain these solutions. We minimize the square residual error to obtain accurate approximations to these solutions. The obtained analytical solutions are convergent and agree well with numerical solutions even at large times. Time trajectories of the solution, its first derivative and phase plots are presented to confirm the validity of the proposed approach. We also provide rough criteria for the determination of parameter regimes which lead to limit cycle or quasi-periodic behaviour. (papers)

  5. RFID design fundamentals and applications

    CERN Document Server

    Lozano-Nieto, Albert

    2010-01-01

    RFID is an increasingly pervasive tool that is now used in a wide range of fields. It is employed to substantiate adherence to food preservation and safety standards, combat the circulation of counterfeit pharmaceuticals, and verify authenticity and history of critical parts used in aircraft and other machinery-and these are just a few of its uses. Goes beyond deployment, focusing on exactly how RFID actually worksRFID Design Fundamentals and Applications systematically explores the fundamental principles involved in the design and characterization of RFID technologies. The RFID market is expl

  6. Fundamentals of multicore software development

    CERN Document Server

    Pankratius, Victor; Tichy, Walter F

    2011-01-01

    With multicore processors now in every computer, server, and embedded device, the need for cost-effective, reliable parallel software has never been greater. By explaining key aspects of multicore programming, Fundamentals of Multicore Software Development helps software engineers understand parallel programming and master the multicore challenge. Accessible to newcomers to the field, the book captures the state of the art of multicore programming in computer science. It covers the fundamentals of multicore hardware, parallel design patterns, and parallel programming in C++, .NET, and Java. It

  7. Qualitative insights on fundamental mechanics

    International Nuclear Information System (INIS)

    Mardari, Ghenadie N

    2007-01-01

    The gap between classical mechanics and quantum mechanics has an important interpretive implication: the Universe must have an irreducible fundamental level, which determines the properties of matter at higher levels of organization. We show that the main parameters of any fundamental model must be theory-independent. Moreover, such models must also contain discrete identical entities with constant properties. These conclusions appear to support the work of Kaniadakis on subquantum mechanics. A qualitative analysis is offered to suggest compatibility with relevant phenomena, as well as to propose new means for verification

  8. Astrophysical probes of fundamental physics

    International Nuclear Information System (INIS)

    Martins, C.J.A.P.

    2009-01-01

    I review the motivation for varying fundamental couplings and discuss how these measurements can be used to constrain fundamental physics scenarios that would otherwise be inaccessible to experiment. I highlight the current controversial evidence for varying couplings and present some new results. Finally I focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements might be used to probe the nature of dark energy, with some advantages over standard methods. In particular I discuss what can be achieved with future spectrographs such as ESPRESSO and CODEX.

  9. Astrophysical probes of fundamental physics

    Energy Technology Data Exchange (ETDEWEB)

    Martins, C.J.A.P. [Centro de Astrofisica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); DAMTP, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom)

    2009-10-15

    I review the motivation for varying fundamental couplings and discuss how these measurements can be used to constrain fundamental physics scenarios that would otherwise be inaccessible to experiment. I highlight the current controversial evidence for varying couplings and present some new results. Finally I focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements might be used to probe the nature of dark energy, with some advantages over standard methods. In particular I discuss what can be achieved with future spectrographs such as ESPRESSO and CODEX.

  10. The fundamental interactions of matter

    International Nuclear Information System (INIS)

    Falla, D.F.

    1977-01-01

    Elementary particles are here discussed, in the context of the extent to which the fundamental interactions are related to the elementary constituents of matter. The field quanta related to the four fundamental interactions (electromagnetic, strong,weak and gravitational) are discussed within an historical context beginning with the conception of the photon. The discovery of the mesons and discoveries relevant to the nature of the heavy vector boson are considered. Finally a few recent speculations on the properties of the graviton are examined. (U.K.)

  11. A Swiss Village in the Dutch Tropics: The Limitations of Empire-Centred Approaches to the Early Modern Atlantic World

    Directory of Open Access Journals (Sweden)

    Karwan Fatah-Black

    2013-03-01

    Full Text Available This article considers what the migration circuits to and from Suriname can tell us about Dutch early modern colonisation in the Atlantic world. Did the Dutch have an Atlantic empire that can be studied by treating it as an integrated space, as suggested by New Imperial Historians, or did colonisation rely on circuits outside Dutch control, stretching beyond its imperial space? An empire-centred approach has dominated the study of Suriname’s history and has largely glossed over the routes taken by European migrants to and from the colony. When the empirecentred perspective is transcended it becomes possible to see that colonists arrived in Suriname from a range of different places around the Atlantic and the European hinterland. The article takes an Atlantic or global perspective to demonstrate the choices available to colonists and the networks through which they moved.

  12. Long range surface plasmon resonance with ultra-high penetration depth for self-referenced sensing and ultra-low detection limit using diverging beam approach

    Energy Technology Data Exchange (ETDEWEB)

    Isaacs, Sivan, E-mail: sivan.isaacs@gmail.com; Abdulhalim, Ibrahim [Department of Electro-Optical Engineering and TheIlse Katz Institute for Nanoscale Science and Technology, Ben Gurion University of the Negev, Beer Sheva 84105 (Israel); NEW CREATE Programme, School of Materials Science and Engineering, 1 CREATE Way, Research Wing, #02-06/08, Singapore 138602 (Singapore)

    2015-05-11

    Using an insulator-metal-insulator structure with dielectric having refractive index (RI) larger than the analyte, long range surface plasmon (SP) resonance exhibiting ultra-high penetration depth is demonstrated for sensing applications of large bioentities at wavelengths in the visible range. Based on the diverging beam approach in Kretschmann-Raether configuration, one of the SP resonances is shown to shift in response to changes in the analyte RI while the other is fixed; thus, it can be used as a built in reference. The combination of the high sensitivity, high penetration depth and self-reference using the diverging beam approach in which a dark line is detected of the high sensitivity, high penetration depth, self-reference, and the diverging beam approach in which a dark line is detected using large number of camera pixels with a smart algorithm for sub-pixel resolution, a sensor with ultra-low detection limit is demonstrated suitable for large bioentities.

  13. Amorphous Phase Mediated Crystallization: Fundamentals of Biomineralization

    Directory of Open Access Journals (Sweden)

    Wenjing Jin

    2018-01-01

    Full Text Available Many biomineralization systems start from transient amorphous precursor phases, but the exact crystallization pathways and mechanisms remain largely unknown. The study of a well-defined biomimetic crystallization system is key for elucidating the possible mechanisms of biomineralization and monitoring the detailed crystallization pathways. In this review, we focus on amorphous phase mediated crystallization (APMC pathways and their crystallization mechanisms in bio- and biomimetic-mineralization systems. The fundamental questions of biomineralization as well as the advantages and limitations of biomimetic model systems are discussed. This review could provide a full landscape of APMC systems for biomineralization and inspire new experiments aimed at some unresolved issues for understanding biomineralization.

  14. A verdade como um problema fundamental em Kant Kant on truth as a fundamental problem

    Directory of Open Access Journals (Sweden)

    Adriano Perin

    2010-01-01

    Full Text Available O principal ponto de desacordo sobre a abordagem kantiana do problema da verdade é se ela pode ser compreendida nos moldes da filosofia contemporânea como coerentista ou como correspondentista. Primando por uma consideração sistemática da argumentação de Kant em confronto com a literatura existente sobre o problema, este trabalho defende a segunda alternativa. Sustenta-se a tese de que a definição da verdade como a "concordância do conhecimento com o seu objeto" é cogente em todo o percurso do pensamento kantiano e que, nessa acepção, a verdade culmina por ser abordada não a partir de uma teoria estabelecida, mas como um problema cuja solução não pode ser dada nos limites da filosofia crítico-transcendental. Pondera-se, primeiramente, a literatura que situa Kant quer como coerentista quer como correspondentista e sistematiza-se a segunda alternativa em quatro grupos: a leitura ontológica, a leitura isomórfica, a leitura "consequencialista" e a leitura regulativa. Num segundo momento, em atenção ao período pré-crítico, argumenta-se que a alternativa coerentista deixa de se confirmar já nessa mesma época e que, na década de 1750, Kant descarta uma suposta teoria correspondentista isomórfica. Num último momento, considera-se a argumentação crítica e defende-se que a mesma concebe a verdade como um problema fundamental que não cabe ao tratamento de uma teoria correspondentista concebida de modo "consequencialista" ou regulativo.The main point of disagreement about Kant's approach of the problem of truth is whether it can be understood within the apparatus of contemporary philosophy as a coherence or a correspondence theory. By favoring a systematic consideration of Kant's argumentation in light of the available literature on the problem, this paper argues toward the latter alternative. It is sustained that the definition of truth as "the agreement of cognition with its object" is cogent throughout Kant's thought and

  15. Flux Limiter Lattice Boltzmann Scheme Approach to Compressible Flows with Flexible Specific-Heat Ratio and Prandtl Number

    International Nuclear Information System (INIS)

    Gan Yanbiao; Li Yingjun; Xu Aiguo; Zhang Guangcai

    2011-01-01

    We further develop the lattice Boltzmann (LB) model [Physica A 382 (2007) 502] for compressible flows from two aspects. Firstly, we modify the Bhatnagar-Gross-Krook (BGK) collision term in the LB equation, which makes the model suitable for simulating flows with different Prandtl numbers. Secondly, the flux limiter finite difference (FLFD) scheme is employed to calculate the convection term of the LB equation, which makes the unphysical oscillations at discontinuities be effectively suppressed and the numerical dissipations be significantly diminished. The proposed model is validated by recovering results of some well-known benchmarks, including (i) The thermal Couette flow; (ii) One- and two-dimensional Riemann problems. Good agreements are obtained between LB results and the exact ones or previously reported solutions. The flexibility, together with the high accuracy of the new model, endows the proposed model considerable potential for tracking some long-standing problems and for investigating nonlinear nonequilibrium complex systems. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  16. An Approach to the Prototyping of an Optimized Limited Stroke Actuator to Drive a Low Pressure Exhaust Gas Recirculation Valve.

    Science.gov (United States)

    Gutfrind, Christophe; Dufour, Laurent; Liebart, Vincent; Vannier, Jean-Claude; Vidal, Pierre

    2016-05-20

    The purpose of this article is to describe the design of a limited stroke actuator and the corresponding prototype to drive a Low Pressure (LP) Exhaust Gas Recirculation (EGR) valve for use in Internal Combustion Engines (ICEs). The direct drive actuator topology is an axial flux machine with two air gaps in order to minimize the rotor inertia and a bipolar surface-mounted permanent magnet in order to respect an 80° angular stroke. Firstly, the actuator will be described and optimized under constraints of a 150 ms time response, a 0.363 N·m minimal torque on an angular range from 0° to 80° and prototyping constraints. Secondly, the finite element method (FEM) using the FLUX-3D(®) software (CEDRAT, Meylan, France) will be used to check the actuator performances with consideration of the nonlinear effect of the iron material. Thirdly, a prototype will be made and characterized to compare its measurement results with the analytical model and the FEM model results. With these electromechanical behavior measurements, a numerical model is created with Simulink(®) in order to simulate an EGR system with this direct drive actuator under all operating conditions. Last but not least, the energy consumption of this machine will be estimated to evaluate the efficiency of the proposed EGR electromechanical system.

  17. Preclinical Torsades-de-Pointes screens: advantages and limitations of surrogate and direct approaches in evaluating proarrhythmic risk.

    Science.gov (United States)

    Gintant, Gary A

    2008-08-01

    The successful development of novel drugs requires the ability to detect (and avoid) compounds that may provoke Torsades-de-Pointes (TdeP) arrhythmia while endorsing those compounds with minimal torsadogenic risk. As TdeP is a rare arrhythmia not readily observed during clinical or post-marketing studies, numerous preclinical models are employed to assess delayed or altered ventricular repolarization (surrogate markers linked to enhanced proarrhythmic risk). This review evaluates the advantages and limitations of selected preclinical models (ranging from the simplest cellular hERG current assay to the more complex in vitro perfused ventricular wedge and Langendorff heart preparations and in vivo chronic atrio-ventricular (AV)-node block model). Specific attention is paid to the utility of concentration-response relationships and "risk signatures" derived from these studies, with the intention of moving beyond predicting clinical QT prolongation and towards prediction of TdeP risk. While the more complex proarrhythmia models may be suited to addressing questionable or conflicting proarrhythmic signals obtained with simpler preclinical assays, further benchmarking of proarrhythmia models is required for their use in the robust evaluation of safety margins. In the future, these models may be able to reduce unwarranted attrition of evolving compounds while becoming pivotal in the balanced integrated risk assessment of advancing compounds.

  18. An Approach to the Prototyping of an Optimized Limited Stroke Actuator to Drive a Low Pressure Exhaust Gas Recirculation Valve

    Science.gov (United States)

    Gutfrind, Christophe; Dufour, Laurent; Liebart, Vincent; Vannier, Jean-Claude; Vidal, Pierre

    2016-01-01

    The purpose of this article is to describe the design of a limited stroke actuator and the corresponding prototype to drive a Low Pressure (LP) Exhaust Gas Recirculation (EGR) valve for use in Internal Combustion Engines (ICEs). The direct drive actuator topology is an axial flux machine with two air gaps in order to minimize the rotor inertia and a bipolar surface-mounted permanent magnet in order to respect an 80° angular stroke. Firstly, the actuator will be described and optimized under constraints of a 150 ms time response, a 0.363 N·m minimal torque on an angular range from 0° to 80° and prototyping constraints. Secondly, the finite element method (FEM) using the FLUX-3D® software (CEDRAT, Meylan, France) will be used to check the actuator performances with consideration of the nonlinear effect of the iron material. Thirdly, a prototype will be made and characterized to compare its measurement results with the analytical model and the FEM model results. With these electromechanical behavior measurements, a numerical model is created with Simulink® in order to simulate an EGR system with this direct drive actuator under all operating conditions. Last but not least, the energy consumption of this machine will be estimated to evaluate the efficiency of the proposed EGR electromechanical system. PMID:27213398

  19. Maximizing the benefit of health workforce secondment in Botswana: an approach for strengthening health systems in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Grignon JS

    2014-05-01

    Full Text Available Jessica S Grignon,1,2 Jenny H Ledikwe,1,2 Ditsapelo Makati,2 Robert Nyangah,2 Baraedi W Sento,2 Bazghina-werq Semo1,2 1Department of Global Health, University of Washington, Seattle, WA, USA; 2International Training and Education Center for Health, Gaborone, Botswana Abstract: To address health systems challenges in limited-resource settings, global health initiatives, particularly the President's Emergency Plan for AIDS Relief, have seconded health workers to the public sector. Implementation considerations for secondment as a health workforce development strategy are not well documented. The purpose of this article is to present outcomes, best practices, and lessons learned from a President's Emergency Plan for AIDS Relief-funded secondment program in Botswana. Outcomes are documented across four World Health Organization health systems' building blocks. Best practices include documentation of joint stakeholder expectations, collaborative recruitment, and early identification of counterparts. Lessons learned include inadequate ownership, a two-tier employment system, and ill-defined position duration. These findings can inform program and policy development to maximize the benefit of health workforce secondment. Secondment requires substantial investment, and emphasis should be placed on high-level technical positions responsible for building systems, developing health workers, and strengthening government to translate policy into programs. Keywords: human resources, health policy, health worker, HIV/AIDS, PEPFAR

  20. Current Treatment Limitations in Age-Related Macular Degeneration and Future Approaches Based on Cell Therapy and Tissue Engineering

    Science.gov (United States)

    Fernández-Robredo, P.; Sancho, A.; Johnen, S.; Recalde, S.; Gama, N.; Thumann, G.; Groll, J.; García-Layana, A.

    2014-01-01

    Age-related macular degeneration (AMD) is the leading cause of blindness in the Western world. With an ageing population, it is anticipated that the number of AMD cases will increase dramatically, making a solution to this debilitating disease an urgent requirement for the socioeconomic future of the European Union and worldwide. The present paper reviews the limitations of the current therapies as well as the socioeconomic impact of the AMD. There is currently no cure available for AMD, and even palliative treatments are rare. Treatment options show several side effects, are of high cost, and only treat the consequence, not the cause of the pathology. For that reason, many options involving cell therapy mainly based on retinal and iris pigment epithelium cells as well as stem cells are being tested. Moreover, tissue engineering strategies to design and manufacture scaffolds to mimic Bruch's membrane are very diverse and under investigation. Both alternative therapies are aimed to prevent and/or cure AMD and are reviewed herein. PMID:24672707

  1. Atom counting in HAADF STEM using a statistical model-based approach: methodology, possibilities, and inherent limitations.

    Science.gov (United States)

    De Backer, A; Martinez, G T; Rosenauer, A; Van Aert, S

    2013-11-01

    In the present paper, a statistical model-based method to count the number of atoms of monotype crystalline nanostructures from high resolution high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) images is discussed in detail together with a thorough study on the possibilities and inherent limitations. In order to count the number of atoms, it is assumed that the total scattered intensity scales with the number of atoms per atom column. These intensities are quantitatively determined using model-based statistical parameter estimation theory. The distribution describing the probability that intensity values are generated by atomic columns containing a specific number of atoms is inferred on the basis of the experimental scattered intensities. Finally, the number of atoms per atom column is quantified using this estimated probability distribution. The number of atom columns available in the observed STEM image, the number of components in the estimated probability distribution, the width of the components of the probability distribution, and the typical shape of a criterion to assess the number of components in the probability distribution directly affect the accuracy and precision with which the number of atoms in a particular atom column can be estimated. It is shown that single atom sensitivity is feasible taking the latter aspects into consideration. © 2013 Elsevier B.V. All rights reserved.

  2. Another argument against fundamental scalars

    International Nuclear Information System (INIS)

    Joglekar, S.D.

    1990-01-01

    An argument, perhaps not as strong, which is based on the inclusion of interaction with external gravity into a theory describing strong, electromagnetic and weak interactions is presented. The argument is related to the basis of the common belief which favours a renormalizable action against a non-renormalizable action as a candidate for a fundamental theory. (author). 12 refs

  3. Fundamentals of Welding. Teacher Edition.

    Science.gov (United States)

    Fortney, Clarence; And Others

    These instructional materials assist teachers in improving instruction on the fundamentals of welding. The following introductory information is included: use of this publication; competency profile; instructional/task analysis; related academic and workplace skills list; tools, materials, and equipment list; and 27 references. Seven units of…

  4. Composing Europe's Fundamental Rights Area

    DEFF Research Database (Denmark)

    Storgaard, Louise Halleskov

    2015-01-01

    The article offers a perspective on how the objective of a strong and coherent European protection standard pursued by the fundamental rights amendments of the Lisbon Treaty can be achieved, as it proposes a discursive pluralistic framework to understand and guide the relationship between the EU...

  5. Fundamental Composite (Goldstone) Higgs Dynamics

    DEFF Research Database (Denmark)

    Cacciapaglia, G.; Sannino, Francesco

    2014-01-01

    We provide a unified description, both at the effective and fundamental Lagrangian level, of models of composite Higgs dynamics where the Higgs itself can emerge, depending on the way the electroweak symmetry is embedded, either as a pseudo-Goldstone boson or as a massive excitation of the conden...... searches of new physics at the Large Hadron Collider....

  6. Fundamentals of Biomass pellet production

    DEFF Research Database (Denmark)

    Holm, Jens Kai; Henriksen, Ulrik Birk; Hustad, Johan Einar

    2005-01-01

    Pelletizing experiments along with modelling of the pelletizing process have been carried out with the aim of understanding the fundamental physico-chemical mechanisms that control the quality and durability of biomass pellets. A small-scale California pellet mill (25 kg/h) located with the Biomass...

  7. Energy informatics: Fundamentals and standardization

    Directory of Open Access Journals (Sweden)

    Biyao Huang

    2017-06-01

    Full Text Available Based on international standardization and power utility practices, this paper presents a preliminary and systematic study on the field of energy informatics and analyzes boundary expansion of information and energy system, and the convergence of energy system and ICT. A comprehensive introduction of the fundamentals and standardization of energy informatics is provided, and several key open issues are identified.

  8. Experiments in Fundamental Neutron Physics

    OpenAIRE

    Nico, J. S.; Snow, W. M.

    2006-01-01

    Experiments using slow neutrons address a growing range of scientific issues spanning nuclear physics, particle physics, astrophysics, and cosmology. The field of fundamental physics using neutrons has experienced a significant increase in activity over the last two decades. This review summarizes some of the recent developments in the field and outlines some of the prospects for future research.

  9. Brake Fundamentals. Automotive Articulation Project.

    Science.gov (United States)

    Cunningham, Larry; And Others

    Designed for secondary and postsecondary auto mechanics programs, this curriculum guide contains learning exercises in seven areas: (1) brake fundamentals; (2) brake lines, fluid, and hoses; (3) drum brakes; (4) disc brake system and service; (5) master cylinder, power boost, and control valves; (6) parking brakes; and (7) trouble shooting. Each…

  10. FUNdamental Movement in Early Childhood.

    Science.gov (United States)

    Campbell, Linley

    2001-01-01

    Noting that the development of fundamental movement skills is basic to children's motor development, this booklet provides a guide for early childhood educators in planning movement experiences for children between 4 and 8 years. The booklet introduces a wide variety of appropriate practices to promote movement skill acquisition and increased…

  11. Fundamentals: IVC and Computer Science

    NARCIS (Netherlands)

    Gozalvez, Javier; Haerri, Jerome; Hartenstein, Hannes; Heijenk, Geert; Kargl, Frank; Petit, Jonathan; Scheuermann, Björn; Tieler, Tessa; Altintas, O.; Dressler, F.; Hartenstein, H.; Tonguz, O.K.

    The working group on “Fundamentals: IVC and Computer Science‿ discussed the lasting value of achieved research results as well as potential future directions in the field of inter- vehicular communication. Two major themes ‘with variations’ were the dependence on a specific technology (particularly

  12. Credit cycles and macro fundamentals

    NARCIS (Netherlands)

    Koopman, S.J.; Kraeussl, R.G.W.; Lucas, A.; Monteiro, A.

    2009-01-01

    We use an intensity-based framework to study the relation between macroeconomic fundamentals and cycles in defaults and rating activity. Using Standard and Poor's U.S. corporate rating transition and default data over the period 1980-2005, we directly estimate the default and rating cycle from micro

  13. Fundamental length and relativistic length

    International Nuclear Information System (INIS)

    Strel'tsov, V.N.

    1988-01-01

    It si noted that the introduction of fundamental length contradicts the conventional representations concerning the contraction of the longitudinal size of fast-moving objects. The use of the concept of relativistic length and the following ''elongation formula'' permits one to solve this problem

  14. Experimental tests of fundamental symmetries

    NARCIS (Netherlands)

    Jungmann, K. P.

    2014-01-01

    Ongoing experiments and projects to test our understanding of fundamental inter- actions and symmetries in nature have progressed significantly in the past few years. At high energies the long searched for Higgs boson has been found; tests of gravity for antimatter have come closer to reality;

  15. Leaders' limitations and approaches to creating conditions for interaction and communication in parental groups: A qualitative study.

    Science.gov (United States)

    Frykedal, Karin Forslund; Rosander, Michael; Barimani, Mia; Berlin, Anita

    2018-01-01

    The aim of this study was to describe and understand parental group (PG) leaders' experiences of creating conditions for interaction and communication. The data consisted of 10 interviews with 14 leaders. The transcribed interviews were analysed using thematic analysis. The results showed that the leaders' ambition was to create a parent-centred learning environment by establishing conditions for interaction and communication between the parents in the PGs. However, the leaders' experience was that their professional competencies were insufficient and that they lacked pedagogical tools to create constructive group discussions. Nevertheless, they found other ways to facilitate interactive processes. Based on their experience in the PG, the leaders constructed informal socio-emotional roles for themselves (e.g. caring role and personal role) and let their more formal task roles (e.g. professional role, group leader and consulting role) recede into the background, so as to remove the imbalance of power between the leaders and the parents. They believed this would make the parents feel more confident and make it easier for them to start communicating and interacting. This personal approach places them in a vulnerable position in the PG, in which it is easy for them to feel offended by parents' criticism, questioning or silence.

  16. Escola de ensino fundamental(s em movimento – movimento na escola de ensino fundamental

    Directory of Open Access Journals (Sweden)

    Reiner Hildebrandt-Stramann

    2007-12-01

    Full Text Available A escola de ensino fundamental na Alemanha sofreu movimento nos últimos 15 anos, porque, entre outros motivos, entrou movimento nessas escolas. Esse jogo de palavras chama atenção a duas linhas de trabalho que determinam a discussão na atual pedagogia escolar. O presente trabalho revela essas duas perspectivas. Uma das linhas está relacionada ao atual processo de mudança na pedagogia escolar. Essa prediz que a escola de ensino fundamental deve ser um lugar de aprendizagem e de vivência para as crianças. A outra linha tem a ver com o jogo de palavras ancorado a esses processos da pedagogia do movimento, a qual ganha cada vez maiores dimensões. A escola de ensino fundamental deve ser vista sob a perspectiva do movimento e transformada em um lugar de movimento.

  17. Coherence and diffraction limited resolution in microscopic OCT by a unified approach for the correction of dispersion and aberrations

    Science.gov (United States)

    Schulz-Hildebrandt, H.; Münter, Michael; Ahrens, M.; Spahr, H.; Hillmann, D.; König, P.; Hüttmann, G.

    2018-03-01

    Optical coherence tomography (OCT) images scattering tissues with 5 to 15 μm resolution. This is usually not sufficient for a distinction of cellular and subcellular structures. Increasing axial and lateral resolution and compensation of artifacts caused by dispersion and aberrations is required to achieve cellular and subcellular resolution. This includes defocus which limit the usable depth of field at high lateral resolution. OCT gives access the phase of the scattered light and hence correction of dispersion and aberrations is possible by numerical algorithms. Here we present a unified dispersion/aberration correction which is based on a polynomial parameterization of the phase error and an optimization of the image quality using Shannon's entropy. For validation, a supercontinuum light sources and a costume-made spectrometer with 400 nm bandwidth were combined with a high NA microscope objective in a setup for tissue and small animal imaging. Using this setup and computation corrections, volumetric imaging at 1.5 μm resolution is possible. Cellular and near cellular resolution is demonstrated in porcine cornea and the drosophila larva, when computational correction of dispersion and aberrations is used. Due to the excellent correction of the used microscope objective, defocus was the main contribution to the aberrations. In addition, higher aberrations caused by the sample itself were successfully corrected. Dispersion and aberrations are closely related artifacts in microscopic OCT imaging. Hence they can be corrected in the same way by optimization of the image quality. This way microscopic resolution is easily achieved in OCT imaging of static biological tissues.

  18. arXiv Minimal Fundamental Partial Compositeness

    CERN Document Server

    Cacciapaglia, Giacomo; Sannino, Francesco; Thomsen, Anders Eller

    Building upon the fundamental partial compositeness framework we provide consistent and complete composite extensions of the standard model. These are used to determine the effective operators emerging at the electroweak scale in terms of the standard model fields upon consistently integrating out the heavy composite dynamics. We exhibit the first effective field theories matching these complete composite theories of flavour and analyse their physical consequences for the third generation quarks. Relations with other approaches, ranging from effective analyses for partial compositeness to extra dimensions as well as purely fermionic extensions, are briefly discussed. Our methodology is applicable to any composite theory of dynamical electroweak symmetry breaking featuring a complete theory of flavour.

  19. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  20. Modelling floods in the Ammer catchment: limitations and challenges with a coupled meteo-hydrological model approach

    Directory of Open Access Journals (Sweden)

    R. Ludwig

    2003-01-01

    Full Text Available Numerous applications of hydrological models have shown their capability to simulate hydrological processes with a reasonable degree of certainty. For flood modelling, the quality of precipitation data — the key input parameter — is very important but often remains questionable. This paper presents a critical review of experience in the EU-funded RAPHAEL project. Different meteorological data sources were evaluated to assess their applicability for flood modelling and forecasting in the Bavarian pre-alpine catchment of the Ammer river (709 km2, for which the hydrological aspects of runoff production are described as well as the complex nature of floods. Apart from conventional rain gauge data, forecasts from several Numerical Weather Prediction Models (NWP as well as rain radar data are examined, scaled and applied within the framework of a GIS-structured and physically based hydrological model. Multi-scenario results are compared and analysed. The synergetic approach leads to promising results under certain meteorological conditions but emphasises various drawbacks. At present, NWPs are the only source of rainfall forecasts (up to 96 hours with large spatial coverage and high temporal resolution. On the other hand, the coarse spatial resolution of NWP grids cannot yet address, adequately, the heterogeneous structures of orographic rainfields in complex convective situations; hence, a major downscaling problem for mountain catchment applications is introduced. As shown for two selected Ammer flood events, a high variability in prediction accuracy has still to be accepted at present. Sensitivity analysis of both meteo-data input and hydrological model performance in terms of process description are discussed and positive conclusions have been drawn for future applications of an advanced meteo-hydro model synergy. Keywords: RAPHAEL, modelling, forecasting, model coupling, PROMET-D, TOPMODEL