WorldWideScience

Sample records for approaching fundamental limits

  1. Fundamental limit of light trapping in grating structures

    KAUST Repository

    Yu, Zongfu

    2010-08-11

    We use a rigorous electromagnetic approach to analyze the fundamental limit of light-trapping enhancement in grating structures. This limit can exceed the bulk limit of 4n 2, but has significant angular dependency. We explicitly show that 2D gratings provide more enhancement than 1D gratings. We also show the effects of the grating profile’s symmetry on the absorption enhancement limit. Numerical simulations are applied to support the theory. Our findings provide general guidance for the design of grating structures for light-trapping solar cells.

  2. Limits on fundamental limits to computation.

    Science.gov (United States)

    Markov, Igor L

    2014-08-14

    An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.

  3. Fundamental gravitational limitations to quantum computing

    International Nuclear Information System (INIS)

    Gambini, R.; Porto, A.; Pullin, J.

    2006-01-01

    Lloyd has considered the ultimate limitations the fundamental laws of physics place on quantum computers. He concludes in particular that for an 'ultimate laptop' (a computer of one liter of volume and one kilogram of mass) the maximum number of operations per second is bounded by 10 51 . The limit is derived considering ordinary quantum mechanics. Here we consider additional limits that are placed by quantum gravity ideas, namely the use of a relational notion of time and fundamental gravitational limits that exist on time measurements. We then particularize for the case of an ultimate laptop and show that the maximum number of operations is further constrained to 10 47 per second. (authors)

  4. Towards the Fundamental Quantum Limit of Linear Measurements of Classical Signals.

    Science.gov (United States)

    Miao, Haixing; Adhikari, Rana X; Ma, Yiqiu; Pang, Belinda; Chen, Yanbei

    2017-08-04

    The quantum Cramér-Rao bound (QCRB) sets a fundamental limit for the measurement of classical signals with detectors operating in the quantum regime. Using linear-response theory and the Heisenberg uncertainty relation, we derive a general condition for achieving such a fundamental limit. When applied to classical displacement measurements with a test mass, this condition leads to an explicit connection between the QCRB and the standard quantum limit that arises from a tradeoff between the measurement imprecision and quantum backaction; the QCRB can be viewed as an outcome of a quantum nondemolition measurement with the backaction evaded. Additionally, we show that the test mass is more a resource for improving measurement sensitivity than a victim of the quantum backaction, which suggests a new approach to enhancing the sensitivity of a broad class of sensors. We illustrate these points with laser interferometric gravitational-wave detectors.

  5. 33 CFR 86.03 - Limits of fundamental frequencies.

    Science.gov (United States)

    2010-07-01

    ... of fundamental frequencies. To ensure a wide variety of whistle characteristics, the fundamental... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Limits of fundamental frequencies. 86.03 Section 86.03 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY...

  6. Fundamental Limitations for Imaging GEO Satellites

    Science.gov (United States)

    2015-10-18

    Fundamental limitations for imaging GEO satellites D. Mozurkewich Seabrook Engineering , Seabrook, MD 20706 USA H. R. Schmitt, J. T. Armstrong Naval...higher spatial frequency. Send correspondence to David Mozurkewich, Seabrook Engineering , 9310 Dubarry Ave., Seabrook MD 20706 E-mail: dave

  7. Fundamental limit of nanophotonic light trapping in solar cells.

    Science.gov (United States)

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2010-10-12

    Establishing the fundamental limit of nanophotonic light-trapping schemes is of paramount importance and is becoming increasingly urgent for current solar cell research. The standard theory of light trapping demonstrated that absorption enhancement in a medium cannot exceed a factor of 4n(2)/sin(2)θ, where n is the refractive index of the active layer, and θ is the angle of the emission cone in the medium surrounding the cell. This theory, however, is not applicable in the nanophotonic regime. Here we develop a statistical temporal coupled-mode theory of light trapping based on a rigorous electromagnetic approach. Our theory reveals that the conventional limit can be substantially surpassed when optical modes exhibit deep-subwavelength-scale field confinement, opening new avenues for highly efficient next-generation solar cells.

  8. Fundamental limits of repeaterless quantum communications

    Science.gov (United States)

    Pirandola, Stefano; Laurenza, Riccardo; Ottaviani, Carlo; Banchi, Leonardo

    2017-01-01

    Quantum communications promises reliable transmission of quantum information, efficient distribution of entanglement and generation of completely secure keys. For all these tasks, we need to determine the optimal point-to-point rates that are achievable by two remote parties at the ends of a quantum channel, without restrictions on their local operations and classical communication, which can be unlimited and two-way. These two-way assisted capacities represent the ultimate rates that are reachable without quantum repeaters. Here, by constructing an upper bound based on the relative entropy of entanglement and devising a dimension-independent technique dubbed ‘teleportation stretching', we establish these capacities for many fundamental channels, namely bosonic lossy channels, quantum-limited amplifiers, dephasing and erasure channels in arbitrary dimension. In particular, we exactly determine the fundamental rate-loss tradeoff affecting any protocol of quantum key distribution. Our findings set the limits of point-to-point quantum communications and provide precise and general benchmarks for quantum repeaters. PMID:28443624

  9. Fundamental limits of radio interferometers: calibration and source parameter estimation

    OpenAIRE

    Trott, Cathryn M.; Wayth, Randall B.; Tingay, Steven J.

    2012-01-01

    We use information theory to derive fundamental limits on the capacity to calibrate next-generation radio interferometers, and measure parameters of point sources for instrument calibration, point source subtraction, and data deconvolution. We demonstrate the implications of these fundamental limits, with particular reference to estimation of the 21cm Epoch of Reionization power spectrum with next-generation low-frequency instruments (e.g., the Murchison Widefield Array -- MWA, Precision Arra...

  10. Fundamental-mode sources in approach to critical experiments

    International Nuclear Information System (INIS)

    Goda, J.; Busch, R.

    2000-01-01

    An equivalent fundamental-mode source is an imaginary source that is distributed identically in space, energy, and angle to the fundamental-mode fission source. Therefore, it produces the same neutron multiplication as the fundamental-mode fission source. Even if two source distributions produce the same number of spontaneous fission neutrons, they will not necessarily contribute equally toward the multiplication of a given system. A method of comparing the relative importance of source distributions is needed. A factor, denoted as g* and defined as the ratio of the fixed-source multiplication to the fundamental-mode multiplication, is used to convert a given source strength to its equivalent fundamental-mode source strength. This factor is of interest to criticality safety as it relates to the 1/M method of approach to critical. Ideally, a plot of 1/M versus κ eff is linear. However, since 1/M = (1 minus κ eff )/g*, the plot will be linear only if g* is constant with κ eff . When g* increases with κ eff , the 1/M plot is said to be conservative because the critical mass is underestimated. However, it is possible for g* to decrease with κ eff yielding a nonconservative 1/M plot. A better understanding of g* would help predict whether a given approach to critical will be conservative or nonconservative. The equivalent fundamental-mode source strength g*S can be predicted by experiment. The experimental method was tested on the XIX-1 core on the Fast Critical Assembly at the Japan Atomic Energy Research Institute. The results showed a 30% difference between measured and calculated values. However, the XIX-1 reactor had significant intermediate-energy neutrons. The presence of intermediate-energy neutrons may have made the cross-section set used for predicted values less than ideal for the system

  11. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  12. Fundamental size limitations of micro four-point probes

    DEFF Research Database (Denmark)

    Ansbæk, Thor; Petersen, Dirch Hjorth; Hansen, Ole

    2009-01-01

    The continued down-scaling of integrated circuits and magnetic tunnel junctions (MTJ) for hard disc read heads presents a challenge to current metrology technology. The four-point probes (4PP), currently used for sheet resistance characterization in these applications, therefore must be down......-scaled as well in order to correctly characterize the extremely thin films used. This presents a four-point probe design and fabrication challenge. We analyze the fundamental limitation on down-scaling of a generic micro four-point probe (M4PP) in a comprehensive study, where mechanical, thermal, and electrical...

  13. Fundamental limitations of cavity-assisted atom interferometry

    Science.gov (United States)

    Dovale-Álvarez, M.; Brown, D. D.; Jones, A. W.; Mow-Lowry, C. M.; Miao, H.; Freise, A.

    2017-11-01

    Atom interferometers employing optical cavities to enhance the beam splitter pulses promise significant advances in science and technology, notably for future gravitational wave detectors. Long cavities, on the scale of hundreds of meters, have been proposed in experiments aiming to observe gravitational waves with frequencies below 1 Hz, where laser interferometers, such as LIGO, have poor sensitivity. Alternatively, short cavities have also been proposed for enhancing the sensitivity of more portable atom interferometers. We explore the fundamental limitations of two-mirror cavities for atomic beam splitting, and establish upper bounds on the temperature of the atomic ensemble as a function of cavity length and three design parameters: the cavity g factor, the bandwidth, and the optical suppression factor of the first and second order spatial modes. A lower bound to the cavity bandwidth is found which avoids elongation of the interaction time and maximizes power enhancement. An upper limit to cavity length is found for symmetric two-mirror cavities, restricting the practicality of long baseline detectors. For shorter cavities, an upper limit on the beam size was derived from the geometrical stability of the cavity. These findings aim to aid the design of current and future cavity-assisted atom interferometers.

  14. Arguing against fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    This paper aims to open up discussion on the relationship between fundamentality and naturalism, and in particular on the question of whether fundamentality may be denied on naturalistic grounds. A historico-inductive argument for an anti-fundamentalist conclusion, prominent within the contemporary metaphysical literature, is examined; finding it wanting, an alternative 'internal' strategy is proposed. By means of an example from the history of modern physics - namely S-matrix theory - it is demonstrated that (1) this strategy can generate similar (though not identical) anti-fundamentalist conclusions on more defensible naturalistic grounds, and (2) that fundamentality questions can be empirical questions. Some implications and limitations of the proposed approach are discussed.

  15. Investigation of fundamental limits to beam brightness available from photoinjectors

    International Nuclear Information System (INIS)

    Bazarov, Ivan

    2015-01-01

    The goal of this project was investigation of fundamental limits to beam brightness available from photoinjectors. This basic research in accelerator physics spanned over 5 years aiming to extend the fundamental understanding of high average current, low emittance sources of relativistic electrons based on photoemission guns, a necessary prerequisite for a new generation of coherent X-ray synchrotron radiation facilities based on continuous duty superconducting linacs. The program focused on two areas critical to making advances in the electron source performance: 1) the physics of photocathodes for the production of low emittance electrons and 2) control of space charge forces in the immediate vicinity to the cathode via 3D laser pulse shaping.

  16. Investigation of fundamental limits to beam brightness available from photoinjectors

    Energy Technology Data Exchange (ETDEWEB)

    Bazarov, Ivan [Cornell Univ., Ithaca, NY (United States)

    2015-07-09

    The goal of this project was investigation of fundamental limits to beam brightness available from photoinjectors. This basic research in accelerator physics spanned over 5 years aiming to extend the fundamental understanding of high average current, low emittance sources of relativistic electrons based on photoemission guns, a necessary prerequisite for a new generation of coherent X-ray synchrotron radiation facilities based on continuous duty superconducting linacs. The program focused on two areas critical to making advances in the electron source performance: 1) the physics of photocathodes for the production of low emittance electrons and 2) control of space charge forces in the immediate vicinity to the cathode via 3D laser pulse shaping.

  17. Limiting value definition in radiation protection physics, legislation and toxicology. Fundamentals, contrasts, perspectives

    International Nuclear Information System (INIS)

    Smeddinck, Ulrich; Koenig, Claudia

    2016-01-01

    The volume is the documentation of an ENTRIA workshop discussion on limiting value definition in radiation protection including the following contributions: Introduction in radiation protection -fundamentals concepts of limiting values, heterogeneity; evaluation standards for dose in radiation protection in the context of final repository search; definition of limiting values in toxicology; public participation to limiting value definition - a perspective for the radiation protection regulation; actual developments in radiation protection.

  18. Fundamental Limit of Nanophotonic Light-trapping in Solar Cells

    OpenAIRE

    Yu, Zongfu; Raman, Aaswath; Fan, Shanhui

    2010-01-01

    Establishing the fundamental limit of nanophotonic light-trapping schemes is of paramount importance and is becoming increasingly urgent for current solar cell research. The standard theory of light trapping demonstrated that absorption enhancement in a medium cannot exceed a factor of 4n^2/ sin^2(\\theta), where n is the refractive index of the active layer, and \\theta is the angle of the emission cone in the medium surrounding the cell. This theory, however, is not applicable in the nanophot...

  19. Fundamental limits to position determination by concentration gradients.

    Directory of Open Access Journals (Sweden)

    Filipe Tostevin

    2007-04-01

    Full Text Available Position determination in biological systems is often achieved through protein concentration gradients. Measuring the local concentration of such a protein with a spatially varying distribution allows the measurement of position within the system. For these systems to work effectively, position determination must be robust to noise. Here, we calculate fundamental limits to the precision of position determination by concentration gradients due to unavoidable biochemical noise perturbing the gradients. We focus on gradient proteins with first-order reaction kinetics. Systems of this type have been experimentally characterised in both developmental and cell biology settings. For a single gradient we show that, through time-averaging, great precision potentially can be achieved even with very low protein copy numbers. As a second example, we investigate the ability of a system with oppositely directed gradients to find its centre. With this mechanism, positional precision close to the centre improves more slowly with increasing averaging time, and so longer averaging times or higher copy numbers are required for high precision. For both single and double gradients, we demonstrate the existence of optimal length scales for the gradients for which precision is maximized, as well as analyze how precision depends on the size of the concentration-measuring apparatus. These results provide fundamental constraints on the positional precision supplied by concentration gradients in various contexts, including both in developmental biology and also within a single cell.

  20. Queueing networks a fundamental approach

    CERN Document Server

    Dijk, Nico

    2011-01-01

    This handbook aims to highlight fundamental, methodological and computational aspects of networks of queues to provide insights and to unify results that can be applied in a more general manner.  The handbook is organized into five parts: Part 1 considers exact analytical results such as of product form type. Topics include characterization of product forms by physical balance concepts and simple traffic flow equations, classes of service and queue disciplines that allow a product form, a unified description of product forms for discrete time queueing networks, insights for insensitivity, and aggregation and decomposition results that allow subnetworks to be aggregated into single nodes to reduce computational burden. Part 2 looks at monotonicity and comparison results such as for computational simplification by either of two approaches: stochastic monotonicity and ordering results based on the ordering of the proces generators, and comparison results and explicit error bounds based on an underlying Markov r...

  1. Fundamental limits to the velocity of solid armatures in railguns

    International Nuclear Information System (INIS)

    Long, G.C. Jr.

    1987-01-01

    The fundamental limits to the velocity of solid armatures in railguns are dependent upon the increase in temperature which melts the conducting medium or lowers the yield strength of the material. A two-dimensional transient finite-element electrothermal model is developed to determine the magnetic and temperature fields in the rails and armature of a railgun. The solution for the magnetic and temperature fields is based upon the fundamentals of Maxwell's equations and Fourier's law of heat conduction with no a priori assumptions about the current-density distribution in the rails or the armature. The magnetic-field and temperature-field spatial variations are calculated using finite-element techniques, while the time variations are calculated using finite-differencing methods. A thermal-diffusion iteration is performed between each magnetic diffusion iteration. Joule heating information is provided by solving the magnetic diffusion problem and temperature data for calculating material properties such as the electrical resistivity, thermal conductivity, and specific heat is provided by solving the thermal diffusion problem. Various types of rail and armature designs are simulated to include solid armatures consisting of different homogeneous materials, resistive rails, and a graded-resistance armature

  2. Some Fundamental Limits on SAW RFID Tag Information Capacity and Collision Resolution

    Science.gov (United States)

    Barton, Richard J.

    2013-01-01

    In this paper, we apply results from multi-user information theory to study the limits of information capacity and collision resolution for SAW RFID tags. In particular, we derive bounds on the achievable data rate per tag as a function of fundamental parameters such as tag time-bandwidth product, tag signal-to-noise ratio (SNR), and number of tags in the environment. We also discuss the implications of these bounds for tag waveform design and tag interrogation efficiency

  3. Fundamentals of a graded approach to safety-related equipment setpoints

    International Nuclear Information System (INIS)

    Woodruff, B.A.; Cash, J.S. Jr.; Bockhorst, R.M.

    1993-01-01

    The concept of using a graded approach to reconstitute instrument setpoints associated with safety-related equipment was first presented to the industry by the U.S. Nuclear Regulatory Commission during the 1992 ISA/POWID Symposium in Kansas City, Missouri. The graded approach establishes that the manner in which a utility analyzes and documents setpoints is related to each setpoint's relative importance to safety. This allows a utility to develop separate requirements for setpoints of varying levels of safety significance. A graded approach to setpoints is a viable strategy that minimizes extraneous effort expended in resolving difficult issues that arise when formal setpoint methodology is applied blindly to all setpoints. Close examination of setpoint methodology reveals that the application of a graded approach is fundamentally dependent on the analytical basis of each individual setpoint

  4. Heat-Assisted Magnetic Recording: Fundamental Limits to Inverse Electromagnetic Design

    Science.gov (United States)

    Bhargava, Samarth

    In this dissertation, we address the burgeoning fields of diffractive optics, metals-optics and plasmonics, and computational inverse problems in the engineering design of electromagnetic structures. We focus on the application of the optical nano-focusing system that will enable Heat-Assisted Magnetic Recording (HAMR), a higher density magnetic recording technology that will fulfill the exploding worldwide demand of digital data storage. The heart of HAMR is a system that focuses light to a nano- sub-diffraction-limit spot with an extremely high power density via an optical antenna. We approach this engineering problem by first discussing the fundamental limits of nano-focusing and the material limits for metal-optics and plasmonics. Then, we use efficient gradient-based optimization algorithms to computationally design shapes of 3D nanostructures that outperform human designs on the basis of mass-market product requirements. In 2014, the world manufactured ˜1 zettabyte (ZB), ie. 1 Billion terabytes (TBs), of data storage devices, including ˜560 million magnetic hard disk drives (HDDs). Global demand of storage will likely increase by 10x in the next 5-10 years, and manufacturing capacity cannot keep up with demand alone. We discuss the state-of-art HDD and why industry invented Heat-Assisted Magnetic Recording (HAMR) to overcome the data density limitations. HAMR leverages the temperature sensitivity of magnets, in which the coercivity suddenly and non-linearly falls at the Curie temperature. Data recording to high-density hard disks can be achieved by locally heating one bit of information while co-applying a magnetic field. The heating can be achieved by focusing 100 microW of light to a 30nm diameter spot on the hard disk. This is an enormous light intensity, roughly ˜100,000,000x the intensity of sunlight on the earth's surface! This power density is ˜1,000x the output of gold-coated tapered optical fibers used in Near-field Scanning Optical Microscopes

  5. Quantum Limits of Space-to-Ground Optical Communications

    Science.gov (United States)

    Hemmati, H.; Dolinar, S.

    2012-01-01

    For a pure loss channel, the ultimate capacity can be achieved with classical coherent states (i.e., ideal laser light): (1) Capacity-achieving receiver (measurement) is yet to be determined. (2) Heterodyne detection approaches the ultimate capacity at high mean photon numbers. (3) Photon-counting approaches the ultimate capacity at low mean photon numbers. A number of current technology limits drive the achievable performance of free-space communication links. Approaching fundamental limits in the bandwidth-limited regime: (1) Heterodyne detection with high-order coherent-state modulation approaches ultimate limits. SOA improvements to laser phase noise, adaptive optics systems for atmospheric transmission would help. (2) High-order intensity modulation and photon-counting can approach heterodyne detection within approximately a factor of 2. This may have advantages over coherent detection in the presence of turbulence. Approaching fundamental limits in the photon-limited regime (1) Low-duty cycle binary coherent-state modulation (OOK, PPM) approaches ultimate limits. SOA improvements to laser extinction ratio, receiver dark noise, jitter, and blocking would help. (2) In some link geometries (near field links) number-state transmission could improve over coherent-state transmission

  6. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    International Nuclear Information System (INIS)

    Egan, James; McMillan, Normal; Denieffe, David

    2011-01-01

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  7. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    Energy Technology Data Exchange (ETDEWEB)

    Egan, James; McMillan, Normal; Denieffe, David, E-mail: eganj@itcarlow.ie [IT Carlow (Ireland)

    2011-08-17

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  8. Fundamental aspects of plasma chemical physics Thermodynamics

    CERN Document Server

    Capitelli, Mario; D'Angola, Antonio

    2012-01-01

    Fundamental Aspects of Plasma Chemical Physics - Thermodynamics develops basic and advanced concepts of plasma thermodynamics from both classical and statistical points of view. After a refreshment of classical thermodynamics applied to the dissociation and ionization regimes, the book invites the reader to discover the role of electronic excitation in affecting the properties of plasmas, a topic often overlooked by the thermal plasma community. Particular attention is devoted to the problem of the divergence of the partition function of atomic species and the state-to-state approach for calculating the partition function of diatomic and polyatomic molecules. The limit of ideal gas approximation is also discussed, by introducing Debye-Huckel and virial corrections. Throughout the book, worked examples are given in order to clarify concepts and mathematical approaches. This book is a first of a series of three books to be published by the authors on fundamental aspects of plasma chemical physics.  The next bo...

  9. Coherence-limited solar power conversion: the fundamental thermodynamic bounds and the consequences for solar rectennas

    Science.gov (United States)

    Mashaal, Heylal; Gordon, Jeffrey M.

    2014-10-01

    Solar rectifying antennas constitute a distinct solar power conversion paradigm where sunlight's spatial coherence is a basic constraining factor. In this presentation, we derive the fundamental thermodynamic limit for coherence-limited blackbody (principally solar) power conversion. Our results represent a natural extension of the eponymous Landsberg limit, originally derived for converters that are not constrained by the radiation's coherence, and are irradiated at maximum concentration (i.e., with a view factor of unity to the solar disk). We proceed by first expanding Landsberg's results to arbitrary solar view factor (i.e., arbitrary concentration and/or angular confinement), and then demonstrate how the results are modified when the converter can only process coherent radiation. The results are independent of the specific power conversion mechanism, and hence are valid for diffraction-limited as well as quantum converters (and not just classical heat engines or in the geometric optics regime). The derived upper bounds bode favorably for the potential of rectifying antennas as potentially high-efficiency solar converters.

  10. Limits of the endoscopic transnasal transtubercular approach.

    Science.gov (United States)

    Gellner, Verena; Tomazic, Peter V

    2018-06-01

    The endoscopic transnasal trans-sphenoidal transtubercular approach has become a standard alternative approach to neurosurgical transcranial routes for lesions of the anterior skull base in particular pathologies of the anterior tubercle, sphenoid plane, and midline lesions up to the interpeduncular cistern. For both the endoscopic and the transcranial approach indications must strictly be evaluated and tailored to the patients' morphology and condition. The purpose of this review was to evaluate the evidence in literature of the limitations of the endoscopic transtubercular approach. A PubMed/Medline search was conducted in January 2018 entering following keywords. Upon initial screening 7 papers were included in this review. There are several other papers describing the endoscopic transtubercular approach (ETTA). We tried to list the limitation factors according to the actual existing literature as cited. The main limiting factors are laterally extending lesions in relation to the optic canal and vascular encasement and/or unfavorable tumor tissue consistency. The ETTA is considered as a high level transnasal endoscopic extended skull base approach and requires excellent training, skills and experience.

  11. Fundamental length

    International Nuclear Information System (INIS)

    Pradhan, T.

    1975-01-01

    The concept of fundamental length was first put forward by Heisenberg from purely dimensional reasons. From a study of the observed masses of the elementary particles known at that time, it is sumrised that this length should be of the order of magnitude 1 approximately 10 -13 cm. It was Heisenberg's belief that introduction of such a fundamental length would eliminate the divergence difficulties from relativistic quantum field theory by cutting off the high energy regions of the 'proper fields'. Since the divergence difficulties arise primarily due to infinite number of degrees of freedom, one simple remedy would be the introduction of a principle that limits these degrees of freedom by removing the effectiveness of the waves with a frequency exceeding a certain limit without destroying the relativistic invariance of the theory. The principle can be stated as follows: It is in principle impossible to invent an experiment of any kind that will permit a distintion between the positions of two particles at rest, the distance between which is below a certain limit. A more elegant way of introducing fundamental length into quantum theory is through commutation relations between two position operators. In quantum field theory such as quantum electrodynamics, it can be introduced through the commutation relation between two interpolating photon fields (vector potentials). (K.B.)

  12. Fundamental phenomena affecting low temperature combustion and HCCI engines, high load limits and strategies for extending these limits

    KAUST Repository

    Saxena, Samveg; Bedoya, Ivá n D.

    2013-01-01

    Low temperature combustion (LTC) engines are an emerging engine technology that offers an alternative to spark-ignited and diesel engines. One type of LTC engine, the homogeneous charge compression ignition (HCCI) engine, uses a well-mixed fuel–air charge like spark-ignited engines and relies on compression ignition like diesel engines. Similar to diesel engines, the use of high compression ratios and removal of the throttling valve in HCCI allow for high efficiency operation, thereby allowing lower CO2 emissions per unit of work delivered by the engine. The use of a highly diluted well-mixed fuel–air charge allows for low emissions of nitrogen oxides, soot and particulate matters, and the use of oxidation catalysts can allow low emissions of unburned hydrocarbons and carbon monoxide. As a result, HCCI offers the ability to achieve high efficiencies comparable with diesel while also allowing clean emissions while using relatively inexpensive aftertreatment technologies. HCCI is not, however, without its challenges. Traditionally, two important problems prohibiting market penetration of HCCI are 1) inability to achieve high load, and 2) difficulty in controlling combustion timing. Recent research has significantly mitigated these challenges, and thus HCCI has a promising future for automotive and power generation applications. This article begins by providing a comprehensive review of the physical phenomena governing HCCI operation, with particular emphasis on high load conditions. Emissions characteristics are then discussed, with suggestions on how to inexpensively enable low emissions of all regulated emissions. The operating limits that govern the high load conditions are discussed in detail, and finally a review of recent research which expands the high load limits of HCCI is discussed. Although this article focuses on the fundamental phenomena governing HCCI operation, it is also useful for understanding the fundamental phenomena in reactivity controlled

  13. Fundamental phenomena affecting low temperature combustion and HCCI engines, high load limits and strategies for extending these limits

    KAUST Repository

    Saxena, Samveg

    2013-10-01

    Low temperature combustion (LTC) engines are an emerging engine technology that offers an alternative to spark-ignited and diesel engines. One type of LTC engine, the homogeneous charge compression ignition (HCCI) engine, uses a well-mixed fuel–air charge like spark-ignited engines and relies on compression ignition like diesel engines. Similar to diesel engines, the use of high compression ratios and removal of the throttling valve in HCCI allow for high efficiency operation, thereby allowing lower CO2 emissions per unit of work delivered by the engine. The use of a highly diluted well-mixed fuel–air charge allows for low emissions of nitrogen oxides, soot and particulate matters, and the use of oxidation catalysts can allow low emissions of unburned hydrocarbons and carbon monoxide. As a result, HCCI offers the ability to achieve high efficiencies comparable with diesel while also allowing clean emissions while using relatively inexpensive aftertreatment technologies. HCCI is not, however, without its challenges. Traditionally, two important problems prohibiting market penetration of HCCI are 1) inability to achieve high load, and 2) difficulty in controlling combustion timing. Recent research has significantly mitigated these challenges, and thus HCCI has a promising future for automotive and power generation applications. This article begins by providing a comprehensive review of the physical phenomena governing HCCI operation, with particular emphasis on high load conditions. Emissions characteristics are then discussed, with suggestions on how to inexpensively enable low emissions of all regulated emissions. The operating limits that govern the high load conditions are discussed in detail, and finally a review of recent research which expands the high load limits of HCCI is discussed. Although this article focuses on the fundamental phenomena governing HCCI operation, it is also useful for understanding the fundamental phenomena in reactivity controlled

  14. A Fundamental Approach to Developing Aluminium based Bulk Amorphous Alloys based on Stable Liquid Metal Structures and Electronic Equilibrium - 154041

    Science.gov (United States)

    2017-03-28

    AFRL-AFOSR-JP-TR-2017-0027 A Fundamental Approach to Developing Aluminium -based Bulk Amorphous Alloys based on Stable Liquid-Metal Structures and...to 16 Dec 2016 4.  TITLE AND SUBTITLE A Fundamental Approach to Developing Aluminium -based Bulk Amorphous Alloys based on Stable Liquid-Metal...Air Force Research Laboratory for accurately predicting compositions of new amorphous alloys specifically based on aluminium with properties superior

  15. Whole-genome sequencing approaches for conservation biology: Advantages, limitations and practical recommendations.

    Science.gov (United States)

    Fuentes-Pardo, Angela P; Ruzzante, Daniel E

    2017-10-01

    Whole-genome resequencing (WGR) is a powerful method for addressing fundamental evolutionary biology questions that have not been fully resolved using traditional methods. WGR includes four approaches: the sequencing of individuals to a high depth of coverage with either unresolved or resolved haplotypes, the sequencing of population genomes to a high depth by mixing equimolar amounts of unlabelled-individual DNA (Pool-seq) and the sequencing of multiple individuals from a population to a low depth (lcWGR). These techniques require the availability of a reference genome. This, along with the still high cost of shotgun sequencing and the large demand for computing resources and storage, has limited their implementation in nonmodel species with scarce genomic resources and in fields such as conservation biology. Our goal here is to describe the various WGR methods, their pros and cons and potential applications in conservation biology. WGR offers an unprecedented marker density and surveys a wide diversity of genetic variations not limited to single nucleotide polymorphisms (e.g., structural variants and mutations in regulatory elements), increasing their power for the detection of signatures of selection and local adaptation as well as for the identification of the genetic basis of phenotypic traits and diseases. Currently, though, no single WGR approach fulfils all requirements of conservation genetics, and each method has its own limitations and sources of potential bias. We discuss proposed ways to minimize such biases. We envision a not distant future where the analysis of whole genomes becomes a routine task in many nonmodel species and fields including conservation biology. © 2017 John Wiley & Sons Ltd.

  16. Fundamental Limits to Coherent Scattering and Photon Coalescence from Solid-State Quantum Emitters [arXiv

    DEFF Research Database (Denmark)

    Iles-Smith, Jake; McCutcheon, Dara; Mørk, Jesper

    2016-01-01

    a substantial suppression of detrimental interactions between the source and its phonon environment. Nevertheless, we demonstrate here that this reasoning is incomplete, and phonon interactions continue to play a crucial role in determining solid-state emission characteristics even for very weak excitation. We...... find that the sideband resulting from non-Markovian relaxation of the phonon environment leads to a fundamental limit to the fraction of coherently scattered light and to the visibility of two-photon coalescence at weak driving, both of which are absent for atomic systems or within simpler Markovian...

  17. Fundamentalism and science

    Directory of Open Access Journals (Sweden)

    Massimo Pigliucci

    2006-06-01

    Full Text Available The many facets of fundamentalism. There has been much talk about fundamentalism of late. While most people's thought on the topic go to the 9/11 attacks against the United States, or to the ongoing war in Iraq, fundamentalism is affecting science and its relationship to society in a way that may have dire long-term consequences. Of course, religious fundamentalism has always had a history of antagonism with science, and – before the birth of modern science – with philosophy, the age-old vehicle of the human attempt to exercise critical thinking and rationality to solve problems and pursue knowledge. “Fundamentalism” is defined by the Oxford Dictionary of the Social Sciences1 as “A movement that asserts the primacy of religious values in social and political life and calls for a return to a 'fundamental' or pure form of religion.” In its broadest sense, however, fundamentalism is a form of ideological intransigence which is not limited to religion, but includes political positions as well (for example, in the case of some extreme forms of “environmentalism”.

  18. Fundamental limitations on V/STOL terminal guidance due to aircraft characteristics

    Science.gov (United States)

    Wolkovitch, J.; Lamont, C. W.; Lochtie, D. W.

    1971-01-01

    A review is given of limitations on approach flight paths of V/STOL aircraft, including limits on descent angle due to maximum drag/lift ratio. A method of calculating maximum drag/lift ratio of tilt-wing and deflected slipstream aircraft is presented. Derivatives and transfer functions for the CL-84 tilt-wing and X-22A tilt-duct aircraft are presented. For the unaugmented CL-84 in steep descents the transfer function relating descent angle to thrust contains a right-half plane zero. Using optimal control theory, it is shown that this zero causes a serious degradation in the accuracy with which steep flight paths can be followed in the presence of gusts.

  19. Fundamentals, financial factors and firm investment in India: A Panel VAR approach

    OpenAIRE

    Das, Pranab Kumar

    2008-01-01

    This study analyses the role of fundamentals and financial factors in determining firm investment in India with imperfect capital market in a panel VAR framework. Previous research in this area is based on the test of significance (or some variant of this) of the cash flow variable in the investment equation. In this strand of research, cash flow is considered to be a financial factor. The major theoretical problem of this approach is that in a forward-looking model cash flow might be cor...

  20. 33 CFR 401.52 - Limit of approach to a bridge.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Limit of approach to a bridge... approach to a bridge. (a) No vessel shall pass the limit of approach sign at any movable bridge until the bridge is in a fully open position and the signal light shows green. (b) No vessel shall pass the limit...

  1. Probing the fundamental limit of niobium in high radiofrequency fields by dual mode excitation in superconducting radiofrequency cavities

    International Nuclear Information System (INIS)

    Eremeev, Grigory; Geng, Rongli; Palczewski, Ari

    2011-01-01

    We have studied thermal breakdown in several multicell superconducting radiofrequency cavity by simultaneous excitation of two TM 010 passband modes. Unlike measurements done in the past, which indicated a clear thermal nature of the breakdown, our measurements present a more complex picture with interplay of both thermal and magnetic effects. JLab LG-1 that we studied was limited at 40.5 MV/m, corresponding to B peak = 173 mT, in 89 mode. Dual mode measurements on this quench indicate that this quench is not purely magnetic, and so we conclude that this field is not the fundamental limit in SRF cavities

  2. Fundamental Work Cost of Quantum Processes

    Science.gov (United States)

    Faist, Philippe; Renner, Renato

    2018-04-01

    Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics to the nanoscale. Here, we provide a general fundamental lower limit, valid for systems with an arbitrary Hamiltonian and in contact with any thermodynamic bath, on the work cost for the implementation of any logical process. This limit is given by a new information measure—the coherent relative entropy—which accounts for the Gibbs weight of each microstate. The coherent relative entropy enjoys a collection of natural properties justifying its interpretation as a measure of information and can be understood as a generalization of a quantum relative entropy difference. As an application, we show that the standard first and second laws of thermodynamics emerge from our microscopic picture in the macroscopic limit. Finally, our results have an impact on understanding the role of the observer in thermodynamics: Our approach may be applied at any level of knowledge—for instance, at the microscopic, mesoscopic, or macroscopic scales—thus providing a formulation of thermodynamics that is inherently relative to the observer. We obtain a precise criterion for when the laws of thermodynamics can be applied, thus making a step forward in determining the exact extent of the universality of thermodynamics and enabling a systematic treatment of Maxwell-demon-like situations.

  3. Fundamental Work Cost of Quantum Processes

    Directory of Open Access Journals (Sweden)

    Philippe Faist

    2018-04-01

    Full Text Available Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics to the nanoscale. Here, we provide a general fundamental lower limit, valid for systems with an arbitrary Hamiltonian and in contact with any thermodynamic bath, on the work cost for the implementation of any logical process. This limit is given by a new information measure—the coherent relative entropy—which accounts for the Gibbs weight of each microstate. The coherent relative entropy enjoys a collection of natural properties justifying its interpretation as a measure of information and can be understood as a generalization of a quantum relative entropy difference. As an application, we show that the standard first and second laws of thermodynamics emerge from our microscopic picture in the macroscopic limit. Finally, our results have an impact on understanding the role of the observer in thermodynamics: Our approach may be applied at any level of knowledge—for instance, at the microscopic, mesoscopic, or macroscopic scales—thus providing a formulation of thermodynamics that is inherently relative to the observer. We obtain a precise criterion for when the laws of thermodynamics can be applied, thus making a step forward in determining the exact extent of the universality of thermodynamics and enabling a systematic treatment of Maxwell-demon-like situations.

  4. Fundamentals of Geophysics

    Science.gov (United States)

    Frohlich, Cliff

    Choosing an intermediate-level geophysics text is always problematic: What should we teach students after they have had introductory courses in geology, math, and physics, but little else? Fundamentals of Geophysics is aimed specifically at these intermediate-level students, and the author's stated approach is to construct a text “using abundant diagrams, a simplified mathematical treatment, and equations in which the student can follow each derivation step-by-step.” Moreover, for Lowrie, the Earth is round, not flat—the “fundamentals of geophysics” here are the essential properties of our Earth the planet, rather than useful techniques for finding oil and minerals. Thus this book is comparable in both level and approach to C. M. R. Fowler's The Solid Earth (Cambridge University Press, 1990).

  5. Fundamental image quality limits for microcomputed tomography in small animals

    International Nuclear Information System (INIS)

    Ford, N.L.; Thornton, M.M.; Holdsworth, D.W.

    2003-01-01

    resolution to improve, by decreasing the detector element size to tens of microns or less, high quality images will be limited by the x-ray dose administered. For the highest quality images, these doses will approach the lethal dose or LD50 for the animals. Approaching the lethal dose will affect the way experiments are planned, and may reduce opportunities for experiments involving imaging the same animal over time. Dose considerations will become much more important for live small-animal imaging as the limits of resolution are tested

  6. Virtual and composite fundamentals in the ERM

    NARCIS (Netherlands)

    Knot, KHW; Sturm, JE

    1999-01-01

    A latent-variable approach is applied to identify the appropriate driving process for fundamental exchange rates in the ERM. From the time-series characteristics of so-called "virtual fundamentals" and "composite fundamentals", a significant degree of mean reversion can be asserted. The relative

  7. Promoting physical activity among children and adolescents: the strengths and limitations of school-based approaches.

    Science.gov (United States)

    Booth, Michael; Okely, Anthony

    2005-04-01

    Paediatric overweight and obesity is recognised as one of Australia's most significant health problems and effective approaches to increasing physical activity and reducing energy consumption are being sought urgently. Every potential approach and setting should be subjected to critical review in an attempt to maximise the impact of policy and program initiatives. This paper identifies the strengths and limitations of schools as a setting for promoting physical activity. The strengths are: most children and adolescents attend school; most young people are likely to see teachers as credible sources of information; schools provide access to the facilities, infrastructure and support required for physical activity; and schools are the workplace of skilled educators. Potential limitations are: those students who like school the least are the most likely to engage in health-compromising behaviours and the least likely to be influenced by school-based programs; there are about 20 more hours per week available for physical activity outside schools hours than during school hours; enormous demands are already being made on schools; many primary school teachers have low levels of perceived competence in teaching physical education and fundamental movement skills; and opportunities for being active at school may not be consistent with how and when students prefer to be active.

  8. From fundamental limits to radioprotection practice

    International Nuclear Information System (INIS)

    Henry, P.; Chassany, J.

    1980-01-01

    The individual dose limits fixed by present French legislation for different categories of people refer to dose equivalents received by or delivered to the whole body or to certain tissues or organs over given periods of time. The values concerning personnel engaged directly in work under radiations are summed up in a table. These are the limits which radioprotection authorities must impose, while ensuring that exposure levels are kept as low as possible. With the means available in practical radioprotection it is not possible to measure dose equivalents directly, but information may be obtained on dose rates, absorbed doses, particle fluxes, activities per unit volume and per surface area. An interpretation of these measurements is necessary if an efficient supervision of worker exposure is to be achieved [fr

  9. 100 nm scale low-noise sensors based on aligned carbon nanotube networks: overcoming the fundamental limitation of network-based sensors

    Science.gov (United States)

    Lee, Minbaek; Lee, Joohyung; Kim, Tae Hyun; Lee, Hyungwoo; Lee, Byung Yang; Park, June; Jhon, Young Min; Seong, Maeng-Je; Hong, Seunghun

    2010-02-01

    Nanoscale sensors based on single-walled carbon nanotube (SWNT) networks have been considered impractical due to several fundamental limitations such as a poor sensitivity and small signal-to-noise ratio. Herein, we present a strategy to overcome these fundamental problems and build highly-sensitive low-noise nanoscale sensors simply by controlling the structure of the SWNT networks. In this strategy, we prepared nanoscale width channels based on aligned SWNT networks using a directed assembly strategy. Significantly, the aligned network-based sensors with narrower channels exhibited even better signal-to-noise ratio than those with wider channels, which is opposite to conventional random network-based sensors. As a proof of concept, we demonstrated 100 nm scale low-noise sensors to detect mercury ions with the detection limit of ~1 pM, which is superior to any state-of-the-art portable detection system and is below the allowable limit of mercury ions in drinking water set by most government environmental protection agencies. This is the first demonstration of 100 nm scale low-noise sensors based on SWNT networks. Considering the increased interests in high-density sensor arrays for healthcare and environmental protection, our strategy should have a significant impact on various industrial applications.

  10. 100 nm scale low-noise sensors based on aligned carbon nanotube networks: overcoming the fundamental limitation of network-based sensors

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Minbaek; Lee, Joohyung; Kim, Tae Hyun; Lee, Hyungwoo; Lee, Byung Yang; Hong, Seunghun [Department of Physics and Astronomy, Seoul National University, Shilim-Dong, Kwanak-Gu, Seoul 151-742 (Korea, Republic of); Park, June; Seong, Maeng-Je [Department of Physics, Chung-Ang University, Heukseok-Dong, Dongjak-Gu, Seoul 156-756 (Korea, Republic of); Jhon, Young Min, E-mail: mseong@cau.ac.kr, E-mail: shong@phya.snu.ac.kr [Korea Institute of Science and Technology, Hawolgok-Dong, Seongbuk-Gu, Seoul 136-791 (Korea, Republic of)

    2010-02-05

    Nanoscale sensors based on single-walled carbon nanotube (SWNT) networks have been considered impractical due to several fundamental limitations such as a poor sensitivity and small signal-to-noise ratio. Herein, we present a strategy to overcome these fundamental problems and build highly-sensitive low-noise nanoscale sensors simply by controlling the structure of the SWNT networks. In this strategy, we prepared nanoscale width channels based on aligned SWNT networks using a directed assembly strategy. Significantly, the aligned network-based sensors with narrower channels exhibited even better signal-to-noise ratio than those with wider channels, which is opposite to conventional random network-based sensors. As a proof of concept, we demonstrated 100 nm scale low-noise sensors to detect mercury ions with the detection limit of {approx}1 pM, which is superior to any state-of-the-art portable detection system and is below the allowable limit of mercury ions in drinking water set by most government environmental protection agencies. This is the first demonstration of 100 nm scale low-noise sensors based on SWNT networks. Considering the increased interests in high-density sensor arrays for healthcare and environmental protection, our strategy should have a significant impact on various industrial applications.

  11. 100 nm scale low-noise sensors based on aligned carbon nanotube networks: overcoming the fundamental limitation of network-based sensors

    International Nuclear Information System (INIS)

    Lee, Minbaek; Lee, Joohyung; Kim, Tae Hyun; Lee, Hyungwoo; Lee, Byung Yang; Hong, Seunghun; Park, June; Seong, Maeng-Je; Jhon, Young Min

    2010-01-01

    Nanoscale sensors based on single-walled carbon nanotube (SWNT) networks have been considered impractical due to several fundamental limitations such as a poor sensitivity and small signal-to-noise ratio. Herein, we present a strategy to overcome these fundamental problems and build highly-sensitive low-noise nanoscale sensors simply by controlling the structure of the SWNT networks. In this strategy, we prepared nanoscale width channels based on aligned SWNT networks using a directed assembly strategy. Significantly, the aligned network-based sensors with narrower channels exhibited even better signal-to-noise ratio than those with wider channels, which is opposite to conventional random network-based sensors. As a proof of concept, we demonstrated 100 nm scale low-noise sensors to detect mercury ions with the detection limit of ∼1 pM, which is superior to any state-of-the-art portable detection system and is below the allowable limit of mercury ions in drinking water set by most government environmental protection agencies. This is the first demonstration of 100 nm scale low-noise sensors based on SWNT networks. Considering the increased interests in high-density sensor arrays for healthcare and environmental protection, our strategy should have a significant impact on various industrial applications.

  12. Fire protection for nuclear power plants. Part 1. Fundamental approaches. Version 6/99

    International Nuclear Information System (INIS)

    1999-06-01

    The KTA nuclear safety code sets out the fundamental approaches and principles for the prevention of fires in nuclear power plants, addressing aspects such as initiation, spreading, and effects of a fire: (a) Fire load and ignition sources, (b) structural and plant engineering conditions, (c) ways and means relating to fire call and fire fighting. Relevant technical and organisational measures are defined. Scope and quality of fire prevention measures to be taken, as well the relevant in-service inspection activities are determined according to the protective goals pursued in each case. (orig./CB) [de

  13. Extending the fundamental imaging-depth limit of multi-photon microscopy by imaging with photo-activatable fluorophores.

    Science.gov (United States)

    Chen, Zhixing; Wei, Lu; Zhu, Xinxin; Min, Wei

    2012-08-13

    It is highly desirable to be able to optically probe biological activities deep inside live organisms. By employing a spatially confined excitation via a nonlinear transition, multiphoton fluorescence microscopy has become indispensable for imaging scattering samples. However, as the incident laser power drops exponentially with imaging depth due to scattering loss, the out-of-focus fluorescence eventually overwhelms the in-focal signal. The resulting loss of imaging contrast defines a fundamental imaging-depth limit, which cannot be overcome by increasing excitation intensity. Herein we propose to significantly extend this depth limit by multiphoton activation and imaging (MPAI) of photo-activatable fluorophores. The imaging contrast is drastically improved due to the created disparity of bright-dark quantum states in space. We demonstrate this new principle by both analytical theory and experiments on tissue phantoms labeled with synthetic caged fluorescein dye or genetically encodable photoactivatable GFP.

  14. Roothaan approach in the thermodynamic limit

    Science.gov (United States)

    Gutierrez, G.; Plastino, A.

    1982-02-01

    A systematic method for the solution of the Hartree-Fock equations in the thermodynamic limit is presented. The approach is seen to be a natural extension of the one usually employed in the finite-fermion case, i.e., that developed by Roothaan. The new techniques developed here are applied, as an example, to neutron matter, employing the so-called V1 Bethe "homework" potential. The results obtained are, by far, superior to those that the ordinary plane-wave Hartree-Fock theory yields. NUCLEAR STRUCTURE Hartree-Fock approach; nuclear and neutron matter.

  15. A Case for Flexible Epistemology and Metamethodology in Religious Fundamentalism Research

    Directory of Open Access Journals (Sweden)

    Carter J. Haynes

    2010-07-01

    Full Text Available After reviewing a representative sample of current and historical research in religious fundamentalism, the author addresses the epistemological presuppositions supporting both quantitative and qualitative methodologies and argues for epistemological flexibility and metamethodology, both of which support and are supported by metatheoretical thinking. Habermas’ concept of the scientistic self-understanding of the sciences is used to point up the limitations of positivist epistemology, especially in the context of fundamentalism research. A metamethodological approach, supported by epistemological flexibility, makes dialogical engagement between researchers and those they research possible, and an example of how this would look in an actual research design is provided. The article concludes with a theoretical statement and graphic representation of a model for dialogical engagement between Western scholars and non-Western religious fundamentalists. Such engagement, the author argues, is necessary before any real progress on the “problem” of radicalized fundamentalism can be made.

  16. [95/95] Approach for design limits analysis in WWER

    International Nuclear Information System (INIS)

    Shishkov, L.; Tsyganov, S.

    2008-01-01

    The paper discusses a well-known condition [95%/95%], which is important for monitoring some limits of core parameters in the course of designing the reactors (such as PWR or WWER). The condition ensures the postulate 'there is at least a 95 % probability at a 95 % confidence level that' some parameter does not exceed the limit. Such conditions are stated, for instance, in US standards and IAEA norms as recommendations for DNBR and fuel temperature. A question may arise: why can such approach for the limits be only applied to these parameters, while not normally applied to any other parameters? What is the way to ensure the limits in design practice? Using the general statements of mathematical statistics the authors interpret the [95/95] approach as applied to WWER design limits. (Authors)

  17. The thermodynamic limit and the finite-size behaviour of the fundamental Sp(2N) spin chain

    International Nuclear Information System (INIS)

    Martins, M.J.

    2002-01-01

    This paper is concerned with the study of the fundamental integrable Sp(2N) spin chain. The Bethe ansatz equations are solved by special string structure which allows us to determine the bulk limit properties. We present evidences that the critical properties of the system are governed by the product of N c=1 conformal field theories and therefore different from that of the Sp(2N) Wess-Zumino-Witten theory. We argue that many of our findings can be generalized to include anisotropic symplectic spin chains. The possible relevance of our results to the physics of the spin-orbital spin chains are also discussed

  18. The separate universe approach to soft limits

    Energy Technology Data Exchange (ETDEWEB)

    Kenton, Zachary; Mulryne, David J., E-mail: z.a.kenton@qmul.ac.uk, E-mail: d.mulryne@qmul.ac.uk [School of Physics and Astronomy, Queen Mary University of London, Mile End Road, London, E1 4NS (United Kingdom)

    2016-10-01

    We develop a formalism for calculating soft limits of n -point inflationary correlation functions using separate universe techniques. Our method naturally allows for multiple fields and leads to an elegant diagrammatic approach. As an application we focus on the trispectrum produced by inflation with multiple light fields, giving explicit formulae for all possible single- and double-soft limits. We also investigate consistency relations and present an infinite tower of inequalities between soft correlation functions which generalise the Suyama-Yamaguchi inequality.

  19. Fundamental limits to frequency estimation: a comprehensive microscopic perspective

    Science.gov (United States)

    Haase, J. F.; Smirne, A.; Kołodyński, J.; Demkowicz-Dobrzański, R.; Huelga, S. F.

    2018-05-01

    We consider a metrology scenario in which qubit-like probes are used to sense an external field that affects their energy splitting in a linear fashion. Following the frequency estimation approach in which one optimizes the state and sensing time of the probes to maximize the sensitivity, we provide a systematic study of the attainable precision under the impact of noise originating from independent bosonic baths. Specifically, we invoke an explicit microscopic derivation of the probe dynamics using the spin-boson model with weak coupling of arbitrary geometry. We clarify how the secular approximation leads to a phase-covariant (PC) dynamics, where the noise terms commute with the field Hamiltonian, while the inclusion of non-secular contributions breaks the PC. Moreover, unless one restricts to a particular (i.e., Ohmic) spectral density of the bath modes, the noise terms may contain relevant information about the frequency to be estimated. Thus, by considering general evolutions of a single probe, we study regimes in which these two effects have a non-negligible impact on the achievable precision. We then consider baths of Ohmic spectral density yet fully accounting for the lack of PC, in order to characterize the ultimate attainable scaling of precision when N probes are used in parallel. Crucially, we show that beyond the semigroup (Lindbladian) regime the Zeno limit imposing the 1/N 3/2 scaling of the mean squared error, recently derived assuming PC, generalises to any dynamics of the probes, unless the latter are coupled to the baths in the direction perfectly transversal to the frequency encoding—when a novel scaling of 1/N 7/4 arises. As our microscopic approach covers all classes of dissipative dynamics, from semigroup to non-Markovian ones (each of them potentially non-phase-covariant), it provides an exhaustive picture, in which all the different asymptotic scalings of precision naturally emerge.

  20. When fast is better: protein folding fundamentals and mechanisms from ultrafast approaches.

    Science.gov (United States)

    Muñoz, Victor; Cerminara, Michele

    2016-09-01

    Protein folding research stalled for decades because conventional experiments indicated that proteins fold slowly and in single strokes, whereas theory predicted a complex interplay between dynamics and energetics resulting in myriad microscopic pathways. Ultrafast kinetic methods turned the field upside down by providing the means to probe fundamental aspects of folding, test theoretical predictions and benchmark simulations. Accordingly, experimentalists could measure the timescales for all relevant folding motions, determine the folding speed limit and confirm that folding barriers are entropic bottlenecks. Moreover, a catalogue of proteins that fold extremely fast (microseconds) could be identified. Such fast-folding proteins cross shallow free energy barriers or fold downhill, and thus unfold with minimal co-operativity (gradually). A new generation of thermodynamic methods has exploited this property to map folding landscapes, interaction networks and mechanisms at nearly atomic resolution. In parallel, modern molecular dynamics simulations have finally reached the timescales required to watch fast-folding proteins fold and unfold in silico All of these findings have buttressed the fundamentals of protein folding predicted by theory, and are now offering the first glimpses at the underlying mechanisms. Fast folding appears to also have functional implications as recent results connect downhill folding with intrinsically disordered proteins, their complex binding modes and ability to moonlight. These connections suggest that the coupling between downhill (un)folding and binding enables such protein domains to operate analogically as conformational rheostats. © 2016 The Author(s).

  1. Updates on tetanus toxin: a fundamental approach

    Directory of Open Access Journals (Sweden)

    Md. Ahaduzzaman

    2015-03-01

    Full Text Available Clostridium tetani is an anaerobic bacterium that produces second most poisonous protein toxins than any other bacteria. Tetanus in animals is sporadic in nature but difficult to combat even by using antibiotics and antiserum. It is crucial to understand the fundamental mechanisms and signals that control toxin production for advance research and medicinal uses. This review was intended for better understanding the basic patho-physiology of tetanus and neurotoxins (TeNT among the audience of related field.

  2. Photonic Design: From Fundamental Solar Cell Physics to Computational Inverse Design

    Science.gov (United States)

    Miller, Owen Dennis

    Photonic innovation is becoming ever more important in the modern world. Optical systems are dominating shorter and shorter communications distances, LED's are rapidly emerging for a variety of applications, and solar cells show potential to be a mainstream technology in the energy space. The need for novel, energy-efficient photonic and optoelectronic devices will only increase. This work unites fundamental physics and a novel computational inverse design approach towards such innovation. The first half of the dissertation is devoted to the physics of high-efficiency solar cells. As solar cells approach fundamental efficiency limits, their internal physics transforms. Photonic considerations, instead of electronic ones, are the key to reaching the highest voltages and efficiencies. Proper photon management led to Alta Device's recent dramatic increase of the solar cell efficiency record to 28.3%. Moreover, approaching the Shockley-Queisser limit for any solar cell technology will require light extraction to become a part of all future designs. The second half of the dissertation introduces inverse design as a new computational paradigm in photonics. An assortment of techniques (FDTD, FEM, etc.) have enabled quick and accurate simulation of the "forward problem" of finding fields for a given geometry. However, scientists and engineers are typically more interested in the inverse problem: for a desired functionality, what geometry is needed? Answering this question breaks from the emphasis on the forward problem and forges a new path in computational photonics. The framework of shape calculus enables one to quickly find superior, non-intuitive designs. Novel designs for optical cloaking and sub-wavelength solar cell applications are presented.

  3. Fundamental parameters approach applied to focal construct geometry for X-ray diffraction

    International Nuclear Information System (INIS)

    Rogers, K.; Evans, P.; Prokopiou, D.; Dicken, A.; Godber, S.; Rogers, J.

    2012-01-01

    A novel geometry for the acquisition of powder X-ray diffraction data, referred to as focal construct geometry (FCG), is presented. Diffraction data obtained by FCG have been shown to possess significantly enhanced intensity due to the hollow tube beam arrangement utilized. In contrast to conventional diffraction, the detector is translated to collect images along a primary axis and record the location of Bragg maxima. These high intensity condensation foci are unique to FCG and appear due to the convergence of Debye cones at single points on the primary axis. This work focuses on a two dimensional, fundamental parameter's approach to simulate experimental data and subsequently aid with interpretation. This convolution method is shown to favorably reproduce the experimental diffractograms and can also accommodate preferred orientation effects in some circumstances.

  4. STEP and fundamental physics

    Science.gov (United States)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-09-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 1013 to one part in 1018 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels.

  5. Ice limit of Coulomb gauge Yang-Mills theory

    International Nuclear Information System (INIS)

    Heinzl, T.; Ilderton, A.; Langfeld, K.; Lavelle, M.; McMullan, D.

    2008-01-01

    In this paper we describe gauge invariant multiquark states generalizing the path integral framework developed by Parrinello, Jona-Lasinio, and Zwanziger to amend the Faddeev-Popov approach. This allows us to produce states such that, in a limit which we call the ice limit, fermions are dressed with glue exclusively from the fundamental modular region associated with Coulomb gauge. The limit can be taken analytically without difficulties, avoiding the Gribov problem. This is illustrated by an unambiguous construction of gauge invariant mesonic states for which we simulate the static quark-antiquark potential.

  6. The fundamental parameter approach of quantitative XRFA- investigation of photoelectric absorption coefficients

    International Nuclear Information System (INIS)

    Shaltout, A.

    2003-06-01

    The present work describes some actual problems of quantitative x-ray fluorescence analysis by means of the fundamental parameter approach. To perform this task, some of the main parameters are discussed in detail. These parameters are photoelectric cross sections, coherent and incoherent scattering cross sections, mass absorption cross sections and the variation of the x-ray tube voltage. Photoelectric cross sections, coherent and incoherent scattering cross sections and mass absorption cross sections in the energy range from 1 to 300 keV for the elements from Z=1 to 94 considering ten different data bases are studied. These are data bases given by Hubbell, McMaster, Mucall, Scofield, Xcom, Elam, Sasaki, Henke, Cullen and Chantler's data bases. These data bases have been developed also for an application in fundamental parameter programs for quantitative x-ray analysis (Energy Dispersive X-Ray Fluorescence Analysis (EDXRFA), Electron Probe Microanalysis (EPMA), X-Ray Photoelectron Spectroscopy (XPS) and Total Electron Yield (TEY)). In addition a comparison is performed between different data bases. In McMaster's data base, the missing elements (Z=84, 85, 87, 88, 89, 91, and 93) are added by using photoelectric cross sections of Scofield's data base, coherent as well as incoherent scattering cross sections of Elam's data base and the absorption edges of Bearden. Also, the N-fit coefficients of the elements from Z=61 to 69 are wrong in McMaster data base, therefore, linear least squares fits are used to recalculate the N-fit coefficients of these elements. Additionally, in the McMaster tables the positions of the M- and N-edges of all elements with the exception of the M1- and N1- edges are not defined as well as the jump ratio of the edges. In the present work, the M- and N-edges and the related jump ratios are calculated. To include the missing N-edges, Bearden's values of energy edges are used. In Scofield's data base, modifications include check and correction

  7. Overview of the fundamental safety principles

    International Nuclear Information System (INIS)

    Chishinga, Milton Mulenga

    2015-02-01

    The primary objective of this work was to provide an overview of the International Atomic Energy (IAEA) document; 'Fundamental Safety principles, SF.1'. The document outlines ten (10) fundamental principles which provide the basis for an effective the radiation protection framework. The document is the topmost in the hierarchy of the IAEA Safety Standards Series. These principles are the foundation of the nuclear safety put stringent obligations on Parties under the Convention on Nuclear Safety. The fundamental safety objective is to protect people and the environment from harmful effects of ionizing radiation. The fundamental Safety objective of protecting people individually and collectively and the environment has to be achieved without unduly limiting the operation of facilities or the conduct of activities that give rise to risks. The thematic areas covered are; responsibility for safety, role of government, leadership and management for safety, justification of facilities and activities, optimization of protection, limitation of risks to individuals, protection of present and future generations, prevention of accidents, emergency preparedness and response and protective actions to reduce existing or unregulated radiation risks. Appropriate recommendations have been provided for effective application of the principles by Governments, Regulatory Bodies and Operating Organizations of facilities and Nuclear Installations the give rise to radiation risks. (au)

  8. The fundamentals of mathematical analysis

    CERN Document Server

    Fikhtengol'ts, G M

    1965-01-01

    The Fundamentals of Mathematical Analysis, Volume 1 is a textbook that provides a systematic and rigorous treatment of the fundamentals of mathematical analysis. Emphasis is placed on the concept of limit which plays a principal role in mathematical analysis. Examples of the application of mathematical analysis to geometry, mechanics, physics, and engineering are given. This volume is comprised of 14 chapters and begins with a discussion on real numbers, their properties and applications, and arithmetical operations over real numbers. The reader is then introduced to the concept of function, i

  9. Approaching the Shockley-Queisser limit: General assessment of the main limiting mechanisms in photovoltaic cells

    International Nuclear Information System (INIS)

    Vossier, Alexis; Gualdi, Federico; Dollet, Alain; Ares, Richard; Aimez, Vincent

    2015-01-01

    In principle, the upper efficiency limit of any solar cell technology can be determined using the detailed-balance limit formalism. However, “real” solar cells show efficiencies which are always below this theoretical value due to several limiting mechanisms. We study the ability of a solar cell architecture to approach its own theoretical limit, using a novel index introduced in this work, and the amplitude with which the different limiting mechanisms affect the cell efficiency is scrutinized as a function of the electronic gap and the illumination level to which the cell is submitted. The implications for future generations of solar cells aiming at an improved conversion of the solar spectrum are also addressed

  10. Fundamental limits of positron emission mammography

    International Nuclear Information System (INIS)

    Moses, William W.; Qi, Jinyi

    2001-01-01

    We explore the causes of performance limitation in positron emission mammography cameras. We compare two basic camera geometries containing the same volume of 511 keV photon detectors, one with a parallel plane geometry and another with a rectangular geometry. We find that both geometries have similar performance for the phantom imaged (in Monte Carlo simulation), even though the solid angle coverage of the rectangular camera is about 50 percent higher than the parallel plane camera. The reconstruction algorithm used significantly affects the resulting image; iterative methods significantly outperform the commonly used focal plane tomography. Finally, the characteristics of the tumor itself, specifically the absolute amount of radiotracer taken up by the tumor, will significantly affect the imaging performance

  11. Roothaan approach in the thermodynamic limit

    International Nuclear Information System (INIS)

    Gutierrez, G.; Plastino, A.

    1982-01-01

    A systematic method for the solution of the Hartree-Fock equations in the thermodynamic limit is presented. The approach is seen to be a natural extension of the one usually employed in the finite-fermion case, i.e., that developed by Roothaan. The new techniques developed here are applied, as an example, to neutron matter, employing the so-called V 1 Bethe homework potential. The results obtained are, by far, superior to those that the ordinary plane-wave Hartree-Fock theory yields

  12. Surface chemistry and fundamental limitations on the plasma cleaning of metals

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Bin, E-mail: bindong@my.unt.edu [Department of Chemistry, University of North Texas, 1155 Union Circle 305070, Denton, TX, 76203 (United States); Driver, M. Sky, E-mail: Marcus.Driver@unt.edu [Department of Chemistry, University of North Texas, 1155 Union Circle 305070, Denton, TX, 76203 (United States); Emesh, Ismail, E-mail: Ismail_Emesh@amat.com [Applied Materials Inc., 3050 Bowers Ave, Santa Clara, CA, 95054 (United States); Shaviv, Roey, E-mail: Roey_Shaviv@amat.com [Applied Materials Inc., 3050 Bowers Ave, Santa Clara, CA, 95054 (United States); Kelber, Jeffry A., E-mail: Jeffry.Kelber@unt.edu [Department of Chemistry, University of North Texas, 1155 Union Circle 305070, Denton, TX, 76203 (United States)

    2016-10-30

    Highlights: • O{sub 2}-free plasma treatment of air-exposed Co or Cu surfaces yields remnant C layers inert to further plasma cleaning. • The formation of the remnant C layer is graphitic (Cu) or carbidic (Co). • The formation of a remnant C layer is linked to plasma cleaning of a metal surface. - Abstract: In-situ X-ray photoelectron spectroscopy (XPS) studies reveal that plasma cleaning of air-exposed Co or Cu transition metal surfaces results in the formation of a remnant C film 1–3 monolayers thick, which is not reduced upon extensive further plasma exposure. This effect is observed for H{sub 2} or NH{sub 3} plasma cleaning of Co, and He or NH{sub 3} plasma cleaning of Cu, and is observed with both inductively coupled (ICP) and capacitively-coupled plasma (CCP). Changes in C 1 s XPS spectra indicate that this remnant film formation is accompanied by the formation of carbidic C on Co and of graphitic C on Cu. This is in contrast to published work showing no such remnant carbidic/carbon layer after similar treatments of Si oxynitride surfaces. The observation of the remnant carbidic C film on Co and graphitic film on Cu, but not on silicon oxynitride (SiO{sub x}N{sub y}), regardless of plasma chemistry or type, indicates that this effect is due to plasma induced secondary electron emission from the metal surface, resulting in transformation of sp{sup 3} adventitious C to either a metal carbide or graphite. These results suggest fundamental limitations to plasma-based surface cleaning procedures on metal surfaces.

  13. STEP and fundamental physics

    International Nuclear Information System (INIS)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-01-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 10 13 to one part in 10 18 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels. (paper)

  14. Censoring approach to the detection limits in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Pajek, M.; Kubala-Kukus, A.

    2004-01-01

    We demonstrate that the effect of detection limits in the X-ray fluorescence analysis (XRF), which limits the determination of very low concentrations of trace elements and results in appearance of the so-called 'nondetects', can be accounted for using the statistical concept of censoring. More precisely, the results of such measurements can be viewed as the left random censored data, which can further be analyzed using the Kaplan-Meier method correcting the data for the presence of nondetects. Using this approach, the results of measured, detection limit censored concentrations can be interpreted in a nonparametric manner including the correction for the nondetects, i.e. the measurements in which the concentrations were found to be below the actual detection limits. Moreover, using the Monte Carlo simulation technique we show that by using the Kaplan-Meier approach the corrected mean concentrations for a population of the samples can be estimated within a few percent uncertainties with respect of the simulated, uncensored data. This practically means that the final uncertainties of estimated mean values are limited in fact by the number of studied samples and not by the correction procedure itself. The discussed random-left censoring approach was applied to analyze the XRF detection-limit-censored concentration measurements of trace elements in biomedical samples

  15. A Unique Mathematical Derivation of the Fundamental Laws of Nature Based on a New Algebraic-Axiomatic (Matrix Approach

    Directory of Open Access Journals (Sweden)

    Ramin Zahedi

    2017-09-01

    Full Text Available In this article, as a new mathematical approach to origin of the laws of nature, using a new basic algebraic axiomatic (matrix formalism based on the ring theory and Clifford algebras (presented in Section 2, “it is shown that certain mathematical forms of fundamental laws of nature, including laws governing the fundamental forces of nature (represented by a set of two definite classes of general covariant massive field equations, with new matrix formalisms, are derived uniquely from only a very few axioms.” In agreement with the rational Lorentz group, it is also basically assumed that the components of relativistic energy-momentum can only take rational values. In essence, the main scheme of this new mathematical axiomatic approach to the fundamental laws of nature is as follows: First, based on the assumption of the rationality of D-momentum and by linearization (along with a parameterization procedure of the Lorentz invariant energy-momentum quadratic relation, a unique set of Lorentz invariant systems of homogeneous linear equations (with matrix formalisms compatible with certain Clifford and symmetric algebras is derived. Then by an initial quantization (followed by a basic procedure of minimal coupling to space-time geometry of these determined systems of linear equations, a set of two classes of general covariant massive (tensor field equations (with matrix formalisms compatible with certain Clifford, and Weyl algebras is derived uniquely as well.

  16. Individual differences in fundamental social motives.

    Science.gov (United States)

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. A systems approach to theoretical fluid mechanics: Fundamentals

    Science.gov (United States)

    Anyiwo, J. C.

    1978-01-01

    A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.

  18. Limits to magnetic resonance microscopy

    International Nuclear Information System (INIS)

    Glover, Paul; Mansfield, Peter

    2002-01-01

    The last quarter of the twentieth century saw the development of magnetic resonance imaging (MRI) grow from a laboratory demonstration to a multi-billion dollar worldwide industry. There is a clinical body scanner in almost every hospital of the developed nations. The field of magnetic resonance microscopy (MRM), after mostly being abandoned by researchers in the first decade of MRI, has become an established branch of the science. This paper reviews the development of MRM over the last decade with an emphasis on the current state of the art. The fundamental principles of imaging and signal detection are examined to determine the physical principles which limit the available resolution. The limits are discussed with reference to liquid, solid and gas phase microscopy. In each area, the novel approaches employed by researchers to push back the limits of resolution are discussed. Although the limits to resolution are well known, the developments and applications of MRM have not reached their limit. (author)

  19. Fundamental approaches for analysis thermal hydraulic parameter for Puspati Research Reactor

    International Nuclear Information System (INIS)

    Hashim, Zaredah; Lanyau, Tonny Anak; Farid, Mohamad Fairus Abdul; Kassim, Mohammad Suhaimi; Azhar, Noraishah Syahirah

    2016-01-01

    The 1-MW PUSPATI Research Reactor (RTP) is the one and only nuclear pool type research reactor developed by General Atomic (GA) in Malaysia. It was installed at Malaysian Nuclear Agency and has reached the first criticality on 8 June 1982. Based on the initial core which comprised of 80 standard TRIGA fuel elements, the very fundamental thermal hydraulic model was investigated during steady state operation using the PARET-code. The main objective of this paper is to determine the variation of temperature profiles and Departure of Nucleate Boiling Ratio (DNBR) of RTP at full power operation. The second objective is to confirm that the values obtained from PARET-code are in agreement with Safety Analysis Report (SAR) for RTP. The code was employed for the hot and average channels in the core in order to calculate of fuel’s center and surface, cladding, coolant temperatures as well as DNBR’s values. In this study, it was found that the results obtained from the PARET-code showed that the thermal hydraulic parameters related to safety for initial core which was cooled by natural convection was in agreement with the designed values and safety limit in SAR

  20. Fundamental approaches for analysis thermal hydraulic parameter for Puspati Research Reactor

    Science.gov (United States)

    Hashim, Zaredah; Lanyau, Tonny Anak; Farid, Mohamad Fairus Abdul; Kassim, Mohammad Suhaimi; Azhar, Noraishah Syahirah

    2016-01-01

    The 1-MW PUSPATI Research Reactor (RTP) is the one and only nuclear pool type research reactor developed by General Atomic (GA) in Malaysia. It was installed at Malaysian Nuclear Agency and has reached the first criticality on 8 June 1982. Based on the initial core which comprised of 80 standard TRIGA fuel elements, the very fundamental thermal hydraulic model was investigated during steady state operation using the PARET-code. The main objective of this paper is to determine the variation of temperature profiles and Departure of Nucleate Boiling Ratio (DNBR) of RTP at full power operation. The second objective is to confirm that the values obtained from PARET-code are in agreement with Safety Analysis Report (SAR) for RTP. The code was employed for the hot and average channels in the core in order to calculate of fuel's center and surface, cladding, coolant temperatures as well as DNBR's values. In this study, it was found that the results obtained from the PARET-code showed that the thermal hydraulic parameters related to safety for initial core which was cooled by natural convection was in agreement with the designed values and safety limit in SAR.

  1. Fundamental approaches for analysis thermal hydraulic parameter for Puspati Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Hashim, Zaredah, E-mail: zaredah@nm.gov.my; Lanyau, Tonny Anak, E-mail: tonny@nm.gov.my; Farid, Mohamad Fairus Abdul; Kassim, Mohammad Suhaimi [Reactor Technology Centre, Technical Support Division, Malaysia Nuclear Agency, Ministry of Science, Technology and Innovation, Bangi, 43000, Kajang, Selangor Darul Ehsan (Malaysia); Azhar, Noraishah Syahirah [Universiti Teknologi Malaysia, 80350, Johor Bahru, Johor Darul Takzim (Malaysia)

    2016-01-22

    The 1-MW PUSPATI Research Reactor (RTP) is the one and only nuclear pool type research reactor developed by General Atomic (GA) in Malaysia. It was installed at Malaysian Nuclear Agency and has reached the first criticality on 8 June 1982. Based on the initial core which comprised of 80 standard TRIGA fuel elements, the very fundamental thermal hydraulic model was investigated during steady state operation using the PARET-code. The main objective of this paper is to determine the variation of temperature profiles and Departure of Nucleate Boiling Ratio (DNBR) of RTP at full power operation. The second objective is to confirm that the values obtained from PARET-code are in agreement with Safety Analysis Report (SAR) for RTP. The code was employed for the hot and average channels in the core in order to calculate of fuel’s center and surface, cladding, coolant temperatures as well as DNBR’s values. In this study, it was found that the results obtained from the PARET-code showed that the thermal hydraulic parameters related to safety for initial core which was cooled by natural convection was in agreement with the designed values and safety limit in SAR.

  2. Fundamental limits to imaging resolution for focused ion beams

    International Nuclear Information System (INIS)

    Orloff, J.; Swanson, L.W.; Utlaut, M.

    1996-01-01

    This article investigates the limitations on the formation of focused ion beam images from secondary electrons. We use the notion of the information content of an image to account for the effects of resolution, contrast, and signal-to-noise ratio and show that there is a competition between the rate at which small features are sputtered away by the primary beam and the rate of collection of secondary electrons. We find that for small features, sputtering is the limit to imaging resolution, and that for extended small features (e.g., layered structures), rearrangement, redeposition, and differential sputtering rates may limit the resolution in some cases. copyright 1996 American Vacuum Society

  3. Summary: fundamental interactions and processes

    International Nuclear Information System (INIS)

    Koltun, D.S.

    1982-01-01

    The subjects of the talks of the first day of the workshop are discussed in terms of fundamental interactions, dynamical theory, and relevant degrees of freedom. Some general considerations are introduced and are used to confront the various approaches taken in the earlier talks

  4. Fundamental limits on quantum dynamics based on entropy change

    Science.gov (United States)

    Das, Siddhartha; Khatri, Sumeet; Siopsis, George; Wilde, Mark M.

    2018-01-01

    It is well known in the realm of quantum mechanics and information theory that the entropy is non-decreasing for the class of unital physical processes. However, in general, the entropy does not exhibit monotonic behavior. This has restricted the use of entropy change in characterizing evolution processes. Recently, a lower bound on the entropy change was provided in the work of Buscemi, Das, and Wilde [Phys. Rev. A 93(6), 062314 (2016)]. We explore the limit that this bound places on the physical evolution of a quantum system and discuss how these limits can be used as witnesses to characterize quantum dynamics. In particular, we derive a lower limit on the rate of entropy change for memoryless quantum dynamics, and we argue that it provides a witness of non-unitality. This limit on the rate of entropy change leads to definitions of several witnesses for testing memory effects in quantum dynamics. Furthermore, from the aforementioned lower bound on entropy change, we obtain a measure of non-unitarity for unital evolutions.

  5. Different Variants of Fundamental Portfolio

    Directory of Open Access Journals (Sweden)

    Tarczyński Waldemar

    2014-06-01

    Full Text Available The paper proposes the fundamental portfolio of securities. This portfolio is an alternative for the classic Markowitz model, which combines fundamental analysis with portfolio analysis. The method’s main idea is based on the use of the TMAI1 synthetic measure and, in limiting conditions, the use of risk and the portfolio’s rate of return in the objective function. Different variants of fundamental portfolio have been considered under an empirical study. The effectiveness of the proposed solutions has been related to the classic portfolio constructed with the help of the Markowitz model and the WIG20 market index’s rate of return. All portfolios were constructed with data on rates of return for 2005. Their effectiveness in 2006- 2013 was then evaluated. The studied period comprises the end of the bull market, the 2007-2009 crisis, the 2010 bull market and the 2011 crisis. This allows for the evaluation of the solutions’ flexibility in various extreme situations. For the construction of the fundamental portfolio’s objective function and the TMAI, the study made use of financial and economic data on selected indicators retrieved from Notoria Serwis for 2005.

  6. The Continuum Limit of Causal Fermion Systems

    OpenAIRE

    Finster, Felix

    2016-01-01

    This monograph introduces the basic concepts of the theory of causal fermion systems, a recent approach to the description of fundamental physics. The theory yields quantum mechanics, general relativity and quantum field theory as limiting cases and is therefore a candidate for a unified physical theory. From the mathematical perspective, causal fermion systems provide a general framework for describing and analyzing non-smooth geometries and "quantum geometries." The dynamics is described by...

  7. Censoring: a new approach for detection limits in total-reflection X-ray fluorescence

    International Nuclear Information System (INIS)

    Pajek, M.; Kubala-Kukus, A.; Braziewicz, J.

    2004-01-01

    It is shown that the detection limits in the total-reflection X-ray fluorescence (TXRF), which restrict quantification of very low concentrations of trace elements in the samples, can be accounted for using the statistical concept of censoring. We demonstrate that the incomplete TXRF measurements containing the so-called 'nondetects', i.e. the non-measured concentrations falling below the detection limits and represented by the estimated detection limit values, can be viewed as the left random-censored data, which can be further analyzed using the Kaplan-Meier (KM) method correcting for nondetects. Within this approach, which uses the Kaplan-Meier product-limit estimator to obtain the cumulative distribution function corrected for the nondetects, the mean value and median of the detection limit censored concentrations can be estimated in a non-parametric way. The Monte Carlo simulations performed show that the Kaplan-Meier approach yields highly accurate estimates for the mean and median concentrations, being within a few percent with respect to the simulated, uncensored data. This means that the uncertainties of KM estimated mean value and median are limited in fact only by the number of studied samples and not by the applied correction procedure for nondetects itself. On the other hand, it is observed that, in case when the concentration of a given element is not measured in all the samples, simple approaches to estimate a mean concentration value from the data yield erroneous, systematically biased results. The discussed random-left censoring approach was applied to analyze the TXRF detection-limit-censored concentration measurements of trace elements in biomedical samples. We emphasize that the Kaplan-Meier approach allows one to estimate the mean concentrations being substantially below the mean level of detection limits. Consequently, this approach gives a new access to lower the effective detection limits for TXRF method, which is of prime interest for

  8. Reliability analysis - systematic approach based on limited data

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1975-11-01

    The initial approaches required for reliability analysis are outlined. These approaches highlight the system boundaries, examine the conditions under which the system is required to operate, and define the overall performance requirements. The discussion is illustrated by a simple example of an automatic protective system for a nuclear reactor. It is then shown how the initial approach leads to a method of defining the system, establishing performance parameters of interest and determining the general form of reliability models to be used. The overall system model and the availability of reliability data at the system level are next examined. An iterative process is then described whereby the reliability model and data requirements are systematically refined at progressively lower hierarchic levels of the system. At each stage, the approach is illustrated with examples from the protective system previously described. The main advantages of the approach put forward are the systematic process of analysis, the concentration of assessment effort in the critical areas and the maximum use of limited reliability data. (author)

  9. Good Administration as a Fundamental Right

    Directory of Open Access Journals (Sweden)

    Margrét Vala Kristjánsdóttir

    2013-06-01

    Full Text Available The EU Charter of Fundamental Rights lists good administration as a fundamental right. The scope of this right, as defined in Article 41 of the EU Charter, is limited to situations in which persons are dealing with the institutions and bodies of the European Union; this gives it a narrower scope than that of the Charter as a whole. This paper discusses the status of this right as a subjective, fundamental right and a codified principle of EU law. The focus is on the question of applicability of the right to situations in which persons are dealing with the institutions and bodies of Member States and questions are raised regarding the implications of Article 41 in this respect. The paper concludes that Article 41 of the Charter in fact limits the applicability of good administration to the institutions and bodies of the EU. This does not however, preclude the applicability of a general principle of good administration, as established by the European Court of Justice, to Member States and the formal recognition of this principle in the EU Charter seems to affect legal reasoning and contribute to some extent to the protection of administrative rules in the implementation of EU law.

  10. Wavelength dispersive X-ray fluorescence analysis using fundamental parameter approach of Catha edulis and other related plant samples

    Energy Technology Data Exchange (ETDEWEB)

    Shaltout, Abdallah A., E-mail: shaltout_a@hotmail.com [Spectroscopy Department, Physics Division, National Research Center, El Behooth Str., 12622 Dokki, Cairo (Egypt); Faculty of science, Taif University, 21974 Taif, P.O. Box 888 (Saudi Arabia); Moharram, Mohammed A. [Spectroscopy Department, Physics Division, National Research Center, El Behooth Str., 12622 Dokki, Cairo (Egypt); Mostafa, Nasser Y. [Faculty of science, Taif University, 21974 Taif, P.O. Box 888 (Saudi Arabia); Chemistry Department, Faculty of Science, Suez Canal University, Ismailia (Egypt)

    2012-01-15

    This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method. - Highlights: Black-Right-Pointing-Pointer Quantitative analysis of Catha edulis was carried out using standardless WDXRF. Black-Right-Pointing-Pointer Differential thermal analysis was used for determination of the loss of ignition. Black-Right-Pointing-Pointer The existence of hydroxyapatite in Catha edulis plant has been confirmed. Black-Right-Pointing-Pointer The CRM results confirmed the validity of the developed method.

  11. Wavelength dispersive X-ray fluorescence analysis using fundamental parameter approach of Catha edulis and other related plant samples

    International Nuclear Information System (INIS)

    Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.

    2012-01-01

    This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method. - Highlights: ► Quantitative analysis of Catha edulis was carried out using standardless WDXRF. ► Differential thermal analysis was used for determination of the loss of ignition. ► The existence of hydroxyapatite in Catha edulis plant has been confirmed. ► The CRM results confirmed the validity of the developed method.

  12. Fundamentals of Monte Carlo

    International Nuclear Information System (INIS)

    Wollaber, Allan Benton

    2016-01-01

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating @@), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  13. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  14. Statistical competencies for medical research learners: What is fundamental?

    Science.gov (United States)

    Enders, Felicity T; Lindsell, Christopher J; Welty, Leah J; Benn, Emma K T; Perkins, Susan M; Mayo, Matthew S; Rahbar, Mohammad H; Kidwell, Kelley M; Thurston, Sally W; Spratt, Heidi; Grambow, Steven C; Larson, Joseph; Carter, Rickey E; Pollock, Brad H; Oster, Robert A

    2017-06-01

    It is increasingly essential for medical researchers to be literate in statistics, but the requisite degree of literacy is not the same for every statistical competency in translational research. Statistical competency can range from 'fundamental' (necessary for all) to 'specialized' (necessary for only some). In this study, we determine the degree to which each competency is fundamental or specialized. We surveyed members of 4 professional organizations, targeting doctorally trained biostatisticians and epidemiologists who taught statistics to medical research learners in the past 5 years. Respondents rated 24 educational competencies on a 5-point Likert scale anchored by 'fundamental' and 'specialized.' There were 112 responses. Nineteen of 24 competencies were fundamental. The competencies considered most fundamental were assessing sources of bias and variation (95%), recognizing one's own limits with regard to statistics (93%), identifying the strengths, and limitations of study designs (93%). The least endorsed items were meta-analysis (34%) and stopping rules (18%). We have identified the statistical competencies needed by all medical researchers. These competencies should be considered when designing statistical curricula for medical researchers and should inform which topics are taught in graduate programs and evidence-based medicine courses where learners need to read and understand the medical research literature.

  15. Fundamental ecology is fundamental.

    Science.gov (United States)

    Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E

    2015-01-01

    The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Fundamental limitation of electrocatalytic methane conversion to methanol

    DEFF Research Database (Denmark)

    Arnarson, Logi; Schmidt, Per Simmendefeldt; Pandey, Mohnish

    2018-01-01

    binding energies on the surface. Based on a simple kinetic model we can conclude that in order to obtain sufficient activity oxygen has to bind weakly to the surface but there is an upper limit to retain selectivity. Few potentially interesting candidates are found but this relatively simple description...

  17. The Holy Text and Violence : Levinas and Fundamentalism

    NARCIS (Netherlands)

    Poorthuis, Marcel; Breitlin, Andris; Bremmers, Chris; Cools, Arthur

    2015-01-01

    Levinas'rejection of a historical ciritcal approach to sacred texts as well as his depreciation of Spinoza's view of the Bible might bring him close to fundamentalism. A thorough analysis is necessary to demonstrate essential differences. Levinas'rejection of a historical ciritcal approach to sacred

  18. Limitation of Socio-Economic Rights in the 2010 Kenyan Constitution: A Proposal for the Adoption of a Proportionality Approach in the Judicial Adjudication of Socio-Economic Rights Disputes

    Directory of Open Access Journals (Sweden)

    Nicholas Wasonga Orago

    2013-12-01

    Full Text Available On 27 August 2010 Kenya adopted a transformative Constitution with the objective of fighting poverty and inequality as well as improving the standards of living of all people in Kenya. One of the mechanisms in the 2010 Constitution aimed at achieving this egalitarian transformation is the entrenchment of justiciable socio-economic rights (SERs, an integral part of the Bill of Rights. The entrenched SERs require the State to put in place a legislative, policy and programmatic framework to enhance the realisation of its constitutional obligations to respect, protect and fulfill these rights for all Kenyans. These SER obligations, just like any other fundamental human rights obligations, are, however, not absolute and are subject to legitimate limitation by the State. Two approaches have been used in international and comparative national law jurisprudence to limit SERs: the proportionality approach, using a general limitation clause that has found application in international and regional jurisprudence on the one hand; and the reasonableness approach, using internal limitations contained in the standard of progressive realisation, an approach that has found application in the SER jurisprudence of the South African Courts, on the other hand. This article proposes that if the entrenched SERs are to achieve their transformative objectives, Kenyan courts must adopt a proportionality approach in the judicial adjudication of SER disputes. This proposal is based on the reasoning that for the entrenched SERs to have a substantive positive impact on the lives of the Kenyan people, any measure by the government aimed at their limitation must be subjected to strict scrutiny by the courts, a form of scrutiny that can be achieved only by using the proportionality standard entrenched in the article 24 general limitation clause.

  19. Upper limit for Poisson variable incorporating systematic uncertainties by Bayesian approach

    International Nuclear Information System (INIS)

    Zhu, Yongsheng

    2007-01-01

    To calculate the upper limit for the Poisson observable at given confidence level with inclusion of systematic uncertainties in background expectation and signal efficiency, formulations have been established along the line of Bayesian approach. A FORTRAN program, BPULE, has been developed to implement the upper limit calculation

  20. Quantum cryptography approaching the classical limit.

    Science.gov (United States)

    Weedbrook, Christian; Pirandola, Stefano; Lloyd, Seth; Ralph, Timothy C

    2010-09-10

    We consider the security of continuous-variable quantum cryptography as we approach the classical limit, i.e., when the unknown preparation noise at the sender's station becomes significantly noisy or thermal (even by as much as 10(4) times greater than the variance of the vacuum mode). We show that, provided the channel transmission losses do not exceed 50%, the security of quantum cryptography is not dependent on the channel transmission, and is therefore incredibly robust against significant amounts of excess preparation noise. We extend these results to consider for the first time quantum cryptography at wavelengths considerably longer than optical and find that regions of security still exist all the way down to the microwave.

  1. A fundamental parameters approach to calibration of the Mars Exploration Rover Alpha Particle X-ray Spectrometer

    Science.gov (United States)

    Campbell, J. L.; Lee, M.; Jones, B. N.; Andrushenko, S. M.; Holmes, N. G.; Maxwell, J. A.; Taylor, S. M.

    2009-04-01

    The detection sensitivities of the Alpha Particle X-ray Spectrometer (APXS) instruments on the Mars Exploration Rovers for a wide range of elements were experimentally determined in 2002 using spectra of geochemical reference materials. A flight spare instrument was similarly calibrated, and the calibration exercise was then continued for this unit with an extended set of geochemical reference materials together with pure elements and simple chemical compounds. The flight spare instrument data are examined in detail here using a newly developed fundamental parameters approach which takes precise account of all the physics inherent in the two X-ray generation techniques involved, namely, X-ray fluorescence and particle-induced X-ray emission. The objectives are to characterize the instrument as fully as possible, to test this new approach, and to determine the accuracy of calibration for major, minor, and trace elements. For some of the lightest elements the resulting calibration exhibits a dependence upon the mineral assemblage of the geological reference material; explanations are suggested for these observations. The results will assist in designing the overall calibration approach for the APXS on the Mars Science Laboratory mission.

  2. Quantum Uncertainty and Fundamental Interactions

    Directory of Open Access Journals (Sweden)

    Tosto S.

    2013-04-01

    Full Text Available The paper proposes a simplified theoretical approach to infer some essential concepts on the fundamental interactions between charged particles and their relative strengths at comparable energies by exploiting the quantum uncertainty only. The worth of the present approach relies on the way of obtaining the results, rather than on the results themselves: concepts today acknowledged as fingerprints of the electroweak and strong interactions appear indeed rooted in the same theoretical frame including also the basic principles of special and general relativity along with the gravity force.

  3. Search for fundamental 'God Particle' speeds up

    CERN Multimedia

    Spotts, P N

    2000-01-01

    This month researchers at CERN are driving the accelerator to its limits and beyond to find the missing Higgs boson. Finding it would confirm a 30-yr-old theory about why matter's most fundamental particles have mass (1 page).

  4. Material Limitations on the Detection Limit in Refractometry

    OpenAIRE

    Skafte-Pedersen, Peder; Nunes, Pedro S.; Xiao, Sanshui; Mortensen, Niels Asger

    2009-01-01

    We discuss the detection limit for refractometric sensors relying on high-Q optical cavities and show that the ultimate classical detection limit is given by min {Δn} ≳ η with n + iη being the complex refractive index of the material under refractometric investigation. Taking finite Q factors and filling fractions into account, the detection limit declines. As an example we discuss the fundamental limits of silicon-based high-Q resonators, such as photonic crystal resonators, for sensing in a...

  5. Mathematical analysis fundamentals

    CERN Document Server

    Bashirov, Agamirza

    2014-01-01

    The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o

  6. Treatment for spasmodic dysphonia: limitations of current approaches

    Science.gov (United States)

    Ludlow, Christy L.

    2009-01-01

    Purpose of review Although botulinum toxin injection is the gold standard for treatment of spasmodic dysphonia, surgical approaches aimed at providing long-term symptom control have been advancing over recent years. Recent findings When surgical approaches provide greater long-term benefits to symptom control, they also increase the initial period of side effects of breathiness and swallowing difficulties. However, recent analyses of quality-of-life questionnaires in patients undergoing regular injections of botulinum toxin demonstrate that a large proportion of patients have limited relief for relatively short periods due to early breathiness and loss-of-benefit before reinjection. Summary Most medical and surgical approaches to the treatment of spasmodic dysphonia have been aimed at denervation of the laryngeal muscles to block symptom expression in the voice, and have both adverse effects as well as treatment benefits. Research is needed to identify the central neuropathophysiology responsible for the laryngeal muscle spasms in order target treatment towards the central neurological abnormality responsible for producing symptoms. PMID:19337127

  7. Lattice gravity near the continuum limit

    International Nuclear Information System (INIS)

    Feinberg, G.; Friedberg, R.; Lee, T.D.; Ren, H.C.

    1984-01-01

    We prove that the lattice gravity always approaches the usual continuum limit when the link length l -> 0, provided that certain general boundary conditions are satisfied. This result holds for any lattice, regular or irregular. Furthermore, for a given lattice, the deviation from its continuum limit can be expressed as a power series in l 2 . General formulas for such a perturbative calculation are given, together with a number of illustrative examples, including the graviton propagator. The lattice gravity satisfies all the invariance properties of Einstein's theory of general relativity. In addition, it is symmetric under a new class of transformations that are absent in the usual continuum theory. The possibility that the lattice theory (with a nonzero l) may be more fundamental is discussed. (orig.)

  8. Fundamental Structure of Loop Quantum Gravity

    Science.gov (United States)

    Han, Muxin; Ma, Yongge; Huang, Weiming

    In the recent twenty years, loop quantum gravity, a background independent approach to unify general relativity and quantum mechanics, has been widely investigated. The aim of loop quantum gravity is to construct a mathematically rigorous, background independent, non-perturbative quantum theory for a Lorentzian gravitational field on a four-dimensional manifold. In the approach, the principles of quantum mechanics are combined with those of general relativity naturally. Such a combination provides us a picture of, so-called, quantum Riemannian geometry, which is discrete on the fundamental scale. Imposing the quantum constraints in analogy from the classical ones, the quantum dynamics of gravity is being studied as one of the most important issues in loop quantum gravity. On the other hand, the semi-classical analysis is being carried out to test the classical limit of the quantum theory. In this review, the fundamental structure of loop quantum gravity is presented pedagogically. Our main aim is to help non-experts to understand the motivations, basic structures, as well as general results. It may also be beneficial to practitioners to gain insights from different perspectives on the theory. We will focus on the theoretical framework itself, rather than its applications, and do our best to write it in modern and precise langauge while keeping the presentation accessible for beginners. After reviewing the classical connection dynamical formalism of general relativity, as a foundation, the construction of the kinematical Ashtekar-Isham-Lewandowski representation is introduced in the content of quantum kinematics. The algebraic structure of quantum kinematics is also discussed. In the content of quantum dynamics, we mainly introduce the construction of a Hamiltonian constraint operator and the master constraint project. At last, some applications and recent advances are outlined. It should be noted that this strategy of quantizing gravity can also be extended to

  9. With Iterative and Bosonized Coupling towards Fundamental Particle Properties

    CERN Document Server

    Binder, B

    2003-01-01

    Previous results have shown that the linear topological potential-to-phase relationship (well known from Josephson junctions) is the key to iterative coupling and non-perturbative bosonization of the 2 two-spinor Dirac equation. In this paper those results are combined to approach the nature of proton, neutron, and electron via extrapolations from Planck units to the System of Units (SI). The electron acts as a bosonizing bridge between opposite parity topological currents. The resulting potentials and masses are based on a fundamental soliton mass limit and two iteratively obtained coupling constants, where one is the fine structure constant. The simple non-perturbative and relativistic results are within measurement uncertainty and show a very high significance. The deviation for the proton and electron masses are approximately 1 ppb ($10^{-9}$), for the neutron 4 ppb.

  10. Fundamental energy limits of SET-based Brownian NAND and half-adder circuits. Preliminary findings from a physical-information-theoretic methodology

    Science.gov (United States)

    Ercan, İlke; Suyabatmaz, Enes

    2018-06-01

    The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.

  11. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 2: Fundamentals and application to counting measurements with the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) on samples are counted after treating them (e.g. aliquotation, solution, enrichment, separation). It considers, besides the random character of radioactive decay and of pulse counting, all other influences arising from sample treatment, (e.g. weighing, enrichment, calibration or the instability of the test setup). ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG 2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as non radioactive waste, and ISO 11929-1, ISO 11929-3 and ISO 11929-4 and is, consequently, complementary to these documents

  12. Latent Fundamentals Arbitrage with a Mixed Effects Factor Model

    Directory of Open Access Journals (Sweden)

    Andrei Salem Gonçalves

    2012-09-01

    Full Text Available We propose a single-factor mixed effects panel data model to create an arbitrage portfolio that identifies differences in firm-level latent fundamentals. Furthermore, we show that even though the characteristics that affect returns are unknown variables, it is possible to identify the strength of the combination of these latent fundamentals for each stock by following a simple approach using historical data. As a result, a trading strategy that bought the stocks with the best fundamentals (strong fundamentals portfolio and sold the stocks with the worst ones (weak fundamentals portfolio realized significant risk-adjusted returns in the U.S. market for the period between July 1986 and June 2008. To ensure robustness, we performed sub period and seasonal analyses and adjusted for trading costs and we found further empirical evidence that using a simple investment rule, that identified these latent fundamentals from the structure of past returns, can lead to profit.

  13. Muonium-Physics of a most Fundamental Atom

    NARCIS (Netherlands)

    Jungmann, KP

    The hydrogen-like muonium atom (M=mu(+)e(-)) offers possiblitites to measure fundamental constants most precisely and to search sensitively for new physics. All experiments on muonium at the presenetly most intense muon sources are statistics limited. New and intense muon sources are indispensable

  14. Heat exchanger versus regenerator: A fundamental comparison

    NARCIS (Netherlands)

    Will, M.E.; Waele, de A.T.A.M.

    2005-01-01

    Irreversible processes in regenerators and heat exchangers limit the performance of cryocoolers. In this paper we compare the performance of cryocoolers, operating with regenerators and heat exchangers from a fundamental point of view. The losses in the two systems are calculated from the entropy

  15. Information-theoretical approach to control of quantum-mechanical systems

    International Nuclear Information System (INIS)

    Kawabata, Shiro

    2003-01-01

    Fundamental limits on the controllability of quantum mechanical systems are discussed in the light of quantum information theory. It is shown that the amount of entropy-reduction that can be extracted from a quantum system by feedback controller is upper bounded by a sum of the decrease of entropy achievable in open-loop control and the mutual information between the quantum system and the controller. This upper bound sets a fundamental limit on the performance of any quantum controllers whose designs are based on the possibilities to attain low entropy states. An application of this approach pertaining to quantum error correction is also discussed

  16. Knowledge claim evaluation : a fundamental issue for knowledge management

    NARCIS (Netherlands)

    Peters, K.; Maruster, L.; Jorna, R.J.J.M.

    2010-01-01

    Purpose - This paper aims to present a classification of approaches toward knowledge claim evaluation (KCE), which is the process of evaluating and testing knowledge claims in organizations, and to position KCE as a fundamental research issue for KM. Design/methodology/approach - The paper draws

  17. an aid to mastering fundamental calculus concepts

    African Journals Online (AJOL)

    Erna Kinsey

    Department of Educational Psychology, University of Pretoria, Pretoria, 0002 South Africa ... according to a well thought-out didactical approach is necessary in order to incorporate technology ... developing new hypotheses instead of testing hypotheses. ... mastering fundamental concepts of two-dimensional functions.

  18. Scientific and technological fundamentals

    International Nuclear Information System (INIS)

    Roethemeyer, H.

    1991-01-01

    Specific ultimate repositories in a given geological formation have to be assessed on the basis of a safety analysis, taking into account the site specifics of the repository system 'Overall geological situation - ultimate disposal facility - waste forms'. The fundamental possibilities and limits of waste disposal are outlined. Orientation values up to about 10 6 years are derived for the isolation potential of ultimate disposal mines, and about 10 4 years for the calculation of effects of emplaced radioactive wastes also on man. (DG) [de

  19. Fundamentals of Cavitation

    CERN Document Server

    Franc, Jean-Pierre

    2005-01-01

    The present book is aimed at providing a comprehensive presentation of cavitation phenomena in liquid flows. It is further backed up by the experience, both experimental and theoretical, of the authors whose expertise has been internationally recognized. A special effort is made to place the various methods of investigation in strong relation with the fundamental physics of cavitation, enabling the reader to treat specific problems independently. Furthermore, it is hoped that a better knowledge of the cavitation phenomenon will allow engineers to create systems using it positively. Examples in the literature show the feasibility of this approach.

  20. Fundamental limits on beam stability at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Decker, G. A.

    1998-01-01

    Orbit correction is now routinely performed at the few-micron level in the Advanced Photon Source (APS) storage ring. Three diagnostics are presently in use to measure and control both AC and DC orbit motions: broad-band turn-by-turn rf beam position monitors (BPMs), narrow-band switched heterodyne receivers, and photoemission-style x-ray beam position monitors. Each type of diagnostic has its own set of systematic error effects that place limits on the ultimate pointing stability of x-ray beams supplied to users at the APS. Limiting sources of beam motion at present are magnet power supply noise, girder vibration, and thermal timescale vacuum chamber and girder motion. This paper will investigate the present limitations on orbit correction, and will delve into the upgrades necessary to achieve true sub-micron beam stability

  1. Fundamental investigations of catalyst nanoparticles

    DEFF Research Database (Denmark)

    Elkjær, Christian Fink

    and economic development in the 20th century. There is however a downside to this development and we are seeing significant pollution and pressure on resources. Catalysis therefore has an increasingly important role in limiting pollution and optimizing the use of resources. This development will depend on our...... fundamental understanding of catalytic processes and our ability to make use of that understanding. This thesis presents fundamental studies of catalyst nanoparticles with particular focus on dynamic processes. Such studies often require atomic-scale characterization, because the catalytic conversion takes...... important that we only study intrinsic structures and phenomena and not those that may be induced by the high energy electrons used to image the specimen. This requires careful consideration of the influence of the electron beam in order to understand, control and minimize that influence. I present four...

  2. Fundamental measure theory for hard-sphere mixtures: a review

    International Nuclear Information System (INIS)

    Roth, Roland

    2010-01-01

    Hard-sphere systems are one of the fundamental model systems of statistical physics and represent an important reference system for molecular or colloidal systems with soft repulsive or attractive interactions in addition to hard-core repulsion at short distances. Density functional theory for classical systems, as one of the core theoretical approaches of statistical physics of fluids and solids, has to be able to treat such an important system successfully and accurately. Fundamental measure theory is up to date the most successful and most accurate density functional theory for hard-sphere mixtures. Since its introduction fundamental measure theory has been applied to many problems, tested against computer simulations, and further developed in many respects. The literature on fundamental measure theory is already large and is growing fast. This review aims to provide a starting point for readers new to fundamental measure theory and an overview of important developments. (topical review)

  3. With Iterative and Bosonized Coupling towards Fundamental Particle Properties

    CERN Document Server

    Binder, B

    2002-01-01

    Previous results have shown that the linear topological potential-to-phase relationship (well known from Josephson junctions) is the key to iterative coupling and non-perturbative bosonization of the 2 two-spinor Dirac equation. In this paper those results are combined to approach the nature of proton, neutron, and electron via extrapolations from the Planck scale to the System of Units (SI). The electron acts as a bosonizing bridge between opposite parity topological currents. The resulting potentials and masses are based on a fundamental soliton mass limit and two iteratively obtained coupling constants where one is the fine structure constant. The simple non-perturbative and relativistic results are within measurement uncertainty and show a very high significance. The deviation for the proton and electron masses are approximately 1 ppb (10^-9), for the neutron 4 ppb.

  4. Fundamental Rights and the EU Internal Market: Just how Fundamental are the EU Treaty Freedoms?
    A Normative Enquiry Based on John Rawls’ Political Philosophy

    Directory of Open Access Journals (Sweden)

    Nik J. de Boer

    2013-01-01

    Full Text Available This article assesses whether the EU Treaty freedoms - the free movement of goods, persons, services and capital - should be considered as fundamental rights which are hierarchically equal to other fundamental rights. It uses the political philosophy of John Rawls to assess why we should attach priority to certain rights and which rights should therefore be considered fundamental rights. On this basis it is argued that we should recognise two main types of fundamental rights, namely basic rights and liberties associated with Rawls' first principle of justice and the rights associated with the principle of fair equality of opportunity. This is followed by an analysis of the interpretation that the European Court of Justice (CJEU gives to the Treaty freedoms. On the basis of the normative framework, it is argued that the Treaty freedoms can be seen as fundamental rights insofar as they embody the value of equality of opportunity. Nonetheless, the CJEU increasingly seems to rely on a broader market access approach rather than an equal treatment approach in interpreting the Treaty freedoms. It is argued that where equal treatment is not at stake, the Treaty freedoms should not be seen as fundamental rights. Therefore, in cases where there is a conflict between a fundamental right and a Treaty freedom the CJEU should carefully distinguish between these two different interpretations of the Treaty freedoms. In cases where it is merely market access that is at stake, the CJEU should regard the protection of fundamental rights as more important, and be very careful in allowing a restriction of fundamental rights in order to protect the exercise of the Treaty freedom. On the other hand, in cases where the Treaty freedoms can be seen as protecting equality of opportunity and where they conflict with other fundamental rights, the Court is justified in construing the conflict as a right-right conflict in which a fair balance has to be sought.

  5. Some aspects of fundamental symmetries and interactions

    NARCIS (Netherlands)

    Jungmann, KP; Grzonka, D; Czyzykiewicz, R; Oelert, W; Rozek, T; Winter, P

    2005-01-01

    The known fundamental symmetries and interactions are well described by the Standard Model. Features of this powerful theory, which are described but not deeper explained, are addressed in a variety of speculative models. Experimental tests of the predictions in such approaches can be either through

  6. African Union approaches to peacebuilding: Efforts at shifting the ...

    African Journals Online (AJOL)

    Without these conditions, the approach leads to extended peace enforcement rather than peacebuilding. Yet, whatever the conditions that prevail, peacebuilding in Africa has experienced limited success due to the failure to fundamentally transform the inherited post-colonial state, society and politics. The neo-colonial ...

  7. FUNDAMENTALS OF BIOMECHANICS

    Directory of Open Access Journals (Sweden)

    Duane Knudson

    2007-09-01

    Full Text Available DESCRIPTION This book provides a broad and in-depth theoretical and practical description of the fundamental concepts in understanding biomechanics in the qualitative analysis of human movement. PURPOSE The aim is to bring together up-to-date biomechanical knowledge with expert application knowledge. Extensive referencing for students is also provided. FEATURES This textbook is divided into 12 chapters within four parts, including a lab activities section at the end. The division is as follows: Part 1 Introduction: 1.Introduction to biomechanics of human movement; 2.Fundamentals of biomechanics and qualitative analysis; Part 2 Biological/Structural Bases: 3.Anatomical description and its limitations; 4.Mechanics of the musculoskeletal system; Part 3 Mechanical Bases: 5.Linear and angular kinematics; 6.Linear kinetics; 7.Angular kinetics; 8.Fluid mechanics; Part 4 Application of Biomechanics in Qualitative Analysis :9.Applying biomechanics in physical education; 10.Applying biomechanics in coaching; 11.Applying biomechanics in strength and conditioning; 12.Applying biomechanics in sports medicine and rehabilitation. AUDIENCE This is an important reading for both student and educators in the medicine, sport and exercise-related fields. For the researcher and lecturer it would be a helpful guide to plan and prepare more detailed experimental designs or lecture and/or laboratory classes in exercise and sport biomechanics. ASSESSMENT The text provides a constructive fundamental resource for biomechanics, exercise and sport-related students, teachers and researchers as well as anyone interested in understanding motion. It is also very useful since being clearly written and presenting several ways of examples of the application of biomechanics to help teach and apply biomechanical variables and concepts, including sport-related ones

  8. Balancing Fundamental Rights with Economic Freedoms According to the European Court of Justice

    Directory of Open Access Journals (Sweden)

    Sybe A. de Vries

    2013-01-01

    Full Text Available The development of fundamental rights within the EU legal order has come to a climax through the entry into force of the Treaty of Lisbon in December 2009. Article 6 of the EU Treaty now recognizes the binding force of the EU Charter of Fundamental Rights, embraces the intention to accede to the European Convention on the Protection of Human Rights and Fundamental Freedoms and codifies the European Court of Justice's (ECJ case law that fundamental rights shall constitute general principles of Union law. The question is how these changes made by the Lisbon Treaty, which mark a new stage in the shaping of the EU's commitment to the protection of fundamental rights, inform the relationship between fundamental rights and the classic Treaty economic freedoms, which have been vital in building Europe's 'economic constitution'. This contribution addresses the conflict that may arise between the Treaty economic freedoms and fundamental rights and assesses how the ECJ should balance these conflicting interests, considering the changed EU legal framework. In this paper the approach of the European Court of Human Rights (ECtHR, having to decide in cases where fundamental rights conflict with each other, will also be briefly touched upon and compared with the Court of Justice's approach.

  9. Stochastic approach to the derivation of emission limits for wastewater treatment plants.

    Science.gov (United States)

    Stransky, D; Kabelkova, I; Bares, V

    2009-01-01

    Stochastic approach to the derivation of WWTP emission limits meeting probabilistically defined environmental quality standards (EQS) is presented. The stochastic model is based on the mixing equation with input data defined by probability density distributions and solved by Monte Carlo simulations. The approach was tested on a study catchment for total phosphorus (P(tot)). The model assumes input variables independency which was proved for the dry-weather situation. Discharges and P(tot) concentrations both in the study creek and WWTP effluent follow log-normal probability distribution. Variation coefficients of P(tot) concentrations differ considerably along the stream (c(v)=0.415-0.884). The selected value of the variation coefficient (c(v)=0.420) affects the derived mean value (C(mean)=0.13 mg/l) of the P(tot) EQS (C(90)=0.2 mg/l). Even after supposed improvement of water quality upstream of the WWTP to the level of the P(tot) EQS, the WWTP emission limits calculated would be lower than the values of the best available technology (BAT). Thus, minimum dilution ratios for the meaningful application of the combined approach to the derivation of P(tot) emission limits for Czech streams are discussed.

  10. New constraints on time-dependent variations of fundamental constants using Planck data

    Science.gov (United States)

    Hart, Luke; Chluba, Jens

    2018-02-01

    Observations of the cosmic microwave background (CMB) today allow us to answer detailed questions about the properties of our Universe, targeting both standard and non-standard physics. In this paper, we study the effects of varying fundamental constants (i.e. the fine-structure constant, αEM, and electron rest mass, me) around last scattering using the recombination codes COSMOREC and RECFAST++. We approach the problem in a pedagogical manner, illustrating the importance of various effects on the free electron fraction, Thomson visibility function and CMB power spectra, highlighting various degeneracies. We demonstrate that the simpler RECFAST++ treatment (based on a three-level atom approach) can be used to accurately represent the full computation of COSMOREC. We also include explicit time-dependent variations using a phenomenological power-law description. We reproduce previous Planck 2013 results in our analysis. Assuming constant variations relative to the standard values, we find the improved constraints αEM/αEM, 0 = 0.9993 ± 0.0025 (CMB only) and me/me, 0 = 1.0039 ± 0.0074 (including BAO) using Planck 2015 data. For a redshift-dependent variation, αEM(z) = αEM(z0) [(1 + z)/1100]p with αEM(z0) ≡ αEM, 0 at z0 = 1100, we obtain p = 0.0008 ± 0.0025. Allowing simultaneous variations of αEM(z0) and p yields αEM(z0)/αEM, 0 = 0.9998 ± 0.0036 and p = 0.0006 ± 0.0036. We also discuss combined limits on αEM and me. Our analysis shows that existing data are not only sensitive to the value of the fundamental constants around recombination but also its first time derivative. This suggests that a wider class of varying fundamental constant models can be probed using the CMB.

  11. The Limits of Existential Autonomy and the Fundamental Law Duties of Preserving Inconscious People Lives

    Directory of Open Access Journals (Sweden)

    Ana Stela Vieira Mendes Câmara

    2016-12-01

    Full Text Available In the face of factual, conceptual and scientific uncertainties surrounding the finitude of life, and assuming the search for the ideal of a dignified, natural and proper death without prepayments or undue extensions, this research has the scope to investigate the reasonableness of the parameters that establish limitations on existential autonomy, due to the preservation of life of unconscious people. Identifies, based on heteronomous component of human dignity, the existence of a bundle of basic legal duties of protection of these individuals whose ownership rests with the family and the state. The methodology is qualitative, interdisciplinary bibliographic and documentary, in which it is used hypothetical-deductive approach.

  12. Quantum limit on time measurement in a gravitational field

    International Nuclear Information System (INIS)

    Sinha, Supurna; Samuel, Joseph

    2015-01-01

    Good clocks are of importance both to fundamental physics and for applications in astronomy, metrology and global positioning systems. In a recent technological breakthrough, researchers at NIST have been able to achieve a stability of one part in 10 18 using an ytterbium clock. This naturally raises the question of whether there are fundamental limits to time keeping. In this article we point out that gravity and quantum mechanics set a fundamental limit on the fractional frequency uncertainty of clocks. This limit comes from a combination of the uncertainty relation, the gravitational redshift and the relativistic time dilation effect. For example, a single ion aluminium clock in a terrestrial gravitational field cannot achieve a fractional frequency uncertainty better than one part in 10 22 . This fundamental limit explores the interaction between gravity and quantum mechanics on a laboratory scale. (paper)

  13. Problem of data quality and the limitations of the infrastructure approach

    Science.gov (United States)

    Behlen, Fred M.; Sayre, Richard E.; Rackus, Edward; Ye, Dingzhong

    1998-07-01

    The 'Infrastructure Approach' is a PACS implementation methodology wherein the archive, network and information systems interfaces are acquired first, and workstations are installed later. The approach allows building a history of archived image data, so that most prior examinations are available in digital form when workstations are deployed. A limitation of the Infrastructure Approach is that the deferred use of digital image data defeats many data quality management functions that are provided automatically by human mechanisms when data is immediately used for the completion of clinical tasks. If the digital data is used solely for archiving while reports are interpreted from film, the radiologist serves only as a check against lost films, and another person must be designated as responsible for the quality of the digital data. Data from the Radiology Information System and the PACS were analyzed to assess the nature and frequency of system and data quality errors. The error level was found to be acceptable if supported by auditing and error resolution procedures requiring additional staff time, and in any case was better than the loss rate of a hardcopy film archive. It is concluded that the problem of data quality compromises but does not negate the value of the Infrastructure Approach. The Infrastructure Approach should best be employed only to a limited extent, and that any phased PACS implementation should have a substantial complement of workstations dedicated to softcopy interpretation for at least some applications, and with full deployment following not long thereafter.

  14. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  15. Material limitations on the detection limit in refractometry.

    Science.gov (United States)

    Skafte-Pedersen, Peder; Nunes, Pedro S; Xiao, Sanshui; Mortensen, Niels Asger

    2009-01-01

    We discuss the detection limit for refractometric sensors relying on high-Q optical cavities and show that the ultimate classical detection limit is given by min {Δn} ≳ η, with n + iη being the complex refractive index of the material under refractometric investigation. Taking finite Q factors and filling fractions into account, the detection limit declines. As an example we discuss the fundamental limits of silicon-based high-Q resonators, such as photonic crystal resonators, for sensing in a bio-liquid environment, such as a water buffer. In the transparency window (λ ≳ 1100 nm) of silicon the detection limit becomes almost independent on the filling fraction, while in the visible, the detection limit depends strongly on the filling fraction because the silicon absorbs strongly.

  16. Relativities of fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    2017-08-01

    S-dualities have been held to have radical implications for our metaphysics of fundamentality. In particular, it has been claimed that they make the fundamentality status of a physical object theory-relative in an important new way. But what physicists have had to say on the issue has not been clear or consistent, and in particular seems to be ambiguous between whether S-dualities demand an anti-realist interpretation of fundamentality talk or merely a revised realism. This paper is an attempt to bring some clarity to the matter. After showing that even antecedently familiar fundamentality claims are true only relative to a raft of metaphysical, physical, and mathematical assumptions, I argue that the relativity of fundamentality inherent in S-duality nevertheless represents something new, and that part of the reason for this is that it has both realist and anti-realist implications for fundamentality talk. I close by discussing the broader significance that S-dualities have for structuralist metaphysics and for fundamentality metaphysics more generally.

  17. DOE fundamentals handbook: Chemistry

    International Nuclear Information System (INIS)

    1993-01-01

    The Chemistry Handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of chemistry. The handbook includes information on the atomic structure of matter; chemical bonding; chemical equations; chemical interactions involved with corrosion processes; water chemistry control, including the principles of water treatment; the hazards of chemicals and gases, and basic gaseous diffusion processes. This information will provide personnel with a foundation for understanding the chemical properties of materials and the way these properties can impose limitations on the operation of equipment and systems

  18. From incremental to fundamental substitution in chemical alternatives assessment

    DEFF Research Database (Denmark)

    Fantke, Peter; Weber, Roland; Scheringer, Martin

    2015-01-01

    to similarity in chemical structures and, hence, similar hazard profiles between phase-out and substitute chemicals, leading to a rather incremental than fundamental substitution. A hampered phase-out process, the lack of implementing Green Chemistry principles in chemicals design, and lack of Sustainable...... an integrated approach of all stakeholders involved toward more fundamental and function-based substitution by greener and more sustainable alternatives. Our recommendations finally constitute a starting point for identifying further research needs and for improving current alternatives assessment practice....

  19. A verdade como um problema fundamental em Kant Kant on truth as a fundamental problem

    Directory of Open Access Journals (Sweden)

    Adriano Perin

    2010-01-01

    Full Text Available O principal ponto de desacordo sobre a abordagem kantiana do problema da verdade é se ela pode ser compreendida nos moldes da filosofia contemporânea como coerentista ou como correspondentista. Primando por uma consideração sistemática da argumentação de Kant em confronto com a literatura existente sobre o problema, este trabalho defende a segunda alternativa. Sustenta-se a tese de que a definição da verdade como a "concordância do conhecimento com o seu objeto" é cogente em todo o percurso do pensamento kantiano e que, nessa acepção, a verdade culmina por ser abordada não a partir de uma teoria estabelecida, mas como um problema cuja solução não pode ser dada nos limites da filosofia crítico-transcendental. Pondera-se, primeiramente, a literatura que situa Kant quer como coerentista quer como correspondentista e sistematiza-se a segunda alternativa em quatro grupos: a leitura ontológica, a leitura isomórfica, a leitura "consequencialista" e a leitura regulativa. Num segundo momento, em atenção ao período pré-crítico, argumenta-se que a alternativa coerentista deixa de se confirmar já nessa mesma época e que, na década de 1750, Kant descarta uma suposta teoria correspondentista isomórfica. Num último momento, considera-se a argumentação crítica e defende-se que a mesma concebe a verdade como um problema fundamental que não cabe ao tratamento de uma teoria correspondentista concebida de modo "consequencialista" ou regulativo.The main point of disagreement about Kant's approach of the problem of truth is whether it can be understood within the apparatus of contemporary philosophy as a coherence or a correspondence theory. By favoring a systematic consideration of Kant's argumentation in light of the available literature on the problem, this paper argues toward the latter alternative. It is sustained that the definition of truth as "the agreement of cognition with its object" is cogent throughout Kant's thought and

  20. Fundamental understanding and practical challenges of anionic redox activity in Li-ion batteries

    Science.gov (United States)

    Assat, Gaurav; Tarascon, Jean-Marie

    2018-05-01

    Our increasing dependence on lithium-ion batteries for energy storage calls for continual improvements in the performance of their positive electrodes, which have so far relied solely on cationic redox of transition-metal ions for driving the electrochemical reactions. Great hopes have recently been placed on the emergence of anionic redox—a transformational approach for designing positive electrodes as it leads to a near-doubling of capacity. But questions have been raised about the fundamental origins of anionic redox and whether its full potential can be realized in applications. In this Review, we discuss the underlying science that triggers a reversible and stable anionic redox activity. Furthermore, we highlight its practical limitations and outline possible approaches for improving such materials and designing new ones. We also summarize their chances for market implementation in the face of the competing nickel-based layered cathodes that are prevalent today.

  1. A Weakest-Link Approach for Fatigue Limit of 30CrNiMo8 Steels (Preprint)

    Science.gov (United States)

    2011-03-01

    34Application of a Weakest-Link Concept to the Fatigue Limit of the Bearing Steel Sae 52100 in a Bainitic Condition," Fatigue and Fracture of...AFRL-RX-WP-TP-2011-4206 A WEAKEST-LINK APPROACH FOR FATIGUE LIMIT OF 30CrNiMo8 STEELS (PREPRINT) S. Ekwaro-Osire and H.V. Kulkarni Texas...2011 4. TITLE AND SUBTITLE A WEAKEST-LINK APPROACH FOR FATIGUE LIMIT OF 30CrNiMo8 STEELS (PREPRINT) 5a. CONTRACT NUMBER In-house 5b. GRANT

  2. Spreadsheet design and validation for characteristic limits determination in gross alpha and beta measurement

    International Nuclear Information System (INIS)

    Prado, Rodrigo G.P. do; Dalmazio, Ilza

    2013-01-01

    The identification and detection of ionizing radiation are essential requisites of radiation protection. Gross alpha and beta measurements are widely applied as a screening method in radiological characterization, environmental monitoring and industrial applications. As in any other analytical technique, test performance depends on the quality of instrumental measurements and reliability of calculations. Characteristic limits refer to three specific statistics, namely, decision threshold, detection limit and confidence interval, which are fundamental to ensuring the quality of determinations. This work describes a way to calculate characteristic limits for measurements of gross alpha and beta activity applying spreadsheets. The approach used for determination of decision threshold, detection limit and limits of the confidence interval, the mathematical expressions of measurands and uncertainty followed standards guidelines. A succinct overview of this approach and examples are presented and spreadsheets were validated using specific software. Furthermore, these spreadsheets could be used as tool to instruct beginner users of methods for ionizing radiation measurements. (author)

  3. Material Limitations on the Detection Limit in Refractometry

    Directory of Open Access Journals (Sweden)

    Niels Asger Mortensen

    2009-10-01

    Full Text Available We discuss the detection limit for refractometric sensors relying on high-Q optical cavities and show that the ultimate classical detection limit is given by min {Δn} ≳ η with n + iη being the complex refractive index of the material under refractometric investigation. Taking finite Q factors and filling fractions into account, the detection limit declines. As an example we discuss the fundamental limits of silicon-based high-Q resonators, such as photonic crystal resonators, for sensing in a bio-liquid environment, such as a water buffer. In the transparency window (λ ≳ 1100 nm of silicon the detection limit becomes almost independent on the filling fraction, while in the visible, the detection limit depends strongly on the filling fraction because the silicon absorbs strongly.

  4. Fundamental limitations of non-thermal plasma processing for internal combustion engine NOx control

    International Nuclear Information System (INIS)

    Penetrante, B.M.

    1993-01-01

    This paper discusses the physics and chemistry of non-thermal plasma processing for post-combustion NO x control in internal combustion engines. A comparison of electron beam and electrical discharge processing is made regarding their power consumption, radical production, NO x removal mechanisms, and by product formation. Can non-thermal deNO x operate efficiently without additives or catalysts? How much electrical power does it cost to operate? What are the by-products of the process? This paper addresses these fundamental issues based on an analysis of the electron-molecule processes and chemical kinetics

  5. Prediction of the Fundamental Period of Infilled RC Frame Structures Using Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Panagiotis G. Asteris

    2016-01-01

    Full Text Available The fundamental period is one of the most critical parameters for the seismic design of structures. There are several literature approaches for its estimation which often conflict with each other, making their use questionable. Furthermore, the majority of these approaches do not take into account the presence of infill walls into the structure despite the fact that infill walls increase the stiffness and mass of structure leading to significant changes in the fundamental period. In the present paper, artificial neural networks (ANNs are used to predict the fundamental period of infilled reinforced concrete (RC structures. For the training and the validation of the ANN, a large data set is used based on a detailed investigation of the parameters that affect the fundamental period of RC structures. The comparison of the predicted values with analytical ones indicates the potential of using ANNs for the prediction of the fundamental period of infilled RC frame structures taking into account the crucial parameters that influence its value.

  6. Fundamentals of Project Management

    CERN Document Server

    Heagney, Joseph

    2011-01-01

    With sales of more than 160,000 copies, Fundamentals of Project Management has helped generations of project managers navigate the ins and outs of every aspect of this complex discipline. Using a simple step-by-step approach, the book is the perfect introduction to project management tools, techniques, and concepts. Readers will learn how to: ò Develop a mission statement, vision, goals, and objectives ò Plan the project ò Create the work breakdown structure ò Produce a workable schedule ò Understand earned value analysis ò Manage a project team ò Control and evaluate progress at every stage.

  7. Quantity and quality limit detritivore growth: mechanisms revealed by ecological stoichiometry and co-limitation theory.

    Science.gov (United States)

    Halvorson, Halvor M; Sperfeld, Erik; Evans-White, Michelle A

    2017-12-01

    Resource quantity and quality are fundamental bottom-up constraints on consumers. Best understood in autotroph-based systems, co-occurrence of these constraints may be common but remains poorly studied in detrital-based systems. Here, we used a laboratory growth experiment to test limitation of the detritivorous caddisfly larvae Pycnopsyche lepida across a concurrent gradient of oak litter quantity (food supply) and quality (phosphorus : carbon [P:C ratios]). Growth increased simultaneously with quantity and quality, indicating co-limitation across the resource gradients. We merged approaches of ecological stoichiometry and co-limitation theory, showing how co-limitation reflected shifts in C and P acquisition throughout homeostatic regulation. Increased growth was best explained by elevated consumption rates and improved P assimilation, which both increased with elevated quantity and quality. Notably, C assimilation efficiencies remained unchanged and achieved maximum 18% at low quantity despite pronounced C limitation. Detrital C recalcitrance and substantive post-assimilatory C losses probably set a minimum quantity threshold to achieve positive C balance. Above this threshold, greater quality enhanced larval growth probably by improving P assimilation toward P-intensive growth. We suggest this interplay of C and P acquisition contributes to detritivore co-limitation, highlighting quantity and quality as potential simultaneous bottom-up controls in detrital-based ecosystems, including under anthropogenic change like nutrient enrichment. © 2017 by the Ecological Society of America.

  8. Limited endoscopic transsphenoidal approach for cavernous sinus biopsy: illustration of 3 cases and discussion.

    Science.gov (United States)

    Graillon, T; Fuentes, S; Metellus, P; Adetchessi, T; Gras, R; Dufour, H

    2014-01-01

    Advances in transsphenoidal surgery and endoscopic techniques have opened new perspectives for cavernous sinus (CS) approaches. The aim of this study was to assess the advantages and disadvantages of limited endoscopic transsphenoidal approach, as performed in pituitary adenoma surgery, for CS tumor biopsy illustrated with three clinical cases. The first case was a 46-year-old woman with a prior medical history of parotid adenocarcinoma successfully treated 10 years previously. The cavernous sinus tumor was revealed by right third and sixth nerve palsy and increased over the past three years. A tumor biopsy using a limited endoscopic transsphenoidal approach revealed an adenocarcinoma metastasis. Complementary radiosurgery was performed. The second case was a 36-year-old woman who consulted for diplopia with right sixth nerve palsy and amenorrhea with hyperprolactinemia. Dopamine agonist treatment was used to restore the patient's menstrual cycle. Cerebral magnetic resonance imaging (MRI) revealed a right sided CS tumor. CS biopsy, via a limited endoscopic transsphenoidal approach, confirmed a meningothelial grade 1 meningioma. Complementary radiosurgery was performed. The third case was a 63-year-old woman with progressive installation of left third nerve palsy and visual acuity loss, revealing a left cavernous sinus tumor invading the optic canal. Surgical biopsy was performed using an enlarged endoscopic transsphenoidal approach to the decompress optic nerve. Biopsy results revealed a meningothelial grade 1 meningioma. Complementary radiotherapy was performed. In these three cases, no complications were observed. Mean hospitalization duration was 4 days. Reported anatomical studies and clinical series have shown the feasibility of reaching the cavernous sinus using an endoscopic endonasal approach. Trans-foramen ovale CS percutaneous biopsy is an interesting procedure but only provides cell analysis results, and not tissue analysis. However, radiotherapy and

  9. Fundamentals of Structural Geology

    Science.gov (United States)

    Pollard, David D.; Fletcher, Raymond C.

    2005-09-01

    Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors

  10. A fundamentally new approach to air-cooled heat exchangers.

    Energy Technology Data Exchange (ETDEWEB)

    Koplow, Jeffrey P.

    2010-01-01

    We describe breakthrough results obtained in a feasibility study of a fundamentally new architecture for air-cooled heat exchangers. A longstanding but largely unrealized opportunity in energy efficiency concerns the performance of air-cooled heat exchangers used in air conditioners, heat pumps, and refrigeration equipment. In the case of residential air conditioners, for example, the typical performance of the air cooled heat exchangers used for condensers and evaporators is at best marginal from the standpoint the of achieving maximum the possible coefficient of performance (COP). If by some means it were possible to reduce the thermal resistance of these heat exchangers to a negligible level, a typical energy savings of order 30% could be immediately realized. It has long been known that a several-fold increase in heat exchanger size, in conjunction with the use of much higher volumetric flow rates, provides a straight-forward path to this goal but is not practical from the standpoint of real world applications. The tension in the market place between the need for energy efficiency and logistical considerations such as equipment size, cost and operating noise has resulted in a compromise that is far from ideal. This is the reason that a typical residential air conditioner exhibits significant sensitivity to reductions in fan speed and/or fouling of the heat exchanger surface. The prevailing wisdom is that little can be done to improve this situation; the 'fan-plus-finned-heat-sink' heat exchanger architecture used throughout the energy sector represents an extremely mature technology for which there is little opportunity for further optimization. But the fact remains that conventional fan-plus-finned-heat-sink technology simply doesn't work that well. Their primary physical limitation to performance (i.e. low thermal resistance) is the boundary layer of motionless air that adheres to and envelops all surfaces of the heat exchanger. Within this

  11. Fundamental problems in provable security and cryptography.

    Science.gov (United States)

    Dent, Alexander W

    2006-12-15

    This paper examines methods for formally proving the security of cryptographic schemes. We show that, despite many years of active research and dozens of significant results, there are fundamental problems which have yet to be solved. We also present a new approach to one of the more controversial aspects of provable security, the random oracle model.

  12. An Information-Based Approach to Precision Analysis of Indoor WLAN Localization Using Location Fingerprint

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2015-12-01

    Full Text Available In this paper, we proposed a novel information-based approach to precision analysis of indoor wireless local area network (WLAN localization using location fingerprint. First of all, by using the Fisher information matrix (FIM, we derive the fundamental limit of WLAN fingerprint-based localization precision considering different signal distributions in characterizing the variation of received signal strengths (RSSs in the target environment. After that, we explore the relationship between the localization precision and access point (AP placement, which can provide valuable suggestions for the design of the highly-precise localization system. Second, we adopt the heuristic simulated annealing (SA algorithm to optimize the AP locations for the sake of approaching the fundamental limit of localization precision. Finally, the extensive simulations and experiments are conducted in both regular line-of-sight (LOS and irregular non-line-of-sight (NLOS environments to demonstrate that the proposed approach can not only effectively improve the WLAN fingerprint-based localization precision, but also reduce the time overhead.

  13. Thermodynamic limits set relevant constraints to the soil-plant-atmosphere system and to optimality in terrestrial vegetation

    Science.gov (United States)

    Kleidon, Axel; Renner, Maik

    2016-04-01

    The soil-plant-atmosphere system is a complex system that is strongly shaped by interactions between the physical environment and vegetation. This complexity appears to demand equally as complex models to fully capture the dynamics of the coupled system. What we describe here is an alternative approach that is based on thermodynamics and which allows for comparatively simple formulations free of empirical parameters by assuming that the system is so complex that its emergent dynamics are only constrained by the thermodynamics of the system. This approach specifically makes use of the second law of thermodynamics, a fundamental physical law that is typically not being considered in Earth system science. Its relevance to land surface processes is that it fundamentally sets a direction as well as limits to energy conversions and associated rates of mass exchange, but it requires us to formulate land surface processes as thermodynamic processes that are driven by energy conversions. We describe an application of this approach to the surface energy balance partitioning at the diurnal scale. In this application the turbulent heat fluxes of sensible and latent heat are described as the result of a convective heat engine that is driven by solar radiative heating of the surface and that operates at its thermodynamic limit. The predicted fluxes from this approach compare very well to observations at several sites. This suggests that the turbulent exchange fluxes between the surface and the atmosphere operate at their thermodynamic limit, so that thermodynamics imposes a relevant constraint to the land surface-atmosphere system. Yet, thermodynamic limits do not entirely determine the soil-plant-atmosphere system because vegetation affects these limits, for instance by affecting the magnitude of surface heating by absorption of solar radiation in the canopy layer. These effects are likely to make the conditions at the land surface more favorable for photosynthetic activity

  14. Novel approach to epicardial pacemaker implantation in patients with limited venous access.

    Science.gov (United States)

    Costa, Roberto; Scanavacca, Mauricio; da Silva, Kátia Regina; Martinelli Filho, Martino; Carrillo, Roger

    2013-11-01

    Limited venous access in certain patients increases the procedural risk and complexity of conventional transvenous pacemaker implantation. The purpose of this study was to determine a minimally invasive epicardial approach using pericardial reflections for dual-chamber pacemaker implantation in patients with limited venous access. Between June 2006 and November 2011, 15 patients underwent epicardial pacemaker implantation. Procedures were performed through a minimally invasive subxiphoid approach and pericardial window with subsequent fluoroscopy-assisted lead placement. Mean patient age was 46.4 ± 15.3 years (9 male [(60.0%], 6 female [40.0%]). The new surgical approach was used in patients determined to have limited venous access due to multiple abandoned leads in 5 (33.3%), venous occlusion in 3 (20.0%), intravascular retention of lead fragments from prior extraction in 3 (20.0%), tricuspid valve vegetation currently under treatment in 2 (13.3%), and unrepaired intracardiac defects in 2 (13.3%). All procedures were successful with no perioperative complications or early deaths. Mean operating time for isolated pacemaker implantation was 231.7 ± 33.5 minutes. Lead placement on the superior aspect of right atrium, through the transverse sinus, was possible in 12 patients. In the remaining 3 patients, the atrial lead was implanted on the left atrium through the oblique sinus, the postcaval recess, or the left pulmonary vein recess. None of the patients displayed pacing or sensing dysfunction, and all parameters remained stable throughout the follow-up period of 36.8 ± 25.1 months. Epicardial pacemaker implantation through pericardial reflections is an effective alternative therapy for those patients requiring physiologic pacing in whom venous access is limited. © 2013 Heart Rhythm Society. All rights reserved.

  15. Design approach for safe tritium handling in ITER

    International Nuclear Information System (INIS)

    Ohira, Shigeru

    2002-01-01

    Outlines for tritium handling and a fundamental approach for ensuring safety are presented. The amount of tritium stored and processed in the ITER facility will be much larger than that in the existing facilities for fusion research, though the processing methods and the conditions of processing (e.g., concentration, pressure, etc.) will be similar for those used in those facilities. Therefore, considerations to be taken for tritium handling, such as limitations of tritium permeation and leaks, provision of an appropriate ventilation/detritiation system for maintenance, measures to ensure mechanical integrity, etc., can be provided based on the knowledge obtained in the facilities. The Technical Advisory Committee of the Science and Technology Agency established a fundamental approach in 2000, and set out the basic safety principles and approaches as technical requirements of safety design and assessment, which were derived from the safety characteristics of the ITER plant. Sufficient prevention of accidents can be achieved by ensuring and maintaining the structural integrity of the enclosures containing radioactive materials against the loads anticipated during operation, and a low hazard potential of radioactive materials, sufficiently within prescribed limits, can be maintained by the vitiation and clean-up system even if large release is postulated. (author)

  16. The Principles of Proportionality, Legal Argumentation and the Discretionary Power of the Public Administration: An Analysis from the Limits on Fundamental Rights and Guarantees

    Directory of Open Access Journals (Sweden)

    Yezid Carrillo-de la Rosa

    2017-06-01

    Full Text Available This paper examines the implications of the principle of proportionality with regards to administrative decisions that limit civil liberties and fundamental rights. The hypothesis we intend to demonstrate is that a discretionary power of the Public Administration for issuing measures that restricts individual rights and liberties is just apparent, since the reach of agency discretion for choosing time, means and place conditions is very narrow. As the following research shows, the principle of proportionality obliges administrative agencies to implement effective means to attain the purposes of their intervention, but minimizing its impacts on constitutionally protected rights and liberties.

  17. A novel approach to derive halo-independent limits on dark matter properties

    OpenAIRE

    Ferrer, Francesc; Ibarra, Alejandro; Wild, Sebastian

    2015-01-01

    We propose a method that allows to place an upper limit on the dark matter elastic scattering cross section with nucleons which is independent of the velocity distribution. Our approach combines null results from direct detection experiments with indirect searches at neutrino telescopes, and goes beyond previous attempts to remove astrophysical uncertainties in that it directly constrains the particle physics properties of the dark matter. The resulting halo-independent upper limits on the sc...

  18. Higher-order harmonics of limited diffraction Bessel beams

    Science.gov (United States)

    Ding; Lu

    2000-03-01

    We investigate theoretically the nonlinear propagation of the limited diffraction Bessel beam in nonlinear media, under the successive approximation of the KZK equation. The result shows that the nth-order harmonic of the Bessel beam, like its fundamental component, is radially limited diffracting, and that the main beamwidth of the nth-order harmonic is exactly 1/n times that of the fundamental.

  19. Predicting fundamental and realized distributions based on thermal niche: A case study of a freshwater turtle

    Science.gov (United States)

    Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco; Ribeiro, Bruno R.

    2018-04-01

    Species distribution models (SDM) have been broadly used in ecology to address theoretical and practical problems. Currently, there are two main approaches to generate SDMs: (i) correlative, which is based on species occurrences and environmental predictor layers and (ii) process-based models, which are constructed based on species' functional traits and physiological tolerances. The distributions estimated by each approach are based on different components of species niche. Predictions of correlative models approach species realized niches, while predictions of process-based are more akin to species fundamental niche. Here, we integrated the predictions of fundamental and realized distributions of the freshwater turtle Trachemys dorbigni. Fundamental distribution was estimated using data of T. dorbigni's egg incubation temperature, and realized distribution was estimated using species occurrence records. Both types of distributions were estimated using the same regression approaches (logistic regression and support vector machines), both considering macroclimatic and microclimatic temperatures. The realized distribution of T. dorbigni was generally nested in its fundamental distribution reinforcing theoretical assumptions that the species' realized niche is a subset of its fundamental niche. Both modelling algorithms produced similar results but microtemperature generated better results than macrotemperature for the incubation model. Finally, our results reinforce the conclusion that species realized distributions are constrained by other factors other than just thermal tolerances.

  20. Ultrafast mid-IR laser scalpel: protein signals of the fundamental limits to minimally invasive surgery.

    Science.gov (United States)

    Amini-Nik, Saeid; Kraemer, Darren; Cowan, Michael L; Gunaratne, Keith; Nadesan, Puviindran; Alman, Benjamin A; Miller, R J Dwayne

    2010-09-28

    Lasers have in principle the capability to cut at the level of a single cell, the fundamental limit to minimally invasive procedures and restructuring biological tissues. To date, this limit has not been achieved due to collateral damage on the macroscale that arises from thermal and shock wave induced collateral damage of surrounding tissue. Here, we report on a novel concept using a specifically designed Picosecond IR Laser (PIRL) that selectively energizes water molecules in the tissue to drive ablation or cutting process faster than thermal exchange of energy and shock wave propagation, without plasma formation or ionizing radiation effects. The targeted laser process imparts the least amount of energy in the remaining tissue without any of the deleterious photochemical or photothermal effects that accompanies other laser wavelengths and pulse parameters. Full thickness incisional and excisional wounds were generated in CD1 mice using the Picosecond IR Laser, a conventional surgical laser (DELight Er:YAG) or mechanical surgical tools. Transmission and scanning electron microscopy showed that the PIRL laser produced minimal tissue ablation with less damage of surrounding tissues than wounds formed using the other modalities. The width of scars formed by wounds made by the PIRL laser were half that of the scars produced using either a conventional surgical laser or a scalpel. Aniline blue staining showed higher levels of collagen in the early stage of the wounds produced using the PIRL laser, suggesting that these wounds mature faster. There were more viable cells extracted from skin using the PIRL laser, suggesting less cellular damage. β-catenin and TGF-β signalling, which are activated during the proliferative phase of wound healing, and whose level of activation correlates with the size of wounds was lower in wounds generated by the PIRL system. Wounds created with the PIRL systsem also showed a lower rate of cell proliferation. Direct comparison of wound

  1. Ultrafast mid-IR laser scalpel: protein signals of the fundamental limits to minimally invasive surgery.

    Directory of Open Access Journals (Sweden)

    Saeid Amini-Nik

    2010-09-01

    Full Text Available Lasers have in principle the capability to cut at the level of a single cell, the fundamental limit to minimally invasive procedures and restructuring biological tissues. To date, this limit has not been achieved due to collateral damage on the macroscale that arises from thermal and shock wave induced collateral damage of surrounding tissue. Here, we report on a novel concept using a specifically designed Picosecond IR Laser (PIRL that selectively energizes water molecules in the tissue to drive ablation or cutting process faster than thermal exchange of energy and shock wave propagation, without plasma formation or ionizing radiation effects. The targeted laser process imparts the least amount of energy in the remaining tissue without any of the deleterious photochemical or photothermal effects that accompanies other laser wavelengths and pulse parameters. Full thickness incisional and excisional wounds were generated in CD1 mice using the Picosecond IR Laser, a conventional surgical laser (DELight Er:YAG or mechanical surgical tools. Transmission and scanning electron microscopy showed that the PIRL laser produced minimal tissue ablation with less damage of surrounding tissues than wounds formed using the other modalities. The width of scars formed by wounds made by the PIRL laser were half that of the scars produced using either a conventional surgical laser or a scalpel. Aniline blue staining showed higher levels of collagen in the early stage of the wounds produced using the PIRL laser, suggesting that these wounds mature faster. There were more viable cells extracted from skin using the PIRL laser, suggesting less cellular damage. β-catenin and TGF-β signalling, which are activated during the proliferative phase of wound healing, and whose level of activation correlates with the size of wounds was lower in wounds generated by the PIRL system. Wounds created with the PIRL systsem also showed a lower rate of cell proliferation. Direct

  2. Fundamental safety principles. Safety fundamentals

    International Nuclear Information System (INIS)

    2007-01-01

    This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purpose. The fundamental safety objective - to protect people and the environment from harmful effects of ionizing radiation - applies to all circumstances that give rise to radiation risks. The safety principles are applicable, as relevant, throughout the entire lifetime of all facilities and activities - existing and new - utilized for peaceful purposes, and to protective actions to reduce existing radiation risks. They provide the basis for requirements and measures for the protection of people and the environment against radiation risks and for the safety of facilities and activities that give rise to radiation risks, including, in particular, nuclear installations and uses of radiation and radioactive sources, the transport of radioactive material and the management of radioactive waste

  3. Fundamental safety principles. Safety fundamentals

    International Nuclear Information System (INIS)

    2006-01-01

    This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purpose. The fundamental safety objective - to protect people and the environment from harmful effects of ionizing radiation - applies to all circumstances that give rise to radiation risks. The safety principles are applicable, as relevant, throughout the entire lifetime of all facilities and activities - existing and new - utilized for peaceful purposes, and to protective actions to reduce existing radiation risks. They provide the basis for requirements and measures for the protection of people and the environment against radiation risks and for the safety of facilities and activities that give rise to radiation risks, including, in particular, nuclear installations and uses of radiation and radioactive sources, the transport of radioactive material and the management of radioactive waste

  4. From the Kohn-Sham band gap to the fundamental gap in solids. An integer electron approach.

    Science.gov (United States)

    Baerends, E J

    2017-06-21

    It is often stated that the Kohn-Sham occupied-unoccupied gap in both molecules and solids is "wrong". We argue that this is not a correct statement. The KS theory does not allow to interpret the exact KS HOMO-LUMO gap as the fundamental gap (difference (I - A) of electron affinity (A) and ionization energy (I), twice the chemical hardness), from which it indeed differs, strongly in molecules and moderately in solids. The exact Kohn-Sham HOMO-LUMO gap in molecules is much below the fundamental gap and very close to the much smaller optical gap (first excitation energy), and LDA/GGA yield very similar gaps. In solids the situation is different: the excitation energy to delocalized excited states and the fundamental gap (I - A) are very similar, not so disparate as in molecules. Again the Kohn-Sham and LDA/GGA band gaps do not represent (I - A) but are significantly smaller. However, the special properties of an extended system like a solid make it very easy to calculate the fundamental gap from the ground state (neutral system) band structure calculations entirely within a density functional framework. The correction Δ from the KS gap to the fundamental gap originates from the response part v resp of the exchange-correlation potential and can be calculated very simply using an approximation to v resp . This affords a calculation of the fundamental gap at the same level of accuracy as other properties of crystals at little extra cost beyond the ground state bandstructure calculation. The method is based on integer electron systems, fractional electron systems (an ensemble of N- and (N + 1)-electron systems) and the derivative discontinuity are not invoked.

  5. Escola de ensino fundamental(s em movimento – movimento na escola de ensino fundamental

    Directory of Open Access Journals (Sweden)

    Reiner Hildebrandt-Stramann

    2007-12-01

    Full Text Available A escola de ensino fundamental na Alemanha sofreu movimento nos últimos 15 anos, porque, entre outros motivos, entrou movimento nessas escolas. Esse jogo de palavras chama atenção a duas linhas de trabalho que determinam a discussão na atual pedagogia escolar. O presente trabalho revela essas duas perspectivas. Uma das linhas está relacionada ao atual processo de mudança na pedagogia escolar. Essa prediz que a escola de ensino fundamental deve ser um lugar de aprendizagem e de vivência para as crianças. A outra linha tem a ver com o jogo de palavras ancorado a esses processos da pedagogia do movimento, a qual ganha cada vez maiores dimensões. A escola de ensino fundamental deve ser vista sob a perspectiva do movimento e transformada em um lugar de movimento.

  6. Abelianization of the F-divided fundamental group scheme

    Indian Academy of Sciences (India)

    INDRANIL BISWAS

    Abelianization of the F-divided fundamental group scheme. 283. Restrict the Poincaré bundle to X × Pic0 red(X). Viewing this restriction as a line bundle on Pic0 red(X) parametrized by X, we ... which gives rise to an exact sequence of the projective systems considered in Definition. 2.3. Applying the projective limit functor ...

  7. Fundamentals of ultrasonic phased arrays

    CERN Document Server

    Schmerr, Lester W

    2014-01-01

    This book describes in detail the physical and mathematical foundations of ultrasonic phased array measurements.?The book uses linear systems theory to develop a comprehensive model of the signals and images that can be formed with phased arrays. Engineers working in the field of ultrasonic nondestructive evaluation (NDE) will find in this approach a wealth of information on how to design, optimize and interpret ultrasonic inspections with phased arrays. The fundamentals and models described in the book will also be of significant interest to other fields, including the medical ultrasound and

  8. Human Needs as an Approach to Designed Landscapes

    Directory of Open Access Journals (Sweden)

    Dalia Aly

    2018-03-01

    Full Text Available The traditional approach of landscape architecture has always focused on the aesthetic and visual aspects of landscapes while giving less attention to other aspects. This view has limited the benefits that can be derived from designed landscapes, despite the wide-ranging potential they carry for humans; socially, environmentally and economically. As a result, many researchers and practitioners are currently challenging this view to develop a more holistic and multidimensional approach. The present research therefore aims at proposing a new perspective for public designed landscapes based on fundamental human needs. The study methodology was comprised of critical content analysis for three main domains: sustainable development, human needs in specific relation to public landscapes, and significant approaches to fundamental human needs. Reconciliation among these domains was achieved based on a modified version of Max-Neef’s matrix of fundamental human needs. Human needs in public landscapes were merged into the matrix to reach a comprehensive yet specific perspective. The study concluded with a conceptual framework that can provide a wider perspective to human needs in designed landscapes. It proposes a new tool for the analysis of the benefits of public landscapes and their value for humans, which can be further used in various applications.

  9. An ICMP-Based Mobility Management Approach Suitable for Protocol Deployment Limitation

    Directory of Open Access Journals (Sweden)

    Jeng-Yueng Chen

    2009-01-01

    Full Text Available Mobility management is one of the important tasks on wireless networks. Many approaches have been proposed in the past, but none of them have been widely deployed so far. Mobile IP (MIP and Route Optimization (ROMIP, respectively, suffer from triangular routing problem and binding cache supporting upon each node on the entire Internet. One step toward a solution is the Mobile Routing Table (MRT, which enables edge routers to take over address binding. However, this approach demands that all the edge routers on the Internet support MRT, resulting in protocol deployment difficulties. To address this problem and to offset the limitation of the original MRT approach, we propose two different schemes, an ICMP echo scheme and an ICMP destination-unreachable scheme. These two schemes work with the MRT to efficiently find MRT-enabled routers that greatly reduce the number of triangular routes. In this paper, we analyze and compare the standard MIP and the proposed approaches. Simulation results have shown that the proposed approaches reduce transmission delay, with only a few routers supporting MRT.

  10. Fundamentals of PIXE analysis

    International Nuclear Information System (INIS)

    Ishii, Keizo

    1997-01-01

    Elemental analysis based on the particle induced x-ray emission (PIXE) is a novel technique to analyze trace elements. It is a very simple method, its sensitivity is very high, multiple elements in a sample can be simultaneously analyzed and a few 10 μg of a sample is enough to be analyzed. Owing to these characteristics, the PIXE analysis is now used in many fields (e.g. biology, medicine, dentistry, environmental pollution, archaeology, culture assets etc.). Fundamentals of the PIXE analysis are described here: the production of characteristic x-rays and inner shell ionization by heavy charged particles, the continuous background in PIXE spectrum, quantitative formulae of the PIXE analysis, the detection limit of PIXE analysis, etc. (author)

  11. Fundamental plasma emission involving ion sound waves

    International Nuclear Information System (INIS)

    Cairns, I.H.

    1987-01-01

    The theory for fundamental plasma emission by the three-wave processes L ± S → T (where L, S and T denote Langmuir, ion sound and transverse waves, respectively) is developed. Kinematic constraints on the characteristics and growth lengths of waves participating in the wave processes are identified. In addition the rates, path-integrated wave temperatures, and limits on the brightness temperature of the radiation are derived. (author)

  12. Inductively coupled plasma emission spectroscopy. Part II: applications and fundamentals. Volume 2

    International Nuclear Information System (INIS)

    Boumans, P.W.J.M.

    1987-01-01

    This is the second part of the two-volume treatise by this well-known and respected author. This volume reviews applications of inductively coupled plasma atomic emission spectroscopy (ICP-AES), summarizes fundamental studies, and compares ICP-AES methods with other methods of analysis. The first six chapters are devoted to specific fields of application, including the following: metals and other industrial materials, geology, the environment, agriculture and food, biology and clinical analysis, and organic materials. The chapter on the analysis of organic materials also covers the special instrumental considerations required when organic solvents are introduced into an inductively coupled plasma. A chapter on the direct analysis of solids completes the first part of this volume. Each of the applications chapters begins with a summary of the types of samples that are encountered in that field, and the kinds of problems that an elemental analysis can help to solve. This is followed by a tutorial approach covering applicability, advantages, and limitations of the methods. The coverage is thorough, including sample handling, storage, and preparation, acid, and fusion dissolution, avoiding contamination, methods of preconcentration, the types of interferences that can be expected and ways to reduce them, and the types of ICP plasmas that are used. The second half of the volume covers fundamental studies of ICP-AES: basic processes of aerosol generation, plasma modeling and computer simulation, spectroscopic diagnostics, excitation mechanisms, and discharge characteristics. This section introduces the experimental and modeling methods that have been used to obtain fundamental information about ICPs

  13. Fundamental volatility and stock returns : does fundamental volatility explain stock returns?

    OpenAIRE

    Selboe, Guner K.; Virdee, Jaspal Singh

    2017-01-01

    In this thesis, we investigate whether the fundamental uncertainty can explain the crosssection of stock returns. To measure the fundamental uncertainty, we estimate rolling standard deviations and accounting betas of four different fundamentals: revenues, gross profit, earnings and cash flows. The standard deviation and the beta of revenues significantly explain returns in the Fama-Macbeth procedure, but only appears significant among smaller stocks in the portfolio formation ...

  14. Analysis of Budget Deficits and Macroeconomic Fundamentals: A VAR-VECM Approach

    Directory of Open Access Journals (Sweden)

    Manamba Epaphra

    2017-10-01

    Full Text Available Aim/purpose - This paper examines the relationship between budget deficits and selected macroeconomic variables in Tanzania for the period spanning from 1966 to 2015. Design/methodology/approach - The paper uses Vector autoregression (VAR - Vector Error Correction Model (VECM and variance decomposition techniques. The Johansen's test is applied to examine the long run relationship among the variables under study. Findings - The Johansen's test of cointegration indicates that the variables are cointegrated and thus have a long run relationship. The results based on the VAR-VECM estimation show that real GDP and exchange rate have a negative and significant relationship with budget deficit whereas inflation, money supply and lending interest rate have a positive one. Variance decomposition results show that variances in the budget deficits are mostly explained by the real GDP, followed by inflation and real exchange rate. Research implications/limitations - Results are very indicative, but highlight the importance of containing inflation and money supply to check their effects on budget deficits over the short run and long-run periods. Also, policy recommendation calls for fiscal authorities in Tanzania to adopt efficient and effective methods of tax collection and public sector spending. Originality/value/contribution - Tanzania has been experiencing budget deficit since the 1970s and that this budget deficit has been blamed for high indebtedness, inflation and poor investment and growth. The paper contributes to the empirical debate on the causal relationship between budget deficits and macroeconomic variables by employing VAR-VECM and variance decomposition approaches.

  15. Unmaking the bomb: Verifying limits on the stockpiles of nuclear weapons

    Science.gov (United States)

    Glaser, Alexander

    2017-11-01

    Verifying limits on the stockpiles of nuclear weapons may require the ability for international in-spectors to account for individual warheads, even when non-deployed, and to confirm the authenticity of nuclear warheads prior to dismantlement. These are fundamentally new challenges for nuclear verification, and they have been known for some time; unfortunately, due to a lack of sense of urgency, research in this area has not made substantial progress over the past 20 years. This chapter explores the central outstanding issues and offers a number of possible paths forward. In the case of confirming numerical limits, these in-clude innovative tagging techniques and approaches solely based on declarations using modern crypto-graphic escrow schemes; with regard to warhead confirmation, there has recently been increasing interest in developing fundamentally new measurement approaches where, in one form or another, sensitive infor-mation is not acquired in the first place. Overall, new international R&D efforts could more usefully focus on non-intrusive technologies and approaches, which may show more promise for early demonstration and adoption. In the meantime, while warhead dismantlements remain unverified, nuclear weapon states ought to begin to document warhead assembly, refurbishment, and dismantlement activities and movements of warheads and warhead components through the weapons complex in ways that international inspectors will find credible at a later time. Again, such a process could be enabled by modern cryptographic techniques such as blockchaining. Finally, and perhaps most importantly, it is important to recognize that the main reason for the complexity of technologies and approaches needed for nuclear disarmament verification is the requirement to protect information that nuclear weapon states consider sensitive. Ultimately, if information security concerns cannot be resolved to the satisfaction of all stakeholders, an alternative would be to "reveal the

  16. Teaching SIMS fundamentals using the FIB ion microscope

    International Nuclear Information System (INIS)

    Chater, Richard J.; McPhail, David S.

    2008-01-01

    The use of liquid metal source ion beams for microscopy and ion milling applications has increased dramatically in recent years. This paper explores the teaching of ion-solid sputtering and ionization phenomena without the facility to mass analyse the ionised yield available in dedicated SIMS instrumentation. Fundamental parameters can be demonstrated during the limited period of an undergraduate laboratory teaching session

  17. THE NECESSITY OF APPROACHING THE ENTERPRISE PERFORMANCE CONCEPT THROUGH A THEORETICAL FUNDAMENTAL SYSTEM

    Directory of Open Access Journals (Sweden)

    DEAC VERONICA

    2017-10-01

    Full Text Available The purpose of this paper is to justify the necessity of building of a theoretical-fundamental system to define and delimitate the integrated notions applicable to the concept of enterprise performance. Standing as a fundamental research, the present paper argues and shows that the literature in this field and the applied environment, as well, require a more clearer segregation, respectively an increase of specificity of the concept "enterprise performance" considering that it is not unanimously defined, on one hand, and, especially, due to the fact that it represents a key concept widely used, which, ultimately, has to be measured in order to be helpful, on the other hand. Moreover, the present paper would be useful to scholars working in the field of firm performance who are willing to understand this concept and to develop the future research referring to enterprise performance measurement.

  18. Practical roadmap and limits to nanostructured photovoltaics

    Energy Technology Data Exchange (ETDEWEB)

    Lunt, Richard R. [Department of Chemical Engineering and Materials Science, Michigan State University, East Lansing, MI 48824 (United States); Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Osedach, Timothy P. [School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138 (United States); Brown, Patrick R. [Department of Physics, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Rowehl, Jill A. [Department of Materials Science and Engineering, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Bulovic, Vladimir [Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States)

    2011-12-22

    The significant research interest in the engineering of photovoltaic (PV) structures at the nanoscale is directed toward enabling reductions in PV module fabrication and installation costs as well as improving cell power conversion efficiency (PCE). With the emergence of a multitude of nanostructured photovoltaic (nano-PV) device architectures, the question has arisen of where both the practical and the fundamental limits of performance reside in these new systems. Here, the former is addressed a posteriori. The specific challenges associated with improving the electrical power conversion efficiency of various nano-PV technologies are discussed and several approaches to reduce their thermal losses beyond the single bandgap limit are reviewed. Critical considerations related to the module lifetime and cost that are unique to nano-PV architectures are also addressed. The analysis suggests that a practical single-junction laboratory power conversion efficiency limit of 17% and a two-cell tandem power conversion efficiency limit of 24% are possible for nano-PVs, which, when combined with operating lifetimes of 10 to 15 years, could position them as a transformational technology for solar energy markets. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  19. Fundamental of cryogenics (for superconducting RF technology)

    CERN Document Server

    Pierini, Paolo

    2013-01-01

    This review briefly illustrates a few fundamental concepts of cryogenic engineering, the technological practice that allows reaching and maintaining the low-temperature operating conditions of the superconducting devices needed in particle accelerators. To limit the scope of the task, and not to duplicate coverage of cryogenic engineering concepts particularly relevant to superconducting magnets that can be found in previous CAS editions, the overview presented in this course focuses on superconducting radio-frequency cavities.

  20. MARKOWITZ' MODEL WITH FUNDAMENTAL AND TECHNICAL ANALYSIS – COMPLEMENTARY METHODS OR NOT

    Directory of Open Access Journals (Sweden)

    Branka Marasović

    2011-02-01

    Full Text Available As it is well known there are few “starting points” in portfolio optimization process, i.e. in the stock selection process. Famous Markowitz’ optimization model is unavoidable in this job. On the other side, someone may say that the indicators of the fundamental analysis must be the starting point. Beside that, the suggestions of the technical analysis must be taken into consideration. There are really numerous studies of the each approach separately, but it is almost impossible to find researches combining these approaches in logic and efficient unity. The main task of the paper is to find out if these approaches are complementary and if they are, how to apply them as efficient unit process. The empirical part of the study uses share sample from the Croatian stock market. Beside Markowitz’ MV model, fundamental and technical analysis, big role in the paper has an original multi-criterion approach.

  1. Bayesian analysis of rotating machines - A statistical approach to estimate and track the fundamental frequency

    DEFF Research Database (Denmark)

    Pedersen, Thorkild Find

    2003-01-01

    frequency and the related frequencies as orders of the fundamental frequency. When analyzing rotating or reciprocating machines it is important to know the running speed. Usually this requires direct access to the rotating parts in order to mount a dedicated tachometer probe. In this thesis different......Rotating and reciprocating mechanical machines emit acoustic noise and vibrations when they operate. Typically, the noise and vibrations are concentrated in narrow frequency bands related to the running speed of the machine. The frequency of the running speed is referred to as the fundamental...

  2. DISCRETE MATHEMATICS AS FUNDAMENTAL DISCIPLINE IS IN SYSTEM OF MATHEMATICAL PREPARATION OF FUTURE SOFTWARE ENGINEER

    OpenAIRE

    D. Shchedrolosev

    2010-01-01

    Fundamental mathematical background is an important part of training future engineers and programmers. The paper considers existing approaches to teaching the fundamentals of discrete mathematics specialist IT profile, a comparative analysis of modern textbooks on discrete mathematics for IT professionals was conducted

  3. Fundamentals of boiling water reactor (BWR)

    International Nuclear Information System (INIS)

    Bozzola, S.

    1982-01-01

    These lectures on fundamentals of BWR reactor physics are a synthesis of known and established concepts. These lectures are intended to be a comprehensive (even though descriptive in nature) presentation, which would give the basis for a fair understanding of power operation, fuel cycle and safety aspects of the boiling water reactor. The fundamentals of BWR reactor physics are oriented to design and operation. In the first lecture general description of BWR is presented, with emphasis on the reactor physics aspects. A survey of methods applied in fuel and core design and operation is presented in the second lecture in order to indicate the main features of the calculational tools. The third and fourth lectures are devoted to review of BWR design bases, reactivity requirements, reactivity and power control, fuel loading patterns. Moreover, operating limits are reviewed, as the actual limits during power operation and constraints for reactor physics analyses (design and operation). The basic elements of core management are also presented. The constraints on control rod movements during the achieving of criticality and low power operation are illustrated in the fifth lecture. Some considerations on plant transient analyses are also presented in the fifth lecture, in order to show the impact between core and fuel performance and plant/system performance. The last (sixth) lecture is devoted to the open vessel testing during the startup of a commercial BWR. A control rod calibration is also illustrated. (author)

  4. COMPARATIVE ANALYSIS BETWEEN THE FUNDAMENTAL AND TECHNICAL ANALYSIS OF STOCKS

    Directory of Open Access Journals (Sweden)

    Nada Petrusheva

    2016-04-01

    Full Text Available In the world of investing and trading, in order to have a definite advantage and constantly create profit, you need to have a strategic approach. Generally speaking, the two main schools of thought and strategies in financial markets are fundamental and technical analysis. Fundamental and technical analysis differ in several aspects, such as the way of functioning and execution, the time horizon used, the tools used and their objective. These differences lead to certain advantages and disadvantages of each of the analyses. Fundamental and technical analysis are also a subject of critical reviews by the academic and scientific community and many of these reviews concern the methods of their application, i.e. the possibility of combining the two analyses and using them complementarily to fully utilize their strengths and advantages.

  5. Fundamentals of semiconductor manufacturing and process control

    CERN Document Server

    May, Gary S

    2006-01-01

    A practical guide to semiconductor manufacturing from process control to yield modeling and experimental design Fundamentals of Semiconductor Manufacturing and Process Control covers all issues involved in manufacturing microelectronic devices and circuits, including fabrication sequences, process control, experimental design, process modeling, yield modeling, and CIM/CAM systems. Readers are introduced to both the theory and practice of all basic manufacturing concepts. Following an overview of manufacturing and technology, the text explores process monitoring methods, including those that focus on product wafers and those that focus on the equipment used to produce wafers. Next, the text sets forth some fundamentals of statistics and yield modeling, which set the foundation for a detailed discussion of how statistical process control is used to analyze quality and improve yields. The discussion of statistical experimental design offers readers a powerful approach for systematically varying controllable p...

  6. Exactly soluble dynamics of (p,q) string near macroscopic fundamental strings

    International Nuclear Information System (INIS)

    Bak, Dongsu; Rey, Soojong; Yee, Houng

    2004-01-01

    We study dynamics of type-IIB bound-state of a Dirichlet string and n fundamental strings in the background of N fundamental strings. Because of supergravity potential, the bound-state string is pulled to the background fundamental strings, whose motion is described by open string rolling radion field. The string coupling can be made controllably weak and, in the limit 1 2 st n 2 st N, the bound-state energy involved is small compared to the string scale. We thus propose rolling dynamics of open string radion in this system as an exactly solvable analog for rolling dynamics of open string tachyon in decaying D-brane. The dynamics bears a novel feature that the worldsheet electric field increases monotonically to the critical value as the bound-state string falls into the background string. Close to the background string, D string constituent inside the bound-state string decouples from fundamental string constituents. (author)

  7. DISCRETE MATHEMATICS AS FUNDAMENTAL DISCIPLINE IS IN SYSTEM OF MATHEMATICAL PREPARATION OF FUTURE SOFTWARE ENGINEER

    Directory of Open Access Journals (Sweden)

    D. Shchedrolosev

    2010-04-01

    Full Text Available Fundamental mathematical background is an important part of training future engineers and programmers. The paper considers existing approaches to teaching the fundamentals of discrete mathematics specialist IT profile, a comparative analysis of modern textbooks on discrete mathematics for IT professionals was conducted

  8. DOE fundamentals handbook: Material science

    International Nuclear Information System (INIS)

    1993-01-01

    This handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of the structure and properties of metals. This volume contains the following modules: thermal shock (thermal stress, pressurized thermal shock), brittle fracture (mechanism, minimum pressurization-temperature curves, heatup/cooldown rate limits), and plant materials (properties considered when selecting materials, fuel materials, cladding and reflectors, control materials, nuclear reactor core problems, plant material problems, atomic displacement due to irradiation, thermal and displacement spikes due to irradiation, neutron capture effect, radiation effects in organic compounds, reactor use of aluminum)

  9. Fundamentals of electronics

    CERN Document Server

    Schubert, Thomas F

    2015-01-01

    This book, Electronic Devices and Circuit Application, is the first of four books of a larger work, Fundamentals of Electronics. It is comprised of four chapters describing the basic operation of each of the four fundamental building blocks of modern electronics: operational amplifiers, semiconductor diodes, bipolar junction transistors, and field effect transistors. Attention is focused on the reader obtaining a clear understanding of each of the devices when it is operated in equilibrium. Ideas fundamental to the study of electronic circuits are also developed in the book at a basic level to

  10. Approach to DOE threshold guidance limits

    International Nuclear Information System (INIS)

    Shuman, R.D.; Wickham, L.E.

    1984-01-01

    The need for less restrictive criteria governing disposal of extremely low-level radioactive waste has long been recognized. The Low-Level Waste Management Program has been directed by the Department of Energy (DOE) to aid in the development of a threshold guidance limit for DOE low-level waste facilities. Project objectives are concernd with the definition of a threshold limit dose and pathway analysis of radionuclide transport within selected exposure scenarios at DOE sites. Results of the pathway analysis will be used to determine waste radionuclide concentration guidelines that meet the defined threshold limit dose. Methods of measurement and verification of concentration limits round out the project's goals. Work on defining a threshold limit dose is nearing completion. Pathway analysis of sanitary landfill operations at the Savannah River Plant and the Idaho National Engineering Laboratory is in progress using the DOSTOMAN computer code. Concentration limit calculations and determination of implementation procedures shall follow completion of the pathways work. 4 references

  11. Fundamentals of gas dynamics

    CERN Document Server

    Babu, V

    2014-01-01

    Fundamentals of Gas Dynamics, Second Edition isa comprehensively updated new edition and now includes a chapter on the gas dynamics of steam. It covers the fundamental concepts and governing equations of different flows, and includes end of chapter exercises based on the practical applications. A number of useful tables on the thermodynamic properties of steam are also included.Fundamentals of Gas Dynamics, Second Edition begins with an introduction to compressible and incompressible flows before covering the fundamentals of one dimensional flows and normal shock wav

  12. A Fundamental Parameter-Based Calibration Model for an Intrinsic Germanium X-Ray Fluorescence Spectrometer

    DEFF Research Database (Denmark)

    Christensen, Leif Højslet; Pind, Niels

    1982-01-01

    A matrix-independent fundamental parameter-based calibration model for an energy-dispersive X-ray fluorescence spectrometer has been developed. This model, which is part of a fundamental parameter approach quantification method, accounts for both the excitation and detection probability. For each...... secondary target a number of relative calibration constants are calculated on the basis of knowledge of the irradiation geometry, the detector specifications, and tabulated fundamental physical parameters. The absolute calibration of the spectrometer is performed by measuring one pure element standard per...

  13. Secret Key Agreement: Fundamental Limits and Practical Challenges

    KAUST Repository

    Rezki, Zouheir

    2017-02-15

    Despite the tremendous progress made toward establishing PLS as a new paradigm to guarantee security of communication systems at the physical layerthere is a common belief among researchers and industrials that there are many practical challenges that prevent PLS from flourishing at the industrial scale. Most secure message transmission constructions available to date are tied to strong assumptions on CSI, consider simple channel models and undermine eavesdropping capabilities; thus compromising their practical interest to a big extent. Perhaps arguably, the most likely reasonable way to leverage PLS potential in securing modern wireless communication systems is via secret-key agreement. In the latter setting, the legitimate parties try to agree on a key exploiting availability of a public channel with high capacity which is also accessible to the eavesdropper. Once a key is shared by the legitimate parties, they may use it in a one-time pad encryption, for instance. In this article, we investigate two performance limits of secret-key agreement communications; namely, the secret-key diversity-multiplexing trade-off and the effect of transmit correlation on the secretkey capacity. We show via examples how secretkey agreement offers more flexibility than secure message transmissions. Finally, we explore a few challenges of secret-key agreement concept and propose a few guidelines to overturn them.

  14. Nutrient co-limitation at the boundary of an oceanic gyre

    Science.gov (United States)

    Browning, Thomas J.; Achterberg, Eric P.; Rapp, Insa; Engel, Anja; Bertrand, Erin M.; Tagliabue, Alessandro; Moore, C. Mark

    2017-11-01

    Nutrient limitation of oceanic primary production exerts a fundamental control on marine food webs and the flux of carbon into the deep ocean. The extensive boundaries of the oligotrophic sub-tropical gyres collectively define the most extreme transition in ocean productivity, but little is known about nutrient limitation in these zones. Here we present the results of full-factorial nutrient amendment experiments conducted at the eastern boundary of the South Atlantic gyre. We find extensive regions in which the addition of nitrogen or iron individually resulted in no significant phytoplankton growth over 48 hours. However, the addition of both nitrogen and iron increased concentrations of chlorophyll a by up to approximately 40-fold, led to diatom proliferation, and reduced community diversity. Once nitrogen-iron co-limitation had been alleviated, the addition of cobalt or cobalt-containing vitamin B12 could further enhance chlorophyll a yields by up to threefold. Our results suggest that nitrogen-iron co-limitation is pervasive in the ocean, with other micronutrients also approaching co-deficiency. Such multi-nutrient limitations potentially increase phytoplankton community diversity.

  15. Fundamental limits of scintillation detector timing precision

    International Nuclear Information System (INIS)

    Derenzo, Stephen E; Choong, Woon-Seng; Moses, William W

    2014-01-01

    In this paper we review the primary factors that affect the timing precision of a scintillation detector. Monte Carlo calculations were performed to explore the dependence of the timing precision on the number of photoelectrons, the scintillator decay and rise times, the depth of interaction uncertainty, the time dispersion of the optical photons (modeled as an exponential decay), the photodetector rise time and transit time jitter, the leading-edge trigger level, and electronic noise. The Monte Carlo code was used to estimate the practical limits on the timing precision for an energy deposition of 511 keV in 3 mm × 3 mm × 30 mm Lu 2 SiO 5 :Ce and LaBr 3 :Ce crystals. The calculated timing precisions are consistent with the best experimental literature values. We then calculated the timing precision for 820 cases that sampled scintillator rise times from 0 to 1.0 ns, photon dispersion times from 0 to 0.2 ns, photodetector time jitters from 0 to 0.5 ns fwhm, and A from 10 to 10 000 photoelectrons per ns decay time. Since the timing precision R was found to depend on A −1/2  more than any other factor, we tabulated the parameter B, where R = BA −1/2 . An empirical analytical formula was found that fit the tabulated values of B with an rms deviation of 2.2% of the value of B. The theoretical lower bound of the timing precision was calculated for the example of 0.5 ns rise time, 0.1 ns photon dispersion, and 0.2 ns fwhm photodetector time jitter. The lower bound was at most 15% lower than leading-edge timing discrimination for A from 10 to 10 000 photoelectrons ns −1 . A timing precision of 8 ps fwhm should be possible for an energy deposition of 511 keV using currently available photodetectors if a theoretically possible scintillator were developed that could produce 10 000 photoelectrons ns −1 . (paper)

  16. Exchange Rates and Fundamentals.

    Science.gov (United States)

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  17. On weighted spaces without a fundamental sequence of bounded sets

    International Nuclear Information System (INIS)

    Olaleru, J.O.

    2001-09-01

    The problem of countably quasibarrelledness of weighted spaces of continuous functions, of which there are no results in the general setting of weighted spaces, is tackled in this paper. This leads to the study of quasibarrelledness of weighted spaces which, unlike that of Ernst and Schnettler, though with a similar approach, we drop the assumption that the weighted space has a fundamental sequence of bounded sets. The study of countably quasibarrelledness of weighted spaces naturally leads to definite results on the weighted (DF)-spaces for those weighted spaces with a fundamental sequence of bounded sets. (author)

  18. Fundamentals and advanced techniques in derivatives hedging

    CERN Document Server

    Bouchard, Bruno

    2016-01-01

    This book covers the theory of derivatives pricing and hedging as well as techniques used in mathematical finance. The authors use a top-down approach, starting with fundamentals before moving to applications, and present theoretical developments alongside various exercises, providing many examples of practical interest. A large spectrum of concepts and mathematical tools that are usually found in separate monographs are presented here. In addition to the no-arbitrage theory in full generality, this book also explores models and practical hedging and pricing issues. Fundamentals and Advanced Techniques in Derivatives Hedging further introduces advanced methods in probability and analysis, including Malliavin calculus and the theory of viscosity solutions, as well as the recent theory of stochastic targets and its use in risk management, making it the first textbook covering this topic. Graduate students in applied mathematics with an understanding of probability theory and stochastic calculus will find this b...

  19. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, M.; Frehen, R.; Schotman, P.C.; Bauer, R.

    2010-01-01

    This paper proposes a novel approach for estimating time-varying betas of individual stocks that incorporates prior information based on fundamentals. We shrink the rolling window estimate of beta towards a firm-specific prior that is motivated by asset pricing theory. The prior captures structural

  20. The fundamental determinants of financial integration in the European Union

    NARCIS (Netherlands)

    Lemmen, J.J.G.; Eijffinger, S.C.W.

    1995-01-01

    This paper focuses on the fundamental determinants of the degree of financial integration in the European Union over the period 1973-1993. Using closed interest differentials to measure the intensity of capital controls and applying a panel data approach, we find realized inflation rates, government

  1. Catalyst design for enhanced sustainability through fundamental surface chemistry.

    Science.gov (United States)

    Personick, Michelle L; Montemore, Matthew M; Kaxiras, Efthimios; Madix, Robert J; Biener, Juergen; Friend, Cynthia M

    2016-02-28

    Decreasing energy consumption in the production of platform chemicals is necessary to improve the sustainability of the chemical industry, which is the largest consumer of delivered energy. The majority of industrial chemical transformations rely on catalysts, and therefore designing new materials that catalyse the production of important chemicals via more selective and energy-efficient processes is a promising pathway to reducing energy use by the chemical industry. Efficiently designing new catalysts benefits from an integrated approach involving fundamental experimental studies and theoretical modelling in addition to evaluation of materials under working catalytic conditions. In this review, we outline this approach in the context of a particular catalyst-nanoporous gold (npAu)-which is an unsupported, dilute AgAu alloy catalyst that is highly active for the selective oxidative transformation of alcohols. Fundamental surface science studies on Au single crystals and AgAu thin-film alloys in combination with theoretical modelling were used to identify the principles which define the reactivity of npAu and subsequently enabled prediction of new reactive pathways on this material. Specifically, weak van der Waals interactions are key to the selectivity of Au materials, including npAu. We also briefly describe other systems in which this integrated approach was applied. © 2016 The Author(s).

  2. Radiology fundamentals

    CERN Document Server

    Singh, Harjit

    2011-01-01

    ""Radiology Fundamentals"" is a concise introduction to the dynamic field of radiology for medical students, non-radiology house staff, physician assistants, nurse practitioners, radiology assistants, and other allied health professionals. The goal of the book is to provide readers with general examples and brief discussions of basic radiographic principles and to serve as a curriculum guide, supplementing a radiology education and providing a solid foundation for further learning. Introductory chapters provide readers with the fundamental scientific concepts underlying the medical use of imag

  3. Fundamentals of Coherent Synchrotron Radiation in Storage Rings

    International Nuclear Information System (INIS)

    Sannibale, F.; Byrd, J.M.; Loftsdottir, A.; Martin, M.C.; Venturini, M.

    2004-01-01

    We present the fundamental concepts for producing stable broadband coherent synchrotron radiation (CSR) in the terahertz frequency region in an electron storage ring. The analysis includes distortion of bunch shape from the synchrotron radiation (SR), enhancing higher frequency coherent emission and limits to stable emission due to a microbunching instability excited by the SR. We use these concepts to optimize the performance of a source for CSR emission

  4. Fundamental care guided by the Careful Nursing Philosophy and Professional Practice Model©.

    Science.gov (United States)

    Meehan, Therese Connell; Timmins, Fiona; Burke, Jacqueline

    2018-02-05

    To propose the Careful Nursing Philosophy and Professional Practice Model © as a conceptual and practice solution to current fundamental nursing care erosion and deficits. There is growing awareness of the crucial importance of fundamental care. Efforts are underway to heighten nurses' awareness of values that motivate fundamental care and thereby increase their attention to effective provision of fundamental care. However, there remains a need for nursing frameworks which motivate nurses to bring fundamental care values to life in their practice and strengthen their commitment to provide fundamental care. This descriptive position paper builds on the Careful Nursing Philosophy and Professional Practice Model © (Careful Nursing). Careful Nursing elaborates explicit nursing values and addresses both relational and pragmatic aspects of nursing practice, offering an ideal guide to provision of fundamental nursing care. A comparative alignment approach is used to review the capacity of Careful Nursing to address fundamentals of nursing care. Careful Nursing provides a value-based comprehensive and practical framework which can strengthen clinical nurses' ability to articulate and control their practice and, thereby, more effectively fulfil their responsibility to provide fundamental care and measure its effectiveness. This explicitly value-based nursing philosophy and professional practice model offers nurses a comprehensive, pragmatic and engaging framework designed to strengthen their control over their practice and ability to provide high-quality fundamental nursing care. © 2018 John Wiley & Sons Ltd.

  5. Continuum mechanics using Mathematica fundamentals, methods, and applications

    CERN Document Server

    Romano, Antonio

    2014-01-01

    This textbook's methodological approach familiarizes readers with the mathematical tools required to correctly define and solve problems in continuum mechanics. Covering essential principles and fundamental applications, this second edition of Continuum Mechanics using Mathematica® provides a solid basis for a deeper study of more challenging and specialized problems related to nonlinear elasticity, polar continua, mixtures, piezoelectricity, ferroelectricity, magneto-fluid mechanics, and state changes (see A. Romano, A. Marasco, Continuum Mechanics: Advanced Topics and Research Trends, Springer (Birkhäuser), 2010, ISBN 978-0-8176-4869-5). Key topics and features: * Concise presentation strikes a balance between fundamentals and applications * Requisite mathematical background carefully collected in two introductory chapters and one appendix * Recent developments highlighted through coverage of more significant applications to areas such as wave propagation, fluid mechanics, porous media, linear elasticity....

  6. Skill-Based and Planned Active Play Versus Free-Play Effects on Fundamental Movement Skills in Preschoolers.

    Science.gov (United States)

    Roach, Lindsay; Keats, Melanie

    2018-01-01

    Fundamental movement skill interventions are important for promoting physical activity, but the optimal intervention model for preschool children remains unclear. We compared two 8-week interventions, a structured skill-station and a planned active play approach, to a free-play control condition on pre- and postintervention fundamental movement skills. We also collected data regarding program attendance and perceived enjoyment. We found a significant interaction effect between intervention type and time. A Tukey honest significant difference analysis supported a positive intervention effect showing a significant difference between both interventions and the free-play control condition. There was a significant between-group difference in group attendance such that mean attendance was higher for both the free-play and planned active play groups relative to the structured skill-based approach. There were no differences in attendance between free-play and planned active play groups, and there were no differences in enjoyment ratings between the two intervention groups. In sum, while both interventions led to improved fundamental movement skills, the active play approach offered several logistical advantages. Although these findings should be replicated, they can guide feasible and sustainable fundamental movement skill programs within day care settings.

  7. Correlated motions are a fundamental property of β-sheets

    Science.gov (United States)

    Fenwick, R. Bryn; Orellana, Laura; Esteban-Martín, Santi; Orozco, Modesto; Salvatella, Xavier

    2014-06-01

    Correlated motions in proteins can mediate fundamental biochemical processes such as signal transduction and allostery. The mechanisms that underlie these processes remain largely unknown due mainly to limitations in their direct detection. Here, based on a detailed analysis of protein structures deposited in the protein data bank, as well as on state-of-the art molecular simulations, we provide general evidence for the transfer of structural information by correlated backbone motions, mediated by hydrogen bonds, across β-sheets. We also show that the observed local and long-range correlated motions are mediated by the collective motions of β-sheets and investigate their role in large-scale conformational changes. Correlated motions represent a fundamental property of β-sheets that contributes to protein function.

  8. The pair potential approach for interfaces: Fundamental problems and practical solutions

    International Nuclear Information System (INIS)

    Maggs, A.C.; Ashcroft, N.W.

    1987-09-01

    A fundamental problem in the use of a central pair-force model for defect problems is that it omits three-body and higher terms which are necessarily present in real systems. Electronic fluctuation effects are also usually omitted. While these can be small in the simple metals, they are significant in noble and transition metals, as shown by a simple real space argument. To guage the importance of their effects in interface problems, the structure of a simple sum 5 twist boundary is examined, with the atoms described by both pair- and three-center interactions and as a function of the relative strength of the two. 15 refs

  9. Electromagnetic Scattering by a Morphologically Complex Object: Fundamental Concepts and Common Misconceptions

    Science.gov (United States)

    Mischenko, Michael I.; Travis, Larry D.; Cairns, Brian; Tishkovets, Victor P.; Dlugach, Janna M.; Rosenbush, Vera K.; Kiselev, Nikolai N.

    2011-01-01

    Following Keller(Proc Symp Appl Math 1962;13:227:46), we classify all theoretical treatments of electromagnetic scattering by a morphologically complex object into first- principle (or "honest" in Keller s terminology) and phenomenological (or "dishonest") categories. This helps us identify, analyze, and dispel several profound misconceptions widespread in the discipline of electromagnetic scattering by solitary particles and discrete random media. Our goal is not to call for a complete renunciation of phenomenological approaches but rather to encourage a critical and careful evaluation of their actual origin, virtues, and limitations. In other words, we do not intend to deter creative thinking in terms of phenomenological short-cuts, but we do want to raise awareness when we stray (often for practical reasons) from the fundamentals. The main results and conclusions are illustrated by numerically-exact data based on direct numerical solutions of the macroscopic Maxwell equations.

  10. Interface Induced Carbonate Mineralization: A Fundamental Geochemical Process Relevant to Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Teng, H. Henry [PI, The George Washington University; Xu, Huifang [Co-PI, University of Wisconsin-Madison

    2013-07-17

    We have approached the long-standing geochemical question why anhydrous high-Mg carbonate minerals (i.e., magnesite and dolomite) cannot be formed at ambient conditions from a new perspective by exploring the formation of MgCO{sub 3} and Mg{sub x}Ca{sub (1-x)}CO{sub 3} in non-aqueous solutions. Data collected from our experiments in this funding period suggest that a fundamental barrier, other than cation hydration, exists that prevents Mg{sup 2+} and CO{sub 3}{sup 2-} ions from forming long-range ordered structures. We propose that this barrier mainly stems from the lattice limitation on the spatial configuration of CO{sub 3} groups in magnesite crystals. On the other hand, the measured higher distribution coefficients of Mg between magnesian calcites formed in the absence and presence of water give us a first direct proof to support and quantify the cation hydration effect.

  11. Approach to the thermodynamic limit in lattice QCD at μ≠0

    International Nuclear Information System (INIS)

    Splittorff, K.; Verbaarschot, J. J. M.

    2008-01-01

    The expectation value of the complex phase factor of the fermion determinant is computed to leading order in the p expansion of the chiral Lagrangian. The computation is valid for μ π /2 and determines the dependence of the sign problem on the volume and on the geometric shape of the volume. In the thermodynamic limit with L i →∞ at fixed temperature 1/L 0 , the average phase factor vanishes. In the low temperature limit where L i /L 0 is fixed as L i becomes large, the average phase factor approaches 1 for μ π /2. The results for a finite volume compare well with lattice results obtained by Allton et al. After taking appropriate limits, we reproduce previously derived results for the ε regime and for one-dimensional QCD. The distribution of the phase itself is also computed

  12. The detection and stabilisation of limit cycle for deterministic finite automata

    Science.gov (United States)

    Han, Xiaoguang; Chen, Zengqiang; Liu, Zhongxin; Zhang, Qing

    2018-04-01

    In this paper, the topological structure properties of deterministic finite automata (DFA), under the framework of the semi-tensor product of matrices, are investigated. First, the dynamics of DFA are converted into a new algebraic form as a discrete-time linear system by means of Boolean algebra. Using this algebraic description, the approach of calculating the limit cycles of different lengths is given. Second, we present two fundamental concepts, namely, domain of attraction of limit cycle and prereachability set. Based on the prereachability set, an explicit solution of calculating domain of attraction of a limit cycle is completely characterised. Third, we define the globally attractive limit cycle, and then the necessary and sufficient condition for verifying whether all state trajectories of a DFA enter a given limit cycle in a finite number of transitions is given. Fourth, the problem of whether a DFA can be stabilised to a limit cycle by the state feedback controller is discussed. Criteria for limit cycle-stabilisation are established. All state feedback controllers which implement the minimal length trajectories from each state to the limit cycle are obtained by using the proposed algorithm. Finally, an illustrative example is presented to show the theoretical results.

  13. The point of no return: A fundamental limit on the ability to control thought and action.

    Science.gov (United States)

    Logan, Gordon D

    2015-01-01

    Bartlett (1958. Thinking. New York: Basic Books) described the point of no return as a point of irrevocable commitment to action, which was preceded by a period of gradually increasing commitment. As such, the point of no return reflects a fundamental limit on the ability to control thought and action. I review the literature on the point of no return, taking three perspectives. First, I consider the point of no return from the perspective of the controlled act, as a locus in the architecture and anatomy of the underlying processes. I review experiments from the stop-signal paradigm that suggest that the point of no return is located late in the response system. Then I consider the point of no return from the perspective of the act of control that tries to change the controlled act before it becomes irrevocable. From this perspective, the point of no return is a point in time that provides enough "lead time" for the act of control to take effect. I review experiments that measure the response time to the stop signal as the lead time required for response inhibition in the stop-signal paradigm. Finally, I consider the point of no return in hierarchically controlled tasks, in which there may be many points of no return at different levels of the hierarchy. I review experiments on skilled typing that suggest different points of no return for the commands that determine what is typed and the countermands that inhibit typing, with increasing commitment to action the lower the level in the hierarchy. I end by considering the point of no return in perception and thought as well as action.

  14. DISSOLVED CONCENTRATION LIMITS OF RADIOACTIVE ELEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    NA

    2004-11-22

    The purpose of this study is to evaluate dissolved concentration limits (also referred to as solubility limits) of elements with radioactive isotopes under probable repository conditions, based on geochemical modeling calculations using geochemical modeling tools, thermodynamic databases, field measurements, and laboratory experiments. The scope of this modeling activity is to predict dissolved concentrations or solubility limits for 14 elements with radioactive isotopes (actinium, americium, carbon, cesium, iodine, lead, neptunium, plutonium, protactinium, radium, strontium, technetium, thorium, and uranium) important to calculated dose. Model outputs for uranium, plutonium, neptunium, thorium, americium, and protactinium are in the form of tabulated functions with pH and log (line integral) CO{sub 2} as independent variables, plus one or more uncertainty terms. The solubility limits for the remaining elements are either in the form of distributions or single values. The output data from this report are fundamental inputs for Total System Performance Assessment for the License Application (TSPA-LA) to determine the estimated release of these elements from waste packages and the engineered barrier system. Consistent modeling approaches and environmental conditions were used to develop solubility models for all of the actinides. These models cover broad ranges of environmental conditions so that they are applicable to both waste packages and the invert. Uncertainties from thermodynamic data, water chemistry, temperature variation, and activity coefficients have been quantified or otherwise addressed.

  15. Correlator of fundamental and anti-symmetric Wilson loops in AdS/CFT correspondence

    International Nuclear Information System (INIS)

    Tai, T.-S.; Yamaguchi, Satoshi

    2007-01-01

    We study the two circular Wilson loop correlator in which one is of anti-symmetric representation, while the other is of fundamental representation in 4-dimensional N = 4 super Yang-Mills theory. This correlator has a good AdS dual, which is a system of a D5-brane and a fundamental string. We calculated the on-shell action of the string, and clarified the Gross-Ooguri transition in this correlator. Some limiting cases are also examined

  16. Fundamental neutron physics

    International Nuclear Information System (INIS)

    Deslattes, R.; Dombeck, T.; Greene, G.; Ramsey, N.; Rauch, H.; Werner, S.

    1984-01-01

    Fundamental physics experiments of merit can be conducted at the proposed intense neutron sources. Areas of interest include: neutron particle properties, neutron wave properties, and fundamental physics utilizing reactor produced γ-rays. Such experiments require intense, full-time utilization of a beam station for periods ranging from several months to a year or more

  17. The fundamental problem of the Russian economy: human capital as a panacea from the raw disease of a country

    Directory of Open Access Journals (Sweden)

    Azyrkina Alexandra, S.

    2015-03-01

    Full Text Available The paper outlines the urgent problems of modern Russia, as well as identified the fundamental problem hindering the successful development of the national economy. As a fundamental problem, the author defines the problem of the lack of diversification of the Russian economy. In the work proposed two global solutions to this problem: the intensive and extensive nature. Preference intensive approach to the resolution of pressing problems is justified. As the reference level intensive approach is the development of human capital. In the paper human capital is in general characterized, the evolution of the concept of human capital and the modern sense of the concept are given. The necessity to consider the human potential as the advantage of Russia is proved. Problems that hinder the development of human capital are identified and analyzed. Also some methods of the solution of those problems are presented. The research also identified the benefits of Russia from the point of view of the development of human capital in comparison with other countries, and identified the urgent problems that hinder its development. Analysis of the current situation from the point of view of limitations for successful human development and the factors hindering this development is provided.

  18. Serious limitations of the QTL/Microarray approach for QTL gene discovery

    Directory of Open Access Journals (Sweden)

    Warden Craig H

    2010-07-01

    Full Text Available Abstract Background It has been proposed that the use of gene expression microarrays in nonrecombinant parental or congenic strains can accelerate the process of isolating individual genes underlying quantitative trait loci (QTL. However, the effectiveness of this approach has not been assessed. Results Thirty-seven studies that have implemented the QTL/microarray approach in rodents were reviewed. About 30% of studies showed enrichment for QTL candidates, mostly in comparisons between congenic and background strains. Three studies led to the identification of an underlying QTL gene. To complement the literature results, a microarray experiment was performed using three mouse congenic strains isolating the effects of at least 25 biometric QTL. Results show that genes in the congenic donor regions were preferentially selected. However, within donor regions, the distribution of differentially expressed genes was homogeneous once gene density was accounted for. Genes within identical-by-descent (IBD regions were less likely to be differentially expressed in chromosome 2, but not in chromosomes 11 and 17. Furthermore, expression of QTL regulated in cis (cis eQTL showed higher expression in the background genotype, which was partially explained by the presence of single nucleotide polymorphisms (SNP. Conclusions The literature shows limited successes from the QTL/microarray approach to identify QTL genes. Our own results from microarray profiling of three congenic strains revealed a strong tendency to select cis-eQTL over trans-eQTL. IBD regions had little effect on rate of differential expression, and we provide several reasons why IBD should not be used to discard eQTL candidates. In addition, mismatch probes produced false cis-eQTL that could not be completely removed with the current strains genotypes and low probe density microarrays. The reviewed studies did not account for lack of coverage from the platforms used and therefore removed genes

  19. Fundamentals of sustainable neighbourhoods

    CERN Document Server

    Friedman, Avi

    2015-01-01

    This book introduces architects, engineers, builders, and urban planners to a range of design principles of sustainable communities and illustrates them with outstanding case studies. Drawing on the author’s experience as well as local and international case studies, Fundamentals of Sustainable Neighbourhoods presents planning concepts that minimize developments' carbon footprint through compact communities, adaptable and expandable dwellings, adaptable landscapes, and smaller-sized yet quality-designed housing. This book also: Examines in-depth global strategies for minimizing the residential carbon footprint, including district heating, passive solar gain, net-zero residences, as well as preserving the communities' natural assets Reconsiders conceptual approaches in building design and urban planning to promote a better connection between communities and nature Demonstrates practical applications of green architecture Focuses on innovative living spaces in urban environments

  20. Approaching the Hole Mobility Limit of GaSb Nanowires.

    Science.gov (United States)

    Yang, Zai-xing; Yip, SenPo; Li, Dapan; Han, Ning; Dong, Guofa; Liang, Xiaoguang; Shu, Lei; Hung, Tak Fu; Mo, Xiaoliang; Ho, Johnny C

    2015-09-22

    In recent years, high-mobility GaSb nanowires have received tremendous attention for high-performance p-type transistors; however, due to the difficulty in achieving thin and uniform nanowires (NWs), there is limited report until now addressing their diameter-dependent properties and their hole mobility limit in this important one-dimensional material system, where all these are essential information for the deployment of GaSb NWs in various applications. Here, by employing the newly developed surfactant-assisted chemical vapor deposition, high-quality and uniform GaSb NWs with controllable diameters, spanning from 16 to 70 nm, are successfully prepared, enabling the direct assessment of their growth orientation and hole mobility as a function of diameter while elucidating the role of sulfur surfactant and the interplay between surface and interface energies of NWs on their electrical properties. The sulfur passivation is found to efficiently stabilize the high-energy NW sidewalls of (111) and (311) in order to yield the thin NWs (i.e., 40 nm in diameters) would grow along the most energy-favorable close-packed planes with the orientation of ⟨111⟩, supported by the approximate atomic models. Importantly, through the reliable control of sulfur passivation, growth orientation and surface roughness, GaSb NWs with the peak hole mobility of ∼400 cm(2)V s(-1) for the diameter of 48 nm, approaching the theoretical limit under the hole concentration of ∼2.2 × 10(18) cm(-3), can be achieved for the first time. All these indicate their promising potency for utilizations in different technological domains.

  1. Islamic fundamentalism in Indonesia

    OpenAIRE

    Nagy, Sandra L.

    1996-01-01

    This is a study of Islamic fundamentalism in Indonesia. Islamic fundamentalism is defined as the return to the foundations and principles of Islam including all movements based on the desire to create a more Islamic society. After describing the practices and beliefs of Islam, this thesis examines the three aspects of universal Islamic fundamentalism: revivalism, resurgence, and radicalism. It analyzes the role of Islam in Indonesia under Dutch colonial rule, an alien Christian imperialist po...

  2. Developing Guided Worksheet for Cognitive Apprenticeship Approach in teaching Formal Definition of The Limit of A Function

    Science.gov (United States)

    Oktaviyanthi, R.; Dahlan, J. A.

    2018-04-01

    This study aims to develop student worksheets that correspond to the Cognitive Apprenticeship learning approach. The main subject in this student worksheet is Functions and Limits with the branch of the main subject is Continuity and Limits of Functions. There are two indicators of the achievement of this learning that are intended to be developed in the student worksheet (1) the student can explain the concept of limit by using the formal definition of limit and (2) the student can evaluate the value of limit of a function using epsilon and delta. The type of research used is development research that refers to the development of Plomp products. The research flow starts from literature review, observation, interviews, work sheet design, expert validity test, and limited trial on first-year students in academic year 2016-2017 in Universitas Serang Raya, STKIP Pelita Pratama Al-Azhar Serang, and Universitas Mathla’ul Anwar Pandeglang. Based on the product development result obtained the student worksheets that correspond to the Cognitive Apprenticeship learning approach are valid and reliable.

  3. Analysis on 'new fundamentals' and range of oil price trend in the long run

    Energy Technology Data Exchange (ETDEWEB)

    Rui, Chen

    2010-09-15

    The range of trend of oil price will be decided by marginal production cost of crude oil and production cost of alternative energy consumed as transportation fuel on a large scale. The former factor determines the lower limit and the latter determines the upper limit of oil price. financial factors and the value of USD will not only affect the short-term change of oil price, they may become fundamentals factors that exert influence on the mid-long term change of oil price, namely, New Fundamentals, which will determine the fluctuation degree of oil price in the long run.

  4. Fundamentals of ergonomic exoskeleton robots

    NARCIS (Netherlands)

    Schiele, A.

    2008-01-01

    This thesis is the first to provide the fundamentals of ergonomic exoskeleton design. The fundamental theory as well as technology necessary to analyze and develop ergonomic wearable robots interacting with humans is established and validated by experiments and prototypes. The fundamentals are (1) a

  5. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  6. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    Science.gov (United States)

    Deffner, Sebastian; Campbell, Steve

    2017-11-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.

  7. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    International Nuclear Information System (INIS)

    Deffner, Sebastian; Campbell, Steve

    2017-01-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam–Tamm and the Margolus–Levitin bounds on the quantum speed limit , and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach , where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader. (topical review)

  8. Nanothermodynamics: a subdivision potential approach

    Directory of Open Access Journals (Sweden)

    R. Moussavi

    2005-12-01

    Full Text Available  Classical thermodynamic laws and relations have been developed for macroscopic systems that satisfy the thermodynamic limit. These relations are challenged as the system size decreases to the scale of nano-systems, in which thermodynamic properties are overshadowed by system size, and the usual classical concepts of extensivity and intensivity are no longer valid. The challenges to the classical thermodynamics in relation to small systems are demonstrated, and via the approach introduced by Hill, the concept of sub-division potential is clarified in details. The fundamental thermodynamic relations are obtained using a rational-based method.

  9. Fundamental Safety Principles

    International Nuclear Information System (INIS)

    Abdelmalik, W.E.Y.

    2011-01-01

    This work presents a summary of the IAEA Safety Standards Series publication No. SF-1 entitled F UDAMENTAL Safety PRINCIPLES p ublished on 2006. This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purposes. Safety measures and security measures have in common the aim of protecting human life and health and the environment. These safety principles are: 1) Responsibility for safety, 2) Role of the government, 3) Leadership and management for safety, 4) Justification of facilities and activities, 5) Optimization of protection, 6) Limitation of risks to individuals, 7) Protection of present and future generations, 8) Prevention of accidents, 9)Emergency preparedness and response and 10) Protective action to reduce existing or unregulated radiation risks. The safety principles concern the security of facilities and activities to the extent that they apply to measures that contribute to both safety and security. Safety measures and security measures must be designed and implemented in an integrated manner so that security measures do not compromise safety and safety measures do not compromise security.

  10. The inductively coupled plasma as a source for the measurement of fundamental spectroscopic constants

    International Nuclear Information System (INIS)

    Farnsworth, P.B.

    1993-01-01

    Inductively coupled plasmas (ICPs) are stable, robust sources for the generation of spectra from neutral and singly ionized atoms. They are used extensively for analytical spectrometry, but have seen limited use for the measurement of fundamental spectroscopic constants. Several properties of the ICP affect its suitability for such fundamental measurements. They include: spatial structure, spectral background, noise characteristics, electron densities and temperatures, and the state of equilibrium in the plasma. These properties are particularly sensitive to the means by which foreign atoms are introduced into the plasma. With some departures from the operating procedures normally used in analytical measurements, the ICP promise to be a useful source for the measurement of fundamental atomic constants. (orig.)

  11. Forming Limits in Sheet Metal Forming for Non-Proportional Loading Conditions - Experimental and Theoretical Approach

    International Nuclear Information System (INIS)

    Ofenheimer, Aldo; Buchmayr, Bruno; Kolleck, Ralf; Merklein, Marion

    2005-01-01

    The influence of strain paths (loading history) on material formability is well known in sheet forming processes. Sophisticated experimental methods are used to determine the entire shape of strain paths of forming limits for aluminum AA6016-T4 alloy. Forming limits for sheet metal in as-received condition as well as for different pre-deformation are presented. A theoretical approach based on Arrieux's intrinsic Forming Limit Stress Curve (FLSC) concept is employed to numerically predict the influence of loading history on forming severity. The detailed experimental strain paths are used in the theoretical study instead of any linear or bilinear simplified loading histories to demonstrate the predictive quality of forming limits in the state of stress

  12. Financial fluctuations anchored to economic fundamentals: A mesoscopic network approach.

    Science.gov (United States)

    Sharma, Kiran; Gopalakrishnan, Balagopal; Chakrabarti, Anindya S; Chakraborti, Anirban

    2017-08-14

    We demonstrate the existence of an empirical linkage between nominal financial networks and the underlying economic fundamentals, across countries. We construct the nominal return correlation networks from daily data to encapsulate sector-level dynamics and infer the relative importance of the sectors in the nominal network through measures of centrality and clustering algorithms. Eigenvector centrality robustly identifies the backbone of the minimum spanning tree defined on the return networks as well as the primary cluster in the multidimensional scaling map. We show that the sectors that are relatively large in size, defined with three metrics, viz., market capitalization, revenue and number of employees, constitute the core of the return networks, whereas the periphery is mostly populated by relatively smaller sectors. Therefore, sector-level nominal return dynamics are anchored to the real size effect, which ultimately shapes the optimal portfolios for risk management. Our results are reasonably robust across 27 countries of varying degrees of prosperity and across periods of market turbulence (2008-09) as well as periods of relative calmness (2012-13 and 2015-16).

  13. Fundamental problem in the relativistic approach to atomic structure theory

    International Nuclear Information System (INIS)

    Kagawa, Takashi

    1987-01-01

    It is known that the relativistic atomic structure theory contains a serious fundamental problem, so-called the Brown-Ravenhall (BR) problem or variational collapse. This problem arises from the fact that the energy spectrum of the relativistic Hamiltonian for many-electron systems is not bounded from below because the negative-energy solutions as well as the positive-energy ones are obtained from the relativistic equation. This report outlines two methods to avoid the BR problem in the relativistic calculation, that is, the projection operator method and the general variation method. The former method is described first. The use of a modified Hamiltonian containing a projection operator which projects the positive-energy solutions in the relativistic wave equation has been proposed to remove the BR difficulty. The problem in the use of the projection operator method is that the projection operator for the system cannot be determined uniquely. The final part of this report outlines the general variation method. This method can be applied to any system, such as relativistic ones whose Hamiltonian is not bounded from below. (Nogami, K.)

  14. Fundamental aspects of quantum theory

    International Nuclear Information System (INIS)

    Gorini, V.; Frigerio, A.

    1986-01-01

    This book presents information on the following topics: general problems and crucial experiments; the classical behavior of measuring instruments; quantum interference effect for two atoms radiating a single photon; quantization and stochastic processes; quantum Markov processes driven by Bose noise; chaotic behavior in quantum mechanics; quantum ergodicity and chaos; microscopic and macroscopic levels of description; fundamental properties of the ground state of atoms and molecules; n-level systems interacting with Bosons - semiclassical limits; general aspects of gauge theories; adiabatic phase shifts for neutrons and photons; the spins of cyons and dyons; round-table discussion the the Aharonov-Bohm effect; gravity in quantum mechanics; the gravitational phase transition; anomalies and their cancellation; a new gauge without any ghost for Yang-Mills Theory; and energy density and roughening in the 3-D Ising ferromagnet

  15. Microelectronics from fundamentals to applied design

    CERN Document Server

    Di Paolo Emilio, Maurizio

    2016-01-01

    This book serves as a practical guide for practicing engineers who need to design analog circuits for microelectronics.  Readers will develop a comprehensive understanding of the basic techniques of analog modern electronic circuit design, discrete and integrated, application as sensors and control and data acquisition systems,and techniques of PCB design.  ·         Describes fundamentals of microelectronics design in an accessible manner; ·         Takes a problem-solving approach to the topic, offering a hands-on guide for practicing engineers; ·         Provides realistic examples to inspire a thorough understanding of system-level issues, before going into the detail of components and devices; ·         Uses a new approach and provides several skills that help engineers and designers retain key and advanced concepts.

  16. Fundamentals of turbomachines

    CERN Document Server

    Dick, Erik

    2015-01-01

    This book explores the working principles of all kinds of turbomachines. The same theoretical framework is used to analyse the different machine types. Fundamentals are first presented and theoretical concepts are then elaborated for particular machine types, starting with the simplest ones.For each machine type, the author strikes a balance between building basic understanding and exploring knowledge of practical aspects. Readers are invited through challenging exercises to consider how the theory applies to particular cases and how it can be generalised.   The book is primarily meant as a course book. It teaches fundamentals and explores applications. It will appeal to senior undergraduate and graduate students in mechanical engineering and to professional engineers seeking to understand the operation of turbomachines. Readers will gain a fundamental understanding of turbomachines. They will also be able to make a reasoned choice of turbomachine for a particular application and to understand its operation...

  17. A neural algorithm for a fundamental computing problem.

    Science.gov (United States)

    Dasgupta, Sanjoy; Stevens, Charles F; Navlakha, Saket

    2017-11-10

    Similarity search-for example, identifying similar images in a database or similar documents on the web-is a fundamental computing problem faced by large-scale information retrieval systems. We discovered that the fruit fly olfactory circuit solves this problem with a variant of a computer science algorithm (called locality-sensitive hashing). The fly circuit assigns similar neural activity patterns to similar odors, so that behaviors learned from one odor can be applied when a similar odor is experienced. The fly algorithm, however, uses three computational strategies that depart from traditional approaches. These strategies can be translated to improve the performance of computational similarity searches. This perspective helps illuminate the logic supporting an important sensory function and provides a conceptually new algorithm for solving a fundamental computational problem. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  18. Fundamentals of Structural Engineering

    CERN Document Server

    Connor, Jerome J

    2013-01-01

    Fundamentals of Structural Engineering provides a balanced, seamless treatment of both classic, analytic methods and contemporary, computer-based techniques for conceptualizing and designing a structure. The book’s principle goal is to foster an intuitive understanding of structural behavior based on problem solving experience for students of civil engineering and architecture who have been exposed to the basic concepts of engineering mechanics and mechanics of materials. Making it distinct from many other undergraduate textbooks, the authors of this text recognize the notion that engineers reason about behavior using simple models and intuition they acquire through problem solving. The approach adopted in this text develops this type of intuition  by presenting extensive, realistic problems and case studies together with computer simulation, which allows rapid exploration of  how a structure responds to changes in geometry and physical parameters. This book also: Emphasizes problem-based understanding of...

  19. Fundamental Frequency Extraction Method using Central Clipping and its Importance for the Classification of Emotional State

    Directory of Open Access Journals (Sweden)

    Pavol Partila

    2012-01-01

    Full Text Available The paper deals with a classification of emotional state. We implemented a method for extracting the fundamental speech signal frequency by means of a central clipping and examined a correlation between emotional state and fundamental speech frequency. For this purpose, we applied an approach of exploratory data analysis. The ANOVA (Analysis of variance test confirmed that a modification in the speaker's emotional state changes the fundamental frequency of human vocal tract. The main contribution of the paper lies in investigation, of central clipping method by the ANOVA.

  20. NATO Advanced Research Workshop on Fundamental and Applied Electromagnetics

    CERN Document Server

    Maksimenko, Sergey

    2016-01-01

    This book presents the most relevant and recent results in the study of “Nanoelectromagnetics”, a recently born fascinating research discipline, whose popularity is fast arising with the intensive penetration of nanotechnology in the world of electronics applications. Studying nanoelectromagnetics means describing the interaction between electromagnetic radiation and quantum mechanical low-dimensional systems: this requires a full interdisciplinary approach, the reason why this book hosts contributions from the fields of fundamental and applied electromagnetics, of chemistry and technology of nanostructures and nanocomposites, of physics of nano-structures systems, etc. The book is aimed at providing the reader with the state of the art in Nanoelectromagnetics, from theoretical modelling to experimental characterization, from design to synthesis, from DC to microwave and terahertz applications, from the study of fundamental material properties to the analysis of complex systems and devices, from commercia...

  1. Information security fundamentals

    CERN Document Server

    Peltier, Thomas R

    2013-01-01

    Developing an information security program that adheres to the principle of security as a business enabler must be the first step in an enterprise's effort to build an effective security program. Following in the footsteps of its bestselling predecessor, Information Security Fundamentals, Second Edition provides information security professionals with a clear understanding of the fundamentals of security required to address the range of issues they will experience in the field.The book examines the elements of computer security, employee roles and r

  2. Fundamentals of statistics

    CERN Document Server

    Mulholland, Henry

    1968-01-01

    Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges

  3. Clustering-based approaches to SAGE data mining

    Directory of Open Access Journals (Sweden)

    Wang Haiying

    2008-07-01

    Full Text Available Abstract Serial analysis of gene expression (SAGE is one of the most powerful tools for global gene expression profiling. It has led to several biological discoveries and biomedical applications, such as the prediction of new gene functions and the identification of biomarkers in human cancer research. Clustering techniques have become fundamental approaches in these applications. This paper reviews relevant clustering techniques specifically designed for this type of data. It places an emphasis on current limitations and opportunities in this area for supporting biologically-meaningful data mining and visualisation.

  4. Infosec management fundamentals

    CERN Document Server

    Dalziel, Henry

    2015-01-01

    Infosec Management Fundamentals is a concise overview of the Information Security management concepts and techniques, providing a foundational template for both experienced professionals and those new to the industry. This brief volume will also appeal to business executives and managers outside of infosec who want to understand the fundamental concepts of Information Security and how it impacts their business decisions and daily activities. Teaches ISO/IEC 27000 best practices on information security management Discusses risks and controls within the context of an overall information securi

  5. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) are counted by high resolution gamma spectrometry registrating a pulse-heights distribution (acquisition of a multichannel spectrum), for example on samples. It considers exclusively the random character of radioactive decay and of pulse counting and ignores all other influences (e.g. arising from sample treatment, weighing, enrichment or the instability of the test setup). It assumes that the distance of neighbouring peaks of gamma lines is not smaller than four times the full width half maximum (FWHM) of gamma line and that the background near to gamma line is nearly a straight line. Otherwise ISO 11929-1 or ISO 11929-2 should be used. ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as nonradioactive waste, and ISO 11929-1, ISO 11929-2 and ISO 11929-4, and is, consequently, complementary to these documents

  6. Fundamental (f) oscillations in a magnetically coupled solar interior-atmosphere system - An analytical approach

    Science.gov (United States)

    Pintér, Balázs; Erdélyi, R.

    2018-01-01

    Solar fundamental (f) acoustic mode oscillations are investigated analytically in a magnetohydrodynamic (MHD) model. The model consists of three layers in planar geometry, representing the solar interior, the magnetic atmosphere, and a transitional layer sandwiched between them. Since we focus on the fundamental mode here, we assume the plasma is incompressible. A horizontal, canopy-like, magnetic field is introduced to the atmosphere, in which degenerated slow MHD waves can exist. The global (f-mode) oscillations can couple to local atmospheric Alfvén waves, resulting, e.g., in a frequency shift of the oscillations. The dispersion relation of the global oscillation mode is derived, and is solved analytically for the thin-transitional layer approximation and for the weak-field approximation. Analytical formulae are also provided for the frequency shifts due to the presence of a thin transitional layer and a weak atmospheric magnetic field. The analytical results generally indicate that, compared to the fundamental value (ω =√{ gk }), the mode frequency is reduced by the presence of an atmosphere by a few per cent. A thin transitional layer reduces the eigen-frequencies further by about an additional hundred microhertz. Finally, a weak atmospheric magnetic field can slightly, by a few percent, increase the frequency of the eigen-mode. Stronger magnetic fields, however, can increase the f-mode frequency by even up to ten per cent, which cannot be seen in observed data. The presence of a magnetic atmosphere in the three-layer model also introduces non-permitted propagation windows in the frequency spectrum; here, f-mode oscillations cannot exist with certain values of the harmonic degree. The eigen-frequencies can be sensitive to the background physical parameters, such as an atmospheric density scale-height or the rate of the plasma density drop at the photosphere. Such information, if ever observed with high-resolution instrumentation and inverted, could help to

  7. The efficiency limit of CH3NH3PbI3 perovskite solar cells

    International Nuclear Information System (INIS)

    Sha, Wei E. I.; Ren, Xingang; Chen, Luzhou; Choy, Wallace C. H.

    2015-01-01

    With the consideration of photon recycling effect, the efficiency limit of methylammonium lead iodide (CH 3 NH 3 PbI 3 ) perovskite solar cells is predicted by a detailed balance model. To obtain convincing predictions, both AM 1.5 spectrum of Sun and experimentally measured complex refractive index of perovskite material are employed in the detailed balance model. The roles of light trapping and angular restriction in improving the maximal output power of thin-film perovskite solar cells are also clarified. The efficiency limit of perovskite cells (without the angular restriction) is about 31%, which approaches to Shockley-Queisser limit (33%) achievable by gallium arsenide (GaAs) cells. Moreover, the Shockley-Queisser limit could be reached with a 200 nm-thick perovskite solar cell, through integrating a wavelength-dependent angular-restriction design with a textured light-trapping structure. Additionally, the influence of the trap-assisted nonradiative recombination on the device efficiency is investigated. The work is fundamentally important to high-performance perovskite photovoltaics

  8. A Fundamental Scale of Descriptions for Analyzing Information Content of Communication Systems

    Directory of Open Access Journals (Sweden)

    Gerardo Febres

    2015-03-01

    Full Text Available The complexity of the description of a system is a function of the entropy of its symbolic description. Prior to computing the entropy of the system’s description, an observation scale has to be assumed. In texts written in artificial and natural languages, typical scales are binary, characters, and words. However, considering languages as structures built around certain preconceived set of symbols, like words or characters, limits the level of complexity that can be revealed analytically. This study introduces the notion of the fundamental description scale to analyze the essence of the structure of a language. The concept of Fundamental Scale is tested for English and musical instrument digital interface (MIDI music texts using an algorithm developed to split a text in a collection of sets of symbols that minimizes the observed entropy of the system. This Fundamental Scale reflects more details of the complexity of the language than using bits, characters or words. Results show that this Fundamental Scale allows to compare completely different languages, such as English and MIDI coded music regarding its structural entropy. This comparative power facilitates the study of the complexity of the structure of different communication systems.

  9. Fundamentals of ergonomic exoskeleton robots

    OpenAIRE

    Schiele, A.

    2008-01-01

    This thesis is the first to provide the fundamentals of ergonomic exoskeleton design. The fundamental theory as well as technology necessary to analyze and develop ergonomic wearable robots interacting with humans is established and validated by experiments and prototypes. The fundamentals are (1) a new theoretical framework for analyzing physical human robot interaction (pHRI) with exoskeletons, and (2) a clear set of design rules of how to build wearable, portable exoskeletons to easily and...

  10. An Optimization-Based Impedance Approach for Robot Force Regulation with Prescribed Force Limits

    Directory of Open Access Journals (Sweden)

    R. de J. Portillo-Vélez

    2015-01-01

    Full Text Available An optimization based approach for the regulation of excessive or insufficient forces at the end-effector level is introduced. The objective is to minimize the interaction force error at the robot end effector, while constraining undesired interaction forces. To that end, a dynamic optimization problem (DOP is formulated considering a dynamic robot impedance model. Penalty functions are considered in the DOP to handle the constraints on the interaction force. The optimization problem is online solved through the gradient flow approach. Convergence properties are presented and the stability is drawn when the force limits are considered in the analysis. The effectiveness of our proposal is validated via experimental results for a robotic grasping task.

  11. Evidence for the Fundamental Difference Hypothesis or Not?: Island Constraints Revisited

    Science.gov (United States)

    Belikova, Alyona; White, Lydia

    2009-01-01

    This article examines how changes in linguistic theory affect the debate between the fundamental difference hypothesis and the access-to-Universal Grammar (UG) approach to SLA. With a focus on subjacency (Chomsky, 1973), a principle of UG that places constraints on "wh"-movement and that has frequently been taken as a test case for verifying…

  12. arXiv Minimal Fundamental Partial Compositeness

    CERN Document Server

    Cacciapaglia, Giacomo; Sannino, Francesco; Thomsen, Anders Eller

    Building upon the fundamental partial compositeness framework we provide consistent and complete composite extensions of the standard model. These are used to determine the effective operators emerging at the electroweak scale in terms of the standard model fields upon consistently integrating out the heavy composite dynamics. We exhibit the first effective field theories matching these complete composite theories of flavour and analyse their physical consequences for the third generation quarks. Relations with other approaches, ranging from effective analyses for partial compositeness to extra dimensions as well as purely fermionic extensions, are briefly discussed. Our methodology is applicable to any composite theory of dynamical electroweak symmetry breaking featuring a complete theory of flavour.

  13. Mass Transfer From Fundamentals to Modern Industrial Applications

    CERN Document Server

    Asano, Koichi

    2006-01-01

    This didactic approach to the principles and modeling of mass transfer as it is needed in modern industrial processes is unique in combining a step-by-step introduction to all important fundamentals with the most recent applications. Based upon the renowned author's successful new modeling method as used for the O-18 process, the exemplary exercises included in the text are fact-proven, taken directly from existing chemical plants. Fascinating reading for chemists, graduate students, chemical and process engineers, as well as thermodynamics physicists.

  14. Quantum Mechanics - Fundamentals and Applications to Technology

    Science.gov (United States)

    Singh, Jasprit

    1996-10-01

    Explore the relationship between quantum mechanics and information-age applications This volume takes an altogether unique approach to quantum mechanics. Providing an in-depth exposition of quantum mechanics fundamentals, it shows how these concepts are applied to most of today's information technologies, whether they are electronic devices or materials. No other text makes this critical, essential leap from theory to real-world applications. The book's lively discussion of the mathematics involved fits right in with contemporary multidisciplinary trends in education: Once the basic formulation has been derived in a given chapter, the connection to important technological problems is summarily described. The many helpful features include * Twenty-eight application-oriented sections that focus on lasers, transistors, magnetic memories, superconductors, nuclear magnetic resonance (NMR), and other important technology-driving materials and devices * One hundred solved examples, with an emphasis on numerical results and the connection between the physics and its applications * End-of-chapter problems that ground the student in both fundamental and applied concepts * Numerous figures and tables to clarify the various topics and provide a global view of the problems under discussion * Over two hundred illustrations to highlight problems and text A book for the information age, Quantum Mechanics: Fundamentals and Applications to Technology promises to become a standard in departments of electrical engineering, applied physics, and materials science, as well as physics. It is an excellent text for senior undergraduate and graduate students, and a helpful reference for practicing scientists, engineers, and chemists in the semiconductor and electronic industries.

  15. Relevance of plastic limit loads to reference stress approach for surface cracked cylinder problems

    International Nuclear Information System (INIS)

    Kim, Yun-Jae; Shim, Do-Jun

    2005-01-01

    To investigate the relevance of the definition of the reference stress to estimate J and C* for surface crack problems, this paper compares finite element (FE) J and C* results for surface cracked pipes with those estimated according to the reference stress approach using various definitions of the reference stress. Pipes with part circumferential inner surface cracks and finite internal axial cracks are considered, subject to internal pressure and global bending. The crack depth and aspect ratio are systematically varied. The reference stress is defined in four different ways using (i) a local limit load (ii), a global limit load, (iii) a global limit load determined from the FE limit analysis, and (iv) the optimised reference load. It is found that the reference stress based on a local limit load gives overall excessively conservative estimates of J and C*. Use of a global limit load clearly reduces the conservatism, compared to that of a local limit load, although it can sometimes provide non-conservative estimates of J and C*. The use of the FE global limit load gives overall non-conservative estimates of J and C*. The reference stress based on the optimised reference load gives overall accurate estimates of J and C*, compared to other definitions of the reference stress. Based on the present findings, general guidance on the choice of the reference stress for surface crack problems is given

  16. Continuum Mechanics using Mathematica® Fundamentals, Applications and Scientific Computing

    CERN Document Server

    Romano, Antonio; Marasco, Addolorata

    2006-01-01

    This book's methodological approach familiarizes readers with the mathematical tools required to correctly define and solve problems in continuum mechanics. The book covers essential principles and fundamental applications, and provides a solid basis for a deeper study of more challenging and specialized problems related to elasticity, fluid mechanics, plasticity, materials with memory, piezoelectricity, ferroelectricity, magneto-fluid mechanics, and state changes. Key topics and features: * Concise presentation strikes a balance between fundamentals and applications * Requisite mathematical background carefully collected in two introductory chapters and two appendices * Recent developments highlighted through coverage of more significant applications to areas such as porous media, electromagnetic fields, and phase transitions Continuum Mechanics using Mathematica® is aimed at advanced undergraduates, graduate students, and researchers in applied mathematics, mathematical physics, and engineering. It may ser...

  17. Hybrid Fundamental Solution Based Finite Element Method: Theory and Applications

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2015-01-01

    Full Text Available An overview on the development of hybrid fundamental solution based finite element method (HFS-FEM and its application in engineering problems is presented in this paper. The framework and formulations of HFS-FEM for potential problem, plane elasticity, three-dimensional elasticity, thermoelasticity, anisotropic elasticity, and plane piezoelectricity are presented. In this method, two independent assumed fields (intraelement filed and auxiliary frame field are employed. The formulations for all cases are derived from the modified variational functionals and the fundamental solutions to a given problem. Generation of elemental stiffness equations from the modified variational principle is also described. Typical numerical examples are given to demonstrate the validity and performance of the HFS-FEM. Finally, a brief summary of the approach is provided and future trends in this field are identified.

  18. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2010-01-01

    New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update and Fundamentals is the single best source for the latest developments, trends, and issues in communication technology. Featuring the fundamental framework along with the history and background of communication technologies, Communication Technology Update and Fundamentals, 12th edition helps you stay ahead of these ever-changing and emerging technologies.As always, every chapter ha

  19. Amorphous Phase Mediated Crystallization: Fundamentals of Biomineralization

    Directory of Open Access Journals (Sweden)

    Wenjing Jin

    2018-01-01

    Full Text Available Many biomineralization systems start from transient amorphous precursor phases, but the exact crystallization pathways and mechanisms remain largely unknown. The study of a well-defined biomimetic crystallization system is key for elucidating the possible mechanisms of biomineralization and monitoring the detailed crystallization pathways. In this review, we focus on amorphous phase mediated crystallization (APMC pathways and their crystallization mechanisms in bio- and biomimetic-mineralization systems. The fundamental questions of biomineralization as well as the advantages and limitations of biomimetic model systems are discussed. This review could provide a full landscape of APMC systems for biomineralization and inspire new experiments aimed at some unresolved issues for understanding biomineralization.

  20. Adam Smith and the objective use of economy as a way to obtain the fundamental right to liberty

    Directory of Open Access Journals (Sweden)

    Luiz Edmundo Celso Borba

    2017-06-01

    Full Text Available The contribution of Adam Smith for liberalism was great, but it is not limited to bringing an insane and unreasonable quest for freedom, as if all the costs were possible for such purpose. Adam Smith brings the birthplace of fundamental rights, to call for the need for recognition of individual liberties, although predicting problems for the exercise of such powers without limits. The idea of the text is to bring the initial outlook for the construction of fundamental rights in an individual prism and after its weighting in a mass spectrum as well as the subsequent discussion which should prevail.

  1. Quantum operations: technical or fundamental challenge?

    International Nuclear Information System (INIS)

    Mielnik, Bogdan

    2013-01-01

    A class of unitary operations generated by idealized, semiclassical fields is studied. The operations implemented by sharp potential kicks are revisited and the possibility of performing them by softly varying external fields is examined. The possibility of using the ion traps as ‘operation factories’ transforming quantum states is discussed. The non-perturbative algorithms indicate that the results of abstract δ-pulses of oscillator potentials can become real. Some of them, if empirically achieved, could be essential to examine certain atypical quantum ideas. In particular, simple dynamical manipulations might contribute to the Aharonov–Bohm criticism of the time–energy uncertainty principle, while some others may verify the existence of fundamental precision limits of the position measurements or the reality of ‘non-commutative geometries’. (paper)

  2. Fundamentals of tribology at the atomic level

    Science.gov (United States)

    Ferrante, John; Pepper, Stephen V.

    1989-01-01

    Tribology, the science and engineering of solid surfaces in moving contact, is a field that encompasses many disciplines: solid state physics, chemistry, materials science, and mechanical engineering. In spite of the practical importance and maturity of the field, the fundamental understanding of basic phenomena has only recently been attacked. An attempt to define some of these problems and indicate some profitable directions for future research is presented. There are three broad classifications: (1) fluid properties (compression, rheology, additives and particulates); (2) material properties of the solids (deformation, defect formation and energy loss mechanisms); and (3) interfacial properties (adhesion, friction chemical reactions, and boundary films). Research in the categories has traditionally been approached by considering macroscopic material properties. Recent activity has shown that some issues can be approached at the atomic level: the atoms in the materials can be manipulated both experimentally and theoretically, and can produce results related to macroscopic phenomena.

  3. Fundamental Design Principles for Transcription-Factor-Based Metabolite Biosensors.

    Science.gov (United States)

    Mannan, Ahmad A; Liu, Di; Zhang, Fuzhong; Oyarzún, Diego A

    2017-10-20

    Metabolite biosensors are central to current efforts toward precision engineering of metabolism. Although most research has focused on building new biosensors, their tunability remains poorly understood and is fundamental for their broad applicability. Here we asked how genetic modifications shape the dose-response curve of biosensors based on metabolite-responsive transcription factors. Using the lac system in Escherichia coli as a model system, we built promoter libraries with variable operator sites that reveal interdependencies between biosensor dynamic range and response threshold. We developed a phenomenological theory to quantify such design constraints in biosensors with various architectures and tunable parameters. Our theory reveals a maximal achievable dynamic range and exposes tunable parameters for orthogonal control of dynamic range and response threshold. Our work sheds light on fundamental limits of synthetic biology designs and provides quantitative guidelines for biosensor design in applications such as dynamic pathway control, strain optimization, and real-time monitoring of metabolism.

  4. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    fundamental assumptions.A recent focus set in the Astrophysical Journal Letters, titled Focus on Exploring Fundamental Physics with Extragalactic Transients, consists of multiple published studies doing just that.Testing General RelativitySeveral of the articles focus on the 4th point above. By assuming that the delay in photon arrival times is only due to the gravitational potential of the Milky Way, these studies set constraints on the deviation of our galaxys gravitational potential from what GR would predict. The study by He Gao et al. uses the different photon arrival times from gamma-ray bursts to set constraints at eVGeV energies, and the study by Jun-Jie Wei et al. complements this by setting constraints at keV-TeV energies using photons from high-energy blazar emission.Photons or neutrinos from different extragalactic transients each set different upper limits on delta gamma, the post-Newtonian parameter, vs. particle energy or frequency. This is a test of Einsteins equivalence principle: if the principle is correct, delta gamma would be exactly zero, meaning that photons of different energies move at the same velocity through a vacuum. [Tingay Kaplan 2016]S.J. Tingay D.L. Kaplan make the case that measuring the time delay of photons from fast radio bursts (FRBs; transient radio pulses that last only a few milliseconds) will provide even tighter constraints if we are able to accurately determine distances to these FRBs.And Adi Musser argues that the large-scale structure of the universe plays an even greater role than the Milky Way gravitational potential, allowing for even stricter testing of Einsteins equivalence principle.The ever-narrower constraints from these studies all support GR as a correct set of rules through which to interpret our universe.Other Tests of Fundamental PhysicsIn addition to the above tests, Xue-Feng Wu et al. show that FRBs can be used to provide severe constraints on the rest mass of the photon, and S. Croft et al. even touches on what we

  5. Teaching the Politics of Islamic Fundamentalism.

    Science.gov (United States)

    Kazemzadeh, Masoud

    1998-01-01

    Argues that the rise of Islamic fundamentalism since the Iranian Revolution has generated a number of issues of analytical significance for political science. Describes three main models in teaching and research on Islamic fundamentalism: Islamic exceptionalism, comparative fundamentalisms, and class analysis. Discusses the construction of a…

  6. Fundamental principles of heat transfer

    CERN Document Server

    Whitaker, Stephen

    1977-01-01

    Fundamental Principles of Heat Transfer introduces the fundamental concepts of heat transfer: conduction, convection, and radiation. It presents theoretical developments and example and design problems and illustrates the practical applications of fundamental principles. The chapters in this book cover various topics such as one-dimensional and transient heat conduction, energy and turbulent transport, forced convection, thermal radiation, and radiant energy exchange. There are example problems and solutions at the end of every chapter dealing with design problems. This book is a valuable int

  7. Failure strength and elastic limit for concrete

    International Nuclear Information System (INIS)

    Robutti, G.; Ronzoni, E.; Ottosen, N.S.

    1979-01-01

    Due to increased demand for realistic analysis of structures such as prestressed concrete reactor vessels and reactor containments, the formulation of general constitutive equations for concrete is of considerable importance. In the field of constitutive equations the correct definition of the limit state represented by the concrete failure surface is a fundamental need. In this paper carried out by a Danish-Italian cooperation, several failure criteria obtained by different authors are compared with failure experimental data obtained with triaxial tests on concrete specimens. Such comparison allow to carry out conclusive considerations on the characteristics of the concrete failure surface and on the advantages and disadvantages of the different criteria. Considerations are also reported on the definition of a limit elastic surface, whose knowledge is of fundamental importance for designers of complex structures in concrete. (orig.)

  8. Fundamental Astronomy

    CERN Document Server

    Karttunen, Hannu; Oja, Heikki; Poutanen, Markku; Donner, Karl Johan

    2007-01-01

    Fundamental Astronomy gives a well-balanced and comprehensive introduction to the topics of classical and modern astronomy. While emphasizing both the astronomical concepts and the underlying physical principles, the text provides a sound basis for more profound studies in the astronomical sciences. The fifth edition of this successful undergraduate textbook has been extensively modernized and extended in the parts dealing with the Milky Way, extragalactic astronomy and cosmology as well as with extrasolar planets and the solar system (as a consequence of recent results from satellite missions and the new definition by the International Astronomical Union of planets, dwarf planets and small solar-system bodies). Furthermore a new chapter on astrobiology has been added. Long considered a standard text for physical science majors, Fundamental Astronomy is also an excellent reference and entrée for dedicated amateur astronomers.

  9. Systems biology approach to bioremediation

    Energy Technology Data Exchange (ETDEWEB)

    Chakraborty, Romy; Wu, Cindy H.; Hazen, Terry C.

    2012-06-01

    Bioremediation has historically been approached as a ‘black box’ in terms of our fundamental understanding. Thus it succeeds and fails, seldom without a complete understanding of why. Systems biology is an integrated research approach to study complex biological systems, by investigating interactions and networks at the molecular, cellular, community, and ecosystem level. The knowledge of these interactions within individual components is fundamental to understanding the dynamics of the ecosystem under investigation. Finally, understanding and modeling functional microbial community structure and stress responses in environments at all levels have tremendous implications for our fundamental understanding of hydrobiogeochemical processes and the potential for making bioremediation breakthroughs and illuminating the ‘black box’.

  10. To the field theory with a fundamental mass

    International Nuclear Information System (INIS)

    Ibadov, R.M.; Kadyshevskij, V.G.

    1986-01-01

    This paper is a continuation of the investigations along the lines of constructing a consistent field theory with fundamental mass M - a hypothetical universal scale in the ultrahigh energy region. Earlier, in the developed approach the key role was played by the de Sitter momentum space of radius M. In this paper a quantum version of this idea is worked out: p-space is assumed to be a de Sitter one like before; however, the four-momentum p μ is treated as a quantum mechanical operator in δ/δx μ only

  11. Nuclear Structure at the Limits

    International Nuclear Information System (INIS)

    Nazarewicz, W.

    1998-01-01

    One of the frontiers of todays nuclear science is the journey to the limits of atomic charge and nuclear mass, of neutron-to-proton ratio, and of angular momentum. The tour to the limits is not only a quest for new, exciting phenomena, but the new data are expected, as well, to bring qualitatively new information about the fundamental properties of the nucleonic many-body system, the nature of the nuclear interaction, and nucleonic correlations at various energy-distance scales. In this series of lectures, current developments in nuclear structure at the limits are discussed from a theoretical perspective, mainly concentrating on medium-mass and heavy nuclei

  12. Materials Fundamentals of Gate Dielectrics

    CERN Document Server

    Demkov, Alexander A

    2006-01-01

    This book presents materials fundamentals of novel gate dielectrics that are being introduced into semiconductor manufacturing to ensure the continuous scalling of the CMOS devices. This is a very fast evolving field of research so we choose to focus on the basic understanding of the structure, thermodunamics, and electronic properties of these materials that determine their performance in device applications. Most of these materials are transition metal oxides. Ironically, the d-orbitals responsible for the high dielectric constant cause sever integration difficulties thus intrinsically limiting high-k dielectrics. Though new in the electronics industry many of these materials are wel known in the field of ceramics, and we describe this unique connection. The complexity of the structure-property relations in TM oxides makes the use of the state of the art first-principles calculations necessary. Several chapters give a detailed description of the modern theory of polarization, and heterojunction band discont...

  13. Country Fundamentals and Currency Excess Returns

    Directory of Open Access Journals (Sweden)

    Daehwan Kim

    2014-06-01

    Full Text Available We examine whether country fundamentals help explain the cross-section of currency excess returns. For this purpose, we consider fundamental variables such as default risk, foreign exchange rate regime, capital control as well as interest rate in the multi-factor model framework. Our empirical results show that fundamental factors explain a large part of the cross-section of currency excess returns. The zero-intercept restriction of the factor model is not rejected for most currencies. They also reveal that our factor model with country fundamentals performs better than a factor model with usual investment-style factors. Our main empirical results are based on 2001-2010 balanced panel data of 19 major currencies. This paper may fill the gap between country fundamentals and practitioners' strategies on currency investment.

  14. New approaches to deriving limits of the release of radioactive material into the environment

    International Nuclear Information System (INIS)

    Lindell, B.

    1977-01-01

    During the last few years, new principles have been developed for the limitation of the release of radioactive material into the environment. It is no longer considered appropriate to base the limitation on limits for the concentrations of the various radionuclides in air and water effluents. Such limits would not prevent large amounts of radioactive material from reaching the environment should effluent rates be high. A common practice has been to identify critical radionuclides and critical pathways and to base the limitation on authorized dose limits for local ''critical groups''. If this were the only limitation, however, larger releases could be permitted after installing either higher stacks or equipment to retain the more short-lived radionuclides for decay before release. Continued release at such limits would then lead to considerably higher exposure at a distance than if no such installation had been made. Accordingly there would be no immediate control of overlapping exposures from several sources, nor would the system guarantee control of the future situation. The new principles described in this paper take the future into account by limiting the annual dose commitments rather than the annual doses. They also offer means of controlling the global situation by limiting not only doses in critical groups but also global collective doses. Their objective is not only to ensure that individual dose limits will always be respected but also to meet the requirement that ''all doses be kept as low as reasonably achievable''. The new approach is based on the most recent recommendations by the ICRP and has been described in a report by an IAEA panel (Procedures for establishing limits for the release of radioactive material into the environment). It has been applied in the development of new Swedish release regulations, which illustrate some of the problems which arise in the practical application

  15. New approaches to deriving limits of the release of radioactive material into the environment

    International Nuclear Information System (INIS)

    Lindell, B.

    1977-01-01

    During the last few years, new principles have been developed for the limitation of the release of radioactive material into the environment. It is no longer considered appropriate to base the limitation on limits for the concentrations of the various radionuclides in air and water effluents. Such limits would not prevent large amounts of radioactive material from reaching the environment should effluent rates be high. A common practice has been to identify critical radionuclides and critical pathways and to base the limitation on authorized dose limits for local ''critical groups''. If this were the only limitation, however, larger releases could be permitted after installing either higher stacks or equipment to retain the more shortlived radionuclides for decay before release. Continued release at such limits would then lead to considerably higher exposure at a distance than if no such installation had been made. Accordingly there would be no immediate control of overlapping exposures from several sources, nor would the system guarantee control of the future situation. The new principles described in this paper take the future into account by limiting the annual dose commitments rather than the annual doses. They also offer means of controlling the global situation by limiting not only doses in critical groups but also global collective doses. Their objective is not only to ensure that individual dose limits will always be respected but also to meet the requirement that ''all doses be kept as low as reasonably achievable''. The new approach is based on the most recent recommendations by the ICRP and has been described in a report by an IAEA panel (Procedures for Establishing Limits for the Release of Radioactive Material into the Environment). It has been applied in the development of new Swedish release regulations, which illustrate some of the problems which arise in the practical application. (author)

  16. Using holistic interpretive synthesis to create practice-relevant guidance for person-centred fundamental care delivered by nurses.

    Science.gov (United States)

    Feo, Rebecca; Conroy, Tiffany; Marshall, Rhianon J; Rasmussen, Philippa; Wiechula, Richard; Kitson, Alison L

    2017-04-01

    Nursing policy and healthcare reform are focusing on two, interconnected areas: person-centred care and fundamental care. Each initiative emphasises a positive nurse-patient relationship. For these initiatives to work, nurses require guidance for how they can best develop and maintain relationships with their patients in practice. Although empirical evidence on the nurse-patient relationship is increasing, findings derived from this research are not readily or easily transferable to the complexities and diversities of nursing practice. This study describes a novel methodological approach, called holistic interpretive synthesis (HIS), for interpreting empirical research findings to create practice-relevant recommendations for nurses. Using HIS, umbrella review findings on the nurse-patient relationship are interpreted through the lens of the Fundamentals of Care Framework. The recommendations for the nurse-patient relationship created through this approach can be used by nurses to establish, maintain and evaluate therapeutic relationships with patients to deliver person-centred fundamental care. Future research should evaluate the validity and impact of these recommendations and test the feasibility of using HIS for other areas of nursing practice and further refine the approach. © 2016 John Wiley & Sons Ltd.

  17. Religious fundamentalism and conflict

    OpenAIRE

    Muzaffer Ercan Yılmaz

    2006-01-01

    This study provides an analytical discussion for the issue of religious fundamentalism and itsrelevance to conflict, in its broader sense. It is stressed that religious fundamentalism manifests itself in twoways: nonviolent intolerance and violent intolerance. The sources of both types of intolerance and theirconnection to conflict are addressed and discussed in detail. Further research is also suggested on conditionsconnecting religion to nonviolent intolerance so as to cope with the problem...

  18. Theoretical Limiting Potentials in Mg/O2 Batteries

    DEFF Research Database (Denmark)

    Smith, Jeffrey G.; Naruse, Junichi; Hiramatsu, Hidehiko

    2016-01-01

    A rechargeable battery based on a multivalent Mg/O2 couple is an attractive chemistry due to its high theoretical energy density and potential for low cost. Nevertheless, metal-air batteries based on alkaline earth anodes have received limited attention and generally exhibit modest performance....... In addition, many fundamental aspects of this system remain poorly understood, such as the reaction mechanisms associated with discharge and charging. The present study aims to close this knowledge gap and thereby accelerate the development of Mg/O2 batteries by employing first-principles calculations...... by the presence of large thermodynamic overvoltages. In contrast, MgO2-based cells are predicted to be much more efficient: superoxide-terminated facets on MgO2 crystallites enable low overvoltages and round-trip efficiencies approaching 90%. These data suggest that the performance of Mg/O2 batteries can...

  19. From fundamental supramolecular chemistry to self-assembled nanomaterials and medicines and back again - how Sam inspired SAMul.

    Science.gov (United States)

    Smith, David K

    2018-05-08

    This feature article provides a personal insight into the research from my group over the past 10 years. In particular, the article explains how, inspired in 2005 by meeting my now-husband, Sam, who had cystic fibrosis, and who in 2011 went on to have a double lung transplant, I took an active decision to follow a more applied approach to some of our research, attempting to use fundamental supramolecular chemistry to address problems of medical interest. In particular, our strategy uses self-assembly to fabricate biologically-active nanosystems from simple low-molecular-weight building blocks. These systems can bind biological polyanions in highly competitive conditions, allowing us to approach applications in gene delivery and coagulation control. In the process, however, we have also developed new fundamental principles such as self-assembled multivalency (SAMul), temporary 'on-off' multivalency, and adaptive/shape-persistent multivalent binding. By targeting materials with applications in drug formulation and tissue engineering, we have discovered novel self-assembling low-molecular-weight hydrogelators based on the industrially-relevant dibenzylidenesorbitol framework and developed innovative approaches to spatially-resolved gels and functional multicomponent hybrid hydrogels. In this way, taking an application-led approach to research has also delivered significant academic value and conceptual advances. Furthermore, beginning to translate fundamental supramolecular chemistry into real-world applications, starts to demonstrate the power of this approach, and its potential to transform the world around us for the better.

  20. Existential vulnerability: toward a psychopathology of limit situations.

    Science.gov (United States)

    Fuchs, Thomas

    2013-01-01

    Jaspers' concept of limit situations seems particularly appropriate not only to elucidate outstanding existential situations in general, but also basic preconditions for the occurrence of mental disorders. For this purpose, the concept is first explained in Jaspers' sense and then related to an 'existential vulnerability' of mentally ill persons that makes them experience even inconspicuous events as distressing limit situations. In such situations, an otherwise hidden fundamental condition of existence becomes manifest for them, e.g. the fragility of one's own body, the inevitability of freedom, or the finiteness of life. This fundamental condition is found unbearable and, as a reaction, gives rise to mental illness. This concept of existential vulnerability is illustrated by some psychopathological examples. © 2013 S. Karger AG, Basel.

  1. MAKING THE NEIGHBOURHOOD A BETTER PLACE TO LIVE. A SWB APPROACH IMPLEMENTING FUNDAMENTAL HUMAN NEEDS

    Directory of Open Access Journals (Sweden)

    Ioanna Anna Papachristou

    2015-10-01

    Full Text Available Subjective well-being (SWB studies have been at the centre of researchers’ attention during the last years. With the majority of people now living in cities, the necessity for a more anthropocentric approach for the study and betterment of urban environments is constantly increasing. In this sense, defining and measuring SWB in urban contexts can be of particular benefit in urban design and planning processes. In this article, a method for measuring SWB for urban places based on the accomplishment of the fundamental human needs is presented and applied at a neighbourhood of Barcelona; that of Vila de Gràcia. For the measurement, a survey was constructed based on the specific geographical and socio-economic characteristics of the study case. Retrieved from Max-Neef’s Human Scale Development Paradigm (Max-Neef et al. 1991, human needs correspond to the domains of study of the suggested method. The matching of the survey’s questions to each need is the outcome of two consecutive processes: a first qualitative one, involving the work of an expert group, and a second quantitative one, involving the definition of weights among the questions that affect the same need. Although the final result is positive (although low for this study case, results for each need show considerable differences in their level of accomplishment. At the same time people seem to truly believe that most of their feelings are affected by their living environment, with stress and calmness leading the list. In summary, the method defines and applies a simple tool to quantify and evaluate current levels of SWB at different urban scales and to determine more holistic urban indexes in order to improve decision making processes, policies and plans. The classification of the questions per need favours the identification of a potential problem at the urban grid and consequently can be used as a process for implementing related measures of improvement. The method can also be seen

  2. Theoretical prediction and impact of fundamental electric dipole moments

    International Nuclear Information System (INIS)

    Ellis, Sebastian A.R.; Kane, Gordon L.

    2016-01-01

    The predicted Standard Model (SM) electric dipole moments (EDMs) of electrons and quarks are tiny, providing an important window to observe new physics. Theories beyond the SM typically allow relatively large EDMs. The EDMs depend on the relative phases of terms in the effective Lagrangian of the extended theory, which are generally unknown. Underlying theories, such as string/M-theories compactified to four dimensions, could predict the phases and thus EDMs in the resulting supersymmetric (SUSY) theory. Earlier one of us, with collaborators, made such a prediction and found, unexpectedly, that the phases were predicted to be zero at tree level in the theory at the unification or string scale ∼O(10 16 GeV). Electroweak (EW) scale EDMs still arise via running from the high scale, and depend only on the SM Yukawa couplings that also give the CKM phase. Here we extend the earlier work by studying the dependence of the low scale EDMs on the constrained but not fully known fundamental Yukawa couplings. The dominant contribution is from two loop diagrams and is not sensitive to the choice of Yukawa texture. The electron EDM should not be found to be larger than about 5×10 −30 e cm, and the neutron EDM should not be larger than about 5×10 −29 e cm. These values are quite a bit smaller than the reported predictions from Split SUSY and typical effective theories, but much larger than the Standard Model prediction. Also, since models with random phases typically give much larger EDMs, it is a significant testable prediction of compactified M-theory that the EDMs should not be above these upper limits. The actual EDMs can be below the limits, so once they are measured they could provide new insight into the fundamental Yukawa couplings of leptons and quarks. We comment also on the role of strong CP violation. EDMs probe fundamental physics near the Planck scale.

  3. Theoretical prediction and impact of fundamental electric dipole moments

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Sebastian A.R.; Kane, Gordon L. [Michigan Center for Theoretical Physics (MCTP),Department of Physics, University of Michigan,Ann Arbor, MI 48109 (United States)

    2016-01-13

    The predicted Standard Model (SM) electric dipole moments (EDMs) of electrons and quarks are tiny, providing an important window to observe new physics. Theories beyond the SM typically allow relatively large EDMs. The EDMs depend on the relative phases of terms in the effective Lagrangian of the extended theory, which are generally unknown. Underlying theories, such as string/M-theories compactified to four dimensions, could predict the phases and thus EDMs in the resulting supersymmetric (SUSY) theory. Earlier one of us, with collaborators, made such a prediction and found, unexpectedly, that the phases were predicted to be zero at tree level in the theory at the unification or string scale ∼O(10{sup 16} GeV). Electroweak (EW) scale EDMs still arise via running from the high scale, and depend only on the SM Yukawa couplings that also give the CKM phase. Here we extend the earlier work by studying the dependence of the low scale EDMs on the constrained but not fully known fundamental Yukawa couplings. The dominant contribution is from two loop diagrams and is not sensitive to the choice of Yukawa texture. The electron EDM should not be found to be larger than about 5×10{sup −30}e cm, and the neutron EDM should not be larger than about 5×10{sup −29}e cm. These values are quite a bit smaller than the reported predictions from Split SUSY and typical effective theories, but much larger than the Standard Model prediction. Also, since models with random phases typically give much larger EDMs, it is a significant testable prediction of compactified M-theory that the EDMs should not be above these upper limits. The actual EDMs can be below the limits, so once they are measured they could provide new insight into the fundamental Yukawa couplings of leptons and quarks. We comment also on the role of strong CP violation. EDMs probe fundamental physics near the Planck scale.

  4. Testing fundamental physics with gravitational waves

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The landmark detection of gravitational waves (GWs) has opened a new era in physics, giving access to the hitherto unexplored strong-gravity regime, where spacetime curvature is extreme and the relevant speed is close to the speed of light. In parallel to its countless astrophysical applications, this discovery can have also important implications for fundamental physics. In this context, I will discuss some outstanding, cross-cutting problems that can be finally investigated in the GW era: the nature of black holes and of spacetime singularities, the limits of classical gravity, the existence of extra light fields, and the effects of dark matter near compact objects. Future GW measurements will provide unparalleled tests of quantum-gravity effects at the horizon scale, exotic compact objects, ultralight dark matter, and of general relativity in the strong-field regime.

  5. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  6. Land Prices and Fundamentals

    OpenAIRE

    Koji Nakamura; Yumi Saita

    2007-01-01

    This paper examines the long-term relationship between macro economic fundamentals and the weighted-average land price indicators, which are supposed to be more appropriate than the official land price indicators when analyzing their impacts on the macro economy. In many cases, we find the cointegrating relationships between the weighted-average land price indicators and the discounted present value of land calculated based on the macro economic fundamentals indicators. We also find that the ...

  7. Fundamental volatility is regime specific

    NARCIS (Netherlands)

    Arnold, I.J.M.; MacDonald, R.; Vries, de C.G.

    2006-01-01

    A widely held notion holds that freely floating exchange rates are excessively volatile when judged against fundamentals and when moving from fixed to floating exchange rates. We re-examine the data and conclude that the disparity between the fundamentals and exchange rate volatility is more

  8. A genetic approach to shape reconstruction in limited data tomography

    International Nuclear Information System (INIS)

    Turcanu, C.; Craciunescu, T.

    2001-01-01

    The paper proposes a new method for shape reconstruction in computerized tomography. Unlike nuclear medicine applications, in physical science problems we are often confronted with limited data sets: constraints in the number of projections or limited view angles . The problem of image reconstruction from projection may be considered as a problem of finding an image (solution) having projections that match the experimental ones. In our approach, we choose a statistical correlation coefficient to evaluate the fitness of any potential solution. The optimization process is carried out by a genetic algorithm. The algorithm has some features common to all genetic algorithms but also some problem-oriented characteristics. One of them is that a chromosome, representing a potential solution, is not linear but coded as a matrix of pixels corresponding to a two-dimensional image. This kind of internal representation reflects the genuine manifestation and slight differences between two points situated in the original problem space give rise to similar differences once they become coded. Another particular feature is a newly built crossover operator: the grid-based crossover, suitable for high dimension two-dimensional chromosomes. Except for the population size and the dimension of the cutting grid for the grid-based crossover, all the other parameters of the algorithm are independent of the geometry of the tomographic reconstruction. The performances of the method are evaluated on a phantom typical for an application with limited data sets: the determination of the neutron energy spectra with time resolution in case of short-pulsed neutron emission. A genetic reconstruction is presented. The qualitative judgement and also the quantitative one, based on some figures of merit, point out that the proposed method ensures an improved reconstruction of shapes, sizes and resolution in the image, even in the presence of noise. (authors)

  9. Evaluating fundamentals of care: The development of a unit-level quality measurement and improvement programme.

    Science.gov (United States)

    Parr, Jenny M; Bell, Jeanette; Koziol-McLain, Jane

    2018-06-01

    The project aimed to develop a unit-level quality measurement and improvement programme using evidence-based fundamentals of care. Feedback from patients, families, whānau, staff and audit data in 2014 indicated variability in the delivery of fundamental aspects of care such as monitoring, nutrition, pain management and environmental cleanliness at a New Zealand District Health Board. A general inductive approach was used to explore the fundamentals of care and design a measurement and improvement programme, the Patient and Whānau Centred Care Standards (PWCCS), focused on fundamental care. Five phases were used to explore the evidence, and design and test a measurement and improvement framework. Nine identified fundamental elements of care were used to define expected standards of care and develop and test a measurement and improvement framework. Four six-monthly peer reviews have been undertaken since June 2015. Charge Nurse Managers used results to identify quality improvements. Significant improvement was demonstrated overall, in six of the 27 units, in seven of the nine standards and three of the four measures. In all, 89% (n = 24) of units improved their overall result. The PWCCS measurement and improvement framework make visible nursing fundamentals of care in line with continuous quality improvement to increase quality of care. Delivering fundamentals of care is described by nurses as getting ?back to basics'. Patient and family feedback supports the centrality of fundamentals of care to their hospital experience. Implementing a unit-level fundamentals of care quality measurement and improvement programme clarifies expected standards of care, highlights the contribution of fundamentals of care to quality and provides a mechanism for ongoing improvements. © 2018 John Wiley & Sons Ltd.

  10. Comparing the accuracy of perturbative and variational calculations for predicting fundamental vibrational frequencies of dihalomethanes

    Science.gov (United States)

    Krasnoshchekov, Sergey V.; Schutski, Roman S.; Craig, Norman C.; Sibaev, Marat; Crittenden, Deborah L.

    2018-02-01

    Three dihalogenated methane derivatives (CH2F2, CH2FCl, and CH2Cl2) were used as model systems to compare and assess the accuracy of two different approaches for predicting observed fundamental frequencies: canonical operator Van Vleck vibrational perturbation theory (CVPT) and vibrational configuration interaction (VCI). For convenience and consistency, both methods employ the Watson Hamiltonian in rectilinear normal coordinates, expanding the potential energy surface (PES) as a Taylor series about equilibrium and constructing the wavefunction from a harmonic oscillator product basis. At the highest levels of theory considered here, fourth-order CVPT and VCI in a harmonic oscillator basis with up to 10 quanta of vibrational excitation in conjunction with a 4-mode representation sextic force field (SFF-4MR) computed at MP2/cc-pVTZ with replacement CCSD(T)/aug-cc-pVQZ harmonic force constants, the agreement between computed fundamentals is closer to 0.3 cm-1 on average, with a maximum difference of 1.7 cm-1. The major remaining accuracy-limiting factors are the accuracy of the underlying electronic structure model, followed by the incompleteness of the PES expansion. Nonetheless, computed and experimental fundamentals agree to within 5 cm-1, with an average difference of 2 cm-1, confirming the utility and accuracy of both theoretical models. One exception to this rule is the formally IR-inactive but weakly allowed through Coriolis-coupling H-C-H out-of-plane twisting mode of dichloromethane, whose spectrum we therefore revisit and reassign. We also investigate convergence with respect to order of CVPT, VCI excitation level, and order of PES expansion, concluding that premature truncation substantially decreases accuracy, although VCI(6)/SFF-4MR results are still of acceptable accuracy, and some error cancellation is observed with CVPT2 using a quartic force field.

  11. A macrothermodynamic approach to the limit of reversible capillary condensation.

    Science.gov (United States)

    Trens, Philippe; Tanchoux, Nathalie; Galarneau, Anne; Brunel, Daniel; Fubini, Bice; Garrone, Edoardo; Fajula, François; Di Renzo, Francesco

    2005-08-30

    The threshold of reversible capillary condensation is a well-defined thermodynamic property, as evidenced by corresponding states treatment of literature and experimental data on the lowest closure point of the hysteresis loop in capillary condensation-evaporation cycles for several adsorbates. The nonhysteretical filling of small mesopores presents the properties of a first-order phase transition, confirming that the limit of condensation reversibility does not coincide with the pore critical point. The enthalpy of reversible capillary condensation can be calculated by a Clausius-Clapeyron approach and is consistently larger than the condensation heat in unconfined conditions. Calorimetric data on the capillary condensation of tert-butyl alcohol in MCM-41 silica confirm a 20% increase of condensation heat in small mesopores. This enthalpic advantage makes easier the overcoming of the adhesion forces by the capillary forces and justifies the disappearing of the hysteresis loop.

  12. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  13. Quantum limitations on the sensitivity of gravitational wave detectors with free masses

    International Nuclear Information System (INIS)

    Tsyplyaev, S.A.

    1989-01-01

    The problem of recording a classical disturbance by tracking the coordinate of a free particle is examined within the scope of nonrelativistic quantum mechanics. The absence of the fundamental limitation on the sensitivity - the standard quantum limit - is proven. An arbitrarily small disturbance can be recorded with preparation of the system in a quantum state having a negative quantum correlation coefficient between the observable coordinate and momentum. It is shown that it belongs to the collective coherent states - the condensed states. Arguments are presented for the absence of fundamental quantum limits on the magnitude of the recordable disturbance in the measurement of an arbitrary observable with a continuous spectrum

  14. Green Manufacturing Fundamentals and Applications

    CERN Document Server

    2013-01-01

    Green Manufacturing: Fundamentals and Applications introduces the basic definitions and issues surrounding green manufacturing at the process, machine and system (including supply chain) levels. It also shows, by way of several examples from different industry sectors, the potential for substantial improvement and the paths to achieve the improvement. Additionally, this book discusses regulatory and government motivations for green manufacturing and outlines the path for making manufacturing more green as well as making production more sustainable. This book also: • Discusses new engineering approaches for manufacturing and provides a path from traditional manufacturing to green manufacturing • Addresses regulatory and economic issues surrounding green manufacturing • Details new supply chains that need to be in place before going green • Includes state-of-the-art case studies in the areas of automotive, semiconductor and medical areas as well as in the supply chain and packaging areas Green Manufactu...

  15. Limiting values for radioactive materials in food

    International Nuclear Information System (INIS)

    Steiner, Martin

    2014-01-01

    The contribution describes the fundamentals of radiation protection: LNT (linear, no threshold) hypotheses, ALARA (a slow as reasonably achievable), limiting values. Using the example the nuclear accident in Chernobyl the differences in contamination development in different foodstuffs in Germany is demonstrated including recommended limiting values and the radiation exposures after 30 years due to consumption of contaminated food. The natural radioactivity is about 0.3 mSv/year.

  16. Standardless quantification approach of TXRF analysis using fundamental parameter method

    International Nuclear Information System (INIS)

    Szaloki, I.; Taniguchi, K.

    2000-01-01

    New standardless evaluation procedure based on the fundamental parameter method (FPM) has been developed for TXRF analysis. The theoretical calculation describes the relationship between characteristic intensities and the geometrical parameters of the excitation, detection system and the specimen parameters: size, thickness, angle of the excitation beam to the surface and the optical properties of the specimen holder. Most of the TXRF methods apply empirical calibration, which requires the application of special preparation technique. However, the characteristic lines of the specimen holder (Si Kα,β) present information from the local excitation and geometrical conditions on the substrate surface. On the basis of the theoretically calculation of the substrate characteristic intensity the excitation beam flux can be approximated. Taking into consideration the elements are in the specimen material a system of non-linear equation can be given involving the unknown concentration values and the geometrical and detection parameters. In order to solve this mathematical problem PASCAL software was written, which calculates the sample composition and the average sample thickness by gradient algorithm. Therefore, this quantitative estimation of the specimen composition requires neither external nor internal standard sample. For verification of the theoretical calculation and the numerical procedure, several experiments were carried out using mixed standard solution containing elements of K, Sc, V, Mn, Co and Cu in 0.1 - 10 ppm concentration range. (author)

  17. Approaching conversion limit with all-dielectric solar cell reflectors.

    Science.gov (United States)

    Fu, Sze Ming; Lai, Yi-Chun; Tseng, Chi Wei; Yan, Sheng Lun; Zhong, Yan Kai; Shen, Chang-Hong; Shieh, Jia-Min; Li, Yu-Ren; Cheng, Huang-Chung; Chi, Gou-chung; Yu, Peichen; Lin, Albert

    2015-02-09

    Metallic back reflectors has been used for thin-film and wafer-based solar cells for very long time. Nonetheless, the metallic mirrors might not be the best choices for photovoltaics. In this work, we show that solar cells with all-dielectric reflectors can surpass the best-configured metal-backed devices. Theoretical and experimental results all show that superior large-angle light scattering capability can be achieved by the diffuse medium reflectors, and the solar cell J-V enhancement is higher for solar cells using all-dielectric reflectors. Specifically, the measured diffused scattering efficiency (D.S.E.) of a diffuse medium reflector is >0.8 for the light trapping spectral range (600nm-1000nm), and the measured reflectance of a diffuse medium can be as high as silver if the geometry of embedded titanium oxide(TiO(2)) nanoparticles is optimized. Moreover, the diffuse medium reflectors have the additional advantage of room-temperature processing, low cost, and very high throughput. We believe that using all-dielectric solar cell reflectors is a way to approach the thermodynamic conversion limit by completely excluding metallic dissipation.

  18. The Contribution of the Caribbean Court of Justice to the Development of Human and Fundamental Rights

    DEFF Research Database (Denmark)

    Caserta, Salvatore

    2018-01-01

    This article highlights some of the most important legal developments of the CCJ with the goal of clarifying its role as a human and fundamental rights Court. The article also assesses these legal developments in the light of the Court’s authority. I argue that, through its case-law, the CCJ has...... succeeded in formally becoming a central player in the enforcement of human and fundamental rights in the region. In particular, the Court has shown a remarkable capacity to navigate the various different jurisdictions of the Caribbean States (ie, common v civil law systems) as well as the different legal...... cultures and approaches to international human rights and laws (ie, dualism v monism as well as British v international approaches to human rights)....

  19. Fundamental number theory with applications

    CERN Document Server

    Mollin, Richard A

    2008-01-01

    An update of the most accessible introductory number theory text available, Fundamental Number Theory with Applications, Second Edition presents a mathematically rigorous yet easy-to-follow treatment of the fundamentals and applications of the subject. The substantial amount of reorganizing makes this edition clearer and more elementary in its coverage. New to the Second Edition           Removal of all advanced material to be even more accessible in scope           New fundamental material, including partition theory, generating functions, and combinatorial number theory           Expa

  20. Fundamental degradation mechanisms of layered oxide Li-ion battery cathode materials: Methodology, insights and novel approaches

    International Nuclear Information System (INIS)

    Hausbrand, R.; Cherkashinin, G.; Ehrenberg, H.; Gröting, M.; Albe, K.; Hess, C.; Jaegermann, W.

    2015-01-01

    Graphical abstract: - Highlights: • Description of recent in operando and in situ analysis methodology. • Surface science approach using photoemission for analysis of cathode surfaces and interfaces. • Ageing and fatigue of layered oxide Li-ion battery cathode materials from the atomistic point of view. • Defect formation and electronic structure evolution as causes for cathode degradation. • Significance of interfacial energy alignment and contact potential for side reactions. - Abstract: This overview addresses the atomistic aspects of degradation of layered LiMO 2 (M = Ni, Co, Mn) oxide Li-ion battery cathode materials, aiming to shed light on the fundamental degradation mechanisms especially inside active cathode materials and at their interfaces. It includes recent results obtained by novel in situ/in operando diffraction methods, modelling, and quasi in situ surface science analysis. Degradation of the active cathode material occurs upon overcharge, resulting from a positive potential shift of the anode. Oxygen loss and eventual phase transformation resulting in dead regions are ascribed to changes in electronic structure and defect formation. The anode potential shift results from loss of free lithium due to side reactions occurring at electrode/electrolyte interfaces. Such side reactions are caused by electron transfer, and depend on the electron energy level alignment at the interface. Side reactions at electrode/electrolyte interfaces and capacity fade may be overcome by the use of suitable solid-state electrolytes and Li-containing anodes

  1. Fundamentals of structural dynamics

    CERN Document Server

    Craig, Roy R

    2006-01-01

    From theory and fundamentals to the latest advances in computational and experimental modal analysis, this is the definitive, updated reference on structural dynamics.This edition updates Professor Craig's classic introduction to structural dynamics, which has been an invaluable resource for practicing engineers and a textbook for undergraduate and graduate courses in vibrations and/or structural dynamics. Along with comprehensive coverage of structural dynamics fundamentals, finite-element-based computational methods, and dynamic testing methods, this Second Edition includes new and e

  2. Promoting fundamental clinical skills: a competency-based college approach at the University of Washington.

    Science.gov (United States)

    Goldstein, Erika A; Maclaren, Carol F; Smith, Sherilyn; Mengert, Terry J; Maestas, Ramoncita R; Foy, Hugh M; Wenrich, Marjorie D; Ramsey, Paul G

    2005-05-01

    The focus on fundamental clinical skills in undergraduate medical education has declined over the last several decades. Dramatic growth in the number of faculty involved in teaching and increasing clinical and research commitments have contributed to depersonalization and declining individual attention to students. In contrast to the close teaching and mentoring relationship between faculty and students 50 years ago, today's medical students may interact with hundreds of faculty members without the benefit of a focused program of teaching and evaluating clinical skills to form the core of their four-year curriculum. Bedside teaching has also declined, which may negatively affect clinical skills development. In response to these and other concerns, the University of Washington School of Medicine has created an integrated developmental curriculum that emphasizes bedside teaching and role modeling, focuses on enhancing fundamental clinical skills and professionalism, and implements these goals via a new administrative structure, the College system, which consists of a core of clinical teachers who spend substantial time teaching and mentoring medical students. Each medical student is assigned a faculty mentor within a College for the duration of his or her medical school career. Mentors continuously teach and reflect with students on clinical skills development and professionalism and, during the second year, work intensively with them at the bedside. They also provide an ongoing personal faculty contact. Competency domains and benchmarks define skill areas in which deepening, progressive attention is focused throughout medical school. This educational model places primary focus on the student.

  3. Quantum-limited heat conduction over macroscopic distances

    Science.gov (United States)

    Partanen, Matti; Tan, Kuan Yen; Govenius, Joonas; Lake, Russell E.; Mäkelä, Miika K.; Tanttu, Tuomo; Möttönen, Mikko

    2016-05-01

    The emerging quantum technological apparatuses, such as the quantum computer, call for extreme performance in thermal engineering. Cold distant heat sinks are needed for the quantized electric degrees of freedom owing to the increasing packaging density and heat dissipation. Importantly, quantum mechanics sets a fundamental upper limit for the flow of information and heat, which is quantified by the quantum of thermal conductance. However, the short distance between the heat-exchanging bodies in the previous experiments hinders their applicability in quantum technology. Here, we present experimental observations of quantum-limited heat conduction over macroscopic distances extending to a metre. We achieved this improvement of four orders of magnitude in the distance by utilizing microwave photons travelling in superconducting transmission lines. Thus, it seems that quantum-limited heat conduction has no fundamental distance cutoff. This work establishes the integration of normal-metal components into the framework of circuit quantum electrodynamics, which provides a basis for the superconducting quantum computer. Especially, our results facilitate remote cooling of nanoelectronic devices using faraway in situ-tunable heat sinks. Furthermore, quantum-limited heat conduction is important in contemporary thermodynamics. Here, the long distance may lead to ultimately efficient mesoscopic heat engines with promising practical applications.

  4. An Adaptive Approach to Mitigate Background Covariance Limitations in the Ensemble Kalman Filter

    KAUST Repository

    Song, Hajoon

    2010-07-01

    A new approach is proposed to address the background covariance limitations arising from undersampled ensembles and unaccounted model errors in the ensemble Kalman filter (EnKF). The method enhances the representativeness of the EnKF ensemble by augmenting it with new members chosen adaptively to add missing information that prevents the EnKF from fully fitting the data to the ensemble. The vectors to be added are obtained by back projecting the residuals of the observation misfits from the EnKF analysis step onto the state space. The back projection is done using an optimal interpolation (OI) scheme based on an estimated covariance of the subspace missing from the ensemble. In the experiments reported here, the OI uses a preselected stationary background covariance matrix, as in the hybrid EnKF–three-dimensional variational data assimilation (3DVAR) approach, but the resulting correction is included as a new ensemble member instead of being added to all existing ensemble members. The adaptive approach is tested with the Lorenz-96 model. The hybrid EnKF–3DVAR is used as a benchmark to evaluate the performance of the adaptive approach. Assimilation experiments suggest that the new adaptive scheme significantly improves the EnKF behavior when it suffers from small size ensembles and neglected model errors. It was further found to be competitive with the hybrid EnKF–3DVAR approach, depending on ensemble size and data coverage.

  5. How-to-Do-It: Hands-on Activity for Mitosis, Meiosis and the Fundamentals of Heredity.

    Science.gov (United States)

    Taylor, Mark F.

    1988-01-01

    Described is an exercise which uses inexpensive and easy-to-make materials to demonstrate the basic fundamentals of heredity. Discusses two approaches using a hypothetical insert to demonstrate inheritance, mitosis, meiosis, and genotypic and phenotypic frequencies. (CW)

  6. Fundamentals of environmental engineering. 2. rev. ed.

    International Nuclear Information System (INIS)

    Bank, M.

    1994-01-01

    'Fundamentals of Environmental Engineering' contains the technical and legal bases for the force environmental areas water supply and waste water disposal, clean air, waste avoidance and waste disposal, as well as noise protection in compact form. Particular scope was allowed for the description of the linkages between the individual environmental areas - for instance, waste combustion and clean air, waste deposition at landfills and treatment of leachate, residual products from successful water and air pollution control measures. For all those who have to familiarize themselves with the complex subject of 'environmental engineering' while in training or during continuing education this book offers a broad approach to the essential general, technical and legal bases. (orig.) [de

  7. Supramolecular chemistry and chemical warfare agents: from fundamentals of recognition to catalysis and sensing.

    Science.gov (United States)

    Sambrook, M R; Notman, S

    2013-12-21

    Supramolecular chemistry presents many possible avenues for the mitigation of the effects of chemical warfare agents (CWAs), including sensing, catalysis and sequestration. To-date, efforts in this field both to study fundamental interactions between CWAs and to design and exploit host systems remain sporadic. In this tutorial review the non-covalent recognition of CWAs is considered from first principles, including taking inspiration from enzymatic systems, and gaps in fundamental knowledge are indicated. Examples of synthetic systems developed for the recognition of CWAs are discussed with a focus on the supramolecular complexation behaviour and non-covalent approaches rather than on the proposed applications.

  8. MDL, Collineations and the Fundamental Matrix

    OpenAIRE

    Maybank , Steve; Sturm , Peter

    1999-01-01

    International audience; Scene geometry can be inferred from point correspondences between two images. The inference process includes the selection of a model. Four models are considered: background (or null), collineation, affine fundamental matrix and fundamental matrix. It is shown how Minimum Description Length (MDL) can be used to compare the different models. The main result is that there is little reason for preferring the fundamental matrix model over the collineation model, even when ...

  9. Stability of rigid rotors supported by air foil bearings: Comparison of two fundamental approaches

    DEFF Research Database (Denmark)

    Larsen, Jon Steffen; Santos, Ilmar; von Osmanski, Alexander Sebastian

    2016-01-01

    . This paper compares two fundamental methods for predicting the OSI. One is based on a nonlinear time domain simulation and another is based on a linearised frequency domain method and a perturbation of the Reynolds equation. Both methods are based on equivalent models and should predict similar results......High speed direct drive motors enable the use of Air Foil Bearings (AFB) in a wide range of applications due to the elimination of gear forces. Unfortunately, AFB supported rotors are lightly damped, and an accurate prediction of their Onset Speed of Instability (OSI) is therefore important...

  10. Ecotoxicological, ecophysiological, and biogeochemical fundamentals of risk assessment

    International Nuclear Information System (INIS)

    Bashkin, V.N.; Kozlov, M.Ya.; Evstafjeva, E.V.

    1993-01-01

    Risk assessment (RA) influenced by different factors in radionuclide polluted regions is carried out by determining the biogeochemical structure of a region. Consequently, ecological-biogeochemical regionalization, ecotoxicological and ecophysiological monitoring of human population health are the important approach to RA. These criteria should conjugate with LCA of various industrial and agricultural products. Given fundamentals and approaches are needed for areas where traditional pollutants (heavy metals, pesticides, fertilizers, POPs etc) are enforced sharply by radioactive pollution. For RA of these complex pollutants, the methods of human adaptability to a polluted environment have been carried out. These techniques include biogeochemical, ecotoxicological, and ecophysiological analyses of risk factors as well as quantitative analysis of uncertainties using expert-modeling systems. Furthermore, the modern statistical methods are used for quantitative assessment of human adaptability to radioactive and nonradioactive pollutants. The results obtained in Chernobyl regions show the acceptability of these methods for risk assessment

  11. Fundamental Properties of the SHIELD Galaxies

    Science.gov (United States)

    Cannon, John; Adams, Betsey; Giovanelli, Riccardo; Haynes, Martha; Jones, Michael; McQuinn, Kristen; Rhode, Katherine; Salzer, John; Skillman, Evan

    2018-05-01

    The ALFALFA survey has significantly advanced our knowledge of the HI mass function (HIMF), particularly at the low mass end. From the ALFALFA survey, we have constructed a sample of all of the galaxies with HI masses less than 20 million solar masses. Observations of this 82 galaxy sample allow, for the first time, a characterization of the lowest HI mass galaxies at redshift zero. Specifically, this sample can be used to determine the low HI-mass ends of various fundamental scaling relations, including the critical baryonic Tully Fisher relation (BTFR) and the mass-metallicity (M-Z) relation. The M-Z relation and the BTFR are cosmologically important, but current samples leave the low-mass parameter spaces severely underpopulated. A full understanding of these relationships depends critically on accurate stellar masses of this complete sample of uniformly-selected galaxies. Here, we request imaging of the 70 galaxies in our sample that have not been observed with Spitzer. The proposed imaging will allow us to measure stellar masses and inclinations of the sample galaxies using a uniform observational approach. Comparison with (existing and in progress) interferometric HI imaging and with ground-based optical imaging and spectroscopy will enable a robust mass decomposition in each galaxy and accurate placements on the aforementioned scaling relationships. The observations proposed here will allow us to populate the mass continuum between mini-halos and bona fide dwarf galaxies, and to address a range of fundamental questions in galaxy formation and near-field cosmology.

  12. [From fundamental research to clinical development: a review of orthodontics].

    Science.gov (United States)

    Zhao, Zhi-he; Bai, Ding

    2011-11-01

    In recent years, new approaches to the diagnosis and treatment of malocclusion have emerged. The diagnostic and therapeutic techniques of orthodontics have evolved from two dimensions to five dimensions with the development of computer technology, auto-machining and imaging. Furthermore, interdisciplinary study has become the driving force for the advancement of fundamental research in orthodontics. The mechanisms of malocclusion and orthodontic tooth movement have been extensively studied to the details at the level of cells and molecules.

  13. Nano-photonic light trapping near the Lambertian limit in organic solar cell architectures.

    Science.gov (United States)

    Biswas, Rana; Timmons, Erik

    2013-09-09

    A critical step to achieving higher efficiency solar cells is the broad band harvesting of solar photons. Although considerable progress has recently been achieved in improving the power conversion efficiency of organic solar cells, these cells still do not absorb upto ~50% of the solar spectrum. We have designed and developed an organic solar cell architecture that can boost the absorption of photons by 40% and the photo-current by 50% for organic P3HT-PCBM absorber layers of typical device thicknesses. Our solar cell architecture is based on all layers of the solar cell being patterned in a conformal two-dimensionally periodic photonic crystal architecture. This results in very strong diffraction of photons- that increases the photon path length in the absorber layer, and plasmonic light concentration near the patterned organic-metal cathode interface. The absorption approaches the Lambertian limit. The simulations utilize a rigorous scattering matrix approach and provide bounds of the fundamental limits of nano-photonic light absorption in periodically textured organic solar cells. This solar cell architecture has the potential to increase the power conversion efficiency to 10% for single band gap organic solar cells utilizing long-wavelength absorbers.

  14. Historical-systematic fundaments of the Trinitarian theory of the liturgical event

    Directory of Open Access Journals (Sweden)

    Robert Woźniak

    2011-12-01

    Full Text Available The object of present research is to develop some fundamental traces of the Trinitarian understanding of the Christian liturgy. The article attempts to point out to the fundamental coordinates of Trinitarian comprehension of the liturgy from the historical perspective. In order to do this, it traces the links between first formulations of Trinitarian faith and early development of the Christian liturgy. The argument starts with consideration of some new biblical approaches to the phenomena of early Christian cult seen in its theological (Christological and Trinitarian constellation (Bauckham, Hurtado. After this preliminary biblical-theological inquiry, some fundamental patristic texts are taken into account. The last stage of investigation is presentation of Second Vatican Council’s account of the theology of liturgy which proofs itself to be openly Trinitarian.

  15. From the Kohn-Sham band gap to the fundamental gap in solids. An integer electron approach

    NARCIS (Netherlands)

    Baerends, E. J.

    2017-01-01

    It is often stated that the Kohn-Sham occupied-unoccupied gap in both molecules and solids is "wrong". We argue that this is not a correct statement. The KS theory does not allow to interpret the exact KS HOMO-LUMO gap as the fundamental gap (difference (I - A) of electron affinity (A) and

  16. Fundamentals of continuum mechanics

    CERN Document Server

    Rudnicki, John W

    2014-01-01

    A concise introductory course text on continuum mechanics Fundamentals of Continuum Mechanics focuses on the fundamentals of the subject and provides the background for formulation of numerical methods for large deformations and a wide range of material behaviours. It aims to provide the foundations for further study, not just of these subjects, but also the formulations for much more complex material behaviour and their implementation computationally.  This book is divided into 5 parts, covering mathematical preliminaries, stress, motion and deformation, balance of mass, momentum and energ

  17. Photonic Design: From Fundamental Solar Cell Physics to Computational Inverse Design

    OpenAIRE

    Miller, Owen Dennis

    2012-01-01

    Photonic innovation is becoming ever more important in the modern world. Optical systems are dominating shorter and shorter communications distances, LED's are rapidly emerging for a variety of applications, and solar cells show potential to be a mainstream technology in the energy space. The need for novel, energy-efficient photonic and optoelectronic devices will only increase. This work unites fundamental physics and a novel computational inverse design approach towards such innovation....

  18. The Fundamental Right to a Decent Work as a Resizing Factor of Company Managers Liability

    Directory of Open Access Journals (Sweden)

    Carla Eugenia Caldas Barros

    2016-11-01

    Full Text Available The present paper aims to point out the human labor value, based on the ideia of decent work as a fundamental right, and yet analyzes the impact of this influence on the company managers liability. Initially, based on the theoretical and descriptive method, describing the rise of human dignity as the center of the legal system and its connection to the labor fundamental rights, as a limiting factor on business activity. Finally, through the deductive method, showing how the resizing of the company managers liability is conditioned by the principles and values consolidated in the constitutional economic order.

  19. Limiting processes in non-equilibrium classical statistical mechanics

    International Nuclear Information System (INIS)

    Jancel, R.

    1983-01-01

    After a recall of the basic principles of the statistical mechanics, the results of ergodic theory, the transient at the thermodynamic limit and his link with the transport theory near the equilibrium are analyzed. The fundamental problems put by the description of non-equilibrium macroscopic systems are investigated and the kinetic methods are stated. The problems of the non-equilibrium statistical mechanics are analyzed: irreversibility and coarse-graining, macroscopic variables and kinetic description, autonomous reduced descriptions, limit processes, BBGKY hierarchy, limit theorems [fr

  20. About limit masses of elementary particles

    International Nuclear Information System (INIS)

    Ibadova, U.R.

    2002-01-01

    when the mass of particles can be compared with the mass of automobiles. The modern QFT does not forbid such physically meaningless extrapolation. Perhaps, it is a principal defect of the theory? In 1965 Markov put forward a hypothesis, according to which the spectrum of masses of elementary particles must jump into discontinuity on the Planck's mass m plank = √hℎc/G , well-known universal constants ℎ, c and G-gravitational constant take place in this expression. Markov named particles of a limiting mass as 'maximons'. The concept of 'maximon' is placed in a basis of the Markov plot of the early Universe. It is necessary to mention that the standard theoretical-field apparatus is used for the description of the 'maximon'. The Markov's idea concerning the existence of finite limit for value of the mass of elementary particles as fundamental physical principle, that is similar to the relativistic and quantum postulates in basis of QFT was realized by V.G.Kadyshevsky. Here the Markov's condition is noted as m< M, considering a limiting mass M simply as new universal constant in the theory of 'fundamental mass'. In the given work, we have strived for the connection of the universal constants (ℎ, c, G), 'Planck's mass' 'maximon' and 'fundamental mass' on basis of the spontaneous breaking of symmetry

  1. Fluid mechanics fundamentals and applications

    CERN Document Server

    Cengel, Yunus

    2013-01-01

    Cengel and Cimbala's Fluid Mechanics Fundamentals and Applications, communicates directly with tomorrow's engineers in a simple yet precise manner. The text covers the basic principles and equations of fluid mechanics in the context of numerous and diverse real-world engineering examples. The text helps students develop an intuitive understanding of fluid mechanics by emphasizing the physics, using figures, numerous photographs and visual aids to reinforce the physics. The highly visual approach enhances the learning of Fluid mechanics by students. This text distinguishes itself from others by the way the material is presented - in a progressive order from simple to more difficult, building each chapter upon foundations laid down in previous chapters. In this way, even the traditionally challenging aspects of fluid mechanics can be learned effectively. McGraw-Hill is also proud to offer ConnectPlus powered by Maple with the third edition of Cengel/Cimbabla, Fluid Mechanics. This innovative and powerful new sy...

  2. Evaluation of a TRU fundamental criterion and reference TRU waste units

    International Nuclear Information System (INIS)

    Klett, R.

    1993-01-01

    The comparison of two options for regulating transuranic (TRU) waste disposal is explained in this paper. The two options are (1) fundamental and derived standards developed specifically for the TRU waste and (2) a family of procedures that use a reference to the TRU waste unit with procedures that use a reference to the TRU waste unit with commercial high-level waste (HLW) criteria. Background information pertaining to both options is covered. A section on criteria specifically for TRUE waste suggests a methodology for developing or adapting fundamental and derived criteria that are consistent with all other aspects of the standards. The section on references TRU waste units covers all the parameter variations that have been suggested for this option. The technical bases of each approach is reviewed, implementation is discussed and their relative attributes and deficiencies are evaluated

  3. Pragmatic electrical engineering fundamentals

    CERN Document Server

    Eccles, William

    2011-01-01

    Pragmatic Electrical Engineering: Fundamentals introduces the fundamentals of the energy-delivery part of electrical systems. It begins with a study of basic electrical circuits and then focuses on electrical power. Three-phase power systems, transformers, induction motors, and magnetics are the major topics.All of the material in the text is illustrated with completely-worked examples to guide the student to a better understanding of the topics. This short lecture book will be of use at any level of engineering, not just electrical. Its goal is to provide the practicing engineer with a practi

  4. Possible limits of plasma linear colliders

    Science.gov (United States)

    Zimmermann, F.

    2017-07-01

    Plasma linear colliders have been proposed as next or next-next generation energy-frontier machines for high-energy physics. I investigate possible fundamental limits on energy and luminosity of such type of colliders, considering acceleration, multiple scattering off plasma ions, intrabeam scattering, bremsstrahlung, and betatron radiation. The question of energy efficiency is also addressed.

  5. DNA data in criminal procedure in the European fundamental rights context.

    Science.gov (United States)

    Soleto, Helena

    2014-01-01

    Despite being one of the most useful and reliable identification tools, DNA profiling in criminal procedure balances on the border between the limitation and violation of Fundamental Rights that can occur beginning with the collection of the sample, its analysis, and its use; and ending with its processing. Throughout this complex process, violation of human or fundamental rights -such as the right to physical and moral integrity, the right not to be subject to degrading treatment, the right not to incriminate oneself, the right to family privacy together with that of not incriminating descendants or relatives in general, the right to personal development and the right to informative self-determination- is possible. This article presents an analysis of all the above-mentioned DNA treating phases in criminal process in the light of possible violations of some Fundamental Rights, while at the same time discarding some of them on the basis of European human rights protection standards. As the case-law of the European Court of Human Rights shows, the legislation on DNA collection and DNA related data processing or its implementation does not always respect all human rights and should be carefully considered before its adoption and during its application.

  6. Arithmetic fundamental groups and moduli of curves

    International Nuclear Information System (INIS)

    Makoto Matsumoto

    2000-01-01

    This is a short note on the algebraic (or sometimes called arithmetic) fundamental groups of an algebraic variety, which connects classical fundamental groups with Galois groups of fields. A large part of this note describes the algebraic fundamental groups in a concrete manner. This note gives only a sketch of the fundamental groups of the algebraic stack of moduli of curves. Some application to a purely topological statement, i.e., an obstruction to the subjectivity of Johnson homomorphisms in the mapping class groups, which comes from Galois group of Q, is explained. (author)

  7. Constant physics and characteristics of fundamental constant

    International Nuclear Information System (INIS)

    Tarrach, R.

    1998-01-01

    We present some evidence which supports a surprising physical interpretation of the fundamental constants. First, we relate two of them through the renormalization group. This leaves as many fundamental constants as base units. Second, we introduce and a dimensional system of units without fundamental constants. Third, and most important, we find, while interpreting the units of the a dimensional system, that is all cases accessible to experimentation the fundamental constants indicate either discretization at small values or boundedness at large values of the corresponding physical quantity. (Author) 12 refs

  8. On the fundamental theorem of card counting, with application to the game of trente et quarante

    OpenAIRE

    Ethier, S. N.; Levin, D. A.

    2005-01-01

    A simplified proof of Thorp and Walden's fundamental theorem of card counting is presented, and a corresponding central limit theorem is established. Results are applied to the casino game of trente et quarante, which was studied by Poisson and De Morgan.

  9. Ion beam analysis fundamentals and applications

    CERN Document Server

    Nastasi, Michael; Wang, Yongqiang

    2015-01-01

    Ion Beam Analysis: Fundamentals and Applications explains the basic characteristics of ion beams as applied to the analysis of materials, as well as ion beam analysis (IBA) of art/archaeological objects. It focuses on the fundamentals and applications of ion beam methods of materials characterization.The book explains how ions interact with solids and describes what information can be gained. It starts by covering the fundamentals of ion beam analysis, including kinematics, ion stopping, Rutherford backscattering, channeling, elastic recoil detection, particle induced x-ray emission, and nucle

  10. 2016 TSRC Summer School on Fundamental Science for Alternative Energy

    Energy Technology Data Exchange (ETDEWEB)

    Batista, Victor S. [Yale Univ., New Haven, CT (United States)

    2017-08-25

    The 2016 TSRC Summer School on Fundamental Science for Alternative Energy introduced principles, methods, and approaches relevant to the design of molecular transformations, energy transduction, and current applications for alternative energy. Energy and environment are likely to be key themes that will dominate the way science and engineering develop over the next few decades. Only an interdisciplinary approach with a team-taught structure as presented at the 2016 TSRC Summer School can be expected to succeed in the face of problems of such difficulty. The course inspired a new generation of 24 graduate students and 2 post-docs to continue work in the field, or at least to have something of an insider's point of view as the field develops in the next few decades.

  11. Accuracy Limitations in Optical Linear Algebra Processors

    Science.gov (United States)

    Batsell, Stephen Gordon

    1990-01-01

    One of the limiting factors in applying optical linear algebra processors (OLAPs) to real-world problems has been the poor achievable accuracy of these processors. Little previous research has been done on determining noise sources from a systems perspective which would include noise generated in the multiplication and addition operations, noise from spatial variations across arrays, and from crosstalk. In this dissertation, we propose a second-order statistical model for an OLAP which incorporates all these system noise sources. We now apply this knowledge to determining upper and lower bounds on the achievable accuracy. This is accomplished by first translating the standard definition of accuracy used in electronic digital processors to analog optical processors. We then employ our second-order statistical model. Having determined a general accuracy equation, we consider limiting cases such as for ideal and noisy components. From the ideal case, we find the fundamental limitations on improving analog processor accuracy. From the noisy case, we determine the practical limitations based on both device and system noise sources. These bounds allow system trade-offs to be made both in the choice of architecture and in individual components in such a way as to maximize the accuracy of the processor. Finally, by determining the fundamental limitations, we show the system engineer when the accuracy desired can be achieved from hardware or architecture improvements and when it must come from signal pre-processing and/or post-processing techniques.

  12. Revisiting Pocos de Caldas. Application of the co-precipitation approach to establish realistic solubility limits for performance assessment

    International Nuclear Information System (INIS)

    Bruno, J.; Duro, L.; Jordana, S.; Cera, E.

    1996-02-01

    Solubility limits constitute a critical parameter for the determination of the mobility of radionuclides in the near field and the geosphere, and consequently for the performance assessment of nuclear waste repositories. Mounting evidence from natural system studies indicate that trace elements, and consequently radionuclides, are associated to the dynamic cycling of major geochemical components. We have recently developed a thermodynamic approach to take into consideration the co-precipitation and co-dissolution processes that mainly control this linkage. The approach has been tested in various natural system studies with encouraging results. The Pocos de Caldas natural analogue was one of the sites where a full testing of our predictive geochemical modelling capabilities were done during the analogue project. We have revisited the Pocos de Caldas data and expanded the trace element solubility calculations by considering the documented trace metal/major ion interactions. This has been done by using the co-precipitation/co-dissolution approach. The outcome is as follows: A satisfactory modelling of the behaviour of U, Zn and REEs is achieved by assuming co-precipitation with ferrihydrite. Strontium concentrations are apparently controlled by its co-dissolution from Sr-rich fluorites. From the performance assessment point of view, the present work indicates that calculated solubility limits using the co-precipitation approach are in close agreement with the actual trace element concentrations. Furthermore, the calculated radionuclide concentrations are 2-4 orders of magnitude lower than conservative solubility limits calculated by assuming equilibrium with individual trace element phases. 34 refs, 18 figs, 13 tabs

  13. Revisiting Pocos de Caldas. Application of the co-precipitation approach to establish realistic solubility limits for performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bruno, J.; Duro, L.; Jordana, S.; Cera, E. [QuantiSci, Barcelona (Spain)

    1996-02-01

    Solubility limits constitute a critical parameter for the determination of the mobility of radionuclides in the near field and the geosphere, and consequently for the performance assessment of nuclear waste repositories. Mounting evidence from natural system studies indicate that trace elements, and consequently radionuclides, are associated to the dynamic cycling of major geochemical components. We have recently developed a thermodynamic approach to take into consideration the co-precipitation and co-dissolution processes that mainly control this linkage. The approach has been tested in various natural system studies with encouraging results. The Pocos de Caldas natural analogue was one of the sites where a full testing of our predictive geochemical modelling capabilities were done during the analogue project. We have revisited the Pocos de Caldas data and expanded the trace element solubility calculations by considering the documented trace metal/major ion interactions. This has been done by using the co-precipitation/co-dissolution approach. The outcome is as follows: A satisfactory modelling of the behaviour of U, Zn and REEs is achieved by assuming co-precipitation with ferrihydrite. Strontium concentrations are apparently controlled by its co-dissolution from Sr-rich fluorites. From the performance assessment point of view, the present work indicates that calculated solubility limits using the co-precipitation approach are in close agreement with the actual trace element concentrations. Furthermore, the calculated radionuclide concentrations are 2-4 orders of magnitude lower than conservative solubility limits calculated by assuming equilibrium with individual trace element phases. 34 refs, 18 figs, 13 tabs.

  14. Recent Advances and Future Prospects in Fundamental Symmetries

    Science.gov (United States)

    Plaster, Brad

    2017-09-01

    A broad program of initiatives in fundamental symmetries seeks answers to several of the most pressing open questions in nuclear physics, ranging from the scale of the neutrino mass, to the particle-antiparticle nature of the neutrino, to the origin of the matter-antimatter asymmetry, to the limits of Standard Model interactions. Although the experimental program is quite broad, with efforts ranging from precision measurements of neutrino properties; to searches for electric dipole moments; to precision measurements of magnetic dipole moments; and to precision measurements of couplings, particle properties, and decays; all of these seemingly disparate initiatives are unified by several common threads. These include the use and exploitation of symmetry principles, novel cross-disciplinary experimental work at the forefront of the precision frontier, and the need for accompanying breakthroughs in development of the theory necessary for an interpretation of the anticipated results from these experiments. This talk will highlight recent accomplishments and advances in fundamental symmetries and point to the extraordinary level of ongoing activity aimed at realizing the development and interpretation of next-generation experiments. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, under Award Number DE-SC-0014622.

  15. Response approach to the squeezed-limit bispectrum: application to the correlation of quasar and Lyman-α forest power spectrum

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Chi-Ting [C.N. Yang Institute for Theoretical Physics, Stony Brook University, Stony Brook, NY 11794 (United States); Cieplak, Agnieszka M.; Slosar, Anže [Brookhaven National Laboratory, Blgd 510, Upton, NY 11375 (United States); Schmidt, Fabian, E-mail: chi-ting.chiang@stonybrook.edu, E-mail: acieplak@bnl.gov, E-mail: fabians@mpa-garching.mpg.de, E-mail: anze@bnl.gov [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching (Germany)

    2017-06-01

    The squeezed-limit bispectrum, which is generated by nonlinear gravitational evolution as well as inflationary physics, measures the correlation of three wavenumbers, in the configuration where one wavenumber is much smaller than the other two. Since the squeezed-limit bispectrum encodes the impact of a large-scale fluctuation on the small-scale power spectrum, it can be understood as how the small-scale power spectrum ''responds'' to the large-scale fluctuation. Viewed in this way, the squeezed-limit bispectrum can be calculated using the response approach even in the cases which do not submit to perturbative treatment. To illustrate this point, we apply this approach to the cross-correlation between the large-scale quasar density field and small-scale Lyman-α forest flux power spectrum. In particular, using separate universe simulations which implement changes in the large-scale density, velocity gradient, and primordial power spectrum amplitude, we measure how the Lyman-α forest flux power spectrum responds to the local, long-wavelength quasar overdensity, and equivalently their squeezed-limit bispectrum. We perform a Fisher forecast for the ability of future experiments to constrain local non-Gaussianity using the bispectrum of quasars and the Lyman-α forest. Combining with quasar and Lyman-α forest power spectra to constrain the biases, we find that for DESI the expected 1−σ constraint is err[ f {sub NL}]∼60. Ability for DESI to measure f {sub NL} through this channel is limited primarily by the aliasing and instrumental noise of the Lyman-α forest flux power spectrum. The combination of response approach and separate universe simulations provides a novel technique to explore the constraints from the squeezed-limit bispectrum between different observables.

  16. Stochastic modelling of a single ion channel: an alternating renewal approach with application to limited time resolution.

    Science.gov (United States)

    Milne, R K; Yeo, G F; Edeson, R O; Madsen, B W

    1988-04-22

    Stochastic models of ion channels have been based largely on Markov theory where individual states and transition rates must be specified, and sojourn-time densities for each state are constrained to be exponential. This study presents an approach based on random-sum methods and alternating-renewal theory, allowing individual states to be grouped into classes provided the successive sojourn times in a given class are independent and identically distributed. Under these conditions Markov models form a special case. The utility of the approach is illustrated by considering the effects of limited time resolution (modelled by using a discrete detection limit, xi) on the properties of observable events, with emphasis on the observed open-time (xi-open-time). The cumulants and Laplace transform for a xi-open-time are derived for a range of Markov and non-Markov models; several useful approximations to the xi-open-time density function are presented. Numerical studies show that the effects of limited time resolution can be extreme, and also highlight the relative importance of the various model parameters. The theory could form a basis for future inferential studies in which parameter estimation takes account of limited time resolution in single channel records. Appendixes include relevant results concerning random sums and a discussion of the role of exponential distributions in Markov models.

  17. DISSOLVED CONCENTRATION LIMITS OF RADIOACTIVE ELEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    P. Bernot

    2005-07-13

    The purpose of this study is to evaluate dissolved concentration limits (also referred to as solubility limits) of elements with radioactive isotopes under probable repository conditions, based on geochemical modeling calculations using geochemical modeling tools, thermodynamic databases, field measurements, and laboratory experiments. The scope of this activity is to predict dissolved concentrations or solubility limits for elements with radioactive isotopes (actinium, americium, carbon, cesium, iodine, lead, neptunium, plutonium, protactinium, radium, strontium, technetium, thorium, and uranium) relevant to calculated dose. Model outputs for uranium, plutonium, neptunium, thorium, americium, and protactinium are provided in the form of tabulated functions with pH and log fCO{sub 2} as independent variables, plus one or more uncertainty terms. The solubility limits for the remaining elements are either in the form of distributions or single values. Even though selection of an appropriate set of radionuclides documented in Radionuclide Screening (BSC 2002 [DIRS 160059]) includes actinium, transport of Ac is not modeled in the total system performance assessment for the license application (TSPA-LA) model because of its extremely short half-life. Actinium dose is calculated in the TSPA-LA by assuming secular equilibrium with {sup 231}Pa (Section 6.10); therefore, Ac is not analyzed in this report. The output data from this report are fundamental inputs for TSPA-LA used to determine the estimated release of these elements from waste packages and the engineered barrier system. Consistent modeling approaches and environmental conditions were used to develop solubility models for the actinides discussed in this report. These models cover broad ranges of environmental conditions so they are applicable to both waste packages and the invert. Uncertainties from thermodynamic data, water chemistry, temperature variation, and activity coefficients have been quantified or

  18. DISSOLVED CONCENTRATION LIMITS OF RADIOACTIVE ELEMENTS

    International Nuclear Information System (INIS)

    P. Bernot

    2005-01-01

    The purpose of this study is to evaluate dissolved concentration limits (also referred to as solubility limits) of elements with radioactive isotopes under probable repository conditions, based on geochemical modeling calculations using geochemical modeling tools, thermodynamic databases, field measurements, and laboratory experiments. The scope of this activity is to predict dissolved concentrations or solubility limits for elements with radioactive isotopes (actinium, americium, carbon, cesium, iodine, lead, neptunium, plutonium, protactinium, radium, strontium, technetium, thorium, and uranium) relevant to calculated dose. Model outputs for uranium, plutonium, neptunium, thorium, americium, and protactinium are provided in the form of tabulated functions with pH and log fCO 2 as independent variables, plus one or more uncertainty terms. The solubility limits for the remaining elements are either in the form of distributions or single values. Even though selection of an appropriate set of radionuclides documented in Radionuclide Screening (BSC 2002 [DIRS 160059]) includes actinium, transport of Ac is not modeled in the total system performance assessment for the license application (TSPA-LA) model because of its extremely short half-life. Actinium dose is calculated in the TSPA-LA by assuming secular equilibrium with 231 Pa (Section 6.10); therefore, Ac is not analyzed in this report. The output data from this report are fundamental inputs for TSPA-LA used to determine the estimated release of these elements from waste packages and the engineered barrier system. Consistent modeling approaches and environmental conditions were used to develop solubility models for the actinides discussed in this report. These models cover broad ranges of environmental conditions so they are applicable to both waste packages and the invert. Uncertainties from thermodynamic data, water chemistry, temperature variation, and activity coefficients have been quantified or otherwise

  19. Fundamentals of piping design

    CERN Document Server

    Smith, Peter

    2013-01-01

    Written for the piping engineer and designer in the field, this two-part series helps to fill a void in piping literature,since the Rip Weaver books of the '90s were taken out of print at the advent of the Computer Aid Design(CAD) era. Technology may have changed, however the fundamentals of piping rules still apply in the digitalrepresentation of process piping systems. The Fundamentals of Piping Design is an introduction to the designof piping systems, various processes and the layout of pipe work connecting the major items of equipment forthe new hire, the engineering student and the vetera

  20. Safety and design limits

    International Nuclear Information System (INIS)

    Shishkov, L. K.; Gorbaev, V. A.; Tsyganov, S. V.

    2007-01-01

    The paper touches upon the issues of NPP safety ensuring at the stage of fuel load design and operation by applying special limitations for a series of parameters, that is, design limits. Two following approaches are compared: the one used by west specialists for the PWR reactor and the Russian approach employed for the WWER reactor. The closeness of approaches is established, differences that are mainly peculiarities of terms are noted (Authors)

  1. Maximal Predictability Approach for Identifying the Right Descriptors for Electrocatalytic Reactions.

    Science.gov (United States)

    Krishnamurthy, Dilip; Sumaria, Vaidish; Viswanathan, Venkatasubramanian

    2018-02-01

    Density functional theory (DFT) calculations are being routinely used to identify new material candidates that approach activity near fundamental limits imposed by thermodynamics or scaling relations. DFT calculations are associated with inherent uncertainty, which limits the ability to delineate materials (distinguishability) that possess high activity. Development of error-estimation capabilities in DFT has enabled uncertainty propagation through activity-prediction models. In this work, we demonstrate an approach to propagating uncertainty through thermodynamic activity models leading to a probability distribution of the computed activity and thereby its expectation value. A new metric, prediction efficiency, is defined, which provides a quantitative measure of the ability to distinguish activity of materials and can be used to identify the optimal descriptor(s) ΔG opt . We demonstrate the framework for four important electrochemical reactions: hydrogen evolution, chlorine evolution, oxygen reduction and oxygen evolution. Future studies could utilize expected activity and prediction efficiency to significantly improve the prediction accuracy of highly active material candidates.

  2. The Model of Games to Develop Fundamental Movement of Kindergarten Students

    Directory of Open Access Journals (Sweden)

    Kristanto Adi Nugroho

    2018-05-01

    Full Text Available The study aimed for developing a game model to optimize the achievement of the fundamental movement of kindergarten students. The results of the research were expected to be a credible and standardized reference for teachers in teaching. The research was conducted using research and development method which was divided into two stages, namely pre-development stage and development stage. The pre-development stage consisted of literature review, relevant research and preliminary studies. The development stage consisted of drafting, expert validation, limited-scale trials, large-scale trials, and operational trials. Expert validation involved two experts using focus group discussion (FGD techniques. The limited scale and extensive test were conducted to see the aspects of substantive content, and the implementation of the model has been qualitatively suitable for the use in the kindergarten. There were 10 children as research subjects were tested on a limited-scale test and there were 24 children on a large-scale test. In operational test using experimental method, there were 47 children. The Instruments used in data collection process at the pre-development stage were interview guides and field notes while in the development stage the researcher used questionnaires and Fundamental Motor Pattern Assessment Instrument to measure the level of motion skills of the children. The data analysis techniques used were qualitative and quantitative analysis (statistics. Result studied of the development of the game model consisted of ten game models, namely: 1 the flying bird game; 2 the ball relay game; 3 The ball kicking game; 4 the balloon tapping game; 5 the seeking and jumping game; 6 the arranging letter game; 7 the sticking picture game; 8 the composing names game; 9 the frog counting game, and 10 the numbers adventure games. Based on the content validator’s assessment, the content of materials was 86 points (82% which was in very good category, the

  3. A risk modelling approach for setting microbiological limits using enterococci as indicator for growth potential of Salmonella in pork

    DEFF Research Database (Denmark)

    Bollerslev, Anne Mette; Nauta, Maarten; Hansen, Tina Beck

    2017-01-01

    Microbiological limits are widely used in food processing as an aid to reduce the exposure to hazardous microorganisms for the consumers. However, in pork, the prevalence and concentrations of Salmonella are generally low and microbiological limits are not considered an efficient tool to support...... for this purpose includes the dose-response relationship for Salmonella and a reduction factor to account for preparation of the fresh pork. By use of the risk model, it was estimated that the majority of salmonellosis cases, caused by the consumption of pork in Denmark, is caused by the small fraction of pork...... products that has enterococci concentrations above 5. log. CFU/g. This illustrates that our approach can be used to evaluate the potential effect of different microbiological limits and therefore, the perspective of this novel approach is that it can be used for definition of a risk-based microbiological...

  4. Thermodynamic fluctuations within the Gibbs and Einstein approaches

    International Nuclear Information System (INIS)

    Rudoi, Yurii G; Sukhanov, Alexander D

    2000-01-01

    A comparative analysis of the descriptions of fluctuations in statistical mechanics (the Gibbs approach) and in statistical thermodynamics (the Einstein approach) is given. On this basis solutions are obtained for the Gibbs and Einstein problems that arise in pressure fluctuation calculations for a spatially limited equilibrium (or slightly nonequilibrium) macroscopic system. A modern formulation of the Gibbs approach which allows one to calculate equilibrium pressure fluctuations without making any additional assumptions is presented; to this end the generalized Bogolyubov - Zubarev and Hellmann - Feynman theorems are proved for the classical and quantum descriptions of a macrosystem. A statistical version of the Einstein approach is developed which shows a fundamental difference in pressure fluctuation results obtained within the context of two approaches. Both the 'genetic' relation between the Gibbs and Einstein approaches and the conceptual distinction between their physical grounds are demonstrated. To illustrate the results, which are valid for any thermodynamic system, an ideal nondegenerate gas of microparticles is considered, both classically and quantum mechanically. Based on the results obtained, the correspondence between the micro- and macroscopic descriptions is considered and the prospects of statistical thermodynamics are discussed. (reviews of topical problems)

  5. Studying fundamental physics using quantum enabled technologies with trapped molecular ions

    Science.gov (United States)

    Segal, D. M.; Lorent, V.; Dubessy, R.; Darquié, B.

    2018-03-01

    The text below was written during two visits that Daniel Segal made at Université Paris 13. Danny stayed at Laboratoire de Physique des Lasers the summers of 2008 and 2009 to participate in the exploration of a novel lead in the field of ultra-high resolution spectroscopy. Our idea was to probe trapped molecular ions using Quantum Logic Spectroscopy (QLS) in order to advance our understanding of a variety of fundamental processes in nature. At that time, QLS, a ground-breaking spectroscopic technique, had only been demonstrated with atomic ions. Our ultimate goals were new approaches to the observation of parity violation in chiral molecules and tests of time variations of the fundamental constants. This text is the original research proposal written eight years ago. We have added a series of notes to revisit it in the light of what has been since realized in the field.

  6. Analysis of enamel development using murine model systems: approaches and limitations.

    Directory of Open Access Journals (Sweden)

    Megan K Pugach

    2014-09-01

    Full Text Available A primary goal of enamel research is to understand and potentially treat or prevent enamel defects related to amelogenesis imperfecta (AI. Rodents are ideal models to assist our understanding of how enamel is formed because they are easily genetically modified, and their continuously erupting incisors display all stages of enamel development and mineralization. While numerous methods have been developed to generate and analyze genetically modified rodent enamel, it is crucial to understand the limitations and challenges associated with these methods in order to draw appropriate conclusions that can be applied translationally, to AI patient care. We have highlighted methods involved in generating and analyzing rodent enamel and potential approaches to overcoming limitations of these methods: 1 generating transgenic, knockout and knockin mouse models, and 2 analyzing rodent enamel mineral density and functional properties (structure, mechanics of mature enamel. There is a need for a standardized workflow to analyze enamel phenotypes in rodent models so that investigators can compare data from different studies. These methods include analyses of gene and protein expression, developing enamel histology, enamel pigment, degree of mineralization, enamel structure and mechanical properties. Standardization of these methods with regard to stage of enamel development and sample preparation is crucial, and ideally investigators can use correlative and complementary techniques with the understanding that developing mouse enamel is dynamic and complex.

  7. Digital holography of particles: benefits of the 'inverse problem' approach

    International Nuclear Information System (INIS)

    Gire, J; Denis, L; Fournier, C; Soulez, F; Ducottet, C; Thiébaut, E

    2008-01-01

    The potential of in-line digital holography to locate and measure the size of particles distributed throughout a volume (in one shot) has been established. These measurements are fundamental for the study of particle trajectories in fluid flow. The most important issues in digital holography today are poor depth positioning accuracy, transverse field-of-view limitations, border artifacts and computational burdens. We recently suggested an 'inverse problem' approach to address some of these issues for the processing of particle digital holograms. The described algorithm improves axial positioning accuracy, gives particle diameters with sub-micrometer accuracy, eliminates border effects and increases the size of the studied volume. This approach for processing particle holograms pushes back some classical constraints. For example, the Nyquist criterion is no longer a restriction for the recording step and the studied volume is no longer confined to the field of view delimited by the sensor borders. In this paper we present a review of the limitations commonly found in digital holography. We then discuss the benefits of the 'inverse problem' approach and the influence of some experimental parameters in this framework

  8. Planck intermediate results. XXIV. Constraints on variation of fundamental constants

    CERN Document Server

    Ade, P A R; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Battaner, E.; Benabed, K.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Burigana, C.; Butler, R.C.; Calabrese, E.; Chamballu, A.; Chiang, H.C.; Christensen, P.R.; Clements, D.L.; Colombo, L.P.L.; Couchot, F.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Diego, J.M.; Dole, H.; Dore, O.; Dupac, X.; Ensslin, T.A.; Eriksen, H.K.; Fabre, O.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gregorio, A.; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.L.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Jaffe, A.H.; Jones, W.C.; Keihanen, E.; Keskitalo, R.; Kneissl, R.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lamarre, J.M.; Lasenby, A.; Lawrence, C.R.; Leonardi, R.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Mandolesi, N.; Maris, M.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Mazzotta, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Moss, A.; Munshi, D.; Murphy, J.A.; Naselsky, P.; Nati, F.; Natoli, P.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C.A.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Pratt, G.W.; Prunet, S.; Rachen, J.P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Ristorcelli, I.; Rocha, G.; Roudier, G.; Rusholme, B.; Sandri, M.; Savini, G.; Scott, D.; Spencer, L.D.; Stolyarov, V.; Sudiwala, R.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Uzan, J.P.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L.A.; Yvon, D.; Zacchei, A.; Zonca, A.

    2015-01-01

    Any variation of the fundamental physical constants, and more particularly of the fine structure constant, $\\alpha$, or of the mass of the electron, $m_e$, would affect the recombination history of the Universe and cause an imprint on the cosmic microwave background angular power spectra. We show that the Planck data allow one to improve the constraint on the time variation of the fine structure constant at redshift $z\\sim 10^3$ by about a factor of 5 compared to WMAP data, as well as to break the degeneracy with the Hubble constant, $H_0$. In addition to $\\alpha$, we can set a constraint on the variation of the mass of the electron, $m_{\\rm e}$, and on the simultaneous variation of the two constants. We examine in detail the degeneracies between fundamental constants and the cosmological parameters, in order to compare the limits obtained from Planck and WMAP and to determine the constraining power gained by including other cosmological probes. We conclude that independent time variations of the fine structu...

  9. An approach to fundamental study of beam loss minimization

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1999-01-01

    The accelerator design rules involving rms matching, developed at CERN in the 1970's, are discussed. An additional rule, for equipartitioning the beam energy among its degrees of freedom, may be added to insure an rms equilibrium conditions. If the strong stochasticity threshold is avoided, as it is in realistic accelerator designs, the dynamics is characterized by extremely long transient settling times, making the role of equipartitioning hard to explain. An approach to systematic study using the RFQ accelerator as a simulation testbed is discussed. New methods are available from recent advances in research on complexity, nonlinear dynamics, and chaos

  10. A fundamental parameter-based calibration model for an intrinsic germanium X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Christensen, L.H.; Pind, N.

    1982-01-01

    A matrix-independent fundamental parameter-based calibration model for an energy-dispersive X-ray fluorescence spectrometer has been developed. This model, which is part of a fundamental parameter approach quantification method, accounts for both the excitation and detection probability. For each secondary target a number of relative calibration constants are calculated on the basis of knowledge of the irradiation geometry, the detector specifications, and tabulated fundamental physical parameters. The absolute calibration of the spectrometer is performed by measuring one pure element standard per secondary target. For sample systems where all elements can be analyzed by means of the same secondary target the absolute calibration constant can be determined during the iterative solution of the basic equation. Calculated and experimentally determined relative calibration constants agree to within 5-10% of each other and so do the results obtained from the analysis of an NBS certified alloy using the two sets of constants. (orig.)

  11. Proposed experiment to test fundamentally binary theories

    Science.gov (United States)

    Kleinmann, Matthias; Vértesi, Tamás; Cabello, Adán

    2017-09-01

    Fundamentally binary theories are nonsignaling theories in which measurements of many outcomes are constructed by selecting from binary measurements. They constitute a sensible alternative to quantum theory and have never been directly falsified by any experiment. Here we show that fundamentally binary theories are experimentally testable with current technology. For that, we identify a feasible Bell-type experiment on pairs of entangled qutrits. In addition, we prove that, for any n , quantum n -ary correlations are not fundamentally (n -1 ) -ary. For that, we introduce a family of inequalities that hold for fundamentally (n -1 ) -ary theories but are violated by quantum n -ary correlations.

  12. The Concept of Ideology in Analysis of Fundamental Questions in Science Education

    Science.gov (United States)

    Säther, Jostein

    The use of the concept of `ideology' in interpretation of science education curricula, textbooks and various practises is reviewed, and examples are given by referring to Norwegian curricula and textbooks. The term is proposed to be used in a broad sense about any kind of action-oriented theory based on a system of ideas, or any attempt to approach politics in the light of a system of ideas. Politics in this context is about shaping of education, and is related to forces (i.e., hypothetical impacts of idea systems) which may legitimise, change, or criticise social practices. The focus is (although not in every case) on the hidden, unconscious and critical aspects. The notion ideological aspects is proposed to be related to metaphysical-ontological, epistemological and axiological claims and connotations. Examples of educational issues concerning e.g., aims, compartmentalisation, integration, and fundamentally different ideas about truth, learning and man are mentioned. Searching for a single and unifying concept for the discussing of all of science education's fundamental questions seems however in vain. Therefore a wide range of concepts seems necessary to deepen our understanding of ``the fundamental questions''.

  13. A Life-cycle Approach to Improve the Sustainability of Rural Water Systems in Resource-Limited Countries

    Directory of Open Access Journals (Sweden)

    Nicholas Stacey

    2012-11-01

    Full Text Available A WHO and UNICEF joint report states that in 2008, 884 million people lacked access to potable drinking water. A life-cycle approach to develop potable water systems may improve the sustainability for such systems, however, a review of the literature shows that such an approach has primarily been used for urban systems located in resourced countries. Although urbanization is increasing globally, over 40 percent of the world’s population is currently rural with many considered poor. In this paper, we present a first step towards using life-cycle assessment to develop sustainable rural water systems in resource-limited countries while pointing out the needs. For example, while there are few differences in costs and environmental impacts for many improved rural water system options, a system that uses groundwater with community standpipes is substantially lower in cost that other alternatives with a somewhat lower environmental inventory. However, a LCA approach shows that from institutional as well as community and managerial perspectives, sustainability includes many other factors besides cost and environment that are a function of the interdependent decision process used across the life cycle of a water system by aid organizations, water user committees, and household users. These factors often present the biggest challenge to designing sustainable rural water systems for resource-limited countries.

  14. Fundamentals of electronic image processing

    CERN Document Server

    Weeks, Arthur R

    1996-01-01

    This book is directed to practicing engineers and scientists who need to understand the fundamentals of image processing theory and algorithms to perform their technical tasks. It is intended to fill the gap between existing high-level texts dedicated to specialists in the field and the need for a more practical, fundamental text on image processing. A variety of example images are used to enhance reader understanding of how particular image processing algorithms work.

  15. Qualitative insights on fundamental mechanics

    OpenAIRE

    Mardari, G. N.

    2002-01-01

    The gap between classical mechanics and quantum mechanics has an important interpretive implication: the Universe must have an irreducible fundamental level, which determines the properties of matter at higher levels of organization. We show that the main parameters of any fundamental model must be theory-independent. They cannot be predicted, because they cannot have internal causes. However, it is possible to describe them in the language of classical mechanics. We invoke philosophical reas...

  16. Homeschooling and religious fundamentalism

    Directory of Open Access Journals (Sweden)

    Robert Kunzman

    2010-10-01

    Full Text Available This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to contemporary culture; suspicion of institutional authority and professional expertise; parental control and centrality of the family; and interweaving of faith and academics. It is important to recognize, however, that fundamentalism exists on a continuum; conservative religious homeschoolers resist liberal democratic values to varying degrees, and efforts to foster dialogue and accommodation with religious homeschoolers can ultimately help strengthen the broader civic fabric.

  17. Fundamental Perspectives on Supply Chain Management

    NARCIS (Netherlands)

    Omta, S.W.F.; Hoenen, S.J.

    2012-01-01

    The aim of the present literature study is to find the fundamental perspectives/models in the realm of supply chain management and to investigate whether they can be extended based on recent literature findings. The fundamental perspectives were found using a two-tier snowball collection method,

  18. EU criminal law and fundamental rights

    NARCIS (Netherlands)

    de Hert, Paul; Mitsilegas, V.; Bergström, M.; Konstadinides, Th.

    2016-01-01

    The chapter first offers a background analysis to EU fundamental rights law, recalling the historical affirmation of the protection of fundamental rights as a EU concern, and the important innovation brought about by the Lisbon Treaty (section 2) and the multiplicity of actors involved in the system

  19. Fundamental symmetries and interactions-selected topics

    NARCIS (Netherlands)

    Jungmann, Klaus P.

    2015-01-01

    In the field of fundamental interactions and symmetries numerous experiments are underway or planned in order to verify the standard model in particle physics, to search for possible extensions to it or to exploit the standard model for extracting most precise values for fundamental constants. We

  20. Fundamental partial compositeness

    DEFF Research Database (Denmark)

    Sannino, Francesco; Strumia, Alessandro; Tesi, Andrea

    2016-01-01

    We construct renormalizable Standard Model extensions, valid up to the Planck scale, that give a composite Higgs from a new fundamental strong force acting on fermions and scalars. Yukawa interactions of these particles with Standard Model fermions realize the partial compositeness scenario. Unde...

  1. Inadequate environment, resources and values lead to missed nursing care: A focused ethnographic study on the surgical ward using the Fundamentals of Care framework.

    Science.gov (United States)

    Jangland, Eva; Teodorsson, Therese; Molander, Karin; Muntlin Athlin, Åsa

    2018-06-01

    To explore the delivery of care from the perspective of patients with acute abdominal pain focusing on the contextual factors at system level using the Fundamentals of Care framework. The Fundamentals of Care framework describes several contextual and systemic factors that can impact the delivery of care. To deliver high-quality, person-centred care, it is important to understand how these factors affect patients' experiences and care needs. A focused ethnographic approach. A total of 20 observations were performed on two surgical wards at a Swedish university hospital. Data were collected using participant observation and informal interviews and analysed using deductive content analysis. The findings, presented in four categories, reflect the value patients place on the caring relationship and a friendly atmosphere on the ward. Patients had concerns about the environment, particularly the high-tempo culture on the ward and its impact on their integrity, rest and sleep, access to information and planning, and need for support in addressing their existential thoughts. The observers also noted that missed nursing care had serious consequences for patient safety. Patients with acute abdominal pain were cared for in the high-tempo culture of a surgical ward with limited resources, unclear leadership and challenges to patients' safety. The findings highlight the crucial importance of prioritising and valuing the patients' fundamental care needs for recovery. Nursing leaders and nurses need to take the lead to reconceptualise the value of fundamental care in the acute care setting. To improve clinical practice, the value of fundamentals of care must be addressed regardless of patient's clinical condition. Providing a caring relationship is paramount to ensure a positive impact on patient's well-being and recovery. © 2017 John Wiley & Sons Ltd.

  2. Fundamentals of successful monitoring, reporting, and verification under a cap-and-trade program

    Energy Technology Data Exchange (ETDEWEB)

    John Schakenbach; Robert Vollaro; Reynaldo Forte [U.S. Environmental Protection Agency, Office of Atmospheric Programs, Washington, DC (United States)

    2006-11-15

    The U.S. Environmental Protection Agency (EPA) developed and implemented the Acid Rain Program (ARP), and NOx Budget Trading Programs (NBTP) using several fundamental monitoring, reporting, and verification (MRV) elements: (1) compliance assurance through incentives and automatic penalties; (2) strong quality assurance (QA); (3) collaborative approach with a petition process; (4) standardized electronic reporting; (5) compliance flexibility for low-emitting sources; (6) complete emissions data record required; (7) centralized administration; (8) level playing field; (9) publicly available data; (10) performance-based approach; and (11) reducing conflicts of interest. Each of these elements is discussed in the context of the authors' experience under two U.S. cap-and-trade programs and their potential application to other cap and-trade programs. The U.S. Office of Management and Budget found that the Acid Rain Program has accounted for the largest quantified human health benefits of any federal regulatory program implemented in the last 10 yr, with annual benefits exceeding costs by {gt} 40 to 1. The authors believe that the elements described in this paper greatly contributed to this success. EPA has used the ARP fundamental elements as a model for other cap-and-trade programs, including the NBTP, which went into effect in 2003, and the recently published Clean Air Interstate Rule and Clean Air Mercury Rule. The authors believe that using these fundamental elements to develop and implement the MRV portion of their cap-and-trade programs has resulted in public confidence in the programs, highly accurate and complete emissions data, and a high compliance rate. 2 refs.

  3. New approach to the theory of coupled πNN-NN system. III. A three-body limit

    International Nuclear Information System (INIS)

    Avishai, Y.; Mizutani, T.

    1980-01-01

    In the limit where the pion is restricted to be emitted only by the nucleon that first absorbed it, it is shown that the equations previously developed to describe the couple πNN (πd) - NN system reduce to conventional three-body equations. Specifically, it is found in this limit that the input πN p 11 amplitude which, put on-shell, is directly related to the experimental phase shift, contrary to the original equations where the direct (dressed) nucleon pole term and the non-pole part of this partial wave enter separately. The present study clarifies the limitation of pure three-body approach to the πNN-NN problems as well as suggests a rare opportunity of observing a possible resonance behavior in the non-pole part of the πN P 11 amplitude through πd experiments

  4. Fundamental flavours, fields and fixed points: a brief account

    Energy Technology Data Exchange (ETDEWEB)

    Kundu, Arnab [Theory Division, Saha Institute of Nuclear Physics,1/AF Bidhannagar, Kolkata 700064 (India); Homi Bhaba National Institute, Training School Complex,Anushakti Nagar, Mumbai 400085 (India); Kundu, Nilay [Center for Gravitational Physics, Yukawa Institute for Theoretical Physics (YITP),Kyoto University,Kyoto 606-8502 (Japan)

    2017-03-13

    In this article we report on a preliminary study, via Holography, of infrared fixed points in a putative strongly coupled SU(N{sub c}) gauge theory, with N{sub f} fundamental matter, in the presence of additional fields in the fundamental sector, e.g. density or a magnetic field. In an inherently effective or a bottom up approach, we work with a simple system: Einstein-gravity with a negative cosmological constant, coupled to a Dirac-Born-Infeld (DBI) matter. We obtain a class of exact solutions, dual to candidate grounds states in the infrared (IR), with a scaling ansatz for various fields. These solutions are of two kinds: AdS{sub m}×ℝ{sup n}-type, which has appeared in the literature before; and AdS{sub m}×EAdS{sub n}-type, where m and n are suitable integers. Both these classes of solutions are non-perturbative in back-reaction. The AdS{sub m}×EAdS{sub n}-type contains examples of Bianchi type-V solutions. We also construct explicit numerical flows from an AdS{sub 5} ultraviolet to both an AdS{sub 2} and an AdS{sub 3} IR.

  5. Optofluidic bioanalysis: fundamentals and applications

    Directory of Open Access Journals (Sweden)

    Ozcelik Damla

    2017-03-01

    Full Text Available Over the past decade, optofluidics has established itself as a new and dynamic research field for exciting developments at the interface of photonics, microfluidics, and the life sciences. The strong desire for developing miniaturized bioanalytic devices and instruments, in particular, has led to novel and powerful approaches to integrating optical elements and biological fluids on the same chip-scale system. Here, we review the state-of-the-art in optofluidic research with emphasis on applications in bioanalysis and a focus on waveguide-based approaches that represent the most advanced level of integration between optics and fluidics. We discuss recent work in photonically reconfigurable devices and various application areas. We show how optofluidic approaches have been pushing the performance limits in bioanalysis, e.g. in terms of sensitivity and portability, satisfying many of the key requirements for point-of-care devices. This illustrates how the requirements for bianalysis instruments are increasingly being met by the symbiotic integration of novel photonic capabilities in a miniaturized system.

  6. Effective Exchange Rates in Central and Eastern European Countries: Cyclicality and Relationship with Macroeconomic Fundamentals

    Directory of Open Access Journals (Sweden)

    Stavárek Daniel

    2015-06-01

    Full Text Available This paper examines the evolution of effective exchange rates in nine Central and Eastern European countries in terms of development trends, volatility and cyclicality. Consequently, it provides direct empirical evidence on the nature of the relationship between effective exchange rates and selected macroeconomic fundamentals, addressing a key precondition of numerous exchange rate determination models and theories that attempt to explain the role of exchange rates in the economy. The results suggest that flexible exchange rate arrangements are reflected in both nominal and real effective exchange rates having higher volatility and variability. Furthermore, the results provide mixed evidence in terms of intensity, direction and cyclicality, but show a weak correlation between exchange rates and fundamentals. Sufficiently high coefficients are found only for money supply. Consequently, using fundamentals for the determination of exchange rates and using the exchange rate to explain economic development may be of limited use for the countries analyzed.

  7. Fundamental physics in particle traps

    International Nuclear Information System (INIS)

    Quint, Wolfgang; Vogel, Manuel

    2014-01-01

    The individual topics are covered by leading experts in the respective fields of research. Provides readers with present theory and experiments in this field. A useful reference for researchers. This volume provides detailed insight into the field of precision spectroscopy and fundamental physics with particles confined in traps. It comprises experiments with electrons and positrons, protons and antiprotons, antimatter and highly charged ions, together with corresponding theoretical background. Such investigations represent stringent tests of quantum electrodynamics and the Standard model, antiparticle and antimatter research, test of fundamental symmetries, constants, and their possible variations with time and space. They are key to various aspects within metrology such as mass measurements and time standards, as well as promising to further developments in quantum information processing. The reader obtains a valuable source of information suited for beginners and experts with an interest in fundamental studies using particle traps.

  8. General Considerations Regarding the Restrictions, Exemptions and Limitations on the Right of Free Movement of Persons

    Directory of Open Access Journals (Sweden)

    Vasilica Negrut

    2013-08-01

    Full Text Available This paper addresses a current problem, not only for the legal research, but also for the practical activity. Through this study we resume a subject that was analyzed by other authors as well, however, we highlighted, based on the analysis and observation, certain features on the free movement of persons. In the current context of globalization, the free movement of persons has new nuances. Based on the historical perspective of the approach of this principle, we examined the restrictions, exceptions and limitations on free movement of persons. From the analysis of the European legislation and jurisprudence it results that the exceptions on the free movement of persons must be interpreted strictly, the limits and the purpose of the restrictions being consistent with the general principles of European Union law (non-discrimination principle, proportionality and fundamental rights protection.

  9. Fundamental limitations on 'warp drive' spacetimes

    International Nuclear Information System (INIS)

    Lobo, Francisco S N; Visser, Matt

    2004-01-01

    'Warp drive' spacetimes are useful as 'gedanken-experiments' that force us to confront the foundations of general relativity, and among other things, to precisely formulate the notion of 'superluminal' communication. After carefully formulating the Alcubierre and Natario warp drive spacetimes, and verifying their non-perturbative violation of the classical energy conditions, we consider a more modest question and apply linearized gravity to the weak-field warp drive, testing the energy conditions to first and second orders of the warp-bubble velocity, v. Since we take the warp-bubble velocity to be non-relativistic, v << c, we are not primarily interested in the 'superluminal' features of the warp drive. Instead we focus on a secondary feature of the warp drive that has not previously been remarked upon-the warp drive (if it could be built) would be an example of a 'reaction-less drive'. For both the Alcubierre and Natario warp drives we find that the occurrence of significant energy condition violations is not just a high-speed effect, but that the violations persist even at arbitrarily low speeds. A particularly interesting feature of this construction is that it is now meaningful to think of placing a finite mass spaceship at the centre of the warp bubble, and then see how the energy in the warp field compares with the mass-energy of the spaceship. There is no hope of doing this in Alcubierre's original version of the warp field, since by definition the point at the centre of the warp bubble moves on a geodesic and is 'massless'. That is, in Alcubierre's original formalism and in the Natario formalism the spaceship is always treated as a test particle, while in the linearized theory we can treat the spaceship as a finite mass object. For both the Alcubierre and Natario warp drives we find that even at low speeds the net (negative) energy stored in the warp fields must be a significant fraction of the mass of the spaceship

  10. An Overview of Metallic Nanowire Networks, Promising Building Blocks for Next Generation Transparent Conductors: Emergence, Fundamentals and Challenges

    Science.gov (United States)

    Pirsalami, Sedigheh; Zebarjad, Seyed Mojtaba; Daneshmanesh, Habib

    2017-08-01

    Transparent conductors (TCs) have a wide range of applications in numerous electronic and optoelectronic devices. This review provides an overview of the emergence of metallic nanowire networks (MNNs) as promising building blocks for the next generation transparent conductors. The fundamental aspects, structure-property relations, fabrication techniques and the corresponding challenges are reviewed. Theoretical and experimental researches suggest that nanowires with smaller diameter, longer length and higher aspect ratio have higher performance. Yet, the development of an efficient synthesis technique for the production of MNNs has remained a challenge. The synthesis method is also crucial to the scalability and the commercial potential of these emerging TCs. The most promising techniques for the synthesis together with their advantages, limitations and the recent findings are here discussed. Finally, we will try to show the promising future research trends in MNNs to have an approach to design the next generation TCs.

  11. Fundamentals of lead-free solder interconnect technology from microstructures to reliability

    CERN Document Server

    Lee, Tae-Kyu; Kim, Choong-Un; Ma, Hongtao

    2015-01-01

    This unique book provides an up-to-date overview of the fundamental concepts behind lead-free solder and interconnection technology. Readers will find a description of the rapidly increasing presence of electronic systems in all aspects of modern life as well as the increasing need for predictable reliability in electronic systems. The physical and mechanical properties of lead-free solders are examined in detail, and building on fundamental science, the mechanisms responsible for damage and failure evolution, which affect reliability of lead-free solder joints are identified based on microstructure evolution.  The continuing miniaturization of electronic systems will increase the demand on the performance of solder joints, which will require new alloy and processing strategies as well as interconnection design strategies. This book provides a foundation on which improved performance and new design approaches can be based.  In summary, this book:  Provides an up-to-date overview on lead-free soldering tech...

  12. Comparison of fundamental and wideband harmonic contrast imaging of liver tumors.

    Science.gov (United States)

    Forsberg, F; Liu, J B; Chiou, H J; Rawool, N M; Parker, L; Goldberg, B B

    2000-03-01

    Wideband harmonic imaging (with phase inversion for improved tissue suppression) was compared to fundamental imaging in vivo. Four woodchucks with naturally occurring liver tumors were injected with Imagent (Alliance Pharmaceutical Corp., San Diego, CA). Randomized combinations of dose (0.05, 0.2 and 0.4 ml/kg) and acoustic output power (AO; 5, 25 and 63% or MI Siemens Medical Systems, Issaquah, WA). Tumor vascularity, conspicuity and contrast enhancement were rated by three independent observers. Imagent produced marked tumor enhancement and improved depiction of neovascularity at all dosages and AO settings in both modes. Tumor vascularity and enhancement correlated with mode, dose and AO (P < 0.002). Fundamental imaging produced more enhancement (P < 0.05), but tumor vascularity and conspicuity were best appreciated in harmonic mode (P < 0.05). Under the conditions studied here, the best approach was wideband harmonic imaging with 0.2 ml/kg of Imagent at an AO of 25%.

  13. The Carnot cycle and the teaching of thermodynamics: a historical approach

    Science.gov (United States)

    Laranjeiras, Cássio C.; Portela, Sebastião I. C.

    2016-09-01

    The Carnot cycle is a topic that is traditionally present in introductory physics courses dedicated to the teaching of thermodynamics, playing an essential role in introducing the concept of Entropy and the consequent formulation of the second Law. Its effective understanding and contribution to the development of thermodynamics is often hindered, however. Among other things, this is the result of a pragmatic approach, which usually limits itself to presenting the isotherms and adiabatic curves in a P-V diagram and is totally disconnected from the historical fundamentals of Heat Theory. The purpose of this paper is to reveal the potential of an approach to the subject that recovers the historical and social dimensions of scientific knowledge, and to promote reflections about the nature of science (NOS).

  14. Risk newsboy: approach for addressing uncertainty in developing action levels and cleanup limits

    International Nuclear Information System (INIS)

    Cooke, Roger; MacDonell, Margaret

    2007-01-01

    Site cleanup decisions involve developing action levels and residual limits for key contaminants, to assure health protection during the cleanup period and into the long term. Uncertainty is inherent in the toxicity information used to define these levels, based on incomplete scientific knowledge regarding dose-response relationships across various hazards and exposures at environmentally relevant levels. This problem can be addressed by applying principles used to manage uncertainty in operations research, as illustrated by the newsboy dilemma. Each day a newsboy must balance the risk of buying more papers than he can sell against the risk of not buying enough. Setting action levels and cleanup limits involves a similar concept of balancing and distributing risks and benefits in the face of uncertainty. The newsboy approach can be applied to develop health-based target concentrations for both radiological and chemical contaminants, with stakeholder input being crucial to assessing 'regret' levels. Associated tools include structured expert judgment elicitation to quantify uncertainty in the dose-response relationship, and mathematical techniques such as probabilistic inversion and iterative proportional fitting. (authors)

  15. Temperature dependence of the hydrogen-broadening coefficient for the nu 9 fundamental of ethane

    Science.gov (United States)

    Halsey, G. W.; Hillman, J. J.; Nadler, Shacher; Jennings, D. E.

    1988-01-01

    Experimental results for the temperature dependence of the H2-broadening coefficient for the nu 9 fundamental of ethane are reported. Measurements were made over the temperature range 95-300 K using a novel low-temperature absorption cell. These spectra were recorded with the Doppler-limited diode laser spectrometer at NASA Goddard. The results are compared with recent measurements and model predictions.

  16. What is Fundamental?

    CERN Multimedia

    2004-01-01

    Discussing what is fundamental in a variety of fields, biologist Richard Dawkins, physicist Gerardus 't Hooft, and mathematician Alain Connes spoke to a packed Main Auditorium at CERN 15 October. Dawkins, Professor of the Public Understanding of Science at Oxford University, explained simply the logic behind Darwinian natural selection, and how it would seem to apply anywhere in the universe that had the right conditions. 't Hooft, winner of the 1999 Physics Nobel Prize, outlined some of the main problems in physics today, and said he thinks physics is so fundamental that even alien scientists from another planet would likely come up with the same basic principles, such as relativity and quantum mechanics. Connes, winner of the 1982 Fields Medal (often called the Nobel Prize of Mathematics), explained how physics is different from mathematics, which he described as a "factory for concepts," unfettered by connection to the physical world. On 16 October, anthropologist Sharon Traweek shared anecdotes from her ...

  17. A derivation of the classical limit of quantum mechanics and quantum electrodynamics

    International Nuclear Information System (INIS)

    Ajanapon, P.

    1985-01-01

    Instead of regarding the classical limit as the h → 0, an alternative view based on the physical interpretation of the elements of the density matrix is proposed. According to this alternative view, taking the classical limit corresponds to taking the diagonal elements and ignoring the off-diagonal elements of the density matrix. As illustrations of this alternative approach, the classical limits of quantum mechanics and quantum electrodynamics are derived. The derivation is carried out in two stages. First, the statistical classical limit is derived. Then with an appropriate initial condition, the deterministic classical limit is obtained. In the case of quantum mechanics, it is found that the classical limit of Schroedinger's wave mechanics is at best statistical, i.e., Schroedinger's wave mechanics does not reduce to deterministic (Hamilton's or Newton's) classical mechanics. In order to obtain the latter, it is necessary to start out initially with a mixture at the level of statistical quantum mechanics. The derivation hinges on the use of the Feynman path integral rigorously defined with the aid of nonstandard analysis. Nonstandard analysis is also applied to extend the method to the case of quantum electrodynamics. The fundamental decoupling problem arising form the use of Grassmann variables is circumvented by the use of c-number electron fields, but antisymmetrically tagged. The basic classical (deterministic) field equations are obtained in the classical limit with appropriate initial conditions. The result raises the question as to what the corresponding classical field equations obtained in the classical limit from the renormalized Lagrangian containing infinite counterterms really mean

  18. Fundamentals of reactor chemistry

    International Nuclear Information System (INIS)

    Akatsu, Eiko

    1981-12-01

    In the Nuclear Engineering School of JAERI, many courses are presented for the people working in and around the nuclear reactors. The curricula of the courses contain also the subject material of chemistry. With reference to the foreign curricula, a plan of educational subject material of chemistry in the Nuclear Engineering School of JAERI was considered, and the fundamental part of reactor chemistry was reviewed in this report. Since the students of the Nuclear Engineering School are not chemists, the knowledge necessary in and around the nuclear reactors was emphasized in order to familiarize the students with the reactor chemistry. The teaching experience of the fundamentals of reactor chemistry is also given. (author)

  19. Fundamentals of electro-engineering I

    International Nuclear Information System (INIS)

    Rapsik, M.; Smola, M.; Bohac, M.; Mucha, M.

    2004-01-01

    This is the text-book of the fundamentals of electro-engineering. It contains the following chapters: (1) Selected terms in electro-engineering; (2) Fundamental electric values; (3) Energy and their transformations; (4) Water, hydro-energy and hydro-energetic potential of the Slovak Republic; (5) Nuclear power engineering; (6) Conventional thermal power plants; (7) Heating and cogeneration of electric power and heat production; (8) Equipment of electricity supply system; (9) Measurements in electro-engineering ; (10) Regulation of frequency and voltage, electric power quality

  20. A MACHINE-LEARNING METHOD TO INFER FUNDAMENTAL STELLAR PARAMETERS FROM PHOTOMETRIC LIGHT CURVES

    International Nuclear Information System (INIS)

    Miller, A. A.; Bloom, J. S.; Richards, J. W.; Starr, D. L.; Lee, Y. S.; Butler, N. R.; Tokarz, S.; Smith, N.; Eisner, J. A.

    2015-01-01

    A fundamental challenge for wide-field imaging surveys is obtaining follow-up spectroscopic observations: there are >10 9 photometrically cataloged sources, yet modern spectroscopic surveys are limited to ∼few× 10 6 targets. As we approach the Large Synoptic Survey Telescope era, new algorithmic solutions are required to cope with the data deluge. Here we report the development of a machine-learning framework capable of inferring fundamental stellar parameters (T eff , log g, and [Fe/H]) using photometric-brightness variations and color alone. A training set is constructed from a systematic spectroscopic survey of variables with Hectospec/Multi-Mirror Telescope. In sum, the training set includes ∼9000 spectra, for which stellar parameters are measured using the SEGUE Stellar Parameters Pipeline (SSPP). We employed the random forest algorithm to perform a non-parametric regression that predicts T eff , log g, and [Fe/H] from photometric time-domain observations. Our final optimized model produces a cross-validated rms error (RMSE) of 165 K, 0.39 dex, and 0.33 dex for T eff , log g, and [Fe/H], respectively. Examining the subset of sources for which the SSPP measurements are most reliable, the RMSE reduces to 125 K, 0.37 dex, and 0.27 dex, respectively, comparable to what is achievable via low-resolution spectroscopy. For variable stars this represents a ≈12%-20% improvement in RMSE relative to models trained with single-epoch photometric colors. As an application of our method, we estimate stellar parameters for ∼54,000 known variables. We argue that this method may convert photometric time-domain surveys into pseudo-spectrographic engines, enabling the construction of extremely detailed maps of the Milky Way, its structure, and history

  1. Value of Fundamental Science

    Science.gov (United States)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  2. Nanotechnology inspired advanced engineering fundamentals for optimizing drug delivery.

    Science.gov (United States)

    Kassem, Ahmed Alaa

    2018-02-06

    Drug toxicity and inefficacy are commonly experienced problems with drug therapy failure. To face these problems, extensive research work took place aiming to design new dosage forms for drug delivery especially nanoparticulate systems. These systems are designed to increase the quantity of the therapeutic molecule delivered to the desired site concurrently with reduced side effects. In order to achieve this objective, nanocarriers must principally display suitable drug vehiculization abilities and a controlled biological destiny of drug molecules. Only the intelligent design of the nanomedicine will accomplish these fundamentals. The present review article is dedicated to the discussion of the important fundamentals to be considered in the fabrication of nanomedicines. These include the therapeutic agent, the nanocarrier and the functionalization moieties. Special consideration is devoted to the explanation and compilation of highly potential fabrication approaches assisting how to control the in vivo destiny of the nanomedicine. Finally, some nanotechnology-based drug delivery systems, for the development of nanomedicine, are also discussed. The nanotechnology-based drug delivery systems showed remarkable outcomes based on passive and active targeting as well as improvement of the drug pharmacodynamic and pharmacokinetic profiles. Multifunctional nanocarrier concept affords a revolutionary drug delivery approach for maximizing the efficacy, safety and monitoring the biological fate of the therapeutic molecule. Nanomedicines may enhance the efficacy of therapeutic molecules and reduce their toxic effects. Meanwhile, further research works are required to rightly optimize (and define) the effectiveness, nanotoxicity, in vivo destiny and feasibility of these nanomedicines which, from a preclinical standpoint, are actually promising. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Effective approaches to regulate mobile advertising: moving towards a co-ordinated legal, self-regulatory, and technical response

    DEFF Research Database (Denmark)

    Cleff, Evelyne Beatrix

    2009-01-01

    This article aims to contribute to the ongoing discourse about the issue of privacy in the mobile advertising domain. The article discusses the fundamental principles and information practices used in digital environments for protecting individuals' private data. Major challenges are identified t......, such as legislation, self-regulation and technical approaches. It is intended to promote an effective approach to improve consumer privacy in the mobile advertising domain.......This article aims to contribute to the ongoing discourse about the issue of privacy in the mobile advertising domain. The article discusses the fundamental principles and information practices used in digital environments for protecting individuals' private data. Major challenges are identified...... that should be addressed, so that fair information principles can be applied in the context of m-advertising. It also points out the limitations of these principles. Furthermore, the article discusses a range of models that is available for regulating the collection, use and disclosure of personal data...

  4. Introduction and fundamentals

    International Nuclear Information System (INIS)

    Thomas, R.H.

    1980-01-01

    This introduction discusses advances in the fundamental sciences which underlie the applied science of health physics and radiation protection. Risk assessments in nuclear medicine are made by defining the conditions of exposure, identification of adverse effects, relating exposure with effect, and estimation of the overall risk for ionizing radiations

  5. Fundamentals of astrodynamics

    NARCIS (Netherlands)

    Wakker, K.F.

    2015-01-01

    This book deals with the motion of the center of mass of a spacecraft; this discipline is generally called astrodynamics. The book focuses on an analytical treatment of the motion of spacecraft and provides insight into the fundamentals of spacecraft orbit dynamics. A large number of topics are

  6. Fundamentals of multicore software development

    CERN Document Server

    Pankratius, Victor; Tichy, Walter F

    2011-01-01

    With multicore processors now in every computer, server, and embedded device, the need for cost-effective, reliable parallel software has never been greater. By explaining key aspects of multicore programming, Fundamentals of Multicore Software Development helps software engineers understand parallel programming and master the multicore challenge. Accessible to newcomers to the field, the book captures the state of the art of multicore programming in computer science. It covers the fundamentals of multicore hardware, parallel design patterns, and parallel programming in C++, .NET, and Java. It

  7. The fundamental interactions of matter

    International Nuclear Information System (INIS)

    Falla, D.F.

    1977-01-01

    Elementary particles are here discussed, in the context of the extent to which the fundamental interactions are related to the elementary constituents of matter. The field quanta related to the four fundamental interactions (electromagnetic, strong,weak and gravitational) are discussed within an historical context beginning with the conception of the photon. The discovery of the mesons and discoveries relevant to the nature of the heavy vector boson are considered. Finally a few recent speculations on the properties of the graviton are examined. (U.K.)

  8. Actinide targets for fundamental research in nuclear physics

    Science.gov (United States)

    Eberhardt, K.; Düllmann, Ch. E.; Haas, R.; Mokry, Ch.; Runke, J.; Thörle-Pospiech, P.; Trautmann, N.

    2018-05-01

    Thin actinide layers deposited on various substrates are widely used as calibration sources in nuclear spectroscopy. Other applications include fundamental research in nuclear chemistry and -physics, e.g., the chemical and physical properties of super-heavy elements (SHE, Z > 103) or nuclear reaction studies with heavy ions. For the design of future nuclear reactors like fast-fission reactors and accelerator-driven systems for transmutation of nuclear waste, precise data for neutron absorption as well as neutron-induced fission cross section data for 242Pu with neutrons of different energies are of particular importance, requiring suitable Pu-targets. Another application includes studies of nuclear transitions in 229Th harvested as α-decay recoil product from a thin layer of its 233U precursor. For this, a thin and very smooth layer of 233U is used. We report here on the production of actinide layers mostly obtained by Molecular Plating (MP). MP is currently the only fabrication method in cases where the desired actinide material is available only in very limited amounts or possesses a high specific activity. Here, deposition is performed from organic solution applying a current density of 1-2 mA/cm2. Under these conditions target thicknesses of 500-1000 μg/cm2 are possible applying a single deposition step with deposition yields approaching 100 %. For yield determination α-particle spectroscopy, γ-spectroscopy and Neutron Activation Analysis is routinely used. Layer homogeneity is checked with Radiographic Imaging. As an alternative technique to MP the production of thin lanthanide and actinide layers by the so-called "Drop on Demand"-technique applied e.g., in ink-jet printing is currently under investigation.

  9. Chapman--Enskog approach to flux-limited diffusion theory

    International Nuclear Information System (INIS)

    Levermore, C.D.

    1979-01-01

    Using the technique developed by Chapman and Enskog for deriving the Navier--Stokes equations from the Boltzmann equation, a framework is set up for deriving diffusion theories from the transport equation. The procedure is first applied to give a derivation of isotropic diffusion theory and then of a completely new theory which is naturally flux-limited. This new flux-limited diffusion theory is then compared with asymptotic diffusion theory

  10. Fundamental principles of the cultural-activity approach in the psychology of giftedness

    OpenAIRE

    Babaeva, Julia

    2013-01-01

    This article examines the cultural-activity approach to the study of giftedness, which is based on the ideas of L. S. Vygotsky, A. N. Leontiev, and O. K. Tikhomirov. Three basic principles of this approach are described: the principle of polymorphism, the dynamic principle, and the principle of the holistic analysis of the giftedness phenomenon. The article introduces the results of empirical research (including a 10-year longitudinal study), which verifies the efficacy of the cultural-activi...

  11. O DIREITO FUNDAMENTAL AO DESENVOLVIMENTO SUSTENTÁVEL

    OpenAIRE

    Fernandes, Jeferson Nogueira

    2012-01-01

    Este artigo jurídico trata do direito fundamental do ser humano ter um desenvolvimento sustentável, e tem como objetivo identificar o desenvolvimento sustentável como sendo um direito fundamental consagrado universalmente. Em sua elaboração foi utilizado o seguinte material: textos doutrinários, legislação nacional pertinente e documentos internacionais. A conclusão indica que: o desenvolvimento sustentável é um direito fundamental do homem e que devemos manter equilibrada a relação ambiente ...

  12. RFID design fundamentals and applications

    CERN Document Server

    Lozano-Nieto, Albert

    2010-01-01

    RFID is an increasingly pervasive tool that is now used in a wide range of fields. It is employed to substantiate adherence to food preservation and safety standards, combat the circulation of counterfeit pharmaceuticals, and verify authenticity and history of critical parts used in aircraft and other machinery-and these are just a few of its uses. Goes beyond deployment, focusing on exactly how RFID actually worksRFID Design Fundamentals and Applications systematically explores the fundamental principles involved in the design and characterization of RFID technologies. The RFID market is expl

  13. Astrophysical probes of fundamental physics

    International Nuclear Information System (INIS)

    Martins, C.J.A.P.

    2009-01-01

    I review the motivation for varying fundamental couplings and discuss how these measurements can be used to constrain fundamental physics scenarios that would otherwise be inaccessible to experiment. I highlight the current controversial evidence for varying couplings and present some new results. Finally I focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements might be used to probe the nature of dark energy, with some advantages over standard methods. In particular I discuss what can be achieved with future spectrographs such as ESPRESSO and CODEX.

  14. Astrophysical probes of fundamental physics

    Energy Technology Data Exchange (ETDEWEB)

    Martins, C.J.A.P. [Centro de Astrofisica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); DAMTP, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom)

    2009-10-15

    I review the motivation for varying fundamental couplings and discuss how these measurements can be used to constrain fundamental physics scenarios that would otherwise be inaccessible to experiment. I highlight the current controversial evidence for varying couplings and present some new results. Finally I focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements might be used to probe the nature of dark energy, with some advantages over standard methods. In particular I discuss what can be achieved with future spectrographs such as ESPRESSO and CODEX.

  15. Qualitative insights on fundamental mechanics

    International Nuclear Information System (INIS)

    Mardari, Ghenadie N

    2007-01-01

    The gap between classical mechanics and quantum mechanics has an important interpretive implication: the Universe must have an irreducible fundamental level, which determines the properties of matter at higher levels of organization. We show that the main parameters of any fundamental model must be theory-independent. Moreover, such models must also contain discrete identical entities with constant properties. These conclusions appear to support the work of Kaniadakis on subquantum mechanics. A qualitative analysis is offered to suggest compatibility with relevant phenomena, as well as to propose new means for verification

  16. A combined Raman spectroscopic and theoretical investigation of fundamental vibrational bands of furfuryl alcohol (2-furanmethanol)

    DEFF Research Database (Denmark)

    Barsberg, S.; Berg, Rolf W.

    2006-01-01

    . study of FA in weakly interacting environments. It is the first study of FA vibrational properties based on d. functional theory (DFT/B3LYP), and a recently proposed hybrid approach to the calcn. of fundamental frequencies, which also includes an anharmonic contribution. FA occupies five different...

  17. Influences of Fundamental Frequency, Formant Frequencies, Aperiodicity, and Spectrum Level on the Perception of Voice Gender

    Science.gov (United States)

    Skuk, Verena G.; Schweinberger, Stefan R.

    2014-01-01

    Purpose: To determine the relative importance of acoustic parameters (fundamental frequency [F0], formant frequencies [FFs], aperiodicity, and spectrum level [SL]) on voice gender perception, the authors used a novel parameter-morphing approach that, unlike spectral envelope shifting, allows the application of nonuniform scale factors to transform…

  18. Java programming fundamentals problem solving through object oriented analysis and design

    CERN Document Server

    Nair, Premchand S

    2008-01-01

    While Java texts are plentiful, it's difficult to find one that takes a real-world approach, and encourages novice programmers to build on their Java skills through practical exercise. Written by an expert with 19 experience teaching computer programming, Java Programming Fundamentals presents object-oriented programming by employing examples taken from everyday life. Provides a foundation in object-oriented design principles and UML notation Describes common pitfalls and good programming practicesFurnishes supplemental links, documents, and programs on its companion website, www.premnair.netU

  19. On the importance of identifying, characterizing, and predicting fundamental phenomena towards microbial electrochemistry applications.

    Science.gov (United States)

    Torres, César Iván

    2014-06-01

    The development of microbial electrochemistry research toward technological applications has increased significantly in the past years, leading to many process configurations. This short review focuses on the need to identify and characterize the fundamental phenomena that control the performance of microbial electrochemical cells (MXCs). Specifically, it discusses the importance of recent efforts to discover and characterize novel microorganisms for MXC applications, as well as recent developments to understand transport limitations in MXCs. As we increase our understanding of how MXCs operate, it is imperative to continue modeling efforts in order to effectively predict their performance, design efficient MXC technologies, and implement them commercially. Thus, the success of MXC technologies largely depends on the path of identifying, understanding, and predicting fundamental phenomena that determine MXC performance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Astronomers Gain Clues About Fundamental Physics

    Science.gov (United States)

    2005-12-01

    An international team of astronomers has looked at something very big -- a distant galaxy -- to study the behavior of things very small -- atoms and molecules -- to gain vital clues about the fundamental nature of our entire Universe. The team used the National Science Foundation's Robert C. Byrd Green Bank Telescope (GBT) to test whether the laws of nature have changed over vast spans of cosmic time. The Green Bank Telescope The Robert C. Byrd Green Bank Telescope CREDIT: NRAO/AUI/NSF (Click on image for GBT gallery) "The fundamental constants of physics are expected to remain fixed across space and time; that's why they're called constants! Now, however, new theoretical models for the basic structure of matter indicate that they may change. We're testing these predictions." said Nissim Kanekar, an astronomer at the National Radio Astronomy Observatory (NRAO), in Socorro, New Mexico. So far, the scientists' measurements show no change in the constants. "We've put the most stringent limits yet on some changes in these constants, but that's not the end of the story," said Christopher Carilli, another NRAO astronomer. "This is the exciting frontier where astronomy meets particle physics," Carilli explained. The research can help answer fundamental questions about whether the basic components of matter are tiny particles or tiny vibrating strings, how many dimensions the Universe has, and the nature of "dark energy." The astronomers were looking for changes in two quantities: the ratio of the masses of the electron and the proton, and a number physicists call the fine structure constant, a combination of the electron charge, the speed of light and the Planck constant. These values, considered fundamental physical constants, once were "taken as time independent, with values given once and forever" said German particle physicist Christof Wetterich. However, Wetterich explained, "the viewpoint of modern particle theory has changed in recent years," with ideas such as

  1. Development of an air coil superconducting fault current limiter

    Energy Technology Data Exchange (ETDEWEB)

    Naeckel, Oliver

    2016-07-01

    Electrical power grids are the lifeline of technical infrastructure and fundamental for industry and modern lives. Fault Currents can disrupt the continuous supply of electrical energy, cause instable grid conditions and damage electrical equipment. The Air Coil Superconducting Fault Current Limiter (AC-SFCL) is a measure to effectively limit fault currents. The concept is investigated and proven experimentally by designing, building and successfully testing a 60 kV, 400 V, z=6% demonstrator.

  2. Fundamental rights, contract law and the protection of the weaker party: a comparative analysis of the constitutionalisation of contract Law, with emphasis on risky financial transactions

    NARCIS (Netherlands)

    Cherednychenko, O.O.

    2007-01-01

    Originally, contract law was considered to be immune from the effect of fundamental rights, the function of which was limited to being individual defences against the vigilant eye of the State. This traditional view, however, has recently been put under pressure as a result of fundamental rights

  3. PRESSURE ULCER PREVENTION: FUNDAMENTALS FOR BEST PRACTICE.

    Science.gov (United States)

    Collier, Mark

    2016-01-01

    This introduction has highlighted both the complex nature of the aetiology of pressure ulcer development and the complex nature of the assessment process intended to identify those patients who are or might be at an enhanced risk of pressure ulcer development. The latter statement assumes that all patients cared for in any healthcare setting are vulnerable to pressure ulcer development. Whilst it is acknowledged that the use of a risk assessment tool can be important in an overall pressure ulcer prevention strategy, it is important that the limitations of these tools are acknowledged and that they are not an finite assessment in themselves and that they should be used by a practitioner with a fundamental breadth of relevant knowledge and an appreciation of the range of appropriate preventative equipment/techniques available and the role of the multi-disciplinary team in the prevention of all avoidable pressure ulcers.

  4. Shareholders' Fundamental Rights in Listed Companies

    DEFF Research Database (Denmark)

    Werlauff, Erik

    2017-01-01

    There can be no reasonable doubt that the EU’s initiatives in the field of shareholders’ fundamental rights in listed companies are among the successful, relevant and necessary provisions under EU corporate and stock exchange law. This also holds true for the main Directive 2007/36. When consider......There can be no reasonable doubt that the EU’s initiatives in the field of shareholders’ fundamental rights in listed companies are among the successful, relevant and necessary provisions under EU corporate and stock exchange law. This also holds true for the main Directive 2007/36. When...... considering the whole spirit and idea of the EU and its competences, the field of basic shareholders’ rights, including cross-border shareholding, is to be regarded as a welcome initiative that has facilitated the exercise of fundamental rights also in cross-border shareholding. The success is further...... emphasized by the fact that some countries, including Denmark, have regarded a number of the fundamental rights vested in the directive as being so well formulated that the countries have chosen to gold-plate their own legislation by introducing rights which are similar to those in the directive also for non...

  5. Default Bayesian Estimation of the Fundamental Frequency

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2013-01-01

    Joint fundamental frequency and model order esti- mation is an important problem in several applications. In this paper, a default estimation algorithm based on a minimum of prior information is presented. The algorithm is developed in a Bayesian framework, and it can be applied to both real....... Moreover, several approximations of the posterior distributions on the fundamental frequency and the model order are derived, and one of the state-of-the-art joint fundamental frequency and model order estimators is demonstrated to be a special case of one of these approximations. The performance...

  6. LFR safety approach and main ELFR safety analysis results

    International Nuclear Information System (INIS)

    Bubelis, E.; Schikorr, M.; Frogheri, M.; Mansani, L.; Bandini, G.; Burgazzi, L.; Mikityuk, K.; Zhang, Y.; Lo Frano, R.; Forgione, N.

    2013-01-01

    LFR safety approach: → A global safety approach for the LFR reference plant has been assessed and the safety analyses methodology has been developed. → LFR follows the general guidelines of the Generation IV safety concept recommendations. Thus, improved safety and higher reliability are recognized as an essential priority. → The fundamental safety objectives and the Defence-in-Depth (DiD) approach, as described by IAEA Safety Guides, have been preserved. → The recommendations of the Risk and Safety Working Group (RSWG) of GEN-IV IF has been taken into account: • safety is to be “built-in” in the fundamental design rather than “added on”; • full implementation of the Defence-in-Depth principles in a manner that is demonstrably exhaustive, progressive, tolerant, forgiving and well-balanced; • “risk-informed” approach - deterministic approach complemented with a probabilistic one; • adoption of an integrated methodology that can be used to evaluate and document the safety of Gen IV nuclear systems - ISAM. In particular the OPT tool is the fundamental methodology used throughout the design process

  7. Fundamental and semi-global kinetic mechanisms for hydrocarbon combustion. Final report, March 1977-October 1980

    Energy Technology Data Exchange (ETDEWEB)

    Dryer, F L; Glassman, I; Brezinsky, K

    1981-03-01

    Over the past three and one half years, substantial research efforts of the Princeton Fuels Research Group have been directed towards the development of simplified mechanisms which would accurately describe the oxidation of hydrocarbons fuels. The objectives of this combustion research included the study of semi-empirical modeling (that is an overall description) of the chemical kinetic mechanisms of simple hydrocarbon fuels. Such fuels include the alkanes: ethane, propane, butane, hexane and octane as well as the critically important alkenes: ethene, propene and butene. As an extension to this work, the study of the detailed radical species characteristics of combustion systems was initiated as another major aspect of the program, with emphasis on the role of the OH and HO/sub 2/ radicals. Finally, the studies of important alternative fuel problems linked the program to longer range approaches to the energy supply question. Studies of alternative fuels composed the major elements of this area of the program. The efforts on methanol research were completed, and while the aromatics aspects of the DOE work have been a direct extension of efforts supported by the Air Force Office of Scientific Research, they represented a significant part of the overall research effort. The emphasis in the proposed program is to provide further fundamental understanding of the oxidation of hydrocarbon fuels which will be useful in guiding engineering approaches. Although the scope of program ranges from the fundamentals of chemical kinetics to that of alternative fuel combustion, the objective in mind is to provide insight and guidance to the understanding of practical combustion environments. The key to our approach has been our understanding of the fundamental combustion chemistry and its relation to the important practical combustion problems which exist in implementing energy efficient, alternate fuels technologies.

  8. On the fundamentals of nuclear reactor safety assessment. Inherent threats and their implications

    Energy Technology Data Exchange (ETDEWEB)

    Hyvaerinen, J. [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland). Nuclear Safety Dept.

    1996-12-01

    The thesis addresses some fundamental questions related to implementation and assessment of nuclear safety. The safety principles and assessment methods are described, followed by descriptions of selected novel technical challenges to nuclear safety. The novel challenges encompass a wide variety of technical issues, thus providing insights on the limitations of conventional safety assessment methods. Study of the limitations suggests means to improve nuclear reactor design criteria and safety assessment practices. The novel safety challenges discussed are (1) inherent boron dilution in PWRs, (2) metallic insulation performance with respect to total loss of emergency cooling systems in a loss-of-coolant accident, and (3) horizontal steam generator heat transfer performance at natural circulation conditions. (50 refs.).

  9. On the fundamentals of nuclear reactor safety assessment. Inherent threats and their implications

    International Nuclear Information System (INIS)

    Hyvaerinen, J.

    1996-12-01

    The thesis addresses some fundamental questions related to implementation and assessment of nuclear safety. The safety principles and assessment methods are described, followed by descriptions of selected novel technical challenges to nuclear safety. The novel challenges encompass a wide variety of technical issues, thus providing insights on the limitations of conventional safety assessment methods. Study of the limitations suggests means to improve nuclear reactor design criteria and safety assessment practices. The novel safety challenges discussed are (1) inherent boron dilution in PWRs, (2) metallic insulation performance with respect to total loss of emergency cooling systems in a loss-of-coolant accident, and (3) horizontal steam generator heat transfer performance at natural circulation conditions. (50 refs.)

  10. Unified composite model of all fundamental particles and forces

    International Nuclear Information System (INIS)

    Terazawa, H.

    2000-01-01

    The unified supersymmetric composite model of all fundamental particles (and forces) including not only the fundamental fermions (quarks and leptons) but also the fundamental bosons (gauge bosons and Higgs scalars) is reviewed in detail

  11. The Fundamental Principle of Human Dignity and the Right to Life : Collision Any of These Fundamental Principles The Perspective of Abortion

    Directory of Open Access Journals (Sweden)

    Érika do Amaral Véras

    2016-12-01

    Full Text Available This legal article works the theme of the collision of the fundamental principles, especially the principle of human dignity and the right to life, abortion perspective. First, we discuss of fundamental rights, bringing its definition, observed the distinction between human rights and fundamental rights. Then the super principle of human dignity is covered and, soon after, the right to life is highlighted through its relevant elements. Finally, talks on a possible collision beween the fundamental right to life and the principle of human dignity, with a special focus on the issue of abortion.

  12. Quench limits

    International Nuclear Information System (INIS)

    Sapinski, M.

    2012-01-01

    With thirteen beam induced quenches and numerous Machine Development tests, the current knowledge of LHC magnets quench limits still contains a lot of unknowns. Various approaches to determine the quench limits are reviewed and results of the tests are presented. Attempt to reconstruct a coherent picture emerging from these results is taken. The available methods of computation of the quench levels are presented together with dedicated particle shower simulations which are necessary to understand the tests. The future experiments, needed to reach better understanding of quench limits as well as limits for the machine operation are investigated. The possible strategies to set BLM (Beam Loss Monitor) thresholds are discussed. (author)

  13. An approach to criteria, design limits and monitoring in nuclear fuel waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, G R; Baumgartner, P; Bird, G A; Davison, C C; Johnson, L H; Tamm, J A

    1994-12-01

    The Nuclear Fuel Waste Management Program has been established to develop and demonstrate the technology for safe geological disposal of nuclear fuel waste. One objective of the program is to show that a disposal system (i.e., a disposal centre and associated transportation system) can be designed and that it would be safe. Therefore the disposal system must be shown to comply with safety requirements specified in guidelines, standards, codes and regulations. The components of the disposal system must also be shown to operate within the limits specified in their design. Compliance and performance of the disposal system would be assessed on a site-specific basis by comparing estimates of the anticipated performance of the system and its components with compliance or performance criteria. A monitoring program would be developed to consider the effects of the disposal system on the environment and would include three types of monitoring: baseline monitoring, compliance monitoring, and performance monitoring. This report presents an approach to establishing compliance and performance criteria, limits for use in disposal system component design, and the main elements of a monitoring program for a nuclear fuel waste disposal system. (author). 70 refs., 9 tabs., 13 figs.

  14. An approach to criteria, design limits and monitoring in nuclear fuel waste disposal

    International Nuclear Information System (INIS)

    Simmons, G.R.; Baumgartner, P.; Bird, G.A.; Davison, C.C.; Johnson, L.H.; Tamm, J.A.

    1994-12-01

    The Nuclear Fuel Waste Management Program has been established to develop and demonstrate the technology for safe geological disposal of nuclear fuel waste. One objective of the program is to show that a disposal system (i.e., a disposal centre and associated transportation system) can be designed and that it would be safe. Therefore the disposal system must be shown to comply with safety requirements specified in guidelines, standards, codes and regulations. The components of the disposal system must also be shown to operate within the limits specified in their design. Compliance and performance of the disposal system would be assessed on a site-specific basis by comparing estimates of the anticipated performance of the system and its components with compliance or performance criteria. A monitoring program would be developed to consider the effects of the disposal system on the environment and would include three types of monitoring: baseline monitoring, compliance monitoring, and performance monitoring. This report presents an approach to establishing compliance and performance criteria, limits for use in disposal system component design, and the main elements of a monitoring program for a nuclear fuel waste disposal system. (author). 70 refs., 9 tabs., 13 figs

  15. Scaling theory for the quasideterministic limit of continuous bifurcations.

    Science.gov (United States)

    Kessler, David A; Shnerb, Nadav M

    2012-05-01

    Deterministic rate equations are widely used in the study of stochastic, interacting particles systems. This approach assumes that the inherent noise, associated with the discreteness of the elementary constituents, may be neglected when the number of particles N is large. Accordingly, it fails close to the extinction transition, when the amplitude of stochastic fluctuations is comparable with the size of the population. Here we present a general scaling theory of the transition regime for spatially extended systems. We demonstrate this through a detailed study of two fundamental models for out-of-equilibrium phase transitions: the Susceptible-Infected-Susceptible (SIS) that belongs to the directed percolation equivalence class and the Susceptible-Infected-Recovered (SIR) model belonging to the dynamic percolation class. Implementing the Ginzburg criteria we show that the width of the fluctuation-dominated region scales like N^{-κ}, where N is the number of individuals per site and κ=2/(d_{u}-d), d_{u} is the upper critical dimension. Other exponents that control the approach to the deterministic limit are shown to be calculable once κ is known. The theory is extended to include the corrections to the front velocity above the transition. It is supported by the results of extensive numerical simulations for systems of various dimensionalities.

  16. 48 CFR 9904.416-40 - Fundamental requirement.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirement..., OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.416-40 Fundamental requirement. (a) The amount of...

  17. 48 CFR 9904.413-40 - Fundamental requirement.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirement..., OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.413-40 Fundamental requirement. (a) Assignment of...

  18. 48 CFR 9904.417-40 - Fundamental requirement.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirement... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.417-40 Fundamental requirement. The cost of money applicable to the investment in tangible and intangible capital assets being constructed, fabricated, or...

  19. 48 CFR 9905.501-40 - Fundamental requirement.

    Science.gov (United States)

    2010-10-01

    ... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.501-40 Fundamental... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirement. 9905.501-40 Section 9905.501-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD...

  20. 48 CFR 9905.502-40 - Fundamental requirement.

    Science.gov (United States)

    2010-10-01

    ... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.502-40 Fundamental... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirement. 9905.502-40 Section 9905.502-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD...

  1. 48 CFR 9905.506-40 - Fundamental requirement.

    Science.gov (United States)

    2010-10-01

    ... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.506-40 Fundamental... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirement. 9905.506-40 Section 9905.506-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD...

  2. EFFICACY OF SUBMUCOSAL DELIVERY THROUGH A PARAPHARYNGEAL APPROACH IN THE TREATMENT OF LIMITED CRICOID CHONDROMA

    Directory of Open Access Journals (Sweden)

    M.T. Khorsi Y. Amidi

    2008-05-01

    Full Text Available Cartilaginous tumors comprise 1% of all laryngeal masses. Since they grow slowly and metastasis is rare, long term survival is expected in cases of chondroma and chondrosarcoma. Thus, based on these facts and the fact that total salvage surgery after recurrence of previous tumor does not influence treatment outcomes, "Quality of Life" must be taken into great consideration. Based on 3 cases of limited condrosarcoma that we have successfully operated on using submucosal delivery through a parapharyngeal approach, after several years of recurrence free follow ups, authors determine this technique as an efficient method of approach to these tumors. Since this technique takes less time and there is no need for glottic incision and the patient is discharged in 2 days without insertion of endolaryngeal stent, we believe this method is superior to laryngofissure or total laryngectomy.

  3. Equations of viscous flow of silicate liquids with different approaches for universality of high temperature viscosity limit

    Directory of Open Access Journals (Sweden)

    Ana F. Kozmidis-Petrović

    2014-06-01

    Full Text Available The Vogel-Fulcher-Tammann (VFT, Avramov and Milchev (AM as well as Mauro, Yue, Ellison, Gupta and Allan (MYEGA functions of viscous flow are analysed when the compositionally independent high temperature viscosity limit is introduced instead of the compositionally dependent parameter η∞ . Two different approaches are adopted. In the first approach, it is assumed that each model should have its own (average high-temperature viscosity parameter η∞ . In that case, η∞ is different for each of these three models. In the second approach, it is assumed that the high-temperature viscosity is a truly universal value, independent of the model. In this case, the parameter η∞ would be the same and would have the same value: log η∞ = −1.93 dPa·s for all three models. 3D diagrams can successfully predict the difference in behaviour of viscous functions when average or universal high temperature limit is applied in calculations. The values of the AM functions depend, to a greater extent, on whether the average or the universal value for η∞ is used which is not the case with the VFT model. Our tests and values of standard error of estimate (SEE show that there are no general rules whether the average or universal high temperature viscosity limit should be applied to get the best agreement with the experimental functions.

  4. A qualitative risk assessment approach for Swiss dairy products: opportunities and limitations.

    Science.gov (United States)

    Menéndez González, S; Hartnack, S; Berger, T; Doherr, M; Breidenbach, E

    2011-05-01

    Switzerland implemented a risk-based monitoring of Swiss dairy products in 2002 based on a risk assessment (RA) that considered the probability of exceeding a microbiological limit value set by law. A new RA was launched in 2007 to review and further develop the previous assessment, and to make recommendations for future risk-based monitoring according to current risks. The resulting qualitative RA was designed to ascertain the risk to human health from the consumption of Swiss dairy products. The products and microbial hazards to be considered in the RA were determined based on a risk profile. The hazards included Campylobacter spp., Listeria monocytogenes, Salmonella spp., Shiga toxin-producing Escherichia coli, coagulase-positive staphylococci and Staphylococcus aureus enterotoxin. The release assessment considered the prevalence of the hazards in bulk milk samples, the influence of the process parameters on the microorganisms, and the influence of the type of dairy. The exposure assessment was linked to the production volume. An overall probability was estimated combining the probabilities of release and exposure for each combination of hazard, dairy product and type of dairy. This overall probability represents the likelihood of a product from a certain type of dairy exceeding the microbiological limit value and being passed on to the consumer. The consequences could not be fully assessed due to lack of detailed information on the number of disease cases caused by the consumption of dairy products. The results were expressed as a ranking of overall probabilities. Finally, recommendations for the design of the risk-based monitoring programme and for filling the identified data gaps were given. The aims of this work were (i) to present the qualitative RA approach for Swiss dairy products, which could be adapted to other settings and (ii) to discuss the opportunities and limitations of the qualitative method. © 2010 Blackwell Verlag GmbH.

  5. Examining the conflation of multiculturalism, sexism, and religious fundamentalism through Taylor and Bakhtin: expanding post-colonial feminist epistemology.

    Science.gov (United States)

    Racine, Louise

    2009-01-01

    In this post-9/11 era marked by religious and ethnic conflicts and the rise of cultural intolerance, ambiguities arising from the conflation of multiculturalism, sexism, and religious fundamentalism jeopardize the delivery of culturally safe nursing care to non-Western populations. This new social reality requires nurses to develop a heightened awareness of health issues pertaining to racism and ethnocentrism to provide culturally safe care to non-Western immigrants or refugees. Through the lens of post-colonial feminism, this paper explores the challenge of providing culturally safe nursing care in the context of the post-9/11 in Canadian healthcare settings. A critical appraisal of the literature demonstrates that post-colonial feminism, despite some limitations, remains a valuable theoretical perspective to apply in cultural nursing research and develop culturally safe nursing practice. Post-colonial feminism offers the analytical lens to understand how health, social and cultural context, race and gender intersect to impact on non-Western populations' health. However, an uncritical application of post-colonial feminism may not serve racialized men's and women's interests because of its essentialist risk. Post-colonial feminism must expand its epistemological assumptions to integrate Taylor's concept of identity and recognition and Bakhtin's concepts of dialogism and unfinalizability to explore non-Western populations' health issues and the context of nursing practice. This would strengthen the theoretical adequacy of post-colonial feminist approaches in unveiling the process of racialization that arises from the conflation of multiculturalism, sexism, and religious fundamentalism in Western healthcare settings.

  6. A Kalman-based Fundamental Frequency Estimation Algorithm

    DEFF Research Database (Denmark)

    Shi, Liming; Nielsen, Jesper Kjær; Jensen, Jesper Rindom

    2017-01-01

    Fundamental frequency estimation is an important task in speech and audio analysis. Harmonic model-based methods typically have superior estimation accuracy. However, such methods usually as- sume that the fundamental frequency and amplitudes are station- ary over a short time frame. In this pape...

  7. 48 CFR 9904.420-40 - Fundamental requirement.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirement..., OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.420-40 Fundamental requirement. (a) The basic unit for...

  8. 48 CFR 9904.411-40 - Fundamental requirement.

    Science.gov (United States)

    2010-10-01

    ... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.411-40 Fundamental requirement. (a) The contractor shall... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirement. 9904.411-40 Section 9904.411-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD...

  9. 48 CFR 9904.402-40 - Fundamental requirement.

    Science.gov (United States)

    2010-10-01

    ... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.402-40 Fundamental requirement. All costs incurred for... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirement. 9904.402-40 Section 9904.402-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD...

  10. 48 CFR 9904.418-40 - Fundamental requirements.

    Science.gov (United States)

    2010-10-01

    ... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.418-40 Fundamental requirements. (a) A business unit... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Fundamental requirements. 9904.418-40 Section 9904.418-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD...

  11. Design principles for single standing nanowire solar cells: going beyond the planar efficiency limits.

    Science.gov (United States)

    Zeng, Yang; Ye, Qinghao; Shen, Wenzhong

    2014-05-09

    Semiconductor nanowires (NWs) have long been used in photovoltaic applications but restricted to approaching the fundamental efficiency limits of the planar devices with less material. However, recent researches on standing NWs have started to reveal their potential of surpassing these limits when their unique optical property is utilized in novel manners. Here, we present a theoretical guideline for maximizing the conversion efficiency of a single standing NW cell based on a detailed study of its optical absorption mechanism. Under normal incidence, a standing NW behaves as a dielectric resonator antenna, and its optical cross-section shows its maximum when the lowest hybrid mode (HE11δ) is excited along with the presence of a back-reflector. The promotion of the cell efficiency beyond the planar limits is attributed to two effects: the built-in concentration caused by the enlarged optical cross-section, and the shifting of the absorption front resulted from the excited mode profile. By choosing an optimal NW radius to support the HE11δ mode within the main absorption spectrum, we demonstrate a relative conversion-efficiency enhancement of 33% above the planar cell limit on the exemplary a-Si solar cells. This work has provided a new basis for designing and analyzing standing NW based solar cells.

  12. Physiology-based modelling approaches to characterize fish habitat suitability: Their usefulness and limitations

    Science.gov (United States)

    Teal, Lorna R.; Marras, Stefano; Peck, Myron A.; Domenici, Paolo

    2018-02-01

    empirical relationships which can be found consistently within physiological data across the animal kingdom. The advantages of the DEB models are that they make use of the generalities found in terms of animal physiology and can therefore be applied to species for which little data or empirical observations are available. In addition, the limitations as well as useful potential refinements of these and other physiology-based modelling approaches are discussed. Inclusion of the physiological response of various life stages and modelling the patterns of extreme events observed in nature are suggested for future work.

  13. M(atrix) theory: matrix quantum mechanics as a fundamental theory

    International Nuclear Information System (INIS)

    Taylor, Washington

    2001-01-01

    This article reviews the matrix model of M theory. M theory is an 11-dimensional quantum theory of gravity that is believed to underlie all superstring theories. M theory is currently the most plausible candidate for a theory of fundamental physics which reconciles gravity and quantum field theory in a realistic fashion. Evidence for M theory is still only circumstantial -- no complete background-independent formulation of the theory exists as yet. Matrix theory was first developed as a regularized theory of a supersymmetric quantum membrane. More recently, it has appeared in a different guise as the discrete light-cone quantization of M theory in flat space. These two approaches to matrix theory are described in detail and compared. It is shown that matrix theory is a well-defined quantum theory that reduces to a supersymmetric theory of gravity at low energies. Although its fundamental degrees of freedom are essentially pointlike, higher-dimensional fluctuating objects (branes) arise through the non-Abelian structure of the matrix degrees of freedom. The problem of formulating matrix theory in a general space-time background is discussed, and the connections between matrix theory and other related models are reviewed

  14. Brownian Motion as a Limit to Physical Measuring Processes

    DEFF Research Database (Denmark)

    Niss, Martin

    2016-01-01

    In this paper, we examine the history of the idea that noise presents a fundamental limit to physical measuring processes. This idea had its origins in research aimed at improving the accuracy of instruments for electrical measurements. Out of these endeavors, the Swedish physicist Gustaf A. Ising...

  15. Evaluating the fundamental qualities of a nuclear medicine radiographer for the provision of an optimal clinical service

    International Nuclear Information System (INIS)

    Griffiths, Marc; King, Simon; Stewart, Rob; Dawson, Gary

    2010-01-01

    The developing nature of nuclear medicine practice highlights the need for an evaluation of the fundamental qualities of a Radiographer working within this discipline. Existing guidelines appear to be in place for clinical technologists working within nuclear medicine. However, limited guidance has been provided for Radiographers practicing within this discipline. This article aims to discuss the fundamental qualities that are considered essential for optimal service delivery, following consultation with various stakeholders. Areas such as technical expertise and knowledge, appropriate use of imaging equipment and current models of safe working practice will be discussed. Patient care and ethical considerations will also be evaluated, along with some core recommendations for future advanced practice.

  16. Technological Fundamentalism? The Use of Unmanned Aerial Vehicles in the Conduct of War

    OpenAIRE

    Futrell, Doris J.

    2004-01-01

    There is an on-going battle in the Department of Defense between reason and the faith in technology. Those ascribing to technological fundamentalism are blind to the empirical evidence that their faith in technology is obscuring the technological limitations that are evident. The desire for information dominance to reach the state of total transparency of the opponent in order to win the war is untenable. The reasoning voiced by skeptics should be heeded but the technological fundamentalis...

  17. Processes at superhigh energies and hypothesis on fundamental length

    International Nuclear Information System (INIS)

    Mateev, M.D.

    1977-01-01

    The possibility of the noncontradictory introduction of the fundamental length (FL) into the apparatus of the relativistic quantum field theory (QFT) is considered. The approach connected with the change in the space-time geometry is given in detail. It is considered that the most adequate apparatus of description of phenomena in the high energy physics is the QFT in the pulse space. The analysis of the basic quantities of the theory is carried out in terms of the pulse representation. The consideration of free particles, the Reinman propagator of free particles and its properties, the uncertainty relation and the Planck formula shows that quite a new physics of processes at superhigh energies appears

  18. Structure functions at small xBj in a Euclidean field theory approach

    International Nuclear Information System (INIS)

    Hebecker, A.; Meggiolaro, E.; Nachtmann, O.

    2000-01-01

    The small-x Bj limit of deep inelastic scattering is related to the high-energy limit of the forward Compton amplitude in a familiar way. We show that the analytic continuation of this amplitude in the energy variable is calculable from a matrix element in Euclidean field theory. This matrix element can be written as a Euclidean functional integral in an effective field theory. Its effective Lagrangian has a simple expression in terms of the original Lagrangian. The functional integral expression obtained can, at least in principle, be evaluated using genuinely non-perturbative methods, e.g., on the lattice. Thus, a fundamentally new approach to the long-standing problem of structure functions at very small x Bj seems possible. We give arguments that the limit x Bj →0 corresponds to a critical point of the effective field theory where the correlation length becomes infinite in one direction

  19. Fundamental physics issues of multilevel logic in developing a parallel processor.

    Science.gov (United States)

    Bandyopadhyay, Anirban; Miki, Kazushi

    2007-06-01

    In the last century, On and Off physical switches, were equated with two decisions 0 and 1 to express every information in terms of binary digits and physically realize it in terms of switches connected in a circuit. Apart from memory-density increase significantly, more possible choices in particular space enables pattern-logic a reality, and manipulation of pattern would allow controlling logic, generating a new kind of processor. Neumann's computer is based on sequential logic, processing bits one by one. But as pattern-logic is generated on a surface, viewing whole pattern at a time is a truly parallel processing. Following Neumann's and Shannons fundamental thermodynamical approaches we have built compatible model based on series of single molecule based multibit logic systems of 4-12 bits in an UHV-STM. On their monolayer multilevel communication and pattern formation is experimentally verified. Furthermore, the developed intelligent monolayer is trained by Artificial Neural Network. Therefore fundamental weak interactions for the building of truly parallel processor are explored here physically and theoretically.

  20. Current limiting experiment with 600 V/100A rectification type superconducting fault current limiter; 600 V-100A kyu seiryugata chodendo genryuki no genryu shiken

    Energy Technology Data Exchange (ETDEWEB)

    Matsuzaki, J.; Tsurunaga, K.; Urata, M. [Toshiba Corp., Tokyo (Japan); Okuma, T.; Sato, Y.; Iwata, Y. [Tokyo Electric Power Co., Inc., Tokyo (Japan)

    1999-06-07

    The rectification type current limiter with the current-limiting system of the new type which combined rectifier circuits with the direct current reactor has been proposed until now, and it has succeeded in the current-limiting test by the normal conduction reactor by the 6.6kV class model vessel. Since the loss of the conductor becomes fundamentally the zero, in the same current limiter, by using superconducting wire rod, because direct current always flows in the reactor, making into low-loss becomes possible. In this report, this paper describes cut-off characteristic of 600V/100A rectification type superconductive current limiter using the metal type superconductive conductor. (NEDO)

  1. Fundamentals of electrochemical science

    CERN Document Server

    Oldham, Keith

    1993-01-01

    Key Features* Deals comprehensively with the basic science of electrochemistry* Treats electrochemistry as a discipline in its own right and not as a branch of physical or analytical chemistry* Provides a thorough and quantitative description of electrochemical fundamentals

  2. Fundamentals of ion exchange

    International Nuclear Information System (INIS)

    Townsend, R.P.

    1993-01-01

    In this paper the fundamentals of ion exchange mechanisms and their thermodynamics are described. A range of ion exchange materials is considered and problems of communication and technology transfer between scientists working in the field are discussed. (UK)

  3. Fundamental Concepts in Biophysics Volume 1

    CERN Document Server

    Jue, Thomas

    2009-01-01

    HANDBOOK OF MODERN BIOPHYSICS Series Editor Thomas Jue, PhD Handbook of Modern Biophysics brings current biophysics topics into focus, so that biology, medical, engineering, mathematics, and physical-science students or researchers can learn fundamental concepts and the application of new techniques in addressing biomedical challenges. Chapters explicate the conceptual framework of the physics formalism and illustrate the biomedical applications. With the addition of problem sets, guides to further study, and references, the interested reader can continue to explore independently the ideas presented. Volume I: Fundamental Concepts in Biophysics Editor Thomas Jue, PhD In Fundamental Concepts in Biophysics, prominent professors have established a foundation for the study of biophysics related to the following topics: Mathematical Methods in Biophysics Quantum Mechanics Basic to Biophysical Methods Computational Modeling of Receptor–Ligand Binding and Cellular Signaling Processes Fluorescence Spectroscopy Elec...

  4. Safety analysis fundamentals

    International Nuclear Information System (INIS)

    Wright, A.C.D.

    2002-01-01

    This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations

  5. Fundamentals of semiconductors physics and materials properties

    CERN Document Server

    Yu, Peter Y

    2010-01-01

    This fourth edition of the well-established Fundamentals of Semiconductors serves to fill the gap between a general solid-state physics textbook and research articles by providing detailed explanations of the electronic, vibrational, transport, and optical properties of semiconductors. The approach is physical and intuitive rather than formal and pedantic. Theories are presented to explain experimental results. This textbook has been written with both students and researchers in mind. Its emphasis is on understanding the physical properties of Si and similar tetrahedrally coordinated semiconductors. The explanations are based on physical insights. Each chapter is enriched by an extensive collection of tables of material parameters, figures, and problems. Many of these problems "lead the student by the hand" to arrive at the results. The major changes made in the fourth edition include: an extensive appendix about the important and by now well-established deep center known as the DX center, additional problems...

  6. Fast LCMV-based Methods for Fundamental Frequency Estimation

    DEFF Research Database (Denmark)

    Jensen, Jesper Rindom; Glentis, George-Othon; Christensen, Mads Græsbøll

    2013-01-01

    peaks and require matrix inversions for each point in the search grid. In this paper, we therefore consider fast implementations of LCMV-based fundamental frequency estimators, exploiting the estimators' inherently low displacement rank of the used Toeplitz-like data covariance matrices, using...... with several orders of magnitude, but, as we show, further computational savings can be obtained by the adoption of an approximative IAA-based data covariance matrix estimator, reminiscent of the recently proposed Quasi-Newton IAA technique. Furthermore, it is shown how the considered pitch estimators can...... as such either the classic time domain averaging covariance matrix estimator, or, if aiming for an increased spectral resolution, the covariance matrix resulting from the application of the recent iterative adaptive approach (IAA). The proposed exact implementations reduce the required computational complexity...

  7. A refractory metal gate approach for micronic CMOS technology

    International Nuclear Information System (INIS)

    Lubowiecki, V.; Ledys, J.L.; Plossu, C.; Balland, B.

    1987-01-01

    In the future, devices scaling down, integration density and performance improvements are going to bring a number of conventional circuit design and process techniques to their fundamental limits. To avoid any severe limitations in MOS ULSI (Ultra Large Scale Integration) technologies, interconnection materials and schemes are required to emerge, in order to face the Megabits memory field. Among those, the gate approach will obviously take a keyrole, when the operating speed of ULSI chips will reach the practical upper limits imposed by parasitic resistances and capacitances which stem from the circuit interconnect wiring. Even if fairly suitable for MOS process, doped polycrystalline silicon is being gradually replaced by refractory metal silicide or polycide structures, which match better with low resistivity requirements. However, as we approach the submicronic IC's, higher conductivity materials will be paid more and more attention. Recently, works have been devoted and published on refractory metal gate technologies. Molybdenum or tungsten, deposited either by CVD or PVD methods, are currently reported even if some drawbacks in their process integration still remain. This paper is willing to present such an approach based on tungsten (more reliable than Molybdenum deposited by LPCVD (giving more conductive and more stable films than PVD). Deposition process will be first described. Then CMOS process flow will allow us to focus on specific refractory metal gate issues. Finally, electrical and physical properties will be assessed, which will demonstrate the feasibility of such a technology as well as the compatibility of the tungsten with most of the usual techniques

  8. A Survey on Trust-Based Web Service Provision Approaches

    DEFF Research Database (Denmark)

    Dragoni, Nicola

    2010-01-01

    The basic tenet of Service-Oriented Computing (SOC) is the possibility of building distributed applications on the Web by using Web Services as fundamental building blocks. The proliferation of such services is considered the second wave of evolution in the Internet age, moving the Web from...... a collection of pages to a collections of services. Consensus is growing that this Web Service “revolution” won't eventuate until we resolve trust-related issues. Indeed, the intrinsic openness of the SOC vision makes crucial to locate useful services and recognize them as trustworthy. In this paper we review...... the field of trust-based Web Service selection, providing a structured classification of current approaches and highlighting the main limitations of each class and of the overall field. As a result, we claim that a soft notion of trust lies behind such weaknesses and we advocate the need of a new approach...

  9. How to increase treatment effectiveness and efficiency in psychiatry: creative psychopharmacotherapy - part 1: definition, fundamental principles and higher effectiveness polypharmacy.

    Science.gov (United States)

    Jakovljević, Miro

    2013-09-01

    Psychopharmacotherapy is a fascinating field that can be understood in many different ways. It is both a science and an art of communication with a heavily subjective dimension. The advent of a significant number of the effective and well tolerated mental health medicines during and after 1990s decade of the brain has increased our possibilities to treat major mental disorders in more successful ways with much better treatment outcome including full recovery. However, there is a huge gap between our possibilities for achieving high treatment effectiveness and not satisfying results in day-to-day clinical practice. Creative approach to psychopharmacotherapy could advance everyday clinical practice and bridge the gap. Creative psychopharmacotherapy is a concept that incorporates creativity as its fundamental tool. Creativity involves the intention and ability to transcend limiting traditional ideas, rules, patterns and relationships and to create meaningful new ideas, interpretations, contexts and methods in clinical psychopharmacology.

  10. Factors influencing the delivery of the fundamentals of care: Perceptions of nurses, nursing leaders and healthcare consumers.

    Science.gov (United States)

    Conroy, Tiffany

    2017-11-17

    To explore the factors described by nurses and consumer representatives influencing the delivery of the fundamentals of care. An ongoing challenge facing nursing is ensuring the "basics" or fundamentals of care are delivered optimally. The way nurses and patients perceive the delivery of the fundamentals of care had not been explored. Once identified, the factors that promote the delivery of the fundamentals of care may be facilitated. Inductive content analysis of scenario based focus groups. A qualitative approach was taken using three stages, including direct observation, focus groups and interviews. This paper reports the second stage. Focus groups discussed four patient care scenarios derived from the observational data. Focus groups were conducted separately for registered nurses, nurses in leadership roles and consumer representatives. Content analysis was used. The analysis of the focus group data resulted in three themes: Organisational factors; Individual nurse or patient factors; and Interpersonal factors. Organisational factors include nursing leadership, the context of care delivery and the availability of time. Individual nurse and patient factors include the specific care needs of the patient and the individual nurse and patient characteristics. Interpersonal factors include the nurse-patient relationship; involving the patient in their care, ensuring understanding and respecting choices; communication; and setting care priorities. Seeking the perspective of the people involved in delivering and receiving the fundamentals of care showed a shared understanding of the factors influencing the delivery of the fundamentals of care. The influence of nursing leadership and the quality of the nurse-patient relationship were perceived as important factors. Nurses and consumers share a common perspective of the factors influencing the delivery of the fundamentals of care and both value a therapeutic nurse-patient relationship. Clinical nursing leaders must

  11. Fundamentals of Fire Phenomena

    DEFF Research Database (Denmark)

    Quintiere, James

    analyses. Fire phenomena encompass everything about the scientific principles behind fire behaviour. Combining the principles of chemistry, physics, heat and mass transfer, and fluid dynamics necessary to understand the fundamentals of fire phenomena, this book integrates the subject into a clear...

  12. Industrial separation processes : fundamentals

    NARCIS (Netherlands)

    Haan, de A.B.; Bosch, Hans

    2013-01-01

    Separation processes on an industrial scale comprise well over half of the capital and operating costs. They are basic knowledge in every chemical engineering and process engineering study. This book provides comprehensive and fundamental knowledge of university teaching in this discipline,

  13. Fundamentals of plasma physics

    CERN Document Server

    Bittencourt, J A

    1986-01-01

    A general introduction designed to present a comprehensive, logical and unified treatment of the fundamentals of plasma physics based on statistical kinetic theory. Its clarity and completeness make it suitable for self-learning and self-paced courses. Problems are included.

  14. Fundaments for creation of national radiation protection standard for nuclear gauges

    International Nuclear Information System (INIS)

    Ferreira, Luiz Cavalcante

    2016-01-01

    The present work It aims to provide fundaments for the creation of a national standard of practice, safety and responsible use of nuclear gauges in accordance with the recommendations already existing national and international. The work deals with the protection against ionizing radiation, an outline of a proposal for a standard that discriminates in its articles and paragraphs, the basic principles of a proposal for a standard that discriminates in its articles and paragraphs, the basic principles of safety and security, and some pointes that are also relevant such as the responsibilities of those involved in acquisition and nuclear gauge operation, storage, maintenance, testing and emergency situations. The result is to provide a means to limit the dose of operators and people from the public and maintain these limits within the recommended by CNEN, reducing exposure do ionizing radiation, and having greater control in operating the equipment. (author)

  15. Shotgun approaches to gait analysis : insights & limitations

    NARCIS (Netherlands)

    Kaptein, Ronald G.; Wezenberg, Daphne; IJmker, Trienke; Houdijk, Han; Beek, Peter J.; Lamoth, Claudine J. C.; Daffertshofer, Andreas

    2014-01-01

    Background: Identifying features for gait classification is a formidable problem. The number of candidate measures is legion. This calls for proper, objective criteria when ranking their relevance. Methods: Following a shotgun approach we determined a plenitude of kinematic and physiological gait

  16. Fundamentals - longitudinal motion

    International Nuclear Information System (INIS)

    Weng, W.T.

    1989-01-01

    There are many ways to accelerate charged particles to high energy for physics research. Each has served its purpose but eventually has encountered fundamental limitations of one kind or another. Looking at the famous Livingston curve, the initial birth and final level-off of all types of accelerators is seen. In fact, in the mid-80s we personally witnessed the creation of a new type of collider - the Stanford Linear Collider. Also witnessed, was the resurgence of study into novel methods of acceleration. This paper will cover acceleration and longitudinal motion in a synchrotron. A synchrotron is a circular accelerator with the following three characteristics: (1) Magnetic guiding (dipole) and confinement (quadrupole) components are placed in a small neighborhood around the equilibrium orbit. (2) Particles are kept in resonance with the radio-frequency electric field indefinitely to achieve acceleration to higher energies. (3) Magnetic fields are varied adiabatically with the energy of the particle. D. Edwards described the transverse oscillations of particles in a synchrotron. Here the author talks about the longitudinal oscillations of particles. The phase stability principle was invented by V. Veksler and E. McMillan independently in 1945. The phase stability and strong focusing principle, invented by Courant and Livingston in 1952, enabled the steady energy gain of accelerators and storage rings witnessed during the past 30 years. This paper is a unified overview of the related rf subjects in an accelerator and a close coupling between accelerator physics and engineering practices, which is essential for the major progress in areas such as high intensity synchrotrons, a multistage accelerator complex, and anti-proton production and cooling, made possible in the past 20 years

  17. Investigation of density limit processes in DIII-D

    International Nuclear Information System (INIS)

    Maingi, R.; Mahdavi, M.A.; Petrie, T.W.

    1999-02-01

    A series of experiments has been conducted in DIII-D to investigate density-limiting processes. The authors have studied divertor detachment and MARFEs on closed field lines and find semi-quantitative agreement with theoretical calculations of onset conditions. They have shown that the critical density for MARFE onset at low edge temperature scales as I p /a 2 , i.e. similar to Greenwald scaling. They have also shown that the scaling of the critical separatrix density with heating power at partial detachment onset agrees with Borass' model. Both of these processes yield high edge density limits for reactors such as ITER. By using divertor pumping and pellet fueling they have avoided these and other processes and accessed densities > 1.5x Greenwald limit scaling with H-mode confinement, demonstrating that the Greenwald limit is not a fundamental limit on the core density

  18. Investigation of density limit processes in DIII-D

    International Nuclear Information System (INIS)

    Maingi, R.; Baylor, L.R.; Jernigan, T.

    2001-01-01

    A series of experiments has been conducted in DIII-D to investigate density-limiting processes. We have studied divertor detachment and MARFEs on closed field lines and find semi-quantitative agreement with theoretical calculations of onset conditions. We have shown that the critical density for MARFE onset at low edge temperature scales as I p /a 2 , i.e. similar to Greenwald scaling. We have also shown that the scaling of the critical separatrix density with heating power at partial detachment onset agrees with Borass' model. Both of these processes yield high edge density limits for reactors such as ITER. By using divertor pumping and pellet fueling we have avoided these and other processes and accessed densities >1.5x Greenwald limit scaling with H-mode confinement, demonstrating that the Greenwald limit is not a fundamental limit on the core density. (author)

  19. Investigation of density limit processes in DIII-D

    International Nuclear Information System (INIS)

    Maingi, R.; Mahdavi, M.A.; Petrie, T.W.

    1999-01-01

    A series of experiments has been conducted in DIII-D to investigate density-limiting processes. We have studied divertor detachment and MARFEs on closed field lines and find semi-quantitative agreement with theoretical calculations of onset conditions. We have shown that the critical density for MARFE onset at low edge temperature scales as I p /a 2 , i.e. similar to Greenwald scaling. We have also shown that the scaling of the critical separatrix density with heating power at partial detachment onset agrees with Borass' model. Both of these processes yield high edge density limits for reactors such as ITER. By using divertor pumping and pellet fueling we have avoided these and other processes and accessed densities > 1.5x Greenwald limit scaling with H-mode confinement, demonstrating that the Greenwald limit is not a fundamental limit on the core density. (author)

  20. Benefits and limitations of a multidisciplinary approach to individualized management of Cornelia de Lange syndrome and related diagnoses.

    Science.gov (United States)

    January, Kathleen; Conway, Laura J; Deardorff, Matthew; Harrington, Ann; Krantz, Ian D; Loomes, Kathleen; Pipan, Mary; Noon, Sarah E

    2016-06-01

    Given the clinical complexities of Cornelia de Lange Syndrome (CdLS), the Center for CdLS and Related Diagnoses at The Children's Hospital of Philadelphia (CHOP) and The Multidisciplinary Clinic for Adolescents and Adults at Greater Baltimore Medical Center (GBMC) were established to develop a comprehensive approach to clinical management and research issues relevant to CdLS. Little work has been done to evaluate the general utility of a multispecialty approach to patient care. Previous research demonstrates several advantages and disadvantages of multispecialty care. This research aims to better understand the benefits and limitations of a multidisciplinary clinic setting for individuals with CdLS and related diagnoses. Parents of children with CdLS and related diagnoses who have visited a multidisciplinary clinic (N = 52) and who have not visited a multidisciplinary clinic (N = 69) were surveyed to investigate their attitudes. About 90.0% of multispecialty clinic attendees indicated a preference for multidisciplinary care. However, some respondents cited a need for additional clinic services including more opportunity to meet with other specialists (N = 20), such as behavioral health, and increased information about research studies (N = 15). Travel distance and expenses often prevented families' multidisciplinary clinic attendance (N = 41 and N = 35, respectively). Despite identified limitations, these findings contribute to the evidence demonstrating the utility of a multispecialty approach to patient care. This approach ultimately has the potential to not just improve healthcare for individuals with CdLS but for those with medically complex diagnoses in general. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Fundamental Scientific Problems in Magnetic Recording

    Energy Technology Data Exchange (ETDEWEB)

    Schulthess, T.C.; Miller, M.K.

    2007-06-27

    Magnetic data storage technology is presently leading the high tech industry in advancing device integration--doubling the storage density every 12 months. To continue these advancements and to achieve terra bit per inch squared recording densities, new approaches to store and access data will be needed in about 3-5 years. In this project, collaboration between Oak Ridge National Laboratory (ORNL), Center for Materials for Information Technology (MINT) at University of Alabama (UA), Imago Scientific Instruments, and Seagate Technologies, was undertaken to address the fundamental scientific problems confronted by the industry in meeting the upcoming challenges. The areas that were the focus of this study were to: (1) develop atom probe tomography for atomic scale imaging of magnetic heterostructures used in magnetic data storage technology; (2) develop a first principles based tools for the study of exchange bias aimed at finding new anti-ferromagnetic materials to reduce the thickness of the pinning layer in the read head; (3) develop high moment magnetic materials and tools to study magnetic switching in nanostructures aimed at developing improved writers of high anisotropy magnetic storage media.

  2. Grenoble Fundamental Research Department

    International Nuclear Information System (INIS)

    1979-01-01

    A summary of the various activities of the Fundamental Research Institute, Grenoble, France is given. The following fields are covered: Nuclear physics, solid state physics, physical chemistry, biology and advanced techniques. Fore more detailed descriptions readers are referred to scientific literature [fr

  3. Fundamentals and Optimal Institutions

    DEFF Research Database (Denmark)

    Gonzalez-Eiras, Martin; Harmon, Nikolaj Arpe; Rossi, Martín

    2016-01-01

    of regulatory institutions such as revenue sharing, salary caps or luxury taxes. We show, theoretically and empirically, that these large differences in adopted institutions can be rationalized as optimal responses to differences in the fundamental characteristics of the sports being played. This provides...

  4. PREFACE: Fundamental Constants in Physics and Metrology

    Science.gov (United States)

    Klose, Volkmar; Kramer, Bernhard

    1986-01-01

    This volume contains the papers presented at the 70th PTB Seminar which, the second on the subject "Fundamental Constants in Physics and Metrology", was held at the Physikalisch-Technische Bundesanstalt in Braunschweig from October 21 to 22, 1985. About 100 participants from the universities and various research institutes of the Federal Republic of Germany participated in the meeting. Besides a number of review lectures on various broader subjects there was a poster session which contained a variety of topical contributed papers ranging from the theory of the quantum Hall effect to reports on the status of the metrological experiments at the PTB. In addition, the participants were also offered the possibility to visit the PTB laboratories during the course of the seminar. During the preparation of the meeting we noticed that even most of the general subjects which were going to be discussed in the lectures are of great importance in connection with metrological experiments and should be made accessible to the scientific community. This eventually resulted in the idea of the publication of the papers in a regular journal. We are grateful to the editor of Metrologia for providing this opportunity. We have included quite a number of papers from basic physical research. For example, certain aspects of high-energy physics and quantum optics, as well as the many-faceted role of Sommerfeld's fine-structure constant, are covered. We think that questions such as "What are the intrinsic fundamental parameters of nature?" or "What are we doing when we perform an experiment?" can shed new light on the art of metrology, and do, potentially, lead to new ideas. This appears to be especially necessary when we notice the increasing importance of the role of the fundamental constants and macroscopic quantum effects for the definition and the realization of the physical units. In some cases we have reached a point where the limitations of our knowledge of a fundamental constant and

  5. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, A.; Murthy, S.

    2007-06-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d ( d = 3, . . . , 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings. (author)

  6. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, Atish; Murthy, Sameer

    2008-01-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d (d = 3, ..., 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings

  7. ORGANIZATIONAL WORK GROUPS AND WORK TEAMS – APPROACHES AND DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Raluca ZOLTAN

    2015-02-01

    Full Text Available Work groups and work teams represents basic structures of traditional and modern organizations, and during the time they have been intensively researched. However, managers often do not always consider the fundamental differences between groups and teams, which will lead to unrealistic goals and results below expectations. Thus, in the present paper we propose a review of the main researching approaches on groups and teams (psychosocial, socio-technical, and behavioral approach, in the third part of the paper being detailed the fundamental differences between groups and teams in the light of these approaches.

  8. Fundamental theories of waves and particles formulated without classical mass

    Science.gov (United States)

    Fry, J. L.; Musielak, Z. E.

    2010-12-01

    Quantum and classical mechanics are two conceptually and mathematically different theories of physics, and yet they do use the same concept of classical mass that was originally introduced by Newton in his formulation of the laws of dynamics. In this paper, physical consequences of using the classical mass by both theories are explored, and a novel approach that allows formulating fundamental (Galilean invariant) theories of waves and particles without formally introducing the classical mass is presented. In this new formulation, the theories depend only on one common parameter called 'wave mass', which is deduced from experiments for selected elementary particles and for the classical mass of one kilogram. It is shown that quantum theory with the wave mass is independent of the Planck constant and that higher accuracy of performing calculations can be attained by such theory. Natural units in connection with the presented approach are also discussed and justification beyond dimensional analysis is given for the particular choice of such units.

  9. Portfolio optimization using fundamental indicators based on multi-objective EA

    CERN Document Server

    Silva, Antonio Daniel; Horta, Nuno

    2016-01-01

    This work presents a new approach to portfolio composition in the stock market. It incorporates a fundamental approach using financial ratios and technical indicators with a Multi-Objective Evolutionary Algorithms to choose the portfolio composition with two objectives the return and the risk. Two different chromosomes are used for representing different investment models with real constraints equivalents to the ones faced by managers of mutual funds, hedge funds, and pension funds. To validate the present solution two case studies are presented for the SP&500 for the period June 2010 until end of 2012. The simulations demonstrates that stock selection based on financial ratios is a combination that can be used to choose the best companies in operational terms, obtaining returns above the market average with low variances in their returns. In this case the optimizer found stocks with high return on investment in a conjunction with high rate of growth of the net income and a high profit margin. To obtain s...

  10. Proceedings of ISEC 2008, International Solvent Extraction Conference - Solvent Extraction: Fundamentals to Industrial Applications

    International Nuclear Information System (INIS)

    Moyer, Bruce A.

    2008-01-01

    The North American industry has employed major solvent-extraction processes to support a wide range of separations including but not limited to chemical, metallurgical, nuclear, biochemical, pharmaceutical, and petroleum applications. The knowledge enabling these separations has been obtained through fundamental studies in academe, government and industry. The International Solvent Extraction Conferences have been and continue to be a major gathering of scientists, engineers, operators, and vendors from around the world, who present new findings since the last meeting, exchange ideas, make business contacts, and conduct collegial discussions. The ISEC 2008 program emphasizes fundamentals to industrial applications of solvent extraction, particularly how this broad spectrum of activities is interconnected and has led to the implementation of novel processes. The oral and poster sessions have been organized into seven topics: Fundamentals; Novel Reagents, Materials and Techniques; Nuclear Fuel Reprocessing; Hydrometallurgy and Metals Extraction; Analytical and Preparative Applications; Biotechnology, Pharmaceuticals, Life-Science Products, and Organic Products; and Process Chemistry and Engineering. Over 350 abstracts were received, resulting in more than 260 manuscripts published in these proceedings. Five outstanding plenary presentations have been identified, with five parallel sessions for oral presentations and posters. In recognition of the major role solvent extraction (SX) plays in the hydrometallurgical and nuclear industries, these proceedings begin with sections focusing on hydrometallurgy, process chemistry, and engineering. More fundamental topics follow, including sections on novel reagents, materials, and techniques, featuring novel applications in analytical and biotechnology areas. Despite the diversity of topics and ideas represented, however, the primary focus of the ISEC community continues to be metals extraction. Four papers from these

  11. Can we push the fundamental Planck scale above $10^{19}$ GeV?

    CERN Document Server

    Stojkovic, Dejan

    2014-01-01

    The value of the quantum gravity scale is MPl = $10^{19}$ GeV. However, this is inherently a three-dimensional quantity. We know that we can bring this scale all the way down to TeV if we introduce extra dimensions with large volume. This will solve the hierarchy problem by destroying the desert between the electroweak and gravity scales, but will also introduce a host of new problems since some things (e.g. proton stability, neutrino masses etc) have their natural habitat in this desert. In contrast, we can also solve the hierarchy problem by reducing the number of dimensions at high energies. If the fundamental theory (which does not have to be gravity as we understand it today) is lower dimensional, then the fundamental energy scale might be much greater than 1019GeV. Then, some experimental and observational limits (e.g. on Lorentz invariance violation) which are coming close to or even exceeding the scale of 1019GeV can be evaded. In addition, scattering of particles at transplanckian energies will not p...

  12. Variation of nonlinearity parameter at low fundamental amplitudes

    Science.gov (United States)

    Barnard, Daniel J.

    1999-04-01

    Recent harmonic generation measurements of the nonlinearity parameter β in polycrystalline Cu-Al alloys have shown a transition to lower values at low fundamental amplitude levels. Values for β at high (>10 Å) fundamental levels are in the range predicted by single-crystal second- and third-order elastic constants while lower fundamental levels (alloy by others. The source of the effect is unclear but initial results may require a reexamination of current methods for measurement of third-order elastic constants.

  13. A risk modelling approach for setting microbiological limits using enterococci as indicator for growth potential of Salmonella in pork.

    Science.gov (United States)

    Bollerslev, Anne Mette; Nauta, Maarten; Hansen, Tina Beck; Aabo, Søren

    2017-01-02

    Microbiological limits are widely used in food processing as an aid to reduce the exposure to hazardous microorganisms for the consumers. However, in pork, the prevalence and concentrations of Salmonella are generally low and microbiological limits are not considered an efficient tool to support hygiene interventions. The objective of the present study was to develop an approach which could make it possible to define potential risk-based microbiological limits for an indicator, enterococci, in order to evaluate the risk from potential growth of Salmonella. A positive correlation between the concentration of enterococci and the prevalence and concentration of Salmonella was shown for 6640 pork samples taken at Danish cutting plants and retail butchers. The samples were collected in five different studies in 2001, 2002, 2010, 2011 and 2013. The observations that both Salmonella and enterococci are carried in the intestinal tract, contaminate pork by the same mechanisms and share similar growth characteristics (lag phase and maximum specific growth rate) at temperatures around 5-10°C, suggest a potential of enterococci to be used as an indicator of potential growth of Salmonella in pork. Elevated temperatures during processing will lead to growth of both enterococci and, if present, also Salmonella. By combining the correlation between enterococci and Salmonella with risk modelling, it is possible to predict the risk of salmonellosis based on the level of enterococci. The risk model used for this purpose includes the dose-response relationship for Salmonella and a reduction factor to account for preparation of the fresh pork. By use of the risk model, it was estimated that the majority of salmonellosis cases, caused by the consumption of pork in Denmark, is caused by the small fraction of pork products that has enterococci concentrations above 5logCFU/g. This illustrates that our approach can be used to evaluate the potential effect of different microbiological

  14. Fundamentals of Counting Statistics in Digital PCR: I Just Measured Two Target Copies-What Does It Mean?

    Science.gov (United States)

    Tzonev, Svilen

    2018-01-01

    Current commercially available digital PCR (dPCR) systems and assays are capable of detecting individual target molecules with considerable reliability. As tests are developed and validated for use on clinical samples, the need to understand and develop robust statistical analysis routines increases. This chapter covers the fundamental processes and limitations of detecting and reporting on single molecule detection. We cover the basics of quantification of targets and sources of imprecision. We describe the basic test concepts: sensitivity, specificity, limit of blank, limit of detection, and limit of quantification in the context of dPCR. We provide basic guidelines how to determine those, how to choose and interpret the operating point, and what factors may influence overall test performance in practice.

  15. A unified flight control methodology for a compound rotorcraft in fundamental and aerobatic maneuvering flight

    Science.gov (United States)

    Thorsen, Adam

    This study investigates a novel approach to flight control for a compound rotorcraft in a variety of maneuvers ranging from fundamental to aerobatic in nature. Fundamental maneuvers are a class of maneuvers with design significance that are useful for testing and tuning flight control systems along with uncovering control law deficiencies. Aerobatic maneuvers are a class of aggressive and complex maneuvers with more operational significance. The process culminating in a unified approach to flight control includes various control allocation studies for redundant controls in trim and maneuvering flight, an efficient methodology to simulate non-piloted maneuvers with varying degrees of complexity, and the setup of an unconventional control inceptor configuration along with the use of a flight simulator to gather pilot feedback in order to improve the unified control architecture. A flight path generation algorithm was developed to calculate control inceptor commands required for a rotorcraft in aerobatic maneuvers. This generalized algorithm was tailored to generate flight paths through optimization methods in order to satisfy target terminal position coordinates or to minimize the total time of a particular maneuver. Six aerobatic maneuvers were developed drawing inspiration from air combat maneuvers of fighter jet aircraft: Pitch-Back Turn (PBT), Combat Ascent Turn (CAT), Combat Descent Turn (CDT), Weaving Pull-up (WPU), Combat Break Turn (CBT), and Zoom and Boom (ZAB). These aerobatic maneuvers were simulated at moderate to high advance ratios while fundamental maneuvers of the compound including level accelerations/decelerations, climbs, descents, and turns were investigated across the entire flight envelope to evaluate controller performance. The unified control system was developed to allow controls to seamlessly transition between manual and automatic allocations while ensuring that the axis of control for a particular inceptor remained constant with flight

  16. Fundamental characteristics and simplified evaluation method of dynamic earth pressure

    International Nuclear Information System (INIS)

    Nukui, Y.; Inagaki, Y.; Ohmiya, Y.

    1989-01-01

    In Japan, a method is commonly used in the evaluation of dynamic earth pressure acting on the underground walls of a deeply embedded nuclear reactor building. However, since this method was developed on the basis of the limit state of soil supported by retaining walls, the behavior of dynamic earth pressure acting on the embedded part of a nuclear reactor building may differ from the estimated by this method. This paper examines the fundamental characteristics of dynamic earth pressure through dynamic soil-structure interaction analysis. A simplified method to evaluate dynamic earth pressure for the design of underground walls of a nuclear reactor building is described. The dynamic earth pressure is fluctuating earth pressure during earthquake

  17. THE PHENOMENON OF (ISLAMIC RELIGIOUS FUNDAMENTALISM IN A NON-‘RELIGIOUS’ CAMPUS: A CASE STUDY AT HASANUDDIN UNIVERSITY MAKASSAR

    Directory of Open Access Journals (Sweden)

    Taufani Taufani

    2014-06-01

    Full Text Available This research aims to describe and examine the phenomenon of Islamic fundamentalism on the campus of Hasanuddin University (UNHAS. Islamic fundamentalism is a phenomenon that emerged after the reform and it is commonly encountered in the campus world. The trend shows that the phenomenon of Islamic fundamentalism is growing in the campus that has no particular religious affiliation and is often driven by the propagation of the Campus Dakwah Organization (LDK. This research would like to test the thesis that whether it is relevant to the context of the Hasanuddin University that in fact is not a religiously-affiliated campus or the contrary. The method of collecting data was done through observation of the activities of the LDK activist at the Hasanuddin University (UNHAS Campus Dakwah Organization’s Musholla Lovers (LDK-MPM, in-depth interviews, documentation/review of previous research and papers. This research shows that Islamic fundamentalism led by LDK-MPM is growing at the Hasanuddin University. This phenomenon emerged as the implications of the post-reform freedom, so that these opportunities are exploited by activists to channel their euphoria, because at the time of the new order, their propagation had a fairly limited space. Another factor that led to the rise of Islamic fundamentalism and growing at Hasanuddin University is because the students did not have comprehensive Islamic references, so that they had no checklist for critiquing and examining the ideology of Islamic fundamentalism. In addition, the emergence of modernity considered to bring the negative excesses also serves as another factor being the cause of Islamic fundamentalism. Therefore, the ideology of Islamic fundamentalism emerged as an alternative to counteract the negative excesses. Keywords: Islamic Fundamentalism, LDK-MPM, Hasanuddin University.

  18. The observational approach in environmental restoration

    International Nuclear Information System (INIS)

    Smyth, J.D.; Quinn, R.D.

    1991-07-01

    The US Department of Energy (DOE) has committed to completing environmental restoration of its facilities within the next 28 years (DOE 1990b). In order to achieve this, DOE must ensure that its restoration activities are both effective and efficient. A key aspect of fulfilling this commitment is the recognition and management of uncertainty that is inherent in waste-site clean-up actions. The DOE Office of Environmental Restoration (DOE-ER) requested Pacific Northwest Laboratory (PNL) to investigate the applicability and implementation of what is known as the ''observational approach'' to better address these needs. PNL's initial investigation resulted in the positive conclusion that the observational approach had potential benefit to DOE during environmental restoration. In a follow-on effort, PNL supported by CH2M HILL, has been providing guidance to DOE field-offices on observational approach fundamentals, implementation, and application to waste-site remediation. This paper outlines the fundamentals of the observational approach and discusses the progress in integrating the observational approach in DOE's environmental restoration efforts. 9 refs., 2 figs

  19. THE FUNDAMENTS OF EXPLANATORY CAUSES

    Directory of Open Access Journals (Sweden)

    Lavinia Mihaela VLĂDILĂ

    2015-07-01

    Full Text Available The new Criminal Code in the specter of the legal life the division of causes removing the criminal feature of the offence in explanatory causes and non-attributable causes. This dichotomy is not without legal and factual fundaments and has been subjected to doctrinaire debates even since the period when the Criminal Code of 1969 was still in force. From our perspective, one of the possible legal fundaments of the explanatory causes results from that the offence committed is based on the protection of a right at least equal with the one prejudiced by the action of aggression, salvation, by the legal obligation imposed or by the victim’s consent.

  20. O sofrimento psíquico na perspectiva da psicopatologia fundamental Psychic suffering from the fundamental psychopathology perspective

    Directory of Open Access Journals (Sweden)

    Paulo Ceccarelli

    2005-12-01

    Full Text Available Partindo da palavra PSICOPATOLOGIA, o autor mostra, de forma resumida, como cada contexto histórico tentou "decompor" o sofrimento psíquico em seus elementos de base para classificá-lo, estudá-lo e tratá-lo. Após uma breve apresentação da psicopatologia na contemporaneidade, o autor introduz os pressupostos da Psicopatologia Fundamental e suas contribuições na compreensão do sofrimento psíquico. Ainda que não seja objetivo do texto participar do debate atual sobre as diretrizes curriculares que norteiam a formação do psicólogo, o autor toma o estudo do conhecimento (logos da alma (psyché - a psicologia - como exemplo de um dos campos de aplicação da Psicopatologia Fundamental.From the word PSYCHOPATHOLOGY the author briefly shows how each historical context had its own way to decompose psychic suffering in order to classify, study and search for its cure. After a short discussion about psychopathology in contemporaneity the author introduces the theoretical bases of Fundamental Psychopathology and its contributions to understanding psychic suffering. Although this text does not claim to participate in the debate about psychology students training, the author exemplifies through the study of the soul (psyche knowledge (logos one of the applications of Fundamental Psychopathology.

  1. Social use of alcohol among adolescent offenders: a fundamental approach toward human needs

    Directory of Open Access Journals (Sweden)

    Gustavo D?Andrea

    2014-02-01

    Full Text Available This study examined some basic health care approaches toward human needs, with a particular focus on nursing. We aimed to incorporate these approaches into the discussion of the mental health of adolescent offenders who consume alcohol. We discuss specific needs of the delinquent group, critique policies that prioritize coercion of adolescent offenders, and the role that nurses could play in the sphere of juvenile delinquency.

  2. Fundamental U-Theory of Time. Part 1

    Directory of Open Access Journals (Sweden)

    Yuvraj J. Gopaul

    2016-02-01

    Full Text Available The Fundamental U-Theory of Time (Part 1 is an original theory that aims to unravel the mystery of what exactly is ‘time’. To date very few explanations, from the branches of physics or cosmology, have succeeded to provide an accurate and comprehensive depiction of time. Most explanations have only managed to provide partial understanding or at best, glimpses of its true nature. The U-Theory uses ‘Thought Experiments’ to uncover the determining characteristics of time. In part 1 of this theory, the focus is not on the mathematics as it is on the accuracy of the depiction of time. Moreover, it challenges current views on theoretical physics, particularly on the idea of ‘time travel’. Notably, it is a theory seeking to present a fresh approach for reviewing Einstein’s Theory of Relativity, while unlocking new pathways for upcoming research in the field of physics and cosmology.

  3. Fundamentals of colour awareness: a literature review

    Directory of Open Access Journals (Sweden)

    A. Rubin

    2005-12-01

    Full Text Available A description of some of the basic or funda-mental aspects of the colour sensory mechanism will be provided here, based on modern ideas and literature, with reference specifically to the likely origins and evolution of colour vision.  The mo-lecular basis for colour awareness and the human colour pathway will also be considered in some detail. This paper intends to provide the theoreti-cal and philosophical basis for further papers that will introduce a modern and original computer- based  method  for  more  comprehensive  colour vision  assessment.    This  new  approach,  to  be fully described in later manuscripts, may contrib-ute towards improvements in understanding and knowledge of human colour perception and its measurement, still perhaps a relatively under-ex-plored or neglected field of study within optom-etry and ophthalmology.

  4. Kant and the Critique of the Ethics-First Approach to Politics

    DEFF Research Database (Denmark)

    Rostbøll, Christian F.

    2017-01-01

    of an "ethics-first approach to politics," in which political theory is a mere application of moral principles. But what does this ethics-first approach have to do with Kant himself? Very little. This article shows how Kant's approach to political theory at a fundamental level includes political institutions......, power, and coercion as well as disagreement, security, and coordination problems. In contrast to realists, Kant has a fundamental principle, which can explain why and guide how we ought to approach the political question, namely the norm of equal freedom. Yet, Kant's theory does not take the form...

  5. Is G a conversion factor or a fundamental unit?

    OpenAIRE

    Fiorentini, G.; Okun, L.; Vysotsky, M.

    2001-01-01

    By using fundamental units c, h, G as conversion factors one can easily transform the dimensions of all observables. In particular one can make them all ``geometrical'', or dimensionless. However this has no impact on the fact that there are three fundamental units, G being one of them. Only experiment can tell us whether G is basically fundamental.

  6. Fundamental Metallurgy of Solidification

    DEFF Research Database (Denmark)

    Tiedje, Niels

    2004-01-01

    The text takes the reader through some fundamental aspects of solidification, with focus on understanding the basic physics that govern solidification in casting and welding. It is described how the first solid is formed and which factors affect nucleation. It is described how crystals grow from...

  7. Fast fundamental frequency estimation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm; Jensen, Jesper Rindom

    2017-01-01

    Modelling signals as being periodic is common in many applications. Such periodic signals can be represented by a weighted sum of sinusoids with frequencies being an integer multiple of the fundamental frequency. Due to its widespread use, numerous methods have been proposed to estimate the funda...

  8. Entropy-limited hydrodynamics: a novel approach to relativistic hydrodynamics

    Science.gov (United States)

    Guercilena, Federico; Radice, David; Rezzolla, Luciano

    2017-07-01

    We present entropy-limited hydrodynamics (ELH): a new approach for the computation of numerical fluxes arising in the discretization of hyperbolic equations in conservation form. ELH is based on the hybridisation of an unfiltered high-order scheme with the first-order Lax-Friedrichs method. The activation of the low-order part of the scheme is driven by a measure of the locally generated entropy inspired by the artificial-viscosity method proposed by Guermond et al. (J. Comput. Phys. 230(11):4248-4267, 2011, doi: 10.1016/j.jcp.2010.11.043). Here, we present ELH in the context of high-order finite-differencing methods and of the equations of general-relativistic hydrodynamics. We study the performance of ELH in a series of classical astrophysical tests in general relativity involving isolated, rotating and nonrotating neutron stars, and including a case of gravitational collapse to black hole. We present a detailed comparison of ELH with the fifth-order monotonicity preserving method MP5 (Suresh and Huynh in J. Comput. Phys. 136(1):83-99, 1997, doi: 10.1006/jcph.1997.5745), one of the most common high-order schemes currently employed in numerical-relativity simulations. We find that ELH achieves comparable and, in many of the cases studied here, better accuracy than more traditional methods at a fraction of the computational cost (up to {˜}50% speedup). Given its accuracy and its simplicity of implementation, ELH is a promising framework for the development of new special- and general-relativistic hydrodynamics codes well adapted for massively parallel supercomputers.

  9. Learning to Recognize Actions From Limited Training Examples Using a Recurrent Spiking Neural Model

    Science.gov (United States)

    Panda, Priyadarshini; Srinivasa, Narayan

    2018-01-01

    A fundamental challenge in machine learning today is to build a model that can learn from few examples. Here, we describe a reservoir based spiking neural model for learning to recognize actions with a limited number of labeled videos. First, we propose a novel encoding, inspired by how microsaccades influence visual perception, to extract spike information from raw video data while preserving the temporal correlation across different frames. Using this encoding, we show that the reservoir generalizes its rich dynamical activity toward signature action/movements enabling it to learn from few training examples. We evaluate our approach on the UCF-101 dataset. Our experiments demonstrate that our proposed reservoir achieves 81.3/87% Top-1/Top-5 accuracy, respectively, on the 101-class data while requiring just 8 video examples per class for training. Our results establish a new benchmark for action recognition from limited video examples for spiking neural models while yielding competitive accuracy with respect to state-of-the-art non-spiking neural models. PMID:29551962

  10. Resonance spectrum of near-extremal Kerr black holes in the eikonal limit

    International Nuclear Information System (INIS)

    Hod, Shahar

    2012-01-01

    The fundamental resonances of rapidly rotating Kerr black holes in the eikonal limit are derived analytically. We show that there exists a critical value, μ c =√((15-√(193))/2 ), for the dimensionless ratio μ≡m/l between the azimuthal harmonic index m and the spheroidal harmonic index l of the perturbation mode, above which the perturbations become long lived. In particular, it is proved that above μ c the imaginary parts of the quasinormal frequencies scale like the black-hole temperature: ω I (n;μ>μ c )=2πT BH (n+1/2 ). This implies that for perturbations modes in the interval μ c I of the black hole becomes extremely long as the extremal limit T BH →0 is approached. A generalization of the results to the case of scalar quasinormal resonances of near-extremal Kerr-Newman black holes is also provided. In particular, we prove that only black holes that rotate fast enough (with MΩ≥2/5 , where M and Ω are the black-hole mass and angular velocity, respectively) possess this family of remarkably long-lived perturbation modes.

  11. Fundamentals and applications of ultrasonic waves

    CERN Document Server

    Cheeke, J David N

    2002-01-01

    Ultrasonics. A subject with applications across all the basic sciences, engineering, medicine, and oceanography, yet even the broader topic of acoustics is now rarely offered at undergraduate levels. Ultrasonics is addressed primarily at the doctoral level, and texts appropriate for beginning graduate students or newcomers to the field are virtually nonexistent.Fundamentals and Applications of Ultrasonic Waves fills that void. Designed specifically for senior undergraduates, beginning graduate students, and those just entering the field, it begins with the fundamentals, but goes well beyond th

  12. Fundamentals of differential beamforming

    CERN Document Server

    Benesty, Jacob; Pan, Chao

    2016-01-01

    This book provides a systematic study of the fundamental theory and methods of beamforming with differential microphone arrays (DMAs), or differential beamforming in short. It begins with a brief overview of differential beamforming and some popularly used DMA beampatterns such as the dipole, cardioid, hypercardioid, and supercardioid, before providing essential background knowledge on orthogonal functions and orthogonal polynomials, which form the basis of differential beamforming. From a physical perspective, a DMA of a given order is defined as an array that measures the differential acoustic pressure field of that order; such an array has a beampattern in the form of a polynomial whose degree is equal to the DMA order. Therefore, the fundamental and core problem of differential beamforming boils down to the design of beampatterns with orthogonal polynomials. But certain constraints also have to be considered so that the resulting beamformer does not seriously amplify the sensors’ self noise and the mism...

  13. The brain triuno and the ethical intelligence: fundamental counterfoil of the multifocal intelligence

    Directory of Open Access Journals (Sweden)

    C. Seijo

    2013-10-01

    Full Text Available This study has for aim offer an analysis as for the brain triuno and the ethical intelligence: fundamental Counterfoil of the multifocal intelligence, taking in tells one of the theories that it sustains her like they are the different types of multiple intelligences established by Beauport and Cury (2004. The theoretical sustenance, it is based on the contents of Martin (2005, Belohlavek (2007, Galicians (2002, Beauport and Cury (2004, between others, being realized under a symbolic interpretive approach, across a qualitative methodology, type descriptive and not experimental design, by means of a documentary analysis. In this regard, it is found that the ethical intelligence is a mental mechanism that constructs the structural preconceptos and the rules of game with which an individual approaches the reality, that is to say, it is the capacity of the general formation, predicting the behavior for the achievement of aims organizacionales. As for the final considerations they focused in obtaining the most wide knowledge inside the organizations, allowing to reflect before the weaknesses that they present thinking about the brain triuno applying the multifocal intelligence, fundamental counterfoil of the ethical intelligence and of what way the rationing visualizes the strengths, nevertheless of the weaknesses that they present. 

  14. Limit lines for risk

    International Nuclear Information System (INIS)

    Cox, D.C.; Baybutt, P.

    1982-01-01

    Approaches to the regulation of risk from technological systems, such as nuclear power plants or chemical process plants, in which potential accidents may result in a broad range of adverse consequences must take into account several different aspects of risk. These include overall or average risk, accidents posing high relative risks, the rate at which accident probability decreases with increasing accident consequences, and the impact of high frequency, low consequence accidents. A hypothetical complementary cumulative distribution function (CCDF), with appropriately chosen parametric form, meets all these requirements. The Farmer limit line, by contrast, places limits on the risks due to individual accident sequences, and cannot adequately account for overall risk. This reduces its usefulness as a regulatory tool. In practice, the CCDF is used in the Canadian nuclear licensing process, while the Farmer limit line approach, supplemented by separate qualitative limits on overall risk, is employed in the United Kingdom

  15. The island of knowledge the limits of science and the search for meaning

    CERN Document Server

    Gleiser, Marcelo

    2014-01-01

    Do all questions have answers? How much can we know about the world? Is there such a thing as an ultimate truth? To be human is to want to know, but what we are able to observe is only a tiny portion of what’s “out there.” In The Island of Knowledge, physicist Marcelo Gleiser traces our search for answers to the most fundamental questions of existence. In so doing, he reaches a provocative conclusion: science, the main tool we use to find answers, is fundamentally limited. These limits to our knowledge arise both from our tools of exploration and from the nature of physical reality: the speed of light, the uncertainty principle, the impossibility of seeing beyond the cosmic horizon, the incompleteness theorem, and our own limitations as an intelligent species. Recognizing limits in this way, Gleiser argues, is not a deterrent to progress or a surrendering to religion. Rather, it frees us to question the meaning and nature of the universe while affirming the central role of life and ourselves in it. Sc...

  16. Approaching the basis set limit for DFT calculations using an environment-adapted minimal basis with perturbation theory: Formulation, proof of concept, and a pilot implementation

    International Nuclear Information System (INIS)

    Mao, Yuezhi; Horn, Paul R.; Mardirossian, Narbe; Head-Gordon, Teresa; Skylaris, Chris-Kriton; Head-Gordon, Martin

    2016-01-01

    Recently developed density functionals have good accuracy for both thermochemistry (TC) and non-covalent interactions (NC) if very large atomic orbital basis sets are used. To approach the basis set limit with potentially lower computational cost, a new self-consistent field (SCF) scheme is presented that employs minimal adaptive basis (MAB) functions. The MAB functions are optimized on each atomic site by minimizing a surrogate function. High accuracy is obtained by applying a perturbative correction (PC) to the MAB calculation, similar to dual basis approaches. Compared to exact SCF results, using this MAB-SCF (PC) approach with the same large target basis set produces <0.15 kcal/mol root-mean-square deviations for most of the tested TC datasets, and <0.1 kcal/mol for most of the NC datasets. The performance of density functionals near the basis set limit can be even better reproduced. With further improvement to its implementation, MAB-SCF (PC) is a promising lower-cost substitute for conventional large-basis calculations as a method to approach the basis set limit of modern density functionals.

  17. Approaches to proton single-event rate calculations

    International Nuclear Information System (INIS)

    Petersen, E.L.

    1996-01-01

    This article discusses the fundamentals of proton-induced single-event upsets and of the various methods that have been developed to calculate upset rates. Two types of approaches are used based on nuclear-reaction analysis. Several aspects can be analyzed using analytic methods, but a complete description is not available. The paper presents an analytic description for the component due to elastic-scattering recoils. There have been a number of studies made using Monte Carlo methods. These can completely describe the reaction processes, including the effect of nuclear reactions occurring outside the device-sensitive volume. They have not included the elastic-scattering processes. The article describes the semiempirical approaches that are most widely used. The quality of previous upset predictions relative to space observations is discussed and leads to comments about the desired quality of future predictions. Brief sections treat the possible testing limitation due to total ionizing dose effects, the relationship of proton and heavy-ion upsets, upsets due to direct proton ionization, and relative proton and cosmic-ray upset rates

  18. Are fundamental constants really constant

    International Nuclear Information System (INIS)

    Norman, E.B.

    1986-01-01

    Reasons for suspecting that fundamental constants might change with time are reviewed. Possible consequences of such variations are examined. The present status of experimental tests of these ideas is discussed

  19. Religious Fundamentalism/Religious Modernism: Conceptual Adversaries or Ambivalent Phenomena?

    Directory of Open Access Journals (Sweden)

    D. GOLOVUSHKIN

    2015-02-01

    Full Text Available Both religious modernism and religious fundamentalism appeared as problems in academic and theological literature at the beginning of the 20th century. They came about as the result of the dynamic development of modernistic ideology in Russia, the United States, Western Europe and the Islamic world. Today, the concepts of religious modernism and religious fundamentalism are widely used to describe religious processes and phenomena which are the result of interaction between religion (as a dynamic spiritual and social subsystem and society - as a social system experiencing evolution. The concept of religious modernism is traditionally associated with religious renewal, the contemporary world, and innovation. Fundamentalism, on the contrary, is an ideological commitment to the “roots and origins” of religion. Under the aegis of fundamentalism, any religious idea, value or concept has a right to exist. Religious Studies, during the course of time and the production of ever new material, encountered a serious theoretic-methodological problem: How can various religious movements and religious traditions be organized into groups since some of them combine elements of religious modernism and of religious fundamentalism? Already at the end of the nineteen-eighties, the well-established view defining “fundamentalism-modernism” as contrary positions had to be rethought. Studies dating from the nineteen-nineties and the beginning of the new millennium concentrated on noting the social origins and the political character of these phenomena. They demonstrated that neither fundamentalism nor modernism present the whole picture. The lines dividing them are so blurred, that they become confl uent. Consequently, the author concludes that religious fundamentalism and religious modernism are ambivalent phenomena, which can, on occasion, interact with each other.

  20. Fundamentals of EUV resist-inorganic hardmask interactions

    Science.gov (United States)

    Goldfarb, Dario L.; Glodde, Martin; De Silva, Anuja; Sheshadri, Indira; Felix, Nelson M.; Lionti, Krystelle; Magbitang, Teddie

    2017-03-01

    High resolution Extreme Ultraviolet (EUV) patterning is currently limited by EUV resist thickness and pattern collapse, thus impacting the faithful image transfer into the underlying stack. Such limitation requires the investigation of improved hardmasks (HMs) as etch transfer layers for EUV patterning. Ultrathin (<5nm) inorganic HMs can provide higher etch selectivity, lower post-etch LWR, decreased defectivity and wet strippability compared to spin-on hybrid HMs (e.g., SiARC), however such novel layers can induce resist adhesion failure and resist residue. Therefore, a fundamental understanding of EUV resist-inorganic HM interactions is needed in order to optimize the EUV resist interfacial behavior. In this paper, novel materials and processing techniques are introduced to characterize and improve the EUV resist-inorganic HM interface. HM surface interactions with specific EUV resist components are evaluated for open-source experimental resist formulations dissected into its individual additives using EUV contrast curves as an effective characterization method to determine post-development residue formation. Separately, an alternative adhesion promoter platform specifically tailored for a selected ultrathin inorganic HM based on amorphous silicon (aSi) is presented and the mitigation of resist delamination is exemplified for the cases of positive-tone and negative-tone development (PTD, NTD). Additionally, original wafer priming hardware for the deposition of such novel adhesion promoters is unveiled. The lessons learned in this work can be directly applied to the engineering of EUV resist materials and processes specifically designed to work on such novel HMs.