WorldWideScience

Sample records for approaching fundamental limits

  1. Fundamental Approaches to Software Engineering

    NARCIS (Netherlands)

    Gnesi, S.; Rensink, Arend; Unknown, [Unknown

    This volume contains the proceedings of FASE 2014, the 17th International Conferences on Fundamental Approaches to Software Engineering, which was held in Grenoble, Italy, in April 2014 as part of the annual European Joint Conferences on Theory and Practice of Software (ETAPS).

  2. Queueing networks a fundamental approach

    CERN Document Server

    Dijk, Nico

    2011-01-01

    This handbook aims to highlight fundamental, methodological and computational aspects of networks of queues to provide insights and to unify results that can be applied in a more general manner.  The handbook is organized into five parts: Part 1 considers exact analytical results such as of product form type. Topics include characterization of product forms by physical balance concepts and simple traffic flow equations, classes of service and queue disciplines that allow a product form, a unified description of product forms for discrete time queueing networks, insights for insensitivity, and aggregation and decomposition results that allow subnetworks to be aggregated into single nodes to reduce computational burden. Part 2 looks at monotonicity and comparison results such as for computational simplification by either of two approaches: stochastic monotonicity and ordering results based on the ordering of the proces generators, and comparison results and explicit error bounds based on an underlying Markov r...

  3. Fundamental limitations in filtering and control

    CERN Document Server

    Seron, Maria M

    1997-01-01

    The issue of fundamental limitations in filtering and control lies at the very heart of any feedback system design, since it reveals what is and is not achievable on the basis of that system's structural and dynamic characteristics. Alongside new succinct treatments of Bode's original results from the 1940s, this book presents a comprehensive analysis of modern results, featuring contemporary developments in multivariable systems, sampled-data, periodic and nonlinear problems. The text gives particular prominence to sensitivity functions which measure the fundamental qualities of the system, including performance and robustness. With extensive appendices covering the necessary background on complex variable theory, this book is an ideal self-contained resource for researchers and practitioners in this field.

  4. Fundamental limits of repeaterless quantum communications

    Science.gov (United States)

    Pirandola, Stefano; Laurenza, Riccardo; Ottaviani, Carlo; Banchi, Leonardo

    2017-04-01

    Quantum communications promises reliable transmission of quantum information, efficient distribution of entanglement and generation of completely secure keys. For all these tasks, we need to determine the optimal point-to-point rates that are achievable by two remote parties at the ends of a quantum channel, without restrictions on their local operations and classical communication, which can be unlimited and two-way. These two-way assisted capacities represent the ultimate rates that are reachable without quantum repeaters. Here, by constructing an upper bound based on the relative entropy of entanglement and devising a dimension-independent technique dubbed `teleportation stretching', we establish these capacities for many fundamental channels, namely bosonic lossy channels, quantum-limited amplifiers, dephasing and erasure channels in arbitrary dimension. In particular, we exactly determine the fundamental rate-loss tradeoff affecting any protocol of quantum key distribution. Our findings set the limits of point-to-point quantum communications and provide precise and general benchmarks for quantum repeaters.

  5. Fundamental limits of repeaterless quantum communications

    Science.gov (United States)

    Pirandola, Stefano; Laurenza, Riccardo; Ottaviani, Carlo; Banchi, Leonardo

    2017-01-01

    Quantum communications promises reliable transmission of quantum information, efficient distribution of entanglement and generation of completely secure keys. For all these tasks, we need to determine the optimal point-to-point rates that are achievable by two remote parties at the ends of a quantum channel, without restrictions on their local operations and classical communication, which can be unlimited and two-way. These two-way assisted capacities represent the ultimate rates that are reachable without quantum repeaters. Here, by constructing an upper bound based on the relative entropy of entanglement and devising a dimension-independent technique dubbed ‘teleportation stretching', we establish these capacities for many fundamental channels, namely bosonic lossy channels, quantum-limited amplifiers, dephasing and erasure channels in arbitrary dimension. In particular, we exactly determine the fundamental rate-loss tradeoff affecting any protocol of quantum key distribution. Our findings set the limits of point-to-point quantum communications and provide precise and general benchmarks for quantum repeaters. PMID:28443624

  6. Fundamental limits of repeaterless quantum communications.

    Science.gov (United States)

    Pirandola, Stefano; Laurenza, Riccardo; Ottaviani, Carlo; Banchi, Leonardo

    2017-04-26

    Quantum communications promises reliable transmission of quantum information, efficient distribution of entanglement and generation of completely secure keys. For all these tasks, we need to determine the optimal point-to-point rates that are achievable by two remote parties at the ends of a quantum channel, without restrictions on their local operations and classical communication, which can be unlimited and two-way. These two-way assisted capacities represent the ultimate rates that are reachable without quantum repeaters. Here, by constructing an upper bound based on the relative entropy of entanglement and devising a dimension-independent technique dubbed 'teleportation stretching', we establish these capacities for many fundamental channels, namely bosonic lossy channels, quantum-limited amplifiers, dephasing and erasure channels in arbitrary dimension. In particular, we exactly determine the fundamental rate-loss tradeoff affecting any protocol of quantum key distribution. Our findings set the limits of point-to-point quantum communications and provide precise and general benchmarks for quantum repeaters.

  7. Fundamental limit of light trapping in grating structures

    KAUST Repository

    Yu, Zongfu

    2010-08-11

    We use a rigorous electromagnetic approach to analyze the fundamental limit of light-trapping enhancement in grating structures. This limit can exceed the bulk limit of 4n 2, but has significant angular dependency. We explicitly show that 2D gratings provide more enhancement than 1D gratings. We also show the effects of the grating profile’s symmetry on the absorption enhancement limit. Numerical simulations are applied to support the theory. Our findings provide general guidance for the design of grating structures for light-trapping solar cells.

  8. Measurement of quantum memory effects and its fundamental limitations

    Science.gov (United States)

    Wittemer, Matthias; Clos, Govinda; Breuer, Heinz-Peter; Warring, Ulrich; Schaetz, Tobias

    2018-02-01

    We discuss that the nature of projective measurements in quantum mechanics can lead to a nontrivial bias in non-Markovianity measures, quantifying the flow of information between a system and its environment. Consequently, in the current form, envisioned applications are fundamentally limited. In our trapped-ion system, we precisely quantify such bias and perform local quantum probing to demonstrate corresponding limitations. The combination of extended measures and our scalable experimental approach can provide a versatile reference, relevant for understanding more complex systems.

  9. Collaborative Filtering: Fundamental Limits and Good Practices

    Indian Academy of Sciences (India)

    Recover Entire Matrix. (Aditya, Dabeer, Dey, IEEE Trans. Inform. Theory, Apr 2011). Clustering algo fails. Limits not known α. 1. 0.5 log(cluster size). Θ(αlogn + log logn). Pe → 0. For cluster + majority. Pe → 1. For any scheme. Threshold determined by majority decoding ...

  10. Collaborative Filtering: Fundamental Limits and Good Practices

    Indian Academy of Sciences (India)

    Beyond Search ? Shopping. Services. Travel. Events. Media. Social Network. Web Search: Long list of related items. Recommendations: Few “likable” items. Limited domain ... using rating matrix. •Facebook, LinkedIn. - Suggest connections. •RichRelevance recommendation engine. - Disney Stores, Sears, Office Depot, etc.

  11. From fundamental limits to radioprotection practice

    International Nuclear Information System (INIS)

    Henry, P.; Chassany, J.

    1980-01-01

    The individual dose limits fixed by present French legislation for different categories of people refer to dose equivalents received by or delivered to the whole body or to certain tissues or organs over given periods of time. The values concerning personnel engaged directly in work under radiations are summed up in a table. These are the limits which radioprotection authorities must impose, while ensuring that exposure levels are kept as low as possible. With the means available in practical radioprotection it is not possible to measure dose equivalents directly, but information may be obtained on dose rates, absorbed doses, particle fluxes, activities per unit volume and per surface area. An interpretation of these measurements is necessary if an efficient supervision of worker exposure is to be achieved [fr

  12. Fundamental limits of positron emission mammography

    International Nuclear Information System (INIS)

    Moses, William W.; Qi, Jinyi

    2001-01-01

    We explore the causes of performance limitation in positron emission mammography cameras. We compare two basic camera geometries containing the same volume of 511 keV photon detectors, one with a parallel plane geometry and another with a rectangular geometry. We find that both geometries have similar performance for the phantom imaged (in Monte Carlo simulation), even though the solid angle coverage of the rectangular camera is about 50 percent higher than the parallel plane camera. The reconstruction algorithm used significantly affects the resulting image; iterative methods significantly outperform the commonly used focal plane tomography. Finally, the characteristics of the tumor itself, specifically the absolute amount of radiotracer taken up by the tumor, will significantly affect the imaging performance

  13. Fundamental limits of positron emission mammography

    Energy Technology Data Exchange (ETDEWEB)

    Moses, William W.; Qi, Jinyi

    2001-06-01

    We explore the causes of performance limitation in positron emission mammography cameras. We compare two basic camera geometries containing the same volume of 511 keV photon detectors, one with a parallel plane geometry and another with a rectangular geometry. We find that both geometries have similar performance for the phantom imaged (in Monte Carlo simulation), even though the solid angle coverage of the rectangular camera is about 50 percent higher than the parallel plane camera. The reconstruction algorithm used significantly affects the resulting image; iterative methods significantly outperform the commonly used focal plane tomography. Finally, the characteristics of the tumor itself, specifically the absolute amount of radiotracer taken up by the tumor, will significantly affect the imaging performance.

  14. Fundamental limits of scintillation detector timing precision

    International Nuclear Information System (INIS)

    Derenzo, Stephen E; Choong, Woon-Seng; Moses, William W

    2014-01-01

    In this paper we review the primary factors that affect the timing precision of a scintillation detector. Monte Carlo calculations were performed to explore the dependence of the timing precision on the number of photoelectrons, the scintillator decay and rise times, the depth of interaction uncertainty, the time dispersion of the optical photons (modeled as an exponential decay), the photodetector rise time and transit time jitter, the leading-edge trigger level, and electronic noise. The Monte Carlo code was used to estimate the practical limits on the timing precision for an energy deposition of 511 keV in 3 mm × 3 mm × 30 mm Lu 2 SiO 5 :Ce and LaBr 3 :Ce crystals. The calculated timing precisions are consistent with the best experimental literature values. We then calculated the timing precision for 820 cases that sampled scintillator rise times from 0 to 1.0 ns, photon dispersion times from 0 to 0.2 ns, photodetector time jitters from 0 to 0.5 ns fwhm, and A from 10 to 10 000 photoelectrons per ns decay time. Since the timing precision R was found to depend on A −1/2  more than any other factor, we tabulated the parameter B, where R = BA −1/2 . An empirical analytical formula was found that fit the tabulated values of B with an rms deviation of 2.2% of the value of B. The theoretical lower bound of the timing precision was calculated for the example of 0.5 ns rise time, 0.1 ns photon dispersion, and 0.2 ns fwhm photodetector time jitter. The lower bound was at most 15% lower than leading-edge timing discrimination for A from 10 to 10 000 photoelectrons ns −1 . A timing precision of 8 ps fwhm should be possible for an energy deposition of 511 keV using currently available photodetectors if a theoretically possible scintillator were developed that could produce 10 000 photoelectrons ns −1 . (paper)

  15. Exact Fundamental Limits of the First and Second Hyperpolarizabilities

    Science.gov (United States)

    Lytel, Rick; Mossman, Sean; Crowell, Ethan; Kuzyk, Mark G.

    2017-08-01

    Nonlinear optical interactions of light with materials originate in the microscopic response of the molecular constituents to excitation by an optical field, and are expressed by the first (β ) and second (γ ) hyperpolarizabilities. Upper bounds to these quantities were derived seventeen years ago using approximate, truncated state models that violated completeness and unitarity, and far exceed those achieved by potential optimization of analytical systems. This Letter determines the fundamental limits of the first and second hyperpolarizability tensors using Monte Carlo sampling of energy spectra and transition moments constrained by the diagonal Thomas-Reiche-Kuhn (TRK) sum rules and filtered by the off-diagonal TRK sum rules. The upper bounds of β and γ are determined from these quantities by applying error-refined extrapolation to perfect compliance with the sum rules. The method yields the largest diagonal component of the hyperpolarizabilities for an arbitrary number of interacting electrons in any number of dimensions. The new method provides design insight to the synthetic chemist and nanophysicist for approaching the limits. This analysis also reveals that the special cases which lead to divergent nonlinearities in the many-state catastrophe are not physically realizable.

  16. Fundamental Limits of Data Analytics in Sociotechnical Systems

    Directory of Open Access Journals (Sweden)

    Lav R. Varshney

    2016-02-01

    Full Text Available In the Big Data era, informational systems involving humans and machines are being deployed in multifarious societal settings. Many use data analytics as subcomponents for descriptive, predictive, and prescriptive tasks, often trained using machine learning. Yet when analytics components are placed in large-scale sociotechnical systems, it is often difficult to characterize how well the systems will act, measured with criteria relevant in the world. Here, we propose a system modeling technique that treats data analytics components as `noisy black boxes' or stochastic kernels, which together with elementary stochastic analysis provides insight into fundamental performance limits.An example application is helping prioritize people's limited attention, where learning algorithms rank tasks using noisy features and people sequentially select from the ranked list. This paper demonstrates the general technique by developing a stochastic model of analytics-enabled sequential selection, derives fundamental limits using concomitants of order statistics, and assesses limits in terms of system-wide performance metrics like screening cost and value of objects selected. Connections to sample complexity for bipartite ranking are also made.

  17. Investigation of fundamental limits to beam brightness available from photoinjectors

    International Nuclear Information System (INIS)

    Bazarov, Ivan

    2015-01-01

    The goal of this project was investigation of fundamental limits to beam brightness available from photoinjectors. This basic research in accelerator physics spanned over 5 years aiming to extend the fundamental understanding of high average current, low emittance sources of relativistic electrons based on photoemission guns, a necessary prerequisite for a new generation of coherent X-ray synchrotron radiation facilities based on continuous duty superconducting linacs. The program focused on two areas critical to making advances in the electron source performance: 1) the physics of photocathodes for the production of low emittance electrons and 2) control of space charge forces in the immediate vicinity to the cathode via 3D laser pulse shaping.

  18. Investigation of fundamental limits to beam brightness available from photoinjectors

    Energy Technology Data Exchange (ETDEWEB)

    Bazarov, Ivan [Cornell Univ., Ithaca, NY (United States)

    2015-07-09

    The goal of this project was investigation of fundamental limits to beam brightness available from photoinjectors. This basic research in accelerator physics spanned over 5 years aiming to extend the fundamental understanding of high average current, low emittance sources of relativistic electrons based on photoemission guns, a necessary prerequisite for a new generation of coherent X-ray synchrotron radiation facilities based on continuous duty superconducting linacs. The program focused on two areas critical to making advances in the electron source performance: 1) the physics of photocathodes for the production of low emittance electrons and 2) control of space charge forces in the immediate vicinity to the cathode via 3D laser pulse shaping.

  19. Fundamental size limitations of micro four-point probes

    DEFF Research Database (Denmark)

    Ansbæk, Thor; Petersen, Dirch Hjorth; Hansen, Ole

    2009-01-01

    The continued down-scaling of integrated circuits and magnetic tunnel junctions (MTJ) for hard disc read heads presents a challenge to current metrology technology. The four-point probes (4PP), currently used for sheet resistance characterization in these applications, therefore must be down......-scaled as well in order to correctly characterize the extremely thin films used. This presents a four-point probe design and fabrication challenge. We analyze the fundamental limitation on down-scaling of a generic micro four-point probe (M4PP) in a comprehensive study, where mechanical, thermal, and electrical...

  20. Fundamental limitation of electrocatalytic methane conversion to methanol.

    Science.gov (United States)

    Arnarson, Logi; Schmidt, Per S; Pandey, Mohnish; Bagger, Alexander; Thygesen, Kristian S; Stephens, Ifan E L; Rossmeisl, Jan

    2018-04-09

    The electrochemical oxidation of methane to methanol at remote oil fields where methane is flared is the ultimate solution to harness this valuable energy resource. In this study we identify a fundamental surface catalytic limitation of this process in terms of a compromise between selectivity and activity, as oxygen evolution is a competing reaction. By investigating two classes of materials, rutile oxides and two-dimensional transition metal nitrides and carbides (MXenes), we find a linear relationship between the energy needed to activate methane, i.e. to break the first C-H bond, and oxygen binding energies on the surface. Based on a simple kinetic model we can conclude that in order to obtain sufficient activity oxygen has to bind weakly to the surface but there is an upper limit to retain selectivity. Few potentially interesting candidates are found but this relatively simple description enables future large scale screening studies for more optimal candidates.

  1. Updates on tetanus toxin: a fundamental approach

    Directory of Open Access Journals (Sweden)

    Md. Ahaduzzaman

    2015-03-01

    Full Text Available Clostridium tetani is an anaerobic bacterium that produces second most poisonous protein toxins than any other bacteria. Tetanus in animals is sporadic in nature but difficult to combat even by using antibiotics and antiserum. It is crucial to understand the fundamental mechanisms and signals that control toxin production for advance research and medicinal uses. This review was intended for better understanding the basic patho-physiology of tetanus and neurotoxins (TeNT among the audience of related field.

  2. Fundamental limits to the velocity of solid armatures in railguns

    International Nuclear Information System (INIS)

    Long, G.C. Jr.

    1987-01-01

    The fundamental limits to the velocity of solid armatures in railguns are dependent upon the increase in temperature which melts the conducting medium or lowers the yield strength of the material. A two-dimensional transient finite-element electrothermal model is developed to determine the magnetic and temperature fields in the rails and armature of a railgun. The solution for the magnetic and temperature fields is based upon the fundamentals of Maxwell's equations and Fourier's law of heat conduction with no a priori assumptions about the current-density distribution in the rails or the armature. The magnetic-field and temperature-field spatial variations are calculated using finite-element techniques, while the time variations are calculated using finite-differencing methods. A thermal-diffusion iteration is performed between each magnetic diffusion iteration. Joule heating information is provided by solving the magnetic diffusion problem and temperature data for calculating material properties such as the electrical resistivity, thermal conductivity, and specific heat is provided by solving the thermal diffusion problem. Various types of rail and armature designs are simulated to include solid armatures consisting of different homogeneous materials, resistive rails, and a graded-resistance armature

  3. Fundamental limits to high-contrast wavefront control

    Science.gov (United States)

    Mazoyer, Johan; Pueyo, Laurent

    2017-09-01

    The current generation of ground-based coronagraphic instruments uses deformable mirrors to correct for phase errors and to improve contrast levels at small angular separations. Improving these techniques, several space and ground based instruments are currently developed using two deformable mirrors to correct for both phase and amplitude errors. However, as wavefront control techniques improve, more complex telescope pupil geometries (support structures, segmentation) will soon be a limiting factor for these next generation coronagraphic instruments. In this paper we discuss fundamental limits associated with wavefront control with deformable mirrors in high contrast coronagraph. We start with an analytic prescription of wavefront errors, along with their wavelength dependence, and propagate them through coronagraph models. We then consider a few wavefront control architectures, number of deformable mirrors and their placement in the optical train of the instrument, and algorithms that can be used to cancel the starlight scattered by these wavefront errors over a finite bandpass. For each configuration we derive the residual contrast as a function of bandwidth and of the properties of the incoming wavefront. This result has consequences when setting the wavefront requirements, along with the wavefront control architecture of future high contrast instrument both from the ground and from space. In particular we show that these limits can severely affect the effective Outer Working Angle that can be achieved by a given coronagraph instrument.

  4. Towards the Fundamental Quantum Limit of Linear Measurements of Classical Signals.

    Science.gov (United States)

    Miao, Haixing; Adhikari, Rana X; Ma, Yiqiu; Pang, Belinda; Chen, Yanbei

    2017-08-04

    The quantum Cramér-Rao bound (QCRB) sets a fundamental limit for the measurement of classical signals with detectors operating in the quantum regime. Using linear-response theory and the Heisenberg uncertainty relation, we derive a general condition for achieving such a fundamental limit. When applied to classical displacement measurements with a test mass, this condition leads to an explicit connection between the QCRB and the standard quantum limit that arises from a tradeoff between the measurement imprecision and quantum backaction; the QCRB can be viewed as an outcome of a quantum nondemolition measurement with the backaction evaded. Additionally, we show that the test mass is more a resource for improving measurement sensitivity than a victim of the quantum backaction, which suggests a new approach to enhancing the sensitivity of a broad class of sensors. We illustrate these points with laser interferometric gravitational-wave detectors.

  5. Health (care) and human rights: a fundamental conditions approach.

    Science.gov (United States)

    Liao, S Matthew

    2016-08-01

    Many international declarations state that human beings have a human right to health care. However, is there a human right to health care? What grounds this right, and who has the corresponding duties to promote this right? Elsewhere, I have argued that human beings have human rights to the fundamental conditions for pursuing a good life. Drawing on this fundamental conditions approach of human rights, I offer a novel way of grounding a human right to health care.

  6. Fundamental limits to imaging resolution for focused ion beams

    International Nuclear Information System (INIS)

    Orloff, J.; Swanson, L.W.; Utlaut, M.

    1996-01-01

    This article investigates the limitations on the formation of focused ion beam images from secondary electrons. We use the notion of the information content of an image to account for the effects of resolution, contrast, and signal-to-noise ratio and show that there is a competition between the rate at which small features are sputtered away by the primary beam and the rate of collection of secondary electrons. We find that for small features, sputtering is the limit to imaging resolution, and that for extended small features (e.g., layered structures), rearrangement, redeposition, and differential sputtering rates may limit the resolution in some cases. copyright 1996 American Vacuum Society

  7. Fundamental limits on quantum dynamics based on entropy change

    Science.gov (United States)

    Das, Siddhartha; Khatri, Sumeet; Siopsis, George; Wilde, Mark M.

    2018-01-01

    It is well known in the realm of quantum mechanics and information theory that the entropy is non-decreasing for the class of unital physical processes. However, in general, the entropy does not exhibit monotonic behavior. This has restricted the use of entropy change in characterizing evolution processes. Recently, a lower bound on the entropy change was provided in the work of Buscemi, Das, and Wilde [Phys. Rev. A 93(6), 062314 (2016)]. We explore the limit that this bound places on the physical evolution of a quantum system and discuss how these limits can be used as witnesses to characterize quantum dynamics. In particular, we derive a lower limit on the rate of entropy change for memoryless quantum dynamics, and we argue that it provides a witness of non-unitality. This limit on the rate of entropy change leads to definitions of several witnesses for testing memory effects in quantum dynamics. Furthermore, from the aforementioned lower bound on entropy change, we obtain a measure of non-unitarity for unital evolutions.

  8. Fundamental statistical limitations of future dark matter direct detection experiments

    NARCIS (Netherlands)

    Strege, C.; Trotta, F.; Bertone, G.; Peter, A.H.G.; Scott, P.

    2012-01-01

    We discuss irreducible statistical limitations of future ton-scale dark matter direct detection experiments. We focus in particular on the coverage of confidence intervals, which quantifies the reliability of the statistical method used to reconstruct the dark matter parameters and the bias of the

  9. Fundamental Limits of Blind Deconvolution Part I: Ambiguity Kernel

    OpenAIRE

    Choudhary, Sunav; Mitra, Urbashi

    2014-01-01

    Blind deconvolution is an ubiquitous non-linear inverse problem in applications like wireless communications and image processing. This problem is generally ill-posed, and there have been efforts to use sparse models for regularizing blind deconvolution to promote signal identifiability. Part I of this two-part paper characterizes the ambiguity space of blind deconvolution and shows unidentifiability of this inverse problem for almost every pair of unconstrained input signals. The approach in...

  10. Fundamental limitations in developing computer-aided detection for mammography

    Science.gov (United States)

    Nishikawa, Robert M.; Pesce, Lorenzo L.

    2011-08-01

    While asymptomatic screening with mammography has been proven to reduce breast cancer mortality, radiologists miss cancers when reading screening mammograms. Computer-aided detection (CADe) is being developed to help radiologists avoid overlooking a cancer. In this paper, we describe two overarching issues that limit the current development of CADe schemes. These are the inability to optimize a scheme for clinical impact - current methods only optimize for how well the CADe scheme works in the absence of a radiologist - and the lack of a figure of merit that quantifies the performance efficiency of the CADe scheme. Such a figure of merit could be used to determine how much better performance a CADe scheme could obtain, at least in theory, and which component of the several techniques employed in the CADe scheme is the weakest link.

  11. Secret Key Agreement: Fundamental Limits and Practical Challenges

    KAUST Repository

    Rezki, Zouheir

    2017-02-15

    Despite the tremendous progress made toward establishing PLS as a new paradigm to guarantee security of communication systems at the physical layerthere is a common belief among researchers and industrials that there are many practical challenges that prevent PLS from flourishing at the industrial scale. Most secure message transmission constructions available to date are tied to strong assumptions on CSI, consider simple channel models and undermine eavesdropping capabilities; thus compromising their practical interest to a big extent. Perhaps arguably, the most likely reasonable way to leverage PLS potential in securing modern wireless communication systems is via secret-key agreement. In the latter setting, the legitimate parties try to agree on a key exploiting availability of a public channel with high capacity which is also accessible to the eavesdropper. Once a key is shared by the legitimate parties, they may use it in a one-time pad encryption, for instance. In this article, we investigate two performance limits of secret-key agreement communications; namely, the secret-key diversity-multiplexing trade-off and the effect of transmit correlation on the secretkey capacity. We show via examples how secretkey agreement offers more flexibility than secure message transmissions. Finally, we explore a few challenges of secret-key agreement concept and propose a few guidelines to overturn them.

  12. Limiting value definition in radiation protection physics, legislation and toxicology. Fundamentals, contrasts, perspectives

    International Nuclear Information System (INIS)

    Smeddinck, Ulrich; Koenig, Claudia

    2016-01-01

    The volume is the documentation of an ENTRIA workshop discussion on limiting value definition in radiation protection including the following contributions: Introduction in radiation protection -fundamentals concepts of limiting values, heterogeneity; evaluation standards for dose in radiation protection in the context of final repository search; definition of limiting values in toxicology; public participation to limiting value definition - a perspective for the radiation protection regulation; actual developments in radiation protection.

  13. A fundamentally new approach to air-cooled heat exchangers.

    Energy Technology Data Exchange (ETDEWEB)

    Koplow, Jeffrey P.

    2010-01-01

    We describe breakthrough results obtained in a feasibility study of a fundamentally new architecture for air-cooled heat exchangers. A longstanding but largely unrealized opportunity in energy efficiency concerns the performance of air-cooled heat exchangers used in air conditioners, heat pumps, and refrigeration equipment. In the case of residential air conditioners, for example, the typical performance of the air cooled heat exchangers used for condensers and evaporators is at best marginal from the standpoint the of achieving maximum the possible coefficient of performance (COP). If by some means it were possible to reduce the thermal resistance of these heat exchangers to a negligible level, a typical energy savings of order 30% could be immediately realized. It has long been known that a several-fold increase in heat exchanger size, in conjunction with the use of much higher volumetric flow rates, provides a straight-forward path to this goal but is not practical from the standpoint of real world applications. The tension in the market place between the need for energy efficiency and logistical considerations such as equipment size, cost and operating noise has resulted in a compromise that is far from ideal. This is the reason that a typical residential air conditioner exhibits significant sensitivity to reductions in fan speed and/or fouling of the heat exchanger surface. The prevailing wisdom is that little can be done to improve this situation; the 'fan-plus-finned-heat-sink' heat exchanger architecture used throughout the energy sector represents an extremely mature technology for which there is little opportunity for further optimization. But the fact remains that conventional fan-plus-finned-heat-sink technology simply doesn't work that well. Their primary physical limitation to performance (i.e. low thermal resistance) is the boundary layer of motionless air that adheres to and envelops all surfaces of the heat exchanger. Within this

  14. Some Fundamental Limits on SAW RFID Tag Information Capacity and Collision Resolution

    Science.gov (United States)

    Barton, Richard J.

    2013-01-01

    In this paper, we apply results from multi-user information theory to study the limits of information capacity and collision resolution for SAW RFID tags. In particular, we derive bounds on the achievable data rate per tag as a function of fundamental parameters such as tag time-bandwidth product, tag signal-to-noise ratio (SNR), and number of tags in the environment. We also discuss the implications of these bounds for tag waveform design and tag interrogation efficiency

  15. Fundamental x-ray interaction limits in diagnostic imaging detectors: spatial resolution.

    Science.gov (United States)

    Hajdok, G; Battista, J J; Cunningham, I A

    2008-07-01

    The practice of diagnostic x-ray imaging has been transformed with the emergence of digital detector technology. Although digital systems offer many practical advantages over conventional film-based systems, their spatial resolution performance can be a limitation. The authors present a Monte Carlo study to determine fundamental resolution limits caused by x-ray interactions in four converter materials: Amorphous silicon (a-Si), amorphous selenium, cesium iodide, and lead iodide. The "x-ray interaction" modulation transfer function (MTF) was determined for each material and compared in terms of the 50% MTF spatial frequency and Wagner's effective aperture for incident photon energies between 10 and 150 keV and various converter thicknesses. Several conclusions can be drawn from their Monte Carlo study. (i) In low-Z (a-Si) converters, reabsorption of Compton scatter x rays limits spatial resolution with a sharp MTF drop at very low spatial frequencies (x-ray interaction MTF. (iii) The spread of energy due to secondary electron (e.g., photoelectrons) transport is significant only at very high spatial frequencies. (iv) Unlike the spread of optical light in phosphors, the spread of absorbed energy from x-ray interactions does not significantly degrade spatial resolution as converter thickness is increased. (v) The effective aperture results reported here represent fundamental spatial resolution limits of the materials tested and serve as target benchmarks for the design and development of future digital x-ray detectors.

  16. Heat-Assisted Magnetic Recording: Fundamental Limits to Inverse Electromagnetic Design

    Science.gov (United States)

    Bhargava, Samarth

    In this dissertation, we address the burgeoning fields of diffractive optics, metals-optics and plasmonics, and computational inverse problems in the engineering design of electromagnetic structures. We focus on the application of the optical nano-focusing system that will enable Heat-Assisted Magnetic Recording (HAMR), a higher density magnetic recording technology that will fulfill the exploding worldwide demand of digital data storage. The heart of HAMR is a system that focuses light to a nano- sub-diffraction-limit spot with an extremely high power density via an optical antenna. We approach this engineering problem by first discussing the fundamental limits of nano-focusing and the material limits for metal-optics and plasmonics. Then, we use efficient gradient-based optimization algorithms to computationally design shapes of 3D nanostructures that outperform human designs on the basis of mass-market product requirements. In 2014, the world manufactured ˜1 zettabyte (ZB), ie. 1 Billion terabytes (TBs), of data storage devices, including ˜560 million magnetic hard disk drives (HDDs). Global demand of storage will likely increase by 10x in the next 5-10 years, and manufacturing capacity cannot keep up with demand alone. We discuss the state-of-art HDD and why industry invented Heat-Assisted Magnetic Recording (HAMR) to overcome the data density limitations. HAMR leverages the temperature sensitivity of magnets, in which the coercivity suddenly and non-linearly falls at the Curie temperature. Data recording to high-density hard disks can be achieved by locally heating one bit of information while co-applying a magnetic field. The heating can be achieved by focusing 100 microW of light to a 30nm diameter spot on the hard disk. This is an enormous light intensity, roughly ˜100,000,000x the intensity of sunlight on the earth's surface! This power density is ˜1,000x the output of gold-coated tapered optical fibers used in Near-field Scanning Optical Microscopes

  17. Limited Approach in Endoscopic Dacryocystorhinostomy of Pediatrics

    OpenAIRE

    Hashemi, Seyyed Mostafa; Eshaghian, Afrooz

    2017-01-01

    Background: Limited spatial nasal cavity in children, make pediatric dacryocystorhinostomy (DCR) a difficult surgical procedure. We apply a limited approach to pediatric DCR and follow them for their consequences. Materials and Methods: An experimental study was done in pediatric DCR with limited approach (age < 14-year-old). After written consent, with general anesthesia, with nasal endoscopic surgery, lacrimal bone is exposed and extruded. In contrast with routine procedure, ascending proce...

  18. Prediction of injury by limited and asymmetrical fundamental movement patterns in american football players.

    Science.gov (United States)

    Kiesel, Kyle B; Butler, Robert J; Plisky, Philip J

    2014-05-01

    Previous injury is the strongest risk factor for future injury in sports. It has been proposed that motor-control changes such as movement limitation and asymmetry associated with injury and pain may be perpetuated as part of an individual's movement strategy. Motor control of fundamental 1-×-body-weight tasks can reliably and efficiently be measured in the field. To determine whether the motor control of fundamental movement patterns and pattern asymmetry have a relationship with time-loss injury over the course of the preseason in professional football. Injury-risk study. American professional football facilities. 238 American professional football players. To measure the motor control of 1-×-body-weight fundamental movement patterns, Functional Movement Screen scores were obtained before the start of training camp. The previously established cutoff score of ≤14 and the presence of any asymmetries on the FMS were examined using relative risk to determine if a relationship exists with time-loss injury. Time-loss musculoskeletal injury defined as any time loss from practice or competition due to musculoskeletal injury. Players who scored ≤14 exhibited a relative risk of 1.87 (CI95 1.202.96). Similarly, players with at least 1 asymmetry displayed a relative risk of 1.80 (CI95 1.112.74). The combination of scoring below the threshold and exhibiting a movement asymmetry was highly specific for injury, with a specificity of .87 (CI95 .84.90). The results of this study suggest that fundamental movement patterns and pattern asymmetry are identifiable risk factors for time-loss injury during the preseason in professional football players.

  19. Stimulated Raman Scattering Imposes Fundamental Limits to the Duration and Bandwidth of Temporal Cavity Solitons

    Science.gov (United States)

    Wang, Yadong; Anderson, Miles; Coen, Stéphane; Murdoch, Stuart G.; Erkintalo, Miro

    2018-02-01

    Temporal cavity solitons (CS) are optical pulses that can persist in passive resonators, and they play a key role in the generation of coherent microresonator frequency combs. In resonators made of amorphous materials, such as fused silica, they can exhibit a spectral redshift due to stimulated Raman scattering. Here we show that this Raman-induced self-frequency-shift imposes a fundamental limit on the duration and bandwidth of temporal CSs. Specifically, we theoretically predict that stimulated Raman scattering introduces a previously unidentified Hopf bifurcation that leads to destabilization of CSs at large pump-cavity detunings, limiting the range of detunings over which they can exist. We have confirmed our theoretical predictions by performing extensive experiments in synchronously driven fiber ring resonators, obtaining results in excellent agreement with numerical simulations. Our results could have significant implications for the future design of Kerr frequency comb systems based on amorphous microresonators.

  20. Standardless quantification approach of TXRF analysis using fundamental parameter method

    International Nuclear Information System (INIS)

    Szaloki, I.; Taniguchi, K.

    2000-01-01

    New standardless evaluation procedure based on the fundamental parameter method (FPM) has been developed for TXRF analysis. The theoretical calculation describes the relationship between characteristic intensities and the geometrical parameters of the excitation, detection system and the specimen parameters: size, thickness, angle of the excitation beam to the surface and the optical properties of the specimen holder. Most of the TXRF methods apply empirical calibration, which requires the application of special preparation technique. However, the characteristic lines of the specimen holder (Si Kα,β) present information from the local excitation and geometrical conditions on the substrate surface. On the basis of the theoretically calculation of the substrate characteristic intensity the excitation beam flux can be approximated. Taking into consideration the elements are in the specimen material a system of non-linear equation can be given involving the unknown concentration values and the geometrical and detection parameters. In order to solve this mathematical problem PASCAL software was written, which calculates the sample composition and the average sample thickness by gradient algorithm. Therefore, this quantitative estimation of the specimen composition requires neither external nor internal standard sample. For verification of the theoretical calculation and the numerical procedure, several experiments were carried out using mixed standard solution containing elements of K, Sc, V, Mn, Co and Cu in 0.1 - 10 ppm concentration range. (author)

  1. Fundamental phenomena affecting low temperature combustion and HCCI engines, high load limits and strategies for extending these limits

    KAUST Repository

    Saxena, Samveg

    2013-10-01

    Low temperature combustion (LTC) engines are an emerging engine technology that offers an alternative to spark-ignited and diesel engines. One type of LTC engine, the homogeneous charge compression ignition (HCCI) engine, uses a well-mixed fuel–air charge like spark-ignited engines and relies on compression ignition like diesel engines. Similar to diesel engines, the use of high compression ratios and removal of the throttling valve in HCCI allow for high efficiency operation, thereby allowing lower CO2 emissions per unit of work delivered by the engine. The use of a highly diluted well-mixed fuel–air charge allows for low emissions of nitrogen oxides, soot and particulate matters, and the use of oxidation catalysts can allow low emissions of unburned hydrocarbons and carbon monoxide. As a result, HCCI offers the ability to achieve high efficiencies comparable with diesel while also allowing clean emissions while using relatively inexpensive aftertreatment technologies. HCCI is not, however, without its challenges. Traditionally, two important problems prohibiting market penetration of HCCI are 1) inability to achieve high load, and 2) difficulty in controlling combustion timing. Recent research has significantly mitigated these challenges, and thus HCCI has a promising future for automotive and power generation applications. This article begins by providing a comprehensive review of the physical phenomena governing HCCI operation, with particular emphasis on high load conditions. Emissions characteristics are then discussed, with suggestions on how to inexpensively enable low emissions of all regulated emissions. The operating limits that govern the high load conditions are discussed in detail, and finally a review of recent research which expands the high load limits of HCCI is discussed. Although this article focuses on the fundamental phenomena governing HCCI operation, it is also useful for understanding the fundamental phenomena in reactivity controlled

  2. An approach to fundamental study of beam loss minimization

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1999-01-01

    The accelerator design rules involving rms matching, developed at CERN in the 1970's, are discussed. An additional rule, for equipartitioning the beam energy among its degrees of freedom, may be added to insure an rms equilibrium conditions. If the strong stochasticity threshold is avoided, as it is in realistic accelerator designs, the dynamics is characterized by extremely long transient settling times, making the role of equipartitioning hard to explain. An approach to systematic study using the RFQ accelerator as a simulation testbed is discussed. New methods are available from recent advances in research on complexity, nonlinear dynamics, and chaos

  3. Photon routing in cavity QED: Beyond the fundamental limit of photon blockade

    Energy Technology Data Exchange (ETDEWEB)

    Rosenblum, Serge; Dayan, Barak [Department of Chemical Physics, Weizmann Institute of Science, Rehovot 76100 (Israel); Parkins, Scott [Department of Physics, University of Auckland, Private Bag 92019, Auckland (New Zealand)

    2011-09-15

    The most simple and seemingly straightforward application of the photon blockade effect, in which the transport of one photon prevents the transport of others, would be to separate two incoming indistinguishable photons to different output ports. We show that time-energy uncertainty relations inherently prevent this ideal situation when the blockade is implemented by a two-level system. The fundamental nature of this limit is revealed in the fact that photon blockade in the strong coupling regime of cavity QED, resulting from the nonlinearity of the Jaynes-Cummings energy level structure, exhibits efficiency and temporal behavior identical to those of photon blockade in the bad cavity regime, where the underlying nonlinearity is that of the atom itself. We demonstrate that this limit can be exceeded, yet not avoided, by exploiting time-energy entanglement between the incident photons. Finally, we show how this limit can be circumvented completely by using a three-level atom coupled to a single-sided cavity, enabling an ideal and robust photon routing mechanism.

  4. Limited Approach in Endoscopic Dacryocystorhinostomy of Pediatrics.

    Science.gov (United States)

    Hashemi, Seyyed Mostafa; Eshaghian, Afrooz

    2017-01-01

    Limited spatial nasal cavity in children, make pediatric dacryocystorhinostomy (DCR) a difficult surgical procedure. We apply a limited approach to pediatric DCR and follow them for their consequences. An experimental study was done in pediatric DCR with limited approach (age bone is exposed and extruded. In contrast with routine procedure, ascending process of maxillary sinus reserve; and marsupialization and wide exposure to lacrimal sac was done only by lacrimal bone defect; and cannulation preserve with temporary silicone tube. Between 2006 and 2012, 16 pediatric DCR was done by a unique surgeon in 2 otorhinolaryngologic centers. Before surgery 14 (87.5%) had epiphora, 3 (18.8%) had eye discharge, and 3 (18.8%) had eye sticky eye. Two (12.5%) had history of facial trauma, and 10 (62.5%) had congenital nasolacrimal duct insufficiency. Five (31.3%) had history of dacryocystitis. Patients were followed for 17 ± 9 months. Silicone tube stayed for 4 ± 2.5 months. We could follow 7 patients and minimal improvement or need to revision surgery considered as technical failure. After surgery, 3 patients had no epiphora with complete improvement; 2 had very good improvement with confidence of the patients and parents; 2 cases had unsuccessful surgery in our patients, who needs to another surgery. One of them had several probing and surgery before our endoscopic DCR. Limited approach in endoscopic DCR of pediatrics can be done in noncomplicated patients, with minimal manipulation, more confidence, and acceptable results.

  5. Roothaan approach in the thermodynamic limit

    Science.gov (United States)

    Gutierrez, G.; Plastino, A.

    1982-02-01

    A systematic method for the solution of the Hartree-Fock equations in the thermodynamic limit is presented. The approach is seen to be a natural extension of the one usually employed in the finite-fermion case, i.e., that developed by Roothaan. The new techniques developed here are applied, as an example, to neutron matter, employing the so-called V1 Bethe "homework" potential. The results obtained are, by far, superior to those that the ordinary plane-wave Hartree-Fock theory yields. NUCLEAR STRUCTURE Hartree-Fock approach; nuclear and neutron matter.

  6. Limited Approach in Endoscopic Dacryocystorhinostomy of Pediatrics

    Directory of Open Access Journals (Sweden)

    Seyyed Mostafa Hashemi

    2017-01-01

    Full Text Available Background: Limited spatial nasal cavity in children, make pediatric dacryocystorhinostomy (DCR a difficult surgical procedure. We apply a limited approach to pediatric DCR and follow them for their consequences. Materials and Methods: An experimental study was done in pediatric DCR with limited approach (age < 14-year-old. After written consent, with general anesthesia, with nasal endoscopic surgery, lacrimal bone is exposed and extruded. In contrast with routine procedure, ascending process of maxillary sinus reserve; and marsupialization and wide exposure to lacrimal sac was done only by lacrimal bone defect; and cannulation preserve with temporary silicone tube. Results: Between 2006 and 2012, 16 pediatric DCR was done by a unique surgeon in 2 otorhinolaryngologic centers. Before surgery 14 (87.5% had epiphora, 3 (18.8% had eye discharge, and 3 (18.8% had eye sticky eye. Two (12.5% had history of facial trauma, and 10 (62.5% had congenital nasolacrimal duct insufficiency. Five (31.3% had history of dacryocystitis. Patients were followed for 17 ± 9 months. Silicone tube stayed for 4 ± 2.5 months. We could follow 7 patients and minimal improvement or need to revision surgery considered as technical failure. After surgery, 3 patients had no epiphora with complete improvement; 2 had very good improvement with confidence of the patients and parents; 2 cases had unsuccessful surgery in our patients, who needs to another surgery. One of them had several probing and surgery before our endoscopic DCR. Conclusions: Limited approach in endoscopic DCR of pediatrics can be done in noncomplicated patients, with minimal manipulation, more confidence, and acceptable results.

  7. Roothaan approach in the thermodynamic limit

    International Nuclear Information System (INIS)

    Gutierrez, G.; Plastino, A.

    1982-01-01

    A systematic method for the solution of the Hartree-Fock equations in the thermodynamic limit is presented. The approach is seen to be a natural extension of the one usually employed in the finite-fermion case, i.e., that developed by Roothaan. The new techniques developed here are applied, as an example, to neutron matter, employing the so-called V 1 Bethe homework potential. The results obtained are, by far, superior to those that the ordinary plane-wave Hartree-Fock theory yields

  8. Estimation of fundamental frequencies in polyphonic music sound using subspace-based approach

    Science.gov (United States)

    Lee, Jong H.; Chun, Joohwan

    2001-11-01

    Music is a sum of several instrumental sounds whose individual fundamental frequencies are based on the musical score. Reversely musical sound contains information about the score, such as the instruments played and their fundamental frequencies. Automatic identification of scores from the musical sound is called the automatic transcription. There are many items to be estimated; the type of instruments, fundamental frequency, and note. Among these, the fundamental frequency estimation problem (FFE) has been widely studied. It is extensively studied for more than thirty years and there are many algorithms for the estimation of mono-phonic sound and poly-phonic sound. In this paper we propose a new estimation method of musical sound using the subspace approach. Our algorithm can be used to estimate poly-phonic and poly-instrumental sounds. This subspace approach is based on the autocorrelation of sounds and the orthogonality property. First, we gather subspaces of various instruments with different fundamental frequency. We define the subspaces as sound manifold. Next, we compare sound manifold and the subspace of measurement musical sound. We use the noise subspace of measurement sound and apply a MUSIC-like algorithm which use the orthogonality property of the signal subspace and the noise subspace. We test our algorithm with MIDI signals and show good identification capability.

  9. COMPETENCE-BASED APPROACH WHILE TEACHING FUNDAMENTAL SCIENCE SUBJECTS AT MEDICAL UNIVERSITY

    Directory of Open Access Journals (Sweden)

    V. Y. Gelman

    2016-01-01

    Full Text Available The aim of the study is to analyze the features of the competence approach application in the theoretical and natural science subjects’ teaching in medical school.Methods. The method of expert estimation is used in order to find out main tendencies and problems arising in the course of professional competencies development. The implementation of problem-oriented method in fundamental disciplines’ teaching is applied.Results. The effect of the competence approach on educational process is shown in the teaching experience of pathological anatomy and statistics. The problems and complicating factors faced by teacher are identified. The basic approaches facilitating the development of professional competencies based on a fixed time allotted for studying a particular discipline are proposed.Scientific novelty and practical significance. The proposed approaches and practical recommendations will enable to form more efficiently the competences in fundamental disciplines. The implementation of competence approach in teaching the fundamental science subjects requires adjustments of substantial and methodological levels of training including the active use of information technologies and a substantial increase in the role of seminars and self-studies. The advantage of the proposed approaches concludes in their universality: with some adjustment, they can be used in teaching other subjects, regardless of the specifics and the type of educational institution.

  10. Quantum cryptography approaching the classical limit.

    Science.gov (United States)

    Weedbrook, Christian; Pirandola, Stefano; Lloyd, Seth; Ralph, Timothy C

    2010-09-10

    We consider the security of continuous-variable quantum cryptography as we approach the classical limit, i.e., when the unknown preparation noise at the sender's station becomes significantly noisy or thermal (even by as much as 10(4) times greater than the variance of the vacuum mode). We show that, provided the channel transmission losses do not exceed 50%, the security of quantum cryptography is not dependent on the channel transmission, and is therefore incredibly robust against significant amounts of excess preparation noise. We extend these results to consider for the first time quantum cryptography at wavelengths considerably longer than optical and find that regions of security still exist all the way down to the microwave.

  11. Prospects and fundamental limitations of room temperature, non-avalanche, semiconductor photon-counting sensors (Conference Presentation)

    Science.gov (United States)

    Ma, Jiaju; Zhang, Yang; Wang, Xiaoxin; Ying, Lei; Masoodian, Saleh; Wang, Zhiyuan; Starkey, Dakota A.; Deng, Wei; Kumar, Rahul; Wu, Yang; Ghetmiri, Seyed Amir; Yu, Zongfu; Yu, Shui-Qing; Salamo, Gregory J.; Fossum, Eric R.; Liu, Jifeng

    2017-05-01

    This research investigates the fundamental limits and trade-space of quantum semiconductor photodetectors using the Schrödinger equation and the laws of thermodynamics.We envision that, to optimize the metrics of single photon detection, it is critical to maximize the optical absorption in the minimal volume and minimize the carrier transit process simultaneously. Integration of photon management with quantum charge transport/redistribution upon optical excitation can be engineered to maximize the quantum efficiency (QE) and data rate and minimize timing jitter at the same time. Due to the ultra-low capacitance of these quantum devices, even a single photoelectron transfer can induce a notable change in the voltage, enabling non-avalanche single photon detection at room temperature as has been recently demonstrated in Si quanta image sensors (QIS). In this research, uniform III-V quantum dots (QDs) and Si QIS are used as model systems to test the theory experimentally. Based on the fundamental understanding, we also propose proof-of-concept, photon-managed quantum capacitance photodetectors. Built upon the concepts of QIS and single electron transistor (SET), this novel device structure provides a model system to synergistically test the fundamental limits and tradespace predicted by the theory for semiconductor detectors. This project is sponsored under DARPA/ARO's DETECT Program: Fundamental Limits of Quantum Semiconductor Photodetectors.

  12. Fundamental limitations of non-thermal plasma processing for internal combustion engine NOx control

    International Nuclear Information System (INIS)

    Penetrante, B.M.

    1993-01-01

    This paper discusses the physics and chemistry of non-thermal plasma processing for post-combustion NO x control in internal combustion engines. A comparison of electron beam and electrical discharge processing is made regarding their power consumption, radical production, NO x removal mechanisms, and by product formation. Can non-thermal deNO x operate efficiently without additives or catalysts? How much electrical power does it cost to operate? What are the by-products of the process? This paper addresses these fundamental issues based on an analysis of the electron-molecule processes and chemical kinetics

  13. Probing the fundamental limit of niobium in high radiofrequency fields by dual mode excitation in superconducting radiofrequency cavities

    Energy Technology Data Exchange (ETDEWEB)

    Eremeev, Grigory; Geng, Rongli; Palczewski, Ari

    2011-07-01

    We have studied thermal breakdown in several multicell superconducting radiofrequency cavity by simultaneous excitation of two TM{sub 010} passband modes. Unlike measurements done in the past, which indicated a clear thermal nature of the breakdown, our measurements present a more complex picture with interplay of both thermal and magnetic effects. JLab LG-1 that we studied was limited at 40.5 MV/m, corresponding to B{sub peak} = 173 mT, in 8{pi}/9 mode. Dual mode measurements on this quench indicate that this quench is not purely magnetic, and so we conclude that this field is not the fundamental limit in SRF cavities.

  14. Network Synchronization in a Noisy Environment with Time Delays: Fundamental Limits and Trade-Offs

    Science.gov (United States)

    Hunt, D.; Korniss, G.; Szymanski, B. K.

    2010-08-01

    We study the effects of nonzero time delays in stochastic synchronization problems with linear couplings in an arbitrary network. Using the known exact threshold value from the theory of differential equations with delays, we provide the synchronizability threshold for an arbitrary network. Further, by constructing the scaling theory of the underlying fluctuations, we establish the absolute limit of synchronization efficiency in a noisy environment with uniform time delays, i.e., the minimum attainable value of the width of the synchronization landscape. Our results also have strong implications for optimization and trade-offs in network synchronization with delays.

  15. Fire protection for nuclear power plants. Part 1. Fundamental approaches. Version 6/99

    International Nuclear Information System (INIS)

    1999-06-01

    The KTA nuclear safety code sets out the fundamental approaches and principles for the prevention of fires in nuclear power plants, addressing aspects such as initiation, spreading, and effects of a fire: (a) Fire load and ignition sources, (b) structural and plant engineering conditions, (c) ways and means relating to fire call and fire fighting. Relevant technical and organisational measures are defined. Scope and quality of fire prevention measures to be taken, as well the relevant in-service inspection activities are determined according to the protective goals pursued in each case. (orig./CB) [de

  16. Fundamental limits on wavelength, efficiency and yield of the charge separation triad.

    Directory of Open Access Journals (Sweden)

    Alexander Punnoose

    Full Text Available In an attempt to optimize a high yield, high efficiency artificial photosynthetic protein we have discovered unique energy and spatial architecture limits which apply to all light-activated photosynthetic systems. We have generated an analytical solution for the time behavior of the core three cofactor charge separation element in photosynthesis, the photosynthetic cofactor triad, and explored the functional consequences of its makeup including its architecture, the reduction potentials of its components, and the absorption energy of the light absorbing primary-donor cofactor. Our primary findings are two: First, that a high efficiency, high yield triad will have an absorption frequency more than twice the reorganization energy of the first electron transfer, and second, that the relative distance of the acceptor and the donor from the primary-donor plays an important role in determining the yields, with the highest efficiency, highest yield architecture having the light absorbing cofactor closest to the acceptor. Surprisingly, despite the increased complexity found in natural solar energy conversion proteins, we find that the construction of this central triad in natural systems matches these predictions. Our analysis thus not only suggests explanations for some aspects of the makeup of natural photosynthetic systems, it also provides specific design criteria necessary to create high efficiency, high yield artificial protein-based triads.

  17. Fundamental Limits to Coherent Scattering and Photon Coalescence from Solid-State Quantum Emitters [arXiv

    DEFF Research Database (Denmark)

    Iles-Smith, Jake; McCutcheon, Dara; Mørk, Jesper

    2016-01-01

    find that the sideband resulting from non-Markovian relaxation of the phonon environment leads to a fundamental limit to the fraction of coherently scattered light and to the visibility of two-photon coalescence at weak driving, both of which are absent for atomic systems or within simpler Markovian......The desire to produce high-quality single photons for applications in quantum information science has lead to renewed interest in exploring solid-state emitters in the weak excitation regime. Under these conditions it is expected that photons are coherently scattered, and so benefit from...

  18. Shotgun approaches to gait analysis : insights & limitations

    NARCIS (Netherlands)

    Kaptein, Ronald G.; Wezenberg, Daphne; IJmker, Trienke; Houdijk, Han; Beek, Peter J.; Lamoth, Claudine J. C.; Daffertshofer, Andreas

    2014-01-01

    Background: Identifying features for gait classification is a formidable problem. The number of candidate measures is legion. This calls for proper, objective criteria when ranking their relevance. Methods: Following a shotgun approach we determined a plenitude of kinematic and physiological gait

  19. The voluntary offset - approaches and limitations

    International Nuclear Information System (INIS)

    2012-06-01

    After having briefly presented the voluntary offset mechanism which aims at funding a project of reduction or capture of greenhouse gas emissions, this document describes the approach to be followed to adopt this voluntary offset, for individuals as well as for companies, communities or event organisations. It describes other important context issues (projects developed under the voluntary offset, actors of the voluntary offsetting market, market status, offset labels), and how to proceed in practice (definition of objectives and expectations, search for needed requirements, to ensure the meeting of requirements with respect to expectations). It addresses the case of voluntary offset in France (difficult implantation, possible solutions)

  20. A Fundamental Approach to Developing Aluminium based Bulk Amorphous Alloys based on Stable Liquid Metal Structures and Electronic Equilibrium - 154041

    Science.gov (United States)

    2017-03-28

    AFRL-AFOSR-JP-TR-2017-0027 A Fundamental Approach to Developing Aluminium -based Bulk Amorphous Alloys based on Stable Liquid-Metal Structures and...to 16 Dec 2016 4.  TITLE AND SUBTITLE A Fundamental Approach to Developing Aluminium -based Bulk Amorphous Alloys based on Stable Liquid-Metal...Air Force Research Laboratory for accurately predicting compositions of new amorphous alloys specifically based on aluminium with properties superior

  1. The fundamental parameter approach of quantitative XRFA- investigation of photoelectric absorption coefficients

    International Nuclear Information System (INIS)

    Shaltout, A.

    2003-06-01

    The present work describes some actual problems of quantitative x-ray fluorescence analysis by means of the fundamental parameter approach. To perform this task, some of the main parameters are discussed in detail. These parameters are photoelectric cross sections, coherent and incoherent scattering cross sections, mass absorption cross sections and the variation of the x-ray tube voltage. Photoelectric cross sections, coherent and incoherent scattering cross sections and mass absorption cross sections in the energy range from 1 to 300 keV for the elements from Z=1 to 94 considering ten different data bases are studied. These are data bases given by Hubbell, McMaster, Mucall, Scofield, Xcom, Elam, Sasaki, Henke, Cullen and Chantler's data bases. These data bases have been developed also for an application in fundamental parameter programs for quantitative x-ray analysis (Energy Dispersive X-Ray Fluorescence Analysis (EDXRFA), Electron Probe Microanalysis (EPMA), X-Ray Photoelectron Spectroscopy (XPS) and Total Electron Yield (TEY)). In addition a comparison is performed between different data bases. In McMaster's data base, the missing elements (Z=84, 85, 87, 88, 89, 91, and 93) are added by using photoelectric cross sections of Scofield's data base, coherent as well as incoherent scattering cross sections of Elam's data base and the absorption edges of Bearden. Also, the N-fit coefficients of the elements from Z=61 to 69 are wrong in McMaster data base, therefore, linear least squares fits are used to recalculate the N-fit coefficients of these elements. Additionally, in the McMaster tables the positions of the M- and N-edges of all elements with the exception of the M1- and N1- edges are not defined as well as the jump ratio of the edges. In the present work, the M- and N-edges and the related jump ratios are calculated. To include the missing N-edges, Bearden's values of energy edges are used. In Scofield's data base, modifications include check and correction

  2. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    International Nuclear Information System (INIS)

    Egan, James; McMillan, Normal; Denieffe, David

    2011-01-01

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  3. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    Energy Technology Data Exchange (ETDEWEB)

    Egan, James; McMillan, Normal; Denieffe, David, E-mail: eganj@itcarlow.ie [IT Carlow (Ireland)

    2011-08-17

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  4. Fundamental approaches for analysis thermal hydraulic parameter for Puspati Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Hashim, Zaredah, E-mail: zaredah@nm.gov.my; Lanyau, Tonny Anak, E-mail: tonny@nm.gov.my; Farid, Mohamad Fairus Abdul; Kassim, Mohammad Suhaimi [Reactor Technology Centre, Technical Support Division, Malaysia Nuclear Agency, Ministry of Science, Technology and Innovation, Bangi, 43000, Kajang, Selangor Darul Ehsan (Malaysia); Azhar, Noraishah Syahirah [Universiti Teknologi Malaysia, 80350, Johor Bahru, Johor Darul Takzim (Malaysia)

    2016-01-22

    The 1-MW PUSPATI Research Reactor (RTP) is the one and only nuclear pool type research reactor developed by General Atomic (GA) in Malaysia. It was installed at Malaysian Nuclear Agency and has reached the first criticality on 8 June 1982. Based on the initial core which comprised of 80 standard TRIGA fuel elements, the very fundamental thermal hydraulic model was investigated during steady state operation using the PARET-code. The main objective of this paper is to determine the variation of temperature profiles and Departure of Nucleate Boiling Ratio (DNBR) of RTP at full power operation. The second objective is to confirm that the values obtained from PARET-code are in agreement with Safety Analysis Report (SAR) for RTP. The code was employed for the hot and average channels in the core in order to calculate of fuel’s center and surface, cladding, coolant temperatures as well as DNBR’s values. In this study, it was found that the results obtained from the PARET-code showed that the thermal hydraulic parameters related to safety for initial core which was cooled by natural convection was in agreement with the designed values and safety limit in SAR.

  5. Fundamental efficiency of limited cone-beam X-ray CT (3DX Multi image micro CT) for practical use

    Energy Technology Data Exchange (ETDEWEB)

    Arai, Yoshinori; Hashimoto, Koji; Iwai, Kazuo; Shinoda, Koji [Nihon Univ., Tokyo (Japan). School of Dentistry

    2000-06-01

    The limited cone-beam CT known as Ortho-CT has been used in clinical examination since 1997. On the basis of this experience, we developed a new style of limited cone-beam CT for practical use, called ''3DX Multi image micro CT'' (3DX). The purpose of this study was to introduce this new style of limited cone-beam X-CT by comparing it to the prototype (Ortho-CT). 3DX was compared with Ortho-CT regarding the fundamental efficiency. The skin doses of 3DX and Ortho-CT were measured using TLD on a Rando phantom. The resolutions of both systems were evaluated with MTF (modulation transfer function). The subjective image quality was evaluated on the following anatomical landmarks: the inner ear, temporomandibular joint (TMJ), maxillary first molar, mandibular first molar. Five dental radiologists and two otolaryngologists evaluated the quality of 3DX images comparing then with that of Ortho-CT images for the same observation point. The five-point scale ranged from one point (inferior) to five (superior). The skin doses were 1.07 mSv with 3DX and 1.19 mSv with Ortho-CT. The skin dose of ''3DX'' was almost the same as that of Ortho-CT. The resolutions of Ortho-CT were 0.6 line pair/mm (horizontal) and 0.9 line pair/mm (vertical) on the MTF (0.5). The resolution of 3DX was 1.1 line pair/mm (horizontal) and 1.3 line pair/mm (vertical). The subjective image quality of 3DX was better than that of Ortho-CT at every observation point. The minimum score was 3.46, and maximum score was 4.17. There were significant differences with every observation point (p<0.05). On the basis of the clinical experience of Ortho-CT, a new style of limited cone-beam X-CT called ''3DX'' was developed by us for practical use. The skin dose is almost the same as in the prototype system. The images show very high resolution compared to those of the prototype system. We think that this system is very useful for diagnosis of hard tissue for

  6. MAKING THE NEIGHBOURHOOD A BETTER PLACE TO LIVE. A SWB APPROACH IMPLEMENTING FUNDAMENTAL HUMAN NEEDS

    Directory of Open Access Journals (Sweden)

    Ioanna Anna Papachristou

    2015-10-01

    Full Text Available Subjective well-being (SWB studies have been at the centre of researchers’ attention during the last years. With the majority of people now living in cities, the necessity for a more anthropocentric approach for the study and betterment of urban environments is constantly increasing. In this sense, defining and measuring SWB in urban contexts can be of particular benefit in urban design and planning processes. In this article, a method for measuring SWB for urban places based on the accomplishment of the fundamental human needs is presented and applied at a neighbourhood of Barcelona; that of Vila de Gràcia. For the measurement, a survey was constructed based on the specific geographical and socio-economic characteristics of the study case. Retrieved from Max-Neef’s Human Scale Development Paradigm (Max-Neef et al. 1991, human needs correspond to the domains of study of the suggested method. The matching of the survey’s questions to each need is the outcome of two consecutive processes: a first qualitative one, involving the work of an expert group, and a second quantitative one, involving the definition of weights among the questions that affect the same need. Although the final result is positive (although low for this study case, results for each need show considerable differences in their level of accomplishment. At the same time people seem to truly believe that most of their feelings are affected by their living environment, with stress and calmness leading the list. In summary, the method defines and applies a simple tool to quantify and evaluate current levels of SWB at different urban scales and to determine more holistic urban indexes in order to improve decision making processes, policies and plans. The classification of the questions per need favours the identification of a potential problem at the urban grid and consequently can be used as a process for implementing related measures of improvement. The method can also be seen

  7. Simple approach to sediment provenance tracing using element analysis and fundamental principles

    Science.gov (United States)

    Matys Grygar, Tomas; Elznicova, Jitka; Popelka, Jan

    2016-04-01

    Common sediment fingerprinting techniques use either (1) extensive analytical datasets, sometimes nearly complete with respect to accessible characterization techniques; they are processed by multidimensional statistics based on certain statistical assumptions on distribution functions of analytical results and conservativeness/additivity of some components, or (2) analytically demanding characteristics such as isotope ratios assumed to be unequivocal "labels" on the parent material unaltered by any catchment process. The inherent problem of the approach ad (1) is that interpretation of statistical components ("sources") is done ex post and remains purely formal. The problem of the approach ad (2) is that catchment processes (weathering, transport, deposition) can modify most geochemical parameters of soils and sediments, in other words, that the idea that some geochemistry parameters are "conservative" may be idealistic. Grain-size effects and sediment provenance have a joint influence on chemical composition of fluvial sediments that is indeed not easy to distinguish. Attempts to separate those two main components using only statistics seem risky and equivocal, because grain-size dependence of element composition is nearly individual for each element and reflects sediment maturity and catchment-specific formation transport processes. We suppose that the use of less extensive datasets of analytical results and their interpretation respecting fundamental principles should be more robust than only statistic tools applied to overwhelming datasets. We examined sediment composition, both published by other researchers and gathered by us, and we found some general principles, which are in our opinion relevant for fingerprinting: (1) Concentrations of all elements are grain-size sensitive, i.e. there are no "conservative" elements in conventional sense of provenance- or transport-pathways tracing, (2) fractionation by catchment processes and fluvial transport changes

  8. Setting Win Limits: An Alternative Approach to "Responsible Gambling"?

    Science.gov (United States)

    Walker, Douglas M; Litvin, Stephen W; Sobel, Russell S; St-Pierre, Renée A

    2015-09-01

    Social scientists, governments, and the casino industry have all emphasized the need for casino patrons to "gamble responsibly." Strategies for responsible gambling include self-imposed time limits and loss limits on gambling. Such strategies help prevent people from losing more than they can afford and may help prevent excessive gambling behavior. Yet, loss limits also make it more likely that casino patrons leave when they are losing. Oddly, the literature makes no mention of "win limits" as a potential approach to responsible gambling. A win limit would be similar to a loss limit, except the gambler would leave the casino upon reaching a pre-set level of winnings. We anticipate that a self-imposed win limit will reduce the gambler's average loss and, by default, also reduce the casino's profit. We test the effect of a self-imposed win limit by running slot machine simulations in which the treatment group of players has self-imposed and self-enforced win and loss limits, while the control group has a self-imposed loss limit or no limit. We find that the results conform to our expectations: the win limit results in improved player performance and reduced casino profits. Additional research is needed, however, to determine whether win limits could be a useful component of a responsible gambling strategy.

  9. [95/95] Approach for design limits analysis in WWER

    International Nuclear Information System (INIS)

    Shishkov, L.; Tsyganov, S.

    2008-01-01

    The paper discusses a well-known condition [95%/95%], which is important for monitoring some limits of core parameters in the course of designing the reactors (such as PWR or WWER). The condition ensures the postulate 'there is at least a 95 % probability at a 95 % confidence level that' some parameter does not exceed the limit. Such conditions are stated, for instance, in US standards and IAEA norms as recommendations for DNBR and fuel temperature. A question may arise: why can such approach for the limits be only applied to these parameters, while not normally applied to any other parameters? What is the way to ensure the limits in design practice? Using the general statements of mathematical statistics the authors interpret the [95/95] approach as applied to WWER design limits. (Authors)

  10. Analysis of Budget Deficits and Macroeconomic Fundamentals: A VAR-VECM Approach

    Directory of Open Access Journals (Sweden)

    Manamba Epaphra

    2017-10-01

    Full Text Available Aim/purpose - This paper examines the relationship between budget deficits and selected macroeconomic variables in Tanzania for the period spanning from 1966 to 2015. Design/methodology/approach - The paper uses Vector autoregression (VAR - Vector Error Correction Model (VECM and variance decomposition techniques. The Johansen's test is applied to examine the long run relationship among the variables under study. Findings - The Johansen's test of cointegration indicates that the variables are cointegrated and thus have a long run relationship. The results based on the VAR-VECM estimation show that real GDP and exchange rate have a negative and significant relationship with budget deficit whereas inflation, money supply and lending interest rate have a positive one. Variance decomposition results show that variances in the budget deficits are mostly explained by the real GDP, followed by inflation and real exchange rate. Research implications/limitations - Results are very indicative, but highlight the importance of containing inflation and money supply to check their effects on budget deficits over the short run and long-run periods. Also, policy recommendation calls for fiscal authorities in Tanzania to adopt efficient and effective methods of tax collection and public sector spending. Originality/value/contribution - Tanzania has been experiencing budget deficit since the 1970s and that this budget deficit has been blamed for high indebtedness, inflation and poor investment and growth. The paper contributes to the empirical debate on the causal relationship between budget deficits and macroeconomic variables by employing VAR-VECM and variance decomposition approaches.

  11. Fiber Contraction Approaches for Improving CMC Proportional Limit

    Science.gov (United States)

    DiCarlo, James A.; Yun, Hee Mann

    1997-01-01

    The fact that the service life of ceramic matrix composites (CMC) decreases dramatically for stresses above the CMC proportional limit has triggered a variety of research activities to develop microstructural approaches that can significantly improve this limit. As discussed in a previous report, both local and global approaches exist for hindering the propagation of cracks through the CMC matrix, the physical source for the proportional limit. Local approaches include: (1) minimizing fiber diameter and matrix modulus; (2) maximizing fiber volume fraction, fiber modulus, and matrix toughness; and (3) optimizing fiber-matrix interfacial shear strength; all of which should reduce the stress concentration at the tip of cracks pre existing or created in the matrix during CMC service. Global approaches, as with pre-stressed concrete, center on seeking mechanisms for utilizing the reinforcing fiber to subject the matrix to in-situ compressive stresses which will remain stable during CMC service. Demonstrated CMC examples for the viability of this residual stress approach are based on strain mismatches between the fiber and matrix in their free states, such as, thermal expansion mismatch and creep mismatch. However, these particular mismatch approaches are application limited in that the residual stresses from expansion mismatch are optimum only at low CMC service temperatures and the residual stresses from creep mismatch are typically unidirectional and difficult to implement in complex-shaped CMC.

  12. Bayesian analysis of rotating machines - A statistical approach to estimate and track the fundamental frequency

    DEFF Research Database (Denmark)

    Pedersen, Thorkild Find

    2003-01-01

    frequency and the related frequencies as orders of the fundamental frequency. When analyzing rotating or reciprocating machines it is important to know the running speed. Usually this requires direct access to the rotating parts in order to mount a dedicated tachometer probe. In this thesis different......Rotating and reciprocating mechanical machines emit acoustic noise and vibrations when they operate. Typically, the noise and vibrations are concentrated in narrow frequency bands related to the running speed of the machine. The frequency of the running speed is referred to as the fundamental...

  13. THE NECESSITY OF APPROACHING THE ENTERPRISE PERFORMANCE CONCEPT THROUGH A THEORETICAL FUNDAMENTAL SYSTEM

    Directory of Open Access Journals (Sweden)

    DEAC VERONICA

    2017-10-01

    Full Text Available The purpose of this paper is to justify the necessity of building of a theoretical-fundamental system to define and delimitate the integrated notions applicable to the concept of enterprise performance. Standing as a fundamental research, the present paper argues and shows that the literature in this field and the applied environment, as well, require a more clearer segregation, respectively an increase of specificity of the concept "enterprise performance" considering that it is not unanimously defined, on one hand, and, especially, due to the fact that it represents a key concept widely used, which, ultimately, has to be measured in order to be helpful, on the other hand. Moreover, the present paper would be useful to scholars working in the field of firm performance who are willing to understand this concept and to develop the future research referring to enterprise performance measurement.

  14. Fundamental ecology is fundamental.

    Science.gov (United States)

    Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E

    2015-01-01

    The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Enhanced defence in depth: a fundamental approach for innovative nuclear systems recommended by INPRO

    International Nuclear Information System (INIS)

    Kuczera, B.; Juhn, P.E.

    2004-01-01

    In May 2001, the IAEA initiated the 'International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO)'. Having in mind that nuclear power will be an important option for meeting future electricity needs, the scope of INPRO covers nuclear reactors expected to come into service in the next fifty years, together with their associated fuel cycles. This article deals with enhanced defence in depth (DID) strategy that is recommended by INPRO. This strategy is twofold: first, to prevent accidents and second, if prevention fails, to limit their potential consequences and prevent any evolution to more serious conditions. Accident prevention is the first priority. For innovative nuclear systems, the effectiveness of preventive measures should be enhanced compared with existing systems. DID is generally structured in 5 levels of protection, including successive barriers preventing the release of radioactive material to the environment. These levels are: 1) prevention of abnormal operation and failures, 2) control of abnormal operation and detection of failures, 3) control of accidents within the design basis, 4) control of severe plant conditions, including prevention and mitigation of the consequences of severe accidents, and 5) mitigation of radiological consequences of significant release of radioactive materials. In the area of nuclear safety, INPRO has set 5 principles: 1) incorporate DID as a part of the safety approach and make the 5 levels of DID more independent from each other than in current installations; 2) prevent, reduce or contain releases of radioactive or hazardous materials in any normal or abnormal plant operation; 3) incorporate increased emphasis on inherent safety characteristics and passive safety features; 4) include research and development work to bring the capability of computer codes used for the safety of innovative nuclear systems to the standard of codes used for the safety of current reactors; and 5) include a holistic life

  16. The pair potential approach for interfaces: Fundamental problems and practical solutions

    International Nuclear Information System (INIS)

    Maggs, A.C.; Ashcroft, N.W.

    1987-09-01

    A fundamental problem in the use of a central pair-force model for defect problems is that it omits three-body and higher terms which are necessarily present in real systems. Electronic fluctuation effects are also usually omitted. While these can be small in the simple metals, they are significant in noble and transition metals, as shown by a simple real space argument. To guage the importance of their effects in interface problems, the structure of a simple sum 5 twist boundary is examined, with the atoms described by both pair- and three-center interactions and as a function of the relative strength of the two. 15 refs

  17. Stability of rigid rotors supported by air foil bearings: Comparison of two fundamental approaches

    DEFF Research Database (Denmark)

    Larsen, Jon Steffen; Santos, Ilmar; von Osmanski, Alexander Sebastian

    2016-01-01

    . This paper compares two fundamental methods for predicting the OSI. One is based on a nonlinear time domain simulation and another is based on a linearised frequency domain method and a perturbation of the Reynolds equation. Both methods are based on equivalent models and should predict similar results......High speed direct drive motors enable the use of Air Foil Bearings (AFB) in a wide range of applications due to the elimination of gear forces. Unfortunately, AFB supported rotors are lightly damped, and an accurate prediction of their Onset Speed of Instability (OSI) is therefore important...

  18. Fundamental length

    International Nuclear Information System (INIS)

    Pradhan, T.

    1975-01-01

    The concept of fundamental length was first put forward by Heisenberg from purely dimensional reasons. From a study of the observed masses of the elementary particles known at that time, it is sumrised that this length should be of the order of magnitude 1 approximately 10 -13 cm. It was Heisenberg's belief that introduction of such a fundamental length would eliminate the divergence difficulties from relativistic quantum field theory by cutting off the high energy regions of the 'proper fields'. Since the divergence difficulties arise primarily due to infinite number of degrees of freedom, one simple remedy would be the introduction of a principle that limits these degrees of freedom by removing the effectiveness of the waves with a frequency exceeding a certain limit without destroying the relativistic invariance of the theory. The principle can be stated as follows: It is in principle impossible to invent an experiment of any kind that will permit a distintion between the positions of two particles at rest, the distance between which is below a certain limit. A more elegant way of introducing fundamental length into quantum theory is through commutation relations between two position operators. In quantum field theory such as quantum electrodynamics, it can be introduced through the commutation relation between two interpolating photon fields (vector potentials). (K.B.)

  19. Arguing against fundamentality

    Science.gov (United States)

    McKenzie, Kerry

    This paper aims to open up discussion on the relationship between fundamentality and naturalism, and in particular on the question of whether fundamentality may be denied on naturalistic grounds. A historico-inductive argument for an anti-fundamentalist conclusion, prominent within the contemporary metaphysical literature, is examined; finding it wanting, an alternative 'internal' strategy is proposed. By means of an example from the history of modern physics - namely S-matrix theory - it is demonstrated that (1) this strategy can generate similar (though not identical) anti-fundamentalist conclusions on more defensible naturalistic grounds, and (2) that fundamentality questions can be empirical questions. Some implications and limitations of the proposed approach are discussed.

  20. Fundamental (f) oscillations in a magnetically coupled solar interior-atmosphere system - An analytical approach

    Science.gov (United States)

    Pintér, Balázs; Erdélyi, R.

    2018-01-01

    Solar fundamental (f) acoustic mode oscillations are investigated analytically in a magnetohydrodynamic (MHD) model. The model consists of three layers in planar geometry, representing the solar interior, the magnetic atmosphere, and a transitional layer sandwiched between them. Since we focus on the fundamental mode here, we assume the plasma is incompressible. A horizontal, canopy-like, magnetic field is introduced to the atmosphere, in which degenerated slow MHD waves can exist. The global (f-mode) oscillations can couple to local atmospheric Alfvén waves, resulting, e.g., in a frequency shift of the oscillations. The dispersion relation of the global oscillation mode is derived, and is solved analytically for the thin-transitional layer approximation and for the weak-field approximation. Analytical formulae are also provided for the frequency shifts due to the presence of a thin transitional layer and a weak atmospheric magnetic field. The analytical results generally indicate that, compared to the fundamental value (ω =√{ gk }), the mode frequency is reduced by the presence of an atmosphere by a few per cent. A thin transitional layer reduces the eigen-frequencies further by about an additional hundred microhertz. Finally, a weak atmospheric magnetic field can slightly, by a few percent, increase the frequency of the eigen-mode. Stronger magnetic fields, however, can increase the f-mode frequency by even up to ten per cent, which cannot be seen in observed data. The presence of a magnetic atmosphere in the three-layer model also introduces non-permitted propagation windows in the frequency spectrum; here, f-mode oscillations cannot exist with certain values of the harmonic degree. The eigen-frequencies can be sensitive to the background physical parameters, such as an atmospheric density scale-height or the rate of the plasma density drop at the photosphere. Such information, if ever observed with high-resolution instrumentation and inverted, could help to

  1. Stability of rigid rotors supported by air foil bearings: Comparison of two fundamental approaches

    Science.gov (United States)

    Larsen, Jon S.; Santos, Ilmar F.; von Osmanski, Sebastian

    2016-10-01

    High speed direct drive motors enable the use of Air Foil Bearings (AFB) in a wide range of applications due to the elimination of gear forces. Unfortunately, AFB supported rotors are lightly damped, and an accurate prediction of their Onset Speed of Instability (OSI) is therefore important. This paper compares two fundamental methods for predicting the OSI. One is based on a nonlinear time domain simulation and another is based on a linearised frequency domain method and a perturbation of the Reynolds equation. Both methods are based on equivalent models and should predict similar results. Significant discrepancies are observed leading to the question, is the classical frequency domain method sufficiently accurate? The discrepancies and possible explanations are discussed in detail.

  2. A Unique Mathematical Derivation of the Fundamental Laws of Nature Based on a New Algebraic-Axiomatic (Matrix Approach

    Directory of Open Access Journals (Sweden)

    Ramin Zahedi

    2017-09-01

    Full Text Available In this article, as a new mathematical approach to origin of the laws of nature, using a new basic algebraic axiomatic (matrix formalism based on the ring theory and Clifford algebras (presented in Section 2, “it is shown that certain mathematical forms of fundamental laws of nature, including laws governing the fundamental forces of nature (represented by a set of two definite classes of general covariant massive field equations, with new matrix formalisms, are derived uniquely from only a very few axioms.” In agreement with the rational Lorentz group, it is also basically assumed that the components of relativistic energy-momentum can only take rational values. In essence, the main scheme of this new mathematical axiomatic approach to the fundamental laws of nature is as follows: First, based on the assumption of the rationality of D-momentum and by linearization (along with a parameterization procedure of the Lorentz invariant energy-momentum quadratic relation, a unique set of Lorentz invariant systems of homogeneous linear equations (with matrix formalisms compatible with certain Clifford and symmetric algebras is derived. Then by an initial quantization (followed by a basic procedure of minimal coupling to space-time geometry of these determined systems of linear equations, a set of two classes of general covariant massive (tensor field equations (with matrix formalisms compatible with certain Clifford, and Weyl algebras is derived uniquely as well.

  3. Social use of alcohol among adolescent offenders: a fundamental approach toward human needs

    Directory of Open Access Journals (Sweden)

    Gustavo D?Andrea

    2014-02-01

    Full Text Available This study examined some basic health care approaches toward human needs, with a particular focus on nursing. We aimed to incorporate these approaches into the discussion of the mental health of adolescent offenders who consume alcohol. We discuss specific needs of the delinquent group, critique policies that prioritize coercion of adolescent offenders, and the role that nurses could play in the sphere of juvenile delinquency.

  4. Fundamental principles of the cultural-activity approach in the psychology of giftedness

    OpenAIRE

    Babaeva, Julia

    2013-01-01

    This article examines the cultural-activity approach to the study of giftedness, which is based on the ideas of L. S. Vygotsky, A. N. Leontiev, and O. K. Tikhomirov. Three basic principles of this approach are described: the principle of polymorphism, the dynamic principle, and the principle of the holistic analysis of the giftedness phenomenon. The article introduces the results of empirical research (including a 10-year longitudinal study), which verifies the efficacy of the cultural-activi...

  5. Limits to the role of a common fundamental frequency in the fusion of two sounds with different spatial cues

    Science.gov (United States)

    Darwin, C. J.; Hukin, R. W.

    2004-07-01

    Two experiments establish constraints on the ability of a common fundamental frequency (F0) to perceptually fuse low-pass filtered and complementary high-pass filtered speech presented to different ears. In experiment 1 the filter cut-off is set at 1 kHz. When the filters are sharp, giving little overlap in frequency between the two sounds, listeners report hearing two sounds even when the sounds at the two ears are on the same F0. Shallower filters give more fusion. In experiment 2, the filters' cut-off frequency is varied together with their slope. Fusion becomes more frequent when the signals at the two ears share low-frequency components. This constraint mirrors the natural filtering by head-shadow of sound sources presented to one side. The mechanisms underlying perceptual fusion may thus be similar to those underlying auditory localization.

  6. Treatment for spasmodic dysphonia: limitations of current approaches

    Science.gov (United States)

    Ludlow, Christy L.

    2009-01-01

    Purpose of review Although botulinum toxin injection is the gold standard for treatment of spasmodic dysphonia, surgical approaches aimed at providing long-term symptom control have been advancing over recent years. Recent findings When surgical approaches provide greater long-term benefits to symptom control, they also increase the initial period of side effects of breathiness and swallowing difficulties. However, recent analyses of quality-of-life questionnaires in patients undergoing regular injections of botulinum toxin demonstrate that a large proportion of patients have limited relief for relatively short periods due to early breathiness and loss-of-benefit before reinjection. Summary Most medical and surgical approaches to the treatment of spasmodic dysphonia have been aimed at denervation of the laryngeal muscles to block symptom expression in the voice, and have both adverse effects as well as treatment benefits. Research is needed to identify the central neuropathophysiology responsible for the laryngeal muscle spasms in order target treatment towards the central neurological abnormality responsible for producing symptoms. PMID:19337127

  7. Lean flammability limit as a fundamental refrigerant property: Phase 3. Final technical report, February 1997--February 1998

    Energy Technology Data Exchange (ETDEWEB)

    Grosshandler, W.; Donnelly, M.; Womeldorf, C.

    1998-08-01

    Alternative refrigerants are being developed by industry to prevent the further destruction of stratospheric ozone by chlorofluorocarbons (CFCs), which had been the working fluids of choice for many air-conditioning and refrigeration machines. Hydrofluorocarbons (HFCs) are one class of compounds that are being pursued as replacements because their ozone depletion potential is zero. In general, the exchange of fluorine atoms on an HFC molecule with hydrogen atoms decreases its atmospheric lifetime, and it may also increase the efficiency of the working fluid. Both of these effects are highly desirable from environmental considerations since they act to mitigate global warming. Unfortunately, more hydrogen on a HFC is usually associated with an increase in flammability. An accepted method for determining the flammability limits of gaseous fuels is ASTM Standard E 681. The minimum and maximum concentrations of the fuel in air for flame propagation are based upon the observed ignition and growth of a flame in a vessel filled with a quiescent fuel/air mixture. a Clear distinction is sought between a non-propagating flicker and a flame which has enough horizontal propagation to be hazardous. This report reviews the past work done on premixed, counter-flowing flames, describes the current counter-flow burner facility and operating procedures, presents the experimental results with the analysis that yields the above flammability limits, and recommends further activities that could lead to a science-based methodology for assessing the risk of fire from refrigeration machine working fluids. 30 figs.

  8. Observation of the fundamental Nyquist noise limit in an ultra-high Q-factor cryogenic bulk acoustic wave cavity

    Energy Technology Data Exchange (ETDEWEB)

    Goryachev, Maxim, E-mail: maxim.goryachev@uwa.edu.au; Ivanov, Eugene N.; Tobar, Michael E. [ARC Centre of Excellence for Engineered Quantum Systems, University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia); Kann, Frank van [School of Physics, University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia); Galliou, Serge [Department of Time and Frequency, FEMTO-ST Institute, ENSMM, 26 Chemin de l' Épitaphe, 25000 Besançon (France)

    2014-10-13

    Thermal Nyquist noise fluctuations of high-Q bulk acoustic wave cavities have been observed at cryogenic temperatures with a DC superconducting quantum interference device amplifier. High Q modes with bandwidths of few tens of milliHz produce thermal fluctuations with a signal-to-noise ratio of up to 23 dB. The estimated effective temperature from the Nyquist noise is in good agreement with the physical temperature of the device, confirming the validity of the equivalent circuit model and the non-existence of any excess resonator self-noise. The measurements also confirm that the quality factor remains extremely high (Q > 10{sup 8} at low order overtones) for very weak (thermal) system motion at low temperatures, when compared to values measured with relatively strong external excitation. This result represents an enabling step towards operating such a high-Q acoustic device at the standard quantum limit.

  9. Hand-made cloning approach: potentials and limitations.

    Science.gov (United States)

    Vajta, G; Kragh, P M; Mtango, N R; Callesen, H

    2005-01-01

    Two major drawbacks hamper the advancement of somatic cell nuclear transfer in domestic animals. The first is a biological problem that has been studied extensively by many scientists and from many viewpoints, including the cell, molecular and developmental biology, morphology, biochemistry and tissue culture. The second is a technical problem that may be responsible for 50% or more of quantitative and/or qualitative failures of routine cloning experiments and is partially the result of the demanding and complicated procedure. However, even the relatively rare documented efforts focusing on technique are usually restricted to details and accept the principles of the micromanipulator-based approach, with its inherent limitations. Over the past decade, a small alternative group of procedures, called hand-made cloning (HMC), has emerged that has the common feature of removal of the zona pellucida prior to enucleation and fusion, resulting in a limited (or no) requirement for micromanipulators. The benefits of HMC are low equipment costs, a simple and rapid procedure and an in vitro efficiency comparable with or higher than that of traditional nuclear transfer. Embryos created by the zona-free techniques can be cryopreserved and, although data are still sparse, are capable of establishing pregnancies and resulting in the birth of calves. Hand-made cloning may also open the way to partial or full automation of somatic cell nuclear transfer. Consequently, the zona- and micromanipulator-free approach may become a useful alternative to traditional cloning, either in special situations or generally for the standardisation and widespread application of somatic cell nuclear transfer.

  10. 100 nm scale low-noise sensors based on aligned carbon nanotube networks: overcoming the fundamental limitation of network-based sensors

    International Nuclear Information System (INIS)

    Lee, Minbaek; Lee, Joohyung; Kim, Tae Hyun; Lee, Hyungwoo; Lee, Byung Yang; Hong, Seunghun; Park, June; Seong, Maeng-Je; Jhon, Young Min

    2010-01-01

    Nanoscale sensors based on single-walled carbon nanotube (SWNT) networks have been considered impractical due to several fundamental limitations such as a poor sensitivity and small signal-to-noise ratio. Herein, we present a strategy to overcome these fundamental problems and build highly-sensitive low-noise nanoscale sensors simply by controlling the structure of the SWNT networks. In this strategy, we prepared nanoscale width channels based on aligned SWNT networks using a directed assembly strategy. Significantly, the aligned network-based sensors with narrower channels exhibited even better signal-to-noise ratio than those with wider channels, which is opposite to conventional random network-based sensors. As a proof of concept, we demonstrated 100 nm scale low-noise sensors to detect mercury ions with the detection limit of ∼1 pM, which is superior to any state-of-the-art portable detection system and is below the allowable limit of mercury ions in drinking water set by most government environmental protection agencies. This is the first demonstration of 100 nm scale low-noise sensors based on SWNT networks. Considering the increased interests in high-density sensor arrays for healthcare and environmental protection, our strategy should have a significant impact on various industrial applications.

  11. 100 nm scale low-noise sensors based on aligned carbon nanotube networks: overcoming the fundamental limitation of network-based sensors

    Science.gov (United States)

    Lee, Minbaek; Lee, Joohyung; Kim, Tae Hyun; Lee, Hyungwoo; Lee, Byung Yang; Park, June; Jhon, Young Min; Seong, Maeng-Je; Hong, Seunghun

    2010-02-01

    Nanoscale sensors based on single-walled carbon nanotube (SWNT) networks have been considered impractical due to several fundamental limitations such as a poor sensitivity and small signal-to-noise ratio. Herein, we present a strategy to overcome these fundamental problems and build highly-sensitive low-noise nanoscale sensors simply by controlling the structure of the SWNT networks. In this strategy, we prepared nanoscale width channels based on aligned SWNT networks using a directed assembly strategy. Significantly, the aligned network-based sensors with narrower channels exhibited even better signal-to-noise ratio than those with wider channels, which is opposite to conventional random network-based sensors. As a proof of concept, we demonstrated 100 nm scale low-noise sensors to detect mercury ions with the detection limit of ~1 pM, which is superior to any state-of-the-art portable detection system and is below the allowable limit of mercury ions in drinking water set by most government environmental protection agencies. This is the first demonstration of 100 nm scale low-noise sensors based on SWNT networks. Considering the increased interests in high-density sensor arrays for healthcare and environmental protection, our strategy should have a significant impact on various industrial applications.

  12. Cancer: fundamentals behind pH targeting and the double-edged approach

    Science.gov (United States)

    Koltai, Tomas

    2016-01-01

    targeting this vulnerable side of cancer development. It also analyzes the double-edged approach, which consists in pharmacologically increasing intracellular proton production and simultaneously decreasing proton extrusion creating intracellular acidity, acid stress, and eventual apoptosis. PMID:27799782

  13. A probabilistic analysis reveals fundamental limitations with the environmental impact quotient and similar systems for rating pesticide risks

    Directory of Open Access Journals (Sweden)

    Robert K.D. Peterson

    2014-04-01

    Full Text Available Comparing risks among pesticides has substantial utility for decision makers. However, if rating schemes to compare risks are to be used, they must be conceptually and mathematically sound. We address limitations with pesticide risk rating schemes by examining in particular the Environmental Impact Quotient (EIQ using, for the first time, a probabilistic analytic technique. To demonstrate the consequences of mapping discrete risk ratings to probabilities, adjusted EIQs were calculated for a group of 20 insecticides in four chemical classes. Using Monte Carlo simulation, adjusted EIQs were determined under different hypothetical scenarios by incorporating probability ranges. The analysis revealed that pesticides that have different EIQs, and therefore different putative environmental effects, actually may be no different when incorporating uncertainty. The EIQ equation cannot take into account uncertainty the way that it is structured and provide reliable quotients of pesticide impact. The EIQ also is inconsistent with the accepted notion of risk as a joint probability of toxicity and exposure. Therefore, our results suggest that the EIQ and other similar schemes be discontinued in favor of conceptually sound schemes to estimate risk that rely on proper integration of toxicity and exposure information.

  14. The Principles of Proportionality, Legal Argumentation and the Discretionary Power of the Public Administration: An Analysis from the Limits on Fundamental Rights and Guarantees

    Directory of Open Access Journals (Sweden)

    Yezid Carrillo-de la Rosa

    2017-06-01

    Full Text Available This paper examines the implications of the principle of proportionality with regards to administrative decisions that limit civil liberties and fundamental rights. The hypothesis we intend to demonstrate is that a discretionary power of the Public Administration for issuing measures that restricts individual rights and liberties is just apparent, since the reach of agency discretion for choosing time, means and place conditions is very narrow. As the following research shows, the principle of proportionality obliges administrative agencies to implement effective means to attain the purposes of their intervention, but minimizing its impacts on constitutionally protected rights and liberties.

  15. Argüição de descumprimento de preceito fundamental : limites e finalidades do Instituto no Direito Constitucional Brasileiro

    OpenAIRE

    Santos, Marcos André Couto

    2003-01-01

    O objetivo da presente Dissertação consiste em aferir os limites e as finalidades constitucionais da Argüição de Descumprimento de Preceito Fundamental, específico instrumento de controle de constitucionalidade concentrado do Direito Constitucional Positivo Brasileiro, previsto atualmente no parágrafo 1º, do art. 102, da Constituição Federal de 1988, regulado pela Lei Federal nº 9.882/99. Tal estudo, eminentemente constitucional, justifica-se diante da necessidade de atestar...

  16. Fundamental Limitations on Image Restoration

    Science.gov (United States)

    1975-04-01

    8217^""■’■•-’^"•n»^iPI«|»PP|li«P«W»TO>i* IPiP »WW»**»"W"l<WW|WJ»f«»(B^’—-^- I KIJIM. 1^1 ill ill I I ntn^m^nrnm^**** I ! ■■H IIMMW)»™^™ resulting In a spurious

  17. Entropy-limited hydrodynamics: a novel approach to relativistic hydrodynamics

    Science.gov (United States)

    Guercilena, Federico; Radice, David; Rezzolla, Luciano

    2017-07-01

    We present entropy-limited hydrodynamics (ELH): a new approach for the computation of numerical fluxes arising in the discretization of hyperbolic equations in conservation form. ELH is based on the hybridisation of an unfiltered high-order scheme with the first-order Lax-Friedrichs method. The activation of the low-order part of the scheme is driven by a measure of the locally generated entropy inspired by the artificial-viscosity method proposed by Guermond et al. (J. Comput. Phys. 230(11):4248-4267, 2011, doi: 10.1016/j.jcp.2010.11.043). Here, we present ELH in the context of high-order finite-differencing methods and of the equations of general-relativistic hydrodynamics. We study the performance of ELH in a series of classical astrophysical tests in general relativity involving isolated, rotating and nonrotating neutron stars, and including a case of gravitational collapse to black hole. We present a detailed comparison of ELH with the fifth-order monotonicity preserving method MP5 (Suresh and Huynh in J. Comput. Phys. 136(1):83-99, 1997, doi: 10.1006/jcph.1997.5745), one of the most common high-order schemes currently employed in numerical-relativity simulations. We find that ELH achieves comparable and, in many of the cases studied here, better accuracy than more traditional methods at a fraction of the computational cost (up to {˜}50% speedup). Given its accuracy and its simplicity of implementation, ELH is a promising framework for the development of new special- and general-relativistic hydrodynamics codes well adapted for massively parallel supercomputers.

  18. Population pressure on coral atolls: trends and approaching limits.

    Science.gov (United States)

    Rapaport, M

    1990-09-01

    Trends and approaching limits of population pressure on coral atolls is discussed by examining the atoll environment in terms of the physical geography, the production systems, and resource distribution. Atoll populations are grouped as dependent and independent, and demographic trends in population growth, migraiton, urbanization, and political dependency are reviewed. Examination of the carrying capacity includes a dynamic model, the influences of the West, and philopsophical considerations. The carrying capacity is the "maximal population supportable in a given area". Traditional models are criticized because of a lack in accounting for external linkages. The proposed model is dynamic and considers perceived needs and overseas linkages. It also explains regional disparities in population distribution, and provides a continuing model for population movement from outer islands to district centers and mainland areas. Because of increased expectations and perceived needs, there is a lower carrying capacity for outlying areas, and expanded capacity in district centers. This leads to urbanization, emigration, and carrying capacity overshot in regional and mainland areas. Policy intervention is necessary at the regional and island community level. Atolls, which are islands surrounding deep lagoons, exist in archipelagoes across the oceans, and are rich in aquatic life. The balance in this small land area with a vulnerable ecosystem may be easily disturbed by scarce water supplies, barren soils, rising sea levels in the future, hurricanes, and tsunamis. Traditionally, fisheries and horticulture (pit-taro, coconuts, and breadfruit) have sustained populations, but modern influences such as blasting, reef mining, new industrial technologies, population pressure, and urbanization threaten the balance. Population pressure, which has lead to pollution, epidemics, malnutrition, crime, social disintegration, and foreign dependence, is evidenced in the areas of Tuvalu, Kiribati

  19. 33 CFR 401.52 - Limit of approach to a bridge.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Limit of approach to a bridge... approach to a bridge. (a) No vessel shall pass the limit of approach sign at any movable bridge until the bridge is in a fully open position and the signal light shows green. (b) No vessel shall pass the limit...

  20. Whole-genome sequencing approaches for conservation biology: Advantages, limitations and practical recommendations.

    Science.gov (United States)

    Fuentes-Pardo, Angela P; Ruzzante, Daniel E

    2017-10-01

    Whole-genome resequencing (WGR) is a powerful method for addressing fundamental evolutionary biology questions that have not been fully resolved using traditional methods. WGR includes four approaches: the sequencing of individuals to a high depth of coverage with either unresolved or resolved haplotypes, the sequencing of population genomes to a high depth by mixing equimolar amounts of unlabelled-individual DNA (Pool-seq) and the sequencing of multiple individuals from a population to a low depth (lcWGR). These techniques require the availability of a reference genome. This, along with the still high cost of shotgun sequencing and the large demand for computing resources and storage, has limited their implementation in nonmodel species with scarce genomic resources and in fields such as conservation biology. Our goal here is to describe the various WGR methods, their pros and cons and potential applications in conservation biology. WGR offers an unprecedented marker density and surveys a wide diversity of genetic variations not limited to single nucleotide polymorphisms (e.g., structural variants and mutations in regulatory elements), increasing their power for the detection of signatures of selection and local adaptation as well as for the identification of the genetic basis of phenotypic traits and diseases. Currently, though, no single WGR approach fulfils all requirements of conservation genetics, and each method has its own limitations and sources of potential bias. We discuss proposed ways to minimize such biases. We envision a not distant future where the analysis of whole genomes becomes a routine task in many nonmodel species and fields including conservation biology. © 2017 John Wiley & Sons Ltd.

  1. Credit card spending limit and personal finance: system dynamics approach

    Directory of Open Access Journals (Sweden)

    Mirjana Pejić Bach

    2014-03-01

    Full Text Available Credit cards have become one of the major ways for conducting cashless transactions. However, they have a long term impact on the well being of their owner through the debt generated by credit card usage. Credit card issuers approve high credit limits to credit card owners, thereby influencing their credit burden. A system dynamics model has been used to model behavior of a credit card owner in different scenarios according to the size of a credit limit. Experiments with the model demonstrated that a higher credit limit approved on the credit card decreases the budget available for spending in the long run. This is a contribution toward the evaluation of action for credit limit control based on their consequences.

  2. The limit fold change model: A practical approach for selecting differentially expressed genes from microarray data

    Directory of Open Access Journals (Sweden)

    Rytz Andreas

    2002-06-01

    Full Text Available Abstract Background The biomedical community is developing new methods of data analysis to more efficiently process the massive data sets produced by microarray experiments. Systematic and global mathematical approaches that can be readily applied to a large number of experimental designs become fundamental to correctly handle the otherwise overwhelming data sets. Results The gene selection model presented herein is based on the observation that: (1 variance of gene expression is a function of absolute expression; (2 one can model this relationship in order to set an appropriate lower fold change limit of significance; and (3 this relationship defines a function that can be used to select differentially expressed genes. The model first evaluates fold change (FC across the entire range of absolute expression levels for any number of experimental conditions. Genes are systematically binned, and those genes within the top X% of highest FCs for each bin are evaluated both with and without the use of replicates. A function is fitted through the top X% of each bin, thereby defining a limit fold change. All genes selected by the 5% FC model lie above measurement variability using a within standard deviation (SDwithin confidence level of 99.9%. Real time-PCR (RT-PCR analysis demonstrated 85.7% concordance with microarray data selected by the limit function. Conclusion The FC model can confidently select differentially expressed genes as corroborated by variance data and RT-PCR. The simplicity of the overall process permits selecting model limits that best describe experimental data by extracting information on gene expression patterns across the range of expression levels. Genes selected by this process can be consistently compared between experiments and enables the user to globally extract information with a high degree of confidence.

  3. A Practical Approach for Parameter Identification with Limited Information

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Yang, Guangya; Tarnowski, Germán Claudio

    2014-01-01

    A practical parameter estimation procedure for a real excitation system is reported in this paper. The core algorithm is based on genetic algorithm (GA) which estimates the parameters of a real AC brushless excitation system with limited information about the system. Practical considerations...

  4. Fracture mechanics approach to estimate rail wear limits

    Science.gov (United States)

    2009-10-01

    This paper describes a systematic methodology to estimate allowable limits for rail head wear in terms of vertical head-height loss, gage-face side wear, and/or the combination of the two. This methodology is based on the principles of engineering fr...

  5. The Preliminary Pollutant Limit Value Approach: Procedures and Data Base.

    Science.gov (United States)

    1984-06-01

    4 Dinitrophenol 2.0x10 - 3 Dimethyl phthalate 10 Diethyl phthalate 12 - Dibutyl phthalate 1.2 Di-2-ethylhexyl phthalate 0.61 ThaLlium salts 5"Oxi-4...environmental substances are not simply limited to a divison between "conventional" and carcinogenic considerations. Some substances are teratogens ...Environmental Quality 9 4 identifies several compounds that are known teratogens in humans or mammals; in several cases, the dose levels involved are of the

  6. Limited sinus tarsi approach for intra-articular calcaneus fractures.

    Science.gov (United States)

    Kikuchi, Christian; Charlton, Timothy P; Thordarson, David B

    2013-12-01

    Operative treatment of calcaneal fractures has a historically high rate of wound complications, so the most optimal operative approach has been a topic of investigation. This study reviews the radiographic and clinical outcomes of the use of the sinus tarsi approach for operative fixation of these fractures with attention to the rate of infection and restoration of angular measurements. The radiographs and charts of 20 patients with 22 calcaneal fractures were reviewed to assess for restoration of angular and linear dimensions of the calcaneus as well as time to radiographic union. Secondary outcome measures included the rate of postoperative infection, osteomyelitis, revision surgeries, and nonunion. We found a statistically significant restoration of Böhler's angle and calcaneal width. Three of the 22 cases had a superficial wound infection. One patient had revision surgery for symptomatic hardware removal. There were no events of osteomyelitis, deep infection, malunion, or nonunion. We found that the sinus tarsi approach yielded similar outcomes to those reported in the literature. Level IV, retrospective case series.

  7. A Practical Approach for Parameter Identification with Limited Information

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Yang, Guangya; Tarnowski, Germán Claudio

    2014-01-01

    A practical parameter estimation procedure for a real excitation system is reported in this paper. The core algorithm is based on genetic algorithm (GA) which estimates the parameters of a real AC brushless excitation system with limited information about the system. Practical considerations...... are integrated in the estimation procedure to reduce the complexity of the problem. The effectiveness of the proposed technique is demonstrated via real measurements. Besides, it is seen that GA can converge to a satisfactory solution even when starting from large initial variation ranges of the estimated...

  8. A general approach to total repair cost limit replacement policies

    Directory of Open Access Journals (Sweden)

    F. Beichelt

    2014-01-01

    Full Text Available A common replacement policy for technical systems consists in replacing a system by a new one after its economic lifetime, i.e. at that moment when its long-run maintenance cost rate is minimal. However, the strict application of the economic lifetime does not take into account the individual deviations of maintenance cost rates of single systems from the average cost development. Hence, Beichet proposed the total repair cost limit replacement policy: the system is replaced by a new one as soon as its total repair cost reaches or exceeds a given level. He modelled the repair cost development by functions of the Wiener process with drift. Here the same policy is considered under the assumption that the one-dimensional probability distribution of the process describing the repair cost development is given. In the examples analysed, applying the total repair cost limit replacement policy instead of the economic life-time leads to cost savings of between 4% and 30%. Finally, it is illustrated how to include the reliability aspect into the policy.

  9. Data Smearing: An Approach to Disclosure Limitation for Tabular Data

    Directory of Open Access Journals (Sweden)

    Toth Daniell

    2014-12-01

    Full Text Available Statistical agencies often collect sensitive data for release to the public at aggregated levels in the form of tables. To protect confidential data, some cells are suppressed in the publicly released data. One problem with this method is that many cells of interest must be suppressed in order to protect a much smaller number of sensitive cells. Another problem is that the covariates used to aggregate and level of aggregation must be fixed before the data is released. Both of these restrictions can severely limit the utility of the data. We propose a new disclosure limitation method that replaces the full set of microdata with synthetic data for use in producing released data in tabular form. This synthetic data set is obtained by replacing each unit’s values with a weighted average of sampled values from the surrounding area. The synthetic data is produced in a way to give asymptotically unbiased estimates for aggregate cells as the number of units in the cell increases. The method is applied to the U.S. Bureau of Labor Statistics Quarterly Census of Employment and Wages data, which is released to the public quarterly in tabular form and aggregated across varying scales of time, area, and economic sector.

  10. Spectroscopy of 211Rn approaching the valence limit

    International Nuclear Information System (INIS)

    Davidson, P.M.; Dracoulis, G.D.; Kibedi, T.; Fabricius, B.; Baxter, A.M.; Stuchbery, A.E.; Poletti, A.R.; Schiffer, K.J.

    1993-02-01

    High spin states in 211 Rn were populated using the reaction 198 Pt( 18 O,5n) at 96 MeV. The decay was studied using γ-ray and electron spectroscopy. The known level scheme is extended up to a spin of greater than 69/2 and many non-yrast states are added. Semi-empirical shell model calculations and the properties of related states in 210 Rn and 212 Rn are used to assign configurations to some of the non-yrast states. The properties of the high spin states observed are compared to the predictions of the Multi-Particle Octupole Coupling model and the semi-empirical shell model. The maximum reasonable spin available from the valence particles and holes is 77/2 and states are observed to near this limit. 12 refs., 4 tabs., 8 figs

  11. [Limitation of therapeutic effort: Approach to a combined view].

    Science.gov (United States)

    Bueno Muñoz, M J

    2013-01-01

    Over the past few decades, we have been witnessing that increasing fewer people pass away at home and increasing more do so within the hospital. More specifically, 20% of deaths now occur in an intensive care unit (ICU). However, death in the ICU has become a highly technical process. This sometimes originates excesses because the resources used are not proportionate related to the purposes pursued (futility). It may create situations that do not respect the person's dignity throughout the death process. It is within this context that the situation of the clinical procedure called "limitation of the therapeutic effort" (LTE) is reviewed. This has become a true bridge between Intensive Care and Palliative Care. Its final goal is to guarantee a dignified and painless death for the terminally ill. Copyright © 2012 Elsevier España, S.L. y SEEIUC. All rights reserved.

  12. Limiter

    Science.gov (United States)

    Cohen, S.A.; Hosea, J.C.; Timberlake, J.R.

    1984-10-19

    A limiter with a specially contoured front face is provided. The front face of the limiter (the plasma-side face) is flat with a central indentation. In addition, the limiter shape is cylindrically symmetric so that the limiter can be rotated for greater heat distribution. This limiter shape accommodates the various power scrape-off distances lambda p, which depend on the parallel velocity, V/sub parallel/, of the impacting particles.

  13. A probabilistic approach to uncertainty quantification with limited information

    International Nuclear Information System (INIS)

    Red-Horse, J.R.; Benjamin, A.S.

    2004-01-01

    Many safety assessments depend upon models that rely on probabilistic characterizations about which there is incomplete knowledge. For example, a system model may depend upon the time to failure of a piece of equipment for which no failures have actually been observed. The analysts in this case are faced with the task of developing a failure model for the equipment in question, having very limited knowledge about either the correct form of the failure distribution or the statistical parameters that characterize the distribution. They may assume that the process conforms to a Weibull or log-normal distribution or that it can be characterized by a particular mean or variance, but those assumptions impart more knowledge to the analysis than is actually available. To address this challenge, we propose a method where random variables comprising equivalence classes constrained by the available information are approximated using polynomial chaos expansions (PCEs). The PCE approximations are based on rigorous mathematical concepts developed from functional analysis and measure theory. The method has been codified in a computational tool, AVOCET, and has been applied successfully to example problems. Results indicate that it should be applicable to a broad range of engineering problems that are characterized by both irreducible andreducible uncertainty

  14. Multicore in Production: Advantages and Limits of the Multiprocess Approach

    CERN Document Server

    Binet, S; The ATLAS collaboration; Lavrijsen, W; Leggett, Ch; Lesny, D; Jha, M K; Severini, H; Smith, D; Snyder, S; Tatarkhanov, M; Tsulaia, V; van Gemmeren, P; Washbrook, A

    2011-01-01

    The shared memory architecture of multicore CPUs provides HENP developers with the opportunity to reduce the memory footprint of their applications by sharing memory pages between the cores in a processor. ATLAS pioneered the multi-process approach to parallelizing HENP applications. Using Linux fork() and the Copy On Write mechanism we implemented a simple event task farm which allows to share up to 50% memory pages among event worker processes with negligible CPU overhead. By leaving the task of managing shared memory pages to the operating system, we have been able to run in parallel large reconstruction and simulation applications originally written to be run in a single thread of execution with little to no change to the application code. In spite of this, the process of validating athena multi-process for production took ten months of concentrated effort and is expected to continue for several more months. In general terms, we had two classes of problems in the multi-process port: merging the output fil...

  15. Analysis of operational limit of an aircraft: An aeroelastic approach

    Science.gov (United States)

    Hasan, Md. Mehedi; Hassan, M. D. Mehedi; Sarrowar, S. M. Bayazid; Faisal, Kh. Md.; Ahmed, Sheikh Reaz, Dr.

    2017-06-01

    In classical theory of elasticity, external loading acting on the body is independent of deformation of the body. But, in aeroelasticity, aerodynamic forces depend on the attitude of the body relative to the flow. Aircraft's are subjected to a range of static loads resulting from equilibrium or steady flight maneuvers such as coordinated level turn, steady pitch and bank rate, steady and level flight. Interaction of these loads with elastic forces of aircraft structure creates some aeroelastic phenomena. In this paper, we have summarized recent developments in the area of aeroelasticity. A numerical approach has been applied for finding divergence speed, a static aeroelastic phenomena, of a typical aircraft. This paper also involves graphical representations of constraints on load factor and bank angle during different steady flight maneuvers taking flexibility into account and comparing it with the value without flexibility. Effect of wing skin thickness, spar web thickness and position of flexural axis of wing on this divergence speed as well as load factor and bank angle has also been observed using MATLAB.

  16. Protecting Fundamental (Social) Rights through the Lens of the EU Single Market: the Quest for a More 'Holistic Approach'

    NARCIS (Netherlands)

    de Vries, S.A.

    2016-01-01

    In this article, four trajectories will be followed with a view to further developing the linkages that exist between the EU Single Market and fundamental (social) rights and to examining to what extent the EU Single Market, apart from putting constraints on the realization of social rights, offers

  17. The Gibbs energy form of the Fundamental Equation for multi-phase multi-reaction systems within the physical approach

    CERN Document Server

    Luetich, J J

    2001-01-01

    A general Gibbs energy representation of the fundamental equation for multi-phase systems in chemical equilibrium is presented. No difference is made between physical and chemical transitions, i.e. between phase and combination changes. This paper is the first member of a tetralogy conceived to give insight into the concept of microscopic reversibility.

  18. Radiology fundamentals

    CERN Document Server

    Singh, Harjit

    2011-01-01

    ""Radiology Fundamentals"" is a concise introduction to the dynamic field of radiology for medical students, non-radiology house staff, physician assistants, nurse practitioners, radiology assistants, and other allied health professionals. The goal of the book is to provide readers with general examples and brief discussions of basic radiographic principles and to serve as a curriculum guide, supplementing a radiology education and providing a solid foundation for further learning. Introductory chapters provide readers with the fundamental scientific concepts underlying the medical use of imag

  19. Fundamental Astronomy

    CERN Document Server

    Karttunen, Hannu; Oja, Heikki; Poutanen, Markku; Donner, Karl Johan

    2007-01-01

    Fundamental Astronomy gives a well-balanced and comprehensive introduction to the topics of classical and modern astronomy. While emphasizing both the astronomical concepts and the underlying physical principles, the text provides a sound basis for more profound studies in the astronomical sciences. The fifth edition of this successful undergraduate textbook has been extensively modernized and extended in the parts dealing with the Milky Way, extragalactic astronomy and cosmology as well as with extrasolar planets and the solar system (as a consequence of recent results from satellite missions and the new definition by the International Astronomical Union of planets, dwarf planets and small solar-system bodies). Furthermore a new chapter on astrobiology has been added. Long considered a standard text for physical science majors, Fundamental Astronomy is also an excellent reference and entrée for dedicated amateur astronomers.

  20. Metrological Array of Cyber-Physical Systems. Part 15. Approach to the Creation of Temperature Standard on Basis of Fundamental Physical Constants

    Directory of Open Access Journals (Sweden)

    Bohdan STADNYK

    2016-04-01

    Full Text Available After proving the existence of Temperature Quantum the next step would be the study of possibility of Temperature Standard creation. We consider the general principles of design and operation of such advanced Temperature Standard constructed on the basis of Quantum Temperature Unit. The latter is determined solely via the fundamental physical constants. Approach to the mentioned Standard is developed in this paper.

  1. Marketing fundamentals.

    Science.gov (United States)

    Redmond, W H

    2001-01-01

    This chapter outlines current marketing practice from a managerial perspective. The role of marketing within an organization is discussed in relation to efficiency and adaptation to changing environments. Fundamental terms and concepts are presented in an applied context. The implementation of marketing plans is organized around the four P's of marketing: product (or service), promotion (including advertising), place of delivery, and pricing. These are the tools with which marketers seek to better serve their clients and form the basis for competing with other organizations. Basic concepts of strategic relationship management are outlined. Lastly, alternate viewpoints on the role of advertising in healthcare markets are examined.

  2. Upper limit for Poisson variable incorporating systematic uncertainties by Bayesian approach

    International Nuclear Information System (INIS)

    Zhu, Yongsheng

    2007-01-01

    To calculate the upper limit for the Poisson observable at given confidence level with inclusion of systematic uncertainties in background expectation and signal efficiency, formulations have been established along the line of Bayesian approach. A FORTRAN program, BPULE, has been developed to implement the upper limit calculation

  3. Fundamental Fluid Mechanics

    Indian Academy of Sciences (India)

    BOOK I REVIEW. Fundamental Fluid. Mechanics. Good Text Book Material. V H Arakeri. Fluid Mechanics for Engineers. P N Chatterjee. MacMillan India Limited. Vol. 1, pp. 367. RS.143. Vo1.2, pp.306. RS.130. Fluid Mechanics for Engineers in two vol- umes by P N Chatterjee contains standard material for a first level ...

  4. Censoring: a new approach for detection limits in total-reflection X-ray fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Pajek, M. [Institute of Physics, Swietokrzyska Academy, Swietokrzyska 15, 25-406 Kielce (Poland)]. E-mail: pajek@pu.kielce.pl; Kubala-Kukus, A. [Institute of Physics, Swietokrzyska Academy, Swietokrzyska 15, 25-406 Kielce (Poland); Braziewicz, J. [Institute of Physics, Swietokrzyska Academy, Swietokrzyska 15, 25-406 Kielce (Poland)

    2004-08-31

    It is shown that the detection limits in the total-reflection X-ray fluorescence (TXRF), which restrict quantification of very low concentrations of trace elements in the samples, can be accounted for using the statistical concept of censoring. We demonstrate that the incomplete TXRF measurements containing the so-called 'nondetects', i.e. the non-measured concentrations falling below the detection limits and represented by the estimated detection limit values, can be viewed as the left random-censored data, which can be further analyzed using the Kaplan-Meier (KM) method correcting for nondetects. Within this approach, which uses the Kaplan-Meier product-limit estimator to obtain the cumulative distribution function corrected for the nondetects, the mean value and median of the detection limit censored concentrations can be estimated in a non-parametric way. The Monte Carlo simulations performed show that the Kaplan-Meier approach yields highly accurate estimates for the mean and median concentrations, being within a few percent with respect to the simulated, uncensored data. This means that the uncertainties of KM estimated mean value and median are limited in fact only by the number of studied samples and not by the applied correction procedure for nondetects itself. On the other hand, it is observed that, in case when the concentration of a given element is not measured in all the samples, simple approaches to estimate a mean concentration value from the data yield erroneous, systematically biased results. The discussed random-left censoring approach was applied to analyze the TXRF detection-limit-censored concentration measurements of trace elements in biomedical samples. We emphasize that the Kaplan-Meier approach allows one to estimate the mean concentrations being substantially below the mean level of detection limits. Consequently, this approach gives a new access to lower the effective detection limits for TXRF method, which is of prime interest

  5. Understanding small biomolecule-biomaterial interactions: a review of fundamental theoretical and experimental approaches for biomolecule interactions with inorganic surfaces.

    Science.gov (United States)

    Costa, Dominique; Garrain, Pierre-Alain; Baaden, Marc

    2013-04-01

    Interactions between biomolecules and inorganic surfaces play an important role in natural environments and in industry, including a wide variety of conditions: marine environment, ship hulls (fouling), water treatment, heat exchange, membrane separation, soils, mineral particles at the earth's surface, hospitals (hygiene), art and buildings (degradation and biocorrosion), paper industry (fouling) and more. To better control the first steps leading to adsorption of a biomolecule on an inorganic surface, it is mandatory to understand the adsorption mechanisms of biomolecules of several sizes at the atomic scale, that is, the nature of the chemical interaction between the biomolecule and the surface and the resulting biomolecule conformations once adsorbed at the surface. This remains a challenging and unsolved problem. Here, we review the state of art in experimental and theoretical approaches. We focus on metallic biomaterial surfaces such as TiO(2) and stainless steel, mentioning some remarkable results on hydroxyapatite. Experimental techniques include atomic force microscopy, surface plasmon resonance, quartz crystal microbalance, X-ray photoelectron spectroscopy, fluorescence microscopy, polarization modulation infrared reflection absorption spectroscopy, sum frequency generation and time of flight secondary ion mass spectroscopy. Theoretical models range from detailed quantum mechanical representations to classical forcefield-based approaches. Copyright © 2012 Wiley Periodicals, Inc.

  6. A fundamental approach to specify thermal and pressure loadings on containment buildings of sodium cooled fast reactors during a core disruptive accident

    International Nuclear Information System (INIS)

    Velusamy, K.; Chellapandi, P.; Satpathy, K.; Verma, Neeraj; Raviprasan, G.R.; Rajendrakumar, M.; Chetal, S.C.

    2011-01-01

    Highlights: → An approach to quantify thermal and pressure loadings on RCB is presented. → Scaling laws to determine sodium release from water experiments are proposed. → Potential of in-vessel sodium fire after a CDA is assessed. → The proposed approach is applied to Indian Prototype Fast Breeder Reactor. - Abstract: Reactor Containment Building (RCB) is the ultimate barrier to the environment against activity release in any nuclear power plant. It has to be designed to withstand both positive and negative pressures that are credible. Core Disruptive Accident (CDA) is an important event that specifies the design basis for RCB in sodium cooled fast reactors. In this paper, a fundamental approach towards quantification of thermal and pressure loadings on RCB during a CDA, has been described. Mathematical models have been derived from fundamental conservation principles towards determination of sodium release during a CDA, subsequent sodium fire inside RCB, building up of positive and negative pressures inside RCB, potential of in-vessel sodium fire due to failed seals and temperature evolution in RCB walls during extended period of containment isolation. Various heating sources for RCB air and RCB wall and their potential have been identified. Scaling laws for conducting CDA experiments in small-scale water models by chemical explosives and the rule for extrapolation of water leak to quantify sodium leak in reactor are proposed. Validation of the proposed models and experimental simulation rules has been demonstrated by applying them to Indian prototype fast breeder reactor. Finally, it is demonstrated that in-vessel sodium fire potential is very weak and no special containment cooling system is essential.

  7. A novel approach to derive halo-independent limits on dark matter properties

    OpenAIRE

    Ferrer, Francesc; Ibarra, Alejandro; Wild, Sebastian

    2015-01-01

    We propose a method that allows to place an upper limit on the dark matter elastic scattering cross section with nucleons which is independent of the velocity distribution. Our approach combines null results from direct detection experiments with indirect searches at neutrino telescopes, and goes beyond previous attempts to remove astrophysical uncertainties in that it directly constrains the particle physics properties of the dark matter. The resulting halo-independent upper limits on the sc...

  8. Approaches to assessment in time-limited Mentalization-Based Therapy for Children (MBT-C)

    OpenAIRE

    Muller, Nicole; Midgley, Nick

    2015-01-01

    In this article we describe our clinical approach to assessment, formulation and the identification of a therapeutic focus in the context of time-limited Mentalization-Based Treatment for Children (MBT-C) aged between 6 and 12. Rather than seeing the capacity to mentalize as a global construct, we set out an approach to assessing the developmental ‘building blocks’ of the capacity to mentalize the self and others, including the capacity for attention regulation, emotion regulation, and explic...

  9. Censoring approach to the detection limits in X-ray fluorescence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pajek, M. [Institute of Physics, Swietokrzyska Academy, Swietokrzyska 15, 25-406 Kielce (Poland)]. E-mail: pajek@pu.kielce.pl; Kubala-Kukus, A. [Institute of Physics, Swietokrzyska Academy, Swietokrzyska 15, 25-406 Kielce (Poland)

    2004-10-08

    We demonstrate that the effect of detection limits in the X-ray fluorescence analysis (XRF), which limits the determination of very low concentrations of trace elements and results in appearance of the so-called 'nondetects', can be accounted for using the statistical concept of censoring. More precisely, the results of such measurements can be viewed as the left random censored data, which can further be analyzed using the Kaplan-Meier method correcting the data for the presence of nondetects. Using this approach, the results of measured, detection limit censored concentrations can be interpreted in a nonparametric manner including the correction for the nondetects, i.e. the measurements in which the concentrations were found to be below the actual detection limits. Moreover, using the Monte Carlo simulation technique we show that by using the Kaplan-Meier approach the corrected mean concentrations for a population of the samples can be estimated within a few percent uncertainties with respect of the simulated, uncensored data. This practically means that the final uncertainties of estimated mean values are limited in fact by the number of studied samples and not by the correction procedure itself. The discussed random-left censoring approach was applied to analyze the XRF detection-limit-censored concentration measurements of trace elements in biomedical samples.

  10. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  11. New approaches to deriving limits of the release of radioactive material into the environment

    International Nuclear Information System (INIS)

    Lindell, B.

    1977-01-01

    During the last few years, new principles have been developed for the limitation of the release of radioactive material into the environment. It is no longer considered appropriate to base the limitation on limits for the concentrations of the various radionuclides in air and water effluents. Such limits would not prevent large amounts of radioactive material from reaching the environment should effluent rates be high. A common practice has been to identify critical radionuclides and critical pathways and to base the limitation on authorized dose limits for local ''critical groups''. If this were the only limitation, however, larger releases could be permitted after installing either higher stacks or equipment to retain the more short-lived radionuclides for decay before release. Continued release at such limits would then lead to considerably higher exposure at a distance than if no such installation had been made. Accordingly there would be no immediate control of overlapping exposures from several sources, nor would the system guarantee control of the future situation. The new principles described in this paper take the future into account by limiting the annual dose commitments rather than the annual doses. They also offer means of controlling the global situation by limiting not only doses in critical groups but also global collective doses. Their objective is not only to ensure that individual dose limits will always be respected but also to meet the requirement that ''all doses be kept as low as reasonably achievable''. The new approach is based on the most recent recommendations by the ICRP and has been described in a report by an IAEA panel (Procedures for establishing limits for the release of radioactive material into the environment). It has been applied in the development of new Swedish release regulations, which illustrate some of the problems which arise in the practical application

  12. The emergence of the dimensions and fundamental forces in the universe, an information-theoretical approach for the expaining of the quantity ratios of the fundamental interactions. 2. rev. and enl. ed.

    International Nuclear Information System (INIS)

    Ganter, Bernd

    2013-01-01

    After a description of the four fundamental inteactions and the connection of information with energy the principle of the fast maximation together with the Ganter tableau is described. Then as example the derivation of the value of the fine-structure constant from the Ganter tableau is described. Thereafter the extension of the Ganter tableau, further properties of the Ganter tableau, and the persuasion of the Ganter tableau are considered. (HSI)

  13. Use of a fundamental approach to spray-drying formulation design to facilitate the development of multi-component dry powder aerosols for respiratory drug delivery.

    Science.gov (United States)

    Hoe, Susan; Ivey, James W; Boraey, Mohammed A; Shamsaddini-Shahrbabak, Abouzar; Javaheri, Emadeddin; Matinkhoo, Sadaf; Finlay, Warren H; Vehring, Reinhard

    2014-02-01

    A fundamental approach incorporating current theoretical models into aerosol formulation design potentially reduces experimental work for complex formulations. A D-amino acid mixture containing D-Leucine (D-Leu), D-Methionine, D-Tryptophan, and D-Tyrosine was selected as a model formulation for this approach. Formulation design targets were set, with the aim of producing a highly dispersible D-amino acid aerosol. Particle formation theory and a spray dryer process model were applied with boundary conditions to the design targets, resulting in a priori predictions of particle morphology and necessary spray dryer process parameters. Two formulations containing 60% w/w trehalose, 30% w/w D-Leu, and 10% w/w remaining D-amino acids were manufactured. The design targets were met. The formulations had rugose and hollow particles, caused by deformation of a crystalline D-Leu shell while trehalose remained amorphous, as predicted by particle formation theory. D-Leu acts as a dispersibility enhancer, ensuring that both formulations: 1) delivered over 40% of the loaded dose into the in vitro lung region, and 2) achieved desired values of lung airway surface liquid concentrations based on lung deposition simulations. Theoretical models were applied to successfully achieve complex formulations with design challenges a priori. No further iterations to the design process were required.

  14. La operación analítica: límites y fundamentos The analytical operation: limits and fundaments

    Directory of Open Access Journals (Sweden)

    David Laznik

    2009-12-01

    Full Text Available La construcción del corpus teórico psicoanalítico, en tanto teoría de una praxis, experimenta a lo largo de la obra freudiana diversas rectificaciones que inciden en la delimitación de los conceptos y de las operaciones inherentes a su campo. Desde esa perspectiva, la pregunta por el alcance y los límites del método psicoanalítico subsiste en articulación con las sucesivas reformulaciones. Luego de establecer la segunda tópica, Freud sistematiza, en 1926, los diferentes tipos de resistencias. Posteriormente y en diversos momentos, retoma la problemática en torno a los obstáculos que complican el trabajo analítico. Estas consideraciones introducen nuevos interrogantes y recortan la incidencia de nuevos factores que, aún sin precipitar en una formalización acabada, complejizan el estatuto y el alcance de la operación analítica.The construction of the theoretical psychoanalytic corpus, as a theory of a praxis, experiences along the Freudian work diverse rectifications that affect in the delimiting of the concepts and of the operations inherent to his field. From this perspective, the question for the range and the limits of the psychoanalytic method survives in joint with the successive reformulations. After establishing the second topic, Freud systematizes, in 1926, the different types of resistances. Later, and in diverse moments, he recaptures the problematics around the obstacles that complicate the analytical work. These considerations introduce new questions and delimits the incident of new factors that, still without precipitating in a finished formalization, complex the statute and the range of the analytical operation.

  15. A large deviations approach to limit theory for heavy-tailed time series

    DEFF Research Database (Denmark)

    Mikosch, Thomas Valentin; Wintenberger, Olivier

    2016-01-01

    In this paper we propagate a large deviations approach for proving limit theory for (generally) multivariate time series with heavy tails. We make this notion precise by introducing regularly varying time series. We provide general large deviation results for functionals acting on a sample path...... and vanishing in some neighborhood of the origin. We study a variety of such functionals, including large deviations of random walks, their suprema, the ruin functional, and further derive weak limit theory for maxima, point processes, cluster functionals and the tail empirical process. One of the main results...

  16. Fundamental composite electroweak dynamics

    DEFF Research Database (Denmark)

    Arbey, Alexandre; Cacciapaglia, Giacomo; Cai, Haiying

    2017-01-01

    Using the recent joint results from the ATLAS and CMS collaborations on the Higgs boson, we determine the current status of composite electroweak dynamics models based on the expected scalar sector. Our analysis can be used as a minimal template for a wider class of models between the two limiting...... cases of composite Goldstone Higgs and Technicolor-like ones. This is possible due to the existence of a unified description, both at the effective and fundamental Lagrangian levels, of models of composite Higgs dynamics where the Higgs boson itself can emerge, depending on the way the electroweak...... space at the effective Lagrangian level. We show that a wide class of models of fundamental composite electroweak dynamics are still compatible with the present constraints. The results are relevant for the ongoing and future searches at the Large Hadron Collider....

  17. Allylmagnesium Halides Do Not React Chemoselectively Because Reaction Rates Approach the Diffusion Limit.

    Science.gov (United States)

    Read, Jacquelyne A; Woerpel, K A

    2017-02-17

    Competition experiments demonstrate that additions of allylmagnesium halides to carbonyl compounds, unlike additions of other organomagnesium reagents, occur at rates approaching the diffusion rate limit. Whereas alkylmagnesium and alkyllithium reagents could differentiate between electronically or sterically different carbonyl compounds, allylmagnesium reagents reacted with most carbonyl compounds at similar rates. Even additions to esters occurred at rates competitive with additions to aldehydes. Only in the case of particularly sterically hindered substrates, such as those bearing tertiary alkyl groups, were additions slower.

  18. Fundamental aspects of plasma chemical physics Thermodynamics

    CERN Document Server

    Capitelli, Mario; D'Angola, Antonio

    2012-01-01

    Fundamental Aspects of Plasma Chemical Physics - Thermodynamics develops basic and advanced concepts of plasma thermodynamics from both classical and statistical points of view. After a refreshment of classical thermodynamics applied to the dissociation and ionization regimes, the book invites the reader to discover the role of electronic excitation in affecting the properties of plasmas, a topic often overlooked by the thermal plasma community. Particular attention is devoted to the problem of the divergence of the partition function of atomic species and the state-to-state approach for calculating the partition function of diatomic and polyatomic molecules. The limit of ideal gas approximation is also discussed, by introducing Debye-Huckel and virial corrections. Throughout the book, worked examples are given in order to clarify concepts and mathematical approaches. This book is a first of a series of three books to be published by the authors on fundamental aspects of plasma chemical physics.  The next bo...

  19. Advantages and limitations of the 'worst case scenario' approach in IMPT treatment planning.

    Science.gov (United States)

    Casiraghi, M; Albertini, F; Lomax, A J

    2013-03-07

    The 'worst case scenario' (also known as the minimax approach in optimization terms) is a common approach to model the effect of delivery uncertainties in proton treatment planning. Using the 'dose-error-bar distribution' previously reported by our group as an example, we have investigated in more detail one of the underlying assumptions of this method. That is, the dose distributions calculated for a limited number of worst case patient positioning scenarios (i.e. limited number of shifts sampled on a spherical surface) represent the worst dose distributions that can occur during the patient treatment under setup uncertainties. By uniformly sampling patient shifts from anywhere within a spherical error-space, a number of treatment scenarios have been simulated and dose deviations from the nominal dose distribution have been computed. The dose errors from these simulations (comprehensive approach) have then been compared to the dose-error-bar approach previously reported (surface approximation) using both point-by-point and dose- and error-volume-histogram analysis (DVH/EVHs). This comparison has been performed for two different clinical cases treated using intensity modulated proton therapy (IMPT): a skull-base and a spinal-axis tumor. Point-by-point evaluation shows that the surface approximation leads to a correct estimation (95% accuracy) of the potential dose errors for the 96% and 85% of the irradiated voxels, for the two investigated cases respectively. We also found that the voxels for which the surface approximation fails are generally localized close to sharp soft tissue-bone interfaces and air cavities. Moreover, analysis of EVHs and DVHs for the two cases shows that the percentage of voxels of a given volume of interest potentially affected by a certain maximum dose error is correctly estimated using the surface approximation and that this approach also accurately predicts the upper and lower bounds of the DVH curves that can occur under positioning

  20. Advantages and limitations of the ‘worst case scenario’ approach in IMPT treatment planning

    International Nuclear Information System (INIS)

    Casiraghi, M; Albertini, F; Lomax, A J

    2013-01-01

    The ‘worst case scenario’ (also known as the minimax approach in optimization terms) is a common approach to model the effect of delivery uncertainties in proton treatment planning. Using the ‘dose-error-bar distribution’ previously reported by our group as an example, we have investigated in more detail one of the underlying assumptions of this method. That is, the dose distributions calculated for a limited number of worst case patient positioning scenarios (i.e. limited number of shifts sampled on a spherical surface) represent the worst dose distributions that can occur during the patient treatment under setup uncertainties. By uniformly sampling patient shifts from anywhere within a spherical error-space, a number of treatment scenarios have been simulated and dose deviations from the nominal dose distribution have been computed. The dose errors from these simulations (comprehensive approach) have then been compared to the dose-error-bar approach previously reported (surface approximation) using both point-by-point and dose– and error–volume–histogram analysis (DVH/EVHs). This comparison has been performed for two different clinical cases treated using intensity modulated proton therapy (IMPT): a skull-base and a spinal-axis tumor. Point-by-point evaluation shows that the surface approximation leads to a correct estimation (95% accuracy) of the potential dose errors for the 96% and 85% of the irradiated voxels, for the two investigated cases respectively. We also found that the voxels for which the surface approximation fails are generally localized close to sharp soft tissue–bone interfaces and air cavities. Moreover, analysis of EVHs and DVHs for the two cases shows that the percentage of voxels of a given volume of interest potentially affected by a certain maximum dose error is correctly estimated using the surface approximation and that this approach also accurately predicts the upper and lower bounds of the DVH curves that can occur under

  1. Strengths and Limitations of New Approaches for Graphical Presentation of Blood Glucose Monitoring System Accuracy Data.

    Science.gov (United States)

    Pleus, Stefan; Flacke, Frank; Sieber, Jochen; Haug, Cornelia; Freckmann, Guido

    2017-11-01

    Graphical presentation of blood glucose monitoring systems' (BGMSs) accuracy typically includes difference plots (DPs). Recently, 3 new approaches were presented: radar plots (RPs), rectangle target plots (RTPs), and surveillance error grids (SEGs). BGMS data were modeled based on 3 scenarios that can be encountered in real life to highlight strengths and limitations of these approaches. Detailed assessment of BGMS data may be easier in plots with individual data points (DPs, RPs, SEGs), whereas RTPs may facilitate display of large amounts of data or comparison of BGMS. SEGs have the advantage of assessing clinical risk. The selection of a specific type depends mostly on the kind of information sought (eg, accuracy in specific concentration intervals, lot-to-lot variability, clinical risk) as there is no "absolute best" approach.

  2. DOE fundamentals handbook: Chemistry

    International Nuclear Information System (INIS)

    1993-01-01

    The Chemistry Handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of chemistry. The handbook includes information on the atomic structure of matter; chemical bonding; chemical equations; chemical interactions involved with corrosion processes; water chemistry control, including the principles of water treatment; the hazards of chemicals and gases, and basic gaseous diffusion processes. This information will provide personnel with a foundation for understanding the chemical properties of materials and the way these properties can impose limitations on the operation of equipment and systems

  3. Novel approach to epicardial pacemaker implantation in patients with limited venous access.

    Science.gov (United States)

    Costa, Roberto; Scanavacca, Mauricio; da Silva, Kátia Regina; Martinelli Filho, Martino; Carrillo, Roger

    2013-11-01

    Limited venous access in certain patients increases the procedural risk and complexity of conventional transvenous pacemaker implantation. The purpose of this study was to determine a minimally invasive epicardial approach using pericardial reflections for dual-chamber pacemaker implantation in patients with limited venous access. Between June 2006 and November 2011, 15 patients underwent epicardial pacemaker implantation. Procedures were performed through a minimally invasive subxiphoid approach and pericardial window with subsequent fluoroscopy-assisted lead placement. Mean patient age was 46.4 ± 15.3 years (9 male [(60.0%], 6 female [40.0%]). The new surgical approach was used in patients determined to have limited venous access due to multiple abandoned leads in 5 (33.3%), venous occlusion in 3 (20.0%), intravascular retention of lead fragments from prior extraction in 3 (20.0%), tricuspid valve vegetation currently under treatment in 2 (13.3%), and unrepaired intracardiac defects in 2 (13.3%). All procedures were successful with no perioperative complications or early deaths. Mean operating time for isolated pacemaker implantation was 231.7 ± 33.5 minutes. Lead placement on the superior aspect of right atrium, through the transverse sinus, was possible in 12 patients. In the remaining 3 patients, the atrial lead was implanted on the left atrium through the oblique sinus, the postcaval recess, or the left pulmonary vein recess. None of the patients displayed pacing or sensing dysfunction, and all parameters remained stable throughout the follow-up period of 36.8 ± 25.1 months. Epicardial pacemaker implantation through pericardial reflections is an effective alternative therapy for those patients requiring physiologic pacing in whom venous access is limited. © 2013 Heart Rhythm Society. All rights reserved.

  4. The "Food Polymer Science" approach to the practice of industrial R&D, leading to patent estates based on fundamental starch science and technology.

    Science.gov (United States)

    Slade, Louise; Levine, Harry

    2018-04-13

    This article reviews the application of the "Food Polymer Science" approach to the practice of industrial R&D, leading to patent estates based on fundamental starch science and technology. The areas of patents and patented technologies reviewed here include: (a) soft-from-the-freezer ice creams and freezer-storage-stable frozen bread dough products, based on "cryostabilization technology" of frozen foods, utilizing commercial starch hydrolysis products (SHPs); (b) glassy-matrix encapsulation technology for flavors and other volatiles, based on structure-function relationships for commercial SHPs; (c) production of stabilized whole-grain wheat flours for biscuit products, based on the application of "solvent retention capacity" technology to develop flours with reduced damaged starch; (d) production of improved-quality, low-moisture cookies and crackers, based on pentosanase enzyme technology; (e) production of "baked-not-fried," chip-like, starch-based snack products, based on the use of commercial modified-starch ingredients with selected functionality; (f) accelerated staling of a starch-based food product from baked bread crumb, based on the kinetics of starch retrogradation, treated as a crystallization process for a partially crystalline glassy polymer system; and (g) a process for producing an enzyme-resistant starch, for use as a reduced-calorie flour replacer in a wide range of grain-based food products, including cookies, extruded expanded snacks, and breakfast cereals.

  5. Fundamental limitation on quantum broadcast networks

    Science.gov (United States)

    Bäuml, Stefan; Azuma, Koji

    2017-06-01

    The ability to distribute entanglement over complex quantum networks is an important step towards a quantum internet. Recently, there has been significant theoretical effort, mainly focusing on the distribution of bipartite entanglement via a simple quantum network composed only of bipartite quantum channels. There are, however, a number of quantum information processing protocols based on multipartite rather than bipartite entanglement. Whereas multipartite entanglement can be distributed by means of a network of such bipartite channels, a more natural way is to use a more general network, that is, a quantum broadcast network including quantum broadcast channels. In this work, we present a general framework for deriving upper bounds on the rates at which GHZ states or multipartite private states can be distributed among a number of different parties over an arbitrary quantum broadcast network. Our upper bounds are written in terms of the multipartite squashed entanglement, corresponding to a generalisation of recently derived bounds (Azuma et al, (2016), Nat. Commun. 7 13523). We also discuss how lower bounds can be obtained by combining a generalisation of an aggregated quantum repeater protocol with graph theoretic concepts.

  6. An Adaptive Approach to Mitigate Background Covariance Limitations in the Ensemble Kalman Filter

    KAUST Repository

    Song, Hajoon

    2010-07-01

    A new approach is proposed to address the background covariance limitations arising from undersampled ensembles and unaccounted model errors in the ensemble Kalman filter (EnKF). The method enhances the representativeness of the EnKF ensemble by augmenting it with new members chosen adaptively to add missing information that prevents the EnKF from fully fitting the data to the ensemble. The vectors to be added are obtained by back projecting the residuals of the observation misfits from the EnKF analysis step onto the state space. The back projection is done using an optimal interpolation (OI) scheme based on an estimated covariance of the subspace missing from the ensemble. In the experiments reported here, the OI uses a preselected stationary background covariance matrix, as in the hybrid EnKF–three-dimensional variational data assimilation (3DVAR) approach, but the resulting correction is included as a new ensemble member instead of being added to all existing ensemble members. The adaptive approach is tested with the Lorenz-96 model. The hybrid EnKF–3DVAR is used as a benchmark to evaluate the performance of the adaptive approach. Assimilation experiments suggest that the new adaptive scheme significantly improves the EnKF behavior when it suffers from small size ensembles and neglected model errors. It was further found to be competitive with the hybrid EnKF–3DVAR approach, depending on ensemble size and data coverage.

  7. An ICMP-Based Mobility Management Approach Suitable for Protocol Deployment Limitation

    Directory of Open Access Journals (Sweden)

    Jeng-Yueng Chen

    2009-01-01

    Full Text Available Mobility management is one of the important tasks on wireless networks. Many approaches have been proposed in the past, but none of them have been widely deployed so far. Mobile IP (MIP and Route Optimization (ROMIP, respectively, suffer from triangular routing problem and binding cache supporting upon each node on the entire Internet. One step toward a solution is the Mobile Routing Table (MRT, which enables edge routers to take over address binding. However, this approach demands that all the edge routers on the Internet support MRT, resulting in protocol deployment difficulties. To address this problem and to offset the limitation of the original MRT approach, we propose two different schemes, an ICMP echo scheme and an ICMP destination-unreachable scheme. These two schemes work with the MRT to efficiently find MRT-enabled routers that greatly reduce the number of triangular routes. In this paper, we analyze and compare the standard MIP and the proposed approaches. Simulation results have shown that the proposed approaches reduce transmission delay, with only a few routers supporting MRT.

  8. Approaches to assessment in time-limited Mentalization Based Therapy for Children (MBT-C

    Directory of Open Access Journals (Sweden)

    Nick eMidgley

    2015-07-01

    Full Text Available In this article we describe our clinical approach to assessment, formulation and the identification of a therapeutic focus in the context of time-limited Mentalization Based Treatment for children (MBT-C aged between 6-12. Rather than seeing the capacity to mentalize as a global construct, we set out an approach to assessing the developmental 'building blocks' of the capacity to mentalize the self and others, including the capacity for attention regulation, emotion regulation and explicit mentalization. Assessing the child's strengths and vulnerabilities in each of these domains provides a more nuanced picture of the child's mentalizing capacities and difficulties, and can provide a useful approach to case formulation. The article sets out an approach to assessment that includes a consideration of mentalizing strengths and difficulties in both the child and the parents, and shows how this can be used to help develop a mutually-agreed treatment focus. A clinical vignette illustrates the approach taken to assessment and connects it to routine clinical practice.

  9. Approaches to assessment in time-limited Mentalization-Based Therapy for Children (MBT-C).

    Science.gov (United States)

    Muller, Nicole; Midgley, Nick

    2015-01-01

    In this article we describe our clinical approach to assessment, formulation and the identification of a therapeutic focus in the context of time-limited Mentalization-Based Treatment for Children (MBT-C) aged between 6 and 12. Rather than seeing the capacity to mentalize as a global construct, we set out an approach to assessing the developmental 'building blocks' of the capacity to mentalize the self and others, including the capacity for attention regulation, emotion regulation, and explicit mentalization. Assessing the child's strengths and vulnerabilities in each of these domains provides a more nuanced picture of the child's mentalizing capacities and difficulties, and can provide a useful approach to case formulation. The article sets out an approach to assessment that includes a consideration of mentalizing strengths and difficulties in both the child and the parents, and shows how this can be used to help develop a mutually agreed treatment focus. A clinical vignette illustrates the approach taken to assessment and connects it to routine clinical practice.

  10. Fundamentals of Structural Geology

    Science.gov (United States)

    Pollard, David D.; Fletcher, Raymond C.

    2005-09-01

    Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors

  11. Fundamentals of Project Management

    CERN Document Server

    Heagney, Joseph

    2011-01-01

    With sales of more than 160,000 copies, Fundamentals of Project Management has helped generations of project managers navigate the ins and outs of every aspect of this complex discipline. Using a simple step-by-step approach, the book is the perfect introduction to project management tools, techniques, and concepts. Readers will learn how to: ò Develop a mission statement, vision, goals, and objectives ò Plan the project ò Create the work breakdown structure ò Produce a workable schedule ò Understand earned value analysis ò Manage a project team ò Control and evaluate progress at every stage.

  12. Fundamentals of Cavitation

    CERN Document Server

    Franc, Jean-Pierre

    2005-01-01

    The present book is aimed at providing a comprehensive presentation of cavitation phenomena in liquid flows. It is further backed up by the experience, both experimental and theoretical, of the authors whose expertise has been internationally recognized. A special effort is made to place the various methods of investigation in strong relation with the fundamental physics of cavitation, enabling the reader to treat specific problems independently. Furthermore, it is hoped that a better knowledge of the cavitation phenomenon will allow engineers to create systems using it positively. Examples in the literature show the feasibility of this approach.

  13. Man as the measure of all things: a limiting approach to urban regeneration?

    Science.gov (United States)

    Hugentobler, Margrit

    2006-01-01

    Urban planning and change in the last century has been guided by concepts of Modernity rooted in the Age of Enlightenment that placed the needs of "rational man" at the core of human endeavors of all kinds. Yet, rather than leading to aesthetically beautiful cities characterized by sustainable resource utilization processes, the anthropocentric approach to urban and economic development has created global problems of depletion of natural resources, massive pollution and growing social imbalances within and between nation states. The widely heralded concept of (economically, environmentally, and socially) sustainable development has not yet produced a fundamental rethinking of our patterns of production and consumption. A multi-systems level framework with which to think about sustainable urban development and regeneration is outlined. It is based on an evolutionary perspective of systems and their emergent qualitatively different properties. A distinction between chemical/physical, biological, human/individual, social and cultural systems levels is made. Broadly framed guiding questions at each system's level are proposed as the basis for the development of sustainability criteria and indicators that can be tailored to any type of project in the planning or evaluation stage. A case study addressing the renewal of urban villages in the mega city of Guangzhou in Southern China illustrates the application potential of the framework to the challenge of urban regeneration.

  14. An approach for modeling sediment budgets in supply-limited rivers

    Science.gov (United States)

    Wright, Scott A.; Topping, David J.; Rubin, David M.; Melis, Theodore S.

    2010-01-01

    was to develop an approach complex enough to capture the processes related to sediment supply limitation but simple enough to allow for rapid calculations of multi-year sediment budgets. The approach relies on empirical relations between suspended sediment concentration and discharge but on a particle size specific basis and also tracks and incorporates the particle size distribution of the bed sediment. We have applied this approach to the Colorado River below Glen Canyon Dam (GCD), a reach that is particularly suited to such an approach because it is substantially sediment supply limited such that transport rates are strongly dependent on both water discharge and sediment supply. The results confirm the ability of the approach to simulate the effects of supply limitation, including periods of accumulation and bed fining as well as erosion and bed coarsening, using a very simple formulation. Although more empirical in nature than standard one-dimensional morphodynamic models, this alternative approach is attractive because its simplicity allows for rapid evaluation of multi-year sediment budgets under a range of flow regimes and sediment supply conditions, and also because it requires substantially less data for model setup and use.

  15. An Optimization-Based Impedance Approach for Robot Force Regulation with Prescribed Force Limits

    Directory of Open Access Journals (Sweden)

    R. de J. Portillo-Vélez

    2015-01-01

    Full Text Available An optimization based approach for the regulation of excessive or insufficient forces at the end-effector level is introduced. The objective is to minimize the interaction force error at the robot end effector, while constraining undesired interaction forces. To that end, a dynamic optimization problem (DOP is formulated considering a dynamic robot impedance model. Penalty functions are considered in the DOP to handle the constraints on the interaction force. The optimization problem is online solved through the gradient flow approach. Convergence properties are presented and the stability is drawn when the force limits are considered in the analysis. The effectiveness of our proposal is validated via experimental results for a robotic grasping task.

  16. Limiting Approach to Generalized Gamma Bessel Model via Fractional Calculus and Its Applications in Various Disciplines

    Directory of Open Access Journals (Sweden)

    Nicy Sebastian

    2015-08-01

    Full Text Available The essentials of fractional calculus according to different approaches that can be useful for our applications in the theory of probability and stochastic processes are established. In addition to this, from this fractional integral, one can list out almost all of the extended densities for the pathway parameter q < 1 and q → 1. Here, we bring out the idea of thicker- or thinner-tailed models associated with a gamma-type distribution as a limiting case of the pathway operator. Applications of this extended gamma model in statistical mechanics, input-output models, solar spectral irradiance modeling, etc., are established.

  17. [The concept of the "limit": Α metapsychological approach of the Freudian theory].

    Science.gov (United States)

    Tsipa, Ν; Houssier, F

    2017-01-01

    on this theoretical approach, the psychopathological expression of the concept is studied as well. As a conclusion of this review it is shown that S. Freud was the first who has studied, at least indirectly, the concept of the "limit" by introducing the theory of instincts and throughout his later works.

  18. Physiology-based modelling approaches to characterize fish habitat suitability: Their usefulness and limitations

    Science.gov (United States)

    Teal, Lorna R.; Marras, Stefano; Peck, Myron A.; Domenici, Paolo

    2018-02-01

    empirical relationships which can be found consistently within physiological data across the animal kingdom. The advantages of the DEB models are that they make use of the generalities found in terms of animal physiology and can therefore be applied to species for which little data or empirical observations are available. In addition, the limitations as well as useful potential refinements of these and other physiology-based modelling approaches are discussed. Inclusion of the physiological response of various life stages and modelling the patterns of extreme events observed in nature are suggested for future work.

  19. FUNDAMENTALS OF BIOMECHANICS

    Directory of Open Access Journals (Sweden)

    Duane Knudson

    2007-09-01

    Full Text Available DESCRIPTION This book provides a broad and in-depth theoretical and practical description of the fundamental concepts in understanding biomechanics in the qualitative analysis of human movement. PURPOSE The aim is to bring together up-to-date biomechanical knowledge with expert application knowledge. Extensive referencing for students is also provided. FEATURES This textbook is divided into 12 chapters within four parts, including a lab activities section at the end. The division is as follows: Part 1 Introduction: 1.Introduction to biomechanics of human movement; 2.Fundamentals of biomechanics and qualitative analysis; Part 2 Biological/Structural Bases: 3.Anatomical description and its limitations; 4.Mechanics of the musculoskeletal system; Part 3 Mechanical Bases: 5.Linear and angular kinematics; 6.Linear kinetics; 7.Angular kinetics; 8.Fluid mechanics; Part 4 Application of Biomechanics in Qualitative Analysis :9.Applying biomechanics in physical education; 10.Applying biomechanics in coaching; 11.Applying biomechanics in strength and conditioning; 12.Applying biomechanics in sports medicine and rehabilitation. AUDIENCE This is an important reading for both student and educators in the medicine, sport and exercise-related fields. For the researcher and lecturer it would be a helpful guide to plan and prepare more detailed experimental designs or lecture and/or laboratory classes in exercise and sport biomechanics. ASSESSMENT The text provides a constructive fundamental resource for biomechanics, exercise and sport-related students, teachers and researchers as well as anyone interested in understanding motion. It is also very useful since being clearly written and presenting several ways of examples of the application of biomechanics to help teach and apply biomechanical variables and concepts, including sport-related ones

  20. Serious limitations of the QTL/Microarray approach for QTL gene discovery

    Directory of Open Access Journals (Sweden)

    Warden Craig H

    2010-07-01

    Full Text Available Abstract Background It has been proposed that the use of gene expression microarrays in nonrecombinant parental or congenic strains can accelerate the process of isolating individual genes underlying quantitative trait loci (QTL. However, the effectiveness of this approach has not been assessed. Results Thirty-seven studies that have implemented the QTL/microarray approach in rodents were reviewed. About 30% of studies showed enrichment for QTL candidates, mostly in comparisons between congenic and background strains. Three studies led to the identification of an underlying QTL gene. To complement the literature results, a microarray experiment was performed using three mouse congenic strains isolating the effects of at least 25 biometric QTL. Results show that genes in the congenic donor regions were preferentially selected. However, within donor regions, the distribution of differentially expressed genes was homogeneous once gene density was accounted for. Genes within identical-by-descent (IBD regions were less likely to be differentially expressed in chromosome 2, but not in chromosomes 11 and 17. Furthermore, expression of QTL regulated in cis (cis eQTL showed higher expression in the background genotype, which was partially explained by the presence of single nucleotide polymorphisms (SNP. Conclusions The literature shows limited successes from the QTL/microarray approach to identify QTL genes. Our own results from microarray profiling of three congenic strains revealed a strong tendency to select cis-eQTL over trans-eQTL. IBD regions had little effect on rate of differential expression, and we provide several reasons why IBD should not be used to discard eQTL candidates. In addition, mismatch probes produced false cis-eQTL that could not be completely removed with the current strains genotypes and low probe density microarrays. The reviewed studies did not account for lack of coverage from the platforms used and therefore removed genes

  1. Fundamental enabling issues in nanotechnology :

    Energy Technology Data Exchange (ETDEWEB)

    Floro, Jerrold Anthony [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Foiles, Stephen Martin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hearne, Sean Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoyt, Jeffrey John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Seel, Steven Craig [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Webb III, Edmund Blackburn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Morales, Alfredo Martin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zimmerman, Jonathan A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2007-10-01

    To effectively integrate nanotechnology into functional devices, fundamental aspects of material behavior at the nanometer scale must be understood. Stresses generated during thin film growth strongly influence component lifetime and performance; stress has also been proposed as a mechanism for stabilizing supported nanoscale structures. Yet the intrinsic connections between the evolving morphology of supported nanostructures and stress generation are still a matter of debate. This report presents results from a combined experiment and modeling approach to study stress evolution during thin film growth. Fully atomistic simulations are presented predicting stress generation mechanisms and magnitudes during all growth stages, from island nucleation to coalescence and film thickening. Simulations are validated by electrodeposition growth experiments, which establish the dependence of microstructure and growth stresses on process conditions and deposition geometry. Sandia is one of the few facilities with the resources to combine experiments and modeling/theory in this close a fashion. Experiments predicted an ongoing coalescence process that generates signficant tensile stress. Data from deposition experiments also supports the existence of a kinetically limited compressive stress generation mechanism. Atomistic simulations explored island coalescence and deposition onto surfaces intersected by grain boundary structures to permit investigation of stress evolution during later growth stages, e.g. continual island coalescence and adatom incorporation into grain boundaries. The predictive capabilities of simulation permit direct determination of fundamental processes active in stress generation at the nanometer scale while connecting those processes, via new theory, to continuum models for much larger island and film structures. Our combined experiment and simulation results reveal the necessary materials science to tailor stress, and therefore performance, in

  2. Laparoscopic approach of hepatic hydatid double cyst in pediatric patient: difficulties, indications and limitations

    Directory of Open Access Journals (Sweden)

    Isabela M. Drăghici

    2016-05-01

    Full Text Available Purpose: Laparoscopic management analysis of a rare condition having potentially severe evolution, seen in pediatric surgical pathology. Aims: Outlining the optimal surgical approach method of hepatic hydatid double cyst and the laparoscopic method’s limitations. Materials and Methods: The patient is a 6 years old girl that presented with two simultaneous giant hepatic hydatid cysts (segments VII-VIII, having close vicinity to the right branch of portal vein and to hepatic veins; she benefited from a single stage partial pericystectomy Lagrot performed by laparoscopy. Results: The procedure had no intraoperative accidents or incidents. Had good postoperative evolution without immediate or late complications. Trocars positioning had been adapted to the patient’s size and cysts topography. Conclusions: The laparoscopic treatment is feasible and safe, but is not yet the gold standard for a hepatic hydatid disease due to certain inconveniences.

  3. Risk newsboy: approach for addressing uncertainty in developing action levels and cleanup limits

    International Nuclear Information System (INIS)

    Cooke, Roger; MacDonell, Margaret

    2007-01-01

    Site cleanup decisions involve developing action levels and residual limits for key contaminants, to assure health protection during the cleanup period and into the long term. Uncertainty is inherent in the toxicity information used to define these levels, based on incomplete scientific knowledge regarding dose-response relationships across various hazards and exposures at environmentally relevant levels. This problem can be addressed by applying principles used to manage uncertainty in operations research, as illustrated by the newsboy dilemma. Each day a newsboy must balance the risk of buying more papers than he can sell against the risk of not buying enough. Setting action levels and cleanup limits involves a similar concept of balancing and distributing risks and benefits in the face of uncertainty. The newsboy approach can be applied to develop health-based target concentrations for both radiological and chemical contaminants, with stakeholder input being crucial to assessing 'regret' levels. Associated tools include structured expert judgment elicitation to quantify uncertainty in the dose-response relationship, and mathematical techniques such as probabilistic inversion and iterative proportional fitting. (authors)

  4. An approach to criteria, design limits and monitoring in nuclear fuel waste disposal

    International Nuclear Information System (INIS)

    Simmons, G.R.; Baumgartner, P.; Bird, G.A.; Davison, C.C.; Johnson, L.H.; Tamm, J.A.

    1994-12-01

    The Nuclear Fuel Waste Management Program has been established to develop and demonstrate the technology for safe geological disposal of nuclear fuel waste. One objective of the program is to show that a disposal system (i.e., a disposal centre and associated transportation system) can be designed and that it would be safe. Therefore the disposal system must be shown to comply with safety requirements specified in guidelines, standards, codes and regulations. The components of the disposal system must also be shown to operate within the limits specified in their design. Compliance and performance of the disposal system would be assessed on a site-specific basis by comparing estimates of the anticipated performance of the system and its components with compliance or performance criteria. A monitoring program would be developed to consider the effects of the disposal system on the environment and would include three types of monitoring: baseline monitoring, compliance monitoring, and performance monitoring. This report presents an approach to establishing compliance and performance criteria, limits for use in disposal system component design, and the main elements of a monitoring program for a nuclear fuel waste disposal system. (author). 70 refs., 9 tabs., 13 figs

  5. Fundamentals of Structural Engineering

    CERN Document Server

    Connor, Jerome J

    2013-01-01

    Fundamentals of Structural Engineering provides a balanced, seamless treatment of both classic, analytic methods and contemporary, computer-based techniques for conceptualizing and designing a structure. The book’s principle goal is to foster an intuitive understanding of structural behavior based on problem solving experience for students of civil engineering and architecture who have been exposed to the basic concepts of engineering mechanics and mechanics of materials. Making it distinct from many other undergraduate textbooks, the authors of this text recognize the notion that engineers reason about behavior using simple models and intuition they acquire through problem solving. The approach adopted in this text develops this type of intuition  by presenting extensive, realistic problems and case studies together with computer simulation, which allows rapid exploration of  how a structure responds to changes in geometry and physical parameters. This book also: Emphasizes problem-based understanding of...

  6. Fundamentals of sustainable neighbourhoods

    CERN Document Server

    Friedman, Avi

    2015-01-01

    This book introduces architects, engineers, builders, and urban planners to a range of design principles of sustainable communities and illustrates them with outstanding case studies. Drawing on the author’s experience as well as local and international case studies, Fundamentals of Sustainable Neighbourhoods presents planning concepts that minimize developments' carbon footprint through compact communities, adaptable and expandable dwellings, adaptable landscapes, and smaller-sized yet quality-designed housing. This book also: Examines in-depth global strategies for minimizing the residential carbon footprint, including district heating, passive solar gain, net-zero residences, as well as preserving the communities' natural assets Reconsiders conceptual approaches in building design and urban planning to promote a better connection between communities and nature Demonstrates practical applications of green architecture Focuses on innovative living spaces in urban environments

  7. MDI Biological Laboratory Arsenic Summit: Approaches to Limiting Human Exposure to Arsenic.

    Science.gov (United States)

    Stanton, Bruce A; Caldwell, Kathleen; Congdon, Clare Bates; Disney, Jane; Donahue, Maria; Ferguson, Elizabeth; Flemings, Elsie; Golden, Meredith; Guerinot, Mary Lou; Highman, Jay; James, Karen; Kim, Carol; Lantz, R Clark; Marvinney, Robert G; Mayer, Greg; Miller, David; Navas-Acien, Ana; Nordstrom, D Kirk; Postema, Sonia; Rardin, Laurie; Rosen, Barry; SenGupta, Arup; Shaw, Joseph; Stanton, Elizabeth; Susca, Paul

    2015-09-01

    This report is the outcome of the meeting "Environmental and Human Health Consequences of Arsenic" held at the MDI Biological Laboratory in Salisbury Cove, Maine, August 13-15, 2014. Human exposure to arsenic represents a significant health problem worldwide that requires immediate attention according to the World Health Organization (WHO). One billion people are exposed to arsenic in food, and more than 200 million people ingest arsenic via drinking water at concentrations greater than international standards. Although the US Environmental Protection Agency (EPA) has set a limit of 10 μg/L in public water supplies and the WHO has recommended an upper limit of 10 μg/L, recent studies indicate that these limits are not protective enough. In addition, there are currently few standards for arsenic in food. Those who participated in the Summit support citizens, scientists, policymakers, industry, and educators at the local, state, national, and international levels to (1) establish science-based evidence for setting standards at the local, state, national, and global levels for arsenic in water and food; (2) work with government agencies to set regulations for arsenic in water and food, to establish and strengthen non-regulatory programs, and to strengthen collaboration among government agencies, NGOs, academia, the private sector, industry, and others; (3) develop novel and cost-effective technologies for identification and reduction of exposure to arsenic in water; (4) develop novel and cost-effective approaches to reduce arsenic exposure in juice, rice, and other relevant foods; and (5) develop an Arsenic Education Plan to guide the development of science curricula as well as community outreach and education programs that serve to inform students and consumers about arsenic exposure and engage them in well water testing and development of remediation strategies.

  8. A qualitative risk assessment approach for Swiss dairy products: opportunities and limitations.

    Science.gov (United States)

    Menéndez González, S; Hartnack, S; Berger, T; Doherr, M; Breidenbach, E

    2011-05-01

    Switzerland implemented a risk-based monitoring of Swiss dairy products in 2002 based on a risk assessment (RA) that considered the probability of exceeding a microbiological limit value set by law. A new RA was launched in 2007 to review and further develop the previous assessment, and to make recommendations for future risk-based monitoring according to current risks. The resulting qualitative RA was designed to ascertain the risk to human health from the consumption of Swiss dairy products. The products and microbial hazards to be considered in the RA were determined based on a risk profile. The hazards included Campylobacter spp., Listeria monocytogenes, Salmonella spp., Shiga toxin-producing Escherichia coli, coagulase-positive staphylococci and Staphylococcus aureus enterotoxin. The release assessment considered the prevalence of the hazards in bulk milk samples, the influence of the process parameters on the microorganisms, and the influence of the type of dairy. The exposure assessment was linked to the production volume. An overall probability was estimated combining the probabilities of release and exposure for each combination of hazard, dairy product and type of dairy. This overall probability represents the likelihood of a product from a certain type of dairy exceeding the microbiological limit value and being passed on to the consumer. The consequences could not be fully assessed due to lack of detailed information on the number of disease cases caused by the consumption of dairy products. The results were expressed as a ranking of overall probabilities. Finally, recommendations for the design of the risk-based monitoring programme and for filling the identified data gaps were given. The aims of this work were (i) to present the qualitative RA approach for Swiss dairy products, which could be adapted to other settings and (ii) to discuss the opportunities and limitations of the qualitative method. © 2010 Blackwell Verlag GmbH.

  9. The Limitations of Existing Approaches in Improving MicroRNA Target Prediction Accuracy.

    Science.gov (United States)

    Loganantharaj, Rasiah; Randall, Thomas A

    2017-01-01

    MicroRNAs (miRNAs) are small (18-24 nt) endogenous RNAs found across diverse phyla involved in posttranscriptional regulation, primarily downregulation of mRNAs. Experimentally determining miRNA-mRNA interactions can be expensive and time-consuming, making the accurate computational prediction of miRNA targets a high priority. Since miRNA-mRNA base pairing in mammals is not perfectly complementary and only a fraction of the identified motifs are real binding sites, accurately predicting miRNA targets remains challenging. The limitations and bottlenecks of existing algorithms and approaches are discussed in this chapter.A new miRNA-mRNA interaction algorithm was implemented in Python (TargetFind) to capture three different modes of association and to maximize detection sensitivity to around 95% for mouse (mm9) and human (hg19) reference data. For human (hg19) data, the prediction accuracy with any one feature among evolutionarily conserved score, multiple targets in a UTR or changes in free energy varied within a close range from 63.5% to 66%. When the results of these features are combined with majority voting, the expected prediction accuracy increases to 69.5%. When all three features are used together, the average best prediction accuracy with tenfold cross validation from the classifiers naïve Bayes, support vector machine, artificial neural network, and decision tree were, respectively, 66.5%, 67.1%, 69%, and 68.4%. The results reveal the advantages and limitations of these approaches.When comparing different sets of features on their strength in predicting true hg19 targets, evolutionarily conserved score slightly outperformed all other features based on thermostability, and target multiplicity. The sophisticated supervised learning algorithms did not improve the prediction accuracy significantly compared to a simple threshold based approach on conservation score or combining the results of each feature with majority agreements. The targets from randomly

  10. Calibrating Fundamental British Values: How Head Teachers Are Approaching Appraisal in the Light of the Teachers' Standards 2012, Prevent and the Counter-Terrorism and Security Act, 2015

    Science.gov (United States)

    Revell, Lynn; Bryan, Hazel

    2016-01-01

    In requiring that teachers should "not undermine fundamental British values (FBV)," a phrase originally articulated in the Home Office counter-terrorism document, Prevent, the Teachers' Standards has brought into focus the nature of teacher professionalism. Teachers in England are now required to promote FBV within and outside school,…

  11. Exchange Rates and Fundamentals.

    Science.gov (United States)

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  12. Handling limited datasets with neural networks in medical applications: A small-data approach.

    Science.gov (United States)

    Shaikhina, Torgyn; Khovanova, Natalia A

    2017-01-01

    Single-centre studies in medical domain are often characterised by limited samples due to the complexity and high costs of patient data collection. Machine learning methods for regression modelling of small datasets (less than 10 observations per predictor variable) remain scarce. Our work bridges this gap by developing a novel framework for application of artificial neural networks (NNs) for regression tasks involving small medical datasets. In order to address the sporadic fluctuations and validation issues that appear in regression NNs trained on small datasets, the method of multiple runs and surrogate data analysis were proposed in this work. The approach was compared to the state-of-the-art ensemble NNs; the effect of dataset size on NN performance was also investigated. The proposed framework was applied for the prediction of compressive strength (CS) of femoral trabecular bone in patients suffering from severe osteoarthritis. The NN model was able to estimate the CS of osteoarthritic trabecular bone from its structural and biological properties with a standard error of 0.85MPa. When evaluated on independent test samples, the NN achieved accuracy of 98.3%, outperforming an ensemble NN model by 11%. We reproduce this result on CS data of another porous solid (concrete) and demonstrate that the proposed framework allows for an NN modelled with as few as 56 samples to generalise on 300 independent test samples with 86.5% accuracy, which is comparable to the performance of an NN developed with 18 times larger dataset (1030 samples). The significance of this work is two-fold: the practical application allows for non-destructive prediction of bone fracture risk, while the novel methodology extends beyond the task considered in this study and provides a general framework for application of regression NNs to medical problems characterised by limited dataset sizes. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Experiencing a probabilistic approach to clarify and disclose uncertainties when setting occupational exposure limits.

    Science.gov (United States)

    Vernez, David; Fraize-Frontier, Sandrine; Vincent, Raymond; Binet, Stéphane; Rousselle, Christophe

    2018-03-15

    Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  14. Hankin and Reeves' approach to estimating fish abundance in small streams: limitations and potential options; TOPICAL

    International Nuclear Information System (INIS)

    Thompson, William L.

    2000-01-01

    Hankin and Reeves' (1988) approach to estimating fish abundance in small streams has been applied in stream-fish studies across North America. However, as with any method of population estimation, there are important assumptions that must be met for estimates to be minimally biased and reasonably precise. Consequently, I investigated effects of various levels of departure from these assumptions via simulation based on results from an example application in Hankin and Reeves (1988) and a spatially clustered population. Coverage of 95% confidence intervals averaged about 5% less than nominal when removal estimates equaled true numbers within sampling units, but averaged 62% - 86% less than nominal when they did not, with the exception where detection probabilities of individuals were and gt;0.85 and constant across sampling units (95% confidence interval coverage= 90%). True total abundances averaged far (20% - 41%) below the lower confidence limit when not included within intervals, which implies large negative bias. Further, average coefficient of variation was about 1.5 times higher when removal estimates did not equal true numbers within sampling units (C(bar V)0.27[SE= 0.0004]) than when they did (C(bar V)= 0.19[SE= 0.0002]). A potential modification to Hankin and Reeves' approach is to include environmental covariates that affect detection rates of fish into the removal model or other mark-recapture model. A potential alternative is to use snorkeling in combination with line transect sampling to estimate fish densities. Regardless of the method of population estimation, a pilot study should be conducted to validate the enumeration method, which requires a known (or nearly so) population of fish to serve as a benchmark to evaluate bias and precision of population estimates

  15. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    fundamental assumptions.A recent focus set in the Astrophysical Journal Letters, titled Focus on Exploring Fundamental Physics with Extragalactic Transients, consists of multiple published studies doing just that.Testing General RelativitySeveral of the articles focus on the 4th point above. By assuming that the delay in photon arrival times is only due to the gravitational potential of the Milky Way, these studies set constraints on the deviation of our galaxys gravitational potential from what GR would predict. The study by He Gao et al. uses the different photon arrival times from gamma-ray bursts to set constraints at eVGeV energies, and the study by Jun-Jie Wei et al. complements this by setting constraints at keV-TeV energies using photons from high-energy blazar emission.Photons or neutrinos from different extragalactic transients each set different upper limits on delta gamma, the post-Newtonian parameter, vs. particle energy or frequency. This is a test of Einsteins equivalence principle: if the principle is correct, delta gamma would be exactly zero, meaning that photons of different energies move at the same velocity through a vacuum. [Tingay Kaplan 2016]S.J. Tingay D.L. Kaplan make the case that measuring the time delay of photons from fast radio bursts (FRBs; transient radio pulses that last only a few milliseconds) will provide even tighter constraints if we are able to accurately determine distances to these FRBs.And Adi Musser argues that the large-scale structure of the universe plays an even greater role than the Milky Way gravitational potential, allowing for even stricter testing of Einsteins equivalence principle.The ever-narrower constraints from these studies all support GR as a correct set of rules through which to interpret our universe.Other Tests of Fundamental PhysicsIn addition to the above tests, Xue-Feng Wu et al. show that FRBs can be used to provide severe constraints on the rest mass of the photon, and S. Croft et al. even touches on what we

  16. 40 CFR Appendix A to Subpart Kk of... - Data Quality Objective and Lower Confidence Limit Approaches for Alternative Capture Efficiency...

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Data Quality Objective and Lower... to Subpart KK of Part 63—Data Quality Objective and Lower Confidence Limit Approaches for Alternative...) protocols and test methods that satisfy the criteria of either the data quality objective (DQO) approach or...

  17. A computational approach to achieve situational awareness from limited observations of a complex system

    Science.gov (United States)

    Sherwin, Jason

    At the start of the 21st century, the topic of complexity remains a formidable challenge in engineering, science and other aspects of our world. It seems that when disaster strikes it is because some complex and unforeseen interaction causes the unfortunate outcome. Why did the financial system of the world meltdown in 2008--2009? Why are global temperatures on the rise? These questions and other ones like them are difficult to answer because they pertain to contexts that require lengthy descriptions. In other words, these contexts are complex. But we as human beings are able to observe and recognize this thing we call 'complexity'. Furthermore, we recognize that there are certain elements of a context that form a system of complex interactions---i.e., a complex system. Many researchers have even noted similarities between seemingly disparate complex systems. Do sub-atomic systems bear resemblance to weather patterns? Or do human-based economic systems bear resemblance to macroscopic flows? Where do we draw the line in their resemblance? These are the kinds of questions that are asked in complex systems research. And the ability to recognize complexity is not only limited to analytic research. Rather, there are many known examples of humans who, not only observe and recognize but also, operate complex systems. How do they do it? Is there something superhuman about these people or is there something common to human anatomy that makes it possible to fly a plane? Or to drive a bus? Or to operate a nuclear power plant? Or to play Chopin's etudes on the piano? In each of these examples, a human being operates a complex system of machinery, whether it is a plane, a bus, a nuclear power plant or a piano. What is the common thread running through these abilities? The study of situational awareness (SA) examines how people do these types of remarkable feats. It is not a bottom-up science though because it relies on finding general principles running through a host of varied

  18. An approach to the derivation of radionuclide intake limits for members of the public

    International Nuclear Information System (INIS)

    Thompson, R.C.

    1980-01-01

    The modification of occupational exposure limits for application to general populations is discussed. First, the permitted radiation dose needs to be modified from that considered appropriate for occupational exposure, to that considered appropriate for the particular general population exposure of concern. This is a problem of optimization and is considered only briefly. The second modification allows for the different physical, biological, and societal parameters applicable to general populations as contrasted with occupational populations. These differences derive from the heterogeneity of the general population particularly in terms of age and state-of-health, as these affect radionuclide deposition, absorption, distribution, and retention, and as they affect basic sensitivity to the development of detrimental effects. Environmental factors will influence physical availability and may alter the chemical and physical form of the radionuclide, and hence biological availability to the general population. Societal factors may modify the potential for exposure of different segments of the general population. This complex modifying factor will be different for each radioelement. The suggested approach is illustrated using plutonium as an example. (H.K.)

  19. Nonlinear flowering responses to climate: are species approaching their limits of phenological change?

    Science.gov (United States)

    Iler, Amy M.; Høye, Toke T.; Inouye, David W.; Schmidt, Niels M.

    2013-01-01

    Many alpine and subalpine plant species exhibit phenological advancements in association with earlier snowmelt. While the phenology of some plant species does not advance beyond a threshold snowmelt date, the prevalence of such threshold phenological responses within plant communities is largely unknown. We therefore examined the shape of flowering phenology responses (linear versus nonlinear) to climate using two long-term datasets from plant communities in snow-dominated environments: Gothic, CO, USA (1974–2011) and Zackenberg, Greenland (1996–2011). For a total of 64 species, we determined whether a linear or nonlinear regression model best explained interannual variation in flowering phenology in response to increasing temperatures and advancing snowmelt dates. The most common nonlinear trend was for species to flower earlier as snowmelt advanced, with either no change or a slower rate of change when snowmelt was early (average 20% of cases). By contrast, some species advanced their flowering at a faster rate over the warmest temperatures relative to cooler temperatures (average 5% of cases). Thus, some species seem to be approaching their limits of phenological change in response to snowmelt but not temperature. Such phenological thresholds could either be a result of minimum springtime photoperiod cues for flowering or a slower rate of adaptive change in flowering time relative to changing climatic conditions. PMID:23836793

  20. Multi-variable flood damage modelling with limited data using supervised learning approaches

    Directory of Open Access Journals (Sweden)

    D. Wagenaar

    2017-09-01

    Full Text Available Flood damage assessment is usually done with damage curves only dependent on the water depth. Several recent studies have shown that supervised learning techniques applied to a multi-variable data set can produce significantly better flood damage estimates. However, creating and applying a multi-variable flood damage model requires an extensive data set, which is rarely available, and this is currently holding back the widespread application of these techniques. In this paper we enrich a data set of residential building and contents damage from the Meuse flood of 1993 in the Netherlands, to make it suitable for multi-variable flood damage assessment. Results from 2-D flood simulations are used to add information on flow velocity, flood duration and the return period to the data set, and cadastre data are used to add information on building characteristics. Next, several statistical approaches are used to create multi-variable flood damage models, including regression trees, bagging regression trees, random forest, and a Bayesian network. Validation on data points from a test set shows that the enriched data set in combination with the supervised learning techniques delivers a 20 % reduction in the mean absolute error, compared to a simple model only based on the water depth, despite several limitations of the enriched data set. We find that with our data set, the tree-based methods perform better than the Bayesian network.

  1. Multi-variable flood damage modelling with limited data using supervised learning approaches

    Science.gov (United States)

    Wagenaar, Dennis; de Jong, Jurjen; Bouwer, Laurens M.

    2017-09-01

    Flood damage assessment is usually done with damage curves only dependent on the water depth. Several recent studies have shown that supervised learning techniques applied to a multi-variable data set can produce significantly better flood damage estimates. However, creating and applying a multi-variable flood damage model requires an extensive data set, which is rarely available, and this is currently holding back the widespread application of these techniques. In this paper we enrich a data set of residential building and contents damage from the Meuse flood of 1993 in the Netherlands, to make it suitable for multi-variable flood damage assessment. Results from 2-D flood simulations are used to add information on flow velocity, flood duration and the return period to the data set, and cadastre data are used to add information on building characteristics. Next, several statistical approaches are used to create multi-variable flood damage models, including regression trees, bagging regression trees, random forest, and a Bayesian network. Validation on data points from a test set shows that the enriched data set in combination with the supervised learning techniques delivers a 20 % reduction in the mean absolute error, compared to a simple model only based on the water depth, despite several limitations of the enriched data set. We find that with our data set, the tree-based methods perform better than the Bayesian network.

  2. Inferring gas-phase metallicity gradients of galaxies at the seeing limit: a forward modelling approach

    Science.gov (United States)

    Carton, David; Brinchmann, Jarle; Shirazi, Maryam; Contini, Thierry; Epinat, Benoît; Erroz-Ferrer, Santiago; Marino, Raffaella A.; Martinsson, Thomas P. K.; Richard, Johan; Patrício, Vera

    2017-06-01

    We present a method to recover the gas-phase metallicity gradients from integral field spectroscopic (IFS) observations of barely resolved galaxies. We take a forward modelling approach and compare our models to the observed spatial distribution of emission-line fluxes, accounting for the degrading effects of seeing and spatial binning. The method is flexible and is not limited to particular emission lines or instruments. We test the model through comparison to synthetic observations and use downgraded observations of nearby galaxies to validate this work. As a proof of concept, we also apply the model to real IFS observations of high-redshift galaxies. From our testing, we show that the inferred metallicity gradients and central metallicities are fairly insensitive to the assumptions made in the model and that they are reliably recovered for galaxies with sizes approximately equal to the half width at half-maximum of the point spread function. However, we also find that the presence of star-forming clumps can significantly complicate the interpretation of metallicity gradients in moderately resolved high-redshift galaxies. Therefore, we emphasize that care should be taken when comparing nearby well-resolved observations to high-redshift observations of partially resolved galaxies.

  3. The fundamentals of mathematical analysis

    CERN Document Server

    Fikhtengol'ts, G M

    1965-01-01

    The Fundamentals of Mathematical Analysis, Volume 1 is a textbook that provides a systematic and rigorous treatment of the fundamentals of mathematical analysis. Emphasis is placed on the concept of limit which plays a principal role in mathematical analysis. Examples of the application of mathematical analysis to geometry, mechanics, physics, and engineering are given. This volume is comprised of 14 chapters and begins with a discussion on real numbers, their properties and applications, and arithmetical operations over real numbers. The reader is then introduced to the concept of function, i

  4. Benefits and Limitations of Block Periodized Training Approaches to Athletes' Preparation: A Review.

    Science.gov (United States)

    Issurin, Vladimir B

    2016-03-01

    The present review introduces innovative concepts of training periodization and summarizes a large body of findings characterizing their potential benefits and possible limitations. Evidence-based analysis of the traditional periodization model led to elaboration of alternative versions of athletic preparation. These alternative versions postulated the superiority of training programs with a high concentration of selected workloads compared with traditionally designed plans directed at the concurrent development of many athletic abilities at low/medium workload concentration. The training cycles of highly concentrated specialized workloads were coined "training blocks" by experts and practitioners; correspondingly, the alternative versions were termed "block periodized (BP) preparation systems" by their presenters. Ultimately, two BP training models were proposed: a concentrated unidirectional training model (CU) and a multi-targeted BP approach to athletes' preparation. The first innovative version postulated administration of highly concentrated training means for enhancement of one leading fitness component, whereas the second version proposed the development of many targeted abilities within sequenced block mesocycles containing a minimal number of compatible training modalities. Both versions differ in their methodological background, duration and content of training blocks, possibilities of providing multi-peak performances, and applicability to various sports. In recent decades, many studies have evaluated the effects of both BP training versions in different sports. Examination of the training effects producing by the CU model in combat and team sports has found significant gains in various fitness estimates but not in sport-specific performances. Similarly, utilization of a CU program by elite swimmers did not lead to substantial enhancement of their peak performances. In contrast, studies of multi-targeted BP training programs have revealed their distinct

  5. Plant nutrition between chemical and physiological limitations: is a sustainable approach possible?

    Directory of Open Access Journals (Sweden)

    Roberto Pinton

    2008-04-01

    Full Text Available The estimate of world population growth and the extent of malnutrition problems due to lack of food or to deficit of specific micronutrients bring to light the importance of plant nutrition in the context of a sustainable development. Beside these aspects, which force to use fertilizers, the topic of nutrient use efficiency of by plants is far from being solved: recent estimates of world cereals productions indicate that use efficiency of nitrogen fertilizers is not higher than 35%. These values are even smaller for phosphorus fertilizers (estimate of use efficiency between 10 and 30%, worsen by the fact that, with the present technology and on the basis of present knowledge, it is expected that the phosphorus reserves used for fertilizer production will be sufficient for less than 100 years. Efficiency problems have also been recently raised concerning the use of synthetic chelates to alleviate deficiency of micronutrients: these compounds have been shown to be extremely mobile along soil profile and to be only partially utilizable by plants. The low uptake efficiency of nutrients from soil is, in one hand, caused by several intrinsic characteristics of the biogeochemical cycle of nutrients, by the other, seems to be limited by biochemical and physiological aspects of nutrient absorption. Only recently, the complexity of these aspects has been apprehended and it has been realized that the programs of breeding had neglected these problematic. In this review aspects related to the acquisition of a macro- (N and a micro- (Fe nutrient, will be discussed. The aim is to show that improvements of mineral nutrient use efficiency can be achieved only through a scientific approach, considering the whole soil-plant system. Particularly emphasis will be put on aspect of molecular physiology relevant to the improvement of nutrient capture efficiency; furthermore, the role of naturally occurring organic molecules in optimizing the nutritional capacity of

  6. Plant nutrition between chemical and physiological limitations: is a sustainable approach possible?

    Directory of Open Access Journals (Sweden)

    Roberto Pinton

    2011-02-01

    Full Text Available The estimate of world population growth and the extent of malnutrition problems due to lack of food or to deficit of specific micronutrients bring to light the importance of plant nutrition in the context of a sustainable development. Beside these aspects, which force to use fertilizers, the topic of nutrient use efficiency of by plants is far from being solved: recent estimates of world cereals productions indicate that use efficiency of nitrogen fertilizers is not higher than 35%. These values are even smaller for phosphorus fertilizers (estimate of use efficiency between 10 and 30%, worsen by the fact that, with the present technology and on the basis of present knowledge, it is expected that the phosphorus reserves used for fertilizer production will be sufficient for less than 100 years. Efficiency problems have also been recently raised concerning the use of synthetic chelates to alleviate deficiency of micronutrients: these compounds have been shown to be extremely mobile along soil profile and to be only partially utilizable by plants. The low uptake efficiency of nutrients from soil is, in one hand, caused by several intrinsic characteristics of the biogeochemical cycle of nutrients, by the other, seems to be limited by biochemical and physiological aspects of nutrient absorption. Only recently, the complexity of these aspects has been apprehended and it has been realized that the programs of breeding had neglected these problematic. In this review aspects related to the acquisition of a macro- (N and a micro- (Fe nutrient, will be discussed. The aim is to show that improvements of mineral nutrient use efficiency can be achieved only through a scientific approach, considering the whole soil-plant system. Particularly emphasis will be put on aspect of molecular physiology relevant to the improvement of nutrient capture efficiency; furthermore, the role of naturally occurring organic molecules in optimizing the nutritional capacity of

  7. A new approach to define acceptance limits for hematology in external quality assessment schemes.

    Science.gov (United States)

    Soumali, Mohamed Rida; Van Blerk, Marjan; Akharif, Abdelhadi; Albarède, Stéphanie; Kesseler, Dagmar; Gutierrez, Gabriela; de la Salle, Barbara; Plum, Inger; Guyard, Anne; Favia, Ana Paula; Coucke, Wim

    2017-10-26

    A study performed in 2007 comparing the evaluation procedures used in European external quality assessment schemes (EQAS) for hemoglobin and leukocyte concentrations showed that acceptance criteria vary widely. For this reason, the Hematology working group from the European Organisation for External Quality Assurance Providers in Laboratory Medicine (EQALM) decided to perform a statistical study with the aim of establishing appropriate acceptance limits (AL) allowing harmonization between the evaluation procedures of European EQAS organizers. Eight EQAS organizers from seven European countries provided their hematology survey results from 2010 to 2012 for red blood cells (RBC), hemoglobin, hematocrit, mean corpuscular volume (MCV), white blood cells (WBC), platelets and reticulocytes. More than 440,000 data were collected. The relation between the absolute value of the relative differences between reported EQA results and their corresponding assigned value (U-scores) was modeled by means of an adaptation of Thompson's "characteristic function". Quantile regression was used to investigate the percentiles of the U-scores for each target concentration range. For deriving AL, focus was mainly on the upper percentiles (90th, 95th and 99th). For RBC, hemoglobin, hematocrit and MCV, no relation was found between the U-scores and the target concentrations for any of the percentiles. For WBC, platelets and reticulocytes, a relation with the target concentrations was found and concentration-dependent ALs were determined. The approach enabled to determine state of the art-based ALs, that were concentration-dependent when necessary and usable by various EQA providers. It could also easily be applied to other domains.

  8. Fundamentals of gas dynamics

    CERN Document Server

    Babu, V

    2014-01-01

    Fundamentals of Gas Dynamics, Second Edition isa comprehensively updated new edition and now includes a chapter on the gas dynamics of steam. It covers the fundamental concepts and governing equations of different flows, and includes end of chapter exercises based on the practical applications. A number of useful tables on the thermodynamic properties of steam are also included.Fundamentals of Gas Dynamics, Second Edition begins with an introduction to compressible and incompressible flows before covering the fundamentals of one dimensional flows and normal shock wav

  9. Fundamental Safety Principles

    International Nuclear Information System (INIS)

    Abdelmalik, W.E.Y.

    2011-01-01

    This work presents a summary of the IAEA Safety Standards Series publication No. SF-1 entitled F UDAMENTAL Safety PRINCIPLES p ublished on 2006. This publication states the fundamental safety objective and ten associated safety principles, and briefly describes their intent and purposes. Safety measures and security measures have in common the aim of protecting human life and health and the environment. These safety principles are: 1) Responsibility for safety, 2) Role of the government, 3) Leadership and management for safety, 4) Justification of facilities and activities, 5) Optimization of protection, 6) Limitation of risks to individuals, 7) Protection of present and future generations, 8) Prevention of accidents, 9)Emergency preparedness and response and 10) Protective action to reduce existing or unregulated radiation risks. The safety principles concern the security of facilities and activities to the extent that they apply to measures that contribute to both safety and security. Safety measures and security measures must be designed and implemented in an integrated manner so that security measures do not compromise safety and safety measures do not compromise security.

  10. Fundamental neutron physics

    International Nuclear Information System (INIS)

    Deslattes, R.; Dombeck, T.; Greene, G.; Ramsey, N.; Rauch, H.; Werner, S.

    1984-01-01

    Fundamental physics experiments of merit can be conducted at the proposed intense neutron sources. Areas of interest include: neutron particle properties, neutron wave properties, and fundamental physics utilizing reactor produced γ-rays. Such experiments require intense, full-time utilization of a beam station for periods ranging from several months to a year or more

  11. Dependence and Fundamentality

    Directory of Open Access Journals (Sweden)

    Justin Zylstra

    2014-12-01

    Full Text Available I argue that dependence is neither necessary nor sufficient for relative fundamentality. I then introduce the notion of 'likeness in nature' and provide an account of relative fundamentality in terms of it and the notion of dependence. Finally, I discuss some puzzles that arise in Aristotle's Categories, to which the theory developed is applied.

  12. Chandrasekhar Limit: An Elementary Approach Based on Classical Physics and Quantum Theory

    Science.gov (United States)

    Pinochet, Jorge; Van Sint Jan, Michael

    2016-01-01

    In a brief article published in 1931, Subrahmanyan Chandrasekhar made public an important astronomical discovery. In his article, the then young Indian astrophysicist introduced what is now known as the "Chandrasekhar limit." This limit establishes the maximum mass of a stellar remnant beyond which the repulsion force between electrons…

  13. Fundamental statistical theories

    International Nuclear Information System (INIS)

    Demopoulos, W.

    1976-01-01

    Einstein argued that since quantum mechanics is not a fundamental theory it cannot be regarded as in any sense final. The pure statistical states of the quantum theory are not dispersion-free. In this sense, the theory is significantly statistical. The problem investigated in this paper is to determine under what conditions is a significalty statistical theory correctly regarded as fundamental. The solution developed in this paper is that a statistical theory is fundamental only if it is complete; moreover the quantum theory is complete. (B.R.H.)

  14. Fundamentals of electronics

    CERN Document Server

    Schubert, Thomas F

    2015-01-01

    This book, Electronic Devices and Circuit Application, is the first of four books of a larger work, Fundamentals of Electronics. It is comprised of four chapters describing the basic operation of each of the four fundamental building blocks of modern electronics: operational amplifiers, semiconductor diodes, bipolar junction transistors, and field effect transistors. Attention is focused on the reader obtaining a clear understanding of each of the devices when it is operated in equilibrium. Ideas fundamental to the study of electronic circuits are also developed in the book at a basic level to

  15. Fundamental Work Cost of Quantum Processes

    Directory of Open Access Journals (Sweden)

    Philippe Faist

    2018-04-01

    Full Text Available Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics to the nanoscale. Here, we provide a general fundamental lower limit, valid for systems with an arbitrary Hamiltonian and in contact with any thermodynamic bath, on the work cost for the implementation of any logical process. This limit is given by a new information measure—the coherent relative entropy—which accounts for the Gibbs weight of each microstate. The coherent relative entropy enjoys a collection of natural properties justifying its interpretation as a measure of information and can be understood as a generalization of a quantum relative entropy difference. As an application, we show that the standard first and second laws of thermodynamics emerge from our microscopic picture in the macroscopic limit. Finally, our results have an impact on understanding the role of the observer in thermodynamics: Our approach may be applied at any level of knowledge—for instance, at the microscopic, mesoscopic, or macroscopic scales—thus providing a formulation of thermodynamics that is inherently relative to the observer. We obtain a precise criterion for when the laws of thermodynamics can be applied, thus making a step forward in determining the exact extent of the universality of thermodynamics and enabling a systematic treatment of Maxwell-demon-like situations.

  16. Limitations of implementing sustainable construction principles in the conventional South African design approach

    CSIR Research Space (South Africa)

    Sebake, TN

    2008-06-01

    Full Text Available professionals, particularly by architects, in the implementation of sustainability principles in the development of building projects. The aim of the paper is to highlight the limitations of introducing sustainability aspects into the existing South African...

  17. DNA isolation protocols affect the detection limit of PCR approaches of bacteria in samples from the human gastrointestinal tract

    NARCIS (Netherlands)

    Zoetendal, E.G.; Ben-Amor, K.; Akkermans, A.D.L.; Abee, T.; Vos, de W.M.

    2001-01-01

    A major concern in molecular ecological studies is the lysis efficiency of different bacteria in a complex ecosystem. We used a PCR-based 16S rDNA approach to determine the effect of two DNA isolation protocols (i.e. the bead beating and Triton-X100 method) on the detection limit of seven

  18. Execution techniques for high-level radioactive waste disposal. 2. Fundamental concept of geological disposal and implementing approach of disposal project

    International Nuclear Information System (INIS)

    Kawanishi, Motoi; Komada, Hiroya; Tsuchino, Susumu; Shiozaki, Isao; Kitayama, Kazumi; Akasaka, Hidenari; Inagaki, Yusuke; Kawamura, Hideki

    1999-01-01

    The making high activity of the high-level radioactive waste disposal business shall be fully started after establishing of the implementing organization which is planned around 2000. Considering each step of disposal business, in this study, the implementation procedure for a series of disposal business such as the selection of the disposal site, the construction and operation of the disposal facility, the closure and decommissioning of the disposal facility and the management after closure, which are carried forward by the implementation body is discussed in detail from the technical viewpoint and an example of the master schedule is proposed. Furthermore, we investigate and propose the concept of the geological disposal which becomes important in carrying forward to making of the business of the disposal, such as the present site selection smoothly, the fundamental idea of the safe securing for disposal, the basic idea to get trust to the disposal technique and the geological environmental condition which is the basic condition of this whole study for the disposal business making. (author)

  19. Quantum Uncertainty and Fundamental Interactions

    Directory of Open Access Journals (Sweden)

    Tosto S.

    2013-04-01

    Full Text Available The paper proposes a simplified theoretical approach to infer some essential concepts on the fundamental interactions between charged particles and their relative strengths at comparable energies by exploiting the quantum uncertainty only. The worth of the present approach relies on the way of obtaining the results, rather than on the results themselves: concepts today acknowledged as fingerprints of the electroweak and strong interactions appear indeed rooted in the same theoretical frame including also the basic principles of special and general relativity along with the gravity force.

  20. Fundamentals of crystallography

    CERN Document Server

    2011-01-01

    Crystallography is a basic tool for scientists in many diverse disciplines. This text offers a clear description of fundamentals and of modern applications. It supports curricula in crystallography at undergraduate level.

  1. Fundamentals of electrochemical science

    CERN Document Server

    Oldham, Keith

    1993-01-01

    Key Features* Deals comprehensively with the basic science of electrochemistry* Treats electrochemistry as a discipline in its own right and not as a branch of physical or analytical chemistry* Provides a thorough and quantitative description of electrochemical fundamentals

  2. Information security fundamentals

    CERN Document Server

    Peltier, Thomas R

    2013-01-01

    Developing an information security program that adheres to the principle of security as a business enabler must be the first step in an enterprise's effort to build an effective security program. Following in the footsteps of its bestselling predecessor, Information Security Fundamentals, Second Edition provides information security professionals with a clear understanding of the fundamentals of security required to address the range of issues they will experience in the field.The book examines the elements of computer security, employee roles and r

  3. Fundamentals of structural dynamics

    CERN Document Server

    Craig, Roy R

    2006-01-01

    From theory and fundamentals to the latest advances in computational and experimental modal analysis, this is the definitive, updated reference on structural dynamics.This edition updates Professor Craig's classic introduction to structural dynamics, which has been an invaluable resource for practicing engineers and a textbook for undergraduate and graduate courses in vibrations and/or structural dynamics. Along with comprehensive coverage of structural dynamics fundamentals, finite-element-based computational methods, and dynamic testing methods, this Second Edition includes new and e

  4. Religious fundamentalism and conflict

    OpenAIRE

    Muzaffer Ercan Yılmaz

    2006-01-01

    This study provides an analytical discussion for the issue of religious fundamentalism and itsrelevance to conflict, in its broader sense. It is stressed that religious fundamentalism manifests itself in twoways: nonviolent intolerance and violent intolerance. The sources of both types of intolerance and theirconnection to conflict are addressed and discussed in detail. Further research is also suggested on conditionsconnecting religion to nonviolent intolerance so as to cope with the problem...

  5. Chandrasekhar limit: an elementary approach based on classical physics and quantum theory

    Science.gov (United States)

    Pinochet, Jorge; Van Sint Jan, Michael

    2016-05-01

    In a brief article published in 1931, Subrahmanyan Chandrasekhar made public an important astronomical discovery. In his article, the then young Indian astrophysicist introduced what is now known as the Chandrasekhar limit. This limit establishes the maximum mass of a stellar remnant beyond which the repulsion force between electrons due to the exclusion principle can no longer stop the gravitational collapse. In the present article, we create an elemental approximation to the Chandrasekhar limit, accessible to non-graduate science and engineering students. The article focuses especially on clarifying the origins of Chandrasekhar’s discovery and the underlying physical concepts. Throughout the article, only basic algebra is used as well as some general notions of classical physics and quantum theory.

  6. FY 2000 research and development of fundamental technologies for AC superconducting power devices. R and D of fundamental technologies for superconducting power cables and faults current limiters, R and D of superconducting magnets for power applications, and study on the total systems and related subjects; 2000 nendo koryu chodendo denryoku kiki kiban gijutsu kenkyu kaihatsu seika hokokusho. Chodendo soden cable kiban gijutsu no kenkyu kaihatsu, chodendo genryuki kiban gijutsu no kenkyu kaihatsu, denryokuyo chodendo magnet no kenkyu kaihatsu, total system nado no kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    The project for research and development of fundamental technologies for AC superconducting power devices has been started, and the FY 2000 results are reported. The R and D of fundamental technologies for superconducting power cables include grasping the mechanical characteristics associated with integration necessary for fabrication of large current capacity and long cables; development of barrier cable materials by various methods; and development of short insulated tubes as cooling technology for long superconducting cables, and grasping its thermal/mechanical characteristics. The R and D of faults current limiters include introduction of the unit for superconducting film fabrication, determination of the structures and layouts for large currents, and improvement of performance of each device for high voltages. R and D of superconducting magnets for power applications include grasping the fundamental characteristics of insulation at cryogenic temperature, completion of the insulation designs for high voltage/current lead bushing, and development of prototype sub-cooled nitrogen cooling unit for cooling each AC power device. Study on the total systems and related subjects include analysis for stabilization of the group model systems, to confirm improved voltage stability when the superconducting cable is in service. (NEDO)

  7. Options and Limitations of the Cognitive Psychological Approach to the Treatment of Dyslexia.

    Science.gov (United States)

    Tonnessen, Finn Egil

    1999-01-01

    Analyzes how cognitive psychology defines and treats dyslexia. Shows how behaviorism and connectionism can function as supplements in areas in which cognitive psychology has displayed weaknesses and limitations. Characteristics of cognitive psychology, cognitive treatment, and behavioristic and connectionistic treatment are discussed. (CR)

  8. A Spectral Approach for Quenched Limit Theorems for Random Expanding Dynamical Systems

    Science.gov (United States)

    Dragičević, D.; Froyland, G.; González-Tokman, C.; Vaienti, S.

    2018-01-01

    We prove quenched versions of (i) a large deviations principle (LDP), (ii) a central limit theorem (CLT), and (iii) a local central limit theorem for non-autonomous dynamical systems. A key advance is the extension of the spectral method, commonly used in limit laws for deterministic maps, to the general random setting. We achieve this via multiplicative ergodic theory and the development of a general framework to control the regularity of Lyapunov exponents of twisted transfer operator cocycles with respect to a twist parameter. While some versions of the LDP and CLT have previously been proved with other techniques, the local central limit theorem is, to our knowledge, a completely new result, and one that demonstrates the strength of our method. Applications include non-autonomous (piecewise) expanding maps, defined by random compositions of the form {T_{σ^{n-1} ω} circ\\cdotscirc T_{σω}circ T_ω} . An important aspect of our results is that we only assume ergodicity and invertibility of the random driving {σ:Ω\\toΩ} ; in particular no expansivity or mixing properties are required.

  9. Understanding the Limits of Marxist Approaches to Sociocultural Studies of Science Education

    Science.gov (United States)

    Lima, Paulo, Jr.; Ostermann, Fernanda; Rezende, Flavia

    2014-01-01

    In the first three sections of this paper we comment on some of the ideas developed in the forum papers, pointing out possible misunderstandings and constructing new explanations that clarify arguments we made in the original article. In the last section we expand the discussion raised in the original paper, elaborating on the limits of the use of…

  10. Approach to setting occupational exposure limits for sensory irritants in the Netherlands

    NARCIS (Netherlands)

    Feron, V.J.; Arts, J.H.E.; Mojet, J.

    2001-01-01

    This article describes how scientists in the Netherlands set occupational exposure limits (OELs) for sensory irritants. When they tackle this issue, a number of key questions need to be answered. For example, did the studies indeed measure sensory irritation and not cytotoxicity? When the irritant

  11. Setting limits for acceptable change in sediment particle size composition: testing a new approach to managing marine aggregate dredging.

    Science.gov (United States)

    Cooper, Keith M

    2013-08-15

    A baseline dataset from 2005 was used to identify the spatial distribution of macrofaunal assemblages across the eastern English Channel. The range of sediment composition found in association with each assemblage was used to define limits for acceptable change at ten licensed marine aggregate extraction areas. Sediment data acquired in 2010, 4 years after the onset of dredging, were used to assess whether conditions remained within the acceptable limits. Despite the observed changes in sediment composition, the composition of sediments in and around nine extraction areas remained within pre-defined acceptable limits. At the tenth site, some of the observed changes within the licence area were judged to have gone beyond the acceptable limits. Implications of the changes are discussed, and appropriate management measures identified. The approach taken in this study offers a simple, objective and cost-effective method for assessing the significance of change, and could simplify the existing monitoring regime. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. An open simulation approach to identify chances and limitations for vulnerable road user (VRU) active safety.

    Science.gov (United States)

    Seiniger, Patrick; Bartels, Oliver; Pastor, Claus; Wisch, Marcus

    2013-01-01

    It is commonly agreed that active safety will have a significant impact on reducing accident figures for pedestrians and probably also bicyclists. However, chances and limitations for active safety systems have only been derived based on accident data and the current state of the art, based on proprietary simulation models. The objective of this article is to investigate these chances and limitations by developing an open simulation model. This article introduces a simulation model, incorporating accident kinematics, driving dynamics, driver reaction times, pedestrian dynamics, performance parameters of different autonomous emergency braking (AEB) generations, as well as legal and logical limitations. The level of detail for available pedestrian accident data is limited. Relevant variables, especially timing of the pedestrian appearance and the pedestrian's moving speed, are estimated using assumptions. The model in this article uses the fact that a pedestrian and a vehicle in an accident must have been in the same spot at the same time and defines the impact position as a relevant accident parameter, which is usually available from accident data. The calculations done within the model identify the possible timing available for braking by an AEB system as well as the possible speed reduction for different accident scenarios as well as for different system configurations. The simulation model identifies the lateral impact position of the pedestrian as a significant parameter for system performance, and the system layout is designed to brake when the accident becomes unavoidable by the vehicle driver. Scenarios with a pedestrian running from behind an obstruction are the most demanding scenarios and will very likely never be avoidable for all vehicle speeds due to physical limits. Scenarios with an unobstructed person walking will very likely be treatable for a wide speed range for next generation AEB systems.

  13. Advantages and limitations of the use of optogenetic approach in studying fast-scale spike encoding.

    Directory of Open Access Journals (Sweden)

    Aleksey Malyshev

    Full Text Available Understanding single-neuron computations and encoding performed by spike-generation mechanisms of cortical neurons is one of the central challenges for cell electrophysiology and computational neuroscience. An established paradigm to study spike encoding in controlled conditions in vitro uses intracellular injection of a mixture of signals with fluctuating currents that mimic in vivo-like background activity. However this technique has two serious limitations: it uses current injection, while synaptic activation leads to changes of conductance, and current injection is technically most feasible in the soma, while the vast majority of synaptic inputs are located on the dendrites. Recent progress in optogenetics provides an opportunity to circumvent these limitations. Transgenic expression of light-activated ionic channels, such as Channelrhodopsin2 (ChR2, allows induction of controlled conductance changes even in thin distant dendrites. Here we show that photostimulation provides a useful extension of the tools to study neuronal encoding, but it has its own limitations. Optically induced fluctuating currents have a low cutoff (~70 Hz, thus limiting the dynamic range of frequency response of cortical neurons. This leads to severe underestimation of the ability of neurons to phase-lock their firing to high frequency components of the input. This limitation could be worked around by using short (2 ms light stimuli which produce membrane potential responses resembling EPSPs by their fast onset and prolonged decay kinetics. We show that combining application of short light stimuli to different parts of dendritic tree for mimicking distant EPSCs with somatic injection of fluctuating current that mimics fluctuations of membrane potential in vivo, allowed us to study fast encoding of artificial EPSPs photoinduced at different distances from the soma. We conclude that dendritic photostimulation of ChR2 with short light pulses provides a powerful tool to

  14. Determination of the Ultimate Limit States of Shallow Foundations using Gene Expression Programming (GEP) Approach

    DEFF Research Database (Denmark)

    Tahmasebi poor, A; Barari, Amin; Behnia, M

    2015-01-01

    In this study, a gene expression programming (GEP) approach was employed to develop modified expressions for predicting the bearing capacity of shallow foundations founded on granular material. The model was validate against the results of load tests on full-scale and model footings obtained from...

  15. Hankin and Reeves' approach to estimating fish abundance in small streams: limitations and alternatives

    Science.gov (United States)

    William L. Thompson

    2003-01-01

    Hankin and Reeves' (1988) approach to estimating fish abundance in small streams has been applied in stream fish studies across North America. However, their population estimator relies on two key assumptions: (1) removal estimates are equal to the true numbers of fish, and (2) removal estimates are highly correlated with snorkel counts within a subset of sampled...

  16. Limitations of the toxic equivalency factor (TEF) approach for risk assessment of halogenated aromatic hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Safe, S. [Texas A and M Univ., College Station, TX (United States). Dept. of Veterinary Physiology and Pharmacology

    1995-12-31

    2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) and related halogenated aromatic hydrocarbons (HAHs) are present as complex mixtures of polychlorinated dibenzo-p-dioxins (PCDDs), dibenzofurans (PCDFs) and biphenyls (PCBs) in most environmental matrices. Risk management of these mixtures utilize the toxic equivalency factor (TEF) approach in which the TCDD (dioxin) or toxic equivalents of a mixture is a summation of the congener concentration (Ci) times TEF{sub i} (potency relative to TCDD) where. TEQ{sub mixture} = {Sigma}[Cil] {times} TEF{sub i}. TEQs are determined only for those HAHs which are aryl hydrocarbon (Ah) receptor agonists and this approach assumes that the toxic or biochemical effects of individual compounds in a mixture are additive. Several in vivo and in vitro laboratory and field studies with different HAH mixtures have been utilized to validate the TEF approach. For some responses, the calculated toxicities of PCDD/PCDF and PCB mixtures predict the observed toxic potencies. However, for fetal cleft palate and immunotoxicity in mice, nonadditive (antagonistic) responses are observed using complex PCB mixtures or binary mixtures containing an Ah receptor agonist with 2,2{prime},4,4{prime},5,5{prime}-hexachlorobiphenyl (PCB153). The potential interactive effects of PCBs and other dietary Ah receptor antagonist suggest that the TEF approach for risk management of HAHs requires further refinement and should be used selectively.

  17. Limited-Form Wegener Granulomatosis Case: Anaesthetic Approach and Literature Review.

    Science.gov (United States)

    Sarıtaş, Tuba Berra; Şahin, Osman; Borazan, Hale; Otelcioğlu, Şeref

    2014-12-01

    Wegener granulomatosis (WG) is a kind of vasculitis that affects small and medium-sized arteries. Necrotizing granulomatous vasculitis of the upper and lower respiratory tracts and necrotizing glomerulonephritis of the kidneys are present. WG affects mainly Caucasian individuals between 15-75 years old, with a mean age of onset of 41 years. It affects both males and females equally. Kidney involvement is not present in the limited form of WG. Peripheral nerve blocks are good alternatives when general anaesthesia is risky. Popliteal block is blockade of the sciatic nerve at the popliteal region. Popliteal block is a kind of peripheral block for surgeries below the knee level. In this article, we report on the anaesthesia management of a 61-year-old limited-form WG patient for whom general anaesthesia was risky because of lung involvement.

  18. Failure prediction for titanium alloys using a superplastic forming limit diagram approach

    Energy Technology Data Exchange (ETDEWEB)

    Leen, S.B.; Kroehn, M.A.; Hyde, T.H. [School Mechanical, Materials and Manufacturing Engineering, University of Nottingham (United Kingdom)

    2008-04-15

    Superplastic forming limit diagrams (SPFLDs) are presented for both Ti-6Al-4V and Ti-6Al-2Sn-4Zr-2Mo alloys. FE-predicted {epsilon}{sub 1}-{epsilon}{sub 3}-{epsilon}{sub eq} paths for key points on the forming blank are then plotted on the SPFLD to predict failure. A key factor for reliable SPF forming limit prediction is the incorporation of a mechanisms-based constitutive model, which includes microstructural effects, such as static and dynamic grain growth and associated hardening, and with material constants independent of the forming strain-rate. The sinh model of Dunne and co-workers is thus employed. Results from forming trials for both materials are used to assess the failure predictions. (Abstract Copyright [2008], Wiley Periodicals, Inc.)

  19. A QMU approach for characterizing the operability limits of air-breathing hypersonic vehicles

    International Nuclear Information System (INIS)

    Iaccarino, Gianluca; Pecnik, Rene; Glimm, James; Sharp, David

    2011-01-01

    The operability limits of a supersonic combustion engine for an air-breathing hypersonic vehicle are characterized using numerical simulations and an uncertainty quantification methodology. The time-dependent compressible flow equations with heat release are solved in a simplified configuration. Verification, calibration and validation are carried out to assess the ability of the model to reproduce the flow/thermal interactions that occur when the engine unstarts due to thermal choking. quantification of margins and uncertainty (QMU) is used to determine the safe operation region for a range of fuel flow rates and combustor geometries. - Highlights: → In this work we introduce a method to study the operability limits of hypersonic scramjet engines. → The method is based on a calibrated heat release model. → It accounts explicitly for uncertainties due to flight conditions and model correlations. → We examine changes due to the combustor geometry and fuel injection.

  20. A conjugate directions approach to improve the limited-memory BFGS method

    Czech Academy of Sciences Publication Activity Database

    Vlček, Jan; Lukšan, Ladislav

    2012-01-01

    Roč. 219, č. 3 (2012), s. 800-809 ISSN 0096-3003 R&D Projects: GA ČR GA201/09/1957 Institutional research plan: CEZ:AV0Z10300504 Keywords : unconstrained minimization * variable metric methods * limited- memory methods * the BFGS update * conjugate directions * numerical results Subject RIV: BA - General Mathematics Impact factor: 1.349, year: 2012

  1. Implementation of upper limit calculation for a poisson variable by bayesian approach

    International Nuclear Information System (INIS)

    Zhu Yongsheng

    2008-01-01

    The calculation of Bayesian confidence upper limit for a Poisson variable including both signal and background with and without systematic uncertainties has been formulated. A Fortran 77 routine, BPULE, has been developed to implement the calculation. The routine can account for systematic uncertainties in the background expectation and signal efficiency. The systematic uncertainties may be separately parameterized by a Gaussian, Log-Gaussian or flat probability density function (pdf). Some technical details of BPULE have been discussed. (authors)

  2. How Limited Systematicity Emerges: A Computational Cognitive Neuroscience Approach (Author’s Manuscript)

    Science.gov (United States)

    2014-09-01

    plausibility. It should be pointed out though that systematicity in ACT-R is limited both by the need to acquire the skills and knowledge needed to... inferential chain. All tokens of a symbol in logic must have identical meaning throughout the proof or else it is not a valid proof. Despite their natural...First, neurons do not communicate with symbols, despite the inevitable urge to think of them in this way (O’Reilly, 2010). Spikes are completely

  3. Approaches to the calculation of limitations on nuclear detonations for peaceful purposes

    International Nuclear Information System (INIS)

    Whipple, G.H.

    1969-01-01

    The long-term equilibrium levels of tritium, krypton- 85 and carbon-14 which are acceptable in the environment have been estimated on the following premises: 1) the three isotopes reach the environment and equilibrate throughout it in periods shorter than their half lives, 2) nuclear detonations and nuclear power constitute the dominant sources of these isotopes, 3) the doses from these three isotopes add to one another and to the doses from other radioactive isotopes released to the environment, and 4) the United States, by virtue of its population, is entitled to 6% of the world's capacity to accept radioactive wastes. These premises lead to the conclusion that U.S. nuclear detonations are limited by carbon-14 to 60 megatons per year. The corresponding limit for U.S. nuclear power appears to be set by krypton-85 at 100,000 electrical megawatts, although data for carbon-14 production by nuclear power are not available. It is noted that if the equilibration assumed in these estimates does not occur, the limits will in general be lower than those given above. (author)

  4. A labview approach to instrumentation for the TFTR bumper limiter alignment project

    International Nuclear Information System (INIS)

    Skelly, G.N.; Owens, D.K.

    1992-01-01

    This paper reports on a project recently undertaken to measure the alignment of the TFTR bumper limiter in relation to the toroidal magnetic field axis. The process involved the measurement of the toroidal magnetic field, and the positions of the tiles that make up the bumper limiter. The basis for the instrument control and data acquisition system was National Instrument's LabVIEW 2. LabVIEW is a graphical programming system for developing scientific and engineering applications on a Macintosh. For this project, a Macintosh IIci controlled the IEEE-488 GPIB programmable instruments via an interface box connected to the SCSI port of the computer. With LabVIEW, users create graphical software modules called virtual instruments instead of writing conventional text-based code. To measure the magnetic field, the control system acquired data from two nuclear magnetic resonance magnetometers while the torroidal field coils were pulsed. To measure the position of the tiles on the limiter, an instrumented mechanical arm was used inside the vessel

  5. An approach to the determination of physical-chemical limits of energy consumption for the transition to a stationary state

    International Nuclear Information System (INIS)

    Zimen, K.E.

    1975-02-01

    The paper gives a model of energy consumption and a programme for its application. Previous models are mainly criticized on the grounds that new technological developments as well as adjustments due to learning processes of homo sapiens are generally not sufficiently accounted for in these models. The approach of this new model is therefore an attempt at the determination of the physical-chemical limiting values for the capacity of the global HST (homo sapiens - Tellus) system or of individual regions with respect to certain critical factors. These limiting values determined by the physical-chemical system of the earth are independent of human ingenuity and flexibility. (orig./AK) [de

  6. The accountability for reasonableness approach to guide priority setting in health systems within limited resources

    DEFF Research Database (Denmark)

    Byskov, Jens; Marchal, Bruno; Maluka, Stephen

    2014-01-01

    BACKGROUND: Priority-setting decisions are based on an important, but not sufficient set of values and thus lead to disagreement on priorities. Accountability for Reasonableness (AFR) is an ethics-based approach to a legitimate and fair priority-setting process that builds upon four conditions......-aligned with general values expressed by both service providers and community representatives. There was some variation in the interpretations and actual use of the AFR in the decision-making processes in the three districts, and its effect ranged from an increase in awareness of the importance of fairness...... researchers was formed to implement, and continually assess and improve the application of the four conditions. Researchers evaluated the intervention using qualitative and quantitative data collection and analysis methods. RESULTS: The values underlying the AFR approach were in all three districts well...

  7. Comparative animal models for the study of lymphohematopoietic tumors: strengths and limitations of present approaches.

    Science.gov (United States)

    O'Connor, Owen A; Toner, Lorraine E; Vrhovac, Radovan; Budak-Alpdogan, Tulin; Smith, Emily A; Bergman, Philip

    2005-07-01

    The lymphomas probably represent the most complex and heterogenous set of malignancies known to cancer medicine. Underneath the single term lymphoma exist some of the fastest growing cancers known to science (i.e Burkitt's and lymphoblastic lymphoma), as well as some of the slowest growing (i.e. small lymphocytic lymphoma [SLL] and follicular lymphoma). It is this very biology that can dictate the selection of drugs and treatment approaches for managing these patients, strategies that can range from very aggressive combination chemotherapy administered in an intensive care unit (for example, patients with Burkitt's lymphoma), to watch and wait approaches that may go on for years in patients with SLL. This impressive spectrum of biology emerges from a relatively restricted number of molecular defects. The importance of these different molecular defects is of course greatly influenced by the intrinsic biology that defines the lymphocyte at its different stages of differentiation and maturation. It is precisely this molecular understanding that is beginning to form the basis for a new approach to thinking about lymphoma, and novel approaches to its management. Unfortunately, while our understanding of human lymphoma has blossomed, our ability to generate appropriate animal models reflective of this biology has not. Most preclinical models of these diseases still rely upon sub-cutaneous xenograft models of only the most aggressive lymphomas like Burkitt's lymphoma. While these models clearly serve an important role in understanding biology, and perhaps more importantly, in identifying promising new drugs for these diseases, they fall short in truly representing the broader, more heterogenous biology found in patients. Clearly, depending upon the questions being posed, or the types of drugs being studied, the best model to employ may vary from situation to situation. In this article, we will review the numerous complexities associated with various animal models of

  8. A discussion of the limitations of the psychometric and cultural theory approaches to risk perception

    International Nuclear Information System (INIS)

    Sjoeberg, L.

    1996-01-01

    Risk perception has traditionally been conceived as a cognitive phenomenon, basically a question of information processing. The very term perception suggests that information processing is involved and of crucial importance. Kahneman and Tversky suggested that the use of 'heuristics' in the intuitive estimation of probabilities accounts for biased probability perception, hence claiming to explain risk perception as well. The psychometric approach of Slovic et al, a further step in in the cognitive tradition, conceives of perceived risk as a function of general properties of a hazard. However, the psychometric approach is shown here to explain only about 20% of the variance of perceived risk, even less of risk acceptability. Its claim to explanatory power is based on a statistical illusion: mean values were investigated and accounted for, across hazards. A currently popular alternative to the psychometric tradition, Cultural Theory, is even less successful and explains only about 5% of the variance of perceived risk. The claims of this approach were also based on a statistical illusion: 'significant' results were reported and interpreted as being of substantial importance. The present paper presents a new approach: attitude to the risk generating technology, general sensitivity to risks and specific risk explained well over 60% of the variance of perceived risk of nuclear waste, in a study of extensive data from a representative sample of the Swedish population. The attitude component functioning as an explanatory factor of perceived risk, rather than as a consequence of perceived risk, suggests strongly that perceived risk is something other than cognition. Implications for risk communication are discussed. (author)

  9. Fundamentals of turbomachines

    CERN Document Server

    Dick, Erik

    2015-01-01

    This book explores the working principles of all kinds of turbomachines. The same theoretical framework is used to analyse the different machine types. Fundamentals are first presented and theoretical concepts are then elaborated for particular machine types, starting with the simplest ones.For each machine type, the author strikes a balance between building basic understanding and exploring knowledge of practical aspects. Readers are invited through challenging exercises to consider how the theory applies to particular cases and how it can be generalised.   The book is primarily meant as a course book. It teaches fundamentals and explores applications. It will appeal to senior undergraduate and graduate students in mechanical engineering and to professional engineers seeking to understand the operation of turbomachines. Readers will gain a fundamental understanding of turbomachines. They will also be able to make a reasoned choice of turbomachine for a particular application and to understand its operation...

  10. Formulation approaches to pediatric oral drug delivery: benefits and limitations of current platforms.

    Science.gov (United States)

    Lopez, Felipe L; Ernest, Terry B; Tuleu, Catherine; Gul, Mine Orlu

    2015-01-01

    Most conventional drug delivery systems are not acceptable for pediatric patients as they differ in their developmental status and dosing requirements from other subsets of the population. Technology platforms are required to aid the development of age-appropriate medicines to maximize patient acceptability while maintaining safety, efficacy, accessibility and affordability. The current approaches and novel developments in the field of age-appropriate drug delivery for pediatric patients are critically discussed including patient-centric formulations, administration devices and packaging systems. Despite the incentives provided by recent regulatory modifications and the efforts of formulation scientists, there is still a need for implementation of pharmaceutical technologies that enable the manufacture of licensed age-appropriate formulations. Harmonization of endeavors from regulators, industry and academia by sharing learning associated with data obtained from pediatric investigation plans, product development pathways and scientific projects would be the way forward to speed up bench-to-market age appropriate formulation development. A collaborative approach will benefit not only pediatrics, but other patient populations such as geriatrics would also benefit from an accelerated patient-centric approach to drug delivery.

  11. Homeschooling and religious fundamentalism

    Directory of Open Access Journals (Sweden)

    Robert Kunzman

    2010-10-01

    Full Text Available This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to contemporary culture; suspicion of institutional authority and professional expertise; parental control and centrality of the family; and interweaving of faith and academics. It is important to recognize, however, that fundamentalism exists on a continuum; conservative religious homeschoolers resist liberal democratic values to varying degrees, and efforts to foster dialogue and accommodation with religious homeschoolers can ultimately help strengthen the broader civic fabric.

  12. Fundamentals of nonlinear optics

    CERN Document Server

    Powers, Peter E

    2011-01-01

    Peter Powers's rigorous but simple description of a difficult field keeps the reader's attention throughout. … All chapters contain a list of references and large numbers of practice examples to be worked through. … By carefully working through the proposed problems, students will develop a sound understanding of the fundamental principles and applications. … the book serves perfectly for an introductory-level course for second- and third-order nonlinear optical phenomena. The author's writing style is refreshing and original. I expect that Fundamentals of Nonlinear Optics will fast become pop

  13. Fundamentals of piping design

    CERN Document Server

    Smith, Peter

    2013-01-01

    Written for the piping engineer and designer in the field, this two-part series helps to fill a void in piping literature,since the Rip Weaver books of the '90s were taken out of print at the advent of the Computer Aid Design(CAD) era. Technology may have changed, however the fundamentals of piping rules still apply in the digitalrepresentation of process piping systems. The Fundamentals of Piping Design is an introduction to the designof piping systems, various processes and the layout of pipe work connecting the major items of equipment forthe new hire, the engineering student and the vetera

  14. Pragmatic electrical engineering fundamentals

    CERN Document Server

    Eccles, William

    2011-01-01

    Pragmatic Electrical Engineering: Fundamentals introduces the fundamentals of the energy-delivery part of electrical systems. It begins with a study of basic electrical circuits and then focuses on electrical power. Three-phase power systems, transformers, induction motors, and magnetics are the major topics.All of the material in the text is illustrated with completely-worked examples to guide the student to a better understanding of the topics. This short lecture book will be of use at any level of engineering, not just electrical. Its goal is to provide the practicing engineer with a practi

  15. Fundamentals of continuum mechanics

    CERN Document Server

    Rudnicki, John W

    2014-01-01

    A concise introductory course text on continuum mechanics Fundamentals of Continuum Mechanics focuses on the fundamentals of the subject and provides the background for formulation of numerical methods for large deformations and a wide range of material behaviours. It aims to provide the foundations for further study, not just of these subjects, but also the formulations for much more complex material behaviour and their implementation computationally.  This book is divided into 5 parts, covering mathematical preliminaries, stress, motion and deformation, balance of mass, momentum and energ

  16. Fundamentals of magnetism

    CERN Document Server

    Reis, Mario

    2013-01-01

    The Fundamentals of Magnetism is a truly unique reference text, that explores the study of magnetism and magnetic behavior with a depth that no other book can provide. It covers the most detailed descriptions of the fundamentals of magnetism providing an emphasis on statistical mechanics which is absolutely critical for understanding magnetic behavior. The books covers the classical areas of basic magnetism, including Landau Theory and magnetic interactions, but features a more concise and easy-to-read style. Perfect for upper-level graduate students and industry researchers, The Fu

  17. Fundamentals of reactor chemistry

    International Nuclear Information System (INIS)

    Akatsu, Eiko

    1981-12-01

    In the Nuclear Engineering School of JAERI, many courses are presented for the people working in and around the nuclear reactors. The curricula of the courses contain also the subject material of chemistry. With reference to the foreign curricula, a plan of educational subject material of chemistry in the Nuclear Engineering School of JAERI was considered, and the fundamental part of reactor chemistry was reviewed in this report. Since the students of the Nuclear Engineering School are not chemists, the knowledge necessary in and around the nuclear reactors was emphasized in order to familiarize the students with the reactor chemistry. The teaching experience of the fundamentals of reactor chemistry is also given. (author)

  18. Fundamentals of fluid lubrication

    Science.gov (United States)

    Hamrock, Bernard J.

    1991-01-01

    The aim is to coordinate the topics of design, engineering dynamics, and fluid dynamics in order to aid researchers in the area of fluid film lubrication. The lubrication principles that are covered can serve as a basis for the engineering design of machine elements. The fundamentals of fluid film lubrication are presented clearly so that students that use the book will have confidence in their ability to apply these principles to a wide range of lubrication situations. Some guidance on applying these fundamentals to the solution of engineering problems is also provided.

  19. Formulando uma Psicopatologia Fundamental

    OpenAIRE

    Pereira, Mario Eduardo Costa

    1998-01-01

    O presente trabalho busca situar a Psicopatologia Fundamental em relação ao contexto atual da psicopatologia e delimitar seu âmbito científico naquilo que ela traz de original na discussão psicopatológica. Inicialmente, o campo da psicopatologia é estudado em relação à formalização proposta por Karl Jaspers em termos de uma psicopatologia geral. Em seguida, discute-se a incidência específica da psicanálise nesse debate. Propõe-se que a tarefa da psicopatologia fundamental tem três frentes pri...

  20. Fundamentals and Optimal Institutions

    DEFF Research Database (Denmark)

    Gonzalez-Eiras, Martin; Harmon, Nikolaj Arpe; Rossi, Martín

    2016-01-01

    To shed light on the relation between fundamentals and adopted institutions we examine institutional choice across the ``Big Four'' US sports leagues. Despite having very similar business models and facing the same economic and legal environment, these leagues exhibit large differences in their use...... of regulatory institutions such as revenue sharing, salary caps or luxury taxes. We show, theoretically and empirically, that these large differences in adopted institutions can be rationalized as optimal responses to differences in the fundamental characteristics of the sports being played. This provides...

  1. Infosec management fundamentals

    CERN Document Server

    Dalziel, Henry

    2015-01-01

    Infosec Management Fundamentals is a concise overview of the Information Security management concepts and techniques, providing a foundational template for both experienced professionals and those new to the industry. This brief volume will also appeal to business executives and managers outside of infosec who want to understand the fundamental concepts of Information Security and how it impacts their business decisions and daily activities. Teaches ISO/IEC 27000 best practices on information security management Discusses risks and controls within the context of an overall information securi

  2. Approaches for the development of occupational exposure limits for man-made mineral fibres (MMMFs)

    International Nuclear Information System (INIS)

    Ziegler-Skylakakis, Kyriakoula

    2004-01-01

    Occupational exposure limits (OELs) are an essential tool in the control of exposure to hazardous chemical agents, and serve to minimise the occurrence of occupational diseases associated with such exposure. The setting of OELs, together with other associated measures, forms an essential part of the European Community's strategy on health and safety at work, upon which the legislative framework for the protection of workers from risks related to chemical agents is based. The European Commission is assisted by the Scientific Committee on Occupational Exposure Limits (SCOEL) in its work of setting OELs for hazardous chemical agents. The procedure for setting OELs requires information on the toxic mechanisms of an agent that should allow to differentiate between thresholded and non-thresholded mechanisms. In the first case, a no-observed adverse effect level (NOAEL) can be defined, which can be the basis for a derivation of an OEL. In the latter case, any exposure is correlated with a certain risk. If adequate scientific data are available, SCOEL estimates the risk associated with a series of exposure levels. This can then be used for guidance, when setting OELs at European level. Man-made mineral fibres (MMMFs) are widely used at different worksites. MMMF products can release airborne respirable fibres during their production, use and removal. According to the classification of the EU system, all MMMF fibres are considered to be irritants and are classified for carcinogenicity. EU legislation foresees the use of limit values as one of the provisions for the protection of workers from the risks related to exposure to carcinogens. In the following paper, the research requirements identified by SCOEL for the development of OELs for MMMFs will be presented

  3. Phase behaviour of symmetric binary mixtures with partially miscible components in slit-like pores. Application of the fundamental measure density functional approach

    CERN Document Server

    Martínez, A; Patrykiejew, A; Sokolowski, S

    2003-01-01

    We investigate adsorption in slit-like pores of model symmetric binary mixtures exhibiting demixing in bulk phase, by using a density functional approach. Our focus is on the evaluation of the first-order phase transitions in adsorbed fluids and the lines separating mixed and demixed phases. The scenario for phase transitions is sensitive to the pore width and to the energy of adsorption. Both these parameters can change the phase diagrams of the confined fluid. In particular, for relatively wide pores and for strong wall-fluid interactions, the demixing line can precede the first-order transition. Moreover, a competition between layering transitions and demixing within particular layers also leads to further enrichment of the phase diagram.

  4. Approaching Quantum-Limited Amplification with Large Gain Catalyzed by Optical Parametric Amplifier Medium

    Science.gov (United States)

    Zheng, Qiang; Li, Kai

    2017-07-01

    Amplifier is at the heart of experiments carrying out the precise measurement of a weak signal. An idea quantum amplifier should have a large gain and minimum added noise simultaneously. Here, we consider the quantum measurement properties of the cavity with the OPA medium in the op-amp mode to amplify an input signal. We show that our nonlinear-cavity quantum amplifier has large gain in the single-value stable regime and achieves quantum limit unconditionally. Supported by the National Natural Science Foundation of China under Grant Nos. 11365006, 11364006, and the Natural Science Foundation of Guizhou Province QKHLHZ [2015]7767

  5. Flow-through SIP - A novel stable isotope probing approach limiting cross-feeding

    Science.gov (United States)

    Mooshammer, Maria; Kitzinger, Katharina; Schintlmeister, Arno; Kjedal, Henrik; Nielsen, Jeppe Lund; Nielsen, Per; Wagner, Michael

    2017-04-01

    Stable isotope probing (SIP) is a widely applied tool to link specific microbial populations to metabolic processes in the environment without the prerequisite of cultivation, which has greatly advanced our understanding of the role of microorganisms in biogeochemical cycling. SIP relies on tracing specific isotopically labeled substrates (e.g., 13C, 15N, 18O) into cellular biomarkers, such as DNA, RNA or phospholipid fatty acids, and is considered to be a robust technique to identify microbial populations that assimilate the labeled substrate. However, cross-feeding can occur when labeled metabolites are released from a primary consumer and then used by other microorganisms. This leads to erroneous identification of organisms that are not directly responsible for the process of interest, but are rather connected to primary consumers via a microbial food web. Here, we introduce a new approach that has the potential to eliminate the effect of cross-feeding in SIP studies and can thus also be used to distinguish primary consumers from other members of microbial food webs. In this approach, a monolayer of microbial cells are placed on a filter membrane, and labeled substrates are supplied by a continuous flow. By means of flow-through, labeled metabolites and degradation products are constantly removed, preventing secondary consumption of the substrate. We present results from a proof-of-concept experiment using nitrifiers from activated sludge as model system, in which we used fluorescence in situ hybridization (FISH) with rRNA-targeted oligonucleotide probes for identification of nitrifiers in combination with nanoscale secondary ion mass spectrometry (NanoSIMS) for visualization of isotope incorporation at the single-cell level. Our results show that flow-through SIP is a promising approach to significantly reduce cross-feeding and secondary substrate consumption in SIP experiments.

  6. PRINZMETAL ANGINA PECTORIS IN CLINICAL PRACTICE: POSSIBILITY OF CHRONO-THERAPEUTIC APPROACH AND LIMITATIONS OF COMORBIDITY

    Directory of Open Access Journals (Sweden)

    M. V. Baeva

    2013-12-01

    Full Text Available Prinzmetal Angina (synonyms: vasospastic angina, variant angina in accordance with the definition of, it is caused by coronary artery spasm which occurs during sleep at night, between midnight and early morning and manifested with ST segment elevation on the ECG. Frequent “attachment” to the attacks of a certain period of the sleep period, gives you the opportunity to use chronomedical approaching the treatment of patients suffering from it, as demonstrated by our observation. On the other hand, for adulthood comorbidity is characteristic, and Prinzmetal is no exception, which we wanted to show, studying the clinical case.

  7. Challenges and Limitations of Applying an Emotion-driven Design Approach on Elderly Users

    DEFF Research Database (Denmark)

    Andersen, Casper L.; Gudmundsson, Hjalte P.; Achiche, Sofiane

    2011-01-01

    Population ageing is without parallel in human history and the twenty-first century will witness even more rapid ageing than did the century just past. Understanding the user needs of the elderly and how to design better products for this segment of the population is crucial, as it can offer a co...... related to the participants’ age and cognitive abilities. The challenges encountered are discussed and guidelines on what should be taken into account to facilitate an emotion-driven design approach for elderly people are proposed....... a competitive advantage for companies. In this paper, challenges of applying an emotion-driven design approach applied on elderly people, in order to identify their user needs towards walking frames, are discussed. The discussion will be based on the experiences and results obtained from the case study....... To measure the emotional responses of the elderly, a questionnaire was designed and adapted from P.M.A. Desmet’s product-emotion measurement instrument: PrEmo. During the case study it was observed that there were several challenges when carrying out the user survey, and that those challenges particularly...

  8. DOE fundamentals handbook: Material science

    International Nuclear Information System (INIS)

    1993-01-01

    This handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of the structure and properties of metals. This volume contains the following modules: thermal shock (thermal stress, pressurized thermal shock), brittle fracture (mechanism, minimum pressurization-temperature curves, heatup/cooldown rate limits), and plant materials (properties considered when selecting materials, fuel materials, cladding and reflectors, control materials, nuclear reactor core problems, plant material problems, atomic displacement due to irradiation, thermal and displacement spikes due to irradiation, neutron capture effect, radiation effects in organic compounds, reactor use of aluminum)

  9. Safety analysis fundamentals

    International Nuclear Information System (INIS)

    Wright, A.C.D.

    2002-01-01

    This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations

  10. Fundamentals of Diesel Engines.

    Science.gov (United States)

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…

  11. Introduction and fundamentals

    International Nuclear Information System (INIS)

    Thomas, R.H.

    1980-01-01

    This introduction discusses advances in the fundamental sciences which underlie the applied science of health physics and radiation protection. Risk assessments in nuclear medicine are made by defining the conditions of exposure, identification of adverse effects, relating exposure with effect, and estimation of the overall risk for ionizing radiations

  12. Fundamentals of plasma physics

    CERN Document Server

    Bittencourt, J A

    1986-01-01

    A general introduction designed to present a comprehensive, logical and unified treatment of the fundamentals of plasma physics based on statistical kinetic theory. Its clarity and completeness make it suitable for self-learning and self-paced courses. Problems are included.

  13. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  14. Fundamentals of astrodynamics

    NARCIS (Netherlands)

    Wakker, K.F.

    2015-01-01

    This book deals with the motion of the center of mass of a spacecraft; this discipline is generally called astrodynamics. The book focuses on an analytical treatment of the motion of spacecraft and provides insight into the fundamentals of spacecraft orbit dynamics. A large number of topics are

  15. Fundamental partial compositeness

    DEFF Research Database (Denmark)

    Sannino, Francesco; Strumia, Alessandro; Tesi, Andrea

    2016-01-01

    We construct renormalizable Standard Model extensions, valid up to the Planck scale, that give a composite Higgs from a new fundamental strong force acting on fermions and scalars. Yukawa interactions of these particles with Standard Model fermions realize the partial compositeness scenario. Unde...

  16. Fundamental research data base

    Science.gov (United States)

    1983-01-01

    A fundamental research data base containing ground truth, image, and Badhwar profile feature data for 17 North Dakota, South Dakota, and Minnesota agricultural sites is described. Image data was provided for a minimum of four acquisition dates for each site and all four images were registered to one another.

  17. Fast fundamental frequency estimation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm; Jensen, Jesper Rindom

    2017-01-01

    Modelling signals as being periodic is common in many applications. Such periodic signals can be represented by a weighted sum of sinusoids with frequencies being an integer multiple of the fundamental frequency. Due to its widespread use, numerous methods have been proposed to estimate the funda...

  18. Fundamental Metallurgy of Solidification

    DEFF Research Database (Denmark)

    Tiedje, Niels

    2004-01-01

    The text takes the reader through some fundamental aspects of solidification, with focus on understanding the basic physics that govern solidification in casting and welding. It is described how the first solid is formed and which factors affect nucleation. It is described how crystals grow from ...

  19. Evaluation of potential approaches to arms control limitations of intermediate-range nuclear weapon systems. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Paolucci, D.A.; Trapold, A.C.; Lyding, J.F.

    1983-05-02

    This report presents the results of an evaluation of potential approaches to arms control limitations of intermediate-range nuclear forces (INF). This analysis focused on U.S./NATO and Soviet land-based nuclear weapon systems deployed or scheduled for deployment in the European area. The weapon systems were the longer range missiles (1,000 to 5,000 km), shorter range missiles (500 to 925 km), and nuclear capable aircraft. An analysis was conducted to evaluate the relative effectiveness of the U.S./NATO and Soviet weapon systems and forces. In conjunction with this analysis nuclear threat target lists were developed in order to evaluate the relative capabilities and limitations of these forces to attack high priority nuclear threat targets. A series of potential alternative approaches to limitations and reductions of these nuclear weapon systems and forces were developed and evaluated. Particular attention was paid to U.S. and Soviet draft proposals and negotiating positions which have been developed during the ongoing INF negotiations. In addition to the analysis of these weapon systems to attack the nuclear threat target lists, a number of measures was selected to provide a basis for a comparison and evaluation of the alternative limitations and reductions of missile systems. These measures were number of launchers and warheads, missile range, equivalent megatons (EMT), counter military potential (CMP), hard target kill capability, and capability against airfields.

  20. Development of system based code for integrity of FBR. Fundamental probabilistic approach, Part 1: Model calculation of creep-fatigue damage (Research report)

    International Nuclear Information System (INIS)

    Kawasaki, Nobuchika; Asayama, Tai

    2001-09-01

    Both reliability and safety have to be further improved for the successful commercialization of FBRs. At the same time, construction and operation costs need to be reduced to a same level of future LWRs. To realize compatibility among reliability, safety and, cost, the Structural Mechanics Research Group in JNC started the development of System Based Code for Integrity of FBR. This code extends the present structural design standard to include the areas of fabrication, installation, plant system design, safety design, operation and maintenance, and so on. A quantitative index is necessary to connect different partial standards in this code. Failure probability is considered as a candidate index. Therefore we decided to make a model calculation using failure probability and judge its applicability. We first investigated other probabilistic standards like ASME Code Case N-578. A probabilistic approach in the structural integrity evaluation was created based on these results, and also an evaluation flow was proposed. According to this flow, a model calculation of creep-fatigue damage was performed. This trial calculation was for a vessel in a sodium-cooled FBR. As the result of this model calculation, a crack initiation probability and a crack penetration probability were found to be effective indices. Last we discussed merits of this System Based Code, which are presented in this report. Furthermore, this report presents future development tasks. (author)

  1. Culturally sensitive social practice, reality or fiction? A theoretical and comparative approach to the fundamentals of cultural competence and its development in the Social Work degree

    Directory of Open Access Journals (Sweden)

    Joaquín Guerrero-Muñoz

    2017-12-01

    Full Text Available Cultural competence in Social Work has been recognised by the National Association of Social Workers of the United States as a basic standard of the profession. The objective of this standard is for social workers to have a wide range of knowledge of the client’s culture and develop skills to provide social care and culturally sensitive services. However, cultural competence in Social Work addresses important theoretical, practical and educational difficulties. This article offers an exhaustive evaluation of the different criticisms that socially sensitive social work has received. A crucial aspect of social practice is the teaching-learning process of cultural competence. After analysing the Spanish social work degrees, this paper outlines a multidimensional, holistic and integrated approach that responds to the educational needs of our future professionals. Based on findings, we suggest the systematic inclusion of cultural competence in curricula and the generation of learning outcomes in basic components such as self-awareness, effective communication, decision making, coping situations in intercultural contexts and the understanding of different socio-cultural realities. 

  2. Design and modeling of an SJ infrared solar cell approaching upper limit of theoretical efficiency

    Science.gov (United States)

    Sahoo, G. S.; Mishra, G. P.

    2018-01-01

    Recent trends of photovoltaics account for the conversion efficiency limit making them more cost effective. To achieve this we have to leave the golden era of silicon cell and make a path towards III-V compound semiconductor groups to take advantages like bandgap engineering by alloying these compounds. In this work we have used a low bandgap GaSb material and designed a single junction (SJ) cell with a conversion efficiency of 32.98%. SILVACO ATLAS TCAD simulator has been used to simulate the proposed model using both Ray Tracing and Transfer Matrix Method (under 1 sun and 1000 sun of AM1.5G spectrum). A detailed analyses of photogeneration rate, spectral response, potential developed, external quantum efficiency (EQE), internal quantum efficiency (IQE), short-circuit current density (JSC), open-circuit voltage (VOC), fill factor (FF) and conversion efficiency (η) are discussed. The obtained results are compared with previously reported SJ solar cell reports.

  3. A Cultural Psychological Approach to Analyze Intercultural Learning: Potential and Limits of the Structure Formation Technique

    Directory of Open Access Journals (Sweden)

    Doris Weidemann

    2009-01-01

    Full Text Available Despite the huge interest in sojourner adjustment, there is still a lack of qualitative as well as of longitudinal research that would offer more detailed insights into intercultural learning processes during overseas stays. The present study aims to partly fill that gap by documenting changes in knowledge structures and general living experiences of fifteen German sojourners in Taiwan in a longitudinal, cultural-psychological study. As part of a multimethod design a structure formation technique was used to document subjective theories on giving/losing face and their changes over time. In a second step results from this study are compared to knowledge-structures of seven long-term German residents in Taiwan, and implications for the conceptualization of intercultural learning will be proposed. Finally, results from both studies serve to discuss the potential and limits of structure formation techniques in the field of intercultural communication research. URN: urn:nbn:de:0114-fqs0901435

  4. Exploring the aphasiac's naming disturbances; a new approach using the neighbourhood limited classification method.

    Science.gov (United States)

    Moerman, C; Corluy, R; Meersman, W

    1983-12-01

    The patterns of errors observed in eighteen patients with aphasia naming 42 different items presented 5 times in random succession were analyzed by the N.L.C. (Neighbourhood Limited Classification) method. The initial and last responses for each item were considered. The procedure allowed the computation of an improvement coefficient for each patient, illustrating the capacity to modify an initially erroneous response. For each type of response a cluster analysis was performed and a grouping of patients was obtained; in both cases some patients had close to normal scores. Clustering was chiefly a function of severity of word-finding disturbance. Phonemic paraphasias follow a more independent course, allowing additional clustering. In the future, more importance should be given to their discriminative value in the taxonomy of naming disturbances.

  5. Comparison of Generative and Discriminative Approaches for Speaker Recognition with Limited Data

    Directory of Open Access Journals (Sweden)

    J. Silovsky

    2009-09-01

    Full Text Available This paper presents a comparison of three different speaker recognition methods deployed in a broadcast news processing system. We focus on how the generative and discriminative nature of these methods affects the speaker recognition framework and we also deal with intersession variability compensation techniques in more detail, which are of great interest in broadcast processing domain. Performed experiments are specific particularly for the very limited amount of data used for both speaker enrollment (typically ranging from 30 to 60 seconds and recognition (typically ranging from 5 to 15 seconds. Our results show that the system based on Gaussian Mixture Models (GMMs outperforms both systems based on Support Vector Machines (SVMs but its drawback is higher computational cost.

  6. Limited enzymic degradation of proteins: a new approach in the industrial application of hydrolases

    Energy Technology Data Exchange (ETDEWEB)

    Adler-Nissen, J.

    1982-01-01

    The industrial importance of hydrolases exceeds that of other classes of enzymes. A major application area for hydrolases is for the dissolution of biopolymers such as starch, pectin, cellulose and protein; in many cases it has been the desire to achieve as complete a solubilization as possible. However, with food proteins, it has been demonstrated that a limited controlled hydrolysis may give rise to particularly interesting functional and organoleptic properties. The degree of hydrolysis (DH) is defined as the percentage of peptide bonds cleaved and is used as the controlling indice for the hydrolysis of food proteins. For a given enzyme-substrate system, at least five independent indices can be defined: S(substrate concentration), E/S (enzyme/substrate ratio), pH, T (temperature) and t (time). The advantage of the DH-concept is that of these five variables, four (S,E/S, T, t) can be replaced by DH, i.e. within certain limits of S, E/S, T and t, the properties of a particular protein-enzyme system are solely dependent on DH and pH of the hydrolysis. Empirically, this is demonstrated for soya-protein isolate hydrolyzed with Alcalase and theoretically the same result can be derived from the fact that there is substrate saturation throughout the reaction. These theoretical calculations are the basis for the so-called theta (h)-method, by which the significance of a particular hydrolysis indice can be studied. For each empirically derived hydrolysis curve, the hydrolysis time corresponding to any DH is found. Over a complete DH interval the proportion between the hydrolysis time for each DH is then calculated. If this term, denoted theta (h), is the same for all DH, the properties of the hydrolysates are independent of variations in the hydrolysis indice under study. A statistical procedure must be used to determine if theta (h) is constant or not. (Refs. 20).

  7. Limitation of Socio-Economic Rights in the 2010 Kenyan Constitution: A Proposal for the Adoption of a Proportionality Approach in the Judicial Adjudication of Socio-Economic Rights Disputes

    Directory of Open Access Journals (Sweden)

    Nicholas Wasonga Orago

    2013-12-01

    Full Text Available On 27 August 2010 Kenya adopted a transformative Constitution with the objective of fighting poverty and inequality as well as improving the standards of living of all people in Kenya. One of the mechanisms in the 2010 Constitution aimed at achieving this egalitarian transformation is the entrenchment of justiciable socio-economic rights (SERs, an integral part of the Bill of Rights. The entrenched SERs require the State to put in place a legislative, policy and programmatic framework to enhance the realisation of its constitutional obligations to respect, protect and fulfill these rights for all Kenyans. These SER obligations, just like any other fundamental human rights obligations, are, however, not absolute and are subject to legitimate limitation by the State. Two approaches have been used in international and comparative national law jurisprudence to limit SERs: the proportionality approach, using a general limitation clause that has found application in international and regional jurisprudence on the one hand; and the reasonableness approach, using internal limitations contained in the standard of progressive realisation, an approach that has found application in the SER jurisprudence of the South African Courts, on the other hand. This article proposes that if the entrenched SERs are to achieve their transformative objectives, Kenyan courts must adopt a proportionality approach in the judicial adjudication of SER disputes. This proposal is based on the reasoning that for the entrenched SERs to have a substantive positive impact on the lives of the Kenyan people, any measure by the government aimed at their limitation must be subjected to strict scrutiny by the courts, a form of scrutiny that can be achieved only by using the proportionality standard entrenched in the article 24 general limitation clause.

  8. Fostering Interculturality in Urban Ethnic Neighbourhoods: Opportunities and Limits of the Responsible Tourism Approach

    Directory of Open Access Journals (Sweden)

    Melissa Moralli

    2016-12-01

    Full Text Available In the last decades, new forms of interculturality are emerging as a consequence of economic and cultural globalisation. Within these experiences, tourism in ethnic neighbourhoods can represent an innovative way to create new spaces for intercultural encounter in western cities. The aim of this paper is to analyse this new social phenomenon thanks to the results emerging from an exploratory ethnographic research focused on Mygrantour, an intercultural network which aims to create new forms of intercultural encounter between migrant communities and tourists in some Mediterranean cities. Adopting a responsible tourism approach, this paper will analyse the social and cultural consequences of this growing phenomenon in terms of social integration and inter-cultural encounter.

  9. A Case of Sublingual Dermoid Cyst: Extending the Limits of the Oral Approach

    Directory of Open Access Journals (Sweden)

    Nobuo Ohta

    2012-01-01

    Full Text Available We present the case of a dermoid cyst with an oral and a submental component in a 21-year-old Japanese woman who presented with complaints of a mass in the oral cavity and difficulty in chewing and swallowing solid foods for about 2 years. MRI shows a 55 × 65 mm well-circumscribed cystic mass extending from the sublingual area to the mylohyoid muscle. Under general anesthesia and with nasotracheal intubation, the patient underwent surgical removal of the mass. Although the cyst was large and extending mylohyoid muscle, intraoral midline incision was performed through the mucosa overlying the swelling and the cyst was separated from the surrounding tissues with appropriate traction and countertraction and successfully removed without extraoral incision. Oral approach in surgical enucleation is useful procedure to avoid cosmetic problems in large and extending mylohyoid muscle cyst.

  10. SPECTRUS: A Dimensionality Reduction Approach for Identifying Dynamical Domains in Protein Complexes from Limited Structural Datasets.

    Science.gov (United States)

    Ponzoni, Luca; Polles, Guido; Carnevale, Vincenzo; Micheletti, Cristian

    2015-08-04

    Identifying dynamical, quasi-rigid domains in proteins provides a powerful means for characterizing functionally oriented structural changes via a parsimonious set of degrees of freedom. In fact, the relative displacements of few dynamical domains usually suffice to rationalize the mechanics underpinning biological functionality in proteins and can even be exploited for structure determination or refinement purposes. Here we present SPECTRUS, a general scheme that, by solely using amino acid distance fluctuations, can pinpoint the innate quasi-rigid domains of single proteins or large complexes in a robust way. Consistent domains are usually obtained by using either a pair of representative structures or thousands of conformers. The functional insights offered by the approach are illustrated for biomolecular systems of very different size and complexity such as kinases, ion channels, and viral capsids. The decomposition tool is available as a software package and web server at spectrus.sissa.it. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Limitations of the endonasal endoscopic approach in treating olfactory groove meningiomas. A systematic review.

    Science.gov (United States)

    Shetty, Sathwik Raviraj; Ruiz-Treviño, Armando S; Omay, Sacit Bulent; Almeida, Joao Paulo; Liang, Buqing; Chen, Yu-Ning; Singh, Harminder; Schwartz, Theodore H

    2017-10-01

    To review current management strategies for olfactory groove meningioma (OGM)s and the recent literature comparing endoscopic endonasal (EEA) with traditional transcranial (TCA) approaches. A PubMed search of the recent literature (2011-2016) was performed to examine outcomes following EEA and TCA for OGM. The extent of resection, visual outcome, postoperative complications and recurrence rates were analyzed using percentages and proportions, the Fischer exact test and the Student's t-test using Graphpad PRISM 7.0Aa (San Diego, CA) software. There were 444 patients in the TCA group with a mean diameter of 4.61 (±1.17) cm and 101 patients in the EEA group with a mean diameter of 3.55 (± 0.58) cm (p = 0.0589). GTR was achieved in 90.9% (404/444) in the TCA group and 70.2% (71/101) in the EEA group (p OGMs.

  12. Fundamentals of ultrasonic phased arrays

    CERN Document Server

    Schmerr, Lester W

    2014-01-01

    This book describes in detail the physical and mathematical foundations of ultrasonic phased array measurements.?The book uses linear systems theory to develop a comprehensive model of the signals and images that can be formed with phased arrays. Engineers working in the field of ultrasonic nondestructive evaluation (NDE) will find in this approach a wealth of information on how to design, optimize and interpret ultrasonic inspections with phased arrays. The fundamentals and models described in the book will also be of significant interest to other fields, including the medical ultrasound and

  13. Fundamental Properties of Salts

    Energy Technology Data Exchange (ETDEWEB)

    Toni Y Gutknecht; Guy L Fredrickson

    2012-11-01

    Thermal properties of molten salt systems are of interest to electrorefining operations, pertaining to both the Fuel Cycle Research & Development Program (FCR&D) and Spent Fuel Treatment Mission, currently being pursued by the Department of Energy (DOE). The phase stability of molten salts in an electrorefiner may be adversely impacted by the build-up of fission products in the electrolyte. Potential situations that need to be avoided, during electrorefining operations, include (i) fissile elements build up in the salt that might approach the criticality limits specified for the vessel, (ii) electrolyte freezing at the operating temperature of the electrorefiner due to changes in the liquidus temperature, and (iii) phase separation (non-homogenous solution). The stability (and homogeneity) of the phases can be monitored by studying the thermal characteristics of the molten salts as a function of impurity concentration. Simulated salt compositions consisting of the selected rare earth and alkaline earth chlorides, with a eutectic mixture of LiCl-KCl as the carrier electrolyte, were studied to determine the melting points (thermal characteristics) using a Differential Scanning Calorimeter (DSC). The experimental data were used to model the liquidus temperature. On the basis of the this data, it became possible to predict a spent fuel treatment processing scenario under which electrorefining could no longer be performed as a result of increasing liquidus temperatures of the electrolyte.

  14. Exploring the Obstacles and the Limits of Sustainable Development. A Theoretical Approach

    Directory of Open Access Journals (Sweden)

    Paula-Carmen Roșca

    2017-03-01

    Full Text Available The term “sustainable” or “sustainability” is currently used so much and in so many fields that it has become basically part of our everyday lives. It has been connected and linked to almost everything related to our living, to our lifestyle: energy, transport, housing, diet, clothing etc. But what does the term “sustainable” really mean? Many people may have heard about sustainable development or sustainability and may have even tried to have a sustainable living but their efforts might not be enough. The present paper is meant to bring forward a few of the limits of “sustainability” concept. Moreover, it is focused on revealing some arguments from the “other side” along with disagreements regarding some of the principles of “sustainable development” and even critics related to its progress, to its achievements. Another purpose of this paper is to draw attention over some of the issues and obstacles which may threaten the future of sustainability. The paper is also meant to highlight the impact that some stakeholders might have on the evolution of sustainable development due to their financial power, on a global scale.

  15. Yeast biomass production: a new approach in glucose-limited feeding strategy

    Directory of Open Access Journals (Sweden)

    Érika Durão Vieira

    2013-01-01

    Full Text Available The aim of this work was to implement experimentally a simple glucose-limited feeding strategy for yeast biomass production in a bubble column reactor based on a spreadsheet simulator suitable for industrial application. In biomass production process using Saccharomyces cerevisiae strains, one of the constraints is the strong tendency of these species to metabolize sugars anaerobically due to catabolite repression, leading to low values of biomass yield on substrate. The usual strategy to control this metabolic tendency is the use of a fed-batch process in which where the sugar source is fed incrementally and total sugar concentration in broth is maintained below a determined value. The simulator presented in this work was developed to control molasses feeding on the basis of a simple theoretical model in which has taken into account the nutritional growth needs of yeast cell and two input data: the theoretical specific growth rate and initial cell biomass. In experimental assay, a commercial baker's yeast strain and molasses as sugar source were used. Experimental results showed an overall biomass yield on substrate of 0.33, a biomass increase of 6.4 fold and a specific growth rate of 0.165 h-1 in contrast to the predicted value of 0.180 h-1 in the second stage simulation.

  16. The quasi-classical limit of scattering amplitude - L2-approach for short range potentials

    International Nuclear Information System (INIS)

    Yajima, K.; Vienna Univ.

    1984-01-01

    We are concerned with the asymptotic behaviour as Planck's constant h → 0 of the scattering operator Ssup(h) associated with the pair of Schroedinger equations i h/2π delta u/delta t = - ((h/2π) 2 /2m)Δu + V(x) u equivalent to Hsup(h)u and i h/2π delta u/delta t = - ((h/2π) 2 /2m)Δu equivalent to Hsup(h) 0 u. We shall show under certain conditions that the scattering matrix S-circumflexsup(h)(p,q), the distribution kernel of Ssup(h) in momentum representation, may be expressed in terms of a Fourier integral operator. Then applying the stationary phase method to it, we shall prove that S-circumflexsup(h) has an asymptotic expansion in powers of h/2π up to any order in L 2 -space and that the limit as h → 0 of the total cross section is twice the one of classical mechanics, in generic. (Author)

  17. Revisiting Pocos de Caldas. Application of the co-precipitation approach to establish realistic solubility limits for performance assessment

    International Nuclear Information System (INIS)

    Bruno, J.; Duro, L.; Jordana, S.; Cera, E.

    1996-02-01

    Solubility limits constitute a critical parameter for the determination of the mobility of radionuclides in the near field and the geosphere, and consequently for the performance assessment of nuclear waste repositories. Mounting evidence from natural system studies indicate that trace elements, and consequently radionuclides, are associated to the dynamic cycling of major geochemical components. We have recently developed a thermodynamic approach to take into consideration the co-precipitation and co-dissolution processes that mainly control this linkage. The approach has been tested in various natural system studies with encouraging results. The Pocos de Caldas natural analogue was one of the sites where a full testing of our predictive geochemical modelling capabilities were done during the analogue project. We have revisited the Pocos de Caldas data and expanded the trace element solubility calculations by considering the documented trace metal/major ion interactions. This has been done by using the co-precipitation/co-dissolution approach. The outcome is as follows: A satisfactory modelling of the behaviour of U, Zn and REEs is achieved by assuming co-precipitation with ferrihydrite. Strontium concentrations are apparently controlled by its co-dissolution from Sr-rich fluorites. From the performance assessment point of view, the present work indicates that calculated solubility limits using the co-precipitation approach are in close agreement with the actual trace element concentrations. Furthermore, the calculated radionuclide concentrations are 2-4 orders of magnitude lower than conservative solubility limits calculated by assuming equilibrium with individual trace element phases. 34 refs, 18 figs, 13 tabs

  18. Revisiting Pocos de Caldas. Application of the co-precipitation approach to establish realistic solubility limits for performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bruno, J.; Duro, L.; Jordana, S.; Cera, E. [QuantiSci, Barcelona (Spain)

    1996-02-01

    Solubility limits constitute a critical parameter for the determination of the mobility of radionuclides in the near field and the geosphere, and consequently for the performance assessment of nuclear waste repositories. Mounting evidence from natural system studies indicate that trace elements, and consequently radionuclides, are associated to the dynamic cycling of major geochemical components. We have recently developed a thermodynamic approach to take into consideration the co-precipitation and co-dissolution processes that mainly control this linkage. The approach has been tested in various natural system studies with encouraging results. The Pocos de Caldas natural analogue was one of the sites where a full testing of our predictive geochemical modelling capabilities were done during the analogue project. We have revisited the Pocos de Caldas data and expanded the trace element solubility calculations by considering the documented trace metal/major ion interactions. This has been done by using the co-precipitation/co-dissolution approach. The outcome is as follows: A satisfactory modelling of the behaviour of U, Zn and REEs is achieved by assuming co-precipitation with ferrihydrite. Strontium concentrations are apparently controlled by its co-dissolution from Sr-rich fluorites. From the performance assessment point of view, the present work indicates that calculated solubility limits using the co-precipitation approach are in close agreement with the actual trace element concentrations. Furthermore, the calculated radionuclide concentrations are 2-4 orders of magnitude lower than conservative solubility limits calculated by assuming equilibrium with individual trace element phases. 34 refs, 18 figs, 13 tabs.

  19. Angular plasmon response of gold nanoparticles arrays: approaching the Rayleigh limit

    Directory of Open Access Journals (Sweden)

    Marae-Djouda Joseph

    2016-07-01

    Full Text Available The regular arrangement of metal nanoparticles influences their plasmonic behavior. It has been previously demonstrated that the coupling between diffracted waves and plasmon modes can give rise to extremely narrow plasmon resonances. This is the case when the single-particle localized surface plasmon resonance (λLSP is very close in value to the Rayleigh anomaly wavelength (λRA of the nanoparticles array. In this paper, we performed angle-resolved extinction measurements on a 2D array of gold nano-cylinders designed to fulfil the condition λRA<λLSP. Varying the angle of excitation offers a unique possibility to finely modify the value of λRA, thus gradually approaching the condition of coupling between diffracted waves and plasmon modes. The experimental observation of a collective dipolar resonance has been interpreted by exploiting a simplified model based on the coupling of evanescent diffracted waves with plasmon modes. Among other plasmon modes, the measurement technique has also evidenced and allowed the study of a vertical plasmon mode, only visible in TM polarization at off-normal excitation incidence. The results of numerical simulations, based on the periodic Green’s tensor formalism, match well with the experimental transmission spectra and show fine details that could go unnoticed by considering only experimental data.

  20. What is Fundamental?

    CERN Multimedia

    2004-01-01

    Discussing what is fundamental in a variety of fields, biologist Richard Dawkins, physicist Gerardus 't Hooft, and mathematician Alain Connes spoke to a packed Main Auditorium at CERN 15 October. Dawkins, Professor of the Public Understanding of Science at Oxford University, explained simply the logic behind Darwinian natural selection, and how it would seem to apply anywhere in the universe that had the right conditions. 't Hooft, winner of the 1999 Physics Nobel Prize, outlined some of the main problems in physics today, and said he thinks physics is so fundamental that even alien scientists from another planet would likely come up with the same basic principles, such as relativity and quantum mechanics. Connes, winner of the 1982 Fields Medal (often called the Nobel Prize of Mathematics), explained how physics is different from mathematics, which he described as a "factory for concepts," unfettered by connection to the physical world. On 16 October, anthropologist Sharon Traweek shared anecdotes from her ...

  1. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, A.; Murthy, S.

    2007-06-01

    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d ( d = 3, . . . , 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings. (author)

  2. Fundamentals of differential beamforming

    CERN Document Server

    Benesty, Jacob; Pan, Chao

    2016-01-01

    This book provides a systematic study of the fundamental theory and methods of beamforming with differential microphone arrays (DMAs), or differential beamforming in short. It begins with a brief overview of differential beamforming and some popularly used DMA beampatterns such as the dipole, cardioid, hypercardioid, and supercardioid, before providing essential background knowledge on orthogonal functions and orthogonal polynomials, which form the basis of differential beamforming. From a physical perspective, a DMA of a given order is defined as an array that measures the differential acoustic pressure field of that order; such an array has a beampattern in the form of a polynomial whose degree is equal to the DMA order. Therefore, the fundamental and core problem of differential beamforming boils down to the design of beampatterns with orthogonal polynomials. But certain constraints also have to be considered so that the resulting beamformer does not seriously amplify the sensors’ self noise and the mism...

  3. Fundamentals of nuclear physics

    CERN Document Server

    Takigawa, Noboru

    2017-01-01

    This book introduces the current understanding of the fundamentals of nuclear physics by referring to key experimental data and by providing a theoretical understanding of principal nuclear properties. It primarily covers the structure of nuclei at low excitation in detail. It also examines nuclear forces and decay properties. In addition to fundamentals, the book treats several new research areas such as non-relativistic as well as relativistic Hartree–Fock calculations, the synthesis of super-heavy elements, the quantum chromodynamics phase diagram, and nucleosynthesis in stars, to convey to readers the flavor of current research frontiers in nuclear physics. The authors explain semi-classical arguments and derivation of its formulae. In these ways an intuitive understanding of complex nuclear phenomena is provided. The book is aimed at graduate school students as well as junior and senior undergraduate students and postdoctoral fellows. It is also useful for researchers to update their knowledge of diver...

  4. Frontiers of Fundamental Physics

    CERN Document Server

    2014-01-01

    The 14th annual international symposium “Frontiers of Fundamental Physics” (FFP14) was organized by the OCEVU Labex. It was held in Marseille, on the Saint-Charles Campus of Aix Marseille University (AMU) and had over 280 participants coming from all over the world. FFP Symposium began in India in 1997 and it became itinerant in 2004, through Europe, Canada and Australia. It covers topics in fundamental physics with the objective to enable scholars working in related areas to meet on a single platform and exchange ideas. In addition to highlighting the progress in these areas, the symposium invites the top researchers to reflect on the educational aspects of our discipline. Moreover, the scientific concepts are also discussed through philosophical and epistemological viewpoints. Several eminent scientists, such as the laureates of prestigious awards (Nobel Prize, Fields Medal,…), have already participated in these meetings. The FFP14 Symposium developed around seven main themes, namely: Astroparticle Ph...

  5. Fundamental physics constants

    International Nuclear Information System (INIS)

    Cohen, E.R.; Taylor, B.N.

    1995-01-01

    Present technological applications require the values used for the fundamental physical and chemical constants to be more and more precise and at the same time coherent. Great importance is then attached to the task of coordinating and comparing the most recent experimental data, extracting from them as a whole, by means of a least square fit, a set of values for the fundamental constants as precise and coherent as possible. The set of values which is at present in usage, derives from a fit performed in 1986, but new experimental results already promise a large reduction in the uncertainties of various constants. A new global fit that will implement such reductions is scheduled for completion in 1995 or 1996

  6. Fundamental concepts on energy

    International Nuclear Information System (INIS)

    Rodriguez, M.H.

    1998-01-01

    The fundamental concepts on energy and the different forms in which it is manifested are presented. Since it is possible to transform energy in a way to other, the laws that govern these transformations are discussed. The energy transformation processes are an essential compound in the capacity humanizes to survive and be developed. The energy use brings important economic aspects, technical and political. Because this, any decision to administer energy system will be key for our future life

  7. Fundamentals of gas counters

    International Nuclear Information System (INIS)

    Bateman, J.E.

    1994-01-01

    The operation of gas counters used for detecting radiation is explained in terms of the four fundamental physical processes which govern their operation. These are 1) conversion of neutral radiation into charged particles, 2) ionization of the host gas by a fast charge particle 3) transport of the gas ions to the electrodes and 4) amplification of the electrons in a region of enhanced electric field. Practical implications of these are illustrated. (UK)

  8. Fundamentals of queueing theory

    CERN Document Server

    Gross, Donald; Thompson, James M; Harris, Carl M

    2013-01-01

    Praise for the Third Edition ""This is one of the best books available. Its excellent organizational structure allows quick reference to specific models and its clear presentation . . . solidifies the understanding of the concepts being presented.""-IIE Transactions on Operations Engineering Thoroughly revised and expanded to reflect the latest developments in the field, Fundamentals of Queueing Theory, Fourth Edition continues to present the basic statistical principles that are necessary to analyze the probabilistic nature of queues. Rather than pre

  9. Fundamentals of linear algebra

    CERN Document Server

    Dash, Rajani Ballav

    2008-01-01

    FUNDAMENTALS OF LINEAR ALGEBRA is a comprehensive Text Book, which can be used by students and teachers of All Indian Universities. The Text has easy, understandable form and covers all topics of UGC Curriculum. There are lots of worked out examples which helps the students in solving the problems without anybody's help. The Problem sets have been designed keeping in view of the questions asked in different examinations.

  10. Fundamentals of radiological protection

    International Nuclear Information System (INIS)

    Wells, J.; Mill, A.J.; Charles, M.W.

    1978-05-01

    The basic processes of living cells which are relevant to an understanding of the interaction of ionizing radiation with man are described. Particular reference is made to cell death, cancer induction and genetic effects. This is the second of a series of reports which present the fundamentals necessary for an understanding of the bases of regulatory criteria such as those recommended by the International Commision on Radiological Protection (ICRP). Others consider basic radiation physics and the biological effects of ionizing radiation. (author)

  11. Biomedical engineering fundamentals

    CERN Document Server

    Bronzino, Joseph D

    2014-01-01

    Known as the bible of biomedical engineering, The Biomedical Engineering Handbook, Fourth Edition, sets the standard against which all other references of this nature are measured. As such, it has served as a major resource for both skilled professionals and novices to biomedical engineering.Biomedical Engineering Fundamentals, the first volume of the handbook, presents material from respected scientists with diverse backgrounds in physiological systems, biomechanics, biomaterials, bioelectric phenomena, and neuroengineering. More than three dozen specific topics are examined, including cardia

  12. Fundamentals of Fire Phenomena

    DEFF Research Database (Denmark)

    Quintiere, James

    analyses. Fire phenomena encompass everything about the scientific principles behind fire behaviour. Combining the principles of chemistry, physics, heat and mass transfer, and fluid dynamics necessary to understand the fundamentals of fire phenomena, this book integrates the subject into a clear...... as a visiting professor at BYG.DTU financed by the Larsen and Nielsen Foundation, and is entered to the research database by Kristian Hertz responsible for the visiting professorship....

  13. Value of Fundamental Science

    Science.gov (United States)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  14. A Life-cycle Approach to Improve the Sustainability of Rural Water Systems in Resource-Limited Countries

    Directory of Open Access Journals (Sweden)

    Nicholas Stacey

    2012-11-01

    Full Text Available A WHO and UNICEF joint report states that in 2008, 884 million people lacked access to potable drinking water. A life-cycle approach to develop potable water systems may improve the sustainability for such systems, however, a review of the literature shows that such an approach has primarily been used for urban systems located in resourced countries. Although urbanization is increasing globally, over 40 percent of the world’s population is currently rural with many considered poor. In this paper, we present a first step towards using life-cycle assessment to develop sustainable rural water systems in resource-limited countries while pointing out the needs. For example, while there are few differences in costs and environmental impacts for many improved rural water system options, a system that uses groundwater with community standpipes is substantially lower in cost that other alternatives with a somewhat lower environmental inventory. However, a LCA approach shows that from institutional as well as community and managerial perspectives, sustainability includes many other factors besides cost and environment that are a function of the interdependent decision process used across the life cycle of a water system by aid organizations, water user committees, and household users. These factors often present the biggest challenge to designing sustainable rural water systems for resource-limited countries.

  15. Individual differences in fundamental social motives.

    Science.gov (United States)

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Limitation of Socio-Economic Rights in the 2010 Kenyan Constitution

    African Journals Online (AJOL)

    These SER obligations, just like any other fundamental human rights obligations, are, however, not absolute and are subject to legitimate limitation by the State. Two approaches have been used in international and comparative national law jurisprudence to limit SERs: the proportionality approach, using a general limitation ...

  17. Benefits and limitations of a multidisciplinary approach to individualized management of Cornelia de Lange syndrome and related diagnoses.

    Science.gov (United States)

    January, Kathleen; Conway, Laura J; Deardorff, Matthew; Harrington, Ann; Krantz, Ian D; Loomes, Kathleen; Pipan, Mary; Noon, Sarah E

    2016-06-01

    Given the clinical complexities of Cornelia de Lange Syndrome (CdLS), the Center for CdLS and Related Diagnoses at The Children's Hospital of Philadelphia (CHOP) and The Multidisciplinary Clinic for Adolescents and Adults at Greater Baltimore Medical Center (GBMC) were established to develop a comprehensive approach to clinical management and research issues relevant to CdLS. Little work has been done to evaluate the general utility of a multispecialty approach to patient care. Previous research demonstrates several advantages and disadvantages of multispecialty care. This research aims to better understand the benefits and limitations of a multidisciplinary clinic setting for individuals with CdLS and related diagnoses. Parents of children with CdLS and related diagnoses who have visited a multidisciplinary clinic (N = 52) and who have not visited a multidisciplinary clinic (N = 69) were surveyed to investigate their attitudes. About 90.0% of multispecialty clinic attendees indicated a preference for multidisciplinary care. However, some respondents cited a need for additional clinic services including more opportunity to meet with other specialists (N = 20), such as behavioral health, and increased information about research studies (N = 15). Travel distance and expenses often prevented families' multidisciplinary clinic attendance (N = 41 and N = 35, respectively). Despite identified limitations, these findings contribute to the evidence demonstrating the utility of a multispecialty approach to patient care. This approach ultimately has the potential to not just improve healthcare for individuals with CdLS but for those with medically complex diagnoses in general. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. Fundamentals of GPS Receivers A Hardware Approach

    CERN Document Server

    Doberstein, Dan

    2012-01-01

    While much of the current literature on GPS receivers is aimed at those intimately familiar with their workings, this volume summarizes the basic principles using as little mathematics as possible, and details the necessary specifications and circuits for constructing a GPS receiver that is accurate to within 300 meters. Dedicated sections deal with the features of the GPS signal and its data stream, the details of the receiver (using a hybrid design as exemplar), and more advanced receivers and topics including time and frequency measurements. Later segments discuss the Zarlink GPS receiver chip set, as well as providing a thorough examination of the TurboRogue receiver, one of the most accurate yet made. Guiding the reader through the concepts and circuitry, from the antenna to the solution of user position, the book’s deployment of a hybrid receiver as a basis for discussion allows for extrapolation of the core ideas to more complex, and more accurate designs. Digital methods are used, but any analogue c...

  19. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  20. Fundamentals of radiological protection

    International Nuclear Information System (INIS)

    Charles, M.W.; Wells, J.; Mill, A.J.

    1978-04-01

    A brief review is presented of the early and late effects of ionising radiation on man, with particular emphasis on those aspects of importance in radiological protection. The terminology and dose response curves, are explained. Early effects on cells, tissues and whole organs are discussed. Late somatic effects considered include cancer and life-span shortening. Genetic effects are examined. The review is the third of a series of reports which present the fundamentals necessary for an understanding of the basis of regulatory criteria, such as those of the ICRP. (u.K.)

  1. Fundamentals of microwave photonics

    CERN Document Server

    Urick, V J; McKinney , Jason D

    2015-01-01

    A comprehensive resource to designing andconstructing analog photonic links capable of high RFperformanceFundamentals of Microwave Photonics provides acomprehensive description of analog optical links from basicprinciples to applications.  The book is organized into fourparts. The first begins with a historical perspective of microwavephotonics, listing the advantages of fiber optic links anddelineating analog vs. digital links. The second section coversbasic principles associated with microwave photonics in both the RFand optical domains.  The third focuses on analog modulationformats-starti

  2. Fundamental of biomedical engineering

    CERN Document Server

    Sawhney, GS

    2007-01-01

    About the Book: A well set out textbook explains the fundamentals of biomedical engineering in the areas of biomechanics, biofluid flow, biomaterials, bioinstrumentation and use of computing in biomedical engineering. All these subjects form a basic part of an engineer''s education. The text is admirably suited to meet the needs of the students of mechanical engineering, opting for the elective of Biomedical Engineering. Coverage of bioinstrumentation, biomaterials and computing for biomedical engineers can meet the needs of the students of Electronic & Communication, Electronic & Instrumenta

  3. Nanomachines fundamentals and applications

    CERN Document Server

    Wang, Joseph

    2013-01-01

    This first-hand account by one of the pioneers of nanobiotechnology brings together a wealth of valuable material in a single source. It allows fascinating insights into motion at the nanoscale, showing how the proven principles of biological nanomotors are being transferred to artificial nanodevices.As such, the author provides engineers and scientists with the fundamental knowledge surrounding the design and operation of biological and synthetic nanomotors and the latest advances in nanomachines. He addresses such topics as nanoscale propulsions, natural biomotors, molecular-scale machin

  4. Fundamentals of semiconductor devices

    CERN Document Server

    Lindmayer, Joseph

    1965-01-01

    Semiconductor properties ; semiconductor junctions or diodes ; transistor fundamentals ; inhomogeneous impurity distributions, drift or graded-base transistors ; high-frequency properties of transistors ; band structure of semiconductors ; high current densities and mechanisms of carrier transport ; transistor transient response and recombination processes ; surfaces, field-effect transistors, and composite junctions ; additional semiconductor characteristics ; additional semiconductor devices and microcircuits ; more metal, insulator, and semiconductor combinations for devices ; four-pole parameters and configuration rotation ; four-poles of combined networks and devices ; equivalent circuits ; the error function and its properties ; Fermi-Dirac statistics ; useful physical constants.

  5. DOE fundamentals handbook: Chemistry

    International Nuclear Information System (INIS)

    1993-01-01

    This handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of chemistry. This volume contains the following modules: reactor water chemistry (effects of radiation on water chemistry, chemistry parameters), principles of water treatment (purpose; treatment processes [ion exchange]; dissolved gases, suspended solids, and pH control; water purity), and hazards of chemicals and gases (corrosives [acids, alkalies], toxic compounds, compressed gases, flammable/combustible liquids)

  6. Fundamentals of calculus

    CERN Document Server

    Morris, Carla C

    2015-01-01

    Fundamentals of Calculus encourages students to use power, quotient, and product rules for solutions as well as stresses the importance of modeling skills.  In addition to core integral and differential calculus coverage, the book features finite calculus, which lends itself to modeling and spreadsheets.  Specifically, finite calculus is applied to marginal economic analysis, finance, growth, and decay.  Includes: Linear Equations and FunctionsThe DerivativeUsing the Derivative Exponential and Logarithmic Functions Techniques of DifferentiationIntegral CalculusIntegration TechniquesFunctions

  7. Fundamental concepts of mathematics

    CERN Document Server

    Goodstein, R L

    Fundamental Concepts of Mathematics, 2nd Edition provides an account of some basic concepts in modern mathematics. The book is primarily intended for mathematics teachers and lay people who wants to improve their skills in mathematics. Among the concepts and problems presented in the book include the determination of which integral polynomials have integral solutions; sentence logic and informal set theory; and why four colors is enough to color a map. Unlike in the first edition, the second edition provides detailed solutions to exercises contained in the text. Mathematics teachers and people

  8. Fundamentals of magnetism

    CERN Document Server

    Getzlaff, Mathias

    2007-01-01

    In the last decade a tremendous progress has taken place in understanding the basis of magnetism, especially in reduced dimensions. In the first part, the fundamentals of magnetism are conveyed for atoms and bulk-like solid-state systems providing a basis for the understanding of new phenomena which exclusively occur in low-dimensional systems as the giant magneto resistance. This wide field is discussed in the second part and illustrated by copious examples. This textbook is particularly suitable for graduate students in physical and materials sciences. It includes numerous examples, exercises, and references.

  9. Electronic circuits fundamentals & applications

    CERN Document Server

    Tooley, Mike

    2015-01-01

    Electronics explained in one volume, using both theoretical and practical applications.New chapter on Raspberry PiCompanion website contains free electronic tools to aid learning for students and a question bank for lecturersPractical investigations and questions within each chapter help reinforce learning Mike Tooley provides all the information required to get to grips with the fundamentals of electronics, detailing the underpinning knowledge necessary to appreciate the operation of a wide range of electronic circuits, including amplifiers, logic circuits, power supplies and oscillators. The

  10. El grupo fundamental

    Directory of Open Access Journals (Sweden)

    Carlos A. Robles Corbalá

    2015-12-01

    Full Text Available En este artículo se aborda un problema clásico para poder detectar si dos espacios topológicos son homeomorfos o no. Para lo cual a cada espacio topológico se le asocia un grupo algebraico, de tal suerte que si los espacios son homeomorfos, entonces los grupos asociados serán isomorfos. Se presenta una construcción del grupo fundamental de un espacio topológico y se enfoca en demostrar que efectivamente es un grupo.

  11. Fundamentals of photonics

    CERN Document Server

    Saleh, Bahaa E A

    2007-01-01

    Now in a new full-color edition, Fundamentals of Photonics, Second Edition is a self-contained and up-to-date introductory-level textbook that thoroughly surveys this rapidly expanding area of engineering and applied physics. Featuring a logical blend of theory and applications, coverage includes detailed accounts of the primary theories of light, including ray optics, wave optics, electromagnetic optics, and photon optics, as well as the interaction of photons and atoms, and semiconductor optics. Presented at increasing levels of complexity, preliminary sections build toward more advan

  12. Quivers, words and fundamentals

    International Nuclear Information System (INIS)

    Mattioli, Paolo; Ramgoolam, Sanjaye

    2015-01-01

    A systematic study of holomorphic gauge invariant operators in general N=1 quiver gauge theories, with unitary gauge groups and bifundamental matter fields, was recently presented in http://dx.doi.org/10.1007/JHEP04(2013)094. For large ranks a simple counting formula in terms of an infinite product was given. We extend this study to quiver gauge theories with fundamental matter fields, deriving an infinite product form for the refined counting in these cases. The infinite products are found to be obtained from substitutions in a simple building block expressed in terms of the weighted adjacency matrix of the quiver. In the case without fundamentals, it is a determinant which itself is found to have a counting interpretation in terms of words formed from partially commuting letters associated with simple closed loops in the quiver. This is a new relation between counting problems in gauge theory and the Cartier-Foata monoid. For finite ranks of the unitary gauge groups, the refined counting is given in terms of expressions involving Littlewood-Richardson coefficients.

  13. An Unified Approach to Limits on Power Generation and Power Consumption in Thermo-Electro-Chemical Systems

    Directory of Open Access Journals (Sweden)

    Stanisław Sieniutycz

    2013-02-01

    Full Text Available This research presents a unified approach to power limits in power producing and power consuming systems, in particular those using renewable resources. As a benchmark system which generates or consumes power, a well-known standardized arrangement is considered, in which two different reservoirs are separated by an engine or a heat pump. Either of these units is located between a resource fluid (‘upper’ fluid 1 and the environmental fluid (‘lower’ fluid, 2. Power yield or power consumption is determined in terms of conductivities, reservoir temperatures and internal irreversibility coefficient, F. While bulk temperatures Ti of reservoirs’ are the only necessary state coordinates describing purely thermal units, in chemical (electrochemical engines, heat pumps or separators it is necessary to use both temperatures and chemical potentials mk. Methods of mathematical programming and dynamic optimization are applied to determine limits on power yield or power consumption in various energy systems, such as thermal engines, heat pumps, solar dryers, electrolysers, fuel cells, etc. Methodological similarities when treating power limits in engines, separators, and heat pumps are shown. Numerical approaches to multistage systems are based on methods of dynamic programming (DP or on Pontryagin’s maximum principle. The first method searches for properties of optimal work and is limited to systems with low dimensionality of state vector, whereas the second investigates properties of differential (canonical equations derived from the process Hamiltonian. A relatively unknown symmetry in behaviour of power producers (engines and power consumers is enunciated in this paper. An approximate evaluation shows that, at least ¼ of power dissipated in the natural transfer process must be added to a separator or a heat pump in order to assure a required process rate. Applications focus on drying systems which, by nature, require a large amount of thermal

  14. Approaching the limits

    African Journals Online (AJOL)

    From the 4th – 17th December 2016, the parties of the. Convention for Biodiversity held their 13th conference in Cancún,. Mexico. At the event, a revised red list was produced. On the list are some species featured for the first time. Others were down- listed, or moved into categories more dire than previously was the case.

  15. The role of dose limitation and optimization in intervention. Approaches to the remediation of contaminated sites in Germany

    International Nuclear Information System (INIS)

    Goldammer, W.; Helming, M.; Kuehnel, G.; Landfermann, H.-H.

    2000-01-01

    The clean-up of contaminated sites requires appropriate and efficient methodologies for the decision-making about priorities and extent of remedial measures, aiming at the two, usually conflicting, goals to protect people and the environment and to save money and other resources. Finding the cost-effective balance between these two primary objectives often is complicated by several factors. Sensible decision-making in this situation requires the use of appropriate methodologies and tools which assist in identifying and implementing the optimal solution. The paper discusses an approach developed in Germany to achieve environmentally sound and cost-effective solutions. A basic requirement within the German approach is the limitation of individual doses in order to limit inequity between people exposed. An Action Level of 1 mSv per annum is used in this sense for the identification of sites that require farther investigation and potentially remediation. On the basis of this individual dose related criterion secondary reference levels for directly measurable quantities such as activity concentrations have been derived, facilitating the practical application of the Action Level Concept. Decisions on remedial action, in particular for complex sites, are based on justification and optimization analyses. These take into consideration a variety of different contaminants and risks to humans and the environment arising on various exposure pathways. The optimization analyses, carried-out to identify optimal remediation options, address radiological risks as well as short and long term costs within a cost-benefit analysis framework. Other relevant factors of influence, e.g. chemical risks or ecological damage, are incorporated as well. Comprehensive methodologies utilizing probabilistic methods have been developed to assess site conditions and possible remediation options on this basis. The approaches developed are applied within the German uranium mine rehabilitation program

  16. Materials Fundamentals of Gate Dielectrics

    CERN Document Server

    Demkov, Alexander A

    2006-01-01

    This book presents materials fundamentals of novel gate dielectrics that are being introduced into semiconductor manufacturing to ensure the continuous scalling of the CMOS devices. This is a very fast evolving field of research so we choose to focus on the basic understanding of the structure, thermodunamics, and electronic properties of these materials that determine their performance in device applications. Most of these materials are transition metal oxides. Ironically, the d-orbitals responsible for the high dielectric constant cause sever integration difficulties thus intrinsically limiting high-k dielectrics. Though new in the electronics industry many of these materials are wel known in the field of ceramics, and we describe this unique connection. The complexity of the structure-property relations in TM oxides makes the use of the state of the art first-principles calculations necessary. Several chapters give a detailed description of the modern theory of polarization, and heterojunction band discont...

  17. Fundamentals of quantum mechanics

    CERN Document Server

    House, J E

    2017-01-01

    Fundamentals of Quantum Mechanics, Third Edition is a clear and detailed introduction to quantum mechanics and its applications in chemistry and physics. All required math is clearly explained, including intermediate steps in derivations, and concise review of the math is included in the text at appropriate points. Most of the elementary quantum mechanical models-including particles in boxes, rigid rotor, harmonic oscillator, barrier penetration, hydrogen atom-are clearly and completely presented. Applications of these models to selected “real world” topics are also included. This new edition includes many new topics such as band theory and heat capacity of solids, spectroscopy of molecules and complexes (including applications to ligand field theory), and small molecules of astrophysical interest.

  18. Lasers Fundamentals and Applications

    CERN Document Server

    Thyagarajan, K

    2010-01-01

    Lasers: Fundamentals and Applications, serves as a vital textbook to accompany undergraduate and graduate courses on lasers and their applications. Ever since their invention in 1960, lasers have assumed tremendous importance in the fields of science, engineering and technology because of their diverse uses in basic research and countless technological applications. This book provides a coherent presentation of the basic physics behind the way lasers work, and presents some of their most important applications in vivid detail. After reading this book, students will understand how to apply the concepts found within to practical, tangible situations. This textbook includes worked-out examples and exercises to enhance understanding, and the preface shows lecturers how to most beneficially match the textbook with their course curricula. The book includes several recent Nobel Lectures, which will further expose students to the emerging applications and excitement of working with lasers. Students who study lasers, ...

  19. Theory of fundamental interactions

    International Nuclear Information System (INIS)

    Pestov, A.B.

    1992-01-01

    In the present article the theory of fundamental interactions is derived in a systematic way from the first principles. In the developed theory there is no separation between space-time and internal gauge space. Main equations for basic fields are derived. In is shown that the theory satisfies the correspondence principle and gives rise to new notions in the considered region. In particular, the conclusion is made about the existence of particles which are characterized not only by the mass, spin, charge but also by the moment of inertia. These are rotating particles, the particles which represent the notion of the rigid body on the microscopical level and give the key for understanding strong interactions. The main concepts and dynamical laws for these particles are formulated. The basic principles of the theory may be examined experimentally not in the distant future. 29 refs

  20. Fundamentals of optical waveguides

    CERN Document Server

    Okamoto, Katsunari

    2006-01-01

    Fundamentals of Optical Waveguides is an essential resource for any researcher, professional or student involved in optics and communications engineering. Any reader interested in designing or actively working with optical devices must have a firm grasp of the principles of lightwave propagation. Katsunari Okamoto has presented this difficult technology clearly and concisely with several illustrations and equations. Optical theory encompassed in this reference includes coupled mode theory, nonlinear optical effects, finite element method, beam propagation method, staircase concatenation method, along with several central theorems and formulas. Since the publication of the well-received first edition of this book, planar lightwave circuits and photonic crystal fibers have fully matured. With this second edition the advances of these fibers along with other improvements on existing optical technologies are completely detailed. This comprehensive volume enables readers to fully analyze, design and simulate opti...

  1. Fundamental partial compositeness

    Energy Technology Data Exchange (ETDEWEB)

    Sannino, Francesco [CP-Origins and Danish IAS, University of Southern Denmark,Campusvej 55 (Denmark); Strumia, Alessandro [Dipartimento di Fisica dell’Università di Pisa and INFN,Pisa (Italy); Theory Division, CERN,Geneva (Switzerland); Tesi, Andrea [Department of Physics, Enrico Fermi Institute, University of Chicago,Chicago, IL 60637 (United States); Vigiani, Elena [Dipartimento di Fisica dell’Università di Pisa and INFN,Pisa (Italy)

    2016-11-07

    We construct renormalizable Standard Model extensions, valid up to the Planck scale, that give a composite Higgs from a new fundamental strong force acting on fermions and scalars. Yukawa interactions of these particles with Standard Model fermions realize the partial compositeness scenario. Under certain assumptions on the dynamics of the scalars, successful models exist because gauge quantum numbers of Standard Model fermions admit a minimal enough ‘square root’. Furthermore, right-handed SM fermions have an SU(2){sub R}-like structure, yielding a custodially-protected composite Higgs. Baryon and lepton numbers arise accidentally. Standard Model fermions acquire mass at tree level, while the Higgs potential and flavor violations are generated by quantum corrections. We further discuss accidental symmetries and other dynamical features stemming from the new strongly interacting scalars. If the same phenomenology can be obtained from models without our elementary scalars, they would reappear as composite states.

  2. Fundamentals of phosphors

    CERN Document Server

    Yen, William M; Yamamoto, Hajime

    2006-01-01

    Drawing from the second edition of the best-selling Handbook of Phosphors, Fundamentals of Phosphors covers the principles and mechanisms of luminescence in detail and surveys the primary phosphor materials as well as their optical properties. The book addresses cutting-edge developments in phosphor science and technology including oxynitride phosphors and the impact of lanthanide level location on phosphor performance.Beginning with an explanation of the physics underlying luminescence mechanisms in solids, the book goes on to interpret various luminescence phenomena in inorganic and organic materials. This includes the interpretation of the luminescence of recently developed low-dimensional systems, such as quantum wells and dots. The book also discusses the excitation mechanisms by cathode-ray and ionizing radiation and by electric fields to produce electroluminescence. The book classifies phosphor materials according to the type of luminescence centers employed or the class of host materials used and inte...

  3. Psychology fundaments: considerations

    OpenAIRE

    Cambaúva, Lenita Gama

    2000-01-01

    O presente texto é uma reflexão acerca de alguns fundamentos da Psicologia. Por ser um conceito fundamental para o conhecimento humano da relação sujeito e objeto, aborda-se, a partir de uma concepção histórico-social do homem, a constituição do conceito de subjetividade na história do pensamento. Expõe-se, de forma preliminar, como se deu o advento das Ciências Modernas e, com elas, a ênfase nas Ciências Naturais. Da mesma forma, focaliza-se também a crítica das Ciências Humanas às Ciências ...

  4. Automotive electronics design fundamentals

    CERN Document Server

    Zaman, Najamuz

    2015-01-01

    This book explains the topology behind automotive electronics architectures and examines how they can be profoundly augmented with embedded controllers. These controllers serve as the core building blocks of today’s vehicle electronics. Rather than simply teaching electrical basics, this unique resource focuses on the fundamental concepts of vehicle electronics architecture, and details the wide variety of Electronic Control Modules (ECMs) that enable the increasingly sophisticated "bells & whistles" of modern designs.  A must-have for automotive design engineers, technicians working in automotive electronics repair centers and students taking automotive electronics courses, this guide bridges the gap between academic instruction and industry practice with clear, concise advice on how to design and optimize automotive electronics with embedded controllers.

  5. Superconductivity fundamentals and applications

    CERN Document Server

    Buckel, Werner

    2004-01-01

    This is the second English edition of what has become one of the definitive works on superconductivity in German -- currently in its sixth edition. Comprehensive and easy to understand, this introductory text is written especially with the non-specialist in mind. The authors, both long-term experts in this field, present the fundamental considerations without the need for extensive mathematics, describing the various phenomena connected with the superconducting state, with liberal insertion of experimental facts and examples for modern applications. While all fields of superconducting phenomena are dealt with in detail, this new edition pays particular attention to the groundbreaking discovery of magnesium diboride and the current developments in this field. In addition, a new chapter provides an overview of the elements, alloys and compounds where superconductivity has been observed in experiments, together with their major characteristics. The chapter on technical applications has been considerably expanded...

  6. Digital Fourier analysis fundamentals

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations.  These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...

  7. Fundamental partial compositeness

    CERN Document Server

    Sannino, Francesco

    2016-01-01

    We construct renormalizable Standard Model extensions, valid up to the Planck scale, that give a composite Higgs from a new fundamental strong force acting on fermions and scalars. Yukawa interactions of these particles with Standard Model fermions realize the partial compositeness scenario. Successful models exist because gauge quantum numbers of Standard Model fermions admit a minimal enough 'square root'. Furthermore, right-handed SM fermions have an SU(2)$_R$-like structure, yielding a custodially-protected composite Higgs. Baryon and lepton numbers arise accidentally. Standard Model fermions acquire mass at tree level, while the Higgs potential and flavor violations are generated by quantum corrections. We further discuss accidental symmetries and other dynamical features stemming from the new strongly interacting scalars. If the same phenomenology can be obtained from models without our elementary scalars, they would reappear as composite states.

  8. Fluid mechanics fundamentals and applications

    CERN Document Server

    Cengel, Yunus

    2013-01-01

    Cengel and Cimbala's Fluid Mechanics Fundamentals and Applications, communicates directly with tomorrow's engineers in a simple yet precise manner. The text covers the basic principles and equations of fluid mechanics in the context of numerous and diverse real-world engineering examples. The text helps students develop an intuitive understanding of fluid mechanics by emphasizing the physics, using figures, numerous photographs and visual aids to reinforce the physics. The highly visual approach enhances the learning of Fluid mechanics by students. This text distinguishes itself from others by the way the material is presented - in a progressive order from simple to more difficult, building each chapter upon foundations laid down in previous chapters. In this way, even the traditionally challenging aspects of fluid mechanics can be learned effectively. McGraw-Hill is also proud to offer ConnectPlus powered by Maple with the third edition of Cengel/Cimbabla, Fluid Mechanics. This innovative and powerful new sy...

  9. Green Manufacturing Fundamentals and Applications

    CERN Document Server

    2013-01-01

    Green Manufacturing: Fundamentals and Applications introduces the basic definitions and issues surrounding green manufacturing at the process, machine and system (including supply chain) levels. It also shows, by way of several examples from different industry sectors, the potential for substantial improvement and the paths to achieve the improvement. Additionally, this book discusses regulatory and government motivations for green manufacturing and outlines the path for making manufacturing more green as well as making production more sustainable. This book also: • Discusses new engineering approaches for manufacturing and provides a path from traditional manufacturing to green manufacturing • Addresses regulatory and economic issues surrounding green manufacturing • Details new supply chains that need to be in place before going green • Includes state-of-the-art case studies in the areas of automotive, semiconductor and medical areas as well as in the supply chain and packaging areas Green Manufactu...

  10. Protein biomarkers on tissue as imaged via MALDI mass spectrometry: A systematic approach to study the limits of detection.

    Science.gov (United States)

    van de Ven, Stephanie M W Y; Bemis, Kyle D; Lau, Kenneth; Adusumilli, Ravali; Kota, Uma; Stolowitz, Mark; Vitek, Olga; Mallick, Parag; Gambhir, Sanjiv S

    2016-06-01

    MALDI mass spectrometry imaging (MSI) is emerging as a tool for protein and peptide imaging across tissue sections. Despite extensive study, there does not yet exist a baseline study evaluating the potential capabilities for this technique to detect diverse proteins in tissue sections. In this study, we developed a systematic approach for characterizing MALDI-MSI workflows in terms of limits of detection, coefficients of variation, spatial resolution, and the identification of endogenous tissue proteins. Our goal was to quantify these figures of merit for a number of different proteins and peptides, in order to gain more insight in the feasibility of protein biomarker discovery efforts using this technique. Control proteins and peptides were deposited in serial dilutions on thinly sectioned mouse xenograft tissue. Using our experimental setup, coefficients of variation were biomarkers and a new benchmarking strategy that can be used for comparing diverse MALDI-MSI workflows. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A note on 'A simple approach to search for all d-MCs of a limited-flow network'

    International Nuclear Information System (INIS)

    Salehi Fathabadi, H.; Forghani-elahabadi, M.

    2009-01-01

    Many real-world systems are multi-state systems composed of multi-state components in which the reliability can be computed in terms of the lower bound points of level d, called d-Mincuts (d-MCs). Such systems (electric power, transportation, etc.) may be considered as network flows whose arcs have independent and discrete random capacities. In this comment, we investigate the algorithm proposed by Yeh [A simple approach to search for all d-MCs of a limited-flow network. Reliability Engineering and System Safety 2001;71(1):15-19]. We show that the algorithm misses some real d-MCs, and then pinpoint the algorithm's failure and suggest the correcting point.

  12. Observations on the perspectives and limits of the evidence-based approach in the evaluation of gamification processes

    Directory of Open Access Journals (Sweden)

    Bruni Filippo

    2015-12-01

    Full Text Available As continually greater attention is given to the processes of gamification, the dimension pertaining to evaluation must also be focussed on the purpose of avoiding ineffective forms of banalisation. In reference to the evidence-based approach proposed by Mayer and in highlighting its possibilities and limits, an experiment is herein presented related to teacher training, in which we attempt to unite some traits of the processes of gamification to a first evaluation screen. The data obtained, if they seem on the one hand, indicate an overall positive perception on the part of the attendees, on the other though, they indicate forms of resistance and of saturation with respect to both the excessively competitive mechanisms and the peer evaluation procedures.

  13. Beyond the Futility Argument: The Fair Process Approach and Time-Limited Trials for Managing Dialysis Conflict

    Science.gov (United States)

    2013-01-01

    Summary Futility is an ancient concept arising from Greek mythology that was resurrected for its medical application in the 1980s with the proliferation of many lifesaving technologies, including dialysis and renal transplantation. By that time, the domineering medical paternalism that characterized the pre-1960s physician–patient relationship morphed into assertive patient autonomy, and some patients began to claim the right to demand aggressive, high-technology interventions, despite physician disapproval. To counter this power struggle, the establishment of a precise definition of futility offered hope for a futility policy that would allow physicians to justify withholding or withdrawing treatment, despite patient and family objections. This article reviews the various attempts made to define medical futility and describes their limited applicability to dialysis. When futility concerns arise, physicians should recognize the opportunity to address conflict, using best practice communication skills. Physicians would also benefit from understanding the ethical principles of respect for patient autonomy, beneficence, nonmaleficence, justice, and professional integrity that underlie medical decision-making. Also reviewed is the use of a fair process approach or time-limited trial when conflict resolution cannot be achieved. These topics are addressed in the Renal Physician Association’s clinical practice guideline Shared Decision-Making in the Appropriate Initiation and Withdrawal from Dialysis, with which nephrologists should be well versed. A case presentation of intractable calciphylaxis in a new dialysis patient illustrates the pitfalls of physicians not fully appreciating the ethics of medical decision-making and failing to use effective conflict management approaches in the clinical practice guideline. PMID:23868900

  14. The limitations of the Transnationalised State thesis in neo-Gramscian IR: the grounds for a Strategic-Relational Approach

    Directory of Open Access Journals (Sweden)

    Marco Antonio de Meneses Silva

    2010-11-01

    Full Text Available This article addresses recent critical literature in International Relations, on the transnationalisation of the state. It identifies a trend within neo-Gramscian thinking on the matter that awards excessive credence to the agential nature of hegemony with regards to the transnational state, class formation and alliances. In order to correct the imbalance implicit in the inherent instrumentalism of such accounts, a dialectical approach is suggested. This entails re-examining the Gramscian notion of the historic bloc on the one hand, and the employment of the Strategic-Relational Approach (SRA of Jessop on the other. As a consequence, both challenge the phenomena of transnationalisation, indicating the need for alternative conceptualisations in the debates on the state, globalisation, and hegemony. The article finds that structural readings of the historic bloc, and the SRA initiatives reveal the limitations of the current literature in conventional neo-Gramscian thinking, in addition to the need for further developing readings on the state and hegemony.

  15. Making physics more fundamental

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    The stellar death throes of supernovae have been seen and admired since time immemorial. However last year's was the first to come under the combined scrutiny of space-borne radiation detectors and underground neutrino monitors as well as terrestrial optical telescopes and even gravity wave antennae. The remarkable results underline the power of modern physics to explain and interrelate processes in the furthest reaches of the cosmos and the deep interior of nuclear particles. In recent years this common ground between 'Big Bang' cosmology and particle physics has been regularly trodden and retrodden in the light of fresh new insights and new experimental results, and thinking has steadily converged. In 1983, the first Symposium on Astronomy, Cosmology and Fundamental Physics, organized by CERN and the European Southern Observatory (ESO), was full of optimism, with new ideas ('inflation') to explain how the relatively small variations in the structure of the Universe could have arisen through the quantum structure of the initial cataclysm

  16. Fundamentals of klystron testing

    International Nuclear Information System (INIS)

    Caldwell, J.W. Jr.

    1978-08-01

    Fundamentals of klystron testing is a text primarily intended for the indoctrination of new klystron group test stand operators. It should significantly reduce the familiarization time of a new operator, making him an asset to the group sooner than has been experienced in the past. The new employee must appreciate the mission of SLAC before he can rightfully be expected to make a meaningful contribution to the group's effort. Thus, the introductory section acquaints the reader with basic concepts of accelerators in general, then briefly describes major physical aspects of the Stanford Linear Accelerator. Only then is his attention directed to the klystron, with its auxiliary systems, and the rudiments of klystron tube performance checks. It is presumed that the reader is acquainted with basic principles of electronics and scientific notation. However, to preserve the integrity of an indoctrination guide, tedious technical discussions and mathematical analysis have been studiously avoided. It is hoped that the new operator will continue to use the text for reference long after his indoctrination period is completed. Even the more experienced operator should find that particular sections will refresh his understanding of basic principles of klystron testing

  17. Revisiting energy efficiency fundamentals

    Energy Technology Data Exchange (ETDEWEB)

    Perez-Lombard, L.; Velazquez, D. [Grupo de Termotecnia, Escuela Superior de Ingenieros, Universidad de Sevilla, Camino de los Descubrimientos s/n, 41092 Seville (Spain); Ortiz, J. [Building Research Establishment (BRE), Garston, Watford, WD25 9XX (United Kingdom)

    2013-05-15

    Energy efficiency is a central target for energy policy and a keystone to mitigate climate change and to achieve a sustainable development. Although great efforts have been carried out during the last four decades to investigate the issue, focusing into measuring energy efficiency, understanding its trends and impacts on energy consumption and to design effective energy efficiency policies, many energy efficiency-related concepts, some methodological problems for the construction of energy efficiency indicators (EEI) and even some of the energy efficiency potential gains are often ignored or misunderstood, causing no little confusion and controversy not only for laymen but even for specialists. This paper aims to revisit, analyse and discuss some efficiency fundamental topics that could improve understanding and critical judgement of efficiency stakeholders and that could help in avoiding unfounded judgements and misleading statements. Firstly, we address the problem of measuring energy efficiency both in qualitative and quantitative terms. Secondly, main methodological problems standing in the way of the construction of EEI are discussed, and a sequence of actions is proposed to tackle them in an ordered fashion. Finally, two key topics are discussed in detail: the links between energy efficiency and energy savings, and the border between energy efficiency improvement and renewable sources promotion.

  18. Fundamentals of nuclear chemistry

    International Nuclear Information System (INIS)

    Majer, V.

    1982-01-01

    The author of the book has had 25 years of experience at the Nuclear Chemistry of Prague Technical University. In consequence, the book is intended as a basic textbook for students of this field. Its main objectives are an easily understandable presentation of the complex subject and in spite of the uncertainty which still characterizes the definition and subjects of nuclear chemistry - a systematic classification and logical structure. Contents: 1. Introduction (history and definition); 2. General nuclear chemistry (physical fundamentals, hot atom chemistry, interaction of nuclear radiation with matter, radioactive elements, isotope effects, isotope exchange, chemistry of radioactive trace elements); 3. Methods of nuclear chemistry of nuclear chemistry (radiochemical methods, activation, separation and enrichment chemistry); 4. Preparative nuclear chemistry (isotope production, labelled compounds); 5. Analytival nuclear chemistry; 6. Applied nuclear chemistry (isotope applications in general physical and analytical chemistry). The book is supplemented by an annex with tables, a name catalogue and a subject index which will facilitate access to important information. (RB) [de

  19. Fundamentals of Quantum Mechanics

    Science.gov (United States)

    Tang, C. L.

    2005-06-01

    Quantum mechanics has evolved from a subject of study in pure physics to one with a wide range of applications in many diverse fields. The basic concepts of quantum mechanics are explained in this book in a concise and easy-to-read manner emphasising applications in solid state electronics and modern optics. Following a logical sequence, the book is focused on the key ideas and is conceptually and mathematically self-contained. The fundamental principles of quantum mechanics are illustrated by showing their application to systems such as the hydrogen atom, multi-electron ions and atoms, the formation of simple organic molecules and crystalline solids of practical importance. It leads on from these basic concepts to discuss some of the most important applications in modern semiconductor electronics and optics. Containing many homework problems and worked examples, the book is suitable for senior-level undergraduate and graduate level students in electrical engineering, materials science and applied physics. Clear exposition of quantum mechanics written in a concise and accessible style Precise physical interpretation of the mathematical foundations of quantum mechanics Illustrates the important concepts and results by reference to real-world examples in electronics and optoelectronics Contains homeworks and worked examples, with solutions available for instructors

  20. Fundamental problems in provable security and cryptography.

    Science.gov (United States)

    Dent, Alexander W

    2006-12-15

    This paper examines methods for formally proving the security of cryptographic schemes. We show that, despite many years of active research and dozens of significant results, there are fundamental problems which have yet to be solved. We also present a new approach to one of the more controversial aspects of provable security, the random oracle model.

  1. Fundamentals of C

    CERN Document Server

    Guruprasad, N

    2009-01-01

    The book presents a contemporary approach to programming. Complte C programs are presented as and when it is required. This Book is not a cookbook . To get the maximum benefit from this book, you should take as active a role as possible. Don`t just read the examples. Enter it into your system and try them out.

  2. Plasmonics fundamentals and applications

    CERN Document Server

    Maier, Stefan Alexander

    2007-01-01

    Considered a major field of photonics, plasmonics offers the potential to confine and guide light below the diffraction limit and promises a new generation of highly miniaturized photonic devices. This book combines a comprehensive introduction with an extensive overview of the current state of the art. Coverage includes plasmon waveguides, cavities for field-enhancement, nonlinear processes and the emerging field of active plasmonics studying interactions of surface plasmons with active media.

  3. Fundamentals of nanomechanical resonators

    CERN Document Server

    Schmid, Silvan; Roukes, Michael Lee

    2016-01-01

    This authoritative book introduces and summarizes the latest models and skills required to design and optimize nanomechanical resonators, taking a top-down approach that uses macroscopic formulas to model the devices. The authors cover the electrical and mechanical aspects of nano electromechanical system (NEMS) devices. The introduced mechanical models are also key to the understanding and optimization of nanomechanical resonators used e.g. in optomechanics. Five comprehensive chapters address: The eigenmodes derived for the most common continuum mechanical structures used as nanomechanical resonators; The main sources of energy loss in nanomechanical resonators; The responsiveness of micro and nanomechanical resonators to mass, forces, and temperature; The most common underlying physical transduction mechanisms; The measurement basics, including amplitude and frequency noise. The applied approach found in this book is appropriate for engineering students and researchers working with micro and nanomechanical...

  4. O enfoque qualitativo na avaliação do consumo alimentar: fundamentos, aplicações e considerações operacionais The qualitative approach in the evaluation of food consumption: fundamentals, applications and operational considerations

    Directory of Open Access Journals (Sweden)

    Maria Lúcia Magalhães Bosi

    2011-12-01

    Full Text Available Disputas grosso modo infundadas entre defensores dos enfoques qualitativo e quantitativo têm impedido o reconhecimento dos benefícios das aplicações combinadas de ambos os métodos em um mesmo estudo, ou seja, de uma abordagem multidimensional e integrada. Não obstante, em anos recentes, o campo da Nutrição em Saúde Coletiva vem vivenciando um aumento na condução de estudos orientados não apenas pela mensuração, mas pela combinação de métodos qualitativos e quantitativos. Com efeito, o enfoque qualitativo tem muito a contribuir para a investigação do consumo alimentar, dentre vários outros objetos e temáticas, nas quais sobressai a importância do aprofundamento da compreensão da produção subjetiva, expressa em crenças, atitudes e comportamentos. Este artigo resume a natureza, os fundamentos e a utilidade do enfoque qualitativo em pesquisas no âmbito da alimentação e nutrição, esclarecendo como esses métodos têm sido ou podem ser usados para estudar os complexos problemas que se apresentam nesse campo, circunscrevendo a discussão ao âmbito dos estudos sobre consumo alimentar. A integração de ambos os métodos, qualitativo e quantitativo, mediante a complementaridade metodológica, pode minimizar os limites do emprego de cada enfoque de forma isolada.Unfounded disputes between advocates of qualitative and quantitative approaches have hindered the recognition of the benefits of combined application of both methods in the same study, ie, a multidimensional and integrated approach. Nevertheless, in recent years, the field of Nutrition in Public Health has experienced an increase in conducting studies guided not only by measurement, but by the combination of qualitative and quantitative methods. Indeed, the qualitative approach has much to contribute to research in food consumption, among many other objects and themes, in which stands the importance of deepening the understanding of subjective production, expressed

  5. Degeneration of penicillin production in ethanol-limited chemostat cultivations of Penicillium chrysogenum: A systems biology approach

    Directory of Open Access Journals (Sweden)

    Daran Jean-Marc

    2011-08-01

    Full Text Available Abstract Background In microbial production of non-catabolic products such as antibiotics a loss of production capacity upon long-term cultivation (for example chemostat, a phenomenon called strain degeneration, is often observed. In this study a systems biology approach, monitoring changes from gene to produced flux, was used to study degeneration of penicillin production in a high producing Penicillium chrysogenum strain during prolonged ethanol-limited chemostat cultivations. Results During these cultivations, the biomass specific penicillin production rate decreased more than 10-fold in less than 22 generations. No evidence was obtained for a decrease of the copy number of the penicillin gene cluster, nor a significant down regulation of the expression of the penicillin biosynthesis genes. However, a strong down regulation of the biosynthesis pathway of cysteine, one of the precursors of penicillin, was observed. Furthermore the protein levels of the penicillin pathway enzymes L-α-(δ-aminoadipyl-L-α-cystenyl-D-α-valine synthetase (ACVS and isopenicillin-N synthase (IPNS, decreased significantly. Re-cultivation of fully degenerated cells in unlimited batch culture and subsequent C-limited chemostats did only result in a slight recovery of penicillin production. Conclusions Our findings indicate that the observed degeneration is attributed to a significant decrease of the levels of the first two enzymes of the penicillin biosynthesis pathway, ACVS and IPNS. This decrease is not caused by genetic instability of the penicillin amplicon, neither by down regulation of the penicillin biosynthesis pathway. Furthermore no indications were obtained for degradation of these enzymes as a result of autophagy. Possible causes for the decreased enzyme levels could be a decrease of the translation efficiency of ACVS and IPNS during degeneration, or the presence of a culture variant impaired in the biosynthesis of functional proteins of these enzymes

  6. Dual-Frequency Alternating Current Designer Waveform for Reliable Voltammetric Determination of Electrode Kinetics Approaching the Reversible Limit.

    Science.gov (United States)

    Li, Jiezhen; Bentley, Cameron L; Bond, Alan M; Zhang, Jie

    2016-02-16

    Alternating current (ac) voltammetry provides access to faster electrode kinetics than direct current (dc) methods. However, difficulties in ac and other methods arise when the heterogeneous electron-transfer rate constant (k(0)) approaches the reversible limit, because the voltammetric characteristics become insensitive to electrode kinetics. Thus, in this near-reversible regime, even small uncertainties associated with bulk concentration (C), diffusion coefficient (D), electrode area (A), and uncompensated resistance (Ru) can lead to significant systematic error in the determination of k(0). In this study, we have introduced a kinetically sensitive dual-frequency designer waveform into the Fourier-transformed large-amplitude alternating current (FTAC) voltammetric method that is made up of two sine waves having the same amplitude but with different frequencies (e.g., 37 and 615 Hz) superimposed onto a dc ramp to quantify the close-to-reversible Fc(0/+) process (Fc = ferrocene) in two nonhaloaluminate ionic liquids. The concept is that from a single experiment the lower-frequency data set, collected on a time scale where the target process is reversible, can be used as an internal reference to calibrate A, D, C, and Ru. These calibrated values are then used to calculate k(0) from analysis of the harmonics of the higher-frequency data set, where the target process is quasi-reversible. With this approach, k(0) values of 0.28 and 0.11 cm·s(-1) have been obtained at a 50 μm diameter platinum microdisk electrode for the close-to-diffusion-controlled Fc(0/+) process in two ionic liquids, 1-ethyl-3-methylimidazolium bis(trifluoromethanesulfonyl)imide and 1-butyl-3-methylimidazolium bis(trifluoromethanesulfonyl)imide, respectively.

  7. Hankin and Reeves' Approach to Estimating Fish Abundance in Small Streams : Limitations and Potential Options.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, William L. [Bonneville Power Administration, Portland, OR (US). Environment, Fish and Wildlife

    2000-11-01

    Hankin and Reeves' (1988) approach to estimating fish abundance in small streams has been applied in stream-fish studies across North America. However, as with any method of population estimation, there are important assumptions that must be met for estimates to be minimally biased and reasonably precise. Consequently, I investigated effects of various levels of departure from these assumptions via simulation based on results from an example application in Hankin and Reeves (1988) and a spatially clustered population. Coverage of 95% confidence intervals averaged about 5% less than nominal when removal estimates equaled true numbers within sampling units, but averaged 62% - 86% less than nominal when they did not, with the exception where detection probabilities of individuals were >0.85 and constant across sampling units (95% confidence interval coverage = 90%). True total abundances averaged far (20% - 41%) below the lower confidence limit when not included within intervals, which implies large negative bias. Further, average coefficient of variation was about 1.5 times higher when removal estimates did not equal true numbers within sampling units (C{bar V} = 0.27 [SE = 0.0004]) than when they did (C{bar V} = 0.19 [SE = 0.0002]). A potential modification to Hankin and Reeves' approach is to include environmental covariates that affect detection rates of fish into the removal model or other mark-recapture model. A potential alternative is to use snorkeling in combination with line transect sampling to estimate fish densities. Regardless of the method of population estimation, a pilot study should be conducted to validate the enumeration method, which requires a known (or nearly so) population of fish to serve as a benchmark to evaluate bias and precision of population estimates.

  8. Physiological Capacity and Training Routines of Elite Cross-Country Skiers: Approaching the Upper Limits of Human Endurance.

    Science.gov (United States)

    Sandbakk, Øyvind; Holmberg, Hans-Christer

    2017-09-01

    Cross-country (XC) skiing is one of the most demanding of endurance sports, involving protracted competitions on varying terrain employing a variety of skiing techniques that require upper- and/or lower-body work to different extents. Through more effective training and extensive improvements in equipment and track preparation, the speed of cross-country ski races has increased more than that of any other winter Olympic sport, and, in addition, new types of racing events have been introduced. To a certain extent this has altered the optimal physiological capacity required to win, and the training routines of successful skiers have evolved accordingly. The long-standing tradition of researchers working closely with XC-ski coaches and athletes to monitor progress, improve training, and refine skiing techniques has provided unique physiological insights revealing how these athletes are approaching the upper limits of human endurance. This review summarizes current scientific knowledge concerning the demands involved in elite XC skiing, as well as the physiological capacity and training routines of the best athletes.

  9. Physics fundamentals for ITER

    International Nuclear Information System (INIS)

    Rosenbluth, M.N.

    1999-01-01

    The design of an experimental thermonuclear reactor requires both cutting-edge technology and physics predictions precise enough to carry forward the design. The past few years of worldwide physics studies have seen great progress in understanding, innovation and integration. We will discuss this progress and the remaining issues in several key physics areas. (1) Transport and plasma confinement. A worldwide database has led to an 'empirical scaling law' for tokamaks which predicts adequate confinement for the ITER fusion mission, albeit with considerable but acceptable uncertainty. The ongoing revolution in computer capabilities has given rise to new gyrofluid and gyrokinetic simulations of microphysics which may be expected in the near future to attain predictive accuracy. Important databases on H-mode characteristics and helium retention have also been assembled. (2) Divertors, heat removal and fuelling. A novel concept for heat removal - the radiative, baffled, partially detached divertor - has been designed for ITER. Extensive two-dimensional (2D) calculations have been performed and agree qualitatively with recent experiments. Preliminary studies of the interaction of this configuration with core confinement are encouraging and the success of inside pellet launch provides an attractive alternative fuelling method. (3) Macrostability. The ITER mission can be accomplished well within ideal magnetohydrodynamic (MHD) stability limits, except for internal kink modes. Comparisons with JET, as well as a theoretical model including kinetic effects, predict such sawteeth will be benign in ITER. Alternative scenarios involving delayed current penetration or off-axis current drive may be employed if required. The recent discovery of neoclassical beta limits well below ideal MHD limits poses a threat to performance. Extrapolation to reactor scale is as yet unclear. In theory such modes are controllable by current drive profile control or feedback and experiments should

  10. Fundamentals of quantum mechanics

    CERN Document Server

    Erkoc, Sakir

    2006-01-01

    HISTORICAL EXPERIMENTS AND THEORIESDates of Important Discoveries and Events Blackbody RadiationPhotoelectrice Effect Quantum Theory of Spectra TheComptone Effect Matterwaves, the de Broglie HypothesisThe Davisson -Germer Experiment Heisenberg's Uncertainity PrincipleDifference Between Particles and Waves Interpretation of the Wavefunction AXIOMATIC STRUCTURE OF QUANTUM MECHANICSThe Necessity of Quantum TheoryFunction Spaces Postulates of Quantum Mechanics The Kronecker Delta and the Dirac Delta Function Dirac Notation OBSERVABLES AND SUPERPOSITIONFree Particle Particle In A Box Ensemble Average Hilbert -Space Interpretation The Initial Square Wave Particle Beam Superposition and Uncertainty Degeneracy of States Commutators and Uncertainty TIME DEVELOPMENT AND CONSERVATION THEOREMSTime Development of State Functions, The Discrete Case The Continuous Case, Wave Packets Particle Beam Gaussian Wave Packet Free Particle Propagator The Limiting Cases of the Gaussian Wave Packets Time Development of Expectation Val...

  11. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2010-01-01

    New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update and Fundamentals is the single best source for the latest developments, trends, and issues in communication technology. Featuring the fundamental framework along with the history and background of communication technologies, Communication Technology Update and Fundamentals, 12th edition helps you stay ahead of these ever-changing and emerging technologies.As always, every chapter ha

  12. Fundamentals of structural engineering

    CERN Document Server

    Connor, Jerome J

    2016-01-01

    This book-presents new methods and tools for the integration and simulation of smart devices. The design approach described in this book explicitly accounts for integration of Smart Systems components and subsystems as a specific constraint. It includes methodologies and EDA tools to enable multi-disciplinary and multi-scale modeling and design, simulation of multi-domain systems, subsystems and components at all levels of abstraction, system integration and exploration for optimization of functional and non-functional metrics. By covering theoretical and practical aspects of smart device design, this book targets people who are working and studying on hardware/software modelling, component integration and simulation under different positions (system integrators, designers, developers, researchers, teachers, students etc.). In particular, it is a good introduction to people who have interest in managing heterogeneous components in an efficient and effective way on different domains and different abstraction l...

  13. Fundamental Pyrolysis Studies

    Energy Technology Data Exchange (ETDEWEB)

    Milne, T. A.; Evans, R. J.; Soltys, M. N.

    1983-03-01

    Progress on the direct mass spectrometric sampling of pyrolysis products from wood and its constituents is described for the period from June 1982 to February 1983. A brief summary and references to detailed reports, of the qualitative demonstration of our approach to the study of the separated processes of primary and secondary pyrolysis is presented. Improvements and additions to the pyrolysis and data acquisition systems are discussed and typical results shown. Chief of these are a heated-grid pyrolysis system for controlled primary pyrolysis and a sheathed flame arrangement for secondary cracking studies. Qualitative results of the secondary cracking of cellulose, lignin, and wood are shown as are comparisons with the literature for the pyrolysis spectra of cellulose, lignin, and levoglucosan. 'Fingerprints' for a number of materials are shown, with spectra taken under carefully controlled conditions so that sensitivity calibrations for different compounds, now being determined, can be applied.

  14. Fundamental Scaling Laws in Nanophotonics

    Science.gov (United States)

    Liu, Ke; Sun, Shuai; Majumdar, Arka; Sorger, Volker J.

    2016-11-01

    The success of information technology has clearly demonstrated that miniaturization often leads to unprecedented performance, and unanticipated applications. This hypothesis of “smaller-is-better” has motivated optical engineers to build various nanophotonic devices, although an understanding leading to fundamental scaling behavior for this new class of devices is missing. Here we analyze scaling laws for optoelectronic devices operating at micro and nanometer length-scale. We show that optoelectronic device performance scales non-monotonically with device length due to the various device tradeoffs, and analyze how both optical and electrical constrains influence device power consumption and operating speed. Specifically, we investigate the direct influence of scaling on the performance of four classes of photonic devices, namely laser sources, electro-optic modulators, photodetectors, and all-optical switches based on three types of optical resonators; microring, Fabry-Perot cavity, and plasmonic metal nanoparticle. Results show that while microrings and Fabry-Perot cavities can outperform plasmonic cavities at larger length-scales, they stop working when the device length drops below 100 nanometers, due to insufficient functionality such as feedback (laser), index-modulation (modulator), absorption (detector) or field density (optical switch). Our results provide a detailed understanding of the limits of nanophotonics, towards establishing an opto-electronics roadmap, akin to the International Technology Roadmap for Semiconductors.

  15. Connecting Fundamental Constants

    International Nuclear Information System (INIS)

    Di Mario, D.

    2008-01-01

    A model for a black hole electron is built from three basic constants only: h, c and G. The result is a description of the electron with its mass and charge. The nature of this black hole seems to fit the properties of the Planck particle and new relationships among basic constants are possible. The time dilation factor in a black hole associated with a variable gravitational field would appear to us as a charge; on the other hand the Planck time is acting as a time gap drastically limiting what we are able to measure and its dimension will appear in some quantities. This is why the Planck time is numerically very close to the gravitational/electric force ratio in an electron: its difference, disregarding a π√(2) factor, is only 0.2%. This is not a coincidence, it is always the same particle and the small difference is between a rotating and a non-rotating particle. The determination of its rotational speed yields accurate numbers for many quantities, including the fine structure constant and the electron magnetic moment

  16. Nanostructured metals. Fundamentals to applications

    International Nuclear Information System (INIS)

    Grivel, J.-C.; Hansen, N.; Huang, X.; Juul Jensen, D.; Mishin, O.V.; Nielsen, S.F.; Pantleon, W.; Toftegaard, H.; Winther, G.; Yu, T.

    2009-01-01

    In the today's world, materials science and engineering must as other technical fields focus on sustainability. Raw materials and energy have to be conserved and metals with improved or new structural and functional properties must be invented, developed and brought to application. In this endeavour a very promising route is to reduce the structural scale of metallic materials, thereby bridging industrial metals of today with emerging nanometals of tomorrow, i.e. structural scales ranging from a few micrometres to the nanometre regime. While taking a focus on metals with structures in this scale regime the symposium spans from fundamental aspects towards applications, uniting materials scientists and technologists. A holistic approach characterizes the themes of the symposium encompassing synthesis, characterization, modelling and performance where in each area significant progress has been made in recent years. Synthesis now covers top-down processes, e.g. plastic deformation, and bottom-up processes, e.g. chemical and physical synthesis. In the area of structural and mechanical characterization advanced techniques are now widely applied and in-situ techniques for structural characterization under mechanical or thermal loading are under rapid development in both 2D and 3D. Progress in characterization techniques has led to a precise description of different boundaries (grain, dislocation, twin, phase), and of how they form and evolve, also including theoretical modelling and simulations of structures, properties and performance. (au)

  17. Fundamentals of surfactant sputtering

    International Nuclear Information System (INIS)

    Hofsaess, Hans; Zhang Kun

    2009-01-01

    We introduce a new sputter technique, utilizing the steady-state coverage of a substrate surface with up to 10 16 cm -2 of foreign atoms simultaneously during sputter erosion by combined ion irradiation and atom deposition. These atoms strongly modify the substrate sputter yield on atomic to macroscopic length scales and therefore act as surfactant atoms (a blend of 'surface active agent'). Depending on the surfactant-substrate combination, the novel technique allows enhanced surface smoothing, generation of novel surface patterns, shaping of surfaces and formation of ultra-thin films. Sputter yield attenuation is demonstrated for sputtering of Si and Fe substrates and different surfactant species using 5 keV Xe ions at different incidence angles and fluences up to 10 17 cm -2 . Analytical approaches and Monte Carlo simulations are used to predict the sputtering yield attenuation as function of surfactant coverage. For sputtering of Si with Au surfactants we observe high sputter yields despite a steady-state surfactant coverage, which can be explained by strong ion-induced interdiffusion of substrate and surfactant atoms and the formation of a buried Au x Si surfactant layer in dynamic equilibrium.

  18. Fundamentals of technology roadmapping

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, M.L.; Bray, O.H.

    1997-04-01

    Technology planning is important for many reasons. Globally, companies are facing many competitive problems. Technology roadmapping, a form of technology planning can help deal with this increasingly competitive environment. While it has been used by some companies and industries, the focus has always been on the technology roadmap as a product, not on the process. This report focuses on formalizing the process so that it can be more broadly and easily used. As a DOE national security laboratory with R&D as a major product, Sandia must do effective technology planning to identify and develop the technologies required to meet its national security mission. Once identified, technology enhancements or new technologies may be developed internally or collaboratively with external partners. For either approach, technology roadmapping, as described in this report, is an effective tool for technology planning and coordination, which fits within a broader set of planning activities. This report, the second in a series on technology roadmapping, develops and documents this technology roadmapping process, which can be used by Sandia, other national labs, universities, and industry. The main benefit of technology roadmapping is that it provides information to make better technology investment decisions by identifying critical technologies and technology gaps and identifying ways to leverage R&D investments. It can also be used as a marketing tool. Technology roadmapping is critical when the technology investment decision is not straight forward. This occurs when it is not clear which alternative to pursue, how quickly the technology is needed, or when there is a need to coordinate the development of multiple technologies. The technology roadmapping process consists of three phases - preliminary activity, development of the technology roadmap, and follow-up activity.

  19. A Model Based Deconvolution Approach for Creating Surface Composition Maps of Irregularly Shaped Bodies from Limited Orbiting Nuclear Spectrometer Measurements

    Science.gov (United States)

    Dallmann, N. A.; Carlsten, B. E.; Stonehill, L. C.

    2017-12-01

    Orbiting nuclear spectrometers have contributed significantly to our understanding of the composition of solar system bodies. Gamma rays and neutrons are produced within the surfaces of bodies by impacting galactic cosmic rays (GCR) and by intrinsic radionuclide decay. Measuring the flux and energy spectrum of these products at one point in an orbit elucidates the elemental content of the area in view. Deconvolution of measurements from many spatially registered orbit points can produce detailed maps of elemental abundances. In applying these well-established techniques to small and irregularly shaped bodies like Phobos, one encounters unique challenges beyond those of a large spheroid. Polar mapping orbits are not possible for Phobos and quasistatic orbits will realize only modest inclinations unavoidably limiting surface coverage and creating North-South ambiguities in deconvolution. The irregular shape causes self-shadowing both of the body to the spectrometer but also of the body to the incoming GCR. The view angle to the surface normal as well as the distance between the surface and the spectrometer is highly irregular. These characteristics can be synthesized into a complicated and continuously changing measurement system point spread function. We have begun to explore different model-based, statistically rigorous, iterative deconvolution methods to produce elemental abundance maps for a proposed future investigation of Phobos. By incorporating the satellite orbit, the existing high accuracy shape-models of Phobos, and the spectrometer response function, a detailed and accurate system model can be constructed. Many aspects of this model formation are particularly well suited to modern graphics processing techniques and parallel processing. We will present the current status and preliminary visualizations of the Phobos measurement system model. We will also discuss different deconvolution strategies and their relative merit in statistical rigor, stability

  20. Rural eHealth nutrition education for limited-income families: an iterative and user-centered design approach.

    Science.gov (United States)

    Atkinson, Nancy L; Saperstein, Sandra L; Desmond, Sharon M; Gold, Robert S; Billing, Amy S; Tian, Jing

    2009-06-22

    Adult women living in rural areas have high rates of obesity. Although rural populations have been deemed hard to reach, Internet-based programming is becoming a viable strategy as rural Internet access increases. However, when people are able to get online, they may not find information designed for them and their needs, especially harder to reach populations. This results in a "content gap" for many users. User-centered design is a methodology that can be used to create appropriate online materials. This research was conducted to apply a user-centered approach to the design and development of a health promotion website for low-income mothers living in rural Maryland. Three iterative rounds of concept testing were conducted to (1) identify the name and content needs of the site and assess concerns about registering on a health-related website; (2) determine the tone and look of the website and confirm content and functionality; and (3) determine usability and acceptability. The first two rounds involved focus group and small group discussions, and the third round involved usability testing with individual women as they used the prototype system. The formative research revealed that women with limited incomes were enthusiastic about a website providing nutrition and physical activity information targeted to their incomes and tailored to their personal goals and needs. Other priority content areas identified were budgeting, local resources and information, and content that could be used with their children. Women were able to use the prototype system effectively. This research demonstrated that user-centered design strategies can help close the "content gap" for at-risk audiences.

  1. A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach.

    Science.gov (United States)

    Tipton, Elizabeth; Shuster, Jonathan

    2017-10-15

    Bland-Altman method comparison studies are common in the medical sciences and are used to compare a new measure to a gold-standard (often costlier or more invasive) measure. The distribution of these differences is summarized by two statistics, the 'bias' and standard deviation, and these measures are combined to provide estimates of the limits of agreement (LoA). When these LoA are within the bounds of clinically insignificant differences, the new non-invasive measure is preferred. Very often, multiple Bland-Altman studies have been conducted comparing the same two measures, and random-effects meta-analysis provides a means to pool these estimates. We provide a framework for the meta-analysis of Bland-Altman studies, including methods for estimating the LoA and measures of uncertainty (i.e., confidence intervals). Importantly, these LoA are likely to be wider than those typically reported in Bland-Altman meta-analyses. Frequently, Bland-Altman studies report results based on repeated measures designs but do not properly adjust for this design in the analysis. Meta-analyses of Bland-Altman studies frequently exclude these studies for this reason. We provide a meta-analytic approach that allows inclusion of estimates from these studies. This includes adjustments to the estimate of the standard deviation and a method for pooling the estimates based upon robust variance estimation. An example is included based on a previously published meta-analysis. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Fundamental Limitations to Gain Enhancement in Periodic Media and Waveguides

    DEFF Research Database (Denmark)

    Grgic, Jure; Ott, Johan Raunkjær; Wang, Fengwen

    2012-01-01

    modifies the underlying dispersion law, and thereby may degrade the slow-light properties underlying the device operation and the anticipated gain enhancement itself. This degradation is generic; we demonstrate it for three different systems of current interest (coupled-resonator optical waveguides, Bragg......A common strategy to compensate for losses in optical nanostructures is to add gain material in the system. By exploiting slow-light effects it is expected that the gain may be enhanced beyond its bulk value. Here we show that this route cannot be followed uncritically: inclusion of gain inevitably...

  3. Unified presentation of four fundamental inequalities

    Science.gov (United States)

    Lajzerowicz, Joseph; Lehoucq, Roland; Graner, François

    2018-03-01

    We suggest an unified presentation to teach fundamental constants to graduate students, by introducing four lower limits to observed phenomena. The reduced Planck constant ℏ is the lowest classically definable action. The inverse of invariant speed, s, is the lowest observable slowness. The Planck time, {t}{{P}}, is the lowest observable time scale. The Boltzmann constant, k, determines the lowest coherent degree of freedom; we recall an Einstein criterion on the fluctuations of small thermal systems and show that it has far-reaching implications, such as demonstrating the relations between critical exponents. Each of these four fundamental limits enters in an inequality, which marks a horizon of the Universe we can perceive. This compact presentation can resolve some difficulties encountered when trying to defining the epistemologic status of these constants, and emphasizes their useful role in shaping our intuitive vision of the Universe.

  4. [Public health genetics. Fundamental rights aspects].

    Science.gov (United States)

    Pestalozza, C

    2009-07-01

    The fundamental right of "informational self-determination" (protection of personal data) protects the individual against collection, storage, use and disclosure of her/his personal data - including genetic data - without her/his informed consent. However, in cases of overriding public interest, limitations of this right are deemed legitimate. Public health, expressly guaranteed in some German state constitutions, may constitute such overriding public interest and justify corresponding state measures as long as they respect the principle of proportionality.

  5. Fundamental plasma emission involving ion sound waves

    Science.gov (United States)

    Cairns, Iver H.

    1987-01-01

    The theory for fundamental plasma emission by the three-wave processes L + or - S to T (where L, S and T denote Langmuir, ion sound and transverse waves, respectively) is developed. Kinematic constraints on the characteristics and growth lengths of waves participating in the wave processes are identified. In addition the rates, path-integrated wave temperatures, and limits on the brightness temperature of the radiation are derived.

  6. The emergence of the dimensions and fundamental forces in the universe, an information-theoretical approach for the expaining of the quantity ratios of the fundamental interactions. 2. rev. and enl. ed.; Die Entstehung der Dimensionen und Grundkraefte im Universum, ein informationstheoretischer Ansatz zur Erklaerung der Groessenverhaeltnisse der Fundamentalen Wechselwirkungen

    Energy Technology Data Exchange (ETDEWEB)

    Ganter, Bernd

    2013-06-01

    After a description of the four fundamental inteactions and the connection of information with energy the principle of the fast maximation together with the Ganter tableau is described. Then as example the derivation of the value of the fine-structure constant from the Ganter tableau is described. Thereafter the extension of the Ganter tableau, further properties of the Ganter tableau, and the persuasion of the Ganter tableau are considered. (HSI)

  7. Quench limits

    International Nuclear Information System (INIS)

    Sapinski, M.

    2012-01-01

    With thirteen beam induced quenches and numerous Machine Development tests, the current knowledge of LHC magnets quench limits still contains a lot of unknowns. Various approaches to determine the quench limits are reviewed and results of the tests are presented. Attempt to reconstruct a coherent picture emerging from these results is taken. The available methods of computation of the quench levels are presented together with dedicated particle shower simulations which are necessary to understand the tests. The future experiments, needed to reach better understanding of quench limits as well as limits for the machine operation are investigated. The possible strategies to set BLM (Beam Loss Monitor) thresholds are discussed. (author)

  8. Cr3+-doped fluorides and oxides: role of internal fields and limitations of the Tanabe-Sugano approach.

    Science.gov (United States)

    Trueba, A; García-Lastra, J M; Garcia-Fernandez, P; Aramburu, J A; Barriuso, M T; Moreno, M

    2011-11-24

    This work is aimed at clarifying the changes on optical spectra of Cr(3+) impurities due to either a host lattice variation or a hydrostatic pressure, which can hardly be understood by means of the usual Tanabe-Sugano (TS) approach assuming that the Racah parameter, B, grows when covalency decreases. For achieving this goal, the optical properties of Cr(3+)-doped LiBaF(3) and KMgF(3) model systems have been explored by means of high level ab initio calculations on CrF(6)(3-) units subject to the electric field, E(R)(r), created by the rest of the lattice ions. These calculations, which reproduce available experimental data, indicate that the energy, E((2)E), of the (2)E(t(2g)(3)) → (4)A(2)(t(2g)(3)) emission transition is nearly independent of the host lattice. By contrast, the energy difference corresponding to (4)A(2)(t(2g)(3)) → (4)T(1)(t(2g)(2)e(g)(1)) and (4)A(2)(t(2g)(3)) → (4)T(2)(t(2g)(2)e(g)(1)) excitations, Δ((4)T(1); (4)T(2)), is shown to increase on passing from the normal to the inverted perovskite host lattice despite the increase in covalency, a fact which cannot be accounted for through the usual TS model. Similarly, when the Cr(3+)-F(-) distance, R, is reduced both Δ((4)T(1); (4)T(2)) and the covalency are found to increase. By analyzing the limitations of the usual model, we found surprising results that are shown to arise from the deformation of both 3d(Cr) and ligand orbitals in the antibonding e(g) orbital, which has a σ character and is more extended than the π t(2g) orbital. By contrast, because of the higher stiffness of the t(2g) orbital, the dependence of E((2)E) with R basically follows the corresponding variation of covalency in that level. Bearing in mind the similarities of the optical properties displayed by Cr(3+) impurities in oxides and fluorides, the present results can be useful for understanding experimental data on Cr(3+)-based gemstones where the local symmetry is lower than cubic.

  9. Integrating the Fundamentals of Care framework in baccalaureate nursing education

    DEFF Research Database (Denmark)

    Voldbjerg, Siri; Laugesen, Britt; Bahnsen, Iben Bøgh

    2018-01-01

    such as supporting a holistic approach to an evidence-based integrative patient care and challenges such as skepticism among the faculty are discussed. CONCLUSION: It is suggested how integration of Fundamentals of Care Framework in lectures, case-based work and simulation lab can make fundamental nursing care more...

  10. Fundamental principles of heat transfer

    CERN Document Server

    Whitaker, Stephen

    1977-01-01

    Fundamental Principles of Heat Transfer introduces the fundamental concepts of heat transfer: conduction, convection, and radiation. It presents theoretical developments and example and design problems and illustrates the practical applications of fundamental principles. The chapters in this book cover various topics such as one-dimensional and transient heat conduction, energy and turbulent transport, forced convection, thermal radiation, and radiant energy exchange. There are example problems and solutions at the end of every chapter dealing with design problems. This book is a valuable int

  11. Fundamental number theory with applications

    CERN Document Server

    Mollin, Richard A

    2008-01-01

    An update of the most accessible introductory number theory text available, Fundamental Number Theory with Applications, Second Edition presents a mathematically rigorous yet easy-to-follow treatment of the fundamentals and applications of the subject. The substantial amount of reorganizing makes this edition clearer and more elementary in its coverage. New to the Second Edition           Removal of all advanced material to be even more accessible in scope           New fundamental material, including partition theory, generating functions, and combinatorial number theory           Expa

  12. Math Majors' Visual Proofs in a Dynamic Environment: The Case of Limit of a Function and the ?-d Approach

    Science.gov (United States)

    Caglayan, Günhan

    2015-01-01

    Despite few limitations, GeoGebra as a dynamic geometry software stood as a powerful instrument in helping university math majors understand, explore, and gain experiences in visualizing the limits of functions and the ?-d formalism. During the process of visualizing a theorem, the order mattered in the sequence of constituents. Students made use…

  13. Optofluidic bioanalysis: fundamentals and applications

    Directory of Open Access Journals (Sweden)

    Ozcelik Damla

    2017-03-01

    Full Text Available Over the past decade, optofluidics has established itself as a new and dynamic research field for exciting developments at the interface of photonics, microfluidics, and the life sciences. The strong desire for developing miniaturized bioanalytic devices and instruments, in particular, has led to novel and powerful approaches to integrating optical elements and biological fluids on the same chip-scale system. Here, we review the state-of-the-art in optofluidic research with emphasis on applications in bioanalysis and a focus on waveguide-based approaches that represent the most advanced level of integration between optics and fluidics. We discuss recent work in photonically reconfigurable devices and various application areas. We show how optofluidic approaches have been pushing the performance limits in bioanalysis, e.g. in terms of sensitivity and portability, satisfying many of the key requirements for point-of-care devices. This illustrates how the requirements for bianalysis instruments are increasingly being met by the symbiotic integration of novel photonic capabilities in a miniaturized system.

  14. Fundamentals of modern unsteady aerodynamics

    CERN Document Server

    Gülçat, Ülgen

    2010-01-01

    This introduction to the principles of unsteady aerodynamics covers all the core concepts, provides readers with a review of the fundamental physics, terminology and basic equations, and covers hot new topics such as the use of flapping wings for propulsion.

  15. Quantum mechanics I the fundamentals

    CERN Document Server

    Rajasekar, S

    2015-01-01

    Quantum Mechanics I: The Fundamentals provides a graduate-level account of the behavior of matter and energy at the molecular, atomic, nuclear, and sub-nuclear levels. It covers basic concepts, mathematical formalism, and applications to physically important systems.

  16. Composing Europe's Fundamental Rights Area

    DEFF Research Database (Denmark)

    Storgaard, Louise Halleskov

    2015-01-01

    The article offers a perspective on how the objective of a strong and coherent European protection standard pursued by the fundamental rights amendments of the Lisbon Treaty can be achieved, as it proposes a discursive pluralistic framework to understand and guide the relationship between the EU....... The article ends by addressing three of the most pertinent challenges to European fundamental rights protection through the prism of the proposed framework....

  17. Fundamentals of electronic image processing

    CERN Document Server

    Weeks, Arthur R

    1996-01-01

    This book is directed to practicing engineers and scientists who need to understand the fundamentals of image processing theory and algorithms to perform their technical tasks. It is intended to fill the gap between existing high-level texts dedicated to specialists in the field and the need for a more practical, fundamental text on image processing. A variety of example images are used to enhance reader understanding of how particular image processing algorithms work.

  18. Measurement and Fundamental Processes in Quantum Mechanics

    Science.gov (United States)

    Jaeger, Gregg

    2015-07-01

    In the standard mathematical formulation of quantum mechanics, measurement is an additional, exceptional fundamental process rather than an often complex, but ordinary process which happens also to serve a particular epistemic function: during a measurement of one of its properties which is not already determined by a preceding measurement, a measured system, even if closed, is taken to change its state discontinuously rather than continuously as is usual. Many, including Bell, have been concerned about the fundamental role thus given to measurement in the foundation of the theory. Others, including the early Bohr and Schwinger, have suggested that quantum mechanics naturally incorporates the unavoidable uncontrollable disturbance of physical state that accompanies any local measurement without the need for an exceptional fundamental process or a special measurement theory. Disturbance is unanalyzable for Bohr, but for Schwinger it is due to physical interactions' being borne by fundamental particles having discrete properties and behavior which is beyond physical control. Here, Schwinger's approach is distinguished from more well known treatments of measurement, with the conclusion that, unlike most, it does not suffer under Bell's critique of quantum measurement. Finally, Schwinger's critique of measurement theory is explicated as a call for a deeper investigation of measurement processes that requires the use of a theory of quantum fields.

  19. 'Close to' a palliative approach: nurses' and care aides' descriptions of caring for people with advancing chronic life-limiting conditions.

    Science.gov (United States)

    Reimer-Kirkham, Sheryl; Sawatzky, Richard; Roberts, Della; Cochrane, Marie; Stajduhar, Kelli

    2016-08-01

    The purpose of the study was to examine nurses' and nursing assistants' perspectives of a palliative approach in a variety of nursing care settings that do not specialise in palliative care. Ageing populations worldwide are drawing increasing attention to palliative care. In particular, people with advancing chronic life-limiting conditions often have unmet needs and may die in acute medical, residential care and home health settings without access to palliative care. A palliative approach offers an upstream orientation to adopt palliative care principles to meet the needs of people with life-limiting chronic conditions, adapt palliative care knowledge to other chronic illness conditions, and integrate and contextualise this knowledge within the healthcare system (Sawatzky et al. 2016). A qualitative study using the method of interpretive description carried out by a nursing research-practice collaborative, Initiative for a Palliative Approach: Evidence and Leadership (iPANEL). Twenty-five nurses and five nursing assistants from across British Columbia, Canada participated in interviews and focus groups. Thematic analysis was used to analyse the data. The overarching theme was that of participants close to a palliative approach in that they cared for people who would benefit from a palliative approach, they were committed to providing better end-of-life care, and they understood palliative approach as an extension of specialised palliative care services. Participants varied in their self-reported capacity to integrate a palliative approach, as they were influenced by role clarity, interprofessional collaboration and knowledge. Integration of a palliative approach requires a conceptual shift and can be enhanced through interpersonal relationships and communication, role clarification and education. Nurses care for people with advancing chronic life-limiting conditions in a variety of settings who would benefit from a palliative approach. © 2016 John Wiley & Sons

  20. Sensitivity and uncertainty analyses applied to one-dimensional radionuclide transport in a layered fractured rock: Evaluation of the Limit State approach, Iterative Performance Assessment, Phase 2

    International Nuclear Information System (INIS)

    Wu, Y.T.; Gureghian, A.B.; Sagar, B.; Codell, R.B.

    1992-12-01

    The Limit State approach is based on partitioning the parameter space into two parts: one in which the performance measure is smaller than a chosen value (called the limit state), and the other in which it is larger. Through a Taylor expansion at a suitable point, the partitioning surface (called the limit state surface) is approximated as either a linear or quadratic function. The success and efficiency of the limit state method depends upon choosing an optimum point for the Taylor expansion. The point in the parameter space that has the highest probability of producing the value chosen as the limit state is optimal for expansion. When the parameter space is transformed into a standard Gaussian space, the optimal expansion point, known as the lost Probable Point (MPP), has the property that its location on the Limit State surface is closest to the origin. Additionally, the projections onto the parameter axes of the vector from the origin to the MPP are the sensitivity coefficients. Once the MPP is determined and the Limit State surface approximated, formulas (see Equations 4-7 and 4-8) are available for determining the probability of the performance measure being less than the limit state. By choosing a succession of limit states, the entire cumulative distribution of the performance measure can be detemined. Methods for determining the MPP and also for improving the estimate of the probability are discussed in this report

  1. Design of impact limiters of a bulk type B (U) . Trials of fall and validation of the analytical model In the design of a container for transportation of spent fuel, the impact limiters are a fundamental part for compliance with regulatory requirements; Diseno de los Limitadores de impacto de un Bulto Tipo B(U). Ensayos de Caida y validacion del Modelo Analitico

    Energy Technology Data Exchange (ETDEWEB)

    Garrido Quevedo, D.

    2013-07-01

    The aim is to confirm through real trials that the design and the results obtained through simulation conform to reality with a high degree of confidence... The combination of tests on scale models and the validation of the methods of calculation are necessary tools for the design of limiters impact a container of spent fuel transport.

  2. Quantum Limits of Space-to-Ground Optical Communications

    Science.gov (United States)

    Hemmati, H.; Dolinar, S.

    2012-01-01

    For a pure loss channel, the ultimate capacity can be achieved with classical coherent states (i.e., ideal laser light): (1) Capacity-achieving receiver (measurement) is yet to be determined. (2) Heterodyne detection approaches the ultimate capacity at high mean photon numbers. (3) Photon-counting approaches the ultimate capacity at low mean photon numbers. A number of current technology limits drive the achievable performance of free-space communication links. Approaching fundamental limits in the bandwidth-limited regime: (1) Heterodyne detection with high-order coherent-state modulation approaches ultimate limits. SOA improvements to laser phase noise, adaptive optics systems for atmospheric transmission would help. (2) High-order intensity modulation and photon-counting can approach heterodyne detection within approximately a factor of 2. This may have advantages over coherent detection in the presence of turbulence. Approaching fundamental limits in the photon-limited regime (1) Low-duty cycle binary coherent-state modulation (OOK, PPM) approaches ultimate limits. SOA improvements to laser extinction ratio, receiver dark noise, jitter, and blocking would help. (2) In some link geometries (near field links) number-state transmission could improve over coherent-state transmission

  3. Microscale and nanoscale heat transfer fundamentals and engineering applications

    CERN Document Server

    Sobhan, CB

    2008-01-01

    Preface Introduction to Microscale Heat Transfer Microscale Heat Transfer: A Recent Avenue in Energy Transport State of the Art: Some Introductory Remarks Overview of Microscale Transport Phenomena Discussions on Size-Effect Behavior Fundamental Approach for Microscale Heat Transfer Introduction to Engineering Applications of Microscale Heat Transfer Microscale Heat Conduction Review of Conduction Heat Transfer Conduction at the Microscale Space and Timescales Fundamental Approach Thermal Conductivity Boltzmann Equation and Phonon Transport Conduction in Thin Films

  4. Hip resurfacing using a modified lateral approach with limited splitting of the gluteus medius muscle results in significant impairment of hip abductor strength.

    Science.gov (United States)

    Moussazadeh, A J; Kohlhof, H; Wirtz, D C; Wimmer, M D; Randau, T M; Wölk, T; Gravius, S

    2013-01-01

    A lateral, transgluteal approach for hip resurfacing carries the risk of approach-related weakening of the hip abductors due to unsuccessful re-adaptation of the gluteal muscles to the greater trochanter or to injury to the inferior nerve branch of the superior gluteal nerve. We investigated whether hip resurfacing using a soft tissue-sparing, modified transgluteal approach with limited cranial splitting of the gluteus medius muscle reduces hip abductor strength and the risk of approach-related injury to the superior gluteal nerve. Thirty-one patients (14 female, 17 male; mean age 53.5 ± 5.2 years) underwent hip resurfacing using a modified transgluteal approach with limited cranial splitting of the gluteus medius muscle. Nerve conduction signals were measured by surface electromyography (EMG), hip abductor strength by isokinetic testing a mean 36.2 months (± 11 mos) after surgery. The unoperated side was used as control. Surface EMG disclosed no neural lesions of the inferior branch of the superior gluteal nerve. Isokinetics revealed a significant reduction in muscle strength on the operated versus the contralateral side. Even a limited incision of the gluteus medius muscle resulted in significant impairment of hip abductor strength 2.5 years after surgery.

  5. Fundamentals of semiconductor manufacturing and process control

    CERN Document Server

    May, Gary S

    2006-01-01

    A practical guide to semiconductor manufacturing from process control to yield modeling and experimental design Fundamentals of Semiconductor Manufacturing and Process Control covers all issues involved in manufacturing microelectronic devices and circuits, including fabrication sequences, process control, experimental design, process modeling, yield modeling, and CIM/CAM systems. Readers are introduced to both the theory and practice of all basic manufacturing concepts. Following an overview of manufacturing and technology, the text explores process monitoring methods, including those that focus on product wafers and those that focus on the equipment used to produce wafers. Next, the text sets forth some fundamentals of statistics and yield modeling, which set the foundation for a detailed discussion of how statistical process control is used to analyze quality and improve yields. The discussion of statistical experimental design offers readers a powerful approach for systematically varying controllable p...

  6. Religious quest orientation: Rising against fundamentalism

    Directory of Open Access Journals (Sweden)

    Reeshma Haji

    2014-06-01

    Full Text Available Quest, or a journey-oriented approach to religion, is one dimension of religiosity that has been consistently related to positive outgroup attitudes. The present research assessed the extent to which individual differences in quest religiosity moderated the effects of a religiosity prime on attitudes toward an outgroup religion. Christian identifying participants (N = 55 completed a scale measure of quest religiosity. They then read a vignette that primed quest religiosity or religious fundamentalism. Attitudes toward Muslims and Jews were assessed with evaluation thermometers. Quest religiosity interacted with the prime such that those high in quest appeared to react against the fundamentalism prime by expressing particularly positive outgroup attitudes. Trait quest religiosity appears to buffer against situational factors that are typically associated with negative outgroup attitudes. In addition, implications for research on intergroup relations of religious groups are discussed.

  7. Fundamentals and advanced techniques in derivatives hedging

    CERN Document Server

    Bouchard, Bruno

    2016-01-01

    This book covers the theory of derivatives pricing and hedging as well as techniques used in mathematical finance. The authors use a top-down approach, starting with fundamentals before moving to applications, and present theoretical developments alongside various exercises, providing many examples of practical interest. A large spectrum of concepts and mathematical tools that are usually found in separate monographs are presented here. In addition to the no-arbitrage theory in full generality, this book also explores models and practical hedging and pricing issues. Fundamentals and Advanced Techniques in Derivatives Hedging further introduces advanced methods in probability and analysis, including Malliavin calculus and the theory of viscosity solutions, as well as the recent theory of stochastic targets and its use in risk management, making it the first textbook covering this topic. Graduate students in applied mathematics with an understanding of probability theory and stochastic calculus will find this b...

  8. [Rationalization, rationing, prioritization: terminology and ethical approaches to the allocation of limited resources in hematology/oncology].

    Science.gov (United States)

    Winkler, Eva

    2011-01-01

    The field of oncology with its numerous high-priced innovations contributes considerably to the fact that medical progress is expensive. Additionally, due to the demographic changes and the increasing life expectancy, a growing number of cancer patients want to profit from this progress. Since resources are limited also in the health system, the fair distribution of the available resources urgently needs to be addressed. Dealing with scarcity is a typical problem in the domain of justice theory; therefore, this article first discusses different strategies to manage limited resources: rationalization, rationing, and prioritization. It then presents substantive as well as procedural criteria that assist in the just distribution of effective health benefits. There are various strategies to reduce the utilization of limited resources: Rationalization means that efficiency reserves are being exhausted; by means of rationing, effective health benefits are withheld due to cost considerations. Rationing can occur implicitly and thus covertly, e.g. through budgeting or the implementation of waiting periods, or explicitly, through transparent rules or policies about healthcare coverage. Ranking medical treatments according to their importance (prioritization) is often a prerequisite for rationing decisions. In terms of requirements of justice, both procedural and substantive criteria (e.g. equality, urgency, benefit) are relevant for the acceptance and quality of a decision to limit access to effective health benefits. Copyright © 2011 S. Karger AG, Basel.

  9. Fundamental Composite (Goldstone) Higgs Dynamics

    DEFF Research Database (Denmark)

    Cacciapaglia, G.; Sannino, Francesco

    2014-01-01

    We provide a unified description, both at the effective and fundamental Lagrangian level, of models of composite Higgs dynamics where the Higgs itself can emerge, depending on the way the electroweak symmetry is embedded, either as a pseudo-Goldstone boson or as a massive excitation of the conden......We provide a unified description, both at the effective and fundamental Lagrangian level, of models of composite Higgs dynamics where the Higgs itself can emerge, depending on the way the electroweak symmetry is embedded, either as a pseudo-Goldstone boson or as a massive excitation...... transforming according to the fundamental representation of the gauge group. This minimal choice enables us to use recent first principle lattice results to make the first predictions for the massive spectrum for models of composite (Goldstone) Higgs dynamics. These results are of the upmost relevance to guide...

  10. Image restoration fundamentals and advances

    CERN Document Server

    Gunturk, Bahadir Kursat

    2012-01-01

    Image Restoration: Fundamentals and Advances responds to the need to update most existing references on the subject, many of which were published decades ago. Providing a broad overview of image restoration, this book explores breakthroughs in related algorithm development and their role in supporting real-world applications associated with various scientific and engineering fields. These include astronomical imaging, photo editing, and medical imaging, to name just a few. The book examines how such advances can also lead to novel insights into the fundamental properties of image sources. Addr

  11. RFID design fundamentals and applications

    CERN Document Server

    Lozano-Nieto, Albert

    2010-01-01

    RFID is an increasingly pervasive tool that is now used in a wide range of fields. It is employed to substantiate adherence to food preservation and safety standards, combat the circulation of counterfeit pharmaceuticals, and verify authenticity and history of critical parts used in aircraft and other machinery-and these are just a few of its uses. Goes beyond deployment, focusing on exactly how RFID actually worksRFID Design Fundamentals and Applications systematically explores the fundamental principles involved in the design and characterization of RFID technologies. The RFID market is expl

  12. Degeneration of penicillin production in ethanol-limited chemostat cultivations of Penicillium chrysogenum : A systems biology approach

    NARCIS (Netherlands)

    Douma, Rutger D.; Batista, Joana M.; Touw, Kai M.; Kiel, Jan A. K. W.; Zhao, Zheng; Veiga, Tania; Klaassen, Paul; Bovenberg, Roel A. L.; Daran, Jean-Marc; van Gulik, Walter M.; Heijnen, J.J.; Krikken, Arjen

    2011-01-01

    Background: In microbial production of non-catabolic products such as antibiotics a loss of production capacity upon long-term cultivation (for example chemostat), a phenomenon called strain degeneration, is often observed. In this study a systems biology approach, monitoring changes from gene to

  13. Cr3+-Doped Fluorides and Oxides: Role of Internal Fields and Limitations of the Tanabe–Sugano Approach

    DEFF Research Database (Denmark)

    Trueba, A.; García Lastra, Juan Maria; Garcia-Fernandez, P.

    2011-01-01

    This work is aimed at clarifying the changes on optical spectra of Cr 3+ impurities due to either a host lattice variation or a hydrostatic pressure, which can hardly be understood by means of the usual Tanabe - Sugano (TS) approach assuming that the Racah parameter, B, grows when covalency decre...

  14. A risk modelling approach for setting microbiological limits using enterococci as indicator for growth potential of Salmonella in pork

    DEFF Research Database (Denmark)

    Bollerslev, Anne Mette; Nauta, Maarten; Hansen, Tina Beck

    2017-01-01

    for this purpose includes the dose-response relationship for Salmonella and a reduction factor to account for preparation of the fresh pork. By use of the risk model, it was estimated that the majority of salmonellosis cases, caused by the consumption of pork in Denmark, is caused by the small fraction of pork......Microbiological limits are widely used in food processing as an aid to reduce the exposure to hazardous microorganisms for the consumers. However, in pork, the prevalence and concentrations of Salmonella are generally low and microbiological limits are not considered an efficient tool to support...... are carried in the intestinal tract, contaminate pork by the same mechanisms and share similar growth characteristics (lag phase and maximum specific growth rate) at temperatures around 5-10. °C, suggest a potential of enterococci to be used as an indicator of potential growth of Salmonella in pork. Elevated...

  15. On the Use of Time-Limited Information for Maintenance Decision Support: A Predictive Approach under Maintenance Constraints

    Directory of Open Access Journals (Sweden)

    E. Khoury

    2013-01-01

    Full Text Available This paper deals with a gradually deteriorating system operating under an uncertain environment whose state is only known on a finite rolling horizon. As such, the system is subject to constraints. Maintenance actions can only be planned at imposed times called maintenance opportunities that are available on a limited visibility horizon. This system can, for example, be a commercial vehicle with a monitored critical component that can be maintained only in some specific workshops. Based on the considered system, we aim to use the monitoring data and the time-limited information for maintenance decision support in order to reduce its costs. We propose two predictive maintenance policies based, respectively, on cost and reliability criteria. Classical age-based and condition-based policies are considered as benchmarks. The performance assessment shows the value of the different types of information and the best way to use them in maintenance decision making.

  16. Fundamentals of Welding. Teacher Edition.

    Science.gov (United States)

    Fortney, Clarence; And Others

    These instructional materials assist teachers in improving instruction on the fundamentals of welding. The following introductory information is included: use of this publication; competency profile; instructional/task analysis; related academic and workplace skills list; tools, materials, and equipment list; and 27 references. Seven units of…

  17. Ecological fundamentals of environmental protection

    International Nuclear Information System (INIS)

    Haber, W.

    1993-01-01

    The book reviews the state of the art of ecological knowledge. The emphasis is on ecosystem theory and in the interpretation of our environment with its irreversible anthropogenic changes. It is an important contribution to deeper knowledge about the ecological fundamentals of environmental protection and the factors that constitute nature's potential. (orig./BBR) [de

  18. Fundamentals: IVC and Computer Science

    NARCIS (Netherlands)

    Gozalvez, Javier; Haerri, Jerome; Hartenstein, Hannes; Heijenk, Geert; Kargl, Frank; Petit, Jonathan; Scheuermann, Björn; Tieler, Tessa; Altintas, O.; Dressler, F.; Hartenstein, H.; Tonguz, O.K.

    The working group on “Fundamentals: IVC and Computer Science‿ discussed the lasting value of achieved research results as well as potential future directions in the field of inter- vehicular communication. Two major themes ‘with variations’ were the dependence on a specific technology (particularly

  19. Credit cycles and macro fundamentals

    NARCIS (Netherlands)

    Koopman, S.J.; Kraeussl, R.G.W.; Lucas, A.; Monteiro, A.

    2009-01-01

    We use an intensity-based framework to study the relation between macroeconomic fundamentals and cycles in defaults and rating activity. Using Standard and Poor's U.S. corporate rating transition and default data over the period 1980-2005, we directly estimate the default and rating cycle from micro

  20. Fundamental length and relativistic length

    International Nuclear Information System (INIS)

    Strel'tsov, V.N.

    1988-01-01

    It si noted that the introduction of fundamental length contradicts the conventional representations concerning the contraction of the longitudinal size of fast-moving objects. The use of the concept of relativistic length and the following ''elongation formula'' permits one to solve this problem

  1. Fundamentals of EU VAT law

    NARCIS (Netherlands)

    van Doesum, A.; van Kesteren, H.W.M.; van Norden, G.J.

    Fundamentals of EU VAT Law aims at providing a deep insight into the systematics, the functioning and the principles of the European Value Added Tax (VAT) system. VAT is responsible for generating approximately EUR 903 billion per year in tax revenues across the European Union – revenues that play a

  2. Energy informatics: Fundamentals and standardization

    Directory of Open Access Journals (Sweden)

    Biyao Huang

    2017-06-01

    Full Text Available Based on international standardization and power utility practices, this paper presents a preliminary and systematic study on the field of energy informatics and analyzes boundary expansion of information and energy system, and the convergence of energy system and ICT. A comprehensive introduction of the fundamentals and standardization of energy informatics is provided, and several key open issues are identified.

  3. Energy informatics: Fundamentals and standardization

    OpenAIRE

    Biyao Huang; Xiaomin Bai; Zhenyu Zhou; Quansheng Cui; Daohua Zhu; Ruwei Hu

    2017-01-01

    Based on international standardization and power utility practices, this paper presents a preliminary and systematic study on the field of energy informatics and analyzes boundary expansion of information and energy system, and the convergence of energy system and ICT. A comprehensive introduction of the fundamentals and standardization of energy informatics is provided, and several key open issues are identified.

  4. Experimental tests of fundamental symmetries

    NARCIS (Netherlands)

    Jungmann, K. P.

    2014-01-01

    Ongoing experiments and projects to test our understanding of fundamental inter- actions and symmetries in nature have progressed significantly in the past few years. At high energies the long searched for Higgs boson has been found; tests of gravity for antimatter have come closer to reality;

  5. Fundamentals of bladder tissue engineering

    African Journals Online (AJOL)

    W. Mahfouz

    promote angiogenesis and neurogenesis of the regenerated organs. The choice of the scaffold and the type of cells is a crucial and fundamental step in regenerative medicine. In this review article, we demonstrated these three crucial factors of bladder tissue engineering, with the pros and cons of each scaffold type and.

  6. A verdade como um problema fundamental em Kant Kant on truth as a fundamental problem

    Directory of Open Access Journals (Sweden)

    Adriano Perin

    2010-01-01

    Full Text Available O principal ponto de desacordo sobre a abordagem kantiana do problema da verdade é se ela pode ser compreendida nos moldes da filosofia contemporânea como coerentista ou como correspondentista. Primando por uma consideração sistemática da argumentação de Kant em confronto com a literatura existente sobre o problema, este trabalho defende a segunda alternativa. Sustenta-se a tese de que a definição da verdade como a "concordância do conhecimento com o seu objeto" é cogente em todo o percurso do pensamento kantiano e que, nessa acepção, a verdade culmina por ser abordada não a partir de uma teoria estabelecida, mas como um problema cuja solução não pode ser dada nos limites da filosofia crítico-transcendental. Pondera-se, primeiramente, a literatura que situa Kant quer como coerentista quer como correspondentista e sistematiza-se a segunda alternativa em quatro grupos: a leitura ontológica, a leitura isomórfica, a leitura "consequencialista" e a leitura regulativa. Num segundo momento, em atenção ao período pré-crítico, argumenta-se que a alternativa coerentista deixa de se confirmar já nessa mesma época e que, na década de 1750, Kant descarta uma suposta teoria correspondentista isomórfica. Num último momento, considera-se a argumentação crítica e defende-se que a mesma concebe a verdade como um problema fundamental que não cabe ao tratamento de uma teoria correspondentista concebida de modo "consequencialista" ou regulativo.The main point of disagreement about Kant's approach of the problem of truth is whether it can be understood within the apparatus of contemporary philosophy as a coherence or a correspondence theory. By favoring a systematic consideration of Kant's argumentation in light of the available literature on the problem, this paper argues toward the latter alternative. It is sustained that the definition of truth as "the agreement of cognition with its object" is cogent throughout Kant's thought and

  7. [Overcoming the limitations of the descriptive and categorical approaches in psychiatric diagnosis: a proposal based on Bayesian networks].

    Science.gov (United States)

    Sorias, Soli

    2015-01-01

    Efforts to overcome the problems of descriptive and categorical approaches have not yielded results. In the present article, psychiatric diagnosis using Bayesian networks is proposed. Instead of a yes/no decision, Bayesian networks give the probability of diagnostic category inclusion, thereby yielding both a graded, i.e., dimensional diagnosis, and a value of the certainty of the diagnosis. With the use of Bayesian networks in the diagnosis of mental disorders, information about etiology, associated features, treatment outcome, and laboratory results may be used in addition to clinical signs and symptoms, with each of these factors contributing proportionally to their own specificity and sensitivity. Furthermore, a diagnosis (albeit one with a lower probability) can be made even with incomplete, uncertain, or partially erroneous information, and patients whose symptoms are below the diagnostic threshold can be evaluated. Lastly, there is no need of NOS or "unspecified" categories, and comorbid disorders become different dimensions of the diagnostic evaluation. Bayesian diagnoses allow the preservation of current categories and assessment methods, and may be used concurrently with criteria-based diagnoses. Users need not put in extra effort except to collect more comprehensive information. Unlike the Research Domain Criteria (RDoC) project, the Bayesian approach neither increases the diagnostic validity of existing categories nor explains the pathophysiological mechanisms of mental disorders. It, however, can be readily integrated to present classification systems. Therefore, the Bayesian approach may be an intermediate phase between criteria-based diagnosis and the RDoC ideal.

  8. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  9. Quantum Mechanics Fundamentals and Applications to Technology

    CERN Document Server

    Singh, Jasprit

    1996-01-01

    Explore the relationship between quantum mechanics and information-age applications. This volume takes an altogether unique approach to quantum mechanics. Providing an in-depth exposition of quantum mechanics fundamentals, it shows how these concepts are applied to most of today's information technologies, whether they are electronic devices or materials. No other text makes this critical, essential leap from theory to real-world applications. The book's lively discussion of the mathematics involved fits right in with contemporary multidisciplinary trends in education: Once the basic formulati

  10. arXiv Minimal Fundamental Partial Compositeness

    CERN Document Server

    Cacciapaglia, Giacomo; Sannino, Francesco; Thomsen, Anders Eller

    Building upon the fundamental partial compositeness framework we provide consistent and complete composite extensions of the standard model. These are used to determine the effective operators emerging at the electroweak scale in terms of the standard model fields upon consistently integrating out the heavy composite dynamics. We exhibit the first effective field theories matching these complete composite theories of flavour and analyse their physical consequences for the third generation quarks. Relations with other approaches, ranging from effective analyses for partial compositeness to extra dimensions as well as purely fermionic extensions, are briefly discussed. Our methodology is applicable to any composite theory of dynamical electroweak symmetry breaking featuring a complete theory of flavour.

  11. BOOK REVIEWS: Quantum Mechanics: Fundamentals

    Science.gov (United States)

    Whitaker, A.

    2004-02-01

    mechanics, which is assumed, but to examine whether it gives a consistent account of measurement. The conclusion is that after a measurement, interference terms are ‘effectively’ absent; the set of ‘one-to-one correlations between states of the apparatus and the object’ has the same form as that of everyday statistics and is thus a probability distribution. This probability distribution refers to potentialities, only one of which is actually realized in any one trial. Opinions may differ on whether their treatment is any less vulnerable to criticisms such as those of Bell. To sum up, Gottfried and Yan’s book contains a vast amount of knowledge and understanding. As well as explaining the way in which quantum theory works, it attempts to illuminate fundamental aspects of the theory. A typical example is the ‘fable’ elaborated in Gottfried’s article in Nature cited above, that if Newton were shown Maxwell’s equations and the Lorentz force law, he could deduce the meaning of E and B, but if Maxwell were shown Schrödinger’s equation, he could not deduce the meaning of Psi. For use with a well-constructed course (and, of course, this is the avowed purpose of the book; a useful range of problems is provided for each chapter), or for the relative expert getting to grips with particular aspects of the subject or aiming for a deeper understanding, the book is certainly ideal. It might be suggested, though, that, even compared to the first edition, the isolated learner might find the wide range of topics, and the very large number of mathematical and conceptual techniques, introduced in necessarily limited space, somewhat overwhelming. The second book under consideration, that of Schwabl, contains ‘Advanced’ elements of quantum theory; it is designed for a course following on from one for which Gottfried and Yan, or Schwabl’s own `Quantum Mechanics' might be recommended. It is the second edition in English, and is a translation of the third German edition

  12. Limitations and opportunities of combining Cradle to Grave and Cradle-to-Cradle approaches to support the circular economy

    OpenAIRE

    Niero, Monia; Hauschild, Michael Zwicky; Olsen, Stig Irving

    2016-01-01

    Both Life Cycle Assessment (LCA) with its “Cradle to Grave” approach and the Cradle to Cradle®(C2C) design framework based on the eco-effectiveness concept can support the implementation of circular economy. Based on the insights gained in the packaging sector, we perform a Strengths, Weaknesses, Opportunities, and Threats (SWOT) analysis of the combined use of LCA and “C2C tools”, i.e. the C2C design protocol and the C2C certified TM product standard, in the implementation of circularity str...

  13. Synthesis of silver nanoparticles through green approach using Dioscorea alata and their characterization on antibacterial activities and optical limiting behavior.

    Science.gov (United States)

    Pugazhendhi, S; Sathya, P; Palanisamy, P K; Gopalakrishnan, R

    2016-06-01

    In this work, we have successfully synthesized highly biocompatible and functionalized Dioscorea alata (D. alata) mediated silver nanoparticles with different quantities of its extract for the evaluation of proficient bactericidal activity and optical limiting behavior. The crystalline nature of the synthesized silver nanoparticles was confirmed by powder X-ray powder diffraction (XRD) analysis and furthermore confirmed from SAED pattern of HRTEM Analysis. The Surface Plasmon Resonance band was measured and monitored by UV-Visible spectral studies. The functional groups present in the extract for the reduction and stabilization of the nanoparticles were analyzed by Fourier transform infrared spectroscopy (FTIR) technique. Surface morphology and size of particles were determined by high-resolution transmission electron microscopy analysis (HRTEM). The elemental analysis was made by Energy Dispersive X-ray Spectroscopy (EDX). The synthesized silver nanoparticles (AgNPs) in colloidal form were found to exhibit third order optical nonlinearity as studied by closed aperture Z-scan technique and open aperture technique using 532nm Nd:YAG (SHG) CW laser beam (COHERENT-Compass 215M-50 diode-pumped) output as source. The negative nonlinearity observed was well utilized for the study of optical limiting behavior of the silver nanoparticles. D. alata mediated silver nanoparticles possess very good antimicrobial activity which was confirmed by agar well diffusion assay method. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Fundamental research in developing countries

    International Nuclear Information System (INIS)

    Moravesik, M.J.

    1964-01-01

    Technical assistance is today a widespread activity. Large numbers of persons with special qualifications in the applied sciences go to the developing countries to work on specific research and development projects, as do educationists on Fulbright or other programmes - usually to teach elementary or intermediate courses. But I believe that until now it has been rare for a person primarily interested in fundamental research to go to one of these countries to help build up advanced education and pure research work. Having recently returned from such an assignment, and having found it a most stimulating and enlightening experience, I feel moved to urge strongly upon others who may be in a position to do so that they should seek similar experience themselves. The first step is to show that advanced education and fundamental research are badly needed in the under-developed countries.

  15. DOE Fundamentals Handbook: Classical Physics

    International Nuclear Information System (INIS)

    1992-06-01

    The Classical Physics Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of physical forces and their properties. The handbook includes information on the units used to measure physical properties; vectors, and how they are used to show the net effect of various forces; Newton's Laws of motion, and how to use these laws in force and motion applications; and the concepts of energy, work, and power, and how to measure and calculate the energy involved in various applications. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility systems and equipment

  16. Modern measurements fundamentals and applications

    CERN Document Server

    Petri, D; Carbone, P; Catelani, M

    2015-01-01

    This book explores the modern role of measurement science for both the technically most advanced applications and in everyday and will help readers gain the necessary skills to specialize their knowledge for a specific field in measurement. Modern Measurements is divided into two parts. Part I (Fundamentals) presents a model of the modern measurement activity and the already recalled fundamental bricks. It starts with a general description that introduces these bricks and the uncertainty concept. The next chapters provide an overview of these bricks and finishes (Chapter 7) with a more general and complex model that encompasses both traditional (hard) measurements and (soft) measurements, aimed at quantifying non-physical concepts, such as quality, satisfaction, comfort, etc. Part II (Applications) is aimed at showing how the concepts presented in Part I can be usefully applied to design and implement measurements in some very impor ant and broad fields. The editors cover System Identification (Chapter 8...

  17. Fundamentals of electronic systems design

    CERN Document Server

    Lienig, Jens

    2017-01-01

    This textbook covers the design of electronic systems from the ground up, from drawing and CAD essentials to recycling requirements. Chapter by chapter, it deals with the challenges any modern system designer faces: the design process and its fundamentals, such as technical drawings and CAD, electronic system levels, assembly and packaging issues and appliance protection classes, reliability analysis, thermal management and cooling, electromagnetic compatibility (EMC), all the way to recycling requirements and environmental-friendly design principles. Enables readers to face various challenges of designing electronic systems, including coverage from various engineering disciplines; Written to be accessible to readers of varying backgrounds; Uses illustrations extensively to reinforce fundamental concepts; Organized to follow essential design process, although chapters are self-contained and can be read in any order.

  18. Fundamentals of condensed matter physics

    CERN Document Server

    Cohen, Marvin L

    2016-01-01

    Based on an established course and covering the fundamentals, central areas, and contemporary topics of this diverse field, Fundamentals of Condensed Matter Physics is a much-needed textbook for graduate students. The book begins with an introduction to the modern conceptual models of a solid from the points of view of interacting atoms and elementary excitations. It then provides students with a thorough grounding in electronic structure as a starting point to understand many properties of condensed matter systems - electronic, structural, vibrational, thermal, optical, transport, magnetic and superconductivity - and methods to calculate them. Taking readers through the concepts and techniques, the text gives both theoretically and experimentally inclined students the knowledge needed for research and teaching careers in this field. It features 200 illustrations, 40 worked examples and 150 homework problems for students to test their understanding. Solutions to the problems for instructors are available at w...

  19. Protection of fundamental rights today

    International Nuclear Information System (INIS)

    Meyer-Abich, K.M.

    1984-01-01

    Technical developments can both change the methods of dealing with existing conflicts, and cause new conflicts. Meyer-Abich analyzes five conflicts caused by the technological development in the solution of which the constitutional, liberal, and democratic protection of fundamental rights is not at all guaranteed. Meyer-Abich thinks that these new conflicts can be solved in the framework of the liberal constitutional state, if legal and political consequences are taken in order to guarantee the uncharged protection of fundamental rights under changing conditions. The necessary reforms can, however, only be realized if the way how state and science see themselves changes. Both have to give up their one-sidedness into which have been pushed by conflict which havbe been caused by the scientific and technical development. Only then it will be possible to solve the jemerging conflicts without eopardizing the integritiy of the society. (orig.) [de

  20. Fundamentals of estuarine physical oceanography

    CERN Document Server

    Bruner de Miranda, Luiz; Kjerfve, Björn; Castro Filho, Belmiro Mendes de

    2017-01-01

    This book provides an introduction to the complex system functions, variability and human interference in ecosystem between the continent and the ocean. It focuses on circulation, transport and mixing of estuarine and coastal water masses, which is ultimately related to an understanding of the hydrographic and hydrodynamic characteristics (salinity, temperature, density and circulation), mixing processes (advection and diffusion), transport timescales such as the residence time and the exposure time. In the area of physical oceanography, experiments using these water bodies as a natural laboratory and interpreting their circulation and mixing processes using theoretical and semi-theoretical knowledge are of fundamental importance. Small-scale physical models may also be used together with analytical and numerical models. The book highlights the fact that research and theory are interactive, and the results provide the fundamentals for the development of the estuarine research.

  1. THE FUNDAMENTS OF EXPLANATORY CAUSES

    Directory of Open Access Journals (Sweden)

    Lavinia Mihaela VLĂDILĂ

    2015-07-01

    Full Text Available The new Criminal Code in the specter of the legal life the division of causes removing the criminal feature of the offence in explanatory causes and non-attributable causes. This dichotomy is not without legal and factual fundaments and has been subjected to doctrinaire debates even since the period when the Criminal Code of 1969 was still in force. From our perspective, one of the possible legal fundaments of the explanatory causes results from that the offence committed is based on the protection of a right at least equal with the one prejudiced by the action of aggression, salvation, by the legal obligation imposed or by the victim’s consent.

  2. Limitations and opportunities of combining Cradle to Grave and Cradle-to-Cradle approaches to support the circular economy

    DEFF Research Database (Denmark)

    Niero, Monia; Hauschild, Michael Zwicky; Olsen, Stig Irving

    2016-01-01

    Both Life Cycle Assessment (LCA) with its “Cradle to Grave” approach and the Cradle to Cradle®(C2C) design framework based on the eco-effectiveness concept can support the implementation of circular economy. Based on the insights gained in the packaging sector, we perform a Strengths, Weaknesses......, Opportunities, and Threats (SWOT) analysis of the combined use of LCA and “C2C tools”, i.e. the C2C design protocol and the C2C certified TM product standard, in the implementation of circularity strategies at the product level. Moreover, we discuss the challenges which need to be addressed in order to move...... from a relative to an absolute environmental sustainability perspective at the company level, and define a framework for implementing circularity strategies at the company level, considering an absolute environonmental sustainability perspective and the business dimension....

  3. Multi-target pharmacology: Possibilities and limitations of the skeleton key approach from a medicinal chemist perspective

    Directory of Open Access Journals (Sweden)

    Alan eTalevi

    2015-09-01

    Full Text Available Multi-target drugs have raised considerable interest in the last decade owing to their advantages in the treatment of complex diseases and health conditions linked to drug resistance issues. Prospective drug repositioning to treat comorbid conditions is an additional, overlooked application of multi-target ligands. While medicinal chemists usually rely on some version of the lock and key paradigm to design novel therapeutics, modern pharmacology has recognized that the long-term effects of a given drug on a biological system may depend not only on the specific ligand-target recognition events but also on the influence of the chronic administration of a drug on the cell gene signature. The design of multi-target agents also poses challenging restrictions on the either the topology or flexibility of the candidate drugs which are briefly discussed in the present article. Finally, computational strategies to approach the identification of novel multi-target agents are overviewed.

  4. A Swiss Village in the Dutch Tropics: The Limitations of Empire-Centred Approaches to the Early Modern Atlantic World

    Directory of Open Access Journals (Sweden)

    Karwan Fatah-Black

    2013-03-01

    Full Text Available This article considers what the migration circuits to and from Suriname can tell us about Dutch early modern colonisation in the Atlantic world. Did the Dutch have an Atlantic empire that can be studied by treating it as an integrated space, as suggested by New Imperial Historians, or did colonisation rely on circuits outside Dutch control, stretching beyond its imperial space? An empire-centred approach has dominated the study of Suriname’s history and has largely glossed over the routes taken by European migrants to and from the colony. When the empirecentred perspective is transcended it becomes possible to see that colonists arrived in Suriname from a range of different places around the Atlantic and the European hinterland. The article takes an Atlantic or global perspective to demonstrate the choices available to colonists and the networks through which they moved.

  5. Fundamental Study on Laser Interaction with Metal Matrix Nanocomposites

    OpenAIRE

    Ma, Chao

    2015-01-01

    The objective of this study is to significantly advance the fundamental understanding of laser interaction with metal matrix nanocomposites (MMNCs) and to overcome the fundamental limits of current laser processing techniques by tuning heat transfer and fluid flow using nanoparticles.Ultrasonic assisted electrocodeposition was used to prepare MMNCs samples (e.g., Ni/Al2O3) for laser melting experiments. Microstructural study showed that uniform distribution and dispersion of nanoparticles wer...

  6. Fundamentals of boiling water reactor (BWR)

    International Nuclear Information System (INIS)

    Bozzola, S.

    1982-01-01

    These lectures on fundamentals of BWR reactor physics are a synthesis of known and established concepts. These lectures are intended to be a comprehensive (even though descriptive in nature) presentation, which would give the basis for a fair understanding of power operation, fuel cycle and safety aspects of the boiling water reactor. The fundamentals of BWR reactor physics are oriented to design and operation. In the first lecture general description of BWR is presented, with emphasis on the reactor physics aspects. A survey of methods applied in fuel and core design and operation is presented in the second lecture in order to indicate the main features of the calculational tools. The third and fourth lectures are devoted to review of BWR design bases, reactivity requirements, reactivity and power control, fuel loading patterns. Moreover, operating limits are reviewed, as the actual limits during power operation and constraints for reactor physics analyses (design and operation). The basic elements of core management are also presented. The constraints on control rod movements during the achieving of criticality and low power operation are illustrated in the fifth lecture. Some considerations on plant transient analyses are also presented in the fifth lecture, in order to show the impact between core and fuel performance and plant/system performance. The last (sixth) lecture is devoted to the open vessel testing during the startup of a commercial BWR. A control rod calibration is also illustrated. (author)

  7. DOE fundamentals handbook: Material science

    International Nuclear Information System (INIS)

    1993-01-01

    This handbook was developed to assist nuclear facility operating contractors in providing operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of the structure and properties of metals. This volume contains the two modules: structure of metals (bonding, common lattic types, grain structure/boundary, polymorphis, alloys, imperfections in metals) and properties of metals (stress, strain, Young modulus, stress-strain relation, physical properties, working of metals, corrosion, hydrogen embrittlement, tritium/material compatibility)

  8. Fundamentals of plastic optical fibers

    CERN Document Server

    Koike, Yasuhiro

    2014-01-01

    Polymer photonics is an interdisciplinary field which demands excellence both in optics (photonics) and materials science (polymer). However, theses disciplines have developed independently, and therefore the demand for a comprehensive work featuring the fundamentals of photonic polymers is greater than ever.This volume focuses on Polymer Optical Fiber and their applications. The first part of the book introduces typical optical fibers according to their classifications of material, propagating mode, and structure. Optical properties, the high bandwidth POF and transmission loss are discussed,

  9. Fundamental requirements for petrochemical development

    International Nuclear Information System (INIS)

    Flint, G. B.

    1999-01-01

    The development of NOVA Chemicals over the past 20 years is described as an illustration of how the petrochemical industry provides markets for natural gas, natural gas liquids and the products of crude oil distillation, and functions as a conduit for upgrading products which would otherwise be sold into the fuel market. Some fundamental characteristics of the business which are foundations for competitiveness are reviewed in the process. These fundamentals help to understand why the industry locates in certain geographic regions of the world, which are often remote from end-use markets. Chief among these fundamentals is access to an adequate supply of appropriately priced feedstock; this is the single most important reason why chemical companies continue to emphasize developments in areas of the world where feedstock are advantageously priced. The cost of operations is equally significant. Cost depends not so much on location but on the scale of operations, hence the tendency towards large scale plants. Plant and product rationalization, technology and product development synergies and leverage with suppliers are all opportunities for cost reduction throughout the product supply chain. The combination of lower natural gas cost in Alberta, the lower fixed cost of extraction and the economies of scale achieved by large scale operation (five billion pounds per year of polyethylene production capacity) are the crucial factors that will enable NOVA Chemicals to maintain its competitive position and to weather the highs and lows in industry price fluctuations

  10. The Holy Text and Violence : Levinas and Fundamentalism

    NARCIS (Netherlands)

    Poorthuis, Marcel; Breitlin, Andris; Bremmers, Chris; Cools, Arthur

    2015-01-01

    Levinas'rejection of a historical ciritcal approach to sacred texts as well as his depreciation of Spinoza's view of the Bible might bring him close to fundamentalism. A thorough analysis is necessary to demonstrate essential differences. Levinas'rejection of a historical ciritcal approach to sacred

  11. An Approach to the Prototyping of an Optimized Limited Stroke Actuator to Drive a Low Pressure Exhaust Gas Recirculation Valve

    Directory of Open Access Journals (Sweden)

    Christophe Gutfrind

    2016-05-01

    Full Text Available The purpose of this article is to describe the design of a limited stroke actuator and the corresponding prototype to drive a Low Pressure (LP Exhaust Gas Recirculation (EGR valve for use in Internal Combustion Engines (ICEs. The direct drive actuator topology is an axial flux machine with two air gaps in order to minimize the rotor inertia and a bipolar surface-mounted permanent magnet in order to respect an 80° angular stroke. Firstly, the actuator will be described and optimized under constraints of a 150 ms time response, a 0.363 N·m minimal torque on an angular range from 0° to 80° and prototyping constraints. Secondly, the finite element method (FEM using the FLUX-3D® software (CEDRAT, Meylan, France will be used to check the actuator performances with consideration of the nonlinear effect of the iron material. Thirdly, a prototype will be made and characterized to compare its measurement results with the analytical model and the FEM model results. With these electromechanical behavior measurements, a numerical model is created with Simulink® in order to simulate an EGR system with this direct drive actuator under all operating conditions. Last but not least, the energy consumption of this machine will be estimated to evaluate the efficiency of the proposed EGR electromechanical system.

  12. A limited sampling approach in bioequivalence studies: application to long half-life drugs and replicate design studies.

    Science.gov (United States)

    Mahmood, I; Mahayni, H

    1999-06-01

    The objectives of this study was to develop a limited sampling model (LSM) to predict the area under the curve (AUC) and the maximum plasma concentration (Cmax) for the assessment of bioequivalence studies. Two drugs (A and B) were selected for this purpose. Drug A was chosen to test bioequivalence of two formulations with a long half-life (> 35 hours), whereas drug B was chosen to test the bioequivalence of two formulations (half-life = 12 hrs) with a replicate design study. The LSM for both drugs was developed using 5 blood samples each from 15 healthy subjects. The relationship between plasma concentration (independent variable) at selected time points with the AUC or Cmax (dependent variable) was evaluated by multiple linear regression analysis. The multiple linear regression which gave the best correlation coefficient (r) for 5 sampling time vs AUC or Cmax was chosen as the LSM. The predicted AUC and Cmax from the LSM were then used to assess bioequivalence of two different formulations of each drug following a single oral dose. The model provided good estimates of both AUC and Cmax for both drugs. The 90% confidence intervals on log-transformed observed and predicted AUC and Cmax were comparable for both drugs. The method described here may be used to estimate AUC and Cmax for bioequivalence studies for drugs with long half-lives or for highly variable drugs which may require replicate design studies without detailed blood sampling.

  13. Flux Limiter Lattice Boltzmann Scheme Approach to Compressible Flows with Flexible Specific-Heat Ratio and Prandtl Number

    International Nuclear Information System (INIS)

    Gan Yanbiao; Li Yingjun; Xu Aiguo; Zhang Guangcai

    2011-01-01

    We further develop the lattice Boltzmann (LB) model [Physica A 382 (2007) 502] for compressible flows from two aspects. Firstly, we modify the Bhatnagar-Gross-Krook (BGK) collision term in the LB equation, which makes the model suitable for simulating flows with different Prandtl numbers. Secondly, the flux limiter finite difference (FLFD) scheme is employed to calculate the convection term of the LB equation, which makes the unphysical oscillations at discontinuities be effectively suppressed and the numerical dissipations be significantly diminished. The proposed model is validated by recovering results of some well-known benchmarks, including (i) The thermal Couette flow; (ii) One- and two-dimensional Riemann problems. Good agreements are obtained between LB results and the exact ones or previously reported solutions. The flexibility, together with the high accuracy of the new model, endows the proposed model considerable potential for tracking some long-standing problems and for investigating nonlinear nonequilibrium complex systems. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  14. Maximizing the benefit of health workforce secondment in Botswana: an approach for strengthening health systems in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Grignon JS

    2014-05-01

    Full Text Available Jessica S Grignon,1,2 Jenny H Ledikwe,1,2 Ditsapelo Makati,2 Robert Nyangah,2 Baraedi W Sento,2 Bazghina-werq Semo1,2 1Department of Global Health, University of Washington, Seattle, WA, USA; 2International Training and Education Center for Health, Gaborone, Botswana Abstract: To address health systems challenges in limited-resource settings, global health initiatives, particularly the President's Emergency Plan for AIDS Relief, have seconded health workers to the public sector. Implementation considerations for secondment as a health workforce development strategy are not well documented. The purpose of this article is to present outcomes, best practices, and lessons learned from a President's Emergency Plan for AIDS Relief-funded secondment program in Botswana. Outcomes are documented across four World Health Organization health systems' building blocks. Best practices include documentation of joint stakeholder expectations, collaborative recruitment, and early identification of counterparts. Lessons learned include inadequate ownership, a two-tier employment system, and ill-defined position duration. These findings can inform program and policy development to maximize the benefit of health workforce secondment. Secondment requires substantial investment, and emphasis should be placed on high-level technical positions responsible for building systems, developing health workers, and strengthening government to translate policy into programs. Keywords: human resources, health policy, health worker, HIV/AIDS, PEPFAR

  15. The specific diagnosis of gastrointestinal nematode infections in livestock: larval culture technique, its limitations and alternative DNA-based approaches.

    Science.gov (United States)

    Roeber, Florian; Kahn, Lewis

    2014-10-15

    The specific diagnosis of gastrointestinal nematode infections in ruminants is routinely based on larval culture technique and on the morphological identification of developed third-stage larvae. However, research on the ecology and developmental requirements of different species suggests that environmental conditions (e.g., temperature and humidity) for optimal development to occur vary between the different species. Thus, employing a common culture protocol for all species will favour the development of certain species over others and can cause a biased result in particular when species proportions in a mixed infection are to be determined. Furthermore, the morphological identification of L3 larvae is complicated by a lack of distinctive, obvious features that would allow the identification of all key species. In the present paper we review in detail the potential limitations of larval culture technique and morphological identification and provide account to some modern molecular alternatives to the specific diagnosis of gastrointestinal nematode infection in ruminants. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Atom counting in HAADF STEM using a statistical model-based approach: methodology, possibilities, and inherent limitations.

    Science.gov (United States)

    De Backer, A; Martinez, G T; Rosenauer, A; Van Aert, S

    2013-11-01

    In the present paper, a statistical model-based method to count the number of atoms of monotype crystalline nanostructures from high resolution high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) images is discussed in detail together with a thorough study on the possibilities and inherent limitations. In order to count the number of atoms, it is assumed that the total scattered intensity scales with the number of atoms per atom column. These intensities are quantitatively determined using model-based statistical parameter estimation theory. The distribution describing the probability that intensity values are generated by atomic columns containing a specific number of atoms is inferred on the basis of the experimental scattered intensities. Finally, the number of atoms per atom column is quantified using this estimated probability distribution. The number of atom columns available in the observed STEM image, the number of components in the estimated probability distribution, the width of the components of the probability distribution, and the typical shape of a criterion to assess the number of components in the probability distribution directly affect the accuracy and precision with which the number of atoms in a particular atom column can be estimated. It is shown that single atom sensitivity is feasible taking the latter aspects into consideration. © 2013 Elsevier B.V. All rights reserved.

  17. Microelectronics from fundamentals to applied design

    CERN Document Server

    Di Paolo Emilio, Maurizio

    2016-01-01

    This book serves as a practical guide for practicing engineers who need to design analog circuits for microelectronics.  Readers will develop a comprehensive understanding of the basic techniques of analog modern electronic circuit design, discrete and integrated, application as sensors and control and data acquisition systems,and techniques of PCB design.  ·         Describes fundamentals of microelectronics design in an accessible manner; ·         Takes a problem-solving approach to the topic, offering a hands-on guide for practicing engineers; ·         Provides realistic examples to inspire a thorough understanding of system-level issues, before going into the detail of components and devices; ·         Uses a new approach and provides several skills that help engineers and designers retain key and advanced concepts.

  18. Determination of lead and cadmium concentration limits in agricultural soil and municipal solid waste compost through an approach of zero tolerance to food contamination.

    Science.gov (United States)

    Saha, Jayanta Kumar; Panwar, N R; Singh, M V

    2010-09-01

    Cadmium and lead are important environmental pollutants with high toxicity to animals and human. Soils, though have considerable metal immobilizing capability, can contaminate food chain via plants grown upon them when their built-up occurs to a large extent. Present experiment was carried out with the objective of quantifying the limits of Pb and Cd loading in soil for the purpose of preventing food chain contamination beyond background concentration levels. Two separate sets of pot experiment were carried out for these two heavy metals with graded levels of application doses of Pb at 0.4-150 mg/kg and Cd at 0.02-20 mg/kg to an acidic light textured alluvial soil. Spinach crop was grown for 50 days on these treated soils after a stabilization period of 2 months. Upper limit of background concentration levels (C(ul)) of these metals were calculated through statistical approach from the heavy metals concentration values in leaves of spinach crop grown in farmers' fields. Lead and Cd concentration limits in soil were calculated by dividing C(ul) with uptake response slope obtained from the pot experiment. Cumulative loading limits (concentration limits in soil minus contents in uncontaminated soil) for the experimental soil were estimated to be 170 kg Pb/ha and 0.8 kg Cd/ha. Based on certain assumptions on application rate and computed cumulative loading limit values, maximum permissible Pb and Cd concentration values in municipal solid waste (MSW) compost were proposed as 170 mg Pb/kg and 0.8 mg Cd/kg, respectively. In view of these limiting values, about 56% and 47% of the MSW compost samples from different cities are found to contain Pb and Cd in the safe range.

  19. Approach to Recover Hydrocarbons from Currently Off-Limit Areas of the Antrim Formation, MI Using Low-Impact Technologies

    Energy Technology Data Exchange (ETDEWEB)

    James Wood; William Quinlan

    2008-09-30

    The goal of this project was to develop and execute a novel drilling and completion program in the Antrim Shale near the western shoreline of Northern Michigan. The target was the gas in the Lower Antrim Formation (Upper Devonian). Another goal was to see if drilling permits could be obtained from the Michigan DNR that would allow exploitation of reserves currently off-limits to exploration. This project met both of these goals: the DNR (Michigan Department of Natural Resources) issued permits that allow drilling the shallow subsurface for exploration and production. This project obtained drilling permits for the original demonstration well AG-A-MING 4-12 HD (API: 21-009-58153-0000) and AG-A-MING 4-12 HD1 (API: 21-009-58153-0100) as well as for similar Antrim wells in Benzie County, MI, the Colfax 3-28 HD and nearby Colfax 2-28 HD which were substituted for the AG-A-MING well. This project also developed successful techniques and strategies for producing the shallow gas. In addition to the project demonstration well over 20 wells have been drilled to date into the shallow Antrim as a result of this project's findings. Further, fracture stimulation has proven to be a vital step in improving the deliverability of wells to deem them commercial. Our initial plan was very simple; the 'J-well' design. We proposed to drill a vertical or slant well 30.48 meters (100 feet) below the glacial drift, set required casing, then angle back up to tap the resource lying between the base to the drift and the conventional vertical well. The 'J'-well design was tested at Mancelona Township in Antrim County in February of 2007 with the St. Mancelona 2-12 HD 3.

  20. Coherence and diffraction limited resolution in microscopic OCT by a unified approach for the correction of dispersion and aberrations

    Science.gov (United States)

    Schulz-Hildebrandt, H.; Münter, Michael; Ahrens, M.; Spahr, H.; Hillmann, D.; König, P.; Hüttmann, G.

    2018-03-01

    Optical coherence tomography (OCT) images scattering tissues with 5 to 15 μm resolution. This is usually not sufficient for a distinction of cellular and subcellular structures. Increasing axial and lateral resolution and compensation of artifacts caused by dispersion and aberrations is required to achieve cellular and subcellular resolution. This includes defocus which limit the usable depth of field at high lateral resolution. OCT gives access the phase of the scattered light and hence correction of dispersion and aberrations is possible by numerical algorithms. Here we present a unified dispersion/aberration correction which is based on a polynomial parameterization of the phase error and an optimization of the image quality using Shannon's entropy. For validation, a supercontinuum light sources and a costume-made spectrometer with 400 nm bandwidth were combined with a high NA microscope objective in a setup for tissue and small animal imaging. Using this setup and computation corrections, volumetric imaging at 1.5 μm resolution is possible. Cellular and near cellular resolution is demonstrated in porcine cornea and the drosophila larva, when computational correction of dispersion and aberrations is used. Due to the excellent correction of the used microscope objective, defocus was the main contribution to the aberrations. In addition, higher aberrations caused by the sample itself were successfully corrected. Dispersion and aberrations are closely related artifacts in microscopic OCT imaging. Hence they can be corrected in the same way by optimization of the image quality. This way microscopic resolution is easily achieved in OCT imaging of static biological tissues.

  1. Selection of optimal recording sites for limited lead body surface potential mapping: A sequential selection based approach

    Directory of Open Access Journals (Sweden)

    McCullagh Paul J

    2006-02-01

    Full Text Available Abstract Background In this study we propose the development of a new algorithm for selecting optimal recording sites for limited lead body surface potential mapping. The proposed algorithm differs from previously reported methods in that it is based upon a simple and intuitive data driven technique that does not make any presumptions about deterministic characteristics of the data. It uses a forward selection based search technique to find the best combination of electrocardiographic leads. Methods The study was conducted using a dataset consisting of body surface potential maps (BSPM recorded from 116 subjects which included 59 normals and 57 subjects exhibiting evidence of old Myocardial Infarction (MI. The performance of the algorithm was evaluated using spatial RMS voltage error and correlation coefficient to compare original and reconstructed map frames. Results In all, three configurations of the algorithm were evaluated and it was concluded that there was little difference in the performance of the various configurations. In addition to observing the performance of the selection algorithm, several lead subsets of 32 electrodes as chosen by the various configurations of the algorithm were evaluated. The rationale for choosing this number of recording sites was to allow comparison with a previous study that used a different algorithm, where 32 leads were deemed to provide an acceptable level of reconstruction performance. Conclusion It was observed that although the lead configurations suggested in this study were not identical to that suggested in the previous work, the systems did bear similar characteristics in that recording sites were chosen with greatest density in the precordial region.

  2. SU-E-T-378: Limits and Possibilities of a Simplistic Approach to Whole Breast Radiation Therapy Planning

    Energy Technology Data Exchange (ETDEWEB)

    Hipp, E; Osa, E; No, H; Jozsef, G [NYULMC Clinical Cancer Center, NY, NY (United States); Rosman, M [NYU School of Medicine, NY, NY (United States); Formenti, S [NYULMC Clinical Cancer Center, NY, NY (United States); NYU School of Medicine, NY, NY (United States)

    2014-06-01

    Purpose: Challenges for radiation therapy in developing countries include unreliable infrastructure and high patient load. We propose a system to treat whole breast in the prone position without computed tomography and/or planning software. Methods: Six parameters are measured using calipers and levels with the patient prone in the treatment position. (1) The largest separation; (2) the angle that separation makes with the horizontal; (3) the separation 2 cm posterior to the nipple; (4) the vertical distance between these two separations; (5) the sup/inf length and (6) angle of the desired posterior field edge. The data in (5) (6) and (2) provide field length, collimator and gantry angles. Isocenter is set to the midpoint of (1), anterior jaw setting is 20cm (half-beam setup), and the dose is prescribed to a point 1.5 cm anterior to isocenter. MUs and wedge angles are calculated using an MU calculator and by requiring 100% dose at that point and 100-105% at the midpoint of (3). Measurements on 30 CT scans were taken to obtain the data 1-6. To test the resulting MU/wedge combinations, they were entered into Eclipse (Varian) and dose distributions were calculated. The MU/wedge combinations were recorded and tabulated. Results: Performing a dose volume histogram analysis, the contoured breast V95 was 90.5%, and the average V90 was 94.1%. The maximum dose never exceeded 114.5%, (average 108%). The lung V20 was <5% for 96.7%, and the heart V5 was <10% for 93.3% of our sample. Conclusion: A method to provide prone whole breast treatment without CT-planning was developed. The method provides reasonable coverage and normal tissue sparing. This approach is not recommended if imaging and planning capabilities are available; it was designed to specifically avoid the need for CT planning and should be reserved to clinics that need to avoid that step.

  3. A spectral expansion approach for geodetic slip inversion: implications for the downdip rupture limits of oceanic and continental megathrust earthquakes

    Science.gov (United States)

    Xu, Xiaohua; Sandwell, David T.; Bassett, Dan

    2018-01-01

    We have developed a data-driven spectral expansion inversion method to place bounds on the downdip rupture depth of large megathrust earthquakes having good InSAR and GPS coverage. This inverse theory approach is used to establish the set of models that are consistent with the observations. In addition, the inverse theory method demonstrates that the spatial resolution of the slip models depends on two factors, the spatial coverage and accuracy of the surface deformation measurements, and the slip depth. Application of this method to the 2010 Mw 8.8 Maule Earthquake shows a slip maximum at 19 km depth tapering to zero at ˜40 km depth. In contrast, the continent-continent megathrust earthquakes of the Himalayas, for example 2015 Mw 7.8 Gorkha Earthquake, shows a slip maximum at 9 km depth tapering to zero at ˜18 km depth. The main question is why is the maximum slip depth of the continental megathrust earthquake only 50 per cent of that observed in oceanic megathrust earthquakes. To understand this difference, we have developed a simple 1-D heat conduction model that includes the effects of uplift and surface erosion. The relatively low erosion rates above the ocean megathrust results in a geotherm where the 450-600 °C transition is centred at ˜40 km depth. In contrast, the relatively high average erosion rates in the Himalayas of ˜1 mm yr-1 results in a geotherm where the 450-600 °C transition is centred at ˜20 km. Based on these new observations and models, we suggest that the effect of erosion rate on temperature explains the difference in the maximum depth of the seismogenic zone between Chile and the Himalayas.

  4. XPS depth profiling of derivatized amine and anhydride plasma polymers: Evidence of limitations of the derivatization approach

    Energy Technology Data Exchange (ETDEWEB)

    Manakhov, Anton, E-mail: ant-manahov@ya.ru [National University of Science and Technology “MISiS”, Leninsky pr. 4, Moscow 119049 (Russian Federation); RG Plasma Technologies, CEITEC – Masaryk University, Purkyňova 123, Brno 61200 (Czech Republic); Michlíček, Miroslav [RG Plasma Technologies, CEITEC – Masaryk University, Purkyňova 123, Brno 61200 (Czech Republic); Department of Physical Electronics, Faculty of Science, Masaryk University, Kotlářská, 2, Brno 61137 (Czech Republic); Felten, Alexandre; Pireaux, Jean-Jacques [LISE, Department of Physics, University of Namur, Rue de Bruxelles, 61, Namur B5000 (Belgium); Nečas, David [RG Plasma Technologies, CEITEC – Masaryk University, Purkyňova 123, Brno 61200 (Czech Republic); Zajíčková, Lenka [RG Plasma Technologies, CEITEC – Masaryk University, Purkyňova 123, Brno 61200 (Czech Republic); Department of Physical Electronics, Faculty of Science, Masaryk University, Kotlářská, 2, Brno 61137 (Czech Republic)

    2017-02-01

    the smaller trifluoroethylamine (TFEA) led to a more homogenous depth profile. The data analysis suggests that the size of the derivatizing molecule is the main factor, showing that the very limited permeation of the TFBA molecule can lead to underestimated densities of primary amines if the XPS analysis is solely carried out at a low take-off angle. In contrast, TFEA is found to be an efficient derivatization agent of anhydride groups with high permeability through the carboxyl-anhydride layer.

  5. Country-wide assessment of estuary health: An approach for integrating pressures and ecosystem response in a data limited environment

    Science.gov (United States)

    Van Niekerk, L.; Adams, J. B.; Bate, G. C.; Forbes, A. T.; Forbes, N. T.; Huizinga, P.; Lamberth, S. J.; MacKay, C. F.; Petersen, C.; Taljaard, S.; Weerts, S. P.; Whitfield, A. K.; Wooldridge, T. H.

    2013-09-01

    Population and development pressures increase the need for proactive strategic management on a regional or country-wide scale - reactively protecting ecosystems on an estuary-by-estuary basis against multiple pressures is 'resource hungry' and not feasible. Proactive management requires a strategic assessment of health so that the most suitable return on investment can be made. A country-wide assessment of the nearly 300 functional South African estuaries examined both key pressures (freshwater inflow modification, water quality, artificial breaching of temporarily open/closed systems, habitat modification and exploitation of living resources) and health state. The method used to assess the type and level of the different pressures, as well as the ecological health status of a large number of estuaries in a data limited environment is described in this paper. Key pressures and the ecological condition of estuaries on a national scale are summarised. The template may also be used to provide guidance to coastal researchers attempting to inform management in other developing countries. The assessment was primarily aimed at decision makers both inside and outside the biodiversity sector. A key starting point was to delineate spatially the estuary functional zone (area) for every system. In addition, available data on pressures impacting estuaries on a national scale were collated. A desktop national health assessment, based on an Estuarine Health Index developed for South African ecological water requirement studies, was then applied systematically. National experts, all familiar with the index evaluated the estuaries in their region. Individual estuarine health assessment scores were then translated into health categories that reflect the overall status of South Africa's estuaries. The results showed that estuaries in the warm-temperate biogeographical zone are healthier than those in the cool-temperate and subtropical zones, largely reflecting the country

  6. Fundamental solutions of singular SPDEs

    Energy Technology Data Exchange (ETDEWEB)

    Selesi, Dora, E-mail: dora@dmi.uns.ac.rs [Department of Mathematics and Informatics, University of Novi Sad (Serbia)

    2011-07-15

    Highlights: > Fundamental solutions of linear SPDEs are constructed. > Wick-convolution product is introduced for the first time. > Fourier transformation maps Wick-convolution into Wick product. > Solutions of linear SPDEs are expressed via Wick-convolution with fundamental solutions. > Stochastic Helmholtz equation is solved. - Abstract: This paper deals with some models of mathematical physics, where random fluctuations are modeled by white noise or other singular Gaussian generalized processes. White noise, as the distributional derivative od Brownian motion, which is the most important case of a Levy process, is defined in the framework of Hida distribution spaces. The Fourier transformation in the framework of singular generalized stochastic processes is introduced and its applications to solving stochastic differential equations involving Wick products and singularities such as the Dirac delta distribution are presented. Explicit solutions are obtained in form of a chaos expansion in the Kondratiev white noise space, while the coefficients of the expansion are tempered distributions. Stochastic differential equations of the form P({omega}, D) Lozenge u(x, {omega}) = A(x, {omega}) are considered, where A is a singular generalized stochastic process and P({omega}, D) is a partial differential operator with random coefficients. We introduce the Wick-convolution operator * which enables us to express the solution as u = s*A Lozenge I{sup Lozenge (-1)}, where s denotes the fundamental solution and I is the unit random variable. In particular, the stochastic Helmholtz equation is solved, which in physical interpretation describes waves propagating with a random speed from randomly appearing point sources.

  7. Fundamental solutions of singular SPDEs

    International Nuclear Information System (INIS)

    Selesi, Dora

    2011-01-01

    Highlights: → Fundamental solutions of linear SPDEs are constructed. → Wick-convolution product is introduced for the first time. → Fourier transformation maps Wick-convolution into Wick product. → Solutions of linear SPDEs are expressed via Wick-convolution with fundamental solutions. → Stochastic Helmholtz equation is solved. - Abstract: This paper deals with some models of mathematical physics, where random fluctuations are modeled by white noise or other singular Gaussian generalized processes. White noise, as the distributional derivative od Brownian motion, which is the most important case of a Levy process, is defined in the framework of Hida distribution spaces. The Fourier transformation in the framework of singular generalized stochastic processes is introduced and its applications to solving stochastic differential equations involving Wick products and singularities such as the Dirac delta distribution are presented. Explicit solutions are obtained in form of a chaos expansion in the Kondratiev white noise space, while the coefficients of the expansion are tempered distributions. Stochastic differential equations of the form P(ω, D) ◊ u(x, ω) = A(x, ω) are considered, where A is a singular generalized stochastic process and P(ω, D) is a partial differential operator with random coefficients. We introduce the Wick-convolution operator * which enables us to express the solution as u = s*A ◊ I ◊(-1) , where s denotes the fundamental solution and I is the unit random variable. In particular, the stochastic Helmholtz equation is solved, which in physical interpretation describes waves propagating with a random speed from randomly appearing point sources.

  8. Frontiers of Fundamental Physics 14

    Science.gov (United States)

    The 14th annual international symposium "Frontiers of Fundamental Physics" (FFP14) was organized by the OCEVU Labex. It was held in Marseille, on the Saint-Charles Campus of Aix Marseille University (AMU) and had over 280 participants coming from all over the world. FFP Symposium began in India in 1997 and it became itinerant in 2004, through Europe, Canada and Australia. It covers topics in fundamental physics with the objective to enable scholars working in related areas to meet on a single platform and exchange ideas. In addition to highlighting the progress in these areas, the symposium invites the top researchers to reflect on the educational aspects of our discipline. Moreover, the scientific concepts are also discussed through philosophical and epistemological viewpoints. Several eminent scientists, such as the laureates of prestigious awards (Nobel Prize, Fields Medal,…), have already participated in these meetings. The FFP14 Symposium developed around seven main themes, namely: Astroparticle Physics, Cosmology, High Energy Physics, Quantum Gravity, Mathematical Physics, Physics Education, Epistemology and Philosophy. The morning was devoted to the plenary session, with talks for a broad audience of physicists in its first half (9:00-10:30), and more specialized in its second half (11:00-12:30); this part was held in three amphitheaters. The parallel session of the Symposium took place during the afternoon (14:30-18:30) with seven thematic conferences and an additional conference on open topics named "Frontiers of Fundamental Physics". These eight conferences were organized around the contributions of participants, in addition to the ones of invited speakers. Altogether, there were some 250 contributions to the symposium (talks and posters). The plenary talks were webcasted live and recorded. The slides of the talks and the videos of the plenary talks are available from the Symposium web site: http://ffp14.cpt.univ-mrs.fr/

  9. Photovoltaics fundamentals, technology and practice

    CERN Document Server

    Mertens, Konrad

    2013-01-01

    Concise introduction to the basic principles of solar energy, photovoltaic systems, photovoltaic cells, photovoltaic measurement techniques, and grid connected systems, overviewing the potential of photovoltaic electricity for students and engineers new to the topic After a brief introduction to the topic of photovoltaics' history and the most important facts, Chapter 1 presents the subject of radiation, covering properties of solar radiation, radiation offer, and world energy consumption. Chapter 2 looks at the fundamentals of semiconductor physics. It discusses the build-up of semiconducto

  10. Solid Lubrication Fundamentals and Applications

    Science.gov (United States)

    Miyoshi, Kazuhisa

    2001-01-01

    Solid Lubrication Fundamentals and Applications description of the adhesion, friction, abrasion, and wear behavior of solid film lubricants and related tribological materials, including diamond and diamond-like solid films. The book details the properties of solid surfaces, clean surfaces, and contaminated surfaces as well as discussing the structures and mechanical properties of natural and synthetic diamonds; chemical-vapor-deposited diamond film; surface design and engineering toward wear-resistant, self-lubricating diamond films and coatings. The author provides selection and design criteria as well as applications for synthetic and natural coatings in the commercial, industrial and aerospace industries..

  11. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2014-01-01

    A classic now in its 14th edition, Communication Technology Update and Fundamentals is the single best resource for students and professionals looking to brush up on how these technologies have developed, grown, and converged, as well as what's in store for the future. It begins by developing the communication technology framework-the history, ecosystem, and structure-then delves into each type of technology, including everything from mass media, to computers and consumer electronics, to networking technologies. Each chapter is written by faculty and industry experts who p

  12. Fundamentals of gas particle flow

    CERN Document Server

    Rudinger, G

    1980-01-01

    Fundamentals of Gas-Particle Flow is an edited, updated, and expanded version of a number of lectures presented on the "Gas-Solid Suspensions” course organized by the von Karman Institute for Fluid Dynamics. Materials presented in this book are mostly analytical in nature, but some experimental techniques are included. The book focuses on relaxation processes, including the viscous drag of single particles, drag in gas-particles flow, gas-particle heat transfer, equilibrium, and frozen flow. It also discusses the dynamics of single particles, such as particles in an arbitrary flow, in a r

  13. Computing fundamentals digital literacy edition

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    Computing Fundamentals has been tailor made to help you get up to speed on your Computing Basics and help you get proficient in entry level computing skills. Covering all the key topics, it starts at the beginning and takes you through basic set-up so that you'll be competent on a computer in no time.You'll cover: Computer Basics & HardwareSoftwareIntroduction to Windows 7Microsoft OfficeWord processing with Microsoft Word 2010Creating Spreadsheets with Microsoft ExcelCreating Presentation Graphics with PowerPointConnectivity and CommunicationWeb BasicsNetwork and Internet Privacy and Securit

  14. Fundamental principles of quantum theory

    International Nuclear Information System (INIS)

    Bugajski, S.

    1980-01-01

    After introducing general versions of three fundamental quantum postulates - the superposition principle, the uncertainty principle and the complementarity principle - the question of whether the three principles are sufficiently strong to restrict the general Mackey description of quantum systems to the standard Hilbert-space quantum theory is discussed. An example which shows that the answer must be negative is constructed. An abstract version of the projection postulate is introduced and it is demonstrated that it could serve as the missing physical link between the general Mackey description and the standard quantum theory. (author)

  15. Plasma expansion: fundamentals and applications

    International Nuclear Information System (INIS)

    Engeln, R; Mazouffre, S; Vankan, P; Bakker, I; Schram, D C

    2002-01-01

    The study of plasma expansion is interesting from a fundamental point of view as well as from a more applied point of view. We here give a short overview of the way properties like density, velocity and temperature behave in an expanding thermal plasma. Experimental data show that the basic phenomena of plasma expansion are to some extent similar to those of the expansion of a hot neutral gas. From the application point of view, we present first results on the use of an expanding thermal plasma in the plasma-activated catalysis of ammonia, from N 2 -H 2 mixtures

  16. Fundamentals of liquid crystal devices

    CERN Document Server

    Yang, Deng-Ke

    2014-01-01

    Revised throughout to cover the latest developments in the fast moving area of display technology, this 2nd edition of Fundamentals of Liquid Crystal Devices, will continue to be a valuable resource for those wishing to understand the operation of liquid crystal displays. Significant updates include new material on display components, 3D LCDs and blue-phase displays which is one of the most promising new technologies within the field of displays and it is expected that this new LC-technology will reduce the response time and the number of optical components of LC-modules. Prof. Yang is a pion

  17. Foam engineering fundamentals and applications

    CERN Document Server

    2012-01-01

    Containing contributions from leading academic and industrial researchers, this book provides a much needed update of foam science research. The first section of the book presents an accessible summary of the theory and fundamentals of foams. This includes chapters on morphology, drainage, Ostwald ripening, coalescence, rheology, and pneumatic foams. The second section demonstrates how this theory is used in a wide range of industrial applications, including foam fractionation, froth flotation and foam mitigation. It includes chapters on suprafroths, flotation of oil sands, foams in enhancing petroleum recovery, Gas-liquid Mass Transfer in foam, foams in glass manufacturing, fire-fighting foam technology and consumer product foams.

  18. Testing Fundamental Gravitation in Space

    Energy Technology Data Exchange (ETDEWEB)

    Turyshev, Slava G.

    2013-10-15

    General theory of relativity is a standard theory of gravitation; as such, it is used to describe gravity when the problems in astronomy, astrophysics, cosmology, and fundamental physics are concerned. The theory is also relied upon in many modern applications involving spacecraft navigation, geodesy, and time transfer. Here we review the foundations of general relativity and discuss its current empirical status. We describe both the theoretical motivation and the scientific progress that may result from the new generation of high-precision tests that are anticipated in the near future.

  19. Fundamental Physics with Space Experiments

    Science.gov (United States)

    Vitale, S.

    I review a category of experiments in fundamental physics that need space as a laboratory. All these experiments have in common the need of a very low gravity environment to achieve as an ideal free fall as possible: LISA, the gravitational wave observatory, and its technology demonstrator SMART-2. The satellite tests of the equivalence principle Microscope, and the ultimate sensitivity one STEP, with its close heritage from GP-B, the experiment to measure the gravito-magnetic field of the Earth. Finally the entirely new field of cold atoms in space with its promise to produce the next generation of inertial gravitational and inertial sensors for general relativity experiments.

  20. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    Science.gov (United States)

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta; Ahmed, Lucky; Rasulev, Bakhtiyor; Avramopoulos, Aggelos; Papadopoulos, Manthos G.; Leszczynski, Jerzy; Puzyn, Tomasz

    2016-09-01

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure-Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure-Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide the recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.

  1. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    Energy Technology Data Exchange (ETDEWEB)

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta [University of Gdansk, Laboratory of Environmental Chemometrics, Faculty of Chemistry, Institute for Environmental and Human Health Protection (Poland); Ahmed, Lucky; Rasulev, Bakhtiyor [Jackson State University, Interdisciplinary Nanotoxicity Center, Department of Chemistry and Biochemistry (United States); Avramopoulos, Aggelos; Papadopoulos, Manthos G. [National Hellenic Research Foundation, Institute of Biology, Pharmaceutical Chemistry and Biotechnology (Greece); Leszczynski, Jerzy [Jackson State University, Interdisciplinary Nanotoxicity Center, Department of Chemistry and Biochemistry (United States); Puzyn, Tomasz, E-mail: t.puzyn@qsar.eu.org [University of Gdansk, Laboratory of Environmental Chemometrics, Faculty of Chemistry, Institute for Environmental and Human Health Protection (Poland)

    2016-09-15

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide the recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.

  2. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    International Nuclear Information System (INIS)

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta; Ahmed, Lucky; Rasulev, Bakhtiyor; Avramopoulos, Aggelos; Papadopoulos, Manthos G.; Leszczynski, Jerzy; Puzyn, Tomasz

    2016-01-01

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide the recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.

  3. Study on Application of New Approach of Fault Current Limiters in Fault Ride through Capability Improvement of DFIG Based Wind Turbine

    DEFF Research Database (Denmark)

    Naderi, Seyed Behzad; Davari, Pooya; Zhou, Dao

    2018-01-01

    Due to salient advantages, Doubly-Fed Induction Generator (DFIG) has more application in power network compared to Fixed Speed Wind Turbine. Because of employing back-to-back converters, one of the important studies regarding new grid code requirements is Fault Ride-Through (FRT) capability...... of the DFIG. To improve the FRT capability, one of the approaches is Fault Current Limiters (FCLs). In this paper, a new approach of application of the FCLs is studied and its location and impedance type will be discussed. Then, the resistive type of the FCL located in the stator side will be proposed....... The proposed FCL can insert a controllable resistance in fault current pass to not only restrict fault current level and compensate voltage sag in the stator but also consume pre-fault output active power of the DFIG regarding wind speed variation. Simulation results and analytics are presented to prove...

  4. A laboratory scale fundamental time?

    International Nuclear Information System (INIS)

    Mendes, R.V.

    2012-01-01

    The existence of a fundamental time (or fundamental length) has been conjectured in many contexts. However, the ''stability of physical theories principle'' seems to be the one that provides, through the tools of algebraic deformation theory, an unambiguous derivation of the stable structures that Nature might have chosen for its algebraic framework. It is well-known that c and ℎ are the deformation parameters that stabilize the Galilean and the Poisson algebra. When the stability principle is applied to the Poincare-Heisenberg algebra, two deformation parameters emerge which define two time (or length) scales. In addition there are, for each of them, a plus or minus sign possibility in the relevant commutators. One of the deformation length scales, related to non-commutativity of momenta, is probably related to the Planck length scale but the other might be much larger and already detectable in laboratory experiments. In this paper, this is used as a working hypothesis to look for physical effects that might settle this question. Phase-space modifications, resonances, interference, electron spin resonance and non-commutative QED are considered. (orig.)

  5. Fundamental Enabling Issues in Nanotechnology: Stress at the Atomic Scale

    Energy Technology Data Exchange (ETDEWEB)

    Floro, Jerrold Anthony [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Univ. of Virginia, Charlottesville, VA (United States). Dept. of Materials Science and Engineering; Foiles, Stephen Martin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hearne, Sean Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoyt, Jeffrey John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McMaster Univ., Hamilton, ON (Canada); Seel, Steven Craig [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Emcore Corporation, Albuquerque, NM (United States); Webb, Edmund Blackburn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Morales, Alfredo Martin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Zimmerman, Jonathan A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2007-10-01

    To effectively integrate nanotechnology into functional devices, fundamental aspects of material behavior at the nanometer scale must be understood. Stresses generated during thin film growth strongly influence component lifetime and performance; stress has also been proposed as a mechanism for stabilizing supported nanoscale structures. Yet the intrinsic connections between the evolving morphology of supported nanostructures and stress generation are still a matter of debate. This report presents results from a combined experiment and modeling approach to study stress evolution during thin film growth. Fully atomistic simulations are presented predicting stress generation mechanisms and magnitudes during all growth stages, from island nucleation to coalescence and film thickening. Simulations are validated by electrodeposition growth experiments, which establish the dependence of microstructure and growth stresses on process conditions and deposition geometry. Sandia is one of the few facilities with the resources to combine experiments and modeling/theory in this close a fashion. Experiments predicted an ongoing coalescence process that generates signficant tensile stress. Data from deposition experiments also support the existence of a kinetically limited compressive stress generation mechanism. Atomistic simulations explored island coalescence and deposition onto surfaces intersected by grain boundary structures to permit investigation of stress evolution during later growth stages, e.g., continual island coalescence and adatom incorporation into grain boundaries. The predictive capabilities of simulation permit direct determination of fundamental processes active in stress generation at the nanometer scale while connecting those processes, via new theory, to continuum models for much larger island and film structures. Our combined experiment and simulation results reveal the necessary materials science to tailor stress, and therefore performance, in

  6. Hormesis: a fundamental concept in biology

    Directory of Open Access Journals (Sweden)

    Edward J. Calabrese

    2014-04-01

    Full Text Available This paper assesses the hormesis dose response concept, including its historical foundations, frequency, generality, quantitative features, mechanistic basis and biomedical, pharmaceutical and environmental health implications. The hormetic dose response is highly generalizable, being independent of biology model (i.e. common from plants to humans, level of biological organization (i.e. cell, organ and organism, endpoint, inducing agent and mechanism, providing the first general and quantitative description of plasticity. The hormetic dose response describes the limits to which integrative endpoints (e.g. cell proliferation, cell migration, growth patterns, tissue repair, aging processes, complex behaviors such as anxiety, learning, memory, and stress, preconditioning responses, and numerous adaptive responses can be modulated (i.e., enhanced or diminished by pharmaceutical, chemical and physical means. Thus, the hormesis concept is a fundamental concept in biology with a wide range of biological implications and biomedical applications.

  7. Testing fundamental physics with gravitational waves

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The landmark detection of gravitational waves (GWs) has opened a new era in physics, giving access to the hitherto unexplored strong-gravity regime, where spacetime curvature is extreme and the relevant speed is close to the speed of light. In parallel to its countless astrophysical applications, this discovery can have also important implications for fundamental physics. In this context, I will discuss some outstanding, cross-cutting problems that can be finally investigated in the GW era: the nature of black holes and of spacetime singularities, the limits of classical gravity, the existence of extra light fields, and the effects of dark matter near compact objects. Future GW measurements will provide unparalleled tests of quantum-gravity effects at the horizon scale, exotic compact objects, ultralight dark matter, and of general relativity in the strong-field regime.

  8. Quantum operations: technical or fundamental challenge?

    International Nuclear Information System (INIS)

    Mielnik, Bogdan

    2013-01-01

    A class of unitary operations generated by idealized, semiclassical fields is studied. The operations implemented by sharp potential kicks are revisited and the possibility of performing them by softly varying external fields is examined. The possibility of using the ion traps as ‘operation factories’ transforming quantum states is discussed. The non-perturbative algorithms indicate that the results of abstract δ-pulses of oscillator potentials can become real. Some of them, if empirically achieved, could be essential to examine certain atypical quantum ideas. In particular, simple dynamical manipulations might contribute to the Aharonov–Bohm criticism of the time–energy uncertainty principle, while some others may verify the existence of fundamental precision limits of the position measurements or the reality of ‘non-commutative geometries’. (paper)

  9. Optical Metamaterials Fundamentals and Applications

    CERN Document Server

    Cai, Wenshan

    2010-01-01

    Metamaterials—artificially structured materials with engineered electromagnetic properties—have enabled unprecedented flexibility in manipulating electromagnetic waves and producing new functionalities. In just a few years, the field of optical metamaterials has emerged as one of the most exciting topics in the science of light, with stunning and unexpected outcomes that have fascinated scientists and the general public alike. This volume details recent advances in the study of optical metamaterials, ranging from fundamental aspects to up-to-date implementations, in one unified treatment. Important recent developments and applications such as superlenses and cloaking devices are also treated in detail and made understandable. Optical Metamaterials will serve as a very timely book for both newcomers and advanced researchers in this rapidly evolving field. Early praise for Optical Metamaterials: "...this book is timely bringing to students and other new entrants to the field the most up to date concepts. Th...

  10. Phononic crystals fundamentals and applications

    CERN Document Server

    Adibi, Ali

    2016-01-01

    This book provides an in-depth analysis as well as an overview of phononic crystals. This book discusses numerous techniques for the analysis of phononic crystals and covers, among other material, sonic and ultrasonic structures, hypersonic planar structures and their characterization, and novel applications of phononic crystals. This is an ideal book for those working with micro and nanotechnology, MEMS (microelectromechanical systems), and acoustic devices. This book also: Presents an introduction to the fundamentals and properties of phononic crystals Covers simulation techniques for the analysis of phononic crystals Discusses sonic and ultrasonic, hypersonic and planar, and three-dimensional phononic crystal structures Illustrates how phononic crystal structures are being deployed in communication systems and sensing systems.

  11. Fundamentals of modern unsteady aerodynamics

    CERN Document Server

    Gülçat, Ülgen

    2016-01-01

    In this book, the author introduces the concept of unsteady aerodynamics and its underlying principles. He provides the readers with a comprehensive review of the fundamental physics of free and forced unsteadiness, the terminology and basic equations of aerodynamics ranging from incompressible flow to hypersonics. The book also covers modern topics related to the developments made in recent years, especially in relation to wing flapping for propulsion. The book is written for graduate and senior year undergraduate students in aerodynamics and also serves as a reference for experienced researchers. Each chapter includes ample examples, questions, problems and relevant references.   The treatment of these modern topics has been completely revised end expanded for the new edition. It now includes new numerical examples, a section on the ground effect, and state-space representation.

  12. Fundamental Laser Welding Process Investigations

    DEFF Research Database (Denmark)

    Bagger, Claus; Olsen, Flemming Ove

    1998-01-01

    In a number of systematic laboratory investigations the fundamental behavior of the laser welding process was analyzed by the use of normal video (30 Hz), high speed video (100 and 400 Hz) and photo diodes. Sensors were positioned to monitor the welding process from both the top side and the rear...... side of the specimen.Special attention has been given to the dynamic nature of the laser welding process, especially during unstable welding conditions. In one series of experiments, the stability of the process has been varied by changing the gap distance in lap welding. In another series...... video pictures (400 Hz), a clear impact on the seam characteristics has been identified when a hump occurs.Finally, a clear correlation between the position of the focus point, the resultant process type and the corresponding signal intensity and signal variation has been found for sheets welded...

  13. Fundamentals of Protein NMR Spectroscopy

    CERN Document Server

    Rule, Gordon S

    2006-01-01

    NMR spectroscopy has proven to be a powerful technique to study the structure and dynamics of biological macromolecules. Fundamentals of Protein NMR Spectroscopy is a comprehensive textbook that guides the reader from a basic understanding of the phenomenological properties of magnetic resonance to the application and interpretation of modern multi-dimensional NMR experiments on 15N/13C-labeled proteins. Beginning with elementary quantum mechanics, a set of practical rules is presented and used to describe many commonly employed multi-dimensional, multi-nuclear NMR pulse sequences. A modular analysis of NMR pulse sequence building blocks also provides a basis for understanding and developing novel pulse programs. This text not only covers topics from chemical shift assignment to protein structure refinement, as well as the analysis of protein dynamics and chemical kinetics, but also provides a practical guide to many aspects of modern spectrometer hardware, sample preparation, experimental set-up, and data pr...

  14. Neuroethics as a Fundamental Ethic

    Directory of Open Access Journals (Sweden)

    Josep Corcó

    2017-08-01

    Full Text Available In her book Braintrust, the neurophilosopher Patricia Churchland puts forward her ideas about what neuroscience has contributed so far in the study of the neural bases of ethical behaviour in human beings. The main thesis of Churchland’s book is that morality has its origins in the neurobiology of attachment and bonding; she stresses the importance of oxytocin in the cooperative behaviour of human beings, and proposes that neuroethics might eventually come to be regarded as a fundamental ethic. In my opinion, however, Churchland’s proposal raises some pertinent questions, such as, Why should we behave ethically? or, What are moral values? In this paper we assess Churchland’s main ideas in an attempt to show whether neuroscience can be of help in answering these questions.

  15. Silicon photonics fundamentals and devices

    CERN Document Server

    Deen, M Jamal

    2012-01-01

    The creation of affordable high speed optical communications using standard semiconductor manufacturing technology is a principal aim of silicon photonics research. This would involve replacing copper connections with optical fibres or waveguides, and electrons with photons. With applications such as telecommunications and information processing, light detection, spectroscopy, holography and robotics, silicon photonics has the potential to revolutionise electronic-only systems. Providing an overview of the physics, technology and device operation of photonic devices using exclusively silicon and related alloys, the book includes: * Basic Properties of Silicon * Quantum Wells, Wires, Dots and Superlattices * Absorption Processes in Semiconductors * Light Emitters in Silicon * Photodetectors , Photodiodes and Phototransistors * Raman Lasers including Raman Scattering * Guided Lightwaves * Planar Waveguide Devices * Fabrication Techniques and Material Systems Silicon Photonics: Fundamentals and Devices outlines ...

  16. Fundamental Travel Demand Model Example

    Science.gov (United States)

    Hanssen, Joel

    2010-01-01

    Instances of transportation models are abundant and detailed "how to" instruction is available in the form of transportation software help documentation. The purpose of this paper is to look at the fundamental inputs required to build a transportation model by developing an example passenger travel demand model. The example model reduces the scale to a manageable size for the purpose of illustrating the data collection and analysis required before the first step of the model begins. This aspect of the model development would not reasonably be discussed in software help documentation (it is assumed the model developer comes prepared). Recommendations are derived from the example passenger travel demand model to suggest future work regarding the data collection and analysis required for a freight travel demand model.

  17. Molecular imaging. Fundamentals and applications

    International Nuclear Information System (INIS)

    Tian, Jie

    2013-01-01

    Covers a wide range of new theory, new techniques and new applications. Contributed by many experts in China. The editor has obtained the National Science and Technology Progress Award twice. ''Molecular Imaging: Fundamentals and Applications'' is a comprehensive monograph which describes not only the theory of the underlying algorithms and key technologies but also introduces a prototype system and its applications, bringing together theory, technology and applications. By explaining the basic concepts and principles of molecular imaging, imaging techniques, as well as research and applications in detail, the book provides both detailed theoretical background information and technical methods for researchers working in medical imaging and the life sciences. Clinical doctors and graduate students will also benefit from this book.

  18. Multiphase flow dynamics 1 fundamentals

    CERN Document Server

    Kolev, Nikolay Ivanov

    2004-01-01

    Multi-phase flows are part of our natural environment such as tornadoes, typhoons, air and water pollution and volcanic activities as well as part of industrial technology such as power plants, combustion engines, propulsion systems, or chemical and biological industry. The industrial use of multi-phase systems requires analytical and numerical strategies for predicting their behavior. In its third extended edition this monograph contains theory, methods and practical experience for describing complex transient multi-phase processes in arbitrary geometrical configurations, providing a systematic presentation of the theory and practice of numerical multi-phase fluid dynamics. In the present first volume the fundamentals of multiphase dynamics are provided. This third edition includes various updates, extensions and improvements in all book chapters.

  19. Fundamental Laser Welding Process Investigations

    DEFF Research Database (Denmark)

    Bagger, Claus; Olsen, Flemming Ove

    1998-01-01

    In a number of systematic laboratory investigations the fundamental behavior of the laser welding process was analyzed by the use of normal video (30 Hz), high speed video (100 and 400 Hz) and photo diodes. Sensors were positioned to monitor the welding process from both the top side and the rear...... side of the specimen.Special attention has been given to the dynamic nature of the laser welding process, especially during unstable welding conditions. In one series of experiments, the stability of the process has been varied by changing the gap distance in lap welding. In another series...... of normal video and high speed video (100 Hz) can not reveal any instability in the process when humping occurs. Contrary to this, photo diode signals (sampled at 3 kHz) clearly indicate a characteristic signal when humps occur.When the seam area and seam width have manually been measured on high speed...

  20. Fundamental processes in ion plating

    International Nuclear Information System (INIS)

    Mattox, D.M.

    1980-01-01

    Ion plating is a generic term applied to film deposition processes in which the substrate surface and/or the depositing film is subjected to a flux of high energy particles sufficient to cause changes in the interfacial region of film properties compared to a nonbombarded deposition. Ion plating is being accepted as an alternative coating technique to sputter deposition, vacuum evaporation and electroplating. In order to intelligently choose between the various deposition techniques, the fundamental mechanisms, relating to ion plating, must be understood. This paper reviews the effects of low energy ion bombardment on surfaces, interface formation and film development as they apply to ion plating and the implementation and applications of the ion plating process

  1. Multiphase flow dynamics 1 fundamentals

    CERN Document Server

    Kolev, Nikolay Ivanov

    2007-01-01

    Multi-phase flows are part of our natural environment such as tornadoes, typhoons, air and water pollution and volcanic activities as well as part of industrial technology such as power plants, combustion engines, propulsion systems, or chemical and biological industry. The industrial use of multi-phase systems requires analytical and numerical strategies for predicting their behavior. In its third extended edition this monograph contains theory, methods and practical experience for describing complex transient multi-phase processes in arbitrary geometrical configurations, providing a systematic presentation of the theory and practice of numerical multi-phase fluid dynamics. In the present first volume the fundamentals of multiphase dynamics are provided. This third edition includes various updates, extensions and improvements in all book chapters.

  2. Fundamentals of reversible flowchart languages

    DEFF Research Database (Denmark)

    Yokoyama, Tetsuo; Axelsen, Holger Bock; Glück, Robert

    2016-01-01

    Abstract This paper presents the fundamentals of reversible flowcharts. They are intended to naturally represent the structure and control flow of reversible (imperative) programming languages in a simple computation model, in the same way classical flowcharts do for conventional languages......, structured reversible flowcharts are as expressive as unstructured ones, as shown by a reversible version of the classic Structured Program Theorem. We illustrate how reversible flowcharts can be concretized with two example programming languages, complete with syntax and semantics: a low-level unstructured...... language and a high-level structured language. We introduce concrete tools such as program inverters and translators for both languages, which follow the structure suggested by the flowchart model. To further illustrate the different concepts and tools brought together in this paper, we present two major...

  3. Microwave engineering concepts and fundamentals

    CERN Document Server

    Khan, Ahmad Shahid

    2014-01-01

    Detailing the active and passive aspects of microwaves, Microwave Engineering: Concepts and Fundamentals covers everything from wave propagation to reflection and refraction, guided waves, and transmission lines, providing a comprehensive understanding of the underlying principles at the core of microwave engineering. This encyclopedic text not only encompasses nearly all facets of microwave engineering, but also gives all topics—including microwave generation, measurement, and processing—equal emphasis. Packed with illustrations to aid in comprehension, the book: •Describes the mathematical theory of waveguides and ferrite devices, devoting an entire chapter to the Smith chart and its applications •Discusses different types of microwave components, antennas, tubes, transistors, diodes, and parametric devices •Examines various attributes of cavity resonators, semiconductor and RF/microwave devices, and microwave integrated circuits •Addresses scattering parameters and their properties, as well a...

  4. Islamic Fundamentalism in Modern Russia

    Directory of Open Access Journals (Sweden)

    Elena F. Parubochaya

    2017-09-01

    Full Text Available Nowadays Islam takes the stage of recovery associated with the peculiar issues associated with the Muslim society. These characteristics are expressed in the spread of ideas of Islamic fundamentalism and its supporters’ confrontation with the rest of the world. This process has affected the Russian Muslims as well, the trend developed after the collapse of the Soviet Union when the post soviet muslims began to realize themselves as part of one of the Muslim Ummah, coming into conflict with the secular law of the Russian Federation. After the Soviet Union’s disintegration, the radical Islamic ideas have begun to appear in Russia, in the conditions of the growth of nationalism these thoughts found a fertile ground. One of these ideas was associated with the construction of Sharia state in the Muslim autonomous republics of the Russian Federation and their subsequent withdrawal from Russian’s membership. The situation for the Russian state in the Muslim republics aggravated the war in Chechnya. Through Chechnya mercenaries from Arab countries started to penetrate to the Russian territory, they also brought the money for the destabilization of the internal situation in Russia. Nevertheless, separatism did not find the mass support in neighboring regions such as Dagestan, Kabardino-Balkaria, Karachay-Cherkessia and Ingushetia. It is evidently that international Jihad ideas were supported financially from abroad. The issue of funding is a key part of the development of Islamic fundamentalism in Russia, the international Islamic funds and organizations gave huge financial assistance to them. At the present moment Russian authorities lead a fruitful and a successful fight against terrorism. In the future, after the completion of the antiterrorist operation in the Middle East hundreds of terrorists may return to Russia with huge experience that can threaten the security of the Russian state.

  5. Functional renormalization group approach to SU(N ) Heisenberg models: Momentum-space renormalization group for the large-N limit

    Science.gov (United States)

    Roscher, Dietrich; Buessen, Finn Lasse; Scherer, Michael M.; Trebst, Simon; Diehl, Sebastian

    2018-02-01

    In frustrated magnetism, making a stringent connection between microscopic spin models and macroscopic properties of spin liquids remains an important challenge. A recent step towards this goal has been the development of the pseudofermion functional renormalization group approach (pf-FRG) which, building on a fermionic parton construction, enables the numerical detection of the onset of spin liquid states as temperature is lowered. In this work, focusing on the SU (N ) Heisenberg model at large N , we extend this approach in a way that allows us to directly enter the low-temperature spin liquid phase, and to probe its character. Our approach proceeds in momentum space, making it possible to keep the truncation minimalistic, while also avoiding the bias introduced by an explicit decoupling of the fermionic parton interactions into a given channel. We benchmark our findings against exact mean-field results in the large-N limit, and show that even without prior knowledge the pf-FRG approach identifies the correct mean-field decoupling channel. On a technical level, we introduce an alternative finite temperature regularization scheme that is necessitated to access the spin liquid ordered phase. In a companion paper [Buessen et al., Phys. Rev. B 97, 064415 (2018), 10.1103/PhysRevB.97.064415] we present a different set of modifications of the pf-FRG scheme that allow us to study SU (N ) Heisenberg models (using a real-space RG approach) for arbitrary values of N , albeit only up to the phase transition towards spin liquid physics.

  6. Detector Fundamentals for Reachback Analysts

    Energy Technology Data Exchange (ETDEWEB)

    Karpius, Peter Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Myers, Steven Charles [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-03

    This presentation is a part of the DHS LSS spectroscopy course and provides an overview of the following concepts: detector system components, intrinsic and absolute efficiency, resolution and linearity, and operational issues and limits.

  7. Fundamentals of environmental engineering. 2. rev. ed.

    International Nuclear Information System (INIS)

    Bank, M.

    1994-01-01

    'Fundamentals of Environmental Engineering' contains the technical and legal bases for the force environmental areas water supply and waste water disposal, clean air, waste avoidance and waste disposal, as well as noise protection in compact form. Particular scope was allowed for the description of the linkages between the individual environmental areas - for instance, waste combustion and clean air, waste deposition at landfills and treatment of leachate, residual products from successful water and air pollution control measures. For all those who have to familiarize themselves with the complex subject of 'environmental engineering' while in training or during continuing education this book offers a broad approach to the essential general, technical and legal bases. (orig.) [de

  8. Fundamentals of semiconductors physics and materials properties

    CERN Document Server

    Yu, Peter Y

    2010-01-01

    This fourth edition of the well-established Fundamentals of Semiconductors serves to fill the gap between a general solid-state physics textbook and research articles by providing detailed explanations of the electronic, vibrational, transport, and optical properties of semiconductors. The approach is physical and intuitive rather than formal and pedantic. Theories are presented to explain experimental results. This textbook has been written with both students and researchers in mind. Its emphasis is on understanding the physical properties of Si and similar tetrahedrally coordinated semiconductors. The explanations are based on physical insights. Each chapter is enriched by an extensive collection of tables of material parameters, figures, and problems. Many of these problems "lead the student by the hand" to arrive at the results. The major changes made in the fourth edition include: an extensive appendix about the important and by now well-established deep center known as the DX center, additional problems...

  9. Fundamentals of semiconductors physics and materials properties

    CERN Document Server

    Yu, Peter Y

    1996-01-01

    Fundamentals of Semiconductors attempts to fill the gap between a general solid-state physics textbook and research articles by providing detailed explanations of the electronic, vibrational, transport, and optical properties of semiconductors The approach is physical and intuitive rather than formal and pedantic Theories are presented to explain experimental results This textbook has been written with both students and researchers in mind Its emphasis is on understanding the physical properties of Si and similar tetrahedrally coordinated semiconductors The explanations are based on physical insights Each chapter is enriched by an extensive collection of tables of material parameters, figures and problems Many of these problems 'lead the student by the hand' to arrive at the results

  10. Inappropriate Use of the Quasi-Reversible Electrode Kinetic Model in Simulation-Experiment Comparisons of Voltammetric Processes That Approach the Reversible Limit

    KAUST Repository

    Simonov, Alexandr N.

    2014-08-19

    Many electrode processes that approach the "reversible" (infinitely fast) limit under voltammetric conditions have been inappropriately analyzed by comparison of experimental data and theory derived from the "quasi-reversible" model. Simulations based on "reversible" and "quasi-reversible" models have been fitted to an extensive series of a.c. voltammetric experiments undertaken at macrodisk glassy carbon (GC) electrodes for oxidation of ferrocene (Fc0/+) in CH3CN (0.10 M (n-Bu)4NPF6) and reduction of [Ru(NH 3)6]3+ and [Fe(CN)6]3- in 1 M KCl aqueous electrolyte. The confidence with which parameters such as standard formal potential (E0), heterogeneous electron transfer rate constant at E0 (k0), charge transfer coefficient (α), uncompensated resistance (Ru), and double layer capacitance (CDL) can be reported using the "quasi- reversible" model has been assessed using bootstrapping and parameter sweep (contour plot) techniques. Underparameterization, such as that which occurs when modeling CDL with a potential independent value, results in a less than optimal level of experiment-theory agreement. Overparameterization may improve the agreement but easily results in generation of physically meaningful but incorrect values of the recovered parameters, as is the case with the very fast Fc0/+ and [Ru(NH3)6]3+/2+ processes. In summary, for fast electrode kinetics approaching the "reversible" limit, it is recommended that the "reversible" model be used for theory-experiment comparisons with only E0, R u, and CDL being quantified and a lower limit of k 0 being reported; e.g., k0 ≥ 9 cm s-1 for the Fc0/+ process. © 2014 American Chemical Society.

  11. Math majors' visual proofs in a dynamic environment: the case of limit of a function and the ɛ-δ approach

    Science.gov (United States)

    Caglayan, Günhan

    2015-08-01

    Despite few limitations, GeoGebra as a dynamic geometry software stood as a powerful instrument in helping university math majors understand, explore, and gain experiences in visualizing the limits of functions and the ɛ - δ formalism. During the process of visualizing a theorem, the order mattered in the sequence of constituents. Students made use of such rich constituents as finger-hand gestures and cursor gestures in an attempt to keep a record of visual demonstration in progress, while being aware of the interrelationships among these constituents and the transformational aspect of the visually proving process. Covariational reasoning along with interval mapping structures proved to be the key constituents in the visualizing and sense-making of a limit theorem using the delta-epsilon formalism. Pedagogical approaches and teaching strategies based on experimental mathematics - mindtool - consituential visual proofs trio would permit students to study, construct, and meaningfully connect the new knowledge to the previously mastered concepts and skills in a manner that would make sense for them.

  12. Assessment of Current Process Modeling Approaches to Determine Their Limitations, Applicability and Developments Needed for Long-Fiber Thermoplastic Injection Molded Composites

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep; Holbery, Jim; Smith, Mark T.; Kunc, Vlastimil; Norris, Robert E.; Phelps, Jay; Tucker III, Charles L.

    2006-11-30

    This report describes the status of the current process modeling approaches to predict the behavior and flow of fiber-filled thermoplastics under injection molding conditions. Previously, models have been developed to simulate the injection molding of short-fiber thermoplastics, and an as-formed composite part or component can then be predicted that contains a microstructure resulting from the constituents’ material properties and characteristics as well as the processing parameters. Our objective is to assess these models in order to determine their capabilities and limitations, and the developments needed for long-fiber injection-molded thermoplastics (LFTs). First, the concentration regimes are summarized to facilitate the understanding of different types of fiber-fiber interaction that can occur for a given fiber volume fraction. After the formulation of the fiber suspension flow problem and the simplification leading to the Hele-Shaw approach, the interaction mechanisms are discussed. Next, the establishment of the rheological constitutive equation is presented that reflects the coupled flow/orientation nature. The decoupled flow/orientation approach is also discussed which constitutes a good simplification for many applications involving flows in thin cavities. Finally, before outlining the necessary developments for LFTs, some applications of the current orientation model and the so-called modified Folgar-Tucker model are illustrated through the fiber orientation predictions for selected LFT samples.

  13. Fundamental Neutron Physics: Theory and Analysis

    International Nuclear Information System (INIS)

    Gudkov, Vladimir

    2016-01-01

    The goal of the proposal was to study the possibility of searching for manifestations of new physics beyond the Standard model in fundamental neutron physics experiments. This involves detailed theoretical analyses of parity- and time reversal invariance-violating processes in neutron-induced reactions, properties of neutron β-decay, and the precise description of properties of neutron interactions with nuclei. To describe neutron-nuclear interactions, we use both the effective field theory approach and the theory of nuclear reaction with phenomenological nucleon potentials for the systematic description of parity- and time reversal-violating effects in the consistent way. A major emphasis of our research during the funding period has been the study of parity violation (PV) and time reversal invariance violation (TRIV) in few-body systems. We studied PV effects in non-elastic processes in three-nucleon system using both ''DDH-like'' and effective field theory (EFT) approaches. The wave functions were obtained by solving three-body Faddeev equations in configuration space for a number of realistic strong potentials. The observed model dependence for the DDH approach indicates intrinsic difficulty in the description of nuclear PV effects, and it could be the reason for the observed discrepancies in the nuclear PV data analysis. It shows that the DDH approach could be a reasonable approach for analysis of PV effects only if exactly the same strong and weak potentials are used in calculating all PV observables in all nuclei. However, the existing calculations of nuclear PV effects were performed using different potentials; therefore, strictly speaking, one cannot compare the existing results of these calculations among themselves.

  14. Muonium-Physics of a most Fundamental Atom

    NARCIS (Netherlands)

    Jungmann, KP

    The hydrogen-like muonium atom (M=mu(+)e(-)) offers possiblitites to measure fundamental constants most precisely and to search sensitively for new physics. All experiments on muonium at the presenetly most intense muon sources are statistics limited. New and intense muon sources are indispensable

  15. The impact of traffic dynamics on macroscopic fundamental diagram

    NARCIS (Netherlands)

    Knoop, V.L.; Hoogendoorn, S.P.; Van Lint, J.W.C.

    2013-01-01

    Literature shows that – under specific conditions – the macroscopic fundamental diagram (MFD) describes a crisp relationship between the average flow (production) and the average density in an entire network. The limiting condition is that traffic conditions must be homogeneous over the whole

  16. Teaching the Politics of Islamic Fundamentalism.

    Science.gov (United States)

    Kazemzadeh, Masoud

    1998-01-01

    Argues that the rise of Islamic fundamentalism since the Iranian Revolution has generated a number of issues of analytical significance for political science. Describes three main models in teaching and research on Islamic fundamentalism: Islamic exceptionalism, comparative fundamentalisms, and class analysis. Discusses the construction of a…

  17. Journal of Fundamental and Applied Sciences

    African Journals Online (AJOL)

    Journal of Fundamental and Applied Sciences is an international journal reporting significant new results in all aspects of fundamental and applied sciences research. We welcome experimental, computational (including simulation and modelling) and theoretical studies of fundamental and applied sciences. The work must ...

  18. Estimation of the limit of detection with a bootstrap-derived standard error by a partly non-parametric approach. Application to HPLC drug assays

    DEFF Research Database (Denmark)

    Linnet, Kristian

    2005-01-01

    Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...

  19. The water, fundamental ecological base?

    International Nuclear Information System (INIS)

    Bolivar, Luis Humberto

    1994-01-01

    To speak of ecology and the man's interaction with the environment takes, in fact implicit many elements that, actuating harmoniously generates a conducive entropy to a better to be, however it is necessary to hierarchy the importance of these elements, finding that the water, not alone to constitute sixty five percent of the total volume of the planet, or sixty percent of the human body, but to be the well called molecule of the life, it is constituted in the main element to consider in the study of the ecology. The water circulates continually through the endless hydrological cycle of condensation, precipitation, filtration, retention, evaporation, precipitation and so forth; however, due to the quick growth of the cities, its expansion of the green areas or its border lands, result of a demographic behavior and of inadequate social establishment; or of the advance industrial excessive, they produce irreparable alterations in the continuous processes of the water production, for this reason it is fundamental to know some inherent problems to the sources of water. The water, the most important in the renewable natural resources, essential for the life and for the achievement of good part of the man's goals in their productive function, it is direct or indirectly the natural resource more threatened by the human action

  20. Fundamental studies of fusion plasmas

    Science.gov (United States)

    Aamodt, R. E.; Catto, P. J.; Dippolito, D. A.; Myra, J. R.; Russell, D. A.

    1992-05-01

    The major portion of this program is devoted to critical ICH phenomena. The topics include edge physics, fast wave propagation, ICH induced high frequency instabilities, and a preliminary antenna design for Ignitor. This research was strongly coordinated with the world's experimental and design teams at JET, Culham, ORNL, and Ignitor. The results have been widely publicized at both general scientific meetings and topical workshops including the specialty workshop on ICRF design and physics sponsored by Lodestar in April 1992. The combination of theory, empirical modeling, and engineering design in this program makes this research particularly important for the design of future devices and for the understanding and performance projections of present tokamak devices. Additionally, the development of a diagnostic of runaway electrons on TEXT has proven particularly useful for the fundamental understanding of energetic electron confinement. This work has led to a better quantitative basis for quasilinear theory and the role of magnetic vs. electrostatic field fluctuations on electron transport. An APS invited talk was given on this subject and collaboration with PPPL personnel was also initiated. Ongoing research on these topics will continue for the remainder of the contract period and the strong collaborations are expected to continue, enhancing both the relevance of the work and its immediate impact on areas needing critical understanding.