WorldWideScience

Sample records for techniques provide detailed

  1. Level of detail technique for plant models

    Institute of Scientific and Technical Information of China (English)

    Xiaopeng ZHANG; Qingqiong DENG; Marc JAEGER

    2006-01-01

    Realistic modelling and interactive rendering of forestry and landscape is a challenge in computer graphics and virtual reality. Recent new developments in plant growth modelling and simulation lead to plant models faithful to botanical structure and development, not only representing the complex architecture of a real plant but also its functioning in interaction with its environment. Complex geometry and material of a large group of plants is a big burden even for high performances computers, and they often overwhelm the numerical calculation power and graphic rendering power. Thus, in order to accelerate the rendering speed of a group of plants, software techniques are often developed. In this paper, we focus on plant organs, i.e. leaves, flowers, fruits and inter-nodes. Our approach is a simplification process of all sparse organs at the same time, i. e. , Level of Detail (LOD) , and multi-resolution models for plants. We do explain here the principle and construction of plant simplification. They are used to construct LOD and multi-resolution models of sparse organs and branches of big trees. These approaches take benefit from basic knowledge of plant architecture, clustering tree organs according to biological structures. We illustrate the potential of our approach on several big virtual plants for geometrical compression or LOD model definition. Finally we prove the efficiency of the proposed LOD models for realistic rendering with a virtual scene composed by 184 mature trees.

  2. Arthroscopic optical coherence tomography provides detailed information on articular cartilage lesions in horses.

    Science.gov (United States)

    te Moller, N C R; Brommer, H; Liukkonen, J; Virén, T; Timonen, M; Puhakka, P H; Jurvelin, J S; van Weeren, P R; Töyräs, J

    2013-09-01

    Arthroscopy enables direct inspection of the articular surface, but provides no information on deeper cartilage layers. Optical coherence tomography (OCT), based on measurement of reflection and backscattering of light, is a diagnostic technique used in cardiovascular surgery and ophthalmology. It provides cross-sectional images at resolutions comparable to that of low-power microscopy. The aim of this study was to determine if OCT is feasible for advanced clinical assessment of lesions in equine articular cartilage during diagnostic arthroscopy. Diagnostic arthroscopy of 36 metacarpophalangeal joints was carried out ex vivo. Of these, 18 joints with varying degrees of cartilage damage were selected, wherein OCT arthroscopy was conducted using an OCT catheter (diameter 0.9 mm) inserted through standard instrument portals. Five sites of interest, occasionally supplemented with other locations where defects were encountered, were arthroscopically graded according to the International Cartilage Repair Society (ICRS) classification system. The same sites were evaluated qualitatively (ICRS classification and morphological description of the lesions) and quantitatively (measurement of cartilage thickness) on OCT images. OCT provided high resolution images of cartilage enabling determination of cartilage thickness. Comparing ICRS grades determined by both arthroscopy and OCT revealed poor agreement. Furthermore, OCT visualised a spectrum of lesions, including cavitation, fibrillation, superficial and deep clefts, erosion, ulceration and fragmentation. In addition, with OCT the arthroscopically inaccessible area between the dorsal MC3 and P1 was reachable in some cases. Arthroscopically-guided OCT provided more detailed and quantitative information on the morphology of articular cartilage lesions than conventional arthroscopy. OCT could therefore improve the diagnostic value of arthroscopy in equine orthopaedic surgery.

  3. Mathematical modeling provides kinetic details of the human immune response to vaccination

    Directory of Open Access Journals (Sweden)

    Dustin eLe

    2015-01-01

    Full Text Available With major advances in experimental techniques to track antigen-specific immune responses many basic questions on the kinetics of virus-specific immunity in humans remain unanswered. To gain insights into kinetics of T and B cell responses in human volunteers we combine mathematical models and experimental data from recent studies employing vaccines against yellow fever and smallpox. Yellow fever virus-specific CD8 T cell population expanded slowly with the average doubling time of 2 days peaking 2.5 weeks post immunization. Interestingly, we found that the peak of the yellow fever-specific CD8 T cell response is determined by the rate of T cell proliferation and not by the precursor frequency of antigen-specific cells as has been suggested in several studies in mice. We also found that while the frequency of virus-specific T cells increases slowly, the slow increase can still accurately explain clearance of yellow fever virus in the blood. Our additional mathematical model describes well the kinetics of virus-specific antibody-secreting cell and antibody response to vaccinia virus in vaccinated individuals suggesting that most of antibodies in 3 months post immunization are derived from the population of circulating antibody-secreting cells. Taken together, our analysis provides novel insights into mechanisms by which live vaccines induce immunity to viral infections and highlight challenges of applying methods of mathematical modeling to the current, state-of-the-art yet limited immunological data.

  4. Sparse regularization techniques provide novel insights into outcome integration processes.

    Science.gov (United States)

    Mohr, Holger; Wolfensteller, Uta; Frimmel, Steffi; Ruge, Hannes

    2015-01-01

    probabilistic learning, rule integration and reward processing. Additionally, a detailed post-hoc analysis of these regions revealed that distinct activation dynamics underlay the processing of ambiguous relative to differential outcomes. Together, these results show that L1-regularization can improve classification performance while simultaneously providing highly specific and interpretable discriminative activation patterns.

  5. Accuracy of Implant Position Transfer and Surface Detail Reproduction with Different Impression Materials and Techniques

    Directory of Open Access Journals (Sweden)

    Marzieh Alikhasi

    2016-03-01

    Full Text Available Objectives: The purpose of this study was to compare the accuracy of implant position transfer and surface detail reproduction using two impression techniques and materials.Materials and Methods: A metal model with two implants and three grooves of 0.25, 0.50 and 0.75 mm in depth on the flat superior surface of a die was fabricated. Ten regular-body polyether (PE and 10 regular-body polyvinyl siloxane (PVS impressions with square and conical transfer copings using open tray and closed tray techniques were made for each group. Impressions were poured with type IV stone, and linear and angular displacements of the replica heads were evaluated using a coordinate measuring machine (CMM. Also, accurate reproduction of the grooves was evaluated by a video measuring machine (VMM. These measurements were compared with the measurements calculated on the reference model that served as control, and the data were analyzed with two-way ANOVA and t-test at P= 0.05.Results: There was less linear displacement for PVS and less angular displacement for PE in closed-tray technique, and less linear displacement for PE in open tray technique (P<0.001. Also, the open tray technique showed less angular displacement with the use of PVS impression material. Detail reproduction accuracy was the same in all the groups (P>0.05(.Conclusion: The open tray technique was more accurate using PE, and also both closed tray and open tray techniques had acceptable results with the use of PVS. The choice of impression material and technique made no significant difference in surface detail reproduction.Keywords: Dental Implants; Dental Impression Materials, Dental Impression Technique

  6. Accuracy of Implant Position Transfer and Surface Detail Reproduction with Different Impression Materials and Techniques

    Science.gov (United States)

    Alikhasi, Marzieh; Siadat, Hakimeh; Kharazifard, Mohammad Javad

    2015-01-01

    Objectives: The purpose of this study was to compare the accuracy of implant position transfer and surface detail reproduction using two impression techniques and materials. Materials and Methods: A metal model with two implants and three grooves of 0.25, 0.50 and 0.75 mm in depth on the flat superior surface of a die was fabricated. Ten regular-body polyether (PE) and 10 regular-body polyvinyl siloxane (PVS) impressions with square and conical transfer copings using open tray and closed tray techniques were made for each group. Impressions were poured with type IV stone, and linear and angular displacements of the replica heads were evaluated using a coordinate measuring machine (CMM). Also, accurate reproduction of the grooves was evaluated by a video measuring machine (VMM). These measurements were compared with the measurements calculated on the reference model that served as control, and the data were analyzed with two-way ANOVA and t-test at P= 0.05. Results: There was less linear displacement for PVS and less angular displacement for PE in closed-tray technique, and less linear displacement for PE in open tray technique (Ptray technique showed less angular displacement with the use of PVS impression material. Detail reproduction accuracy was the same in all the groups (P>0.05). Conclusion: The open tray technique was more accurate using PE, and also both closed tray and open tray techniques had acceptable results with the use of PVS. The choice of impression material and technique made no significant difference in surface detail reproduction. PMID:27252761

  7. Molecular Details of Olfactomedin Domains Provide Pathway to Structure-Function Studies.

    Directory of Open Access Journals (Sweden)

    Shannon E Hill

    detailed functional characterization of these biomedically important protein domains.

  8. Wind Turbine Rotor Simulation via CFD Based Actuator Disc Technique Compared to Detailed Measurement

    Directory of Open Access Journals (Sweden)

    Esmail Mahmoodi

    2015-10-01

    Full Text Available In this paper, a generalized Actuator Disc (AD is used to model the wind turbine rotor of the MEXICO experiment, a collaborative European wind turbine project. The AD model as a combination of CFD technique and User Defined Functions codes (UDF, so-called UDF/AD model is used to simulate loads and performance of the rotor in three different wind speed tests. Distributed force on the blade, thrust and power production of the rotor as important designing parameters of wind turbine rotors are focused to model. A developed Blade Element Momentum (BEM theory as a code based numerical technique as well as a full rotor simulation both from the literature are included into the results to compare and discuss. The output of all techniques is compared to detailed measurements for validation, which led us to final conclusions.

  9. A Multi-Channel Salience Based Detail Exaggeration Technique for 3D Relief Surfaces

    Institute of Scientific and Technical Information of China (English)

    Yong-Wei Miao; Jie-Qing Feng; Jin-Rong Wang; Renato Pajarola

    2012-01-01

    Visual saliency can always persuade the viewer's visual attention to fine-scale mesostructure of 3D complex shapes.Owing to the multi-channel salience measure and salience-domain shape modeling technique,a novel visual saliency based shape depiction scheme is presented to exaggerate salient geometric details of the underlying relief surface.Our multi-channel salience measure is calculated by combining three feature maps,i.e.,the O-order feature map of local height distribution,the 1-order feature map of normal difference,and the 2-order feature map of mean curvature variation.The original relief surface is firstly manipulated by a salience-domain enhancement function,and the detail exaggeration surface can then be obtained by adjusting the surface normals of the original surface as the corresponding final normals of the manipulated surface.The advantage of our detail exaggeration technique is that it can adaptively alter the shading of the original shape to reveal visually salient features whilst keeping the desired appearance unimpaired.The experimental results demonstrate that our non-photorealistic shading scheme can enhance the surface mesostructure effectively and thus improving the shape depiction of the relief surfaces.

  10. Accuracy of Implant Position Transfer and Surface Detail Reproduction with Different Impression Materials and Techniques

    OpenAIRE

    Marzieh Alikhasi; Hakimeh Siadat; Elaheh Beyabanaki; Mohammad Javad Kharazifard

    2015-01-01

    Objectives: The purpose of this study was to compare the accuracy of implant position transfer and surface detail reproduction using two impression techniques and materials.Materials and Methods: A metal model with two implants and three grooves of 0.25, 0.50 and 0.75 mm in depth on the flat superior surface of a die was fabricated. Ten regular-body polyether (PE) and 10 regular-body polyvinyl siloxane (PVS) impressions with square and conical transfer copings using open tray and closed tray ...

  11. Reformulation: A Technique for Providing Advanced Feedback in Writing.

    Science.gov (United States)

    Cohen, Andrew D.

    1989-01-01

    The typical writing feedback situation in a language classroom is examined, followed by a discussion of reformulation as an alternative approach. An example is provided. Issues to be considered as well as benefits of the reformulation approach are described for both classroom and individual settings. (six references) (LB)

  12. A 1.4-Mb interval RH map of horse chromosome 17 provides detailed comparison with human and mouse homologues.

    Science.gov (United States)

    Lee, Eun-Joon; Raudsepp, Terje; Kata, Srinivas R; Adelson, David; Womack, James E; Skow, Loren C; Chowdhary, Bhanu P

    2004-02-01

    Comparative genomics has served as a backbone for the rapid development of gene maps in domesticated animals. The integration of this approach with radiation hybrid (RH) analysis provides one of the most direct ways to obtain physically ordered comparative maps across evolutionarily diverged species. We herein report the development of a detailed RH and comparative map for horse chromosome 17 (ECA17). With markers distributed at an average interval of every 1.4 Mb, the map is currently the most informative among the equine chromosomes. It comprises 75 markers (56 genes and 19 microsatellites), of which 50 gene specific and 5 microsatellite markers were generated in this study and typed to our 5000-rad horse x hamster whole genome RH panel. The markers are dispersed over six RH linkage groups and span 825 cR(5000). The map is among the most comprehensive whole chromosome comparative maps currently available for domesticated animals. It finely aligns ECA17 to human and mouse homologues (HSA13 and MMU1, 3, 5, 8, and 14, respectively) and homologues in other domesticated animals. Comparisons provide insight into their relative organization and help to identify evolutionarily conserved segments. The new ECA17 map will serve as a template for the development of clusters of BAC contigs in regions containing genes of interest. Sequencing of these regions will help to initiate studies aimed at understanding the molecular mechanisms for various diseases and inherited disorders in horse as well as human.

  13. Inversion Techniques for Retrieving Detailed Aerosol Properties from Remote Sensing Observations: Achievements and Perspectives

    Science.gov (United States)

    Dubovik, O.

    2010-12-01

    The ability of aerosol particles to interact strongly with electromagnetic radiation makes aerosol one of most climatically important atmospheric component. Remote sensing using the same ability for characterizing properties of atmospheric aerosol is probably the most adequate observational approach for accessing aerosol effect in climatic studies. Indeed, the satellite remote sensing is unique technique allowing monitoring of time variability of the aerosol at regional and global scales. Compare to in situ and laboratory measurements, remote methods do not use aerosol sampling and allow accessing the properties of unperturbed ambient aerosol in the atmospheres. However, interpretation of the remote sensing observations involves data inversion that, in practice, often appears to be a sophisticated procedure leading to rather ambiguous results. Numerous publications offer a wide diversity of approaches suggesting somewhat different inversion methods. Such uncertainty in methodological guidance leads to excessive dependence of retrieval algorithms on the personalized input and preferences of the developer. This presentation highlights a continues effort on developing a concept clarifying the differences between various methods and outlining unified principles addressing such important aspects of inversion optimization as accounting for errors in the data used, inverting the data with different levels of accuracy, accounting for a priori and ancillary information, estimating retrieval errors, etc. The developed concept uses the principles of statistical estimation and suggests a generalized multi-term Least Square type formulation that complementarily unites advantages of a variety of practical inversion approaches, such as Phillips-Tikhonov-Twomey constrained inversion, Kalman filter, Newton-Gauss and Levenberg-Marquardt iterations, optimal estimation, etc. The concept will be demonstrated by successful implementations in several challenging aerosol remote sensing

  14. Semantic Web Techniques for Yellow Page Service Providers

    Directory of Open Access Journals (Sweden)

    Raghu Anantharangachar

    2012-08-01

    Full Text Available Applications providing “yellow pages information” for use over the web should ideally be based on structured information. Use of web pages providing unstructured information poses variety of problems to the user, such as use of arbitrary formats, unsuitability for machine processing and likely incompleteness of information. Structured data alleviates these problems but we require more. Capturing the semantics of a domain in the form of an ontology is necessary to ensure that unforeseen application can easily be created at a later date. Very often yellow page systems are implemented using a centralized database. In some cases, human intermediaries accessible over the phone network examine a centralized database and use their reasoning ability to deal with the user’s need for information. Centralized operation and considerable central administration make these systems expensive to operate. Scaling up such systems is difficult. They behave like isolated systems and it is common for such systems to be highly domain specific, for instance systems dealing with accommodation and travel. This paper explores an alternative – a highly distributed system design meeting a variety of needs – considerably reducing efforts required at a central organization, enabling large numbers of vendors to enter information about their own products and services, enablingend-users to contribute information such as their own ratings, using an ontology to describe each domain of application in a flexible manner for uses foreseen and unforeseen, enabling distributed search and mashups, use of vendor independent standards, using reasoning to find the best matches to a given query, geospatial reasoning and a simple, interactive, mobile application/interface. We view this design as one in which vendors and end-users do the bulk of the work in building large distributed collections of information in a Web 2.0 style. We give importance to geo-spatial information and

  15. Multiple-endpoint assay provides a detailed mechanistic view of responses to herbicide exposure in Chlamydomonas reinhardtii

    Energy Technology Data Exchange (ETDEWEB)

    Nestler, Holger [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); ETH Zurich, Swiss Federal Institute of Technology, Institute of Biogeochemistry and Pollutant Dynamics, Universitaetstrasse 16, 8092 Zurich (Switzerland); Groh, Ksenia J.; Schoenenberger, Rene; Behra, Renata [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); Schirmer, Kristin [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); ETH Zurich, Swiss Federal Institute of Technology, Institute of Biogeochemistry and Pollutant Dynamics, Universitaetstrasse 16, 8092 Zurich (Switzerland); EPF Lausanne, School of Architecture, Civil and Environmental Engineering, 1015 Lausanne (Switzerland); Eggen, Rik I.L. [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); ETH Zurich, Swiss Federal Institute of Technology, Institute of Biogeochemistry and Pollutant Dynamics, Universitaetstrasse 16, 8092 Zurich (Switzerland); Suter, Marc J.-F., E-mail: suter@eawag.ch [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); ETH Zurich, Swiss Federal Institute of Technology, Institute of Biogeochemistry and Pollutant Dynamics, Universitaetstrasse 16, 8092 Zurich (Switzerland)

    2012-04-15

    The release of herbicides into the aquatic environment raises concerns about potential detrimental effects on ecologically important non-target species, such as unicellular algae, necessitating ecotoxicological risk assessment. Algal toxicity tests based on growth, a commonly assessed endpoint, are integrative, and hence do not provide information about underlying toxic mechanisms and effects. This limitation may be overcome by measuring more specific biochemical and physiological endpoints. In the present work, we developed and applied a novel multiple-endpoint assay, and analyzed the effects of the herbicides paraquat, diuron and norflurazon, each representing a specific mechanism of toxic action, on the single celled green alga Chlamydomonas reinhardtii. The endpoints added to assessment of growth were pigment content, maximum and effective photosystem II quantum yield, ATP content, esterase and oxidative activity. All parameters were measured at 2, 6 and 24 h of exposure, except for growth and pigment content, which were determined after 6 and 24 h only. Effective concentrations causing 50% of response (EC50s) and lowest observable effect concentrations (LOECs) were determined for all endpoints and exposure durations where possible. The assay provided a detailed picture of the concentration- and time-dependent development of effects elicited by the analyzed herbicides, thus improving the understanding of the underlying toxic mechanisms. Furthermore, the response patterns were unique to the respective herbicide and reflected the different mechanisms of toxicity. The comparison of the endpoint responses and sensitivities revealed that several physiological and biochemical parameters reacted earlier or stronger to disturbances than growth. Overall, the presented multiple-endpoint assay constitutes a promising basis for investigating stressor and toxicant effects in green algae.

  16. An automated technique for detailed ?-FTIR mapping of diamond and spectral deconvolution

    Science.gov (United States)

    Howell, Dan; Griffin, Bill; O'Neill, Craig; O'Reilly, Suzanne; Pearson, Norman; Handley, Heather

    2010-05-01

    other commonly found defects and impurities. Whether these are intrinsic defects like platelets, extrinsic defects like hydrogen or boron atoms, or inclusions of minerals or fluids. Recent technological developments in the field of spectroscopy allow detailed μ-FTIR analysis to be performed rapidly in an automated fashion. The Nicolet iN10 microscope has an integrated design that maximises signal throughput and allows spectra to be collected with greater efficiency than is possible with conventional μ-FTIR spectrometer-microscope systems. Combining this with a computer controlled x-y stage allows for the automated measuring of several thousand spectra in only a few hours. This affords us the ability to record 2D IR maps of diamond plates with minimal effort, but has created the need for an automated technique to process the large quantities of IR spectra and obtain quantitative data from them. We will present new software routines that can process large batches of IR spectra, including baselining, conversion to absorption coefficient, and deconvolution to identify and quantify the various nitrogen components. Possible sources of error in each step of the process will be highlighted so that the data produced can be critically assessed. The end result will be the production of various false colour 2D maps that show the distribution of nitrogen concentrations and aggregation states, as well as other identifiable components.

  17. Developing Detailed Foresight Narratives: a Participatory Technique from the Mekong Region

    Directory of Open Access Journals (Sweden)

    Tira Foran

    2013-12-01

    Full Text Available Narratives that explore uncertain events are central to a variety of future-oriented approaches ranging from planning to community visioning. Techniques to create interesting narratives, however, have been overlooked in the peer-reviewed environmental foresight literature. We describe a participatory, multidimensional, pragmatic technique to generate qualitative foresight ("scenario" narratives. We applied this technique in the Mekong region of Southeast Asia during 11 workshops conducted in 5 countries and 1 regional setting. To improve consideration of systemic connections, narratives were shared between the six settings, allowing participants to understanding cross-scale enablers and inhibitors of desired development outcomes. A second innovative feature is elaboration of character-oriented narratives. We discuss how the technique responds to ongoing methodological challenges of critical inquiry, policy salience, and agency.

  18. On Technique of Expressing the Consciousness and the Detail Description in Katherine Mansfield’s A Dill Pickle

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    <正>This paper analyzes A Dill Pickle writing style especially from the perspective of the technique of expressing the consciousness and the detail description to discuss Mansfield’s artistic characteristics. It focused on the theme of the sense of loneliness and disillusion of women,and distinctive Modernist tendency.

  19. A Detailed look of Audio Steganography Techniques using LSB and Genetic Algorithm Approach

    OpenAIRE

    Gunjan Nehru; Puja Dhar

    2012-01-01

    This paper is the study of various techniques of audio steganography using different algorithmis like genetic algorithm approach and LSB approach. We have tried some approaches that helps in audio steganography. As we know it is the art and science of writing hidden messages in such a way that no one, apart from the sender and intended recipient, suspects the existence of the message, a form of security through obscurity. In steganography, the message used to hide secret message is called hos...

  20. Combined geophysical techniques for detailed groundwater flow investigation in tectonically deformed fractured rocks

    Directory of Open Access Journals (Sweden)

    John Alexopoulos

    2014-02-01

    Full Text Available In this paper we present a combination of several near surface geophysical investigation techniques with high resolution remote sensing image interpretations, in order to define the groundwater flow paths and whether they can be affected by future seismic events. A seasonal spring (Amvrakia located at the foot of Meteora pillars near the village of Kastraki (Greece was chosen as a test site. The Meteora conglomeratic formations crop out throughout the study area and are characterized by large discontinuities caused by post Miocene till present tectonic deformation [Ferriere et al. 2011, Royden and Papanikolaou 2011]. A network of groundwater pathways has been developed above the impermeable marls underlying the conglomeratic strata. Our research aims to define these water pathways in order to investigate and understand the exact mechanism of the spring by mapping the exposed discontinuity network with classic field mapping and remote sensing image interpretation and define their underground continuity with the contribution of near surface geophysical techniques. Five Very Low Frequency (VLF profiles were conducted with different directions around the spring aiming to detect possible conductive zones in the conglomeratic formations that the study area consists of. Moreover, two Electrical Resistivity Tomography (ERT sections of a total length of 140m were carried out parallel to the VLF profiles for cross-checking and verifying the geophysical information. Both techniques revealed important conductive zones (<200 Ohm m within the conglomerate strata, which we interpret as discontinuities filled with water supplying the spring, which are quite vulnerable to displacements as the hydraulic connections between them might be easily disturbed after a future seismic event.

  1. A Detailed look of Audio Steganography Techniques using LSB and Genetic Algorithm Approach

    Directory of Open Access Journals (Sweden)

    Gunjan Nehru

    2012-01-01

    Full Text Available This paper is the study of various techniques of audio steganography using different algorithmis like genetic algorithm approach and LSB approach. We have tried some approaches that helps in audio steganography. As we know it is the art and science of writing hidden messages in such a way that no one, apart from the sender and intended recipient, suspects the existence of the message, a form of security through obscurity. In steganography, the message used to hide secret message is called host message or cover message. Once the contents of the host message or cover message are modified, the resultant message is known as stego message. In other words, stego message is combination of host message and secret message. Audio steganography requires a text or audio secret message to be embedded within a cover audio message. Due to availability of redundancy, the cover audio message before steganography, stego message after steganography remains same. for information hiding.

  2. LOCKE Detailed Specification Tables

    CERN Document Server

    Menezo, Lucia G; Gregorio, Jose-Angel

    2012-01-01

    This document shows the detailed specification of LOCKE coherence protocol for each cache controller, using a table-based technique. This representation provides clear, concise visual information yet includes sufficient detail (e.g., transient states) arguably lacking in the traditional, graphical form of state diagrams.

  3. Technique for direct measurement of thermal conductivity of elastomers and a detailed uncertainty analysis

    Science.gov (United States)

    Ralphs, Matthew I.; Smith, Barton L.; Roberts, Nicholas A.

    2016-11-01

    High thermal conductivity thermal interface materials (TIMs) are needed to extend the life and performance of electronic circuits. A stepped bar apparatus system has been shown to work well for thermal resistance measurements with rigid materials, but most TIMs are elastic. This work studies the uncertainty of using a stepped bar apparatus to measure the thermal resistance and a tensile/compression testing machine to estimate the compressed thickness of polydimethylsiloxane for a measurement on the thermal conductivity, k eff. An a priori, zeroth order analysis is used to estimate the random uncertainty from the instrumentation; a first order analysis is used to estimate the statistical variation in samples; and an a posteriori, Nth order analysis is used to provide an overall uncertainty on k eff for this measurement method. Bias uncertainty in the thermocouples is found to be the largest single source of uncertainty. The a posteriori uncertainty of the proposed method is 6.5% relative uncertainty (68% confidence), but could be reduced through calibration and correlated biases in the temperature measurements.

  4. Evidence for the efficacy of the MORI technique: viewers do not notice or implicitly remember details from the alternate movie version.

    Science.gov (United States)

    French, Lauren; Gerrie, Matthew P; Garry, Maryanne; Mori, Kazuo

    2009-11-01

    The MORI technique provides a unique way to research social influences on memory. The technique allows people to watch different movies on the same screen at the same time without realizing that each of them sees something different. As a result, researchers can create a situation in which people feel as though they share an experience, but systematic differences are introduced into their memories, and the effect of those differences can be tracked through a discussion. Despite its methodological advances, the MORI technique has been met with criticism, mostly because reviewers are worried that the MORI technique might not completely block the alternate movie version from view, leading people in these studies to see their partner's version of the movie as well as their own. We addressed these concerns in two experiments. We found no evidence that subjects noticed the alternate movie version while watching a movie via the MORI technique (Experiment 1) and no evidence that subjects remembered details from the alternate movie version (Experiment 2). Taken together, the results provide support for the MORI technique as a valuable research tool.

  5. Accuracy of implant transfer and surface detail reproduction with polyether and polyvinyl siloxane using closed-tray impression technique

    Directory of Open Access Journals (Sweden)

    Marzieh Alikhasi

    2013-10-01

    Full Text Available   Background and Aims: Making accurate impressions of prepared teeth when they are adjacent to dental implants is of great importance. In these situations, disregarding the selection of appropriate impression material and technique, not only can affect accuracy of transferring of the 3-dimentional spatial status of implant, but also can jeopardize the accurate recording of tooth. In the present study, the accuracy of two impression materials with taper impression copings for recording implant position and surface details was evaluated.   Materials and Methods: One metal reference model with 2 implants (Implantium and a preparation of three grooves on a tooth according to ADA no. 19 standard was fabricated. 10 medium- consistency polyEther (PE impressions using custom trays and 10 polyVinyl Siloxane (PVS putty wash impressions using prefabricated trays with conical impression coping were made. Impressions were poured with ADA type IV stone. A Coordinate Measuring Machine (CMM evaluated x, y and angular displacement of the implant analog heads and also accuracy of groove reproduction were measured using a Video Measuring Machine (VMM. These measurements were compared to the ones from reference model. Data were analyzed using one-way ANOVA and T-test.   Results: Putty wash PVS had less linear discrepancy compared with reference model (P > 0.001. There was no significant difference in the surface detail reproduction (P = 0.15.   Conclusion: Putty wash PVS had better results for linear displacement compared with medium consistency PE. There was no significant difference in surface detail reproduction between the two impression materials.

  6. Swarm Intelligence: New Techniques for Adaptive Systems to Provide Learning Support

    Science.gov (United States)

    Wong, Lung-Hsiang; Looi, Chee-Kit

    2012-01-01

    The notion of a system adapting itself to provide support for learning has always been an important issue of research for technology-enabled learning. One approach to provide adaptivity is to use social navigation approaches and techniques which involve analysing data of what was previously selected by a cluster of users or what worked for…

  7. Endoscopic Surgical Treatment of Lumbar Synovial Cyst: Detailed Account of Surgical Technique and Report of 11 Consecutive Patients.

    Science.gov (United States)

    Oertel, Joachim M; Burkhardt, Benedikt W

    2017-07-01

    Lumbar synovial cysts (LSCs) are an uncommon cause of radiculopathy and back pain. Open surgical treatment is associated with extensive bone resection and muscle trauma. The endoscopic tubular-assisted LSC resection has not been described in detail. Here the authors assessed the effectiveness of this technique for LSC resection. Eleven patients (4 female and 7 male patients) were operated on via an ipsilateral approach for resection of LSC using an endoscopic tubular retractor system. Preoperative magnetic resonance imaging was evaluated for signs of degeneration and instability. At follow-up a standardized questionnaire including the Oswestry Disability Index and functional outcome according to MacNab criteria was conducted. Additionally, a personal examination with particular reference to back and leg pain was performed. The mean follow-up was 10.5 months. Preoperatively, spondylolisthesis grade 1 was noted in 4 patients (36.4%). Ten patients had bilateral facet joint effusion (90.9%). At follow-up 10 patients reported being free of leg pain (90.9%), eight patients reported no back pain (72.7%), ten patients had full motor strength (90.9%), and 9 patients had no sensory deficit (81.8%). Nine patients reported an excellent or a good clinical outcome (81.8%). The mean Oswestry Disability Index was 4.7%. None of the patients developed new mechanical low back pain or required subsequent fusion procedure. The endoscopic tubular-assisted procedure is a safe way to treat LSC. It offers complete resection of LSC and achieves good clinical outcome by preserving muscle and ligamentous and bony structures, which prevents delayed instability. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Application of an anatomically-detailed finite element thorax model to investigate pediatric cardiopulmonary resuscitation techniques on hard bed.

    Science.gov (United States)

    Jiang, Binhui; Mao, Haojie; Cao, Libo; Yang, King H

    2014-09-01

    Improved Cardiopulmonary Resuscitation (CPR) approaches will largely benefit the children in need. The constant peak displacement and constant peak force loading methods were analyzed on hard bed for pediatric CPR by an anatomically-detailed 10 year-old (YO) child thorax finite element (FE) model. The chest compression and rib injury risk were studied for children with various levels of thorax stiffness. We created three thorax models with different chest stiffness. Simulated CPR׳s in the above two conditions were performed. Three different compression rates were considered under the constant peak displacement condition. The model-calculated deflections and forces were analyzed. The rib maximum principle strains (MPS׳s) were used to predict the potential risk of rib injury. Under the constant peak force condition, the chest deflection ranged from 34.2 to 42.2mm. The highest rib MPS was 0.75%, predicted by the compliant thorax model. Under the normal constant peak displacement condition, the highest rib MPS was 0.52%, predicted by the compliant thorax model. The compression rate did not affect the highest rib MPS. Results revealed that the thoracic stiffness had great effects on the quality of CPR. To maintain CPR quality for various children, the constant peak displacement technique is recommended when the CPR is performed on the hard bed. Furthermore, the outcome of CPR in terms of rib strains and total work are not sensitive to the compression rate. The FE model-predicted high strains were in the ribs, which have been found to be vulnerable to CPR in the literature. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Brief: market research techniques--a synopsis for continuing education providers.

    Science.gov (United States)

    Wilson, S G

    1990-01-01

    Any organization that attempts to attract and hold customers must face the necessity of determining what people want and value, and then cater to those wants and values (Levitt, 1983). Educational providers can use these market research techniques to strengthen the design and implementation of nursing CE offerings. In addition to alleviating the monotony of pen and paper questionnaires for our learners, these techniques can strengthen our programming by gathering a wealth of information about the qualitative and quantitative nature of the needs of our customers. The marketing knowledge gained from these tools can help to ensure the continued success of our educational endeavors despite growing fiscal constraints.

  10. Using public health detailing and a family-centered ecological approach to promote patient-provider-parent action for reducing childhood obesity.

    Science.gov (United States)

    Sealy, Yvette M; Zarcadoolas, Christina; Dresser, Michelle; Wedemeyer, Laura; Short, Leslie; Silver, Lynn

    2012-04-01

    This paper describes the research and development of the Obesity in Children Action Kit, a paper-based chronic disease management tool of the Public Health Detailing Program (PHD) at the New York City (NYC) Department of Health and Mental Hygiene (DOHMH). It also describes PHD's process for developing the Obesity in Children detailing campaign (targeting healthcare providers working with children aged 2-18) and its results, during which the Action Kit materials were a focal point. The campaign goals were to impact healthcare provider clinical behaviors, improve the health literacy of parents and children, instigate patient-provider-parent dialogue, and change family practices to prevent obesity. Qualitative research methods consisted of healthcare provider in-depth interviews and parent focus groups to aid campaign development. Evaluation of the Obesity in Children campaign included self-reported data on uptake and usage of clinical tools and action steps of matched assessments from 237 healthcare provider initial and follow-up visits, material stock counts, and DOHMH representative qualitative visit excerpts. Key themes identified in parent focus groups were concerns about childhood diabetes and high blood pressure, awareness of cultural pressure and our "supersize" culture, frustration with family communication around overweight and obesity, lack of knowledge about food quality and portion size, economic pressures, and the availability of healthy and nutritious foods. During the Obesity in Children campaign, six representatives reached 161 practices with 1,588 one-on-one interactions, and an additional 461 contacts were made through group presentations. After these interactions, there was a significant increase in the percentage of physicians self-reported use of key recommended practices: Use of BMI percentile-for-age to assess for overweight or obesity at every visit increased from 77% to 88% (p tools such as a soda bottle showing sugar content, pediatric plate

  11. Optimum radiotherapy schedule for uterine cervical cancer based-on the detailed information of dose fractionation and radiotherapy technique

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jae Ho; Kim, Hyun Chang; Suh, Chang Ok [Yonsei University Medical School, Seoul (Korea, Republic of)] (and others)

    2005-09-15

    The best dose-fractionation regimen of the definitive radiotherapy for cervix cancer remains to be clearly determined. It seems to be partially attributed to the complexity of the affecting factors and the lack of detailed information on external and intra-cavitary fractionation. To find optimal practice guidelines, our experiences of the combination of external beam radiotherapy (EBRT) and high-dose-rate intracavitary brachytherapy (HDR-ICBT) were reviewed with detailed information of the various treatment parameters obtained from a large cohort of women treated homogeneously at a single institute. The subjects were 743 cervical cancer patients (Stage IB 198, IIA 77, IIB 364, IIIA 7, IIIB 89 and IVA 8) treated by radiotherapy alone, between 1990 and 1996. A total external beam radiotherapy (EBRT) dose of 23.4 {approx} 59.4 Gy (Median 45.0) was delivered to the whole pelvis. High-dose-rate intracavitary brachytherapy (HDR-ICBT) was also performed using various fractionation schemes. A Midline block (MLB) was initiated after the delivery of 14.4{approx} 43.2 Gy (Median 36.0) of EBRT in 495 patients, while in the other 248 patients EBRT could not be used due to slow tumor regression or the huge initial bulk of tumor. The point A, actual bladder and rectal doses were individually assessed in all patients. The biologically effective dose (BED) to the tumor ({alpha} / {beta} = 10) and late-responding tissues ({alpha} /{beta} = 3) for both EBRT and HDR-ICBT were calculated. The total BED values to point A, the actual bladder and rectal reference points were the summation of the EBRT and HDR-ICBT. In addition to all the details on dose-fractionation, the other factors (i.e. the overall treatment time, physicians preference) that can affect the schedule of the definitive radiotherapy were also thoroughly analyzed. The association between MD-BED Gy{sub 3} and the risk of complication was assessed using serial multiple logistic regressions models. The associations between R

  12. Accuracy of implant transfer with open-tray and closed-tray impression techniques and surface detail reproduction of the tooth during impression

    Directory of Open Access Journals (Sweden)

    Hakimeh Siadat

    2012-01-01

    Full Text Available Background and Aims: Accurate recording of implant location is required to achieve passive fit and have the implants without stress concentration. The aim of this in-vitro study was to evaluate the dimensional and angular accuracy of open-tray and closed-tray impression techniques using polyether impression material and also to assess the surface detail reproduction of the tooth while impression making.Materials and Methods: One reference metal model with 2 implants (Implantium on the position of the maxillary second premolar and first molar and one molar tooth for evaluation of surface details was prepared. 27 polyether impressions of these models were made (9 using open-tray, 9 using closed-tray techniques and 9 were made just of the surface of the teeth without any implants. Impressions were poured with ADA type IV stone. Coordinate Measuring Machine was used for measuring the dimensional accuracy and video measuring machine for surface detail reproduction. All of these measurements were compared with the measurements on the reference model. Data were analyzed by and compared by T-test and One-way ANOVA.Results: There was a significant statistical difference between open-tray and closed-tray techniques (P0.05.Conclusion: The accuracy of open-tray impression technique was more than closed-tray technique. The surface detail reproduction of the tooth was not affected by impression technique.

  13. A cone-beam CT based technique to augment the 3D virtual skull model with a detailed dental surface.

    NARCIS (Netherlands)

    Swennen, G.R.; Mommaerts, M.Y.; Abeloos, J.V.S.; Clercq, C. De; Lamoral, P.; Neyt, N.; Casselman, J.W.; Schutyser, F.A.C.

    2009-01-01

    Cone-beam computed tomography (CBCT) is used for maxillofacial imaging. 3D virtual planning of orthognathic and facial orthomorphic surgery requires detailed visualisation of the interocclusal relationship. This study aimed to introduce and evaluate the use of a double CBCT scan procedure with a mod

  14. A detailed review of hip reduction maneuvers: a focus on physician safety and introduction of the Waddell technique

    Directory of Open Access Journals (Sweden)

    Bradford S. Waddell

    2016-03-01

    Full Text Available Dislocation of the hip is a well-described event that occurs in conjunction with highenergy trauma or postoperatively after total hip arthroplasty. Bigelow first described closed treatment of a dislocated hip in 1870, and in the last decade many reduction techniques have been proposed. In this article, we review all described techniques for the reduction of hip dislocation while focusing on physician safety. Furthermore, we introduce a modified technique for the reduction of posterior hip dislocation that allows the physician to adhere to the back safety principles set for by the Occupational Safety and Health Administration.

  15. A cone-beam CT based technique to augment the 3D virtual skull model with a detailed dental surface.

    Science.gov (United States)

    Swennen, G R J; Mommaerts, M Y; Abeloos, J; De Clercq, C; Lamoral, P; Neyt, N; Casselman, J; Schutyser, F

    2009-01-01

    Cone-beam computed tomography (CBCT) is used for maxillofacial imaging. 3D virtual planning of orthognathic and facial orthomorphic surgery requires detailed visualisation of the interocclusal relationship. This study aimed to introduce and evaluate the use of a double CBCT scan procedure with a modified wax bite wafer to augment the 3D virtual skull model with a detailed dental surface. The impressions of the dental arches and the wax bite wafer were scanned for ten patient separately using a high resolution standardized CBCT scanning protocol. Surface-based rigid registration using ICP (iterative closest points) was used to fit the virtual models on the wax bite wafer. Automatic rigid point-based registration of the wax bite wafer on the patient scan was performed to implement the digital virtual dental arches into the patient's skull model. Probability error histograms showed errors of wax bite wafer to set-up a 3D virtual augmented model of the skull with detailed dental surface.

  16. Android-Stego: A Novel Service Provider Imperceptible MMS Steganography Technique Robust to Message Loss

    Directory of Open Access Journals (Sweden)

    Avinash Srinivasan

    2015-08-01

    Full Text Available Information hiding techniques, especially steganography, have been extensively researched for over two decades. Nonetheless, steganography on smartphones over cellular carrier networks is yet to be fully explored. Today, smartphones, which are at the epitome of ubiquitous and pervasive computing, make steganography an easily accessible covert communication channel. In this paper, we propose Android-Stego - a framework for steganography employing smart-phones. Android-Stego has been evaluated and confirmed to achieve covert communication over real world cellular service providers' communication networks such as Verizon and Sprint. A key contribution of our research presented in this paper is the benchmark results we have provided by analyzing real world cellular carriers network restrictions on MMS message size. We have also analyzed the actions the carriers take - such as compression and/or format conversion - on MMS messages that fall outside the established MMS communication norm, which varies for each service provider. Finally, We have used these benchmark results in implementing Android-Stego such that it is sensitive to carrier restrictions and robust to message loss.

  17. Current status on the detailed design and development of fabrication techniques for the ITER blanket shield block in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Duck-Hoi [National Fusion Research Institute, 52 Yeoeun-dong, Yuseong-gu, Daejeon 305-333 (Korea, Republic of)], E-mail: kdwh@nfri.re.kr; Cho, Seungyon; Ahn, Mu-Young; Lee, Eun-Seok; Jung, Ki Jung [National Fusion Research Institute, 52 Yeoeun-dong, Yuseong-gu, Daejeon 305-333 (Korea, Republic of); Kim, Do-Hyeong [ANST, Inc., 222-7 Guro3-dong, Guro-gu, Seoul 152-848 (Korea, Republic of)

    2008-12-15

    Recent activities and progress on the design and fabrication of the ITER blanket shield block in Korea are described in this paper. Hydraulic analyses, using a flow driver model for determining the gap between the radial cooling passages and flow drivers inside the shield block, were performed. The thermo-hydraulic analysis of half of a shield block was also conducted to investigate the uniformity of the flow stream in cooling passages and to evaluate the temperature distribution in the structure. The maximum temperature is below the allowable value, although hot spots occurred in the corner edge in the shield block. A manufacturing feasibility study for the development of the blanket shield block was performed in cooperation with KO industries. It was found that specific techniques would be required for the successful fabrication of an ITER blanket shield block, specifically electron-beam welding at a thickness up to 110 mm. The development of joining and drilling technologies for the thick shield block and lid joints is in progress. In addition, a full scale mock-up fabrication and the development of NDT techniques are planned in the near future.

  18. A simple standard technique for labyrinthectomy in the rat: A methodical communication with a detailed description of the surgical process.

    Science.gov (United States)

    Nádasy, G L; Raffai, G; Fehér, E; Schaming, G; Monos, E

    2016-09-01

    Aims Labyrinthectomized rats are suitable models to test consequences of vestibular lesion and are widely used to study neural plasticity. We describe a combined microsurgical-chemical technique that can be routinely performed with minimum damage. Methods Caudal leaflet of the parotis is elevated. The tendinous fascia covering the bulla is opened frontally from the sternomastoid muscle's tendon while sparing facial nerve branches. A 4 mm diameter hole is drilled into the bulla's hind lower lateral wall to open the common (in rodents) mastoid-tympanic cavity. The cochlear crista (promontory) at the lower posterior part of its medial wall is identified as a bony prominence. A 1 mm diameter hole is drilled into its lower part. The perilymphatic/endolymphatic fluids with tissue debris of the Corti organ are suctioned. Ethanol is injected into the hole. Finally, 10 µL of sodium arsenite solution (50 µM/mL) is pumped into the labyrinth and left in place for 15 min. Simple closure in two layers (fascia and skin) is sufficient. Results and conclusion All rats had neurological symptoms specific for labyrinthectomy (muscle tone, body position, rotatory movements, nystagmus, central deafness). Otherwise, their behavior was unaffected, drinking and eating normally. After a few days, they learned to balance relying on visual and somatic stimuli (neuroplasticity).

  19. Applying BI Techniques To Improve Decision Making And Provide Knowledge Based Management

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2015-07-01

    Full Text Available The paper focuses on BI techniques and especially data mining algorithms that can support and improve the decision making process, with applications within the financial sector. We consider the data mining techniques to be more efficient and thus we applied several techniques, supervised and unsupervised learning algorithms The case study in which these algorithms have been implemented regards the activity of a banking institution, with focus on the management of lending activities.

  20. A comprehensive and novel predictive modeling technique using detailed pathology factors in men with localized prostate carcinoma.

    Science.gov (United States)

    Potters, Louis; Purrazzella, Rosemary; Brustein, Sheryl; Fearn, Paul; Leibel, Steven A; Kattan, Michael W

    2002-10-01

    The purpose of the current study was to evaluate modeling strategies using sextant core prostate biopsy specimen data that would best predict biochemical control in patients with localized prostate carcinoma treated with permanent prostate brachytherapy (PPB). One thousand four hundred seventy-seven patients underwent PPB between 1992 and 2000. The authors restricted analysis to those patients who had sextant biopsies (n = 1073). A central pathology review was undertaken on all specimens. Treatment consisted of PPB with either I-125 or Pd-103 prescribed to 144 Gy or 140 Gy, respectively. Two hundred twenty-eight patients (21%) received PPB in combination with external radiotherapy and 333 patients (31%) received neoadjuvant hormones. In addition to clinical stage, biopsy Gleason sum, and pretreatment prostate specific antigen (pretx-PSA), the following detailed biopsy variables were considered: mean percentage of cancer in an involved core; maximum percentage of cancer; mean primary and secondary Gleason grades; maximum Gleason grade (primary or secondary); percentage of cancer in the apex, mid, and base; percent of cores positive; maximum primary and secondary Gleason grades in apex, mid, and base; maximum percent cancer in apex, mid, and base; maximum Gleason grade in apex, mid, and base; maximum primary Gleason grade; and maximum secondary Gleason grade. In all, 23 biopsy variables were considered. Four modeling strategies were compared. As a base model, the authors considered the pretx-PSA, clinical stage, and biopsy Gleason sum as predictors. For the second model, the authors added percent of cores positive. The third modeling strategy was to use stepwise variable selection to select only those variables (from the total pool of 26) that were statistically significant. The fourth strategy was to apply principal components analysis, which has theoretical advantages over the other strategies. Principal components analysis creates component scores that account for

  1. Providing an Approach to Locating the Semantic Error of Application Using Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Abdollah Rahimi

    2016-12-01

    Full Text Available Regardless of the efforts taken to produce a computer program, the program may still have some bugs and defects. In fact, the larger and more complex programs are more likely to contain errors. The purpose of this paper is to present an approach to detect erroneous performance of application using clustering technique. Because the program paased different execution paths based on different inputs, there is impossible to discover all errors in the program before delivery the software. Monitoring all execution paths before delivery of program is very difficult or maybe impossible, so a lot of errors are hidden in the program and is revealed after delivery. Solutions that have been proposed to achieve this goal are trying to compare the information in the implementation of the program to be successful or unsuccessful which called determinant and introduces the points suspended to the error to programmer. But the main problem is that the analysis carried out at the decisive time information regardless of affiliation between predicate, leading to the inability of these methods to detect certain types of errors. To solve these problems, in this paper a new solution based on behavior analysis and runtime of executable paths in the form of taking into account the interactions between determinants are provided. For this purpose, a clustering method was used for classification of graphs based on the similarities and the ultimate determination of areas suspected of error in the erroneous code paths. Assessment of the proposed strategy on the collection of real programs shows the success of the proposed approach more accurate in detecting errors compared to previous.

  2. Detailed design package for design of a video system providing optimal visual information for controlling payload and experiment operations with television

    Science.gov (United States)

    1975-01-01

    A detailed description of a video system for controlling space shuttle payloads and experiments is presented in the preliminary design review and critical design review, first and second engineering design reports respectively, and in the final report submitted jointly with the design package. The material contained in the four subsequent sections of the package contains system descriptions, design data, and specifications for the recommended 2-view system. Section 2 contains diagrams relating to the simulation test configuration of the 2-view system. Section 3 contains descriptions and drawings of the deliverable breadboard equipment. A description of the recommended system is contained in Section 4 with equipment specifications in Section 5.

  3. Could lymphatic mapping and sentinel node biopsy provide oncological providence for local resectional techniques for colon cancer? A review of the literature

    Directory of Open Access Journals (Sweden)

    Leroy Joel

    2008-09-01

    Full Text Available Abstract Background Endoscopic resectional techniques for colon cancer are undermined by their inability to determine lymph node status. This limits their application to only those lesions at the most minimal risk of lymphatic dissemination whereas their technical capacity could allow intraluminal or even transluminal address of larger lesions. Sentinel node biopsy may theoretically address this breach although the variability of its reported results for this disease is worrisome. Methods Medline, EMBASE and Cochrane databases were interrogated back to 1999 to identify all publications concerning lymphatic mapping for colon cancer with reference cross-checking for completeness. All reports were examined from the perspective of in vivo technique accuracy selectively in early stage disease (i.e. lesions potentially within the technical capacity of endoscopic resection. Results Fifty-two studies detailing the experiences of 3390 patients were identified. Considerable variation in patient characteristics as well as in surgical and histological quality assurances were however evident among the studies identified. In addition, considerable contamination of the studies by inclusion of rectal cancer without subgroup separation was frequent. Indeed such is the heterogeneity of the publications to date, formal meta-analysis to pool patient cohorts in order to definitively ascertain technique accuracy in those with T1 and/or T2 cancer is not possible. Although lymphatic mapping in early stage neoplasia alone has rarely been specifically studied, those studies that included examination of false negative rates identified high T3/4 patient proportions and larger tumor size as being important confounders. Under selected circumstances however the technique seems to perform sufficiently reliably to allow it prompt consideration of its use to tailor operative extent. Conclusion The specific question of whether sentinel node biopsy can augment the oncological

  4. Vis-A-Plan /visualize a plan/ management technique provides performance-time scale

    Science.gov (United States)

    Ranck, N. H.

    1967-01-01

    Vis-A-Plan is a bar-charting technique for representing and evaluating project activities on a performance-time basis. This rectilinear method presents the logic diagram of a project as a series of horizontal time bars. It may be used supplementary to PERT or independently.

  5. Android-Stego: A Novel Service Provider Imperceptible MMS Steganography Technique Robust to Message Loss

    OpenAIRE

    Avinash Srinivasan; Jie Wu; Justin Shi

    2015-01-01

    Information hiding techniques, especially steganography, have been extensively researched for over two decades. Nonetheless, steganography on smartphones over cellular carrier networks is yet to be fully explored. Today, smartphones, which are at the epitome of ubiquitous and pervasive computing, make steganography an easily accessible covert communication channel. In this paper, we propose Android-Stego - a framework for steganography employing smart-phones. Android-Stego has been evaluated ...

  6. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details ... Intra‑Operative Airway Management in Patients with Maxillofacial Trauma having Reduction and ... Clinical Parameters and Challenges of Managing Cervicofacial Necrotizing Fasciitis in a ...

  7. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details. Log in or ... Difficult airway management in a patient with giant malignant goitre scheduled for thyroidectomy - case report ... Airway Management Dilemma in a Patient with Maxillofacial Injury

  8. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details ... Sequencing for Batch Production in a Group Flowline Machine Shop ... Sampling Plans for Monitoring Quality Control Process at a Plastic Manufacturing Firm in Nigeria: A Case Study

  9. High Altitude Platforms for Disaster Recovery: Capabilities, Strategies, and Techniques for Providing Emergency Telecommunications

    Energy Technology Data Exchange (ETDEWEB)

    Juan D. Deaton

    2008-05-01

    Natural disasters and terrorist acts have significant potential to disrupt emergency communication systems. These emergency communication networks include first-responder, cellular, landline, and emergency answering services such as 911, 112, or 999. Without these essential emergency communications capabilities, search, rescue, and recovery operations during a catastrophic event will be severely debilitated. High altitude platforms could be fitted with telecommunications equipment and used to support these critical communications missions once the catastrophic event occurs. With the ability to be continuously on station, HAPs provide excellent options for providing emergency coverage over high-risk areas before catastrophic incidents occur. HAPs could also provide enhanced 911 capabilities using either GPS or reference stations. This paper proposes potential emergency communications architecture and presents a method for estimating emergency communications systems traffic patterns for a catastrophic event.

  10. Providing Nutritional Care in the Office Practice: Teams, Tools, and Techniques.

    Science.gov (United States)

    Kushner, Robert F

    2016-11-01

    Provision of dietary counseling in the office setting is enhanced by using team-based care and electronic tools. Effective provider-patient communication is essential for fostering behavior change: the key component of lifestyle medicine. The principles of communication and behavior change are skill-based and grounded in scientific theories and models. Motivational interviewing and shared decision making, a collaboration process between patients and their providers to reach agreement about a health decision, is an important process in counseling. The stages of change, self-determination, health belief model, social cognitive model, theory of planned behavior, and cognitive behavioral therapy are used in the counseling process. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Safety and health instruction in mechanics provided using Job Safety Analysis technique

    Directory of Open Access Journals (Sweden)

    Seyed Shamseddin Alizadeh

    2015-02-01

    Full Text Available Annually a large numbers of workers in different countries are injured or die. According to statistics provided by the ILO in 2000, the number of occupational accidents in the world has been announced abaut 25 million items that one million people have died as a result of them. In Iran more than 17 million people are working in two million workshops. According to studies, about 80% to 90% of accidents are caused because of employee unsafe behaviors and only cause of 10% to 20% of them is the unsafe conditions. According to the statistics provided in 2005, each year 2.2 million people, men and women, due to work-related accidents and illnesses are deprived from their rights. Work-related deaths and injuries impose heavy costs on societies especially in developing countries.

  12. Sequencing of chloroplast genomes from wheat, barley, rye and their relatives provides a detailed insight into the evolution of the Triticeae tribe.

    Directory of Open Access Journals (Sweden)

    Christopher P Middleton

    Full Text Available Using Roche/454 technology, we sequenced the chloroplast genomes of 12 Triticeae species, including bread wheat, barley and rye, as well as the diploid progenitors and relatives of bread wheat Triticum urartu, Aegilops speltoides and Ae. tauschii. Two wild tetraploid taxa, Ae. cylindrica and Ae. geniculata, were also included. Additionally, we incorporated wild Einkorn wheat Triticum boeoticum and its domesticated form T. monococcum and two Hordeum spontaneum (wild barley genotypes. Chloroplast genomes were used for overall sequence comparison, phylogenetic analysis and dating of divergence times. We estimate that barley diverged from rye and wheat approximately 8-9 million years ago (MYA. The genome donors of hexaploid wheat diverged between 2.1-2.9 MYA, while rye diverged from Triticum aestivum approximately 3-4 MYA, more recently than previously estimated. Interestingly, the A genome taxa T. boeoticum and T. urartu were estimated to have diverged approximately 570,000 years ago. As these two have a reproductive barrier, the divergence time estimate also provides an upper limit for the time required for the formation of a species boundary between the two. Furthermore, we conclusively show that the chloroplast genome of hexaploid wheat was contributed by the B genome donor and that this unknown species diverged from Ae. speltoides about 980,000 years ago. Additionally, sequence alignments identified a translocation of a chloroplast segment to the nuclear genome which is specific to the rye/wheat lineage. We propose the presented phylogeny and divergence time estimates as a reference framework for future studies on Triticeae.

  13. Multimodal Imaging Techniques for the Extraction of Detailed Geometrical and Physiological Information for Use in Multi-Scale Models of Colorectal Cancer and Treatment of Individual Patients

    Directory of Open Access Journals (Sweden)

    Joe Pitt-Francis

    2006-01-01

    Full Text Available A vast array of mathematical models have been proposed for all stages of cancer formation across a wide range of spatio–temporal scales. Attention is now turning to coupling these models across scales and building models of “virtual tumours” for use in in silico testing of novel drugs and treatment regimes. This leads naturally to the requirement for detailed knowledge of the underlying geometry and physiological properties of individual tumours for use in: (i multi-scale mathematical models of in vivo tumour growth and development; (ii fusion of multi-scale, multimodal medical imaging techniques to improve the diagnosis and treatment of individual patients; and (iii training of cancer specialists and surgeons.

  14. Detail and survey radioautographs

    Energy Technology Data Exchange (ETDEWEB)

    Wainwright, Wm.W.

    1949-04-19

    The much used survey or contact type of radioautograph is indispensible for a study of the gross distribution of radioactive materials. A detail radioautograph is equally indispensible. The radioautograph makes possible the determination of plutonium with respect to cells. Outlines of survey and detail techniques are given.

  15. Enhanced conformational sampling technique provides an energy landscape view of large-scale protein conformational transitions.

    Science.gov (United States)

    Shao, Qiang

    2016-10-26

    Large-scale conformational changes in proteins are important for their functions. Tracking the conformational change in real time at the level of a single protein molecule, however, remains a great challenge. In this article, we present a novel in silico approach with the combination of normal mode analysis and integrated-tempering-sampling molecular simulation (NMA-ITS) to give quantitative data for exploring the conformational transition pathway in multi-dimensional energy landscapes starting only from the knowledge of the two endpoint structures of the protein. The open-to-closed transitions of three proteins, including nCaM, AdK, and HIV-1 PR, were investigated using NMA-ITS simulations. The three proteins have varied structural flexibilities and domain communications in their respective conformational changes. The transition state structure in the conformational change of nCaM and the associated free-energy barrier are in agreement with those measured in a standard explicit-solvent REMD simulation. The experimentally measured transition intermediate structures of the intrinsically flexible AdK are captured by the conformational transition pathway measured here. The dominant transition pathways between the closed and fully open states of HIV-1 PR are very similar to those observed in recent REMD simulations. Finally, the evaluated relaxation times of the conformational transitions of three proteins are roughly at the same level as reported experimental data. Therefore, the NMA-ITS method is applicable for a variety of cases, providing both qualitative and quantitative insights into the conformational changes associated with the real functions of proteins.

  16. Author Details

    African Journals Online (AJOL)

    Attitude towards, and likelihood of, complaining in the banking, domestic airline and restaurant industries ... Relationship intention and satisfaction as predictors of wholesale and retail customers' loyalty towards their training providers

  17. Psychophysical evaluation of the image quality of a dynamic flat-panel digital x-ray image detector using the threshold contrast detail detectability (TCDD) technique

    Science.gov (United States)

    Davies, Andrew G.; Cowen, Arnold R.; Bruijns, Tom J. C.

    1999-05-01

    We are currently in an era of active development of the digital X-ray imaging detectors that will serve the radiological communities in the new millennium. The rigorous comparative physical evaluations of such devices are therefore becoming increasingly important from both the technical and clinical perspectives. The authors have been actively involved in the evaluation of a clinical demonstration version of a flat-panel dynamic digital X-ray image detector (or FDXD). Results of objective physical evaluation of this device have been presented elsewhere at this conference. The imaging performance of FDXD under radiographic exposure conditions have been previously reported, and in this paper a psychophysical evaluation of the FDXD detector operating under continuous fluoroscopic conditions is presented. The evaluation technique employed was the threshold contrast detail detectability (TCDD) technique, which enables image quality to be measured on devices operating in the clinical environment. This approach addresses image quality in the context of both the image acquisition and display processes, and uses human observers to measure performance. The Leeds test objects TO[10] and TO[10+] were used to obtain comparative measurements of performance on the FDXD and two digital spot fluorography (DSF) systems, one utilizing a Plumbicon camera and the other a state of the art CCD camera. Measurements were taken at a range of detector entrance exposure rates, namely 6, 12, 25 and 50 (mu) R/s. In order to facilitate comparisons between the systems, all fluoroscopic image processing such as noise reduction algorithms, were disabled during the experiments. At the highest dose rate FDXD significantly outperformed the DSF comparison systems in the TCDD comparisons. At 25 and 12 (mu) R/s all three-systems performed in an equivalent manner and at the lowest exposure rate FDXD was inferior to the two DSF systems. At standard fluoroscopic exposures, FDXD performed in an equivalent

  18. Assessing treatment-as-usual provided to control groups in adherence trials: Exploring the use of an open-ended questionnaire for identifying behaviour change techniques

    NARCIS (Netherlands)

    Oberjé, E.J.M.; Dima, A.L.; Pijnappel, F.J.; Prins, J.M.; Bruin, M. de

    2015-01-01

    OBJECTIVE: Reporting guidelines call for descriptions of control group support in equal detail as for interventions. However, how to assess the active content (behaviour change techniques (BCTs)) of treatment-as-usual (TAU) delivered to control groups in trials remains unclear. The objective of this

  19. What should primary care providers know about pediatric skin conditions? A modified Delphi technique for curriculum development.

    Science.gov (United States)

    Feigenbaum, Dana F; Boscardin, Christy K; Frieden, Ilona J; Mathes, Erin F D

    2014-10-01

    There is limited access to pediatric dermatology in the United States, resulting in inadequate education and patient care. This Delphi study aimed to identify important objectives for a pediatric dermatology curriculum for general practitioners. A modified, 2-round Delphi technique was used to develop consensus on objectives developed by expert pediatric dermatologists. A panel of 20 experts (pediatric dermatologists, family practitioners, and general pediatricians) rated objectives using a 5-point Likert-type scale. Items with group medians 4.0 or greater with at least 70% agreement reached consensus. In round 1, the expert panel rated 231 objectives from 16 categories for inclusion in an online curriculum. In round 2, experts were given group feedback and rated 235 objectives. A total of 170 items met consensus. Generally, objectives surrounding common conditions including acne, molluscum, warts, atopic dermatitis, and newborn skin met consensus whereas objectives on rare growths, birthmarks, and inherited conditions failed to meet consensus. The Delphi panel consisted of US-based physicians, most in urban areas with a dedicated pediatric specialist at their institution. The accepted objectives encompass management of common conditions and referral of potentially dangerous diseases and can be used to develop a pediatric dermatology curriculum for primary care providers. Copyright © 2014 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  20. Main: Clone Detail [KOME

    Lifescience Database Archive (English)

    Full Text Available Clone Detail Mapping Pseudomolecule data detail Detail information Mapping to the TIGR japonica Pseudomolecu...les kome_mapping_pseudomolecule_data_detail.zip kome_mapping_pseudomolecule_data_detail ...

  1. Effect of a network system for providing proper inhalation technique by community pharmacists on clinical outcomes in COPD patients

    Directory of Open Access Journals (Sweden)

    Takemura M

    2013-05-01

    Full Text Available Masaya Takemura,1 Katsumi Mitsui,2 Masako Ido,2 Masataka Matsumoto,1 Misuzu Koyama,3 Daiki Inoue,1 Kazufumi Takamatsu,1 Ryo Itotani,1 Manabu Ishitoko,1 Shinko Suzuki,1 Kensaku Aihara,1 Minoru Sakuramoto,1 Hitoshi Kagioka,1 Motonari Fukui11Respiratory Disease Center, Kitano-Hospital, the Tazuke Kofukai Medical Research Institute, Osaka, Japan; 2Division of Pharmacy, Kitano-Hospital, The Tazuke Kofukai Medical Research Institute, Osaka, Japan; 3Kita-ku Pharmaceutical Association, Osaka, JapanIntroduction: Nonadherence to inhalation therapy is very common in patients with chronic obstructive pulmonary disease (COPD. Few data are available to support the role of community pharmacists in optimizing inhalation therapy in COPD patients. Since 2007, the Kitano Hospital and the Kita-ku Pharmaceutical Association have provided a network system for delivering correct inhalation techniques through certified community pharmacists. The effects of this network system on clinical outcomes in COPD patients were examined.Methods: A total of 88 consecutive outpatients with COPD at baseline and 82 of those 4 years later were recruited from the respiratory clinic of Kitano Hospital Medical Research Institute. Measurements included the frequency of COPD exacerbations, patients’ adherence to inhalation therapy using a five-point Likert scale questionnaire, and patients’ health status both prior to this system and 4 years later.Results: Usable information was obtained from 55 patients with COPD at baseline, and from 51 patients 4 years later. Compared with baseline values, a significant decrease was observed in the frequency of COPD exacerbations (1.5 ± 1.6 versus 0.8 ± 1.4 times/year, P = 0.017. Adherence to the inhalation regimen increased significantly (4.1 ± 0.7 versus 4.4 ± 0.8, P = 0.024, but health status was unchanged. At 4 years, of 51 COPD patients, 39 (76% patients who visited the certified pharmacies showed significantly higher medication adherence

  2. Computed tomography:the details.

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin Walter

    2007-07-01

    Computed Tomography (CT) is a well established technique, particularly in medical imaging, but also applied in Synthetic Aperture Radar (SAR) imaging. Basic CT imaging via back-projection is treated in many texts, but often with insufficient detail to appreciate subtleties such as the role of non-uniform sampling densities. Herein are given some details often neglected in many texts.

  3. Utility of chromatographic and spectroscopic techniques for a detailed characterization of poly(styrene-b-isoprene) miktoarm star copolymers with complex architecture

    KAUST Repository

    Šmigovec Ljubič, Tina

    2012-09-25

    We analyzed various miktoarm star copolymers of the PS(PI) x type (x = 2, 3, 5, 7), which consist of one long polystyrene (PS) arm (82 or 105 kDa) and various numbers of short polyisoprene (PI) arms (from 11.3 to 39.7 kDa), prepared by anionic polymerization and selective chlorosilane chemistry. The length of the PI arm in stars decreases with the number of arms, so that the chemical compositions of all PS(PI) x samples were comparable. Our aim was to determine the purity of samples and to identify exactly the constituents of individual samples. For this purpose we used a variety of separation techniques (size-exclusion chromatography (SEC), reversed-phase liquid-adsorption chromatography (RP-LAC), and two-dimensional liquid chromatography (2D-LC)) and characterization techniques (UV-MALS-RI multidetection SEC system, NMR, and MALDI-TOF MS). The best separation and identification of the samples\\' constituents were achieved by RP-LAC, which separates macromolecules according to their chemical composition, and a subsequent analysis of the off-line collected fractions from the RP-C18 column by SEC/UV-MALS-RI multidetection system. The results showed that all PS(PI) x samples contained the homo-PS and homo-PI in minor amounts and the high-molar-mass (PS) y(PI) z (y > 1) species, the content of which is higher in the samples PS(PI) 5 and PS(PI) 7 than in the samples PS(PI) 2 and PS(PI) 3. The major constituent of the PS(PI) 2 sample was the one with the predicted structure. On the other hand, the major components of the PS(PI) x (x = 3, 5, and 7) samples were the stars consisting of a smaller number of PI arms than predicted from the functionalities of chlorosilane coupling agents. These results are in agreement with the average chemical composition of samples determined by proton NMR spectroscopy and characterization of the constituents by MALDI-TOF MS. © 2012 American Chemical Society.

  4. SU-F-T-647: Linac-Based Stereotactic Radiosurgery (SRS) in the Treatment of Trigeminal Neuralgia: Detailed Description of SRS Procedural Technique and Reported Clinical Outcomes

    Energy Technology Data Exchange (ETDEWEB)

    Pokhrel, D; Sood, S; Badkul, R; Jiang, H; Stepp, T; Camarata, P; Wang, F [University of Kansas Hospital, Kansas City, KS (United States)

    2016-06-15

    . Overall, 20-patients (77%) responded to treatment: 5(19%) achieved complete pain relief without medication (BNI score: I); 5(19%) had no-pain, decreased medication (BNI-score:II); 2(7.7%) had no-pain, but, continued medication (BNI-score:IIIA), and 8(30.8%) had pain that was well controlled by medication (BNI-score: IIIB). Six-patients (23.0%) did not respond to treatment (BNI-score:IV–V). Neither cranial nerve deficit nor radio-necrosis of temporal lobe was clinically observed. Conclusion: Linac-based SRS for medically/surgically refractory TNR provided an effective treatment option for pain resolution/control with very minimal if any normal tissue toxicity. Longer follow up of these patients is anticipated/needed to confirm our observations.

  5. Using data mining techniques to explore physicians' therapeutic decisions when clinical guidelines do not provide recommendations: methods and example for type 2 diabetes.

    Science.gov (United States)

    Toussi, Massoud; Lamy, Jean-Baptiste; Le Toumelin, Philippe; Venot, Alain

    2009-06-10

    Clinical guidelines carry medical evidence to the point of practice. As evidence is not always available, many guidelines do not provide recommendations for all clinical situations encountered in practice. We propose an approach for identifying knowledge gaps in guidelines and for exploring physicians' therapeutic decisions with data mining techniques to fill these knowledge gaps. We demonstrate our method by an example in the domain of type 2 diabetes. We analyzed the French national guidelines for the management of type 2 diabetes to identify clinical conditions that are not covered or those for which the guidelines do not provide recommendations. We extracted patient records corresponding to each clinical condition from a database of type 2 diabetic patients treated at Avicenne University Hospital of Bobigny, France. We explored physicians' prescriptions for each of these profiles using C5.0 decision-tree learning algorithm. We developed decision-trees for different levels of detail of the therapeutic decision, namely the type of treatment, the pharmaco-therapeutic class, the international non proprietary name, and the dose of each medication. We compared the rules generated with those added to the guidelines in a newer version, to examine their similarity. We extracted 27 rules from the analysis of a database of 463 patient records. Eleven rules were about the choice of the type of treatment and thirteen rules about the choice of the pharmaco-therapeutic class of each drug. For the choice of the international non proprietary name and the dose, we could extract only a few rules because the number of patient records was too low for these factors. The extracted rules showed similarities with those added to the newer version of the guidelines. Our method showed its usefulness for completing guidelines recommendations with rules learnt automatically from physicians' prescriptions. It could be used during the development of guidelines as a complementary source from

  6. Ultrasound-Guided Suprainguinal Fascia Iliaca Technique Provides Benefit as an Analgesic Adjunct for Patients Undergoing Total Hip Arthroplasty.

    Science.gov (United States)

    Bullock, W Michael; Yalamuri, Suraj M; Gregory, Stephen H; Auyong, David B; Grant, Stuart A

    2017-02-01

    Analgesia after total hip arthroplasty is often accomplished by the fascia iliaca compartment block, traditionally performed below the inguinal ligament, to anesthetize both femoral and lateral femoral cutaneous nerves. The course of the lateral femoral cutaneous nerve below the inguinal ligament is variable as opposed to consistent above the inguinal ligament in the pelvis. In this case series including 5 patients, we demonstrate that an ultrasound-guided suprainguinal fascia iliaca approach would consistently anesthetize the lateral femoral cutaneous nerve along with anterior cutaneous femoral nerve branches and provide cutaneous analgesia after total hip arthroplasty, as shown by decreased opioid consumption.

  7. Water Distribution Lines, Water distribution system details Including pumps, storage tanks, valves, and mains, Published in Not Provided, 1:600 (1in=50ft) scale, Town of Franklin.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Water Distribution Lines dataset, published at 1:600 (1in=50ft) scale, was produced all or in part from Field Survey/GPS information as of Not Provided. It is...

  8. Global detailed gravimetric geoid

    Science.gov (United States)

    Vincent, S.; Marsh, J. G.

    1974-01-01

    A global detailed gravimetric geoid has been computed by combining the Goddard Space Flight Center GEM-4 gravity model derived from satellite and surface gravity data and surface 1 x 1-deg mean free-air gravity anomaly data. The accuracy of the geoid is plus or minus 2 meters on continents, 5 to 7 meters in areas where surface gravity data are sparse, and 10 to 15 meters in areas where no surface gravity data are available. Comparisons have been made with the astrogeodetic data provided by Rice (United States), Bomford (Europe), and Mather (Australia). Comparisons have also been carried out with geoid heights derived from satellite solutions for geocentric station coordinates in North America, the Caribbean, Europe and Australia.

  9. Visual overview, oral detail

    DEFF Research Database (Denmark)

    Hertzum, Morten; Simonsen, Jesper

    2015-01-01

    and with the coordinating nurse, who is the main keeper of the whiteboard. On the basis of observations, we find that coordination is accomplished through a highly intertwined process of technologically mediated visual overview combined with orally communicated details. The oral details serve to clarify and elaborate...

  10. Crowdsourcing detailed flood data

    Science.gov (United States)

    Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad

    2015-04-01

    Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK

  11. Detailed Chemical Abundances of Extragalactic Globular Clusters

    CERN Document Server

    Bernstein, R A

    2005-01-01

    We outline a method to measure the detailed chemical composition of extragalactic (unresolved) globular clusters (GCs) from echelle spectra of their integrated light. Our goal is to use this method to measure abundance patterns of GCs in distant spiral and elliptical galaxies to constrain their formation histories. To develop this technique we have obtained a ``training set'' of integrated-light spectra of resolved GCs in the Milky Way and LMC by scanning across the clusters during exposures. Our training set also include spectra of individual stars in those GCs from which abundances can be obtained in the normal way to provide a check on our integrated-light results. We present here the preliminary integrated-light analysis of one GC in our training set, NGC 104 (47 Tuc), and outline some of the techniques utilized and problems encountered in that analysis.

  12. Detailed Soils 24K

    Data.gov (United States)

    Kansas Data Access and Support Center — This data set is a digital soil survey and is the most detailed level of soil geographic data developed by the National Cooperative Soil Survey. The information was...

  13. The Comparison of Organisational pay structures as a salary survey technique in providing a unified Non-Racial market Wage curve

    Directory of Open Access Journals (Sweden)

    R. J. Snelgar

    1983-11-01

    Full Text Available The development and maintenance of an equitable and uniformpay structure is complicated by the existence of the "wage-gap". Choice of a job evaluation plan which does not perpetuate discrimination already found in the market place, and which itself is not discriminatory, has become a topic of debate. Results of this study suggest that it is possible to use a technique for conducting salary surveys which does not rely on subjective techniques such as job evaluation. A comparison of total organisational pay structures, rather than actual salaries, thus provides the basis for a uniform non-racial market wage curve according to which internal pay systems may be competitively structured.OpsommingDie ontwikkeling en instandhouding van 'n regverdige en uniforme besoldigingstruktuur word gekompliseer deur die bestaan van 'n “loongaping?. Die keuse van 'n posevalueringplan wat nie bestaande diskriminasie laat voortbestaan nie, en wat self ook nie diskriminerend is nie, is 'n onderwerp waaroor heelwat debat gevoer word. Die resultate van hierdie studie dui aan dat dit moontlik is om 'n tegniek van salarisopname te gebruik wat nie op subjektiewe tegnieke soos posevaluering staat maak nie. 'n Vergelyking wat die totale organisasie se besoldigingstruktuur in ag neem eerder as die werklike salarisse, voorsien 'n basis vir 'n uniforme mark-salariskurwe wat nie rasseverskille reflekteer nie en waarvolgens die interne besoldigingsisteem mededingend gestruktureer kan word.

  14. Three Latin Phonological Details

    DEFF Research Database (Denmark)

    Olsen, Birgit Anette

    2006-01-01

    The present paper deals with three minor details of Latin phonology: 1) the development of the initial sequence *u¿l¿-, where it is suggested that an apparent vacillation between ul- and vol-/vul- represents sandhi variants going back to the proto-language, 2) the adjectives ama¯rus ‘bitter' and ...

  15. Detail in architecture: Between arts

    Directory of Open Access Journals (Sweden)

    Dulencin Juraj

    2016-06-01

    Full Text Available Architectural detail represents an important part of architecture. Not only can it be used as an identifier of a specific building but at the same time enhances the experience of the realized project. Within it lie the signs of a great architect and clues to understanding his or her way of thinking. It is therefore the central topic of a seminar offered to architecture students at the Brno University of Technology. During the course of the semester-long class the students acquaint themselves with atypical architectural details of domestic and international architects by learning to read them, understand them and subsequently draw them by creating architectural blueprints. In other words, by general analysis of a detail the students learn theoretical thinking of its architect who, depending on the nature of the design, had to incorporate a variety of techniques and crafts. Students apply this analytical part to their own architectural detail design. The methodology of the seminar consists of experiential learning by project management and is complemented by a series of lectures discussing a diversity of details as well as materials and technologies required to implement it. The architectural detail design is also part of students’ bachelors thesis, therefore, the realistic nature of their blueprints can be verified in the production process of its physical counterpart. Based on their own documentation the students choose the most suitable manufacturing process whether it is supplied by a specific technology or a craftsman. Students actively participate in the production and correct their design proposals in real scale with the actual material. A student, as a future architect, stands somewhere between a client and an artisan, materializes his or her idea and adjusts the manufacturing process so that the final detail fulfills aesthetic consistency and is in harmony with its initial concept. One of the very important aspects of the design is its

  16. Z-Spectrum analysis provides proton environment data (ZAPPED: a new two-pool technique for human gray and white matter.

    Directory of Open Access Journals (Sweden)

    Mitsue Miyazaki

    Full Text Available A new technique - Z-spectrum Analysis Provides Proton Environment Data (ZAPPED - was used to map cross-relaxing free and restricted protons in nine healthy subjects plus two brain tumor patients at 3T. First, MT data were acquired over a wide symmetric range of frequency offsets, and then a trio of quantitative biomarkers, i.e., the apparent spin-spin relaxation times (T2,f, T2,r in both free and restricted proton pools as well as the restricted pool fraction Fr, were mapped by fitting the measured Z-spectra to a simple two-Lorentzian compartment model on a voxel-by-voxel basis. The mean restricted exchangeable proton fraction, Fr, was found to be 0.17 in gray matter (GM and 0.28 in white matter (WM in healthy subjects. Corresponding mean values for apparent spin-spin relaxation times were 785 µs (T2,f and 17.7 µs (T2,r in GM, 672 µs (T2,f and 23.4 µs (T2,r in WM. The percentages of Ff and Fr in GM are similar for all ages, whereas Fr shows a tendency to decrease with age in WM among healthy subjects. The patient ZAPPED images show higher contrast between tumor and normal tissues than traditional T2-weighted and T1-weighted images. The ZAPPED method provides a simple phenomenological approach to estimating fractions and apparent T2 values of free and restricted MT-active protons, and it may offer clinical useful information.

  17. Detailed Debunking of Denial

    Science.gov (United States)

    Enting, I. G.; Abraham, J. P.

    2012-12-01

    The disinformation campaign against climate science has been compared to a guerilla war whose tactics undermine the traditional checks and balances of science. One comprehensive approach has to been produce archives of generic responses such as the websites of RealClimate and SkepticalScience. We review our experiences with an alternative approach of detailed responses to a small number of high profile cases. Our particular examples were Professor Ian Plimer and Christopher Monckton, the Third Viscount Monckton of Brenchley, each of whom has been taken seriously by political leaders in our respective countries. We relate our experiences to comparable examples such as John Mashey's analysis of the Wegman report and the formal complaints about Lomborg's "Skeptical Environmentalist" and Durkin's "Great Global Warming Swindle". Our two approaches used contrasting approaches: an on-line video of a lecture vs an evolving compendium of misrepresentations. Additionally our approaches differed in the emphasis. The analysis of Monckton concentrated on the misrepresentation of the science, while the analysis of Plimer concentrated on departures from accepted scientific practice: fabrication of data, misrepresentation of cited sources and unattributed use of the work of others. Benefits of an evolving compendium were the ability to incorporate contributions from members of the public who had identified additional errors and the scope for addressing new aspects as they came to public attention. `Detailed debunking' gives non-specialists a reference point for distinguishing non-science when engaging in public debate.

  18. Consider the details

    DEFF Research Database (Denmark)

    Rasmussen, Rasmus; Hertzum, Morten

    2013-01-01

    levels of distance in our study. For revision time, the electronic whiteboard is slower on one subtask but there is no difference on another subtask. Participants prefer the electronic whiteboard. Given the font size of the electronic whiteboard, the inferior reading accuracy is unsurprising......Electronic whiteboards are replacing dry-erase whiteboards in many contexts. In this study we compare electronic and dry-erase whiteboards in emergency departments (EDs) with respect to reading distance and revision time. We find inferior reading accuracy for the electronic whiteboard at all three...... but the reduced possibilities for acquiring information at a glance when clinicians pass the whiteboard may adversely affect their overview. Conversely, the similar revision times for one subtask show that logon may be done quickly. We discuss how details such as font size and logon may impact the high-level...

  19. Detailed IR aperture measurements

    CERN Document Server

    Bruce, Roderik; Garcia Morales, Hector; Giovannozzi, Massimo; Hermes, Pascal Dominik; Mirarchi, Daniele; Quaranta, Elena; Redaelli, Stefano; Rossi, Carlo; Skowronski, Piotr Krzysztof; Wretborn, Sven Joel; CERN. Geneva. ATS Department

    2016-01-01

    MD 1673 was carried out on October 5 2016, in order to investigate in more detail the available aperture in the LHC high-luminosity insertions at 6.5 TeV and β∗=40 cm. Previous aperture measurements in 2016 during commissioning had shown that the available aperture is at the edge of protection, and that the aperture bottleneck at β∗=40 cm in certain cases is found in the separation plane instead of in the crossing plane. Furthermore, the bottlenecks were consistently found in close to the upstream end of Q3 on the side of the incoming beam, and not in Q2 on the outgoing beam as expected from calculations. Therefore, this MD aimed at measuring IR1 and IR5 separately (at 6.5 TeV and β∗=40 cm, for 185 µrad half crossing angle), to further localize the bottlenecks longitudinally using newly installed BLMs, investigate the difference in aperture between Q2 and Q3, and to see if any aperture can be gained using special orbit bumps.

  20. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  1. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  2. Detailing 'measures that matter'.

    Science.gov (United States)

    Heavisides, Bob

    2010-04-01

    In a paper originally presented at last October's Healthcare Estates conference in Harrogate, Bob Heavisides, director of facilities at the Milton Keynes NHS Foundation Trust, explains how estates and facilities directors can provide a package of information based on a number of "measures that matter" to demonstrate to their boards that safe systems of work, operational efficiency and effectiveness, and operational parameters, are within, or better than, equivalent-sized Trusts.

  3. Morphological details in bloodstain particles.

    Science.gov (United States)

    De Wael, K; Lepot, L

    2015-01-01

    During the commission of crimes blood can be transferred to the clothing of the offender or on other crime related objects. Bloodstain particles are sub-millimetre sized flakes that are lost from dried bloodstains. The nature of these red particles is easily confirmed using spectroscopic methods. In casework, bloodstain particles showing highly detailed morphological features were observed. These provided a rationale for a series of experiments described in this work. It was found that the "largest" particles are shed from blood deposited on polyester and polyamide woven fabrics. No particles are lost from the stains made on absorbent fabrics and from those made on knitted fabrics. The morphological features observed in bloodstain particles can provide important information on the substrates from which they were lost. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Exploration of networks using overview+detail with constraint-based cooperative layout.

    Science.gov (United States)

    Dwyer, Tim; Marriott, Kim; Schreiber, Falk; Stuckey, Peter; Woodward, Michael; Wybrow, Michael

    2008-01-01

    A standard approach to large network visualization is to provide an overview of the network and a detailed view of a small component of the graph centred around a focal node. The user explores the network by changing the focal node in the detailed view or by changing the level of detail of a node or cluster. For scalability, fast force-based layout algorithms are used for the overview and the detailed view. However, using the same layout algorithm in both views is problematic since layout for the detailed view has different requirements to that in the overview. Here we present a model in which constrained graph layout algorithms are used for layout in the detailed view. This means the detailed view has high-quality layout including sophisticated edge routing and is customisable by the user who can add placement constraints on the layout. Scalability is still ensured since the slower layout techniques are only applied to the small subgraph shown in the detailed view. The main technical innovations are techniques to ensure that the overview and detailed view remain synchronized, and modifying constrained graph layout algorithms to support smooth, stable layout. The key innovation supporting stability are new dynamic graph layout algorithms that preserve the topology or structure of the network when the user changes the focus node or the level of detail by in situ semantic zooming. We have built a prototype tool and demonstrate its use in two application domains, UML class diagrams and biological networks.

  5. Image fusion theories, techniques and applications

    CERN Document Server

    Mitchell, HB

    2010-01-01

    This text provides a comprehensive introduction to the theories, techniques and applications of image fusion. It examines in detail many real-life examples of image fusion, including panchromatic sharpening and ensemble color image segmentation.

  6. Interactive data visualization foundations, techniques, and applications

    CERN Document Server

    Ward, Matthew; Keim, Daniel

    2015-01-01

    Interactive Data Visualization: Foundations, Techniques, and Applications, Second Edition provides all the theory, details, and tools necessary to build visualizations and systems involving the visualization of data. In color throughout, it explains basic terminology and concepts, algorithmic and software engineering issues, and commonly used techniques and high-level algorithms. Full source code is provided for completing implementations.

  7. Medicare Provider Data - Hospice Providers

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Hospice Utilization and Payment Public Use File provides information on services provided to Medicare beneficiaries by hospice providers. The Hospice PUF...

  8. Tectonic Inversion Along the Algerian and Ligurian Margins: On the Insight Provided By Latest Seismic Processing Techniques Applied to Recent and Vintage 2D Offshore Multichannel Seismic Data

    Science.gov (United States)

    Schenini, L.; Beslier, M. O.; Sage, F.; Badji, R.; Galibert, P. Y.; Lepretre, A.; Dessa, J. X.; Aidi, C.; Watremez, L.

    2014-12-01

    Recent studies on the Algerian and the North-Ligurian margins in the Western Mediterranean have evidenced inversion-related superficial structures, such as folds and asymmetric sedimentary perched basins whose geometry hints at deep compressive structures dipping towards the continent. Deep seismic imaging of these margins is difficult due to steep slope and superficial multiples, and, in the Mediterranean context, to the highly diffractive Messinian evaporitic series in the basin. During the Algerian-French SPIRAL survey (2009, R/V Atalante), 2D marine multi-channel seismic (MCS) reflection data were collected along the Algerian Margin using a 4.5 km, 360 channel digital streamer and a 3040 cu. in. air-gun array. An advanced processing workflow has been laid out using Geocluster CGG software, which includes noise attenuation, 2D SRME multiple attenuation, surface consistent deconvolution, Kirchhoff pre-stack time migration. This processing produces satisfactory seismic images of the whole sedimentary cover, and of southward dipping reflectors in the acoustic basement along the central part of the margin offshore Great Kabylia, that are interpreted as inversion-related blind thrusts as part of flat-ramp systems. We applied this successful processing workflow to old 2D marine MCS data acquired on the North-Ligurian Margin (Malis survey, 1995, R/V Le Nadir), using a 2.5 km, 96 channel streamer and a 1140 cu. in. air-gun array. Particular attention was paid to multiple attenuation in adapting our workflow. The resulting reprocessed seismic images, interpreted with a coincident velocity model obtained by wide-angle data tomography, provide (1) enhanced imaging of the sedimentary cover down to the top of the acoustic basement, including the base of the Messinian evaporites and the sub-salt Miocene series, which appear to be tectonized as far as in the mid-basin, and (2) new evidence of deep crustal structures in the margin which the initial processing had failed to

  9. Practical hacking techniques and countermeasures

    CERN Document Server

    Spivey, Mark D

    2006-01-01

    Examining computer security from the hacker''s perspective, Practical Hacking Techniques and Countermeasures employs virtual computers to illustrate how an attack is executed, including the script, compilation, and results. It provides detailed screen shots in each lab for the reader to follow along in a step-by-step process in order to duplicate and understand how the attack works. It enables experimenting with hacking techniques without fear of corrupting computers or violating any laws.Written in a lab manual style, the book begins with the installation of the VMware® Workstation product and guides the users through detailed hacking labs enabling them to experience what a hacker actually does during an attack. It covers social engineering techniques, footprinting techniques, and scanning tools. Later chapters examine spoofing techniques, sniffing techniques, password cracking, and attack tools. Identifying wireless attacks, the book also explores Trojans, Man-in-the-Middle (MTM) attacks, and Denial of S...

  10. Exploring Architectural Details Through a Wearable Egocentric Vision Device

    Directory of Open Access Journals (Sweden)

    Stefano Alletto

    2016-02-01

    Full Text Available Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience.

  11. Phonetic Detail in American English

    Institute of Scientific and Technical Information of China (English)

    Ray Freeze

    1987-01-01

    @@ In the course of teaching general phonetics and phonological analysis in the psat few years,l have found some phonetic detail which some native speakers as well as non-native speakers were unaware of. This subtle detail will be the focus of this presentation. Som e of this detail many of you will already be aware of because of your experience in learning, teaching, and thinking about English. If anything is new to you, I hope you might enjoy hearing about it even if it turns out not to be useful in your work.

  12. Brachytherapy applications and techniques

    CERN Document Server

    Devlin, Phillip M

    2015-01-01

    Written by the foremost experts in the field, this volume is a comprehensive text and practical reference on contemporary brachytherapy. The book provides detailed, site-specific information on applications and techniques of brachytherapy in the head and neck, central nervous system, breast, thorax, gastrointestinal tract, and genitourinary tract, as well as on gynecologic brachytherapy, low dose rate and high dose rate sarcoma brachytherapy, vascular brachytherapy, and pediatric applications. The book thoroughly describes and compares the four major techniques used in brachytherapy-intraca

  13. Wireless communications algorithmic techniques

    CERN Document Server

    Vitetta, Giorgio; Colavolpe, Giulio; Pancaldi, Fabrizio; Martin, Philippa A

    2013-01-01

    This book introduces the theoretical elements at the basis of various classes of algorithms commonly employed in the physical layer (and, in part, in MAC layer) of wireless communications systems. It focuses on single user systems, so ignoring multiple access techniques. Moreover, emphasis is put on single-input single-output (SISO) systems, although some relevant topics about multiple-input multiple-output (MIMO) systems are also illustrated.Comprehensive wireless specific guide to algorithmic techniquesProvides a detailed analysis of channel equalization and channel coding for wi

  14. Survey of semantic modeling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C.L.

    1975-07-01

    The analysis of the semantics of programing languages was attempted with numerous modeling techniques. By providing a brief survey of these techniques together with an analysis of their applicability for answering semantic issues, this report attempts to illuminate the state-of-the-art in this area. The intent is to be illustrative rather than thorough in the coverage of semantic models. A bibliography is included for the reader who is interested in pursuing this area of research in more detail.

  15. Survey of semantic modeling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C.L.

    1975-07-01

    The analysis of the semantics of programing languages was attempted with numerous modeling techniques. By providing a brief survey of these techniques together with an analysis of their applicability for answering semantic issues, this report attempts to illuminate the state-of-the-art in this area. The intent is to be illustrative rather than thorough in the coverage of semantic models. A bibliography is included for the reader who is interested in pursuing this area of research in more detail.

  16. Clinical techniques of artificial insemination in dogs.

    Science.gov (United States)

    Makloski, Chelsea L

    2012-05-01

    This article provides an overview of the current breeding techniques used in small animal reproduction today with an emphasis on artificial insemination techniques such as transvaginal and transcervical insemination as well as surgical deposition of semen in the uterus and oviduct. Breeding management and ovulation timing will be mentioned but are discussed in further detail in another article in this issue.

  17. A technique for determining the optimum mix of logistics service providers of a make-to-order supply chain by formulating and solving a constrained nonlinear cost optimization problem

    Directory of Open Access Journals (Sweden)

    Mrityunjoy Roy

    2013-04-01

    Full Text Available In this paper, a technique has been developed to determine the optimum mix of logistic service providers of a make-to-order (MTO supply chain. A serial MTO supply chain with different stages/ processes has been considered. For each stage different logistic service providers with different mean processing lead times, but same lead time variances are available. A realistic assumption that for each stage, the logistic service provider who charges more for his service consumes less processing lead time and vice-versa has been made in our study. Thus for each stage, for each service provider, a combination of cost and mean processing lead time is available. Using these combinations, for each stage, a polynomial curve, expressing cost of that stage as a function of mean processing lead time is fit. Cumulating all such expressions of cost for the different stages along with incorporation of suitable constraints arising out of timely delivery, results in the formulation of a constrained nonlinear cost optimization problem. On solving the problem using mathematica, optimum processing lead time for each stage is obtained. Using these optimum processing lead times and by employing a simple technique the optimum logistic service provider mix of the supply chain along with the corresponding total cost of processing is determined. Finally to examine the effect of changes in different parameters on the optimum total processing cost of the supply chain, sensitivity analysis has been carried out graphically.

  18. Review of Ship Structural Details

    Science.gov (United States)

    1977-01-01

    8 4.3 Knee and Beam Brackets 4-11 4.3.1 Brackets for Girders and Deep Webs 4-11 4.3.2 Brackets Connecting Rolled Sections 4-15 4.4 Tripping...are shell stringers penetrating deep web frames and longitudinal girders penetrating deep transverses. This is not a common detail. If double...34. 3-76 ^"SECTION ’’.’(-K PLAJ iNG * S v *^ 4Fb^:TH»r.KNF.^ SAME AS FLAMGE ► BULKHFADQR DEEP WEB SS- 9 Detail Type: STANCHION END

  19. Transformative Dynamics in Detailed Planning

    DEFF Research Database (Denmark)

    Quitzau, Maj-Britt; Poulsen, Naja; Gustavsson, Ted

    that the translation process relies heavily on integration of impositions in the detailed plan, although this has clear limitations, since some sustainable strategies are more difficult to impose than others. It also shows how strategic navigation may represent an alternative translation strategy to promote more...... difficult sustainable strategies that address the project design more directly. In conclusion, the paper argues that strategic navigation represents a stronger mediator of change compared to the detailed plan, but that especially timing issues in the coordination between formal planning and design processes...

  20. DAGAL: Detailed Anatomy of Galaxies

    Science.gov (United States)

    Knapen, Johan H.

    2017-03-01

    The current IAU Symposium is closely connected to the EU-funded network DAGAL (Detailed Anatomy of Galaxies), with the final annual network meeting of DAGAL being at the core of this international symposium. In this short paper, we give an overview of DAGAL, its training activities, and some of the scientific advances that have been made under its umbrella.

  1. DAGAL: Detailed Anatomy of Galaxies

    CERN Document Server

    Knapen, Johan H

    2016-01-01

    The current IAU Symposium is closely connected to the EU-funded network DAGAL (Detailed Anatomy of Galaxies), with the final annual network meeting of DAGAL being at the core of this international symposium. In this short paper, we give an overview of DAGAL, its training activities, and some of the scientific advances that have been made under its umbrella.

  2. On Detailing in Contemporary Architecture

    DEFF Research Database (Denmark)

    Kristensen, Claus; Kirkegaard, Poul Henning

    2010-01-01

    / tactility can blur the meaning of the architecture and turn it into an empty statement. The present paper will outline detailing in contemporary architecture and discuss the issue with respect to architectural quality. Architectural cases considered as sublime piece of architecture will be presented...

  3. Detail in architecture: Between arts & crafts

    Science.gov (United States)

    Dulencin, Juraj

    2016-06-01

    Architectural detail represents an important part of architecture. Not only can it be used as an identifier of a specific building but at the same time enhances the experience of the realized project. Within it lie the signs of a great architect and clues to understanding his or her way of thinking. It is therefore the central topic of a seminar offered to architecture students at the Brno University of Technology. During the course of the semester-long class the students acquaint themselves with atypical architectural details of domestic and international architects by learning to read them, understand them and subsequently draw them by creating architectural blueprints. In other words, by general analysis of a detail the students learn theoretical thinking of its architect who, depending on the nature of the design, had to incorporate a variety of techniques and crafts. Students apply this analytical part to their own architectural detail design. The methodology of the seminar consists of experiential learning by project management and is complemented by a series of lectures discussing a diversity of details as well as materials and technologies required to implement it. The architectural detail design is also part of students' bachelors thesis, therefore, the realistic nature of their blueprints can be verified in the production process of its physical counterpart. Based on their own documentation the students choose the most suitable manufacturing process whether it is supplied by a specific technology or a craftsman. Students actively participate in the production and correct their design proposals in real scale with the actual material. A student, as a future architect, stands somewhere between a client and an artisan, materializes his or her idea and adjusts the manufacturing process so that the final detail fulfills aesthetic consistency and is in harmony with its initial concept. One of the very important aspects of the design is its economic cost, an

  4. Experimental techniques; Techniques experimentales

    Energy Technology Data Exchange (ETDEWEB)

    Roussel-Chomaz, P. [GANIL CNRS/IN2P3, CEA/DSM, 14 - Caen (France)

    2007-07-01

    This lecture presents the experimental techniques, developed in the last 10 or 15 years, in order to perform a new class of experiments with exotic nuclei, where the reactions induced by these nuclei allow to get information on their structure. A brief review of the secondary beams production methods will be given, with some examples of facilities in operation or under project. The important developments performed recently on cryogenic targets will be presented. The different detection systems will be reviewed, both the beam detectors before the targets, and the many kind of detectors necessary to detect all outgoing particles after the reaction: magnetic spectrometer for the heavy fragment, detection systems for the target recoil nucleus, {gamma} detectors. Finally, several typical examples of experiments will be detailed, in order to illustrate the use of each detector either alone, or in coincidence with others. (author)

  5. Ultrasonic techniques for fluids characterization

    CERN Document Server

    Povey, Malcolm J W

    1997-01-01

    This book is a comprehensive and practical guide to the use of ultrasonic techniques for the characterization of fluids. Focusing on ultrasonic velocimetry, the author covers the basic topics and techniques necessaryfor successful ultrasound measurements on emulsions, dispersions, multiphase media, and viscoelastic/viscoplastic materials. Advanced techniques such as scattering, particle sizing, and automation are also presented. As a handbook for industrial and scientific use, Ultrasonic Techniques for Fluids Characterization is an indispensable guide to chemists and chemical engineers using ultrasound for research or process monitoring in the chemical, food processing, pharmaceutical, cosmetic, biotechnology,and fuels industries. Key Features * Appeals to anyone using ultrasound to study fluids * Provides the first detailed description of the ultrasound profiling technique for dispersions * Describes new techniques for measuring phase transitions and nucleation, such as water/ice and oil/fat * Presents the l...

  6. A DETAILED REVIEW ON ORAL MUCOSAL DRUG DELIVERY SYSTEM

    Directory of Open Access Journals (Sweden)

    Radha Bhati

    2012-03-01

    Full Text Available Oral mucosal drug delivery system is widely applicable as novel site for administration of drug for immediate and controlled release action by preventing first pass metabolism and enzymatic degradation due to GI microbial flora. Oral mucosal drug delivery system provides local and systemic action. In this review, attention is focused to give regarding physiology of oral mucosal including tissue permeability, barriers to permeation and route of permeation, biopharmaceutics of buccal and sublingual absorption, factors affecting drug absorption, detailed information of penetration enhancers, design of oral mucosal drug delivery system and role of mucoadhesion and various theories of bioadhesion. Evaluation techniques and selection of animal model for in-vivo studies are also discussed.

  7. Space Telecommunications Radio System (STRS) Architecture, Tutorial Part 2 - Detailed

    Science.gov (United States)

    Handler, Louis

    2014-01-01

    The STRS architecture detail presentation presents each requirement in the STRS Architecture Standard with some examples and supporting information. The purpose is to give a platform provider, application provider, or application integrator a better, more detailed understanding of the STRS Architecture Standard and its use.

  8. The effect of providing feedback on inhaler technique and adherence from an electronic audio recording device, INCA®, in a community pharmacy setting: study protocol for a randomised controlled trial.

    Science.gov (United States)

    O'Dwyer, Susan Mary; MacHale, Elaine; Sulaiman, Imran; Holmes, Martin; Hughes, Cian; D'Arcy, Shona; Rapcan, Viliam; Taylor, Terence; Boland, Fiona; Bosnic-Anticevich, Sinthia; Reilly, Richard B; Ryder, Sheila A; Costello, Richard W

    2016-05-04

    Poor adherence to inhaled medication may lead to inadequate symptom control in patients with respiratory disease. In practice it can be difficult to identify poor adherence. We designed an acoustic recording device, the INCA® (INhaler Compliance Assessment) device, which, when attached to an inhaler, identifies and records the time and technique of inhaler use, thereby providing objective longitudinal data on an individual's adherence to inhaled medication. This study will test the hypothesis that providing objective, personalised, visual feedback on adherence to patients in combination with a tailored educational intervention in a community pharmacy setting, improves adherence more effectively than education alone. The study is a prospective, cluster randomised, parallel-group, multi-site study conducted over 6 months. The study is designed to compare current best practice in care (i.e. routine inhaler technique training) with the use of the INCA® device for respiratory patients in a community pharmacy setting. Pharmacies are the unit of randomisation and on enrolment to the study they will be allocated by the lead researcher to one of the three study groups (intervention, comparator or control groups) using a computer-generated list of random numbers. Given the nature of the intervention neither pharmacists nor participants can be blinded. The intervention group will receive feedback from the acoustic recording device on inhaler technique and adherence three times over a 6-month period along with inhaler technique training at each of these times. The comparator group will also receive training in inhaler use three times over the 6-month study period but no feedback on their habitual performance. The control group will receive usual care (i.e. the safe supply of medicines and advice on their use). The primary outcome is the rate of participant adherence to their inhaled medication, defined as the proportion of correctly taken doses of medication at the correct

  9. Structural concepts and details for seismic design

    Energy Technology Data Exchange (ETDEWEB)

    1991-09-01

    This manual discusses building and building component behavior during earthquakes, and provides suggested details for seismic resistance which have shown by experience to provide adequate performance during earthquakes. Special design and construction practices are also described which, although they might be common in some high-seismic regions, may not be common in low and moderate seismic-hazard regions of the United States. Special attention is given to describing the level of detailing appropriate for each seismic region. The UBC seismic criteria for all seismic zones is carefully examined, and many examples of connection details are given. The general scope of discussion is limited to materials and construction types common to Department of Energy (DOE) sites. Although the manual is primarily written for professional engineers engaged in performing seismic-resistant design for DOE facilities, the first two chapters, plus the introductory sections of succeeding chapters, contain descriptions which are also directed toward project engineers who authorize, review, or supervise the design and construction of DOE facilities. 88 refs., 188 figs.

  10. A Generalized Detailed Balance Relation

    Science.gov (United States)

    Ruelle, David

    2016-08-01

    Given a system M in a thermal bath we obtain a generalized detailed balance relation for the ratio r=π _τ (K→ J)/π _τ (J→ K) of the transition probabilities M:J→ K and M:K→ J in time τ . We assume an active bath, containing solute molecules in metastable states. These molecules may react with M and the transition J→ K occurs through different channels α involving different reactions with the bath. We find that r=sum p^α r^α , where p^α is the probability that channel α occurs, and r^α depends on the amount of heat (more precisely enthalpy) released to the bath in channel α.

  11. A generalized detailed balance relation

    CERN Document Server

    Ruelle, David

    2015-01-01

    Given a system $M$ in a thermal bath we obtain a generalized detailed balance relation for the ratio $r=\\pi_\\tau(K\\to J)/\\pi_\\tau(J\\to K)$ of the transition probabilities $M:J\\to K$ and $M:K\\to J$ in time $\\tau$. We assume an active bath, containing solute molecules in metastable states. These molecules may react with $M$ and the transition $J\\to K$ occurs through different channels $\\alpha$ involving different reactions with the bath. We find that $r=\\sum p^\\alpha r^\\alpha$, where $p^\\alpha$ is the probability that channel $\\alpha$ occurs, and $r^\\alpha$ depends on the amount of heat (more precisely enthalpy) released to the bath in channel $\\alpha$.

  12. Detailed gravimetric geoid computations in North America

    Science.gov (United States)

    Marsh, J. G.; Chang, E. S.

    1976-01-01

    A detailed gravimetric geoid has been computed for the Eastern United States and the Northwestern Atlantic Ocean by combining the Goddard Space Flight Center GEM-8 earth gravity model with the available 15 x 15 arcmin and 1 x 1 deg mean free air surface gravity observations. The short wavelength undulations were computed by applying Stokes' formula to the 15 x 15 arcmin and 1 x 1 deg surface data. The long wavelength undulations were provided by the GEM-8 model. The gravimetric geoid has been compared with Geoceiver derived and astrogeodetically determined geoid heights in the United States and the rms agreement is on the order of 1.5 meters. Excellent agreement in shape has been found between the detailed geoid and geoidal profiles derived from GEOS-III altimeter data in the Northwest Atlantic Ocean.

  13. Medicare Referring Provider DMEPOS PUF

    Data.gov (United States)

    U.S. Department of Health & Human Services — This dataset, which is part of CMSs Medicare Provider Utilization and Payment Data, details information on Durable Medical Equipment, Prosthetics, Orthotics and...

  14. Whole Genome Amplification of Day 3 or Day 5 Human Embryos Biopsies Provides a Suitable DNA Template for PCR-Based Techniques for Genotyping, a Complement of Preimplantation Genetic Testing

    Directory of Open Access Journals (Sweden)

    Elizabeth Schaeffer

    2017-01-01

    Full Text Available Our objective was to determine if whole genome amplification (WGA provides suitable DNA for qPCR-based genotyping for human embryos. Single blastomeres (Day 3 or trophoblastic cells (Day 5 were isolated from 342 embryos for WGA. Comparative Genomic Hybridization determined embryo sex as well as Trisomy 18 or Trisomy 21. To determine the embryo’s sex, qPCR melting curve analysis for SRY and DYS14 was used. Logistic regression indicated a 4.4%, 57.1%, or 98.8% probability of a male embryo when neither gene, SRY only, or both genes were detected, respectively (accuracy = 94.1%, kappa = 0.882, and p<0.001. Fluorescent Capillary Electrophoresis for the amelogenin genes (AMEL was also used to determine sex. AMELY peak’s height was higher and this peak’s presence was highly predictive of male embryos (AUC = 0.93, accuracy = 81.7%, kappa = 0.974, and p<0.001. Trisomy 18 and Trisomy 21 were determined using the threshold cycle difference for RPL17 and TTC3, respectively, which were significantly lower in the corresponding embryos. The Ct difference for TTC3 specifically determined Trisomy 21 (AUC = 0.89 and RPL17 for Trisomy 18 (AUC = 0.94. Here, WGA provides adequate DNA for PCR-based techniques for preimplantation genotyping.

  15. Detailed modeling of mountain wave PSCs

    Directory of Open Access Journals (Sweden)

    S. Fueglistaler

    2003-01-01

    Full Text Available Polar stratospheric clouds (PSCs play a key role in polar ozone depletion. In the Arctic, PSCs can occur on the mesoscale due to orographically induced gravity waves. Here we present a detailed study of a mountain wave PSC event on 25-27 January 2000 over Scandinavia. The mountain wave PSCs were intensively observed by in-situ and remote-sensing techniques during the second phase of the SOLVE/THESEO-2000 Arctic campaign. We use these excellent data of PSC observations on 3 successive days to analyze the PSCs and to perform a detailed comparison with modeled clouds. We simulated the 3-dimensional PSC structure on all 3 days with a mesoscale numerical weather prediction (NWP model and a microphysical box model (using best available nucleation rates for ice and nitric acid trihydrate particles. We show that the combined mesoscale/microphysical model is capable of reproducing the PSC measurements within the uncertainty of data interpretation with respect to spatial dimensions, temporal development and microphysical properties, without manipulating temperatures or using other tuning parameters. In contrast, microphysical modeling based upon coarser scale global NWP data, e.g. current ECMWF analysis data, cannot reproduce observations, in particular the occurrence of ice and nitric acid trihydrate clouds. Combined mesoscale/microphysical modeling may be used for detailed a posteriori PSC analysis and for future Arctic campaign flight and mission planning. The fact that remote sensing alone cannot further constrain model results due to uncertainities in the interpretation of measurements, underlines the need for synchronous in-situ PSC observations in campaigns.

  16. Detailed measurement on a HESCO diffuser

    DEFF Research Database (Denmark)

    Jensen, Rasmus Lund; Holm, Dorte; Nielsen, Peter V.

    2007-01-01

    the inlet velocity is a very important boundary condition both in CFD calculation and general flow measurements. If only the volume flow and the geometrical area are used, a relatively large error in the inlet velocity may result. From the detailed measurements it was possible to establish an expression......This paper focuses on measuring the inlet velocity from a HESCO diffuser used in the IEA Annex 20 work as a function of the volume flow it provides. The aim of the present work is to establish a relation between the inlet velocity, the effective area and the airflow. This is important because...

  17. A detailed phylogeny for the Methanomicrobiales

    Science.gov (United States)

    Rouviere, P.; Mandelco, L.; Winker, S.; Woese, C. R.

    1992-01-01

    The small subunit rRNA sequence of twenty archaea, members of the Methanomicrobiales, permits a detailed phylogenetic tree to be inferred for the group. The tree confirms earlier studies, based on far fewer sequences, in showing the group to be divided into two major clusters, temporarily designated the "methanosarcina" group and the "methanogenium" group. The tree also defines phylogenetic relationships within these two groups, which in some cases do not agree with the phylogenetic relationships implied by current taxonomic names--a problem most acute for the genus Methanogenium and its relatives. The present phylogenetic characterization provides the basis for a consistent taxonomic restructuring of this major methanogenic taxon.

  18. A detailed phylogeny for the Methanomicrobiales

    Science.gov (United States)

    Rouviere, P.; Mandelco, L.; Winker, S.; Woese, C. R.

    1992-01-01

    The small subunit rRNA sequence of twenty archaea, members of the Methanomicrobiales, permits a detailed phylogenetic tree to be inferred for the group. The tree confirms earlier studies, based on far fewer sequences, in showing the group to be divided into two major clusters, temporarily designated the "methanosarcina" group and the "methanogenium" group. The tree also defines phylogenetic relationships within these two groups, which in some cases do not agree with the phylogenetic relationships implied by current taxonomic names--a problem most acute for the genus Methanogenium and its relatives. The present phylogenetic characterization provides the basis for a consistent taxonomic restructuring of this major methanogenic taxon.

  19. Research on the Hotel Image Based on the Detail Service

    Science.gov (United States)

    Li, Ban; Shenghua, Zheng; He, Yi

    Detail service management, initially developed as marketing programs to enhance customer loyalty, has now become an important part of customer relation strategy. This paper analyzes the critical factors of detail service and its influence on the hotel image. We establish the theoretical model of influencing factors on hotel image and propose corresponding hypotheses. We use applying statistical method to test and verify the above-mentioned hypotheses. This paper provides a foundation for further study of detail service design and planning issues.

  20. Academic Detailing in Diabetes: Using Outreach Education to Improve the Quality of Care.

    Science.gov (United States)

    Fischer, Michael A

    2016-10-01

    Most diabetes care is provided in primary care settings, but typical primary care clinicians struggle to keep up with the latest evidence on diabetes screening, pharmacotherapy, and monitoring. Accordingly, many patients with diabetes are not receiving optimal guideline-based therapy. Relying on front-line clinicians on their own to assess the huge volume of new literature and incorporate it into their practice is unrealistic, and conventional continuing medical education has not proven adequate to address gaps in care. Academic detailing, direct educational outreach to clinicians that uses social marketing techniques to provide specific evidence-based recommendations, has been proven in clinical trials to improve the quality of care for a range of conditions. By directly engaging with clinicians to assess their needs, identify areas for change in practice, and provide them with specific tools to implement these changes, academic detailing can serve as a tool to improve care processes and outcomes for patients with diabetes.

  1. Detailed black hole state counting in loop quantum gravity

    Science.gov (United States)

    Agullo, Ivan; Barbero G., J. Fernando; Borja, Enrique F.; Diaz-Polo, Jacobo; Villaseñor, Eduardo J. S.

    2010-10-01

    We give a complete and detailed description of the computation of black hole entropy in loop quantum gravity by employing the most recently introduced number-theoretic and combinatorial methods. The use of these techniques allows us to perform a detailed analysis of the precise structure of the entropy spectrum for small black holes, showing some relevant features that were not discernible in previous computations. The ability to manipulate and understand the spectrum up to the level of detail that we describe in the paper is a crucial step toward obtaining the behavior of entropy in the asymptotic (large horizon area) regime.

  2. Detailed black hole state counting in loop quantum gravity

    CERN Document Server

    Agullo, Ivan; Borja, Enrique F; Diaz-Polo, Jacobo; Villaseñor, Eduardo J S; 10.1103/PhysRevD.82.084029

    2011-01-01

    We give a complete and detailed description of the computation of black hole entropy in loop quantum gravity by employing the most recently introduced number-theoretic and combinatorial methods. The use of these techniques allows us to perform a detailed analysis of the precise structure of the entropy spectrum for small black holes, showing some relevant features that were not discernible in previous computations. The ability to manipulate and understand the spectrum up to the level of detail that we describe in the paper is a crucial step towards obtaining the behavior of entropy in the asymptotic (large horizon area) regime.

  3. Large Terrain Continuous Level of Detail 3D Visualization Tool

    Science.gov (United States)

    Myint, Steven; Jain, Abhinandan

    2012-01-01

    This software solved the problem of displaying terrains that are usually too large to be displayed on standard workstations in real time. The software can visualize terrain data sets composed of billions of vertices, and can display these data sets at greater than 30 frames per second. The Large Terrain Continuous Level of Detail 3D Visualization Tool allows large terrains, which can be composed of billions of vertices, to be visualized in real time. It utilizes a continuous level of detail technique called clipmapping to support this. It offloads much of the work involved in breaking up the terrain into levels of details onto the GPU (graphics processing unit) for faster processing.

  4. 46 CFR 194.05-7 - Explosives-Detail requirements.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Explosives-Detail requirements. 194.05-7 Section 194.05... HANDLING, USE, AND CONTROL OF EXPLOSIVES AND OTHER HAZARDOUS MATERIALS Stowage and Marking § 194.05-7 Explosives—Detail requirements. (a) Except as otherwise provided by this part, Division 1.1 and 1.2...

  5. Commentary: The Perils of Seduction: Distracting Details or Incomprehensible Abstractions?

    Science.gov (United States)

    Goetz, Ernest T.; Sadoski, Mark

    1995-01-01

    Reviews studies that have explicitly investigated the "seductive detail" effect (in which a reader's attention is diverted toward the interesting but unimportant seductive details and away from the uninteresting but important main ideas). Concludes that these studies do not provide convincing evidence for the existence of the effect. Argues that…

  6. Fun While Showing, Not Telling: Crafting Vivid Detail in Writing

    Science.gov (United States)

    Del Nero, Jennifer Renner

    2017-01-01

    This teaching tip highlights three writing minilessons that help students construct vivid sensory detail (textual detail related to the five senses) in their fiction and creative nonfiction writing. Learning to show, not tell, is a difficult task for novice writers. The author explores reasons why this is the case and provides directions for the…

  7. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  8. Scanning ion-selective electrode technique and X-ray microanalysis provide direct evidence of contrasting Na+ transport ability from root to shoot in salt-sensitive cucumber and salt-tolerant pumpkin under NaCl stress.

    Science.gov (United States)

    Lei, Bo; Huang, Yuan; Sun, Jingyu; Xie, Junjun; Niu, Mengliang; Liu, Zhixiong; Fan, Molin; Bie, Zhilong

    2014-12-01

    Grafting onto salt-tolerant pumpkin rootstock can increase cucumber salt tolerance. Previous studies have suggested that this can be attributed to pumpkin roots with higher capacity to limit the transport of Na(+) to the shoot than cucumber roots. However, the mechanism remains unclear. This study investigated the transport of Na(+) in salt-tolerant pumpkin and salt-sensitive cucumber plants under high (200 mM) or moderate (90 mM) NaCl stress. Scanning ion-selective electrode technique showed that pumpkin roots exhibited a higher capacity to extrude Na(+), and a correspondingly increased H(+) influx under 200 or 90 mM NaCl stress. The 200 mM NaCl induced Na(+)/H(+) exchange in the root was inhibited by amiloride (a Na(+)/H(+) antiporter inhibitor) or vanadate [a plasma membrane (PM) H(+) -ATPase inhibitor], indicating that Na(+) exclusion in salt stressed pumpkin and cucumber roots was the result of an active Na(+)/H(+) antiporter across the PM, and the Na(+)/H(+) antiporter system in salt stressed pumpkin roots was sufficient to exclude Na(+) X-ray microanalysis showed higher Na(+) in the cortex, but lower Na(+) in the stele of pumpkin roots than that in cucumber roots under 90 mM NaCl stress, suggesting that the highly vacuolated root cortical cells of pumpkin roots could sequester more Na(+), limit the radial transport of Na(+) to the stele and thus restrict the transport of Na(+) to the shoot. These results provide direct evidence for pumpkin roots with higher capacity to limit the transport of Na(+) to the shoot than cucumber roots.

  9. Report Details Solar Radiation Alert and Recommendations

    Science.gov (United States)

    Staedter, Tracy

    2006-06-01

    High-energy particles from the Sun and from regions beyond the solar system constantly bombard Earth. Thanks to the planet's atmosphere and magnetic field, comsic radiation is not a significant threat to those rooted on terra firma. But airline crew and passengers flying at high altitudes, or over the poles where the Earth's magnetic field provides no protection, are particularly vulnerable to unpredictable flares on the Sun's surface that launch streams of sub-atomic particles toward Earth. The report, ``Solar Radiation Alert System,'' published by the Federal Aviation Administration (FAA) and the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado, Boulder, in July 2005 (www.faa.gov/library/reports/medical/oamtechreports/2000s/media/0514.pdf) details in alert system designed to estimate the ionizing radiation at aircraft flight altitudes and, depending on the resulting dose rate, issue a warning.

  10. Details of tetrahedral anisotropic mesh adaptation

    Science.gov (United States)

    Jensen, Kristian Ejlebjerg; Gorman, Gerard

    2016-04-01

    We have implemented tetrahedral anisotropic mesh adaptation using the local operations of coarsening, swapping, refinement and smoothing in MATLAB without the use of any for- N loops, i.e. the script is fully vectorised. In the process of doing so, we have made three observations related to details of the implementation: 1. restricting refinement to a single edge split per element not only simplifies the code, it also improves mesh quality, 2. face to edge swapping is unnecessary, and 3. optimising for the Vassilevski functional tends to give a little higher value for the mean condition number functional than optimising for the condition number functional directly. These observations have been made for a uniform and a radial shock metric field, both starting from a structured mesh in a cube. Finally, we compare two coarsening techniques and demonstrate the importance of applying smoothing in the mesh adaptation loop. The results pertain to a unit cube geometry, but we also show the effect of corners and edges by applying the implementation in a spherical geometry.

  11. Monte Carlo methods beyond detailed balance

    NARCIS (Netherlands)

    Schram, Raoul D.; Barkema, Gerard T.

    2015-01-01

    Monte Carlo algorithms are nearly always based on the concept of detailed balance and ergodicity. In this paper we focus on algorithms that do not satisfy detailed balance. We introduce a general method for designing non-detailed balance algorithms, starting from a conventional algorithm satisfying

  12. Fabricating an Accurate Implant Master Cast: A Technique Report.

    Science.gov (United States)

    Balshi, Thomas J; Wolfinger, Glenn J; Alfano, Stephen G; Cacovean, Jeannine N; Balshi, Stephen F

    2015-12-01

    The technique for fabricating an accurate implant master cast following the 12-week healing period after Teeth in a Day® dental implant surgery is detailed. The clinical, functional, and esthetic details captured during the final master impression are vital to creating an accurate master cast. This technique uses the properties of the all-acrylic resin interim prosthesis to capture these details. This impression captures the relationship between the remodeled soft tissue and the interim prosthesis. This provides the laboratory technician with an accurate orientation of the implant replicas in the master cast with which a passive fitting restoration can be fabricated.

  13. The study of bronze statuettes with the help of neutron-imaging techniques

    NARCIS (Netherlands)

    Van Langh, R.; Lehmann, E.; Hartmann, S.; Kaestner, A.; Scholten, F.

    2009-01-01

    Until recently fabrication techniques of Renaissance bronzes have been studied only with the naked eye, microscopically, videoscopically and with X-radiography. These techniques provide information on production techniques, yet much important detail remains unclear. As part of an interdisciplinary s

  14. Manufacturing details by Neutron Radiography of Archaeological Pottery

    Energy Technology Data Exchange (ETDEWEB)

    Bernedo, Alfredo Victor Bellido; Latini, Rose Mary [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil); Souza, Maria Ines Silvani; Vinagre Filho, Ubirajara Maribondo [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    Full text: The aim of the present work was to investigate manufacturing details of archaeological pot-sherds ceramics by Neutron Radiography. Pottery is perhaps the most important artefact found in excavation. Its archaeological importance relies on the fact that it can reveal cultural traditions and commercial influences in ancient communities. These pottery was recently discovered in archaeological earth circular structures sites in Acre state Brazil and the characteristics of clay used in their manufacture have been studied by modern scientific techniques such as Instrumental Neutron Activation Analysis (INAA), Thermoluminescence Dating and Moessbauer Spectroscopy. Different fragments of pottery were submitted to a neutron flux of the order of 10{sup 5}n.cm{sup -2}2:s{sup -1} for 3 minutes in the research reactor Argonauta at the Instituto de Engenharia Nuclear/CNEN. Digital processing techniques using imaging plate were applied to the image of the selected sample. The Neutrongraphy shows two different manufacturing details: palette and rollers. The fragment made by the technique of palette show a homogeneous mass and the neutrongraphy of ceramic fragments made by the technique of the rollers, pottery funeral, can be seen horizontal traces of the junction of rollers, overlapping, forming layers supported on each other. This technique allows you to create more stable structures. Thus, both the technique of the pallet as the roller can be characterized by Neutron Radiography. (author)

  15. Nabokov's Details: Making Sense of Irrational Standards

    OpenAIRE

    2012-01-01

    Vladimir Nabokov's passion for detail is well-known, central to our very idea of the "Nabokovian." Yet Nabokov's most important claims for detail pose a challenge for the reader who would take them seriously. Startlingly extreme and deliberately counterintuitive -- Nabokov called them his "irrational standards" -- these claims push the very limits of reason and belief. Nabokov's critics have tended to treat his more extravagant claims for detail -- including his assertion that the "capacity t...

  16. Clinical professional governance for detailed clinical models.

    Science.gov (United States)

    Goossen, William; Goossen-Baremans, Anneke

    2013-01-01

    This chapter describes the need for Detailed Clinical Models for contemporary Electronic Health Systems, data exchange and data reuse. It starts with an explanation of the components related to Detailed Clinical Models with a brief summary of knowledge representation, including terminologies representing clinic relevant "things" in the real world, and information models that abstract these in order to let computers process data about these things. Next, Detailed Clinical Models are defined and their purpose is described. It builds on existing developments around the world and accumulates in current work to create a technical specification at the level of the International Standards Organization. The core components of properly expressed Detailed Clinical Models are illustrated, including clinical knowledge and context, data element specification, code bindings to terminologies and meta-information about authors, versioning among others. Detailed Clinical Models to date are heavily based on user requirements and specify the conceptual and logical levels of modelling. It is not precise enough for specific implementations, which requires an additional step. However, this allows Detailed Clinical Models to serve as specifications for many different kinds of implementations. Examples of Detailed Clinical Models are presented both in text and in Unified Modelling Language. Detailed Clinical Models can be positioned in health information architectures, where they serve at the most detailed granular level. The chapter ends with examples of projects that create and deploy Detailed Clinical Models. All have in common that they can often reuse materials from earlier projects, and that strict governance of these models is essential to use them safely in health care information and communication technology. Clinical validation is one point of such governance, and model testing another. The Plan Do Check Act cycle can be applied for governance of Detailed Clinical Models

  17. Displaying of Details in Subvoxel Accuracy

    Institute of Scientific and Technical Information of China (English)

    蔡文立; 陈天洲; 等

    1996-01-01

    Under the volume segmentation in voxel space,a lot of details,some fine and thin objects,are ignored.In order to accurately display these details,this paper has developed a methodology for volume segmentation in subvoxel space.In the subvoxel space,most of the “bridges”between adjacent layers are broken down.Based on the subvoxel space,an automatic segmentation algorithm reserving details is discussed.After segmentation,volume data in subvoxel space are reduced to original voxel space.Thus,the details with width of only one or several voxels are extracted and displayed.

  18. Dynamical Twisted Mass Fermions with Light Quarks: Simulation and Analysis Details

    CERN Document Server

    Boucaud, Ph; Farchioni, F; Frezzotti, R; Giménez, V; Herdoiza, G; Jansen, K; Lubicz, V; Michael, C; Münster, G; Palao, D; Rossi, G C; Scorzato, L; Shindler, A; Simula, S; Sudmann, T; Urbach, C; Wenger, U

    2008-01-01

    In a recent paper [hep-lat/0701012] we presented precise lattice QCD results of our European Twisted Mass Collaboration (ETMC). They were obtained by employing two mass-degenerate flavours of twisted mass fermions at maximal twist. In the present paper we give details on our simulations and the computation of physical observables. In particular, we discuss the problem of tuning to maximal twist, the techniques we have used to compute correlators and error estimates. In addition, we provide more information on the algorithm used, the autocorrelation times and scale determination, the evaluation of disconnected contributions and the description of our data by means of chiral perturbation theory formulae.

  19. Compact Models and Measurement Techniques for High-Speed Interconnects

    CERN Document Server

    Sharma, Rohit

    2012-01-01

    Compact Models and Measurement Techniques for High-Speed Interconnects provides detailed analysis of issues related to high-speed interconnects from the perspective of modeling approaches and measurement techniques. Particular focus is laid on the unified approach (variational method combined with the transverse transmission line technique) to develop efficient compact models for planar interconnects. This book will give a qualitative summary of the various reported modeling techniques and approaches and will help researchers and graduate students with deeper insights into interconnect models in particular and interconnect in general. Time domain and frequency domain measurement techniques and simulation methodology are also explained in this book.

  20. Optoelectronic pH Meter: Further Details

    Science.gov (United States)

    Jeevarajan, Antony S.; Anderson, Mejody M.; Macatangay, Ariel V.

    2009-01-01

    A collection of documents provides further detailed information about an optoelectronic instrument that measures the pH of an aqueous cell-culture medium to within 0.1 unit in the range from 6.5 to 7.5. The instrument at an earlier stage of development was reported in Optoelectronic Instrument Monitors pH in a Culture Medium (MSC-23107), NASA Tech Briefs, Vol. 28, No. 9 (September 2004), page 4a. To recapitulate: The instrument includes a quartz cuvette through which the medium flows as it is circulated through a bioreactor. The medium contains some phenol red, which is an organic pH-indicator dye. The cuvette sits between a light source and a photodetector. [The light source in the earlier version comprised red (625 nm) and green (558 nm) light-emitting diodes (LEDs); the light source in the present version comprises a single green- (560 nm)-or-red (623 nm) LED.] The red and green are repeatedly flashed in alternation. The responses of the photodiode to the green and red are processed electronically to obtain the ratio between the amounts of green and red light transmitted through the medium. The optical absorbance of the phenol red in the green light varies as a known function of pH. Hence, the pH of the medium can be calculated from the aforesaid ratio.

  1. Detailed Analysis of Motor Unit Activity

    DEFF Research Database (Denmark)

    Nikolic, Mile; Sørensen, John Aasted; Dahl, Kristian

    1997-01-01

    System for decomposition of EMG signals intotheir constituent motor unit potentials and their firing patterns.The aim of the system is detailed analysis ofmotor unit variability.......System for decomposition of EMG signals intotheir constituent motor unit potentials and their firing patterns.The aim of the system is detailed analysis ofmotor unit variability....

  2. Stiilne detail vanalinnas / Jüri Kuuskemaa

    Index Scriptorium Estoniae

    Kuuskemaa, Jüri, 1942-

    2000-01-01

    Näitus Rotermanni soolalaos "Stiilne detail vanalinnas" esitab detaile, mis on ilmestanud Tallinna siluetti (tuulelipud), tänavapilti (lukusiltidest ja uksekoputitest pühakuorvakujudeni) ja tubasid (akna käepidemest ahjukahlite, põrandaplaatide, toauste, stukk- ja kivireljeefideni). Kuraator J. Kuuskemaa, kujundaja M. Agabush.

  3. Detailed Analysis of Motor Unit Activity

    DEFF Research Database (Denmark)

    Nikolic, Mile; Sørensen, John Aasted; Dahl, Kristian

    1997-01-01

    System for decomposition of EMG signals intotheir constituent motor unit potentials and their firing patterns.The aim of the system is detailed analysis ofmotor unit variability.......System for decomposition of EMG signals intotheir constituent motor unit potentials and their firing patterns.The aim of the system is detailed analysis ofmotor unit variability....

  4. Understanding brains: details, intuition, and big data.

    Directory of Open Access Journals (Sweden)

    Eve Marder

    2015-05-01

    Full Text Available Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important.

  5. Acquired Techniques

    DEFF Research Database (Denmark)

    Lunde Nielsen, Espen; Halse, Karianne

    2013-01-01

    Acquired Techniques - a Leap into the Archive, at Aarhus School of Architecture. In collaboration with Karianne Halse, James Martin and Mika K. Friis. Following the footsteps of past travelers this is a journey into tools and techniques of the architectural process. The workshop will focus upon...... architectural production as a conglomerate of various analogue and digital methods, and provide the basics, the tips/tricks - and how the tool themselves becomes operational for spatial/thematic investigations. Eventually, this will become a city, exhibition and phamplet inhabited by the (by...

  6. Detailed reduction of reaction mechanisms for flame modeling

    Science.gov (United States)

    Wang, Hai; Frenklach, Michael

    1991-01-01

    A method for reduction of detailed chemical reaction mechanisms, introduced earlier for ignition system, was extended to laminar premixed flames. The reduction is based on testing the reaction and reaction-enthalpy rates of the 'full' reaction mechanism using a zero-dimensional model with the flame temperature profile as a constraint. The technique is demonstrated with numerical tests performed on the mechanism of methane combustion.

  7. Results of Detailed Hydrologic Characterization Tests - Fiscal Year 1999

    Energy Technology Data Exchange (ETDEWEB)

    Spane, Frank A.; Thorne, Paul D.; Newcomer, Darrell R.

    2001-01-19

    This report provides the results of detailed hydrologic characterization tests conducted within newly constructed Hanford Site wells during FY 1999. Detailed characterization tests performed during FY 1999 included: groundwater flow characterization, barometric response evaluation, slug tests, single-well tracer tests, constant-rate pumping tests, and in-well vertical flow tests. Hydraulic property estimates obtained from the detailed hydrologic tests include: transmissivity, hydraulic conductivity, specific yield, effective porosity, in-well lateral flow velocity, aquifer flow velocity, vertical distribution of hydraulic conductivity (within the well-screen section) and in-well vertical flow velocity. In addition, local groundwater flow characteristics (i.e., hydraulic gradient and flow direction) were determined for four sites where detailed well testing was performed.

  8. Results of Detailed Hydrologic Characterization Tests - Fiscal Year 2000

    Energy Technology Data Exchange (ETDEWEB)

    Spane, Frank A.; Thorne, Paul D.; Newcomer, Darrell R.

    2001-05-15

    This report provides the resluts of detailed hydrologic characterization tests conducted within eleven Hanford Site wells during fiscal year 2000. Detailed characterization tests performed included groundwater-flow characterization; barometric response evaluation; slug tests; single-well tracer tests; constant-rate pumping tests; and in-well, vertical flow tests. Hydraulic property estimates obtained from the detailed hydrologic tests include transmissivity; hydraulic conductivity; specific yield; effective porosity; in-well, lateral flow velocity; aquifer-flow velocity; vertical distribution of hydraulic conductivity (within the well-screen section); and in-well, verticla flow velocity. In addition, local groundwater-flow characteristics (i.e., hydraulic gradient and flow direction) were determined for four sites where detailed well testing was performed.

  9. Post Entitlement Management Information - Detail Database

    Data.gov (United States)

    Social Security Administration — Contains data that supports the detailed and aggregate receipt, pending and clearance data, as well as other strategic and tactical MI for many Title II and Title...

  10. Template Assembly for Detailed Urban Reconstruction

    KAUST Repository

    Nan, Liangliang

    2015-05-04

    We propose a new framework to reconstruct building details by automatically assembling 3D templates on coarse textured building models. In a preprocessing step, we generate an initial coarse model to approximate a point cloud computed using Structure from Motion and Multi View Stereo, and we model a set of 3D templates of facade details. Next, we optimize the initial coarse model to enforce consistency between geometry and appearance (texture images). Then, building details are reconstructed by assembling templates on the textured faces of the coarse model. The 3D templates are automatically chosen and located by our optimization-based template assembly algorithm that balances image matching and structural regularity. In the results, we demonstrate how our framework can enrich the details of coarse models using various data sets.

  11. A Detailed Modeling Study of Propane Oxidation

    Energy Technology Data Exchange (ETDEWEB)

    Westbrook, C K; Jayaweera, T M; Pitz, W J; Curran, H J

    2004-03-19

    A detailed chemical kinetic mechanism has been used to simulate ignition delay times recorded by a number of experimental shock tube studies over the temperature range 900 {le} T {le} 1800 K, in the pressure range 0.75-40 atm and in the equivalence ratio range 0.5 {le} {phi} {le} 2.0. Flame speed measurements at 1 atm in the equivalence ratio range 0.4 {le} {phi} {le} 1.8 have also been simulated. Both of these data sets, particularly those recorded at high pressure, are of particular importance in validating a kinetic mechanism, as internal combustion engines operate at elevated pressures and temperatures and rates of fuel oxidation are critical to efficient system operation. Experiments in which reactant, intermediate and product species were quantitatively recorded, versus temperature in a jet-stirred reactor (JSR) and versus time in a flow reactor are also simulated. This data provide a stringent test of the kinetic mechanism as it must reproduce accurate quantitative profiles for all reactant, intermediate and product species. The JSR experiments were performed in the temperature range 1000-1110 K, in the equivalence ratio range 0.5 {le} {phi} {le} 4.0, at a pressure of 5 atm. These experiments are complemented by those carried out in a flow reactor in the temperature range 660-820 K, at 10 atm and at an equivalence ratio of 0.4. In addition, burner stabilized flames were simulated, where chemical species profiles were measured at atmospheric pressure for two propane-air flat flames. Overall, reasonably good agreement is observed between the model simulations and the experimental results.

  12. Automated medical image segmentation techniques

    Directory of Open Access Journals (Sweden)

    Sharma Neeraj

    2010-01-01

    Full Text Available Accurate segmentation of medical images is a key step in contouring during radiotherapy planning. Computed topography (CT and Magnetic resonance (MR imaging are the most widely used radiographic techniques in diagnosis, clinical studies and treatment planning. This review provides details of automated segmentation methods, specifically discussed in the context of CT and MR images. The motive is to discuss the problems encountered in segmentation of CT and MR images, and the relative merits and limitations of methods currently available for segmentation of medical images.

  13. Genetic techniques for the archaea.

    Science.gov (United States)

    Farkas, Joel A; Picking, Jonathan W; Santangelo, Thomas J

    2013-01-01

    Genetic techniques for the Archaea have undergone a rapid expansion in complexity, resulting in increased exploration of the role of Archaea in the environment and detailed analyses of the molecular physiology and information-processing systems in the third domain of life. Complementary gains in describing the ever-increasing diversity of archaeal organisms have allowed these techniques to be leveraged in new and imaginative ways to elucidate shared and unique aspects of archaeal diversity and metabolism. In this review, we introduce the four archaeal clades for which advanced genetic techniques are available--the methanogens, halophiles, Sulfolobales, and Thermococcales--with the aim of providing an overall profile of the advantages and disadvantages of working within each clade, as essentially all of the genetically accessible archaeal organisms require unique culturing techniques that present real challenges. We discuss the full repertoire of techniques possible within these clades while highlighting the recent advances that have been made by taking advantage of the most prominent techniques and approaches.

  14. Pharmaceutical crystallography: is there a devil in the details?

    DEFF Research Database (Denmark)

    Bond, A. D.

    2012-01-01

    Modern instruments for small-molecule crystallography continue to become more sophisticated and more automated. This technical progress provides a basis for frontier research in chemical and pharmaceutical crystallography, but it also encourages analytical crystallographers to become more...... are presented for pharmaceutical compounds, and the potential importance of the "details" in pharmaceutical crystallography is discussed....

  15. Syllabus Detail and Students' Perceptions of Teacher Effectiveness

    Science.gov (United States)

    Saville, Bryan K.; Zinn, Tracy E.; Brown, Allison R.; Marchuk, Kimberly A.

    2010-01-01

    Although syllabi provide students with important course information, they can also affect perceptions of teaching effectiveness. To test this idea, we distributed 2 versions of a hypothetical course syllabus, a brief version and a detailed version, and asked students to rate the teacher of the course on qualities associated with master teaching.…

  16. NMR crystallography of enzyme active sites: probing chemically detailed, three-dimensional structure in tryptophan synthase.

    Science.gov (United States)

    Mueller, Leonard J; Dunn, Michael F

    2013-09-17

    NMR crystallography--the synergistic combination of X-ray diffraction, solid-state NMR spectroscopy, and computational chemistry--offers unprecedented insight into three-dimensional, chemically detailed structure. Initially, researchers used NMR crystallography to refine diffraction data from organic and inorganic solids. Now we are applying this technique to explore active sites in biomolecules, where it reveals chemically rich detail concerning the interactions between enzyme site residues and the reacting substrate. Researchers cannot achieve this level of detail from X-ray, NMR,or computational methodologies in isolation. For example, typical X-ray crystal structures (1.5-2.5 Å resolution) of enzyme-bound intermediates identify possible hydrogen-bonding interactions between site residues and substrate but do not directly identify the protonation states. Solid-state NMR can provide chemical shifts for selected atoms of enzyme-substrate complexes, but without a larger structural framework in which to interpret them only empirical correlations with local chemical structure are possible. Ab initio calculations and molecular mechanics can build models for enzymatic processes, but they rely on researcher-specified chemical details. Together, however, X-ray diffraction, solid-state NMR spectroscopy, and computational chemistry can provide consistent and testable models for structure and function of enzyme active sites: X-ray crystallography provides a coarse framework upon which scientists can develop models of the active site using computational chemistry; they can then distinguish these models by comparing calculated NMR chemical shifts with the results of solid-state NMR spectroscopy experiments. Conceptually, each technique is a puzzle piece offering a generous view of the big picture. Only when correctly pieced together, however, can they reveal the big picture at the highest possible resolution. In this Account, we detail our first steps in the development of

  17. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  18. Radiologic imaging technique

    Energy Technology Data Exchange (ETDEWEB)

    Bushong, S.C. (Dept. of Radiology, Baylor College of Medicine, Houston, TX (US)); Eastman, T.R. (Agfagavert Inc., Irving, TX (US))

    1990-01-01

    The authors focus on the subject of clinical radiographic technique. Emphasizing correct radiographic technique, it's heavily illustrated with radiographs that demonstrate proper exposure and show what happens when exposure variables are changed. A key feature is a discussion and evaluation of radiographic technique charts. Basic technique charts are provided for every body part examined.

  19. Fatigue-Prone Details in Steel Bridges

    Directory of Open Access Journals (Sweden)

    Mohsen Heshmati

    2012-11-01

    Full Text Available This paper reviews the results of a comprehensive investigation including more than 100 fatigue damage cases, reported for steel and composite bridges. The damage cases are categorized according to types of detail. The mechanisms behind fatigue damage in each category are identified and studied. It was found that more than 90% of all reported damage cases are of deformation-induced type and generated by some kind of unintentional or otherwise overlooked interaction between different load-carrying members or systems in the bridge. Poor detailing, with unstiffened gaps and abrupt changes in stiffness at the connections between different members were also found to contribute to fatigue cracking in many details.

  20. Memory for details with self-referencing.

    Science.gov (United States)

    Serbun, Sarah J; Shih, Joanne Y; Gutchess, Angela H

    2011-11-01

    Self-referencing benefits item memory, but little is known about the ways in which referencing the self affects memory for details. Experiment 1 assessed whether the effects of self-referencing operate only at the item, or general, level or whether they also enhance memory for specific visual details of objects. Participants incidentally encoded objects by making judgements in reference to the self, a close other (one's mother), or a familiar other (Bill Clinton). Results indicate that referencing the self or a close other enhances both specific and general memory. Experiments 2 and 3 assessed verbal memory for source in a task that relied on distinguishing between different mental operations (internal sources). The results indicate that self-referencing disproportionately enhances source memory, relative to conditions referencing other people, semantic, or perceptual information. We conclude that self-referencing not only enhances specific memory for both visual and verbal information, but can also disproportionately improve memory for specific internal source details.

  1. Local address and emergency contact details

    CERN Multimedia

    2013-01-01

    The HR Department would like to remind members of the personnel that they are responsible for ensuring that their personal data concerning local address and preferred emergency contact details remains valid and up-to-date.   Both are easily accessible via the links below: Local address: https://edh.cern.ch/Document/Personnel/LocalAddressChange   Emergency contacts: https://edh.cern.ch/Document/Personnel/EC   Please take a few minutes to check your details and modify if necessary. Thank you in advance. HR Department Head Office

  2. Separation, culture and identification of rabbit bone marrow mesenchymal stem cells by iliac puncture:operation details and techniques%幼兔髂骨穿刺抽取骨髓间充质干细胞分离培养鉴定:注意的细节与技术

    Institute of Scientific and Technical Information of China (English)

    张聪; 刘洪美; 李庆伟; 陈国武; 梁啸; 孟纯阳

    2014-01-01

    背景:骨髓间充质干细胞被认为是构建组织工程骨修复骨与软骨缺损中较为常用的种子细胞,在其基本操作过程中注意常见问题并及时避免,对后期细胞学及组织工程学实验很有意义。目的:通过作者实验操作过程中所遇问题的总结和分析,为初学者和科研人员提供可靠的骨髓间充质干细胞分离培养鉴定方法,减少操作过程中的人为失误和易犯问题。方法:取16只幼年新西兰大白兔作为实验对象,进行髂骨穿刺抽取骨髓液。采用密度梯度离心法联合贴壁培养法体外筛选纯化细胞,并且通过倒置相差显微镜观察其形态学特点、生长曲线和流式细胞术鉴定骨髓间充质干细胞表型。结果与结论:实验过程中前5只兔骨髓抽吸、骨髓间充质干细胞分离过程中遇到不同的问题和困难,经认真总结和分析,后11只兔骨髓抽吸、骨髓间充质干细胞分离均获成功,在细胞培养过程中未发现细菌污染和细胞老化,第3代骨髓间充质干细胞高表达CD29、CD44抗原,而CD14、CD34抗原低表达,MTT测细胞生长曲线显示P3和P5增殖活性较高。尽管骨髓间充质干细胞分离培养鉴定技术已较为成熟,但是如果操作过程中不注意细节问题,也将会导致实验困难重重或失败。严格执行常规操作步骤可以得到纯度较高的骨髓间充质干细胞,提高成功率,为后续相关细胞实验和动物实验做好准备。%BACKGROUND:Bone marrow mesenchymal stem cells are considered as commonly used seed cells to construct tissue-engineered for repair of bone and cartilage defects. It is of great significance for cytology and tissue engineering experiments to study the common problems existing in the basic operation and how to avoid these problems in a timely manner. OBJECTIVE:To summarize the common problems existing in the process of operation in order to provide reliable

  3. Medicare Referring Provider DMEPOS PUF CY2013

    Data.gov (United States)

    U.S. Department of Health & Human Services — This dataset, which is part of CMSs Medicare Provider Utilization and Payment Data, details information on Durable Medical Equipment, Prosthetics, Orthotics and...

  4. Persian fencing techniques

    Directory of Open Access Journals (Sweden)

    Manouchehr Moshtagh Khorasani

    2012-07-01

    Full Text Available There are numerous manuscripts, poems and stories that describe, specifically and in detail, the different techniques used in Persian swordsmanship. The present article explains the origins and the techniques of Persian swordsmanship. The article also describes the traditional code of conduct for Persian warriors. Additionally, it describes an array of techniques that were deployed in actual combat in Iran’s history. Some of these techniques are represented via the miniatures that are reproduced herein. This is the first article on Persian swordsmanship published in any periodical.

  5. Study of engineering surfaces using laser-scattering techniques

    Indian Academy of Sciences (India)

    C Babu Rao; Baldev Raj

    2003-06-01

    Surface roughness parameters are described. Various surface characterization techniques are reviewed briefly. Interaction of light with the surface is discussed. Laser-scattering methods to characterise the surface are detailed. Practical cases, where laser-scattering methods have provided useful information about surface characteristics, are illustrated.

  6. Computational techniques of the simplex method

    CERN Document Server

    Maros, István

    2003-01-01

    Computational Techniques of the Simplex Method is a systematic treatment focused on the computational issues of the simplex method. It provides a comprehensive coverage of the most important and successful algorithmic and implementation techniques of the simplex method. It is a unique source of essential, never discussed details of algorithmic elements and their implementation. On the basis of the book the reader will be able to create a highly advanced implementation of the simplex method which, in turn, can be used directly or as a building block in other solution algorithms.

  7. On detailed 3D reconstruction of large indoor environments

    Science.gov (United States)

    Bondarev, Egor

    2015-03-01

    In this paper we present techniques for highly detailed 3D reconstruction of extra large indoor environments. We discuss the benefits and drawbacks of low-range, far-range and hybrid sensing and reconstruction approaches. The proposed techniques for low-range and hybrid reconstruction, enabling the reconstruction density of 125 points/cm3 on large 100.000 m3 models, are presented in detail. The techniques tackle the core challenges for the above requirements, such as a multi-modal data fusion (fusion of a LIDAR data with a Kinect data), accurate sensor pose estimation, high-density scanning and depth data noise filtering. Other important aspects for extra large 3D indoor reconstruction are the point cloud decimation and real-time rendering. In this paper, we present a method for planar-based point cloud decimation, allowing for reduction of a point cloud size by 80-95%. Besides this, we introduce a method for online rendering of extra large point clouds enabling real-time visualization of huge cloud spaces in conventional web browsers.

  8. New details emerge from the Einstein files

    CERN Multimedia

    Overbye, D

    2002-01-01

    For many years the FBI spied on Einstein. New details of this surveilance are emerging in "The Einstein File: J. Edgar Hoover's Secret War Against the World's Most Famous Scientist," by Fred Jerome, who sued the government with the help of the Public Citizen Litigation Group to obtain a less censored version of the file (1 page).

  9. Detailed Balancing and the Structure of Proton

    CERN Document Server

    Zhang, Y Z

    2001-01-01

    The protons are taken as an ensemble of Fock states. Using detailed balancing principle, ensemble density metrix on the basis of the number of partons is calculated, and so some information about intrinsic gluons and intrinsic sea quarks are gained without any parameter.

  10. Detailed numerical simulations of laser cooling processes

    Science.gov (United States)

    Ramirez-Serrano, J.; Kohel, J.; Thompson, R.; Yu, N.

    2001-01-01

    We developed a detailed semiclassical numerical code of the forces applied on atoms in optical and magnetic fields to increase the understanding of the different roles that light, atomic collisions, background pressure, and number of particles play in experiments with laser cooled and trapped atoms.

  11. Constructing Overview + Detail Dendrogram-Matrix Views

    Science.gov (United States)

    Chen, Jin; MacEachren, Alan M.; Peuquet, Donna J.

    2011-01-01

    A dendrogram that visualizes a clustering hierarchy is often integrated with a reorderable matrix for pattern identification. The method is widely used in many research fields including biology, geography, statistics, and data mining. However, most dendrograms do not scale up well, particularly with respect to problems of graphical and cognitive information overload. This research proposes a strategy that links an overview dendrogram and a detail-view dendrogram, each integrated with a re-orderable matrix. The overview displays only a user-controlled, limited number of nodes that represent the “skeleton” of a hierarchy. The detail view displays the sub-tree represented by a selected meta-node in the overview. The research presented here focuses on constructing a concise overview dendrogram and its coordination with a detail view. The proposed method has the following benefits: dramatic alleviation of information overload, enhanced scalability and data abstraction quality on the dendrogram, and the support of data exploration at arbitrary levels of detail. The contribution of the paper includes a new metric to measure the “importance” of nodes in a dendrogram; the method to construct the concise overview dendrogram from the dynamically-identified, important nodes; and measure for evaluating the data abstraction quality for dendrograms. We evaluate and compare the proposed method to some related existing methods, and demonstrating how the proposed method can help users find interesting patterns through a case study on county-level U.S. cervical cancer mortality and demographic data. PMID:19834151

  12. Towards cleaner combustion engines through groundbreaking detailed chemical kinetic models.

    Science.gov (United States)

    Battin-Leclerc, Frédérique; Blurock, Edward; Bounaceur, Roda; Fournet, René; Glaude, Pierre-Alexandre; Herbinet, Olivier; Sirjean, Baptiste; Warth, V

    2011-09-01

    In the context of limiting the environmental impact of transportation, this critical review discusses new directions which are being followed in the development of more predictive and more accurate detailed chemical kinetic models for the combustion of fuels. In the first part, the performance of current models, especially in terms of the prediction of pollutant formation, is evaluated. In the next parts, recent methods and ways to improve these models are described. An emphasis is given on the development of detailed models based on elementary reactions, on the production of the related thermochemical and kinetic parameters, and on the experimental techniques available to produce the data necessary to evaluate model predictions under well defined conditions (212 references). This journal is © The Royal Society of Chemistry 2011

  13. Developments in remote sensing technology enable more detailed urban flood risk analysis.

    Science.gov (United States)

    Denniss, A.; Tewkesbury, A.

    2009-04-01

    Spaceborne remote sensors have been allowing us to build up a profile of planet earth for many years. With each new satellite launched we see the capabilities improve: new bands of data, higher resolution imagery, the ability to derive better elevation information. The combination of this geospatial data to create land cover and usage maps, all help inform catastrophe modelling systems. From Landsat 30m resolution to 2.44m QuickBird multispectral imagery; from 1m radar data collected by TerraSAR-X which enables rapid tracking of the rise and fall of a flood event, and will shortly have a twin satellite launched enabling elevation data creation; we are spoilt for choice in available data. However, just what is cost effective? It is always a question of choosing the appropriate level of input data detail for modelling, depending on the value of the risk. In the summer of 2007, the cost of the flooding in the UK was approximately £3bn and affected over 58,000 homes and businesses. When it comes to flood risk, we have traditionally considered rising river levels and surge tides, but with climate change and variations in our own construction behaviour, there are other factors to be taken into account. During those summer 2007 events, the Environment Agency suggested that around 70% of the properties damaged were the result of pluvial flooding, where high localised rainfall events overload localised drainage infrastructure, causing widespread flooding of properties and infrastructure. To create a risk model that is able to simulate such an event requires much more accurate source data than can be provided from satellite or radar. As these flood events cause considerable damage within relatively small, complex urban environments, therefore new high resolution remote sensing techniques have to be applied to better model these events. Detailed terrain data of England and Wales, plus cities in Scotland, have been produced by combining terrain measurements from the latest

  14. A detailed BWR recirculation loop model for RELAP

    Energy Technology Data Exchange (ETDEWEB)

    Araiza-Martínez, Enrique, E-mail: enrique.araiza@inin.gob.mx; Ortiz-Villafuerte, Javier, E-mail: javier.ortiz@inin.gob.mx; Castillo-Durán, Rogelio, E-mail: rogelio.castillo@inin.gob.mx

    2017-01-15

    Highlights: • A new detailed BWR recirculation loop model was developed for RELAP. • All jet pumps, risers, manifold, suction and control valves, and recirculation pump are modeled. • Model is tested against data from partial blockage of two jet pumps. • For practical applications, simulation results showed good agreement with available data. - Abstract: A new detailed geometric model of the whole recirculation loop of a BWR has been developed for the code RELAP. This detailed model includes the 10 jet pumps, 5 risers, manifold, suction and control valves, and the recirculation pump, per recirculation loop. The model is tested against data from an event of partial blockage at the entrance nozzle of one jet pump in both recirculation loops. For practical applications, simulation results showed good agreement with data. Then, values of parameters considered as figure of merit (reactor power, dome pressure, core flow, among others) for this event are compared against those from the common 1 jet pump per loop model. The results show that new detailed model led to a closer prediction of the reported power change. The detailed recirculation loop model can provide more reliable boundary condition data to a CFD models for studies of, for example, flow induced vibration, wear, and crack initiation.

  15. Biomedical images texture detail denoising based on PDE

    Science.gov (United States)

    Chen, Guan-nan; Pan, Jian-ji; Li, Chao; Chen, Rong; Lin, Ju-qiang; Yan, Kun-tao; Huang, Zu-fang

    2009-08-01

    Biomedical images denosing based on Partial Differential Equation are well-known for their good processing results. General denosing methods based on PDE can remove the noises of images with gentle changes and preserve more structure details of edges, but have a poor effectiveness on the denosing of biomedical images with many texture details. This paper attempts to make an overview of biomedical images texture detail denosing based on PDE. Subsequently, Three kinds of important image denosing schemes are introduced in this paper: one is image denosing based on the adaptive parameter estimation total variation model, which denosing the images based on region energy distribution; the second is using G norm on the perception scale, which provides a more intuitive understanding of this norm; the final is multi-scale denosing decomposition. The above methods involved can preserve more structures of biomedical images texture detail. Furthermore, this paper demonstrates the applications of those three methods. In the end, the future trend of biomedical images texture detail denosing Based on PDE is pointed out.

  16. Study on detailed geological modelling for fluvial sandstone reservoir in Daqing oil field

    Energy Technology Data Exchange (ETDEWEB)

    Zhao Hanqing; Fu Zhiguo; Lu Xiaoguang [Institute of Petroleum Exploration and Development, Daqing (China)

    1997-08-01

    Guided by the sedimentation theory and knowledge of modern and ancient fluvial deposition and utilizing the abundant information of sedimentary series, microfacies type and petrophysical parameters from well logging curves of close spaced thousands of wells located in a large area. A new method for establishing detailed sedimentation and permeability distribution models for fluvial reservoirs have been developed successfully. This study aimed at the geometry and internal architecture of sandbodies, in accordance to their hierarchical levels of heterogeneity and building up sedimentation and permeability distribution models of fluvial reservoirs, describing the reservoir heterogeneity on the light of the river sedimentary rules. The results and methods obtained in outcrop and modem sedimentation studies have successfully supported the study. Taking advantage of this method, the major producing layers (PI{sub 1-2}), which have been considered as heterogeneous and thick fluvial reservoirs extending widely in lateral are researched in detail. These layers are subdivided into single sedimentary units vertically and the microfacies are identified horizontally. Furthermore, a complex system is recognized according to their hierarchical levels from large to small, meander belt, single channel sandbody, meander scroll, point bar, and lateral accretion bodies of point bar. The achieved results improved the description of areal distribution of point bar sandbodies, provide an accurate and detailed framework model for establishing high resolution predicting model. By using geostatistic technique, it also plays an important role in searching for enriched zone of residual oil distribution.

  17. A Unified Detail-Preserving Liquid Simulation by Two-Phase Lattice Boltzmann Modeling.

    Science.gov (United States)

    Guo, Yulong; Liu, Xiaopei; Xu, Xuemiao

    2016-02-19

    Traditional methods in graphics to simulate liquid-air dynamics under different scenarios usually employ separate approaches with sophisticated interface tracking/reconstruction techniques. In this paper, we propose a novel unified approach which is easy and effective to produce a variety of liquid-air interface phenomena. These phenomena, such as complex surface splashes, bubble interactions, as well as surface tension effects, can co-exist in one single simulation, and are created within the same computational framework. Such a framework is unique in that it is free from any complicated interface tracking/reconstruction procedures. Our approach is developed from the two-phase lattice Boltzmann method with the mean field model, which provides a unified framework for interface dynamics but is numerically unstable under turbulent conditions. Considering the drawbacks of the existing approaches, we propose techniques to suppress oscillation for significant stability enhancement, as well as derive a new subgrid-scale model to further improve stability, faithfully preserving liquid-air interface details without excessive diffusion by taking into account the density variation. The whole framework is highly parallel, enabling very efficient implementation. Comparisons to the related approaches show superiority on stable simulation with detail preservation and multiphase phenomena simultaneously involved. A set of animation results demonstrate the effectiveness of our method.

  18. Providing resilience for carrier ethernet multicast traffic

    DEFF Research Database (Denmark)

    Ruepp, Sarah Renée; Wessing, Henrik; Zhang, Jiang

    2009-01-01

    This paper presents an overview of the Carrier Ethernet technology with specific focus on resilience. In particular, we detail how multicast traffic, which is essential for e.g. IPTV can be protected. We present Carrier Ethernet resilience methods for linear and ring networks and show by simulation...... that the availability of a multicast connection can be significantly increased by applying relevant resilience techniques....

  19. Therapy Provider Phase Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Therapy Provider Phase Information dataset is a tool for providers to search by their National Provider Identifier (NPI) number to determine their phase for...

  20. 3D Urban Visualization with LOD Techniques

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In 3D urban visualization, large data volumes related to buildings are a major factor that limits the delivery and browsing speed in a web-based computer system. This paper proposes a new approach based on the level of detail (LOD) technique advanced in 3D visualization in computer graphics. The key idea of LOD technique is to generalize details of object surfaces without losing details for delivery and displaying objects. This technique has been successfully used in visualizing one or a few multiple objects in films and other industries. However, applying the technique to 3D urban visualization requires an effective generalization method for urban buildings. Conventional two-dimensional (2D) generalization method at different scales provides a good generalization reference for 3D urban visualization. Yet, it is difficult to determine when and where to retrieve data for displaying buildings. To solve this problem, this paper defines an imaging scale point and image scale region for judging when and where to get the right data for visualization. The results show that the average response time of view transformations is much decreased.

  1. Thirty Meter Telescope Detailed Science Case: 2015

    CERN Document Server

    Skidmore, Warren; Fukugawa, Misato; Goswami, Aruna; Hao, Lei; Jewitt, David; Laughlin, Greg; Steidel, Charles; Hickson, Paul; Simard, Luc; Schöck, Matthias; Treu, Tommaso; Cohen, Judith; Anupama, G C; Dickinson, Mark; Harrison, Fiona; Kodama, Tadayuki; Lu, Jessica R; Macintosh, Bruce; Malkan, Matt; Mao, Shude; Narita, Norio; Sekiguchi, Tomohiko; Subramaniam, Annapurni; Tanaka, Masaomi; Tian, Feng; A'Hearn, Michael; Akiyama, Masayuki; Ali, Babar; Aoki, Wako; Bagchi, Manjari; Barth, Aaron; Bhalerao, Varun; Bradac, Marusa; Bullock, James; Burgasser, Adam J; Chapman, Scott; Chary, Ranga-Ram; Chiba, Masashi; Cooray, Asantha; Crossfield, Ian; Currie, Thayne; Das, Mousumi; Dewangan, G C; de Grijs, Richard; Do, Tuan; Dong, Subo; Evslin, Jarah; Fang, Taotao; Fang, Xuan; Fassnacht, Christopher; Fletcher, Leigh; Gaidos, Eric; Gal, Roy; Ghez, Andrea; Giavalisco, Mauro; Grady, Carol A; Greathouse, Thomas; Gogoi, Rupjyoti; Guhathakurta, Puragra; Ho, Luis; Hasan, Priya; Herczeg, Gregory J; Honda, Mitsuhiko; Imanishi, Masa; Inanmi, Hanae; Iye, Masanori; Kamath, U S; Kane, Stephen; Kashikawa, Nobunari; Kasliwal, Mansi; Kirby, Vishal KasliwalEvan; Konopacky, Quinn M; Lepine, Sebastien; Li, Di; Li, Jianyang; Liu, Junjun; Liu, Michael C; Lopez-Rodriguez, Enrigue; Lotz, Jennifer; Lubin, Philip; Macri, Lucas; Maeda, Keiichi; Marchis, Franck; Marois, Christian; Marscher, Alan; Martin, Crystal; Matsuo, Taro; Max, Claire; McConnachie, Alan; McGough, Stacy; Melis, Carl; Meyer, Leo; Mumma, Michael; Muto, Takayuki; Nagao, Tohru; Najita, Joan R; Navarro, Julio; Pierce, Michael; Prochaska, Jason X; Oguri, Masamune; Ojha, Devendra K; Okamoto, Yoshiko K; Orton, Glenn; Otarola, Angel; Ouchi, Masami; Packham, Chris; Padgett, Deborah L; Pandey, Shashi Bhushan; Pilachowsky, Catherine; Pontoppidan, Klaus M; Primack, Joel; Puthiyaveettil, Shalima; Ramirez-Ruiz, Enrico; Reddy, Naveen; Rich, Michael; Richter, Matthew J; Schombert, James; Sen, Anjan Ananda; Shi, Jianrong; Sheth, Kartik; Srianand, R; Tan, Jonathan C; Tanaka, Masayuki; Tanner, Angelle; Tominaga, Nozomu; Tytler, David; U, Vivian; Wang, Lingzhi; Wang, Xiaofeng; Wang, Yiping; Wilson, Gillian; Wright, Shelley; Wu, Chao; Wu, Xufeng; Xu, Renxin; Yamada, Toru; Yang, Bin; Zhao, Gongbo; Zhao, Hongsheng

    2015-01-01

    The TMT Detailed Science Case describes the transformational science that the Thirty Meter Telescope will enable. Planned to begin science operations in 2024, TMT will open up opportunities for revolutionary discoveries in essentially every field of astronomy, astrophysics and cosmology, seeing much fainter objects much more clearly than existing telescopes. Per this capability, TMT's science agenda fills all of space and time, from nearby comets and asteroids, to exoplanets, to the most distant galaxies, and all the way back to the very first sources of light in the Universe. More than 150 astronomers from within the TMT partnership and beyond offered input in compiling the new 2015 Detailed Science Case. The contributing astronomers represent the entire TMT partnership, including the California Institute of Technology (Caltech), the Indian Institute of Astrophysics (IIA), the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC), the National Astronomical Observatory of Japan (NAOJ),...

  2. Reserving by detailed conditioning on individual claim

    Science.gov (United States)

    Kartikasari, Mujiati Dwi; Effendie, Adhitya Ronnie; Wilandari, Yuciana

    2017-03-01

    The estimation of claim reserves is an important activity in insurance companies to fulfill their liabilities. Recently, reserving method of individual claim have attracted a lot of interest in the actuarial science, which overcome some deficiency of aggregated claim method. This paper explores the Reserving by Detailed Conditioning (RDC) method using all of claim information for reserving with individual claim of liability insurance from an Indonesian general insurance company. Furthermore, we compare it to Chain Ladder and Bornhuetter-Ferguson method.

  3. Metastability for Markov processes with detailed balance.

    Science.gov (United States)

    Larralde, Hernán; Leyvraz, François

    2005-04-29

    We present a definition for metastable states applicable to arbitrary finite state Markov processes satisfying detailed balance. In particular, we identify a crucial condition that distinguishes metastable states from other slow decaying modes and which allows us to show that our definition has several desirable properties similar to those postulated in the restricted ensemble approach. The intuitive physical meaning of this condition is simply that the total equilibrium probability of finding the system in the metastable state is negligible.

  4. Endoscopic thyroidectomy: Our technique

    Directory of Open Access Journals (Sweden)

    Puntambekar Shailesh

    2007-01-01

    Full Text Available Minimally invasive surgery is widely employed for the treatment of thyroid diseases. Several minimal access approaches to the thyroid gland have been described. The commonly performed surgeries have been endoscopic lobectomies. We have performed endoscopic total thyroidectomy by the anterior chest wall approach. In this study, we have described our technique and evaluated the feasibility and efficacy of this procedure. Materials and Methods: From June 2005 to August 2006, 15 cases of endoscopic thyroidectomy were done at our institute. Five patients were male and 10 were female. Mean age was 45 years. (Range 23 to 71 years. Four patients had multinodular goiter and underwent near-total thyroidectomy; four patients had follicular adenoma and underwent hemithyroidectomy. Out of the seven patients of papillary carcinoma, four were low-risk and so a hemithyroidectomy was performed while three patients in the high risk group underwent total thyroidectomy. A detailed description of the surgical technique is provided. Results : The mean nodule size was 48 mm (range 20-80 mm and the mean operating time was 85 min (range 60-120 min. In all cases, the recurrent laryngeal nerve was identified and preserved intact, the superior and inferior parathyroids were also identified in all patients. No patients required conversion to an open cervicotomy. All patients were discharged the day after surgery. All thyroidectomies were completed successfully. No recurrent laryngeal nerve palsies or postoperative tetany occurred. The postoperative course was significantly less painful and all patients were satisfied with the cosmetic results. Conclusions : It is possible to remove large nodules and perform as well as total thyroidectomies using our endoscopic approach. It is a safe and effective technique in the hands of an appropriately trained surgeon. The patients get a cosmetic benefit without any morbidity.

  5. Educational Outreach to Opioid Prescribers: The Case for Academic Detailing.

    Science.gov (United States)

    Trotter Davis, Margot; Bateman, Brian; Avorn, Jerry

    2017-02-01

    Nonmedical use of opioid medications constitutes a serious health threat as the rates of addiction, overdoses, and deaths have risen in recent years. Increasingly, inappropriate and excessively liberal prescribing of opioids by physicians is understood to be a central part of the crisis. Public health officials, hospital systems, and legislators are developing programs and regulations to address the problem in sustained and systematic ways that both insures effective treatment of pain and appropriate limits on the availability of opioids. Three approaches have obtained prominence as means of avoiding excessive and inappropriate prescribing, including: providing financial incentives to physicians to change their clinical decision through pay-for-performance contracts, monitoring patient medications through Prescription Drug Monitoring Programs, and educational outreach to physicians. A promising approach to educational outreach to physicians is an intervention known as "academic detailing." It was developed in the 1980s to provide one-on-one educational outreach to physicians using similar methods as the pharmaceutical industry that sends "detailers" to market their products to physician practices. Core to academic detailing, however, is the idea that medical decisions should be based on evidence-based information, including managing conditions with updated assessment measures, behavioral, and nonpharmacological interventions. With the pharmaceutical industry spending billions of dollars to advertise their products, individual practitioners can have difficulty gathering unbiased information, especially as the number of approved medications grows each year. Academic detailing has successfully affected the management of health conditions, such as atrial fibrillation, chronic obstructive pulmonary disease, and recently, has targeted physicians who prescribe opioids. This article discusses the approach as a potentially effective preventative intervention to address the

  6. Review of geotechnical measurement techniques for a nuclear waste repository in bedded salt

    Energy Technology Data Exchange (ETDEWEB)

    1979-12-01

    This report presents a description of geotechnical measurement techniques that can provide the data necessary for safe development - i.e., location, design, construction, operation, decommissioning and abandonment - of a radioactive waste repository in bedded salt. Geotechnical data obtained by a diversity of measurement techniques are required during all phases of respository evolution. The techniques discussed in this report are grouped in the following categories: geologic, geophysical and geodetic; rock mechanics; hydrologic, hydrogeologic and water quality; and thermal. The major contribution of the report is the presentation of extensive tables that provide a review of available measurement techniques for each of these categories. The techniques are also discussed in the text to the extent necessary to describe the measurements and associated instruments, and to evaluate the applicability or limitations of the method. More detailed discussions of thermal phenomena, creep laws and geophysical methods are contained in the appendices; references to detailed explanations of measurement techniques and instrumentation are inluded throughout the report.

  7. Academic detailing to teach aging and geriatrics.

    Science.gov (United States)

    Duckett, Ashley; Cuoco, Theresa; Pride, Pamela; Wiley, Kathy; Iverson, Patty J; Marsden, Justin; Moran, William; Caton, Cathryn

    2015-01-01

    Geriatric education is a required component of internal medicine training. Work hour rules and hectic schedules have challenged residency training programs to develop and utilize innovative teaching methods. In this study, the authors examined the use of academic detailing as a teaching intervention in their residents' clinic and on the general medicine inpatient wards to improve clinical knowledge and skills in geriatric care. The authors found that this teaching method enables efficient, directed education without disrupting patient care. We were able to show improvements in medical knowledge as well as self-efficacy across multiple geriatric topics.

  8. Radar interferometry persistent scatterer technique

    CERN Document Server

    Kampes, Bert M

    2006-01-01

    Only book on Permanent Scatterer technique of radar interferometryExplains the Permanent Scatterer technique in detail, possible pitfalls, and details a newly developed stochastic model and estimator algorithm to cope with possible problems for the application of the PS techniqueThe use of Permanent Scatterer allows very precise measurements of the displacement of hundreds of points per square kilometerDescribes the only technique currently able to perform displacement measurements in the past, utilizing the ERS satellite data archive using data acquired from 1992-prese

  9. Thirty Meter Telescope Detailed Science Case: 2015

    Science.gov (United States)

    Skidmore, Warren; TMT International Science Development Teams; Science Advisory Committee, TMT

    2015-12-01

    The TMT Detailed Science Case describes the transformational science that the Thirty Meter Telescope will enable. Planned to begin science operations in 2024, TMT will open up opportunities for revolutionary discoveries in essentially every field of astronomy, astrophysics and cosmology, seeing much fainter objects much more clearly than existing telescopes. Per this capability, TMT's science agenda fills all of space and time, from nearby comets and asteroids, to exoplanets, to the most distant galaxies, and all the way back to the very first sources of light in the universe. More than 150 astronomers from within the TMT partnership and beyond offered input in compiling the new 2015 Detailed Science Case. The contributing astronomers represent the entire TMT partnership, including the California Institute of Technology (Caltech), the Indian Institute of Astrophysics (IIA), the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC), the National Astronomical Observatory of Japan (NAOJ), the University of California, the Association of Canadian Universities for Research in Astronomy (ACURA) and US associate partner, the Association of Universities for Research in Astronomy (AURA). Cover image: artist's rendition of the TMT International Observatory on Mauna Kea opening in the late evening before beginning operations.

  10. A Detailed Chemical Kinetic Model for TNT

    Energy Technology Data Exchange (ETDEWEB)

    Pitz, W J; Westbrook, C K

    2005-01-13

    A detailed chemical kinetic mechanism for 2,4,6-tri-nitrotoluene (TNT) has been developed to explore problems of explosive performance and soot formation during the destruction of munitions. The TNT mechanism treats only gas-phase reactions. Reactions for the decomposition of TNT and for the consumption of intermediate products formed from TNT are assembled based on information from the literature and on current understanding of aromatic chemistry. Thermodynamic properties of intermediate and radical species are estimated by group additivity. Reaction paths are developed based on similar paths for aromatic hydrocarbons. Reaction-rate constant expressions are estimated from the literature and from analogous reactions where the rate constants are available. The detailed reaction mechanism for TNT is added to existing reaction mechanisms for RDX and for hydrocarbons. Computed results show the effect of oxygen concentration on the amount of soot precursors that are formed in the combustion of RDX and TNT mixtures in N{sub 2}/O{sub 2} mixtures.

  11. Bolivia-Brazil gas line route detailed

    Energy Technology Data Exchange (ETDEWEB)

    1992-05-11

    This paper reports that state oil companies of Brazil and Bolivia have signed an agreement outlining the route for a 2,270 km pipeline system to deliver natural gas from Bolivian fields to Southeast Brazil. The two sides currently are negotiating details about construction costs as well as contract volumes and prices. Capacity is projected at 283-565 MMcfd. No official details are available, but Roberto Y. Hukai, a director of the Sao Paulo engineering company Jaako Poyry/Technoplan, estimates transportation cost of the Bolivian gas at 90 cents/MMBTU. That would be competitive with the price of gas delivered to the Sao Paulo gas utility Comgas, he the. Brazil's Petroleos Brasileiro SA estimates construction of the pipeline on the Brazilian side alone with cost $1.2-1.4 billion. Bolivia's Yacimientos Petroliferos Fiscales Bolivianos (YPFB) is negotiating with private domestic and foreign investors for construction of the Bolivian portion of the project.

  12. Dependent rational providers.

    Science.gov (United States)

    Brothers, Kyle B

    2011-04-01

    Provider claims to conscientious objection have generated a great deal of heated debate in recent years. However, the conflicts that arise when providers make claims to the "conscience" are only a subset of the more fundamental challenges that arise in health care practice when patients and providers come into conflict. In this piece, the author provides an account of patient-provider conflict from within the moral tradition of St. Thomas Aquinas. He argues that the practice of health care providers should be understood as a form of practical reasoning and that this practical reasoning must necessarily incorporate both "moral" and "professional" commitments. In order to understand how the practical reasoning of provider should account for the needs and commitments of the patient and vice versa, he explores the account of dependence provided by Alasdair MacIntyre in his book Dependent Rational Animals. MacIntyre argues that St. Thomas' account of practical reasoning should be extended and adapted to account for the embodied vulnerability of all humans. In light of this insight, providers must view patients not only as the subjects of their moral reflection but also as fellow humans upon whom the provider depends for feedback on the effectiveness and relevance of her practical reasoning. The author argues that this account precludes responsive providers from adopting either moral or professional conclusions on the appropriateness of interventions outside the individual circumstances that arise in particular situations. The adoption of this orientation toward patients will neither eradicate provider-patient conflict nor compel providers to perform interventions to which they object. But this account does require that providers attend meaningfully to the suffering of patients and seek feedback on whether their intervention has effectively addressed that suffering.

  13. The diagnostic contribution of CT volumetric rendering techniques in routine practice

    OpenAIRE

    Perandini Simone; Faccioli N; Zaccarella A; Re T; Mucelli R

    2010-01-01

    Computed tomography (CT) volumetric rendering techniques such as maximum intensity projection (MIP), minimum intensity projection (MinIP), shaded surface display (SSD), volume rendering (VR), and virtual endoscopy (VE) provide added diagnostic capabilities. The diagnostic value of such reconstruction techniques is well documented in literature. These techniques permit the exploration of fine anatomical detail that would be difficult to evaluate using axial reconstructions alone. Although thes...

  14. Surface science techniques

    CERN Document Server

    Bracco, Gianangelo

    2013-01-01

    The book describes the experimental techniques employed to study surfaces and interfaces. The emphasis is on the experimental method. Therefore all chapters start with an introduction of the scientific problem, the theory necessary to understand how the technique works and how to understand the results. Descriptions of real experimental setups, experimental results at different systems are given to show both the strength and the limits of the technique. In a final part the new developments and possible extensions of the techniques are presented. The included techniques provide microscopic as well as macroscopic information. They cover most of the techniques used in surface science.

  15. Detailed Electrochemical Characterisation of Large SOFC Stacks

    DEFF Research Database (Denmark)

    Mosbæk, Rasmus Rode; Hjelm, Johan; Barfod, R.

    2012-01-01

    Fuel Cell A/S was characterised in detail using electrochemical impedance spectroscopy. An investigation of the optimal geometrical placement of the current probes and voltage probes was carried out in order to minimise measurement errors caused by stray impedances. Unwanted stray impedances...... are particularly problematic at high frequencies. Stray impedances may be caused by mutual inductance and stray capacitance in the geometrical set-up and do not describe the fuel cell. Three different stack geometries were investigated by electrochemical impedance spectroscopy. Impedance measurements were carried...... out at a range of ac perturbation amplitudes in order to investigate linearity of the response and the signal-to-noise ratio. Separation of the measured impedance into series and polarisation resistances was possible....

  16. Detailed gravimetric geoid for the United States.

    Science.gov (United States)

    Strange, W. E.; Vincent, S. F.; Berry, R. H.; Marsh, J. G.

    1972-01-01

    A detailed gravimetric geoid was computed for the United States using a combination of satellite-derived spherical harmonic coefficients and 1 by 1 deg mean gravity values from surface gravimetry. Comparisons of this geoid with astrogeodetic geoid data indicate that a precision of plus or minus 2 meters has been obtained. Translations only were used to convert the NAD astrogeodetic geoid heights to geocentric astrogeodetic heights. On the basis of the agreement between the geocentric astrogeodetic geoid heights and the gravimetric geoid heights, no evidence is found for rotation in the North American datum. The value of the zero-order undulation can vary by 10 to 20 meters, depending on which investigator's station positions are used to establish it.

  17. Most Detailed Image of the Crab Nebula

    Science.gov (United States)

    2005-01-01

    This new Hubble image -- one among the largest ever produced with the Earth-orbiting observatory -- shows the most detailed view so far of the entire Crab Nebula ever made. The Crab is arguably the single most interesting object, as well as one of the most studied, in all of astronomy. The image is the largest image ever taken with Hubble's WFPC2 workhorse camera. The Crab Nebula is one of the most intricately structured and highly dynamical objects ever observed. The new Hubble image of the Crab was assembled from 24 individual exposures taken with the NASA/ESA Hubble Space Telescope and is the highest resolution image of the entire Crab Nebula ever made.

  18. A meaningful expansion around detailed balance

    CERN Document Server

    Colangeli, Matteo; Wynants, Bram

    2011-01-01

    We consider Markovian dynamics modeling open mesoscopic systems which are driven away from detailed balance by a nonconservative force. A systematic expansion is obtained of the stationary distribution around an equilibrium reference, in orders of the nonequilibrium forcing. The first order around equilibrium has been known since the work of McLennan (1959), and involves the transient irreversible entropy flux. The expansion generalizes the McLennan formula to higher orders, complementing the entropy flux with the dynamical activity. The latter is more kinetic than thermodynamic and is a possible realization of Landauer's insight (1975) that, for nonequilibrium, the relative occupation of states also depends on the noise along possible escape routes. In that way nonlinear response around equilibrium can be meaningfully discussed in terms of two main quantities only, the entropy flux and the dynamical activity. The expansion makes mathematical sense as shown in the simplest cases from exponential ergodicity.

  19. Picornavirus uncoating intermediate captured in atomic detail

    Science.gov (United States)

    Ren, Jingshan; Wang, Xiangxi; Hu, Zhongyu; Gao, Qiang; Sun, Yao; Li, Xuemei; Porta, Claudine; Walter, Thomas S.; Gilbert, Robert J.; Zhao, Yuguang; Axford, Danny; Williams, Mark; McAuley, Katherine; Rowlands, David J.; Yin, Weidong; Wang, Junzhi; Stuart, David I.; Rao, Zihe; Fry, Elizabeth E.

    2013-01-01

    It remains largely mysterious how the genomes of non-enveloped eukaryotic viruses are transferred across a membrane into the host cell. Picornaviruses are simple models for such viruses, and initiate this uncoating process through particle expansion, which reveals channels through which internal capsid proteins and the viral genome presumably exit the particle, although this has not been clearly seen until now. Here we present the atomic structure of an uncoating intermediate for the major human picornavirus pathogen CAV16, which reveals VP1 partly extruded from the capsid, poised to embed in the host membrane. Together with previous low-resolution results, we are able to propose a detailed hypothesis for the ordered egress of the internal proteins, using two distinct sets of channels through the capsid, and suggest a structural link to the condensed RNA within the particle, which may be involved in triggering RNA release. PMID:23728514

  20. Detailed Chromospheric Activity Nature of KIC 9641031

    CERN Document Server

    Yoldaş, Ezgi

    2016-01-01

    This study depends on KIC 9641031 eclipsing binary system with a chromospherically active component. There are three type variations, such as geometrical variations due to eclipses, sinusoidal variations due to the rotational modulations and also flares, in the light curves obtained with the data taken from the Kepler Mission Database. Taking into account results obtained from KIC 9641031's observations in the Kepler Mission Database, we present and discuss the details of chromospheric activity. The sinusoidal light variations due to rotational modulation and the flare events were modelled separately. 92 different data subsets separated using the analytic models described in the literature were modelled separately to obtain the cool spot configuration. It is seen that just one component of the system is chromospherically active star. On this component, there are two active regions separated by about 180 deg longitudinally between the latitudes of +50 deg and +100 deg, whose locations and forms are rapidly cha...

  1. Detailed Design of Intelligent Object Framework

    Directory of Open Access Journals (Sweden)

    Sasa Savicand Hao Shi

    2013-12-01

    Full Text Available The design and implementation of Intelligent Object Framework(IOF aims to unite the communication and device management through a platform independent ma nagement protocol in conjunction with a management application. The Core Framework is devel oped using Microsoft Visual Studio, Microsoft’s .NET Framework and Microsoft’s Windows Mobile SDK. Secondary Intelligent Object is developed using Tibbo Integrated Development Environment (TIDE and T-BASIC programming language that is loaded on an EM1026 Embedded Device Platform running Tibbo Op erating System (TiOS. The backend database is based on Microsoft’s SQL Server.In this paper, prot ocols associated with Smart Living are first reviewed.The system architecture and intelligent ob ject management studio are presented. Then device application design and database design are detailed . Finally conclusions are drawn and future work is addressed.

  2. Reliability prediction techniques

    Energy Technology Data Exchange (ETDEWEB)

    Whittaker, B.; Worthington, B.; Lord, J.F.; Pinkard, D.

    1986-01-01

    The paper demonstrates the feasibility of applying reliability assessment techniques to mining equipment. A number of techniques are identified and described and examples of their use in assessing mining equipment are given. These techniques include: reliability prediction; failure analysis; design audit; maintainability; availability and the life cycle costing. Specific conclusions regarding the usefulness of each technique are outlined. The choice of techniques depends upon both the type of equipment being assessed and its stage of development, with numerical prediction best suited for electronic equipment and fault analysis and design audit suited to mechanical equipment. Reliability assessments involve much detailed and time consuming work but it has been demonstrated that the resulting reliability improvements lead to savings in service costs which more than offset the cost of the evaluation.

  3. An alternative measure of solar activity from detailed sunspot datasets

    CERN Document Server

    Muraközy, Judit; Ludmány, András

    2016-01-01

    The sunspot number is analyzed by using detailed sunspot data, including aspects of observability, sunspot sizes, and proper identification of sunspot groups as discrete entities of the solar activity. The tests show that besides the subjective factors there are also objective causes of the ambiguities in the series of sunspot numbers. To introduce an alternative activity measure the physical meaning of the sunspot number has to be reconsidered. It contains two components whose numbers are governed by different physical mechanisms, this is one source of the ambiguity. This article suggests an activity index, which is the amount of emerged magnetic flux. The only long-term proxy measure is the detailed sunspot area dataset with proper calibration to the magnetic flux amount. The Debrecen sunspot databases provide an appropriate source for the establishment of the suggested activity index.

  4. Framework programme for detailed characterisation in connection with construction and operation of a final repository for spent nuclear fuel

    Energy Technology Data Exchange (ETDEWEB)

    2010-10-15

    This report presents a programme for the detailed investigations planned to be applied during construction and operation of the repository for spent nuclear fuel at Forsmark. The report is part of SKB's application according to the Nuclear Activities Act. The detailed investigations shall provide relevant data on and site-descriptive models for the bedrock, soil deposits and eco-system of the site in order to facilitate a step-wise design and construction of the final repository. This shall be implemented in a manner that all demands on long-term safety are fulfilled, including accurate documentation of the construction work, and so that assessments of the environmental impact of the repository can be made. For the operational phase, the detailed investigations should also provide support to the deposition process with related decisions, thereby enabling fulfilment of the design premises for the siting and construction of deposition tunnels and deposition holes, as well as for deposition of canisters, and for the subsequent backfilling and closure of the repository. The Observational Method will be applied during the construction of the repository. This method entails establishing in advance acceptable limits of behaviour regarding selected geoscientific parameters and preparing a plan with measures to keep the outcome within these limits. Predictions of expected rock properties are established for each tunnel section. The outcome after excavation is compared with the acceptable range of outcomes. Information from detailed characterization will be of essential importance for application of the Observational Method and for adapting the repository to the prevailing rock properties. SKB has for the past several decades developed methods for site characterisation, applying both above- and underground investigation techniques. Experiences from this work, put into practice during the site investigations, has resulted in a solid knowledge and understanding of the

  5. Framework programme for detailed characterisation in connection with construction and operation of a final repository for spent nuclear fuel

    Energy Technology Data Exchange (ETDEWEB)

    2010-10-15

    This report presents a programme for the detailed investigations planned to be applied during construction and operation of the repository for spent nuclear fuel at Forsmark. The report is part of SKB's application according to the Nuclear Activities Act. The detailed investigations shall provide relevant data on and site-descriptive models for the bedrock, soil deposits and eco-system of the site in order to facilitate a step-wise design and construction of the final repository. This shall be implemented in a manner that all demands on long-term safety are fulfilled, including accurate documentation of the construction work, and so that assessments of the environmental impact of the repository can be made. For the operational phase, the detailed investigations should also provide support to the deposition process with related decisions, thereby enabling fulfilment of the design premises for the siting and construction of deposition tunnels and deposition holes, as well as for deposition of canisters, and for the subsequent backfilling and closure of the repository. The Observational Method will be applied during the construction of the repository. This method entails establishing in advance acceptable limits of behaviour regarding selected geoscientific parameters and preparing a plan with measures to keep the outcome within these limits. Predictions of expected rock properties are established for each tunnel section. The outcome after excavation is compared with the acceptable range of outcomes. Information from detailed characterization will be of essential importance for application of the Observational Method and for adapting the repository to the prevailing rock properties. SKB has for the past several decades developed methods for site characterisation, applying both above- and underground investigation techniques. Experiences from this work, put into practice during the site investigations, has resulted in a solid knowledge and understanding of the

  6. Provider Health and Wellness.

    Science.gov (United States)

    Nanda, Anil; Wasan, Anita; Sussman, James

    2017-07-19

    Provider health and wellness is a significant issue and can impact patient care, including patient satisfaction, quality of care, medical errors, malpractice risk, as well as provider and office staff turnover and early retirement. Health and wellness encompasses various areas including burnout, depression, divorce, and suicide and affects providers of all specialties and at all levels of training. Providers deal with many everyday stresses, including electronic health records, office politics, insurance and billing issues, dissatisfied patients, and their own personal and family issues. Approximately half of all physicians suffer from burnout, and the rate of burnout among physicians of all specialties is increasing. An important first step in dealing with burnout is recognition and then seeking assistance. Strategies to prevent and treat burnout include increasing provider resiliency as well as implementing practical changes in the everyday practice of medicine. There is currently very little data regarding health and wellness specifically in the field of allergy and immunology, and studies are necessary to determine the prevalence of burnout and related issues in this field. Many medical specialties as well as state and national medical associations have health and wellness committees and other resources, which are essential for providers. Health and wellness programs should be introduced early in a provider's training and continued throughout a provider's career. Copyright © 2017 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  7. Passive RF component technology materials, techniques, and applications

    CERN Document Server

    Wang, Guoan

    2012-01-01

    Focusing on novel materials and techniques, this pioneering volume provides you with a solid understanding of the design and fabrication of smart RF passive components. You find comprehensive details on LCP, metal materials, ferrite materials, nano materials, high aspect ratio enabled materials, green materials for RFID, and silicon micromachining techniques. Moreover, this practical book offers expert guidance on how to apply these materials and techniques to design a wide range of cutting-edge RF passive components, from MEMS switch based tunable passives and 3D passives, to metamaterial-bas

  8. Relaxation Techniques to Manage IBS Symptoms

    Science.gov (United States)

    ... the Day Art of IBS Gallery Contact Us Relaxation Techniques to Manage IBS Symptoms Details Content Last Updated: ... Topic Psychological Treatments Understanding Stress Cognitive Behavioral Therapy Relaxation Techniques for IBS You’ve been to the doctor ...

  9. A fine-cut technique for permanent laryngeal sectioning.

    Science.gov (United States)

    Roy, S; Lundy, D S; Marcillo, A S; Casiano, R R

    2001-01-01

    A new technique for permanent sectioning of the human spinal cord has provided superior images over those produced with traditional methods. Application of this technique for sections of the human larynx may yield cost-effective, efficient, and accurate laryngeal anatomic dissections. This study was designed to evaluate this technique for dissections of the human larynx. Laryngeal sections from cadavers were submerged in a celloidin solution, a derivative of wallpaper plaster, and frozen to -15 degrees C. After preparation, axial and coronal cuts of 100 microm were made with a Macrocut Tome sectioning system. Sections were completed in approximately 30 hours. Digitized photographs of the laryngeal sections provide detailed images of precise anatomic relationships. Celloidin-based sectioning of the human larynx yields precise anatomic information beyond standard radiographic imagining and previous permanent laryngeal sectioning techniques in a cost-efficient and timely manner. Black and white fine-section photographs are provided.

  10. Industrial fouling: problem characterization, economic assessment, and review of prevention, mitigation, and accommodation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Garrett-Price, B.A.; Smith, S.A.; Watts, R.L.

    1984-02-01

    A comprehensive overview of heat exchanger fouling in the manufacturing industries is provided. Specifically, this overview addresses: the characteristics of industrial fouling problems; the mitigation and accommodation techniques currently used by industry; and the types and magnitude of costs associated with industrial fouling. A detailed review of the fouling problems, costs and mitigation techniques is provided for the food, textile, pulp and paper, chemical, petroleum, cement, glass and primary metals industries.

  11. Pileup Mitigation Techniques

    CERN Document Server

    Klein, Matthew Henry; The ATLAS collaboration

    2016-01-01

    We report on recent progress in the ATLAS experiment in developing tools to mitigate the effects of pile-up. Forward pile-up jet tagging techniques, as well as constituent-level pile-up suppression algorithms are discussed in details. The impacts of these approaches on both jet energy and angular resolution, as well as jet substructure and boosted object tagging performance are discussed. Improvements to various physics channels of interest are discussed and the potential future of such algorithms — both online and offline, and both at the current LHC and a future high-luminosity LHC and beyond — is considered in detail

  12. Getting the Most Out of Dual-Listed Courses: Involving Undergraduate Students in Discussion through Active Learning Techniques

    Science.gov (United States)

    Duncan, Leslie Lyons; Burkhardt, Bethany L.; Benneyworth, Laura M.; Tasich, Christopher M.; Duncan, Benjamin R.

    2015-01-01

    This article provides readers with details concerning the implementation of four active learning techniques used to help undergraduate students critically discuss primary literature. On the basis of undergraduate and graduate student perceptions and experiences, the authors suggest techniques to enhance the quality of dual-listed courses and…

  13. Multisensor image fusion techniques in remote sensing

    Science.gov (United States)

    Ehlers, Manfred

    Current and future remote sensing programs such as Landsat, SPOT, MOS, ERS, JERS, and the space platform's Earth Observing System (Eos) are based on a variety of imaging sensors that will provide timely and repetitive multisensor earth observation data on a global scale. Visible, infrared and microwave images of high spatial and spectral resolution will eventually be available for all parts of the earth. It is essential that efficient processing techniques be developed to cope with the large multisensor data volumes. This paper discusses data fusion techniques that have proved successful for synergistic merging of SPOT HRV, Landsat TM and SIR-B images. It is demonstrated that these techniques can be used to improve rectification accuracies, to depicit greater cartographic detail, and to enhance spatial resolution in multisensor image data sets.

  14. Wave Forecasting Using Neuro Wavelet Technique

    Directory of Open Access Journals (Sweden)

    Pradnya Dixit

    2014-12-01

    Full Text Available In the present work a hybrid Neuro-Wavelet Technique is used for forecasting waves up to 6 hr, 12 hr, 18 hr and 24 hr in advance using hourly measured significant wave heights at an NDBC station 41004 near the east coast of USA. The NW Technique is employed by combining two methods, Discrete Wavelet Transform and Artificial Neural Networks. The hourly data of previously measured significant wave heights spanning over 2 years from 2010 and 2011 is used to calibrate and test the models. The discrete wavelet transform of NWT analyzes frequency of signal with respect to time at different scales. It decomposes time series into low (approximate and high (detail frequency components. The decomposition of approximate can be carried out up to desired multiple levels in order to provide more detail and approximate components which provides relatively smooth varying amplitude series. The neural network is trained with decorrelated approximate and detail wavelet coefficients. The outputs of networks during testing are reconstructed back using inverse DWT. The results were judged by drawing the wave plots, scatter plots and other error measures. The developed models show reasonable accuracy in prediction of significant wave heights from 6 to 24 hours. To compare the results traditional ANN models were also developed at the same location using the same data and for same time interval.

  15. Detailed Chemical Kinetic Modeling of Cyclohexane Oxidation

    Energy Technology Data Exchange (ETDEWEB)

    Silke, E J; Pitz, W J; Westbrook, C K; Ribaucour, M

    2006-11-10

    A detailed chemical kinetic mechanism has been developed and used to study the oxidation of cyclohexane at both low and high temperatures. Reaction rate constant rules are developed for the low temperature combustion of cyclohexane. These rules can be used for in chemical kinetic mechanisms for other cycloalkanes. Since cyclohexane produces only one type of cyclohexyl radical, much of the low temperature chemistry of cyclohexane is described in terms of one potential energy diagram showing the reaction of cyclohexyl radical + O{sub 2} through five, six and seven membered ring transition states. The direct elimination of cyclohexene and HO{sub 2} from RO{sub 2} is included in the treatment using a modified rate constant of Cavallotti et al. Published and unpublished data from the Lille rapid compression machine, as well as jet-stirred reactor data are used to validate the mechanism. The effect of heat loss is included in the simulations, an improvement on previous studies on cyclohexane. Calculations indicated that the production of 1,2-epoxycyclohexane observed in the experiments can not be simulated based on the current understanding of low temperature chemistry. Possible 'alternative' H-atom isomerizations leading to different products from the parent O{sub 2}QOOH radical were included in the low temperature chemical kinetic mechanism and were found to play a significant role.

  16. Detailed Aerosol Characterization using Polarimetric Measurements

    Science.gov (United States)

    Hasekamp, Otto; di Noia, Antonio; Stap, Arjen; Rietjens, Jeroen; Smit, Martijn; van Harten, Gerard; Snik, Frans

    2016-04-01

    Anthropogenic aerosols are believed to cause the second most important anthropogenic forcing of climate change after greenhouse gases. In contrast to the climate effect of greenhouse gases, which is understood relatively well, the negative forcing (cooling effect) caused by aerosols represents the largest reported uncertainty in the most recent assessment of the International Panel on Climate Change (IPCC). To reduce the large uncertainty on the aerosol effects on cloud formation and climate, accurate satellite measurements of aerosol optical properties (optical thickness, single scattering albedo, phase function) and microphysical properties (size distribution, refractive index, shape) are essential. There is growing consensus in the aerosol remote sensing community that multi-angle measurements of intensity and polarization are essential to unambiguously determine all relevant aerosol properties. This presentations adresses the different aspects of polarimetric remote sensing of atmospheric aerosols, including retrieval algorithm development, validation, and data needs for climate and air quality applications. During past years, at SRON-Netherlands Instite for Space Research retrieval algorithms have been developed that make full use of the capabilities of polarimetric measurements. We will show results of detailed aerosol properties from ground-based- (groundSPEX), airborne- (NASA Research Scanning Polarimeter), and satellite (POLDER) measurements. Also we will discuss observational needs for future instrumentation in order to improve our understanding of the role of aerosols in climate change and air quality.

  17. Some articulatory details of emotional speech

    Science.gov (United States)

    Lee, Sungbok; Yildirim, Serdar; Bulut, Murtaza; Kazemzadeh, Abe; Narayanan, Shrikanth

    2005-09-01

    Differences in speech articulation among four emotion types, neutral, anger, sadness, and happiness are investigated by analyzing tongue tip, jaw, and lip movement data collected from one male and one female speaker of American English. The data were collected using an electromagnetic articulography (EMA) system while subjects produce simulated emotional speech. Pitch, root-mean-square (rms) energy and the first three formants were estimated for vowel segments. For both speakers, angry speech exhibited the largest rms energy and largest articulatory activity in terms of displacement range and movement speed. Happy speech is characterized by largest pitch variability. It has higher rms energy than neutral speech but articulatory activity is rather comparable to, or less than, neutral speech. That is, happy speech is more prominent in voicing activity than in articulation. Sad speech exhibits longest sentence duration and lower rms energy. However, its articulatory activity is no less than neutral speech. Interestingly, for the male speaker, articulation for vowels in sad speech is consistently more peripheral (i.e., more forwarded displacements) when compared to other emotions. However, this does not hold for female subject. These and other results will be discussed in detail with associated acoustics and perceived emotional qualities. [Work supported by NIH.

  18. Detailed modelling of the 21-cm Forest

    CERN Document Server

    Semelin, Benoit

    2015-01-01

    The 21-cm forest is a promising probe of the Epoch of Reionization. The local state of the intergalactic medium (IGM) is encoded in the spectrum of a background source (radio-loud quasars or gamma ray burst afterglow) by absorption at the local 21-cm wavelength, resulting in a continuous and fluctuating absorption level. Small-scale structures (filaments and minihaloes) in the IGM are responsible for the strongest absorption features. The absorption can also be modulated on large scales by inhomogeneous heating and Wouthuysen-Field coupling. We present the results from a simulation that attempts to preserve the cosmological environment while resolving some of the small-scale structures (a few kpc resolution in a 50 Mpc/h box). The simulation couples the dynamics and the ionizing radiative transfer and includes X-ray and Lyman lines radiative transfer for a detailed physical modelling. As a result we find that soft X-ray self-shielding, Lyman-alpha self-shielding and shock heating all have an impact on the pre...

  19. Detailed ultraviolet asymptotics for AdS scalar field perturbations

    CERN Document Server

    Evnin, Oleg

    2016-01-01

    We present a range of methods suitable for accurate evaluation of the leading asymptotics for integrals of products of Jacobi polynomials in limits when the degrees of some or all polynomials inside the integral become large. The structures in question have recently emerged in the context of effective descriptions of small amplitude perturbations in anti-de Sitter (AdS) spacetime. The limit of high degree polynomials corresponds in this situation to effective interactions involving extreme short-wavelength modes, whose dynamics is crucial for the turbulent instabilities that determine the ultimate fate of small AdS perturbations. We explicitly apply the relevant asymptotic techniques to the case of a self-interacting probe scalar field in AdS and extract a detailed form of the leading large degree behavior, including closed form analytic expressions for the numerical coefficients appearing in the asymptotics.

  20. Detailed ultraviolet asymptotics for AdS scalar field perturbations

    Energy Technology Data Exchange (ETDEWEB)

    Evnin, Oleg [Department of Physics, Faculty of Science, Chulalongkorn University,Thanon Phayathai, Pathumwan, Bangkok 10330 (Thailand); Theoretische Natuurkunde, Vrije Universiteit Brussel and The International Solvay Institutes,Pleinlaan 2, B-1050 Brussels (Belgium); Jai-akson, Puttarak [Department of Physics, Faculty of Science, Chulalongkorn University,Thanon Phayathai, Pathumwan, Bangkok 10330 (Thailand)

    2016-04-11

    We present a range of methods suitable for accurate evaluation of the leading asymptotics for integrals of products of Jacobi polynomials in limits when the degrees of some or all polynomials inside the integral become large. The structures in question have recently emerged in the context of effective descriptions of small amplitude perturbations in anti-de Sitter (AdS) spacetime. The limit of high degree polynomials corresponds in this situation to effective interactions involving extreme short-wavelength modes, whose dynamics is crucial for the turbulent instabilities that determine the ultimate fate of small AdS perturbations. We explicitly apply the relevant asymptotic techniques to the case of a self-interacting probe scalar field in AdS and extract a detailed form of the leading large degree behavior, including closed form analytic expressions for the numerical coefficients appearing in the asymptotics.

  1. Strengthening of competence planning truss through instructional media development details

    Science.gov (United States)

    Handayani, Sri; Nurcahyono, M. Hadi

    2017-03-01

    Competency-Based Learning is a model of learning in which the planning, implementation, and assessment refers to the mastery of competencies. Learning in lectures conducted in the framework for comprehensively realizing student competency. Competence means the orientation of the learning activities in the classroom must be given to the students to be more active learning, active search for information themselves and explore alone or with friends in learning activities in pairs or in groups, learn to use a variety of learning resources and printed materials, electronic media, as well as environment. Analysis of learning wooden structure known weakness in the understanding of the truss detail. Hence the need for the development of media that can provide a clear picture of what the structure of the wooden horses and connection details. Development of instructional media consisted of three phases of activity, namely planning, production and assessment. Learning Media planning should be tailored to the needs and conditions necessary to provide reinforcement to the mastery of competencies, through the table material needs. The production process of learning media is done by using hardware (hardware) and software (software) to support the creation of a medium of learning. Assessment of the media poduk yan include feasibility studies, namely by subject matter experts, media experts, while testing was done according to the student's perception of the product. The results of the analysis of the materials for the instructional aspects of the results obtained 100% (very good) and media analysis for the design aspects of the media expressed very good with a percentage of 88.93%. While the analysis of student perceptions expressed very good with a percentage of 84.84%. Media Learning Truss Details feasible and can be used in the implementation of learning wooden structure to provide capacity-building in planning truss

  2. Alternative SEM techniques for observing pyritised fossil material.

    Science.gov (United States)

    Poole; Lloyd

    2000-11-01

    Two scanning electron microscopy (SEM) electron-specimen interactions that provide images based on sample crystal structure, electron channelling and electron backscattered diffraction, are described. The SEM operating conditions and sample preparation are presented, followed by an example application of these techniques to the study of pyritised plant material. The two approaches provide an opportunity to examine simultaneously, at higher magnifications normally available optically, detailed specimen anatomy and preservation state. Our investigation suggests that whereas both techniques have their advantages, the electron channelling approach is generally more readily available to most SEM users. However, electron backscattered diffraction does afford the opportunity of automated examination and characterisation of pyritised fossil material.

  3. Detailed chemical kinetic oxidation mechanism for a biodiesel surrogate

    Energy Technology Data Exchange (ETDEWEB)

    Herbinet, O; Pitz, W J; Westbrook, C K

    2007-09-20

    A detailed chemical kinetic mechanism has been developed and used to study the oxidation of methyl decanoate, a surrogate for biodiesel fuels. This model has been built by following the rules established by Curran et al. for the oxidation of n-heptane and it includes all the reactions known to be pertinent to both low and high temperatures. Computed results have been compared with methyl decanoate experiments in an engine and oxidation of rapeseed oil methyl esters in a jet stirred reactor. An important feature of this mechanism is its ability to reproduce the early formation of carbon dioxide that is unique to biofuels and due to the presence of the ester group in the reactant. The model also predicts ignition delay times and OH profiles very close to observed values in shock tube experiments fueled by n-decane. These model capabilities indicate that large n-alkanes can be good surrogates for large methyl esters and biodiesel fuels to predict overall reactivity, but some kinetic details, including early CO{sub 2} production from biodiesel fuels, can be predicted only by a detailed kinetic mechanism for a true methyl ester fuel. The present methyl decanoate mechanism provides a realistic kinetic tool for simulation of biodiesel fuels.

  4. Detailed chemical kinetic oxidation mechanism for a biodiesel surrogate

    Energy Technology Data Exchange (ETDEWEB)

    Herbinet, O; Pitz, W J; Westbrook, C K

    2007-09-17

    A detailed chemical kinetic mechanism has been developed and used to study the oxidation of methyl decanoate, a surrogate for biodiesel fuels. This model has been built by following the rules established by Curran et al. for the oxidation of n-heptane and it includes all the reactions known to be pertinent to both low and high temperatures. Computed results have been compared with methyl decanoate experiments in an engine and oxidation of rapeseed oil methyl esters in a jet stirred reactor. An important feature of this mechanism is its ability to reproduce the early formation of carbon dioxide that is unique to biofuels and due to the presence of the ester group in the reactant. The model also predicts ignition delay times and OH profiles very close to observed values in shock tube experiments fueled by n-decane. These model capabilities indicate that large n-alkanes can be good surrogates for large methyl esters and biodiesel fuels to predict overall reactivity, but some kinetic details, including early CO2 production from biodiesel fuels, can be predicted only by a detailed kinetic mechanism for a true methyl ester fuel. The present methyl decanoate mechanism provides a realistic kinetic tool for simulation of biodiesel fuels.

  5. Detailed chemical kinetic oxidation mechanism for a biodiesel surrogate

    Energy Technology Data Exchange (ETDEWEB)

    Herbinet, O; Pitz, W J; Westbrook, C K

    2007-09-20

    A detailed chemical kinetic mechanism has been developed and used to study the oxidation of methyl decanoate, a surrogate for biodiesel fuels. This model has been built by following the rules established by Curran et al. for the oxidation of n-heptane and it includes all the reactions known to be pertinent to both low and high temperatures. Computed results have been compared with methyl decanoate experiments in an engine and oxidation of rapeseed oil methyl esters in a jet stirred reactor. An important feature of this mechanism is its ability to reproduce the early formation of carbon dioxide that is unique to biofuels and due to the presence of the ester group in the reactant. The model also predicts ignition delay times and OH profiles very close to observed values in shock tube experiments fueled by n-decane. These model capabilities indicate that large n-alkanes can be good surrogates for large methyl esters and biodiesel fuels to predict overall reactivity, but some kinetic details, including early CO{sub 2} production from biodiesel fuels, can be predicted only by a detailed kinetic mechanism for a true methyl ester fuel. The present methyl decanoate mechanism provides a realistic kinetic tool for simulation of biodiesel fuels.

  6. Detailed chemical kinetic oxidation mechanism for a biodiesel surrogate

    Energy Technology Data Exchange (ETDEWEB)

    Herbinet, O; Pitz, W J; Westbrook, C K

    2007-09-17

    A detailed chemical kinetic mechanism has been developed and used to study the oxidation of methyl decanoate, a surrogate for biodiesel fuels. This model has been built by following the rules established by Curran et al. for the oxidation of n-heptane and it includes all the reactions known to be pertinent to both low and high temperatures. Computed results have been compared with methyl decanoate experiments in an engine and oxidation of rapeseed oil methyl esters in a jet stirred reactor. An important feature of this mechanism is its ability to reproduce the early formation of carbon dioxide that is unique to biofuels and due to the presence of the ester group in the reactant. The model also predicts ignition delay times and OH profiles very close to observed values in shock tube experiments fueled by n-decane. These model capabilities indicate that large n-alkanes can be good surrogates for large methyl esters and biodiesel fuels to predict overall reactivity, but some kinetic details, including early CO2 production from biodiesel fuels, can be predicted only by a detailed kinetic mechanism for a true methyl ester fuel. The present methyl decanoate mechanism provides a realistic kinetic tool for simulation of biodiesel fuels.

  7. Upstream from OPERA: extreme attention to detail

    CERN Multimedia

    CERN Bulletin

    2011-01-01

    Two weeks ago, at a seminar held at CERN, the OPERA collaboration revealed their astonishing observation: neutrinos might move faster than light. The finding is currently under scrutiny in the scientific community. While the result downstream at Gran Sasso speaks for itself, upstream at CERN things are no less intriguing, with high-tech GPS systems, novel techniques for accurately measuring the time, and unique ways keeping the initial particle beam stable. Take away one ingredient and the accuracy needed for the final measurement is spoiled.   Underground installations of the CERN Neutrinos to Gran Sasso (CNGS) project. First ingredient: a stable beam CERN produces neutrinos by sending a beam of protons to hit a target. The collisions produce a secondary beam, which mostly consists of pions and kaons that decay in flight within an evacuated tunnel. Their decay products are muons and muon-neutrinos. An absorber stops the pions and kaons that do not decay, while the resulting muons are absorb...

  8. "Reminder: please update your details": Phishing Trends

    CERN Document Server

    Dhinakaran, Cynthia; Nagamalai, Dhinaharan

    2010-01-01

    Spam messes up users inbox, consumes resources and spread attacks like DDoS, MiM, Phishing etc., Phishing is a byproduct of email and causes financial loss to users and loss of reputation to financial institutions. In this paper we study the characteristics of phishing and technology used by phishers. In order to counter anti phishing technology, phishers change their mode of operation; therefore continuous evaluation of phishing helps us to combat phishers effectively. We have collected seven hundred thousand spam from a corporate server for a period of 13 months from February 2008 to February 2009. From the collected date, we identified different kinds of phishing scams and mode of their operation. Our observation shows that phishers are dynamic and depend more on social engineering techniques rather than software vulnerabilities. We believe that this study would be useful to develop more efficient anti phishing methodologies.

  9. Detailed observations of the source of terrestrial narrowband electromagnetic radiation

    Science.gov (United States)

    Kurth, W. S.

    1982-01-01

    Detailed observations are presented of a region near the terrestrial plasmapause where narrowband electromagnetic radiation (previously called escaping nonthermal continuum radiation) is being generated. These observations show a direct correspondence between the narrowband radio emissions and electron cyclotron harmonic waves near the upper hybrid resonance frequency. In addition, electromagnetic radiation propagating in the Z-mode is observed in the source region which provides an extremely accurate determination of the electron plasma frequency and, hence, density profile of the source region. The data strongly suggest that electrostatic waves and not Cerenkov radiation are the source of the banded radio emissions and define the coupling which must be described by any viable theory.

  10. Technology of Strengthening Steel Details by Surfacing Composite Coatings

    Science.gov (United States)

    Burov, V. G.; Bataev, A. A.; Rakhimyanov, Kh M.; Mul, D. O.

    2016-04-01

    The article considers the problem of forming wear resistant meal ceramic coatings on steel surfaces using the results of our own investigations and the analysis of achievements made in the country and abroad. Increasing the wear resistance of surface layers of steel details is achieved by surfacing composite coatings with carbides or borides of metals as disperse particles in the strengthening phase. The use of surfacing on wearing machine details and mechanisms has a history of more than 100 years. But still engineering investigations in this field are being conducted up to now. The use of heating sources which provide a high density of power allows ensuring temperature and time conditions of surfacing under which composites with peculiar service and functional properties are formed. High concentration of energy in the zone of melt, which is created from powder mixtures and the hardened surface layer, allows producing the transition zone between the main material and surfaced coating. Surfacing by the electron beam directed from vacuum to the atmosphere is of considerable technological advantages. They give the possibility of strengthening surface layers of large-sized details by surfacing powder mixtures without their preliminary compacting. A modified layer of the main metal with ceramic particles distributed in it is created as a result of heating surfaced powders and the detail surface layer by the electron beam. Technology of surfacing allows using powders of refractory metals and graphite in the composition of powder mixtures. They interact with one another and form the particles of the hardening phase of the composition coating. The chemical composition of the main and surfaced materials is considered to be the main factor which determines the character of metallurgical processes in local zones of melt as well as the structure and properties of surfaced composition.

  11. Towards a detailed soot model for internal combustion engines

    Energy Technology Data Exchange (ETDEWEB)

    Mosbach, Sebastian; Celnik, Matthew S.; Raj, Abhijeet; Kraft, Markus [Department of Chemical Engineering and Biotechnology, University of Cambridge, Pembroke Street, Cambridge CB2 3RA (United Kingdom); Zhang, Hongzhi R. [Department of Chemical Engineering, University of Utah, 1495 East 100 South, Kennecott Research Building, Salt Lake City, UT 84112 (United States); Kubo, Shuichi [Frontier Research Center, Toyota Central R and D Labs., Inc., Nagakute, Aichi 480-1192 (Japan); Kim, Kyoung-Oh [Higashifuji Technical Center, Toyota Motor Corporation, Mishuku 1200, Susono, Shizuoka 480-1193 (Japan)

    2009-06-15

    In this work, we present a detailed model for the formation of soot in internal combustion engines describing not only bulk quantities such as soot mass, number density, volume fraction, and surface area but also the morphology and chemical composition of soot aggregates. The new model is based on the Stochastic Reactor Model (SRM) engine code, which uses detailed chemistry and takes into account convective heat transfer and turbulent mixing, and the soot formation is accounted for by SWEEP, a population balance solver based on a Monte Carlo method. In order to couple the gas-phase to the particulate phase, a detailed chemical kinetic mechanism describing the combustion of Primary Reference Fuels (PRFs) is extended to include small Polycyclic Aromatic Hydrocarbons (PAHs) such as pyrene, which function as soot precursor species for particle inception in the soot model. Apart from providing averaged quantities as functions of crank angle like soot mass, volume fraction, aggregate diameter, and the number of primary particles per aggregate for example, the integrated model also gives detailed information such as aggregate and primary particle size distribution functions. In addition, specifics about aggregate structure and composition, including C/H ratio and PAH ring count distributions, and images similar to those produced with Transmission Electron Microscopes (TEMs), can be obtained. The new model is applied to simulate an n-heptane fuelled Homogeneous Charge Compression Ignition (HCCI) engine which is operated at an equivalence ratio of 1.93. In-cylinder pressure and heat release predictions show satisfactory agreement with measurements. Furthermore, simulated aggregate size distributions as well as their time evolution are found to qualitatively agree with those obtained experimentally through snatch sampling. It is also observed both in the experiment as well as in the simulation that aggregates in the trapped residual gases play a vital role in the soot

  12. Structural detailing of openings in sandwich panels

    NARCIS (Netherlands)

    Tomà, T.; Courage, W.

    1996-01-01

    European Recommendations exist which provide calculation rules to determine the strength and stiffness of sandwich panels composed of two metal faces with a foam in between. In case of openings in such panels (e.g. for windows) an influence will appear with regard to the stiffness and loadbearing ca

  13. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  14. What HERA may provide?

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hannes [DESY, Hamburg (Germany); De Roeck, Albert [CERN, Genf (Switzerland); Bartles, Jochen [Univ. Hamburg (DE). Institut fuer Theoretische Physik II] (and others)

    2008-09-15

    More than 100 people participated in a discussion session at the DIS08 workshop on the topic What HERA may provide. A summary of the discussion with a structured outlook and list of desirable measurements and theory calculations is given. (orig.)

  15. Novel information theory techniques for phonon spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Hague, J P [Department of Physics, Loughborough University, Loughborough, LE11 3TU (United Kingdom)

    2007-12-15

    The maximum entropy method (MEM) and spectral reverse Monte Carlo (SRMC) techniques are applied to the determination of the phonon density of states (PDOS) from heat-capacity data. The approach presented here takes advantage of the standard integral transform relating the PDOS with the specific heat at constant volume. MEM and SRMC are highly successful numerical approaches for inverting integral transforms. The formalism and algorithms necessary to carry out the inversion of specific heat curves are introduced, and where possible, I have concentrated on algorithms and experimental details for practical usage. Simulated data are used to demonstrate the accuracy of the approach. The main strength of the techniques presented here is that the resulting spectra are always physical: Computed PDOS is always positive and properly applied information theory techniques only show statistically significant detail. The treatment set out here provides a simple, cost-effective and reliable method to determine phonon properties of new materials. In particular, the new technique is expected to be very useful for establishing where interesting phonon modes and properties can be found, before spending time at large scale facilities.

  16. Comparison between detailed digital and conventional soil maps of an area with complex geology

    Directory of Open Access Journals (Sweden)

    Osmar Bazaglia Filho

    2013-10-01

    Full Text Available Since different pedologists will draw different soil maps of a same area, it is important to compare the differences between mapping by specialists and mapping techniques, as for example currently intensively discussed Digital Soil Mapping. Four detailed soil maps (scale 1:10.000 of a 182-ha sugarcane farm in the county of Rafard, São Paulo State, Brazil, were compared. The area has a large variation of soil formation factors. The maps were drawn independently by four soil scientists and compared with a fifth map obtained by a digital soil mapping technique. All pedologists were given the same set of information. As many field expeditions and soil pits as required by each surveyor were provided to define the mapping units (MUs. For the Digital Soil Map (DSM, spectral data were extracted from Landsat 5 Thematic Mapper (TM imagery as well as six terrain attributes from the topographic map of the area. These data were summarized by principal component analysis to generate the map designs of groups through Fuzzy K-means clustering. Field observations were made to identify the soils in the MUs and classify them according to the Brazilian Soil Classification System (BSCS. To compare the conventional and digital (DSM soil maps, they were crossed pairwise to generate confusion matrices that were mapped. The categorical analysis at each classification level of the BSCS showed that the agreement between the maps decreased towards the lower levels of classification and the great influence of the surveyor on both the mapping and definition of MUs in the soil map. The average correspondence between the conventional and DSM maps was similar. Therefore, the method used to obtain the DSM yielded similar results to those obtained by the conventional technique, while providing additional information about the landscape of each soil, useful for applications in future surveys of similar areas.

  17. Internet Medline providers.

    Science.gov (United States)

    Vine, D L; Coady, T R

    1998-01-01

    Each database in this review has features that will appeal to some users. Each provides a credible interface to information available within the Medline database. The major differences are pricing and interface design. In this context, features that cost more and might seem trivial to the occasional searcher may actually save time and money when used by the professional. Internet Grateful Med is free, but Ms. Coady and I agree the availability of only three ANDable search fields is a major functional limitation. PubMed is also free but much more powerful. The command line interface that permits very sophisticated searches requires a commitment that casual users will find intimidating. Ms. Coady did not believe the feedback currently provided during a search was sufficient for sustained professional use. Paper Chase and Knowledge Finder are mature, modestly priced Medline search services. Paper Chase provides a menu-driven interface that is very easy to use, yet permits the user to search virtually all of Medline's data fields. Knowledge Finder emphasizes the use of natural language queries but fully supports more traditional search strategies. The impact of the tradeoff between fuzzy and Boolean strategies offered by Knowledge Finder is unclear and beyond the scope of this review. Additional software must be downloaded to use all of Knowledge Finders' features. Other providers required no software beyond the basic Internet browser, and this requirement prevented Ms. Coady from evaluating Knowledge Finder. Ovid and Silver Platter offer well-designed interfaces that simplify the construction of complex queries. These are clearly services designed for professional users. While pricing eliminates these for casual use, it should be emphasized that Medline citation access is only a portion of the service provided by these high-end vendors. Finally, we should comment that each of the vendors and government-sponsored services provided prompt and useful feedback to e

  18. Strengthening of non-seismically detailed reinforced concrete beam–column joints using SIFCON blocks

    Indian Academy of Sciences (India)

    I S Misir; S Kahraman

    2013-02-01

    This article aims to propose a novel seismic strengthening technique for non-seismically detailed beam–column joints of existing reinforced concrete buildings, typical of the pre-1975 construction practice in Turkey. The technique is based on mounting pre-fabricated SIFCON composite corner and plate blocks on joints with anchorage rods. For the experimental part three 2/3 scale exterior beam–column joint specimens were tested under quasi-static cyclic loading. One of them was a control specimen with non-seismic details, and the remaining two with the same design properties were strengthened with composite blocks with different thickness and anchorage details. Results showed that the control specimen showed brittle shear failure at low drift levels, whereas in the strengthened specimens, plastic hinge formation moved away from column face allowing specimens to fail in flexure. The proposed technique greatly improved lateral strength, stiffness, energy dissipation, and ductility.

  19. Building Service Provider Capabilities

    DEFF Research Database (Denmark)

    Brandl, Kristin; Jaura, Manya; Ørberg Jensen, Peter D.

    In this paper we study whether and how the interaction between clients and the service providers contributes to the development of capabilities in service provider firms. In situations where such a contribution occurs, we analyze how different types of activities in the production process...... of the services, such as sequential or reciprocal task activities, influence the development of different types of capabilities. We study five cases of offshore-outsourced knowledge-intensive business services that are distinguished according to their reciprocal or sequential task activities in their production...... process. We find that clients influence the development of human capital capabilities and management capabilities in reciprocally produced services. While in sequential produced services clients influence the development of organizational capital capabilities and management capital capabilities....

  20. Achieving Provider Engagement

    Science.gov (United States)

    Greenfield, Geva; Pappas, Yannis; Car, Josip; Majeed, Azeem; Harris, Matthew

    2014-01-01

    The literature on integrated care is limited with respect to practical learning and experience. Although some attention has been paid to organizational processes and structures, not enough is paid to people, relationships, and the importance of these in bringing about integration. Little is known, for example, about provider engagement in the organizational change process, how to obtain and maintain it, and how it is demonstrated in the delivery of integrated care. Based on qualitative data from the evaluation of a large-scale integrated care initiative in London, United Kingdom, we explored the role of provider engagement in effective integration of services. Using thematic analysis, we identified an evolving engagement narrative with three distinct phases: enthusiasm, antipathy, and ambivalence, and argue that health care managers need to be aware of the impact of professional engagement to succeed in advancing the integrated care agenda. PMID:25212855

  1. Detailing of deformation processes in polymeric crystals

    Science.gov (United States)

    Slutsker, A. I.; Vettegren', V. I.; Kulik, V. B.; Hilarov, V. L.; Polikarpov, Yu. I.; Karov, D. D.

    2015-11-01

    Structural changes in polymer crystals (polyethylene, polyimide, and others) have been studied using the X-ray diffraction and Raman spectroscopy methods under different influences: tensile loading along the chain molecule axis and heating from 90 to 350 K. An increase in the molecule axial length under loading and a decrease in the molecule axial length upon heating have been identified and measured using X-ray diffraction. A decrease in the skeletal vibration frequency during loading and heating has been identified and measured using Raman spectroscopy, which indicates an increase in the molecule contour length in both cases. A technique for determining the change in the polyethylene molecule contour length in the crystal from the measured change in the skeletal vibration frequency has been justified. The contributions of two components, namely, skeletal (carbon-carbon) bond stretching and the change (an increase during stretching and a decrease during heating) in the angle between skeletal bonds, to the longitudinal deformation of polyethylene crystals, have been quantitatively estimated. It has been shown that the negative thermal expansion (contraction) of the polymer crystal is caused by the dominant contribution of the decrease in the bond angle.

  2. Rotator cuff tear: A detailed update

    Directory of Open Access Journals (Sweden)

    Vivek Pandey

    2015-01-01

    Full Text Available Rotator cuff tear has been a known entity for orthopaedic surgeons for more than two hundred years. Although the exact pathogenesis is controversial, a combination of intrinsic factors proposed by Codman and extrinsic factors theorized by Neer is likely responsible for most rotator cuff tears. Magnetic resonance imaging remains the gold standard for the diagnosis of rotator cuff tears, but the emergence of ultrasound has revolutionized the diagnostic capability. Even though mini-open rotator cuff repair is still commonly performed, and results are comparable to arthroscopic repair, all-arthroscopic repair of rotator cuff tear is now fast becoming a standard care for rotator cuff repair. Appropriate knowledge of pathology and healing pattern of cuff, strong and biological repair techniques, better suture anchors, and gradual rehabilitation of postcuff repair have led to good to excellent outcome after repair. As the healing of degenerative cuff tear remains unpredictable, the role of biological agents such as platelet-rich plasma and stem cells for postcuff repair augmentation is still under evaluation. The role of scaffolds in massive cuff tear is also being probed.

  3. Imaging of the hip and bony pelvis. Techniques and applications

    Energy Technology Data Exchange (ETDEWEB)

    Davies, A.M. [Royal Orthopaedic Hospital, Birmingham (United Kingdom). MRI Centre; Johnson, K.J. [Princess of Wales Birmingham Children' s Hospital (United Kingdom); Whitehouse, R.W. (eds.) [Manchester Royal Infirmary (United Kingdom). Dept. of Clinical Radiology

    2006-07-01

    This is a comprehensive textbook on imaging of the bony pelvis and hip joint that provides a detailed description of the techniques and imaging findings relevant to this complex anatomical region. In the first part of the book, the various techniques and procedures employed for imaging the pelvis and hip are discussed in detail. The second part of the book documents the application of these techniques to the diverse clinical problems and diseases encountered. Among the many topics addressed are congenital and developmental disorders including developmental dysplasia of the hip, irritable hip and septic arthritis, Perthes' disease and avascular necrosis, slipped upper femoral epiphysis, bony and soft tissue trauma, arthritis, tumours and hip prostheses. Each chapter is written by an acknowledged expert in the field, and a wealth of illustrative material is included. This book will be of great value to musculoskeletal and general radiologists, orthopaedic surgeons and rheumatologists. (orig.)

  4. The Effects of Seductive Details in an Inflatable Planetarium

    Science.gov (United States)

    Gillette, Sean

    Astronomy is becoming a forgotten science, which is evident by its relatively low enrollment figures compared to biology, chemistry, and physics. A portable inflatable planetarium brings relevance back to astronomy and offers support to students and educators by simulating realistic astronomical environments. This study sought to determine if learning is improved in an inflatable planetarium by adhering to the design principles of the cognitive theory of multimedia learning (CTML), specifically the coherence principle, in an authentic classroom. Two groups of 5th grade students of similar ability were purposefully assigned using a 1-teacher-to-many-students format with mean lesson lengths of 34 minutes. The experimental group was differentiated with seductive details, defined as interesting but irrelevant facts that can distract learning. The control group ( n = 28), with seductive details excluded, outperformed the experimental group (n = 28), validating the coherence principle and producing a Cohen's effect size of medium practical significance (d = 0.4). These findings suggest that CTML, when applied to planetarium instruction, does increase student learning and that seductive details do have a negative effect on learning. An adult training project was created to instruct educators on the benefits of CTML in astronomy education. This study leads to positive social change by highlighting astronomy education while providing educators with design principles of CTML in authentic settings to maximize learning, aid in the creation of digital media (astronomical simulations/instructional lessons for planetariums) and provide valuable training for owners of inflatable planetariums with the eventual goal of increasing student enrollment of astronomy courses at the local level.

  5. Dynamic Simulations of Combined Transmission and Distribution Systems using Parallel Processing Techniques

    OpenAIRE

    Aristidou, P; Van Cutsem, T

    2014-01-01

    Simulating a power system with both transmission and distribution networks modeled in detail is a huge computational challenge. In this paper, we propose a Schur-complement-based domain decomposition algorithm to provide accurate, detailed dynamic simulations of the combined system. The simulation procedure is accelerated with the use of parallel programming techniques, taking advantage of the parallelization opportunities inherent in domain decomposition algorithms. The proposed algorithm is...

  6. Immunological techniques in viral hepatitis.

    Science.gov (United States)

    Rehermann, Barbara; Naoumov, Nikolai V

    2007-03-01

    The need to quantitate and monitor immune responses of large patient cohorts with standardized techniques is increasing due to the growing range of treatment options for hepatitis B and hepatitis C, the development of combination therapies, and candidate experimental vaccines for HCV. In addition, advances in immunological techniques have provided new tools for detailed phenotypic and functional analysis of cellular immune responses. At present, there is substantial variation in laboratory protocols, reagents, controls and analysis and presentation of results. Standardization of immunological assays would therefore allow better comparison of results amongst individual laboratories and patient cohorts. The EASL-sponsored and AASLD-endorsed Monothematic Conference on Clinical Immunology in Viral Hepatitis was held at the University College London, United Kingdom, Oct 7-8, 2006 to bring together investigators with research experience in clinical immunology of hepatitis B virus (HBV) and hepatitis C virus (HCV) infections for in-depth discussion, critical evaluation and standardization of immunological assays. This report summarizes the information presented and discussed at the conference, but is not intended to represent a consensus statement. Our aim is to highlight topics and issues that were supported by general agreement and those that were controversial, as well as to provide suggestions for future work.

  7. Power System Control Study. Phase I. Integrated Control Techniques. Phase II. Detail Design and System Modeling.

    Science.gov (United States)

    1981-03-01

    single engine system. With this hardware, the power flow from the load management (or power distribution) centers to individual utilization equipments...assuming all loads are HVDC ). Other advantages of HVDC are ease of paralleling multiple power sources, ease 114 E-44 w- 00 E-4 z E 4-4 4 CI- >. 00 rL...matrices. There are additional coupling terms present in the case of parallel operation due to currents from two sources flowing through the load . The

  8. Report details poverty-population-environment link.

    Science.gov (United States)

    1994-01-01

    This summary reports on the state of environmental conditions in 1993 and is a reprint from an ICPD publication. This summary refers to a Roundtable Meeting held in November 1993, preliminary to the 1994 UN Conference on the Environment and Development, and an environmental report by Mary Berberis. The report identifies five regions with serious environmental degradation and resource depletion (the Bay of Bengal; the former forested uplands of Indonesia, Nepal, the Philippines, and Thailand; the forests of Central America; the arid regions of sub-Saharan Africa; and the small South Pacific island states). These regions are not just beset with environmental problems, but those problems are exacerbated by problems with land supply and use, poverty, waste, and lack of technology. The report emphasizes that a focus solely on population growth issues obscures the urgent demand for dealing with poverty alleviation, land reform, waste reduction, and improved technologies. Environmental degradation is also caused to a great extent by unsustainable patterns of consumption by affluent groups and by the processes of urban expansion, deforestation, and cultivation of marginal lands in both developed and developing countries. Mary Barberis in her summary of the literature on the causes of environmental conditions considers that the most serious environmental damage is generated by conditions of poverty and population pressure. Environmentally unsound practices are supported by inappropriate farming and soil management techniques, unequal access to resources, and government policies. The example of Bangladesh illustrates that urban population growth has occurred mostly in poor areas, and the problems of water supply, drainage, solid waste disposal, and sanitation are compounded by population growth. Rivers and marine fisheries is contaminated by urban discharges, untreated industrial waste, and fertilizers. Increased salinization degrades the land. Harvesting of wood depletes

  9. Providing Compassion through Flow

    Directory of Open Access Journals (Sweden)

    Lydia Royeen

    2015-07-01

    Full Text Available Meg Kral, MS, OTR/L, CLT, is the cover artist for the Summer 2015 issue of The Open Journal of Occupational Therapy. Her untitled piece of art is an oil painting and is a re-creation of a photograph taken while on vacation. Meg is currently supervisor of outpatient services at Rush University Medical Center. She is lymphedema certified and has a specific interest in breast cancer lymphedema. Art and occupational therapy serve similar purposes for Meg: both provide a sense of flow. She values the outcomes, whether it is a piece of art or improved functional status

  10. Microsoft Security Bible A Collection of Practical Security Techniques

    CERN Document Server

    Mullen, Timothy "Thor"

    2011-01-01

    Thor's Microsoft® Security Bible provides a "one-stop-shop" for Microsoft-related security techniques and procedures as applied to the typical deployment of a Microsoft-based infrastructure. The book contains detailed security concepts and methodologies described at every level: Server, Client, Organizational Structure, Platform-specific security options, application specific security (IIS, SQL, Active Directory, etc.) and also includes new, never-before-published security tools complete with source code. Detailed technical information on security processes for all major Microsoft applications

  11. Indirect porcelain veneer technique for restoring intrinsically stained teeth.

    Science.gov (United States)

    Cutbirth, S T

    1992-01-01

    Indirect porcelain veneers are often the ideal restoration for intrinsically stained teeth. This article details a step-by-step procedure for esthetically restoring discolored teeth. Porcelain laminate veneers are often indicated when teeth bleaching or direct composite bonding procedures cannot provide the desired esthetic result. Veneers are more appealing to many patients than full coverage crowns because of the more conservative tooth preparation required. If technique details are followed meticulously and cases are appropriately selected, porcelain veneers are not only durable but also promote marvelous gingival health and may be the most esthetic anterior dental restoration.

  12. A detailed study of patent system for protection of inventions

    Directory of Open Access Journals (Sweden)

    Tulasi G

    2008-01-01

    Full Text Available Creations of brain are called intellect. Since these creations have good commercial value, are called as property. Inventions are intellectual property and can be protected by patents provided the invention is novel, non-obvious, useful and enabled. To have fare trade among member countries, World Trade Organisation proposed TRIPS agreement. India had taken necessary initiation by signing the World Trade Organisation agreement and transformed to global needs. The aim of this article is to enlighten pharmaceutical professionals especially in the field of research and development about planning inventions by thorough review of prior-art, which saves time and money. A thorough understanding is made possible by providing details of origin; present governing bodies, their role along with the Act that is safeguarding the patent system.

  13. Detailed Burnup Calculations for Research Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Leszczynski, F. [Centro Atomico Bariloche (CNEA), 8400 S. C. de Bariloche (Argentina)

    2011-07-01

    A general method (RRMCQ) has been developed by introducing a microscopic burn up scheme which uses the Monte Carlo calculated spatial power distribution of a research reactor core and a depletion code for burn up calculations, as a basis for solving nuclide material balance equations for each spatial region in which the system is divided. Continuous energy dependent cross-section libraries and full 3D geometry of the system is input for the calculations. The resulting predictions for the system at successive burn up time steps are thus based on a calculation route where both geometry and cross-sections are accurately represented, without geometry simplifications and with continuous energy data. The main advantage of this method over the classical deterministic methods currently used is that RRMCQ System is a direct 3D method without the limitations and errors introduced on the homogenization of geometry and condensation of energy of deterministic methods. The Monte Carlo and burn up codes adopted until now are the widely used MCNP5 and ORIGEN2 codes, but other codes can be used also. For using this method, there is a need of a well-known set of nuclear data for isotopes involved in burn up chains, including burnable poisons, fission products and actinides. For fixing the data to be included on this set, a study of the present status of nuclear data is performed, as part of the development of RRMCQ method. This study begins with a review of the available cross-section data of isotopes involved in burn up chains for research nuclear reactors. The main data needs for burn up calculations are neutron cross-sections, decay constants, branching ratios, fission energy and yields. The present work includes results of selected experimental benchmarks and conclusions about the sensitivity of different sets of cross-section data for burn up calculations, using some of the main available evaluated nuclear data files. Basically, the RRMCQ detailed burn up method includes four

  14. Using data mining techniques for building fusion models

    Science.gov (United States)

    Zhang, Zhongfei; Salerno, John J.; Regan, Maureen A.; Cutler, Debra A.

    2003-03-01

    Over the past decade many techniques have been developed which attempt to predict possible events through the use of given models or patterns of activity. These techniques work quite well given the case that one has a model or a valid representation of activity. However, in reality for the majority of the time this is not the case. Models that do exist, in many cases were hand crafted, required many man-hours to develop and they are very brittle in the dynamic world in which we live. Data mining techniques have shown some promise in providing a set of solutions. In this paper we will provide the details for our motivation, theory and techniques which we have developed, as well as the results of a set of experiments.

  15. Devil in the Details? Developmental Dyslexia and Visual Long-Term Memory for Details

    Directory of Open Access Journals (Sweden)

    Lynn eHuestegge

    2014-07-01

    Full Text Available Cognitive theories on causes of developmental dyslexia can be divided into language-specific and general accounts. While the former assume that words are special in that associated processing problems are rooted in language-related cognition (e.g., phonology deficits, the latter propose that dyslexia is rather rooted in a general impairment of cognitive (e.g., visual and/or auditory processing streams. In the present study, we examined to what extent dyslexia (typically characterized by poor orthographic representations may be associated with a general deficit in visual long-term memory for details. We compared object- and detail-related visual long-term memory performance (and phonological skills between dyslexic primary school children and IQ-, age- and gender-matched controls. The results revealed that while the overall amount of long-term memory errors was comparable between groups, dyslexic children exhibited a greater portion of detail-related errors. The results suggest that not only phonological, but also general visual resolution deficits in long-term memory may play an important role in developmental dyslexia.

  16. Trajectories for Novel and Detailed Traffic Information

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Torp, Kristian

    2012-01-01

    the central metric free-flow speed from trajectories, instead of using point-based measurements such as induction-loops. This free-flow speed is widely used to compute and monitor the congestion level. The paper argues that the actual travel-time is a more accurate metric. The paper suggests a novel approach...... are correctly coordinated, and navigational device manufacturers to advice drivers in real-time on expected behavior of signalized intersections. The main conclusion is that trajectories can provide novel insight into the actual traffic situation that is not possible using existing approaches. Further...... to analyzing individual intersections that enables traffic analysts to compute queue lengths and estimated time to pass an intersection. Finally, the paper uses associative rule mining for evaluating green waves on road stretches. Such information can be used to verify that signalized intersections...

  17. Unstable total hip arthroplasty: detailed overview.

    Science.gov (United States)

    Berry, D J

    2001-01-01

    Hip dislocation is one of the most common complications of THA. Good preoperative planning, good postoperative patient education, accurate intraoperative component positioning, rigorous intraoperative testing of hip stability, and good repair of soft tissues during closure all help prevent dislocation. Early postoperative dislocations and first or second dislocations usually are treated with closed reduction and a hip guide brace or hip spica cast, but when dislocation becomes recurrent, surgical treatment usually is needed. When possible, surgical treatment is based on identifying and treating a specific problem leading to the dislocation, such as implant malposition, inadequate soft-tissue tension, or impingement. In selected circumstances, constrained implants or bipolar or tripolar implants provide powerful tools to restore hip stability.

  18. Detailed weather data generator for building simulations

    CERN Document Server

    Adelard, L; Garde, F; Gatina, J -C

    2012-01-01

    Thermal buildings simulation softwares need meteorological files in thermal comfort, energetic evaluation studies. Few tools can make significant meteorological data available such as generated typical year, representative days, or artificial meteorological database. This paper deals about the presentation of a new software, RUNEOLE, used to provide weather data in buildings applications with a method adapted to all kind of climates. RUNEOLE associates three modules of description, modelling and generation of weather data. The statistical description of an existing meteorological database makes typical representative days available and leads to the creation of model libraries. The generation module leads to the generation of non existing sequences. This software tends to be usable for the searchers and designers, by means of interactivity, facilitated use and easy communication. The conceptual basis of this tool will be exposed and we'll propose two examples of applications in building physics for tropical hu...

  19. Persistence of Performance Details in Music and Speech

    OpenAIRE

    Jungers, Melissa K.

    2004-01-01

    What aspects of music and speech are retained in memory? How do remembered performance details influence future performances? This paper focuses on memory for performance details in music and speech and the influence of these elements from perception to performance. Listeners form a memory for a sentence or melody that includes timing and intensity details. These details then influence performance. Musicians persist in the tempo of a melody they have just heard. They also incorporate details ...

  20. RFCM Techniques Chamber Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Provides the capability to develop radio-frequency countermeasure (RFCM) techniques in a controlled environment from 2.0 to 40.0 GHz. The configuration of...

  1. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    National Research Council Canada - National Science Library

    Caescu Stefan Claudiu; Popescu Andrei; Ploesteanu Mara Gabriela

    2011-01-01

    .... Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization...

  2. What HERA May Provide?

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hannes; /DESY; De Roeck, Albert; /CERN; Bartels, Jochen; /Hamburg U., Inst. Theor. Phys. II; Behnke, Olaf; Blumlein, Johannes; /DESY; Brodsky, Stanley; /SLAC /Durham U., IPPP; Cooper-Sarkar, Amanda; /Oxford U.; Deak, Michal; /DESY; Devenish, Robin; /Oxford U.; Diehl, Markus; /DESY; Gehrmann, Thomas; /Zurich U.; Grindhammer, Guenter; /Munich, Max Planck Inst.; Gustafson, Gosta; /CERN /Lund U., Dept. Theor. Phys.; Khoze, Valery; /Durham U., IPPP; Knutsson, Albert; /DESY; Klein, Max; /Liverpool U.; Krauss, Frank; /Durham U., IPPP; Kutak, Krzysztof; /DESY; Laenen, Eric; /NIKHEF, Amsterdam; Lonnblad, Leif; /Lund U., Dept. Theor. Phys.; Motyka, Leszek; /Hamburg U., Inst. Theor. Phys. II /Birmingham U. /Southern Methodist U. /DESY /Piemonte Orientale U., Novara /CERN /Paris, LPTHE /Hamburg U. /Penn State U.

    2011-11-10

    More than 100 people participated in a discussion session at the DIS08 workshop on the topic What HERA may provide. A summary of the discussion with a structured outlook and list of desirable measurements and theory calculations is given. The HERA accelerator and the HERA experiments H1, HERMES and ZEUS stopped running in the end of June 2007. This was after 15 years of very successful operation since the first collisions in 1992. A total luminosity of {approx} 500 pb{sup -1} has been accumulated by each of the collider experiments H1 and ZEUS. During the years the increasingly better understood and upgraded detectors and HERA accelerator have contributed significantly to this success. The physics program remains in full swing and plenty of new results were presented at DIS08 which are approaching the anticipated final precision, fulfilling and exceeding the physics plans and the previsions of the upgrade program. Most of the analyses presented at DIS08 were still based on the so called HERA I data sample, i.e. data taken until 2000, before the shutdown for the luminosity upgrade. This sample has an integrated luminosity of {approx} 100 pb{sup -1}, and the four times larger statistics sample from HERA II is still in the process of being analyzed.

  3. An Experimental Investigation of Improving Human Problem-Solving Performance by Guiding Attention and Adaptively Providing Details on Information Displays

    Science.gov (United States)

    2007-04-01

    1S. SUBJECT TERMS cognitive model, empirical studies, information display design, eye tracking 16. SECURITY CLASSIFICATION OF: 17. LIMITATION 18...based eye - tracking methods. 2 We conducted a second experiment to investigate the strategies of successful and unsuccessful problems solvers in tasks... tracking laboratory at Auburn is one of the very few research groups in the United States investigating the application of eye tracking to information

  4. Weighted simultaneous iterative reconstruction technique for single-axis tomography

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, D., E-mail: Daniel.Wolf@Triebenberg.de; Lubk, A.; Lichte, H.

    2014-01-15

    Tomographic techniques play a crucial role in imaging methods such as transmission electron microscopy (TEM) due to their unique capabilities to reconstruct three-dimensional object information. However, the accuracy of the two standard tomographic reconstruction techniques, the weighted back-projection (W-BP) and the simultaneous iterative reconstruction technique (SIRT) is reduced under common experimental restrictions, such as limited tilt range or noise. We demonstrate that the combination of W-BP and SIRT leads to an improved tomographic reconstruction technique: the weighted SIRT. Convergence, resolution and reconstruction error of the W-SIRT are analyzed by a detailed analytical, numerical, and experimental comparison with established methods. Our reconstruction technique is not restricted to TEM tomography but can be applied to all problems sharing single axis imaging geometry. - Highlights: • A new tomographic reconstruction technique W-SIRT was developed. • W-SIRT provides better convergence, higher resolution and smaller reconstruction error compared to established tomographic techniques. • This is demonstrated by a detailed analytical, numerical, and experimental comparison.

  5. Towards a detailed soot model for internal combustion engines

    Energy Technology Data Exchange (ETDEWEB)

    Kraft, Markus; Mosbach, Sebastian; Celnik, Matthew S. [Univ. of Cambridge (United Kingdom); Zhang, Hongzhi R. [Univ. of Utah (United States); Kubo, Shuichi; Kim, Kyoung-Oh [Toyota (Japan)

    2008-07-01

    In this work, we integrate previously developed models for engine combustion and soot formation. Namely, we combine the stochastic reactor model (SRM) engine code, which uses detailed chemistry and takes into account convective heat transfer and turbulent mixing, with SWEEP, a population balance solver based on a Monte Carlo method. In order to couple the two codes, a detailed chemical kinetic mechanism describing the combustion of primary reference fuels (PRFs) is extended to include small Polycyclic Aromatic Hydrocarbons (PAHs) such as pyrene, which function as soot precursor species for particle inception in the soot model. The integrated model provides not only averaged quantities as functions of crank angle like soot mass, volume fraction, aggregate diameter, and the number of primary particles per aggregate for example, but also more detailed information such as aggregate and primary particle size distribution functions. In addition, specifics about aggregate structure and composition, including C/H ratio and PAH ring count distributions, and images similar to those produced with transmission electron microscopes (TEMs), can be obtained. The combined model is applied to simulate an n-heptane fuelled homogeneous charge compression ignition (HCCI) engine which is operated at an equivalence ratio of 1.93. In-cylinder pressure and heat release predictions show satisfactory agreement with measurements. Furthermore, simulated aggregate size distributions as well as their time evolution are found to qualitatively agree with those obtained experimentally through snatch sampling. It is also seen both in the experiment as well as in the simulation that aggregates in the trapped residual gases play a vital role in the soot formation process. (orig.)

  6. Surgical technique for lung retransplantation in the mouse

    OpenAIRE

    Li, Wenjun; Goldstein, Daniel R.; Bribriesco, Alejandro C.; Nava, Ruben G.; Spahn, Jessica H.; Wang, Xingan; Gelman, Andrew E.; Krupnick, Alexander S.; Kreisel, Daniel

    2013-01-01

    Microsurgical cuff techniques for orthotopic vascularized murine lung transplantation have allowed for the design of studies that examine mechanisms contributing to the high failure rate of pulmonary grafts. Here, we provide a detailed technical description of orthotopic lung retransplantation in mice, which we have thus far performed in 144 animals. The total time of the retransplantation procedure is approximately 55 minutes, 20 minutes for donor harvest and 35 minutes for the implantation,...

  7. Process sequence optimization for digital microfluidic integration using EWOD technique

    Science.gov (United States)

    Yadav, Supriya; Joyce, Robin; Sharma, Akash Kumar; Sharma, Himani; Sharma, Niti Nipun; Varghese, Soney; Akhtar, Jamil

    2016-04-01

    Micro/nano-fluidic MEMS biosensors are the devices that detects the biomolecules. The emerging micro/nano-fluidic devices provide high throughput and high repeatability with very low response time and reduced device cost as compared to traditional devices. This article presents the experimental details for process sequence optimization of digital microfluidics (DMF) using "electrowetting-on-dielectric" (EWOD). Stress free thick film deposition of silicon dioxide using PECVD and subsequent process for EWOD techniques have been optimized in this work.

  8. FIM measurement properties and Rasch model details.

    Science.gov (United States)

    Wright, B D; Linacre, J M; Smith, R M; Heinemann, A W; Granger, C V

    1997-12-01

    To summarize, we take issue with the criticisms of Dickson & Köhler for two main reasons: 1. Rasch analysis provides a model from which to approach the analysis of the FIM, an ordinal scale, as an interval scale. The existence of examples of items or individuals which do not fit the model does not disprove the overall efficacy of the model; and 2. the principal components analysis of FIM motor items as presented by Dickson & Köhler tends to undermine rather than support their argument. Their own analyses produce a single major factor explaining between 58.5 and 67.1% of the variance, depending upon the sample, with secondary factors explaining much less variance. Finally, analysis of item response, or latent trait, is a powerful method for understanding the meaning of a measure. However, it presumes that item scores are accurate. Another concern is that Dickson & Köhler do not address the issue of reliability of scoring the FIM items on which they report, a critical point in comparing results. The Uniform Data System for Medical Rehabilitation (UDSMRSM) expends extensive effort in the training of clinicians of subscribing facilities to score items accurately. This is followed up with a credentialing process. Phase 1 involves the testing of individual clinicians who are submitting data to determine if they have achieved mastery over the use of the FIM instrument. Phase 2 involves examining the data for outlying values. When Dickson & Köhler investigate more carefully the application of the Rasch model to their FIM data, they will discover that the results presented in their paper support rather than contradict their application of the Rasch model! This paper is typical of supposed refutations of Rasch model applications. Dickson & Köhler will find that idiosyncrasies in their data and misunderstandings of the Rasch model are the only basis for a claim to have disproven the relevance of the model to FIM data. The Rasch model is a mathematical theorem (like

  9. Principles of fluorescence techniques

    CERN Document Server

    2016-01-01

    Fluorescence techniques are being used and applied increasingly in academics and industry. The Principles of Fluorescence Techniques course will outline the basic concepts of fluorescence techniques and the successful utilization of the currently available commercial instrumentation. The course is designed for students who utilize fluorescence techniques and instrumentation and for researchers and industrial scientists who wish to deepen their knowledge of fluorescence applications. Key scientists in the field will deliver theoretical lectures. The lectures will be complemented by the direct utilization of steady-state and lifetime fluorescence instrumentation and confocal microscopy for FLIM and FRET applications provided by leading companies.

  10. Hyphenated analytical techniques for materials characterisation

    Science.gov (United States)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  11. Detailed gravity anomalies from GEOS-3 satellite altimetry data

    Science.gov (United States)

    Gopalapillai, G. S.; Mourad, A. G.

    1978-01-01

    A technique for deriving mean gravity anomalies from dense altimetry data was developed. A combination of both deterministic and statistical techniques was used. The basic mathematical model was based on the Stokes' equation which describes the analytical relationship between mean gravity anomalies and geoid undulations at a point; this undulation is a linear function of the altimetry data at that point. The overdetermined problem resulting from the excessive altimetry data available was solved using Least-Squares principles. These principles enable the simultaneous estimation of the associated standard deviations reflecting the internal consistency based on the accuracy estimates provided for the altimetry data as well as for the terrestrial anomaly data. Several test computations were made of the anomalies and their accuracy estimates using GOES-3 data.

  12. Multimillion-cell SAGD models, opportunity for detailed field analysis

    Energy Technology Data Exchange (ETDEWEB)

    Akram, F. [Schlumberger Canada Limited (Canada)

    2011-07-01

    In the heavy oil industry, steam assisted gravity drainage (SAGD) is a new thermal oil recovery process used to enhance oil recovery. However, operators have trouble reaching their objectives. A better understanding of reservoir response is necessary to reach the desired production rates. The aim of this paper is to present a new methodology for carrying out a reservoir simulation study. A multimillion-cell model was developed with realistic properties, structural complexity and heterogeneities; both full field and thermal simulations were conducted. Results showed that the computing technology thus developed offers greater accuracy in less time than is required when using conventional techniques. In addition, this methodology provides a higher degree of confidence in the results by bridging the gap between geology and engineering. The paper is a first step in producing a framework and the methodology presented herein will need to be further tested to confirm its benefits over conventional techniques.

  13. Tools & techniques--statistics: propensity score techniques.

    Science.gov (United States)

    da Costa, Bruno R; Gahl, Brigitta; Jüni, Peter

    2014-10-01

    Propensity score (PS) techniques are useful if the number of potential confounding pretreatment variables is large and the number of analysed outcome events is rather small so that conventional multivariable adjustment is hardly feasible. Only pretreatment characteristics should be chosen to derive PS, and only when they are probably associated with outcome. A careful visual inspection of PS will help to identify areas of no or minimal overlap, which suggests residual confounding, and trimming of the data according to the distribution of PS will help to minimise residual confounding. Standardised differences in pretreatment characteristics provide a useful check of the success of the PS technique employed. As with conventional multivariable adjustment, PS techniques cannot account for confounding variables that are not or are only imperfectly measured, and no PS technique is a substitute for an adequately designed randomised trial.

  14. Detailed description of oil shale organic and mineralogical heterogeneity via fourier transform infrared mircoscopy

    Science.gov (United States)

    Washburn, Kathryn E.; Birdwell, Justin E.; Foster, Michael; Gutierrez, Fernando

    2015-01-01

    Mineralogical and geochemical information on reservoir and source rocks is necessary to assess and produce from petroleum systems. The standard methods in the petroleum industry for obtaining these properties are bulk measurements on homogenized, generally crushed, and pulverized rock samples and can take from hours to days to perform. New methods using Fourier transform infrared (FTIR) spectroscopy have been developed to more rapidly obtain information on mineralogy and geochemistry. However, these methods are also typically performed on bulk, homogenized samples. We present a new approach to rock sample characterization incorporating multivariate analysis and FTIR microscopy to provide non-destructive, spatially resolved mineralogy and geochemistry on whole rock samples. We are able to predict bulk mineralogy and organic carbon content within the same margin of error as standard characterization techniques, including X-ray diffraction (XRD) and total organic carbon (TOC) analysis. Validation of the method was performed using two oil shale samples from the Green River Formation in the Piceance Basin with differing sedimentary structures. One sample represents laminated Green River oil shales, and the other is representative of oil shale breccia. The FTIR microscopy results on the oil shales agree with XRD and LECO TOC data from the homogenized samples but also give additional detail regarding sample heterogeneity by providing information on the distribution of mineral phases and organic content. While measurements for this study were performed on oil shales, the method could also be applied to other geological samples, such as other mudrocks, complex carbonates, and soils.

  15. Emerging optical nanoscopy techniques

    Directory of Open Access Journals (Sweden)

    Montgomery PC

    2015-09-01

    Full Text Available Paul C Montgomery, Audrey Leong-Hoi Laboratoire des Sciences de l'Ingénieur, de l'Informatique et de l'Imagerie (ICube, Unistra-CNRS, Strasbourg, France Abstract: To face the challenges of modern health care, new imaging techniques with subcellular resolution or detection over wide fields are required. Far field optical nanoscopy presents many new solutions, providing high resolution or detection at high speed. We present a new classification scheme to help appreciate the growing number of optical nanoscopy techniques. We underline an important distinction between superresolution techniques that provide improved resolving power and nanodetection techniques for characterizing unresolved nanostructures. Some of the emerging techniques within these two categories are highlighted with applications in biophysics and medicine. Recent techniques employing wider angle imaging by digital holography and scattering lens microscopy allow superresolution to be achieved for subcellular and even in vivo, imaging without labeling. Nanodetection techniques are divided into four subcategories using contrast, phase, deconvolution, and nanomarkers. Contrast enhancement is illustrated by means of a polarized light-based technique and with strobed phase-contrast microscopy to reveal nanostructures. Very high sensitivity phase measurement using interference microscopy is shown to provide nanometric surface roughness measurement or to reveal internal nanometric structures. Finally, the use of nanomarkers is illustrated with stochastic fluorescence microscopy for mapping intracellular structures. We also present some of the future perspectives of optical nanoscopy. Keywords: microscopy, imaging, superresolution, nanodetection, biophysics, medical imaging

  16. Emerging optical nanoscopy techniques

    Science.gov (United States)

    Montgomery, Paul C; Leong-Hoi, Audrey

    2015-01-01

    To face the challenges of modern health care, new imaging techniques with subcellular resolution or detection over wide fields are required. Far field optical nanoscopy presents many new solutions, providing high resolution or detection at high speed. We present a new classification scheme to help appreciate the growing number of optical nanoscopy techniques. We underline an important distinction between superresolution techniques that provide improved resolving power and nanodetection techniques for characterizing unresolved nanostructures. Some of the emerging techniques within these two categories are highlighted with applications in biophysics and medicine. Recent techniques employing wider angle imaging by digital holography and scattering lens microscopy allow superresolution to be achieved for subcellular and even in vivo, imaging without labeling. Nanodetection techniques are divided into four subcategories using contrast, phase, deconvolution, and nanomarkers. Contrast enhancement is illustrated by means of a polarized light-based technique and with strobed phase-contrast microscopy to reveal nanostructures. Very high sensitivity phase measurement using interference microscopy is shown to provide nanometric surface roughness measurement or to reveal internal nanometric structures. Finally, the use of nanomarkers is illustrated with stochastic fluorescence microscopy for mapping intracellular structures. We also present some of the future perspectives of optical nanoscopy. PMID:26491270

  17. Organization and Detailed Parcellation of Human Hippocampal Head and Body Regions Based on a Combined Analysis of Cyto- and Chemoarchitecture.

    Science.gov (United States)

    Ding, Song-Lin; Van Hoesen, Gary W

    2015-10-15

    The hippocampal formation (HF) is one of the hottest regions in neuroscience because it is critical to learning, memory, and cognition, while being vulnerable to many neurological and mental disorders. With increasing high-resolution imaging techniques, many scientists have started to use distinct landmarks along the anterior-posterior axis of HF to allow segmentation into individual subfields in order to identify specific functions in both normal and diseased conditions. These studies urgently call for more reliable and accurate segmentation of the HF subfields DG, CA3, CA2, CA1, prosubiculum, subiculum, presubiculum, and parasubiculum. Unfortunately, very limited data are available on detailed parcellation of the HF subfields, especially in the complex, curved hippocampal head region. In this study we revealed detailed organization and parcellation of all subfields of the hippocampal head and body regions on the base of a combined analysis of multiple cyto- and chemoarchitectural stains and dense sequential section sampling. We also correlated these subfields to macro-anatomical landmarks, which are visible on magnetic resonance imaging (MRI) scans. Furthermore, we created three versions of the detailed anatomic atlas for the hippocampal head region to account for brains with four, three, or two hippocampal digitations. These results will provide a fundamental basis for understanding the organization, parcellation, and anterior-posterior difference of human HF, facilitating accurate segmentation and measurement of HF subfields in the human brain on MRI scans.

  18. Scalable and Detail-Preserving Ground Surface Reconstruction from Large 3D Point Clouds Acquired by Mobile Mapping Systems

    Science.gov (United States)

    Craciun, D.; Serna Morales, A.; Deschaud, J.-E.; Marcotegui, B.; Goulette, F.

    2014-08-01

    The currently existing mobile mapping systems equipped with active 3D sensors allow to acquire the environment with high sampling rates at high vehicle velocities. While providing an effective solution for environment sensing over large scale distances, such acquisition provides only a discrete representation of the geometry. Thus, a continuous map of the underlying surface must be built. Mobile acquisition introduces several constraints for the state-of-the-art surface reconstruction algorithms. Smoothing becomes a difficult task for recovering sharp depth features while avoiding mesh shrinkage. In addition, interpolation-based techniques are not suitable for noisy datasets acquired by Mobile Laser Scanning (MLS) systems. Furthermore, scalability is a major concern for enabling real-time rendering over large scale distances while preserving geometric details. This paper presents a fully automatic ground surface reconstruction framework capable to deal with the aforementioned constraints. The proposed method exploits the quasi-flat geometry of the ground throughout a morphological segmentation algorithm. Then, a planar Delaunay triangulation is applied in order to reconstruct the ground surface. A smoothing procedure eliminates high frequency peaks, while preserving geometric details in order to provide a regular ground surface. Finally, a decimation step is applied in order to cope with scalability constraints over large scale distances. Experimental results on real data acquired in large urban environments are presented and a performance evaluation with respect to ground truth measurements demonstrate the effectiveness of our method.

  19. The Detailed Chemical Abundance Patterns of M31 Globular Clusters

    CERN Document Server

    Colucci, J E; Cohen, J

    2012-01-01

    We present detailed chemical abundances for $>$20 elements in $\\sim$30 globular clusters in M31. These results have been obtained using high resolution ($\\lambda/\\Delta\\lambda\\sim$24,000) spectra of their integrated light and analyzed using our original method. The globular clusters have galactocentric radii between 2.5 kpc and 117 kpc, and therefore provide abundance patterns for different phases of galaxy formation recorded in the inner and outer halo of M31. We find that the clusters in our survey have a range in metallicity of $-2.2$20 kpc have a small range in abundance of [Fe/H]$=-1.6 \\pm 0.10$. We also measure abundances of alpha, r- and s-process elements. These results constitute the first abundance pattern constraints for old populations in M31 that are comparable to those known for the Milky Way halo.

  20. Detailed High Frequency Models of Various Winding Types in Power Transformers

    DEFF Research Database (Denmark)

    Pedersen, Kenneth; Lunow, Morten Erlandsson; Holbøll, Joachim

    2005-01-01

    Abstract--In this paper, techniques are described which demonstrate how a highly detailed internal transformer model can be obtained systematically with Matlab and how it can be prepared for subsequent transient analysis. The input of such a model will mainly be the description of the cross secti...

  1. Developing Fighting Technique Through Visualization

    Directory of Open Access Journals (Sweden)

    Tim Lajcik

    2012-07-01

    Full Text Available Visualization is a training technique that involves creating a detailed mental “movie” of successful performance. This article describes a type of visualization called “mental rehearsal” and explains how it can be used to reinforce the neuromuscular pattern of proper fighting technique. Drawing on his experience as a professional fighter and college coach, his studies in sport psychology as a college student, and his exposure to mental training techniques at the U.S. Olympic Training Center, the author reveals how to use mental imagery to facilitate the mastery of martial art technique.    

  2. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  3. Particle Markov Chain Monte Carlo Techniques of Unobserved Component Time Series Models Using Ox

    DEFF Research Database (Denmark)

    Nonejad, Nima

    This paper details Particle Markov chain Monte Carlo techniques for analysis of unobserved component time series models using several economic data sets. PMCMC combines the particle filter with the Metropolis-Hastings algorithm. Overall PMCMC provides a very compelling, computationally fast...

  4. A High-Speed Asynchronous Communication Technique for MOS (Metal-Oxide-Semiconductor) VLSI Systems.

    Science.gov (United States)

    1985-12-01

    by a well controlled amount; rather than use an active delay line the passive delay inherent in the pc board traces could be used. The transmission...in a synchronous system without a detailed analysis of the actual delays involved. The technique provides phase jitter inmunity of close to 1/4 of .~k

  5. The Delphi Technique: A Research Strategy for Career and Technical Education

    Science.gov (United States)

    Stitt-Gohdes, Wanda L.; Crews, Tena B.

    2004-01-01

    Career and technical education research often centers around quantitative research designs. The Delphi Technique provides a structured communication process designed to produce a detailed examination of a topic and/or problem and discussion from the participating group. The contributions of individuals via this tool produce a group perspective not…

  6. Laparoscopic Heller Myotomy and Dor Fundoplication for Esophageal Achalasia: Technique and Perioperative Management.

    Science.gov (United States)

    Andolfi, Ciro; Fisichella, P Marco

    2016-11-01

    Surgical correction of achalasia using laparoscopic Heller myotomy with Dor fundoplication is argued to be the gold standard treatment for patients with achalasia. The goal of this technical report is to illustrate our preferred approach to patients with achalasia and to provide the reader with a detailed description of our operative technique, its rationale, and our pre and postoperative management.

  7. www.detail.de/english - the research and service platform

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    DETAIL Archive and Download Centre Every year, DETAIL magazine publishes more than 120 outstanding buildings from all over the world, along with interviews, critiques and articles written by authors from different disciplines.

  8. Cloud Storage and its Secure Overlay Techniques

    Directory of Open Access Journals (Sweden)

    Susheel Kumar

    2014-04-01

    Full Text Available In this paper we have tried to explain in detail about cloud storage its need, importance and how much useful it is for the upcoming future. As we know that it provides us flexibility to store and use data to anywhere and from anywhere so it is cost effective and beneficial at all places. The storage of data however has some kind of security issues in accessing the authorized data. Some of the data in the cloud should be deleted for certain reasons to maintain confidentiality. And many other issues are to be overcome by the cloud. There are certain techniques discussed in this paper to address the problems in the cloud communication. We get an idea of the available methods in which the data can be secured. Anyway each architecture has its own shortcomings. But without these techniques, it is difficult to maintain a good client-server storage mechanism in the cloud computing.

  9. Digital audio watermarking fundamentals, techniques and challenges

    CERN Document Server

    Xiang, Yong; Yan, Bin

    2017-01-01

    This book offers comprehensive coverage on the most important aspects of audio watermarking, from classic techniques to the latest advances, from commonly investigated topics to emerging research subdomains, and from the research and development achievements to date, to current limitations, challenges, and future directions. It also addresses key topics such as reversible audio watermarking, audio watermarking with encryption, and imperceptibility control methods. The book sets itself apart from the existing literature in three main ways. Firstly, it not only reviews classical categories of audio watermarking techniques, but also provides detailed descriptions, analysis and experimental results of the latest work in each category. Secondly, it highlights the emerging research topic of reversible audio watermarking, including recent research trends, unique features, and the potentials of this subdomain. Lastly, the joint consideration of audio watermarking and encryption is also reviewed. With the help of this...

  10. Detailed transcriptome atlas of the pancreatic beta cell.

    Science.gov (United States)

    Kutlu, Burak; Burdick, David; Baxter, David; Rasschaert, Joanne; Flamez, Daisy; Eizirik, Decio L; Welsh, Nils; Goodman, Nathan; Hood, Leroy

    2009-01-15

    Gene expression patterns provide a detailed view of cellular functions. Comparison of profiles in disease vs normal conditions provides insights into the processes underlying disease progression. However, availability and integration of public gene expression datasets remains a major challenge. The aim of the present study was to explore the transcriptome of pancreatic islets and, based on this information, to prepare a comprehensive and open access inventory of insulin-producing beta cell gene expression, the Beta Cell Gene Atlas (BCGA). We performed Massively Parallel Signature Sequencing (MPSS) analysis of human pancreatic islet samples and microarray analyses of purified rat beta cells, alpha cells and INS-1 cells, and compared the information with available array data in the literature. MPSS analysis detected around 7600 mRNA transcripts, of which around a third were of low abundance. We identified 2000 and 1400 transcripts that are enriched/depleted in beta cells compared to alpha cells and INS-1 cells, respectively. Microarray analysis identified around 200 transcription factors that are differentially expressed in either beta or alpha cells. We reanalyzed publicly available gene expression data and integrated these results with the new data from this study to build the BCGA. The BCGA contains basal (untreated conditions) gene expression level estimates in beta cells as well as in different cell types in human, rat and mouse pancreas. Hierarchical clustering of expression profile estimates classify cell types based on species while beta cells were clustered together. Our gene atlas is a valuable source for detailed information on the gene expression distribution in beta cells and pancreatic islets along with insulin producing cell lines. The BCGA tool, as well as the data and code used to generate the Atlas are available at the T1Dbase website (T1DBase.org).

  11. Detailed transcriptome atlas of the pancreatic beta cell

    Directory of Open Access Journals (Sweden)

    Eizirik Decio L

    2009-01-01

    Full Text Available Abstract Background Gene expression patterns provide a detailed view of cellular functions. Comparison of profiles in disease vs normal conditions provides insights into the processes underlying disease progression. However, availability and integration of public gene expression datasets remains a major challenge. The aim of the present study was to explore the transcriptome of pancreatic islets and, based on this information, to prepare a comprehensive and open access inventory of insulin-producing beta cell gene expression, the Beta Cell Gene Atlas (BCGA. Methods We performed Massively Parallel Signature Sequencing (MPSS analysis of human pancreatic islet samples and microarray analyses of purified rat beta cells, alpha cells and INS-1 cells, and compared the information with available array data in the literature. Results MPSS analysis detected around 7600 mRNA transcripts, of which around a third were of low abundance. We identified 2000 and 1400 transcripts that are enriched/depleted in beta cells compared to alpha cells and INS-1 cells, respectively. Microarray analysis identified around 200 transcription factors that are differentially expressed in either beta or alpha cells. We reanalyzed publicly available gene expression data and integrated these results with the new data from this study to build the BCGA. The BCGA contains basal (untreated conditions gene expression level estimates in beta cells as well as in different cell types in human, rat and mouse pancreas. Hierarchical clustering of expression profile estimates classify cell types based on species while beta cells were clustered together. Conclusion Our gene atlas is a valuable source for detailed information on the gene expression distribution in beta cells and pancreatic islets along with insulin producing cell lines. The BCGA tool, as well as the data and code used to generate the Atlas are available at the T1Dbase website (T1DBase.org.

  12. A new technique of ECG analysis and its application to evaluation of disorders during ventricular tachycardia

    Energy Technology Data Exchange (ETDEWEB)

    Moskalenko, A.V. [Institute of Theoretical and Experimental Biophysics RAS, Institutskaya Street, 3, Pushchino 142290 (Russian Federation)], E-mail: info@avmoskalenko.ru; Rusakov, A.V. [Institute of Theoretical and Experimental Biophysics RAS, Institutskaya Street, 3, Pushchino 142290 (Russian Federation); Elkin, Yu.E. [Institute of Mathematical Problems of Biology RAS, Institutskaya Street, 4, Pushchino 142290 (Russian Federation)

    2008-04-15

    We propose a new technique of ECG analysis to characterize the properties of polymorphic ventricular arrhythmias, potentially life-threatening disorders of cardiac activation. The technique is based on extracting two indices from the ECG fragment. The result is a new detailed quantitative description of polymorphic ECGs. Our observations suggest that the proposed ECG processing algorithm provides information that supplements the traditional visual ECG analysis. The estimates of ECG variation in this study reveal some unexpected details of ventricular activation dynamics, which are possibly useful for diagnosing cardiac rhythm disturbances.

  13. Plasma scattering of electromagnetic radiation theory and measurement techniques

    CERN Document Server

    Froula, Dustin H; Luhmann, Neville C Jr; Sheffield, John

    2011-01-01

    This work presents one of the most powerful methods of plasma diagnosis in exquisite detail to guide researchers in the theory and measurement techniques of light scattering in plasmas. Light scattering in plasmas is essential in the research and development of fusion energy, environmental solutions, and electronics.Referred to as the "Bible" by researchers the work encompasses fusion and industrial applications essential in plasma research. It is the only comprehensive resource specific to the plasma scattering technique. It provides a wide-range of experimental examples and discussion of the

  14. Introduction and Application of Kinematic Wave Routing Techniques Using HEC-1.

    Science.gov (United States)

    1979-05-01

    WAVE ROUTING TECHNIQUES USING HEC-1 47 - 80 Johannes J. DeVries Chapter 2 provides a discussion of the necessary information required to apply kinematic...solution techniques. The following chapter will present additional background information and the details necessary for the effective application of...procedure for verifying its ability to correctly simulate the behavoir of a given basin is strongly discouraged. 1Hydraulic Engineer, Training & Methods

  15. Patch Clamp: A Powerful Technique for Studying the Mechanism of Acupuncture

    Directory of Open Access Journals (Sweden)

    D. Zhang

    2012-01-01

    Full Text Available Cellular and molecular events can be investigated using electrophysiological techniques. In particular, the patch-clamp method provides detailed information. In addition, the patch-clamp technique has become a powerful method for investigating the mechanisms underlying the effects of acupuncture. In this paper, recent researches on how acupuncture might modulate electrophysiological responses in the central nervous system (CNS and affect peripheral structures are reviewed.

  16. Halving the Casimir force with conductive oxides: experimental details

    CERN Document Server

    de Man, Sven; Iannuzzi, Davide

    2010-01-01

    This work is an extended version of a paper published last year in Physical Review Letters [S. de Man et al., Phys. Rev. Lett. 103, 040402 (2009)], where we presented measurements of the Casimir force between a gold coated sphere and a plate coated with either gold or an indium-tin-oxide (ITO) layer. The experiment, which was performed in air, showed that ITO is sufficiently conducting to prevent charge accumulation, but still transparent enough to halve the Casimir attraction when compared to gold. Here, we report all the experimental details that, due to the limited space available, were omitted in the previous article. We discuss the performance of our setup in terms of stability of the calibration procedure and reproducibility of the Casimir force measurement. We also introduce and demonstrate a new technique to obtain the spring constant of our force sensor. Furthermore, we present a thorough description of the experimental method, a comprehensive explanation of data elaboration and error analysis, and a...

  17. 46 CFR 90.25-1 - Electrical engineering details.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Electrical engineering details. 90.25-1 Section 90.25-1... PROVISIONS General Electrical Engineering Requirements § 90.25-1 Electrical engineering details. (a) All electrical engineering details and installations shall be designed and installed in accordance with...

  18. 46 CFR 24.20-1 - Marine engineering details.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Marine engineering details. 24.20-1 Section 24.20-1... Engineering Requirements § 24.20-1 Marine engineering details. (a) All marine engineering details relative to... 40 feet in length will be found in subchapter F (Marine Engineering) of this chapter. ...

  19. 46 CFR 188.25-1 - Electrical engineering details.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Electrical engineering details. 188.25-1 Section 188.25... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J (Electrical...

  20. 46 CFR 70.25-1 - Electrical engineering details.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 3 2010-10-01 2010-10-01 false Electrical engineering details. 70.25-1 Section 70.25-1... General Electrical Engineering Requirements § 70.25-1 Electrical engineering details. All electrical engineering details and installations shall be designed and installed in accordance with subchapter J...

  1. 46 CFR 90.20-1 - Marine engineering details.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Marine engineering details. 90.20-1 Section 90.20-1... PROVISIONS General Marine Engineering Requirements § 90.20-1 Marine engineering details. (a) All marine engineering details such as piping, valves, fittings, boilers, pressure vessels, etc., and their appurtenances...

  2. 46 CFR 188.20-1 - Marine engineering details.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter. ...

  3. 20 CFR 422.604 - Request for detailed information.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Request for detailed information. 422.604... whom you have premium responsibility, you may request detailed information as to the work histories of any of the listed miners and the basis for the assignment. Your request for detailed information must...

  4. 14 CFR 23.685 - Control system details.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Control system details. 23.685 Section 23... Control Systems § 23.685 Control system details. (a) Each detail of each control system must be designed... cables or tubes against other parts. (d) Each element of the flight control system must have...

  5. 14 CFR 29.685 - Control system details.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Control system details. 29.685 Section 29... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Design and Construction Control Systems § 29.685 Control system details. (a) Each detail of each control system must be designed to prevent jamming, chafing, and...

  6. 14 CFR 25.685 - Control system details.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Control system details. 25.685 Section 25... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Design and Construction Control Systems § 25.685 Control system details. (a) Each detail of each control system must be designed and installed to prevent jamming...

  7. Detailed exploration of the endothelium: parameterization of flow-mediated dilation through principal component analysis.

    Science.gov (United States)

    Laclaustra, Martin; Frangi, Alejandro F; Garcia, Daniel; Boisrobert, Loïc; Frangi, Andres G; Pascual, Isaac

    2007-03-01

    Endothelial dysfunction is associated with cardiovascular diseases and their risk factors (CVRF), and flow-mediated dilation (FMD) is increasingly used to explore it. In this test, artery diameter changes after post-ischaemic hyperaemia are classically quantified using maximum peak vasodilation (FMDc). To obtain more detailed descriptors of FMD we applied principal component analysis (PCA) to diameter-time curves (absolute), vasodilation-time curves (relative) and blood-velocity-time curves. Furthermore, combined PCA of vessel size and blood-velocity curves allowed exploring links between flow and dilation. Vessel diameter data for PCA (post-ischaemic: 140 s) were acquired from brachial ultrasound image sequences of 173 healthy male subjects using a computerized technique previously reported by our team based on image registration (Frangi et al 2003 IEEE Trans. Med. Imaging 22 1458). PCA provides a set of axes (called eigenmodes) that captures the underlying variation present in a database of waveforms so that the first few eigenmodes retain most of the variation. These eigenmodes can be used to synthesize each waveform analysed by means of only a few parameters, as well as potentially any signal of the same type derived from tests of new patients. The eigenmodes obtained seemed related to visual features of the waveform of the FMD process. Subsequently, we used eigenmodes to parameterize our data. Most of the main parameters (13 out of 15) correlated with FMDc. Furthermore, not all parameters correlated with the same CVRF tested, that is, serum lipids (i.e., high LDL-c associated with slow vessel return to a baseline, while low HDL-c associated with a lower vasodilation in response to similar velocity stimulus), thus suggesting that this parameterization allows a more detailed and factored description of the process than FMDc.

  8. Encoding techniques for complex information structures in connectionist systems

    Science.gov (United States)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    Two general information encoding techniques called relative position encoding and pattern similarity association are presented. They are claimed to be a convenient basis for the connectionist implementation of complex, short term information processing of the sort needed in common sense reasoning, semantic/pragmatic interpretation of natural language utterances, and other types of high level cognitive processing. The relationships of the techniques to other connectionist information-structuring methods, and also to methods used in computers, are discussed in detail. The rich inter-relationships of these other connectionist and computer methods are also clarified. The particular, simple forms are discussed that the relative position encoding and pattern similarity association techniques take in the author's own connectionist system, called Conposit, in order to clarify some issues and to provide evidence that the techniques are indeed useful in practice.

  9. High dynamic range infrared images detail enhancement based on local edge preserving filter

    Science.gov (United States)

    Song, Qiong; Wang, Yuehuan; Bai, Kun

    2016-07-01

    In the field of infrared (IR) image processing, displaying a high dynamic range (HDR) image on a low dynamic range display equipment with a natural visual effect, clear details on local areas and less artifacts is an important issue. In this paper, we present a new approach to display HDR IR images with contrast enhancement. First, the local edge-preserving filter (LEPF) is utilized to separate the image into a base layer and detail layer(s). After the filtering procedure, we use an adaptive Gamma transformation to adjust the gray distribution of the base layer, and stretch the detail layer based on a human visual effect principle. Then, we recombine the detail layer and base layer to obtain the enhance output. Finally, we adjust the luminance of output by applying multiple exposure fusion method. The experimental results demonstrate that our proposed method can provide a significant performance in terms of enhancing details and less artifacts than the state of the arts.

  10. 76 FR 66132 - Proposed Collection; Comment Request for Travel Service Provider and Carrier Service Provider...

    Science.gov (United States)

    2011-10-25

    ... persons traveling on direct flights to Cuba and forward that information to carrier service providers, for... collection techniques or other forms of information technology; and (e) estimates of capital or start-up...

  11. Image Enhancement Techniques for Quantitative Investigations of Morphological Features in Cometary Comae: A Comparative Study

    CERN Document Server

    Samarasinha, Nalin

    2014-01-01

    Many cometary coma features are only a few percent above the ambient coma (i.e., the background) and therefore coma enhancement techniques are needed to discern the morphological structures present in cometary comae. A range of image enhancement techniques widely used by cometary scientists is discussed by categorizing them and carrying out a comparative analysis. The enhancement techniques and the corresponding characteristics are described in detail and the respective mathematical representations are provided. As the comparative analyses presented in this paper make use of simulated images with known coma features, the feature identifications as well as the artifacts caused by enhancement provide an objective and definitive assessment of the various techniques. Examples are provided which highlight contrasting capabilities of different techniques to pick out qualitatively distinct features of widely different strengths and spatial scales. On account of this as well as serious image artifacts and spurious fe...

  12. Human Factors Considerations in New Nuclear Power Plants: Detailed Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    OHara,J.; Higgins, J.; Brown, W.; Fink, R.

    2008-02-14

    This Nuclear Regulatory Commission (NRC) sponsored study has identified human-performance issues in new and advanced nuclear power plants. To identify the issues, current industry developments and trends were evaluated in the areas of reactor technology, instrumentation and control technology, human-system integration technology, and human factors engineering (HFE) methods and tools. The issues were organized into seven high-level HFE topic areas: Role of Personnel and Automation, Staffing and Training, Normal Operations Management, Disturbance and Emergency Management, Maintenance and Change Management, Plant Design and Construction, and HFE Methods and Tools. The issues where then prioritized into four categories using a 'Phenomena Identification and Ranking Table' methodology based on evaluations provided by 14 independent subject matter experts. The subject matter experts were knowledgeable in a variety of disciplines. Vendors, utilities, research organizations and regulators all participated. Twenty issues were categorized into the top priority category. This Brookhaven National Laboratory (BNL) technical report provides the detailed methodology, issue analysis, and results. A summary of the results of this study can be found in NUREG/CR-6947. The research performed for this project has identified a large number of human-performance issues for new control stations and new nuclear power plant designs. The information gathered in this project can serve as input to the development of a long-term strategy and plan for addressing human performance in these areas through regulatory research. Addressing human-performance issues will provide the technical basis from which regulatory review guidance can be developed to meet these challenges. The availability of this review guidance will help set clear expectations for how the NRC staff will evaluate new designs, reduce regulatory uncertainty, and provide a well-defined path to new nuclear power plant

  13. Advanced experimental and numerical techniques for cavitation erosion prediction

    CERN Document Server

    Chahine, Georges; Franc, Jean-Pierre; Karimi, Ayat

    2014-01-01

    This book provides a comprehensive treatment of the cavitation erosion phenomenon and state-of-the-art research in the field. It is divided into two parts. Part 1 consists of seven chapters, offering a wide range of computational and experimental approaches to cavitation erosion. It includes a general introduction to cavitation and cavitation erosion, a detailed description of facilities and measurement techniques commonly used in cavitation erosion studies, an extensive presentation of various stages of cavitation damage (including incubation and mass loss), and insights into the contribution of computational methods to the analysis of both fluid and material behavior. The proposed approach is based on a detailed description of impact loads generated by collapsing cavitation bubbles and a physical analysis of the material response to these loads. Part 2 is devoted to a selection of nine papers presented at the International Workshop on Advanced Experimental and Numerical Techniques for Cavitation Erosion (Gr...

  14. Detailed 3-D nuclear analysis of ITER outboard blanket modules

    Energy Technology Data Exchange (ETDEWEB)

    Bohm, Tim, E-mail: tdbohm@wisc.edu [Fusion Technology Institute, University of Wisconsin-Madison, Madison, WI (United States); Davis, Andrew; Sawan, Mohamed; Marriott, Edward; Wilson, Paul [Fusion Technology Institute, University of Wisconsin-Madison, Madison, WI (United States); Ulrickson, Michael; Bullock, James [Formerly, Fusion Technology, Sandia National Laboratories, Albuquerque, NM (United States)

    2015-10-15

    Highlights: • Nuclear analysis was performed on detailed CAD models placed in a 40 degree model of ITER. • The regions examined include BM09, the upper ELM coil region (BM11–13), the neutral beam (NB) region (BM13–16), and BM18. • The results show that VV nuclear heating exceeds limits in the NB and upper ELM coil regions. • The results also show that the level of He production in parts of BM18 exceeds limits. • These calculations are being used to modify the design of the ITER blanket modules. - Abstract: In the ITER design, the blanket modules (BM) provide thermal and nuclear shielding for the vacuum vessel (VV), magnets, and other components. We used the CAD based DAG-MCNP5 transport code to analyze detailed models inserted into a 40 degree partially homogenized ITER global model. The regions analyzed include BM09, BM16 near the heating neutral beam injection (HNB) region, BM11–13 near the upper ELM coil region, and BM18. For the BM16 HNB region, the VV nuclear heating behind the NB region exceeds the design limit by up to 80%. For the BM11–13 region, the nuclear heating of the VV exceeds the design limit by up to 45%. For BM18, the results show that He production does not meet the limit necessary for re-welding. The results presented in this work are being used by the ITER Organization Blanket and Tokamak Integration groups to modify the BM design in the cases where limits are exceeded.

  15. A Review of Image Contrast Enhancement Methods and Techniques

    Directory of Open Access Journals (Sweden)

    G. Maragatham

    2015-02-01

    Full Text Available In this study we aim to provide a survey of existing enhancement techniques with their descriptions and present a detailed analysis of them. Since most of the images while capturing are affected by weather, poor lighting and the acquiring device itself, they suffer from poor contrast. Sufficient Contrast in an image makes an object distinguishable from the other objects and the background. Contrast enhancement improves the quality of images for human observer by expanding the dynamic range of input gray level. A plethora enhancement techniques have though emerged, none of them deem to be a universal one, thus becoming selective in application. In such a scenario, it has become imperative to provide a comprehensive survey of these contrast enhancement techniques used in digital image processing.

  16. Detailed thermodynamic analyses of high-speed compressible turbulence

    Science.gov (United States)

    Towery, Colin; Darragh, Ryan; Poludnenko, Alexei; Hamlington, Peter

    2016-11-01

    Interactions between high-speed turbulence and flames (or chemical reactions) are important in the dynamics and description of many different combustion phenomena, including autoignition and deflagration-to-detonation transition. The probability of these phenomena to occur depends on the magnitude and spectral content of turbulence fluctuations, which can impact a wide range of science and engineering problems, from the hypersonic scramjet engine to the onset of Type Ia supernovae. In this talk, we present results from new direct numerical simulations (DNS) of homogeneous isotropic turbulence with turbulence Mach numbers ranging from 0 . 05 to 1 . 0 and Taylor-scale Reynolds numbers as high as 700. A set of detailed analyses are described in both Eulerian and Lagrangian reference frames in order to assess coherent (structural) and incoherent (stochastic) thermodynamic flow features. These analyses provide direct insights into the thermodynamics of strongly compressible turbulence. Furthermore, presented results provide a non-reacting baseline for future studies of turbulence-chemistry interactions in DNS with complex chemistry mechanisms. This work was supported by the Air Force Office of Scientific Research (AFOSR) under Award No. FA9550-14-1-0273, and the Department of Defense (DoD) High Performance Computing Modernization Program (HPCMP) under a Frontier project award.

  17. Detailed Measurements of Structure Functions from Nucleons and Nuclei

    CERN Multimedia

    2002-01-01

    The experiment will study deep inelastic muon nucleon scattering in a wide range of Q|2~(1-200 (GeV/c)|2) and x~(0.005-0.75). The main aims of the experiment are: \\item a)~~~~Detailed measurements of the nuclear dependence of the structure function F^2|A, of R~=~@s^L/@s^T and of the cross-section for J/@y production. They will provide a basis for the understanding of the EMC effect: the modification of quark and gluon distributions due to the nuclear environment. \\item b)~~~~A simultaneous high luminosity measurement of the structure function F^2 on hydrogen and deuterium. This will provide substantially improved accuracy in the knowledge of the neutron structure function F^2|n, of F^2|p-F^2|n and F^2|n/F^2|p and their Q|2 dependence. Furthermore, the data will allow a determination of the strong coupling constant @a^s(Q|2) with reduced experimental and theoretical uncertainties as well as of the ratio of the down to up quark distributions in the valence region. Due to the large x range covered by the experim...

  18. Translation Techniques

    Directory of Open Access Journals (Sweden)

    Marcia Pinheiro

    2015-05-01

    Full Text Available In this paper, we discuss three translation techniques: literal, cultural, and artistic. Literal translation is a well-known technique, which means that it is quite easy to find sources on the topic. Cultural and artistic translation may be new terms. Whilst cultural translation focuses on matching contexts, artistic translation focuses on matching reactions. Because literal translation matches only words, it is not hard to find situations in which we should not use this technique.  Because artistic translation focuses on reactions, judging the quality of an artistic translation work is one of the most difficult things one can do. We end up having a score of complexity and humanity for each one of the mentioned techniques: Literal translation would be the closest thing we have to the machines world and artistic translation would be the closest thing we have to the purely human world. By creating these classifications and studying the subtleties of each one of them, we are adding degrees of quality to our courses and to translation as a professional field. The main contribution of this paper is then the formalization of such a piece of knowledge. We, however, also lay the foundations for studies of this type.

  19. Experimental Techniques

    CERN Document Server

    Engelfried, J

    1999-01-01

    In this course we will give examples for experimental techniques used in particle physics experiments. After a short introduction, we will discuss applications in silicon microstrip detectors, wire chambers, and single photon detection in Ring Imaging Cherenkov (RICH) counters. A short discussion of the relevant physics processes, mainly different forms of energy loss in matter, is enclosed.

  20. Small teleost fish provide new insights into human skeletal diseases.

    Science.gov (United States)

    Witten, P E; Harris, M P; Huysseune, A; Winkler, C

    2017-01-01

    Small teleost fish such as zebrafish and medaka are increasingly studied as models for human skeletal diseases. Efficient new genome editing tools combined with advances in the analysis of skeletal phenotypes provide new insights into fundamental processes of skeletal development. The skeleton among vertebrates is a highly conserved organ system, but teleost fish and mammals have evolved unique traits or have lost particular skeletal elements in each lineage. Several unique features of the skeleton relate to the extremely small size of early fish embryos and the small size of adult fish used as models. A detailed analysis of the plethora of interesting skeletal phenotypes in zebrafish and medaka pushes available skeletal imaging techniques to their respective limits and promotes the development of new imaging techniques. Impressive numbers of zebrafish and medaka mutants with interesting skeletal phenotypes have been characterized, complemented by transgenic zebrafish and medaka lines. The advent of efficient genome editing tools, such as TALEN and CRISPR/Cas9, allows to introduce targeted deficiencies in genes of model teleosts to generate skeletal phenotypes that resemble human skeletal diseases. This review will also discuss other attractive aspects of the teleost skeleton. This includes the capacity for lifelong tooth replacement and for the regeneration of dermal skeletal elements, such as scales and fin rays, which further increases the value of zebrafish and medaka models for skeletal research. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Computed tomography-based biomarker provides unique signature for diagnosis of COPD phenotypes and disease progression.

    Science.gov (United States)

    Galbán, Craig J; Han, Meilan K; Boes, Jennifer L; Chughtai, Komal A; Meyer, Charles R; Johnson, Timothy D; Galbán, Stefanie; Rehemtulla, Alnawaz; Kazerooni, Ella A; Martinez, Fernando J; Ross, Brian D

    2012-11-01

    Chronic obstructive pulmonary disease (COPD) is increasingly being recognized as a highly heterogeneous disorder, composed of varying pathobiology. Accurate detection of COPD subtypes by image biomarkers is urgently needed to enable individualized treatment, thus improving patient outcome. We adapted the parametric response map (PRM), a voxel-wise image analysis technique, for assessing COPD phenotype. We analyzed whole-lung computed tomography (CT) scans acquired at inspiration and expiration of 194 individuals with COPD from the COPDGene study. PRM identified the extent of functional small airways disease (fSAD) and emphysema as well as provided CT-based evidence that supports the concept that fSAD precedes emphysema with increasing COPD severity. PRM is a versatile imaging biomarker capable of diagnosing disease extent and phenotype while providing detailed spatial information of disease distribution and location. PRM's ability to differentiate between specific COPD phenotypes will allow for more accurate diagnosis of individual patients, complementing standard clinical techniques.

  2. Optimization techniques in statistics

    CERN Document Server

    Rustagi, Jagdish S

    1994-01-01

    Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza

  3. Vadose zone transport field study: Detailed test plan for simulated leak tests

    Energy Technology Data Exchange (ETDEWEB)

    AL Ward; GW Gee

    2000-06-23

    Hanford to: identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project.

  4. Family physicians' perceptions of academic detailing: a quantitative and qualitative study

    Directory of Open Access Journals (Sweden)

    O'Connor Nicolette

    2007-10-01

    Full Text Available Abstract Background The efficacy of academic detailing in changing physicians' knowledge and practice has been the subject of many primary research publications and systematic reviews. However, there is little written about the features of academic detailing that physicians find valuable or that affect their use of it. The goal of our project was to explore family physicians' (FPs perceptions of academic detailing and the factors that affect their use of it. Methods We used 2 methods to collect data, a questionnaire and semi-structured telephone interviews. We mailed questionnaires to all FPs in the Dalhousie Office of Continuing Medical Education database and analyzed responses of non-users and users of academic detailing. After a preliminary analysis of questionnaire data, we conducted semi-structured interviews with 7 FPs who did not use academic detailing and 17 who did use it. Results Overall response rate to the questionnaire was 33% (289/869. Response rate of non-users of academic detailing was 15% (60/393, of users was 48% (229/476. The 3 factors that most encouraged use of academic detailing were the topics selected, the evidence-based approach adopted, and the handout material. The 3 factors that most discouraged the use of academic detailing were spending office time doing CME, scheduling time to see the academic detailer, and having CME provided by a non-physician. Users of academic detailing rated it as being more valuable than other forms of CME. Generally, interview data confirmed questionnaire data with the exception that interview informants did not view having CME provided by a non-physician as a barrier. Interview informants mentioned that the evidence-based approach adopted by academic detailing had led them to more critically evaluate information from other CME programs, pharmaceutical representatives, and journal articles, but not advice from specialists. Conclusion Users of academic detailing highly value its educational

  5. A Detailed Circuit Analysis of the Lawrence Livermore National Laboratory Building 141 Detonator Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Mayhall, D J; Wilson, M J; Wilson, J H

    2003-10-01

    A detailed electrical equivalent circuit of an as-built utility fault simulator is presented. Standard construction techniques for light industrial facilities were used to build a test-bed for evaluating utility power level faults into unintentional victims. The initial components or victims of interest are commercial detonators. Other possible candidates for fault response analyses include motors, power supplies, control systems, computers, or other electronic equipment. Measured Thevenin parameters of all interconnections provide the selected component values used in the model. Included in the model is an opening 10 HP motor circuit demonstrating voltage transients commonly seen on branch circuits from inductive loads common to industrial installations. Complex transmission lines were developed to represent real world transmission line effects possible from the associated branch circuits. To reduce the initial circuit stabilization delay a set of non-linear resistive elements are employed. The resulting model has assisted in confirming previous detonator safety work and supported the definition of critical parameters needed for continued safety assessment of victims to utility type power sources.

  6. Spectrometric techniques 2

    CERN Document Server

    Vanasse, George A

    2013-01-01

    Spectrometric Techniques, Volume II provides information pertinent to vacuum ultraviolet techniques to complete the demonstration of the diversity of methods available to the spectroscopist interested in the ultraviolet visible and infrared spectral regions. This book discusses the specific aspects of the technique of Fourier transform spectroscopy.Organized into five chapters, this volume begins with an overview of the large number of systematic effects in the recording of an interferogram. This text then examines the design approach for a Fourier transform spectrometer with focus on optics.

  7. Computer techniques for electromagnetics

    CERN Document Server

    Mittra, R

    1973-01-01

    Computer Techniques for Electromagnetics discusses the ways in which computer techniques solve practical problems in electromagnetics. It discusses the impact of the emergence of high-speed computers in the study of electromagnetics. This text provides a brief background on the approaches used by mathematical analysts in solving integral equations. It also demonstrates how to use computer techniques in computing current distribution, radar scattering, and waveguide discontinuities, and inverse scattering. This book will be useful for students looking for a comprehensive text on computer techni

  8. New techniques in digital holography

    CERN Document Server

    Picart, Pascal

    2015-01-01

    A state of the art presentation of important advances in the field of digital holography, detailing advances related to fundamentals of digital holography, in-line holography applied to fluid mechanics, digital color holography, digital holographic microscopy, infrared holography, special techniques in full field vibrometry and inverse problems in digital holography

  9. Solar Pilot Plant Phase I, detailed design report: collector subsystem research experiment. CDRL Item No. 6 (Approved)

    Energy Technology Data Exchange (ETDEWEB)

    1976-08-31

    The configurations of the experimental heliostat, power and control system, and support elements for the Barstow Solar Pilot Plant are described, and the analytical and experimental determination of performance parameters is discussed. A system analysis is presented, including demonstration of pointing accuracy by error analysis, and demonstration of loop performance by simulation. Engineering model test plans are given that are to evaluate subassemblies, processes, and procedures as well as provide insight into best tests for heliostat subsystem testing. Mirror module test data are analyzed. A comprehensive test plan for the experimental model is presented. Appended are: a heliostat power consumption analysis; collector subsystem research experiment detail specification; structural analysis; solar image analysis; computer and software information; breadboard test data; simulation of the heliostat control loop; mirror module reflectance measurements; plywood frame fixed focus mirror module test data; techniques for redirected image characterization; performance of a meteorological measuring system; and heliostat design data. (LEW)

  10. Intensity techniques

    DEFF Research Database (Denmark)

    Jacobsen, Finn

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook...... will present topics on signal processing which are important in a specific area of acoustics. These will be of interest to specialists in these areas because they will be presented from their technical perspective, rather than a generic engineering approach to signal processing. Non-specialists, or specialists...... from different areas, will find the self-contained chapters accessible and will be interested in the similarities and differences between the approaches and techniques used in different areas of acoustics....

  11. Electrochemical Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe

    2008-07-20

    Sensitive and selective detection techniques are of crucial importance for capillary electrophoresis (CE), microfluidic chips, and other microfluidic systems. Electrochemical detectors have attracted considerable interest for microfluidic systems with features that include high sensitivity, inherent miniaturization of both the detection and control instrumentation, low cost and power demands, and high compatibility with microfabrication technology. The commonly used electrochemical detectors can be classified into three general modes: conductimetry, potentiometry, and amperometry.

  12. EPA Enforcement and Compliance History Online: Water Effluent Charts Details

    Data.gov (United States)

    U.S. Environmental Protection Agency — Detailed Discharge Monitoring Report (DMR) data supporting effluent charts for one Clean Water Act discharge permit. Includes effluent parameters, amounts discharged...

  13. Combustion instability detection using the wavelet detail of pressure fluctuations

    Institute of Scientific and Technical Information of China (English)

    Junjie JI; Yonghao LUO

    2008-01-01

    A combustion instability detection method that uses the wavelet detail of combustion pressure fluctuations is put forward. To confirm this method, combustion pressure fluctuations in a stoker boiler are recorded at stable and unstable combustion with a pressure transducer. Daubechies one-order wavelet is chosen to obtain the wavelet details for comparison. It shows that the wavelet approximation indicates the general pressure change in the furnace, and the wavelet detail magnitude is consistent with the intensity of turbulence and combustion noise. The magnitude of the wavelet detail is nearly constant when the combustion is stable, however, it will fluctuate much when the combustion is unstable.

  14. Process measuring techniques; Prozessmesstechnik

    Energy Technology Data Exchange (ETDEWEB)

    Freudenberger, A.

    2000-07-01

    This introduction into measurement techniques for chemical and process-technical plant in science and industry describes in detail the methods used to measure basic quantities. Most prominent are modern measuring techniques by means of ultrasound, microwaves and the Coriolis effect. Alongside physical and measuring technique fundamentals, the practical applications of measuring devices are described. Calculation examples are given to illustrate the subject matter. The book addresses students of physical engineering, process engineering and environmental engineering at technical schools as well as engineers of other disciplines wishing to familiarize themselves with the subject of process measurement techniques. (orig.) [German] Diese Einfuehrung in die Messtechnik fuer chemische und verfahrens-technische Forschungs- und Produktionsanlagen beschreibt ausfuehrlich die Methoden zur Messung der Basisgroessen. Moderne Messverfahren mit Ultraschall, Mikrowellen und Coriolis-Effekt stehen dabei im Vordergrund. Beruecksichtigung finden sowohl die physikalischen und messtechnischen Grundlagen als auch die praktischen Anwendungen der Geraete. Berechnungsbeispiele dienen der Erlaeuterung und Vertiefung des Stoffes. Angesprochen sind Studenten der Ingenieurstufengaenge Physikalische Technik und Verfahrens- und Umwelttechnik an Fachhochschulen als auch Ingenieure anderer Fachrichtungen, die sich in das Gebiet der Prozessmesstechnik einarbeiten wollen. (orig.)

  15. Framework for detailed studies on the construction and operation of repositories for spent nuclear fuel; Ramprogram foer detaljundersoekningar vid uppfoerande och drift av slutfoervar foer anvaent kaernbraensle

    Energy Technology Data Exchange (ETDEWEB)

    2010-10-15

    This report presents a programme for the detailed investigations planned to be applied during construction and operation of the repository for spent nuclear fuel at Forsmark. The report is part of SKB's application according to the Nuclear Activities Act. The detailed investigations shall provide relevant data on and site-descriptive models for the bedrock, soil deposits and eco-system of the site in order to facilitate a step-wise design and construction of the final repository. This shall be implemented in a manner that all demands on long-term safety are fulfilled, including accurate documentation of the construction work, and so that assessments of the environmental impact of the repository can be made. For the operational phase, the detailed investigations should also provide support to the deposition process with related decisions, thereby enabling fulfilment of the design premises for the siting and construction of deposition tunnels and deposition holes, as well as for deposition of canisters, and for the subsequent backfilling and closure of the repository. The Observational Method will be applied during the construction of the repository. This method entails establishing in advance acceptable limits of behaviour regarding selected geoscientific parameters and preparing a plan with measures to keep the outcome within these limits. Predictions of expected rock properties are established for each tunnel section. The outcome after excavation is compared with the acceptable range of outcomes. Information from detailed characterization will be of essential importance for application of the Observational Method and for adapting the repository to the prevailing rock properties. SKB has for the past several decades developed methods for site characterisation, applying both above- and underground investigation techniques. Experiences from this work, put into practice during the site investigations, has resulted in a solid knowledge and understanding of the

  16. Framework for detailed studies on the construction and operation of repositories for spent nuclear fuel; Ramprogram foer detaljundersoekningar vid uppfoerande och drift av slutfoervar foer anvaent kaernbraensle

    Energy Technology Data Exchange (ETDEWEB)

    2010-10-15

    This report presents a programme for the detailed investigations planned to be applied during construction and operation of the repository for spent nuclear fuel at Forsmark. The report is part of SKB's application according to the Nuclear Activities Act. The detailed investigations shall provide relevant data on and site-descriptive models for the bedrock, soil deposits and eco-system of the site in order to facilitate a step-wise design and construction of the final repository. This shall be implemented in a manner that all demands on long-term safety are fulfilled, including accurate documentation of the construction work, and so that assessments of the environmental impact of the repository can be made. For the operational phase, the detailed investigations should also provide support to the deposition process with related decisions, thereby enabling fulfilment of the design premises for the siting and construction of deposition tunnels and deposition holes, as well as for deposition of canisters, and for the subsequent backfilling and closure of the repository. The Observational Method will be applied during the construction of the repository. This method entails establishing in advance acceptable limits of behaviour regarding selected geoscientific parameters and preparing a plan with measures to keep the outcome within these limits. Predictions of expected rock properties are established for each tunnel section. The outcome after excavation is compared with the acceptable range of outcomes. Information from detailed characterization will be of essential importance for application of the Observational Method and for adapting the repository to the prevailing rock properties. SKB has for the past several decades developed methods for site characterisation, applying both above- and underground investigation techniques. Experiences from this work, put into practice during the site investigations, has resulted in a solid knowledge and understanding of the

  17. Internet Provider Facilities, Published in Not Provided, US Army.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Internet Provider Facilities dataset as of Not Provided. Data by this publisher are often provided in Not Applicable coordinate system; in a Not Applicable...

  18. Detailed 3-D nuclear analysis of ITER blanket modules

    Energy Technology Data Exchange (ETDEWEB)

    Bohm, T.D., E-mail: tdbohm@wisc.edu [University of Wisconsin-Madison, Madison, WI (United States); Sawan, M.E.; Marriott, E.P.; Wilson, P.P.H. [University of Wisconsin-Madison, Madison, WI (United States); Ulrickson, M.; Bullock, J. [Sandia National Laboratories, Albuquerque, NM (United States)

    2014-10-15

    In ITER, the blanket modules (BM) are arranged around the plasma to provide thermal and nuclear shielding for the vacuum vessel (VV), magnets, and other components. As a part of the BM design process, nuclear analysis is required to determine the level of nuclear heating, helium production, and radiation damage in the BM. Additionally, nuclear heating in the VV is also important for assessing the BM design. We used the CAD based DAG-MCNP5 transport code to analyze detailed models inserted into a 40-degree partially homogenized ITER global model. The regions analyzed include BM01, the neutral beam injection (NB) region, and the upper port region. For BM01, the results show that He production meets the limit necessary for re-welding, and the VV heating behind BM01 is acceptable. For the NBI region, the VV nuclear heating behind the NB region exceeds the design limit by a factor of two. For the upper port region, the nuclear heating of the VV exceeds the design limit by up to 20%. The results presented in this work are being used to modify the BM design in the cases where limits are exceeded.

  19. Urban scale air quality modelling using detailed traffic emissions estimates

    Science.gov (United States)

    Borrego, C.; Amorim, J. H.; Tchepel, O.; Dias, D.; Rafael, S.; Sá, E.; Pimentel, C.; Fontes, T.; Fernandes, P.; Pereira, S. R.; Bandeira, J. M.; Coelho, M. C.

    2016-04-01

    The atmospheric dispersion of NOx and PM10 was simulated with a second generation Gaussian model over a medium-size south-European city. Microscopic traffic models calibrated with GPS data were used to derive typical driving cycles for each road link, while instantaneous emissions were estimated applying a combined Vehicle Specific Power/Co-operative Programme for Monitoring and Evaluation of the Long-range Transmission of Air Pollutants in Europe (VSP/EMEP) methodology. Site-specific background concentrations were estimated using time series analysis and a low-pass filter applied to local observations. Air quality modelling results are compared against measurements at two locations for a 1 week period. 78% of the results are within a factor of two of the observations for 1-h average concentrations, increasing to 94% for daily averages. Correlation significantly improves when background is added, with an average of 0.89 for the 24 h record. The results highlight the potential of detailed traffic and instantaneous exhaust emissions estimates, together with filtered urban background, to provide accurate input data to Gaussian models applied at the urban scale.

  20. Detailed model for practical pulverized coal furnaces and gasifiers

    Energy Technology Data Exchange (ETDEWEB)

    Philips, S.D.; Smoot, L.D.

    1989-08-01

    The need to improve efficiency and reduce pollutant emissions commercial furnaces has prompted energy companies to search for optimized operating conditions and improved designs in their fossil-fuel burning facilities. Historically, companies have relied on the use of empirical correlations and pilot-plant data to make decisions about operating conditions and design changes. The high cost of collecting data makes obtaining large amounts of data infeasible. The main objective of the data book is to provide a single source of detailed three-dimensional combustion and combustion-related data suitable for comprehensive combustion model evaluation. Five tasks were identified as requirements to achieve the main objective. First, identify the types of data needed to evaluate comprehensive combustion models, and establish criteria for selecting the data. Second, identify and document available three-dimensional combustion data related to pulverized coal combustion. Third, collect and evaluate three-dimensional data cases, and select suitable cases based on selection criteria. Fourth, organize the data sets into an easy-to-use format. Fifth, evaluate and interpret the nature and quality of the data base. 39 refs., 15 figs., 14 tabs.

  1. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    Science.gov (United States)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  2. Stream sediment detailed geochemical survey for Date Creek Basin, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Butz, T.R.; Tieman, D.J.; Grimes, J.G.; Bard, C.S.; Helgerson, R.N.; Pritz, P.M.

    1980-06-30

    Results of the Date Creek Basin detailed geochemical survey are reported. Field and laboratory data are reported for 239 stream sediment samples. Statistical and areal distributions of uranium and possible uranium-related variables are displayed. A generalized geologic map of the area is provided, and pertinent geologic factors which may be of significance in evaluating the potential for uranium mineralization are briefly discussed. Based on stream sediment geochemical data, significant concentrations of uranium are restricted to the Anderson Mine area. The 84th percentile concentrations of U-FL, U-NT, and U-FL/U-NT combined with low thorium/U-NT values reflect increased mobility and enrichment of uranium in the carbonate host rocks of that area. Elements characteristically associated with the uranium mineralization include lithium and arsenic. No well defined diffusion halos suggesting outliers of similar uranium mineralization were observed from the stream sediment data in other areas of the Date Creek Basin. Significant concentrations of U-FL or U-NT found outside the mine area are generally coincident with low U-FL/U-NT values and high concentrations of zirconium, titanium, and phosphorus. This suggests that the uranium is related to a resistate mineral assemblage derived from surrounding crystalline igneous and metamorphic rocks.

  3. Detailed temporal structure of communication networks in groups of songbirds.

    Science.gov (United States)

    Stowell, Dan; Gill, Lisa; Clayton, David

    2016-06-01

    Animals in groups often exchange calls, in patterns whose temporal structure may be influenced by contextual factors such as physical location and the social network structure of the group. We introduce a model-based analysis for temporal patterns of animal call timing, originally developed for networks of firing neurons. This has advantages over cross-correlation analysis in that it can correctly handle common-cause confounds and provides a generative model of call patterns with explicit parameters for the influences between individuals. It also has advantages over standard Markovian analysis in that it incorporates detailed temporal interactions which affect timing as well as sequencing of calls. Further, a fitted model can be used to generate novel synthetic call sequences. We apply the method to calls recorded from groups of domesticated zebra finch (Taeniopygia guttata) individuals. We find that the communication network in these groups has stable structure that persists from one day to the next, and that 'kernels' reflecting the temporal range of influence have a characteristic structure for a calling individual's effect on itself, its partner and on others in the group. We further find characteristic patterns of influences by call type as well as by individual.

  4. Basis/Detail-ontwerp, Werken met het IT, versie 2

    NARCIS (Netherlands)

    Peek CJ

    1991-01-01

    This report is the result of the Global/Detailed design phase of "The use of IT, version 2". The first version of "The use of IT" has been published as part - chapter 6 - of the Global/Detailed-design Information System on Environmental Technology (IT), 1989, report nr.

  5. 46 CFR 70.20-1 - Marine engineering details.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 3 2010-10-01 2010-10-01 false Marine engineering details. 70.20-1 Section 70.20-1... General Marine Engineering Requirements § 70.20-1 Marine engineering details. All marine engineering... subchapter F (Marine Engineering) of this chapter. ...

  6. 44 CFR 5.27 - Deletion of identifying details.

    Science.gov (United States)

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Deletion of identifying details. 5.27 Section 5.27 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY..., FEMA may delete identifying details when making available or publishing an opinion, statement of...

  7. Processing and Recall of Seductive Details in Scientific Text

    Science.gov (United States)

    Lehman, Stephen; Schraw, Gregory; McCrudden, Matthew T.; Hartley, Kendall

    2007-01-01

    This study examined how seductive details affect on-line processing of a technical, scientific text. In Experiment 1, each sentence from the experimental text was rated for interest and importance. Participants rated seductive details as being more interesting but less important than main ideas. In Experiment 2, we examined the effect of seductive…

  8. Following the masters: portrait viewing and appreciation is guided by selective detail.

    Science.gov (United States)

    DiPaola, Steve; Riebe, Caitlin; Enns, James T

    2013-01-01

    A painted portrait differs from a photo in that selected regions are often rendered in much sharper detail than other regions. Artists believe these choices guide viewer gaze and influence their appreciation of the portrait, but these claims are difficult to test because increased portrait detail is typically associated with greater meaning, stronger lighting, and a more central location in the composition. In three experiments we monitored viewer gaze and recorded viewer preferences for portraits rendered with a parameterised non-photorealistic technique to mimic the style of Rembrandt (DiPaola, 2009 International Journal of Art and Technology 2 82-93). Results showed that viewer gaze was attracted to and held longer by regions of relatively finer detail (experiment 1), and also by textural highlighting (experiment 2), and that artistic appreciation increased when portraits strongly biased gaze (experiment 3). These findings have implications for understanding both human vision science and visual art.

  9. Cooling technique

    Energy Technology Data Exchange (ETDEWEB)

    Salamon, Todd R; Vyas, Brijesh; Kota, Krishna; Simon, Elina

    2017-01-31

    An apparatus and a method are provided. Use is made of a wick structure configured to receive a liquid and generate vapor in when such wick structure is heated by heat transferred from heat sources to be cooled off. A vapor channel is provided configured to receive the vapor generated and direct said vapor away from the wick structure. In some embodiments, heat conductors are used to transfer the heat from the heat sources to the liquid in the wick structure.

  10. Detailed source process of the 2007 Tocopilla earthquake.

    Science.gov (United States)

    Peyrat, S.; Madariaga, R.; Campos, J.; Asch, G.; Favreau, P.; Bernard, P.; Vilotte, J.

    2008-05-01

    We investigated the detail rupture process of the Tocopilla earthquake (Mw 7.7) of the 14 November 2007 and of the main aftershocks that occurred in the southern part of the North Chile seismic gap using strong motion data. The earthquake happen in the middle of the permanent broad band and strong motion network IPOC newly installed by GFZ and IPGP, and of a digital strong-motion network operated by the University of Chile. The Tocopilla earthquake is the last large thrust subduction earthquake that occurred since the major Iquique 1877 earthquake which produced a destructive tsunami. The Arequipa (2001) and Antofagasta (1995) earthquakes already ruptured the northern and southern parts of the gap, and the intraplate intermediate depth Tarapaca earthquake (2005) may have changed the tectonic loading of this part of the Peru-Chile subduction zone. For large earthquakes, the depth of the seismic rupture is bounded by the depth of the seismogenic zone. What controls the horizontal extent of the rupture for large earthquakes is less clear. Factors that influence the extent of the rupture include fault geometry, variations of material properties and stress heterogeneities inherited from the previous ruptures history. For subduction zones where structures are not well known, what may have stopped the rupture is not obvious. One crucial problem raised by the Tocopilla earthquake is to understand why this earthquake didn't extent further north, and at south, what is the role of the Mejillones peninsula that seems to act as a barrier. The focal mechanism was determined using teleseismic waveforms inversion and with a geodetic analysis (cf. Campos et al.; Bejarpi et al., in the same session). We studied the detailed source process using the strong motion data available. This earthquake ruptured the interplate seismic zone over more than 150 km and generated several large aftershocks, mainly located south of the rupture area. The strong-motion data show clearly two S

  11. Recent developments in the detailed characterization of polymers by multidimensional chromatography.

    Science.gov (United States)

    Baumgaertel, Anja; Altuntaş, Esra; Schubert, Ulrich S

    2012-06-01

    Synthetic polymers as well as biopolymers reveal complex structures, such as variations in functionality, chain length and architecture. Therefore, combinations of different chromatographic techniques are a prerequisite for a detailed characterization. One possible approach is the combination of high performance liquid chromatography at critical conditions (LCCC) and size-exclusion chromatography, also named as two-dimensional chromatography, which allows the separation of the polymers according to different properties, like molar mass, chemical composition or functionality. In addition, LCCC hyphenated with different mass spectrometry techniques, e.g. MALDI-TOF or ESI-TOF, leads to additional information about molecular details of the polymeric structure. We summarize in this article the recent developments in two-dimensional chromatography of synthetic polymers and biopolymers since 2005. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Single Cell Electrical Characterization Techniques.

    Science.gov (United States)

    Mansor, Muhammad Asraf; Ahmad, Mohd Ridzuan

    2015-06-04

    Electrical properties of living cells have been proven to play significant roles in understanding of various biological activities including disease progression both at the cellular and molecular levels. Since two decades ago, many researchers have developed tools to analyze the cell's electrical states especially in single cell analysis (SCA). In depth analysis and more fully described activities of cell differentiation and cancer can only be accomplished with single cell analysis. This growing interest was supported by the emergence of various microfluidic techniques to fulfill high precisions screening, reduced equipment cost and low analysis time for characterization of the single cell's electrical properties, as compared to classical bulky technique. This paper presents a historical review of single cell electrical properties analysis development from classical techniques to recent advances in microfluidic techniques. Technical details of the different microfluidic techniques are highlighted, and the advantages and limitations of various microfluidic devices are discussed.

  13. Single Cell Electrical Characterization Techniques

    Directory of Open Access Journals (Sweden)

    Muhammad Asraf Mansor

    2015-06-01

    Full Text Available Electrical properties of living cells have been proven to play significant roles in understanding of various biological activities including disease progression both at the cellular and molecular levels. Since two decades ago, many researchers have developed tools to analyze the cell’s electrical states especially in single cell analysis (SCA. In depth analysis and more fully described activities of cell differentiation and cancer can only be accomplished with single cell analysis. This growing interest was supported by the emergence of various microfluidic techniques to fulfill high precisions screening, reduced equipment cost and low analysis time for characterization of the single cell’s electrical properties, as compared to classical bulky technique. This paper presents a historical review of single cell electrical properties analysis development from classical techniques to recent advances in microfluidic techniques. Technical details of the different microfluidic techniques are highlighted, and the advantages and limitations of various microfluidic devices are discussed.

  14. Strongly correlated systems experimental techniques

    CERN Document Server

    Mancini, Ferdinando

    2015-01-01

    The continuous evolution and development of experimental techniques is at the basis of any fundamental achievement in modern physics. Strongly correlated systems (SCS), more than any other, need to be investigated through the greatest variety of experimental techniques in order to unveil and crosscheck the numerous and puzzling anomalous behaviors characterizing them. The study of SCS fostered the improvement of many old experimental techniques, but also the advent of many new ones just invented in order to analyze the complex behaviors of these systems. Many novel materials, with functional properties emerging from macroscopic quantum behaviors at the frontier of modern research in physics, chemistry and materials science, belong to this class of systems. The volume presents a representative collection of the modern experimental techniques specifically tailored for the analysis of strongly correlated systems. Any technique is presented in great detail by its own inventor or by one of the world-wide recognize...

  15. [New microbiological techniques].

    Science.gov (United States)

    Schubert, S; Wieser, A; Bonkat, G

    2017-06-01

    Microbiological diagnostic procedures have changed rapidly in recent years. This is especially true in the field of molecular diagnostics. Classical culture-based techniques are still the gold standard in many areas; however, they are already complemented by automated and also molecular techniques to guarantee faster and better quality results. The most commonly used techniques include real-time polymerase chain reaction (RT-PCR) based systems and nucleic acid hybridization. These procedures are used most powerfully from direct patient samples or in assays to detect the presence of nonculturable or fastidious organisms. Further techniques such as DNA sequencing are not yet used routinely for urological samples and can be considered experimental. However, in conjunction with dropping prices and further technical developments, these techniques promise to be used much more in the near future. Regarding bacterial identification from culture, mass spectrometry (MALDI-TOF MS) has become the technique of choice in recent years especially in Europe. It has tremendously shortened the time to result. This is now going to be extended to antibiotic susceptibility testing. This is of paramount importance in view of ever rising antimicrobial resistance rates. Techniques described in this review offer a faster and better microbiological diagnosis. Such continuous improvements are critical especially in times of cost pressure and rising antimicrobial resistance rates. It is in our interest to provide the best possible care for patients and in this regard a good and effective communication between the laboratory and the clinician is of vital importance.

  16. Geolocation Techniques Principles and Applications

    CERN Document Server

    Gentile, Camillo; Raulefs, Ronald; Teolis, Carole

    2013-01-01

    Geolocation Techniques: Principles and Applications provides a comprehensive overview of geolocation technologies and techniques, from radio-frequency based to inertial based. The focus of this book is to provide an overview on the different types of infra-structure supported by most commercial localization systems as well as on the most popular computational techniques which these systems employ. This book can serve as a reference for scholarly activities such as teaching, self-learning, or research.

  17. Foramen magnum meningiomas: detailed surgical approaches and technical aspects at Lariboisière Hospital and review of the literature

    Science.gov (United States)

    George, Bernard

    2007-01-01

    Foramen magnum meningiomas are challenging tumors, requiring special considerations because of the vicinity of the medulla oblongata, the lower cranial nerves, and the vertebral artery. After detailing the relevant anatomy of the foramen magnum area, we will explain our classification system based on the compartment of development, the dural insertion, and the relation to the vertebral artery. The compartment of development is most of the time intradural and less frequently extradural or both intraextradural. Intradurally, foramen magnum meningiomas are classified posterior, lateral, and anterior if their insertion is, respectively, posterior to the dentate ligament, anterior to the dentate ligament, and anterior to the dentate ligament with extension over the midline. This classification system helps to define the best surgical approach and the lateral extent of drilling needed and anticipate the relation with the lower cranial nerves. In our department, three basic surgical approaches were used: the posterior midline, the postero-lateral, and the antero-lateral approaches. We will explain in detail our surgical technique. Finally, a review of the literature is provided to allow comparison with the treatment options advocated by other skull base surgeons. PMID:17882459

  18. Teaching antenatal counseling skills to neonatal providers.

    Science.gov (United States)

    Stokes, Theophil A; Watson, Katie L; Boss, Renee D

    2014-02-01

    Counseling a family confronted with the birth of a periviable neonate is one of the most difficult tasks that a neonatologist must perform. The neonatologist's goal is to facilitate an informed, collaborative decision about whether life-sustaining therapies are in the best interest of this baby. Neonatologists are trained to provide families with a detailed account of the morbidity and mortality data they believe are necessary to facilitate a truly informed decision. Yet these complicated and intensely emotional conversations require advanced communication and counseling skills that our current fellowship-training strategies are not adequately providing. We review educational models for training neonatology fellows to provide antenatal counseling at the threshold of viability. We believe that training aimed at teaching these skills should be incorporated into the neonatal-perinatal medicine fellowship. The optimal approaches for teaching these skills remain uncertain, and there is a need for continued innovation and outcomes-based research.

  19. Recovering and Preventing Loss of Detailed Memory: Differential Rates of Forgetting for Detail Types in Episodic Memory

    Science.gov (United States)

    Sekeres, Melanie J.; Bonasia, Kyra; St-Laurent, Marie; Pishdadian, Sara; Winocur, Gordon; Grady, Cheryl; Moscovitch, Morris

    2016-01-01

    Episodic memories undergo qualitative changes with time, but little is known about how different aspects of memory are affected. Different types of information in a memory, such as perceptual detail, and central themes, may be lost at different rates. In patients with medial temporal lobe damage, memory for perceptual details is severely impaired,…

  20. Investigating the Interaction of Graphic Organizers and Seductive Details: Can a Graphic Organizer Mitigate the Seductive-Details Effect?

    Science.gov (United States)

    Rowland-Bryant, Emily; Skinner, Christopher H.; Skinner, Amy L.; Saudargas, Richard; Robinson, Daniel H.; Kirk, Emily R.

    2009-01-01

    The interaction between seductive details (SD) and a graphic organizer (GO) was investigated. Undergraduate students (n = 207) read a target-material passage about Freud's psychosexual stages. Depending on condition, the participants also read a biographical paragraph (SD-only), viewed a graphic organizer that linked the seductive details to the…

  1. Recovering and Preventing Loss of Detailed Memory: Differential Rates of Forgetting for Detail Types in Episodic Memory

    Science.gov (United States)

    Sekeres, Melanie J.; Bonasia, Kyra; St-Laurent, Marie; Pishdadian, Sara; Winocur, Gordon; Grady, Cheryl; Moscovitch, Morris

    2016-01-01

    Episodic memories undergo qualitative changes with time, but little is known about how different aspects of memory are affected. Different types of information in a memory, such as perceptual detail, and central themes, may be lost at different rates. In patients with medial temporal lobe damage, memory for perceptual details is severely impaired,…

  2. Investigating the Interaction of Graphic Organizers and Seductive Details: Can a Graphic Organizer Mitigate the Seductive-Details Effect?

    Science.gov (United States)

    Rowland-Bryant, Emily; Skinner, Christopher H.; Skinner, Amy L.; Saudargas, Richard; Robinson, Daniel H.; Kirk, Emily R.

    2009-01-01

    The interaction between seductive details (SD) and a graphic organizer (GO) was investigated. Undergraduate students (n = 207) read a target-material passage about Freud's psychosexual stages. Depending on condition, the participants also read a biographical paragraph (SD-only), viewed a graphic organizer that linked the seductive details to the…

  3. Assessing Classroom Assessment Techniques

    Science.gov (United States)

    Simpson-Beck, Victoria

    2011-01-01

    Classroom assessment techniques (CATs) are teaching strategies that provide formative assessments of student learning. It has been argued that the use of CATs enhances and improves student learning. Although the various types of CATs have been extensively documented and qualitatively studied, there appears to be little quantitative research…

  4. Techniques for Vocal Health.

    Science.gov (United States)

    Wiest, Lori

    1997-01-01

    Outlines a series of simple yet effective practices, techniques, and tips for improving the singing voice and minimizing stress on the vocal chords. Describes the four components for producing vocal sound: respiration, phonation, resonation, and articulation. Provides exercises for each and lists symptoms of sickness and vocal strain. (MJP)

  5. Merchandising Techniques and Libraries.

    Science.gov (United States)

    Green, Sylvie A.

    1981-01-01

    Proposes that libraries employ modern booksellers' merchandising techniques to improve circulation of library materials. Using displays in various ways, the methods and reasons for weeding out books, replacing worn book jackets, and selecting new books are discussed. Suggestions for learning how to market and 11 references are provided. (RBF)

  6. Combinatorial techniques

    CERN Document Server

    Sane, Sharad S

    2013-01-01

    This is a basic text on combinatorics that deals with all the three aspects of the discipline: tricks, techniques and theory, and attempts to blend them. The book has several distinctive features. Probability and random variables with their interconnections to permutations are discussed. The theme of parity has been specially included and it covers applications ranging from solving the Nim game to the quadratic reciprocity law. Chapters related to geometry include triangulations and Sperner's theorem, classification of regular polytopes, tilings and an introduction to the Eulcidean Ramsey theory. Material on group actions covers Sylow theory, automorphism groups and a classification of finite subgroups of orthogonal groups. All chapters have a large number of exercises with varying degrees of difficulty, ranging from material suitable for Mathematical Olympiads to research.

  7. Operative technique and pitfalls in donor heart procurement.

    Science.gov (United States)

    Shudo, Yasuhiro; Hiesinger, William; Oyer, Philip E; Woo, Y Joseph

    2017-01-01

    We describe a simple and reproducible donor heart procurement technique in sequential steps. A detailed understanding of procurement and organ preservation techniques should be an essential part of a heart transplant training program.

  8. A detailed gravimetric geoid from North America to Eurasia

    Science.gov (United States)

    Vincent, S. F.; Strange, W. E.; Marsh, J. G.

    1972-01-01

    A detailed gravimetric geoid of the United States, North Atlantic, and Eurasia, which was computed from a combination of satellite derived and surface gravity data, is presented. The precision of this detailed geoid is + or - 2 to + or - 3 m in the continents but may be in the range of 5 to 7 m in those areas where data is sparse. Comparisons of the detailed gravimetric geoid with results of Rapp, Fischer, and Rice for the United States, Bomford in Europe, and Heiskanen and Fischer in India are presented. Comparisons are also presented with geoid heights from satellite solutions for geocentric station coordinates in North America, the Caribbean, and Europe.

  9. Inferring electromagnetic ion cyclotron wave intensity from low altitude POES proton flux measurements: A detailed case study with conjugate Van Allen Probes observations

    Science.gov (United States)

    Zhang, Yang; Shi, Run; Ni, Binbin; Gu, Xudong; Zhang, Xianguo; Zuo, Pingbing; Fu, Song; Xiang, Zheng; Wang, Qi; Cao, Xing; Zou, Zhengyang

    2017-03-01

    Electromagnetic ion cyclotron (EMIC) waves play an important role in the magnetospheric particle dynamics and can lead to resonant pitch-angle scattering and ultimate precipitation of ring current protons. Commonly, the statistics of in situ EMIC wave measurements is adopted for quantitative investigation of wave-particle interaction processes, which however becomes questionable for detailed case studies especially during geomagnetic storms and substorms. Here we establish a novel technique to infer EMIC wave amplitudes from low-altitude proton measurements onboard the Polar Operational Environmental Satellites (POES). The detailed procedure is elaborated regarding how to infer the EMIC wave intensity for one specific time point. We then test the technique with a case study comparing the inferred root-mean-square (RMS) EMIC wave amplitude with the conjugate Van Allen Probes EMFISIS wave measurements. Our results suggest that the developed technique can reasonably estimate EMIC wave intensities from low-altitude POES proton flux data, thereby providing a useful tool to construct a data-based, near-real-time, dynamic model of the global distribution of EMIC waves once the proton flux measurements from multiple POES satellites are available for any specific time period.

  10. Advanced Meteor radar at Tirupati: System details and first results

    Science.gov (United States)

    Sunkara, Eswaraiah; Gurubaran, Subramanian; Sundararaman, Sathishkumar; Venkat Ratnam, Madineni; Karanam, Kishore Kumar; Eethamakula, Kosalendra; Vijaya Bhaskara Rao, S.

    An advanced meteor radar viz., Enhanced Meteor Detection Radar (EMDR) operating at 35.25 MHz is installed at Sri Venkateswara University (SVU), Tirupati (13.63oN, 79.4oE), India, in the month of August 2013. Present communication describes the need for the meteor radar at present location, system description, its measurement techniques, its variables and comparison of measured mean winds with contemporary radars over the Indian region. The present radar site is selected to fill the blind region of Gadanki (13.5oN, 79.2oE) MST radar, which covers mesosphere and lower thermosphere (MLT) region (70-110 km). By modifying the receiving antenna structure and elements, this radar is capable of providing accurate wind information between 70 and 110 km unlike other similar radars. Height covering region is extended by increasing the meteor counting capacity by modifying the receiving antenna structure and elements and hence its wind estimation limits extended below and above of 80 and 100 km, respectively. In the present study, we also made comparison of horizontal winds in the MLT region with those measured by similar and different (MST and MF radars) techniques over the Indian region including the model (HWM 07) data sets. The comparison showed a very good agreement between the overlapping altitudes (82-98 km) of different radars. Zonal winds compared very well as that of meridional winds. The observed discrepancies and limitations in the wind measurement are discussed. This new radar is expected to play important role in understanding the vertical and lateral coupling by forming a unique local network.

  11. Scanning tunneling microscopy II further applications and related scanning techniques

    CERN Document Server

    Güntherodt, Hans-Joachim

    1992-01-01

    Scanning Tunneling Microscopy II, like its predecessor, presents detailed and comprehensive accounts of the basic principles and broad range of applications of STM and related scanning probe techniques. The applications discussed in this volume come predominantly from the fields of electrochemistry and biology. In contrast to those described in Vol. I, these sudies may be performed in air and in liquids. The extensions of the basic technique to map other interactions are described inchapters on scanning force microscopy, magnetic force microscopy, scanning near-field optical microscopy, together with a survey of other related techniques. Also described here is the use of a scanning proximal probe for surface modification. Togehter, the two volumes give a comprehensive account of experimental aspcets of STM. They provide essentialreading and reference material for all students and researchers involvedin this field.

  12. Scanning tunneling microscopy II further applications and related scanning techniques

    CERN Document Server

    Güntherodt, Hans-Joachim

    1995-01-01

    Scanning Tunneling Microscopy II, like its predecessor, presents detailed and comprehensive accounts of the basic principles and broad range of applications of STM and related scanning probe techniques. The applications discussed in this volume come predominantly from the fields of electrochemistry and biology. In contrast to those described in STM I, these studies may be performed in air and in liquids. The extensions of the basic technique to map other interactions are described in chapters on scanning force microscopy, magnetic force microscopy, and scanning near-field optical microscopy, together with a survey of other related techniques. Also described here is the use of a scanning proximal probe for surface modification. Together, the two volumes give a comprehensive account of experimental aspects of STM. They provide essential reading and reference material for all students and researchers involved in this field. In this second edition the text has been updated and new methods are discussed.

  13. Active structural control with stable fuzzy PID techniques

    CERN Document Server

    Yu, Wen

    2016-01-01

    This book presents a detailed discussion of intelligent techniques to measure the displacement of buildings when they are subjected to vibration. It shows how these techniques are used to control active devices that can reduce vibration 60–80% more effectively than widely used passive anti-seismic systems. After introducing various structural control devices and building-modeling and active structural control methods, the authors propose offset cancellation and high-pass filtering techniques to solve some common problems of building-displacement measurement using accelerometers. The most popular control algorithms in industrial settings, PD/PID controllers, are then analyzed and then combined with fuzzy compensation. The stability of this combination is proven with standard weight-training algorithms. These conditions provide explicit methods for selecting PD/PID controllers. Finally, fuzzy-logic and sliding-mode control are applied to the control of wind-induced vibration. The methods described are support...

  14. Modelling catchment areas for secondary care providers: a case study.

    Science.gov (United States)

    Jones, Simon; Wardlaw, Jessica; Crouch, Susan; Carolan, Michelle

    2011-09-01

    Hospitals need to understand patient flows in an increasingly competitive health economy. New initiatives like Patient Choice and the Darzi Review further increase this demand. Essential to understanding patient flows are demographic and geographic profiles of health care service providers, known as 'catchment areas' and 'catchment populations'. This information helps Primary Care Trusts (PCTs) to review how their populations are accessing services, measure inequalities and commission services; likewise it assists Secondary Care Providers (SCPs) to measure and assess potential gains in market share, redesign services, evaluate admission thresholds and plan financial budgets. Unlike PCTs, SCPs do not operate within fixed geographic boundaries. Traditionally, SCPs have used administrative boundaries or arbitrary drive times to model catchment areas. Neither approach satisfactorily represents current patient flows. Furthermore, these techniques are time-consuming and can be challenging for healthcare managers to exploit. This paper presents three different approaches to define catchment areas, each more detailed than the previous method. The first approach 'First Past the Post' defines catchment areas by allocating a dominant SCP to each Census Output Area (OA). The SCP with the highest proportion of activity within each OA is considered the dominant SCP. The second approach 'Proportional Flow' allocates activity proportionally to each OA. This approach allows for cross-boundary flows to be captured in a catchment area. The third and final approach uses a gravity model to define a catchment area, which incorporates drive or travel time into the analysis. Comparing approaches helps healthcare providers to understand whether using more traditional and simplistic approaches to define catchment areas and populations achieves the same or similar results as complex mathematical modelling. This paper has demonstrated, using a case study of Manchester, that when estimating

  15. The influence of narrative practice techniques on child behaviors in forensic interviews.

    Science.gov (United States)

    Anderson, Gwendolyn D; Anderson, Jennifer N; Gilgun, Jane F

    2014-01-01

    During investigations of child sexual abuse, forensic interviewers must maintain a delicate balance of providing support for the child while collecting forensic evidence about the abuse allegation required for credible evidence for court purposes. The use of narrative practice techniques can achieve both goals by creating conditions that facilitate the possibility that children will feel safe enough to provide detailed descriptions of the alleged abuse. This article reports findings from an evaluation of a change in practice using the CornerHouse Forensic Interview Protocol in which narrative practice techniques were incorporated into the interview format. Findings show that children provided more detailed accounts of abuse when interviewers used open-ended questions and supportive statements through narrative practice.

  16. CDC WONDER: Detailed Mortality - Underlying Cause of Death

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Detailed Mortality - Underlying Cause of Death data on CDC WONDER are county-level national mortality and population data spanning the years 1999-2009. Data are...

  17. Analysis of Common Fatigue Details in Steel Truss Structures

    Institute of Scientific and Technical Information of China (English)

    张玉玲; 潘际炎; 潘际銮

    2004-01-01

    Generally, the number of fatigue cycles, the range of the repeated stresses, and the type of the structural details are the key factors affecting fatigue in large-scale welded structures. Seven types of structure details were tested using a 2000-kN hydraulic-pressure-servo fatigue machine to imitate fatigue behavior in modern steel-truss-structures fabricated using thicker welded steel plates and integral joint technology. The details included longitudinal edge welds, welded attachment affecting detail, integral joint, and weld repairs on plate edges. The fatigue damage locations show that the stress (normal or shear), the shape, and the location of the weld start and end points are three major factors reducing the fatigue strength. The test results can be used for similar large structures.

  18. 42 CFR 401.118 - Deletion of identifying details.

    Science.gov (United States)

    2010-10-01

    ... Deletion of identifying details. When CMS publishes or otherwise makes available an opinion or order, statement of policy, or other record which relates to a private party or parties, the name or names or...

  19. A detailed discussion of superfield supergravity prepotential perturbations

    Science.gov (United States)

    Ovalle, J.

    2011-04-01

    This paper presents a detailed discussion of the issue of supergravity perturbations around the flat five dimensional superspace required for manifest superspace formulations of the supergravity side of the AdS_{5}/CFT_{4} Correspondence.

  20. Enabling Detailed Energy Analyses via the Technology Performance Exchange: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Studer, D.; Fleming, K.; Lee, E.; Livingood, W.

    2014-08-01

    One of the key tenets to increasing adoption of energy efficiency solutions in the built environment is improving confidence in energy performance. Current industry practices make extensive use of predictive modeling, often via the use of sophisticated hourly or sub-hourly energy simulation programs, to account for site-specific parameters (e.g., climate zone, hours of operation, and space type) and arrive at a performance estimate. While such methods are highly precise, they invariably provide less than ideal accuracy due to a lack of high-quality, foundational energy performance input data. The Technology Performance Exchange was constructed to allow the transparent sharing of foundational, product-specific energy performance data, and leverages significant, external engineering efforts and a modular architecture to efficiently identify and codify the minimum information necessary to accurately predict product energy performance. This strongly-typed database resource represents a novel solution to a difficult and established problem. One of the most exciting benefits is the way in which the Technology Performance Exchange's application programming interface has been leveraged to integrate contributed foundational data into the Building Component Library. Via a series of scripts, data is automatically translated and parsed into the Building Component Library in a format that is immediately usable to the energy modeling community. This paper (1) presents a high-level overview of the project drivers and the structure of the Technology Performance Exchange; (2) offers a detailed examination of how technologies are incorporated and translated into powerful energy modeling code snippets; and (3) examines several benefits of this robust workflow.

  1. Energy Issues In Mobile Telecom Network: A Detailed Analysis

    Directory of Open Access Journals (Sweden)

    P. Balagangadhar Rao

    2011-12-01

    Full Text Available Diesel and Conventional energy costs are increasing at twice the growth rate of revenues of Mobile Telecom Network infrastructure industry. There is an urgent need to reduce the Operating Expenditure (OPEX in this front. While bridging the rural and urban divide, Telecom Operators should adopt stronger regulations for climate control by reducing the Green house gases like CO2.This strengthens the business case for renewable energy technology usage. Solutions like Solar, Fuel Cells, Wind, Biomass, and Geothermal can be explored and implemented in the arena of energy starving Telecom sector. Such sources provide clean and green energy. They are free and infinitely available. These technologies which use the natural resources are not only suitable for stand alone applications but also have long life span. Their maintenance cost is quite minimal. Most important advantage of the use of these natural resources is to have a low Carbon foot print. These are silent energy sources. Out of these, Solar-based solutions are available as Ground (or Tower mounted variants. Hybrid Technology solutions like Solar-Solar, Solar-DCDG (Direct Current Diesel Generators or Solar-battery bank are to be put into use in order to cut down the OPEX (Operating Expenditure. Further, a single Multi Fuel Cell can also be used, which can run on Ethanol/Bio Fuel/Compressed Natural Gas (CNG/Liquefied Petroleum Gas (LPG/Pyrolysis oil. Also, storage solutions like Lithium ion batteries reduce the Diesel Generator run hours, offering about fifty percent of savings in operating expenditure front. A detailed analysis is made in this paper in respect of the Energy requirements of Mobile Telecom Network; Minimising the Operating Costs by the usage of the technologies that harvest Natural resources; Sharing the Infrastructure by different Operators and bringing Energy efficiency by adopting latest Storage back up technologies.

  2. Detailed characterization of the substrate specificity of mouse wax synthase.

    Science.gov (United States)

    Miklaszewska, Magdalena; Kawiński, Adam; Banaś, Antoni

    2013-01-01

    Wax synthases are membrane-associated enzymes catalysing the esterification reaction between fatty acyl-CoA and a long chain fatty alcohol. In living organisms, wax esters function as storage materials or provide protection against harmful environmental influences. In industry, they are used as ingredients for the production of lubricants, pharmaceuticals, and cosmetics. Currently the biological sources of wax esters are limited to jojoba oil. In order to establish a large-scale production of desired wax esters in transgenic high-yielding oilseed plants, enzymes involved in wax esters synthesis from different biological resources should be characterized in detail taking into consideration their substrate specificity. Therefore, this study aims at determining the substrate specificity of one of such enzymes -- the mouse wax synthase. The gene encoding this enzyme was expressed heterologously in Saccharomyces cerevisiae. In the in vitro assays (using microsomal fraction from transgenic yeast), we evaluated the preferences of mouse wax synthase towards a set of combinations of 11 acyl-CoAs with 17 fatty alcohols. The highest activity was observed for 14:0-CoA, 12:0-CoA, and 16:0-CoA in combination with medium chain alcohols (up to 5.2, 3.4, and 3.3 nmol wax esters/min/mg microsomal protein, respectively). Unsaturated alcohols longer than 18°C were better utilized by the enzyme in comparison to the saturated ones. Combinations of all tested alcohols with 20:0-CoA, 22:1-CoA, or Ric-CoA were poorly utilized by the enzyme, and conjugated acyl-CoAs were not utilized at all. Apart from the wax synthase activity, mouse wax synthase also exhibited a very low acyl-CoA:diacylglycerol acyltransferase activity. However, it displayed neither acyl-CoA:monoacylglycerol acyltransferase, nor acyl-CoA:sterol acyltransferase activity.

  3. Detailed protein sequence alignment based on Spectral Similarity Score (SSS

    Directory of Open Access Journals (Sweden)

    Thomas Dina

    2005-04-01

    Full Text Available Abstract Background The chemical property and biological function of a protein is a direct consequence of its primary structure. Several algorithms have been developed which determine alignment and similarity of primary protein sequences. However, character based similarity cannot provide insight into the structural aspects of a protein. We present a method based on spectral similarity to compare subsequences of amino acids that behave similarly but are not aligned well by considering amino acids as mere characters. This approach finds a similarity score between sequences based on any given attribute, like hydrophobicity of amino acids, on the basis of spectral information after partial conversion to the frequency domain. Results Distance matrices of various branches of the human kinome, that is the full complement of human kinases, were developed that matched the phylogenetic tree of the human kinome establishing the efficacy of the global alignment of the algorithm. PKCd and PKCe kinases share close biological properties and structural similarities but do not give high scores with character based alignments. Detailed comparison established close similarities between subsequences that do not have any significant character identity. We compared their known 3D structures to establish that the algorithm is able to pick subsequences that are not considered similar by character based matching algorithms but share structural similarities. Similarly many subsequences with low character identity were picked between xyna-theau and xyna-clotm F/10 xylanases. Comparison of 3D structures of the subsequences confirmed the claim of similarity in structure. Conclusion An algorithm is developed which is inspired by successful application of spectral similarity applied to music sequences. The method captures subsequences that do not align by traditional character based alignment tools but give rise to similar secondary and tertiary structures. The Spectral

  4. A New Video Coding Method Based on Improving Detail Regions

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The Moving Pictures Expert Group (MPEG) and H.263 standard coding method is widely used in video compression. However, the visual quality of detail regions such as eyes and mouth is not content in people at the decoder, as far as the conference telephone or videophone is concerned. A new coding method based on improving detail regions is presented in this paper. Experimental results show that this method can improve the visual quality at the decoder.

  5. Save Energy Now Assessments Results 2008 Detailed Report

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Anthony L [ORNL; Martin, Michaela A [ORNL; Nimbalkar, Sachin U [ORNL; Quinn, James [U.S. Department of Energy; Glatt, Ms. Sandy [DOE Industrial Technologies Program; Orthwein, Mr. Bill [U.S. Department of Energy

    2010-09-01

    independently replicate the assessment process at the company's other facilities. Another important element of the Save Energy Now assessment process is the follow-up process used to identify how many of the recommended savings opportunities from individual assessments have been implemented in the industrial plants. Plant personnel involved with the Save Energy Now assessments are contacted 6 months, 12 months, and 24 months after individual assessments are completed to determine implementation results. A total of 260 Save Energy Now assessments were successfully completed in calendar year 2008. This means that a total of 718 assessments were completed in 2006, 2007, and 2008. As of July 2009, we have received a total of 239 summary reports from the ESAs that were conducted in year 2008. Hence, at the time that this report was prepared, 680 final assessment reports were completed (200 from year 2006, 241 from year 2007, and 239 from year 2008). The total identified potential cost savings from these 680 assessments is $1.1 billion per year, including natural gas savings of about 98 TBtu per year. These results, if fully implemented, could reduce CO{sub 2} emissions by about 8.9 million metric tons annually. When this report was prepared, data on implementation of recommended energy and cost savings measures from 488 Save Energy Now assessments were available. For these 488 plants, measures saving a total of $147 million per year have been implemented, measures that will save $169 million per year are in the process of being implemented, and plants are planning implementation of measures that will save another $239 million per year. The implemented recommendations are already achieving total CO{sub 2} reductions of about 1.8 million metric tons per year. This report provides a summary of the key results for the Save Energy Now assessments completed in 2008; details of the 6-month, 12-month, and 24-month implementation results obtained to date; and an evaluation of these

  6. A course on integration theory including more than 150 exercises with detailed answers

    CERN Document Server

    Lerner, Nicolas

    2014-01-01

    This textbook provides a detailed treatment of abstract integration theory, construction of the Lebesgue measure via the Riesz-Markov Theorem and also via the Carathéodory Theorem. It also includes some elementary properties of Hausdorff measures as well as the basic properties of spaces of integrable functions and standard theorems on integrals depending on a parameter. Integration on a product space, change-of-variables formulas as well as the construction and study of classical Cantor sets are treated in detail. Classical convolution inequalities, such as Young's inequality and Hardy-Littlewood-Sobolev inequality, are proven. Further topics include the Radon-Nikodym theorem, notions of harmonic analysis, classical inequalities and interpolation theorems including Marcinkiewicz's theorem, and the definition of Lebesgue points and the Lebesgue differentiation theorem. Each chapter ends with a large number of exercises and detailed solutions. A comprehensive appendix provides the reader with various elements...

  7. Techniques, processes, and measures for software safety and reliability. Version 3.0

    Energy Technology Data Exchange (ETDEWEB)

    Sparkman, D

    1992-05-30

    The purpose of this report is to provide a detailed survey of current recommended practices and measurement techniques for the development of reliable and safe software-based systems. This report is intended to assist the United States Nuclear Reaction Regulation (NRR) in determining the importance and maturity of the available techniques and in assessing the relevance of individual standards for application to instrumentation and control systems in nuclear power generating stations. Lawrence Livermore National Laboratory (LLNL) provides technical support for the Instrumentation and Control System Branch (ICSB) of NRRin advanced instrumentation and control systems, distributed digital systems, software reliability, and the application of verificafion and validafion for the development of software.

  8. Performance of the In Situ Microcosm Technique for Measuring the Degradation of Organic Chemicals in Aquifers

    DEFF Research Database (Denmark)

    Nielsen, Per H.; Christensen, Thomas Højlund; Albrechtsen, Hans-Jørgen

    1996-01-01

    An in situ microcosm (ISM) consists of a stainless steel cylinder isolating about 2 L of the aquifer and is equipped with valves allowing for loading and sampling from the ground surface. During the last five years, this technique has been used frequently to study the degradation of organic...... chemicals in polluted and pristine aquifers representing different redox environments. The ISM technique has great potential for providing field-relevant degradation potentials and rate constants, but care must be taken in using the equipment and interpreting the results. This paper provides details...

  9. Experimental Techniques

    DEFF Research Database (Denmark)

    Wyer, Jean

    2013-01-01

    Gas-phase ion spectroscopy requires specialised apparatus, both when it comes to measuring photon absorption and light emission (fluorescence). The reason is much lower ion densities compared to solution-phase spectroscopy. In this chapter different setups are described, all based on mass spectro...... in data interpretation, and the advantages and disadvantages of the different techniques are clarified. New instrumental developments involving cryo-cooled storage rings, which show great promise for the future, are briefly touched upon.......Gas-phase ion spectroscopy requires specialised apparatus, both when it comes to measuring photon absorption and light emission (fluorescence). The reason is much lower ion densities compared to solution-phase spectroscopy. In this chapter different setups are described, all based on mass...... to circumvent this is discussed based on a chemical approach, namely tagging of ammonium groups by crown ether. Prompt dissociation can sometimes be identified from the total beam depletion differing from that due to statistical dissociation. Special emphasis in this chapter is on the limitations and pitfalls...

  10. Detail Enhancement for Infrared Images Based on Propagated Image Filter

    Directory of Open Access Journals (Sweden)

    Yishu Peng

    2016-01-01

    Full Text Available For displaying high-dynamic-range images acquired by thermal camera systems, 14-bit raw infrared data should map into 8-bit gray values. This paper presents a new method for detail enhancement of infrared images to display the image with a relatively satisfied contrast and brightness, rich detail information, and no artifacts caused by the image processing. We first adopt a propagated image filter to smooth the input image and separate the image into the base layer and the detail layer. Then, we refine the base layer by using modified histogram projection for compressing. Meanwhile, the adaptive weights derived from the layer decomposition processing are used as the strict gain control for the detail layer. The final display result is obtained by recombining the two modified layers. Experimental results on both cooled and uncooled infrared data verify that the proposed method outperforms the method based on log-power histogram modification and bilateral filter-based detail enhancement in both detail enhancement and visual effect.

  11. Automatic cartography techniques for earth resources research

    Science.gov (United States)

    Edson, D. T.

    1970-01-01

    Progress in developing instrumentation and software for the EROS user facilities is reported. Significant progress has been made in developing the USGS binary-mode scanning digitizer which is described in detail. Other instrumentation and processes discussed include profile-generating techniques, a manual digitizer, image correlation systems, and some new photomechanical data processing techniques.

  12. Surface science techniques

    CERN Document Server

    Walls, JM

    2013-01-01

    This volume provides a comprehensive and up to the minute review of the techniques used to determine the nature and composition of surfaces. Originally published as a special issue of the Pergamon journal Vacuum, it comprises a carefully edited collection of chapters written by specialists in each of the techniques and includes coverage of the electron and ion spectroscopies, as well as the atom-imaging methods such as the atom probe field ion microscope and the scanning tunnelling microscope. Surface science is an important area of study since the outermost surface layers play a crucial role

  13. Modern recording techniques

    CERN Document Server

    Huber, David Miles

    2013-01-01

    As the most popular and authoritative guide to recording Modern Recording Techniques provides everything you need to master the tools and day to day practice of music recording and production. From room acoustics and running a session to mic placement and designing a studio Modern Recording Techniques will give you a really good grounding in the theory and industry practice. Expanded to include the latest digital audio technology the 7th edition now includes sections on podcasting, new surround sound formats and HD and audio.If you are just starting out or looking for a step up

  14. Defectoscopic and Clinical Applications of Infrared Technique

    Science.gov (United States)

    Kopal, I.; Koštial, P.; Špička, I.; Pleva, L.; Jančíková, Z.

    2017-02-01

    The article deals with possible visualization of inhomogeneities in inorganic materials, such as laminates, as well as organic materials, such as bones. This work also provides a study of the visualization of internal fixation (nail), introduced in a bone by the IR technique. In the theoretical part, we present thermal wave propagation and a theoretical approach to the possibility of visualization of the boundary between two different materials with different thermal conductivity. Further on, the experimental method is tested with success on discovering artificial defects in glass laminates. In the second part of the article, a successful method of the visualization of the internal fixator position in a bone under IR excitation is presented. Methods of processing the data measured with the use of an infrared camera are presented in detail.

  15. Improving identification and management of partner violence: examining the process of academic detailing: a qualitative study

    Directory of Open Access Journals (Sweden)

    le Roux Helena D

    2011-06-01

    Full Text Available Abstract Background Many physicians do not routinely inquire about intimate partner violence. Purpose This qualitative study explores the process of academic detailing as an intervention to change physician behavior with regard to intimate partner violence (IPV identification and documentation. Method A non-physician academic detailer provided a seven-session modular curriculum over a two-and-a-half month period. The detailer noted written details of each training session. Audiotapes of training sessions and semi-structured exit interviews with each physician were recorded and transcribed. Transcriptions were qualitatively and thematically coded and analyzed using Atlas ti®. Results All three study physicians reported increased clarity with regard to the scope of their responsibility to their patients experiencing IPV. They also reported increased levels of comfort in the effective identification and appropriate documentation of IPV and the provision of ongoing support to the patient, including referrals to specialized community services. Conclusion Academic detailing, if presented by a supportive and knowledgeable academic detailer, shows promise to improve physician attitudes and practices with regards to patients in violent relationships.

  16. Detailed statistical contact angle analyses; "slow moving" drops on inclining silicon-oxide surfaces.

    Science.gov (United States)

    Schmitt, M; Groß, K; Grub, J; Heib, F

    2015-06-01

    Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (static to the "slow moving" dynamic contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer

  17. Singular perturbation techniques in the gravitational self-force problem

    CERN Document Server

    Pound, Adam

    2010-01-01

    Much of the progress in the gravitational self-force problem has involved the use of singular perturbation techniques. Yet the formalism underlying these techniques is not widely known. I remedy this situation by explicating the foundations and geometrical structure of singular perturbation theory in general relativity. Within that context, I sketch precise formulations of the methods used in the self-force problem: dual expansions (including matched asymptotic expansions), for which I identify precise matching conditions, one of which is a weak condition arising only when multiple coordinate systems are used; multiscale expansions, for which I provide a covariant formulation; and a self-consistent expansion with a fixed worldline, for which I provide a precise statement of the exact problem and its approximation. I then present a detailed analysis of matched asymptotic expansions as they have been utilized in calculating the self-force. Typically, the method has relied on a weak matching condition, which I s...

  18. Kinetic Actviation Relaxation Technique

    CERN Document Server

    Béland, Laurent Karim; El-Mellouhi, Fedwa; Joly, Jean-François; Mousseau, Normand

    2011-01-01

    We present a detailed description of the kinetic Activation-Relaxation Technique (k-ART), an off-lattice, self-learning kinetic Monte Carlo algorithm with on-the-fly event search. Combining a topological classification for local environments and event generation with ART nouveau, an efficient unbiased sampling method for finding transition states, k-ART can be applied to complex materials with atoms in off-lattice positions or with elastic deformations that cannot be handled with standard KMC approaches. In addition to presenting the various elements of the algorithm, we demonstrate the general character of k-ART by applying the algorithm to three challenging systems: self-defect annihilation in c-Si, self-interstitial diffusion in Fe and structural relaxation in amorphous silicon.

  19. Details and justifications for the MAP concept specification for acceleration above 63 GeV

    Energy Technology Data Exchange (ETDEWEB)

    Berg, J. Scott [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.

    2014-02-28

    The Muon Accelerator Program (MAP) requires a concept specification for each of the accelerator systems. The Muon accelerators will bring the beam energy from a total energy of 63 GeV to the maximum energy that will fit on the Fermilab site. Justifications and supporting references are included, providing more detail than will appear in the concept specification itself.

  20. Mind the Gap! Students' Use of Exemplars and Detailed Rubrics as Formative Assessment

    Science.gov (United States)

    Lipnevich, Anastasiya A.; McCallen, Leigh N.; Miles, Katharine Pace; Smith, Jeffrey K.

    2014-01-01

    The current study examined efficient modes for providing standardized feedback to improve performance on an assignment for a second year college class involving writing a brief research proposal. Two forms of standardized feedback (detailed rubric and proposal exemplars) were utilized is an experimental design with undergraduate students (N = 100)…

  1. A detailed cost analysis of in vitro fertilization and intracytoplasmic sperm injection treatment.

    NARCIS (Netherlands)

    Bouwmans, C.A.; Lintsen, B.M.; Eijkemans, M.J.; Habbema, J.D.; Braat, D.D.M.; Hakkaart, L.

    2008-01-01

    OBJECTIVE: To provide detailed information about costs of in vitro fertilization (IVF) and intracytoplasmic sperm injection (ICSI) treatment stages and to estimate the cost per IVF and ICSI treatment cycle and ongoing pregnancy. DESIGN: Descriptive micro-costing study. SETTING: Four Dutch IVF center

  2. Optical Super-Resolution Imaging of β-Amyloid Aggregation In Vitro and In Vivo: Method and Techniques.

    Science.gov (United States)

    Pinotsi, Dorothea; Kaminski Schierle, Gabriele S; Kaminski, Clemens F

    2016-01-01

    Super-resolution microscopy has emerged as a powerful and non-invasive tool for the study of molecular processes both in vitro and in live cells. In particular, super-resolution microscopy has proven valuable for research studies in protein aggregation. In this chapter we present details of recent advances in this method and the specific techniques, enabling the study of amyloid beta aggregation optically, both in vitro and in cells. First, we show that variants of optical super-resolution microscopy provide a capability to visualize oligomeric and fibrillar structures directly, providing detailed information on species morphology in vitro and even in situ, in the cellular environment. We focus on direct Stochastic Optical Reconstruction Microscopy, dSTORM, which provides morphological detail on spatial scales below 20 nm, and provide detailed protocols for its implementation in the context of amyloid beta research. Secondly, we present a range of optical techniques that offer super-resolution indirectly, which we call multi-parametric microscopy. The latter offers molecular scale information on self-assembly reactions via changes in protein or fluorophore spectral signatures. These techniques are empowered by our recent discovery that disease related amyloid proteins adopt intrinsic energy states upon fibrilisation. We show that fluorescence lifetime imaging provides a particularly sensitive readout to report on the aggregation state, which is robustly quantifiable for experiments performed either in vitro or in vivo.

  3. Older adults report moderately more detailed autobiographical memories

    Directory of Open Access Journals (Sweden)

    Robert S Gardner

    2015-05-01

    Full Text Available Autobiographical memory (AM is an essential component of the human mind. Although the amount and types of subjective detail (content that compose AMs constitute important dimensions of recall, age-related changes in memory content are not well characterized. Previously, we introduced the Cue-Recalled Autobiographical Memory test (CRAM; see http://cramtest.info, an instrument that collects subjective reports of AM content, and applied it to college-aged subjects. CRAM elicits AMs using naturalistic word-cues. Subsequently, subjects date each cued AM to a life period and count the number of remembered details from specified categories (features, e.g., temporal detail, spatial detail, persons, objects, and emotions. The current work applies CRAM to a broad range of individuals (18-78 years old to quantify the effects of age on AM content. Subject age showed a moderately positive effect on AM content: older compared with younger adults reported ~16% more details (~25 vs. ~21 in typical AMs. This age-related increase in memory content was similarly observed for remote and recent AMs, although content declined with the age of the event among all subjects. In general, the distribution of details across features was largely consistent among younger and older adults. However, certain types of details, i.e., those related to objects and sequences of events, contributed more to the age effect on content. Altogether, this work identifies a moderate age-related feature-specific alteration in the way life events are subjectively recalled, among an otherwise stable retrieval profile.

  4. Modelling of the photooxidation of toluene: conceptual ideas for validating detailed mechanisms

    Directory of Open Access Journals (Sweden)

    V. Wagner

    2003-01-01

    Full Text Available Toluene photooxidation is chosen as an example to examine how simulations of smog-chamber experiments can be used to unravel shortcomings in detailed mechanisms and to provide information on complex reaction systems that will be crucial for the design of future validation experiments. The mechanism used in this study is extracted from the Master Chemical Mechanism Version 3 (MCM v3 and has been updated with new modules for cresol and g-dicarbonyl chemistry. Model simulations are carried out for a toluene-NOx experiment undertaken at the European Photoreactor (EUPHORE. The comparison of the simulation with the experimental data reveals two fundamental shortcomings in the mechanism: OH production is too low by about 80%, and the ozone concentration at the end of the experiment is over-predicted by 55%. The radical budget was analysed to identify the key intermediates governing the radical transformation in the toluene system. Ring-opening products, particularly conjugated g-dicarbonyls, were identified as dominant radical sources in the early stages of the experiment. The analysis of the time evolution of radical production points to a missing OH source that peaks when the system reaches highest reactivity. First generation products are also of major importance for the ozone production in the system. The analysis of the radical budget suggests two options to explain the concurrent under-prediction of OH and over-prediction of ozone in the model: 1 missing oxidation processes that produce or regenerate OH without or with little NO to NO2 conversion or 2 NO3 chemistry that sequesters reactive nitrogen oxides into stable nitrogen compounds and at the same time produces peroxy radicals. Sensitivity analysis was employed to identify significant contributors to ozone production and it is shown how this technique, in combination with ozone isopleth plots, can be used for the design of validation experiments.

  5. Integral transform techniques for Green's function

    CERN Document Server

    Watanabe, Kazumi

    2015-01-01

    This book describes mathematical techniques for integral transforms in a detailed but concise manner. The techniques are subsequently applied to the standard partial differential equations, such as the Laplace equation, the wave equation and elasticity equations. Green’s functions for beams, plates and acoustic media are also shown, along with their mathematical derivations. The Cagniard-de Hoop method for double inversion is described in detail, and 2D and 3D elastodynamic problems are treated in full. This new edition explains in detail how to introduce the branch cut for the multi-valued square root function. Further, an exact closed form Green’s function for torsional waves is presented, as well as an application technique of the complex integral, which includes the square root function and an application technique of the complex integral.

  6. Novel Foraminal Expansion Technique

    Science.gov (United States)

    Senturk, Salim; Ciplak, Mert; Oktenoglu, Tunc; Sasani, Mehdi; Egemen, Emrah; Yaman, Onur; Suzer, Tuncer

    2016-01-01

    The technique we describe was developed for cervical foraminal stenosis for cases in which a keyhole foraminotomy would not be effective. Many cervical stenosis cases are so severe that keyhole foraminotomy is not successful. However, the technique outlined in this study provides adequate enlargement of an entire cervical foraminal diameter. This study reports on a novel foraminal expansion technique. Linear drilling was performed in the middle of the facet joint. A small bone graft was placed between the divided lateral masses after distraction. A lateral mass stabilization was performed with screws and rods following the expansion procedure. A cervical foramen was linearly drilled medially to laterally, then expanded with small bone grafts, and a lateral mass instrumentation was added with surgery. The patient was well after the surgery. The novel foraminal expansion is an effective surgical method for severe foraminal stenosis. PMID:27559460

  7. Providing scalable system software for high-end simulations

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-12-31

    Detailed, full-system, complex physics simulations have been shown to be feasible on systems containing thousands of processors. In order to manage these computer systems it has been necessary to create scalable system services. In this talk Sandia`s research on scalable systems will be described. The key concepts of low overhead data movement through portals and of flexible services through multi-partition architectures will be illustrated in detail. The talk will conclude with a discussion of how these techniques can be applied outside of the standard monolithic MPP system.

  8. Applications of the diffraction technique in solid state chemistry from "ab-initio" structure solution to final structure refinement: powder and single crystal

    OpenAIRE

    Napolitano, Emilio

    2011-01-01

    Establishing the crystal structure in solid state chemistry is often a pre-requisite for understanding and predicting the function and technological properties of the matter. The single crystal and powder diffraction approaches play a fundamental role to achieve this goal. These two methods are non-destructive analytical techniques which provide detailed information about the internal lattice of crystalline substances, unit cell dimensions, bond-lengths, bond-angles, and details of site-or...

  9. Detailed flow, hydrometeor and lightning characteristics of an isolated thunderstorm during COPS

    Directory of Open Access Journals (Sweden)

    K. Schmidt

    2012-04-01

    Full Text Available The three-hour life-cycle of the isolated thunderstorm on 15 July 2007 during the Convective and Orographically-induced Precipitation Study (COPS is documented in detail, with a special emphasis on the rapid development and mature phases. Remote sensing techniques as 5-min rapid scans from geostationary satellites, combined velocity retrievals from up to four Doppler-radars, the polarimetric determination of hydrometeors and spatio-temporal occurrences of lightning strokes are employed to arrive at a synoptic quantification of the physical parameters of this, during the COPS period, rare event.

    Inner cloud flow fields are available from radar multiple Doppler analyses, gridded on a 500 m-mesh, at four consecutive times separated by 15 min-intervals (14:35, 14:50, 15:05, 15:20; all times are in UTC. They contain horizontal winds of around 15 m s−1 and updrafts exceeding 5 m s−1, the latter collocated with lightning strokes. Reflectivity and polarimetric data indicate the existence of hail at the 2 km level around 14:40. Furthermore, polarimetric and Doppler radar variables indicate intense hydrometeor variability and cloud dynamics corresponding to an enhanced variance of the retrieved 3-D wind fields. Profiles of flow and hydrometeor statistics over the entire cloud volume provide reference data for high-resolution, episode-type numerical weather prediction runs in research mode.

    The study embarks from two multi-channel time-lapse movie-loops from geostationary satellite imagery (as Supplement, which provide an intuitive distinction of six phases making up the entire life-cycle of the thunderstorm. It concludes with a triple image-loop, juxtaposing a close-up of the cloud motion as seen by Meteosat, simulated brightness temperature (as a proxy for clouds seen by the infrared satellite channel, and a perspective view on the model generated system of cloud cells. By employing the motion-geared human

  10. Unbalance identification using the least angle regression technique

    Science.gov (United States)

    Chatzisavvas, Ioannis; Dohnal, Fadi

    2015-01-01

    The present investigation proposes a robust procedure for unbalance identification using the equivalent load method based on sparse vibration measurements. The procedure is demonstrated and benchmarked on an example rotor at constant speed. Since the number of measuring positions is much smaller than the number of possible fault locations, performing unbalance identification leads to an ill-posed problem. This problem was tried to be overcome previously with modal expansion in the time domain and with several linear regressions in the frequency domain. However, since the solution to the problem is a sparse equivalent force vector, these methods cannot provide a robust identification procedure. A robust identification can only be achieved by providing a-priori information on the number of unbalances to be identified. The presently proposed procedure achieves more precise unbalance identification without the need of a-priori information by incorporating a regularization technique. A well-known technique for producing sparse solutions is the Least Absolute Shrinkage and Selection Operator (LASSO). The proposed procedure is based on the generalized technique Least Angle Regression (LAR) which finds all the solutions of LASSO. A comparison of the time-domain approach, the frequency-domain approach and the proposed technique is made and the superiority of the latter technique in identifying the number of possible fault locations is highlighted. The selection of the threshold of the convergence algorithm of LAR as well as the selection of the value of the Lagrangian multiplier is discussed in some detail.

  11. A detailed gravimetric geoid of North America, Eurasia, and Australia

    Science.gov (United States)

    Vincent, S.; Strange, W. E.

    1972-01-01

    A detailed gravimetric geoid of North America, the North Atlantic, Eurasia, and Australia computed from a combination of satellite-derived and surface 1 x 1 gravity data, is presented. Using a consistent set of parameters, this geoid is referenced to an absolute datum. The precision of this detailed geoid is + or - 2 meters in the continents but may be in the range of 5 to 7 meters in those areas where data was sparse. Comparisons of the detailed gravimetric geoid with results of Rice for the United States, Bomford and Fischer in Eurasia, and Mather in Australia are presented. Comparisons are also presented with geoid heights from satellite solutions for geocentric station coordinates in North America, the Caribbean, Europe, and Australia.

  12. Testing for detailed balance in a financial market

    Science.gov (United States)

    Fiebig, H. R.; Musgrove, D. P.

    2015-06-01

    We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.

  13. Study on Hierarchical Structure of Detailed Control Planning

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Using case studies,this paper analyzes the characteristics of detailed control planning and its hierarchical controls,the form and composition of plan content,and methodological innovations.It then suggests improvements to the planning structure that are oriented at adaptability,fairness,centrality,and scientific principles with regard to the content,methods,and results of the planning.Regarding the hierarchical control system,the paper suggests that the detailed control plan should be composed of "block planning" and "plot planning".It is believed that through a combination of block and plot planning,the problem of joining long-term and short-term planning will be solved and it will be possible to address the need for adjustment and revision of detailed control plan.

  14. Generic Reliability-Based Inspection Planning for Fatigue Sensitive Details

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Straub, Daniel; Faber, Michael Havbro

    2005-01-01

    of fatigue sensitive details in fixed offshore steel jacket platforms and FPSO ship structures. Inspection and maintenance activities are planned such that code based requirements to the safety of personnel and environment for the considered structure are fulfilled and at the same time such that the overall......The generic approach for planning of in-service NDT inspections is extended to cover the case where the fatigue load is modified during the design lifetime of the structure. Generic reliability-based inspection planning has been developed as a practical approach to perform inspection planning...... expected costs for design, inspections, repairs and failures are minimized. The method is based on the assumption of “no-finds” of cracks during inspections. Each fatigue sensitive detail is categorized according to their type of details (SN curves), FDF values, RSR values, inspection, repair and failure...

  15. Surgical technique for lung retransplantation in the mouse

    Science.gov (United States)

    Li, Wenjun; Goldstein, Daniel R.; Bribriesco, Alejandro C.; Nava, Ruben G.; Spahn, Jessica H.; Wang, Xingan; Gelman, Andrew E.; Krupnick, Alexander S.

    2013-01-01

    Microsurgical cuff techniques for orthotopic vascularized murine lung transplantation have allowed for the design of studies that examine mechanisms contributing to the high failure rate of pulmonary grafts. Here, we provide a detailed technical description of orthotopic lung retransplantation in mice, which we have thus far performed in 144 animals. The total time of the retransplantation procedure is approximately 55 minutes, 20 minutes for donor harvest and 35 minutes for the implantation, with a success rate exceeding 95%. The mouse lung retransplantation model represents a novel and powerful tool to examine how cells that reside in or infiltrate pulmonary grafts shape immune responses. PMID:23825768

  16. Stereo-photometric techniques for scanning micrometer scale

    Directory of Open Access Journals (Sweden)

    Rocio Cachero

    2015-11-01

    Full Text Available This paper describes a new methodology based on the combination of photogrammetric and stereo-photometric techniques that allows creating virtual replicas reproducing the relief in micrometric scale, with a geometric resolution until 7 microns. The finest details of the texture obtained by photogrammetric methods are translated to the relief of the mesh to provide quality 3D printing by additive manufacturing methods. These results open new possibilities for virtual and physical reproduction of archeological items that need a great accuracy and geometric resolution.

  17. Visualizing Vertebrate Embryos with Episcopic 3D Imaging Techniques

    Directory of Open Access Journals (Sweden)

    Stefan H. Geyer

    2009-01-01

    Full Text Available The creation of highly detailed, three-dimensional (3D computer models is essential in order to understand the evolution and development of vertebrate embryos, and the pathogenesis of hereditary diseases. A still-increasing number of methods allow for generating digital volume data sets as the basis of virtual 3D computer models. This work aims to provide a brief overview about modern volume data–generation techniques, focusing on episcopic 3D imaging methods. The technical principles, advantages, and problems of episcopic 3D imaging are described. The strengths and weaknesses in its ability to visualize embryo anatomy and labeled gene product patterns, specifically, are discussed.

  18. Transformer-based design techniques for oscillators and frequency dividers

    CERN Document Server

    Luong, Howard Cam

    2016-01-01

    This book provides in-depth coverage of transformer-based design techniques that enable CMOS oscillators and frequency dividers to achieve state-of-the-art performance.  Design, optimization, and measured performance of oscillators and frequency dividers for different applications are discussed in detail, focusing on not only ultra-low supply voltage but also ultra-wide frequency tuning range and locking range.  This book will be an invaluable reference for anyone working or interested in CMOS radio-frequency or mm-Wave integrated circuits and systems.

  19. Glomerular microcapillary thrombosis demonstrated by the new technique of immunocathodoluminescence.

    Science.gov (United States)

    Schmidt, E. H.; Bröcker, W.; Wagner, H.; Pfefferkorn, G.; Beller, F. K.

    1975-01-01

    Fluorescein-labeled antigen-antibody complexes could be made visible by scanning electron microscopy using an intensifying device. This new method of immunocathodoluminescence was demonstrated on cryostat sections of rat kidneys containing glomerular fibrin as the result of endotoxin infusion. The resulting photographs correspond with those obtained by immunofluorescent microscopy. The advantage of this technique is, however, the larger depth of focus. By using thinner cyostat sections it is expected that the higher resolution of scanning microscopy will provide even better details in three dimensions. Images Figure 1 Figure 2 Figure 3 Figure 4 Figure 5 Figure 6 Figure 7 PMID:1101704

  20. Dynamic Range Analysis of the Phase Generated Carrier Demodulation Technique

    Directory of Open Access Journals (Sweden)

    M. J. Plotnikov

    2014-01-01

    Full Text Available The dependence of the dynamic range of the phase generated carrier (PGC technique on low-pass filters passbands is investigated using a simulation model. A nonlinear character of this dependence, which could lead to dynamic range limitations or measurement uncertainty, is presented for the first time. A detailed theoretical analysis is provided to verify the simulation results and these results are consistent with performed calculations. The method for the calculation of low-pass filters passbands according to the required dynamic range upper limit is proposed.

  1. Simulation of flame-vortex interaction using detailed and reduced

    Energy Technology Data Exchange (ETDEWEB)

    Hilka, M. [Gaz de France (GDF), 75 - Paris (France); Veynante, D. [Ecole Centrale de Paris, Laboratoire EM2C. CNRS, 92 - Chatenay-Malabry (France); Baum, M. [CERFACS (France); Poinsot, T.J. [Centre National de la Recherche Scientifique (CNRS), 45 - Orleans-la-Source (France). Institut de Mecanique des Fluides de Toulouse

    1996-12-31

    The interaction between a pair of counter-rotating vortices and a lean premixed CH{sub 4}/O{sub 2}/N{sub 2} flame ({Phi} = + 0.55) has been studied by direct numerical simulations using detailed and reduced chemical reaction schemes. Results from the complex chemistry simulation are discussed with respect to earlier experiments and differences in the simulations using detailed and reduces chemistry are investigated. Transient evolutions of the flame surface and the total heat release rate are compared and modifications in the evolution of the local flame structure are displayed. (authors) 22 refs.

  2. Detailed field test of yaw-based wake steering

    DEFF Research Database (Denmark)

    Fleming, P.; Churchfield, M.; Scholbrock, A.

    2016-01-01

    This paper describes a detailed field-test campaign to investigate yaw-based wake steering. In yaw-based wake steering, an upstream turbine intentionally misaligns its yaw with respect to the inflow to deflect its wake away from a downstream turbine, with the goal of increasing total power...... production. In the first phase, a nacelle-mounted scanning lidar was used to verify wake deflection of a misaligned turbine and calibrate wake deflection models. In the second phase, these models were used within a yaw controller to achieve a desired wake deflection. This paper details the experimental...

  3. Benefits of detailed models of muscle activation and mechanics

    Science.gov (United States)

    Lehman, S. L.; Stark, L.

    1981-01-01

    Recent biophysical and physiological studies identified some of the detailed mechanisms involved in excitation-contraction coupling, muscle contraction, and deactivation. Mathematical models incorporating these mechanisms allow independent estimates of key parameters, direct interplay between basic muscle research and the study of motor control, and realistic model behaviors, some of which are not accessible to previous, simpler, models. The existence of previously unmodeled behaviors has important implications for strategies of motor control and identification of neural signals. New developments in the analysis of differential equations make the more detailed models feasible for simulation in realistic experimental situations.

  4. Exoplanet Detection Techniques

    CERN Document Server

    Fischer, Debra A; Laughlin, Greg P; Macintosh, Bruce; Mahadevan, Suvrath; Sahlmann, Johannes; Yee, Jennifer C

    2015-01-01

    We are still in the early days of exoplanet discovery. Astronomers are beginning to model the atmospheres and interiors of exoplanets and have developed a deeper understanding of processes of planet formation and evolution. However, we have yet to map out the full complexity of multi-planet architectures or to detect Earth analogues around nearby stars. Reaching these ambitious goals will require further improvements in instrumentation and new analysis tools. In this chapter, we provide an overview of five observational techniques that are currently employed in the detection of exoplanets: optical and IR Doppler measurements, transit photometry, direct imaging, microlensing, and astrometry. We provide a basic description of how each of these techniques works and discuss forefront developments that will result in new discoveries. We also highlight the observational limitations and synergies of each method and their connections to future space missions.

  5. MDP challenges from a software provider's perspective

    Science.gov (United States)

    Ohara, Shuichiro

    2014-10-01

    This industry faces new challenges every day. It gets tougher as process nodes shrink and the data complexity and volume increase. We are a mask data preparation (MDP) software provider, and have been providing MDP systems to mask shops since 1990. As the industry has, MDP software providers also have been facing new challenges over time, and the challenges get tougher as process nodes shrink and the data complexity and volume increase. We discuss such MDP challenges and solutions in this paper from a MDP software provider's perspective. The data volume continuously increases, and it is caused by shrinking the process node. In addition, resolution enhancement techniques (RET) such as optical proximity correction (OPC) and inverse lithography technique (ILT) induce data complexity, and it contributes considerably to the increase in data volume. The growth of data volume and complexity brings challenges to MDP system, such as the computing speed, shot count, and mask process correction (MPC). New tools (especially mask writers) also bring new challenges. Variable-shaped E-beam (VSB) mask writers demand fracturing less slivers and lower figure counts for CD accuracy and write time requirements respectively. Now multibeam mask writers are under development and will definitely bring new challenges.

  6. Providing Services to Survivors of Domestic Violence: A Comparison of Rural and Urban Service Provider Perceptions

    Science.gov (United States)

    Eastman, Brenda J.; Bunch, Shelia Grant

    2007-01-01

    Although there is a considerable body of knowledge about domestic violence, a limited proportion focuses on domestic violence in rural settings. Using a nonprobability purposive sampling technique, 93 providers of domestic violence services from rural and urban localities in North Carolina and Virginia were located and asked to complete a…

  7. State of the art review of radioactive waste volume reduction techniques for commercial nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    1980-04-01

    A review is made of the state of the art of volume reduction techniques for low level liquid and solid radioactive wastes produced as a result of: (1) operation of commercial nuclear power plants, (2) storage of spent fuel in away-from-reactor facilities, and (3) decontamination/decommissioning of commercial nuclear power plants. The types of wastes and their chemical, physical, and radiological characteristics are identified. Methods used by industry for processing radioactive wastes are reviewed and compared to the new techniques for processing and reducing the volume of radioactive wastes. A detailed system description and report on operating experiences follow for each of the new volume reduction techniques. In addition, descriptions of volume reduction methods presently under development are provided. The Appendix records data collected during site surveys of vendor facilities and operating power plants. A Bibliography is provided for each of the various volume reduction techniques discussed in the report.

  8. Crucial role of detailed function, task, timeline, link and human vulnerability analyses in HRA

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, T.G.; Haney, L.N.; Ostrom, L.T.

    1992-10-01

    This paper addresses one major cause for large uncertainties in human reliability analysis (HRA) results, that is, an absence of detailed function, task, timeline, link and human vulnerability analyses. All too often this crucial step in the HRA process is done in a cursory fashion using word of mouth or written procedures which themselves may incompletely or inaccurately represent the human action sequences and human error vulnerabilities being analyzed. The paper examines the potential contributions these detailed analyses can make in achieving quantitative and qualitative HRA results which are: (1) creditable, that is, minimize uncertainty, (2) auditable, that is, systematically linking quantitative results and qualitative information from which the results are derived, (3) capable of supporting root cause analyses on human reliability factors determined to be major contributors to risk, and (4) capable of repeated measures and being combined with similar results from other analyses to examine HRA issues transcending individual systems and facilities. Based on experience analyzing test and commercial nuclear reactors, and medical applications of nuclear technology, an iterative process is suggested for doing detailed function, task, timeline, link and human vulnerability analyses using documentation reviews, open-ended and structured interviews, direct observations, and group techniques. Finally, the paper concludes that detailed analyses done in this manner by knowledgeable human factors practitioners, can contribute significantly to the credibility, auditability, causal factor analysis, and combining goals of the HRA.

  9. SUSY using boosted techniques

    CERN Document Server

    Stark, Giordon; The ATLAS collaboration

    2016-01-01

    In this talk, I present a discussion of techniques used in supersymmetry searches in papers published by the ATLAS Collaboration from late Run 1 to early Run 2. The goal is to highlight concepts the analyses have in common, why/how they work, and possible SUSY searches that could benefit from boosted studies. Theoretical background will be provided for reference to encourage participants to explore in depth on their own time.

  10. Babesiosis for Health Care Providers

    Centers for Disease Control (CDC) Podcasts

    2012-04-25

    This podcast will educate health care providers on diagnosing babesiosis and providing patients at risk with tick bite prevention messages.  Created: 4/25/2012 by Center for Global Health, Division of Parasitic Diseases and Malaria.   Date Released: 4/25/2012.

  11. HMO partnering: the provider dilemma.

    Science.gov (United States)

    Ayers, J; Benson, L; Bonhag, R

    1996-10-01

    While the growth of HMOs has slowed patient visits to doctors, it also has created a deluge of press clippings. On July 16, 1996, three articles on the subject appeared in the Wall Street Journal, front section. The headlines painted a vivid picture of the forces acting on HMOs and providers alike (Figure 1). The articles portended more change for healthcare. The "shake-out," a term applied to industries in serious transformation, brings shedding of excess capacity and loss of jobs and income. Providers, in particular, find themselves in a difficult dilemma. They must not only cut costs as reimbursement drops, but also retain patients with good outcomes and high quality service. Patient retention means keeping the individual patient from switching to another provider and keeping the insurer's group of patients as an authorized provider for that insurer. The relationship between provider and HMO lies at the heart of the provider dilemma. The HMO structure, which shifts financial risk for care, is quickly setting the standard, for healthcare pricing, medical standards, and management practices. Understanding and responding to HMO needs are vital to competitive advantage and survival. The article discusses the inherent dilemma of HMO and provider partnering and suggests provider responses.

  12. Factors Affecting Two Types of Memory Specificity: Particularization of Episodes and Details.

    Science.gov (United States)

    Willén, Rebecca M; Granhag, Pär Anders; Strömwall, Leif A

    2016-01-01

    Memory for repeated events is relevant to legal investigations about repeated occurrences. We investigated how two measures of specificity (number of events referred to and amount of detail reported about the events) were influenced by interviewees' age, number of experienced events, interviewer, perceived unpleasantness, and memory rehearsal. Transcribed narratives consisting of over 40.000 utterances from 95 dental patients, and the corresponding dental records, were studied. Amount of detail was measured by categorizing the utterances as generic, specific, or specific-extended. We found that the two measures were affected differently by all five factors. For instance, number of experienced events positively influenced number of referred events but had no effect on amount of detail provided about the events. We make suggestions for future research and encourage reanalysis of the present data set and reuse of the material.

  13. 5 CFR 2635.104 - Applicability to employees on detail.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Applicability to employees on detail. 2635.104 Section 2635.104 Administrative Personnel OFFICE OF GOVERNMENT ETHICS GOVERNMENT ETHICS STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE EXECUTIVE BRANCH General Provisions §...

  14. Probabilistic Model for Fatigue Crack Growth in Welded Bridge Details

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard; Yalamas, Thierry

    2013-01-01

    In the present paper a probabilistic model for fatigue crack growth in welded steel details in road bridges is presented. The probabilistic model takes the influence of bending stresses in the joints into account. The bending stresses can either be introduced by e.g. misalignment or redistributio...

  15. Baca geothermal demonstration project. Power plant detail design document

    Energy Technology Data Exchange (ETDEWEB)

    1981-02-01

    This Baca Geothermal Demonstration Power Plant document presents the design criteria and detail design for power plant equipment and systems, as well as discussing the rationale used to arrive at the design. Where applicable, results of in-house evaluations of alternatives are presented.

  16. Spectangular - Spectral Disentangling For Detailed Chemical Analysis Of Binaries

    Science.gov (United States)

    Sablowski, Daniel

    2016-08-01

    Disentangling of spectra helps to improve the orbit parameters and allows detailed chemical analysis. Spectangular is a GUI program written in C++ for spectral disentangling of spectra of SB1 and SB2 systems. It is based on singular value decomposition in the wavelength space and is coupled to an orbital solution.The results are the component spectra and the orbital parameters.

  17. Reliability assessment of welded steel details in bridges using inspection

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard; Yalamas, T.

    2014-01-01

    of the membrane stresses are estimated using a generic bridge structure and traffic measurements. The optimal reliability level for a welded detail in a bridge subjected to fatigue are estimated by cost benefit-analysis taking into account the risk of human lives through the Life Quality Index. Since the optimal...

  18. The Hybrid Motor Prototype: Design Details and Demonstration Results

    Science.gov (United States)

    1998-01-01

    S.Ueha and Y.Tomikawa3 have published some interesting details of the performance and life of ultrasonic motors with di erent frictional materials...be published as a technical report of the Institute for Systems Research, University of Maryland at College Park. [3] S. Ueha and Y. Tomikawa, Ultrasonic Motors : Theory and Applications. Clarendon Press, Oxford, 1993. 13

  19. Video-based facial animation with detailed appearance texture

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Facial shape transformation described by facial animation parameters (FAPs) involves the dynamic movement or deformation of eyes, brows, mouth, and lips, while detailed facial appearance concerns the facial textures such as creases, wrinkles, etc.Video-based facial animation exhibits not only facial shape transformation but also detailed appearance updates. In this paper, a novel algorithm for effectively extracting FAPs from video is proposed. Our system adopts the ICA-enforced direct appearance model (DAM) to track faces from video sequences; and then, FAPs are extracted from every frame of the video based on an extended model of Wincandidate 3.1. Facial appearance details are transformed from each frame by mapping an expression ratio image to the original image. We adopt wavelet to synthesize expressive details by combining the low-frequency signals of the original face and high-frequency signals of the expressive face from each frame of the video. Experimental results show that our proposed algorithm is suitable for reproducing realistic, expressive facial animations.

  20. Details from the Dashboard: Charter School Race/Ethnicity Demographics

    Science.gov (United States)

    National Alliance for Public Charter Schools, 2012

    2012-01-01

    This "Details from the Dashboard" report examines race/ethnicity breakouts for public charter schools and traditional public schools at the state and the school district level. The data in this report indicate that in the large majority of states, the race/ethnicity student demographics of charter schools are almost identical to those of the…

  1. Details from the Dashboard: Charter Schools by Geographic Region

    Science.gov (United States)

    National Alliance for Public Charter Schools, 2012

    2012-01-01

    While a majority of charter schools nationwide operate in urban and suburban areas, charter schools exist in all corners of the nation, and are expanding into all types of communities. This "Details from the Dashboard" report presents statistics on the number of charter schools and students enrolled in charter schools by the four geographic…

  2. Flavor Asymmetry of Nucleon Sea from Detailed Balance

    CERN Document Server

    Zhang, Y J; Yang, L M; Zhang, Yong-Jun; Ma, Bo-Qiang; Yang, Li-Ming

    2003-01-01

    In this study, the proton is taken as an ensemble of quark-gluon Fock states. Using the principle of detailed balance, we find $\\bar{d}-\\bar{u} \\approx 0.124$, which is in surprisingly agreement with the experimental observation.

  3. MIV project: Simulator detailed design and integration for the EUROSIM

    DEFF Research Database (Denmark)

    Thuesen, Gøsta; Parisch, Manlio; Jørgensen, John Leif;

    1997-01-01

    Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a reference mission scenario was defined. This report describes the detailed code developed for the contract, the code module interface and the interface to the EURO...

  4. Detailed bathymetric surveys in the central Indian Basin

    Digital Repository Service at National Institute of Oceanography (India)

    Kodagali, V.N.; KameshRaju, K.A.; Ramprasad, T.; George, P.; Jaisankar, S.

    Over 420,000 line kilometers of echo-sounding data was collected in the Central Indian Basin. This data was digitized, merged with navigation data and a detailed bathymetric map of the Basin was prepared. The Basin can be broadly classified...

  5. 18 CFR 2.80 - Detailed environmental statement.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Detailed environmental statement. 2.80 Section 2.80 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES GENERAL POLICY AND INTERPRETATIONS Statement of General Policy...

  6. Details of the battle to control Campeche Bay spill

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-01

    Details of the battle to control Campeche Bay spill from Petroleos Mexicanos' well at Ixtoc 1 are given, including the poor performance of ''Operation Sombrero'' and air and surface monitoring of spill transport, particularly by the US Coast Guard.

  7. Theorists and Techniques: Connecting Education Theories to Lamaze Teaching Techniques.

    Science.gov (United States)

    Podgurski, Mary Jo

    2016-01-01

    Should childbirth educators connect education theory to technique? Is there more to learning about theorists than memorizing facts for an assessment? Are childbirth educators uniquely poised to glean wisdom from theorists and enhance their classes with interactive techniques inspiring participant knowledge and empowerment? Yes, yes, and yes. This article will explore how an awareness of education theory can enhance retention of material through interactive learning techniques. Lamaze International childbirth classes already prepare participants for the childbearing year by using positive group dynamics; theory will empower childbirth educators to address education through well-studied avenues. Childbirth educators can provide evidence-based learning techniques in their classes and create true behavioral change.

  8. Detailed measurement of the flow field in an axial transonic compressor; Mesure detaillee des ecoulements dans un compresseur axial transsonique

    Energy Technology Data Exchange (ETDEWEB)

    Fradin, C.

    1998-07-01

    The prediction of flow structure and performances of an axial transonic compressor requires accurate solvers. Taking into account the complex flow patterns, it is important to validate codes by comparing it with experimental results. It is well understood that the availability of experimental data is fundamentally important for the improvement of the solvers. An axial transonic compressor has been fitted in the ERECA test facility of ONERA. The rotor of this compressor is isolated from stators. This experimental configuration allows to obtain a steady flow into the relative frame linked to the rotor. Due to this fact, experimental tests became easier because the basic phenomena are not hidden by mutual interactions rotor-stator. Measurements have been made in a test section far upstream the rotor, to provide the inlet conditions in the computation domain. Non intrusive and fast response measurement techniques allow to obtain the detailed flow structure in several test sections located in the rotor and far downstream it. All tests were carried out at four operating conditions of the compressor. Results provide good test cases for numerical prediction methods using three-dimensional Navier-Stokes solvers. (author)

  9. Hospitals, providers collaborate on transitions.

    Science.gov (United States)

    2012-01-01

    Baystate Health, a three-hospital system with headquarters in Springfield, MA, is partnering with post-acute providers to improve transitions as patients move through the continuum of care. A multidisciplinary post-acute performance team partnered with post-acute providers to determine why patients are readmitted to the hospital and to work on ways to avoid readmissions. Facilities share information with the hospitals how they operate and what they need to ensure patients receive the care they need. The health system's director of post-acute services holds regular meetings with providers to brainstorm on improving patient care.

  10. Narratives of Ghanaian abortion providers

    African Journals Online (AJOL)

    AJRH Managing Editor

    Keywords: Abortion, providers, law, access, reproductive health care ... administrative materials) into the decision-making process between a ... training, research, and outreach efforts of these ..... additional economic factors influence abortion.

  11. TERRAIN, PROVIDENCE COUNTY, RHODE ISLAND

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The Providence AOI consists of the costal portion of the county, and meshes up seamlessly with the Kent county AOI directly south. Ground Control is collected...

  12. Lodging Update: Providence, Rhode Island

    Directory of Open Access Journals (Sweden)

    Ragel Roginsky

    2013-04-01

    Full Text Available Each quarter, Pinnacle Advisory Group prepares an analysis of the New England lodging industry, which provides a regional summary and then focuses in depth on a particular market. These reviews look at recent and proposed supply changes, factors affecting demand and growth rates, and the effects of interactions between such supply and demand trends. In this issue, the authors spotlight the lodging market in Providence, Rhode Island.

  13. DETAILED INVESTIGATION OF THE REJUVENATION OF A SPENT ELECTROLESS NICKEL SOLUTION BY ELECTRODIALYSIS WITH A VIEW TO OPTIMIZING ELECTRODIALYSIS PERFORMANCE

    Science.gov (United States)

    The rejuvenation of spent electroless nickel baths by electrodialysis has received a considerable amount of attention over the past decade and the technique is being increasingly employed to extend electroless nickel bath life. However, thus far there has not been a detailed inve...

  14. Optimization techniques for Transportation Problems

    Directory of Open Access Journals (Sweden)

    Gauthaman.P

    2017-06-01

    Full Text Available This paper infers about optimization technique for various problems in transportation engineering. While for pavement engineering, maintenance is priority issue, for traffic it is signalling which is priority issue. Many optimization methods are discussed though given importance of genetic algorithm approach. While optimization techniques nearly approach practicality, research works are on for modern optimization techniques which not only adds ease of structure but also provide compatibility to modern day problems encountered in transportation engineering. Some of the modern tools are discussed to employ optimization techniques which are quite simple to use and implement once it is calibrated to the desired objective.

  15. Appropriate Levels of Detail in 3-D Visualisation: the House of the Surgeon, Pompeii

    Directory of Open Access Journals (Sweden)

    P.S. Murgatroyd

    2008-02-01

    Full Text Available The use of three-dimensional computer models in archaeological projects is a relatively new but rapidly growing area, both as a means of presenting research to the public and also as an aid to interpretation. This work uses the House of the Surgeon in Pompeii as a case study and seeks to illustrate the most effective ways to display information to a range of audiences. In using the flexibility of computer modelling techniques it seeks to show that the levels of detail in a 3-D computer model can be tailored to individual circumstances to produce the most effective result.

  16. An Investigation of Placement and Type of Seductive Details: The Primacy Effect of Seductive Details on Text Recall

    Science.gov (United States)

    Rowland, Emily; Skinner, Christopher H.; Davis-Richards, Kai; Saudargas, Richard; Robinson, Daniel H.

    2008-01-01

    Seductive details are interesting, but sometimes irrelevant to the target material present in texts and lectures. In the current study, 388 undergraduate students read six paragraphs describing Sigmund Freud's psychosexual stages (i.e., target material). Participants in four groups also read one of two biographical paragraphs. The biographical…

  17. An Investigation of Placement and Type of Seductive Details: The Primacy Effect of Seductive Details on Text Recall

    Science.gov (United States)

    Rowland, Emily; Skinner, Christopher H.; Davis-Richards, Kai; Saudargas, Richard; Robinson, Daniel H.

    2008-01-01

    Seductive details are interesting, but sometimes irrelevant to the target material present in texts and lectures. In the current study, 388 undergraduate students read six paragraphs describing Sigmund Freud's psychosexual stages (i.e., target material). Participants in four groups also read one of two biographical paragraphs. The biographical…

  18. Detailed design report for an operational phase panel-closure system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-01-11

    Under contract to Westinghouse Electric Corporation (Westinghouse), Waste Isolation Division (WID), IT Corporation has prepared a detailed design of a panel-closure system for the Waste Isolation Pilot Plant (WIPP). Preparation of this detailed design of an operational-phase closure system is required to support a Resource Conservation and Recovery Act (RCRA) Part B permit application and a non-migration variance petition. This report describes the detailed design for a panel-closure system specific to the WIPP site. The recommended panel-closure system will adequately isolate the waste-emplacement panels for at least 35 years. This report provides detailed design and material engineering specifications for the construction, emplacement, and interface-grouting associated with a panel-closure system at the WIPP repository, which would ensure that an effective panel-closure system is in place for at least 35 years. The panel-closure system provides assurance that the limit for the migration of volatile organic compounds (VOC) will be met at the point of compliance, the WIPP site boundary. This assurance is obtained through the inherent flexibility of the panel-closure system.

  19. Results of Detailed Hydrologic Characterization Tests—Fiscal and Calendar Year 2005

    Energy Technology Data Exchange (ETDEWEB)

    Spane, Frank A.; Newcomer, Darrell R.

    2008-02-27

    This report provides the results of detailed hydrologic characterization tests conducted within selected Hanford Site wells during fiscal and calendar year 2005. Detailed characterization tests performed included groundwater-flow characterization, barometric response evaluation, slug tests, in-well vertical groundwater-flow assessments, and a single-well tracer and constant-rate pumping test. Hydraulic property estimates obtained from the detailed hydrologic tests include hydraulic conductivity, transmissivity, specific yield, effective porosity, in-well lateral and vertical groundwater-flow velocity, aquifer groundwater-flow velocity, and depth-distribution profiles of hydraulic conductivity. In addition, local groundwater-flow characteristics (i.e., hydraulic gradient and flow direction) were determined for a site where detailed well testing was performed. Results obtained from these tests provide hydrologic information that supports the needs of Resource Conservation and Recovery Act waste management area characterization as well as sitewide groundwater monitoring and modeling programs. These results also reduce the uncertainty of groundwater-flow conditions at selected locations on the Hanford Site.

  20. Advances in molecular marker techniques and their applications in plant sciences.

    Science.gov (United States)

    Agarwal, Milee; Shrivastava, Neeta; Padh, Harish

    2008-04-01

    Detection and analysis of genetic variation can help us to understand the molecular basis of various biological phenomena in plants. Since the entire plant kingdom cannot be covered under sequencing projects, molecular markers and their correlation to phenotypes provide us with requisite landmarks for elucidation of genetic variation. Genetic or DNA based marker techniques such as RFLP (restriction fragment length polymorphism), RAPD (random amplified polymorphic DNA), SSR (simple sequence repeats) and AFLP (amplified fragment length polymorphism) are routinely being used in ecological, evolutionary, taxonomical, phylogenic and genetic studies of plant sciences. These techniques are well established and their advantages as well as limitations have been realized. In recent years, a new class of advanced techniques has emerged, primarily derived from combination of earlier basic techniques. Advanced marker techniques tend to amalgamate advantageous features of several basic techniques. The newer methods also incorporate modifications in the methodology of basic techniques to increase the sensitivity and resolution to detect genetic discontinuity and distinctiveness. The advanced marker techniques also utilize newer class of DNA elements such as retrotransposons, mitochondrial and chloroplast based microsatellites, thereby revealing genetic variation through increased genome coverage. Techniques such as RAPD and AFLP are also being applied to cDNA-based templates to study patterns of gene expression and uncover the genetic basis of biological responses. The review details account of techniques used in identification of markers and their applicability in plant sciences.

  1. [Authentication of Trace Material Evidence in Forensic Science Field with Infrared Microscopic Technique].

    Science.gov (United States)

    Jiang, Zhi-quan; Hu, Ke-liang

    2016-03-01

    In the field of forensic science, conventional infrared spectral analysis technique is usually unable to meet the detection requirements, because only very a few trace material evidence with diverse shapes and complex compositions, can be extracted from the crime scene. Infrared microscopic technique is developed based on a combination of Fourier-transform infrared spectroscopic technique and microscopic technique. Infrared microscopic technique has a lot of advantages over conventional infrared spectroscopic technique, such as high detection sensitivity, micro-area analysisand nondestructive examination. It has effectively solved the problem of authentication of trace material evidence in the field of forensic science. Additionally, almost no external interference is introduced during measurements by infrared microscopic technique. It can satisfy the special need that the trace material evidence must be reserved for witness in court. It is illustrated in detail through real case analysis in this experimental center that, infrared microscopic technique has advantages in authentication of trace material evidence in forensic science field. In this paper, the vibration features in infrared spectra of material evidences, including paints, plastics, rubbers, fibers, drugs and toxicants, can be comparatively analyzed by means of infrared microscopic technique, in an attempt to provide powerful spectroscopic evidence for qualitative diagnosis of various criminal and traffic accident cases. The experimental results clearly suggest that infrared microscopic technique has an incomparable advantage and it has become an effective method for authentication of trace material evidence in the field of forensic science.

  2. What greater spatial and temporal geochemistry detail can add to geobiology

    Science.gov (United States)

    Druschel, G.; Kafantaris, F. C. A.; Schroth, A. W.; Fike, D. A.; Orphan, V. J.; Schmitt-Kopplin, P.; Dvorski, S.; Clercin, N.

    2016-12-01

    Interaction between life and surrounding chemical environments is a defining component of the field of geobiology, one where the spatial scales vary from molecular to global, and the temporal scales of consideration span through deep time and include reaction times from femtoseconds to millennia. Sometimes the details in these interactions are as simple as identifying key microbial species present and changes in key chemical compounds, but there is a richness in the complexity of many systems only greater detail and an eye for the unseen can deliver. Determining the appropriate scale of measurement and the chemical and biological details needed to unravel these sometimes complex interactions is a key towards continuing the rapid and exciting pace of discovery for the field. Focusing on the geochemical side of this process, we have shown that fine-scale spatial and temporal measurements of redox compounds central to microbial metabolisms can illuminate new avenues of possible interactions between life and its surroundings. Additionally, new techniques supplying greater chemical and mineralogical detail can also shed new light on microbial interactions with earth systems. Chemical measurements approaching the chemical environment through time a single microbe may experience show us that some systems display a remarkably chaotic and variable chemical environment that may offer added ecological pressure on microbial function and community structure. New details of the chemical environment, particularly via the coupling of different element cycles such as sulfur and carbon, show us that life and its chemical surroundings can significantly interact beyond metabolisms. And the interaction between geomicrobial niches in even different materials such as sediments and the water column, can be strongly coupled behavior through biotic and abiotic interactions.

  3. Ancillary Services Provided from DER

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.B.

    2005-12-21

    Distributed energy resources (DER) are quickly making their way to industry primarily as backup generation. They are effective at starting and then producing full-load power within a few seconds. The distribution system is aging and transmission system development has not kept up with the growth in load and generation. The nation's transmission system is stressed with heavy power flows over long distances, and many areas are experiencing problems in providing the power quality needed to satisfy customers. Thus, a new market for DER is beginning to emerge. DER can alleviate the burden on the distribution system by providing ancillary services while providing a cost adjustment for the DER owner. This report describes 10 types of ancillary services that distributed generation (DG) can provide to the distribution system. Of these 10 services the feasibility, control strategy, effectiveness, and cost benefits are all analyzed as in the context of a future utility-power market. In this market, services will be provided at a local level that will benefit the customer, the distribution utility, and the transmission company.

  4. Ecosystem services provided by waterbirds.

    Science.gov (United States)

    Green, Andy J; Elmberg, Johan

    2014-02-01

    Ecosystem services are ecosystem processes that directly or indirectly benefit human well-being. There has been much recent literature identifying different services and the communities and species that provide them. This is a vital first step towards management and maintenance of these services. In this review, we specifically address the waterbirds, which play key functional roles in many aquatic ecosystems, including as predators, herbivores and vectors of seeds, invertebrates and nutrients, although these roles have often been overlooked. Waterbirds can maintain the diversity of other organisms, control pests, be effective bioindicators of ecological conditions, and act as sentinels of potential disease outbreaks. They also provide important provisioning (meat, feathers, eggs, etc.) and cultural services to both indigenous and westernized societies. We identify key gaps in the understanding of ecosystem services provided by waterbirds and areas for future research required to clarify their functional role in ecosystems and the services they provide. We consider how the economic value of these services could be calculated, giving some examples. Such valuation will provide powerful arguments for waterbird conservation.

  5. Information Hiding Techniques: A Tutorial Review

    CERN Document Server

    Thampi, Sabu M

    2008-01-01

    The purpose of this tutorial is to present an overview of various information hiding techniques. A brief history of steganography is provided along with techniques that were used to hide information. Text, image and audio based information hiding techniques are discussed. This paper also provides a basic introduction to digital watermarking.

  6. ACCOUNTING TREATMENTS USED FOR ACCOUNTING SERVICES PROVIDERS

    Directory of Open Access Journals (Sweden)

    ŢOGOE GRETI DANIELA

    2014-08-01

    Full Text Available The theme of our research is the ways of keeping accounting entities that are the object of the provision of services in the accounting profession. This paper aims to achieve a parallel between the ways of organizing financial records - accounting provided by freelancers and companies with activity in the financial - accounting. The first step in our scientific research is to establish objectives chosen area of scientific knowledge. Our scientific approach seeks to explain through a thorough and detailed approach as different sides (conceptual and practical looking projections of accounting issues related to regulatory developments and practices in the field. This paper addresses various concepts, accounting treatments, and books and accounting documents used both freelancers in providing accounting services and legal persons authorized accounting profession. In terms of methodology and research perspective, the whole scientific approach combined with quantitative and qualitative research theoretical perspective (descriptive-conceptual with practice perspective (empirical analyzing the main contributions of various authors (Romanian and foreign to knowledge in the field. Following the survey believe that the amendments to the national legislation will support entities providing accounting services, by cutting red tape on Administrative Burdens, and consequently will increase profitability and increase service quality.

  7. The effect of provider- and workflow-focused strategies for guideline implementation on provider acceptance

    Directory of Open Access Journals (Sweden)

    Ramanujam Rangaraj

    2009-10-01

    Full Text Available Abstract Background The effective implementation of clinical practice guidelines (CPGs depends critically on the extent to which the strategies that are deployed for implementing the guidelines promote provider acceptance of CPGs. Such implementation strategies can be classified into two types based on whether they primarily target providers (e.g., academic detailing, grand rounds presentations or the work context (e.g., computer reminders, modifications to forms. This study investigated the independent and joint effects of these two types of implementation strategies on provider acceptance of CPGs. Methods Surveys were mailed to a national sample of providers (primary care physicians, physician assistants, nurses, and nurse practitioners and quality managers selected from Veterans Affairs Medical Centers (VAMCs. A total of 2,438 providers and 242 quality managers from 123 VAMCs participated. Survey items measured implementation strategies and provider acceptance (e.g., guideline-related knowledge, attitudes, and adherence for three sets of CPGs--chronic obstructive pulmonary disease, chronic heart failure, and major depressive disorder. The relationships between implementation strategy types and provider acceptance were tested using multi-level analytic models. Results For all three CPGs, provider acceptance increased with the number of implementation strategies of either type. Moreover, the number of workflow-focused strategies compensated (contributing more strongly to provider acceptance when few provider-focused strategies were used. Conclusion Provider acceptance of CPGs depends on the type of implementation strategies used. Implementation effectiveness can be improved by using both workflow-focused as well as provider-focused strategies.

  8. Michaelis-Menten equation and detailed balance in enzymatic networks.

    Science.gov (United States)

    Cao, Jianshu

    2011-05-12

    Many enzymatic reactions in biochemistry are far more complex than the celebrated Michaelis-Menten scheme, but the observed turnover rate often obeys the hyperbolic dependence on the substrate concentration, a relation established almost a century ago for the simple Michaelis-Menten mechanism. To resolve the longstanding puzzle, we apply the flux balance method to predict the functional form of the substrate dependence in the mean turnover time of complex enzymatic reactions and identify detailed balance (i.e., the lack of unbalanced conformational current) as a sufficient condition for the Michaelis-Menten equation to describe the substrate concentration dependence of the turnover rate in an enzymatic network. This prediction can be verified in single-molecule event-averaged measurements using the recently proposed signatures of detailed balance violations. The finding helps analyze recent single-molecule studies of enzymatic networks and can be applied to other external variables, such as force-dependence and voltage-dependence.

  9. DETAILED MODELLING OF CHARGING BEHAVIOUR OF SMART SOLAR TANKS

    DEFF Research Database (Denmark)

    Fan, Jianhua; Andersen, Elsa; Furbo, Simon

    2010-01-01

    The charging behaviour of smart solar tanks for solar combisystems for one-family houses is investigated with detailed Computational Fluid Dynamics (CFD) modelling and Particle Image Velocimetry (PIV) measurements. The smart solar tank can be charged with a variable auxiliary volume fitted...... to the expected future energy demand. Therefore the heat loss from the tank is decreased and the thermal performance of the solar heating system is increased compared to a traditional system with a fixed auxiliary volume. The solar tank can be charged either by an electric heating element situated in the tank...... or by an electric heating element in a side-arm mounted on the side of the tank. Detailed CFD models of the smart tanks are built with different mesh densities in the tank and in the side-arm. The thermal conditions of the tank during charging are calculated with the CFD models. The fluid flow and temperature...

  10. DETAILED MODELLING OF CHARGING BEHAVIOUR OF SMART SOLAR TANKS

    DEFF Research Database (Denmark)

    Fan, Jianhua; Andersen, Elsa; Furbo, Simon

    The charging behaviour of smart solar tanks for solar combisystems for one-family houses is investigated with detailed Computational Fluid Dynamics (CFD) modelling and Particle Image Velocimetry (PIV) measurements. The smart solar tank can be charged with a variable auxiliary volume fitted...... to the expected future energy demand. Therefore the heat loss from the tank is decreased and the thermal performance of the solar heating system is increased compared to a traditional system with a fixed auxiliary volume. The solar tank can be charged either by an electric heating element situated in the tank...... or by an electric heating element in a side-arm mounted on the side of the tank. Detailed CFD models of the smart tanks are built with different mesh densities in the tank and in the side-arm. The thermal conditions of the tank during charging are calculated with the CFD models. The fluid flow and temperature...

  11. Detailed assessment of homology detection using different substitution matrices

    Institute of Scientific and Technical Information of China (English)

    LI Jing; WANG Wei

    2006-01-01

    Homology detection plays a key role in bioinformatics, whereas substitution matrix is one of the most important components in homology detection. Thus, besides the improvement of alignment algorithms, another effective way to enhance the accuracy of homology detection is to use proper substitution matrices or even construct new matrices.A study on the features of various matrices and on the comparison of the performances between different matrices in homology detection enable us to choose the most proper or optimal matrix for some specific applications. In this paper, by taking BLOSUM matrices as an example, some detailed features of matrices in homology detection are studied by calculating the distributions of numbers of recognized proteins over different sequence identities and sequence lengths. Our results clearly showed that different matrices have different preferences and abilities to the recognition of remote homologous proteins. Furthermore, detailed features of the various matrices can be used to improve the accuracy of homology detection.

  12. Detailed field test of yaw-based wake steering

    Science.gov (United States)

    Fleming, P.; Churchfield, M.; Scholbrock, A.; Clifton, A.; Schreck, S.; Johnson, K.; Wright, A.; Gebraad, P.; Annoni, J.; Naughton, B.; Berg, J.; Herges, T.; White, J.; Mikkelsen, T.; Sjöholm, M.; Angelou, N.

    2016-09-01

    This paper describes a detailed field-test campaign to investigate yaw-based wake steering. In yaw-based wake steering, an upstream turbine intentionally misaligns its yaw with respect to the inflow to deflect its wake away from a downstream turbine, with the goal of increasing total power production. In the first phase, a nacelle-mounted scanning lidar was used to verify wake deflection of a misaligned turbine and calibrate wake deflection models. In the second phase, these models were used within a yaw controller to achieve a desired wake deflection. This paper details the experimental design and setup. All data collected as part of this field experiment will be archived and made available to the public via the U.S. Department of Energy's Atmosphere to Electrons Data Archive and Portal.

  13. Samnett: the EMPS model with power flow constraints: implementation details

    Energy Technology Data Exchange (ETDEWEB)

    Helseth, Arild; Warland, Geir; Mo, Birger; Fosso, Olav B.

    2011-12-15

    This report describes the development and implementation of Samnett. Samnett is a new prototype for solving the coupled market and transmission network problem. The prototype is based on the EMPS model (Samkjoeringsmodellen). Results from the market model are distributed to a detailed transmission network model, where a DC power flow detects if there are overloads on monitored lines or interconnections. In case of overloads, power flow constraints are generated and added to the market problem. This report is an updated version of TR A6891 {sup I}mplementing Network Constraints in the EMPS model{sup .} It further elaborates on theoretical and implementation details in Samnett, but does not contain the case studies and file descriptions presented in TR A6891.(auth)

  14. Detailed Performance of the Outer Tracker at LHCb

    CERN Document Server

    Tuning, N

    2014-01-01

    The LHCb Outer Tracker is a gaseous detector covering an area of 5x6m2 with 12 double layers of straw tubes. Based on data of the first LHC running period from 2010 to 2012, the performance in terms of the single hit resolution and efficiency are presented. Details on the ionization length and subtle effects regarding signal reflections and the subsequent time-walk correction are given. The efficiency to detect a hit in the central half of the straw is estimated to be 99.2%, and the position resolution is determined to be approximately 200 um, depending on the detailed implementation of the internal alignment of individual detector modules. The Outer Tracker received a dose in the hottest region corresponding to 0.12 C/cm, and no signs of gain deterioration or other ageing effects are observed.

  15. Memory for contextual details: effects of emotion and aging.

    Science.gov (United States)

    Kensinger, Elizabeth A; Piguet, Olivier; Krendl, Anne C; Corkin, Suzanne

    2005-06-01

    When individuals are confronted with a complex visual scene that includes some emotional element, memory for the emotional component often is enhanced, whereas memory for peripheral (nonemotional) details is reduced. The present study examined the effects of age and encoding instructions on this effect. With incidental encoding instructions, young and older adults showed this pattern of results, indicating that both groups focused attention on the emotional aspects of the scene. With intentional encoding instructions, young adults no longer showed the effect: They were just as likely to remember peripheral details of negative images as of neutral images. The older adults, in contrast, did not overcome the attentional bias: They continued to show reduced memory for the peripheral elements of the emotional compared with the neutral scenes, even with the intentional encoding instructions.

  16. An ASIC Low Power Primer Analysis, Techniques and Specification

    CERN Document Server

    Chadha, Rakesh

    2013-01-01

    This book provides an invaluable primer on the techniques utilized in the design of low power digital semiconductor devices.  Readers will benefit from the hands-on approach which starts form the ground-up, explaining with basic examples what power is, how it is measured and how it impacts on the design process of application-specific integrated circuits (ASICs).  The authors use both the Unified Power Format (UPF) and Common Power Format (CPF) to describe in detail the power intent for an ASIC and then guide readers through a variety of architectural and implementation techniques that will help meet the power intent.  From analyzing system power consumption, to techniques that can employed in a low power design, to a detailed description of two alternate standards for capturing the power directives at various phases of the design, this book is filled with information that will give ASIC designers a competitive edge in low-power design. Starts from the ground-up and explains what power is, how it is measur...

  17. Properties of quantum Markovian master equations. [Semigroup law, detailed balance

    Energy Technology Data Exchange (ETDEWEB)

    Gorini, V.; Frigerio, A.; Verri, M.; Kossakowski, A.; Sudarshan, E.C.G.

    1976-11-01

    An essentially self-contained account is given of some general structural properties of the dynamics of quantum open Markovian systems. Some recent results regarding the problem of the classification of quantum Markovian master equations and the limiting conditions under which the dynamical evolution of a quantum open system obeys an exact semigroup law (weak coupling limit and singular coupling limit are reviewed). A general form of quantum detailed balance and its relation to thermal relaxation and to microreversibility is discussed.

  18. Detailed models for timing and efficiency in resistive plate chambers

    CERN Document Server

    Riegler, Werner

    2003-01-01

    We discuss detailed models for detector physics processes in Resistive Plate Chambers, in particular including the effect of attachment on the avalanche statistics. In addition, we present analytic formulas for average charges and intrinsic RPC time resolution. Using a Monte Carlo simulation including all the steps from primary ionization to the front-end electronics we discuss the dependence of efficiency and time resolution on parameters like primary ionization, avalanche statistics and threshold.

  19. Detailed thermal-hydraulic computation into a containment building

    Energy Technology Data Exchange (ETDEWEB)

    Caruso. A.; Flour, I.; Simonin, O. [EDF/LNH, Chatou (France); Cherbonnel, C [EDF/SEPTEN, Villeurbanne (France)

    1995-09-01

    This paper deals with numerical predictions of the influence of water sprays upon stratifications into a containment building using a two-dimensional two-phase flow code. Basic equations and closure assumptions are briefly presented. A test case in a situation involving spray evaporation is first detailed to illustrate the validation step. Then results are presented for a compressible recirculating flow into a containment building with condensation phenomena.

  20. New trends in Internet attacks: Clickjacking in detail

    OpenAIRE

    Thoresen, Torgeir Dahlqvist

    2009-01-01

    While the complexity of web applications and their functionality continually increase, so do the number of opportunities for an attacker to launch successful attacks against a web application's users. In this thesis we investigate and describe clickjacking in great detail. To our knowledge, this work represent the first systematic scientific approach to assess clickjacking that also consider the attack's social consequences for users' security through an experiment and survey. We address the...

  1. Comparison of the FFT/matrix inversion and system matrix techniques for higher-order probe correction in spherical near-field antenna measurements

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Nielsen, Jeppe Majlund; Breinbjerg, Olav

    2011-01-01

    Two higher-order probe-correction techniques for spherical near-field antenna measurements are compared in details for the accuracy they provide and their computational cost. The investigated techniques are the FFT/matrix inversion and the system matrix inversion. Each of these techniques allows...... and a higher-order probe....... correction of general high-order probes, including non-symmetric dual-polarized antennas with independent ports. The investigation was carried out by processing with each technique the same measurement data for a challenging case with an antenna under test significantly offset from the center of rotation...

  2. Shading-based Surface Detail Recovery under General Unknown Illumination.

    Science.gov (United States)

    Xu, Di; Duan, Qi; Zheng, Jianmin; Zhang, Juyong; Cai, Jianfei; Cham, Tat-Jen

    2017-02-17

    Reconstructing the shape of a 3D object from multi-view images under unknown, general illumination is a fundamental problem in computer vision and high quality reconstruction is usually challenging especially when fine detail is needed and the albedo of the object is non-uniform. This paper introduces vertex overall illumination vectors to model the illumination effect and presents a total variation (TV) based approach for recovering surface details using shading and multi-view stereo (MVS). Behind the approach are the two important observations: (1) the illumination over the surface of an object often appears to be piece wise smooth and (2) the recovery of surface orientation is not sufficient for reconstructing the surface, which was often overlooked previously. Thus we propose to use TV to regularize the overall illumination vectors and use visual hull to constrain partial vertices. The reconstruction is formulated as a constrained TV-minimization problem that simultaneously treats the shape and illumination vectors as unknowns. An augmented Lagrangian method is proposed to quickly solve the TV-minimization problem. As a result, our approach is robust, stable and is able to efficiently recover high quality of surface details even when starting with a coarse model obtained using MVS. These advantages are demonstrated by extensive experiments on the state-of-the-art MVS database, which includes challenging objects with varying albedo.

  3. Knudsen Gas Provides Nanobubble Stability

    NARCIS (Netherlands)

    Seddon, James Richard Thorley; Zandvliet, Henricus J.W.; Lohse, Detlef

    2011-01-01

    We provide a model for the remarkable stability of surface nanobubbles to bulk dissolution. The key to the solution is that the gas in a nanobubble is of Knudsen type. This leads to the generation of a bulk liquid flow which effectively forces the diffusive gas to remain local. Our model predicts

  4. Twitter for travel medicine providers.

    Science.gov (United States)

    Mills, Deborah J; Kohl, Sarah E

    2016-03-01

    Travel medicine practitioners, perhaps more so than medical practitioners working in other areas of medicine, require a constant flow of information to stay up-to-date, and provide best practice information and care to their patients. Many travel medicine providers are unaware of the popularity and potential of the Twitter platform. Twitter use among our travellers, as well as by physicians and health providers, is growing exponentially. There is a rapidly expanding body of published literature on this information tool. This review provides a brief overview of the ways Twitter is being used by health practitioners, the advantages that are peculiar to Twitter as a platform of social media, and how the interested practitioner can get started. Some key points about the dark side of Twitter are highlighted, as well as the potential benefits of using Twitter as a way to disseminate accurate medical information to the public. This article will help readers develop an increased understanding of Twitter as a tool for extracting useful facts and insights from the ever increasing volume of health information. © International Society of Travel Medicine, 2016. All rights reserved. Published by Oxford University Press. For permissions, please e-mail: journals.permissions@oup.com.

  5. Combining Seismic Arrays to Image Detailed Rupture Properties of Large Earthquakes: Evidence for Frequent Triggering of Multiple Faults

    Science.gov (United States)

    Ishii, M.; Kiser, E.

    2010-12-01

    Imaging detailed rupture characteristics using the back-projection method, which time-reverses waveforms to their source, has become feasible in recent years due to the availability of data from large aperture arrays with dense station coverage. In contrast to conventional techniques, this method can quickly and indiscriminately provide the spatio-temporal details of rupture propagation. Though many studies have utilized the back-projection method with a single regional array, the limited azimuthal coverage often leads to skewed resolution. In this study, we enhance the imaging power by combining data from two arrays, i.e., the Transportable Array (TA) in the United States and the High Sensitivity Seismographic Network (Hi-net) in Japan. This approach suppresses artifacts and achieves good lateral resolution by improving distance and azimuthal coverage while maintaining waveform coherence. We investigate four large events using this method: the August 15, 2007 Pisco, Peru earthquake, the September 12, 2007 Southern Sumatra earthquake, the September 29, 2009 Samoa Islands earthquake, and the February 27, 2010 Maule, Chile earthquake. In every case, except the Samoa Islands event, the distance of one of the arrays from the epicenter requires us to use the direct P wave and core phases in the back-projection. One of the common features of the rupture characteristics obtained from the back-projection analysis is spatio-temporal rupture discontinuities, or discrete subevents. Both the size of the gaps and the timing between subevents suggest that multiple segments are involved during giant earthquakes, and that they trigger slip on other faults. For example, the 2009 Samoa Islands event began with a rupture propagating north for about 15 seconds followed by a much larger rupture that originated 30 km northwest of the terminus of the first event and propagated back toward the southeast. The involvement of multiple rupture segments with different slip characteristics

  6. Evaluation of automatic exposure control performance in full-field digital mammography systems using contrast-detail analysis

    Science.gov (United States)

    Suarez Castellanos, Ivan M.; Kaczmarek, Richard; Brunner, Claudia C.; de Las Heras, Hugo; Liu, Haimo; Chakrabarti, Kish

    2012-03-01

    Full Field Digital Mammography (FFDM) is increasingly replacing screen-film systems for screening and diagnosis of breast abnormalities. All FFDM systems are equipped with an Automatic Exposure Control (AEC) which automatically selects technique factors to optimize dose and image quality. It is therefore crucial that AEC performance is properly adjusted and optimized to different breast thicknesses. In this work, we studied the AEC performance of three widely used FFDM systems using the CDMAM and QUART mam/digi phantoms. We used the CDMAM phantom to generate Contrast-Detail (C-D) curves for each AEC mode available in the FFDM systems under study for phantoms with equivalent X-Ray attenuation properties as 3.2 cm, 6 cm and 7.5 cm thick breasts. Generated C-D curves were compared with ideal C-D curves constructed using a metric referred to as the k-factor which is the product of the thickness and the diameter of the smallest correctly identified disks in the CDMAM phantom. Previous observer studies have indicated that k-factor values of 60 to 80 μm2 are particularly useful in demonstrating the threshold for object detectability for detectors used in digital mammography systems. The QUART mam/digi phantom was used to calculate contrast-to-noise ratio (CNR) values at different phantom thicknesses. The results of the C-D analysis and CNR measurements were used to determine limiting CNR values intended to provide a threshold for proper image quality assessment. The results of the Contrast-Detail analysis show that for two of the three evaluated FFDM systems, at higher phantom thicknesses, low contrast signal detectability gets worse. This agrees with the results obtained with the QUART mam/digi phantom, where CNR decreases below determined limiting CNR values.

  7. Channel coding techniques for wireless communications

    CERN Document Server

    Deergha Rao, K

    2015-01-01

    The book discusses modern channel coding techniques for wireless communications such as turbo codes, low-density parity check (LDPC) codes, space–time (ST) coding, RS (or Reed–Solomon) codes and convolutional codes. Many illustrative examples are included in each chapter for easy understanding of the coding techniques. The text is integrated with MATLAB-based programs to enhance the understanding of the subject’s underlying theories. It includes current topics of increasing importance such as turbo codes, LDPC codes, Luby transform (LT) codes, Raptor codes, and ST coding in detail, in addition to the traditional codes such as cyclic codes, BCH (or Bose–Chaudhuri–Hocquenghem) and RS codes and convolutional codes. Multiple-input and multiple-output (MIMO) communications is a multiple antenna technology, which is an effective method for high-speed or high-reliability wireless communications. PC-based MATLAB m-files for the illustrative examples are provided on the book page on Springer.com for free dow...

  8. Implementing academic detailing for breast cancer screening in underserved communities

    Directory of Open Access Journals (Sweden)

    Ashford Alfred R

    2007-12-01

    Full Text Available Abstract Background African American and Hispanic women, such as those living in the northern Manhattan and the South Bronx neighborhoods of New York City, are generally underserved with regard to breast cancer prevention and screening practices, even though they are more likely to die of breast cancer than are other women. Primary care physicians (PCPs are critical for the recommendation of breast cancer screening to their patients. Academic detailing is a promising strategy for improving PCP performance in recommending breast cancer screening, yet little is known about the effects of academic detailing on breast cancer screening among physicians who practice in medically underserved areas. We assessed the effectiveness of an enhanced, multi-component academic detailing intervention in increasing recommendations for breast cancer screening within a sample of community-based urban physicians. Methods Two medically underserved communities were matched and randomized to intervention and control arms. Ninety-four primary care community (i.e., not hospital based physicians in northern Manhattan were compared to 74 physicians in the South Bronx neighborhoods of the New York City metropolitan area. Intervention participants received enhanced physician-directed academic detailing, using the American Cancer Society guidelines for the early detection of breast cancer. Control group physicians received no intervention. We conducted interviews to measure primary care physicians' self-reported recommendation of mammography and Clinical Breast Examination (CBE, and whether PCPs taught women how to perform breast self examination (BSE. Results Using multivariate analyses, we found a statistically significant intervention effect on the recommendation of CBE to women patients age 40 and over; mammography and breast self examination reports increased across both arms from baseline to follow-up, according to physician self-report. At post-test, physician

  9. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  10. Nozzle fabrication technique

    Science.gov (United States)

    Wells, Dennis L. (Inventor)

    1988-01-01

    This invention relates to techniques for fabricating hour glass throat or convergent divergent nozzle shapes, and more particularly to new and improved techniques for forming rocket nozzles from electrically conductive material and forming cooling channels in the wall thereof. The concept of positioning a block of electrically conductive material so that its axis is set at a predetermined skew angle with relation to a travelling electron discharge machine electrode and thereafter revolving the body about its own axis to generate a hyperbolic surface of revolution, either internal or external is novel. The method will generate a rocket nozzle which may be provided with cooling channels using the same control and positioning system. The configuration of the cooling channels so produced are unique and novel. Also the method is adaptable to nonmetallic material using analogous cutting tools, such as, water jet, laser, abrasive wire and hot wire.

  11. Assessing wet snow avalanche activity using detailed physics based snowpack simulations

    Science.gov (United States)

    Wever, N.; Vera Valero, C.; Fierz, C.

    2016-06-01

    Water accumulating on microstructural transitions inside a snowpack is often considered a prerequisite for wet snow avalanches. Recent advances in numerical snowpack modeling allow for an explicit simulation of this process. We analyze detailed snowpack simulations driven by meteorological stations in three different climate regimes (Alps, Central Andes, and Pyrenees), with accompanying wet snow avalanche activity observations. Predicting wet snow avalanche activity based on whether modeled water accumulations inside the snowpack locally exceed 5-6% volumetric liquid water content is providing a higher prediction skill than using thresholds for daily mean air temperature, or the daily sum of the positive snow energy balance. Additionally, the depth of the maximum water accumulation in the simulations showed a significant correlation with observed avalanche size. Direct output from detailed snow cover models thereby is able to provide a better regional assessment of dangerous slope aspects and potential avalanche size than traditional methods.

  12. A consistent approach for mixed detailed and statistical calculation of opacities in hot plasmas

    CERN Document Server

    Porcherot, Quentin; Gilleron, Franck; Blenski, Thomas

    2011-01-01

    Absorption and emission spectra of plasmas with multicharged-ions contain transition arrays with a huge number of coalescent electric-dipole (E1) lines, which are well suited for treatment by the unresolved transition array and derivative methods. But, some transition arrays show detailed features whose description requires diagonalization of the Hamiltonian matrix. We developed a hybrid opacity code, called SCORCG, which combines statistical approaches with fine-structure calculations consistently. Data required for the computation of detailed transition arrays (atomic configurations and atomic radial integrals) are calculated by the super-configuration code SCO (Super-Configuration Opacity), which provides an accurate description of the plasma screening effects on the wave-functions. Level energies as well as position and strength of spectral lines are computed by an adapted RCG routine of R. D. Cowan. The resulting code provides opacities for hot plasmas and can handle mid-Z elements. The code is also a po...

  13. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    Energy Technology Data Exchange (ETDEWEB)

    1987-04-01

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of different approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.

  14. The Thoratec system implanted as a modified total artificial heart: the Bad Oeynhausen technique.

    Science.gov (United States)

    Arusoglu, Latif; Reiss, Nils; Morshuis, Michiel; Schoenbrodt, Michael; Hakim-Meibodi, Kavous; Gummert, Jan

    2010-12-01

    The CardioWest™ total artificial heart (SynCardia Systems, Tuscon, AZ, USA) is the only FDA-approved total artificial heart determined as a bridge to human heart transplantation for patients dying of biventricular heart failure. Implantation provides immediate hemodynamic restoration and clinical stabilization, leading to end-organ recovery and thus eventually allowing cardiac transplantation. Occasionally, implantation of a total artificial heart is not feasible for anatomical reasons. For this patient group, we have developed an alternative technique using the paracorporeal Thoratec biventricular support system (Thoratec, Pleasanton, CA, USA) as a modified total artificial heart. A detailed description of the implantation technique is presented.

  15. Application of digital image processing techniques to faint solar flare phenomena

    Science.gov (United States)

    Glackin, D. L.; Martin, S. F.

    1980-01-01

    Digital image processing of eight solar flare events was performed using the Video Information Communication and Retrieval language in order to study moving emission fronts, flare halos, and Moreton waves. The techniques used include contrast enhancement, isointensity contouring, the differencing of images, spatial filtering, and geometrical registration. The spatial extent and temporal behavior of the faint phenomena is examined along with the relation of the three types of phenomena to one another. The image processing techniques make possible the detailed study of the history of the phenomena and provide clues to their physical nature.

  16. Reduction of Large Detailed Chemical Kinetic Mechanisms for Autoignition Using Joint Analyses of Reaction Rates and Sensitivities

    Energy Technology Data Exchange (ETDEWEB)

    Saylam, A; Ribaucour, M; Pitz, W J; Minetti, R

    2006-11-29

    A new technique of reduction of detailed mechanisms for autoignition, which is based on two analysis methods is described. An analysis of reaction rates is coupled to an analysis of reaction sensitivity for the detection of redundant reactions. Thresholds associated with the two analyses have a great influence on the size and efficiency of the reduced mechanism. Rules of selection of the thresholds are defined. The reduction technique has been successfully applied to detailed autoignition mechanisms of two reference hydrocarbons: n-heptane and iso-octane. The efficiency of the technique and the ability of the reduced mechanisms to reproduce well the results generated by the full mechanism are discussed. A speedup of calculations by a factor of 5.9 for n-heptane mechanism and by a factor of 16.7 for iso-octane mechanism is obtained without losing accuracy of the prediction of autoignition delay times and concentrations of intermediate species.

  17. Accurate estimation of solvation free energy using polynomial fitting techniques.

    Science.gov (United States)

    Shyu, Conrad; Ytreberg, F Marty

    2011-01-15

    This report details an approach to improve the accuracy of free energy difference estimates using thermodynamic integration data (slope of the free energy with respect to the switching variable λ) and its application to calculating solvation free energy. The central idea is to utilize polynomial fitting schemes to approximate the thermodynamic integration data to improve the accuracy of the free energy difference estimates. Previously, we introduced the use of polynomial regression technique to fit thermodynamic integration data (Shyu and Ytreberg, J Comput Chem, 2009, 30, 2297). In this report we introduce polynomial and spline interpolation techniques. Two systems with analytically solvable relative free energies are used to test the accuracy of the interpolation approach. We also use both interpolation and regression methods to determine a small molecule solvation free energy. Our simulations show that, using such polynomial techniques and nonequidistant λ values, the solvation free energy can be estimated with high accuracy without using soft-core scaling and separate simulations for Lennard-Jones and partial charges. The results from our study suggest that these polynomial techniques, especially with use of nonequidistant λ values, improve the accuracy for ΔF estimates without demanding additional simulations. We also provide general guidelines for use of polynomial fitting to estimate free energy. To allow researchers to immediately utilize these methods, free software and documentation is provided via http://www.phys.uidaho.edu/ytreberg/software. Copyright © 2010 Wiley Periodicals, Inc.

  18. Integral transform techniques for Green's function

    CERN Document Server

    Watanabe, Kazumi

    2014-01-01

    In this book mathematical techniques for integral transforms are described in detail but concisely. The techniques are applied to the standard partial differential equations, such as the Laplace equation, the wave equation and elasticity equations. The Green's functions for beams, plates and acoustic media are also shown along with their mathematical derivations. Lists of Green's functions are presented for the future use. The Cagniard's-de Hoop method for the double inversion is described in detail, and 2D and 3D elasto-dynamics problems are fully treated.

  19. An Efficient Image Compression Technique Based on Arithmetic Coding

    Directory of Open Access Journals (Sweden)

    Prof. Rajendra Kumar Patel

    2012-12-01

    Full Text Available The rapid growth of digital imaging applications, including desktop publishing, multimedia, teleconferencing, and high visual definition has increased the need for effective and standardized image compression techniques. Digital Images play a very important role for describing the detailed information. The key obstacle for many applications is the vast amount of data required to represent a digital image directly. The various processes of digitizing the images to obtain it in the best quality for the more clear and accurate information leads to the requirement of more storage space and better storage and accessing mechanism in the form of hardware or software. In this paper we concentrate mainly on the above flaw so that we reduce the space with best quality image compression. State-ofthe-art techniques can compress typical images from 1/10 to 1/50 their uncompressed size without visibly affecting image quality. From our study I observe that there is a need of good image compression technique which provides better reduction technique in terms of storage and quality. Arithmetic coding is the best way to reducing encoding data. So in this paper we propose arithmetic coding with walsh transformation based image compression technique which is an efficient way of reduction

  20. Success of Hall technique crowns questioned.

    Science.gov (United States)

    Nainar, S M Hashim

    2012-01-01

    Hall technique is a method of providing stainless steel crowns for primary molars without tooth preparation and requires no local anesthesia. Literature review showed inconclusive evidence and therefore this technique should not be used in clinical practice.