WorldWideScience

Sample records for ground survey techniques

  1. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA HANFORD WASHINGTON

    Energy Technology Data Exchange (ETDEWEB)

    PETERSEN SW

    2010-12-02

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM{reg_sign} system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m [328 ft] and 200 m [656 ft]) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  2. American Woodcock Singing-ground Survey

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The American Woodcock Singing-Ground Survey, conducted by the U.S. Fish and Wildlife Service, exploits the conspicuous courtship display of the male woodcock. The...

  3. A Survey: Face Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Muhammad Sharif

    2012-12-01

    Full Text Available In this study, the existing techniques of face recognition are to be encountered along with their pros and cons to conduct a brief survey. The most general methods include Eigenface (Eigenfeatures, Hidden Markov Model (HMM, geometric based and template matching approaches. This survey actually performs analysis on these approaches in order to constitute face representations which will be discussed as under. In the second phase of the survey, factors affecting the recognition rates and processes are also discussed along with the solutions provided by different authors.

  4. Survey of semantic modeling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C.L.

    1975-07-01

    The analysis of the semantics of programing languages was attempted with numerous modeling techniques. By providing a brief survey of these techniques together with an analysis of their applicability for answering semantic issues, this report attempts to illuminate the state-of-the-art in this area. The intent is to be illustrative rather than thorough in the coverage of semantic models. A bibliography is included for the reader who is interested in pursuing this area of research in more detail.

  5. Survey of semantic modeling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C.L.

    1975-07-01

    The analysis of the semantics of programing languages was attempted with numerous modeling techniques. By providing a brief survey of these techniques together with an analysis of their applicability for answering semantic issues, this report attempts to illuminate the state-of-the-art in this area. The intent is to be illustrative rather than thorough in the coverage of semantic models. A bibliography is included for the reader who is interested in pursuing this area of research in more detail.

  6. Research on ground heat exchanger of Ground Source Heat Pump technique

    Institute of Scientific and Technical Information of China (English)

    LIU Dong-sheng; SUN You-hong; GAO Ke; WU Xiao-hang

    2004-01-01

    Ground Source Heat Pump technique and its operating principle are described in this paper. Ground heat exchanger is the key technique of ground source heat pump and its pattems are discussed. Software is helpful to design ground heat exchanger. A project of Chinese Ground Source Heat Pump is introduced and its market is more and more extensive.

  7. Aerial surveys adjusted by ground surveys to estimate area occupied by black-tailed prairie dog colonies

    Science.gov (United States)

    Sidle, John G.; Augustine, David J.; Johnson, Douglas H.; Miller, Sterling D.; Cully, Jack F.; Reading, Richard P.

    2012-01-01

    Aerial surveys using line-intercept methods are one approach to estimate the extent of prairie dog colonies in a large geographic area. Although black-tailed prairie dogs (Cynomys ludovicianus) construct conspicuous mounds at burrow openings, aerial observers have difficulty discriminating between areas with burrows occupied by prairie dogs (colonies) versus areas of uninhabited burrows (uninhabited colony sites). Consequently, aerial line-intercept surveys may overestimate prairie dog colony extent unless adjusted by an on-the-ground inspection of a sample of intercepts. We compared aerial line-intercept surveys conducted over 2 National Grasslands in Colorado, USA, with independent ground-mapping of known black-tailed prairie dog colonies. Aerial line-intercepts adjusted by ground surveys using a single activity category adjustment overestimated colonies by ≥94% on the Comanche National Grassland and ≥58% on the Pawnee National Grassland. We present a ground-survey technique that involves 1) visiting on the ground a subset of aerial intercepts classified as occupied colonies plus a subset of intercepts classified as uninhabited colony sites, and 2) based on these ground observations, recording the proportion of each aerial intercept that intersects a colony and the proportion that intersects an uninhabited colony site. Where line-intercept techniques are applied to aerial surveys or remotely sensed imagery, this method can provide more accurate estimates of black-tailed prairie dog abundance and trends

  8. Reliability and Assessment Techniques on Ground Excavation

    Directory of Open Access Journals (Sweden)

    Sanga Tangchawal

    2009-05-01

    Full Text Available Planning and assessment on the excavation of the brittle materials (soil or rock can be done by using the machinery and/or explosives. The reliability assessment has been proposed to predict the failure of ground during excavation process. The stability planning on cutting soil (rock face by machinery can be compared between the deterministic and the statistical method. The risk of using explosives for rock excavation has to concern on the damage and environmental impacts after blasting events.

  9. July 1973 ground survey of active Central American volcanoes

    Science.gov (United States)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1973-01-01

    The author has identified the following significant results. Ground survey has shown that thermal anomalies of various sizes associated with volcanic activity at several Central American volcanoes should be detectable from Skylab. Anomalously hot areas of especially large size (greater than 500 m in diameter) are now found at Santiaguito and Pacaya volcanoes in Guatemala and San Cristobal in Nicaragua. Smaller anomalous areas are to be found at least seven other volcanoes. This report is completed after ground survey of eleven volcanoes and ground-based radiation thermometry mapping at these same points.

  10. Surveying techniques in vibration measurement

    Directory of Open Access Journals (Sweden)

    Kuras Przemyslaw

    2015-01-01

    Full Text Available In order to determine the actual dynamic characteristics of engineering structures, it is necessary to perform direct measurements. The paper focuses on the problem of using various devices to measure vibration, with particular emphasis on surveying instruments. The main tool used in this study is the radar interferometer, which has been compared to: robotic total station, GNSS receivers and sensors (accelerometer and encoder. The results of four dynamic experiments are presented. They were performed on: industrial chimney, drilling tower, railway bridge and pedestrian footbridge. The obtained results have been discussed in terms of the requirements imposed by the standard ISO 4866:2010.

  11. The Importance of Local Surveys for Tying Techniques Together

    Science.gov (United States)

    Long, James L.; Bosworth, John M.

    2000-01-01

    The synergistic benefits of combining observations from multiple space geodesy techniques located at a site are a main reason behind the proposal for the establishment of the International Space Geodetic and Gravimetric Network (ISGN). However, the full benefits of inter-comparison are only realized when the spatial relationships between the different space geodetic systems are accurately determined. These spatial relationships are best determined and documented by developing a local reference network of stable ground monuments and conducting periodic surveys to tie together the reference points (for example: the intersection of rotation axes of a VLBI antenna) of the space geodetic systems and the ground monument network. The data obtained from local surveys is vital to helping understand any systematic errors within an individual technique and to helping identify any local movement or deformation of the space geodetic systems over time.

  12. Survey of data compression techniques

    Energy Technology Data Exchange (ETDEWEB)

    Gryder, R.; Hake, K.

    1991-09-01

    PM-AIM must provide to customers in a timely fashion information about Army acquisitions. This paper discusses ways that PM-AIM can reduce the volume of data that must be transmitted between sites. Although this paper primarily discusses techniques of data compression, it also briefly discusses other options for meeting the PM-AIM requirements. The options available to PM-AIM, in addition to hardware and software data compression, include less-frequent updates, distribution of partial updates, distributed data base design, and intelligent network design. Any option that enhances the performance of the PM-AIM network is worthy of consideration. The recommendations of this paper apply to the PM-AIM project in three phases: the current phase, the target phase, and the objective phase. Each recommendation will be identified as (1) appropriate for the current phase, (2) considered for implementation during the target phase, or (3) a feature that should be part of the objective phase of PM-AIM's design. The current phase includes only those measures that can be taken with the installed leased lines. The target phase includes those measures that can be taken in transferring the traffic from the leased lines to the DSNET environment with minimal changes in the current design. The objective phase includes all the things that should be done as a matter of course. The objective phase for PM-AIM appears to be a distributed data base with data for each site stored locally and all sites having access to all data.

  13. Survey of data compression techniques

    Energy Technology Data Exchange (ETDEWEB)

    Gryder, R.; Hake, K.

    1991-09-01

    PM-AIM must provide to customers in a timely fashion information about Army acquisitions. This paper discusses ways that PM-AIM can reduce the volume of data that must be transmitted between sites. Although this paper primarily discusses techniques of data compression, it also briefly discusses other options for meeting the PM-AIM requirements. The options available to PM-AIM, in addition to hardware and software data compression, include less-frequent updates, distribution of partial updates, distributed data base design, and intelligent network design. Any option that enhances the performance of the PM-AIM network is worthy of consideration. The recommendations of this paper apply to the PM-AIM project in three phases: the current phase, the target phase, and the objective phase. Each recommendation will be identified as (1) appropriate for the current phase, (2) considered for implementation during the target phase, or (3) a feature that should be part of the objective phase of PM-AIM`s design. The current phase includes only those measures that can be taken with the installed leased lines. The target phase includes those measures that can be taken in transferring the traffic from the leased lines to the DSNET environment with minimal changes in the current design. The objective phase includes all the things that should be done as a matter of course. The objective phase for PM-AIM appears to be a distributed data base with data for each site stored locally and all sites having access to all data.

  14. Status of Ground Motion Mitigation Techniques for CLIC

    CERN Document Server

    Snuverink, J; Collette, C; Duarte Ramos, F; Gaddi, A; Gerwig, H; Janssens, S; Pfingstner, J; Schulte, D; Balik, G; Brunetti, L; Jeremie, A; Burrows, P; Caron, B; Resta-Lopez, J

    2011-01-01

    The Compact Linear Collider (CLIC) accelerator has strong stability requirements on the position of the beam. In particular, the beam position will be sensitive to ground motion. A number of mitigation techniques are proposed - quadrupole stabilisation and positioning, final doublet stabilisation as well as beam based orbit and interaction point (IP) feedback. Integrated studies of the impact of the ground motion on the CLIC Main Linac (ML) and Beam Delivery System (BDS) have been performed, which model the hardware and beam performance in detail. Based on the results future improvements of the mitigation techniques are suggested and simulated. It is shown that with the current design the tight luminosity budget for ground motion effects is fulfilled and accordingly, an essential feasibility issue of CLIC has been addressed.

  15. Identification of Potential Fishing Grounds Using Geospatial Technique

    Science.gov (United States)

    Abdullah, Muhammad

    2016-07-01

    Fishery resources surveys using actual sampling and data collection methods require extensive ship time and sampling time. Informative data from satellite plays a vital role in fisheries application. Satellite Remote Sensing techniques can be used to detect fish aggregation just like visual fish identification ultimately these techniques can be used to predict the potential fishing zones by measuring the parameters which affect the distribution of fishes. Remote sensing is a time saving technique to locate fishery resources along the coast. Pakistan has a continental shelf area of 50,270 km2 and coastline length of 1,120 km. The total maritime zone of Pakistan is over 30 percent of the land area. Fishery plays an important role in the national economy. The marine fisheries sector is the main component, contributing about 57 percent in terms of production. Fishery is the most important economic activity in the villages and towns along the coast, and in most of the coastal villages and settlements it is the sole source of employment and income generation. Fishing by fishermen is done on the sole basis of repeated experiments and collection of information from other fishermen. Often they are in doubt about the location of potential fishing zones. This leads to waste of time and money, adversely affecting fishermen incomes and over or under-exploitation of fishing zones. The main purpose of this study was to map potential fishing grounds by identifying various environmental parameters which impact fish aggregation along the Pakistan coastline. The primary reason of this study is the fact that the fishing communities of Pakistan's coastal regions are extremely poor and lack knowledge of the modern tools and techniques that may be incorporated to enhance their yield and thus, improve their livelihood. Using geospatial techniques in order to accurately map the potential fishing zones based on sea surface temperature (SST) and chlorophyll -a content, in conjunction with

  16. A Survey of Collaborative Filtering Techniques

    Directory of Open Access Journals (Sweden)

    Xiaoyuan Su

    2009-01-01

    Full Text Available As one of the most successful approaches to building recommender systems, collaborative filtering (CF uses the known preferences of a group of users to make recommendations or predictions of the unknown preferences for other users. In this paper, we first introduce CF tasks and their main challenges, such as data sparsity, scalability, synonymy, gray sheep, shilling attacks, privacy protection, etc., and their possible solutions. We then present three main categories of CF techniques: memory-based, model-based, and hybrid CF algorithms (that combine CF with other recommendation techniques, with examples for representative algorithms of each category, and analysis of their predictive performance and their ability to address the challenges. From basic techniques to the state-of-the-art, we attempt to present a comprehensive survey for CF techniques, which can be served as a roadmap for research and practice in this area.

  17. Stylometry of Painting Techniques - A Survey

    OpenAIRE

    Sanaz keshvari; Abdolah Chalechale

    2016-01-01

    To discover a scientific relationship for art is an interesting and challenging problem. Recognition of artist Identification and stylometry of painting have been important issues at artwork. Image processing techniques can be an effective solution for resolving these problem. To the best of our knowledge, these problems have not been widely investigated. This paper presents a comprehensive survey about both proposed stylometry approaches and artist identification. Finally, It also compares a...

  18. Stylometry of Painting Techniques - A Survey

    Directory of Open Access Journals (Sweden)

    Sanaz keshvari

    2016-12-01

    Full Text Available To discover a scientific relationship for art is an interesting and challenging problem. Recognition of artist Identification and stylometry of painting have been important issues at artwork. Image processing techniques can be an effective solution for resolving these problem. To the best of our knowledge, these problems have not been widely investigated. This paper presents a comprehensive survey about both proposed stylometry approaches and artist identification. Finally, It also compares and summarizes all reviewed methods.

  19. A Survey of Unstructured Text Summarization Techniques

    Directory of Open Access Journals (Sweden)

    Sherif Elfayoumy

    2014-05-01

    Full Text Available Due to the explosive amounts of text data being created and organizations increased desire to leverage their data corpora, especially with the availability of Big Data platforms, there is not usually enough time to read and understand each document and make decisions based on document contents. Hence, there is a great demand for summarizing text documents to provide a representative substitute for the original documents. By improving summarizing techniques, precision of document retrieval through search queries against summarized documents is expected to improve in comparison to querying against the full spectrum of original documents. Several generic text summarization algorithms have been developed, each with its own advantages and disadvantages. For example, some algorithms are particularly good for summarizing short documents but not for long ones. Others perform well in identifying and summarizing single-topic documents but their precision degrades sharply with multi-topic documents. In this article we present a survey of the literature in text summarization. We also surveyed some of the most common evaluation methods for the quality of automated text summarization techniques. Last, we identified some of the challenging problems that are still open, in particular the need for a universal approach that yields good results for mixed types of documents.

  20. Simulation of a ground-layer adaptive optics system for the Kunlun Dark Universe Survey Telescope

    Institute of Scientific and Technical Information of China (English)

    Peng Jia; Sijiong Zhang

    2013-01-01

    Ground Layer Adaptive Optics (GLAO) is a recently developed technique extensively applied to ground-based telescopes,which mainly compensates for the wavefront errors induced by ground-layer turbulence to get an appropriate point spread function in a wide field of view.The compensation results mainly depend on the turbulence distribution.The atmospheric turbulence at Dome A in the Antarctic is mainly distributed below 15 meters,which is an ideal site for applications of GLAO.The GLAO system has been simulated for the Kunlun Dark Universe Survey Telescope,which will be set up at Dome A,and uses a rotating mirror to generate several laser guide stars and a wavefront sensor with a wide field of view to sequentially measure the wavefronts from different laser guide stars.The system is simulated on a computer and parameters of the system are given,which provide detailed information about the design of a practical GLAO system.

  1. 76 FR 38203 - Proposed Information Collection; North American Woodcock Singing Ground Survey

    Science.gov (United States)

    2011-06-29

    ... Fish and Wildlife Service Proposed Information Collection; North American Woodcock Singing Ground... migratory bird populations. The North American Woodcock Singing Ground Survey is an essential part of the... Woodcock Singing Ground Survey. Service Form Number(s): 3-156. Type of Request: Extension of currently...

  2. Modification of polarization filtering technique in HF ground wave radar

    Institute of Scientific and Technical Information of China (English)

    Zhang Guoyi; Tan Zhongji; Wang Jiantao

    2006-01-01

    The polarization filter using three orthogonal linear polarization antennas can suppress more disturbances than the polarization filter using two orthogonal linear polarization antennas in HF ground wave radar. But the algorithm of the threedimension filter is relatively complicated and not suitable for real-time processing. It can't use linear and nonlinear polarization vector translation technique directly. A modified polarization filter which is simple and has same suppressing ability as the three-dimension polarization filter is given. It only has half parameters of the primary one. Some problems about estimation of polarization parameters and selection of disturbances are discussed. A method of holding the phase of radar backscatter signal constantly is put forward so that unstationary disturbance signal can be processed.

  3. Where am I? Creating spatial awareness in unmanned ground robots using SLAM: A survey

    Indian Academy of Sciences (India)

    Nitin Kumar Dhiman; Dipti Deodhare; Deepak Khemani

    2015-08-01

    This paper presents a survey of Simultaneous Localization And Mapping (SLAM) algorithms for unmanned ground robots. SLAM is the process of creating a map of the environment, sometimes unknown a priori, while at the same time localizing the robot in the same map. The map could be one of different types i.e. metrical, topological, hybrid or semantic. In this paper, the classification of algorithms is done in three classes: (i) Metric map generating approaches, (ii) Qualitative map generating approaches, and (iii) Hybrid map generating approaches. SLAM algorithms for both static and dynamic environments have been surveyed. The algorithms in each class are further divided based on the techniques used. The survey in this paper presents the current state-of-the-art methods, including important landmark works reported in the literature.

  4. IMAGE AUTHENTICATION TECHNIQUES AND ADVANCES SURVEY

    Directory of Open Access Journals (Sweden)

    Derroll David

    2015-10-01

    Full Text Available With the advanced technologies in the area of Engineering the World has become a smaller place and communication is in our finger tips. The multimedia sharing traffic through electronic media has increased tremendously in the recent years with the higher use of social networking sites. The statistics of amount of images uploaded in the internet per day is very huge. Digital Image security has become vulnerable due to increase transmission over non-secure channel and needs protection. Digital Images play a crucial role in medical and military images etc. and any tampering of them is a serious issue. Several approaches are introduced to authenticate multimedia images. These approaches can be categorized into fragile and semi-fragile watermarking, conventional cryptography and digital signatures based on the image content. The aim of this paper is to provide a comparative study and also a survey of emerging techniques for image authentication. The important requirements for an efficient image authentication system design are discussed along with the classification of image authentication into tamper detection, localization and reconstruction and robustness against image processing operation. Furthermore, the concept of image content based authentication is enlightened.

  5. Ground vibration tests of a helicopter structure using OMA techniques

    Science.gov (United States)

    Ameri, N.; Grappasonni, C.; Coppotelli, G.; Ewins, D. J.

    2013-02-01

    This paper is focused on an assessment of the state-of-the-art of operational modal analysis (OMA) methodologies in estimating modal parameters from output responses on helicopter structures. For this purpose, a ground vibration test was performed on a real helicopter airframe. In the following stages, several OMA techniques were applied to the measured data and compared with the results from typical input-output approach. The results presented are part of a more general research activity carried out in the Group of Aeronautical Research and Technology in Europe (GARTEUR) Action Group 19, helicopter technical activity, whose overall objective is the improvement of the structural dynamic finite element models using in-flight test data. The structure considered is a medium-size helicopter, a time-expired Lynx Mk7 (XZ649) airframe. In order to have a comprehensive analysis, the behaviour of both frequency- and time-domain-based OMA techniques are considered for the modal parameter estimates. An accuracy index and the reliability of the OMA methods with respect to the standard EMA procedures, together with the evaluation of the influence of the experimental setup on the estimate of the modal parameters, will be presented in the paper.

  6. Survey of Cataract Surgical Techniques in Nigeria

    African Journals Online (AJOL)

    PROF SABE NWOSU

    Objective: To determine the techniques of cataract surgery as currently being ... include phacoemulsification, manual small incision sutureless. 3 cataract surgery (SICS) ... technology automated small incision phacoemulsification technique.5.

  7. A Survey on Design Pattern Recovery Techniques

    OpenAIRE

    Ghulam Rasool; Detlef Streitfdert

    2011-01-01

    The evaluation of design pattern recovery techniques and tools is significant as numbers of emergent techniques are presented and used in the past to recover patterns from source code of legacy applications. The problem of very diverse precision and recall values extracted by different pattern recovery techniques and tools on the same examined applications is not investigated thoroughly. It is very desirable to compare features of existing techniques as abundance of techniques supplemented wi...

  8. Adaptive Neuro-Fuzzy Technique for Autonomous Ground Vehicle Navigation

    Directory of Open Access Journals (Sweden)

    Auday Al-Mayyahi

    2014-11-01

    Full Text Available This article proposes an adaptive neuro-fuzzy inference system (ANFIS for solving navigation problems of an autonomous ground vehicle (AGV. The system consists of four ANFIS controllers; two of which are used for regulating both the left and right angular velocities of the AGV in order to reach the target position; and other two ANFIS controllers are used for optimal heading adjustment in order to avoid obstacles. The two velocity controllers receive three sensor inputs: front distance (FD; right distance (RD and left distance (LD for the low-level motion control. Two heading controllers deploy the angle difference (AD between the heading of AGV and the angle to the target to choose the optimal direction. The simulation experiments have been carried out under two different scenarios to investigate the feasibility of the proposed ANFIS technique. The simulation results have been presented using MATLAB software package; showing that ANFIS is capable of performing the navigation and path planning task safely and efficiently in a workspace populated with static obstacles.

  9. Ground Penetrating Radar technique for railway track characterization in Portugal

    Science.gov (United States)

    De Chiara, Francesca; Fontul, Simona; Fortunato, Eduardo; D'Andrea, Antonio

    2013-04-01

    Maintenance actions are significant for transport infrastructures but, today, costs have to be necessary limited. A proper quality control since the construction phase is a key factor for a long life cycle and for a good economy policy. For this reason, suitable techniques have to be chosen and non-destructive tests represent an efficient solution, as they allow to evaluate infrastructure characteristics in a continuous or quasi-continuous way, saving time and costs, enabling to make changes if tests results do not comply with the project requirements. Ground Penetrating Radar (GPR) is a quick and effective technique to evaluate infrastructure condition in a continuous manner, replacing or reducing the use of traditional drilling method. GPR application to railways infrastructures, during construction and monitoring phase, is relatively recent. It is based on the measuring of layers thicknesses and detection of structural changes. It also enables the assessment of materials properties that constitute the infrastructure and the evaluation of the different types of defects such as ballast pockets, fouled ballast, poor drainage, subgrade settlement and transitions problems. These deteriorations are generally the causes of vertical deviations in track geometry and they cannot be detected by the common monitoring procedures, namely the measurements of track geometry. Moreover, the development of new GPR systems with higher antenna frequencies, better data acquisition systems, more user friendly software and new algorithms for calculation of materials properties can lead to a regular use of GPR. Therefore, it represents a reliable technique to assess track geometry problems and consequently to improve maintenance planning. In Portugal, rail inspection is performed with Plasser & Theurer EM120 equipment and recently 400 MHz IDS antennas were installed on it. GPR tests were performed on the Portuguese rail network and, as case study in this paper, a renewed track was

  10. Potential Enhancement of Ground Penetrating Surveys with Dispersion Properties

    Science.gov (United States)

    Tsai, C. A.; Ghent, R. R.; Boivin, A.

    2016-12-01

    Ground penetrating radar (GPR) is a nondestructive measurement technique that utilizes the transmission or reflection of electromagnetic waves to locate targets buried under Earth or artificial materials. GPR is now widely used in mining, civil engineering archaeology and hydrology. One basic premise of surface GPR is that subsurface features will return reflections which are replicas of the transmitted signal. However, phase velocities of electromagnetic waves in real materials vary with frequency. This effect becomes more noticeable in GPR frequency range with increasing moisture content. Dispersion leads to difficulty in interpreting the received signals because the reflected signals are distorted. However, the effects of dispersion on the signals may provide an opportunity to more fully characterize materials under test than is possible using traditional reflection-mode GPR techniques. In this work we present 3D-FDTD numerical modeling results using gprMax to systematically characterize the effect of dispersion on GPR signals. In addition to numerical results, we assess the feasibility of applying our results to terrestrial geophysical scenarios by measuring the dielectric permittivities of a selection of natural materials, including samples from a massive sulphide mine. Our goal is to establish a parameter space that systematically characterize the effect of each parameter in the common dispersion models (Debye, Lorentz and Drude) on GPR signals. We begin the experiment by drying the samples completely and then adding water into the samples in 5 wt % increments. We measure the broadband relative permittivity and loss tangent using a coaxial transmission line for each state from 300 kHz to 8.5 GHz. The results provide a database for future GPR signal interpretation.

  11. Application of Ground Penetrating Radar Surveys and GPS Surveys for Monitoring the Condition of Levees and Dykes

    Directory of Open Access Journals (Sweden)

    Tanajewski Dariusz

    2016-08-01

    Full Text Available This paper analyses the possibility of using integrated GPS (Global Positioning System surveys and ground penetrating radar surveys to precisely locate damages to levees, particularly due to the activity of small fossorial mammals. The technology of intercommunication between ground penetrating radar (GPR and an RTK (Real-Time Kinematic survey unit, and the method of data combination, are presented. The errors which may appear during the survey work are also characterized. The procedure for processing the data so that the final results have a spatial character and are ready to be implemented in digital maps and geographic information systems (GIS is also described.

  12. Application of Ground Penetrating Radar Surveys and GPS Surveys for Monitoring the Condition of Levees and Dykes

    Science.gov (United States)

    Tanajewski, Dariusz; Bakuła, Mieczysław

    2016-08-01

    This paper analyses the possibility of using integrated GPS (Global Positioning System) surveys and ground penetrating radar surveys to precisely locate damages to levees, particularly due to the activity of small fossorial mammals. The technology of intercommunication between ground penetrating radar (GPR) and an RTK (Real-Time Kinematic) survey unit, and the method of data combination, are presented. The errors which may appear during the survey work are also characterized. The procedure for processing the data so that the final results have a spatial character and are ready to be implemented in digital maps and geographic information systems (GIS) is also described.

  13. Consideration on the restoring plan in the subsidence prone areas through the development of ground stability assessment techniques

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, K.S.; Kim, I.H.; Cho, W.J.; Song, W.K.; Synn, J.H.; Choi, S.O.; Yoon, C.H.; Hong, K.P.; Park, C. [Korea Institute of Geology Mining and Materials, Taejon (Korea)

    1998-12-01

    The ground stability assessment technique of the subsidence prone area and its restoring plan need to be developed to obtain the ground stability around the mines at rest or closed since 1980's. Up to the present, the assessment of the subsidence risk has been conducted only after the statements of residents or the observation of symptom on the subsidence. Generally, the assessment process at first stage is carried on through the analysis of surface and mining map, the geological survey and the interviews to the residents. Drilling survey, rock property test, geotechnical rock and ground survey, and numerical analyses belong to the second stage. After the completion of the procedure the stability of buildings and the strength of subsidence are determined. The acquisition of the accurate in-situ data, the estimation of mechanical property of rock mass, and the analysis of basic mechanism may affect in the great extent on the assessment of the subsidence risk. In this study, the development of the subsidence risk assessment method was incorporated with the GIS technique which will be used to make the risk information map on the subsidence. The numerical analysis in 2D and 3D using PFC and FLAC has been conducted to estimate the ground stability of Moo-Geuk Mine area. The displacement behavior of the ground and the development of the failed zone due to the cavity were studied from the numerical modelling. The result of the ground stability assessment for the area in question shows that the risk to the subsidence is relatively small. It is, however, necessary to fill the cavity with some suitable materials when considering the new construction of buildings or roads in plan. Finally, the measures to prevent the subsidence and some case studies were presented, in particular the case study on the measurement of the ground movement in a mine were described in detail. (author). 27 refs., 27 tabs., 62 figs.

  14. ECG Feature Extraction Techniques - A Survey Approach

    CERN Document Server

    Karpagachelvi, S; Sivakumar, M

    2010-01-01

    ECG Feature Extraction plays a significant role in diagnosing most of the cardiac diseases. One cardiac cycle in an ECG signal consists of the P-QRS-T waves. This feature extraction scheme determines the amplitudes and intervals in the ECG signal for subsequent analysis. The amplitudes and intervals value of P-QRS-T segment determines the functioning of heart of every human. Recently, numerous research and techniques have been developed for analyzing the ECG signal. The proposed schemes were mostly based on Fuzzy Logic Methods, Artificial Neural Networks (ANN), Genetic Algorithm (GA), Support Vector Machines (SVM), and other Signal Analysis techniques. All these techniques and algorithms have their advantages and limitations. This proposed paper discusses various techniques and transformations proposed earlier in literature for extracting feature from an ECG signal. In addition this paper also provides a comparative study of various methods proposed by researchers in extracting the feature from ECG signal.

  15. Survey of Diagnostic Techniques for Dynamic Components

    Science.gov (United States)

    2010-01-01

    sensitive to torque fluctuations (19). • NP4 deteriorates as the severity of the damage increases on multiple gear teeth (25). • Crest factor indicates...diagnostic techniques that have been developed for dynamic components such as bearings and gears . There has been a tremendous amount of research... Gears 13  3.1  Diagnostic or Signal Enhancement Techniques ............................................................14  3.2  Time Synchronous Average

  16. 40 CFR 141.403 - Treatment technique requirements for ground water systems.

    Science.gov (United States)

    2010-07-01

    ....403 Treatment technique requirements for ground water systems. (a) Ground water systems with significant deficiencies or source water fecal contamination. (1) The treatment technique requirements of this... requirements of this section. (3) When a significant deficiency is identified at a Subpart H public...

  17. 40 CFR 141.404 - Treatment technique violations for ground water systems.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Treatment technique violations for ground water systems. 141.404 Section 141.404 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Ground Water Rule § 141.404 Treatment technique violations for...

  18. Confirming Ground Geophysical Survey for Mineralization in Al Bayda Area, Yemen

    Institute of Scientific and Technical Information of China (English)

    Faisal S. Al-Huzaim

    2003-01-01

    The Jabal Mabal-prospect of Al Bayda area is covered by meta-volcano sedimentary rocks. The importance of the study area comes from previous studies, which proved that the area has mineralization zones at the periphery of Jabal Al-Mabal. These mineralized zones lie along northwest-southeast shear zones, which occur inside the meta-volcanic rocks. The previous studies concluded that the mineralization was mostly associated with sulphides, especially at the reduction zones. These sulphides give a good response to electrical or electromagnetic techniques. The present study uses electrically induced polarization and Genie electromagnetic surveys to explore the extension of the mineralization zones at the subsurface. The self-potential (SP) technique shows numerous of mineralized zones. Most of these zones are distributed in the southern area along the contact between the quartz and gabbro. The induced polarization (IP) method has been applied on the selected profiles to delineate the subsurface contact between calcite, gabbro and quartz in Al-Jarra Valley and to calculate the shape and depth of the mineralization zones in the subsurface along these profiles. The Genie electromagnetic survey, which has been applied on selected profiles, delineated some weak mineralization occurrences corresponding to the shear zones. The integration of the results obtained using these three techniques, in addition to the different ground geophysical methods previously used, makes it possible to determine the most appropriate zones for development of exploration at the area of investigation.

  19. Investigations into near-real-time surveying for geophysical data collection using an autonomous ground vehicle

    Science.gov (United States)

    Phelps, Geoffrey A.; Ippolito, C.; Lee, R.; Spritzer, R.; Yeh, Y.

    2014-01-01

    The U.S. Geological Survey and the National Aeronautics and Space Administration are cooperatively investigating the utility of unmanned vehicles for near-real-time autonomous surveys of geophysical data collection. Initially focused on unmanned ground vehicle collection of magnetic data, this cooperative effort has brought unmanned surveying, precision guidance, near-real-time communication, on-the-fly data processing, and near-real-time data interpretation into the realm of ground geophysical surveying, all of which offer advantages over current methods of manned collection of ground magnetic data. An unmanned ground vehicle mission has demonstrated that these vehicles can successfully complete missions to collect geophysical data, and add advantages in data collection, processing, and interpretation. We view the current experiment as an initial phase in further unmanned vehicle data-collection missions, including aerial surveying.

  20. Audio Steganography Techniques-A Survey

    OpenAIRE

    Navneet Kaur; Sunny Behal

    2014-01-01

    we can communicate with each other by passing messages which is not secure, but we make a communication be kept secret by embedding the message into carrier or by special tools such as invisible ink, microdots etc. Steganography is the science that involves communicating secret data in an appropriate carrier which is used from hundreds of years. In digital age new techniques of hiding the data inside the carrier are invented which are known as digital steganography. Nowadays, t...

  1. Superresolution imaging: a survey of current techniques

    Science.gov (United States)

    Cristóbal, G.; Gil, E.; Šroubek, F.; Flusser, J.; Miravet, C.; Rodríguez, F. B.

    2008-08-01

    Imaging plays a key role in many diverse areas of application, such as astronomy, remote sensing, microscopy, and tomography. Owing to imperfections of measuring devices (e.g., optical degradations, limited size of sensors) and instability of the observed scene (e.g., object motion, media turbulence), acquired images can be indistinct, noisy, and may exhibit insuffcient spatial and temporal resolution. In particular, several external effects blur images. Techniques for recovering the original image include blind deconvolution (to remove blur) and superresolution (SR). The stability of these methods depends on having more than one image of the same frame. Differences between images are necessary to provide new information, but they can be almost unperceivable. State-of-the-art SR techniques achieve remarkable results in resolution enhancement by estimating the subpixel shifts between images, but they lack any apparatus for calculating the blurs. In this paper, after introducing a review of current SR techniques we describe two recently developed SR methods by the authors. First, we introduce a variational method that minimizes a regularized energy function with respect to the high resolution image and blurs. In this way we establish a unifying way to simultaneously estimate the blurs and the high resolution image. By estimating blurs we automatically estimate shifts with subpixel accuracy, which is inherent for good SR performance. Second, an innovative learning-based algorithm using a neural architecture for SR is described. Comparative experiments on real data illustrate the robustness and utilization of both methods.

  2. Ground-based intercomparison of two isoprene measurement techniques

    Directory of Open Access Journals (Sweden)

    E. Leibrock

    2003-01-01

    Full Text Available An informal intercomparison of two isoprene (C5H8 measurement techniques was carried out during Fall of 1998 at a field site located approximately 3 km west of Boulder, Colorado, USA. A new chemical ionization mass spectrometric technique (CIMS was compared to a well-established gas chromatographic technique (GC. The CIMS technique utilized benzene cation chemistry to ionize isoprene. The isoprene levels measured by the CIMS were often larger than those obtained with the GC. The results indicate that the CIMS technique suffered from an anthropogenic interference associated with air masses from the Denver, CO metropolitan area as well as an additional interference occurring in clean conditions. However, the CIMS technique is also demonstrated to be sensitive and fast. Especially after introduction of a tandem mass spectrometric technique, it is therefore a candidate for isoprene measurements in remote environments near isoprene sources.

  3. The use of continuous improvement techniques: A survey-based ...

    African Journals Online (AJOL)

    International Journal of Engineering, Science and Technology ... The use of continuous improvement techniques: A survey-based study of current practices ... Prior research has focused mainly on the effect of continuous improvement practices ...

  4. A pilot survey of impression materials and techniques used by ...

    African Journals Online (AJOL)

    A pilot survey of impression materials and techniques used by dentists in the fabrication ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search · USING ... Objective: To assess the choice of impression material and impression ...

  5. Monitoring beach changes using GPS surveying techniques

    Science.gov (United States)

    Morton, Robert; Leach, Mark P.; Paine, Jeffrey G.; Cardoza, Michael A.

    1993-01-01

    A need exists for frequent and prompt updating of shoreline positions, rates of shoreline movement, and volumetric nearshore changes. To effectively monitor and predict these beach changes, accurate measurements of beach morphology incorporating both shore-parallel and shore-normal transects are required. Although it is possible to monitor beach dynamics using land-based surveying methods, it is generally not practical to collect data of sufficient density and resolution to satisfy a three-dimensional beach-change model of long segments of the coast. The challenge to coastal scientists is to devise new beach monitoring methods that address these needs and are rapid, reliable, relatively inexpensive, and maintain or improve measurement accuracy.

  6. A Survey on different techniques of steganography

    Directory of Open Access Journals (Sweden)

    Kaur Harpreet

    2016-01-01

    Full Text Available Steganography is important due to the exponential development and secret communication of potential computer users over the internet. Steganography is the art of invisible communication to keep secret information inside other information. Steganalysis is the technology that attempts to ruin the Steganography by detecting the hidden information and extracting.Steganography is the process of Data embedding in the images, text/documented, audio and video files. The paper also highlights the security improved by applying various techniques of video steganography.

  7. Digital Sky Surveys from the ground: Status and Perspectives

    CERN Document Server

    Shanks, T

    2015-01-01

    I first review the status of Digital Sky Surveys. The focus will be on extragalactic surveys with an area of more than 100 sq.deg. The Sloan Digital Sky Survey is the archetype of such imaging surveys and it is its great success that has prompted great activity in this field. The latest surveys explore wider, fainter and higher resolution and also a longer wavelength range than SDSS. Many of these surveys overlap particularly in the S Hemisphere where we now have Pan-STARRS, DES and the ESO VST surveys, and our aim here is to compare their properties. Since there is no dedicated article on the VST ATLAS in this symposium, we shall especially review the properties of this particular survey. This easily fits onto our other main focus which is to compare overlapping Southern Surveys and see how they best fit with the available NIR imaging data. We conclude that the Southern Hemisphere will soon overtake the North in terms of multiwavelength imaging. However we note that the South has more limited opportunities f...

  8. Artificial Intrusion Detection Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Ashutosh Gupta

    2014-08-01

    Full Text Available Networking has become the most integral part of our cyber society. Everyone wants to connect themselves with each other. With the advancement of network technology, we find this most vulnerable to breach and take information and once information reaches to the wrong hands it can do terrible things. During recent years, number of attacks on networks have been increased which drew the attention of many researchers on this field. There have been many researches on intrusion detection lately. Many methods have been devised which are really very useful but they can only detect the attacks which already took place. These methods will always fail whenever there is a foreign attack which is not famous or which is new to the networking world. In order to detect new intrusions in the network, researchers have devised artificial intelligence technique for Intrusion detection prevention system. In this paper we are going to cover what types evolutionary techniques have been devised and their significance and modification.

  9. A survey of compiler optimization techniques

    Science.gov (United States)

    Schneck, P. B.

    1972-01-01

    Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.

  10. Audio Steganography Techniques-A Survey

    Directory of Open Access Journals (Sweden)

    Navneet Kaur

    2014-06-01

    Full Text Available we can communicate with each other by passing messages which is not secure, but we make a communication be kept secret by embedding the message into carrier or by special tools such as invisible ink, microdots etc. Steganography is the science that involves communicating secret data in an appropriate carrier which is used from hundreds of years. In digital age new techniques of hiding the data inside the carrier are invented which are known as digital steganography. Nowadays, the carrier of the message can be an image, audio, video or a text file. In this paper we have purposed a method to enhance the security level in audio steganography and also improve the quality by making 2-level steganography.

  11. Survey on Techniques for Detecting Data Leakage

    Directory of Open Access Journals (Sweden)

    Bhosale Pranjali A

    2016-06-01

    Full Text Available In current business scenario, critical data is to be shared and transferred by organizations to many stake holders in order to complete particular task. The critical data include intellectual copyright, patient information etc. The activities like sharing and transferring of such critical data includes threats like leakage of information, misuse of data, illegal access to data and/or alteration of data. It is necessary to deal with such problem efficiently and effectively, popular solutions to this problem are use of firewalls, data loss prevention tools and watermarking. But sometimes culprit succeeds in overcoming such security measures hence, if organizations becomes able to find out the guilty client responsible for leakage of particular data then risk of data leakage is reduced. For this many systems are proposed, this paper includes information about techniques discussed in some of such methodologies.

  12. 2011 & 2012 report of prairie grouse breeding ground survey on Fort Niobrara NWR

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This memorandum summarizes the 2011 and 2012 prairie grouse lek survey on Fort Niobrara National Wildlife Refuge. Annual prairie grouse breeding ground counts were...

  13. 2015 report of prairie grouse breeding ground survey on Fort Niobrara NWR

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This memorandum summarizes the 2011 and 2012 prairie grouse lek survey on Fort Niobrara National Wildlife Refuge. Annual prairie grouse breeding ground counts were...

  14. 2010 report of prairie grouse breeding ground survey on Fort Niobrara NWR

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This memorandum summarizes the 2010 prairie grouse lek survey on Fort Niobrara National Wildlife Refuge. Annual prairie grouse breeding ground counts were conducted...

  15. Ground Survey for Wintering, Migratory Waterfowl on Pocosin Lakes National Wildlife Refuge: November 24, 1999

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The ground waterfowl surveys document the number of wintering, migratory waterfowl by species for each management unit on the Pungo Unit of Pocosin Lakes National...

  16. Ground Survey for Wintering, Migratory Waterfowl on Pocosin Lakes National Wildlife Refuge: November 26, 2002

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The ground waterfowl surveys document the number of wintering, migratory waterfowl by species for each management unit on the Pungo Unit of Pocosin Lakes National...

  17. Ground Survey for Wintering, Migratory Waterfowl on Pocosin Lakes National Wildlife Refuge: January 14, 2000

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The ground waterfowl surveys document the number of wintering, migratory waterfowl by species for each management unit on the Pungo Unit of Pocosin Lakes National...

  18. Ground Survey for Wintering, Migratory Waterfowl on Pocosin Lakes National Wildlife Refuge: December 22, 2003

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The ground waterfowl surveys document the number of wintering, migratory waterfowl by species for each management unit on the Pungo Unit of Pocosin Lakes National...

  19. Techniques for Leakage Power Reduction in Nanoscale Circuits: A Survey

    DEFF Research Database (Denmark)

    Liu, Wei

    This report surveys progress in the field of designing low power especially low leakage CMOS circuits in deep submicron era. The leakage mechanism and various recently proposed run time leakage reduction techniques are presented. Two designs from Cadence and Sony respectively, which can represent...... current industrial application of these techniques, are also illustrated....

  20. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E.; Storch, H. von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1997-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  1. Ground Water Arsenic Contamination: A Local Survey in India

    Science.gov (United States)

    Kumar, Arun; Rahman, Md. Samiur; Iqubal, Md. Asif; Ali, Mohammad; Niraj, Pintoo Kumar; Anand, Gautam; Kumar, Prabhat; Abhinav; Ghosh, Ashok Kumar

    2016-01-01

    Background: In the present times, arsenic poisoning contamination in the ground water has caused lots of health-related problems in the village population residing in middle Gangetic plain. In Bihar, about 16 districts have been reported to be affected with arsenic poisoning. For the ground water and health assessment, Simri village of Buxar district was undertaken which is a flood plain region of river Ganga. Methods: In this study, 322 water samples were collected for arsenic estimation, and their results were analyzed. Furthermore, the correlation between arsenic contamination in ground water with depth and its distance from river Ganga were analyzed. Results are presented as mean ± standard deviation and total variation present in a set of data was analyzed through one-way analysis of variance. The difference among mean values has been analyzed by applying Dunnett's test. The criterion for statistical significance was set at P arsenic concentration in hand pumps. Furthermore, a correlation between the arsenic concentration with the depth of the hand pumps and the distance from the river Ganga was also a significant study. Conclusions: The present study concludes that in Simri village there is high contamination of arsenic in ground water in all the strips. Such a huge population is at very high risk leading the village on the verge of causing health hazards among them. Therefore, an immediate strategy is required to combat the present problem. PMID:27625765

  2. Ground water arsenic contamination: A local survey in India

    Directory of Open Access Journals (Sweden)

    Arun Kumar

    2016-01-01

    Conclusions: The present study concludes that in Simri village there is high contamination of arsenic in ground water in all the strips. Such a huge population is at very high risk leading the village on the verge of causing health hazards among them. Therefore, an immediate strategy is required to combat the present problem.

  3. Ground Water Arsenic Contamination: A Local Survey in India.

    Science.gov (United States)

    Kumar, Arun; Rahman, Md Samiur; Iqubal, Md Asif; Ali, Mohammad; Niraj, Pintoo Kumar; Anand, Gautam; Kumar, Prabhat; Abhinav; Ghosh, Ashok Kumar

    2016-01-01

    In the present times, arsenic poisoning contamination in the ground water has caused lots of health-related problems in the village population residing in middle Gangetic plain. In Bihar, about 16 districts have been reported to be affected with arsenic poisoning. For the ground water and health assessment, Simri village of Buxar district was undertaken which is a flood plain region of river Ganga. In this study, 322 water samples were collected for arsenic estimation, and their results were analyzed. Furthermore, the correlation between arsenic contamination in ground water with depth and its distance from river Ganga were analyzed. Results are presented as mean ± standard deviation and total variation present in a set of data was analyzed through one-way analysis of variance. The difference among mean values has been analyzed by applying Dunnett's test. The criterion for statistical significance was set at P arsenic concentration in hand pumps. Furthermore, a correlation between the arsenic concentration with the depth of the hand pumps and the distance from the river Ganga was also a significant study. The present study concludes that in Simri village there is high contamination of arsenic in ground water in all the strips. Such a huge population is at very high risk leading the village on the verge of causing health hazards among them. Therefore, an immediate strategy is required to combat the present problem.

  4. Application of Distributed Optical Fiber Sensing Technique in Monitoring the Ground Deformation

    Directory of Open Access Journals (Sweden)

    Jin Liu

    2017-01-01

    Full Text Available The monitoring of ground deformation is important for the prevention and control of geological disaster including land subsidence, ground fissure, surface collapse, and landslides. In this study, a distributed optical fiber sensing technique based on Brillouin Optical Time-Domain Analysis (BOTDA was used to monitor the ground deformation. The principle behind the BOTDA is first introduced, and then laboratory calibration test and physical model test were carried out. Finally, BOTDA-based monitoring of ground fissure was carried out in a test site. Experimental results show that the distributed optical fiber can measure the soil strain during ground deformation process, and the strain curve responded to the soil compression and tension region clearly. During field test in Wuxi City, China, the ground fissures deformation area was monitored accurately and the trend of deformation can also be achieved to forecast and warn against the ground fissure hazards.

  5. Advanced array techniques for unattended ground sensor applications

    Energy Technology Data Exchange (ETDEWEB)

    Followill, F.E.; Wolford, J.K.; Candy, J.V.

    1997-05-06

    Sensor arrays offer opportunities to beam form, and time-frequency analyses offer additional insights to the wavefield data. Data collected while monitoring three different sources with unattended ground sensors in a 16-element, small-aperture (approximately 5 meters) geophone array are used as examples of model-based seismic signal processing on actual geophone array data. The three sources monitored were: (Source 01). A frequency-modulated chirp of an electromechanical shaker mounted on the floor of an underground bunker. Three 60-second time-windows corresponding to (a) 50 Hz to 55 Hz sweep, (b) 60 Hz to 70 Hz sweep, and (c) 80 Hz to 90 Hz sweep. (Source 02). A single transient impact of a hammer striking the floor of the bunker. Twenty seconds of data (with the transient event approximately mid-point in the time window.(Source 11)). The transient event of a diesel generator turning on, including a few seconds before the turn-on time and a few seconds after the generator reaches steady-state conditions. The high-frequency seismic array was positioned at the surface of the ground at a distance of 150 meters (North) of the underground bunker. Four Y-shaped subarrays (each with 2-meter apertures) in a Y-shaped pattern (with a 6-meter aperture) using a total of 16 3-component, high-frequency geophones were deployed. These 48 channels of seismic data were recorded at 6000 and 12000 samples per second on 16-bit data loggers. Representative examples of the data and analyses illustrate the results of this experiment.

  6. The detectability of archaeological structures beneath the soil using the ground penetrating radar technique

    Science.gov (United States)

    Ferrara, C.; Barone, P. M.; Pajewski, L.; Pettinelli, E.; Rossi, G.

    2012-04-01

    The traditional excavation tools applied to Archaeology (i.e. trowels, shovels, bulldozers, etc.) produce, generally, a fast and invasive reconstruction of the ancient past. The geophysical instruments, instead, seem to go in the opposite direction giving, rapidly and non-destructively, geo-archaeological information. Moreover, the economic aspect should not be underestimated: where the former invest a lot of money in order to carry out an excavation or restoration, the latter spend much less to manage a geophysical survey, locating precisely the targets. Survey information gathered using non-invasive methods contributes to the creation of site strategies, conservation, preservation and, if necessary, accurate location of excavation and restoration units, without destructive testing methods, also in well-known archaeological sites [1]-[3]. In particular, Ground Penetrating Radar (GPR) has, recently, become the most important physical technique in archaeological investigations, allowing the detection of targets with both very high vertical and horizontal resolution, and has been successfully applied both to archaeological and diagnostic purposes in historical and monumental sites [4]. GPR configuration, antenna frequency and survey modality can be different, depending on the scope of the measurements, the nature of the site or the type of targets. Two-dimensional (2D) time/depth slices and radargrams should be generated and integrated with information obtained from other buried or similar artifacts to provide age, structure and context of the surveyed sites. In the present work, we present three case-histories on well-known Roman archaeological sites in Rome, in which GPR technique has been successfully used. To obtain 2D maps of the explored area, a bistatic GPR (250MHz and 500MHz antennas) was applied, acquiring data along several parallel profiles. The GPR results reveal the presence of similar circular anomalies in all the investigated archaeological sites. In

  7. Two-Stage MAS Technique for Analysis of DRA Elements and Arrays on Finite Ground Planes

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal; Breinbjerg, Olav

    2007-01-01

    A two-stage Method of Auxiliary Sources (MAS) technique is proposed for analysis of dielectric resonator antenna (DRA) elements and arrays on finite ground planes (FGPs). The problem is solved by first analysing the DRA on an infinite ground plane (IGP) and then using this solution to model the FGP...... problem....

  8. Fault Detection Using Polarimetric Single-Input-Multi-Output Ground Penetrating Radar Technique in Mason, Texas

    Science.gov (United States)

    Amara, A.; Everett, M. E.

    2014-12-01

    At the Mason Mountain Wildlife Management Area (MMWMA) near Mason, Texas, we conducted a 2D ground penetrating radar (GPR) survey using single-input-multi-output (SIMO) acquisition technique to image a Pennsylvanian high-angle normal fault. At the MMWMA, the surface geology is mapped extensively but the subsurface remains largely unknown. The main objective of our study is to develop a detailed subsurface structural image of the fault and evaluate existing hypotheses on fault development. Also, to develop and apply a new methodology based on Polarimetric SIMO acquisition geometry. This new methodology allows the subsurface structures to be viewed simultaneously from different angles and can help reduce noise caused by the heterogeneities that affect the electromagnetic waves. We used a pulseEKKO pro 200 GPR with 200 MHz antennae to acquire 8 north-south lines across the fault. Each line is 30 meters long with the transmitter starting on the Town Mountain Granite, footwall, with the receiver stepping 40 cm until the end of the line crossing the fault on to the Hickory Sandstone, hanging wall. Each pass consisted of a stationary transmitter antenna and the moving receiver antenna. The data were initially processed with standard steps including low-cut dewow filter, background subtraction filter and gain control. Advanced processing techniques include migration, phased array processing, velocity analysis, and normal moveout. We will compare the GPR results with existing geophysical datasets at the same site, including electromagnetic (EM), seismic, and seismoelectric.

  9. A Review About SAR Technique for Shallow Water Bathymetry Surveys

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Synthetic Aperture Radar (SAR) has become one of the important tools for shallow water bathymetry surveys. This has significant economic efficiency compared with the traditional bathymetry surveys. Numerical models have been developed to simulate shallow water bathymetry SAR images. Inversion of these models makes it possible to assess the water depths from SAR images. In this paper, these numerical models of SAR technique are reviewed, and examples are illustrated including in the coastal areas of China. Some issues about SAR technique available and the research orientation in future are also discussed.

  10. A Survey of Advanced Microwave Frequency Measurement Techniques

    Directory of Open Access Journals (Sweden)

    Anand Swaroop Khare

    2012-06-01

    Full Text Available Microwaves are radio waves with wavelengths ranging from as long as one meter to as short as one millimeter, or equivalently, with frequencies between 300 MHz and 300 GHz. The science of photonics includes the generation, emission, modulation, signal processing, switching, transmission, amplification, detection and sensing of light. Microwave photonics has been introduced for achieving ultra broadband signal processing. Instantaneous Frequency Measurement (IFM receivers play an important role in electronic warfare. Technologies used for signal processing, include conventional direct Radio Frequency (RF techniques, digital techniques, intermediate frequency (IF techniques and photonic techniques. Direct RF techniques suffer an increased loss, high dispersion, and unwanted radiation problems in high frequencies. The systems that use traditional RF techniques can be bulky and often lack the agility required to perform advanced signal processing in rapidly changing environments. In this paper we discussed a survey of Microwave Frequency Measurement Techniques. The microwaves techniques are categorized based upon different approaches. This paper provides the major advancement in the Microwave Frequency MeasurementTechniques research; using these approaches the features and categories in the surveyed existing work.

  11. Remote and terrestrial ground monitoring techniques integration for hazard assessment in mountain areas

    Science.gov (United States)

    Chinellato, Giulia; Kenner, Robert; Iasio, Christian; Mair, Volkmar; Mosna, David; Mulas, Marco; Phillips, Marcia; Strada, Claudia; Zischg, Andreas

    2014-05-01

    In high mountain regions the choice of appropriate sites for infrastructure such as roads, railways, cable cars or hydropower dams is often very limited. In parallel, the increasing demand for supply infrastructure in the Alps induces a continuous transformation of the territory. The new role played by the precautionary monitoring in the risk governance becomes fundamental and may overcome the modeling of future events, which represented so far the predominant approach to these sort of issues. Furthermore the consequence of considering methodologies alternative to those more exclusive allow to reduce costs and increasing the frequency of measurements, updating continuously the cognitive framework of existing hazard condition in most susceptible territories. The scale factor of the observed area and the multiple purpose of such regional ordinary surveys make it convenient to adopt Radar Satellite-based systems, but they need to be integrated with terrestrial systems for validation and eventual early warning purposes. Significant progress over the past decade in Remote Sensing (RS), Proximal Sensing and integration-based sensor networks systems now provide technologies, that allow to implement monitoring systems for ordinary surveys of extensive areas or regions, which are affected by active natural processes and slope instability. The Interreg project SloMove aims to provide solutions for such challenges and focuses on using remote sensing monitoring techniques for the monitoring of mass movements in two test sites, in South Tyrol (Italy) and in Grisons Canton (Switzerland). The topics faced in this project concern mass movements and slope deformation monitoring techniques, focusing mainly on the integration of multi-temporal interferometry, new generation of terrestrial technologies for differential digital terrain model elaboration provided by laser scanner (TLS), and GNSS-based topographic surveys, which are used not only for validation purpose, but also for

  12. Fault Based Techniques for Testing Boolean Expressions: A Survey

    CERN Document Server

    Badhera, Usha; Taruna, S

    2012-01-01

    Boolean expressions are major focus of specifications and they are very much prone to introduction of faults, this survey presents various fault based testing techniques. It identifies that the techniques differ in their fault detection capabilities and generation of test suite. The various techniques like Cause effect graph, meaningful impact strategy, Branch Operator Strategy (BOR), BOR+MI, MUMCUT, Modified Condition/ Decision Coverage (MCDC) has been considered. This survey describes the basic algorithms and fault categories used by these strategies for evaluating their performance. Finally, it contains short summaries of the papers that use Boolean expressions used to specify the requirements for detecting faults. These techniques have been empirically evaluated by various researchers on a simplified safety related real time control system.

  13. Survey Study of Moso Bamboo Management Techniques Dissemination in Zhejiang

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    By PRA survey to 1 245 farmer households of 10 key Moso bamboo (Phyllostachys pubescens) production counties, the source and demand of the management techniques in Zhejiang were studied. The conducted principal factor analysis revealed that experience and traditional knowledge are currently major technical sources of farmer households' Moso bamboo forest management techniques and that the demonstrative household is a highly expected technical source, in which the prime factor is interpersonal dissemination ...

  14. Distinguishing grass from ground using LiDAR: Techniques and applications

    Science.gov (United States)

    Pelletier, J. D.; Swetnam, T.; Papuga, S. A.; Nelson, K.; Brooks, P. D.; Harpold, A. A.; Chorover, J.

    2011-12-01

    grass height. A bare-earth DEM that corrects for the effects of dense vegetation can then be constructed by subtracting the estimated mean grass height from the mean of the LiDAR first returns. We illustrate two applications of this method. First, spectral analysis of grass height raster products of Valles Caldera reveal fractal patterns that reflect the roles of geomorphology (e.g. height above active channel) and small-scale disturbances on grass growth and hence on the spatial variations in grass height. Second, snow thicknesses mapped by airborne LiDAR in the Valles Caldera systematically under-predict the actual snow thickness in riparian areas because the ground-surface in the snow-off DEM fails to represent the true ground surface in areas of tall, dense grass. By comparing a grass-corrected LiDAR-derived snow thickness map to the results of snow survey data acquired during the time of the snow-on LiDAR flight, we show that the techniques we developed minimize this problem.

  15. Nondestructive Technique Survey for Assessing Integrity of Composite Firing Vessel

    Energy Technology Data Exchange (ETDEWEB)

    Tran, A.

    2000-08-01

    The repeated use and limited lifetime of a composite tiring vessel compel a need to survey techniques for monitoring the structural integrity of the vessel in order to determine when it should be retired. Various nondestructive techniques were researched and evaluated based on their applicability to the vessel. The methods were visual inspection, liquid penetrant testing, magnetic particle testing, surface mounted strain gauges, thermal inspection, acoustic emission, ultrasonic testing, radiography, eddy current testing, and embedded fiber optic sensors. It was determined that embedded fiber optic sensor is the most promising technique due to their ability to be embedded within layers of composites and their immunity to electromagnetic interference.

  16. EMBLA 2002: an Optical and Ground Survey in Hessdalen

    Science.gov (United States)

    Teodorani, M.; Nobili, G.

    2002-10-01

    A two-weeks scientific expedition to Hessdalen, aimed at investigating on field mysterious atmospheric light-phenomena, was carried out in August 2002 by the physics section of an italian team of scientists. Results are presented and discussed. Photometric analysis shows that the light-phenomenon is able to produce a luminous power of up-to 100 kW. A 3-D analysis of photo frames shows that the luminous phenomenon doesn't resemble canonical plasma features (a sharply gaussian PSF) unless the light phenomenon is caused by one recently discovered natural light-ball of BL type whose light-distribution (PSF) might be able to simulate an uniformly illuminated solid. A comparison of the light-distribution in different time-sequential frames shows that apparent slightly exponential wings of the PSF features are probably due to variations of atmospheric turbulence and transparency and not to intrinsic properties. Maximum phases of luminosity of the radiating surface are demonstrated to be due to the sudden apparition of a cluster of co-existing light-balls at constant temperature, while the inflation of light-balls is ruled out. Spectra show no resolved lines but a three-peaked feature which might be attributed both to some kind of artificial illumination system and to a mixture of many blended lines due to several chemical elements (more possibly: silicon). The results of a lab analysis of ground samples shows that some powder which was collected near a river contains an anomalous iron sphere of micrometric dimensions. A biophysical research-proposal aimed at studying the relation between the EM field produced by the phenomenon and the electrical activity of the human body is also presented. On the basis of this third explorative experience, the importance of having at disposal a sophisticated opto-electronic portable station (missing at present) is stressed for the future.

  17. Survey on Chatbot Design Techniques in Speech Conversation Systems

    Directory of Open Access Journals (Sweden)

    Sameera A. Abdul-Kader

    2015-07-01

    Full Text Available Human-Computer Speech is gaining momentum as a technique of computer interaction. There has been a recent upsurge in speech based search engines and assistants such as Siri, Google Chrome and Cortana. Natural Language Processing (NLP techniques such as NLTK for Python can be applied to analyse speech, and intelligent responses can be found by designing an engine to provide appropriate human like responses. This type of programme is called a Chatbot, which is the focus of this study. This paper presents a survey on the techniques used to design Chatbots and a comparison is made between different design techniques from nine carefully selected papers according to the main methods adopted. These papers are representative of the significant improvements in Chatbots in the last decade. The paper discusses the similarities and differences in the techniques and examines in particular the Loebner prize-winning Chatbots.

  18. High precision survey and alignment techniques in accelerator construction

    CERN Document Server

    Gervaise, J

    1974-01-01

    Basic concepts of precision surveying are briefly reviewed, and an historical account is given of instruments and techniques used during the construction of the Proton Synchrotron (1954-59), the Intersecting Storage Rings (1966-71), and the Super Proton Synchrotron (1971). A nylon wire device, distinvar, invar wire and tape, and recent automation of the gyrotheodolite and distinvar as well as auxiliary equipment (polyurethane jacks, Centipede) are discussed in detail. The paper ends summarizing the present accuracy in accelerator metrology, giving an outlook of possible improvement, and some aspects of staffing for the CERN Survey Group. (0 refs).

  19. Assessment of ground-based monitoring techniques applied to landslide investigations

    Science.gov (United States)

    Uhlemann, S.; Smith, A.; Chambers, J.; Dixon, N.; Dijkstra, T.; Haslam, E.; Meldrum, P.; Merritt, A.; Gunn, D.; Mackay, J.

    2016-01-01

    A landslide complex in the Whitby Mudstone Formation at Hollin Hill, North Yorkshire, UK is periodically re-activated in response to rainfall-induced pore-water pressure fluctuations. This paper compares long-term measurements (i.e., 2009-2014) obtained from a combination of monitoring techniques that have been employed together for the first time on an active landslide. The results highlight the relative performance of the different techniques, and can provide guidance for researchers and practitioners for selecting and installing appropriate monitoring techniques to assess unstable slopes. Particular attention is given to the spatial and temporal resolutions offered by the different approaches that include: Real Time Kinematic-GPS (RTK-GPS) monitoring of a ground surface marker array, conventional inclinometers, Shape Acceleration Arrays (SAA), tilt meters, active waveguides with Acoustic Emission (AE) monitoring, and piezometers. High spatial resolution information has allowed locating areas of stability and instability across a large slope. This has enabled identification of areas where further monitoring efforts should be focused. High temporal resolution information allowed the capture of 'S'-shaped slope displacement-time behaviour (i.e. phases of slope acceleration, deceleration and stability) in response to elevations in pore-water pressures. This study shows that a well-balanced suite of monitoring techniques that provides high temporal and spatial resolutions on both measurement and slope scale is necessary to fully understand failure and movement mechanisms of slopes. In the case of the Hollin Hill landslide it enabled detailed interpretation of the geomorphological processes governing landslide activity. It highlights the benefit of regularly surveying a network of GPS markers to determine areas for installation of movement monitoring techniques that offer higher resolution both temporally and spatially. The small sensitivity of tilt meter measurements

  20. The 6-GHz Multibeam Maser Survey I. Techniques

    CERN Document Server

    Green, J A; Fuller, G A; Avison, A; Breen, S L; Brooks, K; Burton, M G; Chrysostomou, A; Cox, J; Diamond, P J; Ellingsen, S P; Gray, M D; Hoare, M G; Masheder, M R W; McClure-Griffiths, N M; Pestalozzi, M; Phillips, C; Quinn, L; Thompson, M A; Voronkov, M; Walsh, A; Ward-Thompson, D; Wong-McSweeney, D; Yates, J A; Cohen, R J

    2008-01-01

    A new 7-beam 6-7 GHz receiver has been built to survey the Galaxy and the Magellanic Clouds for newly forming high-mass stars that are pinpointed by strong methanol maser emission at 6668 MHz. The receiver was jointly constructed by Jodrell Bank Observatory (JBO) and the Australia Telescope National Facility (ATNF) and allows simultaneous coverage at 6668 and 6035 MHz. It was successfully commissioned at Parkes in January 2006 and is now being used to conduct the Parkes-Jodrell multibeam maser survey of the Milky Way. This will be the first systematic survey of the entire Galactic plane for masers of not only 6668-MHz methanol, but also 6035-MHz excited-state hydroxyl. The survey is two orders of magnitude faster than most previous systematic surveys and has an rms noise level of ~0.17 Jy.This paper describes the observational strategy, techniques and reduction procedures of the Galactic and Magellanic Cloud surveys, together with deeper, pointed, follow-up observations and complementary observations with oth...

  1. Surveying co-located space geodesy techniques for ITRF computation

    Science.gov (United States)

    Sarti, P.; Sillard, P.; Vittuari, L.

    2003-04-01

    We present a comprehensive operational methodology, based on classical geodesy triangulation and trilateration, that allows the determination of reference points of the five space geodesy techniques used in ITRF computation (i.e.: DORIS, GPS, LLR, SLR, VLBI). Most of the times, for a single technique, the reference point is not accessible and measurable directly. Likewise, no mechanically determined ex-center with respect to an external and measurable point is usually given. In these cases, it is not possible to directly measure the sought reference points and it is even less straightforward to obtain the statistical information relating these points for different techniques. We outline the most general practical surveying methodology that permits to recover the reference points of the different techniques regardless of their physical materialization. We also give a detailed analytical approach for less straightforward cases (e.g.: non geodetic VLBI antennae and SLR/LLR systems). We stress the importance of surveying instrumentation and procedure in achieving the best possible results and outline the impact of the information retrieved with our method in ITRF computation. In particular, we will give numerical examples of computation of the reference point of VLBI antennae (Ny Aalesund and Medicina) and the ex-centre vector computation linking co-located VLBI and GPS techniques in Medicina (Italy). A special attention was paid to the rigorous derivation of statistical elements. They will be presented in an other presentation.

  2. Supplementary report on surface-water and ground-water surveys, Nueces River Basin, Texas

    Science.gov (United States)

    Broadhurst, W.L.; Ellsworth, C.E.

    1950-01-01

    A report on the ground-water and surface-water surveys of the Nueces River Basin was included in a report by the Bureau of Reclamation, entitled "Comprehensive plan for water-resources development of the Nueces River Basin project planning report number 5-14.04-3, February 1946".

  3. 40 CFR 141.401 - Sanitary surveys for ground water systems.

    Science.gov (United States)

    2010-07-01

    ...) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Ground Water Rule § 141.401..., maintenance, and monitoring compliance of a public water system to evaluate the adequacy of the system, its sources and operations and the distribution of safe drinking water. (c) The sanitary survey must include...

  4. Ground survey of active Central American volcanoes in November - December 1973

    Science.gov (United States)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1974-01-01

    The author has identified the following significant results. Thermal anomalies at two volcanoes, Santiaguito and Izalco, have grown in size in the past six months, based on repeated ground survey. Thermal anomalies at Pacaya volcano have became less intense in the same period. Large (500 m diameter) thermal anomalies exist at 3 volcanoes presently, and smaller scale anomalies are found at nine other volcanoes.

  5. Ground Data System Risk Mitigation Techniques for Faster, Better, Cheaper Missions

    Science.gov (United States)

    Catena, John J.; Saylor, Rick; Casasanta, Ralph; Weikel, Craig; Powers, Edward I. (Technical Monitor)

    2000-01-01

    With the advent of faster, cheaper, and better missions, NASA Projects acknowledged that a higher level of risk was inherent and accepted with this approach. It was incumbent however upon each component of the Project whether spacecraft, payload, launch vehicle, or ground data system to ensure that the mission would nevertheless be an unqualified success. The Small Explorer (SMEX) program's ground data system (GDS) team developed risk mitigation techniques to achieve these goals starting in 1989. These techniques have evolved through the SMEX series of missions and are practiced today under the Triana program. These techniques are: (1) Mission Team Organization--empowerment of a closeknit ground data system team comprising system engineering, software engineering, testing, and flight operations personnel; (2) Common Spacecraft Test and Operational Control System--utilization of the pre-launch spacecraft integration system as the post-launch ground data system on-orbit command and control system; (3) Utilization of operations personnel in pre-launch testing--making the flight operations team an integrated member of the spacecraft testing activities at the beginning of the spacecraft fabrication phase; (4) Consolidated Test Team--combined system, mission readiness and operations testing to optimize test opportunities with the ground system and spacecraft; and (5). Reuse of Spacecraft, Systems and People--reuse of people, software and on-orbit spacecraft throughout the SMEX mission series. The SMEX ground system development approach for faster, cheaper, better missions has been very successful. This paper will discuss these risk management techniques in the areas of ground data system design, implementation, test, and operational readiness.

  6. TESTING DIFFERENT SURVEY TECHNIQUES TO MODEL ARCHITECTONIC NARROW SPACES

    Directory of Open Access Journals (Sweden)

    A. Mandelli

    2017-08-01

    Full Text Available In the architectural survey field, there has been the spread of a vast number of automated techniques. However, it is important to underline the gap that exists between the technical specification sheet of a particular instrument and its usability, accuracy and level of automation reachable in real cases scenario, especially speaking about Cultural Heritage (CH field. In fact, even if the technical specifications (range, accuracy and field of view are known for each instrument, their functioning and features are influenced by the environment, shape and materials of the object. The results depend more on how techniques are employed than the nominal specifications of the instruments. The aim of this article is to evaluate the real usability, for the 1:50 architectonic restitution scale, of common and not so common survey techniques applied to the complex scenario of dark, intricate and narrow spaces such as service areas, corridors and stairs of Milan’s cathedral indoors. Tests have shown that the quality of the results is strongly affected by side-issues like the impossibility of following the theoretical ideal methodology when survey such spaces. The tested instruments are: the laser scanner Leica C10, the GeoSLAM ZEB1, the DOT DPI 8 and two photogrammetric setups, a full frame camera with a fisheye lens and the NCTech iSTAR, a panoramic camera. Each instrument presents advantages and limits concerning both the sensors themselves and the acquisition phase.

  7. Testing Different Survey Techniques to Model Architectonic Narrow Spaces

    Science.gov (United States)

    Mandelli, A.; Fassi, F.; Perfetti, L.; Polari, C.

    2017-08-01

    In the architectural survey field, there has been the spread of a vast number of automated techniques. However, it is important to underline the gap that exists between the technical specification sheet of a particular instrument and its usability, accuracy and level of automation reachable in real cases scenario, especially speaking about Cultural Heritage (CH) field. In fact, even if the technical specifications (range, accuracy and field of view) are known for each instrument, their functioning and features are influenced by the environment, shape and materials of the object. The results depend more on how techniques are employed than the nominal specifications of the instruments. The aim of this article is to evaluate the real usability, for the 1:50 architectonic restitution scale, of common and not so common survey techniques applied to the complex scenario of dark, intricate and narrow spaces such as service areas, corridors and stairs of Milan's cathedral indoors. Tests have shown that the quality of the results is strongly affected by side-issues like the impossibility of following the theoretical ideal methodology when survey such spaces. The tested instruments are: the laser scanner Leica C10, the GeoSLAM ZEB1, the DOT DPI 8 and two photogrammetric setups, a full frame camera with a fisheye lens and the NCTech iSTAR, a panoramic camera. Each instrument presents advantages and limits concerning both the sensors themselves and the acquisition phase.

  8. Digital Survey Techniques for the Documentation of Wooden Shipwrecks

    Science.gov (United States)

    Costa, E.; Balletti, C.; Beltrame, C.; Guerra, F.; Vernier, P.

    2016-06-01

    Nowadays, researchers widely employ the acquisition of point clouds as one of the principal type of documentation for cultural heritage. In this paper, different digital survey techniques are employed to document a wooden ancient shipwreck, a particular and difficult kind of archaeological finding due to its material characteristics. The instability of wood and the high costs of restoration do not always offer the opportunity of recovering and showing the hull to researchers and public and three-dimensional surveys are fundamental to document the original conditions of the wood. The precarious conditions of this material in contact with air could modify the structure and the size of the boat, requiring a fast and accurate recording technique. The collaboration between Ca' Foscari University and the Laboratory of Photogrammetry of Iuav University of Venice has given the possibility to demonstrate the utility of these technology. We have surveyed a sewn boat of Roman age through multi-image photogrammetry and laser scanner. Point clouds were compared and a residual analysis was done, to verify the characteristics and the opportunity of the two techniques, both of them have allowed obtaining a very precise documentation from a metrical point of view.

  9. Application of InSAR and GIS techniques to ground subsidence assessment in the Nobi Plain, Central Japan

    National Research Council Canada - National Science Library

    Zheng, Minxue; Fukuyama, Kaoru; Sanga-Ngoie, Kazadi

    Spatial variation and temporal changes in ground subsidence over the Nobi Plain, Central Japan, are assessed using GIS techniques and ground level measurements data taken over this area since the 1970s...

  10. A Survey of Structured and Object-Oriented Software Specification Methods and Techniques

    NARCIS (Netherlands)

    Wieringa, R.J.

    1998-01-01

    This article surveys techniques used in structured and object-oriented software specification methods. The techniques are classified as techniques for the specification of external interaction and internal decomposition. The external specification techniques are further subdivided into techniques fo

  11. Literature survey of heat transfer enhancement techniques in refrigeration applications

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, M.K.; Shome, B. [Rensselaer Polytechnic Inst., Troy, NY (United States). Dept. of Mechanical Engineering, Aeronautical Engineering and Mechanics

    1994-05-01

    A survey has been performed of the technical and patent literature on enhanced heat transfer of refrigerants in pool boiling, forced convection evaporation, and condensation. Extensive bibliographies of the technical literature and patents are given. Many passive and active techniques were examined for pure refrigerants, refrigerant-oil mixtures, and refrigerant mixtures. The citations were categorized according to enhancement technique, heat transfer mode, and tube or shell side focus. The effects of the enhancement techniques relative to smooth and/or pure refrigerants were illustrated through the discussion of selected papers. Patented enhancement techniques also are discussed. Enhanced heat transfer has demonstrated significant improvements in performance in many refrigerant applications. However, refrigerant mixtures and refrigerant-oil mixtures have not been studied extensively; no research has been performed with enhanced refrigerant mixtures with oil. Most studies have been of the parametric type; there has been inadequate examination of the fundamental processes governing enhanced refrigerant heat transfer, but some modeling is being done and correlations developed. It is clear that an enhancement technique must be optimized for the refrigerant and operating condition. Fundamental processes governing the heat transfer must be examined if models for enhancement techniques are to be developed; these models could provide the method to optimize a surface. Refrigerant mixtures, with and without oil present, must be studied with enhancement devices; there is too little known to be able to estimate the effects of mixtures (particularly NARMs) with enhanced heat transfer. Other conclusions and recommendations are offered.

  12. Comparing LiDAR-Generated to ground- surveyed channel cross-sectional profiles in a forested mountain stream

    Science.gov (United States)

    Brian C. Dietterick; Russell White; Ryan. Hilburn

    2012-01-01

    Airborne Light Detection and Ranging (LiDAR) holds promise to provide an alternative to traditional ground-based survey methods for stream channel characterization and some change detection purposes, even under challenging landscape conditions. This study compared channel characteristics measured at 53 ground-surveyed and LiDAR-derived crosssectional profiles located...

  13. A Survey of Wireless Sensor Network Security and Routing Techniques

    Directory of Open Access Journals (Sweden)

    Raja Waseem Anwar

    2015-04-01

    Full Text Available The main purpose of the study is to review the evolution of wireless sensor network security and routing techniques. Recent years have seen tremendous growth in Wireless Sensor Networks (WSNs. As WSN’s become more and more crucial to everyday life, their security and trust become a primary concern. However because of the nature of WSNs, security design can be challenging. Trust-aware routing protocols play a vital role in security of Wireless Sensor Networks (WSNs. The review study provides an overview of Wireless Sensor Network (WSN and discusses security issues and the routing techniques for high quality of service and efficient performance in a WSN. In order to identify gaps and propose research directions in WSN security and routing techniques, the study surveys the existing body of literature in this area. The main focus is on trust concepts and trust based approaches for wireless sensor networks. The study also highlights the difference between trust and security in the context of WSNs. The trust and security are interchangeable with each other when we elaborate a secure system and not same. Various surveys conducted about trust and reputation systems in ad hoc and sensor networks are studied and compared. Finally we summarize the different trust aware routing schemes.

  14. A shear wave ground surface vibration technique for the detection of buried pipes

    Science.gov (United States)

    Muggleton, J. M.; Papandreou, B.

    2014-07-01

    A major UK initiative, entitled 'Mapping the Underworld' aims to develop and prove the efficacy of a multi-sensor device for accurate remote buried utility service detection, location and, where possible, identification. One of the technologies to be incorporated in the device is low-frequency vibro-acoustics; the application of this technology for detecting buried infrastructure, in particular pipes, is currently being investigated. Here, a shear wave ground vibration technique for detecting buried pipes is described. For this technique, shear waves are generated at the ground surface, and the resulting ground surface vibrations measured. Time-extended signals are employed to generate the illuminating wave. Generalized cross-correlation functions between the measured ground velocities and a reference measurement adjacent to the excitation are calculated and summed using a stacking method to generate a cross-sectional image of the ground. To mitigate the effects of other potential sources of vibration in the vicinity, the excitation signal can be used as an additional reference when calculating the cross-correlation functions. Measurements have been made at two live test sites to detect a range of buried pipes. Successful detection of the pipes was achieved, with the use of the additional reference signal proving beneficial in the noisier of the two environments.

  15. UAS Mapping as an alternative for land surveying techniques?

    Directory of Open Access Journals (Sweden)

    L. Devriendt

    2014-03-01

    Full Text Available Can a UAS mapping technique compete with standard surveying techniques? Since the boom in different RPAS (remotely piloted air system, UAV (unmanned aerial vehicle, or UAS (unmanned aerial system, this is one of the crucial questions when it comes to UAS mappings. Not the looks and feels are important but the reliability, ease-to-use, and accuracy that you get with a system based on hardware and corresponding software. This was also one of the issues that the Dutch Land Registry asked a few months ago aimed at achieving an effective and usable system for updating property boundaries in new-build districts. Orbit GT gave them a ready-made answer: a definitive outcome based on years of research and development in UAS mapping technology and software.

  16. Techniques for Surveying Urban Active Faults by Seismic Methods

    Institute of Scientific and Technical Information of China (English)

    Xu Mingcai; Gao Jinghua; Liu Jianxun; Rong Lixin

    2005-01-01

    Using the seismic method to detect active faults directly below cities is an irreplaceable prospecting technique. The seismic method can precisely determine the fault position. Seismic method itself can hardly determine the geological age of fault. However, by considering in connection with the borehole data and the standard geological cross-section of the surveyed area, the geological age of reflected wave group can be qualitatively (or semi-quantitatively)determined from the seismic depth profile. To determine the upper terminal point of active faults directly below city, it is necessary to use the high-resolution seismic reflection technique.To effectively determine the geometric feature of deep faults, especially to determine the relation between deep and shallow fracture structures, the seismic reflection method is better than the seismic refraction method.

  17. A Survey on Statistical Based Single Channel Speech Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    Sunnydayal. V

    2014-11-01

    Full Text Available Speech enhancement is a long standing problem with various applications like hearing aids, automatic recognition and coding of speech signals. Single channel speech enhancement technique is used for enhancement of the speech degraded by additive background noises. The background noise can have an adverse impact on our ability to converse without hindrance or smoothly in very noisy environments, such as busy streets, in a car or cockpit of an airplane. Such type of noises can affect quality and intelligibility of speech. This is a survey paper and its object is to provide an overview of speech enhancement algorithms so that enhance the noisy speech signal which is corrupted by additive noise. The algorithms are mainly based on statistical based approaches. Different estimators are compared. Challenges and Opportunities of speech enhancement are also discussed. This paper helps in choosing the best statistical based technique for speech enhancement

  18. Monitoring greenhouse gas emissions in Australian landscapes: Comparing ground based mobile surveying data to GOSAT observations

    Science.gov (United States)

    Bashir, S.; Iverach, C.; Kelly, B. F. J.

    2016-12-01

    Climate change is threatening the health and stability of the natural world and human society. Such concerns were emphasized at COP21 conference in Paris 2015 which highlighted the global need to improve our knowledge of sources of greenhouse gas and to develop methods to mitigate the effects of their emissions. Ongoing spatial and temporal measurements of greenhouse gases at both point and regional scales is important for clarification of climate change mechanisms and accounting. The Greenhouse gas Observing SATellite (GOSAT) is designed to monitor the global distribution of carbon dioxide (CO2) and methane (CH4) from orbit. As existing ground monitoring stations are limited and still unevenly distributed, satellite observations provide important frequent, spatially extensive, but low resolution observations. Recent developments in portable laser based greenhouse gas measurement systems have enabled the rapid measurement of greenhouse gases in ppb at the ground surface. This study was conducted to map major sources of CO2 and CH4 in the eastern states of Australia at the landscape scale and to compare the results to GOSAT observations. During April 2016 we conducted a regional CH4 and CO2 mobile survey, using an LGR greenhouse gas analyzer. Measurements were made along a 4000 KM circuit through major cities, country towns, dry sclerophyll forests, coastal wetlands, coal mining regions, coal seam gas developments, dryland farming and irrigated agricultural landscapes. The ground-based survey data were then compared with the data (L2) from GOSAT. Ground-based mobile surveys showed that there are clear statistical differences in the ground level atmospheric concentration of CH4 and CO2 associated with all major changes in land use. These changes extend for kilometers, and cover one or more GOSAT pixels. In the coal mining districts the ground-level atmospheric concentration of CH4 exceeded 2 ppm for over 40 km, yet this was not discernable in the retrieved data (L2

  19. A Survey of 2D Face Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Mejda Chihaoui

    2016-09-01

    Full Text Available Despite the existence of various biometric techniques, like fingerprints, iris scan, as well as hand geometry, the most efficient and more widely-used one is face recognition. This is because it is inexpensive, non-intrusive and natural. Therefore, researchers have developed dozens of face recognition techniques over the last few years. These techniques can generally be divided into three categories, based on the face data processing methodology. There are methods that use the entire face as input data for the proposed recognition system, methods that do not consider the whole face, but only some features or areas of the face and methods that use global and local face characteristics simultaneously. In this paper, we present an overview of some well-known methods in each of these categories. First, we expose the benefits of, as well as the challenges to the use of face recognition as a biometric tool. Then, we present a detailed survey of the well-known methods by expressing each method’s principle. After that, a comparison between the three categories of face recognition techniques is provided. Furthermore, the databases used in face recognition are mentioned, and some results of the applications of these methods on face recognition databases are presented. Finally, we highlight some new promising research directions that have recently appeared.

  20. The Gaia Era: synergy between space missions and ground based surveys

    CERN Document Server

    Vallenari, A

    2008-01-01

    The Gaia mission is expected to provide highly accurate astrometric, photometric, and spectroscopic measurements for about $10^9$ objects. Automated classification of detected sources is a key part of the data processing. Here a few aspects of the Gaia classification process are presented. Information from other surveys at longer wavelengths, and from follow-up ground based observations will be complementary to Gaia data especially at faint magnitudes, and will offer a great opportunity to understand our Galaxy.

  1. Unmanned air/ground vehicles heterogeneous cooperative techniques:Current status and prospects

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Multiple unmanned air/ground vehicles heterogeneous cooperation is a novel and challenging filed.Heterogeneous cooperative techniques can widen the application fields of unmanned air or ground vehicles,and enhance the effectiveness of implementing detection,search and rescue tasks.This paper mainly focused on the key issues in multiple unmanned air/ground vehicles heterogeneous cooperation,including heterogeneous flocking,formation control,formation stability,network control,and actual applications.The main problems and future directions in this field were also analyzed in detail.These innovative technologies can significantly enhance the effectiveness of implementing complicated tasks,which definitely provide a series of novel breakthroughs for the intelligence,integration and advancement of future robot systems.

  2. Machine learning techniques for gait biometric recognition using the ground reaction force

    CERN Document Server

    Mason, James Eric; Woungang, Isaac

    2016-01-01

    This book focuses on how machine learning techniques can be used to analyze and make use of one particular category of behavioral biometrics known as the gait biometric. A comprehensive Ground Reaction Force (GRF)-based Gait Biometrics Recognition framework is proposed and validated by experiments. In addition, an in-depth analysis of existing recognition techniques that are best suited for performing footstep GRF-based person recognition is also proposed, as well as a comparison of feature extractors, normalizers, and classifiers configurations that were never directly compared with one another in any previous GRF recognition research. Finally, a detailed theoretical overview of many existing machine learning techniques is presented, leading to a proposal of two novel data processing techniques developed specifically for the purpose of gait biometric recognition using GRF. This book · introduces novel machine-learning-based temporal normalization techniques · bridges research gaps concerning the effect of ...

  3. Integration of Geomatic Techniques for the Urban Cavity Survey

    Science.gov (United States)

    Deidda, M.; Sanna, G.

    2013-07-01

    Cagliari, county seat of Sardinia Region (Italy), situated in the southern part of the island, is characterized by a subsoil full of cavities. The excavations in fact, which lasted more than 4000 years, had a great development due also to the special geological characteristics of the city subsoil. The underground voids, which the city is rich in, belong to different classes such as hydraulic structures (aqueducts, cisterns, wells, etc.), settlement works (tunnels, bomb shelters, tombs etc.) and various works (quarries, natural caves, etc.). This paper describes the phases of the survey of a large cavity below a high-traffic square near the Faculty of Engineering in the city of Cagliari, where the research team works. The cave, which is part of a larger complex, is important because it was used in the thirteenth century (known as the Pisan age) as a stone quarry. There are traces of this activity that have to be protected. Moreover, during the last forty years the continuous crossover of vehicles cracked the roof of the cave compromising the stability of the entire area. Consequently a plan was developed to make the whole cavity safe and usable for visits. The study of the safety of the cave has involved different professionals among which geologists, engineers, constructors. The goal of the University of Cagliari geomatic team was to solve two problems: to obtain geometrical information about the void and correctly place the cave in the context of existing maps. The survey and the products, useful for the investigation of the technicians involved, had to comply with tolerances of 3 cm in the horizontal and 5 cm in the vertical component. The approach chosen for this purpose was to integrate different geomatic techniques. The cave was surveyed using a laser scanner (Faro Photon 80) in order to obtain a 3D model of the cave from which all the geometrical information was derived, while both classic topography and GPS techniques were used to include the cave in the

  4. Use of GPR technique in surveying gravel road wearing course

    Science.gov (United States)

    Saarenketo, Timo; Vesa, Heikki

    2000-04-01

    During summer 1998 a series of tests were conducted in Finland in order to find out how Ground Penetrating Radar (GPR) technology can be utilized at both the project and network level, when surveying the wearing course thickness of gravel roads. The second objective was to investigate the possibilities of applying dielectricity information obtained using the GPR surface reflection method when determining the quality of the gravel road wearing course. In this study GPR was tested at the project level on highway 9241 Simo in Northern Finland, where the information provided by the GPR and laboratory research was used in designing and proportioning a new wearing course. In the network level studies, performed in the maintenance areas of Kemi and Karstula in Northern and Central Finland the goal for using GPR was to inspect the condition and thickness of the wearing course and evaluate the need for additional wearing course material. The total length of the roads under survey was approximately 200 km and both a 1.5 GHz ground-coupled antenna and a 1.0 GHz horn antenna were tested in this study. The research results show that GPR can be used to measure the thickness of the wearing course, the average measuring error against reference drilling measurements being 25 mm, which is considerably larger than the error of radar measurements in paved roads. To a great extent this is due to the fact that the thickness of the wearing course varies greatly even in the road's cross-section and thus a single reference thickness does not represent the actual thickness of the area measured with the GPR. The wearing course can often get mixed up with lower layers, which makes it difficult to determine the exact layer interfaces. For this reason reference information must always be used along with the GPR measurement results. Of the two GPR antennae tested, the horn antenna proved to be the more effective in measurements. The dielectric value of the wearing course, measured using the horn

  5. Comparative analysis of clutter suppression techniques for landmine detection using ground-penetrating radar

    Science.gov (United States)

    Yoldemir, Ahmet Burak; Gürcan, Rıdvan; Kaplan, Gülay Büyükaksoy; Sezgin, Mehmet

    2011-06-01

    In this study, we provide an extensive comparison of different clutter suppression techniques that are proposed to enhance ground penetrating radar (GPR) data. Unlike previous studies, we directly measure and present the effect of these preprocessing algorithms on the detection performance. Basic linear prediction algorithm is selected as the detection scheme and it is applied to real GPR data after applying each of the available clutter suppression techniques. All methods are tested on an extensive data set of different surrogate mines and other objects that are commonly encountered under the ground. Among several algorithms, singular value decomposition based clutter suppression stands out with its superior performance and low computational cost, which makes it practical to use in real-time applications.

  6. Spaceflight Systems Training: A Comparison and Contrasting of Techniques for Training Ground Operators and Onboard Crewmembers

    Science.gov (United States)

    Balmain, Clinton; Fleming, Mark

    2009-01-01

    When developing techniques and products for instruction on manned spaceflight systems, training organizations are often faced with two very different customers: ground operators and onboard crewmembers. Frequently, instructional development focuses on one of these customers with the assumption that the other s needs will be met by default. Experience teaches us that differing approaches are required when developing training tailored to the specific needs of each customer. As a rule, ground operators require focused instruction on specific areas of expertise. Their knowledge should be of the details of the hardware, software, and operational techniques associated with that system. They often benefit from historical knowledge of how their system has operated over its lifetime. Since several different ground operators may be interfacing with the same system, each individual operator must understand the agreed-to principles by which that system will be run. In contrast, onboard crewmembers require a more broad, hands-on awareness of their operational environment. Their training should be developed with an understanding of the physical environment in which they live and work and the day-to-day tasks they are most likely to perform. Rarely do they require a deep understanding of the details of a system; it is often sufficient to teach them just enough to maintain situational awareness and perform basic tasks associated with maintenance and operation of onboard systems. Crewmembers may also develop unique onboard operational techniques that differ from preceding crews. They should be taught what flexibility they have in systems operations and how their specific habits can be communicated to ground support personnel. This paper will explore the techniques that can be employed when developing training for these unique customers. We will explore the history of International Space Station training development and how past efforts can guide us in creating training for users of

  7. VLBI collimation tower technique for time-delay studies of a large ground station communications antenna

    Science.gov (United States)

    Otoshi, T. Y.; Young, L. E.; Rusch, W. V. T.

    1983-01-01

    A need for an accurate but inexpensive method for measuring and evaluating time delays of large ground antennas for VLBI applications motivated the development of the collimation tower technique. Supporting analytical work which was performed primarily to verify time delay measurement results obtained for a large antenna when the transmitter was at a collimation distance of 1/25 of the usual far field criterion is discussed. Comparisons of theoretical and experimental results are also given.

  8. The influence of cricket fast bowlers' front leg technique on peak ground reaction forces.

    Science.gov (United States)

    Worthington, Peter; King, Mark; Ranson, Craig

    2013-01-01

    High ground reaction forces during the front foot contact phase of the bowling action are believed to be a major contributor to the high prevalence of lumbar stress fractures in fast bowlers. This study aimed to investigate the influence of front leg technique on peak ground reaction forces during the delivery stride. Three-dimensional kinematic data and ground reaction forces during the front foot contact phase were captured for 20 elite male fast bowlers. Eight kinematic parameters were determined for each performance, describing run-up speed and front leg technique, in addition to peak force and time to peak force in the vertical and horizontal directions. There were substantial variations between bowlers in both peak forces (vertical 6.7 ± 1.4 body weights; horizontal (braking) 4.5 ± 0.8 body weights) and times to peak force (vertical 0.03 ± 0.01 s; horizontal 0.03 ± 0.01 s). These differences were found to be linked to the orientation of the front leg at the instant of front foot contact. In particular, a larger plant angle and a heel strike technique were associated with lower peak forces and longer times to peak force during the front foot contact phase, which may help reduce the likelihood of lower back injuries.

  9. Status of aerial survey emergency preparedness and ground support equipment, calibration, and sensitivities

    Energy Technology Data Exchange (ETDEWEB)

    Dahlstrom, T.S.

    1986-01-01

    During the course of EG and G Energy Measurements, Inc. history in aerial surveillance, the scope of response has broadened from routine surveys and accident response with aerial systems, to being prepared to respond to any radiological incident with aerial, ground mobile, and hand-held instrumentation. The aerial survey system presently consists of four MBB BO-105 helicopters outfitted with gamma pods and specialized navigation systems (MRS or URS) that allow the operator and pilot to fly well-defined survey lines. Minimum detectable activities (MDA) for various isotopes range from a few tenths of a mCi to 100 mCI for point sources and from 1 to 200 pCi/g for volume sources.

  10. Emerging Technologies and Techniques for Wide Area Radiological Survey and Remediation

    Energy Technology Data Exchange (ETDEWEB)

    Sutton, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zhao, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-03-24

    Technologies to survey and decontaminate wide-area contamination and process the subsequent radioactive waste have been developed and implemented following the Chernobyl nuclear power plant release and the breach of a radiological source resulting in contamination in Goiania, Brazil. These civilian examples of radioactive material releases provided some of the first examples of urban radiological remediation. Many emerging technologies have recently been developed and demonstrated in Japan following the release of radioactive cesium isotopes (Cs-134 and Cs-137) from the Fukushima Dai-ichi nuclear power plant in 2011. Information on technologies reported by several Japanese government agencies, such as the Japan Atomic Energy Agency (JAEA), the Ministry of the Environment (MOE) and the National Institute for Environmental Science (NIES), together with academic institutions and industry are summarized and compared to recently developed, deployed and available technologies in the United States. The technologies and techniques presented in this report may be deployed in response to a wide area contamination event in the United States. In some cases, additional research and testing is needed to adequately validate the technology effectiveness over wide areas. Survey techniques can be deployed on the ground or from the air, allowing a range of coverage rates and sensitivities. Survey technologies also include those useful in measuring decontamination progress and mapping contamination. Decontamination technologies and techniques range from non-destructive (e.g., high pressure washing) and minimally destructive (plowing), to fully destructive (surface removal or demolition). Waste minimization techniques can greatly impact the long-term environmental consequences and cost following remediation efforts. Recommendations on technical improvements to address technology gaps are presented together with observations on remediation in Japan.

  11. Experimental verification of a computational technique for determining ground reactions in human bipedal stance.

    Science.gov (United States)

    Audu, Musa L; Kirsch, Robert F; Triolo, Ronald J

    2007-01-01

    We have developed a three-dimensional (3D) biomechanical model of human standing that enables us to study the mechanisms of posture and balance simultaneously in various directions in space. Since the two feet are on the ground, the system defines a kinematically closed-chain which has redundancy problems that cannot be resolved using the laws of mechanics alone. We have developed a computational (optimization) technique that avoids the problems with the closed-chain formulation thus giving users of such models the ability to make predictions of joint moments, and potentially, muscle activations using more sophisticated musculoskeletal models. This paper describes the experimental verification of the computational technique that is used to estimate the ground reaction vector acting on an unconstrained foot while the other foot is attached to the ground, thus allowing human bipedal standing to be analyzed as an open-chain system. The computational approach was verified in terms of its ability to predict lower extremity joint moments derived from inverse dynamic simulations performed on data acquired from four able-bodied volunteers standing in various postures on force platforms. Sensitivity analyses performed with model simulations indicated which ground reaction force (GRF) and center of pressure (COP) components were most critical for providing better estimates of the joint moments. Overall, the joint moments predicted by the optimization approach are strongly correlated with the joint moments computed using the experimentally measured GRF and COP (0.78 unity slope (experimental=computational results) for postures of the four subjects examined. These results indicate that this model-based technique can be relied upon to predict reasonable and consistent estimates of the joint moments using the predicted GRF and COP for most standing postures.

  12. Ground deformation detection of the greater area of Thessaloniki (Northern Greece using radar interferometry techniques

    Directory of Open Access Journals (Sweden)

    D. Raucoules

    2008-07-01

    Full Text Available In the present study SAR interferometric techniques (stacking of conventional interferograms and Permanent Scatterers, using images from satellites ERS-1 and 2, have been applied to the region of Thessaloniki (northern Greece. The period covered by the images is 1992–2000. Both techniques gave good quantitative and qualitative results. The interferometric products were used to study ground surface deformation phenomena that could be related to the local tectonic context, the exploitation of underground water and sediments compaction.

    The city of Thessaloniki shows relatively stable ground conditions. Subsidence in four locations, mainly in the area surrounding the city of Thessaloniki, has been detected and assessed. Two of the sites (Sindos-Kalochori and Langadhas were already known from previous studies as subsiding areas, using ground base measurements. On the contrary the other two sites in the northern suburbs of Thessaloniki (Oreokastro and in the south-east (airport area were unknown as areas of subsidence. A further investigation based on fieldwork is needed in these two areas. Finally, an attempt to interpret the observed deformation, according to the geological regime of the area and its anthropogenic activities, has been carried out.

  13. A reliable ground bounce noise reduction technique for nanoscale CMOS circuits

    Science.gov (United States)

    Sharma, Vijay Kumar; Pattanaik, Manisha

    2015-11-01

    Power gating is the most effective method to reduce the standby leakage power by adding header/footer high-VTH sleep transistors between actual and virtual power/ground rails. When a power gating circuit transitions from sleep mode to active mode, a large instantaneous charge current flows through the sleep transistors. Ground bounce noise (GBN) is the high voltage fluctuation on real ground rail during sleep mode to active mode transitions of power gating circuits. GBN disturbs the logic states of internal nodes of circuits. A novel and reliable power gating structure is proposed in this article to reduce the problem of GBN. The proposed structure contains low-VTH transistors in place of high-VTH footer. The proposed power gating structure not only reduces the GBN but also improves other performance metrics. A large mitigation of leakage power in both modes eliminates the need of high-VTH transistors. A comprehensive and comparative evaluation of proposed technique is presented in this article for a chain of 5-CMOS inverters. The simulation results are compared to other well-known GBN reduction circuit techniques at 22 nm predictive technology model (PTM) bulk CMOS model using HSPICE tool. Robustness against process, voltage and temperature (PVT) variations is estimated through Monte-Carlo simulations.

  14. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  15. Reading Fluency Techniques from the Bottom-up: A Grounded Theory

    Directory of Open Access Journals (Sweden)

    Seyyed Ali Ostovar-Namaghi

    2015-09-01

    Full Text Available In many EFL contexts, including language education milieu in Iran, reading fluency is usually taken for granted since language education in public high schools mainly focuses on reading comprehension. Taking the detrimental effect of fluency deficiency into account, some practitioners foreground reading fluency and try to develop it early on. To give voice to their theories of practice, this qualitative study interviewed teachers who were willing to share their experience with the researchers. In line with grounded theory, the iterative process of data collection and analysis continued until the conceptualization of fluency development techniques was saturated. The techniques emerged are conducive to fluency development and as such the findings have clear implications for practitioners and policy makers nation-wide. Keywords: reading theory, reading fluency, reading techniques

  16. Techniques to extend the reach of ground based gravitational wave detectors

    Science.gov (United States)

    Dwyer, Sheila

    2016-03-01

    While the current generation of advanced ground based detectors will open the gravitational wave universe to observation, ground based interferometry has the potential to extend the reach of these observatories to high redshifts. Several techniques have the potential to improve the advanced detectors beyond design sensitivity, including the use of squeezed light, upgraded suspensions, and possibly new optical coatings, new test mass materials, and cryogenic suspensions. To improve the sensitivity by more than a factor of 10 compared to advanced detectors new, longer facilities will be needed. Future observatories capable of hosting interferometers 10s of kilometers long have the potential to extend the reach of gravitational wave astronomy to cosmological distances, enabling detection of binary inspirals from throughout the history of star formation.

  17. A Survey on Terrain Assessment Techniques for Autonomous Operation of Planetary Robots

    Science.gov (United States)

    Sancho-Pradel, D. L.; Gao, Y.

    A key challenge in autonomous planetary surface exploration is the extraction of meaningful information from sensor data, which would allow a good interpretation of the nearby terrain, and a reasonable assessment of more distant areas. In the last decade, the desire to increase the autonomy of unmanned ground vehicles (UGVs), particularly in terms of off-road navigation, has significantly increased the interest in the field of automated terrain classification. Although the field is relatively new, its advances and goals are scattered across different robotic platforms and applications. The objective of this paper is to present a survey of the field from a planetary exploration perspective, bringing together the underlying techniques, existing approaches and relevant applications under a common framework. The aim is to provide a comprehensive overview to the newcomer in the field, and a structured reference for the practitioners.

  18. Monitoring ground subsidence due to underground mining using integrated space geodetic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Linlin Ge; Michael Hsing-Chung Chang; Chris Rizos [University of NSW, NSW (Australia)

    2004-04-01

    Differential radar interferometry (DInSAR) can deliver {approximately} 1cm height change resolution. The combination of regular radar beam scanning and movement of the satellites carrying the radar sensor enables imaging of the mining region in seconds, from which subtle ground movements can be detected. Quantitative validation comparing the DInSAR-derived subsidence profile against ground truth shows a best RMS error of 1.4cm. A methodology has been developed to use GPS (the Global Positioning System) observations to measure atmospheric disturbances so that the DInSAR results can be corrected. A Geographic Information System (GIS) has been used to post-process InSAR results throughout this project. GIS can be used to present the final results in various formats, for example, profiles for validating with ground truth, subsidence contour maps, and three-dimensional views. Professional looking thematic maps can be generated based on these analyses, lining up with the practice within the mining industry to deliver drawings/maps in a GIS format. Multi-temporal DInSAR results can be analysed using GIS, and the final results compiled into an animation, showing the subsidence region moving as time passes. A virtual reality image has been generated in the GIS, combining DEM, aerial photography, and DInSAR subsidence results. The UNSW InSAR-GPS-GIS Integration Software has been developed to support the seamless flow of data among the three technologies, DInSAR, GPS, and GIS. Several radar satellite missions, some especially designed for InSAR, are scheduled for launch in the near future. Therefore radar data of global coverage with weekly or even daily revisit will be made available at multiple radar bands. With atmospheric disturbances properly accounted for, DInSAR will be a cost-effective, reliable, and operational tool that complements traditional ground survey methods.

  19. A technique for estimating ground-water levels at sites in Rhode Island from observation-well data

    Science.gov (United States)

    Socolow, Roy S.; Frimpter, Michael H.; Turtora, Michael; Bell, Richard W.

    1994-01-01

    Estimates of future high, median, and low ground- water levels are needed for engineering and architectural design decisions and for appropriate selection of land uses. For example, the failure of individual underground sewage-disposal systems due to high ground-water levels can be prevented if accurate water-level estimates are available. Estimates of extreme or average conditions are needed because short duration preconstruction obser- vations are unlikely to be adequately represen- tative. Water-level records for 40 U.S. Geological Survey observation wells in Rhode Island were used to describe and interpret water-level fluctuations. The maximum annual range of water levels average about 6 feet in sand and gravel and 11 feet in till. These data were used to develop equations for estimating future high, median, and low water levels on the basis of any one measurement at a site and records of water levels at observation wells used as indexes. The estimating technique relies on several assumptions about temporal and spatial variations: (1) Water levels will vary in the future as they have in the past, (2) Water levels fluctuate seasonally (3) Ground-water fluctuations are dependent on site geology, and (4) Water levels throughout Rhode Island are subject to similar precipitation and climate. Comparison of 6,697 estimates of high, median, and low water levels (depth to water level exceeded 95, 50, and 5 percent of the time, respectively) with the actual measured levels exceeded 95, 50, and 5 percent of the time at 14 sites unaffected by pumping and unknown reasons, yielded mean squared errors ranging from 0.34 to 1.53 square feet, 0.30 to 1.22 square feet, and 0.32 to 2.55 square feet, respectively. (USGS)

  20. Surveying and benchmarking techniques to analyse DNA gel fingerprint images.

    Science.gov (United States)

    Heras, Jónathan; Domínguez, César; Mata, Eloy; Pascual, Vico

    2016-11-01

    DNA fingerprinting is a genetic typing technique that allows the analysis of the genomic relatedness between samples, and the comparison of DNA patterns. The analysis of DNA gel fingerprint images usually consists of five consecutive steps: image pre-processing, lane segmentation, band detection, normalization and fingerprint comparison. In this article, we firstly survey the main methods that have been applied in the literature in each of these stages. Secondly, we focus on lane-segmentation and band-detection algorithms-as they are the steps that usually require user-intervention-and detect the seven core algorithms used for both tasks. Subsequently, we present a benchmark that includes a data set of images, the gold standards associated with those images and the tools to measure the performance of lane-segmentation and band-detection algorithms. Finally, we implement the core algorithms used both for lane segmentation and band detection, and evaluate their performance using our benchmark. As a conclusion of that study, we obtain that the average profile algorithm is the best starting point for lane segmentation and band detection.

  1. Accuracy assessment of GPS and surveying technique in forest road mapping

    Directory of Open Access Journals (Sweden)

    Ehsan Abdi

    2012-12-01

    Full Text Available Forest road networks provide access to the forest as a source of timber production and tourism services. Moreover, it is considered the main tool to protect forests from fire and smuggling. The prerequisite of road management and maintenance planning is to have spatial distribution and map of the roads. But newly constructed or some other forest road segments are not available in national maps. Therefore, mapping these networks is raised as a priority for a forest manager. The aim of this study was to assess accuracy of routine methods in road mapping. For this purpose, Patom district forest road was selected and road network map was extracted from the National Cartographic Center maps as the ground truth or base map. The map of the network was acquired using two methods, a GPS receiver and survey technique. Selecting 70 sample points on the network and considering the National Cartographic Center map as base map, accuracy was determined for two methods. The results showed that while the survey method was more accurate at the beginning of the path (first 500 meters, accumulation of errors resulted in higher rates of error in this method (up to 263 meters compared to GPS. Mann-Whitney test revealed significant differences in accuracy of two methods and mean accuracies were 38.86 and 147.90 for GPS and surveying respectively. The results showed that for samples 1-15 there was no significant difference between the survey and GPS data but for samples 28-42 and 56-70 statistically significant difference were existed between the survey and GPS data. Regression analysis showed that the relation between GPS and surveying accuracies and distance were best defined by cubic (R2 adj = 0.65 and linear (R2 adj = 0.83 regression models respectively. Applying 10 and 5 meters buffers around base map, 68 and 41% of GPS and 44 and 21% of surveying derived road were overlapped with buffer zones. The time required to complete the survey was found to increase the

  2. Evaluation of intra-annual variation in U.S. Geological Survey National Water Quality Assessment ground water quality data.

    Science.gov (United States)

    Rosen, Michael R; Voss, Frank D; Arufe, Jorge A

    2008-01-01

    Assessment of ground-water quality trends under the U.S. Geological Survey National Water-Quality Assessment Program (NAWQA) included the analysis of samples collected on a quarterly basis for 1 yr between 2001 and 2005. The purpose of this quarterly sampling was to test the hypothesis that variations in the concentration of water-quality parameters of selected individual wells could demonstrate that the intra-annual variation was greater or less than the decadal changes observed for a trend network. Evaluation of more than 100 wells over this period indicates that 1 yr of quarterly sampling is not adequate to address the issue of intra-annual variation because variations seem to be random and highly variable between different wells in the same networks and among networks located in different geographical areas of the USA. In addition, the data from only 1 yr makes it impossible to assess whether variations are due to univariate changes caused by land use changes, hydrologic variations due to variable recharge, or variations caused by ground-water pumping. These data indicate that funds allocated to this activity can be directed to the collection of more effective trend data, including age dating of all wells in the NAWQA network using multiple techniques. Continued evaluation of data and updating of monitoring plans of the NAWQA program is important for maintaining relevance to national goals and scientific objectives.

  3. Cleaning of polluted water using biological techniques. [Ground water]. Rensning af forurenet vand ved biologisk teknik

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, M. (Hedeselskabet (Denmark))

    1992-01-01

    Ground-water at many Danish locations has been polluted by organic substances. This pollution has taken place in relation to leaks or spills of, for example, petrol from leaky tanks or oil separators. The article describes a new biological technique for the purification of ground-water polluted by petrol and diesel oils leaked at a petrol station. The technique involves decompostion by bacteria. During decompostion the biomass in the filter increases and carbon dioxide and water is produced, so there is no waste product from this process. The two units consist of an oil-separator which separates the diesel oil and petrol from the water, and a bio-filter which is constructed as an aired-through inverted filter to which nutrient salts are continually added. The filter-material used is in the form of plastic rings on which the oil-decomposing bacteria grow and reproduce themselves. The system is further described. It is claimed that the bio-filter can decompose 7 kg of petrol and diesel oil in one week, larger ones decompose more. The servicelife of the system is expected to be 4-6 years. Current installation costs are 20.000 - 100.000 Danish kroner, according to size. (AB).

  4. Preliminary survey on site-adaptation techniques for satellite-derived and reanalysis solar radiation datasets

    Energy Technology Data Exchange (ETDEWEB)

    Polo, J.; Wilbert, S.; Ruiz-Arias, J. A.; Meyer, R.; Gueymard, C.; Súri, M.; Martín, L.; Mieslinger, T.; Blanc, P.; Grant, I.; Boland, J.; Ineichen, P.; Remund, J.; Escobar, R.; Troccoli, A.; Sengupta, M.; Nielsen, K. P.; Renne, D.; Geuder, N.; Cebecauer, T.

    2016-07-01

    At any site, the bankability of a projected solar power plant largely depends on the accuracy and general quality of the solar radiation data generated during the solar resource assessment phase. The term 'site adaptation' has recently started to be used in the framework of solar energy projects to refer to the improvement that can be achieved in satellite-derived solar irradiance and model data when short-term local ground measurements are used to correct systematic errors and bias in the original dataset. This contribution presents a preliminary survey of different possible techniques that can improve long-term satellite-derived and model-derived solar radiation data through the use of short-term on-site ground measurements. The possible approaches that are reported here may be applied in different ways, depending on the origin and characteristics of the uncertainties in the modeled data. This work, which is the first step of a forthcoming in-depth assessment of methodologies for site adaptation, has been done within the framework of the International Energy Agency Solar Heating and Cooling Programme Task 46 'Solar Resource Assessment and Forecasting.'

  5. Image segmentation techniques for improved processing of landmine responses in ground-penetrating radar data

    Science.gov (United States)

    Torrione, Peter A.; Collins, Leslie

    2007-04-01

    As ground penetrating radar sensor phenomenology improves, more advanced statistical processing approaches become applicable to the problem of landmine detection in GPR data. Most previous studies on landmine detection in GPR data have focused on the application of statistics and physics based prescreening algorithms, new feature extraction approaches, and improved feature classification techniques. In the typical framework, prescreening algorithms provide spatial location information of anomalous responses in down-track / cross-track coordinates, and feature extraction algorithms are then tasked with generating low-dimensional information-bearing feature sets from these spatial locations. However in time-domain GPR, a significant portion of the data collected at prescreener flagged locations may be unrelated to the true anomaly responses - e.g. ground bounce response, responses either temporally "before" or "after" the anomalous response, etc. The ability to segment the information-bearing region of the GPR image from the background of the image may thus provide improved performance for feature-based processing of anomaly responses. In this work we will explore the application of Markov random fields (MRFs) to the problem of anomaly/background segmentation in GPR data. Preliminary results suggest the potential for improved feature extraction and overall performance gains via application of image segmentation approaches prior to feature extraction.

  6. A Generic Current Mode Design for Multifunction Grounded Capacitor Filters Employing Log-Domain Technique

    Directory of Open Access Journals (Sweden)

    N. A. Shah

    2011-01-01

    Full Text Available A generic design (GD for realizing an nth order log-domain multifunction filter (MFF, which can yield four possible stable filter configurations, each offering simultaneously lowpass (LP, highpass (HP, and bandpass (BP frequency responses, is presented. The features of these filters are very simple, consisting of merely a few exponential transconductor cells and capacitors; all grounded elements, capable of absorbing the shunt parasitic capacitances, responses are electronically tuneable, and suitable for monolithic integration. Furthermore, being designed using log-domain technique, it offers all its advantages. As an example, 5th-order MFFs are designed in each case and their performances are evaluated through simulation. Lastly, a comparative study of the MFFs is also carried, which helps in selecting better high-order MFF for a given application.

  7. Ecological survey of M-Field, Edgewood Area Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Downs, J.L.; Eberhardt, L.E.; Fitzner, R.E.; Rogers, L.E.

    1991-12-01

    An ecological survey was conducted on M-Field, at the Edgewood Area, Aberdeen Proving Ground, Maryland. M-Field is used routinely to test army smokes and obscurants, including brass flakes, carbon fibers, and fog oils. The field has been used for testing purposes for the past 40 years, but little documented history is available. Under current environmental regulations, the test field must be assessed periodically to document the presence or potential use of the area by threatened and endangered species. The M-Field area is approximately 370 acres and is part of the US Army`s Edgewood Area at Aberdeen Proving Ground in Harford County, Maryland. The grass-covered field is primarily lowlands with elevations from about 1.0 to 8 m above sea level, and several buildings and structures are present on the field. The ecological assessment of M-Field was conducted in three stages, beginning with a preliminary site visit in May to assess sampling requirements. Two field site visits were made June 3--7, and August 12--15, 1991, to identify the biota existing on the site. Data were gathered on vegetation, small mammals, invertebrates, birds, large mammals, amphibians, and reptiles.

  8. Ecological survey of M-Field, Edgewood Area Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Downs, J.L.; Eberhardt, L.E.; Fitzner, R.E.; Rogers, L.E.

    1991-12-01

    An ecological survey was conducted on M-Field, at the Edgewood Area, Aberdeen Proving Ground, Maryland. M-Field is used routinely to test army smokes and obscurants, including brass flakes, carbon fibers, and fog oils. The field has been used for testing purposes for the past 40 years, but little documented history is available. Under current environmental regulations, the test field must be assessed periodically to document the presence or potential use of the area by threatened and endangered species. The M-Field area is approximately 370 acres and is part of the US Army's Edgewood Area at Aberdeen Proving Ground in Harford County, Maryland. The grass-covered field is primarily lowlands with elevations from about 1.0 to 8 m above sea level, and several buildings and structures are present on the field. The ecological assessment of M-Field was conducted in three stages, beginning with a preliminary site visit in May to assess sampling requirements. Two field site visits were made June 3--7, and August 12--15, 1991, to identify the biota existing on the site. Data were gathered on vegetation, small mammals, invertebrates, birds, large mammals, amphibians, and reptiles.

  9. THE EFFECTS OF CONTROLLED SKIDDING TECHNIQUE ON RESIDUAL STAND DAMAGE AND GROUND EXPOSURE IN SWAMP FOREST LOGGING

    Directory of Open Access Journals (Sweden)

    Sona Suhartana

    2004-11-01

    • The  average of ground  exposure  caused by controlled  skidding  technique  and conventionalskidding technique was respectively   16.06% and 18.4%.  The difference of 2.34%  was significant at 95%.

  10. Application of a Modified Universal Design Survey for Evaluation of Ares 1 Ground Crew Worksites

    Science.gov (United States)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for NASA's Ares 1 launch vehicle. Launch site ground operations include several operator tasks to prepare the vehicle for launch or to perform maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To support design evaluation, the Ares 1 Upper Stage (US) element Human Factors Engineering (HFE) group developed a survey based on the Universal Design approach. Universal Design is a process to create products that can be used effectively by as many people as possible. Universal Design per se is not a priority for Ares 1 because launch vehicle processing is a specialized skill and not akin to a consumer product that should be used by all people of all abilities. However, applying principles of Universal Design will increase the probability of an error free and efficient design which is a priority for Ares 1. The Design Quality Evaluation Survey centers on the following seven principles: (1) Equitable use, (2) Flexibility in use, (3) Simple and intuitive use, (4) Perceptible information, (5) Tolerance for error, (6) Low physical effort, (7) Size and space for approach and use. Each principle is associated with multiple evaluation criteria which were rated with the degree to which the statement is true. All statements are phrased in the utmost positive, or the design goal so that the degree to which judgments tend toward "completely agree" directly reflects the degree to which the design is good. The Design Quality Evaluation Survey was employed for several US analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability

  11. Delineation of a landfill leachate plume using shallow electromagnetic and ground-penetrating radar surveys

    Energy Technology Data Exchange (ETDEWEB)

    Nobes, D.C.; Armstrong, M.J. [Univ. of Canterbury, Christchurch (New Zealand); Broadbent, M. [Broadbent (Michael), Christchurch (New Zealand)

    1994-12-31

    Leachate plumes are often more electrically conductive than the surrounding host pore waters, and thus can be detected using shallow electromagnetic (EM) methods. The depth of penetration of ground penetrating radar (GPR) is controlled to a large extent by the electrical conductivity. Conductive leachate plumes will appear as ``blank`` areas in the radar profiles, because the radar energy is more severely attenuated in the region of the leachate plume. The authors present here the results of EM and GPR Surveys carried out in an area adjacent to a landfill site. Previous resistivity surveys indicated the presence of a leachate plume originating from an early stage of the landfill operation. The shallow EM and GPR surveys were carried out, in part, to confirm and refine the resistivity results, and to delineate the spatial extent of the plume. The surficial sediments are coastal sands, and the dune topography has an effect on the EM results, even though the variations in elevation are, in general, no more than 3 m. Besides the leachate plume, numerous conductivity highs and lows are present, which are at least coarsely correlated with topographic lows and highs. Following the empirical procedure outlined by Monier-Williams et al. (1990), the topographic effects have been removed, and the plume is better isolated and delineated. A possible second, weaker leachate plume has been identified, emanating from the current landfill operation. The second plume may follow a channel that was masked by the overlying dune sands. The leading edge of the primary leachate plume is moving to the south-southeast at a rate of 14 to 15 m/yr.

  12. Crosstalk suppression in networked resistive sensor arrays using virtual ground technique

    Science.gov (United States)

    Sahai Saxena, Raghvendra; Semwal, Sushil Kumar; Singh Rana, Pratap; Bhan, R. K.

    2013-11-01

    In 2D resistive sensor arrays, the interconnections are reduced considerably by sharing rows and columns among various sensor elements in such a way that one end of each sensor is connected to a row node and other end connected to a column node. This scheme results in total N + M interconnections for N × M array of sensors. Thus, it simplifies the interconnect complexity but suffers from the crosstalk problem among its elements. We experimentally demonstrate that this problem can be overcome by putting all the row nodes at virtually equal potential using virtual ground of high gain operational amplifiers in negative feedback. Although it requires large number of opamps, it solves the crosstalk problem to a large extent. Additionally, we get the response of all the sensors lying in a column simultaneously, resulting in a faster scanning capability. By performing lock-in-amplifier based measurements on a light dependent resistor at a randomly selected location in a 4 × 4 array of otherwise fixed valued resistors, we have shown that the technique can provide 86 dB crosstalk suppression even with a simple opamp. Finally, we demonstrate the circuit implementation of this technique for a 16 × 16 imaging array of light dependent resistors.

  13. A Survey on Various Image Inpainting Techniques to Restore Image

    Directory of Open Access Journals (Sweden)

    Rajul Suthar,

    2014-02-01

    Full Text Available Image Inpainting or Image Restore is technique which is used to recover the damaged image and to fill the regions which are missing in original image in visually plausible way. Inpainting, the technique of modifying an image in an invisible form, it is art which is used from the early year. Applications of this technique include rebuilding of damaged photographs& films, removal of superimposed text, removal/replacement of unwanted objects, red eye correction, image coding. The main goal of the Inpainting is to change the damaged region in an image. In this paper we provide a review of different techniques used for image Inpainting. We discuss different inpainting techniques like Exemplar based image inpainting, PDE based image inpainting, texture synthesis based image inpainting, structural inpainting and textural inpainting.

  14. Disease Prediction in Data Mining Technique – A Survey

    Directory of Open Access Journals (Sweden)

    S. Sudha

    2013-01-01

    Full Text Available Data mining is defined as sifting through very large amounts of data for useful information. Some of the most important and popular data mining techniques are association rules, classification, clustering, prediction and sequential patterns. Data mining techniques are used for variety of applications. In health care industry, data mining plays an important role for predicting diseases. For detecting a disease number of tests should be required from the patient. But using data mining technique the number of test should be reduced. This reduced test plays an important role in time and performance. This technique has an advantages and disadvantages. This research paper analyzes how data mining techniques are used for predicting different types of diseases. This paper reviewed the research papers which mainly concentrated on predicting heart disease, Diabetes and Breast cancer.

  15. Geology, Geochemistry and Ground Magnetic Survey on Kalateh Naser Iron Ore Deposit, Khorasan Jonoubi Province

    Directory of Open Access Journals (Sweden)

    Saeed Saadat

    2017-02-01

    Full Text Available Introduction Ground magnetometer surveys is one of the oldest geophysical exploration methods used in identifying iron reserves. The correct interpretation of ground magnetic surveys, along with geological and geochemical data will not only reduce costs but also to indicate the location, depth and dimensions of the hidden reserves of iron (Robinson and Coruh, 2005; Calagari, 1992. Kalateh Naser prospecting area is located at 33° 19َ to 33° 19ََ 42" latitude and 60° 0' to 60° 9َ 35" longitude in the western side of the central Ahangaran mountain range, eastern Iran (Fig.1. Based on primary field evidences, limited outcrops of magnetite mineralization were observed and upon conducting ground magnetic survey, evidence for large Iron ore deposits were detected (Saadat, 2014. This paper presents the geological and geochemical studies and the results of magnetic measurements in the area of interest and its applicability in exploration of other potential Iron deposits in the neighboring areas. Materials and methods To better understand the geological units of the area, samples were taken and thin sections were studied. Geochemical studies were conducted through XRF and ICP-Ms and wet chemistry analysis. The ground magnetic survey was designed to take measurements from grids of 20 meter apart lines and 10 meter apart points along the north-south trend. 2000 points were measured during a 6-day field work by expert geophysicists. Records were made by Canadian manufactured product Magnetometer Proton GSM19T (Fig. 2. Properties of Proton Magnetometer using in magnetic survey in Kalateh Naser prospecting area is shown in Table 1. Total magnetic intensity map, reduced to pole magnetic map, analytic single map, first vertical derivative map and upward continuation map have been prepared for this area. Results The most significant rock units in the area are cretaceous carbonate rocks (Fig. 3. The unit turns to shale and thin bedded limestone in the

  16. A Survey of Librarian Perceptions of Information Literacy Techniques

    Science.gov (United States)

    Yearwood, Simone L.; Foasberg, Nancy M.; Rosenberg, Kenneth D.

    2015-01-01

    Teaching research competencies and information literacy is an integral part of the academic librarian's role. There has long been debate among librarians over what are the most effective methods of instruction for college students. Library Faculty members at a large urban university system were surveyed to determine their perceptions of the…

  17. Supply chain simulation tools and techniques: a survey

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    The main contribution of this paper is twofold: it surveys different types of simulation for supply chain management; it discusses several methodological issues. These different types of simulation are spreadsheet simulation, system dynamics, discrete-event simulation and business games. Which simul

  18. Guidelines for a Training Course in Noise Survey Techniques.

    Science.gov (United States)

    Shadley, John; And Others

    The course is designed to train noise survey technicians during a 3-5 day period to make reliable measurements of 75 percent of the noise problems encountered in the community. The more complex noise problems remaining will continue to be handled by experienced specialists. These technicians will be trained to assist State and local governments in…

  19. Survey of resampling techniques using MSS and synthetic imagery

    Science.gov (United States)

    Bauer, Brian P.

    1980-01-01

    The objective of this survey is to investigate the methods of interpolation and deconvolution for image restoration The methods evaluated are nearest neighbor, bilinear interpolation, cubic convolution, and two-dimensional deconvolution. The effects of these restoration methods are demonstrated using Landsat multispectral scanner (MSS) data and synthetic imagery.

  20. STUDY OF INFLUENCE OF EFFLUENT ON GROUND WATER USING REMOTE SENSING, GIS AND MODELING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    S. Pathak

    2012-07-01

    boundaries using specialized software. Establishment of other boundary conditions was based on well data. Calibration and validation of was done using ground water modelling software. Change detection analysis indicated areas of impact on land use/ cover particularly, agriculture activity. Normalised difference vegetation index found to have negative correlation with pollution level. Population dynamics have been studied and it is found to be poorly correlated with land degradation. Water levels do not show significant variations in past twenty years baring normal seasonal fluctuation. Chemical analysis of ground water samples studies in time series. The water quality studied through various parameters shows concentration in mid-reach of the Bandi river. Analysis of litholog data shows three unconfined aquifers. Pump test and resistivity survey was carried out for initial aquifer properties in local water levels. Modelling contaminant migration helped in prediction of the extent of the adversity. Surface flow is checked allowing more water but it is proving to be accumulation point in absence of good rainfall & flow in the river. Hotspots of dumping /active contamination were identified with certain remediation efforts and supply of solid waste to cement industry in addition to bio-filter for heavy metals.

  1. Study of Influence of Effluent on Ground Water Using Remote Sensing, GIS and Modeling Techniques

    Science.gov (United States)

    Pathak, S.; Bhadra, B. K.; Sharma, J. R.

    2012-07-01

    software. Establishment of other boundary conditions was based on well data. Calibration and validation of was done using ground water modelling software. Change detection analysis indicated areas of impact on land use/ cover particularly, agriculture activity. Normalised difference vegetation index found to have negative correlation with pollution level. Population dynamics have been studied and it is found to be poorly correlated with land degradation. Water levels do not show significant variations in past twenty years baring normal seasonal fluctuation. Chemical analysis of ground water samples studies in time series. The water quality studied through various parameters shows concentration in mid-reach of the Bandi river. Analysis of litholog data shows three unconfined aquifers. Pump test and resistivity survey was carried out for initial aquifer properties in local water levels. Modelling contaminant migration helped in prediction of the extent of the adversity. Surface flow is checked allowing more water but it is proving to be accumulation point in absence of good rainfall & flow in the river. Hotspots of dumping /active contamination were identified with certain remediation efforts and supply of solid waste to cement industry in addition to bio-filter for heavy metals.

  2. Survey of Green Radio Communications Networks: Techniques and Recent Advances

    Directory of Open Access Journals (Sweden)

    Mohammed H. Alsharif

    2013-01-01

    Full Text Available Energy efficiency in cellular networks has received significant attention from both academia and industry because of the importance of reducing the operational expenditures and maintaining the profitability of cellular networks, in addition to making these networks “greener.” Because the base station is the primary energy consumer in the network, efforts have been made to study base station energy consumption and to find ways to improve energy efficiency. In this paper, we present a brief review of the techniques that have been used recently to improve energy efficiency, such as energy-efficient power amplifier techniques, time-domain techniques, cell switching, management of the physical layer through multiple-input multiple-output (MIMO management, heterogeneous network architectures based on Micro-Pico-Femtocells, cell zooming, and relay techniques. In addition, this paper discusses the advantages and disadvantages of each technique to contribute to a better understanding of each of the techniques and thereby offer clear insights to researchers about how to choose the best ways to reduce energy consumption in future green radio networks.

  3. A survey of individual preference for colorectal cancer screening technique

    Directory of Open Access Journals (Sweden)

    Schwartz Alan

    2004-11-01

    Full Text Available Abstract Background Due to the low participation in colorectal cancer screening, public preference for colorectal cancer screening modality was determined. Methods A cross-sectional survey was performed of healthy ambulatory adults in a pediatrics primary care office and neighboring church. Overall preference was ranked for each of four colorectal cancer screening modalities: Faecal Occult Blood, Fiberoptic Sigmoidoscopy, Barium Enema and Colonoscopy. Four additional domains of preference also were ranked: suspected discomfort, embarrassment, inconvenience and danger of each exam. Results 80 surveys were analyzed, 57 of which were received from participants who had experienced none of the screening tests. Fecal Occult Blood Testing is significantly preferred over each other screening modality in overall preference and every domain of preference, among all subjects and those who had experienced none of the tests. Conclusions Efforts to increase public participation in colorectal cancer screening may be more effective if undertaken in the context of public perceptions of screening choices.

  4. A Survey on Hough Transform, Theory, Techniques and Applications

    OpenAIRE

    Hassanein, Allam Shehata; Mohammad, Sherien; Sameer, Mohamed; Ragab, Mohammad Ehab

    2015-01-01

    For more than half a century, the Hough transform is ever-expanding for new frontiers. Thousands of research papers and numerous applications have evolved over the decades. Carrying out an all-inclusive survey is hardly possible and enormously space-demanding. What we care about here is emphasizing some of the most crucial milestones of the transform. We describe its variations elaborating on the basic ones such as the line and circle Hough transforms. The high demand for storage and computat...

  5. Visualization Techniques for Electrical Grid Smart Metering Data: A Survey

    DEFF Research Database (Denmark)

    Stefan, Maria; Lopez, Jose Manuel Guterrez Lopez; Andreasen, Morten Henius

    2017-01-01

    (GIS) tools are useful to help visualize the collected big data in near-real time. For this reason, a survey of existing GIS software will be made so that the choice of the most suitable tool can be justified. Also, the integration of GIS technologies into the Common Information Model (CIM) aims...... to improve the visualization efficiency. As a consequence, investigating methods for adapting CIM standards to the GIS platform are also important....

  6. A survey of GPU-based medical image computing techniques.

    Science.gov (United States)

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine.

  7. Do singing-ground surveys reflect american woodcock abundance in the western Great Lakes region?

    Science.gov (United States)

    Matthew R. Nelson,; Andersen, David E.

    2013-01-01

    The Singing-ground Survey (SGS) is the primary monitoring tool used to assess population status and trends of American woodcock (Scolopax minor). Like most broad-scale surveys, the SGS cannot be directly validated because there are no independent estimates of abundance of displaying male American woodcock at an appropriate spatial scale. Furthermore, because locations of individual SGS routes have generally remained stationary since the SGS was standardized in 1968, it is not known whether routes adequately represent the landscapes they were intended to represent. To indirectly validate the SGS, we evaluated whether 1) counts of displaying male American woodcock on SGS routes related to land-cover types known to be related to American woodcock abundance, 2) changes in counts of displaying male American woodcock through time were related to changes in land cover along SGS routes, and 3) land-cover type composition along SGS routes was similar to land-cover type composition of the surrounding landscape. In Wisconsin and Minnesota, USA, counts along SGS routes reflected known American woodcock-habitat relations. Increases in the number of woodcock heard along SGS routes over a 13-year period in Wisconsin were related to increasing amounts of early successional forest, decreasing amounts of mature forest, and increasing dispersion and interspersion of cover types. Finally, the cover types most strongly associated with American woodcock abundance were represented along SGS routes in proportion to their composition of the broader landscape. Taken together, these results suggest that in the western Great Lakes region, the SGS likely provides a reliable tool for monitoring relative abundance and population trends of breeding, male American woodcock.

  8. Validation of stratospheric temperature profiles from a ground-based microwave radiometer with other techniques

    Science.gov (United States)

    Navas, Francisco; Kämpfer, Niklaus; Haefele, Alexander; Keckhut, Philippe; Hauchecorne, Alain

    2016-04-01

    Vertical profiles of atmospheric temperature trends has become recognized as an important indicator of climate change, because different climate forcing mechanisms exhibit distinct vertical warming and cooling patterns. For example, the cooling of the stratosphere is an indicator for climate change as it provides evidence of natural and anthropogenic climate forcing just like surface warming. Despite its importance, our understanding of the observed stratospheric temperature trend and our ability to test simulations of the stratospheric response to emissions of greenhouse gases and ozone depleting substances remains limited. One of the main reason is because stratospheric long-term datasets are sparse and obtained trends differ from one another. Different techniques allow to measure stratospheric temperature profiles as radiosonde, lidar or satellite. The main advantage of microwave radiometers against these other instruments is a high temporal resolution with a reasonable good spatial resolution. Moreover, the measurement at a fixed location allows to observe local atmospheric dynamics over a long time period, which is crucial for climate research. This study presents an evaluation of the stratospheric temperature profiles from a newly ground-based microwave temperature radiometer (TEMPERA) which has been built and designed at the University of Bern. The measurements from TEMPERA are compared with the ones from other different techniques such as in-situ (radiosondes), active remote sensing (lidar) and passive remote sensing on board of Aura satellite (MLS) measurements. In addition a statistical analysis of the stratospheric temperature obtained from TEMPERA measurements during four years of data has been performed. This analysis evidenced the capability of TEMPERA radiometer to monitor the temperature in the stratosphere for a long-term. The detection of some singular sudden stratospheric warming (SSW) during the analyzed period shows the necessity of these

  9. A Survey Paper on Fuzzy Image Segmentation Techniques

    Directory of Open Access Journals (Sweden)

    Ms. R. Saranya Pon Selvi

    2014-03-01

    Full Text Available The image segmentation plays an important role in the day-to-day life. The new technologies are emerging in the field of Image processing, especially in the domain of segmentation.Segmentation is considered as one of the main steps in image processing. It divides a digital image into multiple regions in order to analyze them. It is also used to distinguish different objects in the image. Several image segmentation techniques have been developed by the researchers in order to make images smooth and easy to evaluate. This paper presents a brief outline on some of the most commonly used segmentation techniques like thresholding, Region based, Model based, Edge detection..etc. mentioning its advantages as well as the drawbacks. Some of the techniques are suitable for noisy images.

  10. Automated bare earth extraction technique for complex topography in light detection and ranging surveys

    Science.gov (United States)

    Stevenson, Terry H.; Magruder, Lori A.; Neuenschwander, Amy L.; Bradford, Brian

    2013-01-01

    Bare earth extraction is an important component to light detection and ranging (LiDAR) data analysis in terms of terrain classification. The challenge in providing accurate digital surface models is augmented when there is diverse topography within the data set or complex combinations of vegetation and built structures. Few existing algorithms can handle substantial terrain diversity without significant editing or user interaction. This effort presents a newly developed methodology that provides a flexible, adaptable tool capable of integrating multiple LiDAR data attributes for an accurate terrain assessment. The terrain extraction and segmentation (TEXAS) approach uses a third-order spatial derivative for each point in the digital surface model to determine the curvature of the terrain rather than rely solely on the slope. The utilization of the curvature has shown to successfully preserve ground points in areas of steep terrain as they typically exhibit low curvature. Within the framework of TEXAS, the contiguous sets of points with low curvatures are grouped into regions using an edge-based segmentation method. The process does not require any user inputs and is completely data driven. This technique was tested on a variety of existing LiDAR surveys, each with varying levels of topographic complexity.

  11. On Advanced Estimation Techniques for Exoplanet Detection and Characterization using Ground-Based Coronagraphs

    Science.gov (United States)

    Lawson, Peter R.; Frazin, Richard; Barrett, Harrison; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gladysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jerome; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Perrin, Marshall; Poyneer, Lisa; Pueyo, Laurent; Savransky, Dmitry; Soummer, Remi

    2012-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We provide a formal comparison of techniques through a blind data challenge and evaluate performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  12. Outlier Detection Techniques For Wireless Sensor Networks: A Survey

    NARCIS (Netherlands)

    Zhang, Y.; Meratnia, Nirvana; Havinga, Paul J.M.

    2008-01-01

    In the field of wireless sensor networks, measurements that significantly deviate from the normal pattern of sensed data are considered as outliers. The potential sources of outliers include noise and errors, events, and malicious attacks on the network. Traditional outlier detection techniques are

  13. USES OF MARKETING TECHNIQUES THE U. S. GEOLOGICAL SURVEY.

    Science.gov (United States)

    McDermott, Michael P.

    1983-01-01

    The use of marketing techniques by government agencies to provide more efficient and effective dissemination of their information is a fairly recent development. A recessive economy, and increased scrutiny of operations have become a powerful incentive to maximize revenues and minimize expenses wherever possible as long as the primary mission of public service is satisfactorily met.

  14. Effective Installations Technique of Grounding Conductors for Metal Oxide Surge Arresters

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B.H.; Kang, S.M. [Inha University, Inchon (Korea); Ryu, I.S. [Korea Electric Power Corporation, Seoul (Korea)

    2002-06-01

    This paper deals with the effects of grounding conductors for metal oxide surge arresters. When surge arresters are improperly installed, the results can cause costly damage of electrical equipments. In particular, the route of surge arrester connection is very important because bends and links of leads increase the impedances to lightning surges and tend to nullify the effectiveness of a grounding conductor. Therefore, there is a need to know how effective installation of lightning surge arresters is made in order to control voltage and to absorb energy at high lightning currents. The effectiveness of a grounding conductor and 18 [kV] metal oxide distribution line arresters was experimentally investigated under the lightning and oscillatory impulse voltages. Thus, the results are as follows; (1) The induced voltage of a grounding conductor is drastically not affected by length of a connecting line, but it is very sensitive to types of grounding conductor. (2) The coaxial cable having a low characteristic impedance is suitable as a grounding conductor. (3) It is also clear from these results that bonding the metal raceway enclosing the grounding conductor to the grounding electrode is very effective because of skin effect. (4) The induced voltages of grounding conductors for the oscillatory impulse voltages are approximately twice as large as those for the lightning impulse voltages. (author). 9 refs., 12 figs., 2 tabs.

  15. SURVEY ON PREDICTION OF HEART MORBIDITY USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    K.Srinivas

    2011-05-01

    Full Text Available Data mining is the non trivial extraction of implicit, previously unknown and potentially useful information from data. Data mining technology provides a user- oriented approach to novel and hidden patterns in the data. This paper presents about the various existing techniques, the issues and challenges associated with them. The discovered knowledge can be used by the healthcare administrators to improve the quality of service and also used by the medical practitioners to reduce the number of adverse drug effect, to suggest less expensive therapeutically equivalent alternatives. In this paper we discuss the popular data mining techniques namely, Decision Trees, Naïve Bayes and Neural Network that are used for prediction of disease.

  16. A Survey on Image Segmentation Techniques Used In Leukemia Detection

    Directory of Open Access Journals (Sweden)

    Mashiat Fatma

    2014-04-01

    Full Text Available Image segmentation commonly known as partitioning of an image is one of the intrinsic parts of any image processing technique. In this image processing step, the digital image of choice is segregated into sets of pixels on the basis of some predefined and preselected measures or standards. There have been presented many algorithms for segmenting a digital image. This paper presents a general review of algorithms that have been presented for the purpose of image segmentation.

  17. A survey of computational intelligence techniques in protein function prediction.

    Science.gov (United States)

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction.

  18. A survey of reflectometry techniques with applications to TFTR

    Energy Technology Data Exchange (ETDEWEB)

    Collazo, I.; Stacey, W.M. [Georgia Inst. of Tech., Atlanta, GA (United States); Wilgen, J.; Hanson, G.; Bigelow, T.; Thomas, C.E. [Oak Ridge National Lab., TN (United States); Bretz, N. [Princeton Univ., NJ (United States). Plasma Physics Lab.

    1993-12-01

    This report presents a review of reflectometry with particular attention to eXtraordinary mode (X-mode) reflectometry using the novel technique of dual frequency differential phase. The advantage of using an X-mode wave is that it can probe the edge of the plasma with much higher resolution and using a much smaller frequency range than with the Ordinary mode (O-Mode). The general problem with previous full phase reflectometry techniques is that of keeping track of the phase (on the order of 1000 fringes) as the frequency is swept over the band. The dual frequency phase difference technique has the advantage that since it is keeping track of the phase difference of two frequencies with a constant frequency separation, the fringe counting is on the order of only 3 to 5 fringes. This fringe count, combined with the high resolution of the X-mode wave and the small plasma access requirements of reflectometry, make X-mode reflectometry a very attractive diagnostic for today`s experiments and future fusion devices.

  19. Risk Assessment Techniques and Survey Method for COTS Components

    CERN Document Server

    Gupta, Rashmi

    2012-01-01

    The Rational Unified Process a software engineering process is gaining popularity nowadays. RUP delivers best software practices for component software Development life cycle It supports component based software development. Risk is involved in every component development phase .neglecting those risks sometimes hampers the software growth and leads to negative outcome. In Order to provide appropriate security and protection levels, identifying various risks is very vital. Therefore Risk identification plays a very crucial role in the component based software development This report addresses incorporation of component based software development cycle into RUP phases, assess several category of risk encountered in the component based software. It also entails a survey method to identify the risk factor and evaluating the overall severity of the component software development in terms of the risk. Formula for determining risk prevention cost and finding the risk probability is also been included. The overall go...

  20. Watermarking techniques used in medical images: a survey.

    Science.gov (United States)

    Mousavi, Seyed Mojtaba; Naghsh, Alireza; Abu-Bakar, S A R

    2014-12-01

    The ever-growing numbers of medical digital images and the need to share them among specialists and hospitals for better and more accurate diagnosis require that patients' privacy be protected. As a result of this, there is a need for medical image watermarking (MIW). However, MIW needs to be performed with special care for two reasons. Firstly, the watermarking procedure cannot compromise the quality of the image. Secondly, confidential patient information embedded within the image should be flawlessly retrievable without risk of error after image decompressing. Despite extensive research undertaken in this area, there is still no method available to fulfill all the requirements of MIW. This paper aims to provide a useful survey on watermarking and offer a clear perspective for interested researchers by analyzing the strengths and weaknesses of different existing methods.

  1. GPR as a Low Impact Paleontogical Survey Technique

    Science.gov (United States)

    Sturdevant, G. C.; Leverence, R.; Stewart, R.

    2013-12-01

    The Deweyville Formation, a Pleistocene fluvial sandstone, is a prolific source of megafaunal fossils from periods of low stand environmental conditions. GPR was employed in an environmentally sensitive area in close proximity to a salt dome in Northwest Harris County, Texas as a method of evaluating the probable paleo-depositional environment and to prospect for potential further site development of two distinct fossiliferous zones. The primary zone of interest is a lag gravel bounded sand responsible for producing a regionally unique fossil assemblage including South American megafauna (Lundelius et al, 2013). The secondary zone of interest contains undisturbed mammoth remains housed in coarse white sand emplaced on top of a clay drape which has been hypothesized to represent an oxbow lake formed by the meandering paleo-Brazos river. With an accurate map of the paleo-channel planning future activity can focus on maximizing fossil recovery and minimizing site impact. Pulse EKKO 250 MHz, 400MHz, and 1GHz system was employed in a prospect area proximal to the secondary site to calibrate and evaluate these systems for their resolution and penetration depth in the modern sediments. The data was processed using EKKO Mapper and EKKO View Deluxe software packages, 3d volumes were produced and sliced. Preliminary results from the 250 MHz demonstrate successful imaging of the sand-clay interface. After these surveys were run a small portion of the site was excavated to confirm the estimated velocities, the observed anomalies, and refine our modeling and interpretation, and improve grid design for further surveys. It was confirmed that the sand-clay interface was easily observable using GPR, however the grid spacing proved to be too wide, leading to artifacts in the 3d volume produced.

  2. Survey of energy efficient tracking and localization techniques in buildings using optical and wireless communication media

    NARCIS (Netherlands)

    Bruintjes, Tom M.; Kokkeler, André B.J.; Karagiannis, Georgios; Smit, Gerard J.M.

    2012-01-01

    This paper presents a survey of beamforming, beamsteering and mobile tracking techniques. The survey was made in the context of the SOWICI project. The aim of this project is to reduce power consumption of data exchanging devices within houses. An optical fiber network is used for data transport to

  3. Data indexing techniques for the EUVE all-sky survey

    Science.gov (United States)

    Lewis, J.; Saba, V.; Dobson, C.

    1992-01-01

    This poster describes techniques developed for manipulating large full-sky data sets for the Extreme Ultraviolet Explorer project. The authors have adapted the quatrilateralized cubic sphere indexing algorithm to allow us to efficiently store and process several types of large data sets, such as full-sky maps of photon counts, exposure time, and count rates. A variation of this scheme is used to index sparser data such as individual photon events and viewing times for selected areas of the sky, which are eventually used to create EUVE source catalogs.

  4. Ionosphere-magnetosphere studies using ground based VLF radio propagation technique: an Indian example

    Science.gov (United States)

    Chakravarty, Subhas

    Since IGY period (1957-58), natural and artificially produced Very Low Frequency (VLF) elec-tromagnetic radiations are being recorded at large number of ground stations all over the world and on-board satellites to study various radio wave-thermal/energetic plasma interactive pro-cesses related to earth's ionosphere-plasmasphere-magnetosphere environment. The terrestrial propagation of these VLF radio waves are primarily enabled through the earth ionosphere wave guide (EIWG) mode to long horizontal distances around the globe and ducted along the ge-omagnetic field lines into the conjugate hemisphere through the plasmasphere-magnetosphere regions. The time frequency spectra of the received signals indicate presence of dispersion (wave/group velocities changing with frequency) and various cut-off frequencies based on the width of the EIWG, electron gyro and plasma frequencies etc., providing several types of received signals like whistlers, chorus, tweeks, hiss and hisslers which can be heard on loud-speakers/earphones with distinguishing audio structures. While the VLF technique has been a very effective tool for studying middle and high latitude phenomena, the importance of the similar and anomalous observations over the Indian low latitude stations provide potentially new challenges for their scientific interpretation and modelling. The ducted and non-ducted magnetospheric propagation, pro-longitudinal (PL) mode, low latitude TRIMPI/TLE (Tran-sient Luminous Emissions) or other effects of wave-particle/wave-wave interactions, effects due to ionospheric irregularities and electric fields, full wave solutions to D-region ionisation per-turbations due to solar and stellar energetic X-and γ ray emissions during normal and flaring conditions are a few problems which have been addressed in these low latitude studies over India. Since the conjugate points of Indian stations lie over the Indian oceanic region, the VLF propagation effects would be relatively free from

  5. Condition assessment of concrete pavements using both ground penetrating radar and stress-wave based techniques

    Science.gov (United States)

    Li, Mengxing; Anderson, Neil; Sneed, Lesley; Torgashov, Evgeniy

    2016-12-01

    Two stress-wave based techniques, ultrasonic surface wave (USW) and impact echo (IE), as well as ground penetrating radar (GPR) were used to assess the condition of a segment of concrete pavement that includes a layer of concrete, a granular base and their interface. Core specimens retrieved at multiple locations were used to confirm the accuracy and reliability of each non-destructive testing (NDT) result. Results from this study demonstrate that the GPR method is accurate for estimating the pavement thickness and locating separations (air voids) between the concrete and granular base layers. The USW method is a rapid way to estimate the in-situ elastic modulus (dynamic elastic modulus) of the concrete, however, the existence of air voids at the interface could potentially affect the accuracy and reliability of the USW test results. The estimation of the dynamic modulus and the P-wave velocity of concrete was improved when a shorter wavelength range (3 in. to 8.5 in.) corresponding to the concrete layer thickness was applied instead of the full wavelength rage (3 in. to 11 in.) based on the standard spacing of the receiver transducers. The IE method is proved to be fairly accurate in estimating the thickness of concrete pavements. However, the flexural mode vibration could affect the accuracy and reliability of the test results. Furthermore, the existence of air voids between the concrete and granular base layers could affect the estimation of the compression wave velocity of concrete when the full wavelength range was applied (3 in. to 11 in.). Future work is needed in order to improve the accuracy and reliability of both USW and IE test results.

  6. Survey of Multiple Information Hiding Techniques using Visual Cryptography

    Directory of Open Access Journals (Sweden)

    Bijoy Chhetri

    2015-10-01

    Full Text Available Information now a day‟s seems to have become abundant and the secure transmission and visualization of it has been a challenge. The major security concerns are of Authentication, Confidentiality and Data Integrity.  In regard to this, the various security methodologies have been introduced and Cryptography is one of the schemes where the information is transferred in the disguise form and only authentic user can reveal the exact information. Various Cryptographic techniques has played a very vital role in this regard, among which Visual Cryptographic System(VCS is one of such kind  where the secret data (image, text etc is encoded into multiple images and decoded using Human Visual System(HVS without having to tedious calculations and sound  knowledge of Cryptography. VC  is one of such methodology where the secret information is bifurcated into many disguise images and on super imposing these images, the original secret information is revealed, using Human Visual System(HVS unlike the traditional cryptography where lot of complex mathematical and time consuming calculation are to be performed. In this paper study of various VC techniques has been done based on number of shares, number of secret messages and types of shares in the cases of Grayscale Image.

  7. Attacks Prevention and Detection Techniques In MANET: A Survey

    Directory of Open Access Journals (Sweden)

    Pranjali D. Nikam,

    2014-11-01

    Full Text Available Wireless sensor network is a set of distributed sensor nodes. Which are randomly deployed in geographical area to capture climatic changes like temperature, humidity and pressure. In Wireless Network MANET is a Mobile Ad-Hoc Networks which is a one self-configurable network. MANET is a collection of Wireless mobile node which is dynamically moves from one location to another location. Both attacks Active as well as Passive attacks is in MANET. It doesn’t have a static structure. Security for wireless network is much difficult as compare to wired networks. In last few years many security and attacks issue are face many researchers in MANET. Attacks like Packet dropping attack, Black-Hole attack, Denial of Service attack, wormhole attacks and Packet modification attacks found in MANET. At the time of data communication all the above mentioned attacks access data easily without permission. To solve the problem of attacks in MANET and secure data communication use Intrusion Detection System. In This paper propose the survey of different kinds of attacks on MANET and Wireless sensor networks. This paper helps to young researcher for implement new hybrid algorithm for secure intrusion detection in MANET.

  8. Ground survey of red lechwe in the Linyanti swamps and Chobe floodplains, northern Botswana

    Directory of Open Access Journals (Sweden)

    Phemelo Gadimang

    2017-05-01

    Full Text Available A ground survey of red lechwe was carried out in the Linyanti swamps and the Chobe floodplains of northern Botswana in the dry and wet seasons of 2012 and 2013, respectively. We documented numbers, sex ratio and age structure of red lechwe within the linear strips of 25 km × 300 m along the Linyanti swamps and the Chobe floodplains. Results indicated a significant difference in the numbers of red lechwe between sites and seasons. About 66 and 755 red lechwe were estimated for Chobe in the dry and wet season, respectively, with 343 and 261 of them estimated for Linyanti in the dry and wet season, respectively. In Chobe, the red lechwe densities varied widely between seasons (9 red lechwe/km2 – 101 red lechwe/km2 compared with Linyanti, where the densities did not vary much between seasons (35 red lechwe/km2 – 46 red lechwe/km2 . The lower densities of red lechwe in Chobe in the dry season when compared with the wet season suggest a possible seasonal shift in the distribution of red lechwe to the nearby Zambezi floodplains in Namibia.Conservation implications: The higher number of red lechwe in the Chobe floodplains in the wet season indicates the potential of the floodplains as a habitat for this species in that season. The dry season shift in the distribution of red lechwe in Chobe presents an opportunity for local communities in Namibia to engage in tourism, whereas the return of the red lechwe to the floodplains in the wet season ensures protection of the animals as well as boosts the tourism potential of the Chobe National Park.

  9. A Survey on Dynamic Spectrum Access Techniques for Cognitive Radio

    CERN Document Server

    Garhwal, Anita

    2012-01-01

    Cognitive radio (CR) is a new paradigm that utilizes the available spectrum band. The key characteristic of CR system is to sense the electromagnetic environment to adapt their operation and dynamically vary its radio operating parameters. The technique of dynamically accessing the unused spectrum band is known as Dynamic Spectrum Access (DSA). The dynamic spectrum access technology helps to minimize unused spectrum bands. In this paper, main functions of Cognitive Radio (CR) i.e. spectrum sensing, spectrum management, spectrum mobility and spectrum sharing are discussed. Then DSA models are discussed along with different methods of DSA such as Command and Control, Exclusive-Use, Shared Use of Primary Licensed User and Commons method. Game-theoretic approach using Bertrand game model, Markovian Queuing Model for spectrum allocation in centralized architecture and Fuzzy logic based method are also discussed and result are shown.

  10. Adaptive Steganography: A survey of Recent Statistical Aware Steganography Techniques

    Directory of Open Access Journals (Sweden)

    Manish Mahajan

    2012-09-01

    Full Text Available Steganography is the science that deals with hiding of secret data in some carrier media which may be image, audio, formatted text or video. The main idea behind this is to conceal the very existence of data. We will be dealing here with image steganography. Many algorithms have been proposed for this purpose in spatial & frequency domain. But in almost all the algorithms it has been noticed that as we embed the secret data in the image the certain characteristics or statistics of the image get disturbed. Based on these disturbed statistics steganalysts can get the reflection about the existence of secret data which they further decode with the help of available steganalytic tools. Steganalysis is a science of attacking the hidden data to get an authorized access. Although steganalysis is not a part of this work but it may be sometimes discussed as a part of literature. Even in steganography we are not purely concerned with spatial or frequency domain rather our main emphasis is on adaptive steganography or model based steganography. Adaptive steganography is not entirely a new branch of steganography rather it is based upon spatial & frequency domain with an additional layer of mathematical model. So here we will be dealing with adaptive steganography which take care about the important characteristics & statistics of the cover image well in advance to the embedding of secret data so that the disturbance of image statistics as mentioned earlier, which attracts the forgery or unauthorized access, can be minimized. In this survey we will analyze the various steganography algorithms which are based upon certain mathematical model or in other words algorithms which come under the category of model based steganography.

  11. Complete denture impression techniques practiced by private dental practitioners: a survey.

    Science.gov (United States)

    Kakatkar, Vinay R

    2013-09-01

    Impression making is an important step in fabricating complete dentures. A survey to know the materials used and techniques practiced while recording complete denture impressions was conducted. It is disheartening to know that 33 % practitioners still use base plate custom trays to record final impressions. 8 % still use alginate for making final impressions. An acceptable technique for recording CD impressions is suggested.

  12. Comparative Analysis of Automatic Vehicle Classification Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Kanwal Yousaf

    2012-09-01

    Full Text Available Vehicle classification has emerged as a significant field of study because of its importance in variety of applications like surveillance, security system, traffic congestion avoidance and accidents prevention etc. So far numerous algorithms have been implemented for classifying vehicle. Each algorithm follows different procedures for detecting vehicles from videos. By evaluating some of the commonly used techniques we highlighted most beneficial methodology for classifying vehicles. In this paper we pointed out the working of several video based vehicle classification algorithms and compare these algorithms on the basis of different performance metrics such as classifiers, classification methodology or principles and vehicle detection ratio etc. After comparing these parameters we concluded that Hybrid Dynamic Bayesian Network (HDBN Classification algorithm is far better than the other algorithms due to its nature of estimating the simplest features of vehicles from different videos. HDBN detects vehicles by following important stages of feature extraction, selection and classification. It extracts the rear view information of vehicles rather than other information such as distance between the wheels and height of wheel etc.

  13. Biomass estimates of Pacific herring Clupea harengus pallasi, in California from the 1985-86 spawning-ground surveys

    OpenAIRE

    Spratt, Jerome D.

    1986-01-01

    The 1985-86 spawning biomass estimate of Pacific herring, Clupea harengus pallasi, in San Francisco Bay is 49,000 tons. The relatively small population increases during 1984 and 1985 indicate that the population is rebuilding slowly from the 1983-84 season when only 40,000 tons of herring spawned. Spawning-ground surveys in Tomales Bay were inconclusive. Herring normally spawn in eelgrass, Zostera marina, beds; this season herring spawned unexpectedly in deeper water, disrupting our...

  14. Mitigative techniques and analysis of generic site conditions for ground-water contamination associated with severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Shafer, J.M.; Oberlander, P.L.; Skaggs, R.L.

    1984-04-01

    The purpose of this study is to evaluate the feasibility of using ground-water contaminant mitigation techniques to control radionuclide migration following a severe commercial nuclear power reactor accident. The two types of severe commercial reactor accidents investigated are: (1) containment basemat penetration of core melt debris which slowly cools and leaches radionuclides to the subsurface environment, and (2) containment basemat penetration of sump water without full penetration of the core mass. Six generic hydrogeologic site classifications are developed from an evaluation of reported data pertaining to the hydrogeologic properties of all existing and proposed commercial reactor sites. One-dimensional radionuclide transport analyses are conducted on each of the individual reactor sites to determine the generic characteristics of a radionuclide discharge to an accessible environment. Ground-water contaminant mitigation techniques that may be suitable, depending on specific site and accident conditions, for severe power plant accidents are identified and evaluated. Feasible mitigative techniques and associated constraints on feasibility are determined for each of the six hydrogeologic site classifications. The first of three case studies is conducted on a site located on the Texas Gulf Coastal Plain. Mitigative strategies are evaluated for their impact on contaminant transport and results show that the techniques evaluated significantly increased ground-water travel times. 31 references, 118 figures, 62 tables.

  15. Assessing slope stability by ground based and remote techniques - a case study of 2015 Tbilisi disaster

    Science.gov (United States)

    Akhalaia, G.; Cakir, Z.; Tsiskarishvili, L.; Otinashvili, M.; Sukhishvili, L.; Merebashvili, G.; Tserodze, M.; Akubardia, D.; Managadze, M.

    2016-12-01

    At the night of 13th of June 2015 complex-type landslide was triggered by heavy rainfall in the river Vere basin, 10 km to the west of Georgian capital Tbilisi. Flashflood flow transported the landslide body to the center of Tbilisi. As a result 20 people are dead and 2 still missing, direct infrastructure damage is about 50 mln USD. The landslide is located at Mtatsminda anticline, its length is 3600 meters and sliding surface area estimates 315 000 km2. Bedrock dips varies 20-800 and surface inclination is almost the same. Our group used geodetic, geophysical and UAV survey approaches to estimate total volume of landslide body. As a result of the investigation we calculated that 1 300 000 m3 was transported but about 25% of total amount is still on sliding surface. As the whole area is prone to landslide, different approaches were applied to assess slope stability and indentifing ongoing deformation areas. Two most challenging factors were steep terrain and forest cover, so we used InSAR techniques, optical remote sensing, RTK measurements and geophysical methods. The detection and assessment pre and post-failure deformation, represent important task to understand the failure mechanism and geometry of the landslide, an ultimately purpose is to evaluate its stability. Interferometric Synthetic Aperture Radar data from ENVISAT sensor was utilized in the analysis of the pre-/ post-event deformation. Also, Network of GNSS (Continuously Operating Reference Stations) was used for RTK, to provide centimeter precise measurements. After comparing results derived from these different approaches, proper methods were selected to identify the most unstable areas within the landslide zone.

  16. 2008 report of prairie grouse breeding ground survey on Fort Niobrara NWR

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This memorandum summarizes the 2008 prairie grouse lek survey on Fort Niobrara National Wildlife Refuge. The main objective of this survey is to monitor trends...

  17. Progress report: Waterfowl breeding ground aerial surveys in southern Saskatchewan: 1960

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes the Waterfowl Breeding Population and Habitat Survey and Waterfowl Production and Habitat Survey for southern Saskatchewan during 1960. The...

  18. Ice thickness profile surveying with ground penetrating radar at Artesonraju Glacier, Peru

    Science.gov (United States)

    Chisolm, Rachel; Rabatel, Antoine; McKinney, Daene; Condom, Thomas; Cochacin, Alejo; Davila Roller, Luzmilla

    2014-05-01

    Tropical glaciers are an essential component of the water resource systems in the mountainous regions where they are located, and a warming climate has resulted in the accelerated retreat of Andean glaciers in recent decades. The shrinkage of Andean glaciers influences the flood risk for communities living downstream as new glacial lakes have begun to form at the termini of some glaciers. As these lakes continue to grow in area and volume, they pose an increasing risk of glacial lake outburst floods (GLOFs). Ice thickness measurements have been a key missing link in studying the tropical glaciers in Peru and how climate change is likely to impact glacial melt and the growth of glacial lakes. Ground penetrating radar (GPR) has rarely been applied to glaciers in Peru to measure ice thickness, and these measurements can tell us a lot about how a warming climate will affect glaciers in terms of thickness changes. In the upper Paron Valley (Cordillera Blanca, Peru), an emerging lake has begun to form at the terminus of the Artesonraju Glacier, and this lake has key features, including overhanging ice and loose rock likely to create slides, that could trigger a catastrophic GLOF if the lake continues to grow. Because the glacier mass balance and lake mass balance are closely linked, ice thickness measurements and measurements of the bed slope of the Artesonraju Glacier and underlying bedrock can give us an idea of how the lake is likely to evolve in the coming decades. This study presents GPR data taken in July 2013 at the Artesonraju Glacier as part of a collaboration between the Unidad de Glaciologia y Recursos Hidricos (UGRH) of Peru, the Institut de Recherche pour le Développement (IRD) of France and the University of Texas at Austin (UT) of the United States of America. Two different GPR units belonging to UGRH and UT were used for subsurface imaging to create ice thickness profiles and to characterize the total volume of ice in the glacier. A common midpoint

  19. Analysis of meteorological variables in the Australasian region using ground- and space-based GPS techniques

    Science.gov (United States)

    Kuleshov, Yuriy; Choy, Suelynn; Fu, Erjiang Frank; Chane-Ming, Fabrice; Liou, Yuei-An; Pavelyev, Alexander G.

    2016-07-01

    Results of analysis of meteorological variables (temperature and moisture) in the Australasian region using the global positioning system (GPS) radio occultation (RO) and GPS ground-based observations verified with in situ radiosonde (RS) data are presented. The potential of using ground-based GPS observations for retrieving column integrated precipitable water vapour (PWV) over the Australian continent has been demonstrated using the Australian ground-based GPS reference stations network. Using data from the 15 ground-based GPS stations, the state of the atmosphere over Victoria during a significant weather event, the March 2010 Melbourne storm, has been investigated, and it has been shown that the GPS observations has potential for monitoring the movement of a weather front that has sharp moisture contrast. Temperature and moisture variability in the atmosphere over various climatic regions (the Indian and the Pacific Oceans, the Antarctic and Australia) has been examined using satellite-based GPS RO and in situ RS observations. Investigating recent atmospheric temperature trends over Antarctica, the time series of the collocated GPS RO and RS data were examined, and strong cooling in the lower stratosphere and warming through the troposphere over Antarctica has been identified, in agreement with outputs of climate models. With further expansion of the Global Navigation Satellite Systems (GNSS) system, it is expected that GNSS satellite- and ground-based measurements would be able to provide an order of magnitude larger amount of data which in turn could significantly advance weather forecasting services, climate monitoring and analysis in the Australasian region.

  20. U.S. Geological Survey ground-water studies in Illinois

    Science.gov (United States)

    Avery, Charles F.

    1994-01-01

    Ground water is an important source of water supply in Illinois. The largest amount of ground*water withdrawal is in the northern one-third of the State where aquifers to a depth of about 1,500 feet below land surface contain large quantities of potable water. Approximately 74 percent of the public water-supply systems in Illinois use ground water to supply potable water to more than 5.5 million people. Ground-water withdrawals account for almost 25 percent of the total water withdrawn for public water supplies in Illinois. Many public water-supply systems in the Chicago area have recently changed from using ground water pumped from wells to using water delivered from Lake Michigan. The major issues related to ground water in Illinois are: Water- quality degradation or contamination from point and nonpoint sources, and Water availability, because of the lowering of ground-water levels in the bedrock aquifers in northeastern Illinois and elsewhere in the State where pumpage has exceeded aquifer recharge and the susceptibility of the limited surface-water supplies in central and southern Illinois to drought.

  1. GWM-a ground-water management process for the U.S. Geological Survey modular ground-water model (MODFLOW-2000)

    Science.gov (United States)

    Ahlfeld, David P.; Barlow, Paul M.; Mulligan, Anne E.

    2005-01-01

    GWM is a Ground?Water Management Process for the U.S. Geological Survey modular three?dimensional ground?water model, MODFLOW?2000. GWM uses a response?matrix approach to solve several types of linear, nonlinear, and mixed?binary linear ground?water management formulations. Each management formulation consists of a set of decision variables, an objective function, and a set of constraints. Three types of decision variables are supported by GWM: flow?rate decision variables, which are withdrawal or injection rates at well sites; external decision variables, which are sources or sinks of water that are external to the flow model and do not directly affect the state variables of the simulated ground?water system (heads, streamflows, and so forth); and binary variables, which have values of 0 or 1 and are used to define the status of flow?rate or external decision variables. Flow?rate decision variables can represent wells that extend over one or more model cells and be active during one or more model stress periods; external variables also can be active during one or more stress periods. A single objective function is supported by GWM, which can be specified to either minimize or maximize the weighted sum of the three types of decision variables. Four types of constraints can be specified in a GWM formulation: upper and lower bounds on the flow?rate and external decision variables; linear summations of the three types of decision variables; hydraulic?head based constraints, including drawdowns, head differences, and head gradients; and streamflow and streamflow?depletion constraints. The Response Matrix Solution (RMS) Package of GWM uses the Ground?Water Flow Process of MODFLOW to calculate the change in head at each constraint location that results from a perturbation of a flow?rate variable; these changes are used to calculate the response coefficients. For linear management formulations, the resulting matrix of response coefficients is then combined with other

  2. VLBI collimation-tower technique for time-delay studies of a large ground-station communications antenna

    Science.gov (United States)

    Otoshi, T. Y.; Young, L. E.; Rusch, W. V. T.

    1985-01-01

    A need for an accurate but inexpensive method for measuring and evaluating time delays of large ground antennas for VLBI applications motivated the development of the collimation tower technique. Supporting analytical work which was performed primarily to verify time delay measurement results obtained for a large antenna when the transmitter was at a collimation distance of 1/25 of the usual far field criterion is discussed. Comparisons of theoretical and experimental results are also given.

  3. Aerial Survey Results for 131I Deposition on the Ground after the Fukushima Daiichi Nuclear Power Plant Accident

    Energy Technology Data Exchange (ETDEWEB)

    Torii, Tatsuo [JAEA; Sugita, Takeshi [JAEA; Okada, Colin E. [NSTec; Reed, Michael S. [NSTec; Blumenthal, Daniel J. [NNSA

    2013-08-01

    In March 2011 the second largest accidental release of radioactivity in history occurred at the Fukushima Daiichi nuclear power plant following a magnitude 9.0 earthquake and subsequent tsunami. Teams from the U.S. Department of Energy, National Nuclear Security Administration Office of Emergency Response performed aerial surveys to provide initial maps of the dispersal of radioactive material in Japan. The initial results from the surveys did not report the concentration of 131I. This work reports on analyses performed on the initial survey data by a joint Japan-US collaboration to determine 131I ground concentration. This information is potentially useful in reconstruction of the inhalation and external exposure doses from this short-lived radionuclide. The deposited concentration of 134Cs is also reported.

  4. Shapes of the $^{192,190}$Pb ground states from beta decay studies using the total absorption technique

    CERN Document Server

    Estevez Aguado, M.E.; Agramunt, J.; Rubio, B.; Tain, J.L.; Jordan, D.; Fraile, L.M.; Gelletly, W.; Frank, A.; Csatlos, M.; Csige, L.; Dombradi, Zs.; Krasznahorkay, A.; Nacher, E.; Sarriguren, P.; Borge, M.J.G.; Briz, J.A.; Tengblad, O.; Molina, F.; Moreno, O.; Kowalska, M.; Fedosseev, V.N.; Marsh, B.A.; Fedorov, D.V.; Molkanov, P.L.; Andreyev, A.N.; Seliverstov, M.D.; Burkard, K.; Huller, W.

    2015-01-01

    The beta decay of $^{192,190}$Pb has been studied using the total absorption technique at the ISOLDE(CERN) facility. The beta-decay strength deduced from the measurements, combined with QRPA theoretical calculations, allow us to infer that the ground states of the $^{192,190}$Pb isotopes are spherical. These results represent the first application of the shape determination method using the total absorption technique for heavy nuclei and in a region where there is considerable interest in nuclear shapes and shape effects.

  5. A survey of unmanned ground vehicles with applications to agricultural and environmental sensing

    Science.gov (United States)

    Bonadies, Stephanie; Lefcourt, Alan; Gadsden, S. Andrew

    2016-05-01

    Unmanned ground vehicles have been utilized in the last few decades in an effort to increase the efficiency of agriculture, in particular, by reducing labor needs. Unmanned vehicles have been used for a variety of purposes including: soil sampling, irrigation management, precision spraying, mechanical weeding, and crop harvesting. In this paper, unmanned ground vehicles, implemented by researchers or commercial operations, are characterized through a comparison to other vehicles used in agriculture, namely airplanes and UAVs. An overview of different trade-offs of configurations, control schemes, and data collection technologies is provided. Emphasis is given to the use of unmanned ground vehicles in food crops, and includes a discussion of environmental impacts and economics. Factors considered regarding the future trends and potential issues of unmanned ground vehicles include development, management and performance. Also included is a strategy to demonstrate to farmers the safety and profitability of implementing the technology.

  6. Incomplete categorical data design non-randomized response techniques for sensitive questions in surveys

    CERN Document Server

    Tian, Guo-Liang

    2013-01-01

    Respondents to survey questions involving sensitive information, such as sexual behavior, illegal drug usage, tax evasion, and income, may refuse to answer the questions or provide untruthful answers to protect their privacy. This creates a challenge in drawing valid inferences from potentially inaccurate data. Addressing this difficulty, non-randomized response approaches enable sample survey practitioners and applied statisticians to protect the privacy of respondents and properly analyze the gathered data.Incomplete Categorical Data Design: Non-Randomized Response Techniqu

  7. EPN Chemical ecology and new techniques for below ground sampling and analyses of volatile semiochemicals

    Science.gov (United States)

    It is well established that herbivory induced plant volatiles (HIPVs) attract natural enemies of the herbivores. Utilizing this plant response has become a fundamental part of above ground IPM programs. We now know that also roots can release HIPVs and that these compounds attract beneficial organis...

  8. ANALYSIS OF DISSOLVED METHANE, ETHANE, AND ETHYLENE IN GROUND WATER BY A STANDARD GAS CHROMATOGRAPHIC TECHNIQUE

    Science.gov (United States)

    The measurement of dissolved gases such as methane, ethane, and ethylene in ground water is important in determining whether intrinsic bioremediation is occurring in a fuel- or solvent-contaminated aquifer. A simple procedure is described for the collection and subsequent analys...

  9. ANALYSIS OF DISSOLVED METHANE, ETHANE, AND ETHYLENE IN GROUND WATER BY A STANDARD GAS CHROMATOGRAPHIC TECHNIQUE

    Science.gov (United States)

    The measurement of dissolved gases such as methane, ethane, and ethylene in ground water is important in determining whether intrinsic bioremediation is occurring in a fuel- or solvent-contaminated aquifer. A simple procedure is described for the collection and subsequent analys...

  10. Measurement of Walking Ground Reactions in Real-Life Environments: A Systematic Review of Techniques and Technologies.

    Science.gov (United States)

    Shahabpoor, Erfan; Pavic, Aleksandar

    2017-09-12

    Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the 'accuracy' and 'practicality' of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1) methods based on measured kinematic data; (2) methods based on measured plantar pressure; and (3) methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1) reducing the size and price of tri-axial load-cells; (2) improving the accuracy of orientation measurement using IMUs; (3) minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4) increasing the durability of pressure insole sensors, and (5) enhancing the robustness and versatility of the

  11. Measurement of Walking Ground Reactions in Real-Life Environments: A Systematic Review of Techniques and Technologies

    Directory of Open Access Journals (Sweden)

    Erfan Shahabpoor

    2017-09-01

    Full Text Available Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the ‘accuracy’ and ‘practicality’ of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1 methods based on measured kinematic data; (2 methods based on measured plantar pressure; and (3 methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1 reducing the size and price of tri-axial load-cells; (2 improving the accuracy of orientation measurement using IMUs; (3 minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4 increasing the durability of pressure insole sensors, and (5 enhancing the robustness and

  12. Survey of technology for decommissioning of nuclear fuel cycle facilities. 8. Remote handling and cutting techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ogawa, Ryuichiro; Ishijima, Noboru [Japan Nuclear Cycle Development Inst., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1999-03-01

    In nuclear fuel cycle facility decommissioning and refurbishment, the remote handling techniques such as dismantling, waste handling and decontamination are needed to reduce personnel radiation exposure. The survey research for the status of R and D activities on remote handling tools suitable for nuclear facilities in the world and domestic existing commercial cutting tools applicable to decommissioning of the facilities was conducted. In addition, the drive mechanism, sensing element and control system applicable to the remote handling devices were also surveyed. This report presents brief surveyed summaries. (H. Itami)

  13. On-the-Job Ethics – Proximity Morality Forming in Medical School: A grounded theory analysis using survey data

    Directory of Open Access Journals (Sweden)

    Hans O. Thulesius, MD, Ph.D.

    2009-03-01

    Full Text Available On-the-job-ethics exist in all businesses and can also be called proximity morality forming. In this paper we propose that medical students take a proximity morality stance towards ethics education at medical school. This means that they want to form physician morality “on the job” instead of being taught ethics like any other subject. On-the-job-ethics for medical students involves learning ethics that is used when practicing ethics. Learning ethics includes comprehensive ethics courses in which quality lectures provide ethics grammar useful for the ethics practicing in attitude exercises and vignette reflections in tutored group discussions. On-the-job-ethics develops professional identity, handles diversity of religious and existential worldviews, trains students described as ethically naive, processes difficult clinical experiences, and desists negative role modeling from physicians in clinical or teaching situations. This grounded theory analysis was made from a questionnaire survey on attitudes to ethics education with 409 Swedish medical students participating. We analyzed over 8000 words of open-ended responses and multiplechoice questions using classic grounded theory procedures, but also compared questionnaire data using statistics such as multiple regression models. The paper gives an example of how grounded theory can be used with a limited amount of survey data.

  14. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    Science.gov (United States)

    James, M. R.; Robson, S.; d'Oleire-Oltmanns, S.; Niethammer, U.

    2017-03-01

    Structure-from-motion (SfM) algorithms greatly facilitate the production of detailed topographic models from photographs collected using unmanned aerial vehicles (UAVs). However, the survey quality achieved in published geomorphological studies is highly variable, and sufficient processing details are never provided to understand fully the causes of variability. To address this, we show how survey quality and consistency can be improved through a deeper consideration of the underlying photogrammetric methods. We demonstrate the sensitivity of digital elevation models (DEMs) to processing settings that have not been discussed in the geomorphological literature, yet are a critical part of survey georeferencing, and are responsible for balancing the contributions of tie and control points. We provide a Monte Carlo approach to enable geomorphologists to (1) carefully consider sources of survey error and hence increase the accuracy of SfM-based DEMs and (2) minimise the associated field effort by robust determination of suitable lower-density deployments of ground control. By identifying appropriate processing settings and highlighting photogrammetric issues such as over-parameterisation during camera self-calibration, processing artefacts are reduced and the spatial variability of error minimised. We demonstrate such DEM improvements with a commonly-used SfM-based software (PhotoScan), which we augment with semi-automated and automated identification of ground control points (GCPs) in images, and apply to two contrasting case studies - an erosion gully survey (Taroudant, Morocco) and an active landslide survey (Super-Sauze, France). In the gully survey, refined processing settings eliminated step-like artefacts of up to 50 mm in amplitude, and overall DEM variability with GCP selection improved from 37 to 16 mm. In the much more challenging landslide case study, our processing halved planimetric error to 0.1 m, effectively doubling the frequency at which changes in

  15. The high resolution topographic evolution of an active retrogressive thaw slump compiled from a decade of photography, ground surveys, laser scans and satellite imagery

    Science.gov (United States)

    Crosby, B. T.; Barnhart, T. B.; Rowland, J. C.

    2015-12-01

    Remote sensing imagery has enables the temporal reconstruction of thermal erosion features including lakes, shorelines and hillslope failures in remote Arctic locations, yet these planar data limit analysis to lines and areas. This study explores the application of varying techniques to reconstruct the three dimensional evolution of a single thermal erosion feature using a mixture of opportunistic oblique photos, ground surveys and satellite imagery. At the Selawik River retrogressive thaw slump in northwest Alaska, a bush plane collected oblique aerial photos when the feature was first discovered in 2004 and in subsequent years. These images were recently processed via Structure from Motion to generate georeferenced point clouds for the years prior to the initiation of our research. High resolution ground surveys in 2007, 2009 and 2010 were completed using robotic total station. Terrestrial laser scans (TLS) were collected in the summers of 2011 and 2012. Analysis of stereo satellite imagery from 2012 and 2015 enable continued monitoring of the feature after ground campaigns ended. As accurate coregistraion between point clouds is vital to topographic change detection, all prior and subsequent datasets were georeferenced to stable features observed in the 2012 TLS scan. Though this coregistration introduces uncertainty into each image, the magnitudes of uncertainty are significantly smaller than the topographic changes detected. Upslope retreat of the slump headwall generally decreases over time as the slump floor progresses from a highly dissected gully topography to a low relief, earthflow dominated depositional plane. The decreasing slope of the slump floor diminishes transport capacity, resulting in the progressive burial of the slump headwall, thus decreasing headwall retreat rates. This self-regulation of slump size based on feature relief and transport capacity suggests a capacity to predict the maximum size a given feature can expand to before

  16. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  17. Monitoring soil moisture patterns in alpine meadows using ground sensor networks and remote sensing techniques

    Science.gov (United States)

    Bertoldi, Giacomo; Brenner, Johannes; Notarnicola, Claudia; Greifeneder, Felix; Nicolini, Irene; Della Chiesa, Stefano; Niedrist, Georg; Tappeiner, Ulrike

    2015-04-01

    Soil moisture content (SMC) is a key factor for numerous processes, including runoff generation, groundwater recharge, evapotranspiration, soil respiration, and biological productivity. Understanding the controls on the spatial and temporal variability of SMC in mountain catchments is an essential step towards improving quantitative predictions of catchment hydrological processes and related ecosystem services. The interacting influences of precipitation, soil properties, vegetation, and topography on SMC and the influence of SMC patterns on runoff generation processes have been extensively investigated (Vereecken et al., 2014). However, in mountain areas, obtaining reliable SMC estimations is still challenging, because of the high variability in topography, soil and vegetation properties. In the last few years, there has been an increasing interest in the estimation of surface SMC at local scales. On the one hand, low cost wireless sensor networks provide high-resolution SMC time series. On the other hand, active remote sensing microwave techniques, such as Synthetic Aperture Radars (SARs), show promising results (Bertoldi et al. 2014). As these data provide continuous coverage of large spatial extents with high spatial resolution (10-20 m), they are particularly in demand for mountain areas. However, there are still limitations related to the fact that the SAR signal can penetrate only a few centimeters in the soil. Moreover, the signal is strongly influenced by vegetation, surface roughness and topography. In this contribution, we analyse the spatial and temporal dynamics of surface and root-zone SMC (2.5 - 5 - 25 cm depth) of alpine meadows and pastures in the Long Term Ecological Research (LTER) Area Mazia Valley (South Tyrol - Italy) with different techniques: (I) a network of 18 stations; (II) field campaigns with mobile ground sensors; (III) 20-m resolution RADARSAT2 SAR images; (IV) numerical simulations using the GEOtop hydrological model (Rigon et al

  18. Machine learning techniques for astrophysical modelling and photometric redshift estimation of quasars in optical sky surveys

    CERN Document Server

    Kumar, N Daniel

    2008-01-01

    Machine learning techniques are utilised in several areas of astrophysical research today. This dissertation addresses the application of ML techniques to two classes of problems in astrophysics, namely, the analysis of individual astronomical phenomena over time and the automated, simultaneous analysis of thousands of objects in large optical sky surveys. Specifically investigated are (1) techniques to approximate the precise orbits of the satellites of Jupiter and Saturn given Earth-based observations as well as (2) techniques to quickly estimate the distances of quasars observed in the Sloan Digital Sky Survey. Learning methods considered include genetic algorithms, particle swarm optimisation, artificial neural networks, and radial basis function networks. The first part of this dissertation demonstrates that GAs and PSOs can both be efficiently used to model functions that are highly non-linear in several dimensions. It is subsequently demonstrated in the second part that ANNs and RBFNs can be used as ef...

  19. More efficient ground truth ROI image coding technique :implementation and wavelet based application analysis

    Institute of Scientific and Technical Information of China (English)

    KUMARAYAPA Ajith; ZHANG Ye

    2007-01-01

    In this paper, more efficient, low-complexity and reliable region of interest (ROI) image codec for compressing smooth low texture remote sensing images is proposed. We explore the efficiency of the modified ROI codec with respect to the selected set of convenient wavelet filters, which is a novel method. Such ROI coding experiment analysis representing low bit rate lossy to high quality lossless reconstruction with timing analysis is useful for improving remote sensing ground truth surveillance efficiency in terms of time and quality. The subjective [i.e. fair, five observer (HVS) evaluations using enhanced 3D picture view Hyper memory display technology] and the objective results revealed that for faster ground truth ROI coding applications, the Symlet-4 adaptation performs better than Biorthogonal 4.4 and Biorthogonal 6.8. However, the discrete Meyer wavelet adaptation is the best solution for delayed ROI image reconstructions.

  20. High-resolution mapping, modeling, and evolution of subsurface geomorphology using ground-penetrating radar techniques

    Digital Repository Service at National Institute of Oceanography (India)

    Loveson, V.J.; Gujar, A.R.

    data over an area under study. The gap between sample locations are to be either simulated or manipulated through various statistical methods. Under such conditions, mapping of the area may not yield the reality of the subsurface features in between... continuous profiles along with 200 MHz antenna and measuring wheel. Some times for confirmation, 400 MHz antenna was also used. GPR system was initialized in the field so that the ground reality, related to geo-electrical conditions of the field...

  1. Efficient Calculation of Born Scattering for Fixed-Offset Ground-Penetrating Radar Surveys

    DEFF Research Database (Denmark)

    Meincke, Peter

    2007-01-01

    A formulation is presented for efficient calculation of linear electromagnetic scattering by buried penetrable objects, as involved in the analysis of fixed-offset ground-penetrating radar (GPR) systems. The actual radiation patterns of the GPR antennas are incorporated in the scattering calculat......A formulation is presented for efficient calculation of linear electromagnetic scattering by buried penetrable objects, as involved in the analysis of fixed-offset ground-penetrating radar (GPR) systems. The actual radiation patterns of the GPR antennas are incorporated in the scattering...

  2. Ab initio organic chemistry : a survey of ground- and excited states and aromaticity

    NARCIS (Netherlands)

    Havenith, R.W.A.

    2001-01-01

    This thesis describes the application of quantum mechanical methods on organic chemistry. The ground- and excited states of functionalized oligo(cyclohexylidenes) have been explored as in function of chain length, conformation and substitution. VB theory has been used to study the effect of cyc

  3. Preliminary Analysis of Ground-based Orbit Determination Accuracy for the Wide Field Infrared Survey Telescope (WFIRST)

    Science.gov (United States)

    Sease, Brad

    2017-01-01

    The Wide Field Infrared Survey Telescope is a 2.4-meter telescope planned for launch to the Sun-Earth L2 point in 2026. This paper details a preliminary study of the achievable accuracy for WFIRST from ground-based orbit determination routines. The analysis here is divided into two segments. First, a linear covariance analysis of early mission and routine operations provides an estimate of the tracking schedule required to meet mission requirements. Second, a simulated operations scenario gives insight into the expected behavior of a daily Extended Kalman Filter orbit estimate over the first mission year given a variety of potential momentum unloading schemes.

  4. Aerial survey of barren-ground caribou at Adak and Kagalaska Islands, Alaska in 2012

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Uncertainty surrounded the caribou population trend since the 2005 survey for several reasons. First, reported hunter harvest rates returned to or exceeded pre‐naval...

  5. A Survey and Comparative Study on Video Watermarking Techniques with Reference to Mobile Devices

    Directory of Open Access Journals (Sweden)

    Ankitha.A.Nayak

    2014-12-01

    Full Text Available During the last few years’ mobile devices like smart phone and tablet witnessed a random growth in terms of hardware and software. The increased growth of apps, sharing data, videos, images through internet need security and intellectual property right. Developing a watermarking technique for data protection and authentication on shared data in mobile internet within the limited memory and significant battery consumption is one of the current challenging fields. In this paper we have performed a survey on available video watermarking techniques and a feasibility study on video watermarking techniques for mobile devices. Also the comparative study on features of watermarking with different video watermarking algorithm is performed.

  6. Building Common Ground for Environmental Flows using Traditional Techniques and Novel Engagement Approaches

    Science.gov (United States)

    Mott Lacroix, Kelly E.; Xiu, Brittany C.; Megdal, Sharon B.

    2016-04-01

    Despite increased understanding of the science of environmental flows, identification and implementation of effective environmental flow policies remains elusive. Perhaps the greatest barrier to implementing flow policies is the framework for water management. An alternative management approach is needed when legal rights for environmental flows do not exist, or are ineffective at protecting ecosystems. The research presented here, conducted in the U.S. state of Arizona, provides an empirical example of engagement to promote social learning as an approach to finding ways to provide water for the environment where legal rights for environmental flows are inadequate. Based on our engagement process we propose that identifying and then building common ground require attention to the process of analyzing qualitative data and the methods for displaying complex information, two aspects not frequently discussed in the social learning or stakeholder engagement literature. The results and methods from this study can help communities develop an engagement process that will find and build common ground, increase stakeholder involvement, and identify innovative solutions to provide water for the environment that reflect the concerns of current water users.

  7. Prediction of peak ground acceleration of Iran's tectonic regions using a hybrid soft computing technique

    Directory of Open Access Journals (Sweden)

    Mostafa Gandomi

    2016-01-01

    Full Text Available A new model is derived to predict the peak ground acceleration (PGA utilizing a hybrid method coupling artificial neural network (ANN and simulated annealing (SA, called SA-ANN. The proposed model relates PGA to earthquake source to site distance, earthquake magnitude, average shear-wave velocity, faulting mechanisms, and focal depth. A database of strong ground-motion recordings of 36 earthquakes, which happened in Iran's tectonic regions, is used to establish the model. For more validity verification, the SA-ANN model is employed to predict the PGA of a part of the database beyond the training data domain. The proposed SA-ANN model is compared with the simple ANN in addition to 10 well-known models proposed in the literature. The proposed model performance is superior to the single ANN and other existing attenuation models. The SA-ANN model is highly correlated to the actual records (R = 0.835 and ρ = 0.0908 and it is subsequently converted into a tractable design equation.

  8. Visual servoing in medical robotics: a survey. Part II: tomographic imaging modalities--techniques and applications.

    Science.gov (United States)

    Azizian, Mahdi; Najmaei, Nima; Khoshnam, Mahta; Patel, Rajni

    2015-03-01

    Intraoperative application of tomographic imaging techniques provides a means of visual servoing for objects beneath the surface of organs. The focus of this survey is on therapeutic and diagnostic medical applications where tomographic imaging is used in visual servoing. To this end, a comprehensive search of the electronic databases was completed for the period 2000-2013. Existing techniques and products are categorized and studied, based on the imaging modality and their medical applications. This part complements Part I of the survey, which covers visual servoing techniques using endoscopic imaging and direct vision. The main challenges in using visual servoing based on tomographic images have been identified. 'Supervised automation of medical robotics' is found to be a major trend in this field and ultrasound is the most commonly used tomographic modality for visual servoing. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Gastronet survey on the use of one- or two-person technique for colonoscopy insertion

    Directory of Open Access Journals (Sweden)

    Kjellevold Øystein

    2011-06-01

    Full Text Available Abstract Background Usually, colonoscopy insertion is performed by the colonoscopist (one-person technique. Quite common in the early days of endoscopy, the assisting nurse is now only rarely doing the insertion (two-person technique. Using the Norwegian national endoscopy quality assurance (QA programme, Gastronet, we wanted to explore the extent of two-person technique practice and look into possible differences in performance and QA output measures. Methods 100 colonoscopists in 18 colonoscopy centres having reported their colonoscopies to Gastronet between January and December 2009 were asked if they practiced one- or two-person technique during insertion of the colonoscope. They were categorized accordingly for comparative analyses of QA indicators. Results 75 endoscopists responded to the survey (representing 9368 colonoscopies - 62 of them (83% applied one-person technique and 13 (17% two-person technique. Patients age and sex distributions and indications for colonoscopy were also similar in the two groups. Caecal intubation was 96% in the two-person group compared to 92% in the one-person group (p Conclusion Two-person technique for colonoscope insertion was practiced by a considerable minority of endoscopists (17%. QA indicators were either similar to or better than one-person technique. This suggests that there may be some beneficial elements to this technique worth exploring and try to import into the much preferred one-person insertion technique.

  10. The development of pure β-NQR techniques for measurements of nuclear ground state quadrupole moments in lithium isotopes

    Science.gov (United States)

    Voss, A.; Pearson, M. R.; Billowes, J.; Buchinger, F.; Chow, K. H.; Crawford, J. E.; Hossein, M. D.; Kiefl, R. F.; Levy, C. D. P.; MacFarlane, W. A.; Mané, E.; Morris, G. D.; Parolin, T. J.; Saadaoui, H.; Salman, Z.; Smadella, M.; Song, Q.; Wang, D.

    2011-09-01

    A β-NQR spectrometer becomes a powerful tool to study changes in nuclear ground state properties along isotopic chains when coupled to a laser excitation beamline to polarise the nuclei of interest. Recently, the β-NQR technique in a zero magnetic field has been applied for the first-time to measure ratios of static nuclear quadrupole moments of, Li. Preliminary results of the experiment determining the ratios Q9/Q8 and Q11/Q9 show agreement with present literature values with improved precision.

  11. Ab initio organic chemistry : a survey of ground- and excited states and aromaticity

    OpenAIRE

    Havenith, R.W.A.

    2001-01-01

    This thesis describes the application of quantum mechanical methods on organic chemistry. The ground- and excited states of functionalized oligo(cyclohexylidenes) have been explored as in function of chain length, conformation and substitution. VB theory has been used to study the effect of cyclopentafusion on pyrene on its aromatic characteristics. Finally, the relevant part of the C6 H6 potentional energy surface has been explored to shed light on the reaction mechanism of the thermal elect...

  12. Measuring galaxy [OII] emission line doublet with future ground-based wide-field spectroscopic surveys

    CERN Document Server

    Comparat, Johan; Bacon, Roland; Mostek, Nick J; Newman, Jeffrey A; Schlegel, David J; Yèche, Christophe

    2013-01-01

    The next generation of wide-field spectroscopic redshift surveys will map the large-scale galaxy distribution in the redshift range 0.7< z<2 to measure baryonic acoustic oscillations (BAO). The primary optical signature used in this redshift range comes from the [OII] emission line doublet, which provides a unique redshift identification that can minimize confusion with other single emission lines. To derive the required spectrograph resolution for these redshift surveys, we simulate observations of the [OII] (3727,3729) doublet for various instrument resolutions, and line velocities. We foresee two strategies about the choice of the resolution for future spectrographs for BAO surveys. For bright [OII] emitter surveys ([OII] flux ~30.10^{-17} erg /cm2/s like SDSS-IV/eBOSS), a resolution of R~3300 allows the separation of 90 percent of the doublets. The impact of the sky lines on the completeness in redshift is less than 6 percent. For faint [OII] emitter surveys ([OII] flux ~10.10^{-17} erg /cm2/s like ...

  13. Monitoring of ground surface deformation in mining area with InSAR technique%利用InSAR技术监测矿区地表形变

    Institute of Scientific and Technical Information of China (English)

    朱建军; 邢学敏; 胡俊; 李志伟

    2011-01-01

    The application status and research progress of InSAR technique in the monitoring of the ground surface deformation in mining area were introduced. Firstly, the advantages of D-InSAR technique were analyzed by comparing to the traditional surveying methods. Then, the limitations of D-InSAR in the mining deformation detection were described. According to the limitations of the traditional D-InSAR method, the advanced InSAR technique, e.g., small baseline subset (SBAS), permanent scatterer (PS) and corner reflector (CR) techniques were discussed. Using real mining subsidence monitoring as example, the characteristics and application status of those advanced InSAR techniques were studied, and the key problems still existing in the current research were summarized. Finally, it is indicated that the development trend of InSAR monitoring surface deformation in mining area is the combination of advanced InSAR and high-resolution SAR images.%介绍了InSAR技术在矿区地表形变监测中的应用现状及进展,分析了D-InSAR技术相比于传统测量手段的优势,并指出其在矿区地表形变监测中的不足.针对传统D-InSAR技术的局限性,重点讨论了短基线(SBAS)、永久散射体(PS)和角反射器(CR)等高级差分干涉技术,并结合矿区沉降监测实例,分析了其特点与应用现状,讨论了现有研究中仍存在的问题.高级InSAR技术和高分辨率SAR影像的结合将是矿区地表形变监测的发展趋势.

  14. Changes in sample collection and analytical techniques and effects on retrospective comparability of low-level concentrations of trace elements in ground water

    Science.gov (United States)

    Ivahnenko, T.; Szabo, Z.; Gibs, J.

    2001-01-01

    Ground-water sampling techniques were modified to reduce random low-level contamination during collection of filtered water samples for determination of trace-element concentrations. The modified sampling techniques were first used in New Jersey by the US Geological Survey in 1994 along with inductively coupled plasma-mass spectrometry (ICP-MS) analysis to determine the concentrations of 18 trace elements at the one microgram-per-liter (μg/L) level in the oxic water of the unconfined sand and gravel Kirkwood-Cohansey aquifer system. The revised technique tested included a combination of the following: collection of samples (1) with flow rates of about 2L per minute, (2) through acid-washed single-use disposable tubing and (3) a single-use disposable 0.45-μm pore size capsule filter, (4) contained within portable glove boxes, (5) in a dedicated clean sampling van, (6) only after turbidity stabilized at values less than 2 nephelometric turbidity units (NTU), when possible. Quality-assurance data, obtained from equipment blanks and split samples, indicated that trace element concentrations, with the exception of iron, chromium, aluminum, and zinc, measured in the samples collected in 1994 were not subject to random contamination at 1μg/L.Results from samples collected in 1994 were compared to those from samples collected in 1991 from the same 12 PVC-cased observation wells using the available sampling and analytical techniques at that time. Concentrations of copper, lead, manganese and zinc were statistically significantly lower in samples collected in 1994 than in 1991. Sampling techniques used in 1994 likely provided trace-element data that represented concentrations in the aquifer with less bias than data from 1991 when samples were collected without the same degree of attention to sample handling.

  15. Fusion techniques for hybrid ground-penetrating radar: electromagnetic induction landmine detection systems

    Science.gov (United States)

    Laffin, Matt; Mohamed, Magdi A.; Etebari, Ali; Hibbard, Mark

    2010-04-01

    Hybrid ground penetrating radar (GPR) and electromagnetic induction (EMI) sensors have advanced landmine detection far beyond the capabilities of a single sensing modality. Both probability of detection (PD) and false alarm rate (FAR) are impacted by the algorithms utilized by each sensing mode and the manner in which the information is fused. Algorithm development and fusion will be discussed, with an aim at achieving a threshold probability of detection (PD) of 0.98 with a low false alarm rate (FAR) of less than 1 false alarm per 2 square meters. Stochastic evaluation of prescreeners and classifiers is presented with subdivisions determined based on mine type, metal content, and depth. Training and testing of an optimal prescreener on lanes that contain mostly low metal anti-personnel mines is presented. Several fusion operators for pre-screeners and classifiers, including confidence map multiplication, will be investigated and discussed for integration into the algorithm architecture.

  16. High resolution shallow geologic characterization of a late Pleistocene eolian environment using ground penetrating radar and optically stimulated luminescence techniques: North Carolina, USA

    Science.gov (United States)

    Mallinson, D.; Mahan, S.; Moore, Christine

    2008-01-01

    Geophysical surveys, sedimentology, and optically-stimulated luminescence age analyses were used to assess the geologic development of a coastal system near Swansboro, NC. This area is a significant Woodland Period Native American habitation and is designated the "Broad Reach" archaeological site. 2-d and 3-d subsurface geophysical surveys were performed using a ground penetrating radar system to define the stratigraphic framework and depositional facies. Sediment samples were collected and analyzed for grain-size to determine depositional environments. Samples were acquired and analyzed using optically stimulated luminescence techniques to derive the depositional age of the various features. The data support a low eolian to shallow subtidal coastal depositional setting for this area. Li-DAR data reveal ridge and swale topography, most likely related to beach ridges, and eolian features including low-relief, low-angle transverse and parabolic dunes, blowouts, and a low-relief eolian sand sheet. Geophysical data reveal dominantly seaward dipping units, and low-angle mounded features. Sedimentological data reveal mostly moderately-well to well-sorted fine-grained symmetrical to coarse skewed sands, suggesting initial aqueous transport and deposition, followed by eolian reworking and bioturbation. OSL data indicate initial coastal deposition prior to ca. 45,000 yBP, followed by eolian reworking and low dune stabilization at ca. 13,000 to 11,500 yBP, and again at ca. 10,000 yBP (during, and slightly after the Younger Dryas chronozone).

  17. Relationships between ground and airborne gamma-ray spectrometric survey data, North Ras Millan, Southern Sinai Peninsula, Egypt.

    Science.gov (United States)

    Youssef, Mohamed A S

    2016-02-01

    In the last decades of years, there was considerable growth in the use of airborne gamma-ray spectrometry. With this growth, there was an increasing need to standardize airborne measurements, so that they can be independent of survey parameters. Acceptable procedures were developed for converting airborne to ground gamma-ray spectrometric measurements of total-count intensity as well as, potassium, equivalent uranium and equivalent thorium concentrations, due to natural sources of radiation. The present study aims mainly to establish relationships between ground and airborne gamma-ray spectrometric data, North Ras Millan, Southern Sinai Peninsula, Egypt. The relationships between airborne and ground gamma-ray spectrometric data were deduced for the original and separated rock units in the study area. Various rocks in the study area, represented by Quaternary Wadi sediments, Cambro-Ordovician sandstones, basic dykes and granites, are shown on the detailed geologic map. The structures are displayed, which located on the detailed geologic map, are compiled from the integration of previous geophysical and surface geological studies.

  18. Fundamental study on airborne electromagnetic survey using grounded source; Chihyo source gata kuchu denji tansa no kisoteki kenkyu. 2

    Energy Technology Data Exchange (ETDEWEB)

    Mogi, T.; Fujimitsu, Y. [Kyushu University, Fukuoka (Japan). Faculty of Engineering; Tanaka, Y. [Kyoto University, Kyoto (Japan). Faculty of Science; Jomori, N. [Chiba Electronics Research Institute, Chiba (Japan); Morikawa, T. [Dowa Engineering Co. Ltd., Okayama (Japan); Kusunoki, K. [Central Research Institute of Electric Power Industry, Tokyo (Japan)

    1997-05-27

    With an objective to develop an airborne electromagnetic survey method for greater depths achievable of exploration, a discussion was given on an exploration method of a type in which a transmitting device is placed on the ground to receive signals in an atmosphere. A prototype exploration apparatus is mounted with a fluxgate magnetometer, an attitude meter, a GPS, and a battery. This exploration apparatus is suspended on a 30 meter long rope from a helicopter to perform the exploration. Two flight tests on this apparatus were carried out in the Unzen area, Nagasaki Prefecture and the Motomiya area, Wakayama Prefecture. The ground source was extended to a distance of 1.5 km, and a current of about 20 A was flown with a quiescent wave having four-second cycles. The helicopter flew nearly horizontally at a ground speed of about 50 km, a flight altitude of 450 m above sea level, and a terrain clearances of 100 to 400 m. The obtained data had variations in correspondence with changes in roll and pitch angles, whereas the variation of about 5000 nT was reduced to about 1000 nT as a result of correction. It was not possible, however, to correct completely the variation with short cycles, requiring further discussions on frequency characteristics of the magnetometer. 6 figs., 1 tab.

  19. COST Action TU1208 - Working Group 3 - Electromagnetic modelling, inversion, imaging and data-processing techniques for Ground Penetrating Radar

    Science.gov (United States)

    Pajewski, Lara; Giannopoulos, Antonios; Sesnic, Silvestar; Randazzo, Andrea; Lambot, Sébastien; Benedetto, Francesco; Economou, Nikos

    2017-04-01

    This work aims at presenting the main results achieved by Working Group (WG) 3 "Electromagnetic methods for near-field scattering problems by buried structures; data processing techniques" of the COST (European COoperation in Science and Technology) Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar" (www.GPRadar.eu, www.cost.eu). The main objective of the Action, started in April 2013 and ending in October 2017, is to exchange and increase scientific-technical knowledge and experience of Ground Penetrating Radar (GPR) techniques in civil engineering, whilst promoting in Europe the effective use of this safe non-destructive technique. The Action involves more than 150 Institutions from 28 COST Countries, a Cooperating State, 6 Near Neighbour Countries and 6 International Partner Countries. Among the most interesting achievements of WG3, we wish to mention the following ones: (i) A new open-source version of the finite-difference time-domain simulator gprMax was developed and released. The new gprMax is written in Python and includes many advanced features such as anisotropic and dispersive-material modelling, building of realistic heterogeneous objects with rough surfaces, built-in libraries of antenna models, optimisation of parameters based on Taguchi's method - and more. (ii) A new freeware CAD was developed and released, for the construction of two-dimensional gprMax models. This tool also includes scripts easing the execution of gprMax on multi-core machines or network of computers and scripts for a basic plotting of gprMax results. (iii) A series of interesting freeware codes were developed will be released by the end of the Action, implementing differential and integral forward-scattering methods, for the solution of simple electromagnetic problems by buried objects. (iv) An open database of synthetic and experimental GPR radargrams was created, in cooperation with WG2. The idea behind this initiative is to give researchers the

  20. The 1998 November 14 Occultation of GSC 0622-00345 by Saturn. I. Techniques for Ground-Based Stellar Occultations

    CERN Document Server

    Harrington, Joseph; 10.1088/0004-637X/716/1/398

    2010-01-01

    On 1998 November 14, Saturn and its rings occulted the star GSC 0622-00345. We observed atmospheric immersion with NSFCAM at the National Aeronautics and Space Administration's Infrared Telescope Facility on Mauna Kea, Hawaii. Immersion occurred at 55.5\\circ S planetocentric latitude. A 2.3 {\\mu}m, methane-band filter suppressed reflected sunlight. Atmospheric emersion and ring data were not successfully obtained. We describe our observation, light-curve production, and timing techniques, including improvements in aperture positioning, removal of telluric scintillation effects, and timing. Many of these techniques are known within the occultation community, but have not been described in the reviewed literature. We present a light curve whose signal-to-noise ratio per scale height is 267, among the best ground-based signals yet achieved, despite a disadvantage of up to 8 mag in the stellar flux compared to prior work.

  1. Seeking Construct Validity Grounded in Constructivist Epistemology: Development of the Survey of Contemporary Learning Environments

    Science.gov (United States)

    Schuh, Kathy L.; Kuo, Yi-Lung

    2015-01-01

    This study focused on the development of a new classroom environment instrument for late-elementary students. The development of the survey of contemporary learning environments (SoCLE) followed a content analysis of three similar instruments on constructivist learning environments and the literature on characteristics of contemporary learning…

  2. Seeking Construct Validity Grounded in Constructivist Epistemology: Development of the Survey of Contemporary Learning Environments

    Science.gov (United States)

    Schuh, Kathy L.; Kuo, Yi-Lung

    2015-01-01

    This study focused on the development of a new classroom environment instrument for late-elementary students. The development of the survey of contemporary learning environments (SoCLE) followed a content analysis of three similar instruments on constructivist learning environments and the literature on characteristics of contemporary learning…

  3. Assessment of Kalman filter bias-adjustment technique to improve the simulation of ground-level ozone over Spain.

    Science.gov (United States)

    Sicardi, V; Ortiz, J; Rincón, A; Jorba, O; Pay, M T; Gassó, S; Baldasano, J M

    2012-02-01

    The CALIOPE air quality modelling system has been used to diagnose ground level O(3) concentration for the year 2004, over the Iberian Peninsula. We investigate the improvement in the simulation of daily O(3) maximum by the use of a post-processing such as the Kalman filter bias-adjustment technique. The Kalman filter bias-adjustment technique is a recursive algorithm to optimally estimate bias-adjustment terms from previous measurements and model results. The bias-adjustment technique improved the simulation of daily O(3) maximum for the entire year and the all the stations considered over the whole domain. The corrected simulation presents improvements in statistical indicators such as correlation, root mean square error, mean bias, and gross error. After the post-processing the exceedances of O(3) concentration limits, as established by the European Directive 2008/50/CE, are better reproduced and the uncertainty of the modelling system, as established by the European Directive 2008/50/CE, is reduced from 20% to 7.5%. Such uncertainty in the model results is under the established EU limit of the 50%. Significant improvements in the O(3) timing and amplitude of the daily cycle are also observed after the post-processing. The systematic improvements in the O(3) maximum simulations suggest that the Kalman filter post-processing method is a suitable technique to reproduce accurate estimate of ground-level O(3) concentration. With this study we evince that the adjusted O(3) concentrations obtained after the post-process of the results from the CALIOPE system are a reliable means for real near time O(3) forecasts.

  4. D Survey Techniques for the Architectutal Restoration: the Case of ST. Agata in Pisa

    Science.gov (United States)

    Bevilacqua, M. G.; Caroti, G.; Piemonte, A.; Ruschi, P.; Tenchini, L.

    2017-05-01

    The historical architectural heritage may be considered as the product of a complex system of interaction between several factors - cultural, socio-economic, technical, aesthetic etc. The restoration and conservation of this important heritage, therefore, requires necessarily a multidisciplinary approach, both in the preliminary phase of knowledge and in the operative one, strictly connected to the first, regarding the development of the restoration works in all their steps, from the project to the realization. The historical-critical analysis of bibliographic, archival and iconographic sources, together with the architectural survey, aims at interpreting all the events that, from the initial project to all the eventual phases of transformation, have lead the monument in its current state. This is therefore a multi-temporal and multi-spatial study in which geomatics gives an innovative contribution for its capability of gathering, storing, processing, and delivering different levels of spatially referenced information. The current techniques of architectural survey, supported by specific methodological skills, are therefore not limited to a mere mathematical-geometrical description of the historical building, but are useful also for many other purposes, such as formal-linguistic analysis, interpretation of the historical phases of transformation, description of the state of degradation/conservation etc. In this interdisciplinary perspective, photogrammetry and laser scanner represent the two main techniques, as they offer the greatest potential of performing integrated surveys. In the last decades, we have witnessed the growth and development of these 3D-survey techniques as alternative or complementary tools to the traditional ones. In particular, in the field of architectural restoration, these techniques have made significant improvements not only in terms of measure precision or reduction of time for survey operations, but also for the possibility to represent

  5. Pan-European survey on the occurrence of selected polar organic persistent pollutants in ground water.

    Science.gov (United States)

    Loos, Robert; Locoro, Giovanni; Comero, Sara; Contini, Serafino; Schwesig, David; Werres, Friedrich; Balsaa, Peter; Gans, Oliver; Weiss, Stefan; Blaha, Ludek; Bolchi, Monica; Gawlik, Bernd Manfred

    2010-07-01

    This study provides the first pan-European reconnaissance of the occurrence of polar organic persistent pollutants in European ground water. In total, 164 individual ground-water samples from 23 European Countries were collected and analysed (among others) for 59 selected organic compounds, comprising pharmaceuticals, antibiotics, pesticides (and their transformation products), perfluorinated acids (PFAs), benzotriazoles, hormones, alkylphenolics (endocrine disrupters), Caffeine, Diethyltoluamide (DEET), and Triclosan. The most relevant compounds in terms of frequency of detection and maximum concentrations detected were DEET (84%; 454 ng/L), Caffeine (83%; 189 ng/L), PFOA (66%; 39 ng/L), Atrazine (56%; 253 ng/L), Desethylatrazine (55%; 487 ng/L), 1H-Benzotriazole (53%; 1032 ng/L), Methylbenzotriazole (52%; 516 ng/L), Desethylterbutylazine (49%; 266 ng/L), PFOS (48%, 135 ng/L), Simazine (43%; 127 ng/L), Carbamazepine (42%; 390 ng/L), nonylphenoxy acetic acid (NPE(1)C) (42%; 11 microg/L), Bisphenol A (40%; 2.3 microg/L), PFHxS (35%; 19 ng/L), Terbutylazine (34%; 716 ng/L), Bentazone (32%; 11 microg/L), Propazine (32%; 25 ng/L), PFHpA (30%; 21 ng/L), 2,4-Dinitrophenol (29%; 122 ng/L), Diuron (29%; 279 ng/L), and Sulfamethoxazole (24%; 38 ng/L). The chemicals which were detected most frequently above the European ground water quality standard for pesticides of 0.1 microg/L were Chloridazon-desphenyl (26 samples), NPE(1)C (20), Bisphenol A (12), Benzotriazole (8), N,N'-Dimethylsulfamid (DMS) (8), Desethylatrazine (6), Nonylphenol (6), Chloridazon-methyldesphenyl (6), Methylbenzotriazole (5), Carbamazepine (4), and Bentazone (4). However, only 1.7% of all single analytical measurements (in total 8000) were above this threshold value of 0.1 microg/L; 7.3% were > than 10 ng/L.

  6. Application of ground-penetrating radar technique to evaluate the waterfront location in hardened concrete

    Science.gov (United States)

    Rodríguez-Abad, Isabel; Klysz, Gilles; Martínez-Sala, Rosa; Balayssac, Jean Paul; Mené-Aparicio, Jesús

    2016-12-01

    The long-term performance of concrete structures is directly tied to two factors: concrete durability and strength. When assessing the durability of concrete structures, the study of the water penetration is paramount, because almost all reactions like corrosion, alkali-silica, sulfate, etc., which produce their deterioration, require the presence of water. Ground-penetrating radar (GPR) has shown to be very sensitive to water variations. On this basis, the objective of this experimental study is, firstly, to analyze the correlation between the water penetration depth in concrete samples and the GPR wave parameters. To do this, the samples were immersed into water for different time intervals and the wave parameters were obtained from signals registered when the antenna was placed on the immersed surface of the samples. Secondly, a procedure has been developed to be able to determine, from those signals, the reliability in the detection and location of waterfront depths. The results have revealed that GPR may have an enormous potential in this field, because excellent agreements were found between the correlated variables. In addition, when comparing the waterfront depths calculated from GPR measurements and those visually registered after breaking the samples, we observed that they totally agreed when the waterfront was more than 4 cm depth.

  7. Diversity surveys of soil bacterial community by cultivation-based methods and molecular fingerprinting techniques

    Institute of Scientific and Technical Information of China (English)

    LUO Hai-feng; QI Hong-yan; ZHANG Hong-xun

    2004-01-01

    By combining the cultivation methods with molecular fingerprinting techniques, the diversity surveys of soil bacterial community in 13 areas of China were carried out. The cultivable heterotrophic diversity was investigated by colony morphology on solid LB medium. Genetic diversity was measured as bands on denaturing gradient gel electrophoresis(DGGE) by the extraction and purification of the total soil DNA, and amplification of bacterial 16S rDNA fragments by polymerase chain reaction ( PCR). The Shannon-Wiener indices of diversity (H), richness (S)and evenness( EH ) were employed to estimate the diversity of soil bacterial community. The results showed that there was an obvious diversification existed in soil from the different areas. However, the genetic diversity estimated by PCR-DGGE can provide more comprehensive information on bacterial community than the cultivation-based methods. Therefore, it is suggested to combine the traditional methods with genetic fingerprinting techniques to survey and estimate soil bacterial diversity.

  8. Analysis of Daytime and Nighttime Ground Level Ozone Concentrations Using Boosted Regression Tree Technique

    Directory of Open Access Journals (Sweden)

    Noor Zaitun Yahaya

    2017-01-01

    Full Text Available This paper investigated the use of boosted regression trees (BRTs to draw an inference about daytime and nighttime ozone formation in a coastal environment. Hourly ground-level ozone data for a full calendar year in 2010 were obtained from the Kemaman (CA 002 air quality monitoring station. A BRT model was developed using hourly ozone data as a response variable and nitric oxide (NO, Nitrogen Dioxide (NO2 and Nitrogen Dioxide (NOx and meteorological parameters as explanatory variables. The ozone BRT algorithm model was constructed from multiple regression models, and the 'best iteration' of BRT model was performed by optimizing prediction performance. Sensitivity testing of the BRT model was conducted to determine the best parameters and good explanatory variables. Using the number of trees between 2,500-3,500, learning rate of 0.01, and interaction depth of 5 were found to be the best setting for developing the ozone boosting model. The performance of the O3 boosting models were assessed, and the fraction of predictions within two factor (FAC2, coefficient of determination (R2 and the index of agreement (IOA of the model developed for day and nighttime are 0.93, 0.69 and 0.73 for daytime and 0.79, 0.55 and 0.69 for nighttime respectively. Results showed that the model developed was within the acceptable range and could be used to understand ozone formation and identify potential sources of ozone for estimating O3 concentrations during daytime and nighttime. Results indicated that the wind speed, wind direction, relative humidity, and temperature were the most dominant variables in terms of influencing ozone formation. Finally, empirical evidence of the production of a high ozone level by wind blowing from coastal areas towards the interior region, especially from industrial areas, was obtained.

  9. A Survey of XOR as a Digital Obfuscation Technique in a Corpus of Real Data

    Science.gov (United States)

    2014-01-17

    that have been ofuscated, most digital forensic tools and malware scanners eschew steganography detection because it is computationally expensive and the...be useful to perform a survey on the prevalence and purpose of other simple obfuscation and steganography techniques, such as rotate and the ROT13...Venkata Sai Manoj. Cryptography and steganography , 2010. http://citeseerx.ist. psu.edu/viewdoc/summary?doi=10.1.1.184.5413. Last accessed August 14, 2013

  10. Very-low-frequency resistivity, self-potential and ground temperature surveys on Taal volcano (Philippines): Implications for future activity

    Science.gov (United States)

    Zlotnicki, J.; Vargemezis, G.; Johnston, M. J. S.; Sasai, Y.; Reniva, P.; Alanis, P.

    2017-06-01

    Taal volcano is one of the most dangerous volcanoes in the Philippines. Thirty-three eruptions have occurred through historical time with several exhibiting cataclysmic phases. Most recent eruptions are confined to Volcano Island located within the prehistoric Taal collapse caldera that is now filled by Taal Lake. The last eruptive activity from 1965 to 1977 took place from Mt. Tabaro, about 2 km to the southwest of the Main Crater center. Since this time, episodes of seismic activity, ground deformation, gas release, surface fissuring, fumarole activity and temperature changes are recorded periodically around the main crater, but no major eruption has occurred. This period of quiescence is the third longest period without eruptive activity since 1572. In March 2010, a campaign based on Very-Low-Frequency (VLF) resistivity surveys together with repeated surveys of self-potential, ground temperature and fissure activity was intensified and the results compared to a large-scale Electrical Resistivity Tomography experiment. This work fortunately occurred before, within and after a new seismovolcanic crisis from late April 2010 to March 2011. The joint analysis of these new data, together with results from previous magnetotelluric soundings, allows a better description of the electrical resistivity and crustal structure beneath the Main Crater down to a depth of several kilometers. No indication of growth of the two geothermal areas located on both sides of the northern crater rim was apparent from 2005 to March 2010. These areas appear controlled by active fissures, opened during the 1992 and 1994 crises, that dip downward towards the core of the hydrothermal system located at about 2.5 km depth beneath the crater. Older mineralized fissures at lower elevations to the North of the geothermal areas also dip downward under the crater. Repeated self-potential and ground temperature surveys completed between 2005 and 2015 show new geothermal and hydrothermal activity in

  11. Effective multilevel teaching techniques on attending rounds: a pilot survey and systematic review of the literature.

    Science.gov (United States)

    Certain, Laura K; Guarino, A J; Greenwald, Jeffrey L

    2011-01-01

    While numerous authors acknowledge the challenge of teaching simultaneously to medical students, interns, and residents, few offer specific advice on how to meet that challenge, and none have studied which techniques are most effective. The purpose of this study was to determine whether multilevel teaching is challenging for attendings, whether trainees feel that teaching on rounds is appropriate to their level, and to define multilevel teaching techniques. We surveyed attendings and trainees on the internal medicine services at two academic medical centers. Attendings were divided about whether teaching to multiple levels posed a challenge. Trainees reported that the teaching they received was usually appropriate to their level of training. The most effective techniques for multilevel teaching were Broadening (asking "what if" questions), Targeting (directing questions at specific team members), and Novelty (teaching newly published information), while the least effective were techniques that taught advanced material unfamiliar to most or all of the team. A systematic literature review yielded no studies that focused on multilevel teaching techniques. This article is the first to define and evaluate specific techniques for multilevel instruction in a medical setting and identifies certain techniques as more effective at engaging multiple levels of learners simultaneously.

  12. The use of guided tissue regeneration techniques among endodontists: a web-based survey.

    Science.gov (United States)

    Naylor, Justin; Mines, Pete; Anderson, Alfred; Kwon, David

    2011-11-01

    The purpose of this study was to determine factors and clinical situations that influence an endodontist's decision to use guided tissue regeneration (GTR) techniques during endodontic root-end surgery. An invitation to participate in a web-based survey was e-mailed to 3,750 members of the American Association of Endodontists. Data were collected from 1,129 participants, representing a 30.1% completion rate. The number of questions varied from 3 to 11 depending on individual responses. 40.7% of respondents who perform root-end surgeries also use GTR techniques. The clinical situation in which GTR techniques are used most often is for transosseous lesions. Barrier membranes and bone replacement grafts are each used by more than 85% of respondents using GTR techniques. Insufficient training and insufficient evidence in support of its use were selected as the predominant reasons for not using GTR techniques at 42.4% and 32%, respectively. Although over 40% of respondents are currently using GTR techniques in conjunction with their root-end surgeries, a majority of those who do not use GTR indicated they would consider using these techniques with better evidence and available training. Published by Elsevier Inc.

  13. [Assessment of cancer RCP meetings in Rhône-Alpes: a survey on the ground].

    Science.gov (United States)

    Descotes, J-L; Guillem, P; Bondil, P; Colombel, M; Chabloz, C

    2010-10-01

    The results of a local survey sent to urologists, oncologists and radiotherapeutists working in Rhône-Alpes have been reported to assess the value of multidisciplinary oncological meetings (RCP) in Urology. The results of this short study have been analyzed and compared to the national results published by the Inspection Générale des Affaires Sociales report. Meanwhile, we have created a professional electronic directory collecting all RCP of Rhône-Alpes, which will be accessible soon. Copyright © 2010. Published by Elsevier Masson SAS.

  14. Integrated inversion of ground deformation and magnetic data at Etna volcano using a genetic algorithm technique

    Directory of Open Access Journals (Sweden)

    G. Ganci

    2007-06-01

    Full Text Available Geodetic and magnetic investigations have been playing an increasingly important role in studies on Mt. Etna eruptive processes. During ascent, magma interacts with surrounding rocks and fluids, and inevitably crustal deformation and disturbances in the local magnetic field are produced. These effects are generally interpreted separately from each other and consistency of interpretations obtained from different methods is qualitatively checked only a posteriori. In order to make the estimation of source parameters more robust we propose an integrated inversion from deformation and magnetic data that leads to the best possible understanding of the underlying geophysical process. The inversion problem was formulated following a global optimization approach based on the use of genetic algorithms. The proposed modeling inversion technique was applied on field data sets recorded during the onset of the 2002-2003 Etna flank eruption. The deformation pattern and the magnetic anomalies were consistent with a piezomagnetic effect caused by a dyke intrusion propagating along the NE direction.

  15. Ground Truth and Application for the Anisotropic Receiver Functions Technique - Test site KTB: the installation campaign

    Science.gov (United States)

    Bianchi, Irene; Anselmi, Mario; Apoloner, Maria-Theresia; Qorbani, Ehsan; Gribovszki, Katalin; Bokelmann, Götz

    2015-04-01

    The project at hand is a field test around the KTB (Kontinentale Tiefbohrung) site in the Oberpfalz, Southeastern Germany, at the northwestern edge of the Bohemian Massif. The region has been extensively studied through the analysis of several seismic reflection lines deployed around the drilling site. The deep borehole had been placed into gneiss rocks of the Zone Erbendorf-Vohenstrauss. Drilling activity lasted since 1987 to 1994, and it descends down to a depth of 9101 meters. In our experiment, we aim to recover structural information as well as anisotropy of the upper crust using the receiver function technique. This retrieved information will form the base for a comparison between the resulting anisotropy amount and orientation with information of rock samples from up to 9 km depth, and with earlier high-frequency seismic experiments around the drill site. For that purpose, we installed 9 seismic stations, and recorded seismicity continuously for two years.

  16. Identification Of Ground Water Potential Zones In Tamil Nadu By Remote Sensing And GIS Technique

    Directory of Open Access Journals (Sweden)

    T. Subramani

    2014-12-01

    Full Text Available A case study was conducted to find out the groundwater potential zones in Salem, Erode and Namakkal districts, Tamil Nadu, India with an aerial extent of 360.60 km2 . The thematic maps such as geology, geomorphology, soil hydrological group, land use / land cover and drainage map were prepared for the study area. The Digital Elevation Model (DEM has been generated from the 10 m interval contour lines (which is derived from SOI, Toposheet 1:25000 scale and obtained the slope (% of the study area. The groundwater potential zones were obtained by overlaying all the thematic maps in terms of weighted overlay methods using the spatial analysis tool in Arc GIS 9.3. During weighted overlay analysis, the ranking has been given for each individual parameter of each thematic map and weights were assigned according to the influence such as soil −25%, geomorphology − 25%, land use/land cover −25%, slope − 15%, lineament − 5% and drainage / streams − 5% and find out the potential zones in terms of good, moderate and poor zones with the area of 49.70 km2 , 261.61 km2 and 46.04 km2 respectively. The potential zone wise study area was overlaid with village boundary map and the village wise groundwater potential zones with three categories such as good, moderate and poor zones were obtained. This GIS based output result was validated by conducting field survey by randomly selecting wells in different villages using GPS instruments. The coordinates of each well location were obtained by GPS and plotted in the GIS platform and it was clearly shown that the well coordinates were exactly seated with the classified zones.

  17. Optimal spatial sampling techniques for ground truth data in microwave remote sensing of soil moisture

    Science.gov (United States)

    Rao, R. G. S.; Ulaby, F. T.

    1977-01-01

    The paper examines optimal sampling techniques for obtaining accurate spatial averages of soil moisture, at various depths and for cell sizes in the range 2.5-40 acres, with a minimum number of samples. Both simple random sampling and stratified sampling procedures are used to reach a set of recommended sample sizes for each depth and for each cell size. Major conclusions from statistical sampling test results are that (1) the number of samples required decreases with increasing depth; (2) when the total number of samples cannot be prespecified or the moisture in only one single layer is of interest, then a simple random sample procedure should be used which is based on the observed mean and SD for data from a single field; (3) when the total number of samples can be prespecified and the objective is to measure the soil moisture profile with depth, then stratified random sampling based on optimal allocation should be used; and (4) decreasing the sensor resolution cell size leads to fairly large decreases in samples sizes with stratified sampling procedures, whereas only a moderate decrease is obtained in simple random sampling procedures.

  18. A Study on Preference of Interface Design Techniques for Web Survey

    Directory of Open Access Journals (Sweden)

    Settapong Malisuwan

    2012-09-01

    Full Text Available According to the advancement in internet and web-based application, the survey via the internet has been increasingly utilized due to its convenience and time saving. This article studied the influence of five web-design techniques - screen design, response format, logo type, progress indicator, and image display on the interest of the respondents. Two screen display designs from each design technique were made for selection. Focus group discussion technique was conducted on the four groups of Y generation participants with different characteristics. Open discussion was performed to identify additional design factors that will affect the interest of the questionnaire. The study found the degree of influence of all related design factors can be ranked from screen design, response format, font type, logo type, background color, progress indicator, and image display respectively.

  19. Sub-Threshold Leakage Current Reduction Techniques In VLSI Circuits -A Survey

    Directory of Open Access Journals (Sweden)

    V.Sri Sai Harsha

    2015-09-01

    Full Text Available There is an increasing demand for portable devices powered up by battery, this led the manufacturers of semiconductor technology to scale down the feature size which results in reduction in threshold voltage and enables the complex functionality on a single chip. By scaling down the feature size the dynamic power dissipation has no effect but the static power dissipation has become equal or more than that of Dynamic power dissipation. So in recent CMOS technologies static power dissipation i.e. power dissipation due to leakage current has become a challenging area for VLSI chip designers. In order to prolong the battery life and maintain reliability of circuit, leakage current reduction is the primary goal. A basic overview of techniques used for reduction of sub-threshold leakages is discussed in this paper. Based on the surveyed techniques, one would be able to choose required and apt leakage reduction technique.

  20. A Survey on Steganography Techniques in Real Time Audio Signals and Evaluation

    Directory of Open Access Journals (Sweden)

    Abdulaleem Z. Al-Othmani

    2012-01-01

    Full Text Available Steganography has proven to be one of the practical ways of securing data. It is a new kind of secret communication used mainly to hide secret data inside other innocent digital mediums. Most of existing steganographic techniques use digital multimedia files as cover mediums to hide secret data. Audio files and signals make appropriate mediums for steganography due to the high data transmission rate and the high level of redundancy. Hiding data in real time communication audio signals is not a simple mission. Steganography requirements as well as real time communication requirements are supposed to be met in order to construct a useful and useful data hiding application. In this paper we will survey the general principles of hiding secret information using audio technology, and provide an overview of current functions and techniques. These techniques will be evaluated across both, steganography and real time communication requirements.

  1. A Survey On Data Mining Techniques In Customer Churn Analysis For Telecom Industry

    Directory of Open Access Journals (Sweden)

    Amal M. Almana

    2014-05-01

    Full Text Available Customer churn prediction in Telecom Industry is a core research topic in recent years. A huge amount of data is generated in Telecom Industry every minute. On the other hand, there is lots of development in data mining techniques. Customer churn has emerged as one of the major issues in Telecom Industry. Telecom research indicates that it is more expensive to gain a new customer than to retain an existing one. In order to retain existing customers, Telecom providers need to know the reasons of churn, which can be realized through the knowledge extracted from Telecom data. This paper surveys the commonly used data mining techniques to identify customer churn patterns. The recent literature in the area of predictive data mining techniques in customer churn behavior is reviewed and a discussion on the future research directions is offered.

  2. Ground-based infrared surveys: imaging the thermal fields at volcanoes and revealing the controlling parameters.

    Science.gov (United States)

    Pantaleo, Michele; Walter, Thomas

    2013-04-01

    Temperature monitoring is a widespread procedure in the frame of volcano hazard monitoring. Indeed temperature changes are expected to reflect changes in volcanic activity. We propose a new approach, within the thermal monitoring, which is meant to shed light on the parameters controlling the fluid pathways and the fumarole sites by using infrared measurements. Ground-based infrared cameras allow one to remotely image the spatial distribution, geometric pattern and amplitude of fumarole fields on volcanoes at metre to centimetre resolution. Infrared mosaics and time series are generated and interpreted, by integrating geological field observations and modeling, to define the setting of the volcanic degassing system at shallow level. We present results for different volcano morphologies and show that lithology, structures and topography control the appearance of fumarole field by the creation of permeability contrasts. We also show that the relative importance of those parameters is site-dependent. Deciphering the setting of the degassing system is essential for hazard assessment studies because it would improve our understanding on how the system responds to endogenous or exogenous modification.

  3. Occurrence of selected radionuclides in ground water used for drinking water in the United States; a reconnaissance survey, 1998

    Science.gov (United States)

    Focazio, Michael J.; Szabo, Zoltan; Kraemer, Thomas F.; Mullin, Ann H.; Barringer, Thomas H.; dePaul, Vincent T.

    2001-01-01

    The U.S. Geological Survey, in collaboration with the U.S. Environmental Protection Agency, the American Water Works Association, and the American Water Works Service Company, completed a targeted national reconnaissance survey of selected radionuclides in public ground-water supplies. Radionuclides analyzed included radium-224 (Ra-224), radium-226 (Ra-226), radium-228 (Ra-228), polonium-210 (Po-210) and lead-210 (Pb-210).This U.S. Geological Survey reconnaissance survey focused intentionally on areas with known or suspected elevated concentrations of radium in ground water to determine if Ra-224 was also present in the areas where other isotopes of radium had previously been detected and to determine the co-occurrence characteristics of the three radium isotopes (Ra-224, Ra-226, and Ra-228) in those areas. Ninety-nine raw-water samples (before water treatment) were collected once over a 6-month period in 1998 and 1999 from wells (94 of which are used for public drinking water) in 27 States and 8 physiographic provinces. Twenty-one of the 99 samples exceeded the current U.S. Environmental Protection Agency drinking-water maximum contaminant level of 5 picocuries per liter (pCi/L) for combined radium (Ra-226 + Ra-228). Concentrations of Ra-224 were reported to exceed 1 pCi/L in 30 percent of the samples collected, with a maximum concentration of 73.6 pCi/L measured in water from a nontransient, noncommunity, public-supply well in Maryland. Radium-224 concentrations generally were higher than those of the other isotopes of radium. About 5 percent of the samples contained concentrations of Ra-224 greater than 10 pCi/L, whereas only 2 percent exceeded 10 pCi/L for either Ra-226 or Ra-228. Concentrations of Ra-226 greater than 1 pCi/L were reported in 33 percent of the samples, with a maximum concentration of 16.9 pCi/L measured in water from a public-supply well in Iowa. Concentrations of Ra-228 greater than 1 pCi/L were reported in 22 samples, with a maximum

  4. Forsmark site investigation. Detailed ground magnetic survey and lineament interpretation in the Forsmark area, 2006-2007

    Energy Technology Data Exchange (ETDEWEB)

    Isaksson, Hans; Thunehed, Hans; Pitkaenen, Timo; Keisu, Mikael (GeoVista AB, Luleaa (SE))

    2007-12-15

    The report presents detailed ground magnetic measurements carried out on an 11.1 km2 area in the Forsmark site investigation area. The main objective of this activity is to determine a detailed ground magnetic representation of the bedrock. The results from previous surveys were encouraging and have led to a broad geophysical programme for investigation of lineaments at Forsmark. This report comprises the results from the second and final phase of the extended survey programme and a compilation and summary of all phases in the programme. On ground and on lake ice, a grid with parallel lines was staked. Measurements of the magnetic total field were carried out along profiles, perpendicular to the staked lines, with a profile spacing of 10 m and a station spacing of 5 m. Measurements on the ice-covered sea bays were carried out by a two man crew. One crew member walked along the survey lines, carrying a RTK-GPS guiding the other crew member who measured the magnetic total field. No seaborne survey was carried out in the final phase. Previously, using a high accuracy RTK-GPS unit for boat navigation gave a seaborne survey grid of on average 10 m line spacing and 2-3 m station spacing. In total 427,238 magnetic survey stations have been measured and an area of 4.7 km2 has been surveyed from boat. The magnetic pattern in the survey area can be divided into six main areas with different magnetic character. Along the southwest margin of the survey area the magnetic pattern is intensely banded with rapidly changing low and highly magnetic bands striking southeast-northwest, which to the northeast changes to a gentler, banded pattern of low to moderate magnetic intensity. To the northeast, at the SFR office and along the coastline to the southeast, the pattern is again intensely banded with, southeast-northwest trending, rapidly changing low and highly magnetic bands. These two banded structures probably forms fold limbs of a common fold with a northwest oriented fold axis

  5. Forsmark site investigation. Detailed ground magnetic survey and lineament interpretation in the Forsmark area, 2006-2007

    Energy Technology Data Exchange (ETDEWEB)

    Isaksson, Hans; Thunehed, Hans; Pitkaenen, Timo; Keisu, Mikael (GeoVista AB, Luleaa (SE))

    2007-12-15

    The report presents detailed ground magnetic measurements carried out on an 11.1 km2 area in the Forsmark site investigation area. The main objective of this activity is to determine a detailed ground magnetic representation of the bedrock. The results from previous surveys were encouraging and have led to a broad geophysical programme for investigation of lineaments at Forsmark. This report comprises the results from the second and final phase of the extended survey programme and a compilation and summary of all phases in the programme. On ground and on lake ice, a grid with parallel lines was staked. Measurements of the magnetic total field were carried out along profiles, perpendicular to the staked lines, with a profile spacing of 10 m and a station spacing of 5 m. Measurements on the ice-covered sea bays were carried out by a two man crew. One crew member walked along the survey lines, carrying a RTK-GPS guiding the other crew member who measured the magnetic total field. No seaborne survey was carried out in the final phase. Previously, using a high accuracy RTK-GPS unit for boat navigation gave a seaborne survey grid of on average 10 m line spacing and 2-3 m station spacing. In total 427,238 magnetic survey stations have been measured and an area of 4.7 km2 has been surveyed from boat. The magnetic pattern in the survey area can be divided into six main areas with different magnetic character. Along the southwest margin of the survey area the magnetic pattern is intensely banded with rapidly changing low and highly magnetic bands striking southeast-northwest, which to the northeast changes to a gentler, banded pattern of low to moderate magnetic intensity. To the northeast, at the SFR office and along the coastline to the southeast, the pattern is again intensely banded with, southeast-northwest trending, rapidly changing low and highly magnetic bands. These two banded structures probably forms fold limbs of a common fold with a northwest oriented fold axis

  6. Using Intelligent Techniques in Construction Project Cost Estimation: 10-Year Survey

    Directory of Open Access Journals (Sweden)

    Abdelrahman Osman Elfaki

    2014-01-01

    Full Text Available Cost estimation is the most important preliminary process in any construction project. Therefore, construction cost estimation has the lion’s share of the research effort in construction management. In this paper, we have analysed and studied proposals for construction cost estimation for the last 10 years. To implement this survey, we have proposed and applied a methodology that consists of two parts. The first part concerns data collection, for which we have chosen special journals as sources for the surveyed proposals. The second part concerns the analysis of the proposals. To analyse each proposal, the following four questions have been set. Which intelligent technique is used? How have data been collected? How are the results validated? And which construction cost estimation factors have been used? From the results of this survey, two main contributions have been produced. The first contribution is the defining of the research gap in this area, which has not been fully covered by previous proposals of construction cost estimation. The second contribution of this survey is the proposal and highlighting of future directions for forthcoming proposals, aimed ultimately at finding the optimal construction cost estimation. Moreover, we consider the second part of our methodology as one of our contributions in this paper. This methodology has been proposed as a standard benchmark for construction cost estimation proposals.

  7. Chest physiotherapy techniques in neurological intensive care units of India: A survey.

    Science.gov (United States)

    Bhat, Anup; Chakravarthy, Kalyana; Rao, Bhamini K

    2014-06-01

    Neurological intensive care units (ICUs) are a rapidly developing sub-specialty of neurosciences. Chest physiotherapy techniques are of great value in neurological ICUs in preventing, halting, or reversing the impairments caused due to neurological disorder and ICU stay. However, chest physiotherapy techniques should be modified to a greater extent in the neurological ICU as compared with general ICUs. The aim of this study is to obtain data on current chest physiotherapy practices in neurological ICUs of India. A tertiary care hospital in Karnataka, India, and cross-sectional survey. A questionnaire was formulated and content validated to assess the current chest physiotherapy practices in neurological ICUs of India. The questionnaire was constructed online and a link was distributed via E-mail to 185 physiotherapists working in neurological ICUs across India. Descriptive statistics. The response rate was 44.3% (n = 82); 31% of the physiotherapists were specialized in cardiorespiratory physiotherapy and 30% were specialized in neurological physiotherapy. Clapping, vibration, postural drainage, aerosol therapy, humidification, and suctioning were used commonly used airway clearance (AC) techniques by the majority of physiotherapists. However, devices for AC techniques such as Flutter, Acapella, and standard positive expiratory pressure devices were used less frequently for AC. Techniques such as autogenic drainage and active cycle of breathing technique are also frequently used when appropriate for the patients. Lung expansion therapy techniques such as breathing exercises, incentive spirometry exercises, and positioning, proprioceptive neuromuscular facilitation of breathing are used by majority of physiotherapists. Physiotherapists in this study were using conventional chest physiotherapy techniques more frequently in comparison to the devices available for AC.

  8. Chest physiotherapy techniques in neurological intensive care units of India: A survey

    Science.gov (United States)

    Bhat, Anup; Chakravarthy, Kalyana; Rao, Bhamini K.

    2014-01-01

    Context: Neurological intensive care units (ICUs) are a rapidly developing sub-specialty of neurosciences. Chest physiotherapy techniques are of great value in neurological ICUs in preventing, halting, or reversing the impairments caused due to neurological disorder and ICU stay. However, chest physiotherapy techniques should be modified to a greater extent in the neurological ICU as compared with general ICUs. Aim: The aim of this study is to obtain data on current chest physiotherapy practices in neurological ICUs of India. Settings and Design: A tertiary care hospital in Karnataka, India, and cross-sectional survey. Subjects and Methods: A questionnaire was formulated and content validated to assess the current chest physiotherapy practices in neurological ICUs of India. The questionnaire was constructed online and a link was distributed via E-mail to 185 physiotherapists working in neurological ICUs across India. Statistical Analysis Used: Descriptive statistics. Results: The response rate was 44.3% (n = 82); 31% of the physiotherapists were specialized in cardiorespiratory physiotherapy and 30% were specialized in neurological physiotherapy. Clapping, vibration, postural drainage, aerosol therapy, humidification, and suctioning were used commonly used airway clearance (AC) techniques by the majority of physiotherapists. However, devices for AC techniques such as Flutter, Acapella, and standard positive expiratory pressure devices were used less frequently for AC. Techniques such as autogenic drainage and active cycle of breathing technique are also frequently used when appropriate for the patients. Lung expansion therapy techniques such as breathing exercises, incentive spirometry exercises, and positioning, proprioceptive neuromuscular facilitation of breathing are used by majority of physiotherapists. Conclusions: Physiotherapists in this study were using conventional chest physiotherapy techniques more frequently in comparison to the devices available for

  9. Chest physiotherapy techniques in neurological intensive care units of India: A survey

    Directory of Open Access Journals (Sweden)

    Anup Bhat

    2014-01-01

    Full Text Available Context: Neurological intensive care units (ICUs are a rapidly developing sub-specialty of neurosciences. Chest physiotherapy techniques are of great value in neurological ICUs in preventing, halting, or reversing the impairments caused due to neurological disorder and ICU stay. However, chest physiotherapy techniques should be modified to a greater extent in the neurological ICU as compared with general ICUs. Aim: The aim of this study is to obtain data on current chest physiotherapy practices in neurological ICUs of India. Settings and Design: A tertiary care hospital in Karnataka, India, and cross-sectional survey. Subjects and Methods: A questionnaire was formulated and content validated to assess the current chest physiotherapy practices in neurological ICUs of India. The questionnaire was constructed online and a link was distributed via E-mail to 185 physiotherapists working in neurological ICUs across India. Statistical Analysis Used: Descriptive statistics. Results: The response rate was 44.3% (n = 82; 31% of the physiotherapists were specialized in cardiorespiratory physiotherapy and 30% were specialized in neurological physiotherapy. Clapping, vibration, postural drainage, aerosol therapy, humidification, and suctioning were used commonly used airway clearance (AC techniques by the majority of physiotherapists. However, devices for AC techniques such as Flutter, Acapella, and standard positive expiratory pressure devices were used less frequently for AC. Techniques such as autogenic drainage and active cycle of breathing technique are also frequently used when appropriate for the patients. Lung expansion therapy techniques such as breathing exercises, incentive spirometry exercises, and positioning, proprioceptive neuromuscular facilitation of breathing are used by majority of physiotherapists. Conclusions: Physiotherapists in this study were using conventional chest physiotherapy techniques more frequently in comparison to the

  10. The Gaia spectrophotometric standard stars survey -II. Instrumental effects of six ground-based observing campaigns

    CERN Document Server

    Altavilla, G; Pancino, E; Galleti, S; Ragaini, S; Bellazzini, M; Cocozza, G; Bragaglia, A; Carrasco, J M; Castro, A; Di Fabrizio, L; Federici, L; Figueras, F; Gebran, M; Jordi, C; Masana, E; Schuster, W; Valentini, G; Voss, H

    2015-01-01

    The Gaia SpectroPhotometric Standard Stars (SPSS) survey started in 2006, it was awarded almost 450 observing nights, and accumulated almost 100,000 raw data frames, with both photometric and spectroscopic observations. Such large observational effort requires careful, homogeneous, and automated data reduction and quality control procedures. In this paper, we quantitatively evaluate instrumental effects that might have a significant (i.e.,$\\geq$1%) impact on the Gaia SPSS flux calibration. The measurements involve six different instruments, monitored over the eight years of observations dedicated to the Gaia flux standards campaigns: DOLORES@TNG in La Palma, EFOSC2@NTT and ROSS@REM in La Silla, CAFOS@2.2m in Calar Alto, BFOSC@Cassini in Loiano, and LaRuca@1.5m in San Pedro Martir. We examine and quantitatively evaluate the following effects: CCD linearity and shutter times, calibration frames stability, lamp flexures, second order contamination, light polarization, and fringing. We present methods to correct ...

  11. Ground Deformation during Papandayan Volcano 2002 Eruption as Detected by GPS Surveys

    Directory of Open Access Journals (Sweden)

    Hasanuddin Z. Abidin

    2003-05-01

    Full Text Available Papandayan is an A-type active volcano located in the southern part of Garut Regency, about 70 km southeast of Bandung, Indonesia. Its earliest recorded eruption, and most violent and devastating outburst occurred in 1772 and the latest eruptions occurred in the period of 11 November to 8 December 2002, and consisted of freatic, freatomagmatic and magmatic types of eruption.During the latest eruption period, GPS surveys were conducted at several points inside and around the crater in a radial mode using the reference point located at Papandayan observatory around 10 km from the crater. At the points closest to the erupting craters, GPS displacements up to a few dm were detected, whereas at the points outside the crater, the displacements were in the cm level. The magnitude of displacements observed at each point also show a temporal variation according to the eruption characteristics. The results show that deformation during eruption tends to be local, e.g. just around the crater. Pressure source is difficult to be properly modeled from GPS results, due to limited GPS data available and differences in topography, geological structure and/or rheology related to each GPS station.

  12. A Ground-Based Mid-Infrared Imaging Survey of Embedded Young Stellar Objects in the Rho Ophiuchi Cloud Core

    Science.gov (United States)

    Barsony, M.; Ressler, M. E.; Marsh, K. A.

    2004-12-01

    Results of a comprehensive, new, ground-based mid-infrared imaging survey of the young stellar population of the ρ Ophiuchi cloud are presented. Data were acquired at the Palomar 5-m and at the Keck 10-m telescopes with the MIRLIN and LWS instruments, at 0.5'' and 0.25'' resolutions, respectively. Of 172 survey objects, 85 were detected. A plot of the frequency distribution of the detected objects with SED spectral slope shows that YSOs spend ˜ 3 × 105 yr in the Flat Spectrum phase, clearing out their remnant infall envelopes. Mid-infrared variability is found among a significant fraction of the surveyed objects and is found to occur for all SED classes with optically thick disks. Large amplitude near-infrared variability, also found for all SED classes with optically thick disks, seems to occur with somewhat higher frequency at the earlier evolutionary stages. The highly variable value of K-band veiling that a single source can exhibit in any of the SED classes in which active disk accretion can take place is striking, and is direct observational evidence for highly time-variable accretion activity in disks. Finallly, by comparing mid-infrared vs. near-infrared excesses in a subsample with well-determined effective temperatures and extinction values, disk clearing mechanisms are explored. Financial support for this project through NSF grants AST 00-96087 (CAREER), AST 97-53229 (POWRE), and AST 02-06146 is gratefully acknowledged. MB further thanks the NASA/ASEE Summer Faculty Fellowship program at JPL, that made this work possible.

  13. Dynamical interstellar medium with Gaia and ground-based massive spectroscopic stellar surveys

    CERN Document Server

    Zwitter, Tomaž

    2015-01-01

    The ongoing Gaia mission of ESA will provide accurate spatial and kinematical information for a large fraction of stars in the Galaxy. Interstellar extinction and line absorption studies toward a large number of stars at different distances and directions can give a 3-dimensional distribution map of interstellar absorbers, and thus reach a similar spatial perfection. Under certain morphologies (e.g. geometrically thin absorption curtains) one can infer a complete velocity vector from its radial velocity component and so obtain a dynamical information comparable to stars. But observations of a large number of stars at different distances are needed to determine the location of the absorption pockets. Therefore, techniques to measure interstellar absorptions towards (abundant) cool stars are needed. A complex mix of colliding absorption clouds is found in the Galactic plane. Thus, one would wish to start with deep observations to detect the weak, but simpler interstellar absorptions at high Galactic latitudes. ...

  14. Nurse Practitioners' Use of Communication Techniques: Results of a Maryland Oral Health Literacy Survey.

    Directory of Open Access Journals (Sweden)

    Laura W Koo

    Full Text Available We examined nurse practitioners' use and opinions of recommended communication techniques for the promotion of oral health as part of a Maryland state-wide oral health literacy assessment. Use of recommended health-literate and patient-centered communication techniques have demonstrated improved health outcomes.A 27-item self-report survey, containing 17 communication technique items, across 5 domains, was mailed to 1,410 licensed nurse practitioners (NPs in Maryland in 2010. Use of communication techniques and opinions about their effectiveness were analyzed using descriptive statistics. General linear models explored provider and practice characteristics to predict differences in the total number and the mean number of communication techniques routinely used in a week.More than 80% of NPs (N = 194 routinely used 3 of the 7 basic communication techniques: simple language, limiting teaching to 2-3 concepts, and speaking slowly. More than 75% of respondents believed that 6 of the 7 basic communication techniques are effective. Sociodemographic provider characteristics and practice characteristics were not significant predictors of the mean number or the total number of communication techniques routinely used by NPs in a week. Potential predictors for using more of the 7 basic communication techniques, demonstrating significance in one general linear model each, were: assessing the office for user-friendliness and ever taking a communication course in addition to nursing school.NPs in Maryland self-reported routinely using some recommended health-literate communication techniques, with belief in their effectiveness. Our findings suggest that NPs who had assessed the office for patient-friendliness or who had taken a communication course beyond their initial education may be predictors for using more of the 7 basic communication techniques. These self-reported findings should be validated with observational studies. Graduate and continuing

  15. [Abortion in Brazil: a household survey using the ballot box technique].

    Science.gov (United States)

    Diniz, Debora; Medeiros, Marcelo

    2010-06-01

    This study presents the first results of the National Abortion Survey (PNA, Pesquisa Nacional de Aborto), a household random sample survey fielded in 2010 covering urban women in Brazil aged 18 to 39 years. The PNA combined two techniques, interviewer-administered questionnaires and self-administered ballot box questionnaires. The results of PNA show that at the end of their reproductive health one in five women has performed an abortion, with abortions being more frequent in the main reproductive ages, that is, from 18 to 29 years old. No relevant differentiation was observed in the practice of abortion among religious groups, but abortion was found to be more common among people with lower education. The use of medical drugs to induce abortion occurred in half of the abortions, and post-abortion hospitalization was observed among approximately half of the women who aborted. Such results lead to conclude that abortion is a priority in the Brazilian public health agenda.

  16. A Survey of Soft-Error Mitigation Techniques for Non-Volatile Memories

    Directory of Open Access Journals (Sweden)

    Sparsh Mittal

    2017-02-01

    Full Text Available Non-volatile memories (NVMs offer superior density and energy characteristics compared to the conventional memories; however, NVMs suffer from severe reliability issues that can easily eclipse their energy efficiency advantages. In this paper, we survey architectural techniques for improving the soft-error reliability of NVMs, specifically PCM (phase change memory and STT-RAM (spin transfer torque RAM. We focus on soft-errors, such as resistance drift and write disturbance, in PCM and read disturbance and write failures in STT-RAM. By classifying the research works based on key parameters, we highlight their similarities and distinctions. We hope that this survey will underline the crucial importance of addressing NVM reliability for ensuring their system integration and will be useful for researchers, computer architects and processor designers.

  17. Minimum detectable concentration as a function of gamma walkover survey technique.

    Science.gov (United States)

    King, David A; Altic, Nickolas; Greer, Colt

    2012-02-01

    Gamma walkover surveys are often performed by swinging the radiation detector (e.g., a 2-inch by 2-inch sodium iodide) in a serpentine pattern at a near constant height above the ground surface. The objective is to survey an approximate 1-m swath with 100% coverage producing an equal probability of detecting contamination at any point along the swing. In reality, however, the detector height will vary slightly along the swing path, and in some cases the detector may follow a pendulum-like motion significantly reducing the detector response and increasing the minimum detectable concentration. This paper quantifies relative detector responses for fixed and variable height swing patterns and demonstrates negative impacts on the minimum detectable concentration. Minimum detectable concentrations are calculated for multiple contaminated surface areas (0.1, 1.0, 3, 10, and 30 m2), multiple contaminants (60Co, 137Cs, 241Am, and 226Ra), and two minimum heights (5 and 10 cm). Exposure rate estimates used in minimum detectable concentration calculations are produced using MicroShield™ v.7.02 (Grove Software, Inc., 4925 Boonsboro Road #257, Lynchberg, VA 24503) and MDCs are calculated as outlined in NUREG-1575. Results confirm a pendulum-like detector motion can significantly increase MDCs relative to a low flat trajectory, especially for small areas of elevated activity--up to a 47% difference is observed under worst-modeled conditions.

  18. A Survey of Partition-Based Techniques for Copy-Move Forgery Detection

    Directory of Open Access Journals (Sweden)

    Wandji Nanda Nathalie Diane

    2014-01-01

    Full Text Available A copy-move forged image results from a specific type of image tampering procedure carried out by copying a part of an image and pasting it on one or more parts of the same image generally to maliciously hide unwanted objects/regions or clone an object. Therefore, detecting such forgeries mainly consists in devising ways of exposing identical or relatively similar areas in images. This survey attempts to cover existing partition-based copy-move forgery detection techniques.

  19. Adaptive search techniques for problems in vehicle routing, part I: A survey

    Directory of Open Access Journals (Sweden)

    Kritzinger Stefanie

    2015-01-01

    Full Text Available Research in the field of vehicle routing often focused on finding new ideas and concepts in the development of fast and efficient algorithms for an improved solution process. Early studies introduce static tailor-made strategies, but trends show that algorithms with generic adaptive policies - which emerged in the past years - are more efficient to solve complex vehicle routing problems. In this first part of the survey, we present an overview of recent literature dealing with adaptive or guided search techniques for problems in vehicle routing.

  20. Precision surveying the principles and geomatics practice

    CERN Document Server

    Ogundare, John Olusegun

    2016-01-01

    A comprehensive overview of high precision surveying, including recent developments in geomatics and their applications This book covers advanced precision surveying techniques, their proper use in engineering and geoscience projects, and their importance in the detailed analysis and evaluation of surveying projects. The early chapters review the fundamentals of precision surveying: the types of surveys; survey observations; standards and specifications; and accuracy assessments for angle, distance and position difference measurement systems. The book also covers network design and 3-D coordinating systems before discussing specialized topics such as structural and ground deformation monitoring techniques and analysis, mining surveys, tunneling surveys, and alignment surveys. Precision Surveying: The Principles and Geomatics Practice: * Covers structural and ground deformation monitoring analysis, advanced techniques in mining and tunneling surveys, and high precision alignment of engineering structures *...

  1. Value of window technique in diagnosis of the ground glass opacities in patients with non-small cell pulmonary cancer.

    Science.gov (United States)

    Yao, Gang

    2016-11-01

    The aim of the present study was to examine the value of window technique in qualitative diagnosis of the ground glass opacities (GGO) in patients with non-small cell pulmonary cancer. A total of 124 clinically suspected pulmonary cancer patients were analyzed retrospectively. The lesions were affirmed by puncture biopsy, and were GGO on pulmonary window while were invisible on mediastinal window. Sixty-four multi-detector spiral computed tomography with the window width and window level of 1,500 Hounsfield units (HU) and -450 HU on pulmonary window, while the window width and window level of 400 and 40 HU on mediastinal window, was used in the study. The window adjustment technique was used to analyze the window width and window level of lesion on pulmonary window and mediastinal window, for searching invisible threshold on 3-megapixel medical displays. The diagnostic accuracy and the cut-off value were compared on receiver operating characteristic (ROC) curve. The results showed that the window width and window level on pulmonary window and mediastinal window of malignant lesions were significantly less than those of benign ones (Pvalue on pulmonary window was the window width and window level of 1,300 and -220 HU, the area under the ROC was 0.830 [sensitivity was 72.5%, specificity was 84.3%; 95% confidence interval (CI), 0.712-0.945]. The cut-off value on mediastinal window was the window width and window level of 360 and 30 HU, and the area under the ROC was 0.623 (was 62.0%, specificity was 55.7%; 95% CI, 0.541-0.745). In conclusion, the window technique has high sensitivity and accuracy in qualitative diagnosis of the GGO.

  2. Upper tails of self-intersection local times of random walks: survey of proof techniques

    CERN Document Server

    König, Wolfgang

    2010-01-01

    The asymptotics of the probability that the self-intersection local time of a random walk on $\\Z^d$ exceeds its expectation by a large amount is a fascinating subject because of its relation to some models from Statistical Mechanics, to large-deviation theory and variational analysis and because of the variety of the effects that can be observed. However, the proof of the upper bound is notoriously difficult and requires various sophisticated techniques. We survey some heuristics and some recently elaborated techniques and results. This is an extended summary of a talk held on the CIRM-conference on {\\it Excess self-intersection local times, and related topics} in Luminy, 6-10 Dec., 2010.

  3. Testing river surveying techniques in tidal environments: example from an actively meandering channel surveyed with TLS (Mont Saint-Michel bay, France)

    Science.gov (United States)

    Leroux, J.; Lague, D.

    2013-12-01

    Tidal channel developed in mega-tidal salt marsh offer a unique set of characteristics to study the interaction between hydraulics, riparian vegetation and sedimentation using Terrestrial Laser Scanner (TLS). The recession of water allows a nearly complete survey of the channel that is otherwise impossible in rivers. Moreover, the predictability of tide amplitude allows to target surveys large events. Finally, the hydro-sedimentary processes and peak flow velocities in excess of 2 m/s in mega-tidal estuaries (e.g. Mont Saint Michel (MSM) bay) allow to explore conditions that are similar to river during flood conditions. This has motivated a 3 years study of a sinuous tidal channel located on the fringe of the marsh with the aim to understand its dynamics at daily to annual scales. We have acquired 36 high resolution topographic surveys with TLS, whose 13 daily surveys were acquired during annual largest tides. A local reference network of targets is used to yield a high registration accuracy with uncertainty varying between 1.5 mm and 3.4 mm. We use the CANUPO algorithm for classifying riparian vegetation and ground in 3D data, and use the point cloud comparison algorithm M3C2 to resolve 3D topographic changes down to 5 mm. ADCP, ADV and a turbidimeter were installed to constrain flow velocities and suspended sediment concentration (SSC). Our analysis is focused on three active compartments: (1) the inner bar on which riparian pioneer vegetation is developing and where sedimentation reaches up to 5 cm/tide; (2) the actively eroding outer bank which exhibits local retreat rates up to 2 m/tide; (3) the channel itself for which we document fluctuations of up to 0.2 m in elevation at daily to monthly timescales. We find that High Water Level (HWL) is a good predictor of the mean rate of evolution of these compartments with different empirical relationships. Spatially averaged sedimentation on the inner bend tends to increase linearly with HWL and is increased by a

  4. Telephone survey to investigate relationships between onychectomy or onychectomy technique and house soiling in cats.

    Science.gov (United States)

    Gerard, Amanda F; Larson, Mandy; Baldwin, Claudia J; Petersen, Christine

    2016-09-15

    OBJECTIVE To determine whether associations existed between onychectomy or onychectomy technique and house soiling in cats. DESIGN Cross-sectional study. SAMPLE 281 owners of 455 cats in Polk County, Iowa, identified via a list of randomly selected residential phone numbers of cat owners in that region. PROCEDURES A telephone survey was conducted to collect information from cat owners on factors hypothesized a priori to be associated with house soiling, including cat sex, reproductive status, medical history, and onychectomy history. When cats that had undergone onychectomy were identified, data were collected regarding the cat's age at the time of the procedure and whether a carbon dioxide laser (CDL) had been used. Information on history of house soiling behavior (urinating or defecating outside the litter box) was also collected. RESULTS Onychectomy technique was identified as a risk factor for house soiling. Cats for which a non-CDL technique was used had a higher risk of house soiling than cats for which the CDL technique was used. Cats that had undergone onychectomy and that lived in a multicat (3 to 5 cats) household were more than 3 times as likely to have house soiled as were single-housed cats with intact claws. CONCLUSIONS AND CLINICAL RELEVANCE Results of this cross-sectional study suggested that use of the CDL technique for onychectomy could decrease the risk of house soiling by cats relative to the risk associated with other techniques. This and other findings can be used to inform the decisions of owners and veterinarians when considering elective onychectomy for cats.

  5. Survey on Robot-Assisted Surgical Techniques Utilization in US Pediatric Surgery Fellowships.

    Science.gov (United States)

    Maizlin, Ilan I; Shroyer, Michelle C; Yu, David C; Martin, Colin A; Chen, Mike K; Russell, Robert T

    2017-02-01

    Robotic technology has transformed both practice and education in many adult surgical specialties; no standardized training guidelines in pediatric surgery currently exist. The purpose of our study was to assess the prevalence of robotic procedures and extent of robotic surgery education in US pediatric surgery fellowships. A deidentified survey measured utilization of the robot, perception on the utility of the robot, and its incorporation in training among the program directors of Accreditation Council for Graduate Medical Education (ACGME) pediatric surgery fellowships in the United States. Forty-one of the 47 fellowship programs (87%) responded to the survey. While 67% of respondents indicated the presence of a robot in their facility, only 26% reported its utilizing in their surgical practice. Among programs not utilizing the robot, most common reasons provided were lack of clear supportive evidence, increased intraoperative time, and incompatibility of instrument size to pediatric patients. While 58% of program directors believe that there is a future role for robotic surgery in children, only 18% indicated that robotic training should play a part in pediatric surgery education. Consequently, while over 66% of survey respondents received training in robot-assisted surgical technique, only 29% of fellows receive robot-assisted training during their fellowship. A majority of fellowships have access to a robot, but few utilize the technology in their current practice or as part of training. Further investigation is required into both the technology's potential benefits in the pediatric population and its role in pediatric surgery training.

  6. A state-of-art survey on TQM applications using MCDM techniques

    Directory of Open Access Journals (Sweden)

    Yasaman Mohammadshahi

    2013-07-01

    Full Text Available In today’s competitive economy, quality plays an essential role for the success business units and there are considerable efforts made to control and to improve quality characteristics in order to satisfy customers’ requirements. However, improving quality is normally involved with various criteria and we need to use Multi Criteria Decision Making (MCDM to handle such cases. In this state-of the-art literature survey, 45 articles focused on solving quality problems by MCDM methods are investigated. These articles were published between 1994 and 2013.Seven areas were selected for categorization: (1 AHP, Fuzzy AHP, ANP and Fuzzy ANP, (2 DEMATEL and Fuzzy DEMATEL, (3 GRA, (4 Vikor and Fuzzy Vikor, (5 TOPSIS, Fuzzy TOPSIS and combination of TOPSIS and AHP, (6 Fuzzy and (7 Less frequent and hybrid procedures. According to our survey, Fuzzy based methods were the most popular technique with about 40% usage among procedures. Also AHP and ANP were almost 20% of functional methods. This survey ends with giving recommendation for future researches.

  7. Internet服务管理技术综述%A Survey on Internet Service Management Techniques

    Institute of Scientific and Technical Information of China (English)

    袁满; 罗军; 阚志刚; 胡建平; 马健

    2003-01-01

    Internet is currently evolving from a best effort only service towards a service that supports different levels of Quality of Service. Especially,with IPv6 becoming mature and Telecom network,Internet network and wireless mobile network are merged into an all IP network. In the future,Internet will be a network-service-driven. This integrated all IP network will provides tremendous services with service users. Therefore,it will be important to effectively manage these services both for service providers and service users. For service providers,providing high QoS with service users ,They can earn much high profits. For service users ,they can obtain all kinds of abundant different level QoS to meet their needs,at anytime and anywhere. In this paper,service management technique progresses are overall surveyed ,different organization service management models are surveyed ,including their principle and application scenario. Finally,all these service management models are compared. Interoperability among these service management models is implemented by bridge protocol. The survey lays a foundation for next research for Internet service management.

  8. Lithologic and ground-water-quality data collected using Hoverprobe drilling techniques at the West Branch Canal Creek wetland, Aberdeen Proving Ground, Maryland, April-May 2000

    Science.gov (United States)

    Phelan, Daniel J.; Senus, Michael P.; Olsen, Lisa D.

    2001-01-01

    This report presents lithologic and groundwater- quality data collected during April and May 2000 in the remote areas of the tidal wetland of West Branch Canal Creek, Aberdeen Proving Ground, Maryland. Contamination of the Canal Creek aquifer with volatile organic compounds has been documented in previous investigations of the area. This study was conducted to investigate areas that were previously inaccessible because of deep mud and shallow water, and to support ongoing investigations of the fate and transport of volatile organic compounds in the Canal Creek aquifer. A unique vibracore drill rig mounted on a hovercraft was used for drilling and groundwater sampling. Continuous cores of the wetland sediment and of the Canal Creek aquifer were collected at five sites. Attempts to sample ground water were made by use of a continuous profiler at 12 sites, without well installation, at a total of 81 depths within the aquifer. Of those 81 attempts, only 34 sampling depths produced enough water to collect samples. Ground-water samples from two sites had the highest concentrations of volatile organic compounds?with total volatile organic compound concentrations in the upper part of the aquifer ranging from about 15,000 to 50,000 micrograms per liter. Ground-water samples from five sites had much lower total volatile organic compound concentrations (95 to 2,100 micrograms per liter), whereas two sites were essentially not contaminated, with total volatile organic compound concentrations less than or equal to 5 micrograms per liter.

  9. Survey of agents and techniques applicable to the solidification of low-level radioactive wastes

    Energy Technology Data Exchange (ETDEWEB)

    Fuhrmann, M.; Neilson, R.M. Jr.; Colombo, P.

    1981-12-01

    A review of the various solidification agents and techniques that are currently available or potentially applicable for the solidification of low-level radioactive wastes is presented. An overview of the types and quantities of low-level wastes produced is presented. Descriptions of waste form matrix materials, the wastes types for which they have been or may be applied and available information concerning relevant waste form properties and characteristics follow. Also included are descriptions of the processing techniques themselves with an emphasis on those operating parameters which impact upon waste form properties. The solidification agents considered in this survey include: hydraulic cements, thermoplastic materials, thermosetting polymers, glasses, synthetic minerals and composite materials. This survey is part of a program supported by the United States Department of Energy's Low-Level Waste Management Program (LLWMP). This work provides input into LLWMP efforts to develop and compile information relevant to the treatment and processing of low-level wastes and their disposal by shallow land burial.

  10. The effectiveness of ground-penetrating radar surveys in the location of unmarked burial sites in modern cemeteries

    Science.gov (United States)

    Fiedler, Sabine; Illich, Bernhard; Berger, Jochen; Graw, Matthias

    2009-07-01

    Ground-penetration radar (GPR) is a geophysical method that is commonly used in archaeological and forensic investigations, including the determination of the exact location of graves. Whilst the method is rapid and does not involve disturbance of the graves, the interpretation of GPR profiles is nevertheless difficult and often leads to incorrect results. Incorrect identifications could hinder criminal investigations and complicate burials in cemeteries that have no information on the location of previously existing graves. In order to increase the number of unmarked graves that are identified, the GPR results need to be verified by comparing them with the soil and vegetation properties of the sites examined. We used a modern cemetery to assess the results obtained with GPR which we then compared with previously obtained tachymetric data and with an excavation of the graves where doubt existed. Certain soil conditions tended to make the application of GPR difficult on occasions, but a rough estimation of the location of the graves was always possible. The two different methods, GPR survey and tachymetry, both proved suitable for correctly determining the exact location of the majority of graves. The present study thus shows that GPR is a reliable method for determining the exact location of unmarked graves in modern cemeteries. However, the method did not allow statements to be made on the stage of decay of the bodies. Such information would assist in deciding what should be done with graves where ineffective degradation creates a problem for reusing graves following the standard resting time of 25 years.

  11. Geological disaster survey based on Curvelet transform with borehole Ground Penetrating Radar in Tonglushan old mine site.

    Science.gov (United States)

    Tang, Xinjian; Sun, Tao; Tang, Zhijie; Zhou, Zenghui; Wei, Baoming

    2011-06-01

    Tonglushan old mine site located in Huangshi City, China, is very famous in the world. However, some of the ruins had suffered from geological disasters such as local deformation, surface cracking, in recent years. Structural abnormalities of rock-mass in deep underground were surveyed with borehole ground penetrating radar (GPR) to find out whether there were any mined galleries or mined-out areas below the ruins. With both the multiresolution analysis and sub-band directional of Curvelet transform, the feature information of targets' GPR signals were studied on Curvelet transform domain. Heterogeneity of geotechnical media and clutter jamming of complicated background of GPR signals could be conquered well, and the singularity characteristic information of typical rock mass signals could be extracted. Random noise had be removed by thresholding combined with Curvelet and the statistical characteristics of wanted signals and the noise, then direct wave suppression and the spatial distribution feature extraction could obtain a better result by making use of Curvelet transform directional. GprMax numerical modeling and analyzing of the sample data have verified the feasibility and effectiveness of our method. It is important and applicable for the analyzing of the geological structure and the disaster development about the Tonglushan old mine site.

  12. [Survey of parental acceptance rate to behavior management techniques used in pediatric dentistry].

    Science.gov (United States)

    Chen, Xu; Jin, Shi-Fu; Liu, Hong-Bo

    2008-10-01

    To investigate the parental acceptance rate to behavior management techniques (BMT) used in pediatric dentistry. Two hundred and eighty-five subjects (mother or father) were included in this survey. Five behavior management techniques including (1)tell-show-do; (2)voice control; (3)passive restraint; (4)sedation; (5) general anesthesia commonly used in pediatric dentistry were explained to the parents and then filled the questionnaires by either of the parents, including the parental age, gender, educational level and income, The answerer rated their acceptance of each technique using a visual analogue scale (VAS), a continuous scale ranging from 0 to 100mm. The left end represented "completely acceptable" and the right end represented "completely unacceptable". The subjects were instructed to rate the acceptability of each technique by placing a mark on the VAS response line. The acceptability rating of each technique by the subject was determined by measuring the distance between VAS line from the left end to the mark. SPSS10.0 software was used for statistical analysis. VAS measurements were analyzed using a factorial analysis of variance (ANOVA). Student's t test was used to analyze the difference between the acceptance rates for different gender of the answerers. The correlation between independent variables consisted of parental gender, education level, income and behavior management techniques were analyzed using Spearman correlation test. The order of decreasing acceptance rate was as follows, tell-show-do, voice control, sedation, general anesthesia and passive restraint. The difference between each of them was statistically significant (F=215.2,Panesthesia by Spearman correlation analysis (P<0.01). The acceptance rate of tell-show-do and passive restraint were related to parental gender. According to Students's t test, female tended to accept tell-show-do more than males (P=0.011), nevertheless more males tended to accept passive restraint (P=0.001). No

  13. Dealing with Magnetic Disturbances in Human Motion Capture: A Survey of Techniques

    Directory of Open Access Journals (Sweden)

    Gabriele Ligorio

    2016-03-01

    Full Text Available Magnetic-Inertial Measurement Units (MIMUs based on microelectromechanical (MEMS technologies are widespread in contexts such as human motion tracking. Although they present several advantages (lightweight, size, cost, their orientation estimation accuracy might be poor. Indoor magnetic disturbances represent one of the limiting factors for their accuracy, and, therefore, a variety of work was done to characterize and compensate them. In this paper, the main compensation strategies included within Kalman-based orientation estimators are surveyed and classified according to which degrees of freedom are affected by the magnetic data and to the magnetic disturbance rejection methods implemented. By selecting a representative method from each category, four algorithms were obtained and compared in two different magnetic environments: (1 small workspace with an active magnetic source; (2 large workspace without active magnetic sources. A wrist-worn MIMU was used to acquire data from a healthy subject, whereas a stereophotogrammetric system was adopted to obtain ground-truth data. The results suggested that the model-based approaches represent the best compromise between the two testbeds. This is particularly true when the magnetic data are prevented to affect the estimation of the angles with respect to the vertical direction.

  14. A Geostatistical Data Fusion Technique for Merging Remote Sensing and Ground-Based Observations of Aerosol Optical Thickness

    Science.gov (United States)

    Chatterjee, Abhishek; Michalak, Anna M.; Kahn, Ralph A.; Paradise, Susan R.; Braverman, Amy J.; Miller, Charles E.

    2010-01-01

    Particles in the atmosphere reflect incoming sunlight, tending to cool the Earth below. Some particles, such as soot, also absorb sunlight, which tens to warm the ambient atmosphere. Aerosol optical depth (AOD) is a measure of the amount of particulate matter in the atmosphere, and is a key input to computer models that simulate and predict Earth's changing climate. The global AOD products from the Multi-angle Imaging SpectroRadiometer (MISR) and the MODerate resolution Imaging Spectroradiometer (MODIS), both of which fly on the NASA Earth Observing System's Terra satellite, provide complementary views of the particles in the atmosphere. Whereas MODIS offers global coverage about four times as frequent as MISR, the multi-angle data makes it possible to separate the surface and atmospheric contributions to the observed top-of-atmosphere radiances, and also to more effectively discriminate particle type. Surface-based AERONET sun photometers retrieve AOD with smaller uncertainties than the satellite instruments, but only at a few fixed locations. So there are clear reasons to combine these data sets in a way that takes advantage of their respective strengths. This paper represents an effort at combining MISR, MODIS and AERONET AOD products over the continental US, using a common spatial statistical technique called kriging. The technique uses the correlation between the satellite data and the "ground-truth" sun photometer observations to assign uncertainty to the satellite data on a region-by-region basis. The larger fraction of the sun photometer variance that is duplicated by the satellite data, the higher the confidence assigned to the satellite data in that region. In the Western and Central US, MISR AOD correlation with AERONET are significantly higher than those with MODIS, likely due to bright surfaces in these regions, which pose greater challenges for the single-view MODIS retrievals. In the east, MODIS correlations are higher, due to more frequent sampling

  15. Repair vs replacement of direct composite restorations: a survey of teaching and operative techniques in Oceania.

    Science.gov (United States)

    Brunton, Paul A; Ghazali, Amna; Tarif, Zahidah H; Loch, Carolina; Lynch, Christopher; Wilson, Nairn; Blum, Igor R

    2017-04-01

    To evaluate the teaching and operative techniques for the repair and/or replacement of direct resin-based composite restorations (DCRs) in dental schools in Oceania. A 14-item questionnaire was mailed to the heads of operative dentistry in 16 dental schools in Oceania (Australia, New Zealand, Fiji and Papua New Guinea). The survey asked whether the repair of DCRs was taught within the curriculum; the rationale behind the teaching; how techniques were taught, indications for repair, operative techniques, materials used, patient acceptability, expected longevity and recall systems. All 16 schools participated in the study. Thirteen (81%) reported the teaching of composite repairs as an alternative to replacement. Most schools taught the theoretical and practical aspects of repair at a clinical level only. All 13 schools (100%) agreed on tooth substance preservation being the main reason for teaching repair. The main indications for repair were marginal defects (100%), followed by secondary caries (69%). All 13 schools that performed repairs reported high patient acceptability, and considered it a definitive measure. Only three schools (23%) claimed to have a recall system in place following repair of DCRs. Most respondents either did not know or did not answer when asked about the longevity of DCRs. Repair of DCRs seems to be a viable alternative to replacement, which is actively taught within Oceania. Advantages include it being minimally invasive, preserving tooth structure, and time and money saving. However, standardised guidelines need to be developed and further clinical long-term studies need to be carried out. The decision between replacing or repairing a defective composite restoration tends to be based on what clinicians have been taught, tempered by experience and judgement. This study investigated the current status of teaching and operative techniques of repair of direct composite restorations in dental schools in Oceania. Copyright © 2017 Elsevier Ltd

  16. A Brief History of the use of Electromagnetic Induction Techniques in Soil Survey

    Science.gov (United States)

    Brevik, Eric C.; Doolittle, James

    2017-04-01

    Electromagnetic induction (EMI) has been used to characterize the spatial variability of soil properties since the late 1970s. Initially used to assess soil salinity, the use of EMI in soil studies has expanded to include: mapping soil types; characterizing soil water content and flow patterns; assessing variations in soil texture, compaction, organic matter content, and pH; and determining the depth to subsurface horizons, stratigraphic layers or bedrock, among other uses. In all cases the soil property being investigated must influence soil apparent electrical conductivity (ECa) either directly or indirectly for EMI techniques to be effective. An increasing number and diversity of EMI sensors have been developed in response to users' needs and the availability of allied technologies, which have greatly improved the functionality of these tools and increased the amount and types of data that can be gathered with a single pass. EMI investigations provide several benefits for soil studies. The large amount of georeferenced data that can be rapidly and inexpensively collected with EMI provides more complete characterization of the spatial variations in soil properties than traditional sampling techniques. In addition, compared to traditional soil survey methods, EMI can more effectively characterize diffuse soil boundaries and identify included areas of dissimilar soils within mapped soil units, giving soil scientists greater confidence when collecting spatial soil information. EMI techniques do have limitations; results are site-specific and can vary depending on the complex interactions among multiple and variable soil properties. Despite this, EMI techniques are increasingly being used to investigate the spatial variability of soil properties at field and landscape scales. The future should witness a greater use of multiple-frequency and multiple-coil EMI sensors and integration with other sensors to assess the spatial variability of soil properties. Data analysis

  17. A survey on filter techniques for feature selection in gene expression microarray analysis.

    Science.gov (United States)

    Lazar, Cosmin; Taminau, Jonatan; Meganck, Stijn; Steenhoff, David; Coletta, Alain; Molter, Colin; de Schaetzen, Virginie; Duque, Robin; Bersini, Hugues; Nowé, Ann

    2012-01-01

    A plenitude of feature selection (FS) methods is available in the literature, most of them rising as a need to analyze data of very high dimension, usually hundreds or thousands of variables. Such data sets are now available in various application areas like combinatorial chemistry, text mining, multivariate imaging, or bioinformatics. As a general accepted rule, these methods are grouped in filters, wrappers, and embedded methods. More recently, a new group of methods has been added in the general framework of FS: ensemble techniques. The focus in this survey is on filter feature selection methods for informative feature discovery in gene expression microarray (GEM) analysis, which is also known as differentially expressed genes (DEGs) discovery, gene prioritization, or biomarker discovery. We present them in a unified framework, using standardized notations in order to reveal their technical details and to highlight their common characteristics as well as their particularities.

  18. [Gingival displacement techniques in daily practice. Survey among dental surgeons in Abidjan, Ivory Coast].

    Science.gov (United States)

    Pesson, D M; Bakou, O D; Didia, E L E; Kouame, A; Blohoua, M R J J; Djeredou, K B

    2015-12-01

    Access to cervical margins allows the practitioner to record the entire cervical margin in order to provide a true copy to the technician. This requires a gingival displacement obtainable by different techniques. This study aimed to assess the implementation of gingival displacement methods prior to impression taking in fixed prosthodontics. This is a descriptive and cross-sectional survey of sample of 71 dentists practising in Abidjan, Ivory Coast; which ran from October 2nd, 2010 to November 14th, 2010. A survey form was administered to dentists. The questionnaire was organised around the following headings: identification of dentists and practice of gingival displacement methods. The data processing done using software Epi Info 6 and Excel XP on Window XP, allowed calculation of frequencies, means and proportions and the establishment of connection between variables with the chi2 test. The significance level was set at p < 0.05. The results of the survey indicate that non-surgical methods of gingival displacement, including retraction cords and temporary crowns are those they use most frequently (76.4%) because the vast majority of practitioners (87.22%) believe the most traumatic to the periodontium are surgical methods. Our study showed that the gingival displacement methods are frequently carried out in daily practice, regardless of the topography of the abutment teeth and their number, but with a preference for non-surgical methods, particularly those using retraction cords and temporary crowns. The use of injectable gingival displacement paste is not harmful to the periodontal tissues, easy to use and have a very efficient haemostatic action. It should also be known and practiced.

  19. Stereoscopic visualization of diffusion tensor imaging data: a comparative survey of visualization techniques.

    Science.gov (United States)

    Raslan, Osama; Debnam, James Matthew; Ketonen, Leena; Kumar, Ashok J; Schellingerhout, Dawid; Wang, Jihong

    2013-01-01

    Diffusion tensor imaging (DTI) data has traditionally been displayed as a grayscale functional anisotropy map (GSFM) or color coded orientation map (CCOM). These methods use black and white or color with intensity values to map the complex multidimensional DTI data to a two-dimensional image. Alternative visualization techniques, such as V max maps utilize enhanced graphical representation of the principal eigenvector by means of a headless arrow on regular nonstereoscopic (VM) or stereoscopic display (VMS). A survey of clinical utility of patients with intracranial neoplasms was carried out by 8 neuroradiologists using traditional and nontraditional methods of DTI display. Pairwise comparison studies of 5 intracranial neoplasms were performed with a structured questionnaire comparing GSFM, CCOM, VM, and VMS. Six of 8 neuroradiologists favored V max maps over traditional methods of display (GSFM and CCOM). When comparing the stereoscopic (VMS) and the non-stereoscopic (VM) modes, 4 favored VMS, 2 favored VM, and 2 had no preference. In conclusion, processing and visualizing DTI data stereoscopically is technically feasible. An initial survey of users indicated that V max based display methodology with or without stereoscopic visualization seems to be preferred over traditional methods to display DTI data.

  20. Stereoscopic Visualization of Diffusion Tensor Imaging Data: A Comparative Survey of Visualization Techniques

    Directory of Open Access Journals (Sweden)

    Osama Raslan

    2013-01-01

    Full Text Available Diffusion tensor imaging (DTI data has traditionally been displayed as a grayscale functional anisotropy map (GSFM or color coded orientation map (CCOM. These methods use black and white or color with intensity values to map the complex multidimensional DTI data to a two-dimensional image. Alternative visualization techniques, such as Vmax maps utilize enhanced graphical representation of the principal eigenvector by means of a headless arrow on regular nonstereoscopic (VM or stereoscopic display (VMS. A survey of clinical utility of patients with intracranial neoplasms was carried out by 8 neuroradiologists using traditional and nontraditional methods of DTI display. Pairwise comparison studies of 5 intracranial neoplasms were performed with a structured questionnaire comparing GSFM, CCOM, VM, and VMS. Six of 8 neuroradiologists favored Vmax maps over traditional methods of display (GSFM and CCOM. When comparing the stereoscopic (VMS and the non-stereoscopic (VM modes, 4 favored VMS, 2 favored VM, and 2 had no preference. In conclusion, processing and visualizing DTI data stereoscopically is technically feasible. An initial survey of users indicated that Vmax based display methodology with or without stereoscopic visualization seems to be preferred over traditional methods to display DTI data.

  1. The 21-SPONGE HI Absorption Survey I: Techniques and Initial Results

    CERN Document Server

    Murray, Claire E; Goss, W M; Dickey, John M; Heiles, Carl; Lindner, Robert R; Babler, Brian; Pingel, Nickolas M; Lawrence, Allen; Jencson, Jacob; Hennebelle, Patrick

    2015-01-01

    We present methods and results from "21-cm Spectral Line Observations of Neutral Gas with the EVLA" (21-SPONGE), a large survey for Galactic neutral hydrogen (HI) absorption with the Karl G. Jansky Very Large Array (VLA). With the upgraded capabilities of the VLA, we reach median root-mean-square (RMS) noise in optical depth of $\\sigma_{\\tau}=9\\times 10^{-4}$ per $0.42\\rm\\,km\\,s^{-1}$ channel for the 31 sources presented here. Upon completion, 21-SPONGE will be the largest HI absorption survey with this high sensitivity. We discuss the observations and data reduction strategies, as well as line fitting techniques. We prove that the VLA bandpass is stable enough to detect broad, shallow lines associated with warm HI, and show that bandpass observations can be combined in time to reduce spectral noise. In combination with matching HI emission profiles from the Arecibo Observatory ($\\sim3.5'$ angular resolution), we estimate excitation (or spin) temperatures ($\\rm T_s$) and column densities for Gaussian componen...

  2. Microtremor Array Measurement Survey and Strong Ground Motion Observation Activities of The MarDiM (SATREPS) Project

    Science.gov (United States)

    Ozgur Citak, Seckin; Karagoz, Ozlem; Chimoto, Kosuke; Ozel, Oguz; Yamanaka, Hiroaki; Aksahin, Bengi; Arslan, Safa; Hatayama, Ken; Ohori, Michihiro; Hori, Muneo

    2015-04-01

    Since 1939, devastating earthquakes with magnitude greater than seven ruptured North Anatolian Fault (NAF) westward, starting from 1939 Erzincan (Ms=7.9) at the eastern Turkey and including the latest 1999 Izmit-Golcuk (Ms=7.4) and the Duzce (Ms=7.2) earthquakes in the eastern Marmara region, Turkey. On the other hand, the west of the Sea of Marmara an Mw7.4 earthquake ruptured the NAF' s Ganos segment in 1912. The only un-ruptured segments of the NAF in the last century are within the Sea of Marmara, and are identified as a "seismic gap" zone that its rupture may cause a devastating earthquake. In order to unravel the seismic risks of the Marmara region a comprehensive multidisciplinary research project The MarDiM project "Earthquake And Tsunami Disaster Mitigation in The Marmara Region and Disaster Education in Turkey", has already been started since 2003. The project is conducted in the framework of "Science and Technology Research Partnership for Sustainable Development (SATREPS)" sponsored by Japan Science and Technology Agency (JST) and Japan International Cooperation Agency (JICA). One of the main research field of the project is "Seismic characterization and damage prediction" which aims to improve the prediction accuracy of the estimation of the damages induced by strong ground motions and tsunamis based on reliable source parameters, detailed deep and shallow velocity structure and building data. As for detailed deep and shallow velocity structure microtremor array measurement surveys were conducted in Zeytinburnu district of Istanbul and Tekirdag province at about 81 sites on October 2013 and September 2014. Also in September 2014, 11 accelerometer units were installed mainly in public buildings in both Zeytinburnu and Tekirdag area and are currently in operation. Each accelerometer unit compose of a Network Sensor (CV-374A2) by Tokyo Sokushin, post processing PC for data storage and power supply unit. The Network Sensor (CV-374A2) consist of three servo

  3. Microtremor Array Measurement Survey and Strong Ground Motion observation activities of The SATREPS, MarDiM project -Part 2-

    Science.gov (United States)

    Citak, Seckin; Karagoz, Ozlem; Chimoto, Kosuke; Ozel, Oguz; Yamanaka, Hiroaki; Arslan, Safa; Aksahin, Bengi; Hatayama, Ken; Ohori, Michihiro; Hori, Muneo

    2016-04-01

    Since 1939, devastating earthquakes with magnitude greater than seven ruptured North Anatolian Fault (NAF) westward, starting from 1939 Erzincan (Ms=7.9) at the eastern Turkey and including the latest 1999 Izmit-Golcuk (Ms=7.4) and the Duzce (Ms=7.2) earthquakes in the eastern Marmara region, Turkey. On the other hand, the west of the Sea of Marmara an Mw7.4 earthquake ruptured the NAF' s Ganos segment in 1912. The only un-ruptured segments of the NAF in the last century are within the Sea of Marmara, and are identified as a "seismic gap" zone that its rupture may cause a devastating earthquake. In order to unravel the seismic risks of the Marmara region a comprehensive multidisciplinary research project The MarDiM project "Earthquake And Tsunami Disaster Mitigation in The Marmara Region and Disaster Education in Turkey", has already been started since 2003. The project is conducted in the framework of "Science and Technology Research Partnership for Sustainable Development (SATREPS)" sponsored by Japan Science and Technology Agency (JST) and Japan International Cooperation Agency (JICA). One of the main research field of the project is "Seismic characterization and damage prediction" which aims to improve the prediction accuracy of the estimation of the damages induced by strong ground motions and tsunamis based on reliable source parameters, detailed deep and shallow velocity structure and building data. As for detailed deep and shallow velocity structure microtremor array measurement surveys were conducted in Zeytinburnu district of Istanbul, Tekirdag, Canakkale and Edirne provinces at about 109 sites on October 2013, September 2014 and 2015. Also in September 2014, 11 accelerometer units were installed mainly in public buildings in both Zeytinburnu and Tekirdag area and are currently in operation. Each accelerometer unit compose of a Network Sensor (CV-374A) by Tokyo Sokushin, post processing PC for data storage and power supply unit. The Network Sensor (CV-374

  4. Microtremor Array Measurement Survey and Strong Ground Motion observation activities of The SATREPS, MarDiM project -Part 3-

    Science.gov (United States)

    Citak, Seckin; Safa Arslan, Mehmet; Karagoz, Ozlem; Chimoto, Kosuke; Ozel, Oguz; Yamanaka, Hiroaki; Behiye Aksahin, Bengi; Hatayama, Ken; Sahin, Abdurrahman; Ohori, Michihiro; Safak, Erdal; Hori, Muneo

    2017-04-01

    Since 1939, devastating earthquakes with magnitude greater than seven ruptured North Anatolian Fault (NAF) westward, starting from 1939 Erzincan (Ms=7.9) at the eastern Turkey and including the latest 1999 Izmit-Golcuk (Ms=7.4) and the Duzce (Ms=7.2) earthquakes in the eastern Marmara region, Turkey. On the other hand, the west of the Sea of Marmara an Mw7.4 earthquake ruptured the NAF' s Ganos segment in 1912. The only un-ruptured segments of the NAF in the last century are within the Sea of Marmara, and are identified as a "seismic gap" zone that its rupture may cause a devastating earthquake. In order to unravel the seismic risks of the Marmara region a comprehensive multidisciplinary research project The MarDiM project "Earthquake And Tsunami Disaster Mitigation in The Marmara Region and Disaster Education in Turkey", has already been started since 2003. The project is conducted in the framework of "Science and Technology Research Partnership for Sustainable Development (SATREPS)" sponsored by Japan Science and Technology Agency (JST) and Japan International Cooperation Agency (JICA). One of the main research field of the project is "Seismic characterization and damage prediction" which aims to improve the prediction accuracy of the estimation of the damages induced by strong ground motions and tsunamis based on reliable source parameters, detailed deep and shallow velocity structure and building data. As for detailed deep and shallow velocity structure microtremor array measurement surveys were conducted in Zeytinburnu district of Istanbul, Tekirdag, Canakkale and Edirne provinces at about 140 sites on October 2013, September 2014, 2015 and 2016. Also in September 2014, 11 accelerometer units were installed mainly in public buildings in both Zeytinburnu and Tekirdag area and are currently in operation. Each accelerometer unit compose of a Network Sensor (CV-374A) by Tokyo Sokushin, post processing PC for data storage and power supply unit. The Network Sensor

  5. Mapping Ground Subsidence Phenomena in Ho Chi Minh City through the Radar Interferometry Technique Using ALOS PALSAR Data

    Directory of Open Access Journals (Sweden)

    Dinh Ho Tong Minh

    2015-07-01

    Full Text Available The rapidly developing urbanization since the last decade of the 20th century has led to extensive groundwater extraction, resulting in subsidence in Ho Chi Minh City, Vietnam. Recent advances in multi-temporal spaceborne SAR interferometry, especially with a persistent scatters interferometry (PSI approach, has made this a robust remote sensing technique for measuring large-scale ground subsidence with millimetric accuracy. This work has presented an advanced PSI analysis, to provide an unprecedented spatial extent and continuous temporal coverage of the subsidence in Ho Chi Minh City from 2006 to 2010. The study shows that subsidence is most severe in the Holocene silt loam areas along the Sai Gon River and in the southwest of the city. The groundwater extraction resulting from urbanization and urban growth is mainly responsible for the subsidence. Subsidence in turn leads to more flooding and water nuisance. The correlation between the reference leveling velocity and the estimated PSI result is R2 = 0.88, and the root mean square error is 4.3 (mm/year, confirming their good agreement. From 2006 to 2010, the estimation of the average subsidence rate is -8.0 mm/year, with the maximum value up to -70 mm/year. After four years, in regions along Sai Gon River and in the southwest of the city, the land has sunk up to -12 cm. If not addressed, subsidence leads to the increase of inundation, both in frequency and spatial extent. Finally, regarding climate change, the effects of subsidence should be considered as appreciably greater than those resulting from rising sea level. It is essential to consider these two factors, because the city is inhabited by more than 7.5 million people, where subsidence directly impacts urban structures and infrastructure.

  6. Data Cleaning In Data Warehouse: A Survey of Data Pre-processing Techniques and Tools

    Directory of Open Access Journals (Sweden)

    Anosh Fatima

    2017-03-01

    Full Text Available A Data Warehouse is a computer system designed for storing and analyzing an organization's historical data from day-to-day operations in Online Transaction Processing System (OLTP. Usually, an organization summarizes and copies information from its operational systems to the data warehouse on a regular schedule and management performs complex queries and analysis on the information without slowing down the operational systems. Data need to be pre-processed to improve quality of data, before storing into data warehouse. This survey paper presents data cleaning problems and the approaches in use currently for preprocessing. To determine which technique of preprocessing is best in what scenario to improve the performance of Data Warehouse is main goal of this paper. Many techniques have been analyzed for data cleansing, using certain evaluation attributes and tested on different kind of data sets. Data quality tools such as YALE, ALTERYX, and WEKA have been used for conclusive results to ready the data in data warehouse and ensure that only cleaned data populates the warehouse, thus enhancing usability of the warehouse. Results of paper can be useful in many future activities like cleansing, standardizing, correction, matching and transformation. This research can help in data auditing and pattern detection in the data.

  7. A survey on the use of techniques, materials in dental implantology practice

    Directory of Open Access Journals (Sweden)

    R Chowdhary

    2012-01-01

    Full Text Available Purpose: To present results of a survey on the status of an implantology amongst implant-practicing dentist across the world in 2009. Materials and Methods: A questionnaire was sent to the members of EAO (European Association of Osseointegration, ICOI (International Congress of Osseointegrated Implants, ISOI (Indian Society of Oral Implantologists, Asian Academy of Osseointegration (AAO, Deutsche Gasellschaft Fur Orale Implantologie (DGOI, Philippines Implant Organization, Korean Society of Oral Implantologist, Japanese Association of OralIimplantologists, Chinese Dental Association, Pakistan Dental Association, asking for the personal (anonymous background data and their implantology concepts. Specific questions dealt with level of recognition of implants, use of implants, superstructures, techniques followed, and materials used. Results: A total of 1500 (63.6% of the 2358 questionnaires were answered. Dental implants were the most preferred treatment modality for restoring the missing teeth. Threaded implants were the most preferred. Cement retained implant prosthesis was the most preferred restoration procedure. Dentists believe that the general dentist should practice dental implant treatment modality, preferably teamwork. Immediate loading was the much-accepted concept among the dentists of the developed nations. Conclusion: Dental implants were much accepted treatment modality for the replacement of missing teeth. Most the dentists follow the well documented technique and proven materials, which have been documented in the literature, an evidenced based practice, thus, delivering the best to their patients. Dentists from the developing nations agreed to have standardization in implants.

  8. Shape Sensing Techniques for Continuum Robots in Minimally Invasive Surgery: A Survey.

    Science.gov (United States)

    Shi, Chaoyang; Luo, Xiongbiao; Qi, Peng; Li, Tianliang; Song, Shuang; Najdovski, Zoran; Fukuda, Toshio; Ren, Hongliang

    2017-08-01

    Continuum robots provide inherent structural compliance with high dexterity to access the surgical target sites along tortuous anatomical paths under constrained environments and enable to perform complex and delicate operations through small incisions in minimally invasive surgery. These advantages enable their broad applications with minimal trauma and make challenging clinical procedures possible with miniaturized instrumentation and high curvilinear access capabilities. However, their inherent deformable designs make it difficult to realize 3-D intraoperative real-time shape sensing to accurately model their shape. Solutions to this limitation can lead themselves to further develop closely associated techniques of closed-loop control, path planning, human-robot interaction, and surgical manipulation safety concerns in minimally invasive surgery. Although extensive model-based research that relies on kinematics and mechanics has been performed, accurate shape sensing of continuum robots remains challenging, particularly in cases of unknown and dynamic payloads. This survey investigates the recent advances in alternative emerging techniques for 3-D shape sensing in this field and focuses on the following categories: fiber-optic-sensor-based, electromagnetic-tracking-based, and intraoperative imaging modality-based shape-reconstruction methods. The limitations of existing technologies and prospects of new technologies are also discussed.

  9. A Survey on Face Detection and Recognition Techniques in Different Application Domain

    Directory of Open Access Journals (Sweden)

    Subrat Kumar Rath

    2014-08-01

    Full Text Available In recent technology the popularity and demand of image processing is increasing due to its immense number of application in various fields. Most of these are related to biometric science like face recognitions, fingerprint recognition, iris scan, and speech recognition. Among them face detection is a very powerful tool for video surveillance, human computer interface, face recognition, and image database management. There are a different number of works on this subject. Face recognition is a rapidly evolving technology, which has been widely used in forensics such as criminal identification, secured access, and prison security. In this paper we had gone through different survey and technical papers of this field and list out the different techniques like Linear discriminant analysis, Viola and Jones classification and adaboost learning curvature analysis and discuss about their advantages and disadvantages also describe some of the detection and recognition algorithms, mention some application domain along with different challenges in this field. . We had proposed a classification of detection techniques and discuss all the recognition methods also.

  10. Search Techniques for the Web of Things: A Taxonomy and Survey

    Directory of Open Access Journals (Sweden)

    Yuchao Zhou

    2016-04-01

    Full Text Available The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented.

  11. Development and Testing of Techniques for In-Ground Stabilization, Size Reduction and Safe Removal of Radioactive Wastes Stored in Large Containments in Burial Grounds - 13591

    Energy Technology Data Exchange (ETDEWEB)

    Halliwell, Stephen [VJ Technologies Inc, 89 Carlough Road, Bohemia, NY (United States)

    2013-07-01

    Radioactive waste materials, including Transuranic (TRU) wastes from laboratories have been stored below ground in large containments at a number of sites in the US DOE Complex, and at nuclear sites in Europe. These containments are generally referred to as caissons or shafts. The containments are in a range of sizes and depths below grade. The caissons at the DOE's Hanford site are cylindrical, of the order of 2,500 mm in diameter, 3,050 mm in height and are buried about 6,000 mm below grade. One type of caisson is made out of corrugated pipe, whereas others are made of concrete with standard re-bar. However, the larger shafts in the UK are of the order of 4,600 mm in diameter, 53,500 mm deep, and 12,000 below grade. This paper describes the R and D work and testing activities performed to date to evaluate the concept of in-ground size reduction and stabilization of the contents of large containments similar to those at Hanford. In practice, the height of the Test Facility provided for a test cell that was approximately 22' deep. That prevented a 'full scale mockup' test in the sense that the Hanford Caisson configuration would be an identical replication. Therefore, the project was conducted in two phases. The first phase tested a simulated Caisson with surrogate contents, and part of a Chute section, and the second phase tested a full chute section. These tests were performed at VJ Technologies Test Facility located in East Haven, CT, as part of the Proof of Design Concept program for studying the feasibility of an in-situ grout/grind/mix/stabilize technology for the remediation of four caissons at the 618-11 Burial Ground at US Department of Energy Hanford Site. The test site was constructed such that multiple testing areas were provided for the evaluation of various tools, equipment and procedures under conditions that simulated the Hanford site, with representative soils and layout dimensions. (authors)

  12. Simulation of cylindrical flow to a well using the U.S. Geological Survey Modular Finite-Difference Ground-Water Flow Model

    Science.gov (United States)

    Reilly, Thomas E.; Harbaugh, Arlen W.

    1993-01-01

    Cylindrical (axisymmetric) flow to a well is an important specialized topic of ground-water hydraulics and has been applied by many investigators to determine aquifer properties and determine heads and flows in the vicinity of the well. A recent modification to the U.S. Geological Survey Modular Three-Dimensional Finite-Difference Ground-Water Flow Model provides the opportunity to simulate axisymmetric flow to a well. The theory involves the conceptualization of a system of concentric shells that are capable of reproducing the large variations in gradient in the vicinity of the well by decreasing their area in the direction of the well. The computer program presented serves as a preprocessor to the U.S. Geological Survey model by creating the input data file needed to implement the axisymmetric conceptualization. Data input requirements to this preprocessor are described, and a comparison with a known analytical solution indicates that the model functions appropriately.

  13. Constraining the DEM and the kinematics of an unstable slope in the Maurienne Valley (French Alps) from remote and ground optical techniques

    Science.gov (United States)

    Valentin, Johann; Donze, Frédéric; Jongmans, Denis; Yvart, Sébastien; Lacroix, Pascal; Brenguier, Ombeline; Baillet, Laurent; Lescurier, Anne; Muller, Nicolas; Larose, Eric

    2015-04-01

    Clay-rich rocks are abundant in the Alps where they make a significant part of the sedimentary cover. When they are exposed, these rocks may form reliefs that are affected by rock falls. The failed material quickly transform into rock debris with a fine-grained matrix, which accumulates on slope and in gullies. Historical chronicles show that devastating debris flows can be triggered in the Alpine valleys in case of heavy rain falls. An unstable slope located in the Flysch zone (of Eocene age) has been chosen in the Maurienne valley (French Alps), which is 1 km wide at this location. The study area (0.3 km2) exhibits a rough topography with a slope varying between 20° at the bottom and 90° at the top, for a difference in elevation of about 500 m. The site has been affected by three rock falls (with a volume of about 30,000 m3) in the last fifteen years. The fallen material has filled two gullies. A debris flow occurred in one of this gully in January 2012 and covered a road on a thickness of 4 m. Seismic prospecting was carried out in this 400 long gully and at the cliff top. The results showed that the rock is strongly fractured and deconsolidated over a thickness of at least 10 m and could be affected by further collapses in the near future. A displacement measurement strategy, based on low-cost remote sensing techniques, has been developed in order to obtain spatially-distributed information on the kinematics of the slope and to better understand the double mechanism (fall-flow). Different remote sensing optical techniques using various platforms (satellite, helicopter, drone, and ground) have been first applied in order to obtain DEMs of the site at various scales. The resolution ranges from 1 cm (close range terrestrial optical photogrammetry) to 2 m (Pleiades images), with 50 cm for the helicopter. The DEMs accuracies were estimated from the results of a differential GPS survey. We discuss the optimum strategy to monitor both the flow and the cliff.

  14. Survey on Ranging Sensors and Cooperative Techniques for Relative Positioning of Vehicles

    Directory of Open Access Journals (Sweden)

    Fabian de Ponte Müller

    2017-01-01

    Full Text Available Future driver assistance systems will rely on accurate, reliable and continuous knowledge on the position of other road participants, including pedestrians, bicycles and other vehicles. The usual approach to tackle this requirement is to use on-board ranging sensors inside the vehicle. Radar, laser scanners or vision-based systems are able to detect objects in their line-of-sight. In contrast to these non-cooperative ranging sensors, cooperative approaches follow a strategy in which other road participants actively support the estimation of the relative position. The limitations of on-board ranging sensors regarding their detection range and angle of view and the facility of blockage can be approached by using a cooperative approach based on vehicle-to-vehicle communication. The fusion of both, cooperative and non-cooperative strategies, seems to offer the largest benefits regarding accuracy, availability and robustness. This survey offers the reader a comprehensive review on different techniques for vehicle relative positioning. The reader will learn the important performance indicators when it comes to relative positioning of vehicles, the different technologies that are both commercially available and currently under research, their expected performance and their intrinsic limitations. Moreover, the latest research in the area of vision-based systems for vehicle detection, as well as the latest work on GNSS-based vehicle localization and vehicular communication for relative positioning of vehicles, are reviewed. The survey also includes the research work on the fusion of cooperative and non-cooperative approaches to increase the reliability and the availability.

  15. Survey on Ranging Sensors and Cooperative Techniques for Relative Positioning of Vehicles.

    Science.gov (United States)

    de Ponte Müller, Fabian

    2017-01-31

    Future driver assistance systems will rely on accurate, reliable and continuous knowledge on the position of other road participants, including pedestrians, bicycles and other vehicles. The usual approach to tackle this requirement is to use on-board ranging sensors inside the vehicle. Radar, laser scanners or vision-based systems are able to detect objects in their line-of-sight. In contrast to these non-cooperative ranging sensors, cooperative approaches follow a strategy in which other road participants actively support the estimation of the relative position. The limitations of on-board ranging sensors regarding their detection range and angle of view and the facility of blockage can be approached by using a cooperative approach based on vehicle-to-vehicle communication. The fusion of both, cooperative and non-cooperative strategies, seems to offer the largest benefits regarding accuracy, availability and robustness. This survey offers the reader a comprehensive review on different techniques for vehicle relative positioning. The reader will learn the important performance indicators when it comes to relative positioning of vehicles, the different technologies that are both commercially available and currently under research, their expected performance and their intrinsic limitations. Moreover, the latest research in the area of vision-based systems for vehicle detection, as well as the latest work on GNSS-based vehicle localization and vehicular communication for relative positioning of vehicles, are reviewed. The survey also includes the research work on the fusion of cooperative and non-cooperative approaches to increase the reliability and the availability.

  16. Direct determination of ground-state transition widths of low-lying dipole states in 140Ce with the self-absorption technique

    Directory of Open Access Journals (Sweden)

    C. Romig

    2015-05-01

    Full Text Available The technique of self absorption has been applied for the first time to study the decay pattern of low-lying dipole states of 140Ce. In particular, ground-state transition widths Γ0 and branching ratios Γ0Γ to the ground state have been investigated in the energy domain of the pygmy dipole resonance. Relative self-absorption measurements allow for a model-independent determination of Γ0. Without the need to perform a full spectroscopy of all decay channels, also the branching ratio to the ground state can be determined. The experiment on 140Ce was conducted at the bremsstrahlung facility of the superconducting Darmstadt electron linear accelerator S-DALINAC. In total, the self-absorption and, thus, Γ0 were determined for 104 excited states of 140Ce. The obtained results are presented and discussed with respect to simulations of γ cascades using the DICEBOX code.

  17. Results of detailed ground geophysical surveys for locating and differentiating waste structures in waste management area 'A' at Chalk River Laboratories, Ontario

    Energy Technology Data Exchange (ETDEWEB)

    Tomsons, D.K.; Street, P.J.; Lodha, G.S

    1999-07-01

    Waste Management Area 'A' (WMA 'A'), located in the outer area of the Chalk River Laboratories (CRL) was in use as a waste burial site from 1946 to 1955. Waste management structures include debris-filled trenches, concrete bunkers and miscellaneous contaminated solid materials, and ditches and pits used for liquid dispersal. In order to update historical records, it was proposed to conduct detailed ground geophysical surveys to define the locations of waste management structures in WMA 'A', assist in planning of the drilling and sampling program to provide ground truth for the geophysics investigation and to predict the nature and locations of unknown/undefined shallow structures. A detailed ground geophysical survey grid was established with a total of 127 grid lines, oriented NNE and spaced one metre apart. The geophysical surveys were carried out during August and September, 1996. The combination of geophysical tools used included the Geonics EM61 metal detector, the GSM-19 magnetometer/gradiometer and a RAMAC high frequency ground penetrating radar system. The geophysical surveys were successful in identifying waste management structures and in characterizing to some extent, the composition of the waste. The geophysical surveys are able to determine the presence of most of the known waste management structures, especially in the western and central portions of the grid which contain the majority of the metallic waste. The eastern portion of the grid has a completely different geophysical character. While historical records show that trenches were dug, they are far less evident in the geophysical record. There is clear evidence for a trench running between lines 30E and 63E at 70 m. There are indications from the radar survey of other trench-like structures in the eastern portion. EM61 data clearly show that there is far less metallic debris in the eastern portion. The geophysical surveys were also successful in identifying

  18. NOS/NGS activities to support development of radio interferometric surveying techniques

    Science.gov (United States)

    Carter, W. E.; Dracup, J. F.; Hothem, L. D.; Robertson, D. S.; Strange, W. E.

    1980-01-01

    National Geodetic Survey activities towards the development of operational geodetic survey systems based on radio interferometry are reviewed. Information about the field procedures, data reduction and analysis, and the results obtained to date is presented.

  19. A survey on the state-of-the-technique on software based pipeline leak detection systems

    Energy Technology Data Exchange (ETDEWEB)

    Baptista, Renan Martins [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Centro de Pesquisas. Div. de Explotacao]. E-mail: renan@cenpes.petrobras.com.br

    2000-07-01

    This paper describes a general technical survey on software based leak detection systems (LDS), approaching its main technological features, the operational situations where they are feasible, and the scenarios within the Brazilian pipeline network. The decision on what LDS to choose for a given pipeline is a matter of cost, suitability and feasibility. A simpler low cost, less effective product, but with a fast installation and tuning procedure, may be more suitable for a given operational site (pipeline configuration, kind of fluid, quality of instrumentation and communication), than a complex, high cost, efficient product, but taking a long time to be properly installed. Some other may really have a level of complexity that will require a more sophisticated system. A few number of them will simply not be suitable to have a LDS: it may be caused by the poor quality or absence of instrumentation, or, the worst case, due to the lack of technology to approach that specific case, e. g., multiphase flow lines, or those lines that commonly operates in slack condition. It is intended to approach here the general state-of-the-technique and make some initial comments on the costs. (author)

  20. RESOLVE Survey Photometry and Volume-limited Calibration of the Photometric Gas Fractions Technique

    CERN Document Server

    Eckert, Kathleen D; Stark, David V; Moffett, Amanda J; Norris, Mark A; Snyder, Elaine M; Hoversten, Erik A

    2015-01-01

    We present custom-processed UV, optical, and near-IR photometry for the RESOLVE survey, a volume-limited census of stellar, gas, and dynamical mass within two subvolumes of the nearby universe (RESOLVE-A and -B), complete down to baryonic mass ~10^9.1-9.3 Msun. In contrast to standard pipeline photometry (e.g., SDSS), our photometry uses optimal background subtraction, avoids suppressing color gradients, and includes systematic errors. With these improvements, we measure brighter magnitudes, larger radii, bluer colors, and a real increase in scatter around the red sequence. Combining stellar masses from our photometry with the RESOLVE-A HI mass census, we create volume-limited calibrations of the photometric gas fractions (PGF) technique, which predicts gas-to-stellar mass ratios (G/S) from galaxy colors and optional additional parameters. We analyze G/S-color residuals vs. potential third parameters, finding that axial ratio is the best independent and physically meaningful third parameter. We define a "modi...

  1. Physicians' knowledge about ionizing radiation and radiological imaging techniques: a cross-sectional survey.

    Science.gov (United States)

    Yucel, Aylin; Alyesil, Cansu; Sim, Saadet

    2011-06-01

    Radiological examinations are critical for the evaluation of many disorders in daily practice. To determine the knowledge of ionizing radiation and radiological imaging techniques among physicians of various grades. A cross-sectional survey was carried out of 55 physicians with a mean age of 35.7 ± 6.0 years (age range 25-52 years) in a university hospital. A questionnaire which tested physicians' information about ionizing radiation and their risks was distributed by medical school students. Among the participants, 32 (58.2%) were consultants and 23 (41.8%) were residents. The mean score was 68.2 ± 11.1 (range 37.8-91.8) out of 100. Consultants' points were lower than residents (p = 0.040). Consultants had significantly higher frequency of incorrect answer than residents in the question about 'whether CT scan increases lifetime cancer risk' (p = 0.036). Medical practices in years do not enhance the level of the awareness regarding the ionizing radiation.

  2. 美军无人地面车辆发展综述%Development Survey of US Army Unmanned Ground Vehicles

    Institute of Scientific and Technical Information of China (English)

    陈欣; 王立操; 李联邦; 左志奇

    2012-01-01

    US army unmanned ground vehicles are primitively introduced. The development course of US army unmanned ground vehicles is expatiated, present condition and development trends are given. Some suggestions on developing military unmanned ground vehicles are presented.%对美军无人地面车辆进行简要介绍,阐述美军无人地面车辆发展历程,给出了其研究现状与趋势,提出了对我国无人地面车辆发展的几点启示。

  3. A comparison of survey techniques on sensitive sexual behavior in Italy.

    Science.gov (United States)

    Caltabiano, Marcantonio; Dalla-Zuanna, Gianpiero

    2013-01-01

    This article compares two national surveys carried out through the most commonly used procedures in Italy: CATI (computer-assisted telephone interviews) and SAQ-FI (self-answered questionnaires following interviews). Both surveys ask two identical questions concerning sensitive sexual behavior: early age at first intercourse and same-sex attraction. The SAQ-FI survey had both unit non-response and item non-response rates much lower than the CATI survey. Moreover, in the CATI survey, the groups with highest item non-response rates were also the groups with the lowest proportions of early intercourse and homosexual attraction. In addition, a differential analysis of the respondents produced diverse results for the two surveys. This is especially true of results by gender for same-sex attraction: Such behavior is more common among men (3.1%) than women (2.9%), according to the CATI survey, whereas the opposite is true of the SAQ-FI survey (6.1% of men vs. 7.7% women). In Italy at the beginning of the 21st century, CATI surveys reveal a lower level of early intercourse and same-sex attraction than SAQ-FI surveys. This article argues that the CATI survey underestimates the true level of these sensitive sexual behaviors in the Italian population.

  4. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: production techniques.

    Science.gov (United States)

    Berry, J; Nesbit, M; Saberi, S; Petridis, H

    2014-09-01

    The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for large laboratories. This study

  5. Three-dimensional dynamic topographic survey of granular flows using photogrammetric techniques

    Science.gov (United States)

    Dallavalle, D.; Scotton, P.; Tecca, P. R.

    2012-04-01

    In order to better characterize the behavior of fast dry granular mass movements, such as dense snow or rock avalanches, laboratory analyses have been undertaken in a model scale (Froude similarity, geometrical scale of the order of 50:1 - 100:1). To this end, an experimental flume, consisting of two planes with adjustable inclination, has been used: the upstream plane, with slope varying from 15% to 60%, simulates the flowing zone and the downstream plane, with slopes ranging from 0% to 30%, simulates the deposition zone. The experimental apparatus has been completed in order to obtain a three-dimensional dynamic topographic survey of the sliding free surface, using photogrammetric techniques. The experiments are being performed using a maximum of eight industrial digital video-cameras. A full photogrammetric camera calibration process has been first conducted in order to define the parameters of inner orientation of the cameras and of the objective lenses distortion, in order to reduce the uncertainties in the collinearity equations. The recording time is digitally triggered at the same time to all the cameras. A dedicated acquisition code, based on LabView software, has been realized to achieve the best accuracy in the frames synchronization. The surface is reconstructed, at different times, using the frames taken at the same instant from the different video-cameras. The photogrammetric analysis has being performed by means of commercial dedicated software. As a final product of the research it is expected the tuning of an automatic procedure for the photogrammetric analysis of the series of frames taken in order to describe the dynamic evolution of the motion of a granular mass driven by the gravity and the limits of the proposed techniques. The dynamic three-dimensional reconstruction of the free surface of the sliding granular mass will be used in the calibration process of granular mathematical-numerical models. The comprehension and the estimation of the

  6. Preferred tools and techniques for implantation of cardiac electronic devices in Europe: results of the European Heart Rhythm Association survey.

    Science.gov (United States)

    Bongiorni, Maria Grazia; Proclemer, Alessandro; Dobreanu, Dan; Marinskis, Germanas; Pison, Laurent; Blomstrom-Lundqvist, Carina

    2013-11-01

    The aim of this European Heart Rhythm Association (EHRA) survey was to assess clinical practice in relation to the tools and techniques used for cardiac implantable electronic devices procedures in the European countries. Responses to the questionnaire were received from 62 members of the EHRA research network. The survey involved high-, medium-, and low-volume implanting centres, performing, respectively, more than 200, 100-199 and under 100 implants per year. The following topics were explored: the side approach for implantation, surgical techniques for pocket incision, first venous access for lead implantation, preference of lead fixation, preferred coil number for implantable cardioverter-defibrillator (ICD) leads, right ventricular pacing site, generator placement site, subcutaneous ICD implantation, specific tools and techniques for cardiac resynchronization therapy (CRT), lead implantation sequence in CRT, coronary sinus cannulation technique, target site for left ventricular lead placement, strategy in left ventricular lead implant failure, mean CRT implantation time, optimization of the atrioventricular (AV) and ventriculo-ventricular intervals, CRT implants in patients with permanent atrial fibrillation, AV node ablation in patients with permanent AF. This panoramic view allows us to find out the operator preferences regarding the techniques and tools for device implantation in Europe. The results showed different practices in all the fields we investigated, nevertheless the survey also outlines a good adherence to the common standards and recommendations.

  7. Habitat Modeling in Complex Streams: Comparison of Terrestrial Laser Scanning and Traditional Surveying Techniques for Topographic Surface Generation

    Science.gov (United States)

    Hession, W. C.; Kozarek, J. L.; Resop, J. P.

    2009-12-01

    Accurate stream topography measurement is important for many environmental and ecological applications, such as hydraulic modeling and habitat characterization. Topological surveys are commonly created from point measurements using methods such as total station or global positioning system (GPS) surveying. However, surveying can be time intensive and limited by poor spatial resolution and difficulty in measuring complex morphology such as boulder-filled mountain streams. This can lead to measurement and interpolation errors, which can propagate to model uncertainty. Terrestrial laser scanning (TLS) has the potential to create high resolution, high accuracy topographic maps. Two methods, total station surveying and TLS, were used to measure the topography for an 80-meter forested reach on the Staunton River in Shenandoah National Park, Virginia, USA. The 2,500 surveyed points were directly compared to the TLS point cloud (approximately 9,500,000 points). The total station and TLS datasets were processed to create unique digital elevation models (DEM) of the stream reach. The resulting DEMs were used to evaluate uncertainties in topographic surfaces due to errors in traditional surveying techniques, to evaluate the propagation of uncertainty due to these errors in habitat modeling, and to evaluate the efficacy of utilizing TLS for complex, boulder streams. Comparison of resulting topography of a complex boulder stream using terrestrial laser scanning (grey-scale surfaces) and total station surveying (grid lines).

  8. Simulated JWST/NIRISS Spectroscopy of Anticipated TESS Planets and Selected Super-Earths Discovered from K2 and Ground-Based Surveys

    Science.gov (United States)

    Louie, Dana; Albert, Loic; Deming, Drake

    2017-01-01

    The 2018 launch of James Webb Space Telescope (JWST), coupled with the 2017 launch of the Transiting Exoplanet Survey Satellite (TESS), heralds a new era in Exoplanet Science, with TESS projected to detect over one thousand transiting sub-Neptune-sized planets (Ricker et al, 2014), and JWST offering unprecedented spectroscopic capabilities. Sullivan et al (2015) used Monte Carlo simulations to predict the properties of the planets that TESS is likely to detect, and published a catalog of 962 simulated TESS planets. Prior to TESS launch, the re-scoped Kepler K2 mission and ground-based surveys such as MEarth continue to seek nearby Earth-like exoplanets orbiting M-dwarf host stars. The exoplanet community will undoubtedly employ JWST for atmospheric characterization follow-up studies of promising exoplanets, but the targeted planets for these studies must be chosen wisely to maximize JWST science return. The goal of this project is to estimate the capabilities of JWST’s Near InfraRed Imager and Slitless Spectrograph (NIRISS)—operating with the GR700XD grism in Single Object Slitless Spectrography (SOSS) mode—during observations of exoplanets transiting their host stars. We compare results obtained for the simulated TESS planets, confirmed K2-discovered super-Earths, and exoplanets discovered using ground-based surveys. By determining the target planet characteristics that result in the most favorable JWST observing conditions, we can optimize the choice of target planets in future JWST follow-on atmospheric characterization studies.

  9. The Hannibal Community Survey; A Case Study in a Community Development Technique.

    Science.gov (United States)

    Croll, John A.

    Disturbed by the community's negative attitude toward its prospects for progress, the Hannibal (Missouri) Chamber of Commerce initiated a community self-survey to improve the situation. The questionnaire survey concentrated on felt needs relationg to city government, retail facilities and services, recreation, religion, education, industrial…

  10. Extension and application of a scaling technique for duplication of in-flight aerodynamic heat flux in ground test facilities

    NARCIS (Netherlands)

    Veraar, R.G.

    2009-01-01

    To enable direct experimental duplication of the inflight heat flux distribution on supersonic and hypersonic vehicles, an aerodynamic heating scaling technique has been developed. The scaling technique is based on the analytical equations for convective heat transfer for laminar and turbulent bound

  11. Ground-Control Networks for Image Based Surface Reconstruction: An Investigation of Optimum Survey Designs Using UAV Derived Imagery and Structure-from-Motion Photogrammetry

    Directory of Open Access Journals (Sweden)

    Toby N. Tonkin

    2016-09-01

    Full Text Available The use of small UAV (Unmanned Aerial Vehicle and Structure-from-Motion (SfM with Multi-View Stereopsis (MVS for acquiring survey datasets is now commonplace, however, aspects of the SfM-MVS workflow require further validation. This work aims to provide guidance for scientists seeking to adopt this aerial survey method by investigating aerial survey data quality in relation to the application of ground control points (GCPs at a site of undulating topography (Ennerdale, Lake District, UK. Sixteen digital surface models (DSMs were produced from a UAV survey using a varying number of GCPs (3-101. These DSMs were compared to 530 dGPS spot heights to calculate vertical error. All DSMs produced reasonable surface reconstructions (vertical root-mean-square-error (RMSE of <0.2 m, however, an improvement in DSM quality was found where four or more GCPs (up to 101 GCPs were applied, with errors falling to within the suggested point quality range of the survey equipment used for GCP acquisition (e.g., vertical RMSE of <0.09 m. The influence of a poor GCP distribution was also investigated by producing a DSM using an evenly distributed network of GCPs, and comparing it to a DSM produced using a clustered network of GCPs. The results accord with existing findings, where vertical error was found to increase with distance from the GCP cluster. Specifically vertical error and distance to the nearest GCP followed a strong polynomial trend (R2 = 0.792. These findings contribute to our understanding of the sources of error when conducting a UAV-SfM survey and provide guidance on the collection of GCPs. Evidence-driven UAV-SfM survey designs are essential for practitioners seeking reproducible, high quality topographic datasets for detecting surface change.

  12. Ground Gravity, Magnetic and Electromagnetic Surveys on a Crater on Basalt of Bajada del Diablo Astrobleme-Strewn Field

    Science.gov (United States)

    Acevedo, R. D.; Prezzi, C.; Orgeira, M. J.; Rocca, M.; Martínez, O.; Ponce, J. F.; Corbella, H.; Rabassa, J.; González-Guillot, M.; Subías, I.

    2014-09-01

    With the aim of further investigate the circular structures from Bajada del Diablo, we carried out geophysics surveys and we conclude that the geophysical features could be satisfactorily explained assuming an extra-terrestrial projectile impact.

  13. Use of Tracer Dye Techniques Is Assessing Ground Water Availabilty and Quality in a Karst Aquifer System (Project Overview)

    Science.gov (United States)

    Problem: The Leetown Science Center and ~ 500 acre research facility operated by the U.S. Geological Survey (USGS) Biological Resources Division (BRD) In West Virginia investigates the health and habitats of aquatic species. Large quantities of good quality cold water are needed ...

  14. Using economic valuation techniques to inform water resources management: a survey and critical appraisal of available techniques and an application.

    Science.gov (United States)

    Birol, Ekin; Karousakis, Katia; Koundouri, Phoebe

    2006-07-15

    The need for economic analysis for the design and implementation of efficient water resources management policies is well documented in the economics literature. This need is also emphasised in the European Union's recent Water Framework Directive (2000/60/EC), and is relevant to the objectives of Euro-limpacs, an EU funded project which inter alia, aims to provide a decision-support system for valuing the effects of future global change on Europe's freshwater ecosystems. The purpose of this paper is to define the role of economic valuation techniques in assisting in the design of efficient, equitable and sustainable policies for water resources management in the face of environmental problems such as pollution, intensive land use in agriculture and climate change. The paper begins with a discussion of the conceptual economic framework that can be used to inform water policy-making. An inventory of the available economic valuation methods is presented and the scope and suitability of each for studying various aspects of water resources are critically discussed. Recent studies that apply these methods to water resources are reviewed. Finally, an application of one of the economic valuation methods, namely the contingent valuation method, is presented using a case study of the Cheimaditida wetland in Greece.

  15. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Science.gov (United States)

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  16. A Survey on Security Issues in Ad Hoc Routing Protocols and their Mitigation Techniques

    CERN Document Server

    Kayarkar, Harshavardhan

    2012-01-01

    Mobile Ad hoc Networks (MANETS) are transient networks of mobile nodes, connected through wireless links, without any fixed infrastructure or central management. Due to the self-configuring nature of these networks, the topology is highly dynamic. This makes the Ad Hoc Routing Protocols in MANETS highly vulnerable to serious security issues. In this paper, we survey the common security threats and attacks and summarize the solutions suggested in the survey to mitigate these security vulnerabilities.

  17. Prediction of peak ground acceleration of Iran’s tectonic regions using a hybrid soft computing technique

    Institute of Scientific and Technical Information of China (English)

    Mostafa Gandomi; Mohsen Soltanpour; Mohammad R. Zolfaghari; Amir H. Gandomi

    2016-01-01

    A new model is derived to predict the peak ground acceleration (PGA) utilizing a hybrid method coupling artificial neural network (ANN) and simulated annealing (SA), called SA-ANN. The proposed model re-lates PGA to earthquake source to site distance, earthquake magnitude, average shear-wave velocity, faulting mechanisms, and focal depth. A database of strong ground-motion recordings of 36 earthquakes, which happened in Iran’s tectonic regions, is used to establish the model. For more validity verification, the SA-ANN model is employed to predict the PGA of a part of the database beyond the training data domain. The proposed SA-ANN model is compared with the simple ANN in addition to 10 well-known models proposed in the literature. The proposed model performance is superior to the single ANN and other existing attenuation models. The SA-ANN model is highly correlated to the actual records (R ¼ 0.835 and r ¼ 0.0908) and it is subsequently converted into a tractable design equation.

  18. Improving Standard Poststratification Techniques For Random-Digit-Dialing Telephone Surveys

    Directory of Open Access Journals (Sweden)

    Michael P. Battaglia

    2008-03-01

    Full Text Available Random-digit-dialing surveys in the United States such as the Behavioral Risk Factor Surveillance System (BRFSS typically poststratify on age, gender and race/ethnicity using control totals from an appropriate source such as the 2000 Census, the Current Population Survey, or the American Community Survey. Using logistic regression and interaction detection software we identified key "main effect" socio-demographic variables and important two-factor interactions associated with several health risk factor outcomes measured in the BRFSS, one of the largest annual RDD surveys in the United States. A procedure was developed to construct control totals, which were consistent with estimates of age, gender, and race/ethnicity obtained from a commercial source and distributions of other demographic variables from the Current Population Survey. Raking was used to incorporate main effects and two-factor interaction margins into the weighting of the BRFSS survey data. The resulting risk factor estimates were then compared with those based on the current BRFSS weighting methodology and mean squared error estimates were developed. The research demonstrates that by identifying socio-demographic variables associated with key outcome variables and including these variables in the weighting methodology, nonresponse bias can be substantially reduced.

  19. Embryo transfer techniques: an American Society for Reproductive Medicine survey of current Society for Assisted Reproductive Technology practices.

    Science.gov (United States)

    Toth, Thomas L; Lee, Malinda S; Bendikson, Kristin A; Reindollar, Richard H

    2017-04-01

    To better understand practice patterns and opportunities for standardization of ET. Cross-sectional survey. Not applicable. Not applicable. An anonymous 82-question survey was emailed to the medical directors of 286 Society for Assisted Reproductive Technology member IVF practices. A follow-up survey composed of three questions specific to ET technique was emailed to the same medical directors. Descriptive statistics of the results were compiled. The survey assessed policies, protocols, restrictions, and specifics pertinent to the technique of ET. There were 117 (41%) responses; 32% practice in academic settings and 68% in private practice. Responders were experienced clinicians, half of whom had performed transfer followed immediately with ET (40%); [2] afterload transfer (30%); and [3] direct transfer without prior trial or afterload (27%). Embryos are discharged in the upper (66%) and middle thirds (29%) of the endometrial cavity and not closer than 1-1.5 cm from fundus (87%). Details of each step were reported and allowed the development of a "common" practice ET procedure. ET training and practices vary widely. Improved training and standardization based on outcomes data and best practices are warranted. A common practice procedure is suggested for validation by a systematic literature review. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  20. Use of cognitive interview techniques in the development of nutrition surveys and interactive nutrition messages for low-income populations.

    Science.gov (United States)

    Carbone, Elena T; Campbell, Marci K; Honess-Morreale, Lauren

    2002-05-01

    The effectiveness of dietary surveys and educational messages is dependent in part on how well the target audience's information processing needs and abilities are addressed. Use of pilot testing is helpful; however, problems with wording and language are often not revealed. Cognitive interview techniques offer 1 approach to assist dietitians in understanding how audiences process information. With this method, respondents are led through a survey or message and asked to paraphrase items; discuss thoughts, feelings, and ideas that come to mind; and suggest alternative wording. As part of a US Department of Agriculture-funded nutrition education project, 23 cognitive interviews were conducted among technical community college students in North Carolina. Interview findings informed the development of tailored computer messages and survey questions. Better understanding of respondents' cognitive processes significantly improved the language and approach used in this intervention. Interview data indicated 4 problem areas: vague or ineffective instructions, confusing questions and response options, variable interpretation of terms, and misinterpretation of dietary recommendations. Interviews also provided insight into the meaning of diet-related stages of change. These findings concur with previous research suggesting that cognitive interview techniques are a valuable tool in the formative evaluation and development of nutrition surveys and materials.

  1. Campaign 9 of the $K2$ Mission: Observational Parameters, Scientific Drivers, and Community Involvement for a Simultaneous Space- and Ground-based Microlensing Survey

    CERN Document Server

    Henderson, Calen B; Street, Rachel A; Bennett, David P; Hogg, David W; Poleski, R; Barclay, T; Barentsen, G; Howell, S B; Udalski, A; Szymański, M K; Skowron, J; Mróz, P; Kozłowski, S; Wyrzykowski, Ł; Pietrukowicz, P; Soszyński, I; Ulaczyk, K; Pawlak, M; Sumi, T; Abe, F; Asakura, Y; Barry, R K; Bhattacharya, A; Bond, I A; Donachie, M; Freeman, M; Fukui, A; Hirao, Y; Itow, Y; Koshimoto, N; Li, M C A; Ling, C H; Masuda, K; Matsubara, Y; Muraki, Y; Nagakane, M; Ohnishi, K; Oyokawa, H; Rattenbury, N; Saito, To; Sharan, A; Sullivan, D J; Tristram, P J; Yonehara, A; Bachelet, E; Bramich, D A; Cassan, A; Dominik, M; Jaimes, R Figuera; Horne, K; Hundertmark, M; Mao, S; Ranc, C; Schmidt, R; Snodgrass, C; Steele, I A; Tsapras, Y; Wambsganss, J; Akeson, R; Batista, V; Beaulieu, J -P; Beichman, C A; Bozza, V; Bryden, G; Ciardi, D; Cole, A; Coutures, C; Dong, S; Foreman-Mackey, D; Fouqué, P; Gaudi, B S; Kerins, E; Korhonen, H; Jørgensen, U; Lang, D; Lineweaver, C; Marquette, J -B; Mogavero, Federico; Morales, J C; Nataf, D; Pogge, R W; Santerne, A; Shvartzvald, Y; Suzuki, D; Tamura, M; Tisserand, P; Wang, D; Zhu, W

    2016-01-01

    $K2$'s Campaign 9 ($K2$C9) will conduct a $\\sim$3.4 deg$^{2}$ survey toward the Galactic bulge from 7/April through 1/July of 2016 that will leverage the spatial separation between $K2$ and the Earth to facilitate measurement of the microlens parallax $\\pi_{\\rm E}$ for $\\gtrsim$120 microlensing events, including several planetary in nature as well as many short-timescale microlensing events, which are potentially indicative of free-floating planets (FFPs). These satellite parallax measurements will in turn allow for the direct measurement of the masses of and distances to the lensing systems. In this white paper we provide an overview of the $K2$C9 space- and ground-based microlensing survey. Specifically, we detail the demographic questions that can be addressed by this program, including the frequency of FFPs and the Galactic distribution of exoplanets, the observational parameters of $K2$C9, and the array of ground-based resources dedicated to concurrent observations. Finally, we outline the avenues throug...

  2. A-Survey of Feature Extraction and Classification Techniques in OCR Systems

    Directory of Open Access Journals (Sweden)

    Rohit Verma

    2012-11-01

    Full Text Available This paper describes a set of feature extraction and classification techniques, which play very important role in the recognition of characters. Feature extraction provides us methods with the help of which we can identify characters uniquely and with high degree of accuracy. Feature extraction helps us to find the shape contained in the pattern. Although a number of techniques are available for feature extraction and classification, but the choice of an excellent technique decides the degree of accuracy of recognition. A lot of research has been done in this field and new techniques of extraction and classification has been developed. The objective of this paper is to review these techniques, so that the set of these techniques can be appreciated.

  3. A Survey on Voltage Boosting Techniques for Step-Up DC-DC Converters

    DEFF Research Database (Denmark)

    Forouzesh, Mojtaba; Siwakoti, Yam Prasad; Gorji, Saman Asghari;

    2016-01-01

    , researches on new voltage boosting techniques are inevitable for various power converter applications. This can be achieved either by additional magnetic or by electric field storage elements with switching elements (switch and/or diode) in different configurations. Such combination of primary voltage...... boosting techniques and topologies are large, which at times may be confusing and difficult to follow/adapt for different applications. Considering these aspects and in order to make a clear sketch of the general law and framework of various voltage boosting techniques, this paper comprehensively reviews...... different voltage boosting techniques and categorizes them according to their circuit performance....

  4. Bias in little owl population estimates using playback techniques during surveys

    Directory of Open Access Journals (Sweden)

    Zuberogoitia, I.

    2011-12-01

    Full Text Available To test the efficiency of playback methods to survey little owl (Athene noctua populations we carried out two studies: (1 we recorded the replies of radio–tagged little owls to calls in a small area; (2 we recorded call broadcasts to estimate the effectiveness of the method to detect the presence of little owl. In the first study, we detected an average of 8.12 owls in the 30′ survey period, a number that is close to the real population; we also detected significant little owl movements from the initial location (before the playback to the next locations during the survey period. However, we only detected an average of 2.25 and 5.37 little owls in the first 5′ and 10′, respectively, of the survey time. In the second study, we detected 137 little owl territories in 105 positive sample units. The occupation rate was 0.35, the estimated occupancy was 0.393, and the probability of detection was 0.439. The estimated cumulative probability of detection suggests that a minimum of four sampling times would be needed in an extensive survey to detect 95% of the areas occupied by little owls.

  5. Ground penetration radar 3-D modelling using the finite difference technique; Modelagem tridimensional de dados de radar (GPR) usando a tecnica das diferencas finitas

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Rene Santos; Botelho, Marco Antonio Barsottelli [Bahia Univ., Salvador, BA (Brazil). Inst. de Geociencias. Programa de Pesquisa e Pos-graduacao em Geofisica

    1995-12-31

    The Ground Penetration Radar (GPR) is a surface-geophysical method that can produce continuous high-resolution profiles much better than seismic methods, but phenomena of propagation of the electromagnetic (EM) pulse can be of harder interpretation than seismic pulses. The phenomena of propagation of the EM waves in the air add more reflection on the GPR data gathers. In this work we illustrate this phenomena using 3-D models with structures above the ground. The 3-D synthetic common shot gathers are important tools to analyse the GPR response in order to improve the understanding of the geometry of overburden conditions for activities such as geotechnical investigations or factors controlling groundwater flow. Groups of 3-D synthetic common shot gathers of ground penetrating radar (GPR) data are simulated using the scalar wave equation in which the velocity is controlled by the dielectric permittivity distribution. The propagation velocity in which the EM pulses travel depends on the dielectric permittivity of the material. The algorithm uses the finite difference technique with operators of second order to solve the time and spatial derivatives and also fourth order to solve the spatial derivatives. The shot gathers in association with time slices and also with snapshots constitutes a powerful tool to predict the response of buried structures. (author). 14 refs., 4 figs

  6. Estimation of leaf area index using ground-based remote sensed NDVI measurements: validation and comparison with two indirect techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pontailler, J.-Y. [Univ. Paris-Sud XI, Dept. d' Ecophysiologie Vegetale, Orsay Cedex (France); Hymus, G.J.; Drake, B.G. [Smithsonian Environmental Research Center, Kennedy Space Center, Florida (United States)

    2003-06-01

    This study took place in an evergreen scrub oak ecosystem in Florida. Vegetation reflectance was measured in situ with a laboratory-made sensor in the red (640-665 nm) and near-infrared (750-950 nm) bands to calculate the normalized difference vegetation index (NDVI) and derive the leaf area index (LAI). LAI estimates from this technique were compared with two other nondestructive techniques, intercepted photosynthetically active radiation (PAR) and hemispherical photographs, in four contrasting 4 m{sup 2} plots in February 2000 and two 4m{sup 2} plots in June 2000. We used Beer's law to derive LAI from PAR interception and gap fraction distribution to derive LAI from photographs. The plots were harvested manually after the measurements to determine a 'true' LAI value and to calculate a light extinction coefficient (k). The technique based on Beer's law was affected by a large variation of the extinction coefficient, owing to the larger impact of branches in winter when LAI was low. Hemispherical photographs provided satisfactory estimates, slightly overestimated in winter because of the impact of branches or underestimated in summer because of foliage clumping. NDVI provided the best fit, showing only saturation in the densest plot (LAI = 3.5). We conclude that in situ measurement of NDVI is an accurate and simple technique to nondestructively assess LAI in experimental plots or in crops if saturation remains acceptable. (author)

  7. Evaluation of bone-to-implant contact and bone density adjacent to titanium implants using a stereological technique on ground sections

    DEFF Research Database (Denmark)

    Balatsouka, Dimitra; Gotfredsen, Klaus; Gundersen, Hans Jørgen Gottlieb

    2006-01-01

    using stereological principles. The aim of the study was to describe an unbiased design for evaluating boneto- implant contact (BIC) and peri-implant bone density (BD-i) in three-dimensions. The unbiased design was based on a fixed axis vertical random sampling technique. Three bone-implant blocks were...... collected from 3 rabbits. Four sections were obtained from each animal using a fixed axis vertical random sampling technique. The BIC was estimated by creating a stereological method based on a systematic test line set. The BD-i was estimated using a design based on a systematic point set. The efficiency......When bone implants have to be examined in situ ground sections are required. Histomorphometric measurements are usually performed on two-dimensional sections, causing biased results when they are wrongly extrapolated to 3D without any knowledge of stereology. Unbiased results can only be obtained...

  8. Evaluation of bone-to-implant contact and bone density adjacent to titanium implants using a stereological technique on ground sections

    DEFF Research Database (Denmark)

    Balatsouka, Dimitra; Gotfredsen, Klaus; Gundersen, Hans Jørgen Gottlieb

    2006-01-01

    When bone implants have to be examined in situ ground sections are required. Histomorphometric measurements are usually performed on two-dimensional sections, causing biased results when they are wrongly extrapolated to 3D without any knowledge of stereology. Unbiased results can only be obtained...... using stereological principles. The aim of the study was to describe an unbiased design for evaluating boneto- implant contact (BIC) and peri-implant bone density (BD-i) in three-dimensions. The unbiased design was based on a fixed axis vertical random sampling technique. Three bone-implant blocks were...... collected from 3 rabbits. Four sections were obtained from each animal using a fixed axis vertical random sampling technique. The BIC was estimated by creating a stereological method based on a systematic test line set. The BD-i was estimated using a design based on a systematic point set. The efficiency...

  9. Ground-based Containerless Levitation Techniques for Crystal Growth%用于晶体生长的地基无容器悬浮技术

    Institute of Scientific and Technical Information of China (English)

    曹慧玲; 郭云珠; 马晓亮; 卢慧甍; 解旭卓; 周伯儒; 尹大川

    2011-01-01

    The microgravity provides a relatively homogeneous and stable environment for crystallization since the natural convection and sedimentation can be eliminated. Single crystals of larger size and higher diffraction resolution can be therefore obtained in such environment However, there are limited chances for utilizing space conditions for crystallization because of the low success rate and high cost As a result, it is necessary to develop ground-based simulation techniques. Presently ground-based levitation techniques include mainly aerodynamic, electrostatic, electromagnetic, liquid interface, ultrasonic and magnetic levitation techniques. Crystallization in containerless levitation condition can be realized using these techniques, through which unfavorable effects on crystallization from the vessel wall can be avoided. As a result, high quality crystals can be obtained, which provides a new solution to deal with that bottleneck problem in X-ray diffraction technique and a convenient means to investigate dynamics and mechanism of crystallization on the ground. Six ground-based levitation techniques are reviewed from the following four aspects: the principle, the advantages, the disadvantages, as well as the applications in crystal growth, especially protein crystallization. Three well-developed levitation techniques (liquid interface, ultrasonic, and magnetic levitation techniques) are high lighted specifically on their applications in protein crystallization.%空间微重力环境下几乎无对流和沉降,可为晶体生长提供一个相对稳定和均一的理想环境,易于得到尺寸较大的高质量单晶.但是,空间结晶实验成功率低,费用昂贵,实验机会受限.因此,研发各种空间微重力环境地基模拟技术具有重要意义.目前可用于晶体生长的地基无容器悬浮技术主要有空气动力悬浮、静电悬浮、电磁悬浮、液体界面悬浮、超声悬浮和磁场悬浮技术等.这些地基模拟技术可实

  10. Evaluation of Cast Re-Orientation on a Dental Surveyor Using Three Tripod Techniques: A Survey and In Vitro Study.

    Science.gov (United States)

    Sayed, Mohammed E; Busaily, Idris A; Nahari, Rana J; Hakami, Ruaa O; Maashi, Sami M; Ramireddy, Naveen R

    2017-01-18

    To survey different educational levels (i.e., students, interns, technicians, and prosthodontic faculty) with regard to their opinions, attitudes, and adoption of three selected tripod techniques. The study will also investigate the accuracy of these techniques to reposition casts on the dental surveyor in anterio-posterior (AP) and lateral directions at both technique and educational levels. Tripod points, scored lines, and cemented post tripod techniques were used in this study. Three Kennedy class II modification I stone casts, duplicated from a standard cast, were assigned to each of the tripod techniques. The tilt angles of all casts were set on the dental surveyor to 10° (control angle) in AP and lateral directions using a digital angle gauge with an accuracy of 0.2°. The casts were tripoded accordingly. A total of 243 participants were involved in this study. Participants were first asked to remount the three casts on three different dental surveyors using the tripod technique noted on each cast. Questionnaires were then given to each participant in an individual interview setting; this assured a 100% response rate. The angle differences were calculated. All data were coded and entered into an Excel Spreadsheet file. Statistical analyses were performed using a paired Chi-square, Wilcoxon Matched-pairs, ANOVA, and Tukey post hoc tests at 5% level of significance. No significant difference was found between the educational levels relative to the responses to technique demands, sensitivity, and time required for reorientation (p = 0.08202, 0.8108, 0.6874, respectively); however, the majority of respondents reported low technique demands, low sensitivity, and time saving for technique C in comparison to techniques A and B. Significant differences were noted among the educational levels in response to preference and adoption questions (p = 0.0035 and 0.0015, respectively). The highest percentage of faculty chose technique A for inclusion into the academic

  11. Fine-resolution repeat topographic surveying of dryland landscapes using UAS-based structure-from-motion photogrammetry: Assessing accuracy and precision against traditional ground-based erosion measurements

    Science.gov (United States)

    Gillian, Jeffrey K.; Karl, Jason W.; Elaksher, Ahmed; Duniway, Michael C.

    2017-01-01

    Structure-from-motion (SfM) photogrammetry from unmanned aerial system (UAS) imagery is an emerging tool for repeat topographic surveying of dryland erosion. These methods are particularly appealing due to the ability to cover large landscapes compared to field methods and at reduced costs and finer spatial resolution compared to airborne laser scanning. Accuracy and precision of high-resolution digital terrain models (DTMs) derived from UAS imagery have been explored in many studies, typically by comparing image coordinates to surveyed check points or LiDAR datasets. In addition to traditional check points, this study compared 5 cm resolution DTMs derived from fixed-wing UAS imagery with a traditional ground-based method of measuring soil surface change called erosion bridges. We assessed accuracy by comparing the elevation values between DTMs and erosion bridges along thirty topographic transects each 6.1 m long. Comparisons occurred at two points in time (June 2014, February 2015) which enabled us to assess vertical accuracy with 3314 data points and vertical precision (i.e., repeatability) with 1657 data points. We found strong vertical agreement (accuracy) between the methods (RMSE 2.9 and 3.2 cm in June 2014 and February 2015, respectively) and high vertical precision for the DTMs (RMSE 2.8 cm). Our results from comparing SfM-generated DTMs to check points, and strong agreement with erosion bridge measurements suggests repeat UAS imagery and SfM processing could replace erosion bridges for a more synoptic landscape assessment of shifting soil surfaces for some studies. However, while collecting the UAS imagery and generating the SfM DTMs for this study was faster than collecting erosion bridge measurements, technical challenges related to the need for ground control networks and image processing requirements must be addressed before this technique could be applied effectively to large landscapes.

  12. MODFLOW-2000, the U.S. Geological Survey modular ground-water model : user guide to the LMT6 package, the linkage with MT3DMS for multi-species mass transport modeling

    Science.gov (United States)

    Zheng, Chunmiao; Hill, Mary Catherine; Hsieh, Paul A.

    2001-01-01

    MODFLOW-2000, the newest version of MODFLOW, is a computer program that numerically solves the three-dimensional ground-water flow equation for a porous medium using a finite-difference method. MT3DMS, the successor to MT3D, is a computer program for modeling multi-species solute transport in three-dimensional ground-water systems using multiple solution techniques, including the finite-difference method, the method of characteristics (MOC), and the total-variation-diminishing (TVD) method. This report documents a new version of the Link-MT3DMS Package, which enables MODFLOW-2000 to produce the information needed by MT3DMS, and also discusses new visualization software for MT3DMS. Unlike the Link-MT3D Packages that coordinated previous versions of MODFLOW and MT3D, the new Link-MT3DMS Package requires an input file that, among other things, provides enhanced support for additional MODFLOW sink/source packages and allows list-directed (free) format for the flow model produced flow-transport link file. The report contains four parts: (a) documentation of the Link-MT3DMS Package Version 6 for MODFLOW-2000; (b) discussion of several issues related to simulation setup and input data preparation for running MT3DMS with MODFLOW-2000; (c) description of two test example problems, with comparison to results obtained using another MODFLOW-based transport program; and (d) overview of post-simulation visualization and animation using the U.S. Geological Survey?s Model Viewer.

  13. Bathymetric evolution of Tasman Glacier terminal lake, New Zealand, as determined by remote surveying techniques

    Science.gov (United States)

    Purdie, Heather; Bealing, Paul; Tidey, Emily; Gomez, Christopher; Harrison, Justin

    2016-12-01

    Processes that drive iceberg calving at the margins of freshwater terminating glaciers are still poorly understood. This knowledge-gap is in part due to the challenge of obtaining good in situ data in a highly dynamic and dangerous environment. We are using emerging remote technologies, in the form of a remote controlled jet boat to survey bathymetry, and Structure from Motion (SfM) to characterize terminus morphology, to better understand relationships between lake growth and terminus evolution. Comparison of results between the jet boat mounted dual-frequency Garmin fish-finder with an Odom Echotrac DF3200 MKII with 200/38 kHz dual-frequency transducer, showed that after a sound velocity adjustment, the remote survey obtained depth data within ± 1 m of the higher grade survey equipment. Water depths of up to 240 m were recorded only 100 m away from the terminus, and subaerial cliff height ranged from around 6 to 33 m, with the central region of the terminus more likely to experience buoyancy. Subaqueous ice ramps are ephemeral features, and in 2015 multiple ice ramps extended out into the lake from the terminus by 100-200 m. The consistent location of some of the subaqueous ramps between surveys may indicate that other processes, for example, subglacial hydrology, also influence evolving terminus morphology.

  14. MALT-45: A 7 mm survey of the southern Galaxy - I. Techniques and spectral line data

    CERN Document Server

    Jordan, Christopher H; Lowe, Vicki; Voronkov, Maxim A; Ellingsen, Simon P; Breen, Shari L; Purcell, Cormac R; Barnes, Peter J; Burton, Michael G; Cunningham, Maria R; Hill, Tracey; Jackson, James M; Longmore, Steven N; Peretto, Nicolas; Urquhart, James S

    2015-01-01

    We present the first results from the MALT-45 (Millimetre Astronomer's Legacy Team - 45 GHz) Galactic Plane survey. We have observed 5 square-degrees ($l = 330 - 335$, $b = \\pm0.5$) for spectral lines in the 7 mm band (42-44 and 48-49 GHz), including $\\text{CS}$ $(1-0)$, class I $\\text{CH}_3\\text{OH}$ masers in the $7(0,7)-6(1,6)$ $\\text{A}^{+}$ transition and $\\text{SiO}$ $(1-0)$ $v=0,1,2,3$. MALT-45 is the first unbiased, large-scale, sensitive spectral line survey in this frequency range. In this paper, we present data from the survey as well as a few intriguing results; rigorous analyses of these science cases are reserved for future publications. Across the survey region, we detected 77 class I $\\text{CH}_3\\text{OH}$ masers, of which 58 are new detections, along with many sites of thermal and maser $\\text{SiO}$ emission and thermal $\\text{CS}$. We found that 35 class I $\\text{CH}_3\\text{OH}$ masers were associated with the published locations of class II $\\text{CH}_3\\text{OH}$, $\\text{H}_2\\text{O}$ and $...

  15. Preliminary study of airborne electromagnetic survey using grounded source; Chihyo source gata kuchu denji tansa no kisoteki kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Mogi, T. [Kyushu University, Fukuoka (Japan). Faculty of Engineering; Shimoizumi, M. [Kitakyushu Polytechnic College, Kitakyushu (Japan); Kusunoki, K. [Central Research Institute of Electric Power Industry, Tokyo (Japan); Morikawa, T. [Dowa Engineering Co. Ltd., Okayama (Japan); Jomori, N. [Chiba Electronics Research Institute, Chiba (Japan)

    1996-05-01

    For the development of an airborne electromagnetic prospecting method capable of deeper exploration, a basic study was made about a system wherein a transmitter (source) is positioned on the ground and the receiving is done in the sky. Even in case of this airborne electromagnetic method, the TDEM method is supposedly advantageous over others as in case of groundborne exploration. In the study, the transient response of an airborne vertical magnetic field to a horizontal layered structure was calculated. The current source was 2000m long with a capacity of 30A. The one-layer structure was a 10 Ohm m semi-infinite ground, and the two-layer structure had a 100 Ohm m structure just under the one-layer structure. The result of the calculation suggests that, in the absence of a layer of extremely low resistivity, observation of an approximately 1 second long transient response aboard a helicopter flying at approximately 50km/h will enable an approximately 1000m deep exploration. Problems to affect airborne observation, such as swinging, natural magnetic field fluctuation, and artificially produced noises were investigated by use of a magnetometer suspended from a helicopter in flight. 2 refs., 6 figs.

  16. Use of relaxation techniques and complementary and alternative medicine by American adults with insomnia symptoms: results from a national survey.

    Science.gov (United States)

    Bertisch, Suzanne M; Wells, Rebecca Erwin; Smith, Michael T; McCarthy, Ellen P

    2012-12-15

    Though relaxation training is recommended for insomnia, national patterns of use remain unknown. Similarly, rates of complementary and alternative medicine (CAM) use by adults with insomnia are not well established. We sought to elucidate the patterns and reasons for use of relaxation techniques and CAM use by adults with insomnia symptoms. We used the 2007 National Health Interview Survey (n = 23,358) to estimate prevalence of use among adults by self-reported insomnia symptom status. Among adults reporting insomnia symptoms (n = 4,415), we examined reasons for use and disclosure to medical professionals. We employed logistic regression to determine the adjusted associations between relaxation techniques use, CAM use, and insomnia symptoms. Among adults with insomnia symptoms, 23% used relaxation techniques and 45% used CAM annually. After adjustment, adults with insomnia symptoms had higher likelihood of using relaxation techniques (aOR 1.48, 95% CI 1.32, 1.66) and CAM (aOR 1.29, 95% CI 1.15, 1.44) compared with adults without insomnia. Deep breathing exercise was the most commonly used relaxation technique. Fewer than 2% of adults with insomnia used CAM specifically for insomnia. Only 26% of adults with insomnia symptoms disclosed their relaxation techniques use to medical professionals. Being male, lower educational and physical activity levels, income relaxation techniques use among adults with insomnia symptoms. While adults with insomnia symptoms commonly use relaxation techniques and CAM, few are using for their insomnia. Facilitating discussions about relaxation techniques may foster targeted use for insomnia.

  17. A survey of imagery techniques for semantic labeling of human-vehicle interactions in persistent surveillance systems

    Science.gov (United States)

    Elangovan, Vinayak; Shirkhodaie, Amir

    2011-06-01

    Understanding and semantic annotation of Human-Vehicle Interactions (HVI) facilitate fusion of Hard sensor (HS) and Human Intelligence (HUMINT) in a cohesive way. By characterization, classification, and discrimination of HVI patterns pertinent threats may be realized. Various Persistent Surveillance System (PSS) imagery techniques have been proposed in the past decade for identifying human interactions with various objects in the environment. Understanding of such interactions facilitates to discover human intentions and motives. However, without consideration of incidental context, reasoning and analysis of such behavioral activities is a very challenging and difficult task. This paper presents a current survey of related publications in the area of context-based Imagery techniques applied for HVI recognition, in particular, it discusses taxonomy and ontology of HVI and presents a summary of reported robust image processing techniques for spatiotemporal characterization and tracking of human targets in urban environments. The discussed techniques include model-based, shape-based and appearance-based techniques employed for identification and classification of objects. A detailed overview of major past research activities related to HVI in PSS with exploitation of spatiotemporal reasoning techniques applied to semantic labeling of the HVI is also presented.

  18. A Decolorization Technique with Spent “Greek Coffee” Grounds as Zero-Cost Adsorbents for Industrial Textile Wastewaters

    Science.gov (United States)

    Kyzas, George Z.

    2012-01-01

    In this study, the decolorization of industrial textile wastewaters was studied in batch mode using spent “Greek coffee” grounds (COF) as low-cost adsorbents. In this attempt, there is a cost-saving potential given that there was no further modification of COF (just washed with distilled water to remove dirt and color, then dried in an oven). Furthermore, tests were realized both in synthetic and real textile wastewaters for comparative reasons. The optimum pH of adsorption was acidic (pH = 2) for synthetic effluents, while experiments in free pH (non-adjusted) were carried out for real effluents. Equilibrium data were fitted to the Langmuir, Freundlich and Langmuir-Freundlich (L-F) models. The calculated maximum adsorption capacities (Qmax) for total dye (reactive) removal at 25 °C was 241 mg/g (pH = 2) and 179 mg/g (pH = 10). Thermodynamic parameters were also calculated (ΔH0, ΔG0, ΔS0). Kinetic data were fitted to the pseudo-first, -second and -third order model. The optimum pH for desorption was determined, in line with desorption and reuse analysis. Experiments dealing the increase of mass of adsorbent showed a strong increase in total dye removal.

  19. A Decolorization Technique with Spent “Greek Coffee” Grounds as Zero-Cost Adsorbents for Industrial Textile Wastewaters

    Directory of Open Access Journals (Sweden)

    George Z. Kyzas

    2012-10-01

    Full Text Available In this study, the decolorization of industrial textile wastewaters was studied in batch mode using spent “Greek coffee” grounds (COF as low-cost adsorbents. In this attempt, there is a cost-saving potential given that there was no further modification of COF (just washed with distilled water to remove dirt and color, then dried in an oven. Furthermore, tests were realized both in synthetic and real textile wastewaters for comparative reasons. The optimum pH of adsorption was acidic (pH = 2 for synthetic effluents, while experiments in free pH (non-adjusted were carried out for real effluents. Equilibrium data were fitted to the Langmuir, Freundlich and Langmuir-Freundlich (L-F models. The calculated maximum adsorption capacities (Qmax for total dye (reactive removal at 25 °C was 241 mg/g (pH = 2 and 179 mg/g (pH = 10. Thermodynamic parameters were also calculated (ΔH0, ΔG0, ΔS0. Kinetic data were fitted to the pseudo-first, -second and -third order model. The optimum pH for desorption was determined, in line with desorption and reuse analysis. Experiments dealing the increase of mass of adsorbent showed a strong increase in total dye removal.

  20. Joint application of Geoelectrical Resistivity and Ground Penetrating Radar techniques for the study of hyper-saturated zones. Case study in Egypt

    Science.gov (United States)

    Mesbah, Hany S.; Morsy, Essam A.; Soliman, Mamdouh M.; Kabeel, Khamis

    2017-06-01

    This paper presents the results of the application of the Geoelectrical Resistivity Sounding (GRS) and Ground Penetrating Radar (GPR) for outlining and investigating of surface springing out (flow) of groundwater to the base of an service building site, and determining the reason(s) for the zone of maximum degree of saturation; in addition to provide stratigraphic information for this site. The studied economic building is constructed lower than the ground surface by about 7 m. A Vertical Electrical Sounding (VES) survey was performed at 12 points around the studied building in order to investigate the vertical and lateral extent of the subsurface sequence, three VES's were conducted at each side of the building at discrete distances. And a total of 9 GPR profiles with 100- and 200-MHz antennae were conducted, with the objective of evaluating the depth and the degree of saturation of the subsurface layers. The qualitative and quantitative interpretation of the acquired VES's showed easily the levels of saturations close to and around the studied building. From the interpretation of GPR profiles, it was possible to locate and determine the saturated layers. The radar signals are penetrated and enabled the identification of the subsurface reflectors. The results of GPR and VES showed a good agreement and the integrated interpretations were supported by local geology. Finally, the new constructed geoelectrical resistivity cross-sections (in contoured-form), are easily clarifying the direction of groundwater flow toward the studied building.

  1. Application of spectroscopic techniques to the study of illuminated manuscripts: A survey

    Energy Technology Data Exchange (ETDEWEB)

    Pessanha, S.; Manso, M.; Carvalho, M.L., E-mail: luisa@cii.fc.ul.pt

    2012-05-15

    This work focused on the application of the most relevant spectroscopic techniques used for the characterization of illuminated manuscripts. The historical value of these unique and invaluable artworks, together with the increased awareness concerning the conservation of cultural heritage, prompted the application of analytical techniques to the study of these illuminations. This is essential for the understanding of the artist's working methods, which aids conservation-restoration. The characterization of the pigments may also help assign a probable date to the manuscript. For these purposes, the spectroscopic techniques used so far include those that provide information on the elemental content: X-ray fluorescence, total reflection X-ray fluorescence and scanning electron microscopy coupled with energy-dispersive spectroscopy and laser-induced breakdown spectroscopy. Complementary techniques, such as X-ray diffraction, Fourier transform infrared and Raman spectroscopy, reveal information regarding the compounds present in the samples. The techniques, suitability, technological evolution and development of high-performance detectors, as well as the possibility of microanalysis and the higher sensitivity of the equipment, will also be discussed. Furthermore, issues such as the necessity of sampling, the portability of the equipment and the overall advantages and disadvantages of different techniques will be analyzed. - Highlights: Black-Right-Pointing-Pointer The techniques used for studying illuminated manuscripts are described and compared. Black-Right-Pointing-Pointer For in situ, non-destructive analysis the most suitable technique is EDXRF. Black-Right-Pointing-Pointer For quantitative analysis TXRF is more appropriate. Black-Right-Pointing-Pointer Raman spectroscopy is mostly used for pigments identification. Black-Right-Pointing-Pointer FTIR was used for the characterization of binders and parchment.

  2. A Survey on Data Mining Techniques Applied to Electricity-Related Time Series Forecasting

    OpenAIRE

    Francisco Martínez-Álvarez; Alicia Troncoso; Gualberto Asencio-Cortés; Riquelme, José C

    2015-01-01

    Data mining has become an essential tool during the last decade to analyze large sets of data. The variety of techniques it includes and the successful results obtained in many application fields, make this family of approaches powerful and widely used. In particular, this work explores the application of these techniques to time series forecasting. Although classical statistical-based methods provides reasonably good results, the result of the application of data mining outperforms those of ...

  3. Chest physiotherapy techniques in neurological intensive care units of India: A survey

    OpenAIRE

    2014-01-01

    Context: Neurological intensive care units (ICUs) are a rapidly developing sub-specialty of neurosciences. Chest physiotherapy techniques are of great value in neurological ICUs in preventing, halting, or reversing the impairments caused due to neurological disorder and ICU stay. However, chest physiotherapy techniques should be modified to a greater extent in the neurological ICU as compared with general ICUs. Aim: The aim of this study is to obtain data on current chest physiotherapy practi...

  4. Repeat Finding Techniques, Data Structures and Algorithms in DNA sequences: A Survey

    Directory of Open Access Journals (Sweden)

    Freeson Kaniwa

    2015-09-01

    Full Text Available DNA sequencing technologies keep getting faster and cheaper leading to massive availability of entire human genomes. This massive availability calls for better analysis tools with a potential to realize a shift from reactive to predictive medicine. The challenge remains, since the entire human genomes need more space and processing power than that can be offered by a standard Desktop PC for their analysis. A background of key concepts surrounding the area of DNA analysis is given and a review of selected prominent algorithms used in this area. The significance of this paper would be to survey the concepts surrounding DNA analysis so as to provide a deep rooted understanding and knowledge transfer regarding existing approaches for DNA analysis using Burrows-Wheeler transform, Wavelet tree and their respective strengths and weaknesses. Consequent to this survey, the paper attempts to provide some directions for future research.

  5. Aggregation and non aggregation techniques for large facility location problems: A survey

    Directory of Open Access Journals (Sweden)

    Irawan Chandra Ade

    2015-01-01

    Full Text Available A facility location problem is concerned with determining the location of some useful facilities in such a way so to fulfil one or a few objective functions and constraints. We survey those problems where, in the presence of a large number of customers, some form of aggregation may be required. In addition, a review on conditional location problems where some (say q facilities already exist in the study area is presented.

  6. Exchange Rate Forecasting Techniques, Survey Data, and Implications for the Foreign Exchange Market

    OpenAIRE

    Frankel, Jeffrey A.; Kenneth Froot

    1990-01-01

    The paper presents new empirical results that elucidate the dynamics of the foreign exchange market. The first half of the paper is an updated study of the exchange rate expectations held by market participants, as reflected in responses to surveys, and contains the following conclusions. First, the bias observed in the forward discount as a predictor of the future spot rate is not attributable to an exchange risk premium, as is conventionally believed. Second, at short horizons forecasters t...

  7. Application of Multivariate Statistical Techniques for the Characterization of Ground Water Quality of Lahore, Gujranwala and Sialkot (Pakistan

    Directory of Open Access Journals (Sweden)

    Farooq Ahmad

    2011-12-01

    Full Text Available Multivariate statistical techniques such as factor analysis (FA, cluster analysis (CA and discriminant analysis (DA, were applied for the evaluation of spatial variations and the interpretation of a large complex water quality data set of three cities (Lahore, Gujranwala and Sialkot in Punjab, Pakistan. 16 parameters of water samples collected from nine different sampling stations of each city were determined. Factor analysis indicates five factors, which explained 74% of the total variance in water quality data set. Five factors are salinization, alkalinity, temperature, domestic waste and chloride, which explained 31.1%, 14.3%, 10.6%, 10.0% and 8.0% of the total variance respectively. Hierarchical cluster analysis grouped nine sampling stations of each city into three clusters, i.e., relatively less polluted (LP, and moderately polluted (MP and highly polluted (HP sites, based on the similarity of water quality characteristics. Discriminant analysis (DA identified ten significant parameters (Calcium (Ca, Ammonia, Sulphate, Sodium (Na, electrical conductivity (EC, chloride, temperature (Temp, total hardness(TH, Turbidity, which discriminate the groundwater quality of three cities, with close to 100.0% correct assignment for spatial variations. This study illustrates the benefit of multivariate statistical techniques for interpreting complex data sets in the analysis of spatial variations in water quality, and to plan for future studies.

  8. Comparison of microbial and sorbed soil gas surgace geochemical techniques with seismic surveys from the Southern Altiplano, Bolivia

    Energy Technology Data Exchange (ETDEWEB)

    Aranibar, O.R.; Tucker, J.D.; Hiltzman, D.C.

    1995-12-31

    Yacimientos Petroliferos Fiscales Bolivianos (YPFB) undertook a large seismic evaluation in the southern Altiplano, Bolivia in 1994. As an additional layer of information, sorbed soil gas and Microbial Oil Survey Technique (MOST) geochemical surveys were conducted to evaluate the hydrocarbon microseepage potential. The Wara Sara Prospect had 387 sorbed soil gas samples, collected from one meter depth, and 539 shallow soil microbial samples, collected from 15 to 20 centimeter depth. The sorbed soil gas samples were collected every 500 meters and microbial samples every 250 meters along geochemical traverses spaced 1 km apart. The presence of anmalous hydrocarbon microseepage is indicated by (1) a single hydrocarbon source identified by gas crossplots, (2) the high gas values with a broad range, (3) the high overall gas average, (4) the clusters of elevated samples, and (5) the right hand skewed data distributions.

  9. A Survey on Data Mining Techniques Applied to Electricity-Related Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Francisco Martínez-Álvarez

    2015-11-01

    Full Text Available Data mining has become an essential tool during the last decade to analyze large sets of data. The variety of techniques it includes and the successful results obtained in many application fields, make this family of approaches powerful and widely used. In particular, this work explores the application of these techniques to time series forecasting. Although classical statistical-based methods provides reasonably good results, the result of the application of data mining outperforms those of classical ones. Hence, this work faces two main challenges: (i to provide a compact mathematical formulation of the mainly used techniques; (ii to review the latest works of time series forecasting and, as case study, those related to electricity price and demand markets.

  10. The Use of 3d Scanning and Photogrammetry Techniques in the Case Study of the Roman Theatre of Nikopolis. Surveying, Virtual Reconstruction and Restoration Study.

    Science.gov (United States)

    Bilis, T.; Kouimtzoglou, T.; Magnisali, M.; Tokmakidis, P.

    2017-02-01

    The aim of this paper is to present the specific methods by which 3D scanning and photogrammetric techniques were incorporated into the architectural study, the documentation and the graphic restoration study of the monument of the ancient theatre of Nikopolis. Traditional methods of surveying were enhanced by the use of 3D scanning and image-based 3D reconstruction and 3D remodelling and renderings. For this reason, a team of specialists from different scientific fields has been organized. This presented the opportunity to observe every change of the restoration design process, not only by the use of common elevations and ground plans, but also in 3D space. It has been also very liberating to know how the monument will look like in this unique site after the restoration, so as to obtain at the study stage the best intervention decisions possible. Moreover, these modern work tools helped of course to convince the authorities for the accuracy of the restoration actions and finally to make the proposal clear to the public.

  11. A survey of image processing techniques and statistics for ballistic specimens in forensic science.

    Science.gov (United States)

    Gerules, George; Bhatia, Sanjiv K; Jackson, Daniel E

    2013-06-01

    This paper provides a review of recent investigations on the image processing techniques used to match spent bullets and cartridge cases. It is also, to a lesser extent, a review of the statistical methods that are used to judge the uniqueness of fired bullets and spent cartridge cases. We review 2D and 3D imaging techniques as well as many of the algorithms used to match these images. We also provide a discussion of the strengths and weaknesses of these methods for both image matching and statistical uniqueness. The goal of this paper is to be a reference for investigators and scientists working in this field.

  12. SURVEY

    DEFF Research Database (Denmark)

    SURVEY er en udbredt metode og benyttes inden for bl.a. samfundsvidenskab, humaniora, psykologi og sundhedsforskning. Også uden for forskningsverdenen er der mange organisationer som f.eks. konsulentfirmaer og offentlige institutioner samt marketingsafdelinger i private virksomheder, der arbejder...... med surveys. Denne bog gennemgår alle surveyarbejdets faser og giver en praktisk indføring i: • design af undersøgelsen og udvælgelse af stikprøver, • formulering af spørgeskemaer samt indsamling og kodning af data, • metoder til at analysere resultaterne...

  13. Estimation of the Above Ground Biomass of Tropical Forests using Polarimetric and Tomographic SAR Data Acquired at P Band and 3-D Imaging Techniques

    Science.gov (United States)

    Ferro-Famil, L.; El Hajj Chehade, B.; Ho Tong Minh, D.; Tebaldini, S.; LE Toan, T.

    2016-12-01

    Developing and improving methods to monitor forest biomass in space and time is a timely challenge, especially for tropical forests, for which SAR imaging at larger wavelength presents an interesting potential. Nevertheless, directly estimating tropical forest biomass from classical 2-D SAR images may reveal a very complex and ill-conditioned problem, since a SAR echo is composed of numerous contributions, whose features and importance depend on many geophysical parameters, such has ground humidity, roughness, topography… that are not related to biomass. Recent studies showed that SAR modes of diversity, i.e. polarimetric intensity ratios or interferometric phase centers, do not fully resolve this under-determined problem, whereas Pol-InSAR tree height estimates may be related to biomass through allometric relationships, with, in general over tropical forests, significant levels of uncertainty and lack of robustness. In this context, 3-D imaging using SAR tomography represents an appealing solution at larger wavelengths, for which wave penetration properties ensures a high quality mapping of a tropical forest reflectivity in the vertical direction. This paper presents a series of studies led, in the frame of the preparation of the next ESA mission BIOMASS, on the estimation of biomass over a tropical forest in French Guiana, using Polarimetric SAR Tomographic (Pol-TomSAR) data acquired at P band by ONERA. It is then shown that Pol-TomoSAR significantly improves the retrieval of forest above ground biomass (AGB) in a high biomass forest (200 up to 500 t/ha), with an error of only 10% at 1.5-ha resolution using a reflectivity estimates sampled at a predetermined elevation. The robustness of this technique is tested by applying the same approach over another site, and results show a similar relationship between AGB and tomographic reflectivity over both sites. The excellent ability of Pol-TomSAR to retrieve both canopy top heights and ground topography with an error

  14. 分布式Web服务器技术综述%A Survey on Distributed Web Server Techniques

    Institute of Scientific and Technical Information of China (English)

    马晓星; 吕建

    2002-01-01

    With the explosive growth of the World Wide Wed,many popular Web sites are faced with the challenge of the overload of tremendous requests.The best way out is using distributed Web server systems for their good scalability and low costs.In this paper,we try to give a comprehensive survey on the underlying techniques of the distributed Web server systems:request-dispatching mechanisms,load-balancing algorithms,Web content replication and distribution schemas and other important aspects.

  15. Survey of Techniques for Deep Web Source Selection and Surfacing the Hidden Web Content

    Directory of Open Access Journals (Sweden)

    Khushboo Khurana

    2016-05-01

    Full Text Available Large and continuously growing dynamic web content has created new opportunities for large-scale data analysis in the recent years. There is huge amount of information that the traditional web crawlers cannot access, since they use link analysis technique by which only the surface web can be accessed. Traditional search engine crawlers require the web pages to be linked to other pages via hyperlinks causing large amount of web data to be hidden from the crawlers. Enormous data is available in deep web that can be useful to gain new insight for various domains, creating need to access the information from the deep web by developing efficient techniques. As the amount of Web content grows rapidly, the types of data sources are proliferating, which often provide heterogeneous data. So we need to select Deep Web Data sources that can be used by the integration systems. The paper discusses various techniques that can be used to surface the deep web information and techniques for Deep Web Source Selection.

  16. Results and analysis of the 2008-2009 Insulin Injection Technique Questionnaire survey

    NARCIS (Netherlands)

    De Coninck, Carina; Frid, Anders; Gaspar, Ruth; Hicks, Debbie; Hirsch, Larry; Kreugel, Gillian; Liersch, Jutta; Letondeur, Corinne; Sauvanet, Jean-Pierre; Tubiana, Nadia; Strauss, Kenneth

    2010-01-01

    Background: The efficacy of injection therapy in diabetes depends on correct injection technique and, to provide patients with guidance in this area, we must understand how they currently inject. Methods: From September 2008 to June 2009, 4352 insulin-injecting Type 1 and Type 2 diabetic patients fr

  17. A survey and taxonomy on energy efficient resource allocation techniques for cloud computing systems

    Energy Technology Data Exchange (ETDEWEB)

    Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv; Jayaraman, Prem Prakash; Kolodziej, Joanna; Balaji, Pavan; Zeadally, Sherali; Malluhi, Qutaibah Marwan; Tziritas, Nikos; Vishnu, Abhinav; Khan, Samee U.; Zomaya, Albert

    2014-06-06

    In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subject that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.

  18. A survey of applications and requirements of unique identification systems and RFID techniques

    NARCIS (Netherlands)

    Ilie-Zudor, Elisabeth; Kemeny, Zsolt; van Blommestein, Fred; Monostori, Laszlo; van der Meulen, Andre

    2011-01-01

    The paper contains an overview of unique identification issues and of the various radio frequency identification techniques that are available now or will become available in the short term. The paper also compares REID with traditional ID technologies. It shows application possibilities and gives e

  19. Introduction to the U.S. Geological Survey National Water-Quality Assessment (NAWQA) of ground-water quality trends and comparison to other national programs

    Science.gov (United States)

    Rosen, Michael R.; Lapham, W.W.

    2008-01-01

    Assessment of temporal trends in national ground-water quality networks are rarely published in scientific journals. This is partly due to the fact that long-term data from these types of networks are uncommon and because many national monitoring networks are not driven by hypotheses that can be easily incorporated into scientific research. The U.S. Geological Survey (USGS) National Water-Quality Assessment Program (NAWQA) since 1991 has to date (2006) concentrated on occurrence of contaminants because sufficient data for trend analysis is only just becoming available. This paper introduces the first set of trend assessments from NAWQA and provides an assessment of the success of the program. On a national scale, nitrate concentrations in ground water have generally increased from 1988 to 2004, but trends in pesticide concentrations are less apparent. Regionally, the studies showed high nitrate concentrations and frequent pesticide detections are linked to agricultural use of fertilizers and pesticides. Most of these areas showed increases in nitrate concentration within the last decade, and these increases are associated with oxic-geochemical conditions and well-drained soils. The current NAWQA plan for collecting data to define trends needs to be constantly reevaluated to determine if the approach fulfills the expected outcome. To assist this evaluation, a comparison of NAWQA to other national ground-water quality programs was undertaken. The design and spatial extent of each national program depend on many factors, including current and long-term budgets, purpose of the program, size of the country, and diversity of aquifer types. Comparison of NAWQA to nine other national programs shows a great diversity in program designs, but indicates that different approaches can achieve similar and equally important goals. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  20. Lung Cancer Early Diagnosis Using Some Data Mining Classification Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Thangaraju P

    2014-06-01

    Full Text Available Data mining is the process of analyzing data from different perspectives and summarizing it into useful information. Data mining is primarily used to this requirement thus finding its applications in diverse fields such as retail, financial, communication, marketing organizations and medicine. Data Mining plays an important role in healthcare organization because with the growth of population and dangerous deadly diseases like Cancer, SARS, Leprosy, HIV etc, Lung cancer is one of the most dangerous disease. This survey for appropriate medical image mining, Data Preprocessing, Feature Extraction, rule generation and classification, it provides basic framework for further improvement in medical diagnosis.

  1. Lung Cancer Early Diagnosis Using Some Data Mining Classification Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Thangaraju P

    2015-11-01

    Full Text Available  Data mining is the process of analyzing data from different perspectives and summarizing it into useful information. Data mining is primarily used to this requirement thus finding its applications in diverse fields such as retail, financial, communication, marketing organizations and medicine. Data Mining plays an important role in healthcare organization because with the growth of population and dangerous deadly diseases like Cancer, SARS, Leprosy, HIV etc, Lung cancer is one of the most dangerous disease. This survey for appropriate medical image mining, Data Preprocessing, Feature Extraction, rule generation and classification, it provides basic framework for further improvement in medical diagnosis.

  2. Carbon dioxide for gut distension during digestive endoscopy: Technique and practice survey

    Institute of Scientific and Technical Information of China (English)

    Filip Janssens; Jacques Deviere; Pierre Eisendrath; Jean-Marc Dumonceau

    2009-01-01

    AIM:To assess the adoption of Carbon dioxide (CO2) insufflation by endoscopists from various European countries, and its determinants. METHODS:A survey was di s t r ibuted to 580 endoscopists attending a live course on digestive endoscopy. RESULTS:The response rate was 24.5%. Fewer than half the respondents (66/142, 46.5%) were aware of the fact that room air can be replaced by CO2 for gut distension during endoscopy, and 4.2% of respondents were actually using CO2 as the insufflation agent. Endoscopists aware of the possibility of CO2 insufflation mentioned technical difficulties in implementing the system and the absence of significant advantages of CO2 in comparison with room air as barriers to adoption in daily practice (84% and 49% of answers, respectively; two answers were permitted for this item). CONCLUSION:Based on this survey, adoption of CO2 insufflation during endoscopy seems to remain relatively exceptional. A majority of endoscopists were not aware of this possibility, while others were not aware of recent technical developments that facilitate CO2 implementation in an endoscopy suite.

  3. Current trends in pulp therapy: a survey analyzing pulpotomy techniques taught in pediatric dental residency programs.

    Science.gov (United States)

    Walker, Laquia A; Sanders, Brian J; Jones, James E; Williamson, C Andrew; Dean, Jeffrey A; Legan, Joseph J; Maupome, Gerardo

    2013-01-01

    The study's purpose was to survey directors of pediatric dental residency programs in order to evaluate the materials currently being taught and used for pulpotomy procedures for primary teeth in educational and clinical settings. A web-based survey was emailed to all graduate pediatric dental residency program directors in the United States. Seventy one emails were sent to program directors, 47 responded but only 39 respondents (55%) were included in the study. Results suggested a slight decrease in utilization of formocresol 1:5 dilution (Pformocresol (18% of respondents) were systemic health concerns and carcinogenicity, in addition to evidence-based literature. Even though 25% of respondents have begun to use MTA for primary pulpotomy procedures, the most common reason for utilization of other medicaments over MTA was its higher cost. With 82% of graduate pediatric dental residency programs still utilizing formocresol 1:5 dilution for pulpotomy procedures in primary teeth, there has been no major shift away from its clinical use, in spite of increased usage of newer medicaments over the last 5 years.

  4. Polarized hard X-ray photoemission system with micro-positioning technique for probing ground-state symmetry of strongly correlated materials.

    Science.gov (United States)

    Fujiwara, Hidenori; Naimen, Sho; Higashiya, Atsushi; Kanai, Yuina; Yomosa, Hiroshi; Yamagami, Kohei; Kiss, Takayuki; Kadono, Toshiharu; Imada, Shin; Yamasaki, Atsushi; Takase, Kouichi; Otsuka, Shintaro; Shimizu, Tomohiro; Shingubara, Shoso; Suga, Shigemasa; Yabashi, Makina; Tamasaku, Kenji; Ishikawa, Tetsuya; Sekiyama, Akira

    2016-05-01

    An angle-resolved linearly polarized hard X-ray photoemission spectroscopy (HAXPES) system has been developed to study the ground-state symmetry of strongly correlated materials. The linear polarization of the incoming X-ray beam is switched by a transmission-type phase retarder composed of two diamond (100) crystals. The best value of the degree of linear polarization was found to be -0.96, containing a vertical polarization component of 98%. A newly developed low-temperature two-axis manipulator enables easy polar and azimuthal rotations to select the detection direction of photoelectrons. The lowest temperature achieved was 9 K, offering the chance to access the ground state even for strongly correlated electron systems in cubic symmetry. A co-axial sample monitoring system with long-working-distance microscope enables the same region on the sample surface to be measured before and after rotation. Combining this sample monitoring system with a micro-focused X-ray beam by means of an ellipsoidal Kirkpatrick-Baez mirror (25 µm × 25 µm FWHM), polarized valence-band HAXPES has been performed on NiO for voltage application as resistive random access memory to demonstrate the micro-positioning technique and polarization switching.

  5. Comparison of VTEC from ground-based space geodetic techniques based on ray-traced mapping factors

    Science.gov (United States)

    Heinkelmann, Robert; Alizadeh, M. Mahdi; Schuh, Harald; Deng, Zhiguo; Zus, Florian; Etemadfard, M. Hossein

    2016-07-01

    For the derivation of vertical total electron content (VTEC) from slant total electron content (STEC), usually a standard approach is used based on mapping functions that assume a single-layer model of the ionosphere (e.g. IERS Conventions 2010). In our study we test the standard approach against a recently developed alternative which is based on station specific ray-traced mapping factors. For the evaluation of this new mapping concept, we compute VTEC at selected Very Long Baseline Interferometry (VLBI) stations using the dispersive delays and the corresponding formal errors obtained by observing extra-galactic radio sources at two radio frequencies in S- and X-bands by the permanent geodetic/astrometric program organized by the IVS (International VLBI Service for Geodesy and Astrometry). Additionally, by applying synchronous sampling and a consistent analysis configuration, we determine VTEC at Global Navigation Satellite System (GNSS) antennas using GPS (Global Positioning System) and/or GLONASS (Globalnaja nawigazionnaja sputnikowaja Sistema) observations provided by the IGS (International GNSS Service) that are operated in the vicinity of the VLBI antennas. We compare the VTEC time series obtained by the individual techniques over a period of about twenty years and describe their characteristics qualitatively and statistically. The length of the time series allows us to assess the long-term climatology of ionospheric VTEC during the last twenty years.

  6. Analysis of ionospheric electrodynamic parameters on mesoscales – a review of selected techniques using data from ground-based observation networks and satellites

    Directory of Open Access Journals (Sweden)

    H. Vanhamäki

    2011-03-01

    Full Text Available We present a review of selected data-analysis methods that are frequently applied in studies of ionospheric electrodynamics and magnetosphere-ionosphere coupling using ground-based and space-based data sets. Our focus is on methods that are data driven (not simulations or statistical models and can be used in mesoscale studies, where the analysis area is typically some hundreds or thousands of km across. The selection of reviewed methods is such that most combinations of measured input data (electric field, conductances, magnetic field and currents that occur in practical applications are covered. The techniques are used to solve the unmeasured parameters from Ohm's law and Maxwell's equations, possibly with help of some simplifying assumptions. In addition to reviewing existing data-analysis methods, we also briefly discuss possible extensions that may be used for upcoming data sets.

  7. Detection techniques of selective forwarding attacks in wireless sensor networks: a survey

    CERN Document Server

    Sharma, Preeti; Saluja, Krishan Kumar

    2012-01-01

    The wireless sensor network has become a hot research area due its wide range of application in military and civilian domain, but as it uses wireless media for communication these are easily prone to security attacks. There are number of attacks on wireless sensor networks like black hole attack, sink hole attack, Sybil attack, selective forwarding attacks etc. in this paper we will concentrate on selective forwarding attacks In selective forwarding attacks, malicious nodes behave like normal nodes and selectively drop packets. The selection of dropping nodes may be random. Identifying such attacks is very difficult and sometimes impossible. In this paper we have listed up some detection techniques, which have been proposed by different researcher in recent years, there we also have tabular representation of qualitative analysis of detection techniques

  8. Different types of maximum power point tracking techniques for renewable energy systems: A survey

    Science.gov (United States)

    Khan, Mohammad Junaid; Shukla, Praveen; Mustafa, Rashid; Chatterji, S.; Mathew, Lini

    2016-03-01

    Global demand for electricity is increasing while production of energy from fossil fuels is declining and therefore the obvious choice of the clean energy source that is abundant and could provide security for development future is energy from the sun. In this paper, the characteristic of the supply voltage of the photovoltaic generator is nonlinear and exhibits multiple peaks, including many local peaks and a global peak in non-uniform irradiance. To keep global peak, MPPT is the important component of photovoltaic systems. Although many review articles discussed conventional techniques such as P & O, incremental conductance, the correlation ripple control and very few attempts have been made with intelligent MPPT techniques. This document also discusses different algorithms based on fuzzy logic, Ant Colony Optimization, Genetic Algorithm, artificial neural networks, Particle Swarm Optimization Algorithm Firefly, Extremum seeking control method and hybrid methods applied to the monitoring of maximum value of power at point in systems of photovoltaic under changing conditions of irradiance.

  9. New sensor and non-contact geometrical survey for the vibrating wire technique

    Energy Technology Data Exchange (ETDEWEB)

    Geraldes, Renan [Brazilian Synchrotron Light Laboratory (LNLS), Campinas, SP (Brazil); Junqueira Leão, Rodrigo, E-mail: rodrigo.leao@lnls.br [Brazilian Synchrotron Light Laboratory (LNLS), Campinas, SP (Brazil); Cernicchiaro, Geraldo [Brazilian Center for Research in Physics (CBPF), Rio de Janeiro, RJ (Brazil); Terenzi Neuenschwander, Regis; Citadini, James Francisco; Droher Rodrigues, Antônio Ricardo [Brazilian Synchrotron Light Laboratory (LNLS), Campinas, SP (Brazil)

    2016-03-01

    The tolerances for the alignment of the magnets in the girders of the next machine of the Brazilian Synchrotron Light Laboratory (LNLS), Sirius, are as small as 40 µm for translations and 0.2 mrad for rotations. Therefore, a novel approach to the well-known vibrating wire technique has been developed and tested for the precise fiducialization of magnets. The alignment bench consists of four commercial linear stages, a stretched wire, a commercial lock-in amplifier working with phase-locked loop (PLL), a coordinate measuring machine (CMM) and a vibration sensor for the wire. This novel sensor has been designed for a larger linear region of operation. For the mechanical metrology step of the fiducialization of quadrupoles an innovative technique, using the vision system of the CMM, is presented. While the work with pitch and yaw orientations is still ongoing with promising partial results, the system already presents an uncertainty level below 10 µm for translational alignment.

  10. A Survey on Modeling and Simulation of MEMS Switches and Its Application in Power Gating Techniques

    Directory of Open Access Journals (Sweden)

    Pramod Kumar M.P

    2014-04-01

    Full Text Available Large numbers of techniques have been developed to reduce the leakage power, including supply voltage scaling, varying threshold voltages, smaller logic banks, etc. Power gating is a technique which is used to reduce the static power when the sleep transistor is in off condition. Micro Electro mechanical System (MEMS switches have properties that are very close to an ideal switch, with infinite off-resistance due to an air gap and low on-resistance due to the ohmic metal to metal contact that is formed. In this paper, we discussed the MEMS switch, its material selection and its working in power gated circuits for the purpose of massive reduction of leakage power. This CMOS- MEMS combination provides high switching speed, very clean contacts, less reliability and less lifetime.

  11. A Survey On Detect - Discovering And Evaluating Trust Using Efficient Clustering Technique For Manets

    Directory of Open Access Journals (Sweden)

    K.Sudharson

    2012-03-01

    Full Text Available Analyzing and predicting behavior of node can lead to more secure and more appropriate defense mechanism for attackers in the Mobile Adhoc Network. In this work, models for dynamic recommendation based on fuzzy clustering techniques, applicable to nodes that are currently participate in the transmission of Adhoc Network. The approach focuses on both aspects of MANET mining and behavioral mining. Applying fuzzy clustering and mining techniques, the model infers the node's preferences from transmission logs. The fuzzy clustering approach, in this study, provides the possibility of capturing the uncertainty among node's behaviors. The results shown are promising and proved that integrating fuzzy approach provide us with more interesting and useful patterns which consequently making the recommender system more functional and robust.

  12. Survey of Techniques for Deep Web Source Selection and Surfacing the Hidden Web Content

    OpenAIRE

    Khushboo Khurana; M.B. Chandak

    2016-01-01

    Large and continuously growing dynamic web content has created new opportunities for large-scale data analysis in the recent years. There is huge amount of information that the traditional web crawlers cannot access, since they use link analysis technique by which only the surface web can be accessed. Traditional search engine crawlers require the web pages to be linked to other pages via hyperlinks causing large amount of web data to be hidden from the crawlers. Enormous data is available in...

  13. Sampling and analytical techniques for an interim survey in the South Carolina lowcountry

    Science.gov (United States)

    Richard L. Welch; Robert A. Cathey

    1976-01-01

    Remeasurement of 675 permanent sample locations in the South Carolina Lowcountry using modified sampling techniques showed that net growth of pine for the 6 years 1968-1974 was 637.0 million cubic feet while removals were slightly over 390.6 million cubic feet. In 1974, there were 1,533.5 million cubic feet of pine in the area with that portion in sawtimber size...

  14. Survey of Techniques for Deep Web Source Selection and Surfacing the Hidden Web Content

    OpenAIRE

    Khushboo Khurana; M B Chandak

    2016-01-01

    Large and continuously growing dynamic web content has created new opportunities for large-scale data analysis in the recent years. There is huge amount of information that the traditional web crawlers cannot access, since they use link analysis technique by which only the surface web can be accessed. Traditional search engine crawlers require the web pages to be linked to other pages via hyperlinks causing large amount of web data to be hidden from the crawlers. Enormous data is available in...

  15. Contextualising Water Use in Residential Settings: A Survey of Non-Intrusive Techniques and Approaches

    Directory of Open Access Journals (Sweden)

    Davide Carboni

    2016-05-01

    Full Text Available Water monitoring in households is important to ensure the sustainability of fresh water reserves on our planet. It provides stakeholders with the statistics required to formulate optimal strategies in residential water management. However, this should not be prohibitive and appliance-level water monitoring cannot practically be achieved by deploying sensors on every faucet or water-consuming device of interest due to the higher hardware costs and complexity, not to mention the risk of accidental leakages that can derive from the extra plumbing needed. Machine learning and data mining techniques are promising techniques to analyse monitored data to obtain non-intrusive water usage disaggregation. This is because they can discern water usage from the aggregated data acquired from a single point of observation. This paper provides an overview of water usage disaggregation systems and related techniques adopted for water event classification. The state-of-the art of algorithms and testbeds used for fixture recognition are reviewed and a discussion on the prominent challenges and future research are also included.

  16. A Survey On: Content Based Image Retrieval Systems Using Clustering Techniques For Large Data sets

    Directory of Open Access Journals (Sweden)

    Monika Jain

    2011-12-01

    Full Text Available Content-based image retrieval (CBIR is a new but widely adopted method for finding images from vastand unannotated image databases. As the network and development of multimedia technologies arebecoming more popular, users are not satisfied with the traditional information retrieval techniques. Sonowadays the content based image retrieval (CBIR are becoming a source of exact and fast retrieval. Inrecent years, a variety of techniques have been developed to improve the performance of CBIR. Dataclustering is an unsupervised method for extraction hidden pattern from huge data sets. With large datasets, there is possibility of high dimensionality. Having both accuracy and efficiency for high dimensionaldata sets with enormous number of samples is a challenging arena. In this paper the clustering techniquesare discussed and analysed. Also, we propose a method HDK that uses more than one clustering techniqueto improve the performance of CBIR.This method makes use of hierachical and divide and conquer KMeansclustering technique with equivalency and compatible relation concepts to improve the performanceof the K-Means for using in high dimensional datasets. It also introduced the feature like color, texture andshape for accurate and effective retrieval system.

  17. Contextualising Water Use in Residential Settings: A Survey of Non-Intrusive Techniques and Approaches.

    Science.gov (United States)

    Carboni, Davide; Gluhak, Alex; McCann, Julie A; Beach, Thomas H

    2016-05-20

    Water monitoring in households is important to ensure the sustainability of fresh water reserves on our planet. It provides stakeholders with the statistics required to formulate optimal strategies in residential water management. However, this should not be prohibitive and appliance-level water monitoring cannot practically be achieved by deploying sensors on every faucet or water-consuming device of interest due to the higher hardware costs and complexity, not to mention the risk of accidental leakages that can derive from the extra plumbing needed. Machine learning and data mining techniques are promising techniques to analyse monitored data to obtain non-intrusive water usage disaggregation. This is because they can discern water usage from the aggregated data acquired from a single point of observation. This paper provides an overview of water usage disaggregation systems and related techniques adopted for water event classification. The state-of-the art of algorithms and testbeds used for fixture recognition are reviewed and a discussion on the prominent challenges and future research are also included.

  18. Development of tongue-shaped and multilobate rock glaciers in alpine environments - Interpretations from ground penetrating radar surveys

    Science.gov (United States)

    Degenhardt, John J., Jr.

    2009-08-01

    Rock glaciers occur as lobate or tongue-shaped landforms composed of mixtures of poorly sorted, angular to blocky rock debris and ice. These landforms serve as primary sinks for ice and water storage in mountainous areas and represent transitional forms in the debris transport system, accounting for ~ 60% of all mass transport in some alpine regions. Observations of active (flowing) alpine rock glaciers indicate a common association between the debris that originates from cirque headwalls and the depositional lobes that comprise them. The delivery of this debris to the rock glacier is regulated primarily by the rate of headwall erosion and the point of origin of debris along the headwall. These factors control the relative movement of individual depositional lobes as well as the overall rate of propagation of a rock glacier. In recent geophysical studies, a number of alpine rock glaciers on Prins Karls Forland and Nordenskiöldland, Svalbard, Norway, and the San Juan Mountains of southwest Colorado, USA, have been imaged using ground penetrating radar (GPR) to determine if a relationship exists between the internal structure and surface morphology. Results indicate that the overall morphologic expression of alpine rock glaciers is related to lobate deposition during catastrophic episodes of rockfall that originated from associated cirque headwalls. Longitudinal GPR profiles from alpine rock glaciers examined in this study suggests that the difference in gross morphology between the lobate and tongue-shaped rock glaciers can be attributed primarily (but not exclusively) to cirque geometry, frequency and locations of debris discharge within the cirque, and the trend and magnitude of valley gradient in relation to cirque orientation. Collectively, these factors determine the manner in which high magnitude debris discharges, which seem to be the primary mechanism of formation, accumulate to form these rock glaciers.

  19. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2007-01-01

    Full Text Available In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart sensors that today’s cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher’s interest in the fusion of intelligent sensors and optimal signal processing techniques.

  20. Campaign 9 of the K2 Mission: Observational Parameters, Scientific Drivers, and Community Involvement for a Simultaneous Space- and Ground-based Microlensing Survey

    Science.gov (United States)

    Henderson, Calen B.; Poleski, Radosław; Penny, Matthew; Street, Rachel A.; Bennett, David P.; Hogg, David W.; Gaudi, B. Scott; K2 Campaign 9 Microlensing Science Team; Zhu, W.; Barclay, T.; Barentsen, G.; Howell, S. B.; Mullally, F.; Udalski, A.; Szymański, M. K.; Skowron, J.; Mróz, P.; Kozłowski, S.; Wyrzykowski, Ł.; Pietrukowicz, P.; Soszyński, I.; Ulaczyk, K.; Pawlak, M.; OGLE Project, The; Sumi, T.; Abe, F.; Asakura, Y.; Barry, R. K.; Bhattacharya, A.; Bond, I. A.; Donachie, M.; Freeman, M.; Fukui, A.; Hirao, Y.; Itow, Y.; Koshimoto, N.; Li, M. C. A.; Ling, C. H.; Masuda, K.; Matsubara, Y.; Muraki, Y.; Nagakane, M.; Ohnishi, K.; Oyokawa, H.; Rattenbury, N.; Saito, To.; Sharan, A.; Sullivan, D. J.; Tristram, P. J.; Yonehara, A.; MOA Collaboration; Bachelet, E.; Bramich, D. M.; Cassan, A.; Dominik, M.; Figuera Jaimes, R.; Horne, K.; Hundertmark, M.; Mao, S.; Ranc, C.; Schmidt, R.; Snodgrass, C.; Steele, I. A.; Tsapras, Y.; Wambsganss, J.; RoboNet Project, The; Bozza, V.; Burgdorf, M. J.; Jørgensen, U. G.; Calchi Novati, S.; Ciceri, S.; D'Ago, G.; Evans, D. F.; Hessman, F. V.; Hinse, T. C.; Husser, T.-O.; Mancini, L.; Popovas, A.; Rabus, M.; Rahvar, S.; Scarpetta, G.; Skottfelt, J.; Southworth, J.; Unda-Sanzana, E.; The MiNDSTEp Team; Bryson, S. T.; Caldwell, D. A.; Haas, M. R.; Larson, K.; McCalmont, K.; Packard, M.; Peterson, C.; Putnam, D.; Reedy, L.; Ross, S.; Van Cleve, J. E.; K2C9 Engineering Team; Akeson, R.; Batista, V.; Beaulieu, J.-P.; Beichman, C. A.; Bryden, G.; Ciardi, D.; Cole, A.; Coutures, C.; Foreman-Mackey, D.; Fouqué, P.; Friedmann, M.; Gelino, C.; Kaspi, S.; Kerins, E.; Korhonen, H.; Lang, D.; Lee, C.-H.; Lineweaver, C. H.; Maoz, D.; Marquette, J.-B.; Mogavero, F.; Morales, J. C.; Nataf, D.; Pogge, R. W.; Santerne, A.; Shvartzvald, Y.; Suzuki, D.; Tamura, M.; Tisserand, P.; Wang, D.

    2016-12-01

    K2's Campaign 9 (K2C9) will conduct a ˜3.7 deg2 survey toward the Galactic bulge from 2016 April 22 through July 2 that will leverage the spatial separation between K2 and the Earth to facilitate measurement of the microlens parallax {π }{{E}} for ≳ 170 microlensing events. These will include several that are planetary in nature as well as many short-timescale microlensing events, which are potentially indicative of free-floating planets (FFPs). These satellite parallax measurements will in turn allow for the direct measurement of the masses of and distances to the lensing systems. In this article we provide an overview of the K2C9 space- and ground-based microlensing survey. Specifically, we detail the demographic questions that can be addressed by this program, including the frequency of FFPs and the Galactic distribution of exoplanets, the observational parameters of K2C9, and the array of resources dedicated to concurrent observations. Finally, we outline the avenues through which the larger community can become involved, and generally encourage participation in K2C9, which constitutes an important pathfinding mission and community exercise in anticipation of WFIRST.

  1. Proximity morality in medical school – medical students forming physician morality "on the job": Grounded theory analysis of a student survey

    Directory of Open Access Journals (Sweden)

    Sallin Karl

    2007-08-01

    Full Text Available Abstract Background The value of ethics education have been questioned. Therefore we did a student survey on attitudes about the teaching of ethics in Swedish medical schools. Methods Questionnaire survey on attitudes to ethics education with 409 Swedish medical students participating. We analyzed > 8000 words of open-ended responses and multiple-choice questions using classic grounded theory procedures. Results In this paper we suggest that medical students take a proximity morality stance towards their ethics education meaning that they want to form physician morality "on the job". This involves comprehensive ethics courses in which quality lectures provide "ethics grammar" and together with attitude exercises and vignette reflections nurture tutored group discussions. Goals of forming physician morality are to develop a professional identity, handling diversity of religious and existential worldviews, training students described as ethically naive, processing difficult clinical experiences, and desisting negative role modeling from physicians in clinical or teaching situations, some engaging in "ethics suppression" by controlling sensitive topic discussions and serving students politically correct attitudes. Conclusion We found that medical students have a proximity morality attitude towards ethics education. Rather than being taught ethics they want to form their own physician morality through tutored group discussions in comprehensive ethics courses.

  2. Proximity morality in medical school--medical students forming physician morality "on the job": grounded theory analysis of a student survey.

    Science.gov (United States)

    Thulesius, Hans O; Sallin, Karl; Lynoe, Niels; Löfmark, Rurik

    2007-08-06

    The value of ethics education have been questioned. Therefore we did a student survey on attitudes about the teaching of ethics in Swedish medical schools. Questionnaire survey on attitudes to ethics education with 409 Swedish medical students participating. We analyzed > 8000 words of open-ended responses and multiple-choice questions using classic grounded theory procedures. In this paper we suggest that medical students take a proximity morality stance towards their ethics education meaning that they want to form physician morality "on the job". This involves comprehensive ethics courses in which quality lectures provide "ethics grammar" and together with attitude exercises and vignette reflections nurture tutored group discussions. Goals of forming physician morality are to develop a professional identity, handling diversity of religious and existential worldviews, training students described as ethically naive, processing difficult clinical experiences, and desisting negative role modeling from physicians in clinical or teaching situations, some engaging in "ethics suppression" by controlling sensitive topic discussions and serving students politically correct attitudes. We found that medical students have a proximity morality attitude towards ethics education. Rather than being taught ethics they want to form their own physician morality through tutored group discussions in comprehensive ethics courses.

  3. Survey of techniques for reduction of wind turbine blade trailing edge noise.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin

    2011-08-01

    Aerodynamic noise from wind turbine rotors leads to constraints in both rotor design and turbine siting. The primary source of aerodynamic noise on wind turbine rotors is the interaction of turbulent boundary layers on the blades with the blade trailing edges. This report surveys concepts that have been proposed for trailing edge noise reduction, with emphasis on concepts that have been tested at either sub-scale or full-scale. These concepts include trailing edge serrations, low-noise airfoil designs, trailing edge brushes, and porous trailing edges. The demonstrated noise reductions of these concepts are cited, along with their impacts on aerodynamic performance. An assessment is made of future research opportunities in trailing edge noise reduction for wind turbine rotors.

  4. Elevation Change of Drangajokull, Iceland, from Cloud-Cleared ICESat Repeat Profiles and GPS Ground-Survey Data

    Science.gov (United States)

    Shuman, Christopher A.; Sigurdsson, Oddur; Williams, Richard, Jr.; Hall, Dorothy K.

    2009-01-01

    Located on the Vestfirdir Northwest Fjords), DrangaJokull is the northernmost ice map in Iceland. Currently, the ice cap exceeds 900 m in elevation and covered an area of approx.l46 sq km in August 2004. It was about 204 sq km in area during 1913-1914 and so has lost mass during the 20th century. Drangajokull's size and accessibility for GPS surveys as well as the availability of repeat satellite altimetry profiles since late 2003 make it a good subject for change-detection analysis. The ice cap was surveyed by four GPS-equipped snowmobiles on 19-20 April 2005 and has been profiled in two places by Ice, Cloud. and land Elevation Satellite (ICESat) 'repeat tracks,' fifteen times from late to early 2009. In addition, traditional mass-balance measurements have been taken seasonally at a number of locations across the ice cap and they show positive net mass balances in 2004/2005 through 2006/2007. Mean elevation differences between the temporally-closest ICESat profiles and the GPS-derived digital-elevation model (DEM)(ICESat - DEM) are about 1.1 m but have standard deviations of 3 to 4 m. Differencing all ICESat repeats from the DEM shows that the overall elevation difference trend since 2003 is negative with losses of as much as 1.5 m/a from same season to same season (and similar elevation) data subsets. However, the mass balance assessments by traditional stake re-measurement methods suggest that the elevation changes where ICESat tracks 0046 and 0307 cross Drangajokull are not representative of the whole ice cap. Specifically, the area has experienced positive mass balance years during the time frame when ICESat data indicates substantial losses. This analysis suggests that ICESat-derived elevations may be used for multi-year change detection relative to other data but suggests that large uncertainties remain. These uncertainties may be due to geolocation uncertainty on steep slopes and continuing cloud cover that limits temporal and spatial coverage across the

  5. I Environmental DNA sampling is more sensitive than a traditional survey technique for detecting an aquatic invader.

    Science.gov (United States)

    Smart, Adam S; Tingley, Reid; Weeks, Andrew R; van Rooyen, Anthony R; McCarthy, Michael A

    2015-10-01

    Effective management of alien species requires detecting populations in the early stages of invasion. Environmental DNA (eDNA) sampling can detect aquatic species at relatively low densities, but few studies have directly compared detection probabilities of eDNA sampling with those of traditional sampling methods. We compare the ability of a traditional sampling technique (bottle trapping) and eDNA to detect a recently established invader, the smooth newt Lissotriton vulgaris vulgaris, at seven field sites in Melbourne, Australia. Over a four-month period, per-trap detection probabilities ranged from 0.01 to 0.26 among sites where L. v. vulgaris was detected, whereas per-sample eDNA estimates were much higher (0.29-1.0). Detection probabilities of both methods varied temporally (across days and months), but temporal variation appeared to be uncorrelated between methods. Only estimates of spatial variation were strongly correlated across the two sampling techniques. Environmental variables (water depth, rainfall, ambient temperature) were not clearly correlated with detection probabilities estimated via trapping, whereas eDNA detection probabilities were negatively correlated with water depth, possibly reflecting higher eDNA concentrations at lower water levels. Our findings demonstrate that eDNA sampling can be an order of magnitude more sensitive than traditional methods, and illustrate that traditional- and eDNA-based surveys can provide independent information on species distributions when occupancy surveys are conducted over short timescales.

  6. Ice thickness, volume and subglacial topography of Urumqi Glacier No. 1, Tianshan mountains, central Asia, by ground penetrating radar survey

    Indian Academy of Sciences (India)

    Puyu Wang; Zhongqin Li; Shuang Jin; Ping Zhou; Hongbing Yao; Wenbin Wang

    2014-04-01

    The results of radar survey for three times are presented, aiming to determine ice thickness, volume and subglacial topography of Urumqi Glacier No. 1, Tianshan Mountains, central Asia. Results show that the distribution of ice is more in the center and lesser at both ends of the glacier. The bedrock is quite regular with altitudes decreasing towards the ice front, showing the U-shaped subglacial valley. By comparison, typical ice thinning along the centerline of the East Branch of the glacier was 10–18 m for the period 1981–2006, reaching a maximum of ∼30 m at the terminus. The corresponding ice volume was 10296.2 × 104 m3, 8797.9 × 104 m3 and 8115.0 × 104 m3 in 1981, 2001 and 2006, respectively. It has decreased by 21.2% during the past 25 years, which is the direct result of glacier thinning. In the same period, the ice thickness, area and terminus decreased by 12.2%, 10.3%, and 3.6%, respectively. These changes are responses to the regional climatic warming, which show a dramatic increase of 0.6°C (10 a)−1 during the period 1981–2006.

  7. Using SERVQUAL and Kano research techniques in a patient service quality survey.

    Science.gov (United States)

    Christoglou, Konstantinos; Vassiliadis, Chris; Sigalas, Ioakim

    2006-01-01

    This article presents the results of a service quality study. After an introduction to the SERVQUAL and the Kano research techniques, a Kano analysis of 75 patients from the General Hospital of Katerini in Greece is presented. The service quality criterion used satisfaction and dissatisfaction indices. The Kano statistical analysis process results strengthened the hypothesis of previous research regarding the importance of personal knowledge, the courtesy of the hospital employees and their ability to convey trust and confidence (assurance dimension). Managerial suggestions are made regarding the best way of acting and approaching hospital patients based on the basic SERVQUAL model.

  8. Adaptive InSAR combined with surveying techniques for an improved characterisation of active landslides (El Portalet)

    Science.gov (United States)

    Duro, Javier; Albiol, David; Sánchez, Francisco; Herrera, Gerardo; García Davalillo, Juan Carlos; Fernandez Merodo, Jose Antonio; Allasia, Paolo; Lollino, Piernicola; Manconi, Andrea

    2014-05-01

    InSAR and the Persistent Scatterer Interferometry (PSI) are well established techniques for monitoring urban and rural areas. Besides the large number of available SAR data in the past, the current and forthcoming space-borne SAR sensors offer the possibility of selecting the optimal acquisition configuration (wavelength, resolution, incidence angle, etc.) for each application. However, optimal data takes are not always possible and/or the processing area is difficult to analyse under an InSAR point of view. In such situations, additional and adaptive InSAR developments combined with other surveying techniques provide consistent solutions that meet the requirements of different application cases This work presents an advanced InSAR processing adapted for an active slow deformation landslide in a mountainous area. The presentation will show the benefits of applying advanced and adaptive filtering strategies for improving the InSAR quality in highly decorrelated environments. The availability of Artificial Corner Reflectors over the area of interest enables to tune the filtering procedure and thus maximize the detection and exploitation of natural targets (bare soil, roads, rocks) as measurement points while preserving the phase characteristics over individual and punctual targets (building corners, poles). The new results will be evaluated in terms of final density and quality of measurement points that can be retrieved. The results will show that a very high density of measurements improves the detection of the deformation gradients and its perimeters resulting in a more accurate characterization of the landslide area. The area of study is El Portalet, an active slow deformation landslide area in Central Spanish Pyrenees. During many years the slope of interest has been monitored with several surveying techniques like DGPS, extensometers, inclinometers, GB-SAR and InSAR jointly with an extensive geological interpretation. Currently, in the frame of the FP7 Project

  9. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of)

    2015-05-15

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making.

  10. Survey on Security Issues in Cloud Computing and Associated Mitigation Techniques

    CERN Document Server

    Bhadauria, Rohit

    2012-01-01

    Cloud Computing holds the potential to eliminate the requirements for setting up of high-cost computing infrastructure for IT-based solutions and services that the industry uses. It promises to provide a flexible IT architecture, accessible through internet for lightweight portable devices. This would allow multi-fold increase in the capacity or capabilities of the existing and new software. In a cloud computing environment, the entire data reside over a set of networked resources, enabling the data to be accessed through virtual machines. Since these data-centers may lie in any corner of the world beyond the reach and control of users, there are multifarious security and privacy challenges that need to be understood and taken care of. Also, one can never deny the possibility of a server breakdown that has been witnessed, rather quite often in the recent times. There are various issues that need to be dealt with respect to security and privacy in a cloud computing scenario. This extensive survey paper aims to...

  11. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  12. ADAPTIVE E-LEARNING TECHNIQUES IN THE DEVELOPMENT OF TEACHING ELECTRONIC PORTFOLIO – A SURVEY

    Directory of Open Access Journals (Sweden)

    DEKSON D.E.,

    2010-09-01

    Full Text Available Emerging technologies of communication and information influence the society, in particular the educational system in new directions. Media based educational systems are becoming more popular today and vast student population rely on this for learning. In the technologically emerging education system, it is necessary to have an e-learning system which can understand the learner’s preferences and make attempts to deliver content accordingly. It has become a challenging task to understand the learning preferences of the learners and adapt a method to offer content to suit the learning styles of the learners. Many educationists and researchers in education have made attempts and conducted research on delivering adaptive content. The learners of today are on the look out for content that would suit them in terms of their taste, understanding level, learning curve, own preferences and their personal traits. Thelearning process would be more efficient if we could satisfy the above needs of the learners. This paper makes a survey of the various means of offering adaptive content in an e-learning environment and explores the possible ways of achieving adaptability in learning systems. We conduct a study on the various models of adaptive contentdelivery system and propose newer methods of delivering adaptive content in an e-learning environment.

  13. A survey of routing techniques in store-and-forward and wormhole interconnects.

    Energy Technology Data Exchange (ETDEWEB)

    Holman, David Michael; Lee, David S.

    2008-01-01

    This paper presents an overview of algorithms for directing messages through networks of varying topology. These are commonly referred to as routing algorithms in the literature that is presented. In addition to providing background on networking terminology and router basics, the paper explains the issues of deadlock and livelock as they apply to routing. After this, there is a discussion of routing algorithms for both store-and-forward and wormhole-switched networks. The paper covers both algorithms that do and do not adapt to conditions in the network. Techniques targeting structured as well as irregular topologies are discussed. Following this, strategies for routing in the presence of faulty nodes and links in the network are described.

  14. Survey of Region-Based Text Extraction Techniques for Efficient Indexing of Image/Video Retrieval

    Directory of Open Access Journals (Sweden)

    Samabia Tehsin

    2014-11-01

    Full Text Available With the dramatic increase in multimedia data, escalating trend of internet, and amplifying use of image/video capturing devices; content based indexing and text extraction is gaining more and more importance in research community. In the last decade, many techniques for text extraction are reported in the literature. Methodologies of text extraction from images/videos is generally comprises of text detection and localization, text tracking, text segmentation and optical character recognition (OCR. This paper intends to highlight the contributions and limitations of text detection, localization and tracking phases. The problem is exigent due to variations in the font styles, size and color, text orientations, animations and backgrounds. The paper can serve as the beacon-house for the novice researchers of the text extraction community.

  15. Advanced interpretation of ground motion using Persistent Scatterer Interferometry technique: the Alto Guadalentín Basin (Spain) case of study

    Science.gov (United States)

    Bonì, Roberta; Herrera, Gerardo; Meisina, Claudia; Notti, Davide; Zucca, Francesco; Bejar, Marta; González, Pablo; Palano, Mimmo; Tomás, Roberto; Fernandez, José; Fernández-Merodo, José; Mulas, Joaquín; Aragón, Ramón; Mora, Oscar

    2014-05-01

    Subsidence related to fluid withdrawal has occurred in numerous regions of the world. The phenomena is an important hazard closely related to the development of urban areas. The analysis of the deformations requires an extensive and continuous spatial and temporal monitoring to prevent the negative effects of such risks on structures and infrastructures. Deformation measurements are fundamental in order to identify the affected area extension, to evaluate the temporal evolution of deformation velocities and to identify the main control mechanisms. Differential SAR interferometry represents an advanced remote sensing tool, which can map displacements at very high spatial resolution. The Persistent Scatterer Interferometry (PSI) technique is a class of SAR interferometry that uses point-wise radar targets (PS) on the ground whose phase is not interested by temporal and geometrical decorrelation. This technique generates starting from a set of images two main products: the displacement rate along line of sight (LOS) of single PS; and the LOS displacement time series of individual PS. In this work SAR data with different spatio-temporal resolution were used to study the displacements that occur from 1992 to 2012 in the Alto Guadalentin Basin (southern Spain), where is located the city of Lorca The area is affected by the highest rate of subsidence measured in Europe (>10 cm/yr-1) related to long-term exploitation of the aquifer (González et al. 2011). The objectives of the work were 1) to analyse land subsidence evolution over a 20-year period with PSI technique; 2) to compare the spatial and temporal resolution of SAR data acquired by different sensors, 3) to investigate the causes that could explain this land motion. The SAR data have been obtained with ERS-1/2 & ENVISAT (1992-2007), ALOS PALSAR (2007-2010) and COSMO-SkyMed (2011-2012) images, processed with the Stable Point Network (SPN) technique. The PSI data obtained from different satellite from 1992 to 2012

  16. A Survey on Web Services Discovery Techniques%Web服务发现技术研究

    Institute of Scientific and Technical Information of China (English)

    毛雪; 关佶红

    2011-01-01

    The research on Web services has become a hot spot both in academic and industrial circles. One of the most important issues is how to discover the Web services for the requestors efficiently and effectively. Nowadays, mere and more novel Web services discovery methods are presented. This paper introduces cunent Web service discovery techniques from the aspects of system structure, Web services description, matching. Analyse their advantages and disadvantages. Based on those, bring ont the basic issues in currant Web service discovery techniques, and point out the developing direction.%Web服务近年来一直足学术界和工业界的研究热点,其中如何准确而高效地找到符合用户需求的Web服务,更是制约web服务应用发展的重要的问题.目前,为了提高Web服务发现的质量与效率,有很多新颖且有效方法被提出.文中根据Web服务发现技术的研究现状,主要从系统结构、服务描述、匹配方法方面对当前的Web服务发现技术进行归类和介绍,分析和比较了各种服务发现方法的优势和缺点,据此提出了Web服务发现技术面临的主要问题并指出了今后研究发展方向.

  17. Detection of the seasonal cyclic movement in historic buildings by means of surveying techniques

    Directory of Open Access Journals (Sweden)

    Valle-Melón, J. M.

    2011-03-01

    Full Text Available As in other engineering structures, historic buildings are conditioned by atmospheric changes which affect their size and shape. These effects follow a more or less cyclic pattern and do not normally put the stability of such buildings in jeopardy since they are part of their natural dynamics. Nevertheless, the study of these effects provides valuable information to understand the behavior of both the building and the materials it is made of. This paper arose from the project of geometric monitoring of a presumably unstable historic building: the church of Santa María la Blanca in Agoncillo (La Rioja, Spain, which is being observed with conventional surveying equipment. The computations of the different epochs show several movements that can be explained as due to seasonal cycles.

    Al igual que el resto de estructuras de ingeniería, los edificios históricos están sometidos a las variaciones de las condiciones atmosféricas que afectan a sus dimensiones. Estos efectos son de carácter cíclico y no suelen suponer riesgo para la estabilidad del edificio, ya que se encuentran dentro de su dinámica natural, sin embargo, su determinación aporta información valiosa a la hora de entender el comportamiento tanto del edificio como de los materiales que lo conforman. Los resultados que se presentan surgen del proyecto de auscultación geométrica de un edificio histórico supuestamente inestable, la Iglesia de Santa María la Blanca de Agoncillo (La Rioja, España, que se viene realizando utilizando instrumentación topográfica convencional. En el cálculo de las diferentes campañas se han podido detectar movimientos cíclicos estacionales.

  18. EVALUATION OF BONE-TO-IMPLANT CONTACT AND BONE DENSITY ADJACENT TO TITANIUM IMPLANTS USING A STEREOLOGICAL TECHNIQUE ON GROUND SECTIONS

    Directory of Open Access Journals (Sweden)

    Dimitra Balatsouka

    2011-05-01

    Full Text Available When bone implants have to be examined in situ ground sections are required. Histomorphometric measurements are usually performed on two-dimensional sections, causing biased results when they are wrongly extrapolated to 3D without any knowledge of stereology. Unbiased results can only be obtained using stereological principles. The aim of the study was to describe an unbiased design for evaluating boneto-mplant contact (BIC and peri-implant bone density (BD-i in three-dimensions. The unbiased design was based on a fixed axis vertical random sampling technique. Three bone-implant blocks were collected from 3 rabbits. Four sections were obtained from each animal using a fixed axis vertical random sampling technique. The BIC was estimated by creating a stereological method based on a systematic test line set. The BD-i was estimated using a design based on a systematic point set. The efficiency of the systematic sampling was evaluated for each sampling level: Coefficient of error CE( i w for the systematic test line set, CESyst between the 4 rotated sections, CENoise for the reproducibility. These variances were compared to the biological variation (CV between animals. The mean CE( i w was 5 to 6%; the CESyst was 7.8% for the BIC estimates and 5.7% for the BD-i estimates; the CENoise was 5.8% for the BIC and 7.7% for the BD-i. The CV was 19% for the BIC estimates and 24% for the BD-i estimates. These results demonstrated that the stereological technique used in the present study was a very efficient method to obtain unbiased estimates of BIC and BD-i in 3D on 2D implant-bone sections.

  19. Compilation of geology, mineralization, geochemistry and geophysical study of IP/RS & ground magnetic survey at Roudgaz area, southeast of Gonabad, Khorasan Razavi province

    Directory of Open Access Journals (Sweden)

    Hossein Hajimirzajan

    2013-04-01

    Full Text Available Roudgaz prospect area is a Cu, Sn, Pb, Zn, and Au polymetal vein system located to the southeast of Gonabad and in the northeast of Lut block. Oxidan subvolcanic Tertiary rocks with monzonite to monzodiorite porphyry composition intruded the metamorphic rocks of middle Jurassic. The majority of intrusive bodies are affected by carbonation, argillic, sericitic, and silicification-tourmaline alteration. Mineralization in the area is controlled by fault and is present as vein with domination of NW-SE direction and 85-90º dip. Primary minerals are quartz, tourmaline, chalcopyrite, pyrite, and secondary minerals are malachite, azurite, and goethite. Geochemical sampling using chip composite method indicated high anomalies of Cu, Sn, Pb, and As (up to 10000 ppm, Zn (up to 5527 ppm, and Au (up to 325 ppb. Broad gossan zone is present in the area and is related to the oxidation of sulfide minerals. IP/RS survey was performed over the geochemical anomalies for identification of the location and extension of sulfide mineralization at depth. Generally, chargeability increases in gossan zones, veins, old workings and geochemical anomalies. Resistivity over the quartzite unit and also in locations where mineralized vein is associated with quartz has a high anomaly of up to 425 ohm-m. Due to high geochemical anomaly of Sn and its relation with reduced subvolcanic intrusives, ground magnetic survey was performed to identify the location of magnetite (oxidant and ilmenite (reduced series at depth. Variation of Total Magnetic Intensity (TMI is 335.1 Gamma in the TMI map. The highest magnetic anomalies in the RTP map are located to the north of the survey area which is related to magnetite series (hornblende biotite monzodiorite porphyry and extend to the south at depth. The lowest magnetic anomaly is located to the center of the survey area and particularly to the east of the Roudgaz village correlating the highest chargeability and geochemical anomaly. Based

  20. The first survey of airborne trace elements at airport using moss bag technique.

    Science.gov (United States)

    Vuković, Gordana; Urošević, Mira Aničić; Škrivanj, Sandra; Vergel, Konstantin; Tomašević, Milica; Popović, Aleksandar

    2017-06-01

    Air traffic represents an important way of social mobility in the world, and many ongoing discussions are related to the impacts that air transportation has on local air quality. In this study, moss Sphagnum girgensohnii was used for the first time in the assessment of trace element content at the international airport. The moss bags were exposed during the summer of 2013 at four sampling sites at the airport 'Nikola Tesla' (Belgrade, Serbia): runway (two), auxiliary runway and parking lot. According to the relative accumulation factor (RAF) and the limit of quantification of the moss bag technique (LOQT), the most abundant elements in the samples were Zn, Na, Cr, V, Cu and Fe. A comparison between the element concentrations at the airport and the corresponding values in different land use classes (urban central, suburban, industrial and green zones) across the city of Belgrade did not point out that the air traffic and associated activities significantly contribute to the trace element air pollution. This study emphasised an easy operational and robust (bio)monitoring, using moss bags as a suitable method for assessment of air quality within various microenvironments with restriction in positioning referent instrumental devices.

  1. Survey of modular ontology techniques and their applications in the biomedical domain.

    Science.gov (United States)

    Pathak, Jyotishman; Johnson, Thomas M; Chute, Christopher G

    2009-08-01

    In the past several years, various ontologies and terminologies such as the Gene Ontology have been developed to enable interoperability across multiple diverse medical information systems. They provide a standard way of representing terms and concepts thereby supporting easy transmission and interpretation of data for various applications. However, with their growing utilization, not only has the number of available ontologies increased considerably, but they are also becoming larger and more complex to manage. Toward this end, a growing body of work is emerging in the area of modular ontologies where the emphasis is on either extracting and managing "modules" of an ontology relevant to a particular application scenario (ontology decomposition) or developing them independently and integrating into a larger ontology (ontology composition). In this paper, we investigate state-of-the-art approaches in modular ontologies focusing on techniques that are based on rigorous logical formalisms as well as well-studied graph theories. We analyze and compare how such approaches can be leveraged in developing tools and applications in the biomedical domain. We conclude by highlighting some of the limitations of the modular ontology formalisms and put forward additional requirements to steer their future development.

  2. A Survey of Surface Modification Techniques for Next-Generation Shape Memory Polymer Stent Devices

    Directory of Open Access Journals (Sweden)

    Tina Govindarajan

    2014-08-01

    Full Text Available The search for a single material with ideal surface properties and necessary mechanical properties is on-going, especially with regard to cardiovascular stent materials. Since the majority of stent problems arise from surface issues rather than bulk material deficiencies, surface optimization of a material that already contains the necessary bulk properties is an active area of research. Polymers can be surface-modified using a variety of methods to increase hemocompatibilty by reducing either late-stage restenosis or acute thrombogenicity, or both. These modification methods can be extended to shape memory polymers (SMPs, in an effort to make these materials more surface compatible, based on the application. This review focuses on the role of surface modification of materials, mainly polymers, to improve the hemocompatibility of stent materials; additional discussion of other materials commonly used in stents is also provided. Although shape memory polymers are not yet extensively used for stents, they offer numerous benefits that may make them good candidates for next-generation stents. Surface modification techniques discussed here include roughening, patterning, chemical modification, and surface modification for biomolecule and drug delivery.

  3. A ground electromagnetic survey used to map sulfides and acid sulfate ground waters at the abandoned Cabin Branch Mine, Prince William Forest Park, northern Virginia gold-pyrite belt

    Science.gov (United States)

    Wynn, Jeffrey C.

    2000-01-01

    gold and silver. The environmental impact of massive sulfide deposits can be substantial. These deposits are characterized by high concentrations of heavy-metal sulfide minerals, hosted by silicate rocks. Thus, weathering of these deposits and their mine wastes has the potential to generate heavy-metal laden sulfuric acid that can have negative impacts on aquatic ecosystems. In addition, lead associated with solid mine wastes has the potential for human health impacts through ingestion. The heavy metals that are encountered in these deposits and are most likely to cause environmental impacts include copper, zinc, lead, cadmium, and arsenic. In addition, the weathering of pyrite releases large amounts of iron, and the acid generated attacks the country rocks and causes the release of large amounts of aluminum, which also can severely impact aquatic ecosystems. A reclamation attempt was made at the site in 1995, including construction of storm-water diversion trenches around the abandoned mine area, grading tailings away from the stream bank, addition of pulverized limestone and topsoil, and revegetation. The post-reclamation chemistry of shallow groundwaters (<3 meters deep) shows a neutral pH on the southwestern bank of the stream but pH of 4.1 to 4.5 on the northeastern bank. The dominant ions are Fe2+ and SO42- (Seal, Haffner, Meier, and Pollio, 1999) A ground electromagnetic survey was conducted over the site in 1999 as part of a wider study ( Seal, Haffner, and Meier, 1998a,b, 1999). It was hoped that a 3-D map of the soil conductivity derived from the survey could provide insight into the distribution of the mobilized sulfides present under the ground. This study was conducted in cooperation with the National Park Service

  4. Multidisciplinary Studies of the Fate and Transport of Contaminants in Ground Water at the U.S. Geological Survey Cape Cod Toxic Substances Hydrology Program Research Site, Massachusetts

    Science.gov (United States)

    Leblanc, D. R.; Smith, R. L.; Kent, D. B.; Barber, L. B.; Harvey, R. W.

    2008-12-01

    The U.S. Geological Survey conducts multidisciplinary research on the physical, chemical, and microbiological processes affecting ground-water contaminants of global concern at its Cape Cod Toxic Substances Hydrology Program site in Massachusetts, USA. The work centers on a 6-kilometer-long plume of treated wastewater in a glacial sand and gravel aquifer. The plume is characterized by distinct geochemical zones caused by the biodegradation of organic materials in treated wastewater that was disposed to the aquifer by rapid infiltration during the period 1936-95. A core group of hydrogeologists, geochemists, microbiologists, and geophysicists has been involved in the research effort for more than two decades. The effort has been enhanced by stable funding, a readily accessible site, a relatively simple hydrologic setting, and logistical support from an adjacent military base. The research team uses a three-part approach to plan and conduct research at the site. First, detailed spatial and temporal monitoring of the plume since the late 1970s provides field evidence of important contaminant-transport processes and provides the basis for multidisciplinary, process-oriented studies. Second, ground-water tracer experiments are conducted in various geochemical zones in the plume to study factors that control the rate and extent of contaminant transport. Several arrays of multilevel sampling devices, including an array with more than 15,000 individual sampling points, are used to conduct these experiments. Plume-scale (kilometers) and tracer-test-scale (1- 100 meters) studies are complemented by laboratory experiments and mathematical modeling of flow and reactive transport. Third, results are applied to the treated-wastewater plume, other contaminant plumes at the military base, and other sites nationally to evaluate the applicability of the findings and to point toward further research. Examples of findings to date include that (1) macrodispersivity can be related to

  5. Survey regarding the clinical practice of cardiac CT in Germany. Indications, scanning technique and reporting

    Energy Technology Data Exchange (ETDEWEB)

    Maurer, Marc H.; Hamm, B.; Dewey, M. [Inst. fuer Radiologie, Charite - Universitaetsmedizin Berlin (Germany)

    2009-12-15

    Purpose: to obtain an overview of the current clinical practice of cardiac computed tomography (CT) in Germany. Materials and methods: a 30-item question-naire was mailed to 149 providers of cardiac CT in Germany. The items asked about indications, scanning technique and reporting, data storage, and cost of the examination. Results: overall 45 questionnaires could be analyzed (30%). The majority of centers (76%, 34 of 45 centers) used CT scanners of the latest generation (at least 64 rows). The most common appropriate indications were exclusion of coronary artery disease (91%, 41/45), coronary anomalies (80%, 36/45), and follow-up after coronary artery bypass grafting (53%, 24/45). Each center examined on average 243 {+-} 310 patients in 2007 and the number of centers performing cardiac CT increased significantly in 2007 (p = 0.035) compared with the preceding year. Most used sublingual nitroglycerin (84%, 38/45; median of 2 sprays = 0.8 mg) and/or a beta blocker (86%, 39/44; median of 5 mg IV, median heart rate threshold: 70 beats/min). Many providers used ECG-triggered tube current modulation (65%, 29/44) and/or adjusted the tube current to the body mass index or body weight (63%, 28/44). A median slice thickness of 0.75 mm with a 0.5 mm increment and a 20 cm field-of-view was most commonly used. Source images in orthogonal planes (96%, 43/45), curved MPRs (93%, 42/45), and thin-slice MIPs (69%, 31/45) were used most frequently for interpretation. Extracardiac structures were also evaluated by 84% of the centers (38/45). The mean examination time was 16.2 min and reporting took an average of 28.8 min. (orig.)

  6. PET-guided delineation of radiation therapy treatment volumes: a survey of image segmentation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zaidi, Habib [Geneva University Hospital, Division of Nuclear Medicine, Geneva 4 (Switzerland); Geneva University, Geneva Neuroscience Center, Geneva (Switzerland); El Naqa, Issam [Washington University School of Medicine, Department of Radiation Oncology, St. Louis, MO (United States)

    2010-11-15

    Historically, anatomical CT and MR images were used to delineate the gross tumour volumes (GTVs) for radiotherapy treatment planning. The capabilities offered by modern radiation therapy units and the widespread availability of combined PET/CT scanners stimulated the development of biological PET imaging-guided radiation therapy treatment planning with the aim to produce highly conformal radiation dose distribution to the tumour. One of the most difficult issues facing PET-based treatment planning is the accurate delineation of target regions from typical blurred and noisy functional images. The major problems encountered are image segmentation and imperfect system response function. Image segmentation is defined as the process of classifying the voxels of an image into a set of distinct classes. The difficulty in PET image segmentation is compounded by the low spatial resolution and high noise characteristics of PET images. Despite the difficulties and known limitations, several image segmentation approaches have been proposed and used in the clinical setting including thresholding, edge detection, region growing, clustering, stochastic models, deformable models, classifiers and several other approaches. A detailed description of the various approaches proposed in the literature is reviewed. Moreover, we also briefly discuss some important considerations and limitations of the widely used techniques to guide practitioners in the field of radiation oncology. The strategies followed for validation and comparative assessment of various PET segmentation approaches are described. Future opportunities and the current challenges facing the adoption of PET-guided delineation of target volumes and its role in basic and clinical research are also addressed. (orig.)

  7. Evaluation of Analytical and Numerical Techniques for Defining the Radius of Influence for an Open-Loop Ground Source Heat Pump System

    Energy Technology Data Exchange (ETDEWEB)

    Freedman, Vicky L.; Mackley, Rob D.; Waichler, Scott R.; Horner, Jacob A.

    2013-09-26

    In an open-loop groundwater heat pump (GHP) system, groundwater is extracted, run through a heat exchanger, and injected back into the ground, resulting in no mass balance changes to the flow system. Although the groundwater use is non-consumptive, the withdrawal and injection of groundwater may cause negative hydraulic and thermal impacts to the flow system. Because GHP is a relatively new technology and regulatory guidelines for determining environmental impacts for GHPs may not exist, consumptive use metrics may need to be used for permit applications. For consumptive use permits, a radius of influence is often used, which is defined as the radius beyond which hydraulic impacts to the system are considered negligible. In this paper, the hydraulic radius of influence concept was examined using analytical and numerical methods for a non-consumptive GHP system in southeastern Washington State. At this location, the primary hydraulic concerns were impacts to nearby contaminant plumes and a water supply well field. The results of this study showed that the analytical techniques with idealized radial flow were generally unsuited because they over predicted the influence of the well system. The numerical techniques yielded more reasonable results because they could account for aquifer heterogeneities and flow boundaries. In particular, the use of a capture zone analysis was identified as the best method for determining potential changes in current contaminant plume trajectories. The capture zone analysis is a more quantitative and reliable tool for determining the radius of influence with a greater accuracy and better insight for a non-consumptive GHP assessment.

  8. Retrieval of nitrogen dioxide stratospheric profiles from ground-based zenith-sky UV-visible observations: validation of the technique through correlative comparisons

    Directory of Open Access Journals (Sweden)

    F. Hendrick

    2004-05-01

    Full Text Available A retrieval algorithm based on the Optimal Estimation Method (OEM has been developed in order to provide vertical distributions of NO2 in the stratosphere from ground-based (GB zenith-sky UV-visible observations. It has been applied to observational data sets from the NDSC (Network for Detection of Stratospheric Change stations of Harestua (60° N, 10° E and Andøya (69.3° N, 16.1° E in Norway. The information content and retrieval errors have been analyzed following a formalism used for characterizing ozone profiles retrieved from solar infrared absorption spectra. In order to validate the technique, the retrieved NO2 vertical profiles and columns have been compared to correlative balloon and satellite observations. Such extensive validation of the profile and column retrievals was not reported in previously published work on the profiling from GB UV-visible measurements. A good agreement – generally better than 25% – has been found with the SAOZ (Système d'Analyse par Observations Zénithales and DOAS (Differential Optical Absorption Spectroscopy balloon data. A similar agreement has been reached with correlative satellite data from HALogen Occultation Experiment (HALOE and Polar Ozone and Aerosol Measurement (POAM III instruments above 25 km of altitude. Below 25 km, a systematic overestimation of our retrieved profiles – by up to 50% in some cases – has been observed by both HALOE and POAM III, pointing out the limitation of the satellite solar occultation technique at these altitudes. We have concluded that our study strengthens our confidence in the reliability of the retrieval of vertical distribution information from GB UV-visible observations and offers new perspectives in the use of GB UV-visible network data for validation purposes.

  9. Three-dimensional seismic survey planning based on the newest data acquisition design technique; Saishin no data shutoku design ni motozuku sanjigen jishin tansa keikaku

    Energy Technology Data Exchange (ETDEWEB)

    Minehara, M.; Nakagami, K.; Tanaka, H. [Japan National Oil Corp., Tokyo (Japan). Technology Research Center

    1996-10-01

    Theory of parameter setting for data acquisition is arranged, mainly as to the seismic generating and receiving geometry. This paper also introduces an example of survey planning for three-dimensional land seismic exploration in progress. For the design of data acquisition, fundamental parameters are firstly determined on the basis of the characteristics of reflection records at a given district, and then, the layout of survey is determined. In this study, information through modeling based on the existing interpretation of geologic structures is also utilized, to reflect them for survey specifications. Land three-dimensional seismic survey was designed. Ground surface of the surveyed area consists of rice fields and hilly regions. The target was a nose-shaped structure in the depth about 2,500 m underground. A survey area of 4km{times}5km was set. Records in the shallow layers could not obtained when near offset was not ensured. Quality control of this distribution was important for grasping the shallow structure required. In this survey, the seismic generating point could be ensured more certainly than initially expected, which resulted in the sufficient security of near offset. 2 refs., 2 figs.

  10. Ground-based diffusion experiments on liquid Sn-In systems using the shear cell technique of the satellite mission Foton-M1.

    Science.gov (United States)

    Suzuki, Shinsuke; Kraatz, Kurt-Helmut; Frohberg, Günter

    2004-11-01

    This study reported in this paper was aimed at testing the shear cell that was developed for the satellite mission Foton-M1 to measure diffusion coefficients in liquid metals under microgravity (microg)-conditions. Thick Layer diffusion experiments were performed in the system Sn90In10 versus Sn under 1 g-conditions. For this system several microg-diffusion results are available as reference data. This combination provides a low, but sufficiently stable, density layering throughout the entire experiment, which is important to avoid buoyancy-driven convection. The experimental results were corrected for the influences of the shear-induced convection and mixing after the final shearing, both of which are typical for the shear cell technique. As the result, the reproducibility and the reliability of the diffusion coefficients in the ground-based experiments were within the limits of error of microg-data. Based on our results we discuss the necessary conditions to avoid buoyancy-driven convection.

  11. Illumination Sufficiency Survey Techniques: In-situ Measurements of Lighting System Performance and a User Preference Survey for Illuminance in an Off-Grid, African Setting

    Energy Technology Data Exchange (ETDEWEB)

    Alstone, Peter; Jacobson, Arne; Mills, Evan

    2010-08-26

    Efforts to promote rechargeable electric lighting as a replacement for fuel-based light sources in developing countries are typically predicated on the notion that lighting service levels can be maintained or improved while reducing the costs and environmental impacts of existing practices. However, the extremely low incomes of those who depend on fuel-based lighting create a need to balance the hypothetically possible or desirable levels of light with those that are sufficient and affordable. In a pilot study of four night vendors in Kenya, we document a field technique we developed to simultaneously measure the effectiveness of lighting service provided by a lighting system and conduct a survey of lighting service demand by end-users. We took gridded illuminance measurements across each vendor's working and selling area, with users indicating the sufficiency of light at each point. User light sources included a mix of kerosene-fueled hurricane lanterns, pressure lamps, and LED lanterns.We observed illuminance levels ranging from just above zero to 150 lux. The LED systems markedly improved the lighting service levels over those provided by kerosene-fueled hurricane lanterns. Users reported that the minimum acceptable threshold was about 2 lux. The results also indicated that the LED lamps in use by the subjects did not always provide sufficient illumination over the desired retail areas. Our sample size is much too small, however, to reach any conclusions about requirements in the broader population. Given the small number of subjects and very specific type of user, our results should be regarded as indicative rather than conclusive. We recommend replicating the method at larger scales and across a variety of user types and contexts. Policymakers should revisit the subject of recommended illuminance levels regularly as LED technology advances and the price/service balance point evolves.

  12. Dispersed and piled woody residues volumes in coastal Douglas-fir cutblocks determined using high-resolution imagery from a UAV and from ground-based surveys.

    Science.gov (United States)

    Trofymow, J. A.; Gougeon, F.

    2015-12-01

    After forest harvest significant amounts of woody residues are left dispersed on site and some subsequently piled and burned. Quantification of residues is required for estimating C budgets, billable waste, harvest efficiency, bioenergy potential and smoke emissions. Trofymow (et al 2014 CJFR) compared remote sensing methods to ground-based waste and residue survey (WRS) methods for residue piles in 4 cutblocks in the Oyster River (OR) area in coastal BC. Compared to geospatial methods using 15cm orthophotos and LiDAR acquired in 2011 by helicopter, the WRS method underestimated pile wood by 30% to 50% while a USFS volume method overestimated pile wood by 50% if site specific packing ratios were not used. A geospatial method was developed in PCI Geomatica to analyze 2-bit images of logs >15cm diameters to determine dispersed wood residues in OR and compare to WRS methods. Across blocks, geospatial and WRS method wood volumes were correlated (R2=0.69), however volumes were 2.5 times larger for the geospatial vs WRS method. Methods for dispersed residues could not be properly compared as individual WRS plots were not georeferenced, only 12 plots were sampled in total, and low-resolution images poorly resolved logs. Thus, a new study in 2 cutblocks in the Northwest Bay (NWB) area acquired 2cm resolution RGB air-photography in 2014-15 using an Aeryon Sky Ranger UAV prior to and after burn pile construction. A total of 57 dispersed WRS plots and 24 WRS pile or accumulation plots were georeferenced and measured. Stero-pairs were used to generate point-clouds for pile bulk volumes. Images processed to 8-bit grey scale are being analyzed with a revised PCI method that better accounts for log overlaps. WRS methods depend on a good sample of plots and accurate determination of stratum (dispersed, roadside, piles, accumulations) areas. Analysis of NWB blocks shows WRS field methods for stratum area differ by 5-20% from that determined using orthophotos. Plot-level wood

  13. Development of a vehicle capable of traveling on soft ground. Its application to investigation, survey and management of soft ground; Nanjakuchi sokosha no kaihatsu. Nanjakuchi deno chosa sokuryo kanri eno tekiyo

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, R.; Yano, H. [Ministry of Construction, Tokyo (Japan)

    1998-07-25

    An experimental vehicle is built and tested, which utilizes hovercraft technology, capable of travelling on a soft ground surface and therefore usable in reclamation work. When the ground is soft and viscous, merely increasing the vehicle driving force will futilely add to the vehicle weight, and this causes an adverse effect with the vehicle sinking deeper into the ground and the ground presenting higher resistance. In an effort to decrease the weight and resistance, a vehicle is built, capable of levitating itself by use of hovercraft technology and provided with retractable tracks and wheels for travelling. The targets are mostly attained in the test run, as far as speeds (5.5km/h at the maximum across a muddy ground section) and trekking across ground including an undulated surface are concerned, although the levitation level is found to be somewhat lower than the design value. Operating across a hard ground surface with the body elevated, the vehicle exhibits a higher performance in speed and drivability when the hovercraft effect is utilized. When travelling on the hovercraft effect, the frictional resistance of the skirt decreases as the vehicle moves from a hard surface section into a soft surface section, and this allows the vehicle to run more smoothly at higher speeds. 1 refs., 6 figs.

  14. Study of capillary absorption kinetics by X-ray CT imaging techniques: a survey on sedimentary rocks of Sicily

    Directory of Open Access Journals (Sweden)

    Tiziano Schillaci

    2008-04-01

    Full Text Available Sedimentary rocks are natural porous materials with a great percent of microscopic interconnected pores: they contain fluids, permitting their movement on macroscopic scale. Generally, these rocks present porosity higher then metamorphic rocks. Under certain points of view, this feature represents an advantage; on the other hand, this can constitute an obstacle for cultural heritage applications, because the porosity grade can lead to a deterioration of the lapideous monument for water capillary absorption. In this paper, CT (Computerized Tomography image techniques are applied to capillary absorption kinetics in sedimentary rocks utilized for the Greek temples as well as baroc monuments, respectively located in western and southeastern Sicily. Rocks were sampled near the archaeological areas of Agrigento, Segesta, Selinunte and Val di Noto. CT images were acquired at different times, before and after the water contact, using image elaboration techniques during the acquisition as well as the post-processing phases. Water distribution into porous spaces has been evaluated on the basis of the Hounsfield number, estimated for the 3-D voxel structure of samples. For most of the considered samples, assumptions based on Handy model permit to correlate the average height of the wetting front to the square root of time. Stochastic equations were introduced in order to describe the percolative water behavior in heterogeneous samples, as the Agrigento one. Before the CT acquisition, an estimate of the capillary absorption kinetics has been carried out by the gravimetric method. A petrographical characterization of samples has been performed by stereomicroscope observations, while porosity and morphology of porous have been surveyed by SEM (Scanning Electron Microscope images. Furthermore, the proposed methods have also permitted to define penetration depth as well as distribution uniformity of materials used for restoration and conservation of historical

  15. Retrieval of nitrogen dioxide stratospheric profiles from ground-based zenith-sky UV-visible observations: validation of the technique through correlative comparisons

    Directory of Open Access Journals (Sweden)

    F. Hendrick

    2004-01-01

    Full Text Available A retrieval algorithm based on the Optimal Estimation Method (OEM has been developed in order to provide vertical distributions of NO2 in the stratosphere from ground-based (GB zenith-sky UV-visible observations. It has been applied to observational data sets from the NDSC (Network for Detection of Stratospheric Change stations of Harestua (60° N, 10° E and Andøya (69° N, 16° E in Norway. The information content and retrieval errors have been analyzed following a formalism used for characterizing ozone profiles retrieved from solar infrared absorption spectra. In order to validate the technique, the retrieved NO2 vertical profiles and columns have been compared to correlative balloon and satellite observations. Such extensive validation of the profile and column retrievals was not reported in previously published work on the profiling from GB UV-visible measurements. A good agreement - generally better than 25% - has been found with the SAOZ (Système d'Analyse par Observations Zénithales and DOAS (Differential Optical Absorption Spectroscopy balloons. A similar agreement has been reached with correlative satellite data from the HALogen Occultation Experiment (HALOE and Polar Ozone and Aerosol Measurement (POAM III instruments above 25km of altitude. Below 25km, a systematic underestimation - by up to 40% in some cases - of both HALOE and POAM III profiles by our GB profile retrievals has been observed, pointing out more likely a limitation of both satellite instruments at these altitudes. We have concluded that our study strengthens our confidence in the reliability of the retrieval of vertical distribution information from GB UV-visible observations and offers new perspectives in the use of GB UV-visible network data for validation purposes.

  16. Estimating Digital Terrain Model in forest areas from TanDEM-X and Stereo-photogrammetric technique by means of Random Volume over Ground model

    Science.gov (United States)

    Lee, S. K.; Fatoyinbo, T. E.; Lagomasino, D.; Osmanoglu, B.; Feliciano, E. A.

    2015-12-01

    The Digital Terrain Model (DTM) in forest areas is invaluable information for various environmental, hydrological and ecological studies, for example, watershed delineation, vegetation canopy height, water dynamic modeling, forest biomass and carbon estimations. There are few solutions to extract bare-earth Digital Elevation Model information. Airborne lidar systems are widely and successfully used for estimating bare-earth DEMs with centimeter-order accuracy and high spatial resolution. However, expensive cost of operation and small image coverage prevent the use of airborne lidar sensors for large- or global-scale. Although IceSAT/GLAS (Ice, Cloud, and Land Elevation Satellite/Geoscience Laser Altimeter System) lidar data sets have been available for global DTM estimate with relatively lower cost, the large footprint size of 70 m and the interval of 172 m are insufficient for various applications. In this study we propose to extract higher resolution bare-earth DEM over vegetated areas from the combination of interferometric complex coherence from single-pass TanDEM-X (TDX) data at HH polarization and Digital Surface Model (DSM) derived from high-resolution WorldView (WV) images by means of random volume over ground (RVoG) model. The RVoG model is a widely and successfully used model for polarimetric SAR interferometry (Pol-InSAR) forest canopy height inversion. The bare-earth DEM is obtained by complex volume decorrelation in the RVoG model with the DSM estimated by stereo-photogrammetric technique. Forest canopy height can be estimated by subtracting the estimated bare-earth model from the DSM. Finally, the DTM from airborne lidar system was used to validate the bare-earth DEM and forest canopy height estimates.

  17. A survey of clearing techniques for 3D imaging of tissues with special reference to connective tissue.

    Science.gov (United States)

    Azaripour, Adriano; Lagerweij, Tonny; Scharfbillig, Christina; Jadczak, Anna Elisabeth; Willershausen, Brita; Van Noorden, Cornelis J F

    2016-08-01

    For 3-dimensional (3D) imaging of a tissue, 3 methodological steps are essential and their successful application depends on specific characteristics of the type of tissue. The steps are 1° clearing of the opaque tissue to render it transparent for microscopy, 2° fluorescence labeling of the tissues and 3° 3D imaging. In the past decades, new methodologies were introduced for the clearing steps with their specific advantages and disadvantages. Most clearing techniques have been applied to the central nervous system and other organs that contain relatively low amounts of connective tissue including extracellular matrix. However, tissues that contain large amounts of extracellular matrix such as dermis in skin or gingiva are difficult to clear. The present survey lists methodologies that are available for clearing of tissues for 3D imaging. We report here that the BABB method using a mixture of benzyl alcohol and benzyl benzoate and iDISCO using dibenzylether (DBE) are the most successful methods for clearing connective tissue-rich gingiva and dermis of skin for 3D histochemistry and imaging of fluorescence using light-sheet microscopy.

  18. A multi-wavelength survey of AGN in the XMM-LSS field: I. Quasar selection via the KX technique

    CERN Document Server

    Nakos, Th; Andreon, S; Surdej, J; Riaud, P; Hatziminaoglou, E; Garcet, O; Alloin, D; Baes, M; Galaz, G; Pierre, M; Quintana, H; Page, M J; Tedds, J A; Ceballos, M T; Corral, A; Ebrero, J; Krumpe, M; Mateos, S

    2008-01-01

    AIMS: We present a sample of candidate quasars selected using the KX-technique. The data cover 0.68 deg^2 of the X-ray Multi-Mirror (XMM) Large-Scale Structure (LSS) survey area where overlapping multi-wavelength imaging data permits an investigation of the physical nature of selected sources. METHODS: The KX method identifies quasars on the basis of their optical (R and z') to near-infrared (Ks) photometry and point-like morphology. We combine these data with optical (u*,g'r',i',z') and mid-infrared (3.6-24 micron) wavebands to reconstruct the spectral energy distributions (SEDs) of candidate quasars. RESULTS: Of 93 sources selected as candidate quasars by the KX method, 25 are classified as quasars by the subsequent SED analysis. Spectroscopic observations are available for 12/25 of these sources and confirm the quasar hypothesis in each case. Even more, 90% of the SED-classified quasars show X-ray emission, a property not shared by any of the false candidates in the KX-selected sample. Applying a photometr...

  19. New Techniques for Relating Dynamically Close Galaxy Pairs to Merger and Accretion Rates Application to the SSRS2 Redshift Survey

    CERN Document Server

    Patton, D R; Marzke, R O; Pritchet, C J; Da Costa, L N; Pellegrini, P S

    2000-01-01

    We introduce two new pair statistics, which relate close galaxy pairs to the merger and accretion rates. We demonstrate the importance of correcting these (and other) pair statistics for selection effects related to sample depth and completeness. In particular, we highlight the severe bias that can result from the use of a flux-limited survey. The first statistic, denoted N_c, gives the number of companions per galaxy, within a specified range in absolute magnitude. N_c is directly related to the galaxy merger rate. The second statistic, called L_c, gives the total luminosity in companions, per galaxy. This quantity can be used to investigate the mass accretion rate. Both N_c and L_c are related to the galaxy correlation function and luminosity function in a straightforward manner. We outline techniques which account for various selection effects, and demonstrate the success of this approach using Monte Carlo simulations. If one assumes that clustering is independent of luminosity (which is appropriate for re...

  20. [Induced abortion among prostitutes: a survey using the ballot-box technique in Teresina-Piauí].

    Science.gov (United States)

    Madeiro, Alberto Pereira; Rufino, Andréa Cronemberger

    2012-07-01

    This study assesses the prevalence of induced abortion among prostitutes and lists the most common abortion practices. A survey was conducted with 310 prostitutes between 18 and 39 years of age, by sampling age quotas in the 5 territorial areas of Teresina in the state of Piauí. Data collection was conducted through the use of 2 questionnaires: the first, by the ballot-box technique, with questions about abortion; the second, completed by the researcher, with socio-demographic information. The practice of abortion was reported by 163 (52.6%) women. Most prostitutes performed 1 abortion (50.3%), but 16.5% of them reported carrying out 3 or more. Misoprostol was used alone in 68.1% of the reports and associated with tea and/or probes in 9.2%, followed by tea in 13.4%, probes in 3.7%, and uterine curettage in unregulated clinics in 3.7%. There was post-abortion hospitalization in 47.8% of the cases. After adjustment of the multiple logistic regression model, the variable that remained significantly associated with abortion was to have had 3 or more pregnancies. These results revealed that induced abortion is an event of great prevalence among prostitutes in Teresina. Misoprostol is the most common method to abort and hospitalization was necessary in almost half of cases.

  1. Monitoring of Non-Linear Ground Movement in an Open Pit Iron Mine Based on an Integration of Advanced DInSAR Techniques Using TerraSAR-X Data

    Directory of Open Access Journals (Sweden)

    José Claudio Mura

    2016-05-01

    Full Text Available This work presents an investigation to determine ground deformation based on an integration of DInSAR Time-Series (DTS and Persistent Scatterer Interferometry (PSI techniques aiming at detecting high rates of linear and non-linear ground movement. The combined techniques were applied in an open pit iron mine located in Carajás Mineral Province (Brazilian Amazon region, using a set of 33 TerraSAR-X-1 images acquired from March 2012 to April 2013 when, due to a different deformation behavior during the dry and wet seasons in the Amazon region, a non-linear deformation was detected. The DTS analysis was performed on a stack of multi-look unwrapped interferograms using an extension of the SVD (Singular Value Decomposition, where a set of additional weighted constraints on the acceleration of the displacement was incorporated to control the smoothness of the time-series solutions, whose objective was to correct the atmospheric phase artifacts. The height errors and the deformation history provided by the DTS technique were used as previous information to perform the PSI analysis. This procedure improved the capability of the PSI technique to detect non-linear movement as well as to increase the numbers of point density of the final results. The results of the combined techniques are presented and compared with total station/prisms and ground-based radar (GBR measurements.

  2. UAV, LiDAR & ground-based surveying from Stackpole Quay: best practice for accuracy of virtual outcrops and structural models

    Science.gov (United States)

    Cawood, A.; Bond, C. E.; Howell, J.; Totake, Y.

    2016-12-01

    Virtual outcrops derived from techniques such as LiDAR and SfM (digital photogrammetry) provide a viable and potentially powerful addition or alternative to traditional field studies, given the large amounts of raw data that can be acquired rapidly and safely. The use of these digital representations of outcrops as a source of geological data has increased greatly in the past decade, and as such, the accuracy and precision of these new acquisition methods applied to geological problems has been addressed by a number of authors. Little work has been done, however, on the integration of virtual outcrops into fundamental structural geology workflows and to systematically studying the fidelity of the data derived from them. Here, we use the classic Stackpole Quay syncline outcrop in South Wales to quantitatively evaluate the accuracy of three virtual outcrop models (LiDAR, aerial and terrestrial digital photogrammetry) compared to data collected directly in the field. Using these structural data, we have built 2D and 3D geological models which make predictions of fold geometries. We examine the fidelity of virtual outcrops generated using different acquisition techniques to outcrop geology and how these affect model building and final outcomes. Finally, we utilize newly acquired data to deterministically test model validity. Based upon these results, we find that acquisition of digital imagery by UAS (Unmanned Autonomous Vehicle) yields highly accurate virtual outcrops when compared to terrestrial methods, allowing the construction of robust data-driven predictive models. Careful planning, survey design and choice of suitable acquisition method are, however, of key importance for best results.

  3. A Survey of Non-conventional Techniques for Low-voltage Low-power Analog Circuit Design

    National Research Council Canada - National Science Library

    F. Khateb; S. Bay Abo Dabbous; S. Vlassis

    2013-01-01

    ...). Therefore, this paper presents the operation principle, the advantages and disadvantages of each of these techniques, enabling circuit designers to choose the proper design technique based on application...

  4. Assessing household wealth in health studies in developing countries: a comparison of participatory wealth ranking and survey techniques from rural South Africa

    Directory of Open Access Journals (Sweden)

    Hargreaves James R

    2007-06-01

    Full Text Available Abstract Background Accurate tools for assessing household wealth are essential for many health studies in developing countries. Household survey and participatory wealth ranking (PWR are two approaches to generate data for this purpose. Methods A household survey and PWR were conducted among eight villages in rural South Africa. We developed three indicators of household wealth using the data. One indicator used PWR data only, one used principal components analysis to combine data from the survey, while the final indicator used survey data combined in a manner informed by the PWR. We assessed internal consistency of the indices and assessed their level of agreement in ranking household wealth. Results Food security, asset ownership, housing quality and employment were important indicators of household wealth. PWR, consisting of three independent rankings of 9671 households, showed a high level of internal consistency (intraclass correlation coefficient 0.81, 95% CI 0.79–0.82. Data on 1429 households were available from all three techniques. There was moderate agreement in ranking households into wealth tertiles between the two indicators based on survey data (spearman rho = 0.69, kappa = 0.43, but only limited agreement between these techniques and the PWR data (spearman rho = 0.38 and 0.31, kappa = 0.20 and 0.17. Conclusion Both PWR and household survey can provide a rapid assessment of household wealth. Each technique had strengths and weaknesses. Reasons for differences might include data inaccuracies or limitations in the methods by which information was weighted. Alternatively, the techniques may measure different things. More research is needed to increase the validity of measures of socioeconomic position used in health studies in developing countries.

  5. 2007-039-FA_hypack - Raw HYPACK navigation logs (text) collected by the U.S. Geological Survey from Middle Ground, MA, 2007

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement between the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  6. 2009-068-FA_hypack - Raw HYPACK navigation logs (text) collected by the U.S. Geological Survey from Middle Ground, MA, September 22, 2009

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data were collected under a cooperative agreement between the Massachusetts Office of Coastal Zone Management (CZM) and the U.S. Geological Survey (USGS),...

  7. Report to Pacific Flyway Study Committee on 1986-1991 breeding ground surveys of dusky Canada geese on the Copper River Delta

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Development of an expanded aerial survey on the Copper River Delta was begun in 1985. survey design was standardized in 1988 and the same transects have been flown...

  8. Report to Pacific Flyway Study Committee on 1986-1994 breeding ground surveys of Dusky Canada geese on the Copper River Delta

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Development of an expanded aerial survey on the Copper River Delta was begun in 1986. Survey design was standardized in 1988 and the same transects have been flown...

  9. A survey of techniques to reduce and manage external beam radiation-induced xerostomia in British oncology and radiotherapy departments

    Energy Technology Data Exchange (ETDEWEB)

    Macknelly, Andrew [Norfolk and Norwich University Hospital (United Kingdom); Day, Jane [Faculty of Health, Wellbeing and Science, University Campus Suffolk, Waterfront Building, Neptune Quay, Ipswich (United Kingdom)], E-mail: j.day@ucs.ac.uk

    2009-11-15

    Xerostomia is the most common side effect of external beam radiotherapy to the head and neck [Anand A, Jain J, Negi P, Chaudhoory A, Sinha S, Choudhury P, et-al. Can dose reduction to one parotid gland prevent xerostomia? - A feasibility study for locally advanced head and neck cancer patients treated with intensity-modulated radiotherapy. Clinical Oncology 2006;18(6):497-504.]. A survey was carried out in British oncology departments to determine what treatment regimes, to minimise xerostomia, are used for patients with head-and-neck cancers treated with external beam radiotherapy. A semi-structured questionnaire consisting of both quantitative and qualitative questions was designed that asked departments which of the identified methods they used, why a method might not be currently employed, and whether its use had ever been considered. The study found that there are wide disparities between the techniques employed by oncology departments to avoid and reduce xerostomia in patients with cancers of the head and neck. The National Institute of Clinical Health and Excellence, [National Institute for Clinical Health and Excellence (NICE). Improving outcomes in head and neck cancers: the manual. London: Office of Public Sector Information; 2004.] for example, recommends that patients are given dental care and dietary advice but some departments did not appear to be doing this. Less than half of departments stated that they offer complementary therapies and less than 40% prescribed pilocarpine, a saliva-stimulant. Only two respondents stated that they use amifostine, a radioprotector, during radiotherapy treatment to the head and neck. The results also suggested a move toward using Intensity Modulated Radiotherapy (IMRT) for treating head-and-neck cancers which offers better normal tissue sparing than three-dimensional conformal radiotherapy. [Anand A, Jain J, Negi P, Chaudhoory A, Sinha S, Choudhury P, et al. Can dose reduction to one parotid gland prevent xerostomia

  10. Tectonic, volcanic and human activity ground deformation signals detected by multitemporal InSAR techniques in the Colima Volcanic Complex (Mexico) rift

    Science.gov (United States)

    Brunori, C.; Norini, G.; Bignami, C.; Groppelli, G.; Zucca, F.; Stramondo, S.; Capra, L.; Cabral-Cano, E.

    2010-12-01

    The evolution of volcanoes is strictly related with their substratum and the regional tectonics. The link among morphology, geology and structure of volcanic edifices and the geological-structural characteristics of the basement is important to understand hazardous phenomena as flank eruptions and lateral collapses of volcanoes. The Colima Rift is an active regional structure, N-S oriented and more than 100 km long and 10 wide. This rift is filled by a ~1 km-thick sequence of quaternary lacustrine sediments, alluvium, and colluvium, mostly underling the about 3000 m thick volcanic pile of the Colima Volcanic Complex (CVC). In addition to the regional structures curved faults, roughly E-W oriented, are observed on the CVC edifice due to the spreading of the volcano moving southward on the weak basement. So in the CVC edifice and surrounding area we can observe the interaction of regional structures and volcanic ones due to the gravitational loading of the volcanic edifice on the weak substratum of the graben. To measure displacements due to magma movement at depth and interaction of regional structures and volcanic ones, SAR interferometry has proven to be a reliable method; however, andesitic stratovolcanoes like the CVC indeed,remain difficult to survey using this technique. The main causes are their specific geometry (steep topography), which induces strong tropospheric artefacts, environmental conditions (e.g., mainly vegetation, ash and/or snow cover), leading to a loss of coherency. In this work we try to detect deformations phenomena for the wide CVC using a robust multitemporal InSAR approach Differential Synthetic Aperture Radar Interferometry (DInSAR). We apply the Hooper (2008) DInSAR algorithm (StamPS/MTI) both to ENVISAT ASARr images acquired from 1993 to 2007 and to ALOS PALSAR (datasets from 2006 to 2010) in order to determine the deformation patterns in the CVC.

  11. Ground penetrating radar and terrestrial laser scanner surveys on deposits of dilute pyroclastic density current deposits: insights for dune bedform genesis

    Science.gov (United States)

    Rémi Dujardin, Jean; Amin Douillet, Guilhem; Abolghasem, Amir; Cordonnier, Benoit; Kueppers, Ulrich; Bano, Maksim; Dingwell, Donald B.

    2014-05-01

    Dune bedforms formed by dilute pyroclastic density currents (PDC) are often described or interpreted as antidunes and chute and pools. However, the interpretation remains essentially speculative and is not well understood. This is largely due to the seeming impossibility of in-situ measurements and experimental scaling, as well as the lack of recent, 3D exposures. Indeed, most dune bedform cross-stratifications from the dilute PDC record outcrop in 2D sections. The 2006 eruption of Tungurahua has produced well-developed bedforms that are well-exposed on the surface of the deposits with easy access. We performed a survey of these deposits combining ground penetrating radar (GPR) profiling with terrestrial laser scanning of the surface. The GPR survey was carried in dense arrays (from 10 to 25 cm spacing between profiles) over ca. 10 m long bedforms. GPR profiles were corrected for topography from photogrammetry data. An in-house software, RadLab (written in matlab), was used for common processing of individual profiles and 2D & 3D topographic migration. Each topography-corrected profile was then loaded into a seismic interpretation software, OpenDtect, for 3D visualization and interpretation. Most bedforms show high lateral stability that is independent of the cross-stratification pattern (that varies between stoss-aggrading bedsets, stoss-erosive bedsets and stoss-depositional lensoidal layers). Anecdotic bedforms have their profiles that evolve laterally (i.e. in a direction perpendicular to the flow direction). Cannibalization of two dune bedforms into a single one on one end of the profile can evolve into growth of a single bedform at the other lateral end. Also, lateral variation in the migration direction occurs, i.e. a single bedform can show upstream aggradation at one lateral end of the bedform, but show downstream migration at the other end. Some bedforms have great variations in their internal structure. Several episodes of growth and erosion can be

  12. Enhanced analysis methods to derive the spatial distribution of 131I deposition on the ground by airborne surveys at an early stage after the Fukushima Daiichi nuclear power plant accident.

    Science.gov (United States)

    Torii, Tatsuo; Sugita, Takeshi; Okada, Colin E; Reed, Michael S; Blumenthal, Daniel J

    2013-08-01

    This paper applies both new and well tested analysis methods to aerial radiological surveys to extract the I ground concentrations present after the March 2011 Fukushima Daiichi nuclear power plant (NPP) accident. The analysis provides a complete map of I deposition, an important quantity incalculable at the time of the accident due to the short half-life of I and the complexity of the analysis. A map of I deposition is the first step in conducting internal exposure assessments, population dose reconstruction, and follow-up epidemiological studies. The short half-life of I necessitates the use of aerial radiological surveys to cover the large area quickly, thoroughly, and safely. Teams from the U.S. Department of Energy National Nuclear Security Administration (DOE/NNSA) performed aerial radiological surveys to provide initial maps of the dispersal of radioactive material in Japan. This work reports on analyses performed on a subset of the initial survey data by a joint Japan-U.S. collaboration to determine I ground concentrations. The analytical results show a high concentration of I northwest of the NPP, consistent with the previously reported radioactive cesium deposition, but also shows a significant I concentration south of the plant, which was not observed in the original cesium analysis. The difference in the radioactive iodine and cesium patterns is possibly the result of differences in the ways these materials settle out of the air.

  13. Grounded theory.

    Science.gov (United States)

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  14. The Raman effect and its application to electronic spectroscopies in metal-centered species : Techniques and investigations in ground and excited states

    NARCIS (Netherlands)

    Browne, W.R.; J. McGarvey, J.

    2007-01-01

    In the decades since its discovery and somewhat limited early applications, Raman scattering has become the basis for the development of a variety of methods for probing molecular structure both in ground and electronically excited states. In this review, following a brief look at the underlying pri

  15. The Raman effect and its application to electronic spectroscopies in metal-centered species : Techniques and investigations in ground and excited states

    NARCIS (Netherlands)

    Browne, W.R.; J. McGarvey, J.

    In the decades since its discovery and somewhat limited early applications, Raman scattering has become the basis for the development of a variety of methods for probing molecular structure both in ground and electronically excited states. In this review, following a brief look at the underlying

  16. Is the adoption of Strategic Management Accounting techniques really “strategy-driven”? Evidence from a survey

    OpenAIRE

    Cinquini, Lino; Tenucci, Andrea

    2007-01-01

    Several different approaches to Strategic Management Accounting (SMA) can be found in the literature of management ac counting since Simmonds (1981) coined the term. However, there is a little survey research about SMA practice, with the exception of the studies of Guilding et al. (2000) and Cravens & Guilding (2001). The paper aims to enrich the fragmented knowledge on t he topic by a contingency research study based on an internet questionnaire survey on Italian companies. The study foc...

  17. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    Science.gov (United States)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  18. Effectiveness of control techniques in drinking water installations. Survey of recent scientific research results; Effectiviteit beheerstechnieken in drinkwaterinstallaties. RIVM inventariseert recente wetenschappelijke bevindingen

    Energy Technology Data Exchange (ETDEWEB)

    Scheffer, W.

    2012-12-15

    A literature survey has been carried out on current scientific knowledge about the effectiveness of management techniques for Legionella in potable water systems. The examined scientific literature from 2007-201 comprises mainly case studies on the effect of the introduction of a certain management technique on Legionella growth [Dutch] Het RIVM heeft in opdracht van de Inspectie Leefomgeving en Transport (ILT) literatuuronderzoek gedaan naar de huidige wetenschappelijke kennis over de effectiviteit van beheerstechnieken voor legionella in drinkwaterinstallaties. De onderzochte wetenschappelijke literatuur uit 2007-201 I betreft vooral casestudies naar het effect van de introductie van een bepaalde beheerstechniek op de legionellagroei.

  19. Survey of CT practice in Norway. Examination technique and patient doses; Computer-tomografi ved norske sykehus. Undersoekelsesteknikk og straaledose til pasient

    Energy Technology Data Exchange (ETDEWEB)

    Olerud, H.M.; Finne, I.E.

    1995-12-01

    A Norwegian survey of CT practice is presented, including examination technique and patient doses for 7 typical examinations and 12 specific clinical indications. The result covers 49 CT scanners, and the patient doses are somewhat higher compared to similar result from other countries. Use of contrast, different scan volumes and exposure techniques are main reasons for the wide distribution in doses. The collective dose from CT is estimated to 1000 manSv or 0.2 mSv/caput. 39 refs., 23 figs., 14 tabs.

  20. FY 1998 annual report on the surveys on high-efficiency power generation techniques of the next generation; 1998 nendo jisedai kokoritsu hatsuden gijutsu ni kansuru chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    Comprehensive surveys on and assessment of seeds of the high-efficiency power generation techniques of the next generation, both domestic and overseas, are conducted to help propose preliminary national research themes. In FY 1998, the survey efforts are directed not only to the trends of those for industrial power generation under development, e.g., combined cycle, coal-gasification combined cycle and pressurized fluidized bed combined cycle, but also to the trends of newly proposed systems, e.g., humid air combined cycle, methane reforming combined cycle, fuel reforming/humid air combined cycle, fuel reforming/fuel cell combined cycle and micro gas turbine, to further enhance efficiency by integration with a gas turbine power generation system. The efforts are also directed to (new power generation techniques in the basic research stage) being studied mainly by academic and research organizations, and to development trends of new power generation techniques under development by private enterprises, mainly based on the patent survey. (NEDO)

  1. Grounded cognition.

    Science.gov (United States)

    Barsalou, Lawrence W

    2008-01-01

    Grounded cognition rejects traditional views that cognition is computation on amodal symbols in a modular system, independent of the brain's modal systems for perception, action, and introspection. Instead, grounded cognition proposes that modal simulations, bodily states, and situated action underlie cognition. Accumulating behavioral and neural evidence supporting this view is reviewed from research on perception, memory, knowledge, language, thought, social cognition, and development. Theories of grounded cognition are also reviewed, as are origins of the area and common misperceptions of it. Theoretical, empirical, and methodological issues are raised whose future treatment is likely to affect the growth and impact of grounded cognition.

  2. New-Measurement Techniques to Diagnose Charged Dust and Plasma Layers in the Near-Earth Space Environment Using Ground-Based Ionospheric Heating Facilities

    OpenAIRE

    Mahmoudian, Alireza

    2013-01-01

    Recently, experimental observations have shown that radar echoes from the irregularitysource region associated with mesospheric dusty space plasmas may be modulated by radio wave heating with ground-based ionospheric heating facilities. These experiments show great promise as a diagnostic for the associated dusty plasma in the Near-Earth Space Environment which is believed to have links to global change. This provides an alternative to more complicated and costly space-based observational app...

  3. Ground States of the Lithium Atom and its Ions up to Z = 10 in the Presence of Magnetic Field using Variational Monte Carlo Technique

    CERN Document Server

    Doma, S B; Farag, A M; El-Gammal, F N

    2016-01-01

    The variational Monte Carlo method is applied to investigate the ground state energy of the lithium atom and its ions up to Z=10 in the presence of an external magnetic field regime. Our calculations are based on using three forms of compact and accurate trial wave functions, which were put forward in calculating energies in the absence of magnetic field. The obtained results are in good agreement with the most recent accurate values and also with the exact values.

  4. The VIMOS Public Extragalactic Redshift Survey (VIPERS). Never mind the gaps: comparing techniques to restore homogeneous sky coverage

    CERN Document Server

    Cucciati, O; Branchini, E; Marulli, F; Iovino, A; Moscardini, L; Bel, J; Cappi, A; Peacock, J A; de la Torre, S; Bolzonella, M; Guzzo, L; Polletta, M; Fritz, A; Adami, C; Bottini, D; Coupon, J; Davidzon, I; Franzetti, P; Fumana, M; Garilli, B; Krywult, J; Malek, K; Paioro, L; Pollo, A; Scodeggio, M; Tasca, L A M; Vergani, D; Zanichelli, A; Di Porto, C; Zamorani, G

    2014-01-01

    [Abridged] Non-uniform sampling and gaps in sky coverage are common in galaxy redshift surveys but these effects can degrade galaxy counts-in-cells and density estimates. We carry out a comparison of methods that aim to fill the gaps to correct for the systematic effects. Our study is motivated by the analysis of the VIMOS Extragalactic Redshift Survey (VIPERS), a flux-limited survey (i<22.5) based on one-pass observations with VIMOS, with gaps covering 25% of the surveyed area and a mean sampling rate of 35%. Our findings are applicable to other surveys with similar observing strategies. We compare 1) two algorithms based on photometric redshift, that assign redshifts to galaxies based on the spectroscopic redshifts of the nearest neighbours, 2) two Bayesian methods, the Wiener filter and the Poisson-Lognormal filter. Using galaxy mock catalogues we quantify the accuracy of the counts-in-cells measurements on scales of R=5 and 8 Mpc/h after applying each of these methods. We also study how they perform to...

  5. Results of ground level radiation measurements in support of the 1978 aerial survey of the Lake Ontario Ordnance Works, Lewiston, New York

    Energy Technology Data Exchange (ETDEWEB)

    Berven, B A; Doane, R W; Haywood, F F; Shinpaugh, W H

    1979-09-01

    This report contains the results of a limited series of measurements at the Lake Ontario Ordnance Works site, three miles northeast of Lewiston, New York. The scope of this survey was not extensive, and the survey was conducted to support a concurrent aerial survey conducted by EG and G, Inc. Results of this survey indicate two souces of significant external gamma exposure on the site as well as several locations that retain low to intermediate levels of radioactivity in soil. Off-site soil radionuclide concentrations were well within background levels with one exception. Water radionuclide concentrations on the site in the Central Drainage Ditch are significantly above background levels but decrease with distance from the spoil pile, and are within restrictive concentration guides for off-site locations.

  6. Photogrammetric techniques for aerospace applications

    Science.gov (United States)

    Liu, Tianshu; Burner, Alpheus W.; Jones, Thomas W.; Barrows, Danny A.

    2012-10-01

    Photogrammetric techniques have been used for measuring the important physical quantities in both ground and flight testing including aeroelastic deformation, attitude, position, shape and dynamics of objects such as wind tunnel models, flight vehicles, rotating blades and large space structures. The distinct advantage of photogrammetric measurement is that it is a non-contact, global measurement technique. Although the general principles of photogrammetry are well known particularly in topographic and aerial survey, photogrammetric techniques require special adaptation for aerospace applications. This review provides a comprehensive and systematic summary of photogrammetric techniques for aerospace applications based on diverse sources. It is useful mainly for aerospace engineers who want to use photogrammetric techniques, but it also gives a general introduction for photogrammetrists and computer vision scientists to new applications.

  7. Comparison of conventional culture method and fluorescent in situ hybridization technique for detection of Listeria spp. in ground beef, turkey, and chicken breast fillets in İzmir, Turkey.

    Science.gov (United States)

    Baysal, Ayse Handan

    2014-12-01

    The occurrence of Listeria species in refrigerated fresh chicken breast fillet, turkey breast fillet, and ground beef was evaluated, comparing the conventional culture method and fluorescent in situ hybridization (FISH). FISH uses hybridization of a nucleic acid sequence target of a microorganism with a specific DNA probe labeled with a fluorochrome and imaging by a fluorescence microscope. First, Listeria was inoculated in chicken breast fillet, turkey breast fillet, or ground beef, and the applicability of the FISH method was evaluated. Second, Listeria was detected in fresh chicken breast fillet, turkey breast fillet, and ground beef by culture and FISH methods. Listeria was isolated from 27 (37.4%) of 216 samples by the standard culture method, whereas FISH detected 25 (24.7%) preenriched samples. Of these isolates, 17 (63%) were L. innocua, 6 (22%) L. welshimeri, and 4 (14.8%) L. seeligeri. Overall, the prevalences of Listeria spp. found with the conventional culture method in chicken breast fillet, turkey breast fillet, and ground beef were 9.7, 6.9, and 20.8%, whereas with the FISH technique these values were 11.1, 6.9, and 16.7%, respectively. The molecular FISH technique appears to be a cheap, sensitive, and time-efficient procedure that could be used for routine detection of Listeria spp. in meat. This study showed that retail raw meats are potentially contaminated with Listeria spp. and are, thus, vehicles for transmitting diseases caused by foodborne pathogens, underlining the need for increased precautions, such as implementation of hazard analysis and critical control points and consumer food safety education.

  8. Barren-ground caribou (Rangifer tarandus groenlandicus) behaviour after recent fire events; integrating caribou telemetry data with Landsat fire detection techniques.

    Science.gov (United States)

    Rickbeil, Gregory J M; Hermosilla, Txomin; Coops, Nicholas C; White, Joanne C; Wulder, Michael A

    2017-03-01

    Fire regimes are changing throughout the North American boreal forest in complex ways. Fire is also a major factor governing access to high-quality forage such as terricholous lichens for barren-ground caribou (Rangifer tarandus groenlandicus). Additionally, fire alters forest structure which can affect barren-ground caribou's ability to navigate in a landscape. Here, we characterize how the size and severity of fires are changing across five barren-ground caribou herd ranges in the Northwest Territories and Nunavut, Canada. Additionally, we demonstrate how time since fire, fire severity, and season result in complex changes in caribou behavioural metrics estimated using telemetry data. Fire disturbances were identified using novel gap-free Landsat surface reflectance composites from 1985 to 2011 across all herd ranges. Burn severity was estimated using the differenced normalized burn ratio. Annual area burned and burn severity were assessed through time for each herd and related to two behavioural metrics: velocity and relative turning angle. Neither annual area burned nor burn severity displayed any temporal trend within the study period. However, certain herds, such as the Ahiak/Beverly, have more exposure to fire than other herds (i.e. Cape Bathurst had a maximum forested area burned of less than 4 km(2) ). Time since fire and burn severity both significantly affected velocity and relative turning angles. During fall, winter, and spring, fire virtually eliminated foraging-focused behaviour for all 26 years of analysis while more severe fires resulted in a marked increase in movement-focused behaviour compared to unburnt patches. Between seasons, caribou used burned areas as early as 1-year postfire, demonstrating complex, nonlinear reactions to time since fire, fire severity, and season. In all cases, increases in movement-focused behaviour were detected postfire. We conclude that changes in caribou behaviour immediately postfire are primarily driven by

  9. Preliminary estimates of residence times and apparent ages of ground water in the Chesapeake Bay watershed, and water-quality data from a survey of springs

    Science.gov (United States)

    Focazio, Michael J.; Plummer, L. Neil; Bohlke, John K.; Busenberg, Eurybiades; Bachman, L. Joseph; Powars, David S.

    1998-01-01

    Knowledge of the residence times of the ground-water systems in Chesapeake Bay watershed helps resource managers anticipate potential delays between implementation of land-management practices and any improve-ments in river and estuary water quality. This report presents preliminary estimates of ground-water residence times and apparent ages of water in the shallow aquifers of the Chesapeake Bay watershed. A simple reservoir model, published data, and analyses of spring water were used to estimate residence times and apparent ages of ground-water discharge. Ranges of aquifer hydraulic characteristics throughout the Bay watershed were derived from published literature and were used to estimate ground-water residence times on the basis of a simple reservoir model. Simple combinations of rock type and physiographic province were used to delineate hydrogeomorphic regions (HGMR?s) for the study area. The HGMR?s are used to facilitate organization and display of the data and analyses. Illustrations depicting the relation of aquifer characteristics and associated residence times as a continuum for each HGMR were developed. In this way, the natural variation of aquifer characteristics can be seen graphically by use of data from selected representative studies. Water samples collected in September and November 1996, from 46 springs throughout the watershed were analyzed for chlorofluorocarbons (CFC?s) to estimate the apparent age of ground water. For comparison purposes, apparent ages of water from springs were calculated assuming piston flow. Additi-onal data are given to estimate apparent ages assuming an exponential distribution of ages in spring discharge. Additionally, results from previous studies of CFC-dating of ground water from other springs and wells in the watershed were compiled. The CFC data, and the data on major ions, nutrients, and nitrogen isotopes in the water collected from the 46 springs are included in this report. The apparent ages of water

  10. Crop area ground sample survey using Google Earth image-aided%Google Earth影像辅助的农作物面积地面样方调查

    Institute of Scientific and Technical Information of China (English)

    刘佳; 王利民; 滕飞; 李丹丹; 王小龙; 曹怀堂

    2015-01-01

    By using Google Earth (GE) image revised by differential global positioning system (DGPS) actual measurement points, this paper conducts a ground sample survey of crop planting areas, and compares the difference in survey accuracy and efficiency between this method and the method completely using GPS field measurement. The study area is the Agricultural High-tech Industrial Park of Chinese Academy of Agricultural Sciences (Wanzhuang) and its surrounding area with the area of 3.1 km × 2.0 km. The paper defines the data from the different GE image sources. The images downloaded based on GE Client COM API programming are defined as A-level data, the images revised by online GE images are defined as B-level data, and the images revised by DGPS actual measurement points are defined as C-level data. Compared with the checkpoints of DGPS actual measurement, A-level data of the GE images with spatial resolution of over 0.5 m have a mean square error of 232.7 m inX andY directions, and for B-level and C-level data it is 5.4 m and 1.0 m, respectively. The B-level data meet the requirement that “The mean square error in planimetric position of 1:25000 should be no more than 8.75 m”, and the C-level data meet the demand that “the mean square error in planimetric position of 1:10000 flat ground should be no more than 3.5 m”, which are specified in theDigital Aerophotogrammetry Aerial Trigonometric Survey Specifications. Choosing the samples with 3 structure levels, i.e. simple, medium and complex level in the Langfang survey area, the area measurement accuracy of B-level and C-level data is measured, and the average errors are 0.108% and 0.018% respectively through the comparison with DGPS actual measurement areas. The larger the crop area, the higher the accuracy of the measurement. The survey meets the accuracy requirement of large scale ground sample survey. With respect to GE online coordinate, the average minimal mean square error of B-level data is 0.5 m, and the

  11. Ground water in Oklahoma

    Science.gov (United States)

    Leonard, A.R.

    1960-01-01

    One of the first requisites for the intelligent planning of utilization and control of water and for the administration of laws relating to its use is data on the quantity, quality, and mode of occurrence of the available supplies. The collection, evaluation and interpretation, and publication of such data are among the primary functions of the U.S. Geological Survey. Since 1895 the Congress has made appropriations to the Survey for investigation of the water resources of the Nation. In 1929 the Congress adopted the policy of dollar-for-dollar cooperation with the States and local governmental agencies in water-resources investigations of the U.S. Geological Survey. In 1937 a program of ground-water investigations was started in cooperation with the Oklahoma Geological Survey, and in 1949 this program was expanded to include cooperation with the Oklahoma Planning and Resources Board. In 1957 the State Legislature created the Oklahoma Water Resources Board as the principal State water agency and it became the principal local cooperator. The Ground Water Branch of the U.S. Geological Survey collects, analyzes, and evaluates basic information on ground-water resources and prepares interpretive reports based on those data. Cooperative ground-water work was first concentrated in the Panhandle counties. During World War II most work was related to problems of water supply for defense requirements. Since 1945 detailed investigations of ground-water availability have been made in 11 areas, chiefly in the western and central parts of the State. In addition, water levels in more than 300 wells are measured periodically, principally in the western half of the State. In Oklahoma current studies are directed toward determining the source, occurrence, and availability of ground water and toward estimating the quantity of water and rate of replenishment to specific areas and water-bearing formations. Ground water plays an important role in the economy of the State. It is

  12. Spectroscopic measurements of atmospheric constituents and pollutants by in situ and remote techniques from the ground and in flight; Mesures spectroscopiques de constituants et de polluants atmospheriques par techniques in situ et a distance, au sol ou embarquees

    Energy Technology Data Exchange (ETDEWEB)

    Camy-Peyret, C.; Payan, S.; Jeseck, P.; Te, Y. [Universite Pierre et Marie Curie, Lab. de Physique Moleculaire et Applications, LPMA, 75 - Paris (France)

    2001-09-01

    Infrared spectroscopy is a powerful tool for precise measurements of atmospheric trace species concentrations through the use of characteristic spectral signatures of the different molecular species and their associated vibration-rotation bands in the mid- or near-infrared. Different methods based on quantitative spectroscopy permit tropospheric or stratospheric measurements: in situ long path absorption, atmospheric absorption/emission by Fourier transform spectroscopy with high spectral resolution instruments on the ground, airborne, balloon-borne or satellite-borne. (authors)

  13. Survey of WBSNs for Pre-Hospital Assistance: Trends to Maximize the Network Lifetime and Video Transmission Techniques

    Directory of Open Access Journals (Sweden)

    Enrique Gonzalez

    2015-05-01

    Full Text Available This survey aims to encourage the multidisciplinary communities to join forces for innovation in the mobile health monitoring area. Specifically, multidisciplinary innovations in medical emergency scenarios can have a significant impact on the effectiveness and quality of the procedures and practices in the delivery of medical care. Wireless body sensor networks (WBSNs are a promising technology capable of improving the existing practices in condition assessment and care delivery for a patient in a medical emergency. This technology can also facilitate the early interventions of a specialist physician during the pre-hospital period. WBSNs make possible these early interventions by establishing remote communication links with video/audio support and by providing medical information such as vital signs, electrocardiograms, etc. in real time. This survey focuses on relevant issues needed to understand how to setup a WBSN for medical emergencies. These issues are: monitoring vital signs and video transmission, energy efficient protocols, scheduling, optimization and energy consumption on a WBSN.

  14. Education techniques for lifelong learning: international variations in initial certification and maintenance of certification in radiology: a multinational survey.

    Science.gov (United States)

    Bresolin, Linda; McLoud, Theresa C; Becker, Gary J; Kwakwa, Francis

    2008-01-01

    A survey was sent to representatives of national and regional radiology societies around the world regarding the status of certification, maintenance of certification (MOC), and continuing medical education (CME) requirements. Data were forthcoming from 24 countries (response rate, 71%), including the United States. The survey results indicated that most responding countries now have a standardized process and requirements for initial certification of diagnostic and therapeutic radiologists. Similarly, most reporting countries now have some form of mandatory CME, although the degree to which compliance is tracked varies. There is considerable heterogeneity in what these countries require for recertification or MOC, and the development of such requirements is cited as a goal for many of the countries. The standardization and institutionalization of certification and recertification requirements is in rapid evolution globally.

  15. A survey of a pampas deer, Ozotoceros bezoarticus leucogaster (Arctiodactyla, Cervidae, population in the Pantanal wetland, Brazil, using the distance sampling technique

    Directory of Open Access Journals (Sweden)

    Tomás, W. M.

    2001-06-01

    Full Text Available The pampas deer is an endangered South American species which occurs in open grasslands and savannas. This aim of this survey was to evaluate the use of the distance sampling technique to estimate densities of the species in the Pantanal wetland, as well as to analyze the applicability of the method for a monitoring program. The surveys were conducted on roads from vehicles and also on foot along 26 parallel transects in November 1999 and 2000 at Campo Dora ranch, south-central Pantanal, Brazil. Deer densities were estimated using the program DISTANCE, and the program MONITOR was used to run a power analysis to estimate the probability of detection of a decline in the population. The deer density estimated from vehicles, with data from both years, was 9.81±3.8 individual/km2, and 5.53±0.68 individuals/km2 from transects sampled on foot. The power analysis of these data revealed a monitoring program would require at least two surveys per year over seven years to obtain a 90% chance of detecting a 5% decline in the population. Our results also indicate surveys from roads are not recommended for pampas deer counts as the animals appear to keep a relatively safe distance from cars.

  16. Amazonas project: Application of remote sensing techniques for the integrated survey of natural resources in Amazonas. [Brazil

    Science.gov (United States)

    Dejesusparada, N. (Principal Investigator)

    1981-01-01

    The use of LANDSAT multispectral scanner and return beam vidicon imagery for surveying the natural resources of the Brazilian Amazonas is described. Purposes of the Amazonas development project are summarized. The application of LANDSAT imagery to identification of vegetation coverage and soil use, identification of soil types, geomorphology, and geology and highway planning is discussed. An evaluation of the worth of LANDSAT imagery in mapping the region is presented. Maps generated by the project are included.

  17. A Survey of Non-conventional Techniques for Low-voltage Low-power Analog Circuit Design

    Directory of Open Access Journals (Sweden)

    F. Khateb

    2013-06-01

    Full Text Available Designing integrated circuits able to work under low-voltage (LV low-power (LP condition is currently undergoing a very considerable boom. Reducing voltage supply and power consumption of integrated circuits is crucial factor since in general it ensures the device reliability, prevents overheating of the circuits and in particular prolongs the operation period for battery powered devices. Recently, non-conventional techniques i.e. bulk-driven (BD, floating-gate (FG and quasi-floating-gate (QFG techniques have been proposed as powerful ways to reduce the design complexity and push the voltage supply towards threshold voltage of the MOS transistors (MOST. Therefore, this paper presents the operation principle, the advantages and disadvantages of each of these techniques, enabling circuit designers to choose the proper design technique based on application requirements. As an example of application three operational transconductance amplifiers (OTA base on these non-conventional techniques are presented, the voltage supply is only ±0.4 V and the power consumption is 23.5 µW. PSpice simulation results using the 0.18 µm CMOS technology from TSMC are included to verify the design functionality and correspondence with theory.

  18. Survey of the results of acute sciatic nerve repair comparing epineural and perineurial techniques in the lower extremities of rat

    Institute of Scientific and Technical Information of China (English)

    Hamid Karimi; Kamal Seyed Forootan; Gholamreza Moein; Seyed Jaber Mosavi; Batol Ghorbani Iekta

    2015-01-01

    Objective: To study the result of nerve repair in the two mentioned techniques in rats to find the proper answer to the existing disagreement. Methods: Twenty adult male rats were included in treatment group. Acutely disconnected sciatic nerve was repaired by Epineural technique in half of the rats;in the other half perineurial technique was applied. After 80 d, the number of grown axons of distal on the repair site was calculated through the use of an optical microscope. Additionally by studying the foot print of the rats the return of neural motor activity was evaluated. Results: In epineural group, SFI index was: (56.33±32.30) and in perineurial group: (55.71±30.31);P value=0.930 with their being no difference between these two techniques of surgery. However, in comparing epineural and perineurial groups in the groups themselves, statistical tests showed a significant difference showing functional improvement in comparison with the day before surgery P value=0.0001. Statistical tests showed that the average of axons' number distal to anastomosis site in the epineural group was (349±80) and in the perineurial group was (405±174). These groups have no significant difference regarding the number of axons (P value=0.36). Conclusion:The results of epineural and perineurial surgery techniques show no difference in nerve repair, SFI index, or axon counting in distal part.

  19. Ground Wars

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Kleis

    Political campaigns today are won or lost in the so-called ground war--the strategic deployment of teams of staffers, volunteers, and paid part-timers who work the phones and canvass block by block, house by house, voter by voter. Ground Wars provides an in-depth ethnographic portrait of two...... infrastructures that utilize large databases with detailed individual-level information for targeting voters, and armies of dedicated volunteers and paid part-timers. Nielsen challenges the notion that political communication in America must be tightly scripted, controlled, and conducted by a select coterie...... of professionals. Yet he also quashes the romantic idea that canvassing is a purer form of grassroots politics. In today's political ground wars, Nielsen demonstrates, even the most ordinary-seeming volunteer knocking at your door is backed up by high-tech targeting technologies and party expertise. Ground Wars...

  20. Report to the Pacific Flyway Study Committee on 1986-1997 Breeding Ground Surveys of Dusky Canada Geese on the Copper River Delta

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The 1997 Copper River Delta survey for breeding dusky Canada geese was conducted from 13-15 May by the Division of Migratory Bird Management, U.S. Fish and Wildlife...

  1. Main Sources of Occupational Stress and Symptoms of Burnout, Clinical Distress, and Post-Traumatic Stress Among Distributed Common Ground System Intelligence Exploitation Operators (2011 USAFSAM Survey Results)

    Science.gov (United States)

    2012-09-01

    Washington, DC, 2000. 4. Maslach C, Jackson SE, Leiter MP, Maslach Burnout Inventory Manual, 3rd ed., Consulting Psychologists Press, Palo Alto...5 3.2.1 Maslach Burnout Inventory-General Survey (MBI-GS) ......................... 6 3.2.2 Outcome...assessing sources of stress, as well as standardized instruments assessing occupational burnout ( Maslach Burnout Inventory), clinical distress (Outcome

  2. Report to the Pacific Flyway Study Committee on 1986-2002 Breeding Ground Survey Preliminary Results for Dusky Canada geese on the Copper River Delta, Alaska

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Copper River Delta dusky Canada geese survey was conducted on 17-18 May, 2002, for the 17th consecutive year by the Division of Migratory Bird Management, U.S....

  3. Determination of ground water flow using dilution technique. Modification of equipment and supplementary field measurements; Bestaemning av grundvattenfloedet med utspaedningsteknik. Modifiering av utrustning och kompletterande faeltmaetningar

    Energy Technology Data Exchange (ETDEWEB)

    Gustafsson, Erik [Geosigma AB, Uppsala (Sweden)

    2002-12-01

    Equipment for in-situ measurements of ground water flow was modified and improved. Performance data on the dilution sond are discussed in the report, as well as possibilities for further improvements of the equipment. The most important results and conclusions from the tests are: If an accuracy of 10% is acceptable, the lowest measurable flow rate is of the order of 10{sup -11} m/s in fractured granite. Low flows take long time to measure. Measuring the flow through a section with hydraulic conductivity 10{sup -8} at the gradient 0.01 will take 30-50 days in a 110 mm borehole. A 56 mm borehole will need half that time. The optical in-situ measurement of trace element concentration is disturbed, in particular at the beginning of the measurement, by sedimenting particles that have loosened when the sond was lowered into the borehole. The background transmission T{sub b} is a measure of the amount of disturbing particles in the measurement section and is present, as a constant, in the calibration equation. By delaying the trace element injection until the variation of T{sub b} is less than 7.5x10{sup -3} /h the disturbance from sedimenting particles is negligible. The field measurements were performed without problems in 2 m long sections with hydraulic conductivity around 10{sup -7} - 10{sup -8} m/s. The measured flow rates were in the interval 10{sup -8} - 10{sup -9} m/s. Good correlations were established between changes in ground water flow and measured changes in the hydraulic gradient in the test area. Indirect comparisons with hydraulic tests also show small differences, inside the margin of error for the methods.

  4. A Survey of \\delta18O and \\delta15N Ratios in Ground Water from an Agricultural Community in the San Joaquin Valley, California

    Science.gov (United States)

    Glowacki, S. D.; Suen, C. J.

    2004-12-01

    We studied ground water samples from domestic and monitoring wells in an agricultural community in the eastern side of the San Joaquin Valley, California. The study area is rich in alluvial soils creating an extremely fertile farmland. Livestock farms and agricultural fields are abundant in the area. Fifty-four ground water samples were analyzed for \\delta18O and \\delta15N in dissolved nitrate, in addition to nutrients and major minerals. Nitrate concentration levels in groundwater are elevated and affected by agricultural and other activities. Possible sources of nutrients include: a municipal waste-water treatment facility, a raisin processing plant, a meat processing plant, a turkey farm, diary operations, and agricultural fields. However, except for the turkey farm and a diary, we found no statistical significant contribution of nitrate from the other facilities as compared to the rest of the area. The \\delta18O versus \\delta15N ratios plot of dissolved ground water nitrate shows most samples clustered around an area consistent with soil organic nitrogen. In addition, the rest of the samples show a trend that is indicative of denitrification process. Generally, high \\delta15N values are associated with low nitrate concentrations. The isotopic signal of denitrification is particularly pronounced in samples in the vicinity of the waste water treatment facility, where the highest values of \\delta15N and the lowest nitrate concentrations are observed. However, these samples also have elevated chloride concentrations indicating a waste-water source. These data suggest that the denitrification in the subsurface may have been enhanced by bacteria species introduced by the effluence of the plant. [This study was performed with the collaboration of Steven R Silva of USGS, Menlo Park, and Iris Yamagata and Holly Jo Ferrin of California Department of Water Resources.

  5. Corrosive effect of the type of soil in the systems of grounding more used (copper and stainless steel) for local soil samples from the city of Tunja (Colombia), by means of electrochemical techniques

    Science.gov (United States)

    Guerrero, L.; Salas, Y.; Blanco, J.

    2016-02-01

    In this work electrochemical techniques were used to determine the corrosion behaviour of copper and stainless steel electrodes, used in grounding varying soil type with which they react. A slight but significant change in the corrosion rate, linear polarization resistance and equivalent parameters in the technique of electrochemical impedance spectroscopy circuit was observed. Electrolytes in soils are slightly different depending on laboratory study, but the influence was noted in the retention capacity of water, mainly due to clays, affecting ion mobility and therefore measures such as the corrosion rate. Behaviour was noted in lower potential for copper corrosion, though the corrosion rate regardless of the type of soil, was much higher for electrodes based on copper, by several orders of magnitude.

  6. A LITERATURE SURVEY ON VARIOUS ILLUMINATION NORMALIZATION TECHNIQUES FOR FACE RECOGNITION WITH FUZZY K NEAREST NEIGHBOUR CLASSIFIER

    Directory of Open Access Journals (Sweden)

    A. Thamizharasi

    2015-05-01

    Full Text Available The face recognition is popular in video surveillance, social networks and criminal identifications nowadays. The performance of face recognition would be affected by variations in illumination, pose, aging and partial occlusion of face by Wearing Hats, scarves and glasses etc. The illumination variations are still the challenging problem in face recognition. The aim is to compare the various illumination normalization techniques. The illumination normalization techniques include: Log transformations, Power Law transformations, Histogram equalization, Adaptive histogram equalization, Contrast stretching, Retinex, Multi scale Retinex, Difference of Gaussian, DCT, DCT Normalization, DWT, Gradient face, Self Quotient, Multi scale Self Quotient and Homomorphic filter. The proposed work consists of three steps. First step is to preprocess the face image with the above illumination normalization techniques; second step is to create the train and test database from the preprocessed face images and third step is to recognize the face images using Fuzzy K nearest neighbor classifier. The face recognition accuracy of all preprocessing techniques is compared using the AR face database of color images.

  7. Data mining techniques for distributed denial of service attacks detection in the internet of things: A research survey

    CSIR Research Space (South Africa)

    Machaka, P

    2016-08-01

    Full Text Available . The chapter further investigates the state-of-the-art in data mining techniques for Distributed Denial of Service (DDoS) attacks targeting the various infrastructures. The chapter explores the characteristics and pervasiveness of DDoS attacks. It also explores...

  8. Web-based, mobile-device friendly, self-report survey system incorporating avatars and gaming console techniques.

    Science.gov (United States)

    Savel, Craig; Mierzwa, Stan; Gorbach, Pamina; Lally, Michelle; Zimet, Gregory; Meyer, Kristin; Souidi, Samir; Interventions, Aids

    2014-01-01

    We describe building an avatar-based self-report data collection tool to be used for a specific HIV prevention research project that is evaluating the feasibility and acceptability of this novel approach to collect self-reported data among youth. We discuss the gathering of requirements, the process of building a prototype of the envisioned system, and the lessons learned during the development of the solution. Specific knowledge is shared regarding technical experience with software development technologies and possible avenues for changes that could be considered if such a self-report survey system is used again. Examples of other gaming and avatar technology systems are included to provide further background.

  9. Monitoring of ground movement in open pit iron mines of Carajás Province (Amazon region) based on A-DInSAR techniques using TerraSAR-X data

    Science.gov (United States)

    Silva, Guilherme Gregório; Mura, José Claudio; Paradella, Waldir Renato; Gama, Fabio Furlan; Temporim, Filipe Altoé

    2017-04-01

    Persistent scatterer interferometry (PSI) analysis of a large area is always a challenging task regarding the removal of the atmospheric phase component. This work presents an investigation of ground movement measurements based on a combination of differential SAR interferometry time-series (DTS) and PSI techniques, applied on a large area of extent with open pit iron mines located in Carajás (Brazilian Amazon Region), aiming at detecting linear and nonlinear ground movement. These mines have presented a history of instability, and surface monitoring measurements over sectors of the mines (pit walls) have been carried out based on ground-based radar and total station (prisms). Using a priori information regarding the topographic phase error and a phase displacement model derived from DTS, temporal phase unwrapping in the PSI processing and the removal of the atmospheric phases can be performed more efficiently. A set of 33 TerraSAR-X (TSX-1) images, acquired during the period from March 2012 to April 2013, was used to perform this investigation. The DTS analysis was carried out on a stack of multilook unwrapped interferograms using an extension of SVD to obtain the least-square solution. The height errors and deformation rates provided by the DTS approach were subtracted from the stack of interferograms to perform the PSI analysis. This procedure improved the capability of the PSI analysis for detecting high rates of deformation, as well as increased the numbers of point density of the final results. The proposed methodology showed good results for monitoring surface displacement in a large mining area, which is located in a rain forest environment, providing very useful information about the ground movement for planning and risk control.

  10. Vector Quantization of Harmonic Magnitudes in Speech Coding Applications—A Survey and New Technique

    Directory of Open Access Journals (Sweden)

    Wai C. Chu

    2004-12-01

    Full Text Available A harmonic coder extracts the harmonic components of a signal and represents them efficiently using a few parameters. The principles of harmonic coding have become quite successful and several standardized speech and audio coders are based on it. One of the key issues in harmonic coder design is in the quantization of harmonic magnitudes, where many propositions have appeared in the literature. The objective of this paper is to provide a survey of the various techniques that have appeared in the literature for vector quantization of harmonic magnitudes, with emphasis on those adopted by the major speech coding standards; these include constant magnitude approximation, partial quantization, dimension conversion, and variable-dimension vector quantization (VDVQ. In addition, a refined VDVQ technique is proposed where experimental data are provided to demonstrate its effectiveness.

  11. Preliminary report on geophysics ground follow-up of the 1977 airborne survey in the Wadi Bidah District, Kingdom of Saudi Arabia

    Science.gov (United States)

    Flanigan, V.J.; Wynn, J.C.; Worl, R.G.; Smith, C.W.

    1981-01-01

    Reconnaissance geologic and geochemical sampling was made during the 1978 field season at most of the 50 or so electromagnetic anomalies detected in the 1977 airborne electromagnetic (AEM) survey of the Wadi Bidah district. These Phase 1 studies also included reconnaissance geophysical traverses of nine of the AEM conductors. In addition the AEM anomalies were classified on the basis of this reconnaissance work into a list of priority targets for use in economic studies, and six AEM anomalies were selected for further studies.

  12. Fast ground filtering for TLS data via Scanline Density Analysis

    Science.gov (United States)

    Che, Erzhuo; Olsen, Michael J.

    2017-07-01

    Terrestrial Laser Scanning (TLS) efficiently collects 3D information based on lidar (light detection and ranging) technology. TLS has been widely used in topographic mapping, engineering surveying, forestry, industrial facilities, cultural heritage, and so on. Ground filtering is a common procedure in lidar data processing, which separates the point cloud data into ground points and non-ground points. Effective ground filtering is helpful for subsequent procedures such as segmentation, classification, and modeling. Numerous ground filtering algorithms have been developed for Airborne Laser Scanning (ALS) data. However, many of these are error prone in application to TLS data because of its different angle of view and highly variable resolution. Further, many ground filtering techniques are limited in application within challenging topography and experience difficulty coping with some objects such as short vegetation, steep slopes, and so forth. Lastly, due to the large size of point cloud data, operations such as data traversing, multiple iterations, and neighbor searching significantly affect the computation efficiency. In order to overcome these challenges, we present an efficient ground filtering method for TLS data via a Scanline Density Analysis, which is very fast because it exploits the grid structure storing TLS data. The process first separates the ground candidates, density features, and unidentified points based on an analysis of point density within each scanline. Second, a region growth using the scan pattern is performed to cluster the ground candidates and further refine the ground points (clusters). In the experiment, the effectiveness, parameter robustness, and efficiency of the proposed method is demonstrated with datasets collected from an urban scene and a natural scene, respectively.

  13. Internet连续媒体多播技术综述%Survey of Techniques for Internet Continuous Media Multicast

    Institute of Scientific and Technical Information of China (English)

    杨明; 张福炎

    2002-01-01

    Continuous media multicast has been an important component of many networked services such as audio-visual broadcast and video conferencing.The problems of scalability,congestion control,heterogeneity and reliability,which confront Internet continuous media multicast,are presented first,and then overview of the adaptive rate control schemes and techniques to solve these problems are stated.Finally,we discuss some treds and unsolved issues in the field.

  14. Research on the Application of the Light-Energy Spectrum Fusion Technique to Land and Resources Survey

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper introduces how to use remote sensing images including Landsat (MSS and TM) and airborne radioactivity images to identify the type of rocks in the areas covered by vegetation. The relationship between light spectrum (Landsat MSS and TM) and energy spectrum (U, Th and K) is discussed on the basis of correlation analysis, and it is proven that there are correlations between the Landsat MSS or TM data and the U, Th and K data. By using the fusion technique, new images were generated, which contain both the light spectrum and the energy spectrum information.Taking the Lucong basin as the study area, the present paper demonstrates the successful identification of various types of rocks using the fusion technique. Different types of rocks are represented by different colours on the new light-energy spectrum images, so that volcanic rocks of the Jurassic and Cretaceous periods can be discriminated. Another example, in the Lingquan basin in Northeast China, not only the different types of rocks are successfully distinguished, but corrections and modifications are also made on the original geological map after some field investigation. Practice has proven that the light-energy spectrum fusion technique is a good way in lithologic identification.

  15. Investigations of the ground-state hyperfine atomic structure and beta decay measurement prospects of 21Na with improved laser trapping techniques

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, Mary Anderson [Univ. of California, Berkeley, CA (United States)

    1999-05-01

    This thesis describes an experiment in which a neutral atom laser trap loaded with radioactive 21Na was improved and then used for measurements. The sodium isotope (half-life=22 sec) is produced on line at the 88 in. cyclotron at Lawrence Berkeley National Laboratory. The author developed an effective magnesium oxide target system which is crucial to deliver a substantive beam of 21Na to the experiment. Efficient manipulation of the 21Na beam with lasers allowed 30,000 atoms to be contained in a magneto-optical trap. Using the cold trapped atoms, the author measured to high precision the hyperfine splitting of the atomic ground state of 21Na. She measured the 3S1/2(F=1,m=0)-3S1/2(F=2,m=0) atomic level splitting of 21Na to be 1,906,471,870±200 Hz. Additionally, she achieved initial detection of beta decay from the trap and evaluated the prospects of precision beta decay correlation studies with trapped atoms.

  16. Age Determination by Back Length for African Savanna Elephants: Extending Age Assessment Techniques for Aerial-Based Surveys

    Science.gov (United States)

    Trimble, Morgan J.; van Aarde, Rudi J.; Ferreira, Sam M.; Nørgaard, Camilla F.; Fourie, Johan; Lee, Phyllis C.; Moss, Cynthia J.

    2011-01-01

    Determining the age of individuals in a population can lead to a better understanding of population dynamics through age structure analysis and estimation of age-specific fecundity and survival rates. Shoulder height has been used to accurately assign age to free-ranging African savanna elephants. However, back length may provide an analog measurable in aerial-based surveys. We assessed the relationship between back length and age for known-age elephants in Amboseli National Park, Kenya, and Addo Elephant National Park, South Africa. We also compared age- and sex-specific back lengths between these populations and compared adult female back lengths across 11 widely dispersed populations in five African countries. Sex-specific Von Bertalanffy growth curves provided a good fit to the back length data of known-age individuals. Based on back length, accurate ages could be assigned relatively precisely for females up to 23 years of age and males up to 17. The female back length curve allowed more precise age assignment to older females than the curve for shoulder height does, probably because of divergence between the respective growth curves. However, this did not appear to be the case for males, but the sample of known-age males was limited to ≤27 years. Age- and sex-specific back lengths were similar in Amboseli National Park and Addo Elephant National Park. Furthermore, while adult female back lengths in the three Zambian populations were generally shorter than in other populations, back lengths in the remaining eight populations did not differ significantly, in support of claims that growth patterns of African savanna elephants are similar over wide geographic regions. Thus, the growth curves presented here should allow researchers to use aerial-based surveys to assign ages to elephants with greater precision than previously possible and, therefore, to estimate population variables. PMID:22028925

  17. Age determination by back length for African savanna elephants: extending age assessment techniques for aerial-based surveys.

    Directory of Open Access Journals (Sweden)

    Morgan J Trimble

    Full Text Available Determining the age of individuals in a population can lead to a better understanding of population dynamics through age structure analysis and estimation of age-specific fecundity and survival rates. Shoulder height has been used to accurately assign age to free-ranging African savanna elephants. However, back length may provide an analog measurable in aerial-based surveys. We assessed the relationship between back length and age for known-age elephants in Amboseli National Park, Kenya, and Addo Elephant National Park, South Africa. We also compared age- and sex-specific back lengths between these populations and compared adult female back lengths across 11 widely dispersed populations in five African countries. Sex-specific Von Bertalanffy growth curves provided a good fit to the back length data of known-age individuals. Based on back length, accurate ages could be assigned relatively precisely for females up to 23 years of age and males up to 17. The female back length curve allowed more precise age assignment to older females than the curve for shoulder height does, probably because of divergence between the respective growth curves. However, this did not appear to be the case for males, but the sample of known-age males was limited to ≤27 years. Age- and sex-specific back lengths were similar in Amboseli National Park and Addo Elephant National Park. Furthermore, while adult female back lengths in the three Zambian populations were generally shorter than in other populations, back lengths in the remaining eight populations did not differ significantly, in support of claims that growth patterns of African savanna elephants are similar over wide geographic regions. Thus, the growth curves presented here should allow researchers to use aerial-based surveys to assign ages to elephants with greater precision than previously possible and, therefore, to estimate population variables.

  18. Site Selection for Hvdc Ground Electrodes

    Science.gov (United States)

    Freire, P. F.; Pereira, S. Y.

    2014-12-01

    High-Voltage Direct Current (HVDC) transmission systems are composed of a bipole transmission line with a converter substation at each end. Each substation may be equipped with a HVDC ground electrode, which is a wide area (up to 1 km Ø) and deep (from 3 to 100m) electrical grounding. When in normal operation, the ground electrode will dissipate in the soil the unbalance of the bipole (~1.5% of the rated current). When in monopolar operation with ground return, the HVDC electrode will inject in the soil the nominal pole continuous current, of about 2000 to 3000 Amperes, continuously for a period up to a few hours. HVDC ground electrodes site selection is a work based on extensive geophysical and geological surveys, in order to attend the desired design requirements established for the electrodes, considering both its operational conditions (maximum soil temperature, working life, local soil voltage gradients etc.) and the interference effects on the installations located up to 50 km away. This poster presents the geophysical investigations conducted primarily for the electrodes site selection, and subsequently for the development of the crust resistivity model, which will be used for the interference studies. A preliminary site selection is conducted, based on general geographical and geological criteria. Subsequently, the geology of each chosen area is surveyed in detail, by means of electromagnetic/electrical geophysical techniques, such as magnetotelluric (deep), TDEM (near-surface) and electroresistivity (shallow). Other complementary geologic and geotechnical surveys are conducted, such as wells drilling (for geotechnical characterization, measurement of the water table depth and water flow, and electromagnetic profiling), and soil and water sampling (for measurement of thermal parameters and evaluation of electrosmosis risk). The site evaluation is a dynamic process along the surveys, and some sites will be discarded. For the two or three final sites, the

  19. Fehlende Daten beim Record Linkage von Prozess- und Befragungsdaten : ein empirischer Vergleich ausgewählter Missing Data Techniken (Missing data in the record linkage of process and survey data : An empirical comparison of selected missing data techniques)

    OpenAIRE

    Krug, Gerhard

    2009-01-01

    "To compare different missing data techniques, in this paper I use a survey where participants were among other things asked permission for combining the survey with administrative data (record linkage). For those who refuse their permission I set their survey answers to missing, creating pseudo-missing data due to an empirical relevant but unknown mechanism (compared to the statistical simulation of a missing data process). OLS Regression is performed using Complete Case Analysis (CCA), Mult...

  20. Wind-induced ground motion

    Science.gov (United States)

    Naderyan, Vahid; Hickey, Craig J.; Raspet, Richard

    2016-02-01

    Wind noise is a problem in seismic surveys and can mask the seismic signals at low frequency. This research investigates ground motions caused by wind pressure and shear stress perturbations on the ground surface. A prediction of the ground displacement spectra using the measured ground properties and predicted pressure and shear stress at the ground surface is developed. Field measurements are conducted at a site having a flat terrain and low ambient seismic noise. Triaxial geophones are deployed at different depths to study the wind-induced ground vibrations as a function of depth and wind velocity. Comparison of the predicted to the measured wind-induced ground displacement spectra shows good agreement for the vertical component but significant underprediction for the horizontal components. To validate the theoretical model, a test experiment is designed to exert controlled normal pressure and shear stress on the ground using a vertical and a horizontal mass-spring apparatus. This experiment verifies the linear elastic rheology and the quasi-static displacements assumptions of the model. The results indicate that the existing surface shear stress models significantly underestimate the wind shear stress at the ground surface and the amplitude of the fluctuation shear stress must be of the same order of magnitude as the normal pressure. Measurement results show that mounting the geophones flush with the ground provides a significant reduction in wind noise on all three components of the geophone. Further reduction in wind noise with depth of burial is small for depths up to 40 cm.

  1. Estimates of evapotranspiration for riparian sites (Eucalyptus) in the Lower Murray -Darling Basin using ground validated sap flow and vegetation index scaling techniques

    Science.gov (United States)

    Doody, T.; Nagler, P. L.; Glenn, E. P.

    2014-12-01

    Water accounting is becoming critical globally, and balancing consumptive water demands with environmental water requirements is especially difficult in in arid and semi-arid regions. Within the Murray-Darling Basin (MDB) in Australia, riparian water use has not been assessed across broad scales. This study therefore aimed to apply and validate an existing U.S. riparian ecosystem evapotranspiration (ET) algorithm for the MDB river systems to assist water resource managers to quantify environmental water needs over wide ranges of niche conditions. Ground-based sap flow ET was correlated with remotely sensed predictions of ET, to provide a method to scale annual rates of water consumption by riparian vegetation over entire irrigation districts. Sap flux was measured at nine locations on the Murrumbidgee River between July 2011 and June 2012. Remotely sensed ET was calculated using a combination of local meteorological estimates of potential ET (ETo) and rainfall and MODIS Enhanced Vegetation Index (EVI) from selected 250 m resolution pixels. The sap flow data correlated well with MODIS EVI. Sap flow ranged from 0.81 mm/day to 3.60 mm/day and corresponded to a MODIS-based ET range of 1.43 mm/day to 2.42 mm/day. We found that mean ET across sites could be predicted by EVI-ETo methods with a standard error of about 20% across sites, but that ET at any given site could vary much more due to differences in aquifer and soil properties among sites. Water use was within range of that expected. We conclude that our algorithm developed for US arid land crops and riparian plants is applicable to this region of Australia. Future work includes the development of an adjusted algorithm using these sap flow validated results.

  2. Civil engineering applications of ground penetrating radar

    CERN Document Server

    Pajewski, Lara

    2015-01-01

    This book, based on Transport and Urban Development COST Action TU1208, presents the most advanced applications of ground penetrating radar (GPR) in a civil engineering context, with documentation of instrumentation, methods, and results. It explains clearly how GPR can be employed for the surveying of critical transport infrastructure, such as roads, pavements, bridges, and tunnels, and for the sensing and mapping of underground utilities and voids. Detailed attention is also devoted to use of GPR in the inspection of geological structures and of construction materials and structures, including reinforced concrete, steel reinforcing bars, and pre/post-tensioned stressing ducts. Advanced methods for solution of electromagnetic scattering problems and new data processing techniques are also presented. Readers will come to appreciate that GPR is a safe, advanced, nondestructive, and noninvasive imaging technique that can be effectively used for the inspection of composite structures and the performance of diagn...

  3. Level of carbon dioxide diffuse degassing from the ground of Vesuvio: comparison between extensive surveys and inferences on the gas source

    Directory of Open Access Journals (Sweden)

    Domenico Granieri

    2013-11-01

    Full Text Available An extensive campaign of diffuse CO2 soil flux was carried out at the cone of Vesuvio in October 2006 with two main objectives: 1 to provide an estimation of CO2 diffusely discharged through the soils in the summit area and 2 to evidence those sectors of the volcano where structural and morphological conditions could favour the gas output. The survey consisted of 502 measurements of soil CO2 flux homogenously distributed over an area of about 1.8 km2. Results of this survey were compared with those obtained during a similar campaign carried out by Frondini et al. in 2000, from which we have taken and reinterpreted a subset of data belonging to the common investigated area. Graphical statistical analysis showed three overlapping populations in both surveys, evidencing the contribution of three different sources feeding the soil CO2 degassing process. The overall CO2 emission pattern of 2006 is coherent with that observed in 2000 and suggests that a value between 120 and 140 t/day of CO2 is representative of the total CO2 discharged by diffuse degassing from the summit area of Vesuvio. The preferential exhaling area lies in the inner crater, whose contribution resulted in 45.3% of the total CO2 emission in 2006 (with 62.8 t/day and in 57.4% (with 70.3 t/day in 2000, although its extension is only 13% of the investigated area. This highly emissive area correlated closely with the structural discontinuities of Vesuvio cone, mainly suggesting that the NW-SE trending tectonic line is actually an active fault leaking deep gas to the bottom of the crater. The drainage action of the fault could be enhanced by the “aspiration” effect of the volcanic conduit.

  4. Application of capillary gas chromatography mass spectrometry/computer techniques to synoptic survey of organic material in bed sediment

    Science.gov (United States)

    Steinheimer, T.R.; Pereira, W.E.; Johnson, S.M.

    1981-01-01

    A bed sediment sample taken from an area impacted by heavy industrial activity was analyzed for organic compounds of environmental significance. Extraction was effected on a Soxhlet apparatus using a freeze-dried sample. The Soxhlet extract was fractionated by silica gel micro-column adsorption chromatography. Separation and identification of the organic compounds was accomplished by capillary gas chromatography/mass spectrometry techniques. More than 50 compounds were identified; these include saturated hydrocarbons, olefins, aromatic hydrocarbons, alkylated polycyclic aromatic hydrocarbons, and oxygenated compounds such as aldehydes and ketones. The role of bed sediments as a source or sink for organic pollutants is discussed. ?? 1981.

  5. Teaching research methods in nursing using Aronson's Jigsaw Technique. A cross-sectional survey of student satisfaction.

    Science.gov (United States)

    Leyva-Moral, Juan M; Riu Camps, Marta

    2016-05-01

    To adapt nursing studies to the European Higher Education Area, new teaching methods have been included that assign maximum importance to student-centered learning and collaborative work. The Jigsaw Technique is based on collaborative learning and everyone in the group must play their part because each student's mark depends on the other students. Home group members are given the responsibility to become experts in a specific area of knowledge. Experts meet together to reach an agreement and improve skills. Finally, experts return to their home groups to share all their findings. The aim of this study was to evaluate nursing student satisfaction with the Jigsaw Technique used in the context of a compulsory course in research methods for nursing. A cross-sectional study was conducted using a self-administered anonymous questionnaire administered to students who completed the Research Methods course during the 2012-13 and 2013-14 academic years. The questionnaire was developed taking into account the learning objectives, competencies and skills that should be acquired by students, as described in the course syllabus. The responses were compared by age group (younger or older than 22years). A total of 89.6% of nursing students under 22years believed that this methodology helped them to develop teamwork, while this figure was 79.6% in older students. Nursing students also believed it helped them to work independently, with differences according to age, 79.7% and 58% respectively (p=0.010). Students disagreed with the statement "The Jigsaw Technique involves little workload", with percentages of 88.5% in the group under 22years and 80% in older students. Most believed that this method should not be employed in upcoming courses, although there were differences by age, with 44.3% of the younger group being against and 62% of the older group (p=0.037). The method was not highly valued by students, mainly by those older than 22years, who concluded that they did not learn

  6. Improving the sampling strategy of the Joint Danube Survey 3 (2013) by means of multivariate statistical techniques applied on selected physico-chemical and biological data.

    Science.gov (United States)

    Hamchevici, Carmen; Udrea, Ion

    2013-11-01

    The concept of basin-wide Joint Danube Survey (JDS) was launched by the International Commission for the Protection of the Danube River (ICPDR) as a tool for investigative monitoring under the Water Framework Directive (WFD), with a frequency of 6 years. The first JDS was carried out in 2001 and its success in providing key information for characterisation of the Danube River Basin District as required by WFD lead to the organisation of the second JDS in 2007, which was the world's biggest river research expedition in that year. The present paper presents an approach for improving the survey strategy for the next planned survey JDS3 (2013) by means of several multivariate statistical techniques. In order to design the optimum structure in terms of parameters and sampling sites, principal component analysis (PCA), factor analysis (FA) and cluster analysis were applied on JDS2 data for 13 selected physico-chemical and one biological element measured in 78 sampling sites located on the main course of the Danube. Results from PCA/FA showed that most of the dataset variance (above 75%) was explained by five varifactors loaded with 8 out of 14 variables: physical (transparency and total suspended solids), relevant nutrients (N-nitrates and P-orthophosphates), feedback effects of primary production (pH, alkalinity and dissolved oxygen) and algal biomass. Taking into account the representation of the factor scores given by FA versus sampling sites and the major groups generated by the clustering procedure, the spatial network of the next survey could be carefully tailored, leading to a decreasing of sampling sites by more than 30%. The approach of target oriented sampling strategy based on the selected multivariate statistics can provide a strong reduction in dimensionality of the original data and corresponding costs as well, without any loss of information.

  7. Teachers of the Alexander Technique in the UK and the people who take their lessons: A national cross-sectional survey.

    Science.gov (United States)

    Eldred, J; Hopton, A; Donnison, E; Woodman, J; MacPherson, H

    2015-06-01

    Given the rising profile of the Alexander Technique in the UK, there is a need for a comprehensive description of its teachers and of those who currently take lessons. In a national survey of Alexander teachers, we set out to address this information gap. A cross-sectional survey of 871 UK members of three main Alexander Technique teachers' professional associations was conducted. A questionnaire requested information about their professional background, teaching practice and methods, and about the people who attend lessons and their reasons for seeking help. With an overall response rate of 61%, 534 teachers responded; 74% were female with median age of 58 years, 60% had a higher education qualification, and 95% were self-employed, many with additional non-Alexander paid employment. The majority (87%) offered lessons on their own premises or in a privately rented room, and 19% provided home visits; both individual and group lessons were provided. People who took lessons were predominantly female (66%) with a median age of 48 years, and 91% paid for their lessons privately. Nearly two-thirds (62%) began lessons for reasons related to musculoskeletal conditions, including back symptoms, posture, neck pain, and shoulder pain. Other reasons were general (18%, including well-being), performance-related (10%, including voice-, music-, and sport-related), psychological (5%) and neurological (3%). We estimate that Alexander teachers in the UK provide approximately 400,000 lessons per year. This study provides an overview of Alexander Technique teaching in the UK today and data that may be useful when planning future research. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Visual servoing in medical robotics: a survey. Part I: endoscopic and direct vision imaging - techniques and applications.

    Science.gov (United States)

    Azizian, Mahdi; Khoshnam, Mahta; Najmaei, Nima; Patel, Rajni V

    2014-09-01

    Intra-operative imaging is widely used to provide visual feedback to a clinician when he/she performs a procedure. In visual servoing, surgical instruments and parts of tissue/body are tracked by processing the acquired images. This information is then used within a control loop to manoeuvre a robotic manipulator during a procedure. A comprehensive search of electronic databases was completed for the period 2000-2013 to provide a survey of the visual servoing applications in medical robotics. The focus is on medical applications where image-based tracking is used for closed-loop control of a robotic system. Detailed classification and comparative study of various contributions in visual servoing using endoscopic or direct visual images are presented and summarized in tables and diagrams. The main challenges in using visual servoing for medical robotic applications are identified and potential future directions are suggested. 'Supervised automation of medical robotics' is found to be a major trend in this field. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Determination of hydraulic conductivity in three dimensions and its relation to dispersivity: Chapter D in Ground-water contamination by crude oil at the Bemidji, Minnesota, research site; US Geological Survey Toxic Waste--ground-water contamination study

    Science.gov (United States)

    1984-01-01

    Recent investigations suggest that dispersion in aquifers is scale dependent and a function of the heterogeneity of aquifer materials. Theoretical stochastic studies indicate that determining hydraulic-conductivity variability in three dimensions is important in analyzing the dispersion process. Even though field methods are available to approximate hydraulic conductivity in three dimensions, the methods are not generally used because of high cost of field equipment and because measurement and analysis techniques are cumbersome and time consuming. The hypothesis of this study is that field-determined values of dispersivity are scale dependent and that they may be described as a function of hydraulic conductivity in three dimensions. The objectives of the study at the Bemidji research site are to (1) determine hydraulic conductivity of the porous media in three dimensions, (2) determine field values of dispersivity and its scale dependence on hydraulic conductivity, and (3) develop and apply a computerized data-collection, storage, and analysis system for field use in comprehensive determination of hydraulic conductivity and dispersivity. Plans for this investigation involve a variety of methods of analysis. Hydraulic conductivity will be determined separately in the horizontal and vertical planes of the hydraulic-conductivity ellipsoid. Field values of dispersivity will be determined by single-well and doublet-well injection or withdrawal tests with tracers. A computerized data-collection, storage, and analysis system to measure pressure, flow rate, tracer concentrations, and temperature will be designed for field testing. Real-time computer programs will be used to analyze field data. The initial methods of analysis will be utilized to meet the objectives of the study. Preliminary field data indicate the aquifer underlying the Bemidji site is vertically heterogeneous, cross-bedded outwash. Preliminary analysis of the flow field around a hypothetical doublet

  10. 带地效翼型俯仰控制技术研究%Investigation on pitch control technique for an airfoil with ground effect

    Institute of Scientific and Technical Information of China (English)

    沈冬; 陈迎春; 张彬乾

    2011-01-01

    Aim to the pitch control problem of flying-wing configuration, a novel simple pitch control effector called a belly-flap is investigated to increase nose-up couple efficiently without unacceptable lift loss in this paper. Two-dimensional Reynolds-averaged Navier-Stokes calculations for various belly-flap configurations applied to an airfoil are described, and results are presented. The geometry parameters ( size, location, deflection angle) , ground effect and angle of attack are studied to detect their effects on the aerodynamic characteristics. The researches show that the optimized belly-flap can improve aerodynamic performance in take-off add landing conditions by bringing obvious increments in pitch up moment and lift. Aim to the additive drag increment complied with the belly-flap, many holes are disposed on the belly-flap. The researches show that those holes can decrease the additive drag and increase lift-drag ratio with few decrease in the pitch up moment. In conclusion, the belly-flap is an effective pitch control effector for the blended-wing-body in take-off and landing conditions.%为解决飞翼布局起降状态俯仰控制能力不足问题,研究一种新型的腹部扰流板俯仰控制装置.针对带地效翼型,采用数值计算方法,研究了扰流板高度、弦向位置、偏度、迎角、离地高度等参数对俯仰与升阻性能的影响及其流动机理,给出了扰流板设计原则.研究结果表明,在地效作用下,合适的扰流板弦向位置和高度与偏度配合,可提供显著的抬头力矩增量和较大的升力增量,有助于提高飞机起降性能.针对附加的较大阻力增量,扰流板面开孔影响探索性研究表明,开孔可减小附加阻力,提高升阻比,但使抬头力矩增量有所减小.扰流板是一种改善无尾飞翼布局俯仰操纵的有效措施,值得深入研究.

  11. Comparison of Satellite Surveying to Traditional Surveying Methods for the Resources Industry

    Science.gov (United States)

    Osborne, B. P.; Osborne, V. J.; Kruger, M. L.

    Modern ground-based survey methods involve detailed survey, which provides three-space co-ordinates for surveyed points, to a high level of accuracy. The instruments are operated by surveyors, who process the raw results to create survey location maps for the subject of the survey. Such surveys are conducted for a location or region and referenced to the earth global co- ordinate system with global positioning system (GPS) positioning. Due to this referencing the survey is only as accurate as the GPS reference system. Satellite survey remote sensing utilise satellite imagery which have been processed using commercial geographic information system software. Three-space co-ordinate maps are generated, with an accuracy determined by the datum position accuracy and optical resolution of the satellite platform.This paper presents a case study, which compares topographic surveying undertaken by traditional survey methods with satellite surveying, for the same location. The purpose of this study is to assess the viability of satellite remote sensing for surveying in the resources industry. The case study involves a topographic survey of a dune field for a prospective mining project area in Pakistan. This site has been surveyed using modern surveying techniques and the results are compared to a satellite survey performed on the same area.Analysis of the results from traditional survey and from the satellite survey involved a comparison of the derived spatial co- ordinates from each method. In addition, comparisons have been made of costs and turnaround time for both methods.The results of this application of remote sensing is of particular interest for survey in areas with remote and extreme environments, weather extremes, political unrest, poor travel links, which are commonly associated with mining projects. Such areas frequently suffer language barriers, poor onsite technical support and resources.

  12. A survey and proposed framework on the soft biometrics technique for human identification in intelligent video surveillance system.

    Science.gov (United States)

    Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum

    2012-01-01

    Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing.

  13. A Survey and Proposed Framework on the Soft Biometrics Technique for Human Identification in Intelligent Video Surveillance System

    Directory of Open Access Journals (Sweden)

    Min-Gu Kim

    2012-01-01

    Full Text Available Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing.

  14. Surveying your internal customers.

    Science.gov (United States)

    Weir, V L

    1998-06-01

    Internal customers often are overlooked when business techniques are applied. By applying common external customer satisfaction survey techniques to internal business functions, one hospital identified areas for improvement.

  15. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    Science.gov (United States)

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  16. Source mechanism analysis of strong mining induced seismic event and its influence on ground deformation observed by InSAR technique.

    Science.gov (United States)

    Rudzinski, Lukasz; Mirek, Katarzyna; Mirek, Janusz

    2016-04-01

    On April 17th, 2015 a strong shallow seismic event M4.0 struck a mining panel in the Wujek-Slask coal mine, southern Poland. The event was widely felt, followed with rockburst and caused a strong damages inside mining corridors. Unfortunately two miners are trapped by tunnels collapse. Full Moment Tensor (MT) estimated with regional broad-band signals shows that the event was characterized with very high isotropic (implosive) part. Mining inspections verified the occurrence of a rockfall and floor uplift. Very shallow foci depth (less than 1000m) and collapse - like MT solution suggest that event could be responsible for surface deformation in the vicinity of epicenter. To verified this issue we used the Interferometric Synthetic Aperture Radar technique (InSAR). The InSAR relies on measuring phase differences between two SAR images (radarograms). The measured differences may be computed into a single interferometric image. i.e. an interferogram. Interferogram computed from two radarograms of the same terrain taken at different time allows detecting changes in elevation of the terrain. Two SAR scenes acquired by Sentinel-1 satellite (European Space Agency) were processed to obtain the interferogram covered study area (12.04.2015 and 24.04.2015). 12 days interval differential interferogram shows distinctive concentric feature which indicate subsidence trough. Subsidence pattern shows 1 cycle of deformation corresponding with about 2.5 cm subsidence. The InSAR solution support the reliability of very strong implosive MT part.

  17. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    Science.gov (United States)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  18. Advances in bio-tactile sensors for minimally invasive surgery using the fibre Bragg grating force sensor technique: a survey.

    Science.gov (United States)

    Abushagur, Abdulfatah A G; Arsad, Norhana; Reaz, Mamun Ibne; Bakar, A Ashrif A

    2014-04-09

    The large interest in utilising fibre Bragg grating (FBG) strain sensors for minimally invasive surgery (MIS) applications to replace conventional electrical tactile sensors has grown in the past few years. FBG strain sensors offer the advantages of optical fibre sensors, such as high sensitivity, immunity to electromagnetic noise, electrical passivity and chemical inertness, but are not limited by phase discontinuity or intensity fluctuations. FBG sensors feature a wavelength-encoding sensing signal that enables distributed sensing that utilises fewer connections. In addition, their flexibility and lightness allow easy insertion into needles and catheters, thus enabling localised measurements inside tissues and blood. Two types of FBG tactile sensors have been emphasised in the literature: single-point and array FBG tactile sensors. This paper describes the current design, development and research of the optical fibre tactile techniques that are based on FBGs to enhance the performance of MIS procedures in general. Providing MIS or microsurgery surgeons with accurate and precise measurements and control of the contact forces during tissues manipulation will benefit both surgeons and patients.

  19. 南方高山夏季地栽香菇栽培技术%Ground cultivation techniques of shiitake mushroom at high mountainous area of Southern China in summer season

    Institute of Scientific and Technical Information of China (English)

    钟松花

    2015-01-01

    总结了南方高山地栽香菇的栽培技术,主要包括栽培场所选择、品种选择、栽培时间、生产工艺、发菌管理、菇房管理、病虫害防治、采收等方面内容。%This paper summarized ground cultivation techniques of shiitake mushroom at high mountainous area of southern China in summer season including field selection,variety selection,cultivation date,productive technology,management during mycelial growth,management of mushroom house,control of diseases and pests and harvest.

  20. Study on Ozone Oxidation Technique for the Treatment of Oil-Polluted Ground Water%受石油污染地下水的臭氧处理技术研究

    Institute of Scientific and Technical Information of China (English)

    于勇; 谢天强; 蔺延项; 鲍万民

    2001-01-01

    Ozone oxidation technique can be used for ground water with high oil content. Tests show that ozone hasan obviouse effect on the removal of pollutants, such as benzene substances, fused ring compounds, etc, the optimumamount of addition for ozone oxidation should be 7 mg/L and the contacting time of ozone oxidation should be 2 days.%对含石油量高的地下水,可采用臭氧氧化技术。试验表明,臭氧对于苯系物及稠环化合物等污染物的去除效果明显,臭氧氧化最佳投加量以7 mg/L为宜,臭氧化接触时间以2d为宜。