WorldWideScience

Sample records for surveys techniques based

  1. A survey on OFDM channel estimation techniques based on denoising strategies

    Directory of Open Access Journals (Sweden)

    Pallaviram Sure

    2017-04-01

    Full Text Available Channel estimation forms the heart of any orthogonal frequency division multiplexing (OFDM based wireless communication receiver. Frequency domain pilot aided channel estimation techniques are either least squares (LS based or minimum mean square error (MMSE based. LS based techniques are computationally less complex. Unlike MMSE ones, they do not require a priori knowledge of channel statistics (KCS. However, the mean square error (MSE performance of the channel estimator incorporating MMSE based techniques is better compared to that obtained with the incorporation of LS based techniques. To enhance the MSE performance using LS based techniques, a variety of denoising strategies have been developed in the literature, which are applied on the LS estimated channel impulse response (CIR. The advantage of denoising threshold based LS techniques is that, they do not require KCS but still render near optimal MMSE performance similar to MMSE based techniques. In this paper, a detailed survey on various existing denoising strategies, with a comparative discussion of these strategies is presented.

  2. Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not based on probability schemes

    NARCIS (Netherlands)

    Toepoel, V.; Emerson, Hannah

    2017-01-01

    Weighting techniques in web surveys based on no probability schemes are devised to correct biases due to self-selection, undercoverage, and nonresponse. In an interactive panel, 38 survey experts addressed weighting techniques and auxiliary variables in web surveys. Most of them corrected all biases

  3. Knowledge based systems: A critical survey of major concepts, issues and techniques. Visuals

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-9. The objectives of the report are to: examine various techniques used to build the KBS; to examine at least one KBS in detail, i.e., a case study; to list and identify limitations and problems with the KBS; to suggest future areas of research; and to provide extensive reference materials.

  4. Radon survey techniques

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    The report reviews radon measurement surveys in soils and in water. Special applications, and advantages and limitations of the radon measurement techniques are considered. The working group also gives some directions for further research in this field

  5. Conducting Web-based Surveys.

    OpenAIRE

    David J. Solomon

    2001-01-01

    Web-based surveying is becoming widely used in social science and educational research. The Web offers significant advantages over more traditional survey techniques however there are still serious methodological challenges with using this approach. Currently coverage bias or the fact significant numbers of people do not have access, or choose not to use the Internet is of most concern to researchers. Survey researchers also have much to learn concerning the most effective ways to conduct s...

  6. Complete Denture Impression Techniques Practiced by Private Dental Practitioners: A Survey

    OpenAIRE

    Kakatkar, Vinay R.

    2012-01-01

    Impression making is an important step in fabricating complete dentures. A survey to know the materials used and techniques practiced while recording complete denture impressions was conducted. It is disheartening to know that 33 % practitioners still use base plate custom trays to record final impressions. 8 % still use alginate for making final impressions. An acceptable technique for recording CD impressions is suggested.

  7. Survey of semantic modeling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C.L.

    1975-07-01

    The analysis of the semantics of programing languages was attempted with numerous modeling techniques. By providing a brief survey of these techniques together with an analysis of their applicability for answering semantic issues, this report attempts to illuminate the state-of-the-art in this area. The intent is to be illustrative rather than thorough in the coverage of semantic models. A bibliography is included for the reader who is interested in pursuing this area of research in more detail.

  8. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  9. Modern Surveying Techniques In National Infrastructural ...

    African Journals Online (AJOL)

    Journal of Research in National Development ... Modern Surveying Techniques In National Infrastructural Development: Case Study Of Roads ... Ways that Remote Sensing help to make highway construction easier were discussed.

  10. UAS Mapping as an alternative for land surveying techniques?

    Directory of Open Access Journals (Sweden)

    L. Devriendt

    2014-03-01

    Full Text Available Can a UAS mapping technique compete with standard surveying techniques? Since the boom in different RPAS (remotely piloted air system, UAV (unmanned aerial vehicle, or UAS (unmanned aerial system, this is one of the crucial questions when it comes to UAS mappings. Not the looks and feels are important but the reliability, ease-to-use, and accuracy that you get with a system based on hardware and corresponding software. This was also one of the issues that the Dutch Land Registry asked a few months ago aimed at achieving an effective and usable system for updating property boundaries in new-build districts. Orbit GT gave them a ready-made answer: a definitive outcome based on years of research and development in UAS mapping technology and software.

  11. Web-based Surveys: Changing the Survey Process

    OpenAIRE

    Gunn, Holly

    2002-01-01

    Web-based surveys are having a profound influence on the survey process. Unlike other types of surveys, Web page design skills and computer programming expertise play a significant role in the design of Web-based surveys. Survey respondents face new and different challenges in completing a Web-based survey. This paper examines the different types of Web-based surveys, the advantages and challenges of using Web-based surveys, the design of Web-based surveys, and the issues of validity, error, ...

  12. Timing and technique impact the effectiveness of road-based, mobile acoustic surveys of bats.

    Science.gov (United States)

    D'Acunto, Laura E; Pauli, Benjamin P; Moy, Mikko; Johnson, Kiara; Abu-Omar, Jasmine; Zollner, Patrick A

    2018-03-01

    Mobile acoustic surveys are a common method of surveying bat communities. However, there is a paucity of empirical studies exploring different methods for conducting mobile road surveys of bats. During 2013, we conducted acoustic mobile surveys on three routes in north-central Indiana, U.S.A., using (1) a standard road survey, (2) a road survey where the vehicle stopped for 1 min at every half mile of the survey route (called a "start-stop method"), and (3) a road survey with an individual using a bicycle. Linear mixed models with multiple comparison procedures revealed that when all bat passes were analyzed, using a bike to conduct mobile surveys detected significantly more bat passes per unit time compared to other methods. However, incorporating genus-level comparisons revealed no advantage to using a bike over vehicle-based methods. We also found that survey method had a significant effect when analyses were limited to those bat passes that could be identified to genus, with the start-stop method generally detecting more identifiable passes than the standard protocol or bike survey. Additionally, we found that significantly more identifiable bat passes (particularly those of the Eptesicus and Lasiurus genera) were detected in surveys conducted immediately following sunset. As governing agencies, particularly in North America, implement vehicle-based bat monitoring programs, it is important for researchers to understand how variations on protocols influence the inference that can be gained from different monitoring schemes.

  13. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA, HANFORD, WASHINGTON

    International Nuclear Information System (INIS)

    Petersen, S.W.

    2010-01-01

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM(reg s ign) system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m (328 ft) and 200 m (656 ft)) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  14. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA HANFORD WASHINGTON

    Energy Technology Data Exchange (ETDEWEB)

    PETERSEN SW

    2010-12-02

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM{reg_sign} system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m [328 ft] and 200 m [656 ft]) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  15. Survey of intravitreal injection techniques among retina specialists in Israel

    Directory of Open Access Journals (Sweden)

    Segal O

    2016-06-01

    Full Text Available Ori Segal,1,2 Yael Segal-Trivitz,1,3 Arie Y Nemet,1,2 Noa Geffen,1,2 Ronit Nesher,1,2 Michael Mimouni4 1Department of Ophthalmology, Meir Medical Center, Kfar Saba, 2The Sackler School of Medicine, Tel Aviv University, Tel Aviv, 3Department of Psychiatry, Geha Psychiatric Hospital, Petah Tikva, 4Department of Ophthalmology, Rambam Health Care Campus, Haifa, Israel Purpose: The purpose of this study was to describe antivascular endothelial growth factor intravitreal injection techniques of retinal specialists in order to establish a cornerstone for future practice guidelines. Methods: All members of the Israeli Retina Society were contacted by email to complete an anonymous, 19-question, Internet-based survey regarding their intravitreal injection techniques. Results: Overall, 66% (52/79 completed the survey. Most (98% do not instruct patients to discontinue anticoagulant therapy and 92% prescribe treatment for patients in the waiting room. Three quarters wear sterile gloves and prepare the patient in the supine position. A majority (71% use sterile surgical draping. All respondents apply topical analgesics and a majority (69% measure the distance from the limbus to the injection site. A minority (21% displace the conjunctiva prior to injection. A majority of the survey participants use a 30-gauge needle and the most common quadrant for injection is superotemporal (33%. Less than half routinely assess postinjection optic nerve perfusion (44%. A majority (92% apply prophylactic antibiotics immediately after the injection. Conclusion: The majority of retina specialists perform intravitreal injections similarly. However, a relatively large minority performs this procedure differently. Due to the extremely low percentage of complications, it seems as though such differences do not increase the risk. However, more evidence-based medicine, a cornerstone for practice guidelines, is required in order to identify the intravitreal injection techniques

  16. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  17. A Survey of Model-based Sensor Data Acquisition and Management

    OpenAIRE

    Aggarwal, Charu C.; Sathe, Saket; Papaioannou, Thanasis; Jeung, Hoyoung; Aberer, Karl

    2013-01-01

    In recent years, due to the proliferation of sensor networks, there has been a genuine need of researching techniques for sensor data acquisition and management. To this end, a large number of techniques have emerged that advocate model-based sensor data acquisition and management. These techniques use mathematical models for performing various, day-to-day tasks involved in managing sensor data. In this chapter, we survey the state-of-the-art techniques for model-based sensor data acquisition...

  18. Visual servoing in medical robotics: a survey. Part II: tomographic imaging modalities--techniques and applications.

    Science.gov (United States)

    Azizian, Mahdi; Najmaei, Nima; Khoshnam, Mahta; Patel, Rajni

    2015-03-01

    Intraoperative application of tomographic imaging techniques provides a means of visual servoing for objects beneath the surface of organs. The focus of this survey is on therapeutic and diagnostic medical applications where tomographic imaging is used in visual servoing. To this end, a comprehensive search of the electronic databases was completed for the period 2000-2013. Existing techniques and products are categorized and studied, based on the imaging modality and their medical applications. This part complements Part I of the survey, which covers visual servoing techniques using endoscopic imaging and direct vision. The main challenges in using visual servoing based on tomographic images have been identified. 'Supervised automation of medical robotics' is found to be a major trend in this field and ultrasound is the most commonly used tomographic modality for visual servoing. Copyright © 2014 John Wiley & Sons, Ltd.

  19. A Survey on Anomaly Based Host Intrusion Detection System

    Science.gov (United States)

    Jose, Shijoe; Malathi, D.; Reddy, Bharath; Jayaseeli, Dorathi

    2018-04-01

    An intrusion detection system (IDS) is hardware, software or a combination of two, for monitoring network or system activities to detect malicious signs. In computer security, designing a robust intrusion detection system is one of the most fundamental and important problems. The primary function of system is detecting intrusion and gives alerts when user tries to intrusion on timely manner. In these techniques when IDS find out intrusion it will send alert massage to the system administrator. Anomaly detection is an important problem that has been researched within diverse research areas and application domains. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. From the existing anomaly detection techniques, each technique has relative strengths and weaknesses. The current state of the experiment practice in the field of anomaly-based intrusion detection is reviewed and survey recent studies in this. This survey provides a study of existing anomaly detection techniques, and how the techniques used in one area can be applied in another application domain.

  20. Comparison of survey techniques on detection of northern flying squirrels

    Science.gov (United States)

    Diggins, Corinne A.; Gilley, L. Michelle; Kelly, Christine A.; Ford, W. Mark

    2016-01-01

    The ability to detect a species is central to the success of monitoring for conservation and management purposes, especially if the species is rare or endangered. Traditional methods, such as live capture, can be labor-intensive, invasive, and produce low detection rates. Technological advances and new approaches provide opportunities to more effectively survey for species both in terms of accuracy and efficiency than previous methods. We conducted a pilot comparison study of a traditional technique (live-trapping) and 2 novel noninvasive techniques (camera-trapping and ultrasonic acoustic surveys) on detection rates of the federally endangered Carolina northern flying squirrel (Glaucomys sabrinus coloratus) in occupied habitat within the Roan Mountain Highlands of North Carolina, USA. In 2015, we established 3 5 × 5 live-trapping grids (6.5 ha) with 4 camera traps and 4 acoustic detectors systematically embedded in each grid. All 3 techniques were used simultaneously during 2 4-day survey periods. We compared techniques by assessing probability of detection (POD), latency to detection (LTD; i.e., no. of survey nights until initial detection), and survey effort. Acoustics had the greatest POD (0.37 ± 0.06 SE), followed by camera traps (0.30 ± 0.06) and live traps (0.01 ± 0.005). Acoustics had a lower LTD than camera traps (P = 0.017), where average LTD was 1.5 nights for acoustics and 3.25 nights for camera traps. Total field effort was greatest with live traps (111.9 hr) followed by acoustics (8.4 hr) and camera traps (9.6 hr), although processing and examination for data of noninvasive techniques made overall effort similar among the 3 methods. This pilot study demonstrated that both noninvasive methods were better rapid-assessment detection techniques for flying squirrels than live traps. However, determining seasonal effects between survey techniques and further development of protocols for both noninvasive techniques is

  1. D Survey Techniques for the Architectutal Restoration: the Case of ST. Agata in Pisa

    Science.gov (United States)

    Bevilacqua, M. G.; Caroti, G.; Piemonte, A.; Ruschi, P.; Tenchini, L.

    2017-05-01

    and visualize the historical building in its context. These modern techniques of survey, based on the creation of point clouds, are now widely used both in the study of a building and for the thorough description of architectural details and decorations. This paper aims at describing the methodological approach and the results of the 3D survey of the Chapel of St. Agata in Pisa, aimed at its restoration. For the development of a restoration project, the survey drawings must represent not only the geometry of a building, but also the materials and the level of degradation. So, we chose to use both the laser scanner - which guarantees uniformity of the geometric survey precision - and a 3D image-based modelling. The combined use of these two techniques, supported by a total station survey, has produced two point clouds in the same reference system, and allowed the determination of the external orientation parameters of the photographic images. Since these parameters are known, it was possible to texturize the laser scanner model with high quality images. The adopted methodology, as expected, gave back metrically correct and graphically high-quality drawings. The level of detail of the survey, and consequently of the final drawings, has been previously defined for the identification of all the elements required for the analysis of the current state, such as the clear identification and position of all the degradation phenomena, materials and decorative elements such as some fragmented and heavily damaged frescoes.

  2. A Survey of 2D Face Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Mejda Chihaoui

    2016-09-01

    Full Text Available Despite the existence of various biometric techniques, like fingerprints, iris scan, as well as hand geometry, the most efficient and more widely-used one is face recognition. This is because it is inexpensive, non-intrusive and natural. Therefore, researchers have developed dozens of face recognition techniques over the last few years. These techniques can generally be divided into three categories, based on the face data processing methodology. There are methods that use the entire face as input data for the proposed recognition system, methods that do not consider the whole face, but only some features or areas of the face and methods that use global and local face characteristics simultaneously. In this paper, we present an overview of some well-known methods in each of these categories. First, we expose the benefits of, as well as the challenges to the use of face recognition as a biometric tool. Then, we present a detailed survey of the well-known methods by expressing each method’s principle. After that, a comparison between the three categories of face recognition techniques is provided. Furthermore, the databases used in face recognition are mentioned, and some results of the applications of these methods on face recognition databases are presented. Finally, we highlight some new promising research directions that have recently appeared.

  3. Poisson and negative binomial item count techniques for surveys with sensitive question.

    Science.gov (United States)

    Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin

    2017-04-01

    Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.

  4. Web-Based Surveys: Not Your Basic Survey Anymore

    Science.gov (United States)

    Bertot, John Carlo

    2009-01-01

    Web-based surveys are not new to the library environment. Although such surveys began as extensions of print surveys, the Web-based environment offers a number of approaches to conducting a survey that the print environment cannot duplicate easily. Since 1994, the author and others have conducted national surveys of public library Internet…

  5. A Survey of Face Recognition Technique | Omidiora | Journal of ...

    African Journals Online (AJOL)

    A review of face recognition techniques has been carried out. Face recognition has been an attractive field in the society of both biological and computer vision of research. It exhibits the characteristics of being natural and low-intrusive. In this paper, an updated survey of techniques for face recognition is made. Methods of ...

  6. Motion Transplantation Techniques: A Survey

    NARCIS (Netherlands)

    van Basten, Ben; Egges, Arjan

    2012-01-01

    During the past decade, researchers have developed several techniques for transplanting motions. These techniques transplant a partial auxiliary motion, possibly defined for a small set of degrees of freedom, on a base motion. Motion transplantation improves motion databases' expressiveness and

  7. A Survey of Soft-Error Mitigation Techniques for Non-Volatile Memories

    Directory of Open Access Journals (Sweden)

    Sparsh Mittal

    2017-02-01

    Full Text Available Non-volatile memories (NVMs offer superior density and energy characteristics compared to the conventional memories; however, NVMs suffer from severe reliability issues that can easily eclipse their energy efficiency advantages. In this paper, we survey architectural techniques for improving the soft-error reliability of NVMs, specifically PCM (phase change memory and STT-RAM (spin transfer torque RAM. We focus on soft-errors, such as resistance drift and write disturbance, in PCM and read disturbance and write failures in STT-RAM. By classifying the research works based on key parameters, we highlight their similarities and distinctions. We hope that this survey will underline the crucial importance of addressing NVM reliability for ensuring their system integration and will be useful for researchers, computer architects and processor designers.

  8. Bases of technique of sprinting

    Directory of Open Access Journals (Sweden)

    Valeriy Druz

    2015-06-01

    Full Text Available Purpose: to determine the biomechanical consistent patterns of a movement of a body providing the highest speed of sprinting. Material and Methods: the analysis of scientific and methodical literature on the considered problem, the anthropometrical characteristics of the surveyed contingent of sportsmen, the analysis of high-speed shootings of the leading runners of the world. Results: the biomechanical bases of technique of sprinting make dispersal and movement of the general center of body weight of the sportsman on a parabolic curve in a start phase taking into account the initial height of its stay in a pose of a low start. Its further movement happens on a cycloidal trajectory which is formed due to a pendulum movement of the extremities creating the lifting power which provides flight duration more in a running step, than duration of a basic phase. Conclusions: the received biomechanical regularities of technique of sprinting allow increasing the efficiency of training of sportsmen in sprinting.

  9. 3D-TV System with Depth-Image-Based Rendering Architectures, Techniques and Challenges

    CERN Document Server

    Zhao, Yin; Yu, Lu; Tanimoto, Masayuki

    2013-01-01

    Riding on the success of 3D cinema blockbusters and advances in stereoscopic display technology, 3D video applications have gathered momentum in recent years. 3D-TV System with Depth-Image-Based Rendering: Architectures, Techniques and Challenges surveys depth-image-based 3D-TV systems, which are expected to be put into applications in the near future. Depth-image-based rendering (DIBR) significantly enhances the 3D visual experience compared to stereoscopic systems currently in use. DIBR techniques make it possible to generate additional viewpoints using 3D warping techniques to adjust the perceived depth of stereoscopic videos and provide for auto-stereoscopic displays that do not require glasses for viewing the 3D image.   The material includes a technical review and literature survey of components and complete systems, solutions for technical issues, and implementation of prototypes. The book is organized into four sections: System Overview, Content Generation, Data Compression and Transmission, and 3D V...

  10. A survey on the state-of-the-technique on software based pipeline leak detection systems

    Energy Technology Data Exchange (ETDEWEB)

    Baptista, Renan Martins [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Centro de Pesquisas. Div. de Explotacao]. E-mail: renan@cenpes.petrobras.com.br

    2000-07-01

    This paper describes a general technical survey on software based leak detection systems (LDS), approaching its main technological features, the operational situations where they are feasible, and the scenarios within the Brazilian pipeline network. The decision on what LDS to choose for a given pipeline is a matter of cost, suitability and feasibility. A simpler low cost, less effective product, but with a fast installation and tuning procedure, may be more suitable for a given operational site (pipeline configuration, kind of fluid, quality of instrumentation and communication), than a complex, high cost, efficient product, but taking a long time to be properly installed. Some other may really have a level of complexity that will require a more sophisticated system. A few number of them will simply not be suitable to have a LDS: it may be caused by the poor quality or absence of instrumentation, or, the worst case, due to the lack of technology to approach that specific case, e. g., multiphase flow lines, or those lines that commonly operates in slack condition. It is intended to approach here the general state-of-the-technique and make some initial comments on the costs. (author)

  11. A survey of energy saving techniques for mobile computers

    NARCIS (Netherlands)

    Smit, Gerardus Johannes Maria; Havinga, Paul J.M.

    1997-01-01

    Portable products such as pagers, cordless and digital cellular telephones, personal audio equipment, and laptop computers are increasingly being used. Because these applications are battery powered, reducing power consumption is vital. In this report we first give a survey of techniques for

  12. Radiological survey techniques for decontamination and dismantlement applications

    International Nuclear Information System (INIS)

    Ruesink, G.P.; Stempfley, D.H.; Pettit, P.J.; Warner, R.D.

    1997-01-01

    The Department of Energy's Fernald Environmental Management Project (FEMP) is engaged in an aggressive Program to remove all above ground structures as part of the Fernald sites final remediation remedy. Through the complete removal of major facilities such as Plant 7, Plant 4, and Plant 1, the FEMP has developed radiological survey approaches that are effective for the different phases of the Decontamination and Dismantlement (D ampersand D) process. Some of the most pressing challenges facing the FEMP are implementing effective, low cost methods for the D ampersand D of former process buildings while minimizing environmental effects. One of the key components to ensure minimal impact on the environment is the collection of radiological contamination information during the D ampersand D process to facilitate the decision making process. Prior to the final demolition of any structure, radiological surveys of floors, walls, and ceilings must take place. These surveys must demonstrate that contamination levels am below 5000 dpm removable beta/gamma for non-porous surfaces and below 1000 dpm removable-beta/gamma for all porous surfaces. Technique which can perform these activities in a safe, effective, and cost efficient manner are greatly desired. The FEMP has investigated new approaches to address this need. These techniques include sampling approaches using standard baseline methodology as well as innovative approaches to accelerate final radiological clearance processes. To further improve upon this process, the FEMP has investigated several new technologies through the Fernald Plant 1 Large Scale Technology Demonstration Project. One of the most promising of these new technologies, Laser Induced Fluorescence, may significantly improve the radiological clearance survey process. This paper will present real world experiences in applying radiological control limits to D ampersand D projects as well as relate potential productivity and cost improvements with the

  13. Fuzzy Bi-level Decision-Making Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Guangquan Zhang

    2016-04-01

    Full Text Available Bi-level decision-making techniques aim to deal with decentralized management problems that feature interactive decision entities distributed throughout a bi-level hierarchy. A challenge in handling bi-level decision problems is that various uncertainties naturally appear in decision-making process. Significant efforts have been devoted that fuzzy set techniques can be used to effectively deal with uncertain issues in bi-level decision-making, known as fuzzy bi-level decision-making techniques, and researchers have successfully gained experience in this area. It is thus vital that an instructive review of current trends in this area should be conducted, not only of the theoretical research but also the practical developments. This paper systematically reviews up-to-date fuzzy bi-level decisionmaking techniques, including models, approaches, algorithms and systems. It also clusters related technique developments into four main categories: basic fuzzy bi-level decision-making, fuzzy bi-level decision-making with multiple optima, fuzzy random bi-level decision-making, and the applications of bi-level decision-making techniques in different domains. By providing state-of-the-art knowledge, this survey paper will directly support researchers and practitioners in their understanding of developments in theoretical research results and applications in relation to fuzzy bi-level decision-making techniques.

  14. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    Science.gov (United States)

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  15. The Aalborg Survey / Part 1 - Web Based Survey

    DEFF Research Database (Denmark)

    Harder, Henrik; Christensen, Cecilie Breinholm

    Background and purpose The Aalborg Survey consists of four independent parts: a web, GPS and an interview based survey and a literature study, which together form a consistent investigation and research into use of urban space, and specifically into young people’s use of urban space: what young......) and the research focus within the cluster of Mobility and Tracking Technologies (MoTT), AAU. Summary / Part 1 Web Base Survey The 1st part of the research project Diverse Urban Spaces (DUS) has been carried out during the period from December 1st 2007 to February 1st 2008 as a Web Based Survey of the 27.040 gross...... [statistikbanken.dk, a] young people aged 14-23 living in Aalborg Municipality in 2008. The web based questionnaire has been distributed among the group of young people studying at upper secondary schools in Aalborg, i.e. 7.680 young people [statistikbanken.dk, b]. The resulting data from those respondents who...

  16. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    International Nuclear Information System (INIS)

    2013-01-01

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results

  17. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-12-15

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results.

  18. High precision survey and alignment techniques in accelerator construction

    CERN Document Server

    Gervaise, J

    1974-01-01

    Basic concepts of precision surveying are briefly reviewed, and an historical account is given of instruments and techniques used during the construction of the Proton Synchrotron (1954-59), the Intersecting Storage Rings (1966-71), and the Super Proton Synchrotron (1971). A nylon wire device, distinvar, invar wire and tape, and recent automation of the gyrotheodolite and distinvar as well as auxiliary equipment (polyurethane jacks, Centipede) are discussed in detail. The paper ends summarizing the present accuracy in accelerator metrology, giving an outlook of possible improvement, and some aspects of staffing for the CERN Survey Group. (0 refs).

  19. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H S; Kim, J H; Lee, J C; Choi, Y R; Moon, S S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  20. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    International Nuclear Information System (INIS)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S.

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system

  1. A SURVEY OF RETINA BASED DISEASE IDENTIFICATION USING BLOOD VESSEL SEGMENTATION

    Directory of Open Access Journals (Sweden)

    P Kuppusamy

    2016-11-01

    Full Text Available The colour retinal photography is one of the most essential features to identify the confirmation of various eye diseases. The iris is primary attribute to authenticate the human. This research work presents the survey and comparison of various blood vessel related feature identification, segmentation, extraction and enhancement methods. Additionally, this study is observed the various databases performance for storing the images and testing in minimal time. This paper is also provides the better performance techniques based on the survey.

  2. The Aalborg Survey / Part 3 - Interview Based Survey

    DEFF Research Database (Denmark)

    Harder, Henrik; Christensen, Cecilie Breinholm; Jensen, Maria Vestergaard

    Background and purpose The Aalborg Survey consists of four independent parts: a web, GPS and an interview based survey and a literature study, which together form a consistent investigation and research into use of urban space, and specifically into young people’s use of urban space: what young...... people do in urban spaces, where they are in the urban spaces and when the young people are in the urban spaces. The answers to these questions form the framework and enable further academic discussions and conclusions in relation to the overall research project Diverse Urban Spaces (DUS). The primary......) and the research focus within the cluster of Mobility and Tracking Technologies (MoTT), AAU. Summary / Part 3 - Interview Based Survey The 3rd part of the DUS research project has been carried out during the fall of 2009 and the summer and fall of 2010 as an interview based survey of 18 selected participants (nine...

  3. A survey of visual preprocessing and shape representation techniques

    Science.gov (United States)

    Olshausen, Bruno A.

    1988-01-01

    Many recent theories and methods proposed for visual preprocessing and shape representation are summarized. The survey brings together research from the fields of biology, psychology, computer science, electrical engineering, and most recently, neural networks. It was motivated by the need to preprocess images for a sparse distributed memory (SDM), but the techniques presented may also prove useful for applying other associative memories to visual pattern recognition. The material of this survey is divided into three sections: an overview of biological visual processing; methods of preprocessing (extracting parts of shape, texture, motion, and depth); and shape representation and recognition (form invariance, primitives and structural descriptions, and theories of attention).

  4. Investigation of individual radiation exposures from discharges to the aquatic environment: techniques used in habits surveys

    International Nuclear Information System (INIS)

    Leonard, D.R.P.; Hunt, G.J.; Jones, P.G.W.

    1982-01-01

    The techniques used by the Fisheries Radiobiological Laboratory (FRL) in conducting habits surveys are described and discussed. The main objectives of these surveys are to investigate exposure pathways to the public resulting from radioactive discharges to the aquatic environment and to provide the basic data from which critical groups can be identified. Preparation, conduct and interpretation of the results of surveys are described and possible errors obtained by the interview technique are highlighted. A means of verifying the results of interviews by a logging technique has been devised and some comparative results are presented. (author)

  5. Emerging Technologies and Techniques for Wide Area Radiological Survey and Remediation

    Energy Technology Data Exchange (ETDEWEB)

    Sutton, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zhao, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-03-24

    Technologies to survey and decontaminate wide-area contamination and process the subsequent radioactive waste have been developed and implemented following the Chernobyl nuclear power plant release and the breach of a radiological source resulting in contamination in Goiania, Brazil. These civilian examples of radioactive material releases provided some of the first examples of urban radiological remediation. Many emerging technologies have recently been developed and demonstrated in Japan following the release of radioactive cesium isotopes (Cs-134 and Cs-137) from the Fukushima Dai-ichi nuclear power plant in 2011. Information on technologies reported by several Japanese government agencies, such as the Japan Atomic Energy Agency (JAEA), the Ministry of the Environment (MOE) and the National Institute for Environmental Science (NIES), together with academic institutions and industry are summarized and compared to recently developed, deployed and available technologies in the United States. The technologies and techniques presented in this report may be deployed in response to a wide area contamination event in the United States. In some cases, additional research and testing is needed to adequately validate the technology effectiveness over wide areas. Survey techniques can be deployed on the ground or from the air, allowing a range of coverage rates and sensitivities. Survey technologies also include those useful in measuring decontamination progress and mapping contamination. Decontamination technologies and techniques range from non-destructive (e.g., high pressure washing) and minimally destructive (plowing), to fully destructive (surface removal or demolition). Waste minimization techniques can greatly impact the long-term environmental consequences and cost following remediation efforts. Recommendations on technical improvements to address technology gaps are presented together with observations on remediation in Japan.

  6. Survey on visualization and analysis techniques based on diffusion MRI for in-vivo anisotropic diffusion structures

    International Nuclear Information System (INIS)

    Masutani, Yoshitaka; Sato, Tetsuo; Urayama, Shin-ichi; Bihan, D.L.

    2008-01-01

    In association with development of diffusion MR imaging technologies for anisotropic diffusion measurement in living body, related research is explosively increasing including research fields of applied mathematics and visualization in addition to MR imaging, biomedical image technology, and medical science. One of the reasons is that the diffusion MRI data set is a set of high dimensional image information beyond conventional scalar or vector images, and is attractive for the researchers in the related fields. This survey paper is mainly aimed at introducing state-of-the-art of post processing techniques reported in the literature for diffusion MRI data, such as analysis and visualization. (author)

  7. Use of structured personality survey techniques to indicate operator response to stressful situations

    International Nuclear Information System (INIS)

    Waller, M.A.

    1990-01-01

    Under given circumstances, a person will tend to operate in one of four dominant orientations: (1) to perform tasks; (2) to achieve consensus; (3) to achieve understanding, or (4) to maintain structure. Historically, personality survey techniques, such as the Myers-Briggs type indicator, have been used to determine these tendencies. While these techniques can accurately reflect a person's orientation under normal social situations, under different sets of conditions, the same person may exhibit other tendencies, displaying a similar or entirely different orientation. While most do not exhibit extreme tendencies or changes of orientation, the shift in personality from normal to stressful conditions can be rather dramatic, depending on the individual. Structured personality survey techniques have been used to indicate operator response to stressful situations. These techniques have been extended to indicate the balance between orientations that the control room team has through the various levels of cognizance

  8. Arduino based radiation survey meter

    International Nuclear Information System (INIS)

    Rahman, Nur Aira Abd; Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee; Muzakkir, Amir

    2016-01-01

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr −1 ). Conversion factor (CF) value for conversion of CPM to μSvhr −1 determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr

  9. A Survey on Nickel Titanium Rotary Instruments and their Usage Techniques by Endodontists in India.

    Science.gov (United States)

    Patil, Thimmanagowda N; Saraf, Prahlad A; Penukonda, Raghavendra; Vanaki, Sneha S; Kamatagi, Laxmikant

    2017-05-01

    The preference and usage of nickel titanium rotary instruments varies from individual to individual based on their technique, experience with the rotary systems and the clinical situation. Very limited information is available to explain the adoption of changing concepts with respect to nickel titanium rotary instruments pertaining to the endodontists in India. The aim of this study was to conduct a questionnaire survey to acquire the knowledge concerning different NiTi rotary instruments and their usage techniques by endodontists in India. A Survey questionnaire was designed which consisted of 32 questions regarding designation, demographics, experience with rotary instruments, usage of different file systems, usage techniques, frequency of reuse, occurrence of file fracture, reasons and their management was distributed by hand in the national postgraduate convention and also disseminated via electronic medium to 400 and 600 endodontists respectively. Information was collected from each individual to gain insight into the experiences and beliefs of endodontists concerning the new endodontic technology of rotary NiTi instrumentation based on their clinical experience with the rotary systems. The questions were designed to ascertain the problems, patterns of use and to identify areas of perceived or potential concern regarding the rotary instruments and the data acquired was statistically evaluated using Fisher's-exact test and the Chi-Square test. Overall 63.8% (638) endodontists responded. ProTaper was one of the most commonly used file system followed by M two and ProTaper Next. There was a significant co relation between the years of experience and the file re use frequency, preparation technique, file separation, management of file separation. A large number of Endodontists prefer to reuse the rotary NiTi instruments. As there was an increase in the experience, the incidence of file separation reduced with increasing number of re use frequency and with

  10. Trends in Orbital Decompression Techniques of Surveyed American Society of Ophthalmic Plastic and Reconstructive Surgery Members.

    Science.gov (United States)

    Reich, Shani S; Null, Robert C; Timoney, Peter J; Sokol, Jason A

    To assess current members of the American Society of Ophthalmic Plastic and Reconstructive Surgery (ASOPRS) regarding preference in surgical techniques for orbital decompression in Graves' disease. A 10-question web-based, anonymous survey was distributed to oculoplastic surgeons utilizing the ASOPRS listserv. The questions addressed the number of years of experience performing orbital decompression surgery, preferred surgical techniques, and whether orbital decompression was performed in collaboration with an ENT surgeon. Ninety ASOPRS members participated in the study. Most that completed the survey have performed orbital decompression surgery for >15 years. The majority of responders preferred a combined approach of floor and medial wall decompression or balanced lateral and medial wall decompression; only a minority selected a technique limited to 1 wall. Those surgeons who perform fat decompression were more likely to operate in collaboration with ENT. Most surgeons rarely remove the orbital strut, citing risk of worsening diplopia or orbital dystopia except in cases of optic nerve compression or severe proptosis. The most common reason given for performing orbital decompression was exposure keratopathy. The majority of surgeons perform the surgery without ENT involvement, and number of years of experience did not correlate significantly with collaboration with ENT. The majority of surveyed ASOPRS surgeons prefer a combined wall approach over single wall approach to initial orbital decompression. Despite the technological advances made in the field of modern endoscopic surgery, no single approach has been adopted by the ASOPRS community as the gold standard.

  11. A Comparison of Web-Based and Paper-Based Survey Methods: Testing Assumptions of Survey Mode and Response Cost

    Science.gov (United States)

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-01-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and…

  12. Arduino based radiation survey meter

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nm.gov.my; Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee [Malaysian Nuclear Agency, 43000, Bangi, Selangor (Malaysia); Muzakkir, Amir [Sinaran Utama Teknologi Sdn Bhd, 43650, Bandar Baru Bangi, Selangor (Malaysia)

    2016-01-22

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr{sup −1}). Conversion factor (CF) value for conversion of CPM to μSvhr{sup −1} determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr.

  13. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    Science.gov (United States)

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017.

  14. Survey on Ranging Sensors and Cooperative Techniques for Relative Positioning of Vehicles

    Directory of Open Access Journals (Sweden)

    Fabian de Ponte Müller

    2017-01-01

    Full Text Available Future driver assistance systems will rely on accurate, reliable and continuous knowledge on the position of other road participants, including pedestrians, bicycles and other vehicles. The usual approach to tackle this requirement is to use on-board ranging sensors inside the vehicle. Radar, laser scanners or vision-based systems are able to detect objects in their line-of-sight. In contrast to these non-cooperative ranging sensors, cooperative approaches follow a strategy in which other road participants actively support the estimation of the relative position. The limitations of on-board ranging sensors regarding their detection range and angle of view and the facility of blockage can be approached by using a cooperative approach based on vehicle-to-vehicle communication. The fusion of both, cooperative and non-cooperative strategies, seems to offer the largest benefits regarding accuracy, availability and robustness. This survey offers the reader a comprehensive review on different techniques for vehicle relative positioning. The reader will learn the important performance indicators when it comes to relative positioning of vehicles, the different technologies that are both commercially available and currently under research, their expected performance and their intrinsic limitations. Moreover, the latest research in the area of vision-based systems for vehicle detection, as well as the latest work on GNSS-based vehicle localization and vehicular communication for relative positioning of vehicles, are reviewed. The survey also includes the research work on the fusion of cooperative and non-cooperative approaches to increase the reliability and the availability.

  15. A critical survey of live virtual machine migration techniques

    Directory of Open Access Journals (Sweden)

    Anita Choudhary

    2017-11-01

    Full Text Available Abstract Virtualization techniques effectively handle the growing demand for computing, storage, and communication resources in large-scale Cloud Data Centers (CDC. It helps to achieve different resource management objectives like load balancing, online system maintenance, proactive fault tolerance, power management, and resource sharing through Virtual Machine (VM migration. VM migration is a resource-intensive procedure as VM’s continuously demand appropriate CPU cycles, cache memory, memory capacity, and communication bandwidth. Therefore, this process degrades the performance of running applications and adversely affects efficiency of the data centers, particularly when Service Level Agreements (SLA and critical business objectives are to be met. Live VM migration is frequently used because it allows the availability of application service, while migration is performed. In this paper, we make an exhaustive survey of the literature on live VM migration and analyze the various proposed mechanisms. We first classify the types of Live VM migration (single, multiple and hybrid. Next, we categorize VM migration techniques based on duplication mechanisms (replication, de-duplication, redundancy, and compression and awareness of context (dependency, soft page, dirty page, and page fault and evaluate the various Live VM migration techniques. We discuss various performance metrics like application service downtime, total migration time and amount of data transferred. CPU, memory and storage data is transferred during the process of VM migration and we identify the category of data that needs to be transferred in each case. We present a brief discussion on security threats in live VM migration and categories them in three different classes (control plane, data plane, and migration module. We also explain the security requirements and existing solutions to mitigate possible attacks. Specific gaps are identified and the research challenges in improving

  16. Lot quality assurance sampling techniques in health surveys in developing countries: advantages and current constraints.

    Science.gov (United States)

    Lanata, C F; Black, R E

    1991-01-01

    Traditional survey methods, which are generally costly and time-consuming, usually provide information at the regional or national level only. The utilization of lot quality assurance sampling (LQAS) methodology, developed in industry for quality control, makes it possible to use small sample sizes when conducting surveys in small geographical or population-based areas (lots). This article describes the practical use of LQAS for conducting health surveys to monitor health programmes in developing countries. Following a brief description of the method, the article explains how to build a sample frame and conduct the sampling to apply LQAS under field conditions. A detailed description of the procedure for selecting a sampling unit to monitor the health programme and a sample size is given. The sampling schemes utilizing LQAS applicable to health surveys, such as simple- and double-sampling schemes, are discussed. The interpretation of the survey results and the planning of subsequent rounds of LQAS surveys are also discussed. When describing the applicability of LQAS in health surveys in developing countries, the article considers current limitations for its use by health planners in charge of health programmes, and suggests ways to overcome these limitations through future research. It is hoped that with increasing attention being given to industrial sampling plans in general, and LQAS in particular, their utilization to monitor health programmes will provide health planners in developing countries with powerful techniques to help them achieve their health programme targets.

  17. Survey Of Lossless Image Coding Techniques

    Science.gov (United States)

    Melnychuck, Paul W.; Rabbani, Majid

    1989-04-01

    Many image transmission/storage applications requiring some form of data compression additionally require that the decoded image be an exact replica of the original. Lossless image coding algorithms meet this requirement by generating a decoded image that is numerically identical to the original. Several lossless coding techniques are modifications of well-known lossy schemes, whereas others are new. Traditional Markov-based models and newer arithmetic coding techniques are applied to predictive coding, bit plane processing, and lossy plus residual coding. Generally speaking, the compression ratio offered by these techniques are in the area of 1.6:1 to 3:1 for 8-bit pictorial images. Compression ratios for 12-bit radiological images approach 3:1, as these images have less detailed structure, and hence, their higher pel correlation leads to a greater removal of image redundancy.

  18. Nuclear assay of coal. Volume 4. Moisture determination in coal: survey of electromagnetic techniques. Final report

    International Nuclear Information System (INIS)

    Bevan, R.; Luckie, P.; Gozani, T.; Brown, D.R.; Bozorgmanesh, H.; Elias, E.

    1979-01-01

    This survey consists of two basic parts. The first consists of a survey of various non-nuclear moisture determination techniques. Three techniques are identified as promising for eventual on-line application with coal; these are the capacitance, microwave attenuation, and nuclear magnetic resonance (NMR) techniques. The second part is devoted to an in-depth analysis of these three techniques and the current extent to which they have been applied to coal. With a given coal type, accuracies of +- 1% absolute in moisture content are achievable with all three techniques. The accuracy of the two electromagnetic techniques has been demonstrated in the laboratory and on-line in coal burning plants, whereas only small samples have been analyzed with NMR. The current shortcoming of the simple electromagnetic techniques is the sensitivity of calibrations to physical parameters and coal type. NMR is currently limited by small sample sizes and non-rugged design. These findings are summarized and a list of manufacturers of moisture analyzers is given in the Appendix

  19. Technical errors in complete mouth radiographic survey according to radiographic techniques and film holding methods

    International Nuclear Information System (INIS)

    Choi, Karp Sik; Byun, Chong Soo; Choi, Soon Chul

    1986-01-01

    The purpose of this study was to investigate the numbers and causes of retakes in 300 complete mouth radiographic surveys made by 75 senior dental students. According to radiographic techniques and film holding methods, they were divided into 4 groups: Group I: Bisecting-angle technique with patient's fingers. Group II: Bisecting-angle technique with Rinn Snap-A-Ray device. Group III: Bisecting-angle technique with Rinn XCP instrument (short cone) Group IV: Bisecting-angle technique with Rinn XCP instrument (long cone). The most frequent cases of retakes, the most frequent tooth area examined, of retakes and average number of retakes per complete mouth survey were evaluated. The obtained results were as follows: Group I: Incorrect film placement (47.8), upper canine region, and 0.89. Group II: Incorrect film placement (44.0), upper canine region, and 1.12. Group III: Incorrect film placement (79.2), upper canine region, and 2.05. Group IV: Incorrect film placement (67.7), upper canine region, and 1.69.

  20. Automatic vetting of planet candidates from ground based surveys: Machine learning with NGTS

    Science.gov (United States)

    Armstrong, David J.; Günther, Maximilian N.; McCormac, James; Smith, Alexis M. S.; Bayliss, Daniel; Bouchy, François; Burleigh, Matthew R.; Casewell, Sarah; Eigmüller, Philipp; Gillen, Edward; Goad, Michael R.; Hodgkin, Simon T.; Jenkins, James S.; Louden, Tom; Metrailler, Lionel; Pollacco, Don; Poppenhaeger, Katja; Queloz, Didier; Raynard, Liam; Rauer, Heike; Udry, Stéphane; Walker, Simon R.; Watson, Christopher A.; West, Richard G.; Wheatley, Peter J.

    2018-05-01

    State of the art exoplanet transit surveys are producing ever increasing quantities of data. To make the best use of this resource, in detecting interesting planetary systems or in determining accurate planetary population statistics, requires new automated methods. Here we describe a machine learning algorithm that forms an integral part of the pipeline for the NGTS transit survey, demonstrating the efficacy of machine learning in selecting planetary candidates from multi-night ground based survey data. Our method uses a combination of random forests and self-organising-maps to rank planetary candidates, achieving an AUC score of 97.6% in ranking 12368 injected planets against 27496 false positives in the NGTS data. We build on past examples by using injected transit signals to form a training set, a necessary development for applying similar methods to upcoming surveys. We also make the autovet code used to implement the algorithm publicly accessible. autovet is designed to perform machine learned vetting of planetary candidates, and can utilise a variety of methods. The apparent robustness of machine learning techniques, whether on space-based or the qualitatively different ground-based data, highlights their importance to future surveys such as TESS and PLATO and the need to better understand their advantages and pitfalls in an exoplanetary context.

  1. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-04-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f-v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  2. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-07-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f- v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  3. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    Science.gov (United States)

    Al-Mohammed, A. H.; Abido, M. A.

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research. PMID:24701191

  4. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    Directory of Open Access Journals (Sweden)

    A. H. Al-Mohammed

    2014-01-01

    Full Text Available This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs, when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research.

  5. Survey of Green Radio Communications Networks: Techniques and Recent Advances

    Directory of Open Access Journals (Sweden)

    Mohammed H. Alsharif

    2013-01-01

    Full Text Available Energy efficiency in cellular networks has received significant attention from both academia and industry because of the importance of reducing the operational expenditures and maintaining the profitability of cellular networks, in addition to making these networks “greener.” Because the base station is the primary energy consumer in the network, efforts have been made to study base station energy consumption and to find ways to improve energy efficiency. In this paper, we present a brief review of the techniques that have been used recently to improve energy efficiency, such as energy-efficient power amplifier techniques, time-domain techniques, cell switching, management of the physical layer through multiple-input multiple-output (MIMO management, heterogeneous network architectures based on Micro-Pico-Femtocells, cell zooming, and relay techniques. In addition, this paper discusses the advantages and disadvantages of each technique to contribute to a better understanding of each of the techniques and thereby offer clear insights to researchers about how to choose the best ways to reduce energy consumption in future green radio networks.

  6. Comprehensive geophysical survey technique in exploration for deep-buried hydrothermal type uranium deposits in Xiangshan volcanic basin, China

    International Nuclear Information System (INIS)

    Ke, D.

    2014-01-01

    According to recent drilling results, uranium mineralization has been found underground more than 1000 m deep in the Xiangshan volcanic basin, in where uranium exploration has been carried out for over 50 years. This paper presents a comprehensive geophysical survey technique, including audio magnetotelluric method (AMT), high resolution ground magnetic and radon survey, which aim to prospect deep-buried and concealed uranium deposits in Xiangshan volcanic basin. Based on research and application, a comprehensive geophysical technique consisting of data acquisition, processing and interpretation has been established. Concealed rock and ore-controlling structure buried deeper than 1000 m can be detected by using this technique. Moreover, one kind of anti-interference technique of AMT survey is presented, which can eliminate the interference induced by the high-voltage power lines. Result of AMT in Xiangshan volcanic basin is demonstrated as high-low-high mode, which indicates there are three layers in geology. The upper layer with high resistivity is mainly the react of porphyroclastic lava. The middle layer with low resistivity is metamorphic schists or dellenite whereas the lower layer with high resistivity is inferred as granite. The interface between middle and lower layer is recognized as the potential zone for occurrence of uranium deposits. According to the corresponding relation of the resistivity and magnetic anomaly with uranium ore bodies, the tracing model of faults and interfaces between the different rocks, and the forecasting model of advantageous area for uranium deposits have been established. In terms of the forecasting model, some significant sections for uranium deposits were delineated in the west of the Xiangshan volcanic basin. As a result, some achievements on uranium prospecting have been acquired. High grade economic uranium ore bodies have been found in several boreholes, which are located in the forecasted zones. (author)

  7. Literature survey of heat transfer enhancement techniques in refrigeration applications

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, M.K.; Shome, B. [Rensselaer Polytechnic Inst., Troy, NY (United States). Dept. of Mechanical Engineering, Aeronautical Engineering and Mechanics

    1994-05-01

    A survey has been performed of the technical and patent literature on enhanced heat transfer of refrigerants in pool boiling, forced convection evaporation, and condensation. Extensive bibliographies of the technical literature and patents are given. Many passive and active techniques were examined for pure refrigerants, refrigerant-oil mixtures, and refrigerant mixtures. The citations were categorized according to enhancement technique, heat transfer mode, and tube or shell side focus. The effects of the enhancement techniques relative to smooth and/or pure refrigerants were illustrated through the discussion of selected papers. Patented enhancement techniques also are discussed. Enhanced heat transfer has demonstrated significant improvements in performance in many refrigerant applications. However, refrigerant mixtures and refrigerant-oil mixtures have not been studied extensively; no research has been performed with enhanced refrigerant mixtures with oil. Most studies have been of the parametric type; there has been inadequate examination of the fundamental processes governing enhanced refrigerant heat transfer, but some modeling is being done and correlations developed. It is clear that an enhancement technique must be optimized for the refrigerant and operating condition. Fundamental processes governing the heat transfer must be examined if models for enhancement techniques are to be developed; these models could provide the method to optimize a surface. Refrigerant mixtures, with and without oil present, must be studied with enhancement devices; there is too little known to be able to estimate the effects of mixtures (particularly NARMs) with enhanced heat transfer. Other conclusions and recommendations are offered.

  8. Survey on development of brown coal liquefaction techniques; Kattan ekika gijutsu ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1980-09-01

    Described herein are results of literature survey on brown coal liquefaction reactions and elementary techniques. Liquefaction of brown coal in the presence of CO and steam, or CO, H{sub 2} and steam has been investigated. It is not clear by the literature survey whether it is superior to the normal process which uses hydrogen. Brown coal contains moisture at high contents, and the drying techniques are necessary to be developed for its liquefaction. The future coal liquefaction plant will be much larger than the past one, and there are a number of problems to be solved, such as those involved in the designs of large-sized high-pressure slurry pumps, heat exchangers and preheaters. It is also necessary to develop the materials of and production techniques for large reactors which are serviceable under severe conditions. The solid-liquid separation for liquefaction products involves a number of the elementary techniques characteristic of coal liquefaction processes, and needs many technological developments. The one-stage brown coal liquefaction process is compared with the two-stage process for the secondary hydrogenation of SCR, but no clear conclusions are reached. (NEDO)

  9. Embryo transfer techniques: an American Society for Reproductive Medicine survey of current Society for Assisted Reproductive Technology practices.

    Science.gov (United States)

    Toth, Thomas L; Lee, Malinda S; Bendikson, Kristin A; Reindollar, Richard H

    2017-04-01

    To better understand practice patterns and opportunities for standardization of ET. Cross-sectional survey. Not applicable. Not applicable. An anonymous 82-question survey was emailed to the medical directors of 286 Society for Assisted Reproductive Technology member IVF practices. A follow-up survey composed of three questions specific to ET technique was emailed to the same medical directors. Descriptive statistics of the results were compiled. The survey assessed policies, protocols, restrictions, and specifics pertinent to the technique of ET. There were 117 (41%) responses; 32% practice in academic settings and 68% in private practice. Responders were experienced clinicians, half of whom had performed Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  10. Watershed-based survey designs

    Science.gov (United States)

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream–downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  11. A critical survey of agent-based wholesale electricity market models

    International Nuclear Information System (INIS)

    Weidlich, Anke; Veit, Daniel

    2008-01-01

    The complexity of electricity markets calls for rich and flexible modeling techniques that help to understand market dynamics and to derive advice for the design of appropriate regulatory frameworks. Agent-Based Computational Economics (ACE) is a fairly young research paradigm that offers methods for realistic electricity market modeling. A growing number of researchers have developed agent-based models for simulating electricity markets. The diversity of approaches makes it difficult to overview the field of ACE electricity research; this literature survey should guide the way through and describe the state-of-the-art of this research area. In a conclusive summary, shortcomings of existing approaches and open issues that should be addressed by ACE electricity researchers are critically discussed. (author)

  12. The History of Electromagnetic Induction Techniques in Soil Survey

    Science.gov (United States)

    Brevik, Eric C.; Doolittle, Jim

    2014-05-01

    Electromagnetic induction (EMI) has been used to characterize the spatial variability of soil properties since the late 1970s. Initially used to assess soil salinity, the use of EMI in soil studies has expanded to include: mapping soil types; characterizing soil water content and flow patterns; assessing variations in soil texture, compaction, organic matter content, and pH; and determining the depth to subsurface horizons, stratigraphic layers or bedrock, among other uses. In all cases the soil property being investigated must influence soil apparent electrical conductivity (ECa) either directly or indirectly for EMI techniques to be effective. An increasing number and diversity of EMI sensors have been developed in response to users' needs and the availability of allied technologies, which have greatly improved the functionality of these tools. EMI investigations provide several benefits for soil studies. The large amount of georeferenced data that can be rapidly and inexpensively collected with EMI provides more complete characterization of the spatial variations in soil properties than traditional sampling techniques. In addition, compared to traditional soil survey methods, EMI can more effectively characterize diffuse soil boundaries and identify included areas of dissimilar soils within mapped soil units, giving soil scientists greater confidence when collecting spatial soil information. EMI techniques do have limitations; results are site-specific and can vary depending on the complex interactions among multiple and variable soil properties. Despite this, EMI techniques are increasingly being used to investigate the spatial variability of soil properties at field and landscape scales.

  13. American Samoa Shore-based Creel Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DMWR staff has also conducted shore-based creel surveys which also have 2 major sub-surveys; one to estimate participation (fishing effort), and one to provide...

  14. A SURVEY OF AUTOMATION TECHNIQUES COMING FORTH IN SHEET-FED OFFSET PRINTING ORGANIZATIONS

    OpenAIRE

    Mr. Ramesh Kumar*, Mr. Bijender & Mr. Sandeep Boora

    2017-01-01

    Sheet-Fed offset is one of the premier processes in India as well as abroad. To cope up with customers large quantity demands automation has become mandatory. From prepress to post press a wide range of automation techniques exist and coming forth for sheet fed offset presses. Objective of this paper is to throw light on various sheet-fed offset automation techniques existing today and their futuristic implications. The data related to automation was collected with the help of survey conducte...

  15. SEM-based characterization techniques

    International Nuclear Information System (INIS)

    Russell, P.E.

    1986-01-01

    The scanning electron microscope is now a common instrument in materials characterization laboratories. The basic role of the SEM as a topographic imaging system has steadily been expanding to include a variety of SEM-based analytical techniques. These techniques cover the range of basic semiconductor materials characterization to live-time device characterization of operating LSI or VLSI devices. This paper introduces many of the more commonly used techniques, describes the modifications or additions to a conventional SEM required to utilize the techniques, and gives examples of the use of such techniques. First, the types of signals available from a sample being irradiated by an electron beam are reviewed. Then, where applicable, the type of spectroscopy or microscopy which has evolved to utilize the various signal types are described. This is followed by specific examples of the use of such techniques to solve problems related to semiconductor technology. Techniques emphasized include: x-ray fluorescence spectroscopy, electron beam induced current (EBIC), stroboscopic voltage analysis, cathodoluminescnece and electron beam IC metrology. Current and future trends of some of the these techniques, as related to the semiconductor industry are discussed

  16. Integrating Geological and Geodetic Surveying Techniques for Landslide Deformation Monitoring: Istanbul Case

    Science.gov (United States)

    Menteşe, E. Y.; Kilic, O.; BAS, M.; Tarih, A.; Duran, K.; Gumus, S.; Yapar, E. R.; Karasu, M. E.; Mehmetoğlu, H.; Karaman, A.; Edi˙ger, V.; Kosma, R. C.; Ozalaybey, S.; Zor, E.; Arpat, E.; Polat, F.; Dogan, U.; Cakir, Z.; Erkan, B.

    2017-12-01

    There are several methods that can be utilized for describing the landslide mechanisms. While some of them are commonly used, there are relatively new methods that have been proven to be useful. Obviously, each method has its own limitations and thus integrated use of these methods contributes to obtaining a realistic landslide model. The slopes of Küçükçekmece and Büyükçekmece Lagoons located at the Marmara Sea coast of İstanbul, Turkey, are among most specific examples of complex type landslides. The landslides in the area started developing at low sea level, and appears to ceased or at least slowed down to be at minimum after the sea level rise, as oppose to the still-active landslides that continue to cause damage especially in the valley slopes above the recent sea level between the two lagoons. To clarify the characteristics of these slope movements and classify them in most accurate way, Directorate of Earthquake and Ground Research of Istanbul Metropolitan Municipality launched a project in cooperation with Marmara Research Center of The Scientific and Technological Research Council of Turkey (TÜBİTAK). The project benefits the utility of the techniques of different disciplines such as geology, geophysics, geomorphology, hydrogeology, geotechnics, geodesy, remote sensing and meteorology. Specifically, this study focuses on two main axes of these techniques, namely: geological and geodetic. The reason for selecting these disciplines is because of their efficiency and power to understand the landslide mechanism in the area. Main approaches used in these studies are comprised of geological drills, inclinometer measurements, GPS surveys and SAR (both satellite and ground based) techniques. Integration of the results gathered from these techniques led the project team to comprehend critical aspects of landslide phenomenon in the area and produce precise landslide hazard maps that are basic instruments for a resilient urban development.

  17. Improving Standard Poststratification Techniques For Random-Digit-Dialing Telephone Surveys

    Directory of Open Access Journals (Sweden)

    Michael P. Battaglia

    2008-03-01

    Full Text Available Random-digit-dialing surveys in the United States such as the Behavioral Risk Factor Surveillance System (BRFSS typically poststratify on age, gender and race/ethnicity using control totals from an appropriate source such as the 2000 Census, the Current Population Survey, or the American Community Survey. Using logistic regression and interaction detection software we identified key "main effect" socio-demographic variables and important two-factor interactions associated with several health risk factor outcomes measured in the BRFSS, one of the largest annual RDD surveys in the United States. A procedure was developed to construct control totals, which were consistent with estimates of age, gender, and race/ethnicity obtained from a commercial source and distributions of other demographic variables from the Current Population Survey. Raking was used to incorporate main effects and two-factor interaction margins into the weighting of the BRFSS survey data. The resulting risk factor estimates were then compared with those based on the current BRFSS weighting methodology and mean squared error estimates were developed. The research demonstrates that by identifying socio-demographic variables associated with key outcome variables and including these variables in the weighting methodology, nonresponse bias can be substantially reduced.

  18. Comparative cost assessment of the Kato-Katz and FLOTAC techniques for soil-transmitted helminth diagnosis in epidemiological surveys

    Directory of Open Access Journals (Sweden)

    Speich Benjamin

    2010-08-01

    Full Text Available Abstract Background The Kato-Katz technique is widely used for the diagnosis of soil-transmitted helminthiasis in epidemiological surveys and is believed to be an inexpensive method. The FLOTAC technique shows a higher sensitivity for the diagnosis of light-intensity soil-transmitted helminth infections but is reported to be more complex and expensive. We assessed the costs related to the collection, processing and microscopic examination of stool samples using the Kato-Katz and FLOTAC techniques in an epidemiological survey carried out in Zanzibar, Tanzania. Methods We measured the time for the collection of a single stool specimen in the field, transfer to a laboratory, preparation and microscopic examination using standard protocols for the Kato-Katz and FLOTAC techniques. Salaries of health workers, life expectancy and asset costs of materials, and infrastructure costs were determined. The average cost for a single or duplicate Kato-Katz thick smears and the FLOTAC dual or double technique were calculated. Results The average time needed to collect a stool specimen and perform a single or duplicate Kato-Katz thick smears or the FLOTAC dual or double technique was 20 min and 34 sec (20:34 min, 27:21 min, 28:14 min and 36:44 min, respectively. The total costs for a single and duplicate Kato-Katz thick smears were US$ 1.73 and US$ 2.06, respectively, and for the FLOTAC double and dual technique US$ 2.35 and US$ 2.83, respectively. Salaries impacted most on the total costs of either method. Conclusions The time and cost for soil-transmitted helminth diagnosis using either the Kato-Katz or FLOTAC method in epidemiological surveys are considerable. Our results can help to guide healthcare decision makers and scientists in budget planning and funding for epidemiological surveys, anthelminthic drug efficacy trials and monitoring of control interventions.

  19. Stereoscopic Visualization of Diffusion Tensor Imaging Data: A Comparative Survey of Visualization Techniques

    International Nuclear Information System (INIS)

    Raslan, O.; Debnam, J.M.; Ketonen, L.; Kumar, A.J.; Schellingerhout, D.; Wang, J.

    2013-01-01

    Diffusion tensor imaging (DTI) data has traditionally been displayed as a gray scale functional anisotropy map (GSFM) or color coded orientation map (CCOM). These methods use black and white or color with intensity values to map the complex multidimensional DTI data to a two-dimensional image. Alternative visualization techniques, such as V m ax maps utilize enhanced graphical representation of the principal eigenvector by means of a headless arrow on regular non stereoscopic (VM) or stereoscopic display (VMS). A survey of clinical utility of patients with intracranial neoplasms was carried out by 8 neuro radiologists using traditional and nontraditional methods of DTI display. Pairwise comparison studies of 5 intracranial neoplasms were performed with a structured questionnaire comparing GSFM, CCOM, VM, and VMS. Six of 8 neuro radiologists favored V m ax maps over traditional methods of display (GSFM and CCOM). When comparing the stereoscopic (VMS) and the non-stereoscopic (VM) modes, 4 favored VMS, 2 favored VM, and 2 had no preference. In conclusion, processing and visualizing DTI data stereoscopically is technically feasible. An initial survey of users indicated that V m ax based display methodology with or without stereoscopic visualization seems to be preferred over traditional methods to display DTI data.

  20. Evaluating autonomous acoustic surveying techniques for rails in tidal marshes

    Science.gov (United States)

    Stiffler, Lydia L.; Anderson, James T.; Katzner, Todd

    2018-01-01

    There is a growing interest toward the use of autonomous recording units (ARUs) for acoustic surveying of secretive marsh bird populations. However, there is little information on how ARUs compare to human surveyors or how best to use ARU data that can be collected continuously throughout the day. We used ARUs to conduct 2 acoustic surveys for king (Rallus elegans) and clapper rails (R. crepitans) within a tidal marsh complex along the Pamunkey River, Virginia, USA, during May–July 2015. To determine the effectiveness of an ARU in replacing human personnel, we compared results of callback point‐count surveys with concurrent acoustic recordings and calculated estimates of detection probability for both rail species combined. The success of ARUs at detecting rails that human observers recorded decreased with distance (P ≤ 0.001), such that at 75 m, only 34.0% of human‐detected rails were detected by the ARU. To determine a subsampling scheme for continuous ARU data that allows for effective surveying of presence and call rates of rails, we used ARUs to conduct 15 continuous 48‐hr passive surveys, generating 720 hr of recordings. We established 5 subsampling periods of 5, 10, 15, 30, and 45 min to evaluate ARU‐based presence and vocalization detections of rails compared with each of the full 60‐min sampling of ARU‐based detection of rails. All subsampling periods resulted in different (P ≤ 0.001) detection rates and unstandardized vocalization rates compared with the hourly sampling period. However, standardized vocalization counts from the 30‐min subsampling period were not different from vocalization counts of the full hourly sampling period. When surveying rail species in estuarine environments, species‐, habitat‐, and ARU‐specific limitations to ARU sampling should be considered when making inferences about abundances and distributions from ARU data. 

  1. Survey on peripheral techniques of brown coal liquefaction techniques; Kattan ekika gijutsu ni kansuru shuhen gijutsu no chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1980-09-01

    Described herein are results of survey on brown coal liquefaction techniques and peripheral techniques, centered by COSTEAM process under development in USA, solubilization by alcohol and liquefaction and cracking with the aid of tetrahydroquinoline as the hydrogen donor under development in Japan, and low-temperature carbonization and new promising techniques. The COSTEAM process shows higher reaction rates, conversions and oil yields for brown coal liquefaction than the one using hydrogen gas. Some of the problems involved in this process high viscosity and oxygenated compound content of the product oil. The product oil is acceptable as fuel for power generating plants and can be produced at a moderate cost, but may be unsuitable as vehicle fuel. Coal liquefaction and solubilization processes are mainly represented by those which use hydrogen. The hydrogen cost, which is high, determines the product price. The processes which use alcohol or tetrahydroquinoline are still in the experimental stage. (NEDO)

  2. Using Intelligent Techniques in Construction Project Cost Estimation: 10-Year Survey

    Directory of Open Access Journals (Sweden)

    Abdelrahman Osman Elfaki

    2014-01-01

    Full Text Available Cost estimation is the most important preliminary process in any construction project. Therefore, construction cost estimation has the lion’s share of the research effort in construction management. In this paper, we have analysed and studied proposals for construction cost estimation for the last 10 years. To implement this survey, we have proposed and applied a methodology that consists of two parts. The first part concerns data collection, for which we have chosen special journals as sources for the surveyed proposals. The second part concerns the analysis of the proposals. To analyse each proposal, the following four questions have been set. Which intelligent technique is used? How have data been collected? How are the results validated? And which construction cost estimation factors have been used? From the results of this survey, two main contributions have been produced. The first contribution is the defining of the research gap in this area, which has not been fully covered by previous proposals of construction cost estimation. The second contribution of this survey is the proposal and highlighting of future directions for forthcoming proposals, aimed ultimately at finding the optimal construction cost estimation. Moreover, we consider the second part of our methodology as one of our contributions in this paper. This methodology has been proposed as a standard benchmark for construction cost estimation proposals.

  3. Case-based reasoning diagnostic technique based on multi-attribute similarity

    Energy Technology Data Exchange (ETDEWEB)

    Makoto, Takahashi [Tohoku University, Miyagi (Japan); Akio, Gofuku [Okayama University, Okayamaa (Japan)

    2014-08-15

    Case-based diagnostic technique has been developed based on the multi-attribute similarity. Specific feature of the developed system is to use multiple attributes of process signals for similarity evaluation to retrieve a similar case stored in a case base. The present technique has been applied to the measurement data from Monju with some simulated anomalies. The results of numerical experiments showed that the present technique can be utilizes as one of the methods for a hybrid-type diagnosis system.

  4. Aerial radiation survey techniques for efficient characterization of large areas

    International Nuclear Information System (INIS)

    Sydelko, T.; Riedhauser, S.

    2006-01-01

    Full text: Accidental or intentional releases of radioactive isotopes over potentially very large surface areas can pose serious health risks to humans and ecological receptors. Timely and appropriate responses to these releases depend upon rapid and accurate characterization of impacted areas. These characterization efforts can be adversely impacted by heavy vegetation, rugged terrain, urban environments, and the presence of unknown levels of radioactivity. Aerial survey techniques have proven highly successful in measuring gamma emissions from radiological contaminates of concern quickly, efficiently, and safely. Examples of accidental releases include the unintentional distribution of uranium mining ores during transportation, the loss of uranium processing and waste materials, unintentional nuclear power plant emissions into the atmosphere, and the distribution of isotopes during major flooding events such as the one recently occurring in New Orleans. Intentional releases have occurred during the use of deleted uranium ammunition test firing and war time use by military organizations. The threat of radiological dispersion device (dirty bomb) use by terrorists is currently a major concern of many major cities worldwide. The U.S. Department of Energy, in cooperation with its Remote Sensing Laboratory and Argonne National Laboratory, has developed a sophisticated aerial measurement system for identifying the locations, types, and quantities of gamma emitting radionuclides over extremely large areas. Helicopter mounted Nal detectors are flown at low altitude and constant speed along parallel paths measuring the full spectrum of gamma activity. Analytical procedures are capable of distinguishing between radiological contamination and changes in natural background emissions. Mapped and tabular results of these accurate, timely and cost effective aerial gamma radiation surveys can be used to assist with emergency response actions, if necessary, and to focus more

  5. Instructional Uses of Web-Based Survey Software

    Directory of Open Access Journals (Sweden)

    Concetta A. DePaolo, Ph.D.

    2006-07-01

    Full Text Available Recent technological advances have led to changes in how instruction is delivered. Such technology can create opportunities to enhance instruction and make instructors more efficient in performing instructional tasks, especially if the technology is easy to use and requires no training. One such technology, web-based survey software, is extremely accessible for anyone with basic computer skills. Web-based survey software can be used for a variety of instructional purposes to streamline instructor tasks, as well as enhance instruction and communication with students. Following a brief overview of the technology, we discuss how Web Forms from nTreePoint can be used to conduct instructional surveys, collect course feedback, conduct peer evaluations of group work, collect completed assignments, schedule meeting times among multiple people, and aid in pedagogical research. We also discuss our experiences with these tasks within traditional on-campus courses and how they were enhanced or expedited by the use of web-based survey software.

  6. Comparison of acrylamide intake from Western and guideline based diets using probabilistic techniques and linear programming.

    Science.gov (United States)

    Katz, Josh M; Winter, Carl K; Buttrey, Samuel E; Fadel, James G

    2012-03-01

    Western and guideline based diets were compared to determine if dietary improvements resulting from following dietary guidelines reduce acrylamide intake. Acrylamide forms in heat treated foods and is a human neurotoxin and animal carcinogen. Acrylamide intake from the Western diet was estimated with probabilistic techniques using teenage (13-19 years) National Health and Nutrition Examination Survey (NHANES) food consumption estimates combined with FDA data on the levels of acrylamide in a large number of foods. Guideline based diets were derived from NHANES data using linear programming techniques to comport to recommendations from the Dietary Guidelines for Americans, 2005. Whereas the guideline based diets were more properly balanced and rich in consumption of fruits, vegetables, and other dietary components than the Western diets, acrylamide intake (mean±SE) was significantly greater (Plinear programming and results demonstrate that linear programming techniques can be used to model specific diets for the assessment of toxicological and nutritional dietary components. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Reasons for not changing to activity-based costing: a survey of Irish firms

    Directory of Open Access Journals (Sweden)

    Martin Quinn

    2017-04-01

    Full Text Available Purpose – This paper aims to report on a survey of medium and large Irish firms to ascertain reasons for not changing to more advanced costing techniques, namely, activity-based costing (ABC. Developments in technology and recent poor economic conditions would suggest that the technique could be adopted more by firms, as they make increased efforts to keep costs under control. Design/methodology/approach – A survey instrument was used to gather data drawing from the top 1,000 Irish firms. From a useable population of 821 organisations, a response rate of 20.75 per cent was achieved. Findings – Findings show a rate of adoption of ABC of 18.7 per cent, which is lower than previous studies in an Irish context. The level of information technology in firms is not a key factor for non-adoption. Instead, the main reasoning for non-adoption revolve around stable existing costing methods, which firms expressed satisfaction with. Originality/value – This research suggests the adoption of ABC is not necessarily driven by external factors such as technology and economic shocks, at least in the context of Ireland. It also suggests that costing techniques may be deeply embedded within organisations and are less likely to be subject to change.

  8. Reconstructive techniques in transoral robotic surgery for head and neck cancer: a North American survey.

    Science.gov (United States)

    Konofaos, Petros; Hammond, Sarah; Ver Halen, Jon P; Samant, Sandeep

    2013-02-01

    Although the use of transoral robotic surgery for tumor extirpation is expanding, little is known about national trends in the reconstruction of resultant defects. An 18-question electronic survey was created by an expert panel of surgeons from the Department of Otolaryngology-Head and Neck Surgery and the Department of Plastic and Reconstructive Surgery at the University of Tennessee. Eligible participants were identified by the American Head and Neck Society Web site and from the Intuitive Surgical, Inc., Web site after review of surgeons trained in transoral robotic surgery techniques. Twenty-three of 27 preselected head and neck surgeons (85.18 percent) completed the survey. All respondents use transoral robotic surgery for head and neck tumor extirpation. The majority of the respondents [n = 17 (77.3 percent)] did not use any means of reconstruction. With respect to methods of reconstruction following transoral robotic surgery defects, the majority [n = 4 (80.0 percent)] used a free flap, a pedicled local flap [n = 3 (60.0 percent)], or a distant flap [n = 3 (60.0 percent)]. The radial forearm flap was the most commonly used free flap by all respondents. In general, the majority of survey respondents allow defects to heal secondarily or close primarily. Based on this survey, consensus indications for pedicled or free tissue transfer following transoral robotic surgery defects were primary head and neck tumors (stage T3 and T4a), pharyngeal defects with exposure of vital structures, and prior irradiation or chemoradiation to the operative site and neck.

  9. EBR Strengthening Technique for Concrete, Long-Term Behaviour and Historical Survey

    Directory of Open Access Journals (Sweden)

    Christoph Czaderski

    2018-01-01

    Full Text Available Epoxy bonded steel plates (externally bonded reinforcemen: EBR for the strengthening of concrete structures were introduced to the construction industry in the late 1960s, and the use of fibre reinforced polymers (FRPs was introduced in the 1990s, which means that these techniques have already been used in construction for 50 and 25 years, respectively. In the first part of the paper, a historical survey of the development and introduction of these strengthening techniques into the construction industry are presented. The monitoring of such applications in construction is very important and gives more confidence to this strengthening technique. Therefore, in the second part of the paper, two long-term monitoring campaigns over an extraordinarily long duration will be presented. Firstly, a 47-year monitoring campaign on a concrete beam with an epoxy bonded steel plate and, secondly, a 20-year monitoring campaign on a road bridge with epoxy bonded CFRP (carbon fibre reinforced polymers strips are described. The paper is an expanded version of the paper presented at the SMAR2017 Conference.

  10. A SURVEY ON DELAY AND NEIGHBOR NODE MONITORING BASED WORMHOLE ATTACK PREVENTION AND DETECTION

    Directory of Open Access Journals (Sweden)

    Sudhir T Bagade

    2016-12-01

    Full Text Available In Mobile Ad-hoc Networks (MANET, network layer attacks, for example wormhole attacks, disrupt the network routing operations and can be used for data theft. Wormhole attacks are of two types: hidden and exposed wormhole. There are various mechanisms in literature which are used to prevent and detect wormhole attacks. In this paper, we survey wormhole prevention and detection techniques and present our critical observations for each. These techniques are based on cryptographic mechanisms, monitoring of packet transmission delay and control packet forwarding behavior of neighbor nodes. We compare the techniques using the following criteria- extra resources needed applicability to different network topologies and routing protocols, prevention/detection capability, etc. We conclude the paper with potential research directions.

  11. Survey of technology for decommissioning of nuclear fuel cycle facilities. 8. Remote handling and cutting techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ogawa, Ryuichiro; Ishijima, Noboru [Japan Nuclear Cycle Development Inst., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1999-03-01

    In nuclear fuel cycle facility decommissioning and refurbishment, the remote handling techniques such as dismantling, waste handling and decontamination are needed to reduce personnel radiation exposure. The survey research for the status of R and D activities on remote handling tools suitable for nuclear facilities in the world and domestic existing commercial cutting tools applicable to decommissioning of the facilities was conducted. In addition, the drive mechanism, sensing element and control system applicable to the remote handling devices were also surveyed. This report presents brief surveyed summaries. (H. Itami)

  12. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  13. Composite Techniques Based Color Image Compression

    Directory of Open Access Journals (Sweden)

    Zainab Ibrahim Abood

    2017-03-01

    Full Text Available Compression for color image is now necessary for transmission and storage in the data bases since the color gives a pleasing nature and natural for any object, so three composite techniques based color image compression is implemented to achieve image with high compression, no loss in original image, better performance and good image quality. These techniques are composite stationary wavelet technique (S, composite wavelet technique (W and composite multi-wavelet technique (M. For the high energy sub-band of the 3rd level of each composite transform in each composite technique, the compression parameters are calculated. The best composite transform among the 27 types is the three levels of multi-wavelet transform (MMM in M technique which has the highest values of energy (En and compression ratio (CR and least values of bit per pixel (bpp, time (T and rate distortion R(D. Also the values of the compression parameters of the color image are nearly the same as the average values of the compression parameters of the three bands of the same image.

  14. The use of advanced web-based survey design in Delphi research.

    Science.gov (United States)

    Helms, Christopher; Gardner, Anne; McInnes, Elizabeth

    2017-12-01

    A discussion of the application of metadata, paradata and embedded data in web-based survey research, using two completed Delphi surveys as examples. Metadata, paradata and embedded data use in web-based Delphi surveys has not been described in the literature. The rapid evolution and widespread use of online survey methods imply that paper-based Delphi methods will likely become obsolete. Commercially available web-based survey tools offer a convenient and affordable means of conducting Delphi research. Researchers and ethics committees may be unaware of the benefits and risks of using metadata in web-based surveys. Discussion paper. Two web-based, three-round Delphi surveys were conducted sequentially between August 2014 - January 2015 and April - May 2016. Their aims were to validate the Australian nurse practitioner metaspecialties and their respective clinical practice standards. Our discussion paper is supported by researcher experience and data obtained from conducting both web-based Delphi surveys. Researchers and ethics committees should consider the benefits and risks of metadata use in web-based survey methods. Web-based Delphi research using paradata and embedded data may introduce efficiencies that improve individual participant survey experiences and reduce attrition across iterations. Use of embedded data allows the efficient conduct of multiple simultaneous Delphi surveys across a shorter timeframe than traditional survey methods. The use of metadata, paradata and embedded data appears to improve response rates, identify bias and give possible explanation for apparent outlier responses, providing an efficient method of conducting web-based Delphi surveys. © 2017 John Wiley & Sons Ltd.

  15. Evaluating autonomous acoustic surveying techniques for rails in tidal marshes

    Science.gov (United States)

    Stiffler, Lydia L.; Anderson, James T.; Katzner, Todd

    2018-01-01

    There is a growing interest toward the use of autonomous recording units (ARUs) for acoustic surveying of secretive marsh bird populations. However, there is little information on how ARUs compare to human surveyors or how best to use ARU data that can be collected continuously throughout the day. We used ARUs to conduct 2 acoustic surveys for king (Rallus elegans) and clapper rails (R. crepitans) within a tidal marsh complex along the Pamunkey River, Virginia, USA, during May–July 2015. To determine the effectiveness of an ARU in replacing human personnel, we compared results of callback point‐count surveys with concurrent acoustic recordings and calculated estimates of detection probability for both rail species combined. The success of ARUs at detecting rails that human observers recorded decreased with distance (P ≤ 0.001), such that at of human‐recorded rails also were detected by the ARU, but at >75 m, only 34.0% of human‐detected rails were detected by the ARU. To determine a subsampling scheme for continuous ARU data that allows for effective surveying of presence and call rates of rails, we used ARUs to conduct 15 continuous 48‐hr passive surveys, generating 720 hr of recordings. We established 5 subsampling periods of 5, 10, 15, 30, and 45 min to evaluate ARU‐based presence and vocalization detections of rails compared with each of the full 60‐min sampling of ARU‐based detection of rails. All subsampling periods resulted in different (P ≤ 0.001) detection rates and unstandardized vocalization rates compared with the hourly sampling period. However, standardized vocalization counts from the 30‐min subsampling period were not different from vocalization counts of the full hourly sampling period. When surveying rail species in estuarine environments, species‐, habitat‐, and ARU‐specific limitations to ARU sampling should be considered when making inferences about abundances and distributions from ARU data. 

  16. Point cloud-based survey for cultural heritage – An experience of integrated use of range-based and image-based technology for the San Francesco convent in Monterubbiano

    Directory of Open Access Journals (Sweden)

    A. Meschini

    2014-06-01

    Full Text Available The paper aims at presenting some results of a point cloud-based survey carried out through integrated methodologies based on active and passive 3D acquisition techniques for processing 3D models. This experiment is part of a research project still in progress conducted by an interdisciplinary team from the School of Architecture and Design of Ascoli Piceno and funded by the University of Camerino. We describe an experimentation conducted on the convent of San Francesco located in Monterubbiano town center (Marche, Italy. The whole complex has undergone a number of substantial changes since the year of its foundation in 1247. The survey was based on an approach blending range-based 3D data acquired by a TOF laser scanner and image-based 3D acquired using an UAV equipped with digital camera in order to survey some external parts difficult to reach with TLS. The integration of two acquisition methods aimed to define a workflow suitable to process dense 3D models from which to generate high poly and low poly 3D models useful to describe complex architectures for different purposes such as photorealistic representations, historical documentation, risk assessment analyses based on Finite Element Methods (FEM.

  17. Survey of agents and techniques applicable to the solidification of low-level radioactive wastes

    International Nuclear Information System (INIS)

    Fuhrmann, M.; Neilson, R.M. Jr.; Colombo, P.

    1981-12-01

    A review of the various solidification agents and techniques that are currently available or potentially applicable for the solidification of low-level radioactive wastes is presented. An overview of the types and quantities of low-level wastes produced is presented. Descriptions of waste form matrix materials, the wastes types for which they have been or may be applied and available information concerning relevant waste form properties and characteristics follow. Also included are descriptions of the processing techniques themselves with an emphasis on those operating parameters which impact upon waste form properties. The solidification agents considered in this survey include: hydraulic cements, thermoplastic materials, thermosetting polymers, glasses, synthetic minerals and composite materials. This survey is part of a program supported by the United States Department of Energy's Low-Level Waste Management Program (LLWMP). This work provides input into LLWMP efforts to develop and compile information relevant to the treatment and processing of low-level wastes and their disposal by shallow land burial

  18. Survey of agents and techniques applicable to the solidification of low-level radioactive wastes

    Energy Technology Data Exchange (ETDEWEB)

    Fuhrmann, M.; Neilson, R.M. Jr.; Colombo, P.

    1981-12-01

    A review of the various solidification agents and techniques that are currently available or potentially applicable for the solidification of low-level radioactive wastes is presented. An overview of the types and quantities of low-level wastes produced is presented. Descriptions of waste form matrix materials, the wastes types for which they have been or may be applied and available information concerning relevant waste form properties and characteristics follow. Also included are descriptions of the processing techniques themselves with an emphasis on those operating parameters which impact upon waste form properties. The solidification agents considered in this survey include: hydraulic cements, thermoplastic materials, thermosetting polymers, glasses, synthetic minerals and composite materials. This survey is part of a program supported by the United States Department of Energy's Low-Level Waste Management Program (LLWMP). This work provides input into LLWMP efforts to develop and compile information relevant to the treatment and processing of low-level wastes and their disposal by shallow land burial.

  19. Sci-Fri PM: Radiation Therapy, Planning, Imaging, and Special Techniques - 10: Results from Canada Wide Survey on Total Body Irradiation Practice

    Energy Technology Data Exchange (ETDEWEB)

    Studinski, Ryan; Fraser, Danielle; Samant, Rajiv; MacPherson, Miller [The Ottawa Hospital Cancer Centre, The Ottawa Hospital Cancer Centre, The Ottawa Hospital Cancer Centre, The Ottawa Hospital Cancer Centre (Canada)

    2016-08-15

    Purpose: Total Body Irradiation (TBI) is delivered to a relatively small number of patients with a variety of techniques; it has been a challenge to develop consensus studies for best practice. This survey was created to assess the current state of TBI in Canada. Methods: The survey was created with questions focusing on the radiation prescription, delivery technique and resources involved. The survey was circulated electronically to the heads of every clinical medical physics department in Canada. Responses were gathered and collated, and centres that were known to deliver TBI were urged to respond. Results: Responses from 20 centres were received, including 12 from centres that perform TBI. Although a variety of TBI dose prescriptions were reported, 12 Gy in 6 fractions was used in 11 centres while 5 centres use unique prescriptions. For dose rate, a range of 9 to 51 cGy/min was reported. Most centres use an extended SSD technique, with the patient standing or lying down against a wall. The rest use either a “sweeping” technique or a more complicated multi-field technique. All centres but one indicated that they shield the lungs, and only a minority shield other organs. The survey also showed that considerable resources are used for TBI including extra staffing, extended planning and treatment times and the use of locally developed hardware or software. Conclusions: This survey highlights that both similarities and important discrepancies exist between TBI techniques across the country, and is an opportunity to prompt more collaboration between centres.

  20. Practical guidelines for developing a smartphone-based survey instrument

    DEFF Research Database (Denmark)

    Ohme, Jakob; de Vreese, Claes Holger; Albæk, Erik

    The increasing relevance of mobile surveys makes it important to gather empirical evidence on designs of such surveys. This research note presents the results of a test study conducted to identify the best set-up for a smartphone-based survey. We base our analysis on a random sample of Danish...

  1. Array-based techniques for fingerprinting medicinal herbs

    Directory of Open Access Journals (Sweden)

    Xue Charlie

    2011-05-01

    Full Text Available Abstract Poor quality control of medicinal herbs has led to instances of toxicity, poisoning and even deaths. The fundamental step in quality control of herbal medicine is accurate identification of herbs. Array-based techniques have recently been adapted to authenticate or identify herbal plants. This article reviews the current array-based techniques, eg oligonucleotides microarrays, gene-based probe microarrays, Suppression Subtractive Hybridization (SSH-based arrays, Diversity Array Technology (DArT and Subtracted Diversity Array (SDA. We further compare these techniques according to important parameters such as markers, polymorphism rates, restriction enzymes and sample type. The applicability of the array-based methods for fingerprinting depends on the availability of genomics and genetics of the species to be fingerprinted. For the species with few genome sequence information but high polymorphism rates, SDA techniques are particularly recommended because they require less labour and lower material cost.

  2. A Survey on Infrastructure-Based Vehicular Networks

    Directory of Open Access Journals (Sweden)

    Cristiano M. Silva

    2017-01-01

    Full Text Available The infrastructure of vehicular networks plays a major role in realizing the full potential of vehicular communications. More and more vehicles are connected to the Internet and to each other, driving new technological transformations in a multidisciplinary way. Researchers in automotive/telecom industries and academia are joining their effort to provide their visions and solutions to increasingly complex transportation systems, also envisioning a myriad of applications to improve the driving experience and the mobility. These trends pose significant challenges to the communication systems: low latency, higher throughput, and increased reliability have to be granted by the wireless access technologies and by a suitable (possibly dedicated infrastructure. This paper presents an in-depth survey of more than ten years of research on infrastructures, wireless access technologies and techniques, and deployment that make vehicular connectivity available. In addition, we identify the limitations of present technologies and infrastructures and the challenges associated with such infrastructure-based vehicular communications, also highlighting potential solutions.

  3. CNMI Shore-based Creel Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Commonwealth of the Northern Mariana Islands (CNMI), Division of Fish and Wildlife (DFW) staff conducted shore-based creel surveys which have 2 major...

  4. Industry Based Survey (IBS) Yellowtail

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The "Southern New England Yellowtail Flounder Industry-Based Survey" was a collaboration between the Rhode Island Division of Fish and Wildlife and the fishing...

  5. Terminating Sequential Delphi Survey Data Collection

    Science.gov (United States)

    Kalaian, Sema A.; Kasim, Rafa M.

    2012-01-01

    The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey…

  6. Industry Based Survey (IBS) Cod

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The "Gulf of Maine Atlantic Cod Industry-Based Survey" was a collaboration of the Massachusetts Division of Marine Fisheries and the fishing industry, with support...

  7. [Abortion in Brazil: a household survey using the ballot box technique].

    Science.gov (United States)

    Diniz, Debora; Medeiros, Marcelo

    2010-06-01

    This study presents the first results of the National Abortion Survey (PNA, Pesquisa Nacional de Aborto), a household random sample survey fielded in 2010 covering urban women in Brazil aged 18 to 39 years. The PNA combined two techniques, interviewer-administered questionnaires and self-administered ballot box questionnaires. The results of PNA show that at the end of their reproductive health one in five women has performed an abortion, with abortions being more frequent in the main reproductive ages, that is, from 18 to 29 years old. No relevant differentiation was observed in the practice of abortion among religious groups, but abortion was found to be more common among people with lower education. The use of medical drugs to induce abortion occurred in half of the abortions, and post-abortion hospitalization was observed among approximately half of the women who aborted. Such results lead to conclude that abortion is a priority in the Brazilian public health agenda.

  8. Localization in Wireless Sensor Networks: A Survey on Algorithms, Measurement Techniques, Applications and Challenges

    Directory of Open Access Journals (Sweden)

    Anup Kumar Paul

    2017-10-01

    Full Text Available Localization is an important aspect in the field of wireless sensor networks (WSNs that has developed significant research interest among academia and research community. Wireless sensor network is formed by a large number of tiny, low energy, limited processing capability and low-cost sensors that communicate with each other in ad-hoc fashion. The task of determining physical coordinates of sensor nodes in WSNs is known as localization or positioning and is a key factor in today’s communication systems to estimate the place of origin of events. As the requirement of the positioning accuracy for different applications varies, different localization methods are used in different applications and there are several challenges in some special scenarios such as forest fire detection. In this paper, we survey different measurement techniques and strategies for range based and range free localization with an emphasis on the latter. Further, we discuss different localization-based applications, where the estimation of the location information is crucial. Finally, a comprehensive discussion of the challenges such as accuracy, cost, complexity, and scalability are given.

  9. Radiation techniques used in patients with breast cancer: Results of a survey in Spain

    Science.gov (United States)

    Algara, Manuel; Arenas, Meritxell; De las Peñas Eloisa Bayo, Dolores; Muñoz, Julia; Carceller, José Antonio; Salinas, Juan; Moreno, Ferran; Martínez, Francisco; González, Ezequiel; Montero, Ángel

    2012-01-01

    Aim To evaluate the resources and techniques used in the irradiation of patients with breast cancer after lumpectomy or mastectomy and the status of implementation of new techniques and therapeutic schedules in our country. Background The demand for cancer care has increased among the Spanish population, as long as cancer treatment innovations have proliferated. Radiation therapy in breast cancer has evolved exponentially in recent years with the implementation of three-dimensional conformal radiotherapy, intensity modulated radiotherapy, image guided radiotherapy and hypofractionation. Material and Methods An original survey questionnaire was sent to institutions participating in the SEOR-Mama group (GEORM). In total, the standards of practice in 969 patients with breast cancer after surgery were evaluated. Results The response rate was 70% (28/40 centers). In 98.5% of cases 3D conformal treatment was used. All the institutions employed CT-based planning treatment. Boost was performed in 56.4% of patients: electrons in 59.8%, photons in 23.7% and HDR brachytherapy in 8.8%. Fractionation was standard in 93.1% of patients. Supine position was the most frequent. Only 3 centers used prone position. The common organs of risk delimited were: homolateral lung (80.8%) and heart (80.8%). In 84% histograms were used. An 80.8% of the centers used isocentric technique. In 62.5% asymmetric fields were employed. CTV was delimited in 46.2%, PTV in 65% and both in 38.5%. A 65% of the centers checked with portal films. IMRT and hypofractionation were used in 1% and in 5.5% respectively. Conclusion In most of centers, 3D conformal treatment and CT-based planning treatment were used. IMRT and hypofractionation are currently poorly implemented in Spain. PMID:24377012

  10. Web-based surveys as an alternative to traditional mail methods.

    Science.gov (United States)

    Fleming, Christopher M; Bowden, Mark

    2009-01-01

    Environmental economists have long used surveys to gather information about people's preferences. A recent innovation in survey methodology has been the advent of web-based surveys. While the Internet appears to offer a promising alternative to conventional survey administration modes, concerns exist over potential sampling biases associated with web-based surveys and the effect these may have on valuation estimates. This paper compares results obtained from a travel cost questionnaire of visitors to Fraser Island, Australia, that was conducted using two alternate survey administration modes; conventional mail and web-based. It is found that response rates and the socio-demographic make-up of respondents to the two survey modes are not statistically different. Moreover, both modes yield similar consumer surplus estimates.

  11. Statistical techniques applied to aerial radiometric surveys (STAARS): series introduction and the principal-components-analysis method

    International Nuclear Information System (INIS)

    Pirkle, F.L.

    1981-04-01

    STAARS is a new series which is being published to disseminate information concerning statistical procedures for interpreting aerial radiometric data. The application of a particular data interpretation technique to geologic understanding for delineating regions favorable to uranium deposition is the primary concern of STAARS. Statements concerning the utility of a technique on aerial reconnaissance data as well as detailed aerial survey data will be included

  12. MCNP Techniques for Modeling Sodium Iodide Spectra of Kiwi Surveys

    International Nuclear Information System (INIS)

    Robert B Hayes

    2007-01-01

    This work demonstrates how MCNP can be used to predict the response of mobile search and survey equipment from base principles. The instrumentation evaluated comes from the U.S. Department of Energy's Aerial Measurement Systems. Through reconstructing detector responses to various point-source measurements, detector responses to distributed sources can be estimated through superposition. Use of this methodology for currently deployed systems allows predictive determinations of activity levels and distributions for common configurations of interest. This work helps determine the quality and efficacy of certain surveys in fully characterizing an effected site following a radiological event of national interest

  13. Conducting Surveys and Data Collection: From Traditional to Mobile and SMS-based Surveys

    Directory of Open Access Journals (Sweden)

    Iftikhar Alam

    2014-08-01

    Full Text Available Fresh, bias-free and valid data collected using different survey modes is considered an essential requirement for smooth functioning and evolution of an organization. Surveys play a major role in making in-time correct decisions and generating reports. The aim of this study is to compare and investigate state-of-the-art in different survey modes including print, email, online, mobile and SMS-based surveys. Results indicated that existing methods are neither complete nor sufficient to fulfil the overall requirements of an organization which primarily rely on surveys. Also, it shows that SMS is a dominant method for data collection due to its pervasiveness. However, existing SMS-based data collection has limitations like limited number of characters per SMS, single question per SMS and lake of multimedia support. Recent trends in data collection emphasis on data collection applications for smart phones. However, in developing countries low-end mobile devices are still extensively used which makes the data collection difficult from man in the street. The paper conclude that existing survey modes and methods should be improved to get maximum responses quickly in low cost manner. The study has contributed to the area of surveying and data collection by analysing different factors such as cost, time and response rate. The results of this study can help practitioners in creating a more successful surveying method for data collection that can be effectively used for low budget projects in developed as well as developing countries.

  14. The mobile image quality survey game

    Science.gov (United States)

    Rasmussen, D. René

    2012-01-01

    In this paper we discuss human assessment of the quality of photographic still images, that are degraded in various manners relative to an original, for example due to compression or noise. In particular, we examine and present results from a technique where observers view images on a mobile device, perform pairwise comparisons, identify defects in the images, and interact with the display to indicate the location of the defects. The technique measures the response time and accuracy of the responses. By posing the survey in a form similar to a game, providing performance feedback to the observer, the technique attempts to increase the engagement of the observers, and to avoid exhausting observers, a factor that is often a problem for subjective surveys. The results are compared with the known physical magnitudes of the defects and with results from similar web-based surveys. The strengths and weaknesses of the technique are discussed. Possible extensions of the technique to video quality assessment are also discussed.

  15. Methodology of the National School-based Health Survey in Malaysia, 2012.

    Science.gov (United States)

    Yusoff, Fadhli; Saari, Riyanti; Naidu, Balkish M; Ahmad, Noor Ani; Omar, Azahadi; Aris, Tahir

    2014-09-01

    The National School-Based Health Survey 2012 was a nationwide school health survey of students in Standard 4 to Form 5 (10-17 years of age), who were schooling in government schools in Malaysia during the period of data collection. The survey comprised 3 subsurveys: the Global School Health Survey (GSHS), the Mental Health Survey, and the National School-Based Nutrition Survey. The aim of the survey was to provide data on the health status of adolescents in Malaysia toward strengthening the adolescent health program in the country. The design of the survey was created to fulfill the requirements of the 3 subsurveys. A 2-stage stratified sampling method was adopted in the sampling. The methods for data collection were via questionnaire and physical examination. The National School-Based Health Survey 2012 adopted an appropriate methodology for a school-based survey to ensure valid and reliable findings. © 2014 APJPH.

  16. Monitoring beach changes using GPS surveying techniques

    Science.gov (United States)

    Morton, Robert; Leach, Mark P.; Paine, Jeffrey G.; Cardoza, Michael A.

    1993-01-01

    A need exists for frequent and prompt updating of shoreline positions, rates of shoreline movement, and volumetric nearshore changes. To effectively monitor and predict these beach changes, accurate measurements of beach morphology incorporating both shore-parallel and shore-normal transects are required. Although it is possible to monitor beach dynamics using land-based surveying methods, it is generally not practical to collect data of sufficient density and resolution to satisfy a three-dimensional beach-change model of long segments of the coast. The challenge to coastal scientists is to devise new beach monitoring methods that address these needs and are rapid, reliable, relatively inexpensive, and maintain or improve measurement accuracy.

  17. Guam Boat-based Creel Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Similar to other boat-based survey in basic design, this system is run by the Div. of Aquatic and Wildlife Resources (DAWR) and has been in operation since about...

  18. Survey on Chatbot Design Techniques in Speech Conversation Systems

    OpenAIRE

    Sameera A. Abdul-Kader; Dr. John Woods

    2015-01-01

    Human-Computer Speech is gaining momentum as a technique of computer interaction. There has been a recent upsurge in speech based search engines and assistants such as Siri, Google Chrome and Cortana. Natural Language Processing (NLP) techniques such as NLTK for Python can be applied to analyse speech, and intelligent responses can be found by designing an engine to provide appropriate human like responses. This type of programme is called a Chatbot, which is the focus of this study. This pap...

  19. Recommendations for abortion surveys using the ballot-box technique Recomendações para inquéritos sobre aborto usando a técnica de urna

    Directory of Open Access Journals (Sweden)

    Marcelo Medeiros

    2012-07-01

    Full Text Available The article lists recommendations for dealing with methodological aspects of an abortion survey and makes suggestions for testing and validating the survey questionnaire. The recommendations are based on the experience of the Brazilian Abortion Survey (PNA, a random sample household survey that used the ballot-box technique and covered adult women in all urban areas of the country.O artigo lista recomendações para lidar com aspectos metodológicos de um inquérito sobre aborto e faz sugestões para testar e validar o questionário do levantamento. As recomendações baseiam-se na experiência da Pesquisa Nacional de Aborto (PNA, uma pesquisa domiciliar baseada em amostra aleatória da população urbana do Brasil que utilizou a técnica de urna.

  20. Reduced-Item Food Audits Based on the Nutrition Environment Measures Surveys.

    Science.gov (United States)

    Partington, Susan N; Menzies, Tim J; Colburn, Trina A; Saelens, Brian E; Glanz, Karen

    2015-10-01

    The community food environment may contribute to obesity by influencing food choice. Store and restaurant audits are increasingly common methods for assessing food environments, but are time consuming and costly. A valid, reliable brief measurement tool is needed. The purpose of this study was to develop and validate reduced-item food environment audit tools for stores and restaurants. Nutrition Environment Measures Surveys for stores (NEMS-S) and restaurants (NEMS-R) were completed in 820 stores and 1,795 restaurants in West Virginia, San Diego, and Seattle. Data mining techniques (correlation-based feature selection and linear regression) were used to identify survey items highly correlated to total survey scores and produce reduced-item audit tools that were subsequently validated against full NEMS surveys. Regression coefficients were used as weights that were applied to reduced-item tool items to generate comparable scores to full NEMS surveys. Data were collected and analyzed in 2008-2013. The reduced-item tools included eight items for grocery, ten for convenience, seven for variety, and five for other stores; and 16 items for sit-down, 14 for fast casual, 19 for fast food, and 13 for specialty restaurants-10% of the full NEMS-S and 25% of the full NEMS-R. There were no significant differences in median scores for varying types of retail food outlets when compared to the full survey scores. Median in-store audit time was reduced 25%-50%. Reduced-item audit tools can reduce the burden and complexity of large-scale or repeated assessments of the retail food environment without compromising measurement quality. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  1. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  2. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  3. Farmer survey in the hinterland of Kisangani (Democratic Republic of Congo) on rodent crop damage and rodent control techniques used

    DEFF Research Database (Denmark)

    Drazo, Nicaise Amundala; Kennis, Jan; Leirs, Herwig

    2008-01-01

    We conducted a survey on rodent crop damage among farmers in the hinterland of Kisangani (Democratic Republic of Congo). We studied the amount of crop damage, the rodent groups causing crop damage, the growth stages affected and the control techniques used. We conducted this survey in three...... municipalities using a standard questionnaire form translated into local languages, between November 2005 and June 2006 and during July 2007. We used the Quotas method and interviewed 70 households per municipality. Farmers indicated rodent groups implicated in crop damage on color photographs. Two types...... of survey techniques were used: individual and focus-group surveys. The sugar cane rat, Thryonomys sp. and Lemniscomys striatus caused most damage to crops, but inside granaries, Rattus rattus was the primary pest species eating stored food supplies and causing damage to stored goods. Cassava and maize were...

  4. SNE's methodological basis - web-based software in entrepreneurial surveys

    DEFF Research Database (Denmark)

    Madsen, Henning

    This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project.......This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project....

  5. Technical basis for tumbleweed survey requirements and disposal criteria

    International Nuclear Information System (INIS)

    J. D. Arana

    2000-01-01

    This technical basis document describes the technique for surveying potentially contaminated tumbleweeds in areas where the Environmental Restoration Contractor has jurisdiction and the disposal criteria based on these survey results. The report also discusses the statistical basis for surveys and the historical basis for the assumptions that are used to interpret the surveys

  6. Technical Basis for Tumbleweed Survey Requirements and Disposal Criteria

    International Nuclear Information System (INIS)

    Arana, J.D.

    2000-01-01

    This technical basis document describes the technique for surveying potentially contaminated tumbleweeds in areas where the Environmental Restoration Contractor has jurisdiction and the disposal criteria based on these survey results. The report also discusses the statistical basis for surveys and the historical basis for the assumptions that are used to interpret the surveys

  7. Vector Quantization of Harmonic Magnitudes in Speech Coding Applications—A Survey and New Technique

    Directory of Open Access Journals (Sweden)

    Wai C. Chu

    2004-12-01

    Full Text Available A harmonic coder extracts the harmonic components of a signal and represents them efficiently using a few parameters. The principles of harmonic coding have become quite successful and several standardized speech and audio coders are based on it. One of the key issues in harmonic coder design is in the quantization of harmonic magnitudes, where many propositions have appeared in the literature. The objective of this paper is to provide a survey of the various techniques that have appeared in the literature for vector quantization of harmonic magnitudes, with emphasis on those adopted by the major speech coding standards; these include constant magnitude approximation, partial quantization, dimension conversion, and variable-dimension vector quantization (VDVQ. In addition, a refined VDVQ technique is proposed where experimental data are provided to demonstrate its effectiveness.

  8. Base Oils Biodegradability Prediction with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Malika Trabelsi

    2010-02-01

    Full Text Available In this paper, we apply various data mining techniques including continuous numeric and discrete classification prediction models of base oils biodegradability, with emphasis on improving prediction accuracy. The results show that highly biodegradable oils can be better predicted through numeric models. In contrast, classification models did not uncover a similar dichotomy. With the exception of Memory Based Reasoning and Decision Trees, tested classification techniques achieved high classification prediction. However, the technique of Decision Trees helped uncover the most significant predictors. A simple classification rule derived based on this predictor resulted in good classification accuracy. The application of this rule enables efficient classification of base oils into either low or high biodegradability classes with high accuracy. For the latter, a higher precision biodegradability prediction can be obtained using continuous modeling techniques.

  9. A Survey of Technologies Supporting Virtual Project Based Learning

    DEFF Research Database (Denmark)

    Dirckinck-Holmfeld, Lone

    2002-01-01

    This paper describes a survey of technologies and to what extent they support virtual project based learning. The paper argues that a survey of learning technologies should be related to concrete learning tasks and processes. Problem oriented project pedagogy (POPP) is discussed, and a framework...... for evaluation is proposed where negotiation of meaning, coordination and resource management are identified as the key concepts in virtual project based learning. Three e-learning systems are selected for the survey, Virtual-U, Lotus Learningspace and Lotus Quickplace, as each system offers different strategies...... for e-learning. The paper concludes that virtual project based learning may benefit from facilities of all these systems....

  10. [Estimating child mortality using the previous child technique, with data from health centers and household surveys: methodological aspects].

    Science.gov (United States)

    Aguirre, A; Hill, A G

    1988-01-01

    2 trials of the previous child or preceding birth technique in Bamako, Mali, and Lima, Peru, gave very promising results for measurement of infant and early child mortality using data on survivorship of the 2 most recent births. In the Peruvian study, another technique was tested in which each woman was asked about her last 3 births. The preceding birth technique described by Brass and Macrae has rapidly been adopted as a simple means of estimating recent trends in early childhood mortality. The questions formulated and the analysis of results are direct when the mothers are visited at the time of birth or soon after. Several technical aspects of the method believed to introduce unforeseen biases have now been studied and found to be relatively unimportant. But the problems arising when the data come from a nonrepresentative fraction of the total fertile-aged population have not been resolved. The analysis based on data from 5 maternity centers including 1 hospital in Bamako, Mali, indicated some practical problems and the information obtained showed the kinds of subtle biases that can result from the effects of selection. The study in Lima tested 2 abbreviated methods for obtaining recent early childhood mortality estimates in countries with deficient vital registration. The basic idea was that a few simple questions added to household surveys on immunization or diarrheal disease control for example could produce improved child mortality estimates. The mortality estimates in Peru were based on 2 distinct sources of information in the questionnaire. All women were asked their total number of live born children and the number still alive at the time of the interview. The proportion of deaths was converted into a measure of child survival using a life table. Then each woman was asked for a brief history of the 3 most recent live births. Dates of birth and death were noted in month and year of occurrence. The interviews took only slightly longer than the basic survey

  11. THE OPTIMIZATION OF TECHNOLOGICAL MINING PARAMETERS IN QUARRY FOR DIMENSION STONE BLOCKS QUALITY IMPROVEMENT BASED ON PHOTOGRAMMETRIC TECHNIQUES OF MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Ruslan Sobolevskyi

    2018-01-01

    Full Text Available This research focuses on patterns of change in the dimension stone commodity blocks quality production on previously identifi ed and measured geometrical parameters of natural cracks, modelling and planning out the fi nal dimension of stone products and fi nished products based on the proposed digital photogrammetric techniques. The optimal parameters of surveying are investigated and the infl uence of surveying distance to length and crack area is estimated. Rational technological parameters of dimension stone blocks production are taken into account.

  12. Advantages and limitations of web-based surveys: evidence from a child mental health survey.

    Science.gov (United States)

    Heiervang, Einar; Goodman, Robert

    2011-01-01

    Web-based surveys may have advantages related to the speed and cost of data collection as well as data quality. However, they may be biased by low and selective participation. We predicted that such biases would distort point-estimates such as average symptom level or prevalence but not patterns of associations with putative risk-factors. A structured psychiatric interview was administered to parents in two successive surveys of child mental health. In 2003, parents were interviewed face-to-face, whereas in 2006 they completed the interview online. In both surveys, interviews were preceded by paper questionnaires covering child and family characteristics. The rate of parents logging onto the web site was comparable to the response rate for face-to-face interviews, but the rate of full response (completing all sections of the interview) was much lower for web-based interviews. Full response was less frequent for non-traditional families, immigrant parents, and less educated parents. Participation bias affected point estimates of psychopathology but had little effect on associations with putative risk factors. The time and cost of full web-based interviews was only a quarter of that for face-to-face interviews. Web-based surveys may be performed faster and at lower cost than more traditional approaches with personal interviews. Selective participation seems a particular threat to point estimates of psychopathology, while patterns of associations are more robust.

  13. A Review of the Piezoelectric Electromechanical Impedance Based Structural Health Monitoring Technique for Engineering Structures

    Directory of Open Access Journals (Sweden)

    Wongi S. Na

    2018-04-01

    Full Text Available The birth of smart materials such as piezoelectric (PZT transducers has aided in revolutionizing the field of structural health monitoring (SHM based on non-destructive testing (NDT methods. While a relatively new NDT method known as the electromechanical (EMI technique has been investigated for more than two decades, there are still various problems that must be solved before it is applied to real structures. The technique, which has a significant potential to contribute to the creation of one of the most effective SHM systems, involves the use of a single PZT for exciting and sensing of the host structure. In this paper, studies applied for the past decade related to the EMI technique have been reviewed to understand its trend. In addition, new concepts and ideas proposed by various authors are also surveyed, and the paper concludes with a discussion of the potential directions for future works.

  14. Isotope techniques in a water survey

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1959-10-15

    The circulation of water is one of the most interesting of natural phenomena. Exact knowledge of fluctuations in precipitation and other factors in water circulation is extremely important for areas which have a very limited water supply. The information about the circulation of water is also important for the disposal of radioactive wastes on land and in the sea. Before satisfactory methods of disposal can be devised, it is essential to know precisely whether and to what extent the wastes can be transferred from one place to another as a result of the circulation of water. One of the most efficient ways of gathering such information is to study the isotopic ratios of hydrogen and oxygen in water in different areas. Tritium can serve a s a tracer in the study of water circulation. A variety of information can be obtained by measurements of isotopic composition of water, e.g. the average age of the water molecule in a lake or age, size, storage time and flow rate of a groundwater body. The modern tools of hydrological research cannot be employed by every country, because measurements of the isotopic composition of water require great technical skill and scientific knowledge. Besides, interpretation of isotope data in terms of hydrology and climatology requires the knowledge of certain basic data for the whole world or at least for large areas. A more complete knowledge of the worldwide variations in the isotopic composition of water would greatly facilitate the interpretation of local conditions. Guided by these considerations, the International Atomic Energy Agency has decided to initiate a study to determine the world-wide distribution of hydrogen and oxygen isotopes in water. On the basis of this study, it will be possible to make available basic data for the use of any country that wishes to apply isotope techniques for hydrological and climatological research. Under this project, it is proposed to collect samples of rain, river and ocean water in different

  15. A survey on the application of robot techniques to an atomic power plant

    International Nuclear Information System (INIS)

    Hasegawa, Tsutomu; Sato, Tomomasa; Hirai, Shigeoki; Suehiro, Takashi; Okada, Tokuji

    1982-01-01

    Tasks of workers in atomic power plants have been surveyed from the viewpoint of necessity and possibility of their robotization. The daily tasks are classified into the following: (1) plant operation; (2) periodical examination; (3) patrol and inspection; (4) in-service inspection; (5) maintenance and repaire; (6) examination and production of the fuel; (7) waste disposal; (8) decommission of the plant. The necessity and present status of the robotization in atomic power plants are investigated according to the following classification: (1) inspection robots; (2) patrol inspection/maintenance robots; (3) hot cell robots; (4) plant decommission robots. The following have been made clear through the survey: (1) Various kinds of tasks are necessary for an atomic power plant: (2) Because of most of the tasks taking place in intense radiation environments, it is necessary to introduce robots into atomic power plants: (3) In application of robots in atomic power plant systems, it is necessary to take account of various severe conditions concerning spatial restrictions, radioactive endurance and reliability. Lastly wide applicability of the techniques of knowledge robots, which operate interactively with men, has been confirmed as a result of the survey. (author)

  16. Three-dimensional seismic survey planning based on the newest data acquisition design technique; Saishin no data shutoku design ni motozuku sanjigen jishin tansa keikaku

    Energy Technology Data Exchange (ETDEWEB)

    Minehara, M; Nakagami, K; Tanaka, H [Japan National Oil Corp., Tokyo (Japan). Technology Research Center

    1996-10-01

    Theory of parameter setting for data acquisition is arranged, mainly as to the seismic generating and receiving geometry. This paper also introduces an example of survey planning for three-dimensional land seismic exploration in progress. For the design of data acquisition, fundamental parameters are firstly determined on the basis of the characteristics of reflection records at a given district, and then, the layout of survey is determined. In this study, information through modeling based on the existing interpretation of geologic structures is also utilized, to reflect them for survey specifications. Land three-dimensional seismic survey was designed. Ground surface of the surveyed area consists of rice fields and hilly regions. The target was a nose-shaped structure in the depth about 2,500 m underground. A survey area of 4km{times}5km was set. Records in the shallow layers could not obtained when near offset was not ensured. Quality control of this distribution was important for grasping the shallow structure required. In this survey, the seismic generating point could be ensured more certainly than initially expected, which resulted in the sufficient security of near offset. 2 refs., 2 figs.

  17. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    Directory of Open Access Journals (Sweden)

    Hong Zhang

    2013-01-01

    Full Text Available With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activity, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation towards the performance of human activity recognition.

  18. The socio-economic base line survey; first chapter of the handbook under preparation: "Managing farmers: a handbook for working with farmers in irrigation and drainage projects"

    NARCIS (Netherlands)

    Schrevel, A.

    2002-01-01

    The text The socio-economic base line survey is the first chapter of a book under preparation meant to instruct senior staff of irrigation and drainage projects on techniques to work with farmers. It informs the reader of best practices to set up and execute a socio-economic baseline survey. The

  19. Research on polonium-218 survey technique for uranium

    International Nuclear Information System (INIS)

    Zhou, R.

    1985-01-01

    This article makes an exposition of the principles and procedures of 218 Po survey technique for uranium. The experiments done with 218 Po method on a large scale on the deposits of granite, volcanic rock and carbon-silliceous slate types showed that the method of not only as effective as track method and 210 Po method, but also has the characteristics of its own. The device has higher working efficiency with only 5 minutes needed at each measurement point, and its sensitivity is higher, about 0.7 pulse/136.S (P ci /L). The results of measurement by 218 Po method will not be affected by thorium emanation and there will be no contamination of the scintillation chamber by radon daughter. The ratio of anomalous peak value to the bottom for 218 Po method is proved to be higher than that for track method and 210 Po method. In order to avoid the influence of moisture, the measurement by 218 Po method should be planned to do when it is not a rainy day and the holes must be dug some distance off the ditches and rice fields, thus ensuring the success in applying the method

  20. A review on creatinine measurement techniques.

    Science.gov (United States)

    Mohabbati-Kalejahi, Elham; Azimirad, Vahid; Bahrami, Manouchehr; Ganbari, Ahmad

    2012-08-15

    This paper reviews the entire recent global tendency for creatinine measurement. Creatinine biosensors involve complex relationships between biology and micro-mechatronics to which the blood is subjected. Comparison between new and old methods shows that new techniques (e.g. Molecular Imprinted Polymers based algorithms) are better than old methods (e.g. Elisa) in terms of stability and linear range. All methods and their details for serum, plasma, urine and blood samples are surveyed. They are categorized into five main algorithms: optical, electrochemical, impedometrical, Ion Selective Field-Effect Transistor (ISFET) based technique and chromatography. Response time, detection limit, linear range and selectivity of reported sensors are discussed. Potentiometric measurement technique has the lowest response time of 4-10 s and the lowest detection limit of 0.28 nmol L(-1) belongs to chromatographic technique. Comparison between various techniques of measurements indicates that the best selectivity belongs to MIP based and chromatographic techniques. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Discussion on emergency aerial survey system for practical use

    International Nuclear Information System (INIS)

    Moriuchi, Shigeru; Nagaoka, Toshi; Sakamoto, Ryuichi; Tsutsumi, Masahiro; Satio, Kimiaki; Amano, Hikaru; Matsunaga, Takeshi; Yanase, Nobuyuki; Kasai, Atsushi

    1989-02-01

    In 1980, after the occurrence of the TMI-2 accident in the United States, JAERI started the research and development of aerial survey techniques, and completed two prototype aerial survey systems in 1985 for gamma ray survey and for radioactivity monitoring. Following the Chernobyl reactor accident which occured in Soviet Union in 1986, European countries experienced environmental radiological monitoring using their aerial survey systems, and proved the effectiveness of aerial survey in the emergency. This report describes the outline of the prototype survey systems developed at JAERI, and showed the practical survey systems data processing, data analysis and the techniques including data processing, data analysis and the example outputs. Also, this report made some proposals concerning practical construction and the arrangement of the aerial survey equipments and the establishment of organization which takes charge of the practical emergency survey and the routine maintenance, based on our past experience. (author)

  2. A Study on Integrated Community Based Flood Mitigation with Remote Sensing Technique in Kota Bharu, Kelantan

    International Nuclear Information System (INIS)

    Ainullotfi, A A; Ibrahim, A L; Masron, T

    2014-01-01

    This study is conducted to establish a community based flood management system that is integrated with remote sensing technique. To understand local knowledge, the demographic of the local society is obtained by using the survey approach. The local authorities are approached first to obtain information regarding the society in the study areas such as the population, the gender and the tabulation of settlement. The information about age, religion, ethnic, occupation, years of experience facing flood in the area, are recorded to understand more on how the local knowledge emerges. Then geographic data is obtained such as rainfall data, land use, land elevation, river discharge data. This information is used to establish a hydrological model of flood in the study area. Analysis were made from the survey approach to understand the pattern of society and how they react to floods while the analysis of geographic data is used to analyse the water extent and damage done by the flood. The final result of this research is to produce a flood mitigation method with a community based framework in the state of Kelantan. With the flood mitigation that involves the community's understanding towards flood also the techniques to forecast heavy rainfall and flood occurrence using remote sensing, it is hope that it could reduce the casualties and damage that might cause to the society and infrastructures in the study area

  3. Test description and preliminary pitot-pressure surveys for Langley Test Technique Demonstrator at Mach 6

    Science.gov (United States)

    Everhart, Joel L.; Ashby, George C., Jr.; Monta, William J.

    1992-01-01

    A propulsion/airframe integration experiment conducted in the NASA Langley 20-Inch Mach 6 Tunnel using a 16.8-in.-long version of the Langley Test Technique Demonstrator configuration with simulated scramjet propulsion is described. Schlieren and vapor screen visualization of the nozzle flow field is presented and correlated with pitot-pressure flow-field surveys. The data were obtained at nominal free-stream conditions of Re = 2.8 x 10 exp 6 and a nominal engine total pressure of 100 psia. It is concluded that pitot-pressure surveys coupled to schlieren and vapor-screen photographs, and oil flows have revealed flow features including vortices, free shear layers, and shock waves occurring in the model flow field.

  4. school-based survey of adolescents' opinion on premarital sex in ...

    African Journals Online (AJOL)

    PROF. BARTH EKWEME

    Method: A cross sectional descriptive survey design was used. The sample size was 313 senior secondary school students from four public secondary schools in Yakurr Local Government Area of Cross River State. Simple random sampling technique was used to select 313 students from 4 schools in Yakurr Local ...

  5. Assessing Dental Hygienists' Communication Techniques for Use with Low Oral Health Literacy Patients.

    Science.gov (United States)

    Flynn, Priscilla; Acharya, Amit; Schwei, Kelsey; VanWormer, Jeffrey; Skrzypcak, Kaitlyn

    2016-06-01

    This primary aim of this study was to assess communication techniques used with low oral health literacy patients by dental hygienists in rural Wisconsin dental clinics. A secondary aim was to determine the utility of the survey instrument used in this study. A mixed methods study consisting of a cross-sectional survey, immediately followed by focus groups, was conducted among dental hygienists in the Marshfield Clinic (Wisconsin) service area. The survey quantified the routine use of 18 communication techniques previously shown to be effective with low oral health literacy patients. Linear regression was used to analyze the association between routine use of each communication technique and several indicator variables, including geographic practice region, oral health literacy familiarity, communication skills training and demographic indicators. Qualitative analyses included code mapping to the 18 communication techniques identified in the survey, and generating new codes based on discussion content. On average, the 38 study participants routinely used 6.3 communication techniques. Dental hygienists who used an oral health literacy assessment tool reported using significantly more communication techniques compared to those who did not use an oral health literacy assessment tool. Focus group results differed from survey responses as few dental hygienists stated familiarity with the term "oral health literacy." Motivational interviewing techniques and using an integrated electronic medical-dental record were additional communication techniques identified as useful with low oral health literacy patients. Dental hygienists in this study routinely used approximately one-third of the communication techniques recommended for low oral health literacy patients supporting the need for training on this topic. Based on focus group results, the survey used in this study warrants modification and psychometric testing prior to further use. Copyright © 2016 The American Dental

  6. Laser-based direct-write techniques for cell printing

    Energy Technology Data Exchange (ETDEWEB)

    Schiele, Nathan R; Corr, David T [Biomedical Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States); Huang Yong [Department of Mechanical Engineering, Clemson University, Clemson, SC (United States); Raof, Nurazhani Abdul; Xie Yubing [College of Nanoscale Science and Engineering, University at Albany, SUNY, Albany, NY (United States); Chrisey, Douglas B, E-mail: schien@rpi.ed, E-mail: chrisd@rpi.ed [Material Science and Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States)

    2010-09-15

    Fabrication of cellular constructs with spatial control of cell location ({+-}5 {mu}m) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. (topical review)

  7. Laser-based direct-write techniques for cell printing

    International Nuclear Information System (INIS)

    Schiele, Nathan R; Corr, David T; Huang Yong; Raof, Nurazhani Abdul; Xie Yubing; Chrisey, Douglas B

    2010-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. (topical review)

  8. Survey of Analysis of Crime Detection Techniques Using Data Mining and Machine Learning

    Science.gov (United States)

    Prabakaran, S.; Mitra, Shilpa

    2018-04-01

    Data mining is the field containing procedures for finding designs or patterns in a huge dataset, it includes strategies at the convergence of machine learning and database framework. It can be applied to various fields like future healthcare, market basket analysis, education, manufacturing engineering, crime investigation etc. Among these, crime investigation is an interesting application to process crime characteristics to help the society for a better living. This paper survey various data mining techniques used in this domain. This study may be helpful in designing new strategies for crime prediction and analysis.

  9. Uranium exploration techniques

    International Nuclear Information System (INIS)

    Nichols, C.E.

    1984-01-01

    The subject is discussed under the headings: introduction (genetic description of some uranium deposits; typical concentrations of uranium in the natural environment); sedimentary host rocks (sandstones; tabular deposits; roll-front deposits; black shales); metamorphic host rocks (exploration techniques); geologic techniques (alteration features in sandstones; favourable features in metamorphic rocks); geophysical techniques (radiometric surveys; surface vehicle methods; airborne methods; input surveys); geochemical techniques (hydrogeochemistry; petrogeochemistry; stream sediment geochemistry; pedogeochemistry; emanometry; biogeochemistry); geochemical model for roll-front deposits; geologic model for vein-like deposits. (U.K.)

  10. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Cremers, D.A.; Archuleta, F.L.; Dilworth, H.C.

    1985-01-01

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  11. Categorizing natural disaster damage assessment using satellite-based geospatial techniques

    Science.gov (United States)

    Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.

    2008-01-01

    Remote sensing of a natural disaster's damage offers an exciting backup and/or alternative to traditional means of on-site damage assessment. Although necessary for complete assessment of damage areas, ground-based damage surveys conducted in the aftermath of natural hazard passage can sometimes be potentially complicated due to on-site difficulties (e.g., interaction with various authorities and emergency services) and hazards (e.g., downed power lines, gas lines, etc.), the need for rapid mobilization (particularly for remote locations), and the increasing cost of rapid physical transportation of manpower and equipment. Satellite image analysis, because of its global ubiquity, its ability for repeated independent analysis, and, as we demonstrate here, its ability to verify on-site damage assessment provides an interesting new perspective and investigative aide to researchers. Using one of the strongest tornado events in US history, the 3 May 1999 Oklahoma City Tornado, as a case example, we digitized the tornado damage path and co-registered the damage path using pre- and post-Landsat Thematic Mapper image data to perform a damage assessment. We employed several geospatial approaches, specifically the Getis index, Geary's C, and two lacunarity approaches to categorize damage characteristics according to the original Fujita tornado damage scale (F-scale). Our results indicate strong relationships between spatial indices computed within a local window and tornado F-scale damage categories identified through the ground survey. Consequently, linear regression models, even incorporating just a single band, appear effective in identifying F-scale damage categories using satellite imagery. This study demonstrates that satellite-based geospatial techniques can effectively add spatial perspectives to natural disaster damages, and in particular for this case study, tornado damages.

  12. Developing a hybrid dictionary-based bio-entity recognition technique

    Science.gov (United States)

    2015-01-01

    Background Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. Methods This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. Results The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. Conclusions The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall. PMID:26043907

  13. Developing a hybrid dictionary-based bio-entity recognition technique.

    Science.gov (United States)

    Song, Min; Yu, Hwanjo; Han, Wook-Shin

    2015-01-01

    Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall.

  14. Quality of reporting web-based and non-web-based survey studies: What authors, reviewers and consumers should consider.

    Science.gov (United States)

    Turk, Tarek; Elhady, Mohamed Tamer; Rashed, Sherwet; Abdelkhalek, Mariam; Nasef, Somia Ahmed; Khallaf, Ashraf Mohamed; Mohammed, Abdelrahman Tarek; Attia, Andrew Wassef; Adhikari, Purushottam; Amin, Mohamed Alsabbahi; Hirayama, Kenji; Huy, Nguyen Tien

    2018-01-01

    Several influential aspects of survey research have been under-investigated and there is a lack of guidance on reporting survey studies, especially web-based projects. In this review, we aim to investigate the reporting practices and quality of both web- and non-web-based survey studies to enhance the quality of reporting medical evidence that is derived from survey studies and to maximize the efficiency of its consumption. Reporting practices and quality of 100 random web- and 100 random non-web-based articles published from 2004 to 2016 were assessed using the SUrvey Reporting GuidelinE (SURGE). The CHERRIES guideline was also used to assess the reporting quality of Web-based studies. Our results revealed a potential gap in the reporting of many necessary checklist items in both web-based and non-web-based survey studies including development, description and testing of the questionnaire, the advertisement and administration of the questionnaire, sample representativeness and response rates, incentives, informed consent, and methods of statistical analysis. Our findings confirm the presence of major discrepancies in reporting results of survey-based studies. This can be attributed to the lack of availability of updated universal checklists for quality of reporting standards. We have summarized our findings in a table that may serve as a roadmap for future guidelines and checklists, which will hopefully include all types and all aspects of survey research.

  15. Building Assessment Survey and Evaluation Data (BASE)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Building Assessment Survey and Evaluation (BASE) study was a five year study to characterize determinants of indoor air quality and occupant perceptions in...

  16. Australian survey on current practices for breast radiotherapy.

    Science.gov (United States)

    Dundas, Kylie L; Pogson, Elise M; Batumalai, Vikneswary; Boxer, Miriam M; Yap, Mei Ling; Delaney, Geoff P; Metcalfe, Peter; Holloway, Lois

    2015-12-01

    Detailed, published surveys specific to Australian breast radiotherapy practice were last conducted in 2002. More recent international surveys specific to breast radiotherapy practice include a European survey conducted in 2008/2009 and a Spanish survey conducted in 2009. Radiotherapy techniques continue to evolve, and the utilisation of new techniques, such as intensity-modulated radiation therapy (IMRT), is increasing. This survey aimed to determine current breast radiotherapy practices across Australia. An online survey was completed by 50 of the 69 Australian radiation therapy treatment centres. Supine tangential beam whole breast irradiation remains the standard of care for breast radiotherapy in Australia. A growing number of institutions are exploring prone positioning and IMRT utilisation. This survey demonstrated a wide variation in the benchmarks used to limit and report organ at risk doses, prescribed dose regimen, and post-mastectomy bolus practices. This survey also indicated, when compared with international literature, that there may be less interest in or uptake of external beam partial breast irradiation, prone positioning, simultaneous integrated boost and breath hold techniques. These are areas where further review and research may be warranted to ensure Australian patients are receiving the best care possible based on the best evidence available. This survey provides insight into the current radiotherapy practice for breast cancer in Australia. © 2015 The Royal Australian and New Zealand College of Radiologists.

  17. Geophex Airborne Unmanned Survey System

    International Nuclear Information System (INIS)

    Won, I.L.; Keiswetter, D.

    1995-01-01

    Ground-based surveys place personnel at risk due to the proximity of buried unexploded ordnance (UXO) items or by exposure to radioactive materials and hazardous chemicals. The purpose of this effort is to design, construct, and evaluate a portable, remotely-piloted, airborne, geophysical survey system. This non-intrusive system will provide stand-off capability to conduct surveys and detect buried objects, structures, and conditions of interest at hazardous locations. During a survey, the operators remain remote from, but within visual distance of, the site. The sensor system never contacts the Earth, but can be positioned near the ground so that weak geophysical anomalies can be detected. The Geophex Airborne Unmanned Survey System (GAUSS) is designed to detect and locate small-scale anomalies at hazardous sites using magnetic and electromagnetic survey techniques. The system consists of a remotely-piloted, radio-controlled, model helicopter (RCH) with flight computer, light-weight geophysical sensors, an electronic positioning system, a data telemetry system, and a computer base-station. The report describes GAUSS and its test results

  18. Geophex Airborne Unmanned Survey System

    Energy Technology Data Exchange (ETDEWEB)

    Won, I.L.; Keiswetter, D.

    1995-12-31

    Ground-based surveys place personnel at risk due to the proximity of buried unexploded ordnance (UXO) items or by exposure to radioactive materials and hazardous chemicals. The purpose of this effort is to design, construct, and evaluate a portable, remotely-piloted, airborne, geophysical survey system. This non-intrusive system will provide stand-off capability to conduct surveys and detect buried objects, structures, and conditions of interest at hazardous locations. During a survey, the operators remain remote from, but within visual distance of, the site. The sensor system never contacts the Earth, but can be positioned near the ground so that weak geophysical anomalies can be detected. The Geophex Airborne Unmanned Survey System (GAUSS) is designed to detect and locate small-scale anomalies at hazardous sites using magnetic and electromagnetic survey techniques. The system consists of a remotely-piloted, radio-controlled, model helicopter (RCH) with flight computer, light-weight geophysical sensors, an electronic positioning system, a data telemetry system, and a computer base-station. The report describes GAUSS and its test results.

  19. Use of cognitive interview techniques in the development of nutrition surveys and interactive nutrition messages for low-income populations.

    Science.gov (United States)

    Carbone, Elena T; Campbell, Marci K; Honess-Morreale, Lauren

    2002-05-01

    The effectiveness of dietary surveys and educational messages is dependent in part on how well the target audience's information processing needs and abilities are addressed. Use of pilot testing is helpful; however, problems with wording and language are often not revealed. Cognitive interview techniques offer 1 approach to assist dietitians in understanding how audiences process information. With this method, respondents are led through a survey or message and asked to paraphrase items; discuss thoughts, feelings, and ideas that come to mind; and suggest alternative wording. As part of a US Department of Agriculture-funded nutrition education project, 23 cognitive interviews were conducted among technical community college students in North Carolina. Interview findings informed the development of tailored computer messages and survey questions. Better understanding of respondents' cognitive processes significantly improved the language and approach used in this intervention. Interview data indicated 4 problem areas: vague or ineffective instructions, confusing questions and response options, variable interpretation of terms, and misinterpretation of dietary recommendations. Interviews also provided insight into the meaning of diet-related stages of change. These findings concur with previous research suggesting that cognitive interview techniques are a valuable tool in the formative evaluation and development of nutrition surveys and materials.

  20. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  1. Using benchmarking techniques and the 2011 maternity practices infant nutrition and care (mPINC) survey to improve performance among peer groups across the United States.

    Science.gov (United States)

    Edwards, Roger A; Dee, Deborah; Umer, Amna; Perrine, Cria G; Shealy, Katherine R; Grummer-Strawn, Laurence M

    2014-02-01

    A substantial proportion of US maternity care facilities engage in practices that are not evidence-based and that interfere with breastfeeding. The CDC Survey of Maternity Practices in Infant Nutrition and Care (mPINC) showed significant variation in maternity practices among US states. The purpose of this article is to use benchmarking techniques to identify states within relevant peer groups that were top performers on mPINC survey indicators related to breastfeeding support. We used 11 indicators of breastfeeding-related maternity care from the 2011 mPINC survey and benchmarking techniques to organize and compare hospital-based maternity practices across the 50 states and Washington, DC. We created peer categories for benchmarking first by region (grouping states by West, Midwest, South, and Northeast) and then by size (grouping states by the number of maternity facilities and dividing each region into approximately equal halves based on the number of facilities). Thirty-four states had scores high enough to serve as benchmarks, and 32 states had scores low enough to reflect the lowest score gap from the benchmark on at least 1 indicator. No state served as the benchmark on more than 5 indicators and no state was furthest from the benchmark on more than 7 indicators. The small peer group benchmarks in the South, West, and Midwest were better than the large peer group benchmarks on 91%, 82%, and 36% of the indicators, respectively. In the West large, the Midwest large, the Midwest small, and the South large peer groups, 4-6 benchmarks showed that less than 50% of hospitals have ideal practice in all states. The evaluation presents benchmarks for peer group state comparisons that provide potential and feasible targets for improvement.

  2. Nurse Practitioners' Use of Communication Techniques: Results of a Maryland Oral Health Literacy Survey.

    Science.gov (United States)

    Koo, Laura W; Horowitz, Alice M; Radice, Sarah D; Wang, Min Q; Kleinman, Dushanka V

    2016-01-01

    We examined nurse practitioners' use and opinions of recommended communication techniques for the promotion of oral health as part of a Maryland state-wide oral health literacy assessment. Use of recommended health-literate and patient-centered communication techniques have demonstrated improved health outcomes. A 27-item self-report survey, containing 17 communication technique items, across 5 domains, was mailed to 1,410 licensed nurse practitioners (NPs) in Maryland in 2010. Use of communication techniques and opinions about their effectiveness were analyzed using descriptive statistics. General linear models explored provider and practice characteristics to predict differences in the total number and the mean number of communication techniques routinely used in a week. More than 80% of NPs (N = 194) routinely used 3 of the 7 basic communication techniques: simple language, limiting teaching to 2-3 concepts, and speaking slowly. More than 75% of respondents believed that 6 of the 7 basic communication techniques are effective. Sociodemographic provider characteristics and practice characteristics were not significant predictors of the mean number or the total number of communication techniques routinely used by NPs in a week. Potential predictors for using more of the 7 basic communication techniques, demonstrating significance in one general linear model each, were: assessing the office for user-friendliness and ever taking a communication course in addition to nursing school. NPs in Maryland self-reported routinely using some recommended health-literate communication techniques, with belief in their effectiveness. Our findings suggest that NPs who had assessed the office for patient-friendliness or who had taken a communication course beyond their initial education may be predictors for using more of the 7 basic communication techniques. These self-reported findings should be validated with observational studies. Graduate and continuing education for NPs

  3. Synchrotron radiation based analytical techniques (XAS and XRF)

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2014-01-01

    A brief description of the principles of X-ray absorption spectroscopy (XAS) and X-ray fluorescence (XRF) techniques is given in this article with emphasis on the advantages of using synchrotron radiation-based instrumentation/beamline. XAS technique is described in more detail to emphasize the strength of the technique as a local structural probe. (author)

  4. A micro-controller based wide range survey meter

    International Nuclear Information System (INIS)

    Bhingare, R.R.; Bajaj, K.C.; Kannan, S.

    2004-01-01

    Wide range survey meters (1μSv/h -10 Sv/h) with the detector(s) mounted at the end of a two-to-four meter-long extendable tube are widely used for radiation protection survey of difficult to reach locations and high dose rate areas, The commercially available survey meters of this type use two GM counters to cover a wide range of dose rate measurement. A new micro-controller based wide range survey meter using two Si diode detectors has been developed. The use of solid state detectors in the survey meter has a number of advantages like low power consumption, lighter battery powered detector probe, elimination of high voltage for the operation of the detectors, etc. The design uses infrared communication between the probe and the readout unit through a light-weight collapsible extension tube for high reliability. The design details and features are discussed in detail. (author)

  5. Comparing Four Touch-Based Interaction Techniques for an Image-Based Audience Response System

    NARCIS (Netherlands)

    Jorritsma, Wiard; Prins, Jonatan T.; van Ooijen, Peter M. A.

    2015-01-01

    This study aimed to determine the most appropriate touch-based interaction technique for I2Vote, an image-based audience response system for radiology education in which users need to accurately mark a target on a medical image. Four plausible techniques were identified: land-on, take-off,

  6. Multiagency radiation survey and site investigation manual (MARSSIM): Survey design

    International Nuclear Information System (INIS)

    Abelquist, E.W.; Berger, J.D.

    1996-01-01

    This paper describes the MultiAgency Radiation Survey and Site Investigation Manual (MARSSIM) strategy for designing a final status survey. The purpose of the final status survey is to demonstrate that release criteria established by the regulatory agency have been met. Survey design begins with identification of the contaminants and determination of whether the radionuclides of concern exist in background. The decommissioned site is segregated into Class 1, Class 2, and Class 3 areas, based on contamination potential, and each area is further divided into survey units. Appropriate reference areas for indoor and outdoor background measurements are selected. Survey instrumentation and techniques are selected in order to assure that the instrumentation is capable of detecting the contamination at the derived concentration guideline level (DCGL). Survey reference systems are established and the number of survey data points is determined-with the required number of data points distributed on a triangular grid Pattern. Two suitistical tests are used to evaluate data from final status surveys. For contaminants that are b, present in background, the Wilcoxon Rank Sum test is used; for contaminants that are not present in background, the Wilcoxon Signed Rank (or Sign) test is used. The number of data points needed to satisfy these nonparametric tests is based on the contaminant DCGL value, the expected Standard deviation of the contaminant in background and in the survey unit, and the acceptable probability of making Type I and Type II decision errors. The MARSSIM also requires a reasonable level of assurance that any small areas of elevated residual radioactivity that could be significant relative to regulatory limits are not missed during the final status survey. Measurements and sampling on a specified grid size are used to obtain an adequate assurance level that small locations of elevated radioactivity will Still satisfy DCGLs-applicable to small areas

  7. Comparing econometric and survey-based methodologies in measuring offshoring

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    such as the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data, this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of employment...

  8. Efficient techniques for wave-based sound propagation in interactive applications

    Science.gov (United States)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data

  9. Using a web-based survey tool to undertake a Delphi study: application for nurse education research.

    Science.gov (United States)

    Gill, Fenella J; Leslie, Gavin D; Grech, Carol; Latour, Jos M

    2013-11-01

    The Internet is increasingly being used as a data collection medium to access research participants. This paper reports on the experience and value of using web-survey software to conduct an eDelphi study to develop Australian critical care course graduate practice standards. The eDelphi technique used involved the iterative process of administering three rounds of surveys to a national expert panel. The survey was developed online using SurveyMonkey. Panel members responded to statements using one rating scale for round one and two scales for rounds two and three. Text boxes for panel comments were provided. For each round, the SurveyMonkey's email tool was used to distribute an individualized email invitation containing the survey web link. The distribution of panel responses, individual responses and a summary of comments were emailed to panel members. Stacked bar charts representing the distribution of responses were generated using the SurveyMonkey software. Panel response rates remained greater than 85% over all rounds. An online survey provided numerous advantages over traditional survey approaches including high quality data collection, ease and speed of survey administration, direct communication with the panel and rapid collation of feedback allowing data collection to be undertaken in 12 weeks. Only minor challenges were experienced using the technology. Ethical issues, specific to using the Internet to conduct research and external hosting of web-based software, lacked formal guidance. High response rates and an increased level of data quality were achieved in this study using web-survey software and the process was efficient and user-friendly. However, when considering online survey software, it is important to match the research design with the computer capabilities of participants and recognize that ethical review guidelines and processes have not yet kept pace with online research practices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. The Desired Learning Outcomes of School-Based Nutrition/Physical Activity Health Education: A Health Literacy Constructed Delphi Survey of Finnish Experts

    Science.gov (United States)

    Ormshaw, Michael James; Kokko, Sami Petteri; Villberg, Jari; Kannas, Lasse

    2016-01-01

    Purpose: The purpose of this paper is to utilise the collective opinion of a group of Finnish experts to identify the most important learning outcomes of secondary-level school-based health education, in the specific domains of physical activity and nutrition. Design/ Methodology/ Approach: The study uses a Delphi survey technique to collect the…

  11. Is cell culture a risky business? Risk analysis based on scientist survey data.

    Science.gov (United States)

    Shannon, Mark; Capes-Davis, Amanda; Eggington, Elaine; Georghiou, Ronnie; Huschtscha, Lily I; Moy, Elsa; Power, Melinda; Reddel, Roger R; Arthur, Jonathan W

    2016-02-01

    Cell culture is a technique that requires vigilance from the researcher. Common cell culture problems, including contamination with microorganisms or cells from other cultures, can place the reliability and reproducibility of cell culture work at risk. Here we use survey data, contributed by research scientists based in Australia and New Zealand, to assess common cell culture risks and how these risks are managed in practice. Respondents show that sharing of cell lines between laboratories continues to be widespread. Arrangements for mycoplasma and authentication testing are increasingly in place, although scientists are often uncertain how to perform authentication testing. Additional risks are identified for preparation of frozen stocks, storage and shipping. © 2015 UICC.

  12. A Novel Technique for Steganography Method Based on Improved Genetic Algorithm Optimization in Spatial Domain

    Directory of Open Access Journals (Sweden)

    M. Soleimanpour-moghadam

    2013-06-01

    Full Text Available This paper devotes itself to the study of secret message delivery using cover image and introduces a novel steganographic technique based on genetic algorithm to find a near-optimum structure for the pair-wise least-significant-bit (LSB matching scheme. A survey of the related literatures shows that the LSB matching method developed by Mielikainen, employs a binary function to reduce the number of changes of LSB values. This method verifiably reduces the probability of detection and also improves the visual quality of stego images. So, our proposal draws on the Mielikainen's technique to present an enhanced dual-state scoring model, structured upon genetic algorithm which assesses the performance of different orders for LSB matching and searches for a near-optimum solution among all the permutation orders. Experimental results confirm superiority of the new approach compared to the Mielikainen’s pair-wise LSB matching scheme.

  13. Techniques for Improving the Accuracy of 802.11 WLAN-Based Networking Experimentation

    Directory of Open Access Journals (Sweden)

    Portoles-Comeras Marc

    2010-01-01

    Full Text Available Wireless networking experimentation research has become highly popular due to both the frequent mismatch between theory and practice and the widespread availability of low-cost WLAN cards. However, current WLAN solutions present a series of performance issues, sometimes difficult to predict in advance, that may compromise the validity of the results gathered. This paper surveys recent literature dealing with such issues and draws attention on the negative results of starting experimental research without properly understanding the tools that are going to be used. Furthermore, the paper details how a conscious assessment strategy can prevent placing wrong assumptions on the hardware. Indeed, there are numerous techniques that have been described throughout the literature that can be used to obtain a deeper understanding of the solutions that have been adopted. The paper surveys these techniques and classifies them in order to provide a handful reference for building experimental setups from which accurate measurements may be obtained.

  14. Environmental monitoring using autonomous vehicles: a survey of recent searching techniques.

    Science.gov (United States)

    Bayat, Behzad; Crasta, Naveena; Crespi, Alessandro; Pascoal, António M; Ijspeert, Auke

    2017-06-01

    Autonomous vehicles are becoming an essential tool in a wide range of environmental applications that include ambient data acquisition, remote sensing, and mapping of the spatial extent of pollutant spills. Among these applications, pollution source localization has drawn increasing interest due to its scientific and commercial interest and the emergence of a new breed of robotic vehicles capable of operating in harsh environments without human supervision. The aim is to find the location of a region that is the source of a given substance of interest (e.g. a chemical pollutant at sea or a gas leakage in air) using a group of cooperative autonomous vehicles. Motivated by fast paced advances in this challenging area, this paper surveys recent advances in searching techniques that are at the core of environmental monitoring strategies using autonomous vehicles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Towards a common standard - a reporting checklist for web-based stated preference valuation surveys and a critique for mode surveys

    DEFF Research Database (Denmark)

    Menegaki, Angeliki, N.; Olsen, Søren Bøye; Tsagarakis, Konstantinos P.

    2016-01-01

    . The checklist is developed based on the bulk of knowledge gained so far with web-based surveys. This knowledge is compiled based on an extensive review of relevant literature dated from 2001 to beginning of 2015 in the Scopus database. Somewhat surprisingly, relatively few papers are concerned with survey mode...

  16. Power system stabilizers based on modern control techniques

    Energy Technology Data Exchange (ETDEWEB)

    Malik, O P; Chen, G P; Zhang, Y; El-Metwally, K [Calgary Univ., AB (Canada). Dept. of Electrical and Computer Engineering

    1994-12-31

    Developments in digital technology have made it feasible to develop and implement improved controllers based on sophisticated control techniques. Power system stabilizers based on adaptive control, fuzzy logic and artificial networks are being developed. Each of these control techniques possesses unique features and strengths. In this paper, the relative performance of power systems stabilizers based on adaptive control, fuzzy logic and neural network, both in simulation studies and real time tests on a physical model of a power system, is presented and compared to that of a fixed parameter conventional power system stabilizer. (author) 16 refs., 45 figs., 3 tabs.

  17. Shallow Depth Geophysical Investigation Through the Application of Magnetic and Electric Resistance Techniques: AN Evaluation Study of the Responses of Magnetic and Electric Resistance Techniques to Archaeogeophysical Prospection Surveys in Greece and Cyprus

    Science.gov (United States)

    Sarris, Apostolos

    The response characteristics of total intensity and vertical gradient magnetic techniques have been investigated in detail and compared with electric resistivity and other geophysical techniques. Four case studies from archaeological sites of Greece and Cyprus have been used as the experimental basis of this research project. Data from shallow depth geophysical investigations in these sites were collected over a period of four years. Interpretation of the geophysical results was based on the integration of the various prospecting methods. The results of the comparative study between the different techniques showed a strong correlation among all methods allowing the detection of certain features and the determination of their dimensions. The application of a large range of geophysical prospecting techniques in the surveyed archaeological sites has been able to detect the approximate position of the subsurface remains and to compare the different techniques in terms of the information that they reveal. Each one of these techniques has been used to examine the characteristic response of each method to the geophysical anomalies associated with the surveyed sites. Magnetic susceptibility measurements at two frequencies have identified areas and levels of intense human activity. A number of processing techniques such as low, high and band pass filtering in the spatial and frequency domain, computation of the residuals and fast Fourier transformation (FFT) of the magnetic potential data have been applied to the geophysical measurements. The subsequent convolution with filters representing apparent susceptibility, reduction to pole and equator, Gaussian and Butterworth regional and residual distributions, and inverse filtering in terms of spiking deconvolution have revealed a wealth of information necessary to obtain a more accurate picture of the concealed features. Inverse modelling of isolated magnetic anomalies has further enriched the information database of the

  18. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques.

    Science.gov (United States)

    Bergquist, Magnus; Nilsson, Andreas; Hansla, André

    2017-01-01

    Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants ( n = 347) were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1) and higher personal norms for non-targeted pro-environmental behaviors (Study 2). These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal.

  19. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques

    Directory of Open Access Journals (Sweden)

    Magnus Bergquist

    2017-11-01

    Full Text Available Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants (n = 347 were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1 and higher personal norms for non-targeted pro-environmental behaviors (Study 2. These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal.

  20. Comparison of passive soil vapor survey techniques at a Tijeras Arroyo site, Sandia National Laboratories, Albuquerque, New Mexico

    International Nuclear Information System (INIS)

    Eberle, C.S.; Wade, W.M.; Tharp, T.; Brinkman, J.

    1996-01-01

    Soil vapor surveys were performed to characterize the approximate location of soil contaminants at a hazardous waste site. The samplers were from two separate companies and a comparison was made between the results of the two techniques. These results will be used to design further investigations at the site

  1. Population-based absolute risk estimation with survey data

    Science.gov (United States)

    Kovalchik, Stephanie A.; Pfeiffer, Ruth M.

    2013-01-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  2. Worldwide enucleation techniques and materials for treatment of retinoblastoma: an international survey.

    Directory of Open Access Journals (Sweden)

    Daphne L Mourits

    Full Text Available To investigate the current practice of enucleation with or without orbital implant for retinoblastoma in countries across the world.A digital survey identifying operation techniques and material used for orbital implants after enucleation in patients with retinoblastoma.We received a response of 58 surgeons in 32 different countries. A primary artificial implant is routinely inserted by 42 (72.4% surgeons. Ten (17.2% surgeons leave the socket empty, three (5.2% decide per case. Other surgeons insert a dermis fat graft as a standard primary implant (n=1, or fill the socket in a standard secondary procedure (n=2; one uses dermis fat grafts and one artificial implants. The choice for porous implants was more frequent than for non-porous implants: 27 (58.7% and 15 (32.6%, respectively. Both porous and non-porous implant types are used by 4 (8.7% surgeons. Twenty-five surgeons (54.3% insert bare implants, 11 (23.9% use separate wrappings, eight (17.4% use implants with prefab wrapping and two insert implants with and without wrapping depending on type of implant. Attachment of the muscles to the wrapping or implant (at various locations is done by 31 (53.4% surgeons. Eleven (19.0% use a myoconjunctival technique, nine (15.5% suture the muscles to each other and seven (12.1% do not reattach the muscles. Measures to improve volume are implant exchange at an older age (n=4, the use of Restylane SQ (n=1 and osmotic expanders (n=1. Pegging is done by two surgeons.No (worldwide consensus exists about the use of material and techniques for enucleation for the treatment of retinoblastoma. Considerations for the use of different techniques are discussed.

  3. Comparison of Self-Reported Telephone Interviewing and Web-Based Survey Responses: Findings From the Second Australian Young and Well National Survey

    Science.gov (United States)

    Davenport, Tracey A; Burns, Jane M; Hickie, Ian B

    2017-01-01

    Background Web-based self-report surveying has increased in popularity, as it can rapidly yield large samples at a low cost. Despite this increase in popularity, in the area of youth mental health, there is a distinct lack of research comparing the results of Web-based self-report surveys with the more traditional and widely accepted computer-assisted telephone interviewing (CATI). Objective The Second Australian Young and Well National Survey 2014 sought to compare differences in respondent response patterns using matched items on CATI versus a Web-based self-report survey. The aim of this study was to examine whether responses varied as a result of item sensitivity, that is, the item’s susceptibility to exaggeration on underreporting and to assess whether certain subgroups demonstrated this effect to a greater extent. Methods A subsample of young people aged 16 to 25 years (N=101), recruited through the Second Australian Young and Well National Survey 2014, completed the identical items on two occasions: via CATI and via Web-based self-report survey. Respondents also rated perceived item sensitivity. Results When comparing CATI with the Web-based self-report survey, a Wilcoxon signed-rank analysis showed that respondents answered 14 of the 42 matched items in a significantly different way. Significant variation in responses (CATI vs Web-based) was more frequent if the item was also rated by the respondents as highly sensitive in nature. Specifically, 63% (5/8) of the high sensitivity items, 43% (3/7) of the neutral sensitivity items, and 0% (0/4) of the low sensitivity items were answered in a significantly different manner by respondents when comparing their matched CATI and Web-based question responses. The items that were perceived as highly sensitive by respondents and demonstrated response variability included the following: sexting activities, body image concerns, experience of diagnosis, and suicidal ideation. For high sensitivity items, a regression

  4. Comparison of Self-Reported Telephone Interviewing and Web-Based Survey Responses: Findings From the Second Australian Young and Well National Survey.

    Science.gov (United States)

    Milton, Alyssa C; Ellis, Louise A; Davenport, Tracey A; Burns, Jane M; Hickie, Ian B

    2017-09-26

    Web-based self-report surveying has increased in popularity, as it can rapidly yield large samples at a low cost. Despite this increase in popularity, in the area of youth mental health, there is a distinct lack of research comparing the results of Web-based self-report surveys with the more traditional and widely accepted computer-assisted telephone interviewing (CATI). The Second Australian Young and Well National Survey 2014 sought to compare differences in respondent response patterns using matched items on CATI versus a Web-based self-report survey. The aim of this study was to examine whether responses varied as a result of item sensitivity, that is, the item's susceptibility to exaggeration on underreporting and to assess whether certain subgroups demonstrated this effect to a greater extent. A subsample of young people aged 16 to 25 years (N=101), recruited through the Second Australian Young and Well National Survey 2014, completed the identical items on two occasions: via CATI and via Web-based self-report survey. Respondents also rated perceived item sensitivity. When comparing CATI with the Web-based self-report survey, a Wilcoxon signed-rank analysis showed that respondents answered 14 of the 42 matched items in a significantly different way. Significant variation in responses (CATI vs Web-based) was more frequent if the item was also rated by the respondents as highly sensitive in nature. Specifically, 63% (5/8) of the high sensitivity items, 43% (3/7) of the neutral sensitivity items, and 0% (0/4) of the low sensitivity items were answered in a significantly different manner by respondents when comparing their matched CATI and Web-based question responses. The items that were perceived as highly sensitive by respondents and demonstrated response variability included the following: sexting activities, body image concerns, experience of diagnosis, and suicidal ideation. For high sensitivity items, a regression analysis showed respondents who were male

  5. Nasal base narrowing: the combined alar base excision technique.

    Science.gov (United States)

    Foda, Hossam M T

    2007-01-01

    To evaluate the role of the combined alar base excision technique in narrowing the nasal base and correcting excessive alar flare. The study included 60 cases presenting with a wide nasal base and excessive alar flaring. The surgical procedure combined an external alar wedge resection with an internal vestibular floor excision. All cases were followed up for a mean of 32 (range, 12-144) months. Nasal tip modification and correction of any preexisting caudal septal deformities were always completed before the nasal base narrowing. The mean width of the external alar wedge excised was 7.2 (range, 4-11) mm, whereas the mean width of the sill excision was 3.1 (range, 2-7) mm. Completing the internal excision first resulted in a more conservative external resection, thus avoiding any blunting of the alar-facial crease. No cases of postoperative bleeding, infection, or keloid formation were encountered, and the external alar wedge excision healed with an inconspicuous scar that was well hidden in the depth of the alar-facial crease. Finally, the risk of notching of the alar rim, which can occur at the junction of the external and internal excisions, was significantly reduced by adopting a 2-layered closure of the vestibular floor (P = .01). The combined alar base excision resulted in effective narrowing of the nasal base with elimination of excessive alar flare. Commonly feared complications, such as blunting of the alar-facial crease or notching of the alar rim, were avoided by using simple modifications in the technique of excision and closure.

  6. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  7. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  8. Design of a statewide radiation survey

    International Nuclear Information System (INIS)

    Nagda, N.L.; Koontz, M.D.; Rector, H.E.; Nifong, G.D.

    1989-01-01

    The Florida Institute of Phosphate Research (FIPR) recently sponsored a statewide survey to identify all significant land areas in Florida where the state's environmental radiation rule should be applied. Under this rule, newly constructed buildings must be tested for radiation levels unless approved construction techniques are used. Two parallel surveys - a land-based survey and a population-based survey - were designed and conducted to address the objective. Each survey included measurements in more than 3000 residences throughout the state. Other information sources that existed at the outset of the study, such as geologic profiles mapped by previous investigators and terrestrial uranium levels characterized through aerial gamma radiation surveys, were also examined. Initial data analysis efforts focused on determining the extent of evidence of radon potential for each of 67 counties in the state. Within 18 countries that were determined to have definite evidence of elevated radon potential, more detailed spatial analyses were conducted to identify areas of which the rule should apply. A total of 74 quadrangles delineated by the U.S. Geological Survey, representing about 7% of those constituting the state, were identified as having elevated radon potential and being subject to the rule

  9. Basin Visual Estimation Technique (BVET) and Representative Reach Approaches to Wadeable Stream Surveys: Methodological Limitations and Future Directions

    Science.gov (United States)

    Lance R. Williams; Melvin L. Warren; Susan B. Adams; Joseph L. Arvai; Christopher M. Taylor

    2004-01-01

    Basin Visual Estimation Techniques (BVET) are used to estimate abundance for fish populations in small streams. With BVET, independent samples are drawn from natural habitat units in the stream rather than sampling "representative reaches." This sampling protocol provides an alternative to traditional reach-level surveys, which are criticized for their lack...

  10. Ergonomics in office-based surgery: a survey-guided observational study.

    Science.gov (United States)

    Esser, Adam C; Koshy, James G; Randle, Henry W

    2007-11-01

    The practice of office-based surgery is increasing in many specialties. Using Mohs surgery as a model, we investigated the role of ergonomics in office-based surgery to limit work-related musculoskeletal disorders. All Mayo Clinic surgeons currently performing Mohs surgery and Mohs surgeons trained at Mayo Clinic between 1990 and 2004 received a questionnaire survey between May 2003 and September 2004. A sample of respondents were videotaped during surgery. The main outcome measures were survey responses and an ergonomist's identification of potential causes of musculoskeletal disorders. All 17 surgeons surveyed responded. Those surveyed spend a mean of 24 hours per week in surgery. Sixteen said they had symptoms caused by or made worse by performing surgery. Symptom onset occurred on average at age 35.4 years. The most common complaints were pain and stiffness in the neck, shoulders, and lower back and headaches. Videotapes of 6 surgeons revealed problems with operating room setup, awkward posture, forceful exertion, poor positioning, lighting, and duration of procedures. Symptoms of musculoskeletal injuries are common and may begin early in a physician's career. Modifying footwear, flooring, table height, operating position, lighting, and surgical instruments may improve the ergonomics of office-based surgery.

  11. Aerial radiation survey

    International Nuclear Information System (INIS)

    Pradeep Kumar, K.S.

    1998-01-01

    Aerial gamma spectrometry surveys are the most effective, comprehensive and preferred tool to delimit the large area surface contamination in a radiological emergency either due to a nuclear accident or following a nuclear strike. The airborne survey apart from providing rapid and economical evaluation of ground contamination over large areas due to larger ground clearance and higher speed, is the only technique to overcome difficulties posed by ground surveys of inaccessible region. The aerial survey technique can also be used for searching of lost radioactive sources, tracking of radioactive plume and generation of background data on the Emergency Planning Zone (EPZ) of nuclear installations

  12. A survey on bio inspired meta heuristic based clustering protocols for wireless sensor networks

    Science.gov (United States)

    Datta, A.; Nandakumar, S.

    2017-11-01

    Recent studies have shown that utilizing a mobile sink to harvest and carry data from a Wireless Sensor Network (WSN) can improve network operational efficiency as well as maintain uniform energy consumption by the sensor nodes in the network. Due to Sink mobility, the path between two sensor nodes continuously changes and this has a profound effect on the operational longevity of the network and a need arises for a protocol which utilizes minimal resources in maintaining routes between the mobile sink and the sensor nodes. Swarm Intelligence based techniques inspired by the foraging behavior of ants, termites and honey bees can be artificially simulated and utilized to solve real wireless network problems. The author presents a brief survey on various bio inspired swarm intelligence based protocols used in routing data in wireless sensor networks while outlining their general principle and operation.

  13. A Twitter-based survey on marijuana concentrate use.

    Science.gov (United States)

    Daniulaityte, Raminta; Zatreh, Mussa Y; Lamy, Francois R; Nahhas, Ramzi W; Martins, Silvia S; Sheth, Amit; Carlson, Robert G

    2018-04-11

    The purpose of this paper is to analyze characteristics of marijuana concentrate users, describe patterns and reasons of use, and identify factors associated with daily use of concentrates among U.S.-based cannabis users recruited via a Twitter-based online survey. An anonymous Web-based survey was conducted in June 2017 with 687 U.S.-based cannabis users recruited via Twitter-based ads. The survey included questions about state of residence, socio-demographic characteristics, and cannabis use including marijuana concentrates. Multiple logistic regression analyses were conducted to identify characteristics associated with lifetime and daily use of marijuana concentrates. Almost 60% of respondents were male, 86% were white, and the mean age was 43.0 years. About 48% reported marijuana concentrate use. After adjusting for multiple testing, significant predictors of concentrate use included: living in "recreational" (AOR = 2.04; adj. p = .042) or "medical, less restrictive" (AOR = 1.74; adj. p = .030) states, being younger (AOR = 0.97, adj. p = .008), and daily herbal cannabis use (AOR = 2.57, adj. p = .008). Out of 329 marijuana concentrate users, about 13% (n = 44) reported daily/near daily use. Significant predictors of daily concentrate use included: living in recreational states (AOR = 3.59, adj. p = .020) and using concentrates for therapeutic purposes (AOR = 4.34, adj. p = .020). Living in states with more liberal marijuana policies is associated with greater likelihood of marijuana concentrate use and with more frequent use. Characteristics of daily users, in particular, patterns of therapeutic use warrant further research with community-recruited samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Aerial surveys adjusted by ground surveys to estimate area occupied by black-tailed prairie dog colonies

    Science.gov (United States)

    Sidle, John G.; Augustine, David J.; Johnson, Douglas H.; Miller, Sterling D.; Cully, Jack F.; Reading, Richard P.

    2012-01-01

    Aerial surveys using line-intercept methods are one approach to estimate the extent of prairie dog colonies in a large geographic area. Although black-tailed prairie dogs (Cynomys ludovicianus) construct conspicuous mounds at burrow openings, aerial observers have difficulty discriminating between areas with burrows occupied by prairie dogs (colonies) versus areas of uninhabited burrows (uninhabited colony sites). Consequently, aerial line-intercept surveys may overestimate prairie dog colony extent unless adjusted by an on-the-ground inspection of a sample of intercepts. We compared aerial line-intercept surveys conducted over 2 National Grasslands in Colorado, USA, with independent ground-mapping of known black-tailed prairie dog colonies. Aerial line-intercepts adjusted by ground surveys using a single activity category adjustment overestimated colonies by ≥94% on the Comanche National Grassland and ≥58% on the Pawnee National Grassland. We present a ground-survey technique that involves 1) visiting on the ground a subset of aerial intercepts classified as occupied colonies plus a subset of intercepts classified as uninhabited colony sites, and 2) based on these ground observations, recording the proportion of each aerial intercept that intersects a colony and the proportion that intersects an uninhabited colony site. Where line-intercept techniques are applied to aerial surveys or remotely sensed imagery, this method can provide more accurate estimates of black-tailed prairie dog abundance and trends

  15. Telephone survey to investigate relationships between onychectomy or onychectomy technique and house soiling in cats.

    Science.gov (United States)

    Gerard, Amanda F; Larson, Mandy; Baldwin, Claudia J; Petersen, Christine

    2016-09-15

    OBJECTIVE To determine whether associations existed between onychectomy or onychectomy technique and house soiling in cats. DESIGN Cross-sectional study. SAMPLE 281 owners of 455 cats in Polk County, Iowa, identified via a list of randomly selected residential phone numbers of cat owners in that region. PROCEDURES A telephone survey was conducted to collect information from cat owners on factors hypothesized a priori to be associated with house soiling, including cat sex, reproductive status, medical history, and onychectomy history. When cats that had undergone onychectomy were identified, data were collected regarding the cat's age at the time of the procedure and whether a carbon dioxide laser (CDL) had been used. Information on history of house soiling behavior (urinating or defecating outside the litter box) was also collected. RESULTS Onychectomy technique was identified as a risk factor for house soiling. Cats for which a non-CDL technique was used had a higher risk of house soiling than cats for which the CDL technique was used. Cats that had undergone onychectomy and that lived in a multicat (3 to 5 cats) household were more than 3 times as likely to have house soiled as were single-housed cats with intact claws. CONCLUSIONS AND CLINICAL RELEVANCE Results of this cross-sectional study suggested that use of the CDL technique for onychectomy could decrease the risk of house soiling by cats relative to the risk associated with other techniques. This and other findings can be used to inform the decisions of owners and veterinarians when considering elective onychectomy for cats.

  16. Current STR-based techniques in forensic science

    Directory of Open Access Journals (Sweden)

    Phuvadol Thanakiatkrai

    2013-01-01

    Full Text Available DNA analysis in forensic science is mainly based on short tandem repeat (STR genotyping. The conventional analysis is a three-step process of DNA extraction, amplification and detection. An overview of various techniques that are currently in use and are being actively researched for STR typing is presented. The techniques are separated into STR amplification and detection. New techniques for forensic STR analysis focus on increasing sensitivity, resolution and discrimination power for suboptimal samples. These are achieved by shifting primer-binding sites, using high-fidelity and tolerant polymerases and applying novel methods to STR detection. Examples in which STRs are used in criminal investigations are provided and future research directions are discussed.

  17. Is integrative use of techniques in psychotherapy the exception or the rule? Results of a national survey of doctoral-level practitioners.

    Science.gov (United States)

    Thoma, Nathan C; Cecero, John J

    2009-12-01

    This study sought to investigate the extent to which therapists endorse techniques outside of their self-identified orientation and which techniques are endorsed across orientations. A survey consisting of 127 techniques from 8 major theories of psychotherapy was administered via U.S. mail to a national random sample of doctoral-level psychotherapy practitioners. The 201 participants endorsed substantial numbers of techniques from outside their respective orientations. Many of these techniques were quite different from those of the core theories of the respective orientations. Further examining when and why experienced practitioners switch to techniques outside their primary orientation may help reveal where certain techniques fall short and where others excel, indicating a need for further research that taps the collective experience of practitioners. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  18. Precision surveying the principles and geomatics practice

    CERN Document Server

    Ogundare, John Olusegun

    2016-01-01

    A comprehensive overview of high precision surveying, including recent developments in geomatics and their applications This book covers advanced precision surveying techniques, their proper use in engineering and geoscience projects, and their importance in the detailed analysis and evaluation of surveying projects. The early chapters review the fundamentals of precision surveying: the types of surveys; survey observations; standards and specifications; and accuracy assessments for angle, distance and position difference measurement systems. The book also covers network design and 3-D coordinating systems before discussing specialized topics such as structural and ground deformation monitoring techniques and analysis, mining surveys, tunneling surveys, and alignment surveys. Precision Surveying: The Principles and Geomatics Practice: * Covers structural and ground deformation monitoring analysis, advanced techniques in mining and tunneling surveys, and high precision alignment of engineering structures *...

  19. Explaining discrepancies in reproductive health indicators from population-based surveys and exit surveys: a case from Rwanda.

    Science.gov (United States)

    Meekers, D; Ogada, E A

    2001-06-01

    Reproductive health programmes often need exit surveys and population-based surveys for monitoring and evaluation. This study investigates why such studies produce discrepant estimates of condom use, sexual behaviour and condom brand knowledge, and discusses the implications for future use of exit surveys for programme monitoring. Logistic regression is used to explain differences between a household survey of 1295 persons and an exit survey among a random sample of 2550 consumers at retail outlets in RWANDA: Discrepancies in ever use of condoms and risky sexual behaviours are due to differences in socioeconomic status of the two samples. After controls, exit surveys at most outlet types have the same results as the household survey. Only exit surveys at bars, nightclubs and hotels yield significantly different estimates. However, the above-average knowledge of Prudence Plus condoms in the exit interviews is not attributable to socioeconomic or demographic variables, most likely because respondents have seen the product at the outlets. Information about condom use and sexual behaviour obtained from exit surveys appears as accurate as that obtained through household surveys. Nevertheless, exit surveys must be used cautiously. Because exit surveys may include wealthier and better-educated respondents, they are not representative of the general population. The composition of exit survey samples should be validated through existing household surveys. Comparisons across survey types are generally unadvisable, unless they control for sample differences. When generalizing to the population at large is not needed (e.g. for studies aimed at identifying the characteristics and behaviour of users of particular products or services), exit surveys can provide an appropriate alternative to household surveys.

  20. Search Techniques for the Web of Things: A Taxonomy and Survey

    Science.gov (United States)

    Zhou, Yuchao; De, Suparna; Wang, Wei; Moessner, Klaus

    2016-01-01

    The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented. PMID:27128918

  1. Enhancement of Edge-based Image Quality Measures Using Entropy for Histogram Equalization-based Contrast Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    H. T. R. Kurmasha

    2017-12-01

    Full Text Available An Edge-based image quality measure (IQM technique for the assessment of histogram equalization (HE-based contrast enhancement techniques has been proposed that outperforms the Absolute Mean Brightness Error (AMBE and Entropy which are the most commonly used IQMs to evaluate Histogram Equalization based techniques, and also the two prominent fidelity-based IQMs which are Multi-Scale Structural Similarity (MSSIM and Information Fidelity Criterion-based (IFC measures. The statistical evaluation results show that the Edge-based IQM, which was designed for detecting noise artifacts distortion, has a Person Correlation Coefficient (PCC > 0.86 while the others have poor or fair correlation to human opinion, considering the Human Visual Perception (HVP. Based on HVP, this paper propose an enhancement to classic Edge-based IQM by taking into account the brightness saturation distortion which is the most prominent distortion in HE-based contrast enhancement techniques. It is tested and found to have significantly well correlation (PCC > 0.87, Spearman rank order correlation coefficient (SROCC > 0.92, Root Mean Squared Error (RMSE < 0.1054, and Outlier Ratio (OR = 0%.

  2. Tapping Their Patients' Problems Away? Characteristics of Psychotherapists Using Energy Meridian Techniques

    Science.gov (United States)

    Gaudiano, Brandon A.; Brown, Lily A.; Miller, Ivan W.

    2012-01-01

    Objective: The objective was to learn about the characteristics of psychotherapists who use energy meridian techniques (EMTs). Methods: We conducted an Internet-based survey of the practices and attitudes of licensed psychotherapists. Results: Of 149 survey respondents (21.4% social workers), 42.3% reported that they frequently use or are inclined…

  3. An Observed Voting System Based On Biometric Technique

    Directory of Open Access Journals (Sweden)

    B. Devikiruba

    2015-08-01

    Full Text Available ABSTRACT This article describes a computational framework which can run almost on every computer connected to an IP based network to study biometric techniques. This paper discusses with a system protecting confidential information puts strong security demands on the identification. Biometry provides us with a user-friendly method for this identification and is becoming a competitor for current identification mechanisms. The experimentation section focuses on biometric verification specifically based on fingerprints. This article should be read as a warning to those thinking of using methods of identification without first examine the technical opportunities for compromising mechanisms and the associated legal consequences. The development is based on the java language that easily improves software packages that is useful to test new control techniques.

  4. Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry presents a detailed survey of knowledge based systems. After being in a relatively dormant state for many years, only recently is Artificial Intelligence (AI) - that branch of computer science that attempts to have machines emulate intelligent behavior - accomplishing practical results. Most of these results can be attributed to the design and use of Knowledge-Based Systems, KBSs (or ecpert systems) - problem solving computer programs that can reach a level of performance comparable to that of a human expert in some specialized problem domain. These systems can act as a consultant for various requirements like medical diagnosis, military threat analysis, project risk assessment, etc. These systems possess knowledge to enable them to make intelligent desisions. They are, however, not meant to replace the human specialists in any particular domain. A critical survey of recent work in interactive KBSs is reported. A case study (MYCIN) of a KBS, a list of existing KBSs, and an introduction to the Japanese Fifth Generation Computer Project are provided as appendices. Finally, an extensive set of KBS-related references is provided at the end of the report.

  5. Population-based survey of cessation aids used by Swedish smokers

    Directory of Open Access Journals (Sweden)

    Rutqvist Lars E

    2012-12-01

    Full Text Available Abstract Background Most smokers who quit typically do so unassisted although pharmaceutical products are increasingly used by those who want a quitting aid. Previous Scandinavian surveys indicated that many smokers stopped smoking by switching from cigarettes to smokeless tobacco in the form of snus. However, usage of various cessation aids may have changed in Sweden during recent years due to factors such as the wider availability of pharmaceutical nicotine, the public debate about the health effects of different tobacco products, excise tax increases on snus relative to cigarettes, and the widespread public misconception that nicotine is the main cause of the adverse health effects associated with tobacco use. Methods A population-based, cross-sectional survey was done during November 2008 and September 2009 including 2,599 males and 3,409 females aged between 18 and 89 years. The sampling technique was random digit dialing. Data on tobacco habits and quit attempts were collected through structured telephone interviews. Results The proportion of ever smokers was similar among males (47% compared to females (44%. About two thirds of them reported having stopped smoking at the time of the survey. Among the former smokers, the proportion who reported unassisted quitting was slightly lower among males (68% compared to females (78%. Among ever smokers who reported having made assisted quit attempts, snus was the most frequently reported cessation aid among males (22%, whereas females more frequently reported counseling (8%, or pharmaceutical nicotine (gum 8%, patch 4%. Of those who reported using snus at their latest quit attempt, 81% of males and 72% of females were successful quitters compared to about 50-60% for pharmaceutical nicotine and counseling. Conclusions This survey confirms and extends previous reports in showing that, although most smokers who have quit did so unassisted, snus continues to be the most frequently reported cessation

  6. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    Science.gov (United States)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  7. A Performance Survey on Stack-based and Register-based Virtual Machines

    OpenAIRE

    Fang, Ruijie; Liu, Siqi

    2016-01-01

    Virtual machines have been widely adapted for high-level programming language implementations and for providing a degree of platform neutrality. As the overall use and adaptation of virtual machines grow, the overall performance of virtual machines has become a widely-discussed topic. In this paper, we present a survey on the performance differences of the two most widely adapted types of virtual machines - the stack-based virtual machine and the register-based virtual machine - using various...

  8. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  9. Problem based Learning in surveying Education

    DEFF Research Database (Denmark)

    Enemark, Stig

    The challenge of the future will be that the only constant is change. Therefore, the educational base must be flexible. The graduates must possess skills to adapt to a rapidly changing labour market and they must possess skills to deal with even the unknown problems of the future. The point is...... that opportunity. The basis principles of this educational model are presented using the surveying programme at Aalborg University as an example....

  10. Development of custom LCD based portable survey/contamination monitors

    International Nuclear Information System (INIS)

    Reddy, J.D.

    2010-01-01

    Equipments for carrying out radiation survey measurements for alpha, beta and gamma radiations have evolved considerably with the advancements in Electronics overtime. There are 2 major classes of portable instruments available from most manufacturers - (a) Analog indicator type (b) Direct digital readout type. Analog meters give a direct quantifying feel to radiation levels though they are not rich features nor they have smartness like a digital meter. Digital versions have advantages of direct readout numerically and configurable as per users requirements. To achieve best features of both the techniques a dual indicator type LCD module comprising of Analog indicating LCD segments and digital readout indicating 7 segments has been developed. This LCD comprising of LCD glass and its display driver has been deployed across various types of survey meters and contamination monitors manufactured by Nucleonix. This display now facilitates direct readout of dose rate/count rate in various units simultaneously in both analog LCD scale and direct digital indication. (author)

  11. A Survey on Smartphone-Based Crowdsensing Solutions

    Directory of Open Access Journals (Sweden)

    Willian Zamora

    2016-01-01

    Full Text Available In recent years, the widespread adoption of mobile phones, combined with the ever-increasing number of sensors that smartphones are equipped with, greatly simplified the generalized adoption of crowdsensing solutions by reducing hardware requirements and costs to a minimum. These factors have led to an outstanding growth of crowdsensing proposals from both academia and industry. In this paper, we provide a survey of smartphone-based crowdsensing solutions that have emerged in the past few years, focusing on 64 works published in top-ranked journals and conferences. To properly analyze these previous works, we first define a reference framework based on how we classify the different proposals under study. The results of our survey evidence that there is still much heterogeneity in terms of technologies adopted and deployment approaches, although modular designs at both client and server elements seem to be dominant. Also, the preferred client platform is Android, while server platforms are typically web-based, and client-server communications mostly rely on XML or JSON over HTTP. The main detected pitfall concerns the performance evaluation of the different proposals, which typically fail to make a scalability analysis despite being critical issue when targeting very large communities of users.

  12. Mining survey data for SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat

    2016-01-01

    Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis is one of the most important tools for strategic planning. The traditional method of conducting SWOT analysis does not prioritize and is likely to hold subjective views that may result in an improper strategic action. Accordingly, this research exploits Importance-Performance Analysis (IPA), a technique for measuring customers’ satisfaction based on survey data, to systematically generate prioritized SWOT factors based on custom...

  13. Lessons Learned from the Administration of a Web-Based Survey.

    Science.gov (United States)

    Mertler, Craig A.

    This paper describes the methodology used in a research study involving the collection of data through a Web-based survey, focusing on the advantages and limitations of the methodology. The Teacher motivation and Job Satisfaction Survey was administered to K-12 teachers. Many of the difficulties occurred during the planning phase, as opposed to…

  14. Development and validation of a prediction model for long-term sickness absence based on occupational health survey variables.

    Science.gov (United States)

    Roelen, Corné; Thorsen, Sannie; Heymans, Martijn; Twisk, Jos; Bültmann, Ute; Bjørner, Jakob

    2018-01-01

    The purpose of this study is to develop and validate a prediction model for identifying employees at increased risk of long-term sickness absence (LTSA), by using variables commonly measured in occupational health surveys. Based on the literature, 15 predictor variables were retrieved from the DAnish National working Environment Survey (DANES) and included in a model predicting incident LTSA (≥4 consecutive weeks) during 1-year follow-up in a sample of 4000 DANES participants. The 15-predictor model was reduced by backward stepwise statistical techniques and then validated in a sample of 2524 DANES participants, not included in the development sample. Identification of employees at increased LTSA risk was investigated by receiver operating characteristic (ROC) analysis; the area-under-the-ROC-curve (AUC) reflected discrimination between employees with and without LTSA during follow-up. The 15-predictor model was reduced to a 9-predictor model including age, gender, education, self-rated health, mental health, prior LTSA, work ability, emotional job demands, and recognition by the management. Discrimination by the 9-predictor model was significant (AUC = 0.68; 95% CI 0.61-0.76), but not practically useful. A prediction model based on occupational health survey variables identified employees with an increased LTSA risk, but should be further developed into a practically useful tool to predict the risk of LTSA in the general working population. Implications for rehabilitation Long-term sickness absence risk predictions would enable healthcare providers to refer high-risk employees to rehabilitation programs aimed at preventing or reducing work disability. A prediction model based on health survey variables discriminates between employees at high and low risk of long-term sickness absence, but discrimination was not practically useful. Health survey variables provide insufficient information to determine long-term sickness absence risk profiles. There is a need for

  15. Survey of subsurface geophysical exploration technologies adaptable to an airborne platform

    International Nuclear Information System (INIS)

    Taylor, K.A.

    1992-12-01

    This report has been prepared by the US Department of Energy (DOE) as part of a Research Development Demonstration Testing and Evaluation (RDDT ampersand E) project by EG ampersand G Energy Measurement's (EG ampersand G/EM) Remote Sensing Laboratory. It examines geophysical detection techniques which may be used in Environmental Restoration/Waste Management (ER/WM) surveys to locate buried waste, waste containers, potential waste migratory paths, and aquifer depths. Because of the Remote Sensing Laboratory's unique survey capabilities, only those technologies which have been adapted or are capable of being adapted to an airborne platform were studied. This survey describes several of the available subsurface survey technologies and discusses the basic capabilities of each: the target detectability, required geologic conditions, and associated survey methods. Because the airborne capabilities of these survey techniques have not been fully developed, the chapters deal mostly with the ground-based capabilities of each of the technologies, with reference made to the airborne capabilities where applicable. The information about each survey technique came from various contractors whose companies employ these specific technologies. EG ampersand G/EM cannot guarantee or verify the accuracy of the contractor information; however, the data given is an indication of the technologies that are available

  16. Responsive survey design, demographic data collection, and models of demographic behavior.

    Science.gov (United States)

    Axinn, William G; Link, Cynthia F; Groves, Robert M

    2011-08-01

    To address declining response rates and rising data-collection costs, survey methodologists have devised new techniques for using process data ("paradata") to address nonresponse by altering the survey design dynamically during data collection. We investigate the substantive consequences of responsive survey design-tools that use paradata to improve the representative qualities of surveys and control costs. By improving representation of reluctant respondents, responsive design can change our understanding of the topic being studied. Using the National Survey of Family Growth Cycle 6, we illustrate how responsive survey design can shape both demographic estimates and models of demographic behaviors based on survey data. By juxtaposing measures from regular and responsive data collection phases, we document how special efforts to interview reluctant respondents may affect demographic estimates. Results demonstrate the potential of responsive survey design to change the quality of demographic research based on survey data.

  17. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Science.gov (United States)

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  18. Post-endodontic treatment of incisors and premolars among dental practitioners in Saarland: an interactive Web-based survey.

    Science.gov (United States)

    Mitov, Gergo; Dörr, Michael; Nothdurft, Frank P; Draenert, Florian; Pospiech, Peter R

    2015-06-01

    The aim of the present study was to evaluate the trend of dental practitioners in the federal state of Saarland in Germany in regard to restoring endodontically treated teeth using a Web-based survey. An interactive Web-based survey instrument was developed, including seven clinical scenarios, presented by photographs of natural incisor and premolar with different types of cavities. Following a decision tree adapted to the clinical treatment, questions on different aspects of the post-endodontic treatment were asked. All 615 members of the Saarland Dental Association (SDA) were asked to participate in the survey. A total of 33 % completed the survey. The majority of the participants believed in the reinforcement effect of the ferrule design, as well as the post placement. The vast majority of the responding practitioners (92 %) adapted their treatment strategies to a high extent to the destruction degree of the endodontically treated tooth. Fiber-reinforced composite (FRC) posts are the most popular prefabricated post type, regardless of the cavity size and tooth localization. Significant differences between the dentists according to the degree of experience were detected only for the use of glass-ionomer cements as core buildup material. The predominant post-endodontic treatment strategies of German dental practitioners are only partly in agreement with the current literature. There is a clear trend toward the increasing use of metal-free post and core materials. Although the participants showed a general adoption of modern materials and techniques, different patterns of post-endodontic treatment were revealed that were not consistent with approaches supported by the literature.

  19. Survey as a group interactive teaching technique

    Directory of Open Access Journals (Sweden)

    Ana GOREA

    2017-03-01

    Full Text Available Smooth running of the educational process and the results depend a great deal on the methods used. The methodology of teaching offers a great variety of teaching techniques that the teacher can make use of in the teaching/learning process. Such techniques as brainstorming, the cube, KLW, case study, Venn diagram, and many other are familiar to the teachers and they use them effectively in the classroom. The present article proposes a technique called ‘survey’, which has been successfully used by the author as a student-centered speaking activity in foreign language classes. It has certain advantages especially if used in large groups. It can be adapted for any other discipline in the case when the teacher wishes to offer the students space for cooperative activity and creativity.

  20. Skull base tumours part I: Imaging technique, anatomy and anterior skull base tumours

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Alexandra [Instituto Portugues de Oncologia Francisco Gentil, Centro de Lisboa, Servico de Radiologia, Rua Professor Lima Basto, 1093 Lisboa Codex (Portugal)], E-mail: borgesalexandra@clix.pt

    2008-06-15

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed.

  1. Skull base tumours part I: Imaging technique, anatomy and anterior skull base tumours

    International Nuclear Information System (INIS)

    Borges, Alexandra

    2008-01-01

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed

  2. Search Techniques for the Web of Things: A Taxonomy and Survey

    Directory of Open Access Journals (Sweden)

    Yuchao Zhou

    2016-04-01

    Full Text Available The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented.

  3. Nasal base narrowing of the caucasian nose through the cerclage technique

    Directory of Open Access Journals (Sweden)

    Mocellin, Marcos

    2010-06-01

    Full Text Available Introduction: Several techniques can be performed to reduce the nasal base (narrowing, as skin resection vestibular columellar skin resection, resection of skin in elliptical lip narinary, sloughing of skin and advancements (VY technique of Bernstein and the use of cerclage sutures in the nasal base. Objective: To evaluate the technique of cerclage performed in the nasal base, through endonasal rhinoplasty without delivery of basic technique, in the Caucasian nose, reducing the distance inter-alar flare and correcting the wing with consequent improvement in nasal harmony in the whole face. Methods: A retrospective analysis by analysis of clinical documents and photos of 43 patients in whom cerclage was made of the nasal base by resecting skin ellipse in the region of the vestibule and the nasal base (modified technique of Weir using colorless mononylon® 4 "0" with a straight cutting needle. The study was conducted in 2008 and 2009 at Hospital of Paraná Institute of Otolaryngology - IPO in Curitiba, Parana - Brazil. Patients had a follow up ranging 7-12 months. Results: In 100% of cases was achieved an improvement in nasal harmony, by decreasing the inter-alar distance. Conclusion: The encircling with minimal resection of vestibular skin and the nasal base is an effective method for the narrowing of the nasal base in the Caucasian nose, with predictable results and easy to perform.

  4. Exploiting stock data: a survey of state of the art computational techniques aimed at producing beliefs regarding investment portfolios

    Directory of Open Access Journals (Sweden)

    Mario Linares Vásquez

    2008-01-01

    Full Text Available Selecting an investment portfolio has inspired several models aimed at optimising the set of securities which an in-vesttor may select according to a number of specific decision criteria such as risk, expected return and planning hori-zon. The classical approach has been developed for supporting the two stages of portfolio selection and is supported by disciplines such as econometrics, technical analysis and corporative finance. However, with the emerging field of computational finance, new and interesting techniques have arisen in line with the need for the automatic processing of vast volumes of information. This paper surveys such new techniques which belong to the body of knowledge con-cerning computing and systems engineering, focusing on techniques particularly aimed at producing beliefs regar-ding investment portfolios.

  5. Computerized tablet based versus traditional paper- based survey methods: results from adolescent's health research in schools of Maharashtra, India

    OpenAIRE

    Naveen Agarwal; Balram Paswan; Prakash H. Fulpagare; Dhirendra N Sinha; Thaksaphon Thamarangsi; Manju Rani

    2018-01-01

    Background and challenges to implementation Technological advancement is growing very fast in India and majority of young population is handling electronic devices often during leisure as well as at work. This study indicates that electronic tablets are less time consuming and improves survey response rate over the traditional paper-pencil survey method. Intervention or response An Android-based Global School-based Health Survey (GSHS) questionnaire was used with the...

  6. Comparison of Satellite Surveying to Traditional Surveying Methods for the Resources Industry

    Science.gov (United States)

    Osborne, B. P.; Osborne, V. J.; Kruger, M. L.

    Modern ground-based survey methods involve detailed survey, which provides three-space co-ordinates for surveyed points, to a high level of accuracy. The instruments are operated by surveyors, who process the raw results to create survey location maps for the subject of the survey. Such surveys are conducted for a location or region and referenced to the earth global co- ordinate system with global positioning system (GPS) positioning. Due to this referencing the survey is only as accurate as the GPS reference system. Satellite survey remote sensing utilise satellite imagery which have been processed using commercial geographic information system software. Three-space co-ordinate maps are generated, with an accuracy determined by the datum position accuracy and optical resolution of the satellite platform.This paper presents a case study, which compares topographic surveying undertaken by traditional survey methods with satellite surveying, for the same location. The purpose of this study is to assess the viability of satellite remote sensing for surveying in the resources industry. The case study involves a topographic survey of a dune field for a prospective mining project area in Pakistan. This site has been surveyed using modern surveying techniques and the results are compared to a satellite survey performed on the same area.Analysis of the results from traditional survey and from the satellite survey involved a comparison of the derived spatial co- ordinates from each method. In addition, comparisons have been made of costs and turnaround time for both methods.The results of this application of remote sensing is of particular interest for survey in areas with remote and extreme environments, weather extremes, political unrest, poor travel links, which are commonly associated with mining projects. Such areas frequently suffer language barriers, poor onsite technical support and resources.

  7. A correlation-based pulse detection technique for gamma-ray/neutron detectors

    International Nuclear Information System (INIS)

    Faisal, Muhammad; Schiffer, Randolph T.; Flaska, Marek; Pozzi, Sara A.; Wentzloff, David D.

    2011-01-01

    We present a correlation-based detection technique that significantly improves the probability of detection for low energy pulses. We propose performing a normalized cross-correlation of the incoming pulse data to a predefined pulse template, and using a threshold correlation value to trigger the detection of a pulse. This technique improves the detector sensitivity by amplifying the signal component of incoming pulse data and rejecting noise. Simulation results for various different templates are presented. Finally, the performance of the correlation-based detection technique is compared to the current state-of-the-art techniques.

  8. Memory Based Machine Intelligence Techniques in VLSI hardware

    OpenAIRE

    James, Alex Pappachen

    2012-01-01

    We briefly introduce the memory based approaches to emulate machine intelligence in VLSI hardware, describing the challenges and advantages. Implementation of artificial intelligence techniques in VLSI hardware is a practical and difficult problem. Deep architectures, hierarchical temporal memories and memory networks are some of the contemporary approaches in this area of research. The techniques attempt to emulate low level intelligence tasks and aim at providing scalable solutions to high ...

  9. Estimating micro area behavioural risk factor prevalence from large population-based surveys: a full Bayesian approach

    Directory of Open Access Journals (Sweden)

    L. Seliske

    2016-06-01

    Full Text Available Abstract Background An important public health goal is to decrease the prevalence of key behavioural risk factors, such as tobacco use and obesity. Survey information is often available at the regional level, but heterogeneity within large geographic regions cannot be assessed. Advanced spatial analysis techniques are demonstrated to produce sensible micro area estimates of behavioural risk factors that enable identification of areas with high prevalence. Methods A spatial Bayesian hierarchical model was used to estimate the micro area prevalence of current smoking and excess bodyweight for the Erie-St. Clair region in southwestern Ontario. Estimates were mapped for male and female respondents of five cycles of the Canadian Community Health Survey (CCHS. The micro areas were 2006 Census Dissemination Areas, with an average population of 400–700 people. Two individual-level models were specified: one controlled for survey cycle and age group (model 1, and one controlled for survey cycle, age group and micro area median household income (model 2. Post-stratification was used to derive micro area behavioural risk factor estimates weighted to the population structure. SaTScan analyses were conducted on the granular, postal-code level CCHS data to corroborate findings of elevated prevalence. Results Current smoking was elevated in two urban areas for both sexes (Sarnia and Windsor, and an additional small community (Chatham for males only. Areas of excess bodyweight were prevalent in an urban core (Windsor among males, but not females. Precision of the posterior post-stratified current smoking estimates was improved in model 2, as indicated by narrower credible intervals and a lower coefficient of variation. For excess bodyweight, both models had similar precision. Aggregation of the micro area estimates to CCHS design-based estimates validated the findings. Conclusions This is among the first studies to apply a full Bayesian model to complex

  10. A study on quantification of unavailability of DPPS with fault tolerant techniques considering fault tolerant techniques' characteristics

    International Nuclear Information System (INIS)

    Kim, B. G.; Kang, H. G.; Kim, H. E.; Seung, P. H.; Kang, H. G.; Lee, S. J.

    2012-01-01

    With the improvement of digital technologies, digital I and C systems have included more various fault tolerant techniques than conventional analog I and C systems have, in order to increase fault detection and to help the system safely perform the required functions in spite of the presence of faults. So, in the reliability evaluation of digital systems, the fault tolerant techniques (FTTs) and their fault coverage must be considered. To consider the effects of FTTs in a digital system, there have been several studies on the reliability of digital model. Therefore, this research based on literature survey attempts to develop a model to evaluate the plant reliability of the digital plant protection system (DPPS) with fault tolerant techniques considering detection and process characteristics and human errors. Sensitivity analysis is performed to ascertain important variables from the fault management coverage and unavailability based on the proposed model

  11. Void detection beneath reinforced concrete sections: The practical application of ground-penetrating radar and ultrasonic techniques

    Science.gov (United States)

    Cassidy, Nigel J.; Eddies, Rod; Dods, Sam

    2011-08-01

    Ground-penetrating radar (GPR) and ultrasonic 'pulse echo' techniques are well-established methods for the imaging, investigation and analysis of steel reinforced concrete structures and are important civil engineering survey tools. GPR is, arguably, the more widely-used technique as it is suitable for a greater range of problem scenarios (i.e., from rebar mapping to moisture content determination). Ultrasonic techniques are traditionally associated with the engineering-based, non-destructive testing of concrete structures and their integrity analyses (e.g., flaw detection, shear/longitudinal velocity determination, etc). However, when used in an appropriate manner, both techniques can be considered complementary and provide a unique way of imaging the sub-surface that is suited to a range of geotechnical problems. In this paper, we present a comparative study between mid-to-high frequency GPR (450 MHz and 900 MHz) and array-based, shear wave, pulse-echo ultrasonic surveys using proprietary instruments and conventional GPR data processing and visualisation techniques. Our focus is the practical detection of sub-metre scale voids located under steel reinforced concrete sections in realistic survey conditions (e.g., a capped, relict mine shaft or vent). Representative two-dimensional (2D) sections are presented for both methods illustrating the similarities/differences in signal response and the temporal-spatial target resolutions achieved with each technique. The use of three-dimensional data volumes and time slices (or 'C-scans') for advanced interpretation is also demonstrated, which although common in GPR applications is under-utilised as a technique in general ultrasonic surveys. The results show that ultrasonic methods can perform as well as GPR for this specific investigation scenario and that they have the potential of overcoming some of the inherent limitations of GPR investigations (i.e., the need for careful antenna frequency selection and survey design in

  12. Survey of Object-Based Data Reduction Techniques in Observational Astronomy

    Directory of Open Access Journals (Sweden)

    Łukasik Szymon

    2016-01-01

    Full Text Available Dealing with astronomical observations represents one of the most challenging areas of big data analytics. Besides huge variety of data types, dynamics related to continuous data flow from multiple sources, handling enormous volumes of data is essential. This paper provides an overview of methods aimed at reducing both the number of features/attributes as well as data instances. It concentrates on data mining approaches not related to instruments and observation tools instead working on processed object-based data. The main goal of this article is to describe existing datasets on which algorithms are frequently tested, to characterize and classify available data reduction algorithms and identify promising solutions capable of addressing present and future challenges in astronomy.

  13. Characterization techniques for graphene-based materials in catalysis

    Directory of Open Access Journals (Sweden)

    Maocong Hu

    2017-06-01

    Full Text Available Graphene-based materials have been studied in a wide range of applications including catalysis due to the outstanding electronic, thermal, and mechanical properties. The unprecedented features of graphene-based catalysts, which are believed to be responsible for their superior performance, have been characterized by many techniques. In this article, we comprehensively summarized the characterization methods covering bulk and surface structure analysis, chemisorption ability determination, and reaction mechanism investigation. We reviewed the advantages/disadvantages of different techniques including Raman spectroscopy, X-ray photoelectron spectroscopy (XPS, Fourier transform infrared spectroscopy (FTIR and Diffuse Reflectance Fourier Transform Infrared Spectroscopy (DRIFTS, X-Ray diffraction (XRD, X-ray absorption near edge structure (XANES and X-ray absorption fine structure (XAFS, atomic force microscopy (AFM, scanning electron microscopy (SEM, transmission electron microscopy (TEM, high-resolution transmission electron microscopy (HRTEM, ultraviolet-visible spectroscopy (UV-vis, X-ray fluorescence (XRF, inductively coupled plasma mass spectrometry (ICP, thermogravimetric analysis (TGA, Brunauer–Emmett–Teller (BET, and scanning tunneling microscopy (STM. The application of temperature-programmed reduction (TPR, CO chemisorption, and NH3/CO2-temperature-programmed desorption (TPD was also briefly introduced. Finally, we discussed the challenges and provided possible suggestions on choosing characterization techniques. This review provides key information to catalysis community to adopt suitable characterization techniques for their research.

  14. Fiscal 1999 survey report. Survey of international cooperation over energy use rationalization techniques; 1999 nendo energy shiyo gorika shuho kokusai kyoryoku chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    In the past, studies have been conducted involving the possibility of development of information offering techniques for the realization of sustainable society, for which LCA (life cycle assessment) as a tool for reducing energy consumption and environmental impact is investigated and case studies are conducted in this connection. In this fiscal year, for the purpose of deliberating how to utilize the LCA results, a survey is conducted of how LCA is being used as a tool for constructing an environmental management system set forth in the ISO14000 series, by holding conferences with LCA researchers representing the respective countries involved. Investigations are conducted into the actualities of the environmental management system, environmental performance assessment, and environmental labelling whose standardization has been under way in compliance with the ISO14000 series, into the actualities of matters relating to assessment techniques and decision making such as environmentally-friendly designing, supply chain management, reports on environments, etc., which are becoming established in enterprises, and into the actualities of access to and disclosure of information. International cooperative researches are conducted, participated in by five leading organizations of Sweden, Germany, Denmark, and Canada, and Japan's National Institute for Resources and Development, where actual states of LCA utilization are introduced and improvement on LCA techniques are discussed. (NEDO)

  15. Fiscal 1999 survey report. Survey of international cooperation over energy use rationalization techniques; 1999 nendo energy shiyo gorika shuho kokusai kyoryoku chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    In the past, studies have been conducted involving the possibility of development of information offering techniques for the realization of sustainable society, for which LCA (life cycle assessment) as a tool for reducing energy consumption and environmental impact is investigated and case studies are conducted in this connection. In this fiscal year, for the purpose of deliberating how to utilize the LCA results, a survey is conducted of how LCA is being used as a tool for constructing an environmental management system set forth in the ISO14000 series, by holding conferences with LCA researchers representing the respective countries involved. Investigations are conducted into the actualities of the environmental management system, environmental performance assessment, and environmental labelling whose standardization has been under way in compliance with the ISO14000 series, into the actualities of matters relating to assessment techniques and decision making such as environmentally-friendly designing, supply chain management, reports on environments, etc., which are becoming established in enterprises, and into the actualities of access to and disclosure of information. International cooperative researches are conducted, participated in by five leading organizations of Sweden, Germany, Denmark, and Canada, and Japan's National Institute for Resources and Development, where actual states of LCA utilization are introduced and improvement on LCA techniques are discussed. (NEDO)

  16. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  17. Development of PIC-based digital survey meter

    International Nuclear Information System (INIS)

    Nor Arymaswati Abdullah; Nur Aira Abdul Rahman; Mohd Ashhar Khalid; Taiman Kadni; Glam Hadzir Patai Mohamad; Abd Aziz Mhd Ramli; Chong Foh Yong

    2006-01-01

    The need of radiation monitoring and monitoring of radioactive contamination in the workplace is very important especially when x-ray machines, linear accelerators, electron beam machines and radioactive sources are present. The appropriate use of radiation detector is significant in order to maintain a radiation and contamination free workplace. This paper reports on the development of a prototype of PIC-based digital survey meter. This prototype of digital survey meter is a hand held instrument for general-purpose radiation monitoring and surface contamination meter. Generally, the device is able to detect some or all of the three major types of ionizing radiation, namely alpha, beta and gamma. It uses a Geiger-Muller tube as a radiation detector, which converts gamma radiation quanta to electric pulses and further processed by the electronic devices. The development involved the design of the controller, counter and high voltage circuit. All these circuit are assembled and enclosed in a plastic casing together with a GM detector and LCD display to form a prototype survey meter. The number of counts of the pulses detected by the survey meter varies due to the random nature of radioactivity. By averaging the reading over a time-period, more accurate and stable reading is achieved. To test the accuracy and the linearity of the design, the prototype was calibrated using standard procedure at the Secondary Standard Dosimetry Laboratory (SSDL) in MINT. (Author)

  18. Investigation of background acoustical effect on online surveys: A case study of a farmers' market customer survey

    Science.gov (United States)

    Tang, Xingdi

    Since the middle of 1990s, internet has become a new platform for surveys. Previous studies have discussed the visual design features of internet surveys. However, the application of acoustics as a design characteristic of online surveys has been rarely investigated. The present study aimed to fill that research gap. The purpose of the study was to assess the impact of background sound on respondents' engagement and satisfaction with online surveys. Two forms of background sound were evaluated; audio recorded in studios and audio edited with convolution reverb technique. The author recruited 80 undergraduate students for the experiment. These students were assigned to one of three groups. Each of the three groups was asked to evaluate their engagement and satisfaction with a specific online survey. The content of the online survey was the same. However, the three groups was exposed to the online survey with no background sound, with background sound recorded in studios; and with background sound edited with convolution reverb technique. The results showed no significant difference in engagement and satisfaction in the three groups of online surveys; without background sound, background sound recorded in studios, and background sound edited with convolution reverb technique. The author suggests that background sound does not contribute to online surveys in all the contexts. The industry practitioners should be careful to evaluate the survey context to decide whether the background sound should be added. Particularly, ear-piercing noise or acoustics which may link to respondents' unpleasant experience should be avoided. Moreover, although the results did not support the advantage of the revolution reverb technique in improving respondents' engagement and satisfaction, the author suggests that the potential of the revolution reverb technique in the applications of online surveys can't be totally denied, since it may be useful for some contexts which need further

  19. A Survey of Public Key Infrastructure-Based Security for Mobile Communication Systems

    Directory of Open Access Journals (Sweden)

    Mohammed Ramadan

    2016-08-01

    Full Text Available Mobile communication security techniques are employed to guard the communication between the network entities. Mobile communication cellular systems have become one of the most important communication systems in recent times and are used by millions of people around the world. Since the 1990s, considerable efforts have been taken to improve both the communication and security features of the mobile communications systems. However, these improvements divide the mobile communications field into different generations according to the communication and security techniques such as A3, A5 and A8 algorithms for 2G-GSM cellular system, 3G-authentication and key agreement (AKA, evolved packet system-authentication and key agreement (EPS-AKA, and long term evolution-authentication and key agreement (LTE-AKA algorithms for 3rd generation partnership project (3GPP systems. Furthermore, these generations have many vulnerabilities, and huge security work is involved to solve such problems. Some of them are in the field of the public key cryptography (PKC which requires a high computational cost and more network flexibility to be achieved. As such, the public key infrastructure (PKI is more compatible with the modern generations due to the superior communications features. This paper surveys the latest proposed works on the security of GSM, CDMA, and LTE cellular systems using PKI. Firstly, we present the security issues for each generation of mobile communication systems, then we study and analyze the latest proposed schemes and give some comparisons. Finally, we introduce some new directions for the future scope. This paper classifies the mobile communication security schemes according to the techniques used for each cellular system and covers some of the PKI-based security techniques such as authentication, key agreement, and privacy preserving.

  20. Development of Energy Management System Based on Internet of Things Technique

    OpenAIRE

    Wen-Jye Shyr; Chia-Ming Lin and Hung-Yun Feng

    2017-01-01

    The purpose of this study was to develop an energy management system for university campuses based on the Internet of Things (IoT) technique. The proposed IoT technique based on WebAccess is used via network browser Internet Explore and applies TCP/IP protocol. The case study of IoT for lighting energy usage management system was proposed. Structure of proposed IoT technique included perception layer, equipment layer, control layer, application layer and network layer.

  1. Biosensor-based microRNA detection: techniques, design, performance, and challenges.

    Science.gov (United States)

    Johnson, Blake N; Mutharasan, Raj

    2014-04-07

    The current state of biosensor-based techniques for amplification-free microRNA (miRNA) detection is critically reviewed. Comparison with non-sensor and amplification-based molecular techniques (MTs), such as polymerase-based methods, is made in terms of transduction mechanism, associated protocol, and sensitivity. Challenges associated with miRNA hybridization thermodynamics which affect assay selectivity and amplification bias are briefly discussed. Electrochemical, electromechanical, and optical classes of miRNA biosensors are reviewed in terms of transduction mechanism, limit of detection (LOD), time-to-results (TTR), multiplexing potential, and measurement robustness. Current trends suggest that biosensor-based techniques (BTs) for miRNA assay will complement MTs due to the advantages of amplification-free detection, LOD being femtomolar (fM)-attomolar (aM), short TTR, multiplexing capability, and minimal sample preparation requirement. Areas of future importance in miRNA BT development are presented which include focus on achieving high measurement confidence and multiplexing capabilities.

  2. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2007-01-01

    Full Text Available In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart sensors that today’s cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher’s interest in the fusion of intelligent sensors and optimal signal processing techniques.

  3. An employee total health management-based survey of Iowa employers.

    Science.gov (United States)

    Merchant, James A; Lind, David P; Kelly, Kevin M; Hall, Jennifer L

    2013-12-01

    To implement an Employee Total Health Management (ETHM) model-based questionnaire and provide estimates of model program elements among a statewide sample of Iowa employers. Survey a stratified random sample of Iowa employers, and characterize and estimate employer participation in ETHM program elements. Iowa employers are implementing less than 30% of all 12 components of ETHM, with the exception of occupational safety and health (46.6%) and workers' compensation insurance coverage (89.2%), but intend modest expansion of all components in the coming year. The ETHM questionnaire-based survey provides estimates of progress Iowa employers are making toward implementing components of Total Worker Health programs.

  4. A Survey on Multimedia-Based Cross-Layer Optimization in Visual Sensor Networks

    Science.gov (United States)

    Costa, Daniel G.; Guedes, Luiz Affonso

    2011-01-01

    Visual sensor networks (VSNs) comprised of battery-operated electronic devices endowed with low-resolution cameras have expanded the applicability of a series of monitoring applications. Those types of sensors are interconnected by ad hoc error-prone wireless links, imposing stringent restrictions on available bandwidth, end-to-end delay and packet error rates. In such context, multimedia coding is required for data compression and error-resilience, also ensuring energy preservation over the path(s) toward the sink and improving the end-to-end perceptual quality of the received media. Cross-layer optimization may enhance the expected efficiency of VSNs applications, disrupting the conventional information flow of the protocol layers. When the inner characteristics of the multimedia coding techniques are exploited by cross-layer protocols and architectures, higher efficiency may be obtained in visual sensor networks. This paper surveys recent research on multimedia-based cross-layer optimization, presenting the proposed strategies and mechanisms for transmission rate adjustment, congestion control, multipath selection, energy preservation and error recovery. We note that many multimedia-based cross-layer optimization solutions have been proposed in recent years, each one bringing a wealth of contributions to visual sensor networks. PMID:22163908

  5. Who should be undertaking population-based surveys in humanitarian emergencies?

    Directory of Open Access Journals (Sweden)

    Spiegel Paul B

    2007-06-01

    , coordinate when and where surveys should be undertaken and act as a survey repository. Technical expertise is expensive and donors must pay for it. As donors increasingly demand evidence-based programming, they have an obligation to ensure that sufficient funds are provided so organisations have adequate technical staff.

  6. Knowledge and use of evidence-based nutrition : a survey of paediatric dietitians

    NARCIS (Netherlands)

    Thomas, DE; Kukuruzovic, R; Martino, B; Chauhan, SS; Elliott, EJ

    2003-01-01

    Objective To survey paediatric dietitians' knowledge and use of evidence-based nutrition (EBN). Design Cross-sectional survey using reply-paid questionnaires. Subjects Paediatric dietitians in Australian teaching hospitals. Main outcome measures Age, sex, appointment, clinical practice, research

  7. Financial planning and analysis techniques of mining firms: a note on Canadian practice

    Energy Technology Data Exchange (ETDEWEB)

    Blanco, H.; Zanibbi, L.R. (Laurentian University, Sudbury, ON (Canada). School of Commerce and Administration)

    1992-06-01

    This paper reports on the results of a survey of the financial planning and analysis techniques in use in the mining industry in Canada. The study was undertaken to determine the current status of these practices within mining firms in Canada and to investigate the extent to which the techniques are grouped together within individual firms. In addition, tests were performed on the relationship between these groups of techniques and both organizational size and price volatility of end product. The results show that a few techniques are widely utilized in this industry but that the techniques used most frequently are not as sophisticated as reported in previous, more broadly based surveys. The results also show that firms tend to use 'bundles' of techniques and that the relative use of some of these groups of techniques is weakly associated with both organizational size and type of end product. 19 refs., 7 tabs.

  8. Assessment of health surveys: fitting a multidimensional graded response model.

    Science.gov (United States)

    Depaoli, Sarah; Tiemensma, Jitske; Felt, John M

    The multidimensional graded response model, an item response theory (IRT) model, can be used to improve the assessment of surveys, even when sample sizes are restricted. Typically, health-based survey development utilizes classical statistical techniques (e.g. reliability and factor analysis). In a review of four prominent journals within the field of Health Psychology, we found that IRT-based models were used in less than 10% of the studies examining scale development or assessment. However, implementing IRT-based methods can provide more details about individual survey items, which is useful when determining the final item content of surveys. An example using a quality of life survey for Cushing's syndrome (CushingQoL) highlights the main components for implementing the multidimensional graded response model. Patients with Cushing's syndrome (n = 397) completed the CushingQoL. Results from the multidimensional graded response model supported a 2-subscale scoring process for the survey. All items were deemed as worthy contributors to the survey. The graded response model can accommodate unidimensional or multidimensional scales, be used with relatively lower sample sizes, and is implemented in free software (example code provided in online Appendix). Use of this model can help to improve the quality of health-based scales being developed within the Health Sciences.

  9. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    Science.gov (United States)

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style

  10. Research in decommissioning techniques for nuclear fuel cycle facilities in JNC. 7. JWTF decommissioning techniques

    International Nuclear Information System (INIS)

    Ogawa, Ryuichiro; Ishijima, Noboru

    1999-02-01

    Decommissioning techniques such as radiation measuring and monitoring, decontamination, dismantling and remote handling in the world were surveyed to upgrading technical know-how database for decommissioning of Joyo Waste Treatment Facility (JWTF). As the result, five literatures for measuring and monitoring techniques, 14 for decontamination and 22 for dismantling feasible for JWTF decommissioning were obtained and were summarized in tables. On the basis of the research, practical applicability of those techniques to decommissioning of JWTF was evaluated. This report contains brief surveyed summaries related to JWTF decommissioning. (H. Itami)

  11. Understanding Patient Experience Using Internet-based Email Surveys: A Feasibility Study at Mount Sinai Hospital.

    Science.gov (United States)

    Morgan, Matthew; Lau, Davina; Jivraj, Tanaz; Principi, Tania; Dietrich, Sandra; Bell, Chaim M

    2015-01-01

    Email is becoming a widely accepted communication tool in healthcare settings. This study sought to test the feasibility of Internet-based email surveys of patient experience in the ambulatory setting. We conducted a study of email Internet-based surveys sent to patients in selected ambulatory clinics at Mount Sinai Hospital in Toronto, Canada. Our findings suggest that email links to Internet surveys are a feasible, timely and efficient method to solicit patient feedback about their experience. Further research is required to optimally leverage Internet-based email surveys as a tool to better understand the patient experience.

  12. Address-based versus random-digit-dial surveys: comparison of key health and risk indicators.

    Science.gov (United States)

    Link, Michael W; Battaglia, Michael P; Frankel, Martin R; Osborn, Larry; Mokdad, Ali H

    2006-11-15

    Use of random-digit dialing (RDD) for conducting health surveys is increasingly problematic because of declining participation rates and eroding frame coverage. Alternative survey modes and sampling frames may improve response rates and increase the validity of survey estimates. In a 2005 pilot study conducted in six states as part of the Behavioral Risk Factor Surveillance System, the authors administered a mail survey to selected household members sampled from addresses in a US Postal Service database. The authors compared estimates based on data from the completed mail surveys (n = 3,010) with those from the Behavioral Risk Factor Surveillance System telephone surveys (n = 18,780). The mail survey data appeared reasonably complete, and estimates based on data from the two survey modes were largely equivalent. Differences found, such as differences in the estimated prevalences of binge drinking (mail = 20.3%, telephone = 13.1%) or behaviors linked to human immunodeficiency virus transmission (mail = 7.1%, telephone = 4.2%), were consistent with previous research showing that, for questions about sensitive behaviors, self-administered surveys generally produce higher estimates than interviewer-administered surveys. The mail survey also provided access to cell-phone-only households and households without telephones, which cannot be reached by means of standard RDD surveys.

  13. A Brief History of the use of Electromagnetic Induction Techniques in Soil Survey

    Science.gov (United States)

    Brevik, Eric C.; Doolittle, James

    2017-04-01

    Electromagnetic induction (EMI) has been used to characterize the spatial variability of soil properties since the late 1970s. Initially used to assess soil salinity, the use of EMI in soil studies has expanded to include: mapping soil types; characterizing soil water content and flow patterns; assessing variations in soil texture, compaction, organic matter content, and pH; and determining the depth to subsurface horizons, stratigraphic layers or bedrock, among other uses. In all cases the soil property being investigated must influence soil apparent electrical conductivity (ECa) either directly or indirectly for EMI techniques to be effective. An increasing number and diversity of EMI sensors have been developed in response to users' needs and the availability of allied technologies, which have greatly improved the functionality of these tools and increased the amount and types of data that can be gathered with a single pass. EMI investigations provide several benefits for soil studies. The large amount of georeferenced data that can be rapidly and inexpensively collected with EMI provides more complete characterization of the spatial variations in soil properties than traditional sampling techniques. In addition, compared to traditional soil survey methods, EMI can more effectively characterize diffuse soil boundaries and identify included areas of dissimilar soils within mapped soil units, giving soil scientists greater confidence when collecting spatial soil information. EMI techniques do have limitations; results are site-specific and can vary depending on the complex interactions among multiple and variable soil properties. Despite this, EMI techniques are increasingly being used to investigate the spatial variability of soil properties at field and landscape scales. The future should witness a greater use of multiple-frequency and multiple-coil EMI sensors and integration with other sensors to assess the spatial variability of soil properties. Data analysis

  14. Digital mapping techniques '06 - Workshop proceedings

    Science.gov (United States)

    Soller, David R.

    2007-01-01

    The Digital Mapping Techniques `06 (DMT`06) workshop was attended by more than 110 technical experts from 51 agencies, universities, and private companies, including representatives from 27 state geological surveys (see Appendix A of these Proceedings). This workshop was similar in nature to the previous nine meetings, which were held in Lawrence, Kansas (Soller, 1997), Champaign, Illinois (Soller, 1998), Madison, Wisconsin (Soller, 1999), Lexington, Kentucky (Soller, 2000), Tuscaloosa, Alabama (Soller, 2001), Salt Lake City, Utah (Soller, 2002), Millersville, Pennsylvania (Soller, 2003), Portland, Oregon (Soller, 2004), and Baton Rouge, Louisiana (Soller, 2005). This year?s meeting was hosted by the Ohio Geological Survey, from June 11-14, 2006, on the Ohio State University campus in Columbus, Ohio. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops.Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, the latter of which was formed in August 1996 to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database - and for the State and Federal geological surveys - to provide more high-quality digital maps to the public.At the 2006 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, "publishing" includes Web-based release); 2) field data

  15. Effects of Personalization and Invitation Email Length on Web-Based Survey Response Rates

    Science.gov (United States)

    Trespalacios, Jesús H.; Perkins, Ross A.

    2016-01-01

    Individual strategies to increase response rate and survey completion have been extensively researched. Recently, efforts have been made to investigate a combination of interventions to yield better response rates for web-based surveys. This study examined the effects of four different survey invitation conditions on response rate. From a large…

  16. An Employee Total Health Management–Based Survey of Iowa Employers

    Science.gov (United States)

    Merchant, James A.; Lind, David P.; Kelly, Kevin M.; Hall, Jennifer L.

    2015-01-01

    Objective To implement an Employee Total Health Management (ETHM) model-based questionnaire and provide estimates of model program elements among a statewide sample of Iowa employers. Methods Survey a stratified random sample of Iowa employers, characterize and estimate employer participation in ETHM program elements Results Iowa employers are implementing under 30% of all 12 components of ETHM, with the exception of occupational safety and health (46.6%) and worker compensation insurance coverage (89.2%), but intend modest expansion of all components in the coming year. Conclusions The Employee Total Health Management questionnaire-based survey provides estimates of progress Iowa employers are making toward implementing components of total worker health programs. PMID:24284757

  17. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  18. Knowledge based systems advanced concepts, techniques and applications

    CERN Document Server

    1997-01-01

    The field of knowledge-based systems (KBS) has expanded enormously during the last years, and many important techniques and tools are currently available. Applications of KBS range from medicine to engineering and aerospace.This book provides a selected set of state-of-the-art contributions that present advanced techniques, tools and applications. These contributions have been prepared by a group of eminent researchers and professionals in the field.The theoretical topics covered include: knowledge acquisition, machine learning, genetic algorithms, knowledge management and processing under unc

  19. Combination of multielement technique (INAA and ICP-MS) for a French air pollution bio-monitoring survey using mosses

    International Nuclear Information System (INIS)

    Ayrault, S.; Deschamps, C.; Amblard, G.; Galsomies, L.; Letrouit-Galinou, M.A.; Bonhomme, P.

    1998-01-01

    This work presents the use of two trace analysis techniques through the data obtained for a significant part of the 557 mosses sampled in France. Sampling were made within the framework of European survey carried out in 1995-1996 and proposed by the Nordic Council. The analyses were produced with a combination of two multielement analysis techniques: INAA (Instrumental Neutron Activation Analysis) and ICP-MS (Inductively Coupled Plasma Mass Spectrometry). Theses two techniques were suitable for trace analyses in mosses. They were clearly complementary and provided for 36 elements including the heavy metals of key interest in air pollution studies. The choice of the technique for a given element depended on the feasibility (e g. Pb is not attainable by INAA), the detection limit, the analytical variability, the preparation procedures and the concentration ranges (5-100 μg/g for Pb, 0.5-5 μg/g for As). INAA measured the total content in the sample, while ICP-MS demanded a mineralization procedure resulting in losses/contamination hazards. Thus, INAA results were preferred, although this technique was time consuming. However the ICP-MS results for Cd, Cu, Ni and Pb were retained, for different reasons: detection limits (Cd, Cu), no convenient INAA conditions (Ni), and feasibility (Pb). (authors)

  20. Fractal Image Compression Based on High Entropy Values Technique

    Directory of Open Access Journals (Sweden)

    Douaa Younis Abbaas

    2018-04-01

    Full Text Available There are many attempts tried to improve the encoding stage of FIC because it consumed time. These attempts worked by reducing size of the search pool for pair range-domain matching but most of them led to get a bad quality, or a lower compression ratio of reconstructed image. This paper aims to present a method to improve performance of the full search algorithm by combining FIC (lossy compression and another lossless technique (in this case entropy coding is used. The entropy technique will reduce size of the domain pool (i. e., number of domain blocks based on the entropy value of each range block and domain block and then comparing the results of full search algorithm and proposed algorithm based on entropy technique to see each of which give best results (such as reduced the encoding time with acceptable values in both compression quali-ty parameters which are C. R (Compression Ratio and PSNR (Image Quality. The experimental results of the proposed algorithm proven that using the proposed entropy technique reduces the encoding time while keeping compression rates and reconstruction image quality good as soon as possible.

  1. The Accuracy Assessment of Determining the Axis of Railway Track Basing on the Satellite Surveying

    Science.gov (United States)

    Koc, Władysław; Specht, Cezary; Chrostowski, Piotr; Palikowska, Katarzyna

    2012-09-01

    In 2009, at the Gdansk University of Technology there have been carried out, for the first time, continuous satellite surveying of railway track by the use of the relative phase method based on geodesic active network ASG-EUPOS and NAVGEO service. Still continuing research works focused on the GNSS multi-receivers platform evaluation for projecting and stock-taking. In order to assess the accuracy of the railway track axis position, the values of deviations of transverse position XTE (Cross Track Error) were evaluated. In order to eliminate the influence of random measurement errors and to obtain the coordinates representing the actual shape of the track, the XTE variable was analyzed by signal analysis methods (Chebyshev low-pass filtering and fast Fourier transform). At the end the paper presents the module of the computer software SATTRACK which currently has been developing at the Gdansk University of Technology. The program serves visualization, assessment and design process of railway track, adapted to the technique of continuous satellite surveying. The module called TRACK STRAIGHT is designed to assess the straight sections. A description of its operation as well as examples of its functions has been presented.

  2. Auto-correlation based intelligent technique for complex waveform presentation and measurement

    International Nuclear Information System (INIS)

    Rana, K P S; Singh, R; Sayann, K S

    2009-01-01

    Waveform acquisition and presentation forms the heart of many measurement systems. Particularly, data acquisition and presentation of repeating complex signals like sine sweep and frequency-modulated signals introduces the challenge of waveform time period estimation and live waveform presentation. This paper presents an intelligent technique, for waveform period estimation of both the complex and simple waveforms, based on the normalized auto-correlation method. The proposed technique is demonstrated using LabVIEW based intensive simulations on several simple and complex waveforms. Implementation of the technique is successfully demonstrated using LabVIEW based virtual instrumentation. Sine sweep vibration waveforms are successfully presented and measured for electrodynamic shaker system generated vibrations. The proposed method is also suitable for digital storage oscilloscope (DSO) triggering, for complex signals acquisition and presentation. This intelligence can be embodied into the DSO, making it an intelligent measurement system, catering wide varieties of the waveforms. The proposed technique, simulation results, robustness study and implementation results are presented in this paper.

  3. Recruiting Pregnant Patients for Survey Research: A Head to Head Comparison of Social Media-Based Versus Clinic-Based Approaches.

    Science.gov (United States)

    Admon, Lindsay; Haefner, Jessica K; Kolenic, Giselle E; Chang, Tammy; Davis, Matthew M; Moniz, Michelle H

    2016-12-21

    Recruiting a diverse sample of pregnant women for clinical research is a challenging but crucial task for improving obstetric services and maternal and child health outcomes. To compare the feasibility and cost of recruiting pregnant women for survey research using social media-based and clinic-based approaches. Advertisements were used to recruit pregnant women from the social media website Facebook. In-person methods were used to recruit pregnant women from the outpatient clinic of a large, tertiary care center. In both approaches, potential respondents were invited to participate in a 15-minute Web-based survey. Each recruitment method was monitored for 1 month. Using bivariate statistics, we compared the number, demographic characteristics, and health characteristics of women recruited and the cost per completed survey for each recruitment method. The social media-based approach recruited 1178 women and the clinic-based approach recruited 219 women. A higher proportion of subjects recruited through social media identified as African American (29.4%, 207/705 vs 11.2%, 20/179), reported household incomes social media had earned a college degree (21.3%, 153/717 vs 62.3%, 114/183) and were married or in a domestic partnership (45.7%, 330/722 vs 72.1%, 132/183; all PSocial media-based recruitment costs were US $14.63 per completed survey, compared with US $23.51 for clinic-based recruitment. Web-based recruitment through a social networking platform is a feasible, inexpensive, and rapid means of recruiting a large, diverse sample of pregnant women for survey research. ©Lindsay Admon, Jessica K Haefner, Giselle E Kolenic, Tammy Chang, Matthew M Davis, Michelle H Moniz. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 21.12.2016.

  4. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    2011-01-01

    The estimation of the line impedance can be used by the control of numerous grid-connected systems, such as active filters, islanding detection techniques, non-linear current controllers, detection of the on/off grid operation mode. Therefore, estimating the line impedance can add extra functions...... into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance. The quasi...

  5. Craniospinal radiotherapy in children: Electron- or photon-based technique of spinal irradiation

    International Nuclear Information System (INIS)

    Chojnacka, M.; Skowronska-Gardas, A.; Pedziwiatr, K.; Morawska-Kaczynska, M.; Zygmuntowicz-Pietka, A.; Semaniak, A.

    2010-01-01

    Background: The prone position and electron-based technique for craniospinal irradiation (CSI) have been standard in our department for many years. But this immobilization is difficult for the anaesthesiologist to gain airway access. The increasing number of children treated under anaesthesia led us to reconsider our technique. Aim: The purpose of this study is to report our new photon-based technique for CSI which could be applied in both the supine and the prone position and to compare this technique with our electron-based technique. Materials and methods: Between November 2007 and May 2008, 11 children with brain tumours were treated in the prone position with CSI. For 9 patients two treatment plans were created: the first one using photons and the second one using electron beams for spinal irradiation. We prepared seven 3D-conformal photon plans and four forward planned segmented field plans. We compared 20 treatment plans in terms of target dose homogeneity and sparing of organs at risk. Results: In segmented field plans better dose homogeneity in the thecal sac volume was achieved than in electron-based plans. Regarding doses in organs at risk, in photon-based plans we obtained a lower dose in the thyroid but a higher one in the heart and liver. Conclusions: Our technique can be applied in both the supine and prone position and it seems to be more feasible and precise than the electron technique. However, more homogeneous target coverage and higher precision of dose delivery for photons are obtained at the cost of slightly higher doses to the heart and liver. (authors)

  6. Graph based techniques for tag cloud generation

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2013-01-01

    Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings...... or citation counts for improved measurement of relevance of tag clouds. We show, that on the given data sets, our approach outperforms the state of the art baseline methods with respect to such relevance by 41 % on Movielens dataset and by 11 % on Bibsonomy data set....

  7. Backstepping Based Formation Control of Quadrotors with the State Transformation Technique

    Directory of Open Access Journals (Sweden)

    Keun Uk Lee

    2017-11-01

    Full Text Available In this paper, a backstepping-based formation control of quadrotors with the state transformation technique is proposed. First, the dynamics of a quadrotor is derived by using the Newton–Euler formulation. Next, a backstepping-based formation control for quadrotors using a state transformation technique is presented. In the position control, which is the basis of formation control, it is possible to derive the reference attitude angles employing a state transformation technique without the small angle assumption or the simplified dynamics usually used. Stability analysis based on the Lyapunov theorem shows that the proposed formation controller can provide a quadrotor formation error system that is asymptotically stabilized. Finally, we verify the performance of the proposed formation control method through comparison simulations.

  8. Refractive index sensor based on optical fiber end face using pulse reference-based compensation technique

    Science.gov (United States)

    Bian, Qiang; Song, Zhangqi; Zhang, Xueliang; Yu, Yang; Chen, Yuzhong

    2018-03-01

    We proposed a refractive index sensor based on optical fiber end face using pulse reference-based compensation technique. With good compensation effect of this compensation technique, the power fluctuation of light source, the change of optic components transmission loss and coupler splitting ratio can be compensated, which largely reduces the background noise. The refractive index resolutions can achieve 3.8 × 10-6 RIU and1.6 × 10-6 RIU in different refractive index regions.

  9. Supply chain simulation tools and techniques: a survey

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    The main contribution of this paper is twofold: it surveys different types of simulation for supply chain management; it discusses several methodological issues. These different types of simulation are spreadsheet simulation, system dynamics, discrete-event simulation and business games. Which

  10. A survey of numerical cubature over triangles

    Energy Technology Data Exchange (ETDEWEB)

    Lyness, J.N.; Cools, R.

    1993-12-31

    This survey collects together theoretical results in the area of numerical cubature over triangles and is a vehicle for a current bibliography. We treat first the theory relating to regular integrands and then the corresponding theory for singular integrands with emphasis on the ``full comer singularity.`` Within these two sections we treat successively approaches based on transforming the triangle into a square, formulas based on polynomial moment fitting, and extrapolation techniques. Within each category we quote key theoretical results without proof, and relate other results and references to these. Nearly all the references we have found may be readily placed in one of these categories. This survey is theoretical in character and does not include recent work in adaptive and automatic integration.

  11. Twitter Strategies for Web-Based Surveying: Descriptive Analysis From the International Concussion Study.

    Science.gov (United States)

    Hendricks, Sharief; Düking, Peter; Mellalieu, Stephen D

    2016-09-01

    Social media provides researchers with an efficient means to reach and engage with a large and diverse audience. Twitter allows for the virtual social interaction among a network of users that enables researchers to recruit and administer surveys using snowball sampling. Although using Twitter to administer surveys for research is not new, strategies to improve response rates are yet to be reported. To compare the potential and actual reach of 2 Twitter accounts that administered a Web-based concussion survey to rugby players and trainers using 2 distinct Twitter-targeting strategies. Furthermore, the study sought to determine the likelihood of receiving a retweet based on the time of the day and day of the week of posting. A survey based on previous concussion research was exported to a Web-based survey website Survey Monkey. The survey comprised 2 questionnaires, one for players, and one for those involved in the game (eg, coaches and athletic trainers). The Web-based survey was administered using 2 existing Twitter accounts, with each account executing a distinct targeting strategy. A list of potential Twitter accounts to target was drawn up, together with a list of predesigned tweets. The list of accounts to target was divided into 'High-Profile' and 'Low-Profile', based on each accounts' position to attract publicity with a high social interaction potential. The potential reach (number of followers of the targeted account), and actual reach (number of retweets received by each post) between the 2 strategies were compared. The number of retweets received by each account was further analyzed to understand when the most likely time of day, and day of the week, a retweet would be received. The number of retweets received by a Twitter account decreased by 72% when using the 'high-profile strategy' compared with the 'low-profile strategy' (incidence rate ratio (IRR); 0.28, 95% confidence interval (CI) 0.21-0.37, P.001) and 6 PM to 11:59 PM (IRR 1.48, 95% CI 1

  12. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  13. Digital Mapping Techniques '10-Workshop Proceedings, Sacramento, California, May 16-19, 2010

    Science.gov (United States)

    Soller, David R.; Soller, David R.

    2012-01-01

    The Digital Mapping Techniques '10 (DMT'10) workshop was attended by 110 technical experts from 40 agencies, universities, and private companies, including representatives from 19 State geological surveys (see Appendix A). This workshop, hosted by the California Geological Survey, May 16-19, 2010, in Sacramento, California, was similar in nature to the previous 13 meetings (see Appendix B). The meeting was coordinated by the U.S. Geological Survey's (USGS) National Geologic Map Database project. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was again successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products ("publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  14. Programmatic Environmental Scans: A Survey Based on Program Planning and Evaluation Concepts

    Directory of Open Access Journals (Sweden)

    Donna J. Peterson

    2015-10-01

    Full Text Available Within Extension, environmental scans are most commonly used to assess community or organizational issues or for strategic planning purposes. However, Extension has expanded the use of environmental scans to systematically identify “what programs exist” on a given topic or focus area. Yet, despite recent attention to the topic of environmental scanning in Extension, survey instruments used to conduct environmental scans have not been published. Given the emphasis on implementation of evidence-based practices and programs, having a ready-made survey that can be used to identify programs on a specific topic and that could subsequently lead to an evaluability assessment of those programs would be a useful resource. To encourage the use of environmental scans to identify existing evidence-based programs, this article describes a survey instrument developed for the purpose of scanning for 4-H Healthy Living programs ready for rigorous outcome evaluation and/or national replication. It focuses on the rationale for survey items, as well as provides a summary and definition of those items. The survey tool can be easily adapted for future programmatic environmental scans both within and outside Extension.

  15. Demonstrating the Potential for Web-Based Survey Methodology with a Case Study.

    Science.gov (United States)

    Mertler, Craig

    2002-01-01

    Describes personal experience with using the Internet to administer a teacher-motivation and job-satisfaction survey to elementary and secondary teachers. Concludes that advantages of Web-base surveys, such as cost savings and efficiency of data collection, outweigh disadvantages, such as the limitations of listservs. (Contains 10 references.)…

  16. Findings on education in Bogota (Colombia based on a 2014 multipurpose survey

    Directory of Open Access Journals (Sweden)

    Sandra Patricia Barragán Moreno

    2017-06-01

    Full Text Available This paper presents a study on education of the Multipurpose Survey of the year 2014, financed by the District Department of Planning of the city of Bogotá and developed by the National Statistics Department. This study focuses on determining the main reasons why people of school age did not do so at the time of the survey, and in characterizing the educational levels of household heads and their spouses or partners as referring adults in the households surveyed. The survey was applied to a sample of 61,725 people, who according to the sample design represent 7,794,463 inhabitants of urban areas in Bogota. Using descriptive and data mining techniques, it is stated that the two main reasons for not studying are the lack of money and the need to work. In addition, marital status is a more determinant predictor than sex or socioeconomic stratum. Single people are motivated to earn a college degree when they have had access to higher education at some point in their life. When studying the information of household heads, a similar behavior was observed, because the reasons for not studying were practically the same. Unplanned pregnancy and living with a partner are not outstanding reasons.

  17. Ultrabroadband Phased-Array Receivers Based on Optical Techniques

    Science.gov (United States)

    2016-02-26

    bandwidths, and with it receiver noise floors , are unavoidable. Figure 1. SNR of a thermally limited receiver based on Friis equation showing the...techniques for RF and photonic integration based on liquid crystal polymer substrates were pursued that would aid in the realization of potential imaging...These models assumed that sufficient LNA gain was used on the antenna to set the noise floor of the imaging receiver, which necessitated physical

  18. LH2 Target Design & Position Survey Techniques for the MUSE experiment for Precise Proton Radius Measurement

    Science.gov (United States)

    Le Pottier, Luc; Roy, Pryiashee; Lorenzon, Wolfgang; Raymond, Richard; Steinberg, Noah; Rossi de La Fuente, Erick; MUSE (MUon proton Scattering Experiment) Collaboration

    2017-09-01

    The proton radius puzzle is a currently unresolved problem which has intrigued the scientific community, dealing with a 7 σ discrepancy between the proton radii determined from muonic hydrogen spectroscopy and electron scattering measurements. The MUon Scattering Experiment (MUSE) aims to resolve this puzzle by performing the first simultaneous elastic scattering measurements of both electrons and muons on the proton, which will allow the comparison of the radii from the two interactions with reduced systematic uncertainties. The data from this experiment is expected to provide the best test of lepton universality to date. The experiment will take place at the Paul Scherrer Institute in Switzerland in 2018. An essential component of the experiment is a liquid hydrogen (LH2) cryotarget system. Our group at the University of Michigan is responsible for the design, fabrication and installation of this system. Here we present our LH2 target cell design and fabrication techniques for successful operation at 20 K and 1 atm, and our computer vision-based target position survey system which will determine the position of the target, installed inside a vacuum chamber, with 0.01 mm or better precision at the height of the liquid hydrogen target and along the beam direction during the experiment.

  19. improvement of digital image watermarking techniques based on FPGA implementation

    International Nuclear Information System (INIS)

    EL-Hadedy, M.E

    2006-01-01

    digital watermarking provides the ownership of a piece of digital data by marking the considered data invisibly or visibly. this can be used to protect several types of multimedia objects such as audio, text, image and video. this thesis demonstrates the different types of watermarking techniques such as (discrete cosine transform (DCT) and discrete wavelet transform (DWT) and their characteristics. then, it classifies these techniques declaring their advantages and disadvantages. an improved technique with distinguished features, such as peak signal to noise ratio ( PSNR) and similarity ratio (SR) has been introduced. the modified technique has been compared with the other techniques by measuring heir robustness against differ attacks. finally, field programmable gate arrays (FPGA) based implementation and comparison, for the proposed watermarking technique have been presented and discussed

  20. Application of spectroscopic techniques for the study of paper documents: A survey

    International Nuclear Information System (INIS)

    Manso, M.; Carvalho, M.L.

    2009-01-01

    For many centuries paper was the main material for recording cultural achievements all over the world. Paper is mostly made from cellulose with small amounts of organic and inorganic additives, which allow its identification and characterization and may also contribute to its degradation. Prior to 1850, paper was made entirely from rags, using hemp, flax and cotton fibres. After this period, due to the enormous increase in demand, wood pulp began to be commonly used as raw material, resulting in rapid degradation of paper. Spectroscopic techniques represent one of the most powerful tools to investigate the constituents of paper documents in order to establish its identification and its state of degradation. This review describes the application of selected spectroscopic techniques used for paper characterization and conservation. The spectroscopic techniques that have been used and will be reviewed include: Fourier-Transform Infrared spectroscopy, Raman spectroscopy, Nuclear Magnetic Resonance spectroscopy, X-Ray spectroscopy, Laser-based Spectroscopy, Inductively Coupled Mass Spectroscopy, Laser ablation, Atomic Absorption Spectroscopy and X-Ray Photoelectron Spectroscopy.

  1. A Survey of Spatio-Temporal Grouping Techniques

    National Research Council Canada - National Science Library

    Megret, Remi; DeMenthon, Daniel

    2002-01-01

    ...) segmentation by trajectory grouping, and (3) joint spatial and temporal segmentation. The first category is the broadest, as it inherits the legacy techniques of image segmentation and motion segmentation...

  2. Comparison of small-group training with self-directed internet-based training in inhaler techniques.

    Science.gov (United States)

    Toumas, Mariam; Basheti, Iman A; Bosnic-Anticevich, Sinthia Z

    2009-08-28

    To compare the effectiveness of small-group training in correct inhaler technique with self-directed Internet-based training. Pharmacy students were randomly allocated to 1 of 2 groups: small-group training (n = 123) or self-directed Internet-based training (n = 113). Prior to intervention delivery, all participants were given a placebo Turbuhaler and product information leaflet and received inhaler technique training based on their group. Technique was assessed following training and predictors of correct inhaler technique were examined. There was a significant improvement in the number of participants demonstrating correct technique in both groups (small group training, 12% to 63%; p training, 9% to 59%; p groups in the percent change (n = 234, p > 0.05). Increased student confidence following the intervention was a predictor for correct inhaler technique. Self-directed Internet-based training is as effective as small-group training in improving students' inhaler technique.

  3. Analgesic techniques in minor painful procedures in neonatal units: a survey in northern Italy.

    Science.gov (United States)

    Codipietro, Luigi; Bailo, Elena; Nangeroni, Marco; Ponzone, Alberto; Grazia, Giuseppe

    2011-01-01

    The aim of this survey was to evaluate the current practice regarding pain assessment and pain management strategies adopted in commonly performed minor painful procedures in Northern Italian Neonatal Intensive Care Units (NICUs). A multicenter survey was conducted between 2008 and 2009 in 35 NICUs. The first part of the survey form covered pain assessment tools, the timing of analgesics, and the availability of written guidelines. A second section evaluated the analgesic strategies adopted in commonly performed painful procedures. The listed analgesic procedures were as follows: oral sweet solutions alone, non-nutritive sucking (NNS) alone, a combination of sweet solutions and NNS, breast-feeding where available, and topical anesthetics. Completed questionnaires were returned from 30 neonatal units (85.7% response rate). Ten of the 30 NICUs reported using pain assessment tools for minor invasive procedures. Neonatal Infant Pain Scale was the most frequently used pain scale (60%). Twenty neonatal units had written guidelines directing pain management practices. The most frequently used procedures were pacifiers alone (69%), followed by sweet-tasting solutions (58%). A 5% glucose solution was the most frequently utilized sweet-tasting solution (76.7%). A minority of NICUs (16.7%) administered 12% sucrose solutions for analgesia and the application of topical anesthetics was found in 27% of NICUs while breast-feeding was performed in 7% of NICUs. This study found a low adherence to national and international guidelines for analgesia in minor procedures: the underuse of neonatal pain scales (33%), sucrose solution administration before heel lance (23.3%), topical anesthetics before venipuncture, or other analgesic techniques. The presence of written pain control guidelines in these regions of Northern Italy increased in recent years (from 25% to 66%). © 2010 World Institute of Pain.

  4. A Diagnostic Technique for Formulating Market Strategies in Higher Education Based on Relative Competitive Position.

    Science.gov (United States)

    Dolinsky, Arthur L.; Quazi, Hesan A.

    1994-01-01

    Importance-performance analysis, a marketing research technique using analysis of consumer attitudes toward salient product or service attributes, is found useful for colleges and universities in developing marketing strategies, particularly when competition is considered as an important dimension. Data are drawn from a survey of 252 students at 1…

  5. Risk-based maintenance-Techniques and applications

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2007-01-01

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions

  6. Feathering effect detection and artifact agglomeration index-based video deinterlacing technique

    Science.gov (United States)

    Martins, André Luis; Rodrigues, Evandro Luis Linhari; de Paiva, Maria Stela Veludo

    2018-03-01

    Several video deinterlacing techniques have been developed, and each one presents a better performance in certain conditions. Occasionally, even the most modern deinterlacing techniques create frames with worse quality than primitive deinterlacing processes. This paper validates that the final image quality can be improved by combining different types of deinterlacing techniques. The proposed strategy is able to select between two types of deinterlaced frames and, if necessary, make the local correction of the defects. This decision is based on an artifact agglomeration index obtained from a feathering effect detection map. Starting from a deinterlaced frame produced by the "interfield average" method, the defective areas are identified, and, if deemed appropriate, these areas are replaced by pixels generated through the "edge-based line average" method. Test results have proven that the proposed technique is able to produce video frames with higher quality than applying a single deinterlacing technique through getting what is good from intra- and interfield methods.

  7. Presumption of the distribution of the geological structure based on the geological survey and the topographic data in and around the Horonobe area

    International Nuclear Information System (INIS)

    Sakai, Toshihiro; Matsuoka, Toshiyuki

    2015-06-01

    The Horonobe Underground Research Laboratory (URL) Project, a comprehensive research project investigating the deep underground environment in sedimentary rock, is being pursued by the Japan Atomic Energy Agency (JAEA) at Horonobe-cho in Northern Hokkaido, Japan. One of the main goals of the URL project is to establish techniques for investigation, analysis and assessment of the deep geological environment. JAEA constructed the geologic map and the database of geological mapping in Horonobe-cho in 2005 based on the existing literatures and 1/200,000 geologic maps published by Geological Survey of Japan, and then updated the geologic map in 2007 based on the results of various investigations which were conducted around the URL as the surface based investigation phase of the URL project. On the other hand, there are many geological survey data which are derived from natural resources (petroleum, natural gas and coal, etc.) exploration in and around Horonobe-cho. In this report, we update the geologic map and the database of the geological mapping based on these geological survey and topographical analysis data in and around the Horonobe area, and construct a digital geologic map and a digital database of geological mapping as GIS. These data can be expected to improve the precision of modeling and analyzing of geological environment including its long-term evaluation. The digital data is attached on CD-ROM. (J.P.N.)

  8. A critical review of survey-based research in supply chain integration

    NARCIS (Netherlands)

    van der Vaart, Taco; van Donk, Dirk Pieter

    Supply chain (SC) integration is considered one of the major factors in improving performance. Based upon some concerns regarding the constructs, measurements and items used, this paper analyses survey-based research with respect to the relationship between SC integration and performance. The review

  9. A Survey of Advances in Vision-Based Human Motion Capture and Analysis

    DEFF Research Database (Denmark)

    Moeslund, Thomas B.; Hilton, Adrian; Krüger, Volker

    2006-01-01

    This survey reviews advances in human motion capture and analysis from 2000 to 2006, following a previous survey of papers up to 2000 Human motion capture continues to be an increasingly active research area in computer vision with over 350 publications over this period. A number of significant...... actions and behavior. This survey reviews recent trends in video based human capture and analysis, as well as discussing open problems for future research to achieve automatic visual analysis of human movement....

  10. Cluster cosmology with next-generation surveys.

    Science.gov (United States)

    Ascaso, B.

    2017-03-01

    The advent of next-generation surveys will provide a large number of cluster detections that will serve the basis for constraining cos mological parameters using cluster counts. The main two observational ingredients needed are the cluster selection function and the calibration of the mass-observable relation. In this talk, we present the methodology designed to obtain robust predictions of both ingredients based on realistic cosmological simulations mimicking the following next-generation surveys: J-PAS, LSST and Euclid. We display recent results on the selection functions for these mentioned surveys together with others coming from other next-generation surveys such as eROSITA, ACTpol and SPTpol. We notice that the optical and IR surveys will reach the lowest masses between 0.3technique that we are developing to perform a Fisher Matrix analysis to provide cosmological constraints for the considered next-generation surveys and introduce very preliminary results.

  11. A Survey on Cloud Security Issues and Techniques

    OpenAIRE

    Sharma, Shubhanjali; Gupta, Garima; Laxmi, P. R.

    2014-01-01

    Today, cloud computing is an emerging way of computing in computer science. Cloud computing is a set of resources and services that are offered by the network or internet. Cloud computing extends various computing techniques like grid computing, distributed computing. Today cloud computing is used in both industrial field and academic field. Cloud facilitates its users by providing virtual resources via internet. As the field of cloud computing is spreading the new techniques are developing. ...

  12. A technique for reducing diverse habits survey data and its application to seafood consumption near Winfrith

    International Nuclear Information System (INIS)

    Smith, B.D.; Hunt, G.J.

    1989-01-01

    Habits surveys provide basic information to enable doses to appropriate critical groups of members of the public to be assessed. In some cases, the relevant habits of those to be included in the critical group can be quite diverse, and a simplifying method may be needed. A technique for this is described, and exemplified in relation to liquid radioactive waste discharges from AEE Winfrith, an area where the range of seafoods and radionuclide concentrations in them result in a wide variation of doses. Weighted mean consumption rates are derived for the critical group, and an example of their application in setting a revised liquid discharge authorisation is given. (author)

  13. Survey of NDT techniques, services, qualifications and certification of NDT personnel-preliminary results

    International Nuclear Information System (INIS)

    Aleta, C.R.; Kinilitan, V.E.; Lailo, R.M.

    1987-01-01

    This paper presented the results of a survey conducted to determine the profile of the NDT industry including its problems. A questionnaire designed in three parts 1) present practices on qualification and certification, 2) NDT equipment and 3) services and problems in NDT. Of the 36 firms contacted only 20 responded. Results indicated the following: a) most firms are engaged in four (4) main techniques, RT, UT, MT and PT. Only 2 indicated capability of ET. b) level III personnel are relatively few in number, c) most firms allow the ASNT recommendation as a basis for their qualifications and certification and are in favor of standardization of the qualification and certification process and supportive of a national center for training of NDT personnel and d) most firms perceived the lack of adequate repair/maintenance skills/facilities, followed by high cost of equipment and the lack of national standard for qualification and certification. (ELC)

  14. Determining flexor-tendon repair techniques via soft computing

    Science.gov (United States)

    Johnson, M.; Firoozbakhsh, K.; Moniem, M.; Jamshidi, M.

    2001-01-01

    An SC-based multi-objective decision-making method for determining the optimal flexor-tendon repair technique from experimental and clinical survey data, and with variable circumstances, was presented. Results were compared with those from the Taguchi method. Using the Taguchi method results in the need to perform ad-hoc decisions when the outcomes for individual objectives are contradictory to a particular preference or circumstance, whereas the SC-based multi-objective technique provides a rigorous straightforward computational process in which changing preferences and importance of differing objectives are easily accommodated. Also, adding more objectives is straightforward and easily accomplished. The use of fuzzy-set representations of information categories provides insight into their performance throughout the range of their universe of discourse. The ability of the technique to provide a "best" medical decision given a particular physician, hospital, patient, situation, and other criteria was also demonstrated.

  15. Benthic Photo Survey: Software for Geotagging, Depth-tagging, and Classifying Photos from Survey Data and Producing Shapefiles for Habitat Mapping in GIS

    Directory of Open Access Journals (Sweden)

    Jared Kibele

    2016-03-01

    Full Text Available Photo survey techniques are common for resource management, ecological research, and ground truthing for remote sensing but current data processing methods are cumbersome and inefficient. The Benthic Photo Survey (BPS software described here was created to simplify the data processing and management tasks associated with photo surveys of underwater habitats. BPS is free and open source software written in Python with a QT graphical user interface. BPS takes a GPS log and jpeg images acquired by a diver or drop camera and assigns the GPS position to each photo based on time-stamps (i.e. geotagging. Depth and temperature can be assigned in a similar fashion (i.e. depth-tagging using log files from an inexpensive consumer grade depth / temperature logger that can be attached to the camera. BPS provides the user with a simple interface to assign quantitative habitat and substrate classifications to each photo. Location, depth, temperature, habitat, and substrate data are all stored with the jpeg metadata in Exchangeable image file format (Exif. BPS can then export all of these data in a spatially explicit point shapefile format for use in GIS. BPS greatly reduces the time and skill required to turn photos into usable data thereby making photo survey methods more efficient and cost effective. BPS can also be used, as is, for other photo sampling techniques in terrestrial and aquatic environments and the open source code base offers numerous opportunities for expansion and customization.

  16. An Image Registration Based Technique for Noninvasive Vascular Elastography

    OpenAIRE

    Valizadeh, Sina; Makkiabadi, Bahador; Mirbagheri, Alireza; Soozande, Mehdi; Manwar, Rayyan; Mozaffarzadeh, Moein; Nasiriavanaki, Mohammadreza

    2018-01-01

    Non-invasive vascular elastography is an emerging technique in vascular tissue imaging. During the past decades, several techniques have been suggested to estimate the tissue elasticity by measuring the displacement of the Carotid vessel wall. Cross correlation-based methods are the most prevalent approaches to measure the strain exerted in the wall vessel by the blood pressure. In the case of a low pressure, the displacement is too small to be apparent in ultrasound imaging, especially in th...

  17. Evaluating the status of African wild dogs Lycaon pictus and cheetahs Acinonyx jubatus through tourist-based photographic surveys in the Kruger National Park [corrected].

    Science.gov (United States)

    Marnewick, Kelly; Ferreira, Sam M; Grange, Sophie; Watermeyer, Jessica; Maputla, Nakedi; Davies-Mostert, Harriet T

    2014-01-01

    The Kruger National Park is a stronghold for African wild dog Lycaon pictus and cheetah Acinonyx jubatus conservation in South Africa. Tourist photographic surveys have been used to evaluate the minimum number of wild dogs and cheetahs alive over the last two decades. Photographic-based capture-recapture techniques for open populations were used on data collected during a survey done in 2008/9. Models were run for the park as a whole and per region (northern, central, southern). A total of 412 (329-495; SE 41.95) cheetahs and 151 (144-157; SE 3.21) wild dogs occur in the Kruger National Park. Cheetah capture probabilities were affected by time (number of entries) and sex, whereas wild dog capture probabilities were affected by the region of the park. When plotting the number of new individuals identified against the number of entries received, the addition of new wild dogs to the survey reached an asymptote at 210 entries, but cheetahs did not reach an asymptote. The cheetah population of Kruger appears to be acceptable, while the wild dog population size and density are of concern. The effectiveness of tourist-based surveys for estimating population sizes through capture-recapture analyses is shown.

  18. Evaluating the Status of and African Wild Dogs Lycaon pictus and Cheetahs Acinonyx jubatus through Tourist-based Photographic Surveys in the Kruger National Park

    Science.gov (United States)

    Marnewick, Kelly; Ferreira, Sam M.; Grange, Sophie; Watermeyer, Jessica; Maputla, Nakedi; Davies-Mostert, Harriet T.

    2014-01-01

    The Kruger National Park is a stronghold for African wild dog Lycaon pictus and cheetah Acinonyx jubatus conservation in South Africa. Tourist photographic surveys have been used to evaluate the minimum number of wild dogs and cheetahs alive over the last two decades. Photographic-based capture-recapture techniques for open populations were used on data collected during a survey done in 2008/9. Models were run for the park as a whole and per region (northern, central, southern). A total of 412 (329–495; SE 41.95) cheetahs and 151 (144–157; SE 3.21) wild dogs occur in the Kruger National Park. Cheetah capture probabilities were affected by time (number of entries) and sex, whereas wild dog capture probabilities were affected by the region of the park. When plotting the number of new individuals identified against the number of entries received, the addition of new wild dogs to the survey reached an asymptote at 210 entries, but cheetahs did not reach an asymptote. The cheetah population of Kruger appears to be acceptable, while the wild dog population size and density are of concern. The effectiveness of tourist-based surveys for estimating population sizes through capture-recapture analyses is shown. PMID:24465998

  19. Evaluating the status of African wild dogs Lycaon pictus and cheetahs Acinonyx jubatus through tourist-based photographic surveys in the Kruger National Park [corrected].

    Directory of Open Access Journals (Sweden)

    Kelly Marnewick

    Full Text Available The Kruger National Park is a stronghold for African wild dog Lycaon pictus and cheetah Acinonyx jubatus conservation in South Africa. Tourist photographic surveys have been used to evaluate the minimum number of wild dogs and cheetahs alive over the last two decades. Photographic-based capture-recapture techniques for open populations were used on data collected during a survey done in 2008/9. Models were run for the park as a whole and per region (northern, central, southern. A total of 412 (329-495; SE 41.95 cheetahs and 151 (144-157; SE 3.21 wild dogs occur in the Kruger National Park. Cheetah capture probabilities were affected by time (number of entries and sex, whereas wild dog capture probabilities were affected by the region of the park. When plotting the number of new individuals identified against the number of entries received, the addition of new wild dogs to the survey reached an asymptote at 210 entries, but cheetahs did not reach an asymptote. The cheetah population of Kruger appears to be acceptable, while the wild dog population size and density are of concern. The effectiveness of tourist-based surveys for estimating population sizes through capture-recapture analyses is shown.

  20. Current status of paediatric post-mortem imaging: an ESPR questionnaire-based survey

    Energy Technology Data Exchange (ETDEWEB)

    Arthurs, Owen J. [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Radiology, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom); Rijn, Rick R. van [Academic Medical Centre, Department of Radiology, Amsterdam (Netherlands); Sebire, Neil J. [Great Ormond Street Hospital for Children, Department of Pathology, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom)

    2014-03-15

    The use of post-mortem imaging, including skeletal radiography, CT and MRI, is increasing, providing a minimally invasive alternative to conventional autopsy techniques. The development of clinical guidelines and national standards is being encouraged, particularly for cross-sectional techniques. To outline the current practice of post-mortem imaging amongst members of the European Society of Paediatric Radiology (ESPR). We e-mailed an online questionnaire of current post-mortem service provisions to members of the ESPR in January 2013. The survey included direct questions about what services were offered, the population imaged, current techniques used, imaging protocols, reporting experience and intended future involvement. Seventy-one percent (47/66) of centres from which surveys were returned reported performing some form of post-mortem imaging in children, of which 81 % perform radiographs, 51% CT and 38% MRI. Eighty-seven percent of the imaging is performed within the radiology or imaging departments, usually by radiographers (75%), and 89% is reported by radiologists, of which 64% is reported by paediatric radiologists. Overall, 72% of positive respondents have a standardised protocol for radiographs, but only 32% have such a protocol for CT and 27% for MRI. Sixty-one percent of respondents wrote that this is an important area that needs to be developed. Overall, the majority of centres provide some post-mortem imaging service, most of which is performed within an imaging department and reported by a paediatric radiologist. However, the populations imaged as well as the details of the services offered are highly variable among institutions and lack standardisation. We have identified people who would be interested in taking this work forwards. (orig.)

  1. An Overview on Base Real-Time Hard Shadow Techniques in Virtual Environments

    Directory of Open Access Journals (Sweden)

    Mohd Shahrizal Sunar

    2012-03-01

    Full Text Available Shadows are elegant to create a realistic scene in virtual environments variety type of shadow techniques encourage us to prepare an overview on all base shadow techniques. Non real-time and real-time techniques are big subdivision of shadow generation. In non real-time techniques ray tracing, ray casting and radiosity are well known and are described deeply. Radiosity is implemented to create very realistic shadow on non real-time scene. Although traditional radiosity algorithm is difficult to implement, we have proposed a simple one. The proposed pseudo code is easier to understand and implement. Ray tracing is used to prevent of collision of movement objects. Projection shadow, shadow volume and shadow mapping are used to create real-time shadow in virtual environments. We have used projection shadow for some objects are static and have shadow on flat surface. Shadow volume is used to create accurate shadow with sharp outline. Shadow mapping that is the base of most recently techniques is reconstructed. The reconstruct algorithm gives some new idea to propose another algorithm based on shadow mapping.

  2. Comparative assessment of PIV-based pressure evaluation techniques applied to a transonic base flow

    NARCIS (Netherlands)

    Blinde, P; Michaelis, D; van Oudheusden, B.W.; Weiss, P.E.; de Kat, R.; Laskari, A.; Jeon, Y.J.; David, L; Schanz, D; Huhn, F.; Gesemann, S; Novara, M.; McPhaden, C.; Neeteson, N.; Rival, D.; Schneiders, J.F.G.; Schrijer, F.F.J.

    2016-01-01

    A test case for PIV-based pressure evaluation techniques has been developed by constructing a simulated experiment from a ZDES simulation for an axisymmetric base flow at Mach 0.7. The test case comprises sequences of four subsequent particle images (representing multi-pulse data) as well as

  3. DCT-based cyber defense techniques

    Science.gov (United States)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  4. Key Techniques for the Development of Web-Based PDM System

    Institute of Scientific and Technical Information of China (English)

    WANG Li-juan; ZHANG Xu; NING Ru-xin

    2006-01-01

    Some key techniques for the development of web-based product data management (PDM) system are introduced. The four-tiered B/S architecture of a PDM system-BITPDM is introduced first, followed by its design and implementation, including virtual data vault, flexible coding system, document management,product structure and configuration management, workflow/process and product maturity management. BITPDM can facilitate the activities from new product introduction phase to manufacturing, and manage the product data and their dynamic changing history. Based on Microsoft. NET, XML, web service and SOAP techniques, BITPDM realizes the integration and efficient management of product information.

  5. A knowledge - based system to assist in the design of soil survey schemes

    NARCIS (Netherlands)

    Domburg, P.

    1994-01-01

    Soil survey information with quantified accuracy is relevant to decisions on land use and environmental problems. To obtain such information statistical strategies should be used for collecting and analysing data. A survey project based on a statistical sampling strategy requires a soil

  6. Engineering surveying

    CERN Document Server

    Schofield, W

    2001-01-01

    The aim of Engineering Surveying has always been to impart and develop a clear understanding of the basic topics of the subject. The author has fully revised the book to make it the most up-to-date and relevant textbook available on the subject.The book also contains the latest information on trigonometric levelling, total stations and one-person measuring systems. A new chapter on satellites ensures a firm grasp of this vitally important topic.The text covers engineering surveying modules for civil engineering students on degree courses and forms a reference for the engineering surveying module in land surveying courses. It will also prove to be a valuable reference for practitioners.* Simple clear introduction to surveying for engineers* Explains key techniques and methods* Details reading systems and satellite position fixing

  7. Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation

    Science.gov (United States)

    Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah

    2018-03-01

    To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.

  8. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    This study shows the potency of two GIS-based data driven bivariate techniques namely ... In the view of these weaknesses , there is a strong requirement for reassessment of .... Font color: Text 1, Not Expanded by / Condensed by , ...... West Bengal (India) using remote sensing, geographical information system and multi-.

  9. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    International Nuclear Information System (INIS)

    Han, G.; Lin, B.; Xu, Z.

    2017-01-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  10. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    Science.gov (United States)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  11. An improved technique for non-destructive measurement of the stem ...

    African Journals Online (AJOL)

    It was concluded that the standard volume model based on the non-destructive measurement technique meets the requirements for precision in forest surveys. The precision of the standard volume model for L. gmelinii (a coniferous tree) was superior to that of the model for P. tomentosa (a broad-leaved tree). The electronic ...

  12. A framework for laboratory pre-work based on the concepts, tools and techniques questioning method

    International Nuclear Information System (INIS)

    Huntula, J; Sharma, M D; Johnston, I; Chitaree, R

    2011-01-01

    Learning in the laboratory is different from learning in other contexts because students have to engage with various aspects of the practice of science. They have to use many skills and knowledge in parallel-not only to understand the concepts of physics but also to use the tools and analyse the data. The question arises, how to best guide students' learning in the laboratory. This study is about creating and using questions with a specifically designed framework to aid learning in the laboratory. The concepts, tools and techniques questioning (CTTQ) method was initially designed and used at Mahidol University, Thailand, and was subsequently extended to laboratory pre-work at the University of Sydney. The CTTQ method was implemented in Sydney with 190 first-year students. Three pre-work exercises on a series of electrical experiments were created based on the CTTQ method. The pre-works were completed individually and submitted before the experiment started. Analysed pre-work, surveys and interviews were used to evaluate the pre-work questions in this study. The results indicated that the CTTQ method was successful and the flow in the experiments was better than that in the previous year. At the same time students had difficulty with the last experiment in the sequence and with techniques.

  13. Assessing Caribbean Shallow and Mesophotic Reef Fish Communities Using Baited-Remote Underwater Video (BRUV) and Diver-Operated Video (DOV) Survey Techniques

    Science.gov (United States)

    Macaya-Solis, Consuelo; Exton, Dan A.; Gress, Erika; Wright, Georgina; Rogers, Alex D.

    2016-01-01

    Fish surveys form the backbone of reef monitoring and management initiatives throughout the tropics, and understanding patterns in biases between techniques is crucial if outputs are to address key objectives optimally. Often biases are not consistent across natural environmental gradients such as depth, leading to uncertainty in interpretation of results. Recently there has been much interest in mesophotic reefs (reefs from 30–150 m depth) as refuge habitats from fishing pressure, leading to many comparisons of reef fish communities over depth gradients. Here we compare fish communities using stereo-video footage recorded via baited remote underwater video (BRUV) and diver-operated video (DOV) systems on shallow and mesophotic reefs in the Mesoamerican Barrier Reef, Caribbean. We show inconsistent responses across families, species and trophic groups between methods across the depth gradient. Fish species and family richness were higher using BRUV at both depth ranges, suggesting that BRUV is more appropriate for recording all components of the fish community. Fish length distributions were not different between methods on shallow reefs, yet BRUV recorded more small fish on mesophotic reefs. However, DOV consistently recorded greater relative fish community biomass of herbivores, suggesting that studies focusing on herbivores should consider using DOV. Our results highlight the importance of considering what component of reef fish community researchers and managers are most interested in surveying when deciding which survey technique to use across natural gradients such as depth. PMID:27959907

  14. Searching for millisecond pulsars: surveys, techniques and prospects

    International Nuclear Information System (INIS)

    Stovall, K; Lorimer, D R; Lynch, R S

    2013-01-01

    Searches for millisecond pulsars (which we here loosely define as those with periods < 20 ms) in the galactic field have undergone a renaissance in the past five years. New or recently refurbished radio telescopes utilizing cooled receivers and state-of-the art digital data acquisition systems are carrying out surveys of the entire sky at a variety of radio frequencies. Targeted searches for millisecond pulsars in point sources identified by the Fermi Gamma-ray Space Telescope have proved phenomenally successful, with over 50 discoveries in the past five years. The current sample of millisecond pulsars now numbers almost 200 and, for the first time in 25 years, now outnumbers their counterparts in galactic globular clusters. While many of these searches are motivated to find pulsars which form part of pulsar timing arrays, a wide variety of interesting systems are now being found. Following a brief overview of the millisecond pulsar phenomenon, we describe these searches and present some of the highlights of the new discoveries in the past decade. We conclude with predictions and prospects for ongoing and future surveys. (paper)

  15. Prediction of drug synergy in cancer using ensemble-based machine learning techniques

    Science.gov (United States)

    Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder

    2018-04-01

    Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.

  16. Site suitability evaluation of an old operating landfill using AHP and GIS techniques and integrated hydrogeological and geophysical surveys.

    Science.gov (United States)

    Saatsaz, Masoud; Monsef, Iman; Rahmani, Mostafa; Ghods, Abdolreza

    2018-02-16

    Because of the outdated methods of common landfill selection, it is imperative to reevaluate the usage suitability. To assess the suitability of the existing waste landfill in Zanjan, Iran, we have used a combination of the analytical hierarchy process (AHP) and GIS techniques, along with fieldwork surveys. Four major criteria and 12 subcriteria were considered, and the AHP was applied to assign the relative importance weights of criteria and subcriteria to each other. Finally, a landfill suitability map was generated and ranked based on the final suitability scores. The results show that the unsuitable areas are around Zanjan, in the middle parts of the plain. By contrast, the most suitable areas are uncultivated areas, located mostly in the west, north, and south. The results also indicate that the present landfill is a highly suitable site. After desk studies, geoelectrical surveys and infiltration measurements were conducted to make the final decision. Double-ring permeability tests confirm the landfill is an acceptable site. The electrical sounding shows that the leachate plume has a width of about ~ 450 m, spreads to a depth of about ~ 55 m, and migrates towards the northeast. Considering the groundwater depth, dry climate, and a low infiltration rate of the landfill soils, it can be concluded that leachate plumes will not contaminate groundwater within this decade. The proposed method can be implemented to reevaluate the suitability of any old operating reservoir such as oil reservoirs, petrol filling stations, heavy industrial tanks, and landfills, containing liquid hazardous materials.

  17. Tilting-Twisting-Rolling: a pen-based technique for compass geometric construction

    Institute of Scientific and Technical Information of China (English)

    Fei LYU; Feng TIAN; Guozhong DAI; Hongan WANG

    2017-01-01

    This paper presents a new pen-based technique,Tilting-Twisting-Rolling,to support compass geometric construction.By leveraging the 3D orientation information and 3D rotation information of a pen,this technique allows smooth pen action to complete multi-step geometric construction without switching task states.Results from a user study show this Tilting-Twisting-Rolling technique can improve user performance and user experience in compass geometric construction.

  18. EV Charging Analysis Based on the National Travel Surveys of the Nordic Area

    DEFF Research Database (Denmark)

    Liu, Zhaoxi; Wu, Qiuwei

    2014-01-01

    This paper presents the charging demand profiles of electric vehicles (EVs) based on the National Travel Surveys of the Nordic area. The EV charging analysis is carried out considering different types of charging patterns which are dumb charging, timed charging and spot price based charging....... The driving behavior of the vehicles is studied through the National Travel Surveys of Denmark, Finland, Norway and Sweden. The features of the charging demand are discussed based on the results of the analysis. The study in this paper provides an estimation of the possible level and patterns of the EV...

  19. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    Directory of Open Access Journals (Sweden)

    Mohamed M. Ibrahim

    2014-01-01

    Full Text Available Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  20. Video multiple watermarking technique based on image interlacing using DWT.

    Science.gov (United States)

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  1. Analysis of Employee's Survey for Preventing Human-Errors

    International Nuclear Information System (INIS)

    Sung, Chanho; Kim, Younggab; Joung, Sanghoun

    2013-01-01

    Human errors in nuclear power plant can cause large and small events or incidents. These events or incidents are one of main contributors of reactor trip and might threaten the safety of nuclear plants. To prevent human-errors, KHNP(nuclear power plants) introduced 'Human-error prevention techniques' and have applied the techniques to main parts such as plant operation, operation support, and maintenance and engineering. This paper proposes the methods to prevent and reduce human-errors in nuclear power plants through analyzing survey results which includes the utilization of the human-error prevention techniques and the employees' awareness of preventing human-errors. With regard to human-error prevention, this survey analysis presented the status of the human-error prevention techniques and the employees' awareness of preventing human-errors. Employees' understanding and utilization of the techniques was generally high and training level of employee and training effect on actual works were in good condition. Also, employees answered that the root causes of human-error were due to working environment including tight process, manpower shortage, and excessive mission rather than personal negligence or lack of personal knowledge. Consideration of working environment is certainly needed. At the present time, based on analyzing this survey, the best methods of preventing human-error are personal equipment, training/education substantiality, private mental health check before starting work, prohibit of multiple task performing, compliance with procedures, and enhancement of job site review. However, the most important and basic things for preventing human-error are interests of workers and organizational atmosphere such as communication between managers and workers, and communication between employees and bosses

  2. Comparative Study of Retinal Vessel Segmentation Based on Global Thresholding Techniques

    Directory of Open Access Journals (Sweden)

    Temitope Mapayi

    2015-01-01

    Full Text Available Due to noise from uneven contrast and illumination during acquisition process of retinal fundus images, the use of efficient preprocessing techniques is highly desirable to produce good retinal vessel segmentation results. This paper develops and compares the performance of different vessel segmentation techniques based on global thresholding using phase congruency and contrast limited adaptive histogram equalization (CLAHE for the preprocessing of the retinal images. The results obtained show that the combination of preprocessing technique, global thresholding, and postprocessing techniques must be carefully chosen to achieve a good segmentation performance.

  3. Low Power LDPC Code Decoder Architecture Based on Intermediate Message Compression Technique

    Science.gov (United States)

    Shimizu, Kazunori; Togawa, Nozomu; Ikenaga, Takeshi; Goto, Satoshi

    Reducing the power dissipation for LDPC code decoder is a major challenging task to apply it to the practical digital communication systems. In this paper, we propose a low power LDPC code decoder architecture based on an intermediate message-compression technique which features as follows: (i) An intermediate message compression technique enables the decoder to reduce the required memory capacity and write power dissipation. (ii) A clock gated shift register based intermediate message memory architecture enables the decoder to decompress the compressed messages in a single clock cycle while reducing the read power dissipation. The combination of the above two techniques enables the decoder to reduce the power dissipation while keeping the decoding throughput. The simulation results show that the proposed architecture improves the power efficiency up to 52% and 18% compared to that of the decoder based on the overlapped schedule and the rapid convergence schedule without the proposed techniques respectively.

  4. Lessons From Recruitment to an Internet-Based Survey for Degenerative Cervical Myelopathy: Comparison of Free and Fee-Based Methods

    Science.gov (United States)

    2018-01-01

    Background Degenerative Cervical Myelopathy (DCM) is a syndrome of subacute cervical spinal cord compression due to spinal degeneration. Although DCM is thought to be common, many fundamental questions such as the natural history and epidemiology of DCM remain unknown. In order to answer these, access to a large cohort of patients with DCM is required. With its unrivalled and efficient reach, the Internet has become an attractive tool for medical research and may overcome these limitations in DCM. The most effective recruitment strategy, however, is unknown. Objective To compare the efficacy of fee-based advertisement with alternative free recruitment strategies to a DCM Internet health survey. Methods An Internet health survey (SurveyMonkey) accessed by a new DCM Internet platform (myelopathy.org) was created. Using multiple survey collectors and the website’s Google Analytics, the efficacy of fee-based recruitment strategies (Google AdWords) and free alternatives (including Facebook, Twitter, and myelopathy.org) were compared. Results Overall, 760 surveys (513 [68%] fully completed) were accessed, 305 (40%) from fee-based strategies and 455 (60%) from free alternatives. Accounting for researcher time, fee-based strategies were more expensive ($7.8 per response compared to $3.8 per response for free alternatives) and identified a less motivated audience (Click-Through-Rate of 5% compared to 57% using free alternatives) but were more time efficient for the researcher (2 minutes per response compared to 16 minutes per response for free methods). Facebook was the most effective free strategy, providing 239 (31%) responses, where a single message to 4 existing communities yielded 133 (18%) responses within 7 days. Conclusions The Internet can efficiently reach large numbers of patients. Free and fee-based recruitment strategies both have merits. Facebook communities are a rich resource for Internet researchers. PMID:29402760

  5. Lessons From Recruitment to an Internet-Based Survey for Degenerative Cervical Myelopathy: Comparison of Free and Fee-Based Methods.

    Science.gov (United States)

    Davies, Benjamin; Kotter, Mark

    2018-02-05

    Degenerative Cervical Myelopathy (DCM) is a syndrome of subacute cervical spinal cord compression due to spinal degeneration. Although DCM is thought to be common, many fundamental questions such as the natural history and epidemiology of DCM remain unknown. In order to answer these, access to a large cohort of patients with DCM is required. With its unrivalled and efficient reach, the Internet has become an attractive tool for medical research and may overcome these limitations in DCM. The most effective recruitment strategy, however, is unknown. To compare the efficacy of fee-based advertisement with alternative free recruitment strategies to a DCM Internet health survey. An Internet health survey (SurveyMonkey) accessed by a new DCM Internet platform (myelopathy.org) was created. Using multiple survey collectors and the website's Google Analytics, the efficacy of fee-based recruitment strategies (Google AdWords) and free alternatives (including Facebook, Twitter, and myelopathy.org) were compared. Overall, 760 surveys (513 [68%] fully completed) were accessed, 305 (40%) from fee-based strategies and 455 (60%) from free alternatives. Accounting for researcher time, fee-based strategies were more expensive ($7.8 per response compared to $3.8 per response for free alternatives) and identified a less motivated audience (Click-Through-Rate of 5% compared to 57% using free alternatives) but were more time efficient for the researcher (2 minutes per response compared to 16 minutes per response for free methods). Facebook was the most effective free strategy, providing 239 (31%) responses, where a single message to 4 existing communities yielded 133 (18%) responses within 7 days. The Internet can efficiently reach large numbers of patients. Free and fee-based recruitment strategies both have merits. Facebook communities are a rich resource for Internet researchers. ©Benjamin Davies, Mark Kotter. Originally published in JMIR Research Protocols (http

  6. Teachers of the Alexander Technique in the UK and the people who take their lessons: A national cross-sectional survey.

    Science.gov (United States)

    Eldred, J; Hopton, A; Donnison, E; Woodman, J; MacPherson, H

    2015-06-01

    Given the rising profile of the Alexander Technique in the UK, there is a need for a comprehensive description of its teachers and of those who currently take lessons. In a national survey of Alexander teachers, we set out to address this information gap. A cross-sectional survey of 871 UK members of three main Alexander Technique teachers' professional associations was conducted. A questionnaire requested information about their professional background, teaching practice and methods, and about the people who attend lessons and their reasons for seeking help. With an overall response rate of 61%, 534 teachers responded; 74% were female with median age of 58 years, 60% had a higher education qualification, and 95% were self-employed, many with additional non-Alexander paid employment. The majority (87%) offered lessons on their own premises or in a privately rented room, and 19% provided home visits; both individual and group lessons were provided. People who took lessons were predominantly female (66%) with a median age of 48 years, and 91% paid for their lessons privately. Nearly two-thirds (62%) began lessons for reasons related to musculoskeletal conditions, including back symptoms, posture, neck pain, and shoulder pain. Other reasons were general (18%, including well-being), performance-related (10%, including voice-, music-, and sport-related), psychological (5%) and neurological (3%). We estimate that Alexander teachers in the UK provide approximately 400,000 lessons per year. This study provides an overview of Alexander Technique teaching in the UK today and data that may be useful when planning future research. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Feasibility of CBCT-based dose calculation: Comparative analysis of HU adjustment techniques

    International Nuclear Information System (INIS)

    Fotina, Irina; Hopfgartner, Johannes; Stock, Markus; Steininger, Thomas; Lütgendorf-Caucig, Carola; Georg, Dietmar

    2012-01-01

    Background and purpose: The aim of this work was to compare the accuracy of different HU adjustments for CBCT-based dose calculation. Methods and materials: Dose calculation was performed on CBCT images of 30 patients. In the first two approaches phantom-based (Pha-CC) and population-based (Pop-CC) conversion curves were used. The third method (WAB) represents override of the structures with standard densities for water, air and bone. In ROI mapping approach all structures were overridden with average HUs from planning CT. All techniques were benchmarked to the Pop-CC and CT-based plans by DVH comparison and γ-index analysis. Results: For prostate plans, WAB and ROI mapping compared to Pop-CC showed differences in PTV D median below 2%. The WAB and Pha-CC methods underestimated the bladder dose in IMRT plans. In lung cases PTV coverage was underestimated by Pha-CC method by 2.3% and slightly overestimated by the WAB and ROI techniques. The use of the Pha-CC method for head–neck IMRT plans resulted in difference in PTV coverage up to 5%. Dose calculation with WAB and ROI techniques showed better agreement with pCT than conversion curve-based approaches. Conclusions: Density override techniques provide an accurate alternative to the conversion curve-based methods for dose calculation on CBCT images.

  8. Profile of Pre-Service Science Teachers Based on STEM Career Interest Survey

    Science.gov (United States)

    Winarno, N.; Widodo, A.; Rusdiana, D.; Rochintaniawati, D.; Afifah, R. M. A.

    2017-09-01

    This study aims to investigate the profile of pre-service science teachers based on STEM (Science, Technology, Engineering, and Mathematics) Career Interest Survey. The study uses descriptive survey method as the research design. Samples collected from 66 preservice science teachers in a university located in Bandung, Indonesia. The results of the study are the profile of pre-service science teachers based on STEM Career Interest Survey shows that the average number of career interest in the field of technology is 4.08, in science 3.80, mathematics 3.39 and engineering 3.30. Pre-service science teachers are found to have interests in the STEM career fields. This research is necessary as there are many instances of people choosing majors or studies that are not in accordance with their interests and talents. The recommendation of this study is to develop learning in pre-service science teachers by using STEM approach.

  9. Computational Intelligence based techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Karimi, M.; Bakar, A.H.A.; Mohamad, Hasmaini

    2014-01-01

    Highlights: • Unintentional and intentional islanding, their causes, and solutions are presented. • Remote, passive, active and hybrid islanding detection techniques are discussed. • The limitation of these techniques in accurately detect islanding are discussed. • Computational intelligence techniques ability in detecting islanding is discussed. • Review of ANN, fuzzy logic control, ANFIS, Decision tree techniques is provided. - Abstract: Accurate and fast islanding detection of distributed generation is highly important for its successful operation in distribution networks. Up to now, various islanding detection technique based on communication, passive, active and hybrid methods have been proposed. However, each technique suffers from certain demerits that cause inaccuracies in islanding detection. Computational intelligence based techniques, due to their robustness and flexibility in dealing with complex nonlinear systems, is an option that might solve this problem. This paper aims to provide a comprehensive review of computational intelligence based techniques applied for islanding detection of distributed generation. Moreover, the paper compares the accuracies of computational intelligence based techniques over existing techniques to provide a handful of information for industries and utility researchers to determine the best method for their respective system

  10. Engaging Community Leaders in the Development of a Cardiovascular Health Behavior Survey Using Focus Group–Based Cognitive Interviewing

    Directory of Open Access Journals (Sweden)

    Gwenyth R Wallen

    2017-04-01

    Full Text Available Establishing the validity of health behavior surveys used in community-based participatory research (CBPR in diverse populations is often overlooked. A novel, group-based cognitive interviewing method was used to obtain qualitative data for tailoring a survey instrument designed to identify barriers to improved cardiovascular health in at-risk populations in Washington, DC. A focus group–based cognitive interview was conducted to assess item comprehension, recall, and interpretation and to establish the initial content validity of the survey. Thematic analysis of verbatim transcripts yielded 5 main themes for which participants (n = 8 suggested survey modifications, including survey item improvements, suggestions for additional items, community-specific issues, changes in the skip logic of the survey items, and the identification of typographical errors. Population-specific modifications were made, including the development of more culturally appropriate questions relevant to the community. Group-based cognitive interviewing provided an efficient and effective method for piloting a cardiovascular health survey instrument using CBPR.

  11. Beam-based alignment technique for the SLC [Stanford Linear Collider] linac

    International Nuclear Information System (INIS)

    Adolphsen, C.E.; Lavine, T.L.; Atwood, W.B.

    1989-03-01

    Misalignment of quadrupole magnets and beam position monitors (BPMs) in the linac of the SLAC Linear Collider (SLC) cause the electron and positron beams to be steered off-center in the disk-loaded waveguide accelerator structures. Off-center beams produce wakefields which limit the SLC performance at high beam intensities by causing emittance growth. Here, we present a general method for simultaneously determining quadrupole magnet and BPM offsets using beam trajectory measurements. Results from the application of the method to the SLC linac are described. The alignment precision achieved is approximately 100 μm, which is significantly better than that obtained using optical surveying techniques. 2 refs., 4 figs

  12. Artificial Intelligence techniques for mission planning for mobile robots

    International Nuclear Information System (INIS)

    Martinez, J.M.; Nomine, J.P.

    1990-01-01

    This work focuses on Spatial Modelization Techniques and on Control Software Architectures, in order to deal efficiently with the Navigation and Perception problems encountered in Mobile Autonomous Robotics. After a brief survey of the current various approaches for these techniques, we expose ongoing simulation works for a specific mission in robotics. Studies in progress used for Spatial Reasoning are based on new approaches combining Artificial Intelligence and Geometrical techniques. These methods deal with the problem of environment modelization using three types of models: geometrical topological and semantic models at different levels. The decision making processes of control are presented as the result of cooperation between a group of decentralized agents that communicate by sending messages. (author)

  13. RF Sub-sampling Receiver Architecture based on Milieu Adapting Techniques

    DEFF Research Database (Denmark)

    Behjou, Nastaran; Larsen, Torben; Jensen, Ole Kiel

    2012-01-01

    A novel sub-sampling based architecture is proposed which has the ability of reducing the problem of image distortion and improving the signal to noise ratio significantly. The technique is based on sensing the environment and adapting the sampling rate of the receiver to the best possible...

  14. Combining Cluster Analysis and Small Unmanned Aerial Systems (sUAS) for Accurate and Low-cost Bathymetric Surveying

    Science.gov (United States)

    Maples, B. L.; Alvarez, L. V.; Moreno, H. A.; Chilson, P. B.; Segales, A.

    2017-12-01

    Given that classical in-situ direct surveying for geomorphological subsurface information in rivers is time-consuming, labor-intensive, costly, and often involves high-risk activities, it is obvious that non-intrusive technologies, like UAS-based, LIDAR-based remote sensing, have a promising potential and benefits in terms of efficient and accurate measurement of channel topography over large areas within a short time; therefore, a tremendous amount of attention has been paid to the development of these techniques. Over the past two decades, efforts have been undertaken to develop a specialized technique that can penetrate the water body and detect the channel bed to derive river and coastal bathymetry. In this research, we develop a low-cost effective technique for water body bathymetry. With the use of a sUAS and a light-weight sonar, the bathymetry and volume of a small reservoir have been surveyed. The sUAS surveying approach is conducted under low altitudes (2 meters from the water) using the sUAS to tow a small boat with the sonar attached. A cluster analysis is conducted to optimize the sUAS data collection and minimize the standard deviation created by under-sampling in areas of highly variable bathymetry, so measurements are densified in regions featured by steep slopes and drastic changes in the reservoir bed. This technique provides flexibility, efficiency, and free-risk to humans while obtaining high-quality information. The irregularly-spaced bathymetric survey is then interpolated using unstructured Triangular Irregular Network (TIN)-based maps to avoid re-gridding or re-sampling issues.

  15. Chapter 6. Dwarf mistletoe surveys

    Science.gov (United States)

    J.A. Muir; B. Moody

    2002-01-01

    Dwarf mistletoe surveys are conducted for a variety of vegetation management objectives. Various survey and sampling techniques are used either at a broad, landscape scale in forest planning or program review, or at an individual, stand, site level for specific project implementation. Standard and special surveys provide data to map mistletoe distributions and quantify...

  16. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    Directory of Open Access Journals (Sweden)

    Jonathan Lueke

    2011-01-01

    Full Text Available Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  17. MEMS-based power generation techniques for implantable biosensing applications.

    Science.gov (United States)

    Lueke, Jonathan; Moussa, Walied A

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  18. SELF-ASSEMBLED ROV AND PHOTOGRAMMETRIC SURVEYS WITH LOW COST TECHNIQUES

    Directory of Open Access Journals (Sweden)

    E. Costa

    2018-05-01

    Full Text Available In last years, ROVs, have been employed to explore underwater environments and have played an important role for documentation and surveys in different fields of scientific application. In 2017, the Laboratorio di Fotogrammetria of Iuav University of Venice has decided to buy an OpenRov, a low cost ROV that could be assembled by ourselves to add some external components for our necessities, to document archaeological sites. The paper is related to the photogrammetric survey for the documentation of underwater environments and to the comparison between different solutions applied on a case studio, five marble columns on a sandy bottom at 5 meters deep. On the lateral sides of the ROV, we have applied two GoPro Hero4 Session, which have documented the items both with a series of images and with a video. The geometric accuracy of the obtained 3D model has been evaluated through comparison with a photogrammetric model realized with a professional reflex camera, Nikon D610. Some targets have been topographically surveyed with a trilateration and have been used to connected in the same reference system the different models, allowing the comparisons of the point clouds. Remote Operating Vehicles offer not only safety for their operators, but are also a relatively low cost alternative. The employment of a low-cost vehicle adapted to the necessities of surveys support a request for safer, cheaper and efficient methods for exploring underwater environments.

  19. Assessment of ground-based monitoring techniques applied to landslide investigations

    Science.gov (United States)

    Uhlemann, S.; Smith, A.; Chambers, J.; Dixon, N.; Dijkstra, T.; Haslam, E.; Meldrum, P.; Merritt, A.; Gunn, D.; Mackay, J.

    2016-01-01

    A landslide complex in the Whitby Mudstone Formation at Hollin Hill, North Yorkshire, UK is periodically re-activated in response to rainfall-induced pore-water pressure fluctuations. This paper compares long-term measurements (i.e., 2009-2014) obtained from a combination of monitoring techniques that have been employed together for the first time on an active landslide. The results highlight the relative performance of the different techniques, and can provide guidance for researchers and practitioners for selecting and installing appropriate monitoring techniques to assess unstable slopes. Particular attention is given to the spatial and temporal resolutions offered by the different approaches that include: Real Time Kinematic-GPS (RTK-GPS) monitoring of a ground surface marker array, conventional inclinometers, Shape Acceleration Arrays (SAA), tilt meters, active waveguides with Acoustic Emission (AE) monitoring, and piezometers. High spatial resolution information has allowed locating areas of stability and instability across a large slope. This has enabled identification of areas where further monitoring efforts should be focused. High temporal resolution information allowed the capture of 'S'-shaped slope displacement-time behaviour (i.e. phases of slope acceleration, deceleration and stability) in response to elevations in pore-water pressures. This study shows that a well-balanced suite of monitoring techniques that provides high temporal and spatial resolutions on both measurement and slope scale is necessary to fully understand failure and movement mechanisms of slopes. In the case of the Hollin Hill landslide it enabled detailed interpretation of the geomorphological processes governing landslide activity. It highlights the benefit of regularly surveying a network of GPS markers to determine areas for installation of movement monitoring techniques that offer higher resolution both temporally and spatially. The small sensitivity of tilt meter measurements

  20. Industry Based Monkfish Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Monkfish industry leaders expressed concerns that the NEFSC bottom trawl surveys did not sample in all monkfish habitats; particularly the deeper water outside the...

  1. Combination Base64 Algorithm and EOF Technique for Steganography

    Science.gov (United States)

    Rahim, Robbi; Nurdiyanto, Heri; Hidayat, Rahmat; Saleh Ahmar, Ansari; Siregar, Dodi; Putera Utama Siahaan, Andysah; Faisal, Ilham; Rahman, Sayuti; Suita, Diana; Zamsuri, Ahmad; Abdullah, Dahlan; Napitupulu, Darmawan; Ikhsan Setiawan, Muhammad; Sriadhi, S.

    2018-04-01

    The steganography process combines mathematics and computer science. Steganography consists of a set of methods and techniques to embed the data into another media so that the contents are unreadable to anyone who does not have the authority to read these data. The main objective of the use of base64 method is to convert any file in order to achieve privacy. This paper discusses a steganography and encoding method using base64, which is a set of encoding schemes that convert the same binary data to the form of a series of ASCII code. Also, the EoF technique is used to embed encoding text performed by Base64. As an example, for the mechanisms a file is used to represent the texts, and by using the two methods together will increase the security level for protecting the data, this research aims to secure many types of files in a particular media with a good security and not to damage the stored files and coverage media that used.

  2. The role of graphene-based sorbents in modern sample preparation techniques.

    Science.gov (United States)

    de Toffoli, Ana Lúcia; Maciel, Edvaldo Vasconcelos Soares; Fumes, Bruno Henrique; Lanças, Fernando Mauro

    2018-01-01

    The application of graphene-based sorbents in sample preparation techniques has increased significantly since 2011. These materials have good physicochemical properties to be used as sorbent and have shown excellent results in different sample preparation techniques. Graphene and its precursor graphene oxide have been considered to be good candidates to improve the extraction and concentration of different classes of target compounds (e.g., parabens, polycyclic aromatic hydrocarbon, pyrethroids, triazines, and so on) present in complex matrices. Its applications have been employed during the analysis of different matrices (e.g., environmental, biological and food). In this review, we highlight the most important characteristics of graphene-based material, their properties, synthesis routes, and the most important applications in both off-line and on-line sample preparation techniques. The discussion of the off-line approaches includes methods derived from conventional solid-phase extraction focusing on the miniaturized magnetic and dispersive modes. The modes of microextraction techniques called stir bar sorptive extraction, solid phase microextraction, and microextraction by packed sorbent are discussed. The on-line approaches focus on the use of graphene-based material mainly in on-line solid phase extraction, its variation called in-tube solid-phase microextraction, and on-line microdialysis systems. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Alcohol consumption trends in Australia: Comparing surveys and sales-based measures.

    Science.gov (United States)

    Livingston, Michael; Callinan, Sarah; Raninen, Jonas; Pennay, Amy; Dietze, Paul M

    2018-04-01

    Survey data remain a crucial means for monitoring alcohol consumption, but there has been limited work done to ensure that surveys adequately capture changes in per-capita consumption in Australia. In this study, we explore how trends in consumption from two major Australian surveys compare with an official measure of per-capita consumption between 2001 and 2014 and examine age-specific trends in drinking. Data were from five waves of the cross-sectional National Health Survey (total n = 113 279) and 12 waves of the longitudinal Household Income and Labour Dynamics in Australia Study (average n = 12 347). Overall and age-specific estimates of annual alcohol consumption were derived and compared with official per-capita consumption and previous analyses of the National Drug Strategy Household Survey. In terms of overall consumption, both surveys broadly reflected trends in per-capita consumption, especially the decline that has been observed since 2007/2008. Age-specific trends were broadly similar, with the recent decline in consumption clearly concentrated among teenagers and young adults. The main Australian monitoring surveys remain useful monitoring tools for alcohol consumption in Australia. There is consistent evidence that the recent declines in Australian per-capita consumption have been driven by sharp falls in drinking among young people, a trend that requires further study. [Livingston M, Callinan S, Raninen J, Pennay A, Dietze PM. Alcohol consumption trends in Australia: Comparing surveys and sales-based measures. Drug Alcohol Rev 2017;00:000-000]. © 2017 Australasian Professional Society on Alcohol and other Drugs.

  4. A survey of intrusion detection techniques in Cloud

    OpenAIRE

    Modi, C.; Patel, D.; Patel, H.; Borisaniya, B.; Patel, A.; Rajarajan, M.

    2013-01-01

    Cloud computing provides scalable, virtualized on-demand services to the end users with greater flexibility and lesser infrastructural investment. These services are provided over the Internet using known networking protocols, standards and formats under the supervision of different managements. Existing bugs and vulnerabilities in underlying technologies and legacy protocols tend to open doors for intrusion. This paper, surveys different intrusions affecting availability, confidentiality and...

  5. Survey of image quality and radiographic technique of pediatric chest examinations performed in Latin America

    International Nuclear Information System (INIS)

    Khoury, H.; Mora, P.; Defaz, M.Y.; Blanco, S.; Leyton, F.; Benavente, T.; Ortiz Lopez, P.; Ramirez, R.

    2008-01-01

    This work presents the results of a survey of entrance surface air kerma values (K e ), image quality and radiographic exposure parameters used in pediatric chest examinations performed in Latin America. This study is part of the activities of the IAEA Regional Project RLA/9/057 whose objective is to optimize the radiological protection of patients in diagnostic and interventional radiology, nuclear medicine and radiotherapy. The survey was performed in nine hospitals in Argentina (1), Brazil (4), Chile (1), Costa Rica (1), Peru (1) and Ecuador (1). The study group consisted of 462 pediatric patients (Group I- from two days to one year, Group II- from four to six years of age) undergoing chest PA/AP examinations. At the time of the examination the exposure parameters (kVp, mAs, focal-spot-to-film distance, etc.) and patient information (gender, height, weight and age) were recorded. The radiographic image quality was evaluated by the local radiologist based on the European Guidelines on Quality Criteria for Diagnostic Radiographic Images in Pediatrics. The results showed that the exposure parameters used on newborn patients were in the majority outside the 60-65kV range recommended by the European Guidelines for a good radiographic practice. In the case of examinations of patients with age between 4 to 6 years, 80% were performed with a peak tube voltage within the 60-80 kV range, as recommended by the European Guidelines. It was found that none of countries fully comply with the European Guidelines on Quality Criteria and those criteria No. 2 and No. 3 (reproduction of the chest without rotation) received the lowest scores. Probably this occurs because there are no proper patient immobilization devices. The Ke values, for both patient groups, showed a wide dispersion, ranged from 10 μGy to 160μGy for the newborn patients and from 20μGy to 240μGy for infant patients. It is possible to conclude that, in the participating Latin American countries on this project

  6. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    Science.gov (United States)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  7. Learning Physics through Project-Based Learning Game Techniques

    Science.gov (United States)

    Baran, Medine; Maskan, Abdulkadir; Yasar, Seyma

    2018-01-01

    The aim of the present study, in which Project and game techniques are used together, is to examine the impact of project-based learning games on students' physics achievement. Participants of the study consist of 34 9th grade students (N = 34). The data were collected using achievement tests and a questionnaire. Throughout the applications, the…

  8. Model-based Sensor Data Acquisition and Management

    OpenAIRE

    Aggarwal, Charu C.; Sathe, Saket; Papaioannou, Thanasis G.; Jeung, Ho Young; Aberer, Karl

    2012-01-01

    In recent years, due to the proliferation of sensor networks, there has been a genuine need of researching techniques for sensor data acquisition and management. To this end, a large number of techniques have emerged that advocate model-based sensor data acquisition and management. These techniques use mathematical models for performing various, day-to-day tasks involved in managing sensor data. In this chapter, we survey the state-of-the-art techniques for model-based sensor data acquisition...

  9. Unbiased stereologic techniques for practical use in diagnostic histopathology

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt

    1995-01-01

    Grading of malignancy by the examination of morphologic and cytologic details in histologic sections from malignant neoplasms is based exclusively on qualitative features, associated with significant subjectivity, and thus rather poor reproducibility. The traditional way of malignancy grading may...... by introducing quantitative techniques in the histopathologic discipline of malignancy grading. Unbiased stereologic methods, especially based on measurements of nuclear three-dimensional mean size, have during the last decade proved their value in this regard. In this survey, the methods are reviewed regarding...... the basic technique involved, sampling, efficiency, and reproducibility. Various types of cancers, where stereologic grading of malignancy has been used, are reviewed and discussed with regard to the development of a new objective and reproducible basis for carrying out prognosis-related malignancy grading...

  10. Some fuzzy techniques for staff selection process: A survey

    Science.gov (United States)

    Md Saad, R.; Ahmad, M. Z.; Abu, M. S.; Jusoh, M. S.

    2013-04-01

    With high level of business competition, it is vital to have flexible staff that are able to adapt themselves with work circumstances. However, staff selection process is not an easy task to be solved, even when it is tackled in a simplified version containing only a single criterion and a homogeneous skill. When multiple criteria and various skills are involved, the problem becomes much more complicated. In adddition, there are some information that could not be measured precisely. This is patently obvious when dealing with opinions, thoughts, feelings, believes, etc. One possible tool to handle this issue is by using fuzzy set theory. Therefore, the objective of this paper is to review the existing fuzzy techniques for solving staff selection process. It classifies several existing research methods and identifies areas where there is a gap and need further research. Finally, this paper concludes by suggesting new ideas for future research based on the gaps identified.

  11. Techniques for laser processing, assay, and examination of spent fuel

    International Nuclear Information System (INIS)

    Gray, J.H.; Mitchell, R.C.; Rogell, M.L.

    1981-11-01

    Fuel examination studies were performed which have application to interim spent fuel storage. These studies were in three areas, i.e., laser drilling and rewelding demonstration, nondestructive assay techniques survey, and fuel examination techniques survey

  12. PERFORMANCE ANALYSIS OF PILOT BASED CHANNEL ESTIMATION TECHNIQUES IN MB OFDM SYSTEMS

    Directory of Open Access Journals (Sweden)

    M. Madheswaran

    2011-12-01

    Full Text Available Ultra wideband (UWB communication is mainly used for short range of communication in wireless personal area networks. Orthogonal Frequency Division Multiplexing (OFDM is being used as a key physical layer technology for Fourth Generation (4G wireless communication. OFDM based communication gives high spectral efficiency and mitigates Inter-symbol Interference (ISI in a wireless medium. In this paper the IEEE 802.15.3a based Multiband OFDM (MB OFDM system is considered. The pilot based channel estimation techniques are considered to analyze the performance of MB OFDM systems over Liner Time Invariant (LTI Channel models. In this paper, pilot based Least Square (LS and Least Minimum Mean Square Error (LMMSE channel estimation technique has been considered for UWB OFDM system. In the proposed method, the estimated Channel Impulse Responses (CIRs are filtered in the time domain for the consideration of the channel delay spread. Also the performance of proposed system has been analyzed for different modulation techniques for various pilot density patterns.

  13. Illumination Sufficiency Survey Techniques: In-situ Measurements of Lighting System Performance and a User Preference Survey for Illuminance in an Off-Grid, African Setting

    Energy Technology Data Exchange (ETDEWEB)

    Alstone, Peter; Jacobson, Arne; Mills, Evan

    2010-08-26

    Efforts to promote rechargeable electric lighting as a replacement for fuel-based light sources in developing countries are typically predicated on the notion that lighting service levels can be maintained or improved while reducing the costs and environmental impacts of existing practices. However, the extremely low incomes of those who depend on fuel-based lighting create a need to balance the hypothetically possible or desirable levels of light with those that are sufficient and affordable. In a pilot study of four night vendors in Kenya, we document a field technique we developed to simultaneously measure the effectiveness of lighting service provided by a lighting system and conduct a survey of lighting service demand by end-users. We took gridded illuminance measurements across each vendor's working and selling area, with users indicating the sufficiency of light at each point. User light sources included a mix of kerosene-fueled hurricane lanterns, pressure lamps, and LED lanterns.We observed illuminance levels ranging from just above zero to 150 lux. The LED systems markedly improved the lighting service levels over those provided by kerosene-fueled hurricane lanterns. Users reported that the minimum acceptable threshold was about 2 lux. The results also indicated that the LED lamps in use by the subjects did not always provide sufficient illumination over the desired retail areas. Our sample size is much too small, however, to reach any conclusions about requirements in the broader population. Given the small number of subjects and very specific type of user, our results should be regarded as indicative rather than conclusive. We recommend replicating the method at larger scales and across a variety of user types and contexts. Policymakers should revisit the subject of recommended illuminance levels regularly as LED technology advances and the price/service balance point evolves.

  14. Dealing with Magnetic Disturbances in Human Motion Capture: A Survey of Techniques

    Directory of Open Access Journals (Sweden)

    Gabriele Ligorio

    2016-03-01

    Full Text Available Magnetic-Inertial Measurement Units (MIMUs based on microelectromechanical (MEMS technologies are widespread in contexts such as human motion tracking. Although they present several advantages (lightweight, size, cost, their orientation estimation accuracy might be poor. Indoor magnetic disturbances represent one of the limiting factors for their accuracy, and, therefore, a variety of work was done to characterize and compensate them. In this paper, the main compensation strategies included within Kalman-based orientation estimators are surveyed and classified according to which degrees of freedom are affected by the magnetic data and to the magnetic disturbance rejection methods implemented. By selecting a representative method from each category, four algorithms were obtained and compared in two different magnetic environments: (1 small workspace with an active magnetic source; (2 large workspace without active magnetic sources. A wrist-worn MIMU was used to acquire data from a healthy subject, whereas a stereophotogrammetric system was adopted to obtain ground-truth data. The results suggested that the model-based approaches represent the best compromise between the two testbeds. This is particularly true when the magnetic data are prevented to affect the estimation of the angles with respect to the vertical direction.

  15. Optical supervised filtering technique based on Hopfield neural network

    Science.gov (United States)

    Bal, Abdullah

    2004-11-01

    Hopfield neural network is commonly preferred for optimization problems. In image segmentation, conventional Hopfield neural networks (HNN) are formulated as a cost-function-minimization problem to perform gray level thresholding on the image histogram or the pixels' gray levels arranged in a one-dimensional array [R. Sammouda, N. Niki, H. Nishitani, Pattern Rec. 30 (1997) 921-927; K.S. Cheng, J.S. Lin, C.W. Mao, IEEE Trans. Med. Imag. 15 (1996) 560-567; C. Chang, P. Chung, Image and Vision comp. 19 (2001) 669-678]. In this paper, a new high speed supervised filtering technique is proposed for image feature extraction and enhancement problems by modifying the conventional HNN. The essential improvement in this technique is to use 2D convolution operation instead of weight-matrix multiplication. Thereby, neural network based a new filtering technique has been obtained that is required just 3 × 3 sized filter mask matrix instead of large size weight coefficient matrix. Optical implementation of the proposed filtering technique is executed easily using the joint transform correlator. The requirement of non-negative data for optical implementation is provided by bias technique to convert the bipolar data to non-negative data. Simulation results of the proposed optical supervised filtering technique are reported for various feature extraction problems such as edge detection, corner detection, horizontal and vertical line extraction, and fingerprint enhancement.

  16. A DIFFERENT WEB-BASED GEOCODING SERVICE USING FUZZY TECHNIQUES

    Directory of Open Access Journals (Sweden)

    P. Pahlavani

    2015-12-01

    Full Text Available Geocoding – the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  17. Wood lens design philosophy based on a binary additive manufacturing technique

    Science.gov (United States)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  18. Bandwidth-Tunable Fiber Bragg Gratings Based on UV Glue Technique

    Science.gov (United States)

    Fu, Ming-Yue; Liu, Wen-Feng; Chen, Hsin-Tsang; Chuang, Chia-Wei; Bor, Sheau-Shong; Tien, Chuen-Lin

    2007-07-01

    In this study, we have demonstrated that a uniform fiber Bragg grating (FBG) can be transformed into a chirped fiber grating by a simple UV glue adhesive technique without shifting the reflection band with respect to the center wavelength of the FBG. The technique is based on the induced strain of an FBG due to the UV glue adhesive force on the fiber surface that causes a grating period variation and an effective index change. This technique can provide a fast and simple method of obtaining the required chirp value of a grating for applications in the dispersion compensators, gain flattening in erbium-doped fiber amplifiers (EDFAs) or optical filters.

  19. Vision based techniques for rotorcraft low altitude flight

    Science.gov (United States)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  20. Plane and geodetic surveying

    CERN Document Server

    Johnson, Aylmer

    2014-01-01

    IntroductionAim And ScopeClassification Of SurveysThe Structure Of This BookGeneral Principles Of SurveyingErrorsRedundancyStiffnessAdjustmentPlanning And Record KeepingPrincipal Surveying ActivitiesEstablishing Control NetworksMappingSetting OutResectioningDeformation MonitoringAngle MeasurementThe Surveyor's CompassThe ClinometerThe Total StationMaking ObservationsChecks On Permanent AdjustmentsDistance MeasurementGeneralTape MeasurementsOptical Methods (Tachymetry)Electromagnetic Distance Measurement (EDM)Ultrasonic MethodsGNSSLevellingTheoryThe InstrumentTechniqueBookingPermanent Adjustmen

  1. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  2. Geothermal survey handbook

    Energy Technology Data Exchange (ETDEWEB)

    1974-01-01

    The objective of this handbook is to publicize widely the nature of geothermal surveys. It covers geothermal survey planning and measurement as well as measurement of thermal conductivity. Methods for the detection of eruptive areas, the measurement of radiative heat using snowfall, the measurement of surface temperature using infrared radiation and the measurement of thermal flow are described. The book also contains information on physical detection of geothermal reservoirs, the measurement of spring wells, thermographic measurement of surface heat, irregular layer surveying, air thermographics and aerial photography. Isotope measurement techniques are included.

  3. Mobility Based Key Management Technique for Multicast Security in Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    B. Madhusudhanan

    2015-01-01

    Full Text Available In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality.

  4. Atoms for Food and Nutrition: Application of Nuclear Techniques in Food and Agriculture

    International Nuclear Information System (INIS)

    Nyambura, M.

    2017-01-01

    Light-based soil and plant analysis for evidence-based decision making Leveraging latest technology from lab to space. Soil-Plants Spectroscopy is the Simplicity of light-Light based techniques for measurement of soil and plant materials and farmers advisory. Interaction of light with matter provides rapid, low cost & reproducible soil characterization. Soil Health Surveillance enables surveillance science and evidence-based approaches - things not previously feasible.human and laboratory capacity for diagnosing, surveying and managing soil nutrient deficiencies in Sub-Saharan Africa is woefully inadequate for the task. Strengthening Africa capacity on new science and technology a key resilience strategy for rapid and low cost analytical and diagnostic techniques, improved and well-targeted guidelines and Scientific expertise

  5. Position fixing and surveying techniques for marine archaeological studies

    Digital Repository Service at National Institute of Oceanography (India)

    Ganesan, P.

    . This technical report is going to be of great help to marine archaeologists, who wants to know the capabilities of some of the most common available tools for position fixing, their accuracies and method of surveying, which in turn will help in selecting...

  6. Experience base for Radioactive Waste Thermal Processing Systems: A preliminary survey

    International Nuclear Information System (INIS)

    Mayberry, J.; Geimer, R.; Gillins, R.; Steverson, E.M.; Dalton, D.; Anderson, G.L.

    1992-04-01

    In the process of considering thermal technologies for potential treatment of the Idaho National Engineering Laboratory mixed transuranic contaminated wastes, a preliminary survey of the experience base available from Radioactive Waste Thermal Processing Systems is reported. A list of known commercial radioactive waste facilities in the United States and some international thermal treatment facilities are provided. Survey focus is upon the US Department of Energy thermal treatment facilities. A brief facility description and a preliminary summary of facility status, and problems experienced is provided for a selected subset of the DOE facilities

  7. Retinal Vessels Segmentation Techniques and Algorithms: A Survey

    Directory of Open Access Journals (Sweden)

    Jasem Almotiri

    2018-01-01

    Full Text Available Retinal vessels identification and localization aim to separate the different retinal vasculature structure tissues, either wide or narrow ones, from the fundus image background and other retinal anatomical structures such as optic disc, macula, and abnormal lesions. Retinal vessels identification studies are attracting more and more attention in recent years due to non-invasive fundus imaging and the crucial information contained in vasculature structure which is helpful for the detection and diagnosis of a variety of retinal pathologies included but not limited to: Diabetic Retinopathy (DR, glaucoma, hypertension, and Age-related Macular Degeneration (AMD. With the development of almost two decades, the innovative approaches applying computer-aided techniques for segmenting retinal vessels are becoming more and more crucial and coming closer to routine clinical applications. The purpose of this paper is to provide a comprehensive overview for retinal vessels segmentation techniques. Firstly, a brief introduction to retinal fundus photography and imaging modalities of retinal images is given. Then, the preprocessing operations and the state of the art methods of retinal vessels identification are introduced. Moreover, the evaluation and validation of the results of retinal vessels segmentation are discussed. Finally, an objective assessment is presented and future developments and trends are addressed for retinal vessels identification techniques.

  8. Three-Dimensional Inverse Transport Solver Based on Compressive Sensing Technique

    Science.gov (United States)

    Cheng, Yuxiong; Wu, Hongchun; Cao, Liangzhi; Zheng, Youqi

    2013-09-01

    According to the direct exposure measurements from flash radiographic image, a compressive sensing-based method for three-dimensional inverse transport problem is presented. The linear absorption coefficients and interface locations of objects are reconstructed directly at the same time. It is always very expensive to obtain enough measurements. With limited measurements, compressive sensing sparse reconstruction technique orthogonal matching pursuit is applied to obtain the sparse coefficients by solving an optimization problem. A three-dimensional inverse transport solver is developed based on a compressive sensing-based technique. There are three features in this solver: (1) AutoCAD is employed as a geometry preprocessor due to its powerful capacity in graphic. (2) The forward projection matrix rather than Gauss matrix is constructed by the visualization tool generator. (3) Fourier transform and Daubechies wavelet transform are adopted to convert an underdetermined system to a well-posed system in the algorithm. Simulations are performed and numerical results in pseudo-sine absorption problem, two-cube problem and two-cylinder problem when using compressive sensing-based solver agree well with the reference value.

  9. Analysis of Cell Phone Usage Using Correlation Techniques

    OpenAIRE

    T S R MURTHY; D. SIVA RAMA KRISHNA

    2011-01-01

    The present paper is a sample survey analysis, examined based on correlation techniques. The usage ofmobile phones is clearly almost un-avoidable these days and as such the authors have made a systematicsurvey through a well prepared questionnaire on making use of mobile phones to the maximum extent.These samples are various economical groups across a population of over one-lakh people. The resultsare scientifically categorized and interpreted to match the ground reality.

  10. SKILLS-BASED ECLECTIC TECHNIQUES MATRIX FOR ELT MICROTEACHINGS

    Directory of Open Access Journals (Sweden)

    İskender Hakkı Sarıgöz

    2016-10-01

    Full Text Available Foreign language teaching undergoes constant changes due to the methodological improvement. This progress may be examined in two parts. They are the methods era and the post-methods era. It is not pragmatic today to propose a particular language teaching method and its techniques for all purposes. The holistic inflexibility of mid-century methods has long gone. In the present day, constructivist foreign language teaching trends attempt to see the learner as a whole person and an individual who may be different from the other students in many respects. At the same time, the individual differences should not keep the learners away from group harmony. For this reason, current teacher training programs require eclectic teaching matrixes for unit design considering the mixed ability student groups. These matrixes can be prepared in a multidimensional fashion because there are many functional techniques in different methods and other new techniques to be created by instructors freely in accordance with the teaching aims. The hypothesis in this argument is that the collection of foreign language teaching techniques compiled in ELT microteachings for a particular group of learners has to be arranged eclectically in order to update the teaching process. Nevertheless, designing a teaching format of this sort is a demanding and highly criticized task. This study briefly argues eclecticism in language-skills based methodological struggle from the perspective of ELT teacher education.

  11. Seismic qualification of nuclear control board by using base isolation technique

    International Nuclear Information System (INIS)

    Koizumi, T.; Tsujiuchi, N.; Fujita, T.

    1987-01-01

    The purpose is to adopt base isolation technique as a new approach for seismic qualification of nuclear control board. Basic concept of base isolation technique is expressed. Two dimensional linear motion mechanism with pre-tensioned coil springs and some dampers are included in the isolation device. Control board is regarded as a lamped mass system with inertia moment. Fundamental movement of this device and control board is calculated as a non-linear response problems. Fundamental analysis and numerical estimation, experimental investigation has been undertaken using an actual size control board. Sufficient agreement was recognized between experimental results and numerical estimation. (orig./HP)

  12. A survey informed PV-based cost-effective electrification options for rural sub-Saharan Africa

    International Nuclear Information System (INIS)

    Opiyo, Nicholas

    2016-01-01

    A comprehensive survey is carried out in Kendu Bay area of Kenya to determine electrification patterns of a typical rural sub-Saharan Africa community and to determine the reasons behind such energy choices. The data from the survey is used to build a transition probability matrix (TPM) for different electrification states for Kendu Bay households. The TPM and the survey data are used to model temporal diffusion of PV systems and PV-based communal (mini/micro) grids in the area. Survey data show that majority of Kendu Bay residents shun the national grid due to high connection fees, unreliability of the system, and corruption; people who can afford-to choose small solar home systems for their basic electricity needs. Without any government policy intervention or help, simulation results show that once 100% electrification status has been achieved in Kendu Bay, only 26% of the residents will be found to be electrified through the national grid alone; the majority (38%) will be electrified through PV-based communal grids while the remaining 36% will be electrified through grid connected PV home systems (26%) or grid connected communal grids (10%). - Highlights: • A survey on sources of electricity in Kendu Bay area of Kenya is carried out. • Survey results are used to determine choices and sources of household electricity. • Factors affecting electrification are highlighted. • Survey data are used to build a transition probability matrix (TPM). • The TPM and data from the survey are used to model temporal PV diffusion.

  13. Improvement in QEPAS system utilizing a second harmonic based wavelength calibration technique

    Science.gov (United States)

    Zhang, Qinduan; Chang, Jun; Wang, Fupeng; Wang, Zongliang; Xie, Yulei; Gong, Weihua

    2018-05-01

    A simple laser wavelength calibration technique, based on second harmonic signal, is demonstrated in this paper to improve the performance of quartz enhanced photoacoustic spectroscopy (QEPAS) gas sensing system, e.g. improving the signal to noise ratio (SNR), detection limit and long-term stability. Constant current, corresponding to the gas absorption line, combining f/2 frequency sinusoidal signal are used to drive the laser (constant driving mode), a software based real-time wavelength calibration technique is developed to eliminate the wavelength drift due to ambient fluctuations. Compared to conventional wavelength modulation spectroscopy (WMS), this method allows lower filtering bandwidth and averaging algorithm applied to QEPAS system, improving SNR and detection limit. In addition, the real-time wavelength calibration technique guarantees the laser output is modulated steadily at gas absorption line. Water vapor is chosen as an objective gas to evaluate its performance compared to constant driving mode and conventional WMS system. The water vapor sensor was designed insensitive to the incoherent external acoustic noise by the numerical averaging technique. As a result, the SNR increases 12.87 times in wavelength calibration technique based system compared to conventional WMS system. The new system achieved a better linear response (R2 = 0 . 9995) in concentration range from 300 to 2000 ppmv, and achieved a minimum detection limit (MDL) of 630 ppbv.

  14. Effectiveness of control techniques in drinking water installations. Survey of recent scientific research results; Effectiviteit beheerstechnieken in drinkwaterinstallaties. RIVM inventariseert recente wetenschappelijke bevindingen

    Energy Technology Data Exchange (ETDEWEB)

    Scheffer, W.

    2012-12-15

    A literature survey has been carried out on current scientific knowledge about the effectiveness of management techniques for Legionella in potable water systems. The examined scientific literature from 2007-201 comprises mainly case studies on the effect of the introduction of a certain management technique on Legionella growth [Dutch] Het RIVM heeft in opdracht van de Inspectie Leefomgeving en Transport (ILT) literatuuronderzoek gedaan naar de huidige wetenschappelijke kennis over de effectiviteit van beheerstechnieken voor legionella in drinkwaterinstallaties. De onderzochte wetenschappelijke literatuur uit 2007-201 I betreft vooral casestudies naar het effect van de introductie van een bepaalde beheerstechniek op de legionellagroei.

  15. Advancing US GHG Inventory by Incorporating Survey Data using Machine-Learning Techniques

    Science.gov (United States)

    Alsaker, C.; Ogle, S. M.; Breidt, J.

    2017-12-01

    Crop management data are used in the National Greenhouse Gas Inventory that is compiled annually and reported to the United Nations Framework Convention on Climate Change. Emissions for carbon stock change and N2O emissions for US agricultural soils are estimated using the USDA National Resources Inventory (NRI). NRI provides basic information on land use and cropping histories, but it does not provide much detail on other management practices. In contrast, the Conservation Effects Assessment Project (CEAP) survey collects detailed crop management data that could be used in the GHG Inventory. The survey data were collected from NRI survey locations that are a subset of the NRI every 10 years. Therefore, imputation of the CEAP are needed to represent the management practices across all NRI survey locations both spatially and temporally. Predictive mean matching and an artificial neural network methods have been applied to develop imputation model under a multiple imputation framework. Temporal imputation involves adjusting the imputation model using state-level USDA Agricultural Resource Management Survey data. Distributional and predictive accuracy is assessed for the imputed data, providing not only management data needed for the inventory but also rigorous estimates of uncertainty.

  16. Developing the online survey.

    Science.gov (United States)

    Gordon, Jeffry S; McNew, Ryan

    2008-12-01

    Institutions of higher education are now using Internet-based technology tools to conduct surveys for data collection. Research shows that the type and quality of responses one receives with online surveys are comparable with what one receives in paper-based surveys. Data collection can take place on Web-based surveys, e-mail-based surveys, and personal digital assistants/Smartphone devices. Web surveys can be subscription templates, software packages installed on one's own server, or created from scratch using Web programming development tools. All of these approaches have their advantages and disadvantages. The survey owner must make informed decisions as to the right technology to implement. The correct choice can save hours of work in sorting, organizing, and analyzing data.

  17. Development of the Risk-Based Inspection Techniques and Pilot Plant Activities

    International Nuclear Information System (INIS)

    Phillips, J.H.

    1997-01-01

    Risk-based techniques have been developed for commercial nuclear power plants. System boundaries and success criteria is defined using the probabilistic risk analysis or probabilistic safety analysis developed to meet the individual plant evaluation. Final ranking of components is by a plant expert panel similar to the one developed for maintenance rule. Components are identified as being high risk-significant or low-risk significant. Maintenance and resources are focused on those components that have the highest risk-significance. The techniques have been developed and applied at a number of pilot plants. Results from the first risk-based inspection pilot plant indicates that safety due to pipe failure can be doubled while the inspection reduced to about 80% when compared with current inspection programs. The reduction in inspection reduces the person-rem exposure resulting in further increases in safety. These techniques have been documented in publication by the ASME CRTD

  18. A Survey of Phase Change Memory Systems

    Institute of Scientific and Technical Information of China (English)

    夏飞; 蒋德钧; 熊劲; 孙凝晖

    2015-01-01

    As the scaling of applications increases, the demand of main memory capacity increases in order to serve large working set. It is difficult for DRAM (dynamic random access memory) based memory system to satisfy the memory capacity requirement due to its limited scalability and high energy consumption. Compared to DRAM, PCM (phase change memory) has better scalability, lower energy leakage, and non-volatility. PCM memory systems have become a hot topic of academic and industrial research. However, PCM technology has the following three drawbacks: long write latency, limited write endurance, and high write energy, which raises challenges to its adoption in practice. This paper surveys architectural research work to optimize PCM memory systems. First, this paper introduces the background of PCM. Then, it surveys research efforts on PCM memory systems in performance optimization, lifetime improving, and energy saving in detail, respectively. This paper also compares and summarizes these techniques from multiple dimensions. Finally, it concludes these optimization techniques and discusses possible research directions of PCM memory systems in future.

  19. Survey-based Indicators of Regional Labour Markets and Interregional Migration in Norway

    OpenAIRE

    Carlsen, Fredrik; Johansen, Kåre

    2002-01-01

    A rich set of regional labour market variables is utilised to explain interregional migration in Norway. In particular, regional indicators of labour market pressure are computed from survey data in which respondents are asked to evaluate local job prospects in their resident municipality and the surroundings. Mean satisfaction with local job prospects reported by respondents in a region and related survey-based indicators have a positive and significant impact on net in-migration to the regi...

  20. Hyphenated analytical techniques for materials characterisation

    International Nuclear Information System (INIS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-01-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  1. Hyphenated analytical techniques for materials characterisation

    Science.gov (United States)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  2. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  3. The Research of Histogram Enhancement Technique Based on Matlab Software

    Directory of Open Access Journals (Sweden)

    Li Kai

    2014-08-01

    Full Text Available Histogram enhancement technique has been widely applied as a typical pattern in digital image processing. The paper is based on Matlab software, through the two ways of histogram equalization and histogram specification technologies to deal with the darker images, using two methods of partial equilibrium and mapping histogram to transform the original histograms, thereby enhanced the image information. The results show that these two kinds of techniques both can significantly improve the image quality and enhance the image feature.

  4. Survey to explore understanding of the principles of aseptic technique: Qualitative content analysis with descriptive analysis of confidence and training.

    Science.gov (United States)

    Gould, Dinah J; Chudleigh, Jane; Purssell, Edward; Hawker, Clare; Gaze, Sarah; James, Deborah; Lynch, Mary; Pope, Nicola; Drey, Nicholas

    2018-04-01

    In many countries, aseptic procedures are undertaken by nurses in the general ward setting, but variation in practice has been reported, and evidence indicates that the principles underpinning aseptic technique are not well understood. A survey was conducted, employing a brief, purpose-designed, self-reported questionnaire. The response rate was 72%. Of those responding, 65% of nurses described aseptic technique in terms of the procedure used to undertake it, and 46% understood the principles of asepsis. The related concepts of cleanliness and sterilization were frequently confused with one another. Additionally, 72% reported that they not had received training for at least 5 years; 92% were confident of their ability to apply aseptic technique; and 90% reported that they had not been reassessed since their initial training. Qualitative analysis confirmed a lack of clarity about the meaning of aseptic technique. Nurses' understanding of aseptic technique and the concepts of sterility and cleanliness is inadequate, a finding in line with results of previous studies. This knowledge gap potentially places patients at risk. Nurses' understanding of the principles of asepsis could be improved. Further studies should establish the generalizability of the study findings. Possible improvements include renewed emphasis during initial nurse education, greater opportunity for updating knowledge and skills post-qualification, and audit of practice. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  5. Estimate-Merge-Technique-based algorithms to track an underwater ...

    Indian Academy of Sciences (India)

    D V A N Ravi Kumar

    2017-07-04

    Jul 4, 2017 ... In this paper, two novel methods based on the Estimate Merge Technique ... mentioned advantages of the proposed novel methods is shown by carrying out Monte Carlo simulation in .... equations are converted to sequential equations to make ... estimation error and low convergence time) at feasibly high.

  6. Post-fire debris flow prediction in Western United States: Advancements based on a nonparametric statistical technique

    Science.gov (United States)

    Nikolopoulos, E. I.; Destro, E.; Bhuiyan, M. A. E.; Borga, M., Sr.; Anagnostou, E. N.

    2017-12-01

    Fire disasters affect modern societies at global scale inducing significant economic losses and human casualties. In addition to their direct impacts they have various adverse effects on hydrologic and geomorphologic processes of a region due to the tremendous alteration of the landscape characteristics (vegetation, soil properties etc). As a consequence, wildfires often initiate a cascade of hazards such as flash floods and debris flows that usually follow the occurrence of a wildfire thus magnifying the overall impact in a region. Post-fire debris flows (PFDF) is one such type of hazards frequently occurring in Western United States where wildfires are a common natural disaster. Prediction of PDFD is therefore of high importance in this region and over the last years a number of efforts from United States Geological Survey (USGS) and National Weather Service (NWS) have been focused on the development of early warning systems that will help mitigate PFDF risk. This work proposes a prediction framework that is based on a nonparametric statistical technique (random forests) that allows predicting the occurrence of PFDF at regional scale with a higher degree of accuracy than the commonly used approaches that are based on power-law thresholds and logistic regression procedures. The work presented is based on a recently released database from USGS that reports a total of 1500 storms that triggered and did not trigger PFDF in a number of fire affected catchments in Western United States. The database includes information on storm characteristics (duration, accumulation, max intensity etc) and other auxiliary information of land surface properties (soil erodibility index, local slope etc). Results show that the proposed model is able to achieve a satisfactory prediction accuracy (threat score > 0.6) superior of previously published prediction frameworks highlighting the potential of nonparametric statistical techniques for development of PFDF prediction systems.

  7. Defining the Simulation Technician Role: Results of a Survey-Based Study.

    Science.gov (United States)

    Bailey, Rachel; Taylor, Regina G; FitzGerald, Michael R; Kerrey, Benjamin T; LeMaster, Thomas; Geis, Gary L

    2015-10-01

    In health care simulation, simulation technicians perform multiple tasks to support various educational offerings. Technician responsibilities and the tasks that accompany them seem to vary between centers. The objectives were to identify the range and frequency of tasks that technicians perform and to determine if there is a correspondence between what technicians do and what they feel their responsibilities should be. We hypothesized that there is a core set of responsibilities and tasks for the technician position regardless of background, experience, and type of simulation center. We conducted a prospective, survey-based study of individuals currently functioning in a simulation technician role in a simulation center. This survey was designed internally and piloted within 3 academic simulation centers. Potential respondents were identified through a national mailing list, and the survey was distributed electronically during a 3-week period. A survey request was sent to 280 potential participants, 136 (49%) responded, and 73 met inclusion criteria. Five core tasks were identified as follows: equipment setup and breakdown, programming scenarios into software, operation of software during simulation, audiovisual support for courses, and on-site simulator maintenance. Independent of background before they were hired, technicians felt unprepared for their role once taking the position. Formal training was identified as a need; however, the majority of technicians felt experience over time was the main contributor toward developing knowledge and skills within their role. This study represents a first step in defining the technician role within simulation-based education and supports the need for the development of a formal job description to allow recruitment, development, and certification.

  8. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes.

    Science.gov (United States)

    Chien, Tsair-Wei; Lin, Weir-Sen

    2016-03-02

    The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients' true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access.

  9. Proposing a Wiki-Based Technique for Collaborative Essay Writing

    Directory of Open Access Journals (Sweden)

    Mabel Ortiz Navarrete

    2014-10-01

    Full Text Available This paper aims at proposing a technique for students learning English as a foreign language when they collaboratively write an argumentative essay in a wiki environment. A wiki environment and collaborative work play an important role within the academic writing task. Nevertheless, an appropriate and systematic work assignment is required in order to make use of both. In this paper the proposed technique when writing a collaborative essay mainly attempts to provide the most effective way to enhance equal participation among group members by taking as a base computer mediated collaboration. Within this context, the students’ role is clearly defined and individual and collaborative tasks are explained.

  10. Diabetes incidence and projections from prevalence surveys in Fiji.

    Science.gov (United States)

    Morrell, Stephen; Lin, Sophia; Tukana, Isimeli; Linhart, Christine; Taylor, Richard; Vatucawaqa, Penina; Magliano, Dianna J; Zimmet, Paul

    2016-11-25

    Type 2 diabetes mellitus (T2DM) incidence is traditionally derived from cohort studies that are not always feasible, representative, or available. The present study estimates T2DM incidence in Fijian adults from T2DM prevalence estimates assembled from surveys of 25-64 year old adults conducted over 30 years (n = 14,288). T2DM prevalence by five-year age group from five population-based risk factor surveys conducted over 1980-2011 were variously adjusted for urban-rural residency, ethnicity, and sex to previous censuses (1976, 1986, 1996, 2009) to improve representativeness. Prevalence estimates were then used to calculate T2DM incidence based on birth cohorts from the age-period (Lexis) matrix following the Styblo technique, first used to estimate annual risk of tuberculosis infection (incidence) from sequential Mantoux population surveys. Poisson regression of year, age, sex, and ethnicity strata (n = 160) was used to develop projections of T2DM prevalence and incidence to 2020 based on various scenarios of population weight measured by body mass index (BMI) change. T2DM prevalence and annual incidence increased in Fiji over 1980-2011. Prevalence was higher in Indians and men than i-Taukei and women. Incidence was higher in Indians and women. From regression analyses, absolute reductions of 2.6 to 5.1% in T2DM prevalence (13-26% lower), and 0.5-0.9 per 1000 person-years in incidence (8-14% lower), could be expected in 2020 in adults if mean population weight could be reduced by 1-4 kg, compared to the current period trend in weight gain. This is the first application of the Styblo technique to calculate T2DM incidence from population-based prevalence surveys over time. Reductions in population BMI are predicted to reduce T2DM incidence and prevalence in Fiji among adults aged 25-64 years.

  11. Experimental evaluation of a quasi-modal parameter based rotor foundation identification technique

    Science.gov (United States)

    Yu, Minli; Liu, Jike; Feng, Ningsheng; Hahn, Eric J.

    2017-12-01

    Correct modelling of the foundation of rotating machinery is an invaluable asset in model-based rotor dynamic study. One attractive approach for such purpose is to identify the relevant modal parameters of an equivalent foundation using the motion measurements of rotor and foundation at the bearing supports. Previous research showed that, a complex quasi-modal parameter based system identification technique could be feasible for this purpose; however, the technique was only validated by identifying simple structures under harmonic excitation. In this paper, such identification technique is further extended and evaluated by identifying the foundation of a numerical rotor-bearing-foundation system and an experimental rotor rig respectively. In the identification of rotor foundation with multiple bearing supports, all application points of excitation forces transmitted through bearings need to be included; however the assumed vibration modes far outside the rotor operating speed cannot or not necessary to be identified. The extended identification technique allows one to identify correctly an equivalent foundation with fewer modes than the assumed number of degrees of freedom, essentially by generalising the technique to be able to handle rectangular complex modal matrices. The extended technique is robust in numerical and experimental validation and is therefore likely to be applicable in the field.

  12. A Survey of Librarian Perceptions of Information Literacy Techniques

    Science.gov (United States)

    Yearwood, Simone L.; Foasberg, Nancy M.; Rosenberg, Kenneth D.

    2015-01-01

    Teaching research competencies and information literacy is an integral part of the academic librarian's role. There has long been debate among librarians over what are the most effective methods of instruction for college students. Library Faculty members at a large urban university system were surveyed to determine their perceptions of the…

  13. Efficiencies of Internet-based digital and paper-based scientific surveys and the estimated costs and time for different-sized cohorts.

    Directory of Open Access Journals (Sweden)

    Constantin E Uhlig

    Full Text Available To evaluate the relative efficiencies of five Internet-based digital and three paper-based scientific surveys and to estimate the costs for different-sized cohorts.Invitations to participate in a survey were distributed via e-mail to employees of two university hospitals (E1 and E2 and to members of a medical association (E3, as a link placed in a special text on the municipal homepage regularly read by the administrative employees of two cities (H1 and H2, and paper-based to workers at an automobile enterprise (P1 and college (P2 and senior (P3 students. The main parameters analyzed included the numbers of invited and actual participants, and the time and cost to complete the survey. Statistical analysis was descriptive, except for the Kruskal-Wallis-H-test, which was used to compare the three recruitment methods. Cost efficiencies were compared and extrapolated to different-sized cohorts.The ratios of completely answered questionnaires to distributed questionnaires were between 81.5% (E1 and 97.4% (P2. Between 6.4% (P1 and 57.0% (P2 of the invited participants completely answered the questionnaires. The costs per completely answered questionnaire were $0.57-$1.41 (E1-3, $1.70 and $0.80 for H1 and H2, respectively, and $3.36-$4.21 (P1-3. Based on our results, electronic surveys with 10, 20, 30, or 42 questions would be estimated to be most cost (and time efficient if more than 101.6-225.9 (128.2-391.7, 139.8-229.2 (93.8-193.6, 165.8-230.6 (68.7-115.7, or 188.2-231.5 (44.4-72.7 participants were required, respectively.The study efficiency depended on the technical modalities of the survey methods and engagement of the participants. Depending on our study design, our results suggest that in similar projects that will certainly have more than two to three hundred required participants, the most efficient way of conducting a questionnaire-based survey is likely via the Internet with a digital questionnaire, specifically via a centralized e-mail.

  14. Mass spectrometry. [review of techniques

    Science.gov (United States)

    Burlingame, A. L.; Kimble, B. J.; Derrick, P. J.

    1976-01-01

    Advances in mass spectrometry (MS) and its applications over the past decade are reviewed in depth, with annotated literature references. New instrumentation and techniques surveyed include: modulated-beam MS, chromatographic MS on-line computer techniques, digital computer-compatible quadrupole MS, selected ion monitoring (mass fragmentography), and computer-aided management of MS data and interpretation. Areas of application surveyed include: organic MS and electron impact MS, field ionization kinetics, appearance potentials, translational energy release, studies of metastable species, photoionization, calculations of molecular orbitals, chemical kinetics, field desorption MS, high pressure MS, ion cyclotron resonance, biochemistry, medical/clinical chemistry, pharmacology, and environmental chemistry and pollution studies.

  15. A survey tool for measuring evidence-based decision making capacity in public health agencies

    Directory of Open Access Journals (Sweden)

    Jacobs Julie A

    2012-03-01

    Full Text Available Abstract Background While increasing attention is placed on using evidence-based decision making (EBDM to improve public health, there is little research assessing the current EBDM capacity of the public health workforce. Public health agencies serve a wide range of populations with varying levels of resources. Our survey tool allows an individual agency to collect data that reflects its unique workforce. Methods Health department leaders and academic researchers collaboratively developed and conducted cross-sectional surveys in Kansas and Mississippi (USA to assess EBDM capacity. Surveys were delivered to state- and local-level practitioners and community partners working in chronic disease control and prevention. The core component of the surveys was adopted from a previously tested instrument and measured gaps (importance versus availability in competencies for EBDM in chronic disease. Other survey questions addressed expectations and incentives for using EBDM, self-efficacy in three EBDM skills, and estimates of EBDM within the agency. Results In both states, participants identified communication with policymakers, use of economic evaluation, and translation of research to practice as top competency gaps. Self-efficacy in developing evidence-based chronic disease control programs was lower than in finding or using data. Public health practitioners estimated that approximately two-thirds of programs in their agency were evidence-based. Mississippi participants indicated that health department leaders' expectations for the use of EBDM was approximately twice that of co-workers' expectations and that the use of EBDM could be increased with training and leadership prioritization. Conclusions The assessment of EBDM capacity in Kansas and Mississippi built upon previous nationwide findings to identify top gaps in core competencies for EBDM in chronic disease and to estimate a percentage of programs in U.S. health departments that are evidence-based

  16. Medical Students' Experiences with Addicted Patients: A Web-Based Survey

    Science.gov (United States)

    Midmer, Deana; Kahan, Meldon; Wilson, Lynn

    2008-01-01

    Project CREATE was an initiative to strengthen undergraduate medical education in addictions. As part of a needs assessment, forty-six medical students at Ontario's five medical schools completed a bi-weekly, interactive web-based survey about addiction-related learning events. In all, 704 unique events were recorded, for an average of 16.7…

  17. The Hannibal Community Survey; A Case Study in a Community Development Technique.

    Science.gov (United States)

    Croll, John A.

    Disturbed by the community's negative attitude toward its prospects for progress, the Hannibal (Missouri) Chamber of Commerce initiated a community self-survey to improve the situation. The questionnaire survey concentrated on felt needs relationg to city government, retail facilities and services, recreation, religion, education, industrial…

  18. Study of capillary absorption kinetics by X-ray CT imaging techniques: a survey on sedimentary rocks of Sicily

    Directory of Open Access Journals (Sweden)

    Tiziano Schillaci

    2008-04-01

    Full Text Available Sedimentary rocks are natural porous materials with a great percent of microscopic interconnected pores: they contain fluids, permitting their movement on macroscopic scale. Generally, these rocks present porosity higher then metamorphic rocks. Under certain points of view, this feature represents an advantage; on the other hand, this can constitute an obstacle for cultural heritage applications, because the porosity grade can lead to a deterioration of the lapideous monument for water capillary absorption. In this paper, CT (Computerized Tomography image techniques are applied to capillary absorption kinetics in sedimentary rocks utilized for the Greek temples as well as baroc monuments, respectively located in western and southeastern Sicily. Rocks were sampled near the archaeological areas of Agrigento, Segesta, Selinunte and Val di Noto. CT images were acquired at different times, before and after the water contact, using image elaboration techniques during the acquisition as well as the post-processing phases. Water distribution into porous spaces has been evaluated on the basis of the Hounsfield number, estimated for the 3-D voxel structure of samples. For most of the considered samples, assumptions based on Handy model permit to correlate the average height of the wetting front to the square root of time. Stochastic equations were introduced in order to describe the percolative water behavior in heterogeneous samples, as the Agrigento one. Before the CT acquisition, an estimate of the capillary absorption kinetics has been carried out by the gravimetric method. A petrographical characterization of samples has been performed by stereomicroscope observations, while porosity and morphology of porous have been surveyed by SEM (Scanning Electron Microscope images. Furthermore, the proposed methods have also permitted to define penetration depth as well as distribution uniformity of materials used for restoration and conservation of historical

  19. Estimating leptospirosis incidence using hospital-based surveillance and a population-based health care utilization survey in Tanzania.

    Directory of Open Access Journals (Sweden)

    Holly M Biggs

    Full Text Available The incidence of leptospirosis, a neglected zoonotic disease, is uncertain in Tanzania and much of sub-Saharan Africa, resulting in scarce data on which to prioritize resources for public health interventions and disease control. In this study, we estimate the incidence of leptospirosis in two districts in the Kilimanjaro Region of Tanzania.We conducted a population-based household health care utilization survey in two districts in the Kilimanjaro Region of Tanzania and identified leptospirosis cases at two hospital-based fever sentinel surveillance sites in the Kilimanjaro Region. We used multipliers derived from the health care utilization survey and case numbers from hospital-based surveillance to calculate the incidence of leptospirosis. A total of 810 households were enrolled in the health care utilization survey and multipliers were derived based on responses to questions about health care seeking in the event of febrile illness. Of patients enrolled in fever surveillance over a 1 year period and residing in the 2 districts, 42 (7.14% of 588 met the case definition for confirmed or probable leptospirosis. After applying multipliers to account for hospital selection, test sensitivity, and study enrollment, we estimated the overall incidence of leptospirosis ranges from 75-102 cases per 100,000 persons annually.We calculated a high incidence of leptospirosis in two districts in the Kilimanjaro Region of Tanzania, where leptospirosis incidence was previously unknown. Multiplier methods, such as used in this study, may be a feasible method of improving availability of incidence estimates for neglected diseases, such as leptospirosis, in resource constrained settings.

  20. Estimating Leptospirosis Incidence Using Hospital-Based Surveillance and a Population-Based Health Care Utilization Survey in Tanzania

    Science.gov (United States)

    Biggs, Holly M.; Hertz, Julian T.; Munishi, O. Michael; Galloway, Renee L.; Marks, Florian; Saganda, Wilbrod; Maro, Venance P.; Crump, John A.

    2013-01-01

    Background The incidence of leptospirosis, a neglected zoonotic disease, is uncertain in Tanzania and much of sub-Saharan Africa, resulting in scarce data on which to prioritize resources for public health interventions and disease control. In this study, we estimate the incidence of leptospirosis in two districts in the Kilimanjaro Region of Tanzania. Methodology/Principal Findings We conducted a population-based household health care utilization survey in two districts in the Kilimanjaro Region of Tanzania and identified leptospirosis cases at two hospital-based fever sentinel surveillance sites in the Kilimanjaro Region. We used multipliers derived from the health care utilization survey and case numbers from hospital-based surveillance to calculate the incidence of leptospirosis. A total of 810 households were enrolled in the health care utilization survey and multipliers were derived based on responses to questions about health care seeking in the event of febrile illness. Of patients enrolled in fever surveillance over a 1 year period and residing in the 2 districts, 42 (7.14%) of 588 met the case definition for confirmed or probable leptospirosis. After applying multipliers to account for hospital selection, test sensitivity, and study enrollment, we estimated the overall incidence of leptospirosis ranges from 75–102 cases per 100,000 persons annually. Conclusions/Significance We calculated a high incidence of leptospirosis in two districts in the Kilimanjaro Region of Tanzania, where leptospirosis incidence was previously unknown. Multiplier methods, such as used in this study, may be a feasible method of improving availability of incidence estimates for neglected diseases, such as leptospirosis, in resource constrained settings. PMID:24340122

  1. Nitrous oxide-based techniques versus nitrous oxide-free techniques for general anaesthesia.

    Science.gov (United States)

    Sun, Rao; Jia, Wen Qin; Zhang, Peng; Yang, KeHu; Tian, Jin Hui; Ma, Bin; Liu, Yali; Jia, Run H; Luo, Xiao F; Kuriyama, Akira

    2015-11-06

    Nitrous oxide has been used for over 160 years for the induction and maintenance of general anaesthesia. It has been used as a sole agent but is most often employed as part of a technique using other anaesthetic gases, intravenous agents, or both. Its low tissue solubility (and therefore rapid kinetics), low cost, and low rate of cardiorespiratory complications have made nitrous oxide by far the most commonly used general anaesthetic. The accumulating evidence regarding adverse effects of nitrous oxide administration has led many anaesthetists to question its continued routine use in a variety of operating room settings. Adverse events may result from both the biological actions of nitrous oxide and the fact that to deliver an effective dose, nitrous oxide, which is a relatively weak anaesthetic agent, needs to be given in high concentrations that restrict oxygen delivery (for example, a common mixture is 30% oxygen with 70% nitrous oxide). As well as the risk of low blood oxygen levels, concerns have also been raised regarding the risk of compromising the immune system, impaired cognition, postoperative cardiovascular complications, bowel obstruction from distention, and possible respiratory compromise. To determine if nitrous oxide-based anaesthesia results in similar outcomes to nitrous oxide-free anaesthesia in adults undergoing surgery. We searched the Cochrane Central Register of Controlled Trials (CENTRAL; 2014 Issue 10); MEDLINE (1966 to 17 October 2014); EMBASE (1974 to 17 October 2014); and ISI Web of Science (1974 to 17 October 2014). We also searched the reference lists of relevant articles, conference proceedings, and ongoing trials up to 17 October 2014 on specific websites (http://clinicaltrials.gov/, http://controlled-trials.com/, and http://www.centerwatch.com). We included randomized controlled trials (RCTs) comparing general anaesthesia where nitrous oxide was part of the anaesthetic technique used for the induction or maintenance of general

  2. Techniques, processes, and measures for software safety and reliability

    International Nuclear Information System (INIS)

    Sparkman, D.

    1992-01-01

    The purpose of this report is to provide a detailed survey of current recommended practices and measurement techniques for the development of reliable and safe software-based systems. This report is intended to assist the United States Nuclear Reaction Regulation (NRR) in determining the importance and maturity of the available techniques and in assessing the relevance of individual standards for application to instrumentation and control systems in nuclear power generating stations. Lawrence Livermore National Laboratory (LLNL) provides technical support for the Instrumentation and Control System Branch (ICSB) of NRRin advanced instrumentation and control systems, distributed digital systems, software reliability, and the application of verificafion and validafion for the development of software

  3. Interim status of the Texas uranium survey experiment

    International Nuclear Information System (INIS)

    Jupiter, C.; Wollenberg, H.

    1974-01-01

    The objective of the Texas uranium survey experiment is to evaluate an improved method for prospecting for uranium by determining correlations among (a) geologic analysis, (b) soil sample radiochemical analysis, (c) aerial radiometric data, (d) aerial infrared scans, and (e) aerophotographic data. Although aerial radiometric measurements have been used previously in mineral prospecting, the development of useful correlative techniques based on analysis of data from large terrestrial areas employing the five parameters (a through e, above) remains to be evaluated, and could be of significant value to the need for establishing uranium resource pools to meet the nation's energy crisis. The Texas uranium survey field experiment began on June 13, 1973, employing a Martin-404 aircraft to fly gamma-ray recording equipment at a 500-ft altitude over two areas in southeast Texas. The areas surveyed are referred to as the Dubose area (approximately 120 sq. miles) and the Clay West area (approximately 24 sq. miles). This document briefly summarizes the work which has been done, describes the kind and quality of calibrations and data analysis carried out thus far and outlines recommended additional work which would bring the experiment to some degree of completion, providing a basis for evaluating the techniques

  4. A new simple technique for improving the random properties of chaos-based cryptosystems

    Science.gov (United States)

    Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.

    2018-03-01

    A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.

  5. A comparison of morphological and molecular-based surveys to estimate the species richness of Chaetoceros and Thalassiosira (bacillariophyta, in the Bay of Fundy.

    Directory of Open Access Journals (Sweden)

    Sarah E Hamsher

    Full Text Available The goal of this study was to compare the ability of morphology and molecular-based surveys to estimate species richness for two species-rich diatom genera, Chaetoceros Ehrenb. and Thalassiosira Cleve, in the Bay of Fundy. Phytoplankton tows were collected from two sites at intervals over two years and subsampled for morphology-based surveys (2010, 2011, a culture-based DNA reference library (DRL; 2010, and a molecular-based survey (2011. The DRL and molecular-based survey utilized the 3' end of the RUBISCO large subunit (rbcL-3P to identify genetic species groups (based on 0.1% divergence in rbcL-3P, which were subsequently identified morphologically to allow comparisons to the morphology-based survey. Comparisons were compiled for the year (2011 by site (n = 2 and by season (n = 3. Of the 34 taxa included in the comparisons, 50% of taxa were common to both methods, 35% were unique to the molecular-based survey, and 12% were unique to the morphology-based survey, while the remaining 3% of taxa were unidentified genetic species groups. The morphology-based survey excelled at identifying rare taxa in individual tow subsamples, which were occasionally missed with the molecular approach used here, while the molecular methods (the DRL and molecular-based survey, uncovered nine cryptic species pairs and four previously overlooked species. The last mentioned were typically difficult to identify and were generically assigned to Thalassiosira spp. during the morphology-based survey. Therefore, for now we suggest a combined approach encompassing routine morphology-based surveys accompanied by periodic molecular-based surveys to monitor for cryptic and difficult to identify taxa. As sequencing technologies improve, molecular-based surveys should become routine, leading to a more accurate representation of species composition and richness in monitoring programs.

  6. Agent-based simulation in management and organizational studies: a survey

    Directory of Open Access Journals (Sweden)

    Nelson Alfonso Gómez-Cruz

    2017-10-01

    Full Text Available Purpose - The purpose of this paper is to provide a comprehensive survey of the literature about the use of agent-based simulation (ABS in the study of organizational behavior, decision making, and problem-solving. It aims at contributing to the consolidation of ABS as a field of applied research in management and organizational studies. Design/methodology/approach - The authors carried out a non-systematic search in literature published between 2000 and 2016, by using the keyword “agent-based” to search through Scopus’ business, management and accounting database. Additional search criteria were devised using the papers’ keywords and the categories defined by the divisions and interest groups of the Academy of Management. The authors found 181 articles for this survey. Findings - The survey shows that ABS provides a robust and rigorous framework to elaborate descriptions, explanations, predictions and theories about organizations and their processes as well as develop tools that support strategic and operational decision making and problem-solving. The authors show that the areas that report the highest number of applications are operations and logistics (37 percent, marketing (17 percent and organizational behavior (14 percent. Originality/value - The paper illustrates the increasingly prominent role of ABS in fields such as organizational behavior, strategy, human resources, marketing and logistics. To-date, this is the most complete survey about ABS in all management areas.

  7. Application of ranging technique of radar level meter for draft survey

    Directory of Open Access Journals (Sweden)

    SHEN Yijun

    2017-12-01

    Full Text Available [Objectives] This paper aims to solve the problems of the high subjectivity and low accuracy and efficiency of draft surveying relying on human visual inspection.[Methods] Radar-level oil and liquid measurement technology products are widely used in the petrochemical industry. A device is developed that uses radar to survey the draft of a boat, designed with data series optimization formulae to ensure that the data results are true and correct. At the same time, a test is designed to prove the accuracy of the results.[Results] According to the conditions of the ship,the device is composed of a radar sensor, triangular bracket and display,and is put to use in the test.[Conclusions] With 15 vessels as the research objects,the comparison experiment shows a difference in range between 0.001-0.022 meters, with an average difference rate of 0.028%, which meets the requirements for ship draft survey accuracy.

  8. DEVELOPMENT AND EVALUATION OF TECHNOLOGY EDUCATION USING EARTH OBSERVATION TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Ito

    2012-07-01

    Full Text Available In the present study, we introduce to secondary education an Earth observation technique using synthetic aperture radar (SAR. The goal is to increase interest in and raise the awareness of students in the Earth observation technique through practical activities. A curriculum is developed based on the result of questionnaire surveys of school teachers. The curriculum is composed of 16 units. Teaching materials related to the Earth observation technique are researched and developed. We designed a visual SAR processor and a small corner reflector (CR as a new teaching technique. In teaching sessions at secondary school, the developed teaching materials and software were used effectively. In observation experiments, students set up CRs that they had built, and ALOS PALSAR was able to clearly observe all of the CRs. The proposed curriculum helped all of the students to understand the usefulness of the Earth observation technique.

  9. Analysis of Employee's Survey for Preventing Human-Errors

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Chanho; Kim, Younggab; Joung, Sanghoun [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    Human errors in nuclear power plant can cause large and small events or incidents. These events or incidents are one of main contributors of reactor trip and might threaten the safety of nuclear plants. To prevent human-errors, KHNP(nuclear power plants) introduced 'Human-error prevention techniques' and have applied the techniques to main parts such as plant operation, operation support, and maintenance and engineering. This paper proposes the methods to prevent and reduce human-errors in nuclear power plants through analyzing survey results which includes the utilization of the human-error prevention techniques and the employees' awareness of preventing human-errors. With regard to human-error prevention, this survey analysis presented the status of the human-error prevention techniques and the employees' awareness of preventing human-errors. Employees' understanding and utilization of the techniques was generally high and training level of employee and training effect on actual works were in good condition. Also, employees answered that the root causes of human-error were due to working environment including tight process, manpower shortage, and excessive mission rather than personal negligence or lack of personal knowledge. Consideration of working environment is certainly needed. At the present time, based on analyzing this survey, the best methods of preventing human-error are personal equipment, training/education substantiality, private mental health check before starting work, prohibit of multiple task performing, compliance with procedures, and enhancement of job site review. However, the most important and basic things for preventing human-error are interests of workers and organizational atmosphere such as communication between managers and workers, and communication between employees and bosses.

  10. Guidelines for a Training Course in Noise Survey Techniques.

    Science.gov (United States)

    Shadley, John; And Others

    The course is designed to train noise survey technicians during a 3-5 day period to make reliable measurements of 75 percent of the noise problems encountered in the community. The more complex noise problems remaining will continue to be handled by experienced specialists. These technicians will be trained to assist State and local governments in…

  11. Electromagnetism based atmospheric ice sensing technique - A conceptual review

    Directory of Open Access Journals (Sweden)

    U Mughal

    2016-09-01

    Full Text Available Electromagnetic and vibrational properties of ice can be used to measure certain parameters such as ice thickness, type and icing rate. In this paper we present a review of the dielectric based measurement techniques for matter and the dielectric/spectroscopic properties of ice. Atmospheric Ice is a complex material with a variable dielectric constant, but precise calculation of this constant may form the basis for measurement of its other properties such as thickness and strength using some electromagnetic methods. Using time domain or frequency domain spectroscopic techniques, by measuring both the reflection and transmission characteristics of atmospheric ice in a particular frequency range, the desired parameters can be determined.

  12. Designing and conducting survey research a comprehensive guide

    CERN Document Server

    Rea, Louis M

    2014-01-01

    The industry standard guide, updated with new ideas and SPSS analysis techniques Designing and Conducting Survey Research: A Comprehensive Guide Fourth Edition is the industry standard resource that covers all major components of the survey process, updated to include new data analysis techniques and SPSS procedures with sample data sets online. The book offers practical, actionable guidance on constructing the instrument, administrating the process, and analyzing and reporting the results, providing extensive examples and worksheets that demonstrate the appropriate use of survey and data tech

  13. Vertical Cable Seismic Survey for Hydrothermal Deposit

    Science.gov (United States)

    Asakawa, E.; Murakami, F.; Sekino, Y.; Okamoto, T.; Ishikawa, K.; Tsukahara, H.; Shimura, T.

    2012-04-01

    The vertical cable seismic is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. This type of survey is generally called VCS (Vertical Cable Seismic). Because VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed the method for the hydrothermal deposit survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We are now developing a VCS system, including not only data acquisition hardware but data processing and analysis technique. Our first experiment of VCS surveys has been carried out in Lake Biwa, JAPAN in November 2009 for a feasibility study. Prestack depth migration is applied to the 3D VCS data to obtain a high quality 3D depth volume. Based on the results from the feasibility study, we have developed two autonomous recording VCS systems. After we carried out a trial experiment in the actual ocean at a water depth of about 400m and we carried out the second VCS survey at Iheya Knoll with a deep-towed source. In this survey, we could establish the procedures for the deployment/recovery of the system and could examine the locations and the fluctuations of the vertical cables at a water depth of around 1000m. The acquired VCS data clearly shows the reflections from the sub-seafloor. Through the experiment, we could confirm that our VCS system works well even in the severe circumstances around the locations of seafloor hydrothermal deposits. We have, however, also confirmed that the uncertainty in the locations of the source and of the hydrophones could lower the quality of subsurface image. It is, therefore, strongly necessary to develop a total survey system that assures a accurate positioning and a deployment techniques

  14. Progress of new label-free techniques for biosensors: a review.

    Science.gov (United States)

    Sang, Shengbo; Wang, Yajun; Feng, Qiliang; Wei, Ye; Ji, Jianlong; Zhang, Wendong

    2016-01-01

    The detection techniques used in biosensors can be broadly classified into label-based and label-free. Label-based detection relies on the specific properties of labels for detecting a particular target. In contrast, label-free detection is suitable for the target molecules that are not labeled or the screening of analytes which are not easy to tag. Also, more types of label-free biosensors have emerged with developments in biotechnology. The latest developed techniques in label-free biosensors, such as field-effect transistors-based biosensors including carbon nanotube field-effect transistor biosensors, graphene field-effect transistor biosensors and silicon nanowire field-effect transistor biosensors, magnetoelastic biosensors, optical-based biosensors, surface stress-based biosensors and other type of biosensors based on the nanotechnology are discussed. The sensing principles, configurations, sensing performance, applications, advantages and restriction of different label-free based biosensors are considered and discussed in this review. Most concepts included in this survey could certainly be applied to the development of this kind of biosensor in the future.

  15. Generation of Quasi-Gaussian Pulses Based on Correlation Techniques

    Directory of Open Access Journals (Sweden)

    POHOATA, S.

    2012-02-01

    Full Text Available The Gaussian pulses have been mostly used within communications, where some applications can be emphasized: mobile telephony (GSM, where GMSK signals are used, as well as the UWB communications, where short-period pulses based on Gaussian waveform are generated. Since the Gaussian function signifies a theoretical concept, which cannot be accomplished from the physical point of view, this should be expressed by using various functions, able to determine physical implementations. New techniques of generating the Gaussian pulse responses of good precision are approached, proposed and researched in this paper. The second and third order derivatives with regard to the Gaussian pulse response are accurately generated. The third order derivates is composed of four individual rectangular pulses of fixed amplitudes, being easily to be generated by standard techniques. In order to generate pulses able to satisfy the spectral mask requirements, an adequate filter is necessary to be applied. This paper emphasizes a comparative analysis based on the relative error and the energy spectra of the proposed pulses.

  16. Prenotification, Incentives, and Survey Modality: An Experimental Test of Methods to Increase Survey Response Rates of School Principals

    Science.gov (United States)

    Jacob, Robin Tepper; Jacob, Brian

    2012-01-01

    Teacher and principal surveys are among the most common data collection techniques employed in education research. Yet there is remarkably little research on survey methods in education, or about the most cost-effective way to raise response rates among teachers and principals. In an effort to explore various methods for increasing survey response…

  17. A Survey on different techniques of steganography

    Directory of Open Access Journals (Sweden)

    Kaur Harpreet

    2016-01-01

    Full Text Available Steganography is important due to the exponential development and secret communication of potential computer users over the internet. Steganography is the art of invisible communication to keep secret information inside other information. Steganalysis is the technology that attempts to ruin the Steganography by detecting the hidden information and extracting.Steganography is the process of Data embedding in the images, text/documented, audio and video files. The paper also highlights the security improved by applying various techniques of video steganography.

  18. The ALHAMBRA survey: 2D analysis of the stellar populations in massive early-type galaxies at z < 0.3

    Science.gov (United States)

    San Roman, I.; Cenarro, A. J.; Díaz-García, L. A.; López-Sanjuan, C.; Varela, J.; González Delgado, R. M.; Sánchez-Blázquez, P.; Alfaro, E. J.; Ascaso, B.; Bonoli, S.; Borlaff, A.; Castander, F. J.; Cerviño, M.; Fernández-Soto, A.; Márquez, I.; Masegosa, J.; Muniesa, D.; Pović, M.; Viironen, K.; Aguerri, J. A. L.; Benítez, N.; Broadhurst, T.; Cabrera-Caño, J.; Cepa, J.; Cristóbal-Hornillos, D.; Infante, L.; Martínez, V. J.; Moles, M.; del Olmo, A.; Perea, J.; Prada, F.; Quintana, J. M.

    2018-01-01

    We present a technique that permits the analysis of stellar population gradients in a relatively low-cost way compared to integral field unit (IFU) surveys. We developed a technique to analyze unresolved stellar populations of spatially resolved galaxies based on photometric multi-filter surveys. This technique allows the analysis of vastly larger samples and out to larger galactic radii. We derived spatially resolved stellar population properties and radial gradients by applying a centroidal Voronoi tessellation and performing a multicolor photometry spectral energy distribution fitting. This technique has been successfully applied to a sample of 29 massive (M⋆ > 1010.5M⊙) early-type galaxies at z Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC).

  19. Using SERVQUAL and Kano research techniques in a patient service quality survey.

    Science.gov (United States)

    Christoglou, Konstantinos; Vassiliadis, Chris; Sigalas, Ioakim

    2006-01-01

    This article presents the results of a service quality study. After an introduction to the SERVQUAL and the Kano research techniques, a Kano analysis of 75 patients from the General Hospital of Katerini in Greece is presented. The service quality criterion used satisfaction and dissatisfaction indices. The Kano statistical analysis process results strengthened the hypothesis of previous research regarding the importance of personal knowledge, the courtesy of the hospital employees and their ability to convey trust and confidence (assurance dimension). Managerial suggestions are made regarding the best way of acting and approaching hospital patients based on the basic SERVQUAL model.

  20. A survey of reflectometry techniques with applications to TFTR

    International Nuclear Information System (INIS)

    Collazo, I.; Stacey, W.M.; Wilgen, J.; Hanson, G.; Bigelow, T.; Thomas, C.E.; Bretz, N.

    1993-12-01

    This report presents a review of reflectometry with particular attention to eXtraordinary mode (X-mode) reflectometry using the novel technique of dual frequency differential phase. The advantage of using an X-mode wave is that it can probe the edge of the plasma with much higher resolution and using a much smaller frequency range than with the Ordinary mode (O-Mode). The general problem with previous full phase reflectometry techniques is that of keeping track of the phase (on the order of 1000 fringes) as the frequency is swept over the band. The dual frequency phase difference technique has the advantage that since it is keeping track of the phase difference of two frequencies with a constant frequency separation, the fringe counting is on the order of only 3 to 5 fringes. This fringe count, combined with the high resolution of the X-mode wave and the small plasma access requirements of reflectometry, make X-mode reflectometry a very attractive diagnostic for today's experiments and future fusion devices

  1. Generalized hardware post-processing technique for chaos-based pseudorandom number generators

    KAUST Repository

    Barakat, Mohamed L.

    2013-06-01

    This paper presents a generalized post-processing technique for enhancing the pseudorandomness of digital chaotic oscillators through a nonlinear XOR-based operation with rotation and feedback. The technique allows full utilization of the chaotic output as pseudorandom number generators and improves throughput without a significant area penalty. Digital design of a third-order chaotic system with maximum function nonlinearity is presented with verified chaotic dynamics. The proposed post-processing technique eliminates statistical degradation in all output bits, thus maximizing throughput compared to other processing techniques. Furthermore, the technique is applied to several fully digital chaotic oscillators with performance surpassing previously reported systems in the literature. The enhancement in the randomness is further examined in a simple image encryption application resulting in a better security performance. The system is verified through experiment on a Xilinx Virtex 4 FPGA with throughput up to 15.44 Gbit/s and logic utilization less than 0.84% for 32-bit implementations. © 2013 ETRI.

  2. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Heo, Gyunyoung

    2015-01-01

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making

  3. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of)

    2015-05-15

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making.

  4. Progress in the development of a video-based wind farm simulation technique

    OpenAIRE

    Robotham, AJ

    1992-01-01

    The progress in the development of a video-based wind farm simulation technique is reviewed. While improvements have been achieved in the quality of the composite picture created by combining computer generated animation sequences of wind turbines with background scenes of the wind farm site, extending the technique to include camera movements has proved troublesome.

  5. A survey of text clustering techniques used for web mining

    Directory of Open Access Journals (Sweden)

    Dan MUNTEANU

    2005-12-01

    Full Text Available This paper contains an overview of basic formulations and approaches to clustering. Then it presents two important clustering paradigms: a bottom-up agglomerative technique, which collects similar documents into larger and larger groups, and a top-down partitioning technique, which divides a corpus into topic-oriented partitions.

  6. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  7. Microprocessor based mobile radiation survey system

    International Nuclear Information System (INIS)

    Gilbert, R.W.; McCormack, W.D.

    1983-12-01

    A microprocessor-based system has been designed and constructed to enhance the performance of routine radiation surveys on roads within the Hanford site. This device continually monitors system performance and output from four sodium iodide detectors mounted on the rear bumper of a 4-wheel drive truck. The gamma radiation count rate in counts-per-second is monitored, and a running average computed, with the results compared to predefined limits. If an abnormal instantaneous or average count rate is detected, an alarm is sounded with responsible data displayed on a liquid crystal panel in the cab of the vehicle. The system also has the capability to evaluate detector output using multiple time constants and to perform more complex tests and comparison of the data. Data can be archived for later analysis on conventional chart recorders or stored in digital form on magnetic tape or other digital storage media. 4 figures

  8. System reliability worth assessment at a midwest utility-survey results for residential customers

    Energy Technology Data Exchange (ETDEWEB)

    Chowdhury, A.A.; Mielnik, T.C. [Electric System Planning, MidAmerican Energy Company, Davenport, Iowa (United States); Lawton, L.E.; Sullivan, M.J.; Katz, A. [Population Research Systems, San Francisco, CA (United States)

    2005-12-01

    This paper presents the overall results of a residential customer survey conducted in service areas of MidAmerican Energy Company, a Midwest utility. A similar survey was conducted concurrently in the industrial, commercial and institutional sectors and the survey results are presented in a companion paper. The results of this study are compared with the results of other studies performed in the high cost areas of the US east and west coasts. This is the first ever study of this nature performed for the residential customers in the US Midwest region. Methodological differences in the study design compared to coastal surveys are discussed. Customer survey costing techniques can be categorized into three main groups: contingent valuation techniques, direct costing techniques and indirect costing techniques. Most customer surveys conducted by different organizations in the last two decades used a combination of all three techniques. The selection of a technique is mainly dependent on the type of customer being surveyed. In this MidAmerican study, contingent valuation techniques and an indirect costing technique have been used, as most consequences of power outages to residential users are related to inconvenience or disruption of housekeeping and leisure activities that are intangible in nature. The major contribution of this paper is that particulars of Midwest residential customers compared to residential customers of coastal utilities are noted and customer responses on power quality issues that are important to customers are summarized. (author)

  9. A case for a vegetation survey in a developing country based on Zimbabwe

    Directory of Open Access Journals (Sweden)

    T. Müller

    1983-11-01

    Full Text Available The need for a vegetation survey in Zimbabwe, a developing country, is discussed. It is proposed that such a survey should produce a classification which is based on floristic criteria, and in which the vegetation types relate as nearly as possible to homogeneous environmental units. The practical application of such a classification is outlined with reference to the management of natural vegetation resources, land use planning and the preservation of species diversity.

  10. IoT Security Techniques Based on Machine Learning

    OpenAIRE

    Xiao, Liang; Wan, Xiaoyue; Lu, Xiaozhen; Zhang, Yanyong; Wu, Di

    2018-01-01

    Internet of things (IoT) that integrate a variety of devices into networks to provide advanced and intelligent services have to protect user privacy and address attacks such as spoofing attacks, denial of service attacks, jamming and eavesdropping. In this article, we investigate the attack model for IoT systems, and review the IoT security solutions based on machine learning techniques including supervised learning, unsupervised learning and reinforcement learning. We focus on the machine le...

  11. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    Science.gov (United States)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  12. Development and application of the analyzer-based imaging technique with hard synchrotron radiation

    International Nuclear Information System (INIS)

    Coan, P.

    2006-07-01

    The objective of this thesis is twofold: from one side the application of the analyser-based X-ray phase contrast imaging to study cartilage, bone and bone implants using ESRF synchrotron radiation sources and on the other to contribute to the development of the phase contrast techniques from the theoretical and experimental point of view. Several human samples have been studied in vitro using the analyser based imaging (ABI) technique. Examination included projection and computed tomography imaging and 3-dimensional volume rendering of hip, big toe and ankle articular joints. X-ray ABI images have been critically compared with those obtained with conventional techniques, including radiography, computed tomography, ultrasound, magnetic resonance and histology, the latter taken as gold standard. Results show that only ABI imaging was able to either visualize or correctly estimate the early pathological status of the cartilage. The status of the bone ingrowth in sheep implants have also been examined in vitro: ABI images permitted to correctly distinguish between good and incomplete bone healing. Pioneering in-vivo ABI on guinea pigs were also successfully performed, confirming the possible use of the technique to follow up the progression of joint diseases, the bone/metal ingrowth and the efficacy of drugs treatments. As part of the development of the phase contrast techniques, two objectives have been reached. First, it has been experimentally demonstrated for the first time that the ABI and the propagation based imaging (PBI) can be combined to create images with original features (hybrid imaging, HI). Secondly, it has been proposed and experimentally tested a new simplified set-up capable to produce images with properties similar to those obtained with the ABI technique or HI. Finally, both the ABI and the HI have been theoretically studied with an innovative, wave-based simulation program, which was able to correctly reproduce experimental results. (author)

  13. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  14. Evaluating integration of inland bathymetry in the U.S. Geological Survey 3D Elevation Program, 2014

    Science.gov (United States)

    Miller-Corbett, Cynthia

    2016-09-01

    Inland bathymetry survey collections, survey data types, features, sources, availability, and the effort required to integrate inland bathymetric data into the U.S. Geological Survey 3D Elevation Program are assessed to help determine the feasibility of integrating three-dimensional water feature elevation data into The National Map. Available data from wading, acoustic, light detection and ranging, and combined technique surveys are provided by the U.S. Geological Survey, National Oceanic and Atmospheric Administration, U.S. Army Corps of Engineers, and other sources. Inland bathymetric data accessed through Web-hosted resources or contacts provide useful baseline parameters for evaluating survey types and techniques used for collection and processing, and serve as a basis for comparing survey methods and the quality of results. Historically, boat-mounted acoustic surveys have provided most inland bathymetry data. Light detection and ranging techniques that are beneficial in areas hard to reach by boat, that can collect dense data in shallow water to provide comprehensive coverage, and that can be cost effective for surveying large areas with good water clarity are becoming more common; however, optimal conditions and techniques for collecting and processing light detection and ranging inland bathymetry surveys are not yet well defined.Assessment of site condition parameters important for understanding inland bathymetry survey issues and results, and an evaluation of existing inland bathymetry survey coverage are proposed as steps to develop criteria for implementing a useful and successful inland bathymetry survey plan in the 3D Elevation Program. These survey parameters would also serve as input for an inland bathymetry survey data baseline. Integration and interpolation techniques are important factors to consider in developing a robust plan; however, available survey data are usually in a triangulated irregular network format or other format compatible with

  15. A multispectral scanner survey of the Tonopah Test Range, Nevada. Date of survey: August 1993

    International Nuclear Information System (INIS)

    Brewster, S.B. Jr.; Howard, M.E.; Shines, J.E.

    1994-08-01

    The Multispectral Remote Sensing Department of the Remote Sensing Laboratory conducted an airborne multispectral scanner survey of a portion of the Tonopah Test Range, Nevada. The survey was conducted on August 21 and 22, 1993, using a Daedalus AADS1268 scanner and coincident aerial color photography. Flight altitudes were 5,000 feet (1,524 meters) above ground level for systematic coverage and 1,000 feet (304 meters) for selected areas of special interest. The multispectral scanner survey was initiated as part of an interim and limited investigation conducted to gather preliminary information regarding historical hazardous material release sites which could have environmental impacts. The overall investigation also includes an inventory of environmental restoration sites, a ground-based geophysical survey, and an aerial radiological survey. The multispectral scanner imagery and coincident aerial photography were analyzed for the detection, identification, and mapping of man-made soil disturbances. Several standard image enhancement techniques were applied to the data to assist image interpretation. A geologic ratio enhancement and a color composite consisting of AADS1268 channels 10, 7, and 9 (mid-infrared, red, and near-infrared spectral bands) proved most useful for detecting soil disturbances. A total of 358 disturbance sites were identified on the imagery and mapped using a geographic information system. Of these sites, 326 were located within the Tonopah Test Range while the remaining sites were present on the imagery but outside the site boundary. The mapped site locations are being used to support ongoing field investigations

  16. Electronic surveys: how to maximise success.

    Science.gov (United States)

    McPeake, Joanne; Bateson, Meghan; O'Neill, Anna

    2014-01-01

    To draw on the researchers' experience of developing and distributing a UK-wide electronic survey. The evolution of electronic surveys in healthcare research will be discussed, as well as simple techniques that can be used to improve response rates for this type of data collection. There is an increasing use of electronic survey methods in healthcare research. However, in recent published research, electronic surveys have had lower response rates than traditional survey methods, such as postal and telephone surveys. This is a methodology paper. Electronic surveys have many advantages over traditional surveys, including a reduction in cost and ease of analysis. Drawbacks to this type of data collection include the potential for selection bias and poorer response rates. However, research teams can use a range of simple strategies to boost response rates. These approaches target the different stages of achieving a complete response: initial attraction through personalisation, engagement by having an easily accessible link to the survey, and transparency of survey length and completion though targeting the correct, and thereby interested, population. The fast, efficient and often 'free' electronic survey has many advantages over the traditional postal data collection method, including ease of analysis for what can be vast amounts of data. However, to capitalise on these benefits, researchers must carefully consider techniques to maximise response rates and minimise selection bias for their target population. Researchers can use a range of strategies to improve responses from electronic surveys, including sending up to three reminders, personalising each email, adding the updated response rate to reminder emails, and stating the average time it would take to complete the survey in the title of the email.

  17. A survey of temperature measurement

    International Nuclear Information System (INIS)

    Saltvold, J.R.

    1976-03-01

    Many different techniques for measuring temperature have been surveyed and are discussed. The concept of temperature and the physical phenomena used in temperature measurement are also discussed. Extensive tables are presented in which the range and accuracy of the various techniques and other related data are included. (author)

  18. Resizing Technique-Based Hybrid Genetic Algorithm for Optimal Drift Design of Multistory Steel Frame Buildings

    Directory of Open Access Journals (Sweden)

    Hyo Seon Park

    2014-01-01

    Full Text Available Since genetic algorithm-based optimization methods are computationally expensive for practical use in the field of structural optimization, a resizing technique-based hybrid genetic algorithm for the drift design of multistory steel frame buildings is proposed to increase the convergence speed of genetic algorithms. To reduce the number of structural analyses required for the convergence, a genetic algorithm is combined with a resizing technique that is an efficient optimal technique to control the drift of buildings without the repetitive structural analysis. The resizing technique-based hybrid genetic algorithm proposed in this paper is applied to the minimum weight design of three steel frame buildings. To evaluate the performance of the algorithm, optimum weights, computational times, and generation numbers from the proposed algorithm are compared with those from a genetic algorithm. Based on the comparisons, it is concluded that the hybrid genetic algorithm shows clear improvements in convergence properties.

  19. Wear Detection of Drill Bit by Image-based Technique

    Science.gov (United States)

    Sukeri, Maziyah; Zulhilmi Paiz Ismadi, Mohd; Rahim Othman, Abdul; Kamaruddin, Shahrul

    2018-03-01

    Image processing for computer vision function plays an essential aspect in the manufacturing industries for the tool condition monitoring. This study proposes a dependable direct measurement method to measure the tool wear using image-based analysis. Segmentation and thresholding technique were used as the means to filter and convert the colour image to binary datasets. Then, the edge detection method was applied to characterize the edge of the drill bit. By using cross-correlation method, the edges of original and worn drill bits were correlated to each other. Cross-correlation graphs were able to detect the difference of the worn edge despite small difference between the graphs. Future development will focus on quantifying the worn profile as well as enhancing the sensitivity of the technique.

  20. Evidence-based speech-language pathology practices in schools: findings from a national survey.

    Science.gov (United States)

    Hoffman, Lavae M; Ireland, Marie; Hall-Mills, Shannon; Flynn, Perry

    2013-07-01

    This study documented evidence-based practice (EBP) patterns as reported by speech-language pathologists (SLPs) employed in public schools during 2010-2011. Using an online survey, practioners reported their EBP training experiences, resources available in their workplaces, and the frequency with which they engage in specific EBP activities, as well as their resource needs and future training format preferences. A total of 2,762 SLPs in 28 states participated in the online survey, 85% of whom reported holding the Certificate of Clinical Competence in Speech-Language Pathology credential. Results revealed that one quarter of survey respondents had no formal training in EBP, 11% of SLPs worked in school districts with official EBP procedural guidelines, and 91% had no scheduled time to support EBP activities. The majority of SLPs posed and researched 0 to 2 EBP questions per year and read 0 to 4 American Speech-Language-Hearing Association (ASHA) journal articles per year on either assessment or intervention topics. Use of ASHA online resources and engagement in EBP activities were documented to be low. However, results also revealed that school-based SLPs have high interest in additional training and resources to support scientifically based practices. Suggestions for enhancing EBP support in public schools and augmenting knowledge transfer are provided.

  1. Biometric image enhancement using decision rule based image fusion techniques

    Science.gov (United States)

    Sagayee, G. Mary Amirtha; Arumugam, S.

    2010-02-01

    Introducing biometrics into information systems may result in considerable benefits. Most of the researchers confirmed that the finger print is widely used than the iris or face and more over it is the primary choice for most privacy concerned applications. For finger prints applications, choosing proper sensor is at risk. The proposed work deals about, how the image quality can be improved by introducing image fusion technique at sensor levels. The results of the images after introducing the decision rule based image fusion technique are evaluated and analyzed with its entropy levels and root mean square error.

  2. Detection and sizing of cracks using potential drop techniques based on electromagnetic induction

    International Nuclear Information System (INIS)

    Sato, Yasumoto; Kim, Hoon

    2011-01-01

    The potential drop techniques based on electromagnetic induction are classified into induced current focused potential drop (ICFPD) technique and remotely induced current potential drop (RICPD) technique. The possibility of numerical simulation of the techniques is investigated and the applicability of these techniques to the measurement of defects in conductive materials is presented. Finite element analysis (FEA) for the RICPD measurements on the plate specimen containing back wall slits is performed and calculated results by FEA show good agreement with experimental results. Detection limit of the RICPD technique in depth of back wall slits can also be estimated by FEA. Detection and sizing of artificial defects in parent and welded materials are successfully performed by the ICFPD technique. Applicability of these techniques to detection of cracks in field components is investigated, and most of the cracks in the components investigated are successfully detected by the ICFPD and RICPD techniques. (author)

  3. Long-Period Exoplanets from Photometric Transit Surveys

    Science.gov (United States)

    Osborn, Hugh

    2017-10-01

    Photometric transit surveys on the ground & in space have detected thousands of transiting exoplanets, typically by analytically combining the signals from multiple transits. This technique of exoplanet detection was exploited in K2 to detect nearly 200 candidate planets, and extensive follow-up was able to confirm the planet K2-110b as a 2.6±0.1R⊕, 16.7±3.2M⊙ planet on a 14d orbit around a K-dwarf. The ability to push beyond the time limit set by transit surveys to detect long-period transiting objects from a single eclipse was also studied. This was performed by developing a search technique to search for planets around bright stars in WASP and NGTS photometry, finding NGTS to be marginally better than WASP at detecting such planets with 4.14±0.16 per year compared to 1.43±0.15, and detecting many planet candidates for which follow-up is on-going. This search was then adapted to search for deep, long-duration eclipses in all WASP targets. The results of this survey are described in this thesis, as well as detailed results for the candidate PDS-110, a young T-Tauri star which exhibited ∼20d-long, 30%-deep eclipses in 2008 and 2011. Space-based photometers such as Kepler have the precision to identify small exoplanets and eclipsing binary candidates from only a single eclipse. K2, with its 75d campaign duration and high-precision photometry, is not only ideally suited to detect significant numbers of single-eclipsing objects, but also to characterise them from a single event. The Bayesian transit-fitting tool ("Namaste: An MCMC Analysis of Single Transit Exoplanets") was developed to extract planetary and orbital information from single transits, and was applied to 71 candidate events detected in K2 photometry. The techniques developed in this thesis are highly applicable to future transit surveys such as TESS & PLATO, which will be able to discover & characterise large numbers of long period planets in this way

  4. Free and open source enabling technologies for patient-centric, guideline-based clinical decision support: a survey.

    Science.gov (United States)

    Leong, T Y; Kaiser, K; Miksch, S

    2007-01-01

    Guideline-based clinical decision support is an emerging paradigm to help reduce error, lower cost, and improve quality in evidence-based medicine. The free and open source (FOS) approach is a promising alternative for delivering cost-effective information technology (IT) solutions in health care. In this paper, we survey the current FOS enabling technologies for patient-centric, guideline-based care, and discuss the current trends and future directions of their role in clinical decision support. We searched PubMed, major biomedical informatics websites, and the web in general for papers and links related to FOS health care IT systems. We also relied on our background and knowledge for specific subtopics. We focused on the functionalities of guideline modeling tools, and briefly examined the supporting technologies for terminology, data exchange and electronic health record (EHR) standards. To effectively support patient-centric, guideline-based care, the computerized guidelines and protocols need to be integrated with existing clinical information systems or EHRs. Technologies that enable such integration should be accessible, interoperable, and scalable. A plethora of FOS tools and techniques for supporting different knowledge management and quality assurance tasks involved are available. Many challenges, however, remain in their implementation. There are active and growing trends of deploying FOS enabling technologies for integrating clinical guidelines, protocols, and pathways into the main care processes. The continuing development and maturation of such technologies are likely to make increasingly significant contributions to patient-centric, guideline-based clinical decision support.

  5. An improved visualization-based force-measurement technique for short-duration hypersonic facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laurence, Stuart J.; Karl, Sebastian [Institute of Aerodynamics and Flow Technology, Spacecraft Section, German Aerospace Center (DLR), Goettingen (Germany)

    2010-06-15

    This article is concerned with describing and exploring the limitations of an improved version of a recently proposed visualization-based technique for the measurement of forces and moments in short-duration hypersonic wind tunnels. The technique is based on tracking the motion of a free-flying body over a sequence of high-speed visualizations; while this idea is not new in itself, the use of high-speed digital cinematography combined with a highly accurate least-squares tracking algorithm allows improved results over what have been previously possible with such techniques. The technique precision is estimated through the analysis of artificially constructed and experimental test images, and the resulting error in acceleration measurements is characterized. For wind-tunnel scale models, position measurements to within a few microns are shown to be readily attainable. Image data from two previous experimental studies in the T5 hypervelocity shock tunnel are then reanalyzed with the improved technique: the uncertainty in the mean drag acceleration is shown to be reduced to the order of the flow unsteadiness, 2-3%, and time-resolved acceleration measurements are also shown to be possible. The response time of the technique for the configurations studied is estimated to be {proportional_to}0.5 ms. Comparisons with computations using the DLR TAU code also yield agreement to within the overall experimental uncertainty. Measurement of the pitching moment for blunt geometries still appears challenging, however. (orig.)

  6. Electrical-Based Diagnostic Techniques for Assessing Insulation Condition in Aged Transformers

    Directory of Open Access Journals (Sweden)

    Issouf Fofana

    2016-08-01

    Full Text Available The condition of the internal cellulosic paper and oil insulation are of concern for the performance of power transformers. Over the years, a number of methods have been developed to diagnose and monitor the degradation/aging of the transformer internal insulation system. Some of this degradation/aging can be assessed from electrical responses. Currently there are a variety of electrical-based diagnostic techniques available for insulation condition monitoring of power transformers. In most cases, the electrical signals being monitored are due to mechanical or electric changes caused by physical changes in resistivity, inductance or capacitance, moisture, contamination or aging by-products in the insulation. This paper presents a description of commonly used and modern electrical-based diagnostic techniques along with their interpretation schemes.

  7. At technique for visualizing electrostatic fields based on their topological structures

    International Nuclear Information System (INIS)

    Handa, Susumu

    2004-01-01

    In molecular science, visualization techniques based on computer graphics are now well established as a tool to interpret simulation results, since molecules are complicated in the structures and mutual interactions. As a probe to study such molecular interactions, electrostatic fields are considered to be useful. However, since they are given as 3D vector fields having complicated distributions, conventional drawing techniques are inadequate. In this article, a new approach based on topological structures in vector fields is presented to visualize the electrostatic fields of molecules. The scheme is to select regions of interest only from the topological structures of the fields. An example in applications to chemical reactions of an amino acid complex is presented to show how the scheme is used. (author)

  8. Adaptive differential correspondence imaging based on sorting technique

    Directory of Open Access Journals (Sweden)

    Heng Wu

    2017-04-01

    Full Text Available We develop an adaptive differential correspondence imaging (CI method using a sorting technique. Different from the conventional CI schemes, the bucket detector signals (BDS are first processed by a differential technique, and then sorted in a descending (or ascending order. Subsequently, according to the front and last several frames of the sorted BDS, the positive and negative subsets (PNS are created by selecting the relative frames from the reference detector signals. Finally, the object image is recovered from the PNS. Besides, an adaptive method based on two-step iteration is designed to select the optimum number of frames. To verify the proposed method, a single-detector computational ghost imaging (GI setup is constructed. We experimentally and numerically compare the performance of the proposed method with different GI algorithms. The results show that our method can improve the reconstruction quality and reduce the computation cost by using fewer measurement data.

  9. A comparison of base running start techniques in collegiate fastpitch softball athletes

    OpenAIRE

    Massey, Kelly P.; Brouillette, Kelly Miller; Martino, Mike

    2018-01-01

    This study examined the time difference between three different base running start techniques. Thirteen Division II collegiate softball players performed maximal sprints off a softball bag at two different distances. Sprint times at 4.57 and 18.29 meters for each technique were measured using Fusion Sport’s Smartspeed System. At both 4.57 and 18.29 meters, the rocking start (0.84 ± 0.10; 3.04 ± 0.16 s) was found to be significantly faster (in seconds) than both the split technique (1.01 ± 0.0...

  10. Restoration techniques: characteristics and performances. RESTRAT-TD3+4

    International Nuclear Information System (INIS)

    Zeevaert, T.; Bousher, A.

    1999-01-01

    This report is submitted as Technical Deliverables No. 3 and 4 against the requirements of the RESTRAT (Restoration Strategies for radioactively contaminated sites and their Close Surroundings) Project. The aim of this report is to present the results of a literature survey for the identification of techniques whose application would be most appropriate to remediating sites that have been contaminated by radionuclides from European nuclear installations. Remediation techniques are selected if they been demonstrated to be applicable for treating sites which have been contaminated by radionuclides. The techniques encompass physical-, chemical- and biological-based approaches.The remediation techniques have been characterised in terms of their applicability (the contaminants and the media for which they are suited and the manpower required to apply them); their performance (the effectiveness against the contaminants and the time during which they remain effective); the costs (capital, operational and maintenance costs); side effects (in particular, the production of waste)

  11. Restoration techniques: characteristics and performances. RESTRAT-TD3+4

    Energy Technology Data Exchange (ETDEWEB)

    Zeevaert, T.; Bousher, A

    1999-08-02

    This report is submitted as Technical Deliverables No. 3 and 4 against the requirements of the RESTRAT (Restoration Strategies for radioactively contaminated sites and their Close Surroundings) Project. The aim of this report is to present the results of a literature survey for the identification of techniques whose application would be most appropriate to remediating sites that have been contaminated by radionuclides from European nuclear installations. Remediation techniques are selected if they been demonstrated to be applicable for treating sites which have been contaminated by radionuclides. The techniques encompass physical-, chemical- and biological-based approaches.The remediation techniques have been characterised in terms of their applicability (the contaminants and the media for which they are suited and the manpower required to apply them); their performance (the effectiveness against the contaminants and the time during which they remain effective); the costs (capital, operational and maintenance costs); side effects (in particular, the production of waste)

  12. Avian survey and field guide for Osan Air Base, Korea.

    Energy Technology Data Exchange (ETDEWEB)

    Levenson, J.

    2006-12-05

    This report summarizes the results of the avian surveys conducted at Osan Air Base (AB). This ongoing survey is conducted to comply with requirements of the Environmental Governing Standards (EGS) for the Republic of Korea, the Integrated Natural Resources Management Plan (INRMP) for Osan AB, and the 51st Fighter Wing's Bird Aircraft Strike Hazard (BASH) Plan. One hundred ten bird species representing 35 families were identified and recorded. Seven species are designated as Natural Monuments, and their protection is accorded by the Korean Ministry of Culture and Tourism. Three species appear on the Korean Association for Conservation of Nature's (KACN's) list of Reserved Wild Species and are protected by the Korean Ministry of Environment. Combined, ten different species are Republic of Korea (ROK)-protected. The primary objective of the avian survey at Osan AB was to determine what species of birds are present on the airfield and their respective habitat requirements during the critical seasons of the year. This requirement is specified in Annex J.14.c of the 51st Fighter BASH Plan 91-212 (51 FW OPLAN 91-212). The second objective was to initiate surveys to determine what bird species are present on Osan AB throughout the year and from the survey results, determine if threatened, endangered, or other Korean-listed bird species are present on Osan AB. This overall census satisfies Criterion 13-3.e of the EGS for Korea. The final objective was to formulate management strategies within Osan AB's operational requirements to protect and enhance habitats of known threatened, endangered, and ROK-protected species in accordance with EGS Criterion 13-3.a that are also favorable for the reproduction of indigenous species in accordance with the EGS Criterion 13-3.h.

  13. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    Science.gov (United States)

    2017-11-01

    on Bio -Inspired Optimization Techniques by Canh Ly, Nghia Tran, and Ozlem Kilic Approved for public release; distribution is...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques 5a. CONTRACT NUMBER

  14. A comparison of mandibular denture base deformation with different impression techniques for implant overdentures.

    Science.gov (United States)

    Elsyad, Moustafa Abdou; El-Waseef, Fatma Ahmad; Al-Mahdy, Yasmeen Fathy; Fouad, Mohammed Mohammed

    2013-08-01

    This study aimed to evaluate mandibular denture base deformation along with three impression techniques used for implant-retained overdenture. Ten edentulous patients (five men and five women) received two implants in the canine region of the mandible and three duplicate mandibular overdentures which were constructed with mucostatic, selective pressure, and definitive pressure impression techniques. Ball abutments and respective gold matrices were used to connect the overdentures to the implants. Six linear strain gauges were bonded to the lingual polished surface of each duplicate overdenture at midline and implant areas to measure strain during maximal clenching and gum chewing. The strains recorded at midline were compressive while strains at implant areas were tensile. Clenching recorded significant higher strain when compared with gum chewing for all techniques. The mucostatic technique recorded the highest strain and the definite pressure technique recorded the lowest. There was no significant difference between the strain recorded with mucostatic technique and that registered with selective pressure technique. The highest strain was recorded at the level of ball abutment's top with the mucostatic technique during clenching. Definite pressure impression technique for implant-retained mandibular overdenture is associated with minimal denture deformation during function when compared with mucostatic and selective pressure techniques. Reinforcement of the denture base over the implants may be recommended to increase resistance of fracture when mucostatic or selective pressure impression technique is used. © 2012 John Wiley & Sons A/S.

  15. Regional anesthesia practice in China: a survey.

    Science.gov (United States)

    Huang, Jeffrey; Gao, Huan

    2016-11-01

    Neuraxial anesthesia has been widely used in China. Recently, Chinese anesthesiologists have applied nerve stimulator and ultrasound guidance for peripheral nerve blocks. Nationwide surveys about regional anesthesia practices in China are lacking. We surveyed Chinese anesthesiologists about regional anesthesia techniques, preference, drug selections, complications, and treatments. A survey was sent to all anesthesiologist members by WeChat. The respondents can choose mobile device or desktop to complete the survey. Each IP address is allowed to complete the survey once. A total of 6589 members read invitations. A total of 2654 responses were received with fully completed questionnaires, which represented an overall response rate of 40%. Forty-one percent of the respondents reported that more than 50% of surgeries in their hospitals were done under regional anesthesia. Most of the participants used test dose after epidural catheter insertion. The most common drug for test dose was 3-mL 1.5% lidocaine; 2.6% of the participants reported that they had treated a patient with epidural hematoma after neuraxial anesthesia. Most anesthesiologists (68.2%) performed peripheral nerve blocks as blind procedures based on the knowledge of anatomical landmarks. A majority of hospitals (80%) did not stock Intralipid; 61% of the respondents did not receive peripheral nerve block training. The current survey can serve as a benchmark for future comparisons and evaluation of regional anesthesia practices in China. This survey revealed potential regional anesthesia safety issues in China. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Selected techniques in water resources investigations, 1965

    Science.gov (United States)

    Mesnier, Glennon N.; Chase, Edith B.

    1966-01-01

    Increasing world activity in water-resources development has created an interest in techniques for conducting investigations in the field. In the United States, the Geological Survey has the responsibility for extensive and intensive hydrologic studies, and the Survey places considerable emphasis on discovering better ways to carry out its responsibility. For many years, the dominant interest in field techniques has been "in house," but the emerging world interest has led to a need for published accounts of this progress. In 1963 the Geological Survey published "Selected Techniques in Water Resources Investigations" (Water-Supply Paper 1669-Z) as part of the series "Contributions to the Hydrology of the United States."The report was so favorably received that successive volumes are planned, of which this is the first. The present report contains 25 papers that represent new ideas being tested or applied in the hydrologic field program of the Geological Survey. These ideas range from a proposed system for monitoring fluvial sediment to how to construct stream-gaging wells from steel oil drums. The original papers have been revised and edited by the compilers, but the ideas presented are those of the authors. The general description of the bubble gage on page 2 has been given by the compilers as supplementary information.

  17. Artificial Intelligence based technique for BTS placement

    Science.gov (United States)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  18. Artificial Intelligence based technique for BTS placement

    International Nuclear Information System (INIS)

    Alenoghena, C O; Emagbetere, J O; 1 Minna (Nigeria))" data-affiliation=" (Department of Telecommunications Engineering, Federal University of Techn.1 Minna (Nigeria))" >Aibinu, A M

    2013-01-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out

  19. Cogeneration techniques; Les techniques de cogeneration

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-10-01

    This dossier about cogeneration techniques comprises 12 parts dealing successively with: the advantages of cogeneration (examples of installations, electrical and thermal efficiency); the combustion turbine (principle, performances, types); the alternative internal combustion engines (principle, types, rotation speed, comparative performances); the different configurations of cogeneration installations based on alternative engines and based on steam turbines (coal, heavy fuel and natural gas-fueled turbines); the environmental constraints of combustion turbines (pollutants, techniques of reduction of pollutant emissions); the environmental constraints of alternative internal combustion engines (gas and diesel engines); cogeneration and energy saving; the techniques of reduction of pollutant emissions (pollutants, unburnt hydrocarbons, primary and secondary (catalytic) techniques, post-combustion); the most-advanced configurations of cogeneration installations for enhanced performances (counter-pressure turbines, massive steam injection cycles, turbo-chargers); comparison between the performances of the different cogeneration techniques; the tri-generation technique (compression and absorption cycles). (J.S.)

  20. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  1. Survey shows large differences between the Nordic countries in the use of less invasive surfactant administration

    DEFF Research Database (Denmark)

    Jonsson, Baldvin; Andersson, Sture; Björklund, Lars J

    2017-01-01

    AIM: Less invasive surfactant administration (LISA), namely surfactant instillation through a thin catheter in the trachea during spontaneous breathing, is increasingly used for premature infants. We surveyed the use of this technique in the Nordic countries in autumn 2015. METHODS: A link to a web......-based survey of surfactant administration methods was emailed to the directors of all neonatal units in the Nordic Region, apart from Finland, where only the five university-based departments were invited. RESULTS: Of the 73 units (85%) who responded, 23 (32%) said that they used LISA. The country rates were......%. The main reasons for not using LISA were lack of familiarity with the technique (61%), no perceived benefit over other methods (22%) and concerns about patient discomfort (26%). CONCLUSION: Less invasive surfactant administration was used in 32% of Nordic neonatal units, most commonly in level three units...

  2. ON THE PAPR REDUCTION IN OFDM SYSTEMS: A NOVEL ZCT PRECODING BASED SLM TECHNIQUE

    Directory of Open Access Journals (Sweden)

    VARUN JEOTI

    2011-06-01

    Full Text Available High Peak to Average Power Ratio (PAPR reduction is still an important challenge in Orthogonal Frequency Division Multiplexing (OFDM systems. In this paper, we propose a novel Zadoff-Chu matrix Transform (ZCT precoding based Selected Mapping (SLM technique for PAPR reduction in OFDM systems. This technique is based on precoding the constellation symbols with ZCT precoder after the multiplication of phase rotation factor and before the Inverse Fast Fourier Transform (IFFT in the SLM based OFDM (SLM-OFDM Systems. Computer simulation results show that, the proposed technique can reduce PAPR up to 5.2 dB for N=64 (System subcarriers and V=16 (Dissimilar phase sequences, at clip rate of 10-3. Additionally, ZCT based SLM-OFDM (ZCT-SLM-OFDM systems also take advantage of frequency variations of the communication channel and can also offer substantial performance gain in fading multipath channels.

  3. Free-free and fixed base modal survey tests of the Space Station Common Module Prototype

    Science.gov (United States)

    Driskill, T. C.; Anderson, J. B.; Coleman, A. D.

    1992-01-01

    This paper describes the testing aspects and the problems encountered during the free-free and fixed base modal surveys completed on the original Space Station Common Module Prototype (CMP). The CMP is a 40-ft long by 14.5-ft diameter 'waffle-grid' cylinder built by the Boeing Company and housed at the Marshall Space Flight Center (MSFC) near Huntsville, AL. The CMP modal survey tests were conducted at MSFC by the Dynamics Test Branch. The free-free modal survey tests (June '90 to Sept. '90) included interface verification tests (IFVT), often referred to as impedance measurements, mass-additive testing and linearity studies. The fixed base modal survey tests (Feb. '91 to April '91), including linearity studies, were conducted in a fixture designed to constrain the CMP in 7 total degrees-of-freedom at five trunnion interfaces (two primary, two secondary, and the keel). The fixture also incorporated an airbag off-load system designed to alleviate the non-linear effects of friction in the primary and secondary trunnion interfaces. Numerous test configurations were performed with the objective of providing a modal data base for evaluating the various testing methodologies to verify dynamic finite element models used for input to coupled load analysis.

  4. Radiation synthesized protein-based nanoparticles: A technique overview

    International Nuclear Information System (INIS)

    Varca, Gustavo H.C.; Perossi, Gabriela G.; Grasselli, Mariano; Lugão, Ademar B.

    2014-01-01

    Seeking for alternative routes for protein engineering a novel technique – radiation induced synthesis of protein nanoparticles – to achieve size controlled particles with preserved bioactivity has been recently reported. This work aimed to evaluate different process conditions to optimize and provide an overview of the technique using γ-irradiation. Papain was used as model protease and the samples were irradiated in a gamma cell irradiator in phosphate buffer (pH=7.0) containing ethanol (0–35%). The dose effect was evaluated by exposure to distinct γ-irradiation doses (2.5, 5, 7.5 and 10 kGy) and scale up experiments involving distinct protein concentrations (12.5–50 mg mL −1 ) were also performed. Characterization involved size monitoring using dynamic light scattering. Bityrosine detection was performed using fluorescence measurements in order to provide experimental evidence of the mechanism involved. Best dose effects were achieved at 10 kGy with regard to size and no relevant changes were observed as a function of papain concentration, highlighting very broad operational concentration range. Bityrosine changes were identified for the samples as a function of the process confirming that such linkages play an important role in the nanoparticle formation. - Highlights: • Synthesis of protein-based nanoparticles by γ-irradiation. • Optimization of the technique. • Overview of mechanism involved in the nanoparticle formation. • Engineered papain nanoparticles for biomedical applications

  5. Under-Frequency Load Shedding Technique Considering Event-Based for an Islanded Distribution Network

    Directory of Open Access Journals (Sweden)

    Hasmaini Mohamad

    2016-06-01

    Full Text Available One of the biggest challenge for an islanding operation is to sustain the frequency stability. A large power imbalance following islanding would cause under-frequency, hence an appropriate control is required to shed certain amount of load. The main objective of this research is to develop an adaptive under-frequency load shedding (UFLS technique for an islanding system. The technique is designed considering an event-based which includes the moment system is islanded and a tripping of any DG unit during islanding operation. A disturbance magnitude is calculated to determine the amount of load to be shed. The technique is modeled by using PSCAD simulation tool. A simulation studies on a distribution network with mini hydro generation is carried out to evaluate the UFLS model. It is performed under different load condition: peak and base load. Results show that the load shedding technique have successfully shed certain amount of load and stabilized the system frequency.

  6. Interference Mitigation Technique for Coexistence of Pulse-Based UWB and OFDM

    Directory of Open Access Journals (Sweden)

    Ohno Kohei

    2008-01-01

    Full Text Available Abstract Ultra-wideband (UWB is a useful radio technique for sharing frequency bands between radio systems. It uses very short pulses to spread spectrum. However, there is a potential for interference between systems using the same frequency bands at close range. In some regulatory systems, interference detection and avoidance (DAA techniques are required to prevent interference with existing radio systems. In this paper, the effect of interference on orthogonal frequency division multiplexing (OFDM signals from pulse-based UWB is discussed, and an interference mitigation technique is proposed. This technique focuses on the pulse repetition cycle of UWB. The pulse repetition interval is set the same or half the period of the OFDM symbol excluding the guard interval to mitigate interference. These proposals are also made for direct sequence (DS-UWB. Bit error rate (BER performance is illustrated through both simulation and theoretical approximations.

  7. Use of simulation-based education: a national survey of pediatric clerkship directors.

    Science.gov (United States)

    Vukin, Elizabeth; Greenberg, Robert; Auerbach, Marc; Chang, Lucy; Scotten, Mitzi; Tenney-Soeiro, Rebecca; Trainor, Jennifer; Dudas, Robert

    2014-01-01

    To document the prevalence of simulation-based education (SBE) for third- and fourth-year medical students; to determine the perceived importance of SBE; to characterize the barriers associated with establishing SBE. A 27-item survey regarding simulation was distributed to members of the Council on Medical Student Education in Pediatrics (COMSEP) as part of a larger survey in 2012. Seventy-one (48%) of 147 clerkship directors (CD) at COMSEP institutions responded to the survey questions regarding the use of SBE. Eighty-nine percent (63 of 71) of CDs reported use of SBE in some form: 27% of those programs (17 of 63) reported only the use of the online-based Computer-Assisted Learning in Pediatrics Program, and 73% (46 of 63) reported usage of other SBE modalities. Fifty-four percent of CDs (38 of 71) agreed that SBE is necessary to meet the requirements of the Liaison Committee on Medical Education (LCME). Multiple barriers were reported in initiating and implementing an SBE program. SBE is commonly used for instruction during pediatric undergraduate medical education in North American medical schools. Barriers to the use of SBE remain despite the perception that it is needed to meet requirements of the LCME. Copyright © 2014 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  8. A nationwide population-based cross-sectional survey of health-related quality of life in patients with myeloproliferative neoplasms in Denmark (MPNhealthSurvey: survey design and characteristics of respondents and nonrespondents

    Directory of Open Access Journals (Sweden)

    Brochmann N

    2017-03-01

    Full Text Available Nana Brochmann,1 Esben Meulengracht Flachs,2 Anne Illemann Christensen,3 Christen Lykkegaard Andersen,1 Knud Juel,3 Hans Carl Hasselbalch,1 Ann-Dorthe Zwisler4 1Department of Hematology, Zealand University Hospital, University of Copenhagen, Roskilde, 2Department of Occupational and Environmental Medicine, Bispebjerg University Hospital, Copenhagen, 3National Institute of Public Health, University of Southern Denmark, Copenhagen, 4Danish Knowledge Centre for Rehabilitation and Palliative Care, University of Southern Denmark and Odense University Hospital, Odense, Denmark Objective: The Department of Hematology, Zealand University Hospital, Denmark, and the National Institute of Public Health, University of Southern Denmark, created the first nationwide, population-based, and the most comprehensive cross-sectional health-related quality of life (HRQoL survey of patients with myeloproliferative neoplasms (MPNs. In Denmark, all MPN patients are treated in public hospitals and treatments received are free of charge for these patients. Therefore, MPN patients receive the best available treatment to the extent of its suitability for them and if they wish to receive the treatment. The aims of this article are to describe the survey design and the characteristics of respondents and nonrespondents. Material and methods: Individuals with MPN diagnoses registered in the Danish National Patient Register (NPR were invited to participate. The registers of the Danish Civil Registration System and Statistics Denmark provided information regarding demographics. The survey contained 120 questions: validated patient-reported outcome (PRO questionnaires and additional questions addressing lifestyle. Results: A total of 4,704 individuals were registered with MPN diagnoses in the NPR of whom 4,236 were eligible for participation and 2,613 (62% responded. Overall, the respondents covered the broad spectrum of MPN patients, but patients 70–79 years old, living with

  9. A citizen science based survey method for estimating the density of urban carnivores

    Science.gov (United States)

    Baker, Rowenna; Charman, Naomi; Karlsson, Heidi; Yarnell, Richard W.; Mill, Aileen C.; Smith, Graham C.; Tolhurst, Bryony A.

    2018-01-01

    Globally there are many examples of synanthropic carnivores exploiting growth in urbanisation. As carnivores can come into conflict with humans and are potential vectors of zoonotic disease, assessing densities in suburban areas and identifying factors that influence them are necessary to aid management and mitigation. However, fragmented, privately owned land restricts the use of conventional carnivore surveying techniques in these areas, requiring development of novel methods. We present a method that combines questionnaire distribution to residents with field surveys and GIS, to determine relative density of two urban carnivores in England, Great Britain. We determined the density of: red fox (Vulpes vulpes) social groups in 14, approximately 1km2 suburban areas in 8 different towns and cities; and Eurasian badger (Meles meles) social groups in three suburban areas of one city. Average relative fox group density (FGD) was 3.72 km-2, which was double the estimates for cities with resident foxes in the 1980’s. Density was comparable to an alternative estimate derived from trapping and GPS-tracking, indicating the validity of the method. However, FGD did not correlate with a national dataset based on fox sightings, indicating unreliability of the national data to determine actual densities or to extrapolate a national population estimate. Using species-specific clustering units that reflect social organisation, the method was additionally applied to suburban badgers to derive relative badger group density (BGD) for one city (Brighton, 2.41 km-2). We demonstrate that citizen science approaches can effectively obtain data to assess suburban carnivore density, however publicly derived national data sets need to be locally validated before extrapolations can be undertaken. The method we present for assessing densities of foxes and badgers in British towns and cities is also adaptable to other urban carnivores elsewhere. However this transferability is contingent on

  10. Encoding technique for high data compaction in data bases of fusion devices

    International Nuclear Information System (INIS)

    Vega, J.; Cremy, C.; Sanchez, E.; Portas, A.; Dormido, S.

    1996-01-01

    At present, data requirements of hundreds of Mbytes/discharge are typical in devices such as JET, TFTR, DIII-D, etc., and these requirements continue to increase. With these rates, the amount of storage required to maintain discharge information is enormous. Compaction techniques are now essential to reduce storage. However, general compression techniques may distort signals, but this is undesirable for fusion diagnostics. We have developed a general technique for data compression which is described here. The technique, which is based on delta compression, does not require an examination of the data as in delayed methods. Delta values are compacted according to general encoding forms which satisfy a prefix code property and which are defined prior to data capture. Several prefix codes, which are bit oriented and which have variable code lengths, have been developed. These encoding methods are independent of the signal analog characteristics and enable one to store undistorted signals. The technique has been applied to databases of the TJ-I tokamak and the TJ-IU torsatron. Compaction rates of over 80% with negligible computational effort were achieved. Computer programs were written in ANSI C, thus ensuring portability and easy maintenance. We also present an interpretation, based on information theory, of the high compression rates achieved without signal distortion. copyright 1996 American Institute of Physics

  11. A Survey of DHT Security Techniques

    NARCIS (Netherlands)

    Urdaneta Paredes, G.A.; Pierre, G.E.O.; van Steen, M.R.

    2011-01-01

    Peer-to-peer networks based on distributed hash tables (DHTs) have received considerable attention ever since their introduction in 2001. Unfortunately, DHT-based systems have been shown to be notoriously difficult to protect against security attacks. Various reports have been published that discuss

  12. Problem-based learning in laboratory medicine resident education: a satisfaction survey.

    Science.gov (United States)

    Lepiller, Quentin; Solis, Morgane; Velay, Aurélie; Gantner, Pierre; Sueur, Charlotte; Stoll-Keller, Françoise; Barth, Heidi; Fafi-Kremer, Samira

    2017-04-01

    Theoretical knowledge in biology and medicine plays a substantial role in laboratory medicine resident education. In this study, we assessed the contribution of problem-based learning (PBL) to improve the training of laboratory medicine residents during their internship in the department of virology, Strasbourg University Hospital, France. We compared the residents' satisfaction regarding an educational program based on PBL and a program based on lectures and presentations. PBL induced a high level of satisfaction (100%) among residents compared to lectures and presentations (53%). The main advantages of this technique were to create a situational interest regarding virological problems, to boost the residents' motivation and to help them identify the most relevant learning objectives in virology. However, it appears pertinent to educate the residents in appropriate bibliographic research techniques prior to PBL use and to monitor their learning by regular formative assessment sessions.

  13. CNMI Boat-based Creel Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Commonwealth of the Northern Mariana Islands (CNMI) Creel surveys are operated by the Division of Fish and Wildlife (DFW) and are only on the island of Saipan....

  14. Controlling Aedes aegypti population as DHF vector with radiation based-sterile insect technique in Banjarnegara Regency, Central Java

    International Nuclear Information System (INIS)

    Siti Nurhayati; Bambang Yunianto; Tri Ramadhani; Bina Ikawati; Budi Santoso; Ali Rahayu

    2013-01-01

    The control program of dengue hemorrhagic fever (DHF) in Indonesia is still a problem due to the incomplete integrated handling. Sterile insect technique (SIT) for Aedes aegypti as DHF vector was considered as a potential strategy for controlling the DHF. A preliminary survey was carried out to determine the characteristic of A aegypti population in the study site before the implementation of SIT. The implementation of radiation based-SIT was carried out in Krandegan and Kutabanjar Villages of Banjarnegara Regency, Central Java which involved 99 houses. One hundred gamma rays irradiated male mosquitoes were released to each house up to five times. The eggs, larvae and adult mosquitoes were collected using ovitrap and weekly observed. The initial population density of A. aegypti in the studied area was obtained to be 6 mosquitoes per house with the mean index of house was 15.86% and the mean sterility of sterilized mosquitoes was 79.16%. The SIT effectively reduced A. aegypti population after the fifth release of irradiated mosquitoes into the houses. It can be assumed that the SIT was effective in controlling DHF vector in the studied area, nevertheless, it will be more effective if it is combined with other handling techniques. (author)

  15. Overdenture retained by teeth using a definitive denture base technique: a case report.

    Science.gov (United States)

    Nascimento, D F F; dos Santos, J F F; Marchini, L

    2010-09-01

    This paper presents a technique involving the use of a definitive denture base to make overdentures. Cores with ball attachments were cemented over remaining lower teeth. Impressions of the edentulous maxilla and mandible were taken to obtain a definitive acrylic resin base. The definitive base of the mandible was perforated at the location of ball attachments and its female components were fixed to the base using acrylic resin directly in the patient's mouth. Wax rims were then made, jaw relationships recorded, teeth mounted and tried in, and the dentures were cured. This technique allowed for easy fixing of female components and better retention during the recording of jaw relationships, and can also be used in the construction of implant retained dentures.

  16. Vehicle-borne survey techniques for background radiations

    International Nuclear Information System (INIS)

    Minato, Susumu

    1995-01-01

    This paper presented methods for converting count rates measured inside cars and trains in the natural environment into outdoor terrestrial gamma-ray dose rates. First, (1) the method of calibration for a survey meter is described to be applicable to various geological terrains. Next, the regression formulas were acquired experimentally to correct (2) the shielding effects of cars and trains, and (3) the influence of pavements and ballasts. Furthermore, (4) a new method for removing interfering radiation components emitted from cliffs and tunnels was proposed, and the errors in the calculations were evaluated with numerical experiments. In addition, the degree of influence from the cliff was represented with the angle of elevation subtended to the detector. For the items (2)-(4), in particular, it could be explained with simple models that those methods are reasonable. The method for evaluating simply and accurately cosmic-ray dose rates by means of a portable barometer was also described. (author)

  17. Solid state semiconductor detectorized survey meter

    International Nuclear Information System (INIS)

    Okamoto, Eisuke; Nagase, Yoshiyuki; Furuhashi, Masato

    1987-01-01

    Survey meters are used for measurement of gamma ray dose rate of the space and the surface contamination dencity that the atomic energy plant and the radiation facility etc. We have recently developed semiconductor type survey meter (Commercial name: Compact Survey Meter). This survey meter is a small-sized dose rate meter with excellent function. The special features are using semiconductor type detector which we have developed by our own technique, stablar wide range than the old type, long life, and easy to carry. Now we introduce the efficiency and the function of the survey meter. (author)

  18. The utilization of irradiation techniques in food industry

    International Nuclear Information System (INIS)

    Szabo, S.A.

    1988-01-01

    Irradiation technical researches and the main areas of nuclear technical applications are surveyed, and the main areas where nuclear techniques are used are reported. Then an overview on radiation techniques including radiostimulation, radiomutation, radurization, radioecology and isotope techniques used in the food industry is presented. (author) 4 refs

  19. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    Science.gov (United States)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  20. National Structural Survey of Veterans Affairs Home-Based Primary Care Programs.

    Science.gov (United States)

    Karuza, Jurgis; Gillespie, Suzanne M; Olsan, Tobie; Cai, Xeuya; Dang, Stuti; Intrator, Orna; Li, Jiejin; Gao, Shan; Kinosian, Bruce; Edes, Thomas

    2017-12-01

    To describe the current structural and practice characteristics of the Department of Veterans Affairs (VA) Home-Based Primary Care (HBPC) program. We designed a national survey and surveyed HBPC program directors on-line using REDCap. We received 236 surveys from 394 identified HBPC sites (60% response rate). HBPC site characteristics were quantified using closed-ended formats. HBPC program directors were most often registered nurses, and HBPC programs primarily served veterans with complex chronic illnesses that were at high risk of hospitalization and nursing home care. Primary care was delivered using interdisciplinary teams, with nurses, social workers, and registered dietitians as team members in more than 90% of the sites. Most often, nurse practitioners were the principal primary care providers (PCPs), typically working with nurse case managers. Nearly 60% of the sites reported dual PCPs involving VA and community-based physicians. Nearly all sites provided access to a core set of comprehensive services and programs (e.g., case management, supportive home health care). At the same time, there were variations according to site (e.g., size, location (urban, rural), use of non-VA hospitals, primary care models used). HBPC sites reflected the rationale and mission of HBPC by focusing on complex chronic illness of home-based veterans and providing comprehensive primary care using interdisciplinary teams. Our next series of studies will examine how HBPC site structural characteristics and care models are related to the processes and outcomes of care to determine whether there are best practice standards that define an optimal HBPC structure and care model or whether multiple approaches to HBPC better serve the needs of veterans. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  1. Análise da utilização de técnicas e ferramentas no programa Seis Sigma a partir de um levantamento tipo survey Analysis on the usage of techniques and tools from the Six-Sigma program on a survey-type assessment

    Directory of Open Access Journals (Sweden)

    Eduardo Guilherme Satolo

    2009-01-01

    Full Text Available O programa Seis Sigma é uma iniciativa adotada atualmente por muitas empresas e seu uso ocorre por meio de um processo altamente disciplinado e orientado, no qual são aplicadas diversas técnicas e ferramentas, que objetivam a geração de um ciclo de melhoria contínua. Diante desse fato e considerando a relevância do assunto, foi conduzido um levantamento tipo survey com o intuito de identificar e analisar as técnicas e ferramentas usadas nas etapas do método DMAIC e a sua comparação com a prescrição da literatura. Buscou-se levantar as técnicas e ferramentas mais e menos empregadas pelas empresas, permitindo-se confrontar estes resultados com outros levantamentos já realizados no exterior. Cabe destacar também que os resultados confirmaram o indicado pela literatura, os quais apontam que o programa Seis Sigma necessita apoiar-se em dados mensuráveis e confiáveis, evidenciando assim que o emprego das técnicas e ferramentas é indispensável junto ao método de melhoria DMAIC.The Six-Sigma program is an approach currently adopted by many companies and its usage occurs through a highly disciplined and guided process, in which several techniques and tools are applied, trYing to generate a continuous improvement cycle. Facing this fact and considering the relevance of the subject, a survey-based research was carried out. One of the objectives was to identify and analyse the techniques and tools used in the phases of the DMAIC and their comparison with the literature. The results consisted in assessing the techniques and tools most and least used by the companies, enabling to confront those with other previous research. It is also worthy highlighting that these results confirmed the indications of other publications, which state that the Six Sigma programme needs to be supported by measurable and trustworthy data. This makes evident that the usage of techniques and tools is indispensable for the DMAIC improvement method.

  2. New calibration technique for KCD-based megavoltage imaging

    Science.gov (United States)

    Samant, Sanjiv S.; Zheng, Wei; DiBianca, Frank A.; Zeman, Herbert D.; Laughter, Joseph S.

    1999-05-01

    In megavoltage imaging, current commercial electronic portal imaging devices (EPIDs), despite having the advantage of immediate digital imaging over film, suffer from poor image contrast and spatial resolution. The feasibility of using a kinestatic charge detector (KCD) as an EPID to provide superior image contrast and spatial resolution for portal imaging has already been demonstrated in a previous paper. The KCD system had the additional advantage of requiring an extremely low dose per acquired image, allowing for superior imaging to be reconstructed form a single linac pulse per image pixel. The KCD based images utilized a dose of two orders of magnitude less that for EPIDs and film. Compared with the current commercial EPIDs and film, the prototype KCD system exhibited promising image qualities, despite being handicapped by the use of a relatively simple image calibration technique, and the performance limits of medical linacs on the maximum linac pulse frequency and energy flux per pulse delivered. This image calibration technique fixed relative image pixel values based on a linear interpolation of extrema provided by an air-water calibration, and accounted only for channel-to-channel variations. The counterpart of this for area detectors is the standard flat fielding method. A comprehensive calibration protocol has been developed. The new technique additionally corrects for geometric distortions due to variations in the scan velocity, and timing artifacts caused by mis-synchronization between the linear accelerator and the data acquisition system (DAS). The role of variations in energy flux (2 - 3%) on imaging is demonstrated to be not significant for the images considered. The methodology is presented, and the results are discussed for simulated images. It also allows for significant improvements in the signal-to- noise ratio (SNR) by increasing the dose using multiple images without having to increase the linac pulse frequency or energy flux per pulse. The

  3. Development and application of the analyzer-based imaging technique with hard synchrotron radiation; Developpement et application d'une technique d'imagerie par rayonnement synchrotron basee sur l'utilisation d'un cristal analyseur

    Energy Technology Data Exchange (ETDEWEB)

    Coan, P

    2006-07-15

    The objective of this thesis is twofold: from one side the application of the analyser-based X-ray phase contrast imaging to study cartilage, bone and bone implants using ESRF synchrotron radiation sources and on the other to contribute to the development of the phase contrast techniques from the theoretical and experimental point of view. Several human samples have been studied in vitro using the analyser based imaging (ABI) technique. Examination included projection and computed tomography imaging and 3-dimensional volume rendering of hip, big toe and ankle articular joints. X-ray ABI images have been critically compared with those obtained with conventional techniques, including radiography, computed tomography, ultrasound, magnetic resonance and histology, the latter taken as gold standard. Results show that only ABI imaging was able to either visualize or correctly estimate the early pathological status of the cartilage. The status of the bone ingrowth in sheep implants have also been examined in vitro: ABI images permitted to correctly distinguish between good and incomplete bone healing. Pioneering in-vivo ABI on guinea pigs were also successfully performed, confirming the possible use of the technique to follow up the progression of joint diseases, the bone/metal ingrowth and the efficacy of drugs treatments. As part of the development of the phase contrast techniques, two objectives have been reached. First, it has been experimentally demonstrated for the first time that the ABI and the propagation based imaging (PBI) can be combined to create images with original features (hybrid imaging, HI). Secondly, it has been proposed and experimentally tested a new simplified set-up capable to produce images with properties similar to those obtained with the ABI technique or HI. Finally, both the ABI and the HI have been theoretically studied with an innovative, wave-based simulation program, which was able to correctly reproduce experimental results. (author)

  4. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  5. An automated radiological survey method for performing site remediation and decommissioning

    International Nuclear Information System (INIS)

    Handy, R.G.; Bolch, W.E.; Harder, G.F.; Tolaymat, T.M.

    1994-01-01

    A portable, computer-based method of performing environmental monitoring and assessment for site remediation and decommissioning has been developed. The integrated system has been developed to provide for survey time reductions and real-time data analysis. The technique utilizes a notebook 486 computer with the necessary hardware and software components that makes it possible to be used in an almost unlimited number of environmental monitoring and assessment scenarios. The results from a pilot, open-quotes hide-and-seekclose quotes gamma survey and an actual alpha decontamination survey were elucidated. It was found that a open-quotes hide-and-seekclose quotes survey could come up with timely and accurate conclusions about the position of the source. The use of the automated system in a Th-232 alpha survey resulted in a reduction in the standard time necessary to do a radiological survey. In addition, the ability to analyze the data on-site allowed for identification and location of areas which needed further decontamination. Finally, a discussion on possible future improvements and field conclusions was made

  6. Modelers' perception of mathematical modeling in epidemiology: a web-based survey.

    Directory of Open Access Journals (Sweden)

    Gilles Hejblum

    Full Text Available BACKGROUND: Mathematical modeling in epidemiology (MME is being used increasingly. However, there are many uncertainties in terms of definitions, uses and quality features of MME. METHODOLOGY/PRINCIPAL FINDINGS: To delineate the current status of these models, a 10-item questionnaire on MME was devised. Proposed via an anonymous internet-based survey, the questionnaire was completed by 189 scientists who had published in the domain of MME. A small minority (18% of respondents claimed to have in mind a concise definition of MME. Some techniques were identified by the researchers as characterizing MME (e.g. Markov models, while others-at the same level of sophistication in terms of mathematics-were not (e.g. Cox regression. The researchers' opinions were also contrasted about the potential applications of MME, perceived as highly relevant for providing insight into complex mechanisms and less relevant for identifying causal factors. The quality criteria were those of good science and were not related to the size and the nature of the public health problems addressed. CONCLUSIONS/SIGNIFICANCE: This study shows that perceptions on the nature, uses and quality criteria of MME are contrasted, even among the very community of published authors in this domain. Nevertheless, MME is an emerging discipline in epidemiology and this study underlines that it is associated with specific areas of application and methods. The development of this discipline is likely to deserve a framework providing recommendations and guidance at various steps of the studies, from design to report.

  7. GPR as a Low Impact Paleontogical Survey Technique

    Science.gov (United States)

    Sturdevant, G. C.; Leverence, R.; Stewart, R.

    2013-12-01

    The Deweyville Formation, a Pleistocene fluvial sandstone, is a prolific source of megafaunal fossils from periods of low stand environmental conditions. GPR was employed in an environmentally sensitive area in close proximity to a salt dome in Northwest Harris County, Texas as a method of evaluating the probable paleo-depositional environment and to prospect for potential further site development of two distinct fossiliferous zones. The primary zone of interest is a lag gravel bounded sand responsible for producing a regionally unique fossil assemblage including South American megafauna (Lundelius et al, 2013). The secondary zone of interest contains undisturbed mammoth remains housed in coarse white sand emplaced on top of a clay drape which has been hypothesized to represent an oxbow lake formed by the meandering paleo-Brazos river. With an accurate map of the paleo-channel planning future activity can focus on maximizing fossil recovery and minimizing site impact. Pulse EKKO 250 MHz, 400MHz, and 1GHz system was employed in a prospect area proximal to the secondary site to calibrate and evaluate these systems for their resolution and penetration depth in the modern sediments. The data was processed using EKKO Mapper and EKKO View Deluxe software packages, 3d volumes were produced and sliced. Preliminary results from the 250 MHz demonstrate successful imaging of the sand-clay interface. After these surveys were run a small portion of the site was excavated to confirm the estimated velocities, the observed anomalies, and refine our modeling and interpretation, and improve grid design for further surveys. It was confirmed that the sand-clay interface was easily observable using GPR, however the grid spacing proved to be too wide, leading to artifacts in the 3d volume produced.

  8. The development of additive manufacturing technique for nickel-base alloys: A review

    Science.gov (United States)

    Zadi-Maad, Ahmad; Basuki, Arif

    2018-04-01

    Nickel-base alloys are an attractive alloy due to its excellent mechanical properties, a high resistance to creep deformation, corrosion, and oxidation. However, it is a hard task to control performance when casting or forging for this material. In recent years, additive manufacturing (AM) process has been implemented to replace the conventional directional solidification process for the production of nickel-base alloys. Due to its potentially lower cost and flexibility manufacturing process, AM is considered as a substitute technique for the existing. This paper provides a comprehensive review of the previous work related to the AM techniques for Ni-base alloys while highlighting current challenges and methods to solving them. The properties of conventionally manufactured Ni-base alloys are also compared with the AM fabricated alloys. The mechanical properties obtained from tension, hardness and fatigue test are included, along with discussions of the effect of post-treatment process. Recommendations for further work are also provided.

  9. Combining Internet-Based and Postal Survey Methods in a Survey among Gynecologists: Results of a Randomized Trial.

    Science.gov (United States)

    Ernst, Sinja Alexandra; Brand, Tilman; Lhachimi, Stefan K; Zeeb, Hajo

    2018-04-01

    To assess whether a combination of Internet-based and postal survey methods (mixed-mode) compared to postal-only survey methods (postal-only) leads to improved response rates in a physician survey, and to compare the cost implications of the different recruitment strategies. All primary care gynecologists in Bremen and Lower Saxony, Germany, were invited to participate in a cross-sectional survey from January to July 2014. The sample was divided into two strata (A; B) depending on availability of an email address. Within each stratum, potential participants were randomly assigned to mixed-mode or postal-only group. In Stratum A, the mixed-mode group had a lower response rate compared to the postal-only group (12.5 vs. 20.2 percent; RR = 0.61, 95 percent CI: 0.44-0.87). In stratum B, no significant differences were found (15.6 vs. 16.2 percent; RR = 0.95, 95 percent CI: 0.62-1.44). Total costs (in €) per valid questionnaire returned (Stratum A: 399.72 vs. 248.85; Stratum B: 496.37 vs. 455.15) and per percentage point of response (Stratum A: 1,379.02 vs. 861.02; Stratum B 1,116.82 vs. 1,024.09) were higher, whereas variable costs were lower in mixed-mode compared to the respective postal-only groups (Stratum A cost ratio: 0.47, Stratum B cost ratio: 0.71). In this study, primary care gynecologists were more likely to participate by traditional postal-only than by mixed-mode survey methods that first offered an Internet option. However, the lower response rate for the mixed-mode method may be partly due to the older age structure of the responding gynecologists. Variable costs per returned questionnaire were substantially lower in mixed-mode groups and indicate the potential for cost savings if the sample population is sufficiently large. © Health Research and Educational Trust.

  10. Efficient Identification Using a Prime-Feature-Based Technique

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Haq, Shaiq A.; Valente, Andrea

    2011-01-01

    . Fingerprint identification system, implemented on PC/104 based real-time systems, can accurately identify the operator. Traditionally, the uniqueness of a fingerprint is determined by the overall pattern of ridges and valleys as well as the local ridge anomalies e.g., a ridge bifurcation or a ridge ending......, which are called minutiae points. Designing a reliable automatic fingerprint matching algorithm for minimal platform is quite challenging. In real-time systems, efficiency of the matching algorithm is of utmost importance. To achieve this goal, a prime-feature-based indexing algorithm is proposed......Identification of authorized train drivers through biometrics is a growing area of interest in locomotive radio remote control systems. The existing technique of password authentication is not very reliable and potentially unauthorized personnel may also operate the system on behalf of the operator...

  11. Authentication Protocols for Internet of Things: A Comprehensive Survey

    Directory of Open Access Journals (Sweden)

    Mohamed Amine Ferrag

    2017-01-01

    Full Text Available In this paper, a comprehensive survey of authentication protocols for Internet of Things (IoT is presented. Specifically more than forty authentication protocols developed for or applied in the context of the IoT are selected and examined in detail. These protocols are categorized based on the target environment: (1 Machine to Machine Communications (M2M, (2 Internet of Vehicles (IoV, (3 Internet of Energy (IoE, and (4 Internet of Sensors (IoS. Threat models, countermeasures, and formal security verification techniques used in authentication protocols for the IoT are presented. In addition a taxonomy and comparison of authentication protocols that are developed for the IoT in terms of network model, specific security goals, main processes, computation complexity, and communication overhead are provided. Based on the current survey, open issues are identified and future research directions are proposed.

  12. ULTRALUMINOUS INFRARED GALAXIES IN THE WISE AND SDSS SURVEYS

    International Nuclear Information System (INIS)

    Su, Shanshan; Kong, Xu; Li, Jinrong; Fang, Guanwen

    2013-01-01

    In this paper, we present a large catalog of 419 Ultraluminous infrared galaxies (ULIRGs), carefully selected from the Wide-field Infrared Survey Explorer mid-infrared data and the Sloan Digital Sky Survey eighth data release, and classify them into three subsamples, based on their emission line properties: H II-like ULIRGs, Seyfert 2 ULIRGs, and composite ULIRGs. We apply our new efficient spectral synthesis technique, which is based on mean field approach to Bayesian independent component analysis (MF-ICA) method, to the galaxy integrated spectra. We also analyze the stellar population properties, including percentage contribution, stellar age, and stellar mass, for these three types of ULIRGs, and explore the evolution among them. We find no significant difference between the properties of stellar populations in ULIRGs with or without active galactic nucleus components. Our results suggest that there is no evolutionary link among these three type ULIRGs

  13. Neural Representation. A Survey-Based Analysis of the Notion

    Directory of Open Access Journals (Sweden)

    Oscar Vilarroya

    2017-08-01

    Full Text Available The word representation (as in “neural representation”, and many of its related terms, such as to represent, representational and the like, play a central explanatory role in neuroscience literature. For instance, in “place cell” literature, place cells are extensively associated with their role in “the representation of space.” In spite of its extended use, we still lack a clear, universal and widely accepted view on what it means for a nervous system to represent something, on what makes a neural activity a representation, and on what is re-presented. The lack of a theoretical foundation and definition of the notion has not hindered actual research. My aim here is to identify how active scientists use the notion of neural representation, and eventually to list a set of criteria, based on actual use, that can help in distinguishing between genuine or non-genuine neural-representation candidates. In order to attain this objective, I present first the results of a survey of authors within two domains, place-cell and multivariate pattern analysis (MVPA research. Based on the authors’ replies, and on a review of neuroscientific research, I outline a set of common properties that an account of neural representation seems to require. I then apply these properties to assess the use of the notion in two domains of the survey, place-cell and MVPA studies. I conclude by exploring a shift in the notion of representation suggested by recent literature.

  14. Evaluation of Clipping Based Iterative PAPR Reduction Techniques for FBMC Systems

    Directory of Open Access Journals (Sweden)

    Zsolt Kollár

    2014-01-01

    to conventional orthogonal frequency division multiplexing (OFDM technique. The low ACLR of the transmitted FBMC signal makes it especially favorable in cognitive radio applications, where strict requirements are posed on out-of-band radiation. Large dynamic range resulting in high peak-to-average power ratio (PAPR is characteristic of all sorts of multicarrier signals. The advantageous spectral properties of the high-PAPR FBMC signal are significantly degraded if nonlinearities are present in the transceiver chain. Spectral regrowth may appear, causing harmful interference in the neighboring frequency bands. This paper presents novel clipping based PAPR reduction techniques, evaluated and compared by simulations and measurements, with an emphasis on spectral aspects. The paper gives an overall comparison of PAPR reduction techniques, focusing on the reduction of the dynamic range of FBMC signals without increasing out-of-band radiation. An overview is presented on transmitter oriented techniques employing baseband clipping, which can maintain the system performance with a desired bit error rate (BER.

  15. Social workers’ orientation toward the evidence-based practice process : A Dutch survey

    NARCIS (Netherlands)

    van der Zwet, R.J.M.; Beneken Genaamd Kolmer, D.M.; Schalk, R.

    2016-01-01

    Objectives: This study assesses social workers’ orientation toward the evidence-based practice (EBP) process and explores which specific variables (e.g. age) are associated. Methods: Data were collected from 341 Dutch social workers through an online survey which included a Dutch translation of the

  16. The applications of vehicle borne and ground gamma ray spectrometry in environmental radioactivity survey and monitoring: examples from the Philippines

    International Nuclear Information System (INIS)

    Reyes, R.Y.; Petrache, C.A.; Garcia, N.Q.; Tabora, E.U.; Juson, J.G.

    2002-01-01

    In the light of the nuclear development all over the world, there is an increasing global awareness on matters related to radioactivity and radioactive accidents. As such, the Philippine Nuclear Research Institute (PNRI) acquired through a technical cooperation project with the International Atomic Energy Agency the vehicle borne (car borne) and portable (ground) gamma ray, spectrometers. The objectives of this project were to establish environmental baseline information on the natural radioactivity of the entire country and to generate radioelement maps for geological mapping and mineral resource assessment. The purpose of this paper is to present the results of the different surveys including the methodologies and techniques conducted in the country using both spectrometers in effectively mapping natural and man-made sources of radiation. A pilot survey was successfully carried out over the small island of Marinduque (989 km 2 ) using the combined car borne and ground gamma ray spectrometric survey techniques. This was in preparation of the planned nationwide survey using this approach. Highlight of this study was the production of the first natural radioactivity maps within the country. Interestingly, these maps closely reflect the local geology of Marinduque Island. Car borne gamma ray spectrometric surveys were likewise undertaken at the former US naval base in Subic and US airforce base in Clark. This was due to mounting public concern over the presence of possible radioactive contamination or materials left behind by the US military forces in these bases. Results using the gamma-ray spectrum ratio technique indicated the absence of man-made sources of radiation in areas monitored within the two bases. A sizeable part of Metro Manila, the capital of the Philippines, has also been covered by the car borne survey. Results discovered an area with high measurements of thorium. The radiation source is coming from an establishment that uses thorium nitrate in

  17. Alpha particle response for a prototype radiation survey meter based on poly(ethylene terephthalate) with un-doping fluorescent guest molecules

    International Nuclear Information System (INIS)

    Nguyen, Philip; Nakamura, Hidehito; Sato, Nobuhiro; Takahashi, Tomoyuki; Maki, Daisuke; Kanayama, Masaya; Takahashi, Sentaro; Kitamura, Hisashi; Shirakawa, Yoshiyuki

    2016-01-01

    There is no radiation survey meter that can discriminate among alpha particles, beta particles, and gamma-rays with one material. Previously, undoped poly(ethylene terephthalate) (PET) has been shown to be an effective material for beta particle and gamma-ray detection. Here, we demonstrate a prototype survey meter for alpha particles based on undoped PET. A 140 × 72 × 1-mm PET substrate was fabricated with mirrored surfaces. It was incorporated in a unique detection section of the survey meter that directly detects alpha particles. The prototype exhibited an unambiguous response to alpha particles from a 241 Am radioactive source. These results demonstrate that undoped PET can perform well in survey meters for alpha particle detection. Overall, the PET-based survey meter has the potential to detect multiple types of radiation, and will spawn an unprecedented type of radiation survey meter based on undoped aromatic ring polymers. (author)

  18. Appropriateness, acceptance and sensory preferences based on visual information: A web-based survey on meat substitutes in a meal context

    NARCIS (Netherlands)

    Elzerman, J.E.; Hoek, A.C.; Boekel, van T.; Luning, P.A.

    2015-01-01

    The aim of this study was to investigate the appropriateness, attractiveness, use-intention and (un)desirable sensory properties of meat substitutes in different dishes based only on visual information. A web-based survey was developed to let consumers assess the use of meat substitutes in different

  19. Development of accelerator-based γ-ray-induced positron annihilation spectroscopy technique

    International Nuclear Information System (INIS)

    Selim, F.A.; Wells, D.P.; Harmon, J. F.; Williams, J.

    2005-01-01

    Accelerator-based γ-ray-induced positron annihilation spectroscopy performs positron annihilation spectroscopy by utilizing MeV bremsstrahlung radiation generated from an accelerator (We have named the technique 'accelerator-based γ-ray-induced PAS', even though 'bremsstrahlung' is more correct here than 'γ rays'. The reason for that is to make the name of the technique more general, since PAS may be performed by utilizing MeV γ rays emitted from nuclei through the use of accelerators as described later in this article and as in the case of positron lifetime spectroscopy [F.A. Selim, D.P. Wells, and J.F. Harmon, Rev. Sci. Instrum. 76, 033905 (2005)].) instead of using positrons from radioactive sources or positron beams. MeV γ rays create positrons inside the materials by pair production. The induced positrons annihilate with the material electrons emitting a 511-keV annihilation radiation. Doppler broadening spectroscopy of the 511-keV radiation provides information about open-volume defects and plastic deformation in solids. The high penetration of MeV γ rays allows probing of defects at high depths in thick materials up to several centimeters, which is not possible with most of the current nondestructive techniques. In this article, a detailed description of the technique will be presented, including its benefits and limitations relative to the other nondestructive methods. Its application on the investigation of plastic deformation in thick steel alloys will be shown

  20. Evaluation of irradiation damage effect by applying electric properties based techniques

    International Nuclear Information System (INIS)

    Acosta, B.; Sevini, F.

    2004-01-01

    The most important effect of the degradation by radiation is the decrease in the ductility of the pressure vessel of the reactor (RPV) ferritic steels. The main way to determine the mechanical behaviour of the RPV steels is tensile and impact tests, from which the ductile to brittle transition temperature (DBTT) and its increase due to neutron irradiation can be calculated. These tests are destructive and regularly applied to surveillance specimens to assess the integrity of RPV. The possibility of applying validated non-destructive ageing monitoring techniques would however facilitate the surveillance of the materials that form the reactor vessel. The JRC-IE has developed two devices, focused on the measurement of the electrical properties to assess non-destructively the embrittlement state of materials. The first technique, called Seebeck and Thomson Effects on Aged Material (STEAM), is based on the measurement of the Seebeck coefficient, characteristic of the material and related to the microstructural changes induced by irradiation embrittlement. With the same aim the second technique, named Resistivity Effects on Aged Material (REAM), measures instead the resistivity of the material. The purpose of this research is to correlate the results of the impact tests, STEAM and REAM measurements with the change in the mechanical properties due to neutron irradiation. These results will make possible the improvement of such techniques based on the measurement of material electrical properties for their application to the irradiation embrittlement assessment