WorldWideScience

Sample records for surveys techniques based

  1. A survey on OFDM channel estimation techniques based on denoising strategies

    Directory of Open Access Journals (Sweden)

    Pallaviram Sure

    2017-04-01

    Full Text Available Channel estimation forms the heart of any orthogonal frequency division multiplexing (OFDM based wireless communication receiver. Frequency domain pilot aided channel estimation techniques are either least squares (LS based or minimum mean square error (MMSE based. LS based techniques are computationally less complex. Unlike MMSE ones, they do not require a priori knowledge of channel statistics (KCS. However, the mean square error (MSE performance of the channel estimator incorporating MMSE based techniques is better compared to that obtained with the incorporation of LS based techniques. To enhance the MSE performance using LS based techniques, a variety of denoising strategies have been developed in the literature, which are applied on the LS estimated channel impulse response (CIR. The advantage of denoising threshold based LS techniques is that, they do not require KCS but still render near optimal MMSE performance similar to MMSE based techniques. In this paper, a detailed survey on various existing denoising strategies, with a comparative discussion of these strategies is presented.

  2. Radon survey techniques

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    The report reviews radon measurement surveys in soils and in water. Special applications, and advantages and limitations of the radon measurement techniques are considered. The working group also gives some directions for further research in this field

  3. Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not based on probability schemes

    NARCIS (Netherlands)

    Toepoel, V.; Emerson, Hannah

    2017-01-01

    Weighting techniques in web surveys based on no probability schemes are devised to correct biases due to self-selection, undercoverage, and nonresponse. In an interactive panel, 38 survey experts addressed weighting techniques and auxiliary variables in web surveys. Most of them corrected all biases

  4. Knowledge based systems: A critical survey of major concepts, issues and techniques. Visuals

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-9. The objectives of the report are to: examine various techniques used to build the KBS; to examine at least one KBS in detail, i.e., a case study; to list and identify limitations and problems with the KBS; to suggest future areas of research; and to provide extensive reference materials.

  5. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA, HANFORD, WASHINGTON

    International Nuclear Information System (INIS)

    Petersen, S.W.

    2010-01-01

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM(reg s ign) system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m (328 ft) and 200 m (656 ft)) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  6. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA HANFORD WASHINGTON

    Energy Technology Data Exchange (ETDEWEB)

    PETERSEN SW

    2010-12-02

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM{reg_sign} system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m [328 ft] and 200 m [656 ft]) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  7. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  8. Timing and technique impact the effectiveness of road-based, mobile acoustic surveys of bats.

    Science.gov (United States)

    D'Acunto, Laura E; Pauli, Benjamin P; Moy, Mikko; Johnson, Kiara; Abu-Omar, Jasmine; Zollner, Patrick A

    2018-03-01

    Mobile acoustic surveys are a common method of surveying bat communities. However, there is a paucity of empirical studies exploring different methods for conducting mobile road surveys of bats. During 2013, we conducted acoustic mobile surveys on three routes in north-central Indiana, U.S.A., using (1) a standard road survey, (2) a road survey where the vehicle stopped for 1 min at every half mile of the survey route (called a "start-stop method"), and (3) a road survey with an individual using a bicycle. Linear mixed models with multiple comparison procedures revealed that when all bat passes were analyzed, using a bike to conduct mobile surveys detected significantly more bat passes per unit time compared to other methods. However, incorporating genus-level comparisons revealed no advantage to using a bike over vehicle-based methods. We also found that survey method had a significant effect when analyses were limited to those bat passes that could be identified to genus, with the start-stop method generally detecting more identifiable passes than the standard protocol or bike survey. Additionally, we found that significantly more identifiable bat passes (particularly those of the Eptesicus and Lasiurus genera) were detected in surveys conducted immediately following sunset. As governing agencies, particularly in North America, implement vehicle-based bat monitoring programs, it is important for researchers to understand how variations on protocols influence the inference that can be gained from different monitoring schemes.

  9. A survey on the state-of-the-technique on software based pipeline leak detection systems

    Energy Technology Data Exchange (ETDEWEB)

    Baptista, Renan Martins [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Centro de Pesquisas. Div. de Explotacao]. E-mail: renan@cenpes.petrobras.com.br

    2000-07-01

    This paper describes a general technical survey on software based leak detection systems (LDS), approaching its main technological features, the operational situations where they are feasible, and the scenarios within the Brazilian pipeline network. The decision on what LDS to choose for a given pipeline is a matter of cost, suitability and feasibility. A simpler low cost, less effective product, but with a fast installation and tuning procedure, may be more suitable for a given operational site (pipeline configuration, kind of fluid, quality of instrumentation and communication), than a complex, high cost, efficient product, but taking a long time to be properly installed. Some other may really have a level of complexity that will require a more sophisticated system. A few number of them will simply not be suitable to have a LDS: it may be caused by the poor quality or absence of instrumentation, or, the worst case, due to the lack of technology to approach that specific case, e. g., multiphase flow lines, or those lines that commonly operates in slack condition. It is intended to approach here the general state-of-the-technique and make some initial comments on the costs. (author)

  10. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    Science.gov (United States)

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  11. Motion Transplantation Techniques: A Survey

    NARCIS (Netherlands)

    van Basten, Ben; Egges, Arjan

    2012-01-01

    During the past decade, researchers have developed several techniques for transplanting motions. These techniques transplant a partial auxiliary motion, possibly defined for a small set of degrees of freedom, on a base motion. Motion transplantation improves motion databases' expressiveness and

  12. Survey of semantic modeling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C.L.

    1975-07-01

    The analysis of the semantics of programing languages was attempted with numerous modeling techniques. By providing a brief survey of these techniques together with an analysis of their applicability for answering semantic issues, this report attempts to illuminate the state-of-the-art in this area. The intent is to be illustrative rather than thorough in the coverage of semantic models. A bibliography is included for the reader who is interested in pursuing this area of research in more detail.

  13. Survey of Object-Based Data Reduction Techniques in Observational Astronomy

    Directory of Open Access Journals (Sweden)

    Łukasik Szymon

    2016-01-01

    Full Text Available Dealing with astronomical observations represents one of the most challenging areas of big data analytics. Besides huge variety of data types, dynamics related to continuous data flow from multiple sources, handling enormous volumes of data is essential. This paper provides an overview of methods aimed at reducing both the number of features/attributes as well as data instances. It concentrates on data mining approaches not related to instruments and observation tools instead working on processed object-based data. The main goal of this article is to describe existing datasets on which algorithms are frequently tested, to characterize and classify available data reduction algorithms and identify promising solutions capable of addressing present and future challenges in astronomy.

  14. Survey on visualization and analysis techniques based on diffusion MRI for in-vivo anisotropic diffusion structures

    International Nuclear Information System (INIS)

    Masutani, Yoshitaka; Sato, Tetsuo; Urayama, Shin-ichi; Bihan, D.L.

    2008-01-01

    In association with development of diffusion MR imaging technologies for anisotropic diffusion measurement in living body, related research is explosively increasing including research fields of applied mathematics and visualization in addition to MR imaging, biomedical image technology, and medical science. One of the reasons is that the diffusion MRI data set is a set of high dimensional image information beyond conventional scalar or vector images, and is attractive for the researchers in the related fields. This survey paper is mainly aimed at introducing state-of-the-art of post processing techniques reported in the literature for diffusion MRI data, such as analysis and visualization. (author)

  15. Conducting Web-based Surveys.

    OpenAIRE

    David J. Solomon

    2001-01-01

    Web-based surveying is becoming widely used in social science and educational research. The Web offers significant advantages over more traditional survey techniques however there are still serious methodological challenges with using this approach. Currently coverage bias or the fact significant numbers of people do not have access, or choose not to use the Internet is of most concern to researchers. Survey researchers also have much to learn concerning the most effective ways to conduct s...

  16. Modern Surveying Techniques In National Infrastructural ...

    African Journals Online (AJOL)

    Journal of Research in National Development ... Modern Surveying Techniques In National Infrastructural Development: Case Study Of Roads ... Ways that Remote Sensing help to make highway construction easier were discussed.

  17. Bases of technique of sprinting

    Directory of Open Access Journals (Sweden)

    Valeriy Druz

    2015-06-01

    Full Text Available Purpose: to determine the biomechanical consistent patterns of a movement of a body providing the highest speed of sprinting. Material and Methods: the analysis of scientific and methodical literature on the considered problem, the anthropometrical characteristics of the surveyed contingent of sportsmen, the analysis of high-speed shootings of the leading runners of the world. Results: the biomechanical bases of technique of sprinting make dispersal and movement of the general center of body weight of the sportsman on a parabolic curve in a start phase taking into account the initial height of its stay in a pose of a low start. Its further movement happens on a cycloidal trajectory which is formed due to a pendulum movement of the extremities creating the lifting power which provides flight duration more in a running step, than duration of a basic phase. Conclusions: the received biomechanical regularities of technique of sprinting allow increasing the efficiency of training of sportsmen in sprinting.

  18. Survey Of Lossless Image Coding Techniques

    Science.gov (United States)

    Melnychuck, Paul W.; Rabbani, Majid

    1989-04-01

    Many image transmission/storage applications requiring some form of data compression additionally require that the decoded image be an exact replica of the original. Lossless image coding algorithms meet this requirement by generating a decoded image that is numerically identical to the original. Several lossless coding techniques are modifications of well-known lossy schemes, whereas others are new. Traditional Markov-based models and newer arithmetic coding techniques are applied to predictive coding, bit plane processing, and lossy plus residual coding. Generally speaking, the compression ratio offered by these techniques are in the area of 1.6:1 to 3:1 for 8-bit pictorial images. Compression ratios for 12-bit radiological images approach 3:1, as these images have less detailed structure, and hence, their higher pel correlation leads to a greater removal of image redundancy.

  19. Three-dimensional seismic survey planning based on the newest data acquisition design technique; Saishin no data shutoku design ni motozuku sanjigen jishin tansa keikaku

    Energy Technology Data Exchange (ETDEWEB)

    Minehara, M; Nakagami, K; Tanaka, H [Japan National Oil Corp., Tokyo (Japan). Technology Research Center

    1996-10-01

    Theory of parameter setting for data acquisition is arranged, mainly as to the seismic generating and receiving geometry. This paper also introduces an example of survey planning for three-dimensional land seismic exploration in progress. For the design of data acquisition, fundamental parameters are firstly determined on the basis of the characteristics of reflection records at a given district, and then, the layout of survey is determined. In this study, information through modeling based on the existing interpretation of geologic structures is also utilized, to reflect them for survey specifications. Land three-dimensional seismic survey was designed. Ground surface of the surveyed area consists of rice fields and hilly regions. The target was a nose-shaped structure in the depth about 2,500 m underground. A survey area of 4km{times}5km was set. Records in the shallow layers could not obtained when near offset was not ensured. Quality control of this distribution was important for grasping the shallow structure required. In this survey, the seismic generating point could be ensured more certainly than initially expected, which resulted in the sufficient security of near offset. 2 refs., 2 figs.

  20. Monitoring beach changes using GPS surveying techniques

    Science.gov (United States)

    Morton, Robert; Leach, Mark P.; Paine, Jeffrey G.; Cardoza, Michael A.

    1993-01-01

    A need exists for frequent and prompt updating of shoreline positions, rates of shoreline movement, and volumetric nearshore changes. To effectively monitor and predict these beach changes, accurate measurements of beach morphology incorporating both shore-parallel and shore-normal transects are required. Although it is possible to monitor beach dynamics using land-based surveying methods, it is generally not practical to collect data of sufficient density and resolution to satisfy a three-dimensional beach-change model of long segments of the coast. The challenge to coastal scientists is to devise new beach monitoring methods that address these needs and are rapid, reliable, relatively inexpensive, and maintain or improve measurement accuracy.

  1. Isotope techniques in a water survey

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1959-10-15

    The circulation of water is one of the most interesting of natural phenomena. Exact knowledge of fluctuations in precipitation and other factors in water circulation is extremely important for areas which have a very limited water supply. The information about the circulation of water is also important for the disposal of radioactive wastes on land and in the sea. Before satisfactory methods of disposal can be devised, it is essential to know precisely whether and to what extent the wastes can be transferred from one place to another as a result of the circulation of water. One of the most efficient ways of gathering such information is to study the isotopic ratios of hydrogen and oxygen in water in different areas. Tritium can serve a s a tracer in the study of water circulation. A variety of information can be obtained by measurements of isotopic composition of water, e.g. the average age of the water molecule in a lake or age, size, storage time and flow rate of a groundwater body. The modern tools of hydrological research cannot be employed by every country, because measurements of the isotopic composition of water require great technical skill and scientific knowledge. Besides, interpretation of isotope data in terms of hydrology and climatology requires the knowledge of certain basic data for the whole world or at least for large areas. A more complete knowledge of the worldwide variations in the isotopic composition of water would greatly facilitate the interpretation of local conditions. Guided by these considerations, the International Atomic Energy Agency has decided to initiate a study to determine the world-wide distribution of hydrogen and oxygen isotopes in water. On the basis of this study, it will be possible to make available basic data for the use of any country that wishes to apply isotope techniques for hydrological and climatological research. Under this project, it is proposed to collect samples of rain, river and ocean water in different

  2. Industry Based Monkfish Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Monkfish industry leaders expressed concerns that the NEFSC bottom trawl surveys did not sample in all monkfish habitats; particularly the deeper water outside the...

  3. Survey of intravitreal injection techniques among retina specialists in Israel

    Directory of Open Access Journals (Sweden)

    Segal O

    2016-06-01

    Full Text Available Ori Segal,1,2 Yael Segal-Trivitz,1,3 Arie Y Nemet,1,2 Noa Geffen,1,2 Ronit Nesher,1,2 Michael Mimouni4 1Department of Ophthalmology, Meir Medical Center, Kfar Saba, 2The Sackler School of Medicine, Tel Aviv University, Tel Aviv, 3Department of Psychiatry, Geha Psychiatric Hospital, Petah Tikva, 4Department of Ophthalmology, Rambam Health Care Campus, Haifa, Israel Purpose: The purpose of this study was to describe antivascular endothelial growth factor intravitreal injection techniques of retinal specialists in order to establish a cornerstone for future practice guidelines. Methods: All members of the Israeli Retina Society were contacted by email to complete an anonymous, 19-question, Internet-based survey regarding their intravitreal injection techniques. Results: Overall, 66% (52/79 completed the survey. Most (98% do not instruct patients to discontinue anticoagulant therapy and 92% prescribe treatment for patients in the waiting room. Three quarters wear sterile gloves and prepare the patient in the supine position. A majority (71% use sterile surgical draping. All respondents apply topical analgesics and a majority (69% measure the distance from the limbus to the injection site. A minority (21% displace the conjunctiva prior to injection. A majority of the survey participants use a 30-gauge needle and the most common quadrant for injection is superotemporal (33%. Less than half routinely assess postinjection optic nerve perfusion (44%. A majority (92% apply prophylactic antibiotics immediately after the injection. Conclusion: The majority of retina specialists perform intravitreal injections similarly. However, a relatively large minority performs this procedure differently. Due to the extremely low percentage of complications, it seems as though such differences do not increase the risk. However, more evidence-based medicine, a cornerstone for practice guidelines, is required in order to identify the intravitreal injection techniques

  4. UAS Mapping as an alternative for land surveying techniques?

    Directory of Open Access Journals (Sweden)

    L. Devriendt

    2014-03-01

    Full Text Available Can a UAS mapping technique compete with standard surveying techniques? Since the boom in different RPAS (remotely piloted air system, UAV (unmanned aerial vehicle, or UAS (unmanned aerial system, this is one of the crucial questions when it comes to UAS mappings. Not the looks and feels are important but the reliability, ease-to-use, and accuracy that you get with a system based on hardware and corresponding software. This was also one of the issues that the Dutch Land Registry asked a few months ago aimed at achieving an effective and usable system for updating property boundaries in new-build districts. Orbit GT gave them a ready-made answer: a definitive outcome based on years of research and development in UAS mapping technology and software.

  5. Web-based Surveys: Changing the Survey Process

    OpenAIRE

    Gunn, Holly

    2002-01-01

    Web-based surveys are having a profound influence on the survey process. Unlike other types of surveys, Web page design skills and computer programming expertise play a significant role in the design of Web-based surveys. Survey respondents face new and different challenges in completing a Web-based survey. This paper examines the different types of Web-based surveys, the advantages and challenges of using Web-based surveys, the design of Web-based surveys, and the issues of validity, error, ...

  6. Survey as a group interactive teaching technique

    Directory of Open Access Journals (Sweden)

    Ana GOREA

    2017-03-01

    Full Text Available Smooth running of the educational process and the results depend a great deal on the methods used. The methodology of teaching offers a great variety of teaching techniques that the teacher can make use of in the teaching/learning process. Such techniques as brainstorming, the cube, KLW, case study, Venn diagram, and many other are familiar to the teachers and they use them effectively in the classroom. The present article proposes a technique called ‘survey’, which has been successfully used by the author as a student-centered speaking activity in foreign language classes. It has certain advantages especially if used in large groups. It can be adapted for any other discipline in the case when the teacher wishes to offer the students space for cooperative activity and creativity.

  7. Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry presents a detailed survey of knowledge based systems. After being in a relatively dormant state for many years, only recently is Artificial Intelligence (AI) - that branch of computer science that attempts to have machines emulate intelligent behavior - accomplishing practical results. Most of these results can be attributed to the design and use of Knowledge-Based Systems, KBSs (or ecpert systems) - problem solving computer programs that can reach a level of performance comparable to that of a human expert in some specialized problem domain. These systems can act as a consultant for various requirements like medical diagnosis, military threat analysis, project risk assessment, etc. These systems possess knowledge to enable them to make intelligent desisions. They are, however, not meant to replace the human specialists in any particular domain. A critical survey of recent work in interactive KBSs is reported. A case study (MYCIN) of a KBS, a list of existing KBSs, and an introduction to the Japanese Fifth Generation Computer Project are provided as appendices. Finally, an extensive set of KBS-related references is provided at the end of the report.

  8. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  9. Web-Based Surveys: Not Your Basic Survey Anymore

    Science.gov (United States)

    Bertot, John Carlo

    2009-01-01

    Web-based surveys are not new to the library environment. Although such surveys began as extensions of print surveys, the Web-based environment offers a number of approaches to conducting a survey that the print environment cannot duplicate easily. Since 1994, the author and others have conducted national surveys of public library Internet…

  10. A Survey on different techniques of steganography

    Directory of Open Access Journals (Sweden)

    Kaur Harpreet

    2016-01-01

    Full Text Available Steganography is important due to the exponential development and secret communication of potential computer users over the internet. Steganography is the art of invisible communication to keep secret information inside other information. Steganalysis is the technology that attempts to ruin the Steganography by detecting the hidden information and extracting.Steganography is the process of Data embedding in the images, text/documented, audio and video files. The paper also highlights the security improved by applying various techniques of video steganography.

  11. A Survey of DHT Security Techniques

    NARCIS (Netherlands)

    Urdaneta Paredes, G.A.; Pierre, G.E.O.; van Steen, M.R.

    2011-01-01

    Peer-to-peer networks based on distributed hash tables (DHTs) have received considerable attention ever since their introduction in 2001. Unfortunately, DHT-based systems have been shown to be notoriously difficult to protect against security attacks. Various reports have been published that discuss

  12. SEM-based characterization techniques

    International Nuclear Information System (INIS)

    Russell, P.E.

    1986-01-01

    The scanning electron microscope is now a common instrument in materials characterization laboratories. The basic role of the SEM as a topographic imaging system has steadily been expanding to include a variety of SEM-based analytical techniques. These techniques cover the range of basic semiconductor materials characterization to live-time device characterization of operating LSI or VLSI devices. This paper introduces many of the more commonly used techniques, describes the modifications or additions to a conventional SEM required to utilize the techniques, and gives examples of the use of such techniques. First, the types of signals available from a sample being irradiated by an electron beam are reviewed. Then, where applicable, the type of spectroscopy or microscopy which has evolved to utilize the various signal types are described. This is followed by specific examples of the use of such techniques to solve problems related to semiconductor technology. Techniques emphasized include: x-ray fluorescence spectroscopy, electron beam induced current (EBIC), stroboscopic voltage analysis, cathodoluminescnece and electron beam IC metrology. Current and future trends of some of the these techniques, as related to the semiconductor industry are discussed

  13. Industry Based Survey (IBS) Yellowtail

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The "Southern New England Yellowtail Flounder Industry-Based Survey" was a collaboration between the Rhode Island Division of Fish and Wildlife and the fishing...

  14. Industry Based Survey (IBS) Cod

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The "Gulf of Maine Atlantic Cod Industry-Based Survey" was a collaboration of the Massachusetts Division of Marine Fisheries and the fishing industry, with support...

  15. A Survey of 2D Face Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Mejda Chihaoui

    2016-09-01

    Full Text Available Despite the existence of various biometric techniques, like fingerprints, iris scan, as well as hand geometry, the most efficient and more widely-used one is face recognition. This is because it is inexpensive, non-intrusive and natural. Therefore, researchers have developed dozens of face recognition techniques over the last few years. These techniques can generally be divided into three categories, based on the face data processing methodology. There are methods that use the entire face as input data for the proposed recognition system, methods that do not consider the whole face, but only some features or areas of the face and methods that use global and local face characteristics simultaneously. In this paper, we present an overview of some well-known methods in each of these categories. First, we expose the benefits of, as well as the challenges to the use of face recognition as a biometric tool. Then, we present a detailed survey of the well-known methods by expressing each method’s principle. After that, a comparison between the three categories of face recognition techniques is provided. Furthermore, the databases used in face recognition are mentioned, and some results of the applications of these methods on face recognition databases are presented. Finally, we highlight some new promising research directions that have recently appeared.

  16. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  17. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E.; Storch, H. von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1997-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  18. Survey of decontamination and decommissioning techniques

    International Nuclear Information System (INIS)

    Kusler, L.E.

    1977-01-01

    Reports and articles on decommissioning have been reviewed to determine the current technology status and also attempt to identify potential decommissioning problem areas. It is concluded that technological road blocks, which limited decommissioning facilities in the past have been removed. In general, techniques developed by maintenance in maintaining the facility have been used to decommission facilities. Some of the more promising development underway which will further simplify decommissioning activities are: electrolytic decontamination which simplifies some decontaminating operations; arc saw and vacuum furnace which reduce the volume of metallic contaminated material by a factor of 10; remotely operated plasma torch which reduces personnel exposure; and shaped charges, water cannon and rock splitters which simplify concrete removal. Areas in which published data are limited are detailed costs identifying various components included in the total cost and also the quantity of waste generated during the decommissioning activities. With the increased awareness of decommissioning requirements as specified by licensing requirements, design criteria for new facilities are taking into consideration final decommissioning of buildings. Specific building design features will evolve as designs are evaluated and implemented

  19. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E; Storch, H von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1998-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  20. A Survey of Face Recognition Technique | Omidiora | Journal of ...

    African Journals Online (AJOL)

    A review of face recognition techniques has been carried out. Face recognition has been an attractive field in the society of both biological and computer vision of research. It exhibits the characteristics of being natural and low-intrusive. In this paper, an updated survey of techniques for face recognition is made. Methods of ...

  1. Arduino based radiation survey meter

    International Nuclear Information System (INIS)

    Rahman, Nur Aira Abd; Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee; Muzakkir, Amir

    2016-01-01

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr −1 ). Conversion factor (CF) value for conversion of CPM to μSvhr −1 determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr

  2. Arduino based radiation survey meter

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nm.gov.my; Lombigit, Lojius; Abdullah, Nor Arymaswati; Azman, Azraf; Dolah, Taufik; Jaafar, Zainudin; Mohamad, Glam Hadzir Patai; Ramli, Abd Aziz Mhd; Zain, Rasif Mohd; Said, Fazila; Khalid, Mohd Ashhar; Taat, Muhamad Zahidee [Malaysian Nuclear Agency, 43000, Bangi, Selangor (Malaysia); Muzakkir, Amir [Sinaran Utama Teknologi Sdn Bhd, 43650, Bandar Baru Bangi, Selangor (Malaysia)

    2016-01-22

    This paper presents the design of new digital radiation survey meter with LND7121 Geiger Muller tube detector and Atmega328P microcontroller. Development of the survey meter prototype is carried out on Arduino Uno platform. 16-bit Timer1 on the microcontroller is utilized as external pulse counter to produce count per second or CPS measurement. Conversion from CPS to dose rate technique is also performed by Arduino to display results in micro Sievert per hour (μSvhr{sup −1}). Conversion factor (CF) value for conversion of CPM to μSvhr{sup −1} determined from manufacturer data sheet is compared with CF obtained from calibration procedure. The survey meter measurement results are found to be linear for dose rates below 3500 µSv/hr.

  3. Complete Denture Impression Techniques Practiced by Private Dental Practitioners: A Survey

    OpenAIRE

    Kakatkar, Vinay R.

    2012-01-01

    Impression making is an important step in fabricating complete dentures. A survey to know the materials used and techniques practiced while recording complete denture impressions was conducted. It is disheartening to know that 33 % practitioners still use base plate custom trays to record final impressions. 8 % still use alginate for making final impressions. An acceptable technique for recording CD impressions is suggested.

  4. A survey of energy saving techniques for mobile computers

    NARCIS (Netherlands)

    Smit, Gerardus Johannes Maria; Havinga, Paul J.M.

    1997-01-01

    Portable products such as pagers, cordless and digital cellular telephones, personal audio equipment, and laptop computers are increasingly being used. Because these applications are battery powered, reducing power consumption is vital. In this report we first give a survey of techniques for

  5. Comparison of survey techniques on detection of northern flying squirrels

    Science.gov (United States)

    Diggins, Corinne A.; Gilley, L. Michelle; Kelly, Christine A.; Ford, W. Mark

    2016-01-01

    The ability to detect a species is central to the success of monitoring for conservation and management purposes, especially if the species is rare or endangered. Traditional methods, such as live capture, can be labor-intensive, invasive, and produce low detection rates. Technological advances and new approaches provide opportunities to more effectively survey for species both in terms of accuracy and efficiency than previous methods. We conducted a pilot comparison study of a traditional technique (live-trapping) and 2 novel noninvasive techniques (camera-trapping and ultrasonic acoustic surveys) on detection rates of the federally endangered Carolina northern flying squirrel (Glaucomys sabrinus coloratus) in occupied habitat within the Roan Mountain Highlands of North Carolina, USA. In 2015, we established 3 5 × 5 live-trapping grids (6.5 ha) with 4 camera traps and 4 acoustic detectors systematically embedded in each grid. All 3 techniques were used simultaneously during 2 4-day survey periods. We compared techniques by assessing probability of detection (POD), latency to detection (LTD; i.e., no. of survey nights until initial detection), and survey effort. Acoustics had the greatest POD (0.37 ± 0.06 SE), followed by camera traps (0.30 ± 0.06) and live traps (0.01 ± 0.005). Acoustics had a lower LTD than camera traps (P = 0.017), where average LTD was 1.5 nights for acoustics and 3.25 nights for camera traps. Total field effort was greatest with live traps (111.9 hr) followed by acoustics (8.4 hr) and camera traps (9.6 hr), although processing and examination for data of noninvasive techniques made overall effort similar among the 3 methods. This pilot study demonstrated that both noninvasive methods were better rapid-assessment detection techniques for flying squirrels than live traps. However, determining seasonal effects between survey techniques and further development of protocols for both noninvasive techniques is

  6. Aerial radiation survey techniques for efficient characterization of large areas

    International Nuclear Information System (INIS)

    Sydelko, T.; Riedhauser, S.

    2006-01-01

    Full text: Accidental or intentional releases of radioactive isotopes over potentially very large surface areas can pose serious health risks to humans and ecological receptors. Timely and appropriate responses to these releases depend upon rapid and accurate characterization of impacted areas. These characterization efforts can be adversely impacted by heavy vegetation, rugged terrain, urban environments, and the presence of unknown levels of radioactivity. Aerial survey techniques have proven highly successful in measuring gamma emissions from radiological contaminates of concern quickly, efficiently, and safely. Examples of accidental releases include the unintentional distribution of uranium mining ores during transportation, the loss of uranium processing and waste materials, unintentional nuclear power plant emissions into the atmosphere, and the distribution of isotopes during major flooding events such as the one recently occurring in New Orleans. Intentional releases have occurred during the use of deleted uranium ammunition test firing and war time use by military organizations. The threat of radiological dispersion device (dirty bomb) use by terrorists is currently a major concern of many major cities worldwide. The U.S. Department of Energy, in cooperation with its Remote Sensing Laboratory and Argonne National Laboratory, has developed a sophisticated aerial measurement system for identifying the locations, types, and quantities of gamma emitting radionuclides over extremely large areas. Helicopter mounted Nal detectors are flown at low altitude and constant speed along parallel paths measuring the full spectrum of gamma activity. Analytical procedures are capable of distinguishing between radiological contamination and changes in natural background emissions. Mapped and tabular results of these accurate, timely and cost effective aerial gamma radiation surveys can be used to assist with emergency response actions, if necessary, and to focus more

  7. A critical survey of live virtual machine migration techniques

    Directory of Open Access Journals (Sweden)

    Anita Choudhary

    2017-11-01

    Full Text Available Abstract Virtualization techniques effectively handle the growing demand for computing, storage, and communication resources in large-scale Cloud Data Centers (CDC. It helps to achieve different resource management objectives like load balancing, online system maintenance, proactive fault tolerance, power management, and resource sharing through Virtual Machine (VM migration. VM migration is a resource-intensive procedure as VM’s continuously demand appropriate CPU cycles, cache memory, memory capacity, and communication bandwidth. Therefore, this process degrades the performance of running applications and adversely affects efficiency of the data centers, particularly when Service Level Agreements (SLA and critical business objectives are to be met. Live VM migration is frequently used because it allows the availability of application service, while migration is performed. In this paper, we make an exhaustive survey of the literature on live VM migration and analyze the various proposed mechanisms. We first classify the types of Live VM migration (single, multiple and hybrid. Next, we categorize VM migration techniques based on duplication mechanisms (replication, de-duplication, redundancy, and compression and awareness of context (dependency, soft page, dirty page, and page fault and evaluate the various Live VM migration techniques. We discuss various performance metrics like application service downtime, total migration time and amount of data transferred. CPU, memory and storage data is transferred during the process of VM migration and we identify the category of data that needs to be transferred in each case. We present a brief discussion on security threats in live VM migration and categories them in three different classes (control plane, data plane, and migration module. We also explain the security requirements and existing solutions to mitigate possible attacks. Specific gaps are identified and the research challenges in improving

  8. High precision survey and alignment techniques in accelerator construction

    CERN Document Server

    Gervaise, J

    1974-01-01

    Basic concepts of precision surveying are briefly reviewed, and an historical account is given of instruments and techniques used during the construction of the Proton Synchrotron (1954-59), the Intersecting Storage Rings (1966-71), and the Super Proton Synchrotron (1971). A nylon wire device, distinvar, invar wire and tape, and recent automation of the gyrotheodolite and distinvar as well as auxiliary equipment (polyurethane jacks, Centipede) are discussed in detail. The paper ends summarizing the present accuracy in accelerator metrology, giving an outlook of possible improvement, and some aspects of staffing for the CERN Survey Group. (0 refs).

  9. A survey of visual preprocessing and shape representation techniques

    Science.gov (United States)

    Olshausen, Bruno A.

    1988-01-01

    Many recent theories and methods proposed for visual preprocessing and shape representation are summarized. The survey brings together research from the fields of biology, psychology, computer science, electrical engineering, and most recently, neural networks. It was motivated by the need to preprocess images for a sparse distributed memory (SDM), but the techniques presented may also prove useful for applying other associative memories to visual pattern recognition. The material of this survey is divided into three sections: an overview of biological visual processing; methods of preprocessing (extracting parts of shape, texture, motion, and depth); and shape representation and recognition (form invariance, primitives and structural descriptions, and theories of attention).

  10. Watershed-based survey designs

    Science.gov (United States)

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream–downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  11. Fuzzy Bi-level Decision-Making Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Guangquan Zhang

    2016-04-01

    Full Text Available Bi-level decision-making techniques aim to deal with decentralized management problems that feature interactive decision entities distributed throughout a bi-level hierarchy. A challenge in handling bi-level decision problems is that various uncertainties naturally appear in decision-making process. Significant efforts have been devoted that fuzzy set techniques can be used to effectively deal with uncertain issues in bi-level decision-making, known as fuzzy bi-level decision-making techniques, and researchers have successfully gained experience in this area. It is thus vital that an instructive review of current trends in this area should be conducted, not only of the theoretical research but also the practical developments. This paper systematically reviews up-to-date fuzzy bi-level decisionmaking techniques, including models, approaches, algorithms and systems. It also clusters related technique developments into four main categories: basic fuzzy bi-level decision-making, fuzzy bi-level decision-making with multiple optima, fuzzy random bi-level decision-making, and the applications of bi-level decision-making techniques in different domains. By providing state-of-the-art knowledge, this survey paper will directly support researchers and practitioners in their understanding of developments in theoretical research results and applications in relation to fuzzy bi-level decision-making techniques.

  12. Literature survey of heat transfer enhancement techniques in refrigeration applications

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, M.K.; Shome, B. [Rensselaer Polytechnic Inst., Troy, NY (United States). Dept. of Mechanical Engineering, Aeronautical Engineering and Mechanics

    1994-05-01

    A survey has been performed of the technical and patent literature on enhanced heat transfer of refrigerants in pool boiling, forced convection evaporation, and condensation. Extensive bibliographies of the technical literature and patents are given. Many passive and active techniques were examined for pure refrigerants, refrigerant-oil mixtures, and refrigerant mixtures. The citations were categorized according to enhancement technique, heat transfer mode, and tube or shell side focus. The effects of the enhancement techniques relative to smooth and/or pure refrigerants were illustrated through the discussion of selected papers. Patented enhancement techniques also are discussed. Enhanced heat transfer has demonstrated significant improvements in performance in many refrigerant applications. However, refrigerant mixtures and refrigerant-oil mixtures have not been studied extensively; no research has been performed with enhanced refrigerant mixtures with oil. Most studies have been of the parametric type; there has been inadequate examination of the fundamental processes governing enhanced refrigerant heat transfer, but some modeling is being done and correlations developed. It is clear that an enhancement technique must be optimized for the refrigerant and operating condition. Fundamental processes governing the heat transfer must be examined if models for enhancement techniques are to be developed; these models could provide the method to optimize a surface. Refrigerant mixtures, with and without oil present, must be studied with enhancement devices; there is too little known to be able to estimate the effects of mixtures (particularly NARMs) with enhanced heat transfer. Other conclusions and recommendations are offered.

  13. Radiological survey techniques for decontamination and dismantlement applications

    International Nuclear Information System (INIS)

    Ruesink, G.P.; Stempfley, D.H.; Pettit, P.J.; Warner, R.D.

    1997-01-01

    The Department of Energy's Fernald Environmental Management Project (FEMP) is engaged in an aggressive Program to remove all above ground structures as part of the Fernald sites final remediation remedy. Through the complete removal of major facilities such as Plant 7, Plant 4, and Plant 1, the FEMP has developed radiological survey approaches that are effective for the different phases of the Decontamination and Dismantlement (D ampersand D) process. Some of the most pressing challenges facing the FEMP are implementing effective, low cost methods for the D ampersand D of former process buildings while minimizing environmental effects. One of the key components to ensure minimal impact on the environment is the collection of radiological contamination information during the D ampersand D process to facilitate the decision making process. Prior to the final demolition of any structure, radiological surveys of floors, walls, and ceilings must take place. These surveys must demonstrate that contamination levels am below 5000 dpm removable beta/gamma for non-porous surfaces and below 1000 dpm removable-beta/gamma for all porous surfaces. Technique which can perform these activities in a safe, effective, and cost efficient manner are greatly desired. The FEMP has investigated new approaches to address this need. These techniques include sampling approaches using standard baseline methodology as well as innovative approaches to accelerate final radiological clearance processes. To further improve upon this process, the FEMP has investigated several new technologies through the Fernald Plant 1 Large Scale Technology Demonstration Project. One of the most promising of these new technologies, Laser Induced Fluorescence, may significantly improve the radiological clearance survey process. This paper will present real world experiences in applying radiological control limits to D ampersand D projects as well as relate potential productivity and cost improvements with the

  14. MCNP Techniques for Modeling Sodium Iodide Spectra of Kiwi Surveys

    International Nuclear Information System (INIS)

    Robert B Hayes

    2007-01-01

    This work demonstrates how MCNP can be used to predict the response of mobile search and survey equipment from base principles. The instrumentation evaluated comes from the U.S. Department of Energy's Aerial Measurement Systems. Through reconstructing detector responses to various point-source measurements, detector responses to distributed sources can be estimated through superposition. Use of this methodology for currently deployed systems allows predictive determinations of activity levels and distributions for common configurations of interest. This work helps determine the quality and efficacy of certain surveys in fully characterizing an effected site following a radiological event of national interest

  15. Survey on Chatbot Design Techniques in Speech Conversation Systems

    OpenAIRE

    Sameera A. Abdul-Kader; Dr. John Woods

    2015-01-01

    Human-Computer Speech is gaining momentum as a technique of computer interaction. There has been a recent upsurge in speech based search engines and assistants such as Siri, Google Chrome and Cortana. Natural Language Processing (NLP) techniques such as NLTK for Python can be applied to analyse speech, and intelligent responses can be found by designing an engine to provide appropriate human like responses. This type of programme is called a Chatbot, which is the focus of this study. This pap...

  16. Survey of Green Radio Communications Networks: Techniques and Recent Advances

    Directory of Open Access Journals (Sweden)

    Mohammed H. Alsharif

    2013-01-01

    Full Text Available Energy efficiency in cellular networks has received significant attention from both academia and industry because of the importance of reducing the operational expenditures and maintaining the profitability of cellular networks, in addition to making these networks “greener.” Because the base station is the primary energy consumer in the network, efforts have been made to study base station energy consumption and to find ways to improve energy efficiency. In this paper, we present a brief review of the techniques that have been used recently to improve energy efficiency, such as energy-efficient power amplifier techniques, time-domain techniques, cell switching, management of the physical layer through multiple-input multiple-output (MIMO management, heterogeneous network architectures based on Micro-Pico-Femtocells, cell zooming, and relay techniques. In addition, this paper discusses the advantages and disadvantages of each technique to contribute to a better understanding of each of the techniques and thereby offer clear insights to researchers about how to choose the best ways to reduce energy consumption in future green radio networks.

  17. Evaluating autonomous acoustic surveying techniques for rails in tidal marshes

    Science.gov (United States)

    Stiffler, Lydia L.; Anderson, James T.; Katzner, Todd

    2018-01-01

    There is a growing interest toward the use of autonomous recording units (ARUs) for acoustic surveying of secretive marsh bird populations. However, there is little information on how ARUs compare to human surveyors or how best to use ARU data that can be collected continuously throughout the day. We used ARUs to conduct 2 acoustic surveys for king (Rallus elegans) and clapper rails (R. crepitans) within a tidal marsh complex along the Pamunkey River, Virginia, USA, during May–July 2015. To determine the effectiveness of an ARU in replacing human personnel, we compared results of callback point‐count surveys with concurrent acoustic recordings and calculated estimates of detection probability for both rail species combined. The success of ARUs at detecting rails that human observers recorded decreased with distance (P ≤ 0.001), such that at 75 m, only 34.0% of human‐detected rails were detected by the ARU. To determine a subsampling scheme for continuous ARU data that allows for effective surveying of presence and call rates of rails, we used ARUs to conduct 15 continuous 48‐hr passive surveys, generating 720 hr of recordings. We established 5 subsampling periods of 5, 10, 15, 30, and 45 min to evaluate ARU‐based presence and vocalization detections of rails compared with each of the full 60‐min sampling of ARU‐based detection of rails. All subsampling periods resulted in different (P ≤ 0.001) detection rates and unstandardized vocalization rates compared with the hourly sampling period. However, standardized vocalization counts from the 30‐min subsampling period were not different from vocalization counts of the full hourly sampling period. When surveying rail species in estuarine environments, species‐, habitat‐, and ARU‐specific limitations to ARU sampling should be considered when making inferences about abundances and distributions from ARU data. 

  18. The History of Electromagnetic Induction Techniques in Soil Survey

    Science.gov (United States)

    Brevik, Eric C.; Doolittle, Jim

    2014-05-01

    Electromagnetic induction (EMI) has been used to characterize the spatial variability of soil properties since the late 1970s. Initially used to assess soil salinity, the use of EMI in soil studies has expanded to include: mapping soil types; characterizing soil water content and flow patterns; assessing variations in soil texture, compaction, organic matter content, and pH; and determining the depth to subsurface horizons, stratigraphic layers or bedrock, among other uses. In all cases the soil property being investigated must influence soil apparent electrical conductivity (ECa) either directly or indirectly for EMI techniques to be effective. An increasing number and diversity of EMI sensors have been developed in response to users' needs and the availability of allied technologies, which have greatly improved the functionality of these tools. EMI investigations provide several benefits for soil studies. The large amount of georeferenced data that can be rapidly and inexpensively collected with EMI provides more complete characterization of the spatial variations in soil properties than traditional sampling techniques. In addition, compared to traditional soil survey methods, EMI can more effectively characterize diffuse soil boundaries and identify included areas of dissimilar soils within mapped soil units, giving soil scientists greater confidence when collecting spatial soil information. EMI techniques do have limitations; results are site-specific and can vary depending on the complex interactions among multiple and variable soil properties. Despite this, EMI techniques are increasingly being used to investigate the spatial variability of soil properties at field and landscape scales.

  19. American Samoa Shore-based Creel Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DMWR staff has also conducted shore-based creel surveys which also have 2 major sub-surveys; one to estimate participation (fishing effort), and one to provide...

  20. The Aalborg Survey / Part 3 - Interview Based Survey

    DEFF Research Database (Denmark)

    Harder, Henrik; Christensen, Cecilie Breinholm; Jensen, Maria Vestergaard

    Background and purpose The Aalborg Survey consists of four independent parts: a web, GPS and an interview based survey and a literature study, which together form a consistent investigation and research into use of urban space, and specifically into young people’s use of urban space: what young...... people do in urban spaces, where they are in the urban spaces and when the young people are in the urban spaces. The answers to these questions form the framework and enable further academic discussions and conclusions in relation to the overall research project Diverse Urban Spaces (DUS). The primary......) and the research focus within the cluster of Mobility and Tracking Technologies (MoTT), AAU. Summary / Part 3 - Interview Based Survey The 3rd part of the DUS research project has been carried out during the fall of 2009 and the summer and fall of 2010 as an interview based survey of 18 selected participants (nine...

  1. Microprocessor based techniques at CESR

    International Nuclear Information System (INIS)

    Giannini, G.; Cornell Univ., Ithaca, NY

    1981-01-01

    Microprocessor based systems succesfully used in connection with the High Energy Physics experimental program at the Cornell Electron Storage Ring are described. The multiprocessor calibration system for the CUSB calorimeter is analyzed in view of present and future applications. (orig.)

  2. Visual servoing in medical robotics: a survey. Part II: tomographic imaging modalities--techniques and applications.

    Science.gov (United States)

    Azizian, Mahdi; Najmaei, Nima; Khoshnam, Mahta; Patel, Rajni

    2015-03-01

    Intraoperative application of tomographic imaging techniques provides a means of visual servoing for objects beneath the surface of organs. The focus of this survey is on therapeutic and diagnostic medical applications where tomographic imaging is used in visual servoing. To this end, a comprehensive search of the electronic databases was completed for the period 2000-2013. Existing techniques and products are categorized and studied, based on the imaging modality and their medical applications. This part complements Part I of the survey, which covers visual servoing techniques using endoscopic imaging and direct vision. The main challenges in using visual servoing based on tomographic images have been identified. 'Supervised automation of medical robotics' is found to be a major trend in this field and ultrasound is the most commonly used tomographic modality for visual servoing. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Evaluating autonomous acoustic surveying techniques for rails in tidal marshes

    Science.gov (United States)

    Stiffler, Lydia L.; Anderson, James T.; Katzner, Todd

    2018-01-01

    There is a growing interest toward the use of autonomous recording units (ARUs) for acoustic surveying of secretive marsh bird populations. However, there is little information on how ARUs compare to human surveyors or how best to use ARU data that can be collected continuously throughout the day. We used ARUs to conduct 2 acoustic surveys for king (Rallus elegans) and clapper rails (R. crepitans) within a tidal marsh complex along the Pamunkey River, Virginia, USA, during May–July 2015. To determine the effectiveness of an ARU in replacing human personnel, we compared results of callback point‐count surveys with concurrent acoustic recordings and calculated estimates of detection probability for both rail species combined. The success of ARUs at detecting rails that human observers recorded decreased with distance (P ≤ 0.001), such that at of human‐recorded rails also were detected by the ARU, but at >75 m, only 34.0% of human‐detected rails were detected by the ARU. To determine a subsampling scheme for continuous ARU data that allows for effective surveying of presence and call rates of rails, we used ARUs to conduct 15 continuous 48‐hr passive surveys, generating 720 hr of recordings. We established 5 subsampling periods of 5, 10, 15, 30, and 45 min to evaluate ARU‐based presence and vocalization detections of rails compared with each of the full 60‐min sampling of ARU‐based detection of rails. All subsampling periods resulted in different (P ≤ 0.001) detection rates and unstandardized vocalization rates compared with the hourly sampling period. However, standardized vocalization counts from the 30‐min subsampling period were not different from vocalization counts of the full hourly sampling period. When surveying rail species in estuarine environments, species‐, habitat‐, and ARU‐specific limitations to ARU sampling should be considered when making inferences about abundances and distributions from ARU data. 

  4. Bases en technique du vide

    CERN Document Server

    Rommel, Guy

    2017-01-01

    Cette seconde édition, 20 ans après la première, devrait continuer à aider les techniciens pour la réalisation de leur système de vide. La technologie du vide est utilisée, à présent, dans de nombreux domaines très différents les uns des autres et avec des matériels très fiables. Or, elle est souvent bien peu étudiée, de plus, c'est une discipline où le savoir-faire prend tout son sens. Malheureusement la transmission par des ingénieurs et techniciens expérimentés ne se fait plus ou trop rapidement. La technologie du vide fait appel à la physique, à la chimie, à la mécanique, à la métallurgie, au dessin industriel, à l'électronique, à la thermique, etc. Cette discipline demande donc de maîtriser des techniques de domaines très divers, et ce n'est pas chose facile. Chaque installation est en soi un cas particulier avec ses besoins, sa façon de traiter les matériaux et celle d'utiliser les matériels. Les systèmes de vide sont parfois copiés d'un laboratoire à un autre et le...

  5. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    International Nuclear Information System (INIS)

    2013-01-01

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results

  6. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-12-15

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results.

  7. The Aalborg Survey / Part 1 - Web Based Survey

    DEFF Research Database (Denmark)

    Harder, Henrik; Christensen, Cecilie Breinholm

    Background and purpose The Aalborg Survey consists of four independent parts: a web, GPS and an interview based survey and a literature study, which together form a consistent investigation and research into use of urban space, and specifically into young people’s use of urban space: what young......) and the research focus within the cluster of Mobility and Tracking Technologies (MoTT), AAU. Summary / Part 1 Web Base Survey The 1st part of the research project Diverse Urban Spaces (DUS) has been carried out during the period from December 1st 2007 to February 1st 2008 as a Web Based Survey of the 27.040 gross...... [statistikbanken.dk, a] young people aged 14-23 living in Aalborg Municipality in 2008. The web based questionnaire has been distributed among the group of young people studying at upper secondary schools in Aalborg, i.e. 7.680 young people [statistikbanken.dk, b]. The resulting data from those respondents who...

  8. Lot quality assurance sampling techniques in health surveys in developing countries: advantages and current constraints.

    Science.gov (United States)

    Lanata, C F; Black, R E

    1991-01-01

    Traditional survey methods, which are generally costly and time-consuming, usually provide information at the regional or national level only. The utilization of lot quality assurance sampling (LQAS) methodology, developed in industry for quality control, makes it possible to use small sample sizes when conducting surveys in small geographical or population-based areas (lots). This article describes the practical use of LQAS for conducting health surveys to monitor health programmes in developing countries. Following a brief description of the method, the article explains how to build a sample frame and conduct the sampling to apply LQAS under field conditions. A detailed description of the procedure for selecting a sampling unit to monitor the health programme and a sample size is given. The sampling schemes utilizing LQAS applicable to health surveys, such as simple- and double-sampling schemes, are discussed. The interpretation of the survey results and the planning of subsequent rounds of LQAS surveys are also discussed. When describing the applicability of LQAS in health surveys in developing countries, the article considers current limitations for its use by health planners in charge of health programmes, and suggests ways to overcome these limitations through future research. It is hoped that with increasing attention being given to industrial sampling plans in general, and LQAS in particular, their utilization to monitor health programmes will provide health planners in developing countries with powerful techniques to help them achieve their health programme targets.

  9. Research on polonium-218 survey technique for uranium

    International Nuclear Information System (INIS)

    Zhou, R.

    1985-01-01

    This article makes an exposition of the principles and procedures of 218 Po survey technique for uranium. The experiments done with 218 Po method on a large scale on the deposits of granite, volcanic rock and carbon-silliceous slate types showed that the method of not only as effective as track method and 210 Po method, but also has the characteristics of its own. The device has higher working efficiency with only 5 minutes needed at each measurement point, and its sensitivity is higher, about 0.7 pulse/136.S (P ci /L). The results of measurement by 218 Po method will not be affected by thorium emanation and there will be no contamination of the scintillation chamber by radon daughter. The ratio of anomalous peak value to the bottom for 218 Po method is proved to be higher than that for track method and 210 Po method. In order to avoid the influence of moisture, the measurement by 218 Po method should be planned to do when it is not a rainy day and the holes must be dug some distance off the ditches and rice fields, thus ensuring the success in applying the method

  10. CNMI Shore-based Creel Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Commonwealth of the Northern Mariana Islands (CNMI), Division of Fish and Wildlife (DFW) staff conducted shore-based creel surveys which have 2 major...

  11. Guam Boat-based Creel Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Similar to other boat-based survey in basic design, this system is run by the Div. of Aquatic and Wildlife Resources (DAWR) and has been in operation since about...

  12. Building Assessment Survey and Evaluation Data (BASE)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Building Assessment Survey and Evaluation (BASE) study was a five year study to characterize determinants of indoor air quality and occupant perceptions in...

  13. D Survey Techniques for the Architectutal Restoration: the Case of ST. Agata in Pisa

    Science.gov (United States)

    Bevilacqua, M. G.; Caroti, G.; Piemonte, A.; Ruschi, P.; Tenchini, L.

    2017-05-01

    and visualize the historical building in its context. These modern techniques of survey, based on the creation of point clouds, are now widely used both in the study of a building and for the thorough description of architectural details and decorations. This paper aims at describing the methodological approach and the results of the 3D survey of the Chapel of St. Agata in Pisa, aimed at its restoration. For the development of a restoration project, the survey drawings must represent not only the geometry of a building, but also the materials and the level of degradation. So, we chose to use both the laser scanner - which guarantees uniformity of the geometric survey precision - and a 3D image-based modelling. The combined use of these two techniques, supported by a total station survey, has produced two point clouds in the same reference system, and allowed the determination of the external orientation parameters of the photographic images. Since these parameters are known, it was possible to texturize the laser scanner model with high quality images. The adopted methodology, as expected, gave back metrically correct and graphically high-quality drawings. The level of detail of the survey, and consequently of the final drawings, has been previously defined for the identification of all the elements required for the analysis of the current state, such as the clear identification and position of all the degradation phenomena, materials and decorative elements such as some fragmented and heavily damaged frescoes.

  14. Some fuzzy techniques for staff selection process: A survey

    Science.gov (United States)

    Md Saad, R.; Ahmad, M. Z.; Abu, M. S.; Jusoh, M. S.

    2013-04-01

    With high level of business competition, it is vital to have flexible staff that are able to adapt themselves with work circumstances. However, staff selection process is not an easy task to be solved, even when it is tackled in a simplified version containing only a single criterion and a homogeneous skill. When multiple criteria and various skills are involved, the problem becomes much more complicated. In adddition, there are some information that could not be measured precisely. This is patently obvious when dealing with opinions, thoughts, feelings, believes, etc. One possible tool to handle this issue is by using fuzzy set theory. Therefore, the objective of this paper is to review the existing fuzzy techniques for solving staff selection process. It classifies several existing research methods and identifies areas where there is a gap and need further research. Finally, this paper concludes by suggesting new ideas for future research based on the gaps identified.

  15. Composite Techniques Based Color Image Compression

    Directory of Open Access Journals (Sweden)

    Zainab Ibrahim Abood

    2017-03-01

    Full Text Available Compression for color image is now necessary for transmission and storage in the data bases since the color gives a pleasing nature and natural for any object, so three composite techniques based color image compression is implemented to achieve image with high compression, no loss in original image, better performance and good image quality. These techniques are composite stationary wavelet technique (S, composite wavelet technique (W and composite multi-wavelet technique (M. For the high energy sub-band of the 3rd level of each composite transform in each composite technique, the compression parameters are calculated. The best composite transform among the 27 types is the three levels of multi-wavelet transform (MMM in M technique which has the highest values of energy (En and compression ratio (CR and least values of bit per pixel (bpp, time (T and rate distortion R(D. Also the values of the compression parameters of the color image are nearly the same as the average values of the compression parameters of the three bands of the same image.

  16. Trends in Orbital Decompression Techniques of Surveyed American Society of Ophthalmic Plastic and Reconstructive Surgery Members.

    Science.gov (United States)

    Reich, Shani S; Null, Robert C; Timoney, Peter J; Sokol, Jason A

    To assess current members of the American Society of Ophthalmic Plastic and Reconstructive Surgery (ASOPRS) regarding preference in surgical techniques for orbital decompression in Graves' disease. A 10-question web-based, anonymous survey was distributed to oculoplastic surgeons utilizing the ASOPRS listserv. The questions addressed the number of years of experience performing orbital decompression surgery, preferred surgical techniques, and whether orbital decompression was performed in collaboration with an ENT surgeon. Ninety ASOPRS members participated in the study. Most that completed the survey have performed orbital decompression surgery for >15 years. The majority of responders preferred a combined approach of floor and medial wall decompression or balanced lateral and medial wall decompression; only a minority selected a technique limited to 1 wall. Those surgeons who perform fat decompression were more likely to operate in collaboration with ENT. Most surgeons rarely remove the orbital strut, citing risk of worsening diplopia or orbital dystopia except in cases of optic nerve compression or severe proptosis. The most common reason given for performing orbital decompression was exposure keratopathy. The majority of surgeons perform the surgery without ENT involvement, and number of years of experience did not correlate significantly with collaboration with ENT. The majority of surveyed ASOPRS surgeons prefer a combined wall approach over single wall approach to initial orbital decompression. Despite the technological advances made in the field of modern endoscopic surgery, no single approach has been adopted by the ASOPRS community as the gold standard.

  17. A survey of text clustering techniques used for web mining

    Directory of Open Access Journals (Sweden)

    Dan MUNTEANU

    2005-12-01

    Full Text Available This paper contains an overview of basic formulations and approaches to clustering. Then it presents two important clustering paradigms: a bottom-up agglomerative technique, which collects similar documents into larger and larger groups, and a top-down partitioning technique, which divides a corpus into topic-oriented partitions.

  18. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  19. A Survey of Spatio-Temporal Grouping Techniques

    National Research Council Canada - National Science Library

    Megret, Remi; DeMenthon, Daniel

    2002-01-01

    ...) segmentation by trajectory grouping, and (3) joint spatial and temporal segmentation. The first category is the broadest, as it inherits the legacy techniques of image segmentation and motion segmentation...

  20. Position fixing and surveying techniques for marine archaeological studies

    Digital Repository Service at National Institute of Oceanography (India)

    Ganesan, P.

    . This technical report is going to be of great help to marine archaeologists, who wants to know the capabilities of some of the most common available tools for position fixing, their accuracies and method of surveying, which in turn will help in selecting...

  1. Guidelines for a Training Course in Noise Survey Techniques.

    Science.gov (United States)

    Shadley, John; And Others

    The course is designed to train noise survey technicians during a 3-5 day period to make reliable measurements of 75 percent of the noise problems encountered in the community. The more complex noise problems remaining will continue to be handled by experienced specialists. These technicians will be trained to assist State and local governments in…

  2. Supply chain simulation tools and techniques: a survey

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    The main contribution of this paper is twofold: it surveys different types of simulation for supply chain management; it discusses several methodological issues. These different types of simulation are spreadsheet simulation, system dynamics, discrete-event simulation and business games. Which

  3. A Survey of Librarian Perceptions of Information Literacy Techniques

    Science.gov (United States)

    Yearwood, Simone L.; Foasberg, Nancy M.; Rosenberg, Kenneth D.

    2015-01-01

    Teaching research competencies and information literacy is an integral part of the academic librarian's role. There has long been debate among librarians over what are the most effective methods of instruction for college students. Library Faculty members at a large urban university system were surveyed to determine their perceptions of the…

  4. A Survey on Cloud Security Issues and Techniques

    OpenAIRE

    Sharma, Shubhanjali; Gupta, Garima; Laxmi, P. R.

    2014-01-01

    Today, cloud computing is an emerging way of computing in computer science. Cloud computing is a set of resources and services that are offered by the network or internet. Cloud computing extends various computing techniques like grid computing, distributed computing. Today cloud computing is used in both industrial field and academic field. Cloud facilitates its users by providing virtual resources via internet. As the field of cloud computing is spreading the new techniques are developing. ...

  5. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  6. A Survey on Anomaly Based Host Intrusion Detection System

    Science.gov (United States)

    Jose, Shijoe; Malathi, D.; Reddy, Bharath; Jayaseeli, Dorathi

    2018-04-01

    An intrusion detection system (IDS) is hardware, software or a combination of two, for monitoring network or system activities to detect malicious signs. In computer security, designing a robust intrusion detection system is one of the most fundamental and important problems. The primary function of system is detecting intrusion and gives alerts when user tries to intrusion on timely manner. In these techniques when IDS find out intrusion it will send alert massage to the system administrator. Anomaly detection is an important problem that has been researched within diverse research areas and application domains. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. From the existing anomaly detection techniques, each technique has relative strengths and weaknesses. The current state of the experiment practice in the field of anomaly-based intrusion detection is reviewed and survey recent studies in this. This survey provides a study of existing anomaly detection techniques, and how the techniques used in one area can be applied in another application domain.

  7. A Survey of Soft-Error Mitigation Techniques for Non-Volatile Memories

    Directory of Open Access Journals (Sweden)

    Sparsh Mittal

    2017-02-01

    Full Text Available Non-volatile memories (NVMs offer superior density and energy characteristics compared to the conventional memories; however, NVMs suffer from severe reliability issues that can easily eclipse their energy efficiency advantages. In this paper, we survey architectural techniques for improving the soft-error reliability of NVMs, specifically PCM (phase change memory and STT-RAM (spin transfer torque RAM. We focus on soft-errors, such as resistance drift and write disturbance, in PCM and read disturbance and write failures in STT-RAM. By classifying the research works based on key parameters, we highlight their similarities and distinctions. We hope that this survey will underline the crucial importance of addressing NVM reliability for ensuring their system integration and will be useful for researchers, computer architects and processor designers.

  8. A survey of intrusion detection techniques in Cloud

    OpenAIRE

    Modi, C.; Patel, D.; Patel, H.; Borisaniya, B.; Patel, A.; Rajarajan, M.

    2013-01-01

    Cloud computing provides scalable, virtualized on-demand services to the end users with greater flexibility and lesser infrastructural investment. These services are provided over the Internet using known networking protocols, standards and formats under the supervision of different managements. Existing bugs and vulnerabilities in underlying technologies and legacy protocols tend to open doors for intrusion. This paper, surveys different intrusions affecting availability, confidentiality and...

  9. Graph based techniques for tag cloud generation

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2013-01-01

    Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings...... or citation counts for improved measurement of relevance of tag clouds. We show, that on the given data sets, our approach outperforms the state of the art baseline methods with respect to such relevance by 41 % on Movielens dataset and by 11 % on Bibsonomy data set....

  10. Outlier Detection Techniques For Wireless Sensor Networks: A Survey

    NARCIS (Netherlands)

    Zhang, Y.; Meratnia, Nirvana; Havinga, Paul J.M.

    2008-01-01

    In the field of wireless sensor networks, measurements that significantly deviate from the normal pattern of sensed data are considered as outliers. The potential sources of outliers include noise and errors, events, and malicious attacks on the network. Traditional outlier detection techniques are

  11. A Survey on Nickel Titanium Rotary Instruments and their Usage Techniques by Endodontists in India.

    Science.gov (United States)

    Patil, Thimmanagowda N; Saraf, Prahlad A; Penukonda, Raghavendra; Vanaki, Sneha S; Kamatagi, Laxmikant

    2017-05-01

    The preference and usage of nickel titanium rotary instruments varies from individual to individual based on their technique, experience with the rotary systems and the clinical situation. Very limited information is available to explain the adoption of changing concepts with respect to nickel titanium rotary instruments pertaining to the endodontists in India. The aim of this study was to conduct a questionnaire survey to acquire the knowledge concerning different NiTi rotary instruments and their usage techniques by endodontists in India. A Survey questionnaire was designed which consisted of 32 questions regarding designation, demographics, experience with rotary instruments, usage of different file systems, usage techniques, frequency of reuse, occurrence of file fracture, reasons and their management was distributed by hand in the national postgraduate convention and also disseminated via electronic medium to 400 and 600 endodontists respectively. Information was collected from each individual to gain insight into the experiences and beliefs of endodontists concerning the new endodontic technology of rotary NiTi instrumentation based on their clinical experience with the rotary systems. The questions were designed to ascertain the problems, patterns of use and to identify areas of perceived or potential concern regarding the rotary instruments and the data acquired was statistically evaluated using Fisher's-exact test and the Chi-Square test. Overall 63.8% (638) endodontists responded. ProTaper was one of the most commonly used file system followed by M two and ProTaper Next. There was a significant co relation between the years of experience and the file re use frequency, preparation technique, file separation, management of file separation. A large number of Endodontists prefer to reuse the rotary NiTi instruments. As there was an increase in the experience, the incidence of file separation reduced with increasing number of re use frequency and with

  12. Investigation of individual radiation exposures from discharges to the aquatic environment: techniques used in habits surveys

    International Nuclear Information System (INIS)

    Leonard, D.R.P.; Hunt, G.J.; Jones, P.G.W.

    1982-01-01

    The techniques used by the Fisheries Radiobiological Laboratory (FRL) in conducting habits surveys are described and discussed. The main objectives of these surveys are to investigate exposure pathways to the public resulting from radioactive discharges to the aquatic environment and to provide the basic data from which critical groups can be identified. Preparation, conduct and interpretation of the results of surveys are described and possible errors obtained by the interview technique are highlighted. A means of verifying the results of interviews by a logging technique has been devised and some comparative results are presented. (author)

  13. Searching for millisecond pulsars: surveys, techniques and prospects

    International Nuclear Information System (INIS)

    Stovall, K; Lorimer, D R; Lynch, R S

    2013-01-01

    Searches for millisecond pulsars (which we here loosely define as those with periods < 20 ms) in the galactic field have undergone a renaissance in the past five years. New or recently refurbished radio telescopes utilizing cooled receivers and state-of-the art digital data acquisition systems are carrying out surveys of the entire sky at a variety of radio frequencies. Targeted searches for millisecond pulsars in point sources identified by the Fermi Gamma-ray Space Telescope have proved phenomenally successful, with over 50 discoveries in the past five years. The current sample of millisecond pulsars now numbers almost 200 and, for the first time in 25 years, now outnumbers their counterparts in galactic globular clusters. While many of these searches are motivated to find pulsars which form part of pulsar timing arrays, a wide variety of interesting systems are now being found. Following a brief overview of the millisecond pulsar phenomenon, we describe these searches and present some of the highlights of the new discoveries in the past decade. We conclude with predictions and prospects for ongoing and future surveys. (paper)

  14. A survey of reflectometry techniques with applications to TFTR

    International Nuclear Information System (INIS)

    Collazo, I.; Stacey, W.M.; Wilgen, J.; Hanson, G.; Bigelow, T.; Thomas, C.E.; Bretz, N.

    1993-12-01

    This report presents a review of reflectometry with particular attention to eXtraordinary mode (X-mode) reflectometry using the novel technique of dual frequency differential phase. The advantage of using an X-mode wave is that it can probe the edge of the plasma with much higher resolution and using a much smaller frequency range than with the Ordinary mode (O-Mode). The general problem with previous full phase reflectometry techniques is that of keeping track of the phase (on the order of 1000 fringes) as the frequency is swept over the band. The dual frequency phase difference technique has the advantage that since it is keeping track of the phase difference of two frequencies with a constant frequency separation, the fringe counting is on the order of only 3 to 5 fringes. This fringe count, combined with the high resolution of the X-mode wave and the small plasma access requirements of reflectometry, make X-mode reflectometry a very attractive diagnostic for today's experiments and future fusion devices

  15. Retinal Vessels Segmentation Techniques and Algorithms: A Survey

    Directory of Open Access Journals (Sweden)

    Jasem Almotiri

    2018-01-01

    Full Text Available Retinal vessels identification and localization aim to separate the different retinal vasculature structure tissues, either wide or narrow ones, from the fundus image background and other retinal anatomical structures such as optic disc, macula, and abnormal lesions. Retinal vessels identification studies are attracting more and more attention in recent years due to non-invasive fundus imaging and the crucial information contained in vasculature structure which is helpful for the detection and diagnosis of a variety of retinal pathologies included but not limited to: Diabetic Retinopathy (DR, glaucoma, hypertension, and Age-related Macular Degeneration (AMD. With the development of almost two decades, the innovative approaches applying computer-aided techniques for segmenting retinal vessels are becoming more and more crucial and coming closer to routine clinical applications. The purpose of this paper is to provide a comprehensive overview for retinal vessels segmentation techniques. Firstly, a brief introduction to retinal fundus photography and imaging modalities of retinal images is given. Then, the preprocessing operations and the state of the art methods of retinal vessels identification are introduced. Moreover, the evaluation and validation of the results of retinal vessels segmentation are discussed. Finally, an objective assessment is presented and future developments and trends are addressed for retinal vessels identification techniques.

  16. Poisson and negative binomial item count techniques for surveys with sensitive question.

    Science.gov (United States)

    Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin

    2017-04-01

    Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.

  17. Vehicle-borne survey techniques for background radiations

    International Nuclear Information System (INIS)

    Minato, Susumu

    1995-01-01

    This paper presented methods for converting count rates measured inside cars and trains in the natural environment into outdoor terrestrial gamma-ray dose rates. First, (1) the method of calibration for a survey meter is described to be applicable to various geological terrains. Next, the regression formulas were acquired experimentally to correct (2) the shielding effects of cars and trains, and (3) the influence of pavements and ballasts. Furthermore, (4) a new method for removing interfering radiation components emitted from cliffs and tunnels was proposed, and the errors in the calculations were evaluated with numerical experiments. In addition, the degree of influence from the cliff was represented with the angle of elevation subtended to the detector. For the items (2)-(4), in particular, it could be explained with simple models that those methods are reasonable. The method for evaluating simply and accurately cosmic-ray dose rates by means of a portable barometer was also described. (author)

  18. GPR as a Low Impact Paleontogical Survey Technique

    Science.gov (United States)

    Sturdevant, G. C.; Leverence, R.; Stewart, R.

    2013-12-01

    The Deweyville Formation, a Pleistocene fluvial sandstone, is a prolific source of megafaunal fossils from periods of low stand environmental conditions. GPR was employed in an environmentally sensitive area in close proximity to a salt dome in Northwest Harris County, Texas as a method of evaluating the probable paleo-depositional environment and to prospect for potential further site development of two distinct fossiliferous zones. The primary zone of interest is a lag gravel bounded sand responsible for producing a regionally unique fossil assemblage including South American megafauna (Lundelius et al, 2013). The secondary zone of interest contains undisturbed mammoth remains housed in coarse white sand emplaced on top of a clay drape which has been hypothesized to represent an oxbow lake formed by the meandering paleo-Brazos river. With an accurate map of the paleo-channel planning future activity can focus on maximizing fossil recovery and minimizing site impact. Pulse EKKO 250 MHz, 400MHz, and 1GHz system was employed in a prospect area proximal to the secondary site to calibrate and evaluate these systems for their resolution and penetration depth in the modern sediments. The data was processed using EKKO Mapper and EKKO View Deluxe software packages, 3d volumes were produced and sliced. Preliminary results from the 250 MHz demonstrate successful imaging of the sand-clay interface. After these surveys were run a small portion of the site was excavated to confirm the estimated velocities, the observed anomalies, and refine our modeling and interpretation, and improve grid design for further surveys. It was confirmed that the sand-clay interface was easily observable using GPR, however the grid spacing proved to be too wide, leading to artifacts in the 3d volume produced.

  19. Survey and assessment of conventional software verification and validation techniques

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-02-01

    Reliable software is required for nuclear power plant applications. Verification and validation (V ampersand V) techniques may be applied during software development to help eliminate errors that can inhibit the proper operation of digital systems and that may cause safety problems. EPRI and the NRC are cosponsoring this investigation to determine the best strategies for V ampersand V of expert system software. The strategy used for a particular system will depend on the complexity of the software and the level of integrity required. This report covers the first task in the investigation of reviewing methods for V ampersand V of conventional software systems and evaluating them for use with expert systems

  20. A survey on multiproperty measurement techniques of solid materials

    International Nuclear Information System (INIS)

    Matsumoto, Tsuyoshi

    1989-01-01

    The term 'multiproperty measurement' has not as yet been widely used. It is defined as the simultaneous (or continuous) measurement of several properties of material using one sample and one set of equipment. It is highly advantageous to measure several properties of a sample simultaneously. Various aspects of the nature of a substance can be clarified by evaluating its nature in terms of many properties. In particular, advanced techniques for measuring thermal properties of material are needed in the fields of atomic energy industry, aerospace industry, energy industry, electronics industry and academic community. Conventional thermal property measurement techniques which can be applied to multiproperty measurement or minute test sample measurement are outlined focusing on measurement of the thermal conductivity (axial flow method, radial flow method, plate method, unsteady state heating coil method, direct current heating method), specific heat (adiabatic method, drop calorimetry, differential scanning calorimetry, AC calorimetric method, pulse heating method, and laser heating method), thermal diffusivity (laser-flash method), and emissivity (separated black body method, incorporated black body method). (N,K.)

  1. A Comparison of Web-Based and Paper-Based Survey Methods: Testing Assumptions of Survey Mode and Response Cost

    Science.gov (United States)

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-01-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and…

  2. Problem based Learning in surveying Education

    DEFF Research Database (Denmark)

    Enemark, Stig

    The challenge of the future will be that the only constant is change. Therefore, the educational base must be flexible. The graduates must possess skills to adapt to a rapidly changing labour market and they must possess skills to deal with even the unknown problems of the future. The point is...... that opportunity. The basis principles of this educational model are presented using the surveying programme at Aalborg University as an example....

  3. Artificial Intelligence based technique for BTS placement

    Science.gov (United States)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  4. Artificial Intelligence based technique for BTS placement

    International Nuclear Information System (INIS)

    Alenoghena, C O; Emagbetere, J O; 1 Minna (Nigeria))" data-affiliation=" (Department of Telecommunications Engineering, Federal University of Techn.1 Minna (Nigeria))" >Aibinu, A M

    2013-01-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out

  5. A Survey of Model-based Sensor Data Acquisition and Management

    OpenAIRE

    Aggarwal, Charu C.; Sathe, Saket; Papaioannou, Thanasis; Jeung, Hoyoung; Aberer, Karl

    2013-01-01

    In recent years, due to the proliferation of sensor networks, there has been a genuine need of researching techniques for sensor data acquisition and management. To this end, a large number of techniques have emerged that advocate model-based sensor data acquisition and management. These techniques use mathematical models for performing various, day-to-day tasks involved in managing sensor data. In this chapter, we survey the state-of-the-art techniques for model-based sensor data acquisition...

  6. Reconstructive techniques in transoral robotic surgery for head and neck cancer: a North American survey.

    Science.gov (United States)

    Konofaos, Petros; Hammond, Sarah; Ver Halen, Jon P; Samant, Sandeep

    2013-02-01

    Although the use of transoral robotic surgery for tumor extirpation is expanding, little is known about national trends in the reconstruction of resultant defects. An 18-question electronic survey was created by an expert panel of surgeons from the Department of Otolaryngology-Head and Neck Surgery and the Department of Plastic and Reconstructive Surgery at the University of Tennessee. Eligible participants were identified by the American Head and Neck Society Web site and from the Intuitive Surgical, Inc., Web site after review of surgeons trained in transoral robotic surgery techniques. Twenty-three of 27 preselected head and neck surgeons (85.18 percent) completed the survey. All respondents use transoral robotic surgery for head and neck tumor extirpation. The majority of the respondents [n = 17 (77.3 percent)] did not use any means of reconstruction. With respect to methods of reconstruction following transoral robotic surgery defects, the majority [n = 4 (80.0 percent)] used a free flap, a pedicled local flap [n = 3 (60.0 percent)], or a distant flap [n = 3 (60.0 percent)]. The radial forearm flap was the most commonly used free flap by all respondents. In general, the majority of survey respondents allow defects to heal secondarily or close primarily. Based on this survey, consensus indications for pedicled or free tissue transfer following transoral robotic surgery defects were primary head and neck tumors (stage T3 and T4a), pharyngeal defects with exposure of vital structures, and prior irradiation or chemoradiation to the operative site and neck.

  7. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  8. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  9. Survey on Ranging Sensors and Cooperative Techniques for Relative Positioning of Vehicles

    Directory of Open Access Journals (Sweden)

    Fabian de Ponte Müller

    2017-01-01

    Full Text Available Future driver assistance systems will rely on accurate, reliable and continuous knowledge on the position of other road participants, including pedestrians, bicycles and other vehicles. The usual approach to tackle this requirement is to use on-board ranging sensors inside the vehicle. Radar, laser scanners or vision-based systems are able to detect objects in their line-of-sight. In contrast to these non-cooperative ranging sensors, cooperative approaches follow a strategy in which other road participants actively support the estimation of the relative position. The limitations of on-board ranging sensors regarding their detection range and angle of view and the facility of blockage can be approached by using a cooperative approach based on vehicle-to-vehicle communication. The fusion of both, cooperative and non-cooperative strategies, seems to offer the largest benefits regarding accuracy, availability and robustness. This survey offers the reader a comprehensive review on different techniques for vehicle relative positioning. The reader will learn the important performance indicators when it comes to relative positioning of vehicles, the different technologies that are both commercially available and currently under research, their expected performance and their intrinsic limitations. Moreover, the latest research in the area of vision-based systems for vehicle detection, as well as the latest work on GNSS-based vehicle localization and vehicular communication for relative positioning of vehicles, are reviewed. The survey also includes the research work on the fusion of cooperative and non-cooperative approaches to increase the reliability and the availability.

  10. Survey on peripheral techniques of brown coal liquefaction techniques; Kattan ekika gijutsu ni kansuru shuhen gijutsu no chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1980-09-01

    Described herein are results of survey on brown coal liquefaction techniques and peripheral techniques, centered by COSTEAM process under development in USA, solubilization by alcohol and liquefaction and cracking with the aid of tetrahydroquinoline as the hydrogen donor under development in Japan, and low-temperature carbonization and new promising techniques. The COSTEAM process shows higher reaction rates, conversions and oil yields for brown coal liquefaction than the one using hydrogen gas. Some of the problems involved in this process high viscosity and oxygenated compound content of the product oil. The product oil is acceptable as fuel for power generating plants and can be produced at a moderate cost, but may be unsuitable as vehicle fuel. Coal liquefaction and solubilization processes are mainly represented by those which use hydrogen. The hydrogen cost, which is high, determines the product price. The processes which use alcohol or tetrahydroquinoline are still in the experimental stage. (NEDO)

  11. Integrating Geological and Geodetic Surveying Techniques for Landslide Deformation Monitoring: Istanbul Case

    Science.gov (United States)

    Menteşe, E. Y.; Kilic, O.; BAS, M.; Tarih, A.; Duran, K.; Gumus, S.; Yapar, E. R.; Karasu, M. E.; Mehmetoğlu, H.; Karaman, A.; Edi˙ger, V.; Kosma, R. C.; Ozalaybey, S.; Zor, E.; Arpat, E.; Polat, F.; Dogan, U.; Cakir, Z.; Erkan, B.

    2017-12-01

    There are several methods that can be utilized for describing the landslide mechanisms. While some of them are commonly used, there are relatively new methods that have been proven to be useful. Obviously, each method has its own limitations and thus integrated use of these methods contributes to obtaining a realistic landslide model. The slopes of Küçükçekmece and Büyükçekmece Lagoons located at the Marmara Sea coast of İstanbul, Turkey, are among most specific examples of complex type landslides. The landslides in the area started developing at low sea level, and appears to ceased or at least slowed down to be at minimum after the sea level rise, as oppose to the still-active landslides that continue to cause damage especially in the valley slopes above the recent sea level between the two lagoons. To clarify the characteristics of these slope movements and classify them in most accurate way, Directorate of Earthquake and Ground Research of Istanbul Metropolitan Municipality launched a project in cooperation with Marmara Research Center of The Scientific and Technological Research Council of Turkey (TÜBİTAK). The project benefits the utility of the techniques of different disciplines such as geology, geophysics, geomorphology, hydrogeology, geotechnics, geodesy, remote sensing and meteorology. Specifically, this study focuses on two main axes of these techniques, namely: geological and geodetic. The reason for selecting these disciplines is because of their efficiency and power to understand the landslide mechanism in the area. Main approaches used in these studies are comprised of geological drills, inclinometer measurements, GPS surveys and SAR (both satellite and ground based) techniques. Integration of the results gathered from these techniques led the project team to comprehend critical aspects of landslide phenomenon in the area and produce precise landslide hazard maps that are basic instruments for a resilient urban development.

  12. Stereoscopic Visualization of Diffusion Tensor Imaging Data: A Comparative Survey of Visualization Techniques

    International Nuclear Information System (INIS)

    Raslan, O.; Debnam, J.M.; Ketonen, L.; Kumar, A.J.; Schellingerhout, D.; Wang, J.

    2013-01-01

    Diffusion tensor imaging (DTI) data has traditionally been displayed as a gray scale functional anisotropy map (GSFM) or color coded orientation map (CCOM). These methods use black and white or color with intensity values to map the complex multidimensional DTI data to a two-dimensional image. Alternative visualization techniques, such as V m ax maps utilize enhanced graphical representation of the principal eigenvector by means of a headless arrow on regular non stereoscopic (VM) or stereoscopic display (VMS). A survey of clinical utility of patients with intracranial neoplasms was carried out by 8 neuro radiologists using traditional and nontraditional methods of DTI display. Pairwise comparison studies of 5 intracranial neoplasms were performed with a structured questionnaire comparing GSFM, CCOM, VM, and VMS. Six of 8 neuro radiologists favored V m ax maps over traditional methods of display (GSFM and CCOM). When comparing the stereoscopic (VMS) and the non-stereoscopic (VM) modes, 4 favored VMS, 2 favored VM, and 2 had no preference. In conclusion, processing and visualizing DTI data stereoscopically is technically feasible. An initial survey of users indicated that V m ax based display methodology with or without stereoscopic visualization seems to be preferred over traditional methods to display DTI data.

  13. Radiation techniques used in patients with breast cancer: Results of a survey in Spain

    Science.gov (United States)

    Algara, Manuel; Arenas, Meritxell; De las Peñas Eloisa Bayo, Dolores; Muñoz, Julia; Carceller, José Antonio; Salinas, Juan; Moreno, Ferran; Martínez, Francisco; González, Ezequiel; Montero, Ángel

    2012-01-01

    Aim To evaluate the resources and techniques used in the irradiation of patients with breast cancer after lumpectomy or mastectomy and the status of implementation of new techniques and therapeutic schedules in our country. Background The demand for cancer care has increased among the Spanish population, as long as cancer treatment innovations have proliferated. Radiation therapy in breast cancer has evolved exponentially in recent years with the implementation of three-dimensional conformal radiotherapy, intensity modulated radiotherapy, image guided radiotherapy and hypofractionation. Material and Methods An original survey questionnaire was sent to institutions participating in the SEOR-Mama group (GEORM). In total, the standards of practice in 969 patients with breast cancer after surgery were evaluated. Results The response rate was 70% (28/40 centers). In 98.5% of cases 3D conformal treatment was used. All the institutions employed CT-based planning treatment. Boost was performed in 56.4% of patients: electrons in 59.8%, photons in 23.7% and HDR brachytherapy in 8.8%. Fractionation was standard in 93.1% of patients. Supine position was the most frequent. Only 3 centers used prone position. The common organs of risk delimited were: homolateral lung (80.8%) and heart (80.8%). In 84% histograms were used. An 80.8% of the centers used isocentric technique. In 62.5% asymmetric fields were employed. CTV was delimited in 46.2%, PTV in 65% and both in 38.5%. A 65% of the centers checked with portal films. IMRT and hypofractionation were used in 1% and in 5.5% respectively. Conclusion In most of centers, 3D conformal treatment and CT-based planning treatment were used. IMRT and hypofractionation are currently poorly implemented in Spain. PMID:24377012

  14. Survey of technology for decommissioning of nuclear fuel cycle facilities. 8. Remote handling and cutting techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ogawa, Ryuichiro; Ishijima, Noboru [Japan Nuclear Cycle Development Inst., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1999-03-01

    In nuclear fuel cycle facility decommissioning and refurbishment, the remote handling techniques such as dismantling, waste handling and decontamination are needed to reduce personnel radiation exposure. The survey research for the status of R and D activities on remote handling tools suitable for nuclear facilities in the world and domestic existing commercial cutting tools applicable to decommissioning of the facilities was conducted. In addition, the drive mechanism, sensing element and control system applicable to the remote handling devices were also surveyed. This report presents brief surveyed summaries. (H. Itami)

  15. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H S; Kim, J H; Lee, J C; Choi, Y R; Moon, S S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  16. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    International Nuclear Information System (INIS)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S.

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system

  17. Use of structured personality survey techniques to indicate operator response to stressful situations

    International Nuclear Information System (INIS)

    Waller, M.A.

    1990-01-01

    Under given circumstances, a person will tend to operate in one of four dominant orientations: (1) to perform tasks; (2) to achieve consensus; (3) to achieve understanding, or (4) to maintain structure. Historically, personality survey techniques, such as the Myers-Briggs type indicator, have been used to determine these tendencies. While these techniques can accurately reflect a person's orientation under normal social situations, under different sets of conditions, the same person may exhibit other tendencies, displaying a similar or entirely different orientation. While most do not exhibit extreme tendencies or changes of orientation, the shift in personality from normal to stressful conditions can be rather dramatic, depending on the individual. Structured personality survey techniques have been used to indicate operator response to stressful situations. These techniques have been extended to indicate the balance between orientations that the control room team has through the various levels of cognizance

  18. A microprocessor based mobile radiation survey system

    International Nuclear Information System (INIS)

    Gilbert, R.W.; McCormack, W.D.

    1984-01-01

    A microprocessor-based system has been designed and constructed to enhance the performance of routine radiation surveys on roads within the Hanford site. This device continually monitors system performance and output from four sodium iodide detectors mounted on the rear bumper of a 4-wheel drive truck. The gamma radiation count rate in counts-per-second is monitored, and a running average computed, with the results compared to predefined limits. If an abnormal instantaneous or average count rate is detected, an alarm is sounded with responsible data displayed on a liquid crystal panel in the cab of the vehicle. The system also has the capability to evaluate detector output using multiple time constants and to perform more complex tests and comparison of the data. Data can be archived for later analysis on conventional chart recorders or stored in digital form on magnetic tape or other digital storage media

  19. Microprocessor based mobile radiation survey system

    International Nuclear Information System (INIS)

    Gilbert, R.W.; McCormack, W.D.

    1983-12-01

    A microprocessor-based system has been designed and constructed to enhance the performance of routine radiation surveys on roads within the Hanford site. This device continually monitors system performance and output from four sodium iodide detectors mounted on the rear bumper of a 4-wheel drive truck. The gamma radiation count rate in counts-per-second is monitored, and a running average computed, with the results compared to predefined limits. If an abnormal instantaneous or average count rate is detected, an alarm is sounded with responsible data displayed on a liquid crystal panel in the cab of the vehicle. The system also has the capability to evaluate detector output using multiple time constants and to perform more complex tests and comparison of the data. Data can be archived for later analysis on conventional chart recorders or stored in digital form on magnetic tape or other digital storage media. 4 figures

  20. A SURVEY OF AUTOMATION TECHNIQUES COMING FORTH IN SHEET-FED OFFSET PRINTING ORGANIZATIONS

    OpenAIRE

    Mr. Ramesh Kumar*, Mr. Bijender & Mr. Sandeep Boora

    2017-01-01

    Sheet-Fed offset is one of the premier processes in India as well as abroad. To cope up with customers large quantity demands automation has become mandatory. From prepress to post press a wide range of automation techniques exist and coming forth for sheet fed offset presses. Objective of this paper is to throw light on various sheet-fed offset automation techniques existing today and their futuristic implications. The data related to automation was collected with the help of survey conducte...

  1. DCT-based cyber defense techniques

    Science.gov (United States)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  2. Nasal base narrowing: the combined alar base excision technique.

    Science.gov (United States)

    Foda, Hossam M T

    2007-01-01

    To evaluate the role of the combined alar base excision technique in narrowing the nasal base and correcting excessive alar flare. The study included 60 cases presenting with a wide nasal base and excessive alar flaring. The surgical procedure combined an external alar wedge resection with an internal vestibular floor excision. All cases were followed up for a mean of 32 (range, 12-144) months. Nasal tip modification and correction of any preexisting caudal septal deformities were always completed before the nasal base narrowing. The mean width of the external alar wedge excised was 7.2 (range, 4-11) mm, whereas the mean width of the sill excision was 3.1 (range, 2-7) mm. Completing the internal excision first resulted in a more conservative external resection, thus avoiding any blunting of the alar-facial crease. No cases of postoperative bleeding, infection, or keloid formation were encountered, and the external alar wedge excision healed with an inconspicuous scar that was well hidden in the depth of the alar-facial crease. Finally, the risk of notching of the alar rim, which can occur at the junction of the external and internal excisions, was significantly reduced by adopting a 2-layered closure of the vestibular floor (P = .01). The combined alar base excision resulted in effective narrowing of the nasal base with elimination of excessive alar flare. Commonly feared complications, such as blunting of the alar-facial crease or notching of the alar rim, were avoided by using simple modifications in the technique of excision and closure.

  3. Practical guidelines for developing a smartphone-based survey instrument

    DEFF Research Database (Denmark)

    Ohme, Jakob; de Vreese, Claes Holger; Albæk, Erik

    The increasing relevance of mobile surveys makes it important to gather empirical evidence on designs of such surveys. This research note presents the results of a test study conducted to identify the best set-up for a smartphone-based survey. We base our analysis on a random sample of Danish...

  4. Risk based technique for improving technical specifications

    International Nuclear Information System (INIS)

    Kim, I. S.; Jae, M. S.; Kim, B. S.; Hwang, S. W.; Kang, K. M.; Park, S. S.; Yu, Y. S.

    2001-03-01

    The objective of this study is to develop the systematic guidance for reviewing the documents associated with the changes of technical specifications. The work done in this fiscal year is the following : surveys in TS requirements, TS improvements and TS regulations in foreign countries as well as Korea, surveys on the state-of-the-art of RITSs and their use in Korea, development of a decision-making framework for both the licensee and the regulation agency, description of risk measures, assessment methodology on STI/AOT, and adverse effects caused by periodic maintenance, which are explained in appendix. The results of this study might contribute to enhancing the quality of the current technical specifications and contribute to preparing the risk informed regulation program using the decision-making framework developed in this study

  5. Localization in Wireless Sensor Networks: A Survey on Algorithms, Measurement Techniques, Applications and Challenges

    Directory of Open Access Journals (Sweden)

    Anup Kumar Paul

    2017-10-01

    Full Text Available Localization is an important aspect in the field of wireless sensor networks (WSNs that has developed significant research interest among academia and research community. Wireless sensor network is formed by a large number of tiny, low energy, limited processing capability and low-cost sensors that communicate with each other in ad-hoc fashion. The task of determining physical coordinates of sensor nodes in WSNs is known as localization or positioning and is a key factor in today’s communication systems to estimate the place of origin of events. As the requirement of the positioning accuracy for different applications varies, different localization methods are used in different applications and there are several challenges in some special scenarios such as forest fire detection. In this paper, we survey different measurement techniques and strategies for range based and range free localization with an emphasis on the latter. Further, we discuss different localization-based applications, where the estimation of the location information is crucial. Finally, a comprehensive discussion of the challenges such as accuracy, cost, complexity, and scalability are given.

  6. Nuclear assay of coal. Volume 4. Moisture determination in coal: survey of electromagnetic techniques. Final report

    International Nuclear Information System (INIS)

    Bevan, R.; Luckie, P.; Gozani, T.; Brown, D.R.; Bozorgmanesh, H.; Elias, E.

    1979-01-01

    This survey consists of two basic parts. The first consists of a survey of various non-nuclear moisture determination techniques. Three techniques are identified as promising for eventual on-line application with coal; these are the capacitance, microwave attenuation, and nuclear magnetic resonance (NMR) techniques. The second part is devoted to an in-depth analysis of these three techniques and the current extent to which they have been applied to coal. With a given coal type, accuracies of +- 1% absolute in moisture content are achievable with all three techniques. The accuracy of the two electromagnetic techniques has been demonstrated in the laboratory and on-line in coal burning plants, whereas only small samples have been analyzed with NMR. The current shortcoming of the simple electromagnetic techniques is the sensitivity of calibrations to physical parameters and coal type. NMR is currently limited by small sample sizes and non-rugged design. These findings are summarized and a list of manufacturers of moisture analyzers is given in the Appendix

  7. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    Science.gov (United States)

    Al-Mohammed, A. H.; Abido, M. A.

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research. PMID:24701191

  8. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    Directory of Open Access Journals (Sweden)

    A. H. Al-Mohammed

    2014-01-01

    Full Text Available This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs, when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research.

  9. Triangulation-based 3D surveying borescope

    Science.gov (United States)

    Pulwer, S.; Steglich, P.; Villringer, C.; Bauer, J.; Burger, M.; Franz, M.; Grieshober, K.; Wirth, F.; Blondeau, J.; Rautenberg, J.; Mouti, S.; Schrader, S.

    2016-04-01

    In this work, a measurement concept based on triangulation was developed for borescopic 3D-surveying of surface defects. The integration of such measurement system into a borescope environment requires excellent space utilization. The triangulation angle, the projected pattern, the numerical apertures of the optical system, and the viewing angle were calculated using partial coherence imaging and geometric optical raytracing methods. Additionally, optical aberrations and defocus were considered by the integration of Zernike polynomial coefficients. The measurement system is able to measure objects with a size of 50 μm in all dimensions with an accuracy of +/- 5 μm. To manage the issue of a low depth of field while using an optical high resolution system, a wavelength dependent aperture was integrated. Thereby, we are able to control depth of field and resolution of the optical system and can use the borescope in measurement mode with high resolution and low depth of field or in inspection mode with low resolution and higher depth of field. First measurements of a demonstrator system are in good agreement with our simulations.

  10. The Aalborg Survey / Part 2 - GPS Based Survey

    DEFF Research Database (Denmark)

    Harder, Henrik; Reiter, Ida Maria; Christensen, Cecilie Breinholm

    , the GPS based data are supplemented with an attractivity analysis among the 169 respondents. The resulting data are presented in maps in three scales that illustrate the respondents’ use of Aalborg’s urban spaces with regards to the respondent’s movements in time and space. By further coupling the GPS...... data with data on the single respondent’s gender, age, address, activities, mode of transport, use of money etc. it is possible to gain a more detailed knowledge of young people’s use of urban spaces in Aalborg....

  11. LH2 Target Design & Position Survey Techniques for the MUSE experiment for Precise Proton Radius Measurement

    Science.gov (United States)

    Le Pottier, Luc; Roy, Pryiashee; Lorenzon, Wolfgang; Raymond, Richard; Steinberg, Noah; Rossi de La Fuente, Erick; MUSE (MUon proton Scattering Experiment) Collaboration

    2017-09-01

    The proton radius puzzle is a currently unresolved problem which has intrigued the scientific community, dealing with a 7 σ discrepancy between the proton radii determined from muonic hydrogen spectroscopy and electron scattering measurements. The MUon Scattering Experiment (MUSE) aims to resolve this puzzle by performing the first simultaneous elastic scattering measurements of both electrons and muons on the proton, which will allow the comparison of the radii from the two interactions with reduced systematic uncertainties. The data from this experiment is expected to provide the best test of lepton universality to date. The experiment will take place at the Paul Scherrer Institute in Switzerland in 2018. An essential component of the experiment is a liquid hydrogen (LH2) cryotarget system. Our group at the University of Michigan is responsible for the design, fabrication and installation of this system. Here we present our LH2 target cell design and fabrication techniques for successful operation at 20 K and 1 atm, and our computer vision-based target position survey system which will determine the position of the target, installed inside a vacuum chamber, with 0.01 mm or better precision at the height of the liquid hydrogen target and along the beam direction during the experiment.

  12. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  13. CNMI Boat-based Creel Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Commonwealth of the Northern Mariana Islands (CNMI) Creel surveys are operated by the Division of Fish and Wildlife (DFW) and are only on the island of Saipan....

  14. Synchrotron radiation based analytical techniques (XAS and XRF)

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2014-01-01

    A brief description of the principles of X-ray absorption spectroscopy (XAS) and X-ray fluorescence (XRF) techniques is given in this article with emphasis on the advantages of using synchrotron radiation-based instrumentation/beamline. XAS technique is described in more detail to emphasize the strength of the technique as a local structural probe. (author)

  15. Technical errors in complete mouth radiographic survey according to radiographic techniques and film holding methods

    International Nuclear Information System (INIS)

    Choi, Karp Sik; Byun, Chong Soo; Choi, Soon Chul

    1986-01-01

    The purpose of this study was to investigate the numbers and causes of retakes in 300 complete mouth radiographic surveys made by 75 senior dental students. According to radiographic techniques and film holding methods, they were divided into 4 groups: Group I: Bisecting-angle technique with patient's fingers. Group II: Bisecting-angle technique with Rinn Snap-A-Ray device. Group III: Bisecting-angle technique with Rinn XCP instrument (short cone) Group IV: Bisecting-angle technique with Rinn XCP instrument (long cone). The most frequent cases of retakes, the most frequent tooth area examined, of retakes and average number of retakes per complete mouth survey were evaluated. The obtained results were as follows: Group I: Incorrect film placement (47.8), upper canine region, and 0.89. Group II: Incorrect film placement (44.0), upper canine region, and 1.12. Group III: Incorrect film placement (79.2), upper canine region, and 2.05. Group IV: Incorrect film placement (67.7), upper canine region, and 1.69.

  16. Accelerator based techniques for contraband detection

    Science.gov (United States)

    Vourvopoulos, George

    1994-05-01

    It has been shown that narcotics, explosives, and other contraband materials, contain various chemical elements such as H, C, N, O, P, S, and Cl in quantities and ratios that differentiate them from each other and from other innocuous substances. Neutrons and γ-rays have the ability to penetrate through various materials at large depths. They are thus able, in a non-intrusive way, to interrogate volumes ranging from suitcases to Sea-Land containers, and have the ability to image the object with an appreciable degree of reliability. Neutron induced reactions such as (n, γ), (n, n') (n, p) or proton induced γ-resonance absorption are some of the reactions currently investigated for the identification of the chemical elements mentioned above. Various DC and pulsed techniques are discussed and their advantages, characteristics, and current progress are shown. Areas where use of these methods is currently under evaluation are detection of hidden explosives, illicit drug interdiction, chemical war agents identification, nuclear waste assay, nuclear weapons destruction and others.

  17. An Authentication Technique Based on Classification

    Institute of Scientific and Technical Information of China (English)

    李钢; 杨杰

    2004-01-01

    We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.

  18. Laser-based techniques for combustion diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Georgiev, N.

    1997-04-01

    Two-photon-induced Degenerate Four-Wave Mixing, DFWM, was applied for the first time to the detection of CO, and NH{sub 3} molecules. Measurements were performed in a cell, and in atmospheric-pressure flames. In the cell measurements, the signal dependence on the pressure and on the laser beam intensity was studied. The possibility of simultaneous detection of NH{sub 3} and OH was investigated. Carbon monoxide and ammonia were also detected employing two-photon-induced Polarization Spectroscopy, PS. In the measurements performed in a cold gas flow, the signal strength dependence on the laser intensity, and on the polarization of the pump beam, was investigated. An approach to improve the spatial resolution of the Amplified Stimulated Emission, ASE, was developed. In this approach, two laser beams at different frequencies were crossed in the sample. If the sum of the frequencies of the two laser beams matches a two photon resonance of the investigated species, only the molecules in the intersection volume will be excited. NH{sub 3} molecules and C atoms were studied. The potential of using two-photon LIF for two-dimensional imaging of combustion species was investigated. Although LIF is species specific, several species can be detected simultaneously by utilizing spectral coincidences. Combining one- and two-photon process, OH, NO, and O were detected simultaneously, as well as OH, NO, and NH{sub 3}. Collisional quenching is the major source of uncertainty in quantitative applications of LIF. A technique for two-dimensional, absolute species concentration measurements, circumventing the problems associated with collisional quenching, was developed. By applying simple mathematics to the ratio of two LIF signals generated from two counterpropagating laser beams, the absolute species concentration could be obtained. 41 refs

  19. Survey on development of brown coal liquefaction techniques; Kattan ekika gijutsu ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1980-09-01

    Described herein are results of literature survey on brown coal liquefaction reactions and elementary techniques. Liquefaction of brown coal in the presence of CO and steam, or CO, H{sub 2} and steam has been investigated. It is not clear by the literature survey whether it is superior to the normal process which uses hydrogen. Brown coal contains moisture at high contents, and the drying techniques are necessary to be developed for its liquefaction. The future coal liquefaction plant will be much larger than the past one, and there are a number of problems to be solved, such as those involved in the designs of large-sized high-pressure slurry pumps, heat exchangers and preheaters. It is also necessary to develop the materials of and production techniques for large reactors which are serviceable under severe conditions. The solid-liquid separation for liquefaction products involves a number of the elementary techniques characteristic of coal liquefaction processes, and needs many technological developments. The one-stage brown coal liquefaction process is compared with the two-stage process for the secondary hydrogenation of SCR, but no clear conclusions are reached. (NEDO)

  20. SNE's methodological basis - web-based software in entrepreneurial surveys

    DEFF Research Database (Denmark)

    Madsen, Henning

    This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project.......This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project....

  1. The use of continuous improvement techniques: A survey-based ...

    African Journals Online (AJOL)

    ... and effectiveness differ between types of operations (i.e., manufacturing versus service). Hence, this research provides a starting point for deploying lean and/or ... It also provides guidance for practitioners about what tools are effective in a ...

  2. Survey of image quality and radiographic technique of pediatric chest examinations performed in Latin America

    International Nuclear Information System (INIS)

    Khoury, H.; Mora, P.; Defaz, M.Y.; Blanco, S.; Leyton, F.; Benavente, T.; Ortiz Lopez, P.; Ramirez, R.

    2008-01-01

    This work presents the results of a survey of entrance surface air kerma values (K e ), image quality and radiographic exposure parameters used in pediatric chest examinations performed in Latin America. This study is part of the activities of the IAEA Regional Project RLA/9/057 whose objective is to optimize the radiological protection of patients in diagnostic and interventional radiology, nuclear medicine and radiotherapy. The survey was performed in nine hospitals in Argentina (1), Brazil (4), Chile (1), Costa Rica (1), Peru (1) and Ecuador (1). The study group consisted of 462 pediatric patients (Group I- from two days to one year, Group II- from four to six years of age) undergoing chest PA/AP examinations. At the time of the examination the exposure parameters (kVp, mAs, focal-spot-to-film distance, etc.) and patient information (gender, height, weight and age) were recorded. The radiographic image quality was evaluated by the local radiologist based on the European Guidelines on Quality Criteria for Diagnostic Radiographic Images in Pediatrics. The results showed that the exposure parameters used on newborn patients were in the majority outside the 60-65kV range recommended by the European Guidelines for a good radiographic practice. In the case of examinations of patients with age between 4 to 6 years, 80% were performed with a peak tube voltage within the 60-80 kV range, as recommended by the European Guidelines. It was found that none of countries fully comply with the European Guidelines on Quality Criteria and those criteria No. 2 and No. 3 (reproduction of the chest without rotation) received the lowest scores. Probably this occurs because there are no proper patient immobilization devices. The Ke values, for both patient groups, showed a wide dispersion, ranged from 10 μGy to 160μGy for the newborn patients and from 20μGy to 240μGy for infant patients. It is possible to conclude that, in the participating Latin American countries on this project

  3. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  4. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  5. Comprehensive geophysical survey technique in exploration for deep-buried hydrothermal type uranium deposits in Xiangshan volcanic basin, China

    International Nuclear Information System (INIS)

    Ke, D.

    2014-01-01

    According to recent drilling results, uranium mineralization has been found underground more than 1000 m deep in the Xiangshan volcanic basin, in where uranium exploration has been carried out for over 50 years. This paper presents a comprehensive geophysical survey technique, including audio magnetotelluric method (AMT), high resolution ground magnetic and radon survey, which aim to prospect deep-buried and concealed uranium deposits in Xiangshan volcanic basin. Based on research and application, a comprehensive geophysical technique consisting of data acquisition, processing and interpretation has been established. Concealed rock and ore-controlling structure buried deeper than 1000 m can be detected by using this technique. Moreover, one kind of anti-interference technique of AMT survey is presented, which can eliminate the interference induced by the high-voltage power lines. Result of AMT in Xiangshan volcanic basin is demonstrated as high-low-high mode, which indicates there are three layers in geology. The upper layer with high resistivity is mainly the react of porphyroclastic lava. The middle layer with low resistivity is metamorphic schists or dellenite whereas the lower layer with high resistivity is inferred as granite. The interface between middle and lower layer is recognized as the potential zone for occurrence of uranium deposits. According to the corresponding relation of the resistivity and magnetic anomaly with uranium ore bodies, the tracing model of faults and interfaces between the different rocks, and the forecasting model of advantageous area for uranium deposits have been established. In terms of the forecasting model, some significant sections for uranium deposits were delineated in the west of the Xiangshan volcanic basin. As a result, some achievements on uranium prospecting have been acquired. High grade economic uranium ore bodies have been found in several boreholes, which are located in the forecasted zones. (author)

  6. A SURVEY OF RETINA BASED DISEASE IDENTIFICATION USING BLOOD VESSEL SEGMENTATION

    Directory of Open Access Journals (Sweden)

    P Kuppusamy

    2016-11-01

    Full Text Available The colour retinal photography is one of the most essential features to identify the confirmation of various eye diseases. The iris is primary attribute to authenticate the human. This research work presents the survey and comparison of various blood vessel related feature identification, segmentation, extraction and enhancement methods. Additionally, this study is observed the various databases performance for storing the images and testing in minimal time. This paper is also provides the better performance techniques based on the survey.

  7. Emerging Technologies and Techniques for Wide Area Radiological Survey and Remediation

    Energy Technology Data Exchange (ETDEWEB)

    Sutton, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zhao, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-03-24

    Technologies to survey and decontaminate wide-area contamination and process the subsequent radioactive waste have been developed and implemented following the Chernobyl nuclear power plant release and the breach of a radiological source resulting in contamination in Goiania, Brazil. These civilian examples of radioactive material releases provided some of the first examples of urban radiological remediation. Many emerging technologies have recently been developed and demonstrated in Japan following the release of radioactive cesium isotopes (Cs-134 and Cs-137) from the Fukushima Dai-ichi nuclear power plant in 2011. Information on technologies reported by several Japanese government agencies, such as the Japan Atomic Energy Agency (JAEA), the Ministry of the Environment (MOE) and the National Institute for Environmental Science (NIES), together with academic institutions and industry are summarized and compared to recently developed, deployed and available technologies in the United States. The technologies and techniques presented in this report may be deployed in response to a wide area contamination event in the United States. In some cases, additional research and testing is needed to adequately validate the technology effectiveness over wide areas. Survey techniques can be deployed on the ground or from the air, allowing a range of coverage rates and sensitivities. Survey technologies also include those useful in measuring decontamination progress and mapping contamination. Decontamination technologies and techniques range from non-destructive (e.g., high pressure washing) and minimally destructive (plowing), to fully destructive (surface removal or demolition). Waste minimization techniques can greatly impact the long-term environmental consequences and cost following remediation efforts. Recommendations on technical improvements to address technology gaps are presented together with observations on remediation in Japan.

  8. A survey of machine readable data bases

    Science.gov (United States)

    Matlock, P.

    1981-01-01

    Forty-two of the machine readable data bases available to the technologist and researcher in the natural sciences and engineering are described and compared with the data bases and date base services offered by NASA.

  9. Improving Standard Poststratification Techniques For Random-Digit-Dialing Telephone Surveys

    Directory of Open Access Journals (Sweden)

    Michael P. Battaglia

    2008-03-01

    Full Text Available Random-digit-dialing surveys in the United States such as the Behavioral Risk Factor Surveillance System (BRFSS typically poststratify on age, gender and race/ethnicity using control totals from an appropriate source such as the 2000 Census, the Current Population Survey, or the American Community Survey. Using logistic regression and interaction detection software we identified key "main effect" socio-demographic variables and important two-factor interactions associated with several health risk factor outcomes measured in the BRFSS, one of the largest annual RDD surveys in the United States. A procedure was developed to construct control totals, which were consistent with estimates of age, gender, and race/ethnicity obtained from a commercial source and distributions of other demographic variables from the Current Population Survey. Raking was used to incorporate main effects and two-factor interaction margins into the weighting of the BRFSS survey data. The resulting risk factor estimates were then compared with those based on the current BRFSS weighting methodology and mean squared error estimates were developed. The research demonstrates that by identifying socio-demographic variables associated with key outcome variables and including these variables in the weighting methodology, nonresponse bias can be substantially reduced.

  10. Nurse Practitioners' Use of Communication Techniques: Results of a Maryland Oral Health Literacy Survey.

    Science.gov (United States)

    Koo, Laura W; Horowitz, Alice M; Radice, Sarah D; Wang, Min Q; Kleinman, Dushanka V

    2016-01-01

    We examined nurse practitioners' use and opinions of recommended communication techniques for the promotion of oral health as part of a Maryland state-wide oral health literacy assessment. Use of recommended health-literate and patient-centered communication techniques have demonstrated improved health outcomes. A 27-item self-report survey, containing 17 communication technique items, across 5 domains, was mailed to 1,410 licensed nurse practitioners (NPs) in Maryland in 2010. Use of communication techniques and opinions about their effectiveness were analyzed using descriptive statistics. General linear models explored provider and practice characteristics to predict differences in the total number and the mean number of communication techniques routinely used in a week. More than 80% of NPs (N = 194) routinely used 3 of the 7 basic communication techniques: simple language, limiting teaching to 2-3 concepts, and speaking slowly. More than 75% of respondents believed that 6 of the 7 basic communication techniques are effective. Sociodemographic provider characteristics and practice characteristics were not significant predictors of the mean number or the total number of communication techniques routinely used by NPs in a week. Potential predictors for using more of the 7 basic communication techniques, demonstrating significance in one general linear model each, were: assessing the office for user-friendliness and ever taking a communication course in addition to nursing school. NPs in Maryland self-reported routinely using some recommended health-literate communication techniques, with belief in their effectiveness. Our findings suggest that NPs who had assessed the office for patient-friendliness or who had taken a communication course beyond their initial education may be predictors for using more of the 7 basic communication techniques. These self-reported findings should be validated with observational studies. Graduate and continuing education for NPs

  11. Using Intelligent Techniques in Construction Project Cost Estimation: 10-Year Survey

    Directory of Open Access Journals (Sweden)

    Abdelrahman Osman Elfaki

    2014-01-01

    Full Text Available Cost estimation is the most important preliminary process in any construction project. Therefore, construction cost estimation has the lion’s share of the research effort in construction management. In this paper, we have analysed and studied proposals for construction cost estimation for the last 10 years. To implement this survey, we have proposed and applied a methodology that consists of two parts. The first part concerns data collection, for which we have chosen special journals as sources for the surveyed proposals. The second part concerns the analysis of the proposals. To analyse each proposal, the following four questions have been set. Which intelligent technique is used? How have data been collected? How are the results validated? And which construction cost estimation factors have been used? From the results of this survey, two main contributions have been produced. The first contribution is the defining of the research gap in this area, which has not been fully covered by previous proposals of construction cost estimation. The second contribution of this survey is the proposal and highlighting of future directions for forthcoming proposals, aimed ultimately at finding the optimal construction cost estimation. Moreover, we consider the second part of our methodology as one of our contributions in this paper. This methodology has been proposed as a standard benchmark for construction cost estimation proposals.

  12. [Abortion in Brazil: a household survey using the ballot box technique].

    Science.gov (United States)

    Diniz, Debora; Medeiros, Marcelo

    2010-06-01

    This study presents the first results of the National Abortion Survey (PNA, Pesquisa Nacional de Aborto), a household random sample survey fielded in 2010 covering urban women in Brazil aged 18 to 39 years. The PNA combined two techniques, interviewer-administered questionnaires and self-administered ballot box questionnaires. The results of PNA show that at the end of their reproductive health one in five women has performed an abortion, with abortions being more frequent in the main reproductive ages, that is, from 18 to 29 years old. No relevant differentiation was observed in the practice of abortion among religious groups, but abortion was found to be more common among people with lower education. The use of medical drugs to induce abortion occurred in half of the abortions, and post-abortion hospitalization was observed among approximately half of the women who aborted. Such results lead to conclude that abortion is a priority in the Brazilian public health agenda.

  13. Test description and preliminary pitot-pressure surveys for Langley Test Technique Demonstrator at Mach 6

    Science.gov (United States)

    Everhart, Joel L.; Ashby, George C., Jr.; Monta, William J.

    1992-01-01

    A propulsion/airframe integration experiment conducted in the NASA Langley 20-Inch Mach 6 Tunnel using a 16.8-in.-long version of the Langley Test Technique Demonstrator configuration with simulated scramjet propulsion is described. Schlieren and vapor screen visualization of the nozzle flow field is presented and correlated with pitot-pressure flow-field surveys. The data were obtained at nominal free-stream conditions of Re = 2.8 x 10 exp 6 and a nominal engine total pressure of 100 psia. It is concluded that pitot-pressure surveys coupled to schlieren and vapor-screen photographs, and oil flows have revealed flow features including vortices, free shear layers, and shock waves occurring in the model flow field.

  14. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    Science.gov (United States)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  15. Array-based techniques for fingerprinting medicinal herbs

    Directory of Open Access Journals (Sweden)

    Xue Charlie

    2011-05-01

    Full Text Available Abstract Poor quality control of medicinal herbs has led to instances of toxicity, poisoning and even deaths. The fundamental step in quality control of herbal medicine is accurate identification of herbs. Array-based techniques have recently been adapted to authenticate or identify herbal plants. This article reviews the current array-based techniques, eg oligonucleotides microarrays, gene-based probe microarrays, Suppression Subtractive Hybridization (SSH-based arrays, Diversity Array Technology (DArT and Subtracted Diversity Array (SDA. We further compare these techniques according to important parameters such as markers, polymorphism rates, restriction enzymes and sample type. The applicability of the array-based methods for fingerprinting depends on the availability of genomics and genetics of the species to be fingerprinted. For the species with few genome sequence information but high polymorphism rates, SDA techniques are particularly recommended because they require less labour and lower material cost.

  16. Survey of Analysis of Crime Detection Techniques Using Data Mining and Machine Learning

    Science.gov (United States)

    Prabakaran, S.; Mitra, Shilpa

    2018-04-01

    Data mining is the field containing procedures for finding designs or patterns in a huge dataset, it includes strategies at the convergence of machine learning and database framework. It can be applied to various fields like future healthcare, market basket analysis, education, manufacturing engineering, crime investigation etc. Among these, crime investigation is an interesting application to process crime characteristics to help the society for a better living. This paper survey various data mining techniques used in this domain. This study may be helpful in designing new strategies for crime prediction and analysis.

  17. A technique for reducing diverse habits survey data and its application to seafood consumption near Winfrith

    International Nuclear Information System (INIS)

    Smith, B.D.; Hunt, G.J.

    1989-01-01

    Habits surveys provide basic information to enable doses to appropriate critical groups of members of the public to be assessed. In some cases, the relevant habits of those to be included in the critical group can be quite diverse, and a simplifying method may be needed. A technique for this is described, and exemplified in relation to liquid radioactive waste discharges from AEE Winfrith, an area where the range of seafoods and radionuclide concentrations in them result in a wide variation of doses. Weighted mean consumption rates are derived for the critical group, and an example of their application in setting a revised liquid discharge authorisation is given. (author)

  18. A Survey of Technologies Supporting Virtual Project Based Learning

    DEFF Research Database (Denmark)

    Dirckinck-Holmfeld, Lone

    2002-01-01

    This paper describes a survey of technologies and to what extent they support virtual project based learning. The paper argues that a survey of learning technologies should be related to concrete learning tasks and processes. Problem oriented project pedagogy (POPP) is discussed, and a framework...... for evaluation is proposed where negotiation of meaning, coordination and resource management are identified as the key concepts in virtual project based learning. Three e-learning systems are selected for the survey, Virtual-U, Lotus Learningspace and Lotus Quickplace, as each system offers different strategies...... for e-learning. The paper concludes that virtual project based learning may benefit from facilities of all these systems....

  19. Comparing econometric and survey-based methodologies in measuring offshoring

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    such as the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data, this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of employment...

  20. Base Oils Biodegradability Prediction with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Malika Trabelsi

    2010-02-01

    Full Text Available In this paper, we apply various data mining techniques including continuous numeric and discrete classification prediction models of base oils biodegradability, with emphasis on improving prediction accuracy. The results show that highly biodegradable oils can be better predicted through numeric models. In contrast, classification models did not uncover a similar dichotomy. With the exception of Memory Based Reasoning and Decision Trees, tested classification techniques achieved high classification prediction. However, the technique of Decision Trees helped uncover the most significant predictors. A simple classification rule derived based on this predictor resulted in good classification accuracy. The application of this rule enables efficient classification of base oils into either low or high biodegradability classes with high accuracy. For the latter, a higher precision biodegradability prediction can be obtained using continuous modeling techniques.

  1. Conducting Surveys and Data Collection: From Traditional to Mobile and SMS-based Surveys

    Directory of Open Access Journals (Sweden)

    Iftikhar Alam

    2014-08-01

    Full Text Available Fresh, bias-free and valid data collected using different survey modes is considered an essential requirement for smooth functioning and evolution of an organization. Surveys play a major role in making in-time correct decisions and generating reports. The aim of this study is to compare and investigate state-of-the-art in different survey modes including print, email, online, mobile and SMS-based surveys. Results indicated that existing methods are neither complete nor sufficient to fulfil the overall requirements of an organization which primarily rely on surveys. Also, it shows that SMS is a dominant method for data collection due to its pervasiveness. However, existing SMS-based data collection has limitations like limited number of characters per SMS, single question per SMS and lake of multimedia support. Recent trends in data collection emphasis on data collection applications for smart phones. However, in developing countries low-end mobile devices are still extensively used which makes the data collection difficult from man in the street. The paper conclude that existing survey modes and methods should be improved to get maximum responses quickly in low cost manner. The study has contributed to the area of surveying and data collection by analysing different factors such as cost, time and response rate. The results of this study can help practitioners in creating a more successful surveying method for data collection that can be effectively used for low budget projects in developed as well as developing countries.

  2. Telephone survey to investigate relationships between onychectomy or onychectomy technique and house soiling in cats.

    Science.gov (United States)

    Gerard, Amanda F; Larson, Mandy; Baldwin, Claudia J; Petersen, Christine

    2016-09-15

    OBJECTIVE To determine whether associations existed between onychectomy or onychectomy technique and house soiling in cats. DESIGN Cross-sectional study. SAMPLE 281 owners of 455 cats in Polk County, Iowa, identified via a list of randomly selected residential phone numbers of cat owners in that region. PROCEDURES A telephone survey was conducted to collect information from cat owners on factors hypothesized a priori to be associated with house soiling, including cat sex, reproductive status, medical history, and onychectomy history. When cats that had undergone onychectomy were identified, data were collected regarding the cat's age at the time of the procedure and whether a carbon dioxide laser (CDL) had been used. Information on history of house soiling behavior (urinating or defecating outside the litter box) was also collected. RESULTS Onychectomy technique was identified as a risk factor for house soiling. Cats for which a non-CDL technique was used had a higher risk of house soiling than cats for which the CDL technique was used. Cats that had undergone onychectomy and that lived in a multicat (3 to 5 cats) household were more than 3 times as likely to have house soiled as were single-housed cats with intact claws. CONCLUSIONS AND CLINICAL RELEVANCE Results of this cross-sectional study suggested that use of the CDL technique for onychectomy could decrease the risk of house soiling by cats relative to the risk associated with other techniques. This and other findings can be used to inform the decisions of owners and veterinarians when considering elective onychectomy for cats.

  3. improvement of digital image watermarking techniques based on FPGA implementation

    International Nuclear Information System (INIS)

    EL-Hadedy, M.E

    2006-01-01

    digital watermarking provides the ownership of a piece of digital data by marking the considered data invisibly or visibly. this can be used to protect several types of multimedia objects such as audio, text, image and video. this thesis demonstrates the different types of watermarking techniques such as (discrete cosine transform (DCT) and discrete wavelet transform (DWT) and their characteristics. then, it classifies these techniques declaring their advantages and disadvantages. an improved technique with distinguished features, such as peak signal to noise ratio ( PSNR) and similarity ratio (SR) has been introduced. the modified technique has been compared with the other techniques by measuring heir robustness against differ attacks. finally, field programmable gate arrays (FPGA) based implementation and comparison, for the proposed watermarking technique have been presented and discussed

  4. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Cremers, D.A.; Archuleta, F.L.; Dilworth, H.C.

    1985-01-01

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  5. Survey of agents and techniques applicable to the solidification of low-level radioactive wastes

    International Nuclear Information System (INIS)

    Fuhrmann, M.; Neilson, R.M. Jr.; Colombo, P.

    1981-12-01

    A review of the various solidification agents and techniques that are currently available or potentially applicable for the solidification of low-level radioactive wastes is presented. An overview of the types and quantities of low-level wastes produced is presented. Descriptions of waste form matrix materials, the wastes types for which they have been or may be applied and available information concerning relevant waste form properties and characteristics follow. Also included are descriptions of the processing techniques themselves with an emphasis on those operating parameters which impact upon waste form properties. The solidification agents considered in this survey include: hydraulic cements, thermoplastic materials, thermosetting polymers, glasses, synthetic minerals and composite materials. This survey is part of a program supported by the United States Department of Energy's Low-Level Waste Management Program (LLWMP). This work provides input into LLWMP efforts to develop and compile information relevant to the treatment and processing of low-level wastes and their disposal by shallow land burial

  6. Survey of agents and techniques applicable to the solidification of low-level radioactive wastes

    Energy Technology Data Exchange (ETDEWEB)

    Fuhrmann, M.; Neilson, R.M. Jr.; Colombo, P.

    1981-12-01

    A review of the various solidification agents and techniques that are currently available or potentially applicable for the solidification of low-level radioactive wastes is presented. An overview of the types and quantities of low-level wastes produced is presented. Descriptions of waste form matrix materials, the wastes types for which they have been or may be applied and available information concerning relevant waste form properties and characteristics follow. Also included are descriptions of the processing techniques themselves with an emphasis on those operating parameters which impact upon waste form properties. The solidification agents considered in this survey include: hydraulic cements, thermoplastic materials, thermosetting polymers, glasses, synthetic minerals and composite materials. This survey is part of a program supported by the United States Department of Energy's Low-Level Waste Management Program (LLWMP). This work provides input into LLWMP efforts to develop and compile information relevant to the treatment and processing of low-level wastes and their disposal by shallow land burial.

  7. A survey on the application of robot techniques to an atomic power plant

    International Nuclear Information System (INIS)

    Hasegawa, Tsutomu; Sato, Tomomasa; Hirai, Shigeoki; Suehiro, Takashi; Okada, Tokuji

    1982-01-01

    Tasks of workers in atomic power plants have been surveyed from the viewpoint of necessity and possibility of their robotization. The daily tasks are classified into the following: (1) plant operation; (2) periodical examination; (3) patrol and inspection; (4) in-service inspection; (5) maintenance and repaire; (6) examination and production of the fuel; (7) waste disposal; (8) decommission of the plant. The necessity and present status of the robotization in atomic power plants are investigated according to the following classification: (1) inspection robots; (2) patrol inspection/maintenance robots; (3) hot cell robots; (4) plant decommission robots. The following have been made clear through the survey: (1) Various kinds of tasks are necessary for an atomic power plant: (2) Because of most of the tasks taking place in intense radiation environments, it is necessary to introduce robots into atomic power plants: (3) In application of robots in atomic power plant systems, it is necessary to take account of various severe conditions concerning spatial restrictions, radioactive endurance and reliability. Lastly wide applicability of the techniques of knowledge robots, which operate interactively with men, has been confirmed as a result of the survey. (author)

  8. Vision-based Vehicle Detection Survey

    Directory of Open Access Journals (Sweden)

    Alex David S

    2016-03-01

    Full Text Available Nowadays thousands of drivers and passengers were losing their lives every year on road accident, due to deadly crashes between more than one vehicle. There are number of many research focuses were dedicated to the development of intellectual driver assistance systems and autonomous vehicles over the past decade, which reduces the danger by monitoring the on-road environment. In particular, researchers attracted towards the on-road detection of vehicles in recent years. Different parameters have been analyzed in this paper which includes camera placement and the various applications of monocular vehicle detection, common features and common classification methods, motion- based approaches and nighttime vehicle detection and monocular pose estimation. Previous works on the vehicle detection listed based on camera poisons, feature based detection and motion based detection works and night time detection.

  9. EBR Strengthening Technique for Concrete, Long-Term Behaviour and Historical Survey

    Directory of Open Access Journals (Sweden)

    Christoph Czaderski

    2018-01-01

    Full Text Available Epoxy bonded steel plates (externally bonded reinforcemen: EBR for the strengthening of concrete structures were introduced to the construction industry in the late 1960s, and the use of fibre reinforced polymers (FRPs was introduced in the 1990s, which means that these techniques have already been used in construction for 50 and 25 years, respectively. In the first part of the paper, a historical survey of the development and introduction of these strengthening techniques into the construction industry are presented. The monitoring of such applications in construction is very important and gives more confidence to this strengthening technique. Therefore, in the second part of the paper, two long-term monitoring campaigns over an extraordinarily long duration will be presented. Firstly, a 47-year monitoring campaign on a concrete beam with an epoxy bonded steel plate and, secondly, a 20-year monitoring campaign on a road bridge with epoxy bonded CFRP (carbon fibre reinforced polymers strips are described. The paper is an expanded version of the paper presented at the SMAR2017 Conference.

  10. Power system stabilizers based on modern control techniques

    Energy Technology Data Exchange (ETDEWEB)

    Malik, O P; Chen, G P; Zhang, Y; El-Metwally, K [Calgary Univ., AB (Canada). Dept. of Electrical and Computer Engineering

    1994-12-31

    Developments in digital technology have made it feasible to develop and implement improved controllers based on sophisticated control techniques. Power system stabilizers based on adaptive control, fuzzy logic and artificial networks are being developed. Each of these control techniques possesses unique features and strengths. In this paper, the relative performance of power systems stabilizers based on adaptive control, fuzzy logic and neural network, both in simulation studies and real time tests on a physical model of a power system, is presented and compared to that of a fixed parameter conventional power system stabilizer. (author) 16 refs., 45 figs., 3 tabs.

  11. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  12. A Survey on Infrastructure-Based Vehicular Networks

    Directory of Open Access Journals (Sweden)

    Cristiano M. Silva

    2017-01-01

    Full Text Available The infrastructure of vehicular networks plays a major role in realizing the full potential of vehicular communications. More and more vehicles are connected to the Internet and to each other, driving new technological transformations in a multidisciplinary way. Researchers in automotive/telecom industries and academia are joining their effort to provide their visions and solutions to increasingly complex transportation systems, also envisioning a myriad of applications to improve the driving experience and the mobility. These trends pose significant challenges to the communication systems: low latency, higher throughput, and increased reliability have to be granted by the wireless access technologies and by a suitable (possibly dedicated infrastructure. This paper presents an in-depth survey of more than ten years of research on infrastructures, wireless access technologies and techniques, and deployment that make vehicular connectivity available. In addition, we identify the limitations of present technologies and infrastructures and the challenges associated with such infrastructure-based vehicular communications, also highlighting potential solutions.

  13. A micro-controller based wide range survey meter

    International Nuclear Information System (INIS)

    Bhingare, R.R.; Bajaj, K.C.; Kannan, S.

    2004-01-01

    Wide range survey meters (1μSv/h -10 Sv/h) with the detector(s) mounted at the end of a two-to-four meter-long extendable tube are widely used for radiation protection survey of difficult to reach locations and high dose rate areas, The commercially available survey meters of this type use two GM counters to cover a wide range of dose rate measurement. A new micro-controller based wide range survey meter using two Si diode detectors has been developed. The use of solid state detectors in the survey meter has a number of advantages like low power consumption, lighter battery powered detector probe, elimination of high voltage for the operation of the detectors, etc. The design uses infrared communication between the probe and the readout unit through a light-weight collapsible extension tube for high reliability. The design details and features are discussed in detail. (author)

  14. An Image Registration Based Technique for Noninvasive Vascular Elastography

    OpenAIRE

    Valizadeh, Sina; Makkiabadi, Bahador; Mirbagheri, Alireza; Soozande, Mehdi; Manwar, Rayyan; Mozaffarzadeh, Moein; Nasiriavanaki, Mohammadreza

    2018-01-01

    Non-invasive vascular elastography is an emerging technique in vascular tissue imaging. During the past decades, several techniques have been suggested to estimate the tissue elasticity by measuring the displacement of the Carotid vessel wall. Cross correlation-based methods are the most prevalent approaches to measure the strain exerted in the wall vessel by the blood pressure. In the case of a low pressure, the displacement is too small to be apparent in ultrasound imaging, especially in th...

  15. Memory Based Machine Intelligence Techniques in VLSI hardware

    OpenAIRE

    James, Alex Pappachen

    2012-01-01

    We briefly introduce the memory based approaches to emulate machine intelligence in VLSI hardware, describing the challenges and advantages. Implementation of artificial intelligence techniques in VLSI hardware is a practical and difficult problem. Deep architectures, hierarchical temporal memories and memory networks are some of the contemporary approaches in this area of research. The techniques attempt to emulate low level intelligence tasks and aim at providing scalable solutions to high ...

  16. The Research of Histogram Enhancement Technique Based on Matlab Software

    Directory of Open Access Journals (Sweden)

    Li Kai

    2014-08-01

    Full Text Available Histogram enhancement technique has been widely applied as a typical pattern in digital image processing. The paper is based on Matlab software, through the two ways of histogram equalization and histogram specification technologies to deal with the darker images, using two methods of partial equilibrium and mapping histogram to transform the original histograms, thereby enhanced the image information. The results show that these two kinds of techniques both can significantly improve the image quality and enhance the image feature.

  17. Worldwide enucleation techniques and materials for treatment of retinoblastoma: an international survey.

    Directory of Open Access Journals (Sweden)

    Daphne L Mourits

    Full Text Available To investigate the current practice of enucleation with or without orbital implant for retinoblastoma in countries across the world.A digital survey identifying operation techniques and material used for orbital implants after enucleation in patients with retinoblastoma.We received a response of 58 surgeons in 32 different countries. A primary artificial implant is routinely inserted by 42 (72.4% surgeons. Ten (17.2% surgeons leave the socket empty, three (5.2% decide per case. Other surgeons insert a dermis fat graft as a standard primary implant (n=1, or fill the socket in a standard secondary procedure (n=2; one uses dermis fat grafts and one artificial implants. The choice for porous implants was more frequent than for non-porous implants: 27 (58.7% and 15 (32.6%, respectively. Both porous and non-porous implant types are used by 4 (8.7% surgeons. Twenty-five surgeons (54.3% insert bare implants, 11 (23.9% use separate wrappings, eight (17.4% use implants with prefab wrapping and two insert implants with and without wrapping depending on type of implant. Attachment of the muscles to the wrapping or implant (at various locations is done by 31 (53.4% surgeons. Eleven (19.0% use a myoconjunctival technique, nine (15.5% suture the muscles to each other and seven (12.1% do not reattach the muscles. Measures to improve volume are implant exchange at an older age (n=4, the use of Restylane SQ (n=1 and osmotic expanders (n=1. Pegging is done by two surgeons.No (worldwide consensus exists about the use of material and techniques for enucleation for the treatment of retinoblastoma. Considerations for the use of different techniques are discussed.

  18. A Brief History of the use of Electromagnetic Induction Techniques in Soil Survey

    Science.gov (United States)

    Brevik, Eric C.; Doolittle, James

    2017-04-01

    Electromagnetic induction (EMI) has been used to characterize the spatial variability of soil properties since the late 1970s. Initially used to assess soil salinity, the use of EMI in soil studies has expanded to include: mapping soil types; characterizing soil water content and flow patterns; assessing variations in soil texture, compaction, organic matter content, and pH; and determining the depth to subsurface horizons, stratigraphic layers or bedrock, among other uses. In all cases the soil property being investigated must influence soil apparent electrical conductivity (ECa) either directly or indirectly for EMI techniques to be effective. An increasing number and diversity of EMI sensors have been developed in response to users' needs and the availability of allied technologies, which have greatly improved the functionality of these tools and increased the amount and types of data that can be gathered with a single pass. EMI investigations provide several benefits for soil studies. The large amount of georeferenced data that can be rapidly and inexpensively collected with EMI provides more complete characterization of the spatial variations in soil properties than traditional sampling techniques. In addition, compared to traditional soil survey methods, EMI can more effectively characterize diffuse soil boundaries and identify included areas of dissimilar soils within mapped soil units, giving soil scientists greater confidence when collecting spatial soil information. EMI techniques do have limitations; results are site-specific and can vary depending on the complex interactions among multiple and variable soil properties. Despite this, EMI techniques are increasingly being used to investigate the spatial variability of soil properties at field and landscape scales. The future should witness a greater use of multiple-frequency and multiple-coil EMI sensors and integration with other sensors to assess the spatial variability of soil properties. Data analysis

  19. Environmental monitoring using autonomous vehicles: a survey of recent searching techniques.

    Science.gov (United States)

    Bayat, Behzad; Crasta, Naveena; Crespi, Alessandro; Pascoal, António M; Ijspeert, Auke

    2017-06-01

    Autonomous vehicles are becoming an essential tool in a wide range of environmental applications that include ambient data acquisition, remote sensing, and mapping of the spatial extent of pollutant spills. Among these applications, pollution source localization has drawn increasing interest due to its scientific and commercial interest and the emergence of a new breed of robotic vehicles capable of operating in harsh environments without human supervision. The aim is to find the location of a region that is the source of a given substance of interest (e.g. a chemical pollutant at sea or a gas leakage in air) using a group of cooperative autonomous vehicles. Motivated by fast paced advances in this challenging area, this paper surveys recent advances in searching techniques that are at the core of environmental monitoring strategies using autonomous vehicles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Survey of NDT techniques, services, qualifications and certification of NDT personnel-preliminary results

    International Nuclear Information System (INIS)

    Aleta, C.R.; Kinilitan, V.E.; Lailo, R.M.

    1987-01-01

    This paper presented the results of a survey conducted to determine the profile of the NDT industry including its problems. A questionnaire designed in three parts 1) present practices on qualification and certification, 2) NDT equipment and 3) services and problems in NDT. Of the 36 firms contacted only 20 responded. Results indicated the following: a) most firms are engaged in four (4) main techniques, RT, UT, MT and PT. Only 2 indicated capability of ET. b) level III personnel are relatively few in number, c) most firms allow the ASNT recommendation as a basis for their qualifications and certification and are in favor of standardization of the qualification and certification process and supportive of a national center for training of NDT personnel and d) most firms perceived the lack of adequate repair/maintenance skills/facilities, followed by high cost of equipment and the lack of national standard for qualification and certification. (ELC)

  1. Laser-based direct-write techniques for cell printing

    Energy Technology Data Exchange (ETDEWEB)

    Schiele, Nathan R; Corr, David T [Biomedical Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States); Huang Yong [Department of Mechanical Engineering, Clemson University, Clemson, SC (United States); Raof, Nurazhani Abdul; Xie Yubing [College of Nanoscale Science and Engineering, University at Albany, SUNY, Albany, NY (United States); Chrisey, Douglas B, E-mail: schien@rpi.ed, E-mail: chrisd@rpi.ed [Material Science and Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States)

    2010-09-15

    Fabrication of cellular constructs with spatial control of cell location ({+-}5 {mu}m) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. (topical review)

  2. Laser-based direct-write techniques for cell printing

    International Nuclear Information System (INIS)

    Schiele, Nathan R; Corr, David T; Huang Yong; Raof, Nurazhani Abdul; Xie Yubing; Chrisey, Douglas B

    2010-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. (topical review)

  3. Analgesic techniques in minor painful procedures in neonatal units: a survey in northern Italy.

    Science.gov (United States)

    Codipietro, Luigi; Bailo, Elena; Nangeroni, Marco; Ponzone, Alberto; Grazia, Giuseppe

    2011-01-01

    The aim of this survey was to evaluate the current practice regarding pain assessment and pain management strategies adopted in commonly performed minor painful procedures in Northern Italian Neonatal Intensive Care Units (NICUs). A multicenter survey was conducted between 2008 and 2009 in 35 NICUs. The first part of the survey form covered pain assessment tools, the timing of analgesics, and the availability of written guidelines. A second section evaluated the analgesic strategies adopted in commonly performed painful procedures. The listed analgesic procedures were as follows: oral sweet solutions alone, non-nutritive sucking (NNS) alone, a combination of sweet solutions and NNS, breast-feeding where available, and topical anesthetics. Completed questionnaires were returned from 30 neonatal units (85.7% response rate). Ten of the 30 NICUs reported using pain assessment tools for minor invasive procedures. Neonatal Infant Pain Scale was the most frequently used pain scale (60%). Twenty neonatal units had written guidelines directing pain management practices. The most frequently used procedures were pacifiers alone (69%), followed by sweet-tasting solutions (58%). A 5% glucose solution was the most frequently utilized sweet-tasting solution (76.7%). A minority of NICUs (16.7%) administered 12% sucrose solutions for analgesia and the application of topical anesthetics was found in 27% of NICUs while breast-feeding was performed in 7% of NICUs. This study found a low adherence to national and international guidelines for analgesia in minor procedures: the underuse of neonatal pain scales (33%), sucrose solution administration before heel lance (23.3%), topical anesthetics before venipuncture, or other analgesic techniques. The presence of written pain control guidelines in these regions of Northern Italy increased in recent years (from 25% to 66%). © 2010 World Institute of Pain.

  4. Search Techniques for the Web of Things: A Taxonomy and Survey

    Science.gov (United States)

    Zhou, Yuchao; De, Suparna; Wang, Wei; Moessner, Klaus

    2016-01-01

    The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented. PMID:27128918

  5. Search Techniques for the Web of Things: A Taxonomy and Survey

    Directory of Open Access Journals (Sweden)

    Yuchao Zhou

    2016-04-01

    Full Text Available The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented.

  6. Advantages and limitations of web-based surveys: evidence from a child mental health survey.

    Science.gov (United States)

    Heiervang, Einar; Goodman, Robert

    2011-01-01

    Web-based surveys may have advantages related to the speed and cost of data collection as well as data quality. However, they may be biased by low and selective participation. We predicted that such biases would distort point-estimates such as average symptom level or prevalence but not patterns of associations with putative risk-factors. A structured psychiatric interview was administered to parents in two successive surveys of child mental health. In 2003, parents were interviewed face-to-face, whereas in 2006 they completed the interview online. In both surveys, interviews were preceded by paper questionnaires covering child and family characteristics. The rate of parents logging onto the web site was comparable to the response rate for face-to-face interviews, but the rate of full response (completing all sections of the interview) was much lower for web-based interviews. Full response was less frequent for non-traditional families, immigrant parents, and less educated parents. Participation bias affected point estimates of psychopathology but had little effect on associations with putative risk factors. The time and cost of full web-based interviews was only a quarter of that for face-to-face interviews. Web-based surveys may be performed faster and at lower cost than more traditional approaches with personal interviews. Selective participation seems a particular threat to point estimates of psychopathology, while patterns of associations are more robust.

  7. Estimate-Merge-Technique-based algorithms to track an underwater ...

    Indian Academy of Sciences (India)

    D V A N Ravi Kumar

    2017-07-04

    Jul 4, 2017 ... In this paper, two novel methods based on the Estimate Merge Technique ... mentioned advantages of the proposed novel methods is shown by carrying out Monte Carlo simulation in .... equations are converted to sequential equations to make ... estimation error and low convergence time) at feasibly high.

  8. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    This study shows the potency of two GIS-based data driven bivariate techniques namely ... In the view of these weaknesses , there is a strong requirement for reassessment of .... Font color: Text 1, Not Expanded by / Condensed by , ...... West Bengal (India) using remote sensing, geographical information system and multi-.

  9. Learning Physics through Project-Based Learning Game Techniques

    Science.gov (United States)

    Baran, Medine; Maskan, Abdulkadir; Yasar, Seyma

    2018-01-01

    The aim of the present study, in which Project and game techniques are used together, is to examine the impact of project-based learning games on students' physics achievement. Participants of the study consist of 34 9th grade students (N = 34). The data were collected using achievement tests and a questionnaire. Throughout the applications, the…

  10. Ultrabroadband Phased-Array Receivers Based on Optical Techniques

    Science.gov (United States)

    2016-02-26

    bandwidths, and with it receiver noise floors , are unavoidable. Figure 1. SNR of a thermally limited receiver based on Friis equation showing the...techniques for RF and photonic integration based on liquid crystal polymer substrates were pursued that would aid in the realization of potential imaging...These models assumed that sufficient LNA gain was used on the antenna to set the noise floor of the imaging receiver, which necessitated physical

  11. Optical supervised filtering technique based on Hopfield neural network

    Science.gov (United States)

    Bal, Abdullah

    2004-11-01

    Hopfield neural network is commonly preferred for optimization problems. In image segmentation, conventional Hopfield neural networks (HNN) are formulated as a cost-function-minimization problem to perform gray level thresholding on the image histogram or the pixels' gray levels arranged in a one-dimensional array [R. Sammouda, N. Niki, H. Nishitani, Pattern Rec. 30 (1997) 921-927; K.S. Cheng, J.S. Lin, C.W. Mao, IEEE Trans. Med. Imag. 15 (1996) 560-567; C. Chang, P. Chung, Image and Vision comp. 19 (2001) 669-678]. In this paper, a new high speed supervised filtering technique is proposed for image feature extraction and enhancement problems by modifying the conventional HNN. The essential improvement in this technique is to use 2D convolution operation instead of weight-matrix multiplication. Thereby, neural network based a new filtering technique has been obtained that is required just 3 × 3 sized filter mask matrix instead of large size weight coefficient matrix. Optical implementation of the proposed filtering technique is executed easily using the joint transform correlator. The requirement of non-negative data for optical implementation is provided by bias technique to convert the bipolar data to non-negative data. Simulation results of the proposed optical supervised filtering technique are reported for various feature extraction problems such as edge detection, corner detection, horizontal and vertical line extraction, and fingerprint enhancement.

  12. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    Directory of Open Access Journals (Sweden)

    Mohamed M. Ibrahim

    2014-01-01

    Full Text Available Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  13. Video multiple watermarking technique based on image interlacing using DWT.

    Science.gov (United States)

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  14. Site suitability evaluation of an old operating landfill using AHP and GIS techniques and integrated hydrogeological and geophysical surveys.

    Science.gov (United States)

    Saatsaz, Masoud; Monsef, Iman; Rahmani, Mostafa; Ghods, Abdolreza

    2018-02-16

    Because of the outdated methods of common landfill selection, it is imperative to reevaluate the usage suitability. To assess the suitability of the existing waste landfill in Zanjan, Iran, we have used a combination of the analytical hierarchy process (AHP) and GIS techniques, along with fieldwork surveys. Four major criteria and 12 subcriteria were considered, and the AHP was applied to assign the relative importance weights of criteria and subcriteria to each other. Finally, a landfill suitability map was generated and ranked based on the final suitability scores. The results show that the unsuitable areas are around Zanjan, in the middle parts of the plain. By contrast, the most suitable areas are uncultivated areas, located mostly in the west, north, and south. The results also indicate that the present landfill is a highly suitable site. After desk studies, geoelectrical surveys and infiltration measurements were conducted to make the final decision. Double-ring permeability tests confirm the landfill is an acceptable site. The electrical sounding shows that the leachate plume has a width of about ~ 450 m, spreads to a depth of about ~ 55 m, and migrates towards the northeast. Considering the groundwater depth, dry climate, and a low infiltration rate of the landfill soils, it can be concluded that leachate plumes will not contaminate groundwater within this decade. The proposed method can be implemented to reevaluate the suitability of any old operating reservoir such as oil reservoirs, petrol filling stations, heavy industrial tanks, and landfills, containing liquid hazardous materials.

  15. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    2011-01-01

    The estimation of the line impedance can be used by the control of numerous grid-connected systems, such as active filters, islanding detection techniques, non-linear current controllers, detection of the on/off grid operation mode. Therefore, estimating the line impedance can add extra functions...... into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance. The quasi...

  16. Biometric image enhancement using decision rule based image fusion techniques

    Science.gov (United States)

    Sagayee, G. Mary Amirtha; Arumugam, S.

    2010-02-01

    Introducing biometrics into information systems may result in considerable benefits. Most of the researchers confirmed that the finger print is widely used than the iris or face and more over it is the primary choice for most privacy concerned applications. For finger prints applications, choosing proper sensor is at risk. The proposed work deals about, how the image quality can be improved by introducing image fusion technique at sensor levels. The results of the images after introducing the decision rule based image fusion technique are evaluated and analyzed with its entropy levels and root mean square error.

  17. Electromagnetism based atmospheric ice sensing technique - A conceptual review

    Directory of Open Access Journals (Sweden)

    U Mughal

    2016-09-01

    Full Text Available Electromagnetic and vibrational properties of ice can be used to measure certain parameters such as ice thickness, type and icing rate. In this paper we present a review of the dielectric based measurement techniques for matter and the dielectric/spectroscopic properties of ice. Atmospheric Ice is a complex material with a variable dielectric constant, but precise calculation of this constant may form the basis for measurement of its other properties such as thickness and strength using some electromagnetic methods. Using time domain or frequency domain spectroscopic techniques, by measuring both the reflection and transmission characteristics of atmospheric ice in a particular frequency range, the desired parameters can be determined.

  18. Proposing a Wiki-Based Technique for Collaborative Essay Writing

    Directory of Open Access Journals (Sweden)

    Mabel Ortiz Navarrete

    2014-10-01

    Full Text Available This paper aims at proposing a technique for students learning English as a foreign language when they collaboratively write an argumentative essay in a wiki environment. A wiki environment and collaborative work play an important role within the academic writing task. Nevertheless, an appropriate and systematic work assignment is required in order to make use of both. In this paper the proposed technique when writing a collaborative essay mainly attempts to provide the most effective way to enhance equal participation among group members by taking as a base computer mediated collaboration. Within this context, the students’ role is clearly defined and individual and collaborative tasks are explained.

  19. Knowledge based systems advanced concepts, techniques and applications

    CERN Document Server

    1997-01-01

    The field of knowledge-based systems (KBS) has expanded enormously during the last years, and many important techniques and tools are currently available. Applications of KBS range from medicine to engineering and aerospace.This book provides a selected set of state-of-the-art contributions that present advanced techniques, tools and applications. These contributions have been prepared by a group of eminent researchers and professionals in the field.The theoretical topics covered include: knowledge acquisition, machine learning, genetic algorithms, knowledge management and processing under unc

  20. Case-based reasoning diagnostic technique based on multi-attribute similarity

    Energy Technology Data Exchange (ETDEWEB)

    Makoto, Takahashi [Tohoku University, Miyagi (Japan); Akio, Gofuku [Okayama University, Okayamaa (Japan)

    2014-08-15

    Case-based diagnostic technique has been developed based on the multi-attribute similarity. Specific feature of the developed system is to use multiple attributes of process signals for similarity evaluation to retrieve a similar case stored in a case base. The present technique has been applied to the measurement data from Monju with some simulated anomalies. The results of numerical experiments showed that the present technique can be utilizes as one of the methods for a hybrid-type diagnosis system.

  1. Embryo transfer techniques: an American Society for Reproductive Medicine survey of current Society for Assisted Reproductive Technology practices.

    Science.gov (United States)

    Toth, Thomas L; Lee, Malinda S; Bendikson, Kristin A; Reindollar, Richard H

    2017-04-01

    To better understand practice patterns and opportunities for standardization of ET. Cross-sectional survey. Not applicable. Not applicable. An anonymous 82-question survey was emailed to the medical directors of 286 Society for Assisted Reproductive Technology member IVF practices. A follow-up survey composed of three questions specific to ET technique was emailed to the same medical directors. Descriptive statistics of the results were compiled. The survey assessed policies, protocols, restrictions, and specifics pertinent to the technique of ET. There were 117 (41%) responses; 32% practice in academic settings and 68% in private practice. Responders were experienced clinicians, half of whom had performed Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  2. Farmer survey in the hinterland of Kisangani (Democratic Republic of Congo) on rodent crop damage and rodent control techniques used

    DEFF Research Database (Denmark)

    Drazo, Nicaise Amundala; Kennis, Jan; Leirs, Herwig

    2008-01-01

    We conducted a survey on rodent crop damage among farmers in the hinterland of Kisangani (Democratic Republic of Congo). We studied the amount of crop damage, the rodent groups causing crop damage, the growth stages affected and the control techniques used. We conducted this survey in three...... municipalities using a standard questionnaire form translated into local languages, between November 2005 and June 2006 and during July 2007. We used the Quotas method and interviewed 70 households per municipality. Farmers indicated rodent groups implicated in crop damage on color photographs. Two types...... of survey techniques were used: individual and focus-group surveys. The sugar cane rat, Thryonomys sp. and Lemniscomys striatus caused most damage to crops, but inside granaries, Rattus rattus was the primary pest species eating stored food supplies and causing damage to stored goods. Cassava and maize were...

  3. Different types of maximum power point tracking techniques for renewable energy systems: A survey

    Science.gov (United States)

    Khan, Mohammad Junaid; Shukla, Praveen; Mustafa, Rashid; Chatterji, S.; Mathew, Lini

    2016-03-01

    Global demand for electricity is increasing while production of energy from fossil fuels is declining and therefore the obvious choice of the clean energy source that is abundant and could provide security for development future is energy from the sun. In this paper, the characteristic of the supply voltage of the photovoltaic generator is nonlinear and exhibits multiple peaks, including many local peaks and a global peak in non-uniform irradiance. To keep global peak, MPPT is the important component of photovoltaic systems. Although many review articles discussed conventional techniques such as P & O, incremental conductance, the correlation ripple control and very few attempts have been made with intelligent MPPT techniques. This document also discusses different algorithms based on fuzzy logic, Ant Colony Optimization, Genetic Algorithm, artificial neural networks, Particle Swarm Optimization Algorithm Firefly, Extremum seeking control method and hybrid methods applied to the monitoring of maximum value of power at point in systems of photovoltaic under changing conditions of irradiance.

  4. Mass spectrometry techniques in the survey of steroid metabolites as potential disease biomarkers: a review.

    Science.gov (United States)

    Gouveia, Maria João; Brindley, Paul J; Santos, Lúcio Lara; Correia da Costa, José Manuel; Gomes, Paula; Vale, Nuno

    2013-09-01

    Mass spectrometric approaches have been fundamental to the identification of metabolites associated with steroid hormones, yet this topic has not been reviewed in depth in recent years. To this end, and given the increasing relevance of liquid chromatography-mass spectrometry (LC-MS) studies on steroid hormones and their metabolites, the present review addresses this subject. This review provides a timely summary of the use of various mass spectrometry-based analytical techniques during the evaluation of steroidal biomarkers in a range of human disease settings. The sensitivity and specificity of these technologies are clearly providing valuable new insights into breast cancer and cardiovascular disease. We aim to contribute to an enhanced understanding of steroid metabolism and how it can be profiled by LC-MS techniques. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.

  5. An Observed Voting System Based On Biometric Technique

    Directory of Open Access Journals (Sweden)

    B. Devikiruba

    2015-08-01

    Full Text Available ABSTRACT This article describes a computational framework which can run almost on every computer connected to an IP based network to study biometric techniques. This paper discusses with a system protecting confidential information puts strong security demands on the identification. Biometry provides us with a user-friendly method for this identification and is becoming a competitor for current identification mechanisms. The experimentation section focuses on biometric verification specifically based on fingerprints. This article should be read as a warning to those thinking of using methods of identification without first examine the technical opportunities for compromising mechanisms and the associated legal consequences. The development is based on the java language that easily improves software packages that is useful to test new control techniques.

  6. Current STR-based techniques in forensic science

    Directory of Open Access Journals (Sweden)

    Phuvadol Thanakiatkrai

    2013-01-01

    Full Text Available DNA analysis in forensic science is mainly based on short tandem repeat (STR genotyping. The conventional analysis is a three-step process of DNA extraction, amplification and detection. An overview of various techniques that are currently in use and are being actively researched for STR typing is presented. The techniques are separated into STR amplification and detection. New techniques for forensic STR analysis focus on increasing sensitivity, resolution and discrimination power for suboptimal samples. These are achieved by shifting primer-binding sites, using high-fidelity and tolerant polymerases and applying novel methods to STR detection. Examples in which STRs are used in criminal investigations are provided and future research directions are discussed.

  7. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    Directory of Open Access Journals (Sweden)

    Jonathan Lueke

    2011-01-01

    Full Text Available Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  8. MEMS-based power generation techniques for implantable biosensing applications.

    Science.gov (United States)

    Lueke, Jonathan; Moussa, Walied A

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  9. Power system dynamic state estimation using prediction based evolutionary technique

    International Nuclear Information System (INIS)

    Basetti, Vedik; Chandel, Ashwani K.; Chandel, Rajeevan

    2016-01-01

    In this paper, a new robust LWS (least winsorized square) estimator is proposed for dynamic state estimation of a power system. One of the main advantages of this estimator is that it has an inbuilt bad data rejection property and is less sensitive to bad data measurements. In the proposed approach, Brown's double exponential smoothing technique has been utilised for its reliable performance at the prediction step. The state estimation problem is solved as an optimisation problem using a new jDE-self adaptive differential evolution with prediction based population re-initialisation technique at the filtering step. This new stochastic search technique has been embedded with different state scenarios using the predicted state. The effectiveness of the proposed LWS technique is validated under different conditions, namely normal operation, bad data, sudden load change, and loss of transmission line conditions on three different IEEE test bus systems. The performance of the proposed approach is compared with the conventional extended Kalman filter. On the basis of various performance indices, the results thus obtained show that the proposed technique increases the accuracy and robustness of power system dynamic state estimation performance. - Highlights: • To estimate the states of the power system under dynamic environment. • The performance of the EKF method is degraded during anomaly conditions. • The proposed method remains robust towards anomalies. • The proposed method provides precise state estimates even in the presence of anomalies. • The results show that prediction accuracy is enhanced by using the proposed model.

  10. Using SERVQUAL and Kano research techniques in a patient service quality survey.

    Science.gov (United States)

    Christoglou, Konstantinos; Vassiliadis, Chris; Sigalas, Ioakim

    2006-01-01

    This article presents the results of a service quality study. After an introduction to the SERVQUAL and the Kano research techniques, a Kano analysis of 75 patients from the General Hospital of Katerini in Greece is presented. The service quality criterion used satisfaction and dissatisfaction indices. The Kano statistical analysis process results strengthened the hypothesis of previous research regarding the importance of personal knowledge, the courtesy of the hospital employees and their ability to convey trust and confidence (assurance dimension). Managerial suggestions are made regarding the best way of acting and approaching hospital patients based on the basic SERVQUAL model.

  11. Fractal Image Compression Based on High Entropy Values Technique

    Directory of Open Access Journals (Sweden)

    Douaa Younis Abbaas

    2018-04-01

    Full Text Available There are many attempts tried to improve the encoding stage of FIC because it consumed time. These attempts worked by reducing size of the search pool for pair range-domain matching but most of them led to get a bad quality, or a lower compression ratio of reconstructed image. This paper aims to present a method to improve performance of the full search algorithm by combining FIC (lossy compression and another lossless technique (in this case entropy coding is used. The entropy technique will reduce size of the domain pool (i. e., number of domain blocks based on the entropy value of each range block and domain block and then comparing the results of full search algorithm and proposed algorithm based on entropy technique to see each of which give best results (such as reduced the encoding time with acceptable values in both compression quali-ty parameters which are C. R (Compression Ratio and PSNR (Image Quality. The experimental results of the proposed algorithm proven that using the proposed entropy technique reduces the encoding time while keeping compression rates and reconstruction image quality good as soon as possible.

  12. Characterization techniques for graphene-based materials in catalysis

    Directory of Open Access Journals (Sweden)

    Maocong Hu

    2017-06-01

    Full Text Available Graphene-based materials have been studied in a wide range of applications including catalysis due to the outstanding electronic, thermal, and mechanical properties. The unprecedented features of graphene-based catalysts, which are believed to be responsible for their superior performance, have been characterized by many techniques. In this article, we comprehensively summarized the characterization methods covering bulk and surface structure analysis, chemisorption ability determination, and reaction mechanism investigation. We reviewed the advantages/disadvantages of different techniques including Raman spectroscopy, X-ray photoelectron spectroscopy (XPS, Fourier transform infrared spectroscopy (FTIR and Diffuse Reflectance Fourier Transform Infrared Spectroscopy (DRIFTS, X-Ray diffraction (XRD, X-ray absorption near edge structure (XANES and X-ray absorption fine structure (XAFS, atomic force microscopy (AFM, scanning electron microscopy (SEM, transmission electron microscopy (TEM, high-resolution transmission electron microscopy (HRTEM, ultraviolet-visible spectroscopy (UV-vis, X-ray fluorescence (XRF, inductively coupled plasma mass spectrometry (ICP, thermogravimetric analysis (TGA, Brunauer–Emmett–Teller (BET, and scanning tunneling microscopy (STM. The application of temperature-programmed reduction (TPR, CO chemisorption, and NH3/CO2-temperature-programmed desorption (TPD was also briefly introduced. Finally, we discussed the challenges and provided possible suggestions on choosing characterization techniques. This review provides key information to catalysis community to adopt suitable characterization techniques for their research.

  13. Application of spectroscopic techniques for the study of paper documents: A survey

    International Nuclear Information System (INIS)

    Manso, M.; Carvalho, M.L.

    2009-01-01

    For many centuries paper was the main material for recording cultural achievements all over the world. Paper is mostly made from cellulose with small amounts of organic and inorganic additives, which allow its identification and characterization and may also contribute to its degradation. Prior to 1850, paper was made entirely from rags, using hemp, flax and cotton fibres. After this period, due to the enormous increase in demand, wood pulp began to be commonly used as raw material, resulting in rapid degradation of paper. Spectroscopic techniques represent one of the most powerful tools to investigate the constituents of paper documents in order to establish its identification and its state of degradation. This review describes the application of selected spectroscopic techniques used for paper characterization and conservation. The spectroscopic techniques that have been used and will be reviewed include: Fourier-Transform Infrared spectroscopy, Raman spectroscopy, Nuclear Magnetic Resonance spectroscopy, X-Ray spectroscopy, Laser-based Spectroscopy, Inductively Coupled Mass Spectroscopy, Laser ablation, Atomic Absorption Spectroscopy and X-Ray Photoelectron Spectroscopy.

  14. Development of custom LCD based portable survey/contamination monitors

    International Nuclear Information System (INIS)

    Reddy, J.D.

    2010-01-01

    Equipments for carrying out radiation survey measurements for alpha, beta and gamma radiations have evolved considerably with the advancements in Electronics overtime. There are 2 major classes of portable instruments available from most manufacturers - (a) Analog indicator type (b) Direct digital readout type. Analog meters give a direct quantifying feel to radiation levels though they are not rich features nor they have smartness like a digital meter. Digital versions have advantages of direct readout numerically and configurable as per users requirements. To achieve best features of both the techniques a dual indicator type LCD module comprising of Analog indicating LCD segments and digital readout indicating 7 segments has been developed. This LCD comprising of LCD glass and its display driver has been deployed across various types of survey meters and contamination monitors manufactured by Nucleonix. This display now facilitates direct readout of dose rate/count rate in various units simultaneously in both analog LCD scale and direct digital indication. (author)

  15. 3D-TV System with Depth-Image-Based Rendering Architectures, Techniques and Challenges

    CERN Document Server

    Zhao, Yin; Yu, Lu; Tanimoto, Masayuki

    2013-01-01

    Riding on the success of 3D cinema blockbusters and advances in stereoscopic display technology, 3D video applications have gathered momentum in recent years. 3D-TV System with Depth-Image-Based Rendering: Architectures, Techniques and Challenges surveys depth-image-based 3D-TV systems, which are expected to be put into applications in the near future. Depth-image-based rendering (DIBR) significantly enhances the 3D visual experience compared to stereoscopic systems currently in use. DIBR techniques make it possible to generate additional viewpoints using 3D warping techniques to adjust the perceived depth of stereoscopic videos and provide for auto-stereoscopic displays that do not require glasses for viewing the 3D image.   The material includes a technical review and literature survey of components and complete systems, solutions for technical issues, and implementation of prototypes. The book is organized into four sections: System Overview, Content Generation, Data Compression and Transmission, and 3D V...

  16. A Review of the Piezoelectric Electromechanical Impedance Based Structural Health Monitoring Technique for Engineering Structures

    Directory of Open Access Journals (Sweden)

    Wongi S. Na

    2018-04-01

    Full Text Available The birth of smart materials such as piezoelectric (PZT transducers has aided in revolutionizing the field of structural health monitoring (SHM based on non-destructive testing (NDT methods. While a relatively new NDT method known as the electromechanical (EMI technique has been investigated for more than two decades, there are still various problems that must be solved before it is applied to real structures. The technique, which has a significant potential to contribute to the creation of one of the most effective SHM systems, involves the use of a single PZT for exciting and sensing of the host structure. In this paper, studies applied for the past decade related to the EMI technique have been reviewed to understand its trend. In addition, new concepts and ideas proposed by various authors are also surveyed, and the paper concludes with a discussion of the potential directions for future works.

  17. Statistical techniques applied to aerial radiometric surveys (STAARS): series introduction and the principal-components-analysis method

    International Nuclear Information System (INIS)

    Pirkle, F.L.

    1981-04-01

    STAARS is a new series which is being published to disseminate information concerning statistical procedures for interpreting aerial radiometric data. The application of a particular data interpretation technique to geologic understanding for delineating regions favorable to uranium deposition is the primary concern of STAARS. Statements concerning the utility of a technique on aerial reconnaissance data as well as detailed aerial survey data will be included

  18. IoT Security Techniques Based on Machine Learning

    OpenAIRE

    Xiao, Liang; Wan, Xiaoyue; Lu, Xiaozhen; Zhang, Yanyong; Wu, Di

    2018-01-01

    Internet of things (IoT) that integrate a variety of devices into networks to provide advanced and intelligent services have to protect user privacy and address attacks such as spoofing attacks, denial of service attacks, jamming and eavesdropping. In this article, we investigate the attack model for IoT systems, and review the IoT security solutions based on machine learning techniques including supervised learning, unsupervised learning and reinforcement learning. We focus on the machine le...

  19. Instructional Uses of Web-Based Survey Software

    Directory of Open Access Journals (Sweden)

    Concetta A. DePaolo, Ph.D.

    2006-07-01

    Full Text Available Recent technological advances have led to changes in how instruction is delivered. Such technology can create opportunities to enhance instruction and make instructors more efficient in performing instructional tasks, especially if the technology is easy to use and requires no training. One such technology, web-based survey software, is extremely accessible for anyone with basic computer skills. Web-based survey software can be used for a variety of instructional purposes to streamline instructor tasks, as well as enhance instruction and communication with students. Following a brief overview of the technology, we discuss how Web Forms from nTreePoint can be used to conduct instructional surveys, collect course feedback, conduct peer evaluations of group work, collect completed assignments, schedule meeting times among multiple people, and aid in pedagogical research. We also discuss our experiences with these tasks within traditional on-campus courses and how they were enhanced or expedited by the use of web-based survey software.

  20. Dealing with Magnetic Disturbances in Human Motion Capture: A Survey of Techniques

    Directory of Open Access Journals (Sweden)

    Gabriele Ligorio

    2016-03-01

    Full Text Available Magnetic-Inertial Measurement Units (MIMUs based on microelectromechanical (MEMS technologies are widespread in contexts such as human motion tracking. Although they present several advantages (lightweight, size, cost, their orientation estimation accuracy might be poor. Indoor magnetic disturbances represent one of the limiting factors for their accuracy, and, therefore, a variety of work was done to characterize and compensate them. In this paper, the main compensation strategies included within Kalman-based orientation estimators are surveyed and classified according to which degrees of freedom are affected by the magnetic data and to the magnetic disturbance rejection methods implemented. By selecting a representative method from each category, four algorithms were obtained and compared in two different magnetic environments: (1 small workspace with an active magnetic source; (2 large workspace without active magnetic sources. A wrist-worn MIMU was used to acquire data from a healthy subject, whereas a stereophotogrammetric system was adopted to obtain ground-truth data. The results suggested that the model-based approaches represent the best compromise between the two testbeds. This is particularly true when the magnetic data are prevented to affect the estimation of the angles with respect to the vertical direction.

  1. An Image Morphing Technique Based on Optimal Mass Preserving Mapping

    Science.gov (United States)

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2013-01-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L2 mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  2. Comparing Four Touch-Based Interaction Techniques for an Image-Based Audience Response System

    NARCIS (Netherlands)

    Jorritsma, Wiard; Prins, Jonatan T.; van Ooijen, Peter M. A.

    2015-01-01

    This study aimed to determine the most appropriate touch-based interaction technique for I2Vote, an image-based audience response system for radiology education in which users need to accurately mark a target on a medical image. Four plausible techniques were identified: land-on, take-off,

  3. A DIFFERENT WEB-BASED GEOCODING SERVICE USING FUZZY TECHNIQUES

    Directory of Open Access Journals (Sweden)

    P. Pahlavani

    2015-12-01

    Full Text Available Geocoding – the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  4. Automatic vetting of planet candidates from ground based surveys: Machine learning with NGTS

    Science.gov (United States)

    Armstrong, David J.; Günther, Maximilian N.; McCormac, James; Smith, Alexis M. S.; Bayliss, Daniel; Bouchy, François; Burleigh, Matthew R.; Casewell, Sarah; Eigmüller, Philipp; Gillen, Edward; Goad, Michael R.; Hodgkin, Simon T.; Jenkins, James S.; Louden, Tom; Metrailler, Lionel; Pollacco, Don; Poppenhaeger, Katja; Queloz, Didier; Raynard, Liam; Rauer, Heike; Udry, Stéphane; Walker, Simon R.; Watson, Christopher A.; West, Richard G.; Wheatley, Peter J.

    2018-05-01

    State of the art exoplanet transit surveys are producing ever increasing quantities of data. To make the best use of this resource, in detecting interesting planetary systems or in determining accurate planetary population statistics, requires new automated methods. Here we describe a machine learning algorithm that forms an integral part of the pipeline for the NGTS transit survey, demonstrating the efficacy of machine learning in selecting planetary candidates from multi-night ground based survey data. Our method uses a combination of random forests and self-organising-maps to rank planetary candidates, achieving an AUC score of 97.6% in ranking 12368 injected planets against 27496 false positives in the NGTS data. We build on past examples by using injected transit signals to form a training set, a necessary development for applying similar methods to upcoming surveys. We also make the autovet code used to implement the algorithm publicly accessible. autovet is designed to perform machine learned vetting of planetary candidates, and can utilise a variety of methods. The apparent robustness of machine learning techniques, whether on space-based or the qualitatively different ground-based data, highlights their importance to future surveys such as TESS and PLATO and the need to better understand their advantages and pitfalls in an exoplanetary context.

  5. SKILLS-BASED ECLECTIC TECHNIQUES MATRIX FOR ELT MICROTEACHINGS

    Directory of Open Access Journals (Sweden)

    İskender Hakkı Sarıgöz

    2016-10-01

    Full Text Available Foreign language teaching undergoes constant changes due to the methodological improvement. This progress may be examined in two parts. They are the methods era and the post-methods era. It is not pragmatic today to propose a particular language teaching method and its techniques for all purposes. The holistic inflexibility of mid-century methods has long gone. In the present day, constructivist foreign language teaching trends attempt to see the learner as a whole person and an individual who may be different from the other students in many respects. At the same time, the individual differences should not keep the learners away from group harmony. For this reason, current teacher training programs require eclectic teaching matrixes for unit design considering the mixed ability student groups. These matrixes can be prepared in a multidimensional fashion because there are many functional techniques in different methods and other new techniques to be created by instructors freely in accordance with the teaching aims. The hypothesis in this argument is that the collection of foreign language teaching techniques compiled in ELT microteachings for a particular group of learners has to be arranged eclectically in order to update the teaching process. Nevertheless, designing a teaching format of this sort is a demanding and highly criticized task. This study briefly argues eclecticism in language-skills based methodological struggle from the perspective of ELT teacher education.

  6. Comparative cost assessment of the Kato-Katz and FLOTAC techniques for soil-transmitted helminth diagnosis in epidemiological surveys

    Directory of Open Access Journals (Sweden)

    Speich Benjamin

    2010-08-01

    Full Text Available Abstract Background The Kato-Katz technique is widely used for the diagnosis of soil-transmitted helminthiasis in epidemiological surveys and is believed to be an inexpensive method. The FLOTAC technique shows a higher sensitivity for the diagnosis of light-intensity soil-transmitted helminth infections but is reported to be more complex and expensive. We assessed the costs related to the collection, processing and microscopic examination of stool samples using the Kato-Katz and FLOTAC techniques in an epidemiological survey carried out in Zanzibar, Tanzania. Methods We measured the time for the collection of a single stool specimen in the field, transfer to a laboratory, preparation and microscopic examination using standard protocols for the Kato-Katz and FLOTAC techniques. Salaries of health workers, life expectancy and asset costs of materials, and infrastructure costs were determined. The average cost for a single or duplicate Kato-Katz thick smears and the FLOTAC dual or double technique were calculated. Results The average time needed to collect a stool specimen and perform a single or duplicate Kato-Katz thick smears or the FLOTAC dual or double technique was 20 min and 34 sec (20:34 min, 27:21 min, 28:14 min and 36:44 min, respectively. The total costs for a single and duplicate Kato-Katz thick smears were US$ 1.73 and US$ 2.06, respectively, and for the FLOTAC double and dual technique US$ 2.35 and US$ 2.83, respectively. Salaries impacted most on the total costs of either method. Conclusions The time and cost for soil-transmitted helminth diagnosis using either the Kato-Katz or FLOTAC method in epidemiological surveys are considerable. Our results can help to guide healthcare decision makers and scientists in budget planning and funding for epidemiological surveys, anthelminthic drug efficacy trials and monitoring of control interventions.

  7. Basin Visual Estimation Technique (BVET) and Representative Reach Approaches to Wadeable Stream Surveys: Methodological Limitations and Future Directions

    Science.gov (United States)

    Lance R. Williams; Melvin L. Warren; Susan B. Adams; Joseph L. Arvai; Christopher M. Taylor

    2004-01-01

    Basin Visual Estimation Techniques (BVET) are used to estimate abundance for fish populations in small streams. With BVET, independent samples are drawn from natural habitat units in the stream rather than sampling "representative reaches." This sampling protocol provides an alternative to traditional reach-level surveys, which are criticized for their lack...

  8. Comparison of passive soil vapor survey techniques at a Tijeras Arroyo site, Sandia National Laboratories, Albuquerque, New Mexico

    International Nuclear Information System (INIS)

    Eberle, C.S.; Wade, W.M.; Tharp, T.; Brinkman, J.

    1996-01-01

    Soil vapor surveys were performed to characterize the approximate location of soil contaminants at a hazardous waste site. The samplers were from two separate companies and a comparison was made between the results of the two techniques. These results will be used to design further investigations at the site

  9. [Estimating child mortality using the previous child technique, with data from health centers and household surveys: methodological aspects].

    Science.gov (United States)

    Aguirre, A; Hill, A G

    1988-01-01

    2 trials of the previous child or preceding birth technique in Bamako, Mali, and Lima, Peru, gave very promising results for measurement of infant and early child mortality using data on survivorship of the 2 most recent births. In the Peruvian study, another technique was tested in which each woman was asked about her last 3 births. The preceding birth technique described by Brass and Macrae has rapidly been adopted as a simple means of estimating recent trends in early childhood mortality. The questions formulated and the analysis of results are direct when the mothers are visited at the time of birth or soon after. Several technical aspects of the method believed to introduce unforeseen biases have now been studied and found to be relatively unimportant. But the problems arising when the data come from a nonrepresentative fraction of the total fertile-aged population have not been resolved. The analysis based on data from 5 maternity centers including 1 hospital in Bamako, Mali, indicated some practical problems and the information obtained showed the kinds of subtle biases that can result from the effects of selection. The study in Lima tested 2 abbreviated methods for obtaining recent early childhood mortality estimates in countries with deficient vital registration. The basic idea was that a few simple questions added to household surveys on immunization or diarrheal disease control for example could produce improved child mortality estimates. The mortality estimates in Peru were based on 2 distinct sources of information in the questionnaire. All women were asked their total number of live born children and the number still alive at the time of the interview. The proportion of deaths was converted into a measure of child survival using a life table. Then each woman was asked for a brief history of the 3 most recent live births. Dates of birth and death were noted in month and year of occurrence. The interviews took only slightly longer than the basic survey

  10. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  11. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  12. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  13. Adaptive differential correspondence imaging based on sorting technique

    Directory of Open Access Journals (Sweden)

    Heng Wu

    2017-04-01

    Full Text Available We develop an adaptive differential correspondence imaging (CI method using a sorting technique. Different from the conventional CI schemes, the bucket detector signals (BDS are first processed by a differential technique, and then sorted in a descending (or ascending order. Subsequently, according to the front and last several frames of the sorted BDS, the positive and negative subsets (PNS are created by selecting the relative frames from the reference detector signals. Finally, the object image is recovered from the PNS. Besides, an adaptive method based on two-step iteration is designed to select the optimum number of frames. To verify the proposed method, a single-detector computational ghost imaging (GI setup is constructed. We experimentally and numerically compare the performance of the proposed method with different GI algorithms. The results show that our method can improve the reconstruction quality and reduce the computation cost by using fewer measurement data.

  14. Wear Detection of Drill Bit by Image-based Technique

    Science.gov (United States)

    Sukeri, Maziyah; Zulhilmi Paiz Ismadi, Mohd; Rahim Othman, Abdul; Kamaruddin, Shahrul

    2018-03-01

    Image processing for computer vision function plays an essential aspect in the manufacturing industries for the tool condition monitoring. This study proposes a dependable direct measurement method to measure the tool wear using image-based analysis. Segmentation and thresholding technique were used as the means to filter and convert the colour image to binary datasets. Then, the edge detection method was applied to characterize the edge of the drill bit. By using cross-correlation method, the edges of original and worn drill bits were correlated to each other. Cross-correlation graphs were able to detect the difference of the worn edge despite small difference between the graphs. Future development will focus on quantifying the worn profile as well as enhancing the sensitivity of the technique.

  15. Underwater Time Service and Synchronization Based on Time Reversal Technique

    Science.gov (United States)

    Lu, Hao; Wang, Hai-bin; Aissa-El-Bey, Abdeldjalil; Pyndiah, Ramesh

    2010-09-01

    Real time service and synchronization are very important to many underwater systems. But the time service and synchronization in existence cannot work well due to the multi-path propagation and random phase fluctuation of signals in the ocean channel. The time reversal mirror technique can realize energy concentration through self-matching of the ocean channel and has very good spatial and temporal focusing properties. Based on the TRM technique, we present the Time Reversal Mirror Real Time service and synchronization (TRMRT) method which can bypass the processing of multi-path on the server side and reduce multi-path contamination on the client side. So TRMRT can improve the accuracy of time service. Furthermore, as an efficient and precise method of time service, TRMRT could be widely used in underwater exploration activities and underwater navigation and positioning systems.

  16. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of)

    2015-05-15

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making.

  17. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Heo, Gyunyoung

    2015-01-01

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making

  18. Illumination Sufficiency Survey Techniques: In-situ Measurements of Lighting System Performance and a User Preference Survey for Illuminance in an Off-Grid, African Setting

    Energy Technology Data Exchange (ETDEWEB)

    Alstone, Peter; Jacobson, Arne; Mills, Evan

    2010-08-26

    Efforts to promote rechargeable electric lighting as a replacement for fuel-based light sources in developing countries are typically predicated on the notion that lighting service levels can be maintained or improved while reducing the costs and environmental impacts of existing practices. However, the extremely low incomes of those who depend on fuel-based lighting create a need to balance the hypothetically possible or desirable levels of light with those that are sufficient and affordable. In a pilot study of four night vendors in Kenya, we document a field technique we developed to simultaneously measure the effectiveness of lighting service provided by a lighting system and conduct a survey of lighting service demand by end-users. We took gridded illuminance measurements across each vendor's working and selling area, with users indicating the sufficiency of light at each point. User light sources included a mix of kerosene-fueled hurricane lanterns, pressure lamps, and LED lanterns.We observed illuminance levels ranging from just above zero to 150 lux. The LED systems markedly improved the lighting service levels over those provided by kerosene-fueled hurricane lanterns. Users reported that the minimum acceptable threshold was about 2 lux. The results also indicated that the LED lamps in use by the subjects did not always provide sufficient illumination over the desired retail areas. Our sample size is much too small, however, to reach any conclusions about requirements in the broader population. Given the small number of subjects and very specific type of user, our results should be regarded as indicative rather than conclusive. We recommend replicating the method at larger scales and across a variety of user types and contexts. Policymakers should revisit the subject of recommended illuminance levels regularly as LED technology advances and the price/service balance point evolves.

  19. Vision based techniques for rotorcraft low altitude flight

    Science.gov (United States)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  20. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    Directory of Open Access Journals (Sweden)

    Hong Zhang

    2013-01-01

    Full Text Available With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activity, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation towards the performance of human activity recognition.

  1. Combination Base64 Algorithm and EOF Technique for Steganography

    Science.gov (United States)

    Rahim, Robbi; Nurdiyanto, Heri; Hidayat, Rahmat; Saleh Ahmar, Ansari; Siregar, Dodi; Putera Utama Siahaan, Andysah; Faisal, Ilham; Rahman, Sayuti; Suita, Diana; Zamsuri, Ahmad; Abdullah, Dahlan; Napitupulu, Darmawan; Ikhsan Setiawan, Muhammad; Sriadhi, S.

    2018-04-01

    The steganography process combines mathematics and computer science. Steganography consists of a set of methods and techniques to embed the data into another media so that the contents are unreadable to anyone who does not have the authority to read these data. The main objective of the use of base64 method is to convert any file in order to achieve privacy. This paper discusses a steganography and encoding method using base64, which is a set of encoding schemes that convert the same binary data to the form of a series of ASCII code. Also, the EoF technique is used to embed encoding text performed by Base64. As an example, for the mechanisms a file is used to represent the texts, and by using the two methods together will increase the security level for protecting the data, this research aims to secure many types of files in a particular media with a good security and not to damage the stored files and coverage media that used.

  2. Full-duplex MIMO system based on antenna cancellation technique

    DEFF Research Database (Denmark)

    Foroozanfard, Ehsan; Franek, Ondrej; Tatomirescu, Alexandru

    2014-01-01

    The performance of an antenna cancellation technique for a multiple-input– multiple-output (MIMO) full-duplex system that is based on null-steering beamforming and antenna polarization diversity is investigated. A practical implementation of a symmetric antenna topology comprising three dual......-polarized patch antennas operating at 2.4 GHz is described. The measurement results show an average of 60 dB self-interference cancellation over 200 MHz bandwidth. Moreover, a decoupling level of up to 22 dB is achieved for MIMO multiplexing using antenna polarization diversity. The performance evaluation...

  3. Cooperative Technique Based on Sensor Selection in Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    ISLAM, M. R.

    2009-02-01

    Full Text Available An energy efficient cooperative technique is proposed for the IEEE 1451 based Wireless Sensor Networks. Selected numbers of Wireless Transducer Interface Modules (WTIMs are used to form a Multiple Input Single Output (MISO structure wirelessly connected with a Network Capable Application Processor (NCAP. Energy efficiency and delay of the proposed architecture are derived for different combination of cluster size and selected number of WTIMs. Optimized constellation parameters are used for evaluating derived parameters. The results show that the selected MISO structure outperforms the unselected MISO structure and it shows energy efficient performance than SISO structure after a certain distance.

  4. Development of PIC-based digital survey meter

    International Nuclear Information System (INIS)

    Nor Arymaswati Abdullah; Nur Aira Abdul Rahman; Mohd Ashhar Khalid; Taiman Kadni; Glam Hadzir Patai Mohamad; Abd Aziz Mhd Ramli; Chong Foh Yong

    2006-01-01

    The need of radiation monitoring and monitoring of radioactive contamination in the workplace is very important especially when x-ray machines, linear accelerators, electron beam machines and radioactive sources are present. The appropriate use of radiation detector is significant in order to maintain a radiation and contamination free workplace. This paper reports on the development of a prototype of PIC-based digital survey meter. This prototype of digital survey meter is a hand held instrument for general-purpose radiation monitoring and surface contamination meter. Generally, the device is able to detect some or all of the three major types of ionizing radiation, namely alpha, beta and gamma. It uses a Geiger-Muller tube as a radiation detector, which converts gamma radiation quanta to electric pulses and further processed by the electronic devices. The development involved the design of the controller, counter and high voltage circuit. All these circuit are assembled and enclosed in a plastic casing together with a GM detector and LCD display to form a prototype survey meter. The number of counts of the pulses detected by the survey meter varies due to the random nature of radioactivity. By averaging the reading over a time-period, more accurate and stable reading is achieved. To test the accuracy and the linearity of the design, the prototype was calibrated using standard procedure at the Secondary Standard Dosimetry Laboratory (SSDL) in MINT. (Author)

  5. Nitrous oxide-based techniques versus nitrous oxide-free techniques for general anaesthesia.

    Science.gov (United States)

    Sun, Rao; Jia, Wen Qin; Zhang, Peng; Yang, KeHu; Tian, Jin Hui; Ma, Bin; Liu, Yali; Jia, Run H; Luo, Xiao F; Kuriyama, Akira

    2015-11-06

    Nitrous oxide has been used for over 160 years for the induction and maintenance of general anaesthesia. It has been used as a sole agent but is most often employed as part of a technique using other anaesthetic gases, intravenous agents, or both. Its low tissue solubility (and therefore rapid kinetics), low cost, and low rate of cardiorespiratory complications have made nitrous oxide by far the most commonly used general anaesthetic. The accumulating evidence regarding adverse effects of nitrous oxide administration has led many anaesthetists to question its continued routine use in a variety of operating room settings. Adverse events may result from both the biological actions of nitrous oxide and the fact that to deliver an effective dose, nitrous oxide, which is a relatively weak anaesthetic agent, needs to be given in high concentrations that restrict oxygen delivery (for example, a common mixture is 30% oxygen with 70% nitrous oxide). As well as the risk of low blood oxygen levels, concerns have also been raised regarding the risk of compromising the immune system, impaired cognition, postoperative cardiovascular complications, bowel obstruction from distention, and possible respiratory compromise. To determine if nitrous oxide-based anaesthesia results in similar outcomes to nitrous oxide-free anaesthesia in adults undergoing surgery. We searched the Cochrane Central Register of Controlled Trials (CENTRAL; 2014 Issue 10); MEDLINE (1966 to 17 October 2014); EMBASE (1974 to 17 October 2014); and ISI Web of Science (1974 to 17 October 2014). We also searched the reference lists of relevant articles, conference proceedings, and ongoing trials up to 17 October 2014 on specific websites (http://clinicaltrials.gov/, http://controlled-trials.com/, and http://www.centerwatch.com). We included randomized controlled trials (RCTs) comparing general anaesthesia where nitrous oxide was part of the anaesthetic technique used for the induction or maintenance of general

  6. Failure Mechanism of Rock Bridge Based on Acoustic Emission Technique

    Directory of Open Access Journals (Sweden)

    Guoqing Chen

    2015-01-01

    Full Text Available Acoustic emission (AE technique is widely used in various fields as a reliable nondestructive examination technology. Two experimental tests were carried out in a rock mechanics laboratory, which include (1 small scale direct shear tests of rock bridge with different lengths and (2 large scale landslide model with locked section. The relationship of AE event count and record time was analyzed during the tests. The AE source location technology and comparative analysis with its actual failure model were done. It can be found that whether it is small scale test or large scale landslide model test, AE technique accurately located the AE source point, which reflected the failure generation and expansion of internal cracks in rock samples. Large scale landslide model with locked section test showed that rock bridge in rocky slope has typical brittle failure behavior. The two tests based on AE technique well revealed the rock failure mechanism in rocky slope and clarified the cause of high speed and long distance sliding of rocky slope.

  7. Radiation synthesized protein-based nanoparticles: A technique overview

    International Nuclear Information System (INIS)

    Varca, Gustavo H.C.; Perossi, Gabriela G.; Grasselli, Mariano; Lugão, Ademar B.

    2014-01-01

    Seeking for alternative routes for protein engineering a novel technique – radiation induced synthesis of protein nanoparticles – to achieve size controlled particles with preserved bioactivity has been recently reported. This work aimed to evaluate different process conditions to optimize and provide an overview of the technique using γ-irradiation. Papain was used as model protease and the samples were irradiated in a gamma cell irradiator in phosphate buffer (pH=7.0) containing ethanol (0–35%). The dose effect was evaluated by exposure to distinct γ-irradiation doses (2.5, 5, 7.5 and 10 kGy) and scale up experiments involving distinct protein concentrations (12.5–50 mg mL −1 ) were also performed. Characterization involved size monitoring using dynamic light scattering. Bityrosine detection was performed using fluorescence measurements in order to provide experimental evidence of the mechanism involved. Best dose effects were achieved at 10 kGy with regard to size and no relevant changes were observed as a function of papain concentration, highlighting very broad operational concentration range. Bityrosine changes were identified for the samples as a function of the process confirming that such linkages play an important role in the nanoparticle formation. - Highlights: • Synthesis of protein-based nanoparticles by γ-irradiation. • Optimization of the technique. • Overview of mechanism involved in the nanoparticle formation. • Engineered papain nanoparticles for biomedical applications

  8. The Hannibal Community Survey; A Case Study in a Community Development Technique.

    Science.gov (United States)

    Croll, John A.

    Disturbed by the community's negative attitude toward its prospects for progress, the Hannibal (Missouri) Chamber of Commerce initiated a community self-survey to improve the situation. The questionnaire survey concentrated on felt needs relationg to city government, retail facilities and services, recreation, religion, education, industrial…

  9. Survey on Security Issues in Cloud Computing and Associated Mitigation Techniques

    Science.gov (United States)

    Bhadauria, Rohit; Sanyal, Sugata

    2012-06-01

    Cloud Computing holds the potential to eliminate the requirements for setting up of high-cost computing infrastructure for IT-based solutions and services that the industry uses. It promises to provide a flexible IT architecture, accessible through internet for lightweight portable devices. This would allow multi-fold increase in the capacity or capabilities of the existing and new software. In a cloud computing environment, the entire data reside over a set of networked resources, enabling the data to be accessed through virtual machines. Since these data-centers may lie in any corner of the world beyond the reach and control of users, there are multifarious security and privacy challenges that need to be understood and taken care of. Also, one can never deny the possibility of a server breakdown that has been witnessed, rather quite often in the recent times. There are various issues that need to be dealt with respect to security and privacy in a cloud computing scenario. This extensive survey paper aims to elaborate and analyze the numerous unresolved issues threatening the cloud computing adoption and diffusion affecting the various stake-holders linked to it.

  10. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim

    2015-05-21

    Base oils, blended for finished lubricant formulations, are classified by the American Petroleum Institute into five groups, viz., groups I-V. Groups I-III consist of petroleum based hydrocarbons whereas groups IV and V are made of synthetic polymers. In the present study, five base oil samples belonging to groups I and III were extensively characterized using high performance liquid chromatography (HPLC), comprehensive two-dimensional gas chromatography (GC×GC), and Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated, and then the availed information was combined to reveal compositional details on the base oil samples studied. HPLC showed the overwhelming presence of saturated over aromatic compounds in all five base oils. A similar trend was further corroborated using GC×GC, which yielded semiquantitative information on the compound classes present in the samples and provided further details on the carbon number distributions within these classes. In addition to chromatography methods, FT-ICR MS supplemented the compositional information on the base oil samples by resolving the aromatics compounds into alkyl- and naphtheno-subtituted families. APCI proved more effective for the ionization of the highly saturated base oil components compared to APPI. Furthermore, for the detailed information on hydrocarbon molecules FT-ICR MS revealed the presence of saturated and aromatic sulfur species in all base oil samples. The results presented herein offer a unique perspective into the detailed molecular structure of base oils typically used to formulate lubricants. © 2015 American Chemical Society.

  11. Study of capillary absorption kinetics by X-ray CT imaging techniques: a survey on sedimentary rocks of Sicily

    Directory of Open Access Journals (Sweden)

    Tiziano Schillaci

    2008-04-01

    Full Text Available Sedimentary rocks are natural porous materials with a great percent of microscopic interconnected pores: they contain fluids, permitting their movement on macroscopic scale. Generally, these rocks present porosity higher then metamorphic rocks. Under certain points of view, this feature represents an advantage; on the other hand, this can constitute an obstacle for cultural heritage applications, because the porosity grade can lead to a deterioration of the lapideous monument for water capillary absorption. In this paper, CT (Computerized Tomography image techniques are applied to capillary absorption kinetics in sedimentary rocks utilized for the Greek temples as well as baroc monuments, respectively located in western and southeastern Sicily. Rocks were sampled near the archaeological areas of Agrigento, Segesta, Selinunte and Val di Noto. CT images were acquired at different times, before and after the water contact, using image elaboration techniques during the acquisition as well as the post-processing phases. Water distribution into porous spaces has been evaluated on the basis of the Hounsfield number, estimated for the 3-D voxel structure of samples. For most of the considered samples, assumptions based on Handy model permit to correlate the average height of the wetting front to the square root of time. Stochastic equations were introduced in order to describe the percolative water behavior in heterogeneous samples, as the Agrigento one. Before the CT acquisition, an estimate of the capillary absorption kinetics has been carried out by the gravimetric method. A petrographical characterization of samples has been performed by stereomicroscope observations, while porosity and morphology of porous have been surveyed by SEM (Scanning Electron Microscope images. Furthermore, the proposed methods have also permitted to define penetration depth as well as distribution uniformity of materials used for restoration and conservation of historical

  12. On HTML and XML based web design and implementation techniques

    International Nuclear Information System (INIS)

    Bezboruah, B.; Kalita, M.

    2006-05-01

    Web implementation is truly a multidisciplinary field with influences from programming, choosing of scripting languages, graphic design, user interface design, and database design. The challenge of a Web designer/implementer is his ability to create an attractive and informative Web. To work with the universal framework and link diagrams from the design process as well as the Web specifications and domain information, it is essential to create Hypertext Markup Language (HTML) or other software and multimedia to accomplish the Web's objective. In this article we will discuss Web design standards and the techniques involved in Web implementation based on HTML and Extensible Markup Language (XML). We will also discuss the advantages and disadvantages of HTML over its successor XML in designing and implementing a Web. We have developed two Web pages, one utilizing the features of HTML and the other based on the features of XML to carry out the present investigation. (author)

  13. Efficient Identification Using a Prime-Feature-Based Technique

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Haq, Shaiq A.; Valente, Andrea

    2011-01-01

    . Fingerprint identification system, implemented on PC/104 based real-time systems, can accurately identify the operator. Traditionally, the uniqueness of a fingerprint is determined by the overall pattern of ridges and valleys as well as the local ridge anomalies e.g., a ridge bifurcation or a ridge ending......, which are called minutiae points. Designing a reliable automatic fingerprint matching algorithm for minimal platform is quite challenging. In real-time systems, efficiency of the matching algorithm is of utmost importance. To achieve this goal, a prime-feature-based indexing algorithm is proposed......Identification of authorized train drivers through biometrics is a growing area of interest in locomotive radio remote control systems. The existing technique of password authentication is not very reliable and potentially unauthorized personnel may also operate the system on behalf of the operator...

  14. Designing on ICT reconstruction software based on DSP techniques

    International Nuclear Information System (INIS)

    Liu Jinhui; Xiang Xincheng

    2006-01-01

    The convolution back project (CBP) algorithm is used to realize the CT image's reconstruction in ICT generally, which is finished by using PC or workstation. In order to add the ability of multi-platform operation of CT reconstruction software, a CT reconstruction method based on modern digital signal processor (DSP) technique is proposed and realized in this paper. The hardware system based on TI's C6701 DSP processor is selected to support the CT software construction. The CT reconstruction software is compiled only using assembly language related to the DSP hardware. The CT software can be run on TI's C6701 EVM board by inputting the CT data, and can get the CT Images that satisfy the real demands. (authors)

  15. New calibration technique for KCD-based megavoltage imaging

    Science.gov (United States)

    Samant, Sanjiv S.; Zheng, Wei; DiBianca, Frank A.; Zeman, Herbert D.; Laughter, Joseph S.

    1999-05-01

    In megavoltage imaging, current commercial electronic portal imaging devices (EPIDs), despite having the advantage of immediate digital imaging over film, suffer from poor image contrast and spatial resolution. The feasibility of using a kinestatic charge detector (KCD) as an EPID to provide superior image contrast and spatial resolution for portal imaging has already been demonstrated in a previous paper. The KCD system had the additional advantage of requiring an extremely low dose per acquired image, allowing for superior imaging to be reconstructed form a single linac pulse per image pixel. The KCD based images utilized a dose of two orders of magnitude less that for EPIDs and film. Compared with the current commercial EPIDs and film, the prototype KCD system exhibited promising image qualities, despite being handicapped by the use of a relatively simple image calibration technique, and the performance limits of medical linacs on the maximum linac pulse frequency and energy flux per pulse delivered. This image calibration technique fixed relative image pixel values based on a linear interpolation of extrema provided by an air-water calibration, and accounted only for channel-to-channel variations. The counterpart of this for area detectors is the standard flat fielding method. A comprehensive calibration protocol has been developed. The new technique additionally corrects for geometric distortions due to variations in the scan velocity, and timing artifacts caused by mis-synchronization between the linear accelerator and the data acquisition system (DAS). The role of variations in energy flux (2 - 3%) on imaging is demonstrated to be not significant for the images considered. The methodology is presented, and the results are discussed for simulated images. It also allows for significant improvements in the signal-to- noise ratio (SNR) by increasing the dose using multiple images without having to increase the linac pulse frequency or energy flux per pulse. The

  16. Risk-based maintenance-Techniques and applications

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2007-01-01

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions

  17. Generation of Quasi-Gaussian Pulses Based on Correlation Techniques

    Directory of Open Access Journals (Sweden)

    POHOATA, S.

    2012-02-01

    Full Text Available The Gaussian pulses have been mostly used within communications, where some applications can be emphasized: mobile telephony (GSM, where GMSK signals are used, as well as the UWB communications, where short-period pulses based on Gaussian waveform are generated. Since the Gaussian function signifies a theoretical concept, which cannot be accomplished from the physical point of view, this should be expressed by using various functions, able to determine physical implementations. New techniques of generating the Gaussian pulse responses of good precision are approached, proposed and researched in this paper. The second and third order derivatives with regard to the Gaussian pulse response are accurately generated. The third order derivates is composed of four individual rectangular pulses of fixed amplitudes, being easily to be generated by standard techniques. In order to generate pulses able to satisfy the spectral mask requirements, an adequate filter is necessary to be applied. This paper emphasizes a comparative analysis based on the relative error and the energy spectra of the proposed pulses.

  18. Avian survey and field guide for Osan Air Base, Korea.

    Energy Technology Data Exchange (ETDEWEB)

    Levenson, J.

    2006-12-05

    This report summarizes the results of the avian surveys conducted at Osan Air Base (AB). This ongoing survey is conducted to comply with requirements of the Environmental Governing Standards (EGS) for the Republic of Korea, the Integrated Natural Resources Management Plan (INRMP) for Osan AB, and the 51st Fighter Wing's Bird Aircraft Strike Hazard (BASH) Plan. One hundred ten bird species representing 35 families were identified and recorded. Seven species are designated as Natural Monuments, and their protection is accorded by the Korean Ministry of Culture and Tourism. Three species appear on the Korean Association for Conservation of Nature's (KACN's) list of Reserved Wild Species and are protected by the Korean Ministry of Environment. Combined, ten different species are Republic of Korea (ROK)-protected. The primary objective of the avian survey at Osan AB was to determine what species of birds are present on the airfield and their respective habitat requirements during the critical seasons of the year. This requirement is specified in Annex J.14.c of the 51st Fighter BASH Plan 91-212 (51 FW OPLAN 91-212). The second objective was to initiate surveys to determine what bird species are present on Osan AB throughout the year and from the survey results, determine if threatened, endangered, or other Korean-listed bird species are present on Osan AB. This overall census satisfies Criterion 13-3.e of the EGS for Korea. The final objective was to formulate management strategies within Osan AB's operational requirements to protect and enhance habitats of known threatened, endangered, and ROK-protected species in accordance with EGS Criterion 13-3.a that are also favorable for the reproduction of indigenous species in accordance with the EGS Criterion 13-3.h.

  19. A Twitter-based survey on marijuana concentrate use.

    Science.gov (United States)

    Daniulaityte, Raminta; Zatreh, Mussa Y; Lamy, Francois R; Nahhas, Ramzi W; Martins, Silvia S; Sheth, Amit; Carlson, Robert G

    2018-04-11

    The purpose of this paper is to analyze characteristics of marijuana concentrate users, describe patterns and reasons of use, and identify factors associated with daily use of concentrates among U.S.-based cannabis users recruited via a Twitter-based online survey. An anonymous Web-based survey was conducted in June 2017 with 687 U.S.-based cannabis users recruited via Twitter-based ads. The survey included questions about state of residence, socio-demographic characteristics, and cannabis use including marijuana concentrates. Multiple logistic regression analyses were conducted to identify characteristics associated with lifetime and daily use of marijuana concentrates. Almost 60% of respondents were male, 86% were white, and the mean age was 43.0 years. About 48% reported marijuana concentrate use. After adjusting for multiple testing, significant predictors of concentrate use included: living in "recreational" (AOR = 2.04; adj. p = .042) or "medical, less restrictive" (AOR = 1.74; adj. p = .030) states, being younger (AOR = 0.97, adj. p = .008), and daily herbal cannabis use (AOR = 2.57, adj. p = .008). Out of 329 marijuana concentrate users, about 13% (n = 44) reported daily/near daily use. Significant predictors of daily concentrate use included: living in recreational states (AOR = 3.59, adj. p = .020) and using concentrates for therapeutic purposes (AOR = 4.34, adj. p = .020). Living in states with more liberal marijuana policies is associated with greater likelihood of marijuana concentrate use and with more frequent use. Characteristics of daily users, in particular, patterns of therapeutic use warrant further research with community-recruited samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Population-based absolute risk estimation with survey data

    Science.gov (United States)

    Kovalchik, Stephanie A.; Pfeiffer, Ruth M.

    2013-01-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  1. Explaining discrepancies in reproductive health indicators from population-based surveys and exit surveys: a case from Rwanda.

    Science.gov (United States)

    Meekers, D; Ogada, E A

    2001-06-01

    Reproductive health programmes often need exit surveys and population-based surveys for monitoring and evaluation. This study investigates why such studies produce discrepant estimates of condom use, sexual behaviour and condom brand knowledge, and discusses the implications for future use of exit surveys for programme monitoring. Logistic regression is used to explain differences between a household survey of 1295 persons and an exit survey among a random sample of 2550 consumers at retail outlets in RWANDA: Discrepancies in ever use of condoms and risky sexual behaviours are due to differences in socioeconomic status of the two samples. After controls, exit surveys at most outlet types have the same results as the household survey. Only exit surveys at bars, nightclubs and hotels yield significantly different estimates. However, the above-average knowledge of Prudence Plus condoms in the exit interviews is not attributable to socioeconomic or demographic variables, most likely because respondents have seen the product at the outlets. Information about condom use and sexual behaviour obtained from exit surveys appears as accurate as that obtained through household surveys. Nevertheless, exit surveys must be used cautiously. Because exit surveys may include wealthier and better-educated respondents, they are not representative of the general population. The composition of exit survey samples should be validated through existing household surveys. Comparisons across survey types are generally unadvisable, unless they control for sample differences. When generalizing to the population at large is not needed (e.g. for studies aimed at identifying the characteristics and behaviour of users of particular products or services), exit surveys can provide an appropriate alternative to household surveys.

  2. A Novel Technique for Steganography Method Based on Improved Genetic Algorithm Optimization in Spatial Domain

    Directory of Open Access Journals (Sweden)

    M. Soleimanpour-moghadam

    2013-06-01

    Full Text Available This paper devotes itself to the study of secret message delivery using cover image and introduces a novel steganographic technique based on genetic algorithm to find a near-optimum structure for the pair-wise least-significant-bit (LSB matching scheme. A survey of the related literatures shows that the LSB matching method developed by Mielikainen, employs a binary function to reduce the number of changes of LSB values. This method verifiably reduces the probability of detection and also improves the visual quality of stego images. So, our proposal draws on the Mielikainen's technique to present an enhanced dual-state scoring model, structured upon genetic algorithm which assesses the performance of different orders for LSB matching and searches for a near-optimum solution among all the permutation orders. Experimental results confirm superiority of the new approach compared to the Mielikainen’s pair-wise LSB matching scheme.

  3. A SURVEY ON DELAY AND NEIGHBOR NODE MONITORING BASED WORMHOLE ATTACK PREVENTION AND DETECTION

    Directory of Open Access Journals (Sweden)

    Sudhir T Bagade

    2016-12-01

    Full Text Available In Mobile Ad-hoc Networks (MANET, network layer attacks, for example wormhole attacks, disrupt the network routing operations and can be used for data theft. Wormhole attacks are of two types: hidden and exposed wormhole. There are various mechanisms in literature which are used to prevent and detect wormhole attacks. In this paper, we survey wormhole prevention and detection techniques and present our critical observations for each. These techniques are based on cryptographic mechanisms, monitoring of packet transmission delay and control packet forwarding behavior of neighbor nodes. We compare the techniques using the following criteria- extra resources needed applicability to different network topologies and routing protocols, prevention/detection capability, etc. We conclude the paper with potential research directions.

  4. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Science.gov (United States)

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  5. Refractive index sensor based on optical fiber end face using pulse reference-based compensation technique

    Science.gov (United States)

    Bian, Qiang; Song, Zhangqi; Zhang, Xueliang; Yu, Yang; Chen, Yuzhong

    2018-03-01

    We proposed a refractive index sensor based on optical fiber end face using pulse reference-based compensation technique. With good compensation effect of this compensation technique, the power fluctuation of light source, the change of optic components transmission loss and coupler splitting ratio can be compensated, which largely reduces the background noise. The refractive index resolutions can achieve 3.8 × 10-6 RIU and1.6 × 10-6 RIU in different refractive index regions.

  6. Enhancing the effectiveness of IST through risk-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  7. Digital 3D Borobudur – Integration of 3D surveying and modeling techniques

    Directory of Open Access Journals (Sweden)

    D. Suwardhi

    2015-08-01

    Full Text Available The Borobudur temple (Indonesia is one of the greatest Buddhist monuments in the world, now listed as an UNESCO World Heritage Site. The present state of the temple is the result of restorations after being exposed to natural disasters several times. Today there is still a growing rate of deterioration of the building stones whose causes need further researches. Monitoring programs, supported at institutional level, have been effectively executed to observe the problem. The paper presents the latest efforts to digitally document the Borobudur Temple and its surrounding area in 3D with photogrammetric techniques. UAV and terrestrial images were acquired to completely digitize the temple, produce DEM, orthoimages and maps at 1:100 and 1:1000 scale. The results of the project are now employed by the local government organizations to manage the heritage area and plan new policies for the conservation and preservation of the UNESCO site. In order to help data management and policy makers, a web-based information system of the heritage area was also built to visualize and easily access all the data and achieved 3D results.

  8. A Study on Integrated Community Based Flood Mitigation with Remote Sensing Technique in Kota Bharu, Kelantan

    International Nuclear Information System (INIS)

    Ainullotfi, A A; Ibrahim, A L; Masron, T

    2014-01-01

    This study is conducted to establish a community based flood management system that is integrated with remote sensing technique. To understand local knowledge, the demographic of the local society is obtained by using the survey approach. The local authorities are approached first to obtain information regarding the society in the study areas such as the population, the gender and the tabulation of settlement. The information about age, religion, ethnic, occupation, years of experience facing flood in the area, are recorded to understand more on how the local knowledge emerges. Then geographic data is obtained such as rainfall data, land use, land elevation, river discharge data. This information is used to establish a hydrological model of flood in the study area. Analysis were made from the survey approach to understand the pattern of society and how they react to floods while the analysis of geographic data is used to analyse the water extent and damage done by the flood. The final result of this research is to produce a flood mitigation method with a community based framework in the state of Kelantan. With the flood mitigation that involves the community's understanding towards flood also the techniques to forecast heavy rainfall and flood occurrence using remote sensing, it is hope that it could reduce the casualties and damage that might cause to the society and infrastructures in the study area

  9. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-04-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f-v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  10. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-07-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f- v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  11. Detecting Molecular Properties by Various Laser-Based Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hsin, Tse-Ming [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  12. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  13. Comparison of acrylamide intake from Western and guideline based diets using probabilistic techniques and linear programming.

    Science.gov (United States)

    Katz, Josh M; Winter, Carl K; Buttrey, Samuel E; Fadel, James G

    2012-03-01

    Western and guideline based diets were compared to determine if dietary improvements resulting from following dietary guidelines reduce acrylamide intake. Acrylamide forms in heat treated foods and is a human neurotoxin and animal carcinogen. Acrylamide intake from the Western diet was estimated with probabilistic techniques using teenage (13-19 years) National Health and Nutrition Examination Survey (NHANES) food consumption estimates combined with FDA data on the levels of acrylamide in a large number of foods. Guideline based diets were derived from NHANES data using linear programming techniques to comport to recommendations from the Dietary Guidelines for Americans, 2005. Whereas the guideline based diets were more properly balanced and rich in consumption of fruits, vegetables, and other dietary components than the Western diets, acrylamide intake (mean±SE) was significantly greater (Plinear programming and results demonstrate that linear programming techniques can be used to model specific diets for the assessment of toxicological and nutritional dietary components. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Resident fatigue in otolaryngology residents: a Web based survey.

    Science.gov (United States)

    Nida, Andrew M; Googe, Benjamin J; Lewis, Andrea F; May, Warren L

    2016-01-01

    Resident fatigue has become a point of emphasis in medical education and its effects on otolaryngology residents and their patients require further study. The purpose of our study was to evaluate the prevalence and nature of fatigue in otolaryngology residents, evaluate various quality of life measures, and investigate associations of increased fatigue with resident safety. Anonymous survey. Internet based. United States allopathic otolaryngology residents. None. The survey topics included demographics, residency structure, sleep habits and perceived stress. Responses were correlated with a concurrent Epworth Sleep Scale questionnaire to evaluate effects of fatigue on resident training and quality of life. 190 residents responded to the survey with 178 completing the Epworth Sleep Scale questionnaire. Results revealed a mean Epworth Sleep Scale score of 9.9±5.1 with a median of 10.0 indicating a significant number of otolaryngology residents are excessively sleepy. Statistically significant correlations between Epworth Sleep Scale and sex, region, hours of sleep, and work hours were found. Residents taking in-house call had significantly fewer hours of sleep compared to home call (p=0.01). Residents on "head and neck" (typically consisting of a large proportion of head and neck oncologic surgery) rotations tended to have higher Epworth Sleep Scale and had significantly fewer hours of sleep (p=.003) and greater work hours (potolaryngology residents are excessively sleepy. Our data suggest that the effects of fatigue play a role in resident well-being and resident safety. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Use of cognitive interview techniques in the development of nutrition surveys and interactive nutrition messages for low-income populations.

    Science.gov (United States)

    Carbone, Elena T; Campbell, Marci K; Honess-Morreale, Lauren

    2002-05-01

    The effectiveness of dietary surveys and educational messages is dependent in part on how well the target audience's information processing needs and abilities are addressed. Use of pilot testing is helpful; however, problems with wording and language are often not revealed. Cognitive interview techniques offer 1 approach to assist dietitians in understanding how audiences process information. With this method, respondents are led through a survey or message and asked to paraphrase items; discuss thoughts, feelings, and ideas that come to mind; and suggest alternative wording. As part of a US Department of Agriculture-funded nutrition education project, 23 cognitive interviews were conducted among technical community college students in North Carolina. Interview findings informed the development of tailored computer messages and survey questions. Better understanding of respondents' cognitive processes significantly improved the language and approach used in this intervention. Interview data indicated 4 problem areas: vague or ineffective instructions, confusing questions and response options, variable interpretation of terms, and misinterpretation of dietary recommendations. Interviews also provided insight into the meaning of diet-related stages of change. These findings concur with previous research suggesting that cognitive interview techniques are a valuable tool in the formative evaluation and development of nutrition surveys and materials.

  16. A Survey on Smartphone-Based Crowdsensing Solutions

    Directory of Open Access Journals (Sweden)

    Willian Zamora

    2016-01-01

    Full Text Available In recent years, the widespread adoption of mobile phones, combined with the ever-increasing number of sensors that smartphones are equipped with, greatly simplified the generalized adoption of crowdsensing solutions by reducing hardware requirements and costs to a minimum. These factors have led to an outstanding growth of crowdsensing proposals from both academia and industry. In this paper, we provide a survey of smartphone-based crowdsensing solutions that have emerged in the past few years, focusing on 64 works published in top-ranked journals and conferences. To properly analyze these previous works, we first define a reference framework based on how we classify the different proposals under study. The results of our survey evidence that there is still much heterogeneity in terms of technologies adopted and deployment approaches, although modular designs at both client and server elements seem to be dominant. Also, the preferred client platform is Android, while server platforms are typically web-based, and client-server communications mostly rely on XML or JSON over HTTP. The main detected pitfall concerns the performance evaluation of the different proposals, which typically fail to make a scalability analysis despite being critical issue when targeting very large communities of users.

  17. Determination of rock fragmentation based on a photographic technique

    International Nuclear Information System (INIS)

    Dehgan Banadaki, M.M.; Majdi, A.; Raessi Gahrooei, D.

    2002-01-01

    The paper represents a physical blasting model in laboratory scale along with a photographic approach to describe the distribution of blasted rock materials. For this purpose, based on wobble probability distribution function, eight samples each weighted 100 kg,were obtained. Four pictures from four different section of each sample were taken. Then, pictures were converted into graphic files with characterizing boundary of each piece of rocks in the samples. Error caused due to perspective were eliminated. Volume of each piece of the blasted rock materials and hence the required sieve size, each piece of rock to pass through, were calculated. Finally, original blasted rock size distribution was compared with that obtained from the photographic method. The paper concludes with presenting an approach to convert the results of photographic technique into size distribution obtained by seine analysis with sufficient verification

  18. Whitelists Based Multiple Filtering Techniques in SCADA Sensor Networks

    Directory of Open Access Journals (Sweden)

    DongHo Kang

    2014-01-01

    Full Text Available Internet of Things (IoT consists of several tiny devices connected together to form a collaborative computing environment. Recently IoT technologies begin to merge with supervisory control and data acquisition (SCADA sensor networks to more efficiently gather and analyze real-time data from sensors in industrial environments. But SCADA sensor networks are becoming more and more vulnerable to cyber-attacks due to increased connectivity. To safely adopt IoT technologies in the SCADA environments, it is important to improve the security of SCADA sensor networks. In this paper we propose a multiple filtering technique based on whitelists to detect illegitimate packets. Our proposed system detects the traffic of network and application protocol attacks with a set of whitelists collected from normal traffic.

  19. Demand Management Based on Model Predictive Control Techniques

    Directory of Open Access Journals (Sweden)

    Yasser A. Davizón

    2014-01-01

    Full Text Available Demand management (DM is the process that helps companies to sell the right product to the right customer, at the right time, and for the right price. Therefore the challenge for any company is to determine how much to sell, at what price, and to which market segment while maximizing its profits. DM also helps managers efficiently allocate undifferentiated units of capacity to the available demand with the goal of maximizing revenue. This paper introduces control system approach to demand management with dynamic pricing (DP using the model predictive control (MPC technique. In addition, we present a proper dynamical system analogy based on active suspension and a stability analysis is provided via the Lyapunov direct method.

  20. Clustering economies based on multiple criteria decision making techniques

    Directory of Open Access Journals (Sweden)

    Mansour Momeni

    2011-10-01

    Full Text Available One of the primary concerns on many countries is to determine different important factors affecting economic growth. In this paper, we study some factors such as unemployment rate, inflation ratio, population growth, average annual income, etc to cluster different countries. The proposed model of this paper uses analytical hierarchy process (AHP to prioritize the criteria and then uses a K-mean technique to cluster 59 countries based on the ranked criteria into four groups. The first group includes countries with high standards such as Germany and Japan. In the second cluster, there are some developing countries with relatively good economic growth such as Saudi Arabia and Iran. The third cluster belongs to countries with faster rates of growth compared with the countries located in the second group such as China, India and Mexico. Finally, the fourth cluster includes countries with relatively very low rates of growth such as Jordan, Mali, Niger, etc.

  1. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    Science.gov (United States)

    Parkash, Om; Hanim Shueb, Rafidah

    2015-01-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  2. An RSS based location estimation technique for cognitive relay networks

    KAUST Repository

    Qaraqe, Khalid A.

    2010-11-01

    In this paper, a received signal strength (RSS) based location estimation method is proposed for a cooperative wireless relay network where the relay is a cognitive radio. We propose a method for the considered cognitive relay network to determine the location of the source using the direct and the relayed signal at the destination. We derive the Cramer-Rao lower bound (CRLB) expressions separately for x and y coordinates of the location estimate. We analyze the effects of cognitive behaviour of the relay on the performance of the proposed method. We also discuss and quantify the reliability of the location estimate using the proposed technique if the source is not stationary. The overall performance of the proposed method is presented through simulations. ©2010 IEEE.

  3. A Survey on Voltage Boosting Techniques for Step-Up DC-DC Converters

    DEFF Research Database (Denmark)

    Forouzesh, Mojtaba; Siwakoti, Yam Prasad; Gorji, Saman Asghari

    2016-01-01

    boosting techniques and topologies are large, which at times may be confusing and difficult to follow/adapt for different applications. Considering these aspects and in order to make a clear sketch of the general law and framework of various voltage boosting techniques, this paper comprehensively reviews...

  4. Astronomical Image Compression Techniques Based on ACC and KLT Coder

    Directory of Open Access Journals (Sweden)

    J. Schindler

    2011-01-01

    Full Text Available This paper deals with a compression of image data in applications in astronomy. Astronomical images have typical specific properties — high grayscale bit depth, size, noise occurrence and special processing algorithms. They belong to the class of scientific images. Their processing and compression is quite different from the classical approach of multimedia image processing. The database of images from BOOTES (Burst Observer and Optical Transient Exploring System has been chosen as a source of the testing signal. BOOTES is a Czech-Spanish robotic telescope for observing AGN (active galactic nuclei and the optical transient of GRB (gamma ray bursts searching. This paper discusses an approach based on an analysis of statistical properties of image data. A comparison of two irrelevancy reduction methods is presented from a scientific (astrometric and photometric point of view. The first method is based on a statistical approach, using the Karhunen-Loeve transform (KLT with uniform quantization in the spectral domain. The second technique is derived from wavelet decomposition with adaptive selection of used prediction coefficients. Finally, the comparison of three redundancy reduction methods is discussed. Multimedia format JPEG2000 and HCOMPRESS, designed especially for astronomical images, are compared with the new Astronomical Context Coder (ACC coder based on adaptive median regression.

  5. An investigation of a video-based patient repositioning technique

    International Nuclear Information System (INIS)

    Yan Yulong; Song Yulin; Boyer, Arthur L.

    2002-01-01

    Purpose: We have investigated a video-based patient repositioning technique designed to use skin features for radiotherapy repositioning. We investigated the feasibility of the clinical application of this system by quantitative evaluation of performance characteristics of the methodology. Methods and Materials: Multiple regions of interest (ROI) were specified in the field of view of video cameras. We used a normalized correlation pattern-matching algorithm to compute the translations of each ROI pattern in a target image. These translations were compared against trial translations using a quadratic cost function for an optimization process in which the patient rotation and translational parameters were calculated. Results: A hierarchical search technique achieved high-speed (compute correlation for 128x128 ROI in 512x512 target image within 0.005 s) and subpixel spatial accuracy (as high as 0.2 pixel). By treating the observed translations as movements of points on the surfaces of a hypothetical cube, we were able to estimate accurately the actual translations and rotations of the test phantoms used in our experiments to less than 1 mm and 0.2 deg. with a standard deviation of 0.3 mm and 0.5 deg. respectively. For human volunteer cases, we estimated the translations and rotations to have an accuracy of 2 mm and 1.2 deg. Conclusion: A personal computer-based video system is suitable for routine patient setup of fractionated conformal radiotherapy. It is expected to achieve high-precision repositioning of the skin surface with high efficiency

  6. Advancing US GHG Inventory by Incorporating Survey Data using Machine-Learning Techniques

    Science.gov (United States)

    Alsaker, C.; Ogle, S. M.; Breidt, J.

    2017-12-01

    Crop management data are used in the National Greenhouse Gas Inventory that is compiled annually and reported to the United Nations Framework Convention on Climate Change. Emissions for carbon stock change and N2O emissions for US agricultural soils are estimated using the USDA National Resources Inventory (NRI). NRI provides basic information on land use and cropping histories, but it does not provide much detail on other management practices. In contrast, the Conservation Effects Assessment Project (CEAP) survey collects detailed crop management data that could be used in the GHG Inventory. The survey data were collected from NRI survey locations that are a subset of the NRI every 10 years. Therefore, imputation of the CEAP are needed to represent the management practices across all NRI survey locations both spatially and temporally. Predictive mean matching and an artificial neural network methods have been applied to develop imputation model under a multiple imputation framework. Temporal imputation involves adjusting the imputation model using state-level USDA Agricultural Resource Management Survey data. Distributional and predictive accuracy is assessed for the imputed data, providing not only management data needed for the inventory but also rigorous estimates of uncertainty.

  7. Shallow Depth Geophysical Investigation Through the Application of Magnetic and Electric Resistance Techniques: AN Evaluation Study of the Responses of Magnetic and Electric Resistance Techniques to Archaeogeophysical Prospection Surveys in Greece and Cyprus

    Science.gov (United States)

    Sarris, Apostolos

    The response characteristics of total intensity and vertical gradient magnetic techniques have been investigated in detail and compared with electric resistivity and other geophysical techniques. Four case studies from archaeological sites of Greece and Cyprus have been used as the experimental basis of this research project. Data from shallow depth geophysical investigations in these sites were collected over a period of four years. Interpretation of the geophysical results was based on the integration of the various prospecting methods. The results of the comparative study between the different techniques showed a strong correlation among all methods allowing the detection of certain features and the determination of their dimensions. The application of a large range of geophysical prospecting techniques in the surveyed archaeological sites has been able to detect the approximate position of the subsurface remains and to compare the different techniques in terms of the information that they reveal. Each one of these techniques has been used to examine the characteristic response of each method to the geophysical anomalies associated with the surveyed sites. Magnetic susceptibility measurements at two frequencies have identified areas and levels of intense human activity. A number of processing techniques such as low, high and band pass filtering in the spatial and frequency domain, computation of the residuals and fast Fourier transformation (FFT) of the magnetic potential data have been applied to the geophysical measurements. The subsequent convolution with filters representing apparent susceptibility, reduction to pole and equator, Gaussian and Butterworth regional and residual distributions, and inverse filtering in terms of spiking deconvolution have revealed a wealth of information necessary to obtain a more accurate picture of the concealed features. Inverse modelling of isolated magnetic anomalies has further enriched the information database of the

  8. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal; Henkel, Jö rg

    2010-01-01

    % for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures, namely ARM and MIPS. © 2010 ACM.

  9. Skull base tumours part I: Imaging technique, anatomy and anterior skull base tumours

    International Nuclear Information System (INIS)

    Borges, Alexandra

    2008-01-01

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed

  10. Comparative assessment of PIV-based pressure evaluation techniques applied to a transonic base flow

    NARCIS (Netherlands)

    Blinde, P; Michaelis, D; van Oudheusden, B.W.; Weiss, P.E.; de Kat, R.; Laskari, A.; Jeon, Y.J.; David, L; Schanz, D; Huhn, F.; Gesemann, S; Novara, M.; McPhaden, C.; Neeteson, N.; Rival, D.; Schneiders, J.F.G.; Schrijer, F.F.J.

    2016-01-01

    A test case for PIV-based pressure evaluation techniques has been developed by constructing a simulated experiment from a ZDES simulation for an axisymmetric base flow at Mach 0.7. The test case comprises sequences of four subsequent particle images (representing multi-pulse data) as well as

  11. A novel technique for active vibration control, based on optimal

    Indian Academy of Sciences (India)

    In the last few decades, researchers have proposed many control techniques to suppress unwanted vibrations in a structure. In this work, a novel and simple technique is proposed for the active vibration control. In this technique, an optimal tracking control is employed to suppress vibrations in a structure by simultaneously ...

  12. Neural Representation. A Survey-Based Analysis of the Notion

    Directory of Open Access Journals (Sweden)

    Oscar Vilarroya

    2017-08-01

    Full Text Available The word representation (as in “neural representation”, and many of its related terms, such as to represent, representational and the like, play a central explanatory role in neuroscience literature. For instance, in “place cell” literature, place cells are extensively associated with their role in “the representation of space.” In spite of its extended use, we still lack a clear, universal and widely accepted view on what it means for a nervous system to represent something, on what makes a neural activity a representation, and on what is re-presented. The lack of a theoretical foundation and definition of the notion has not hindered actual research. My aim here is to identify how active scientists use the notion of neural representation, and eventually to list a set of criteria, based on actual use, that can help in distinguishing between genuine or non-genuine neural-representation candidates. In order to attain this objective, I present first the results of a survey of authors within two domains, place-cell and multivariate pattern analysis (MVPA research. Based on the authors’ replies, and on a review of neuroscientific research, I outline a set of common properties that an account of neural representation seems to require. I then apply these properties to assess the use of the notion in two domains of the survey, place-cell and MVPA studies. I conclude by exploring a shift in the notion of representation suggested by recent literature.

  13. Is cell culture a risky business? Risk analysis based on scientist survey data.

    Science.gov (United States)

    Shannon, Mark; Capes-Davis, Amanda; Eggington, Elaine; Georghiou, Ronnie; Huschtscha, Lily I; Moy, Elsa; Power, Melinda; Reddel, Roger R; Arthur, Jonathan W

    2016-02-01

    Cell culture is a technique that requires vigilance from the researcher. Common cell culture problems, including contamination with microorganisms or cells from other cultures, can place the reliability and reproducibility of cell culture work at risk. Here we use survey data, contributed by research scientists based in Australia and New Zealand, to assess common cell culture risks and how these risks are managed in practice. Respondents show that sharing of cell lines between laboratories continues to be widespread. Arrangements for mycoplasma and authentication testing are increasingly in place, although scientists are often uncertain how to perform authentication testing. Additional risks are identified for preparation of frozen stocks, storage and shipping. © 2015 UICC.

  14. Methodology of the National School-based Health Survey in Malaysia, 2012.

    Science.gov (United States)

    Yusoff, Fadhli; Saari, Riyanti; Naidu, Balkish M; Ahmad, Noor Ani; Omar, Azahadi; Aris, Tahir

    2014-09-01

    The National School-Based Health Survey 2012 was a nationwide school health survey of students in Standard 4 to Form 5 (10-17 years of age), who were schooling in government schools in Malaysia during the period of data collection. The survey comprised 3 subsurveys: the Global School Health Survey (GSHS), the Mental Health Survey, and the National School-Based Nutrition Survey. The aim of the survey was to provide data on the health status of adolescents in Malaysia toward strengthening the adolescent health program in the country. The design of the survey was created to fulfill the requirements of the 3 subsurveys. A 2-stage stratified sampling method was adopted in the sampling. The methods for data collection were via questionnaire and physical examination. The National School-Based Health Survey 2012 adopted an appropriate methodology for a school-based survey to ensure valid and reliable findings. © 2014 APJPH.

  15. Broadcast Expenses Controlling Techniques in Mobile Ad-hoc Networks: A Survey

    Directory of Open Access Journals (Sweden)

    Naeem Ahmad

    2016-07-01

    Full Text Available The blind flooding of query packets in route discovery more often characterizes the broadcast storm problem, exponentially increases energy consumption of intermediate nodes and congests the entire network. In such a congested network, the task of establishing the path between resources may become very complex and unwieldy. An extensive research work has been done in this area to improve the route discovery phase of routing protocols by reducing broadcast expenses. The purpose of this study is to provide a comparative analysis of existing broadcasting techniques for the route discovery phase, in order to bring about an efficient broadcasting technique for determining the route with minimum conveying nodes in ad-hoc networks. The study is designed to highlight the collective merits and demerits of such broadcasting techniques along with certain conclusions that would contribute to the choice of broadcasting techniques.

  16. Acellular dermal matrix based nipple reconstruction: A modified technique

    Directory of Open Access Journals (Sweden)

    Raghavan Vidya

    2017-09-01

    Full Text Available Nipple areolar reconstruction (NAR has evolved with the advancement in breast reconstruction and can improve self-esteem and, consequently, patient satisfaction. Although a variety of reconstruction techniques have been described in the literature varying from nipple sharing, local flaps to alloplastic and allograft augmentation, over time, loss of nipple projection remains a major problem. Acellular dermal matrices (ADM have revolutionised breast reconstruction more recently. We discuss the use of ADM to act as a base plate and strut to give support to the base and offer nipple bulk and projection in a primary procedure of NAR with a local clover shaped dermal flap in 5 breasts (4 patients. We used 5-point Likert scales (1 = highly unsatisfied, 5 = highly satisfied to assess patient satisfaction. Median age was 46 years (range: 38–55 years. Nipple projection of 8 mm, 7 mm, and 7 mms were achieved in the unilateral cases and 6 mm in the bilateral case over a median 18 month period. All patients reported at least a 4 on the Likert scale. We had no post-operative complications. It seems that nipple areolar reconstruction [NAR] using ADM can achieve nipple projection which is considered aesthetically pleasing for patients.

  17. Crack identification based on synthetic artificial intelligent technique

    International Nuclear Information System (INIS)

    Shim, Mun Bo; Suh, Myung Won

    2001-01-01

    It has been established that a crack has an important effect on the dynamic behavior of a structure. This effect depends mainly on the location and depth of the crack. To identify the location and depth of a crack in a structure, a method is presented in this paper which uses synthetic artificial intelligent technique, that is, Adaptive-Network-based Fuzzy Inference System(ANFIS) solved via hybrid learning algorithm(the back-propagation gradient descent and the least-squares method) are used to learn the input(the location and depth of a crack)-output(the structural eigenfrequencies) relation of the structural system. With this ANFIS and a Continuous Evolutionary Algorithm(CEA), it is possible to formulate the inverse problem. CEAs based on genetic algorithms work efficiently for continuous search space optimization problems like a parameter identification problem. With this ANFIS, CEAs are used to identify the crack location and depth minimizing the difference from the measured frequencies. We have tried this new idea on a simple beam structure and the results are promising

  18. Structural design systems using knowledge-based techniques

    International Nuclear Information System (INIS)

    Orsborn, K.

    1993-01-01

    Engineering information management and the corresponding information systems are of a strategic importance for industrial enterprises. This thesis treats the interdisciplinary field of designing computing systems for structural design and analysis using knowledge-based techniques. Specific conceptual models have been designed for representing the structure and the process of objects and activities in a structural design and analysis domain. In this thesis, it is shown how domain knowledge can be structured along several classification principles in order to reduce complexity and increase flexibility. By increasing the conceptual level of the problem description and representation of the domain knowledge in a declarative form, it is possible to enhance the development, maintenance and use of software for mechanical engineering. This will result in a corresponding increase of the efficiency of the mechanical engineering design process. These ideas together with the rule-based control point out the leverage of declarative knowledge representation within this domain. Used appropriately, a declarative knowledge representation preserves information better, is more problem-oriented and change-tolerant than procedural representations. 74 refs

  19. SELF-ASSEMBLED ROV AND PHOTOGRAMMETRIC SURVEYS WITH LOW COST TECHNIQUES

    Directory of Open Access Journals (Sweden)

    E. Costa

    2018-05-01

    Full Text Available In last years, ROVs, have been employed to explore underwater environments and have played an important role for documentation and surveys in different fields of scientific application. In 2017, the Laboratorio di Fotogrammetria of Iuav University of Venice has decided to buy an OpenRov, a low cost ROV that could be assembled by ourselves to add some external components for our necessities, to document archaeological sites. The paper is related to the photogrammetric survey for the documentation of underwater environments and to the comparison between different solutions applied on a case studio, five marble columns on a sandy bottom at 5 meters deep. On the lateral sides of the ROV, we have applied two GoPro Hero4 Session, which have documented the items both with a series of images and with a video. The geometric accuracy of the obtained 3D model has been evaluated through comparison with a photogrammetric model realized with a professional reflex camera, Nikon D610. Some targets have been topographically surveyed with a trilateration and have been used to connected in the same reference system the different models, allowing the comparisons of the point clouds. Remote Operating Vehicles offer not only safety for their operators, but are also a relatively low cost alternative. The employment of a low-cost vehicle adapted to the necessities of surveys support a request for safer, cheaper and efficient methods for exploring underwater environments.

  20. Positron emission tomography, physical bases and comparaison with other techniques

    International Nuclear Information System (INIS)

    Guermazi, Fadhel; Hamza, F; Amouri, W.; Charfeddine, S.; Kallel, S.; Jardak, I.

    2013-01-01

    Positron emission tomography (PET) is a medical imaging technique that measures the three-dimensional distribution of molecules marked by a positron-emitting particle. PET has grown significantly in clinical fields, particularly in oncology for diagnosis and therapeutic follow purposes. The technical evolutions of this technique are fast. Among the technical improvements, is the coupling of the PET scan with computed tomography (CT). PET is obtained by intravenous injection of a radioactive tracer. The marker is usually fluorine ( 18 F) embedded in a glucose molecule forming the 18-fluorodeoxyglucose (FDG-18). This tracer, similar to glucose, binds to tissues that consume large quantities of the sugar such cancerous tissue, cardiac muscle or brain. Detection using scintillation crystals (BGO, LSO, LYSO) suitable for high energy (511keV) recognizes the lines of the gamma photons originating from the annihilation of a positron with an electron. The electronics of detection or coincidence circuit is based on two criteria: a time window, of about 6 to 15 ns, and an energy window. This system measures the true coincidences that correspond to the detection of two photons of 511 kV from the same annihilation. Most PET devices are constituted by a series of elementary detectors distributed annularly around the patient. Each detector comprises a scintillation crystal matrix coupled to a finite number (4 or 6) of photomultipliers. The electronic circuit, or the coincidence circuit, determines the projection point of annihilation by means of two elementary detectors. The processing of such information must be extremely fast, considering the count rates encountered in practice. The information measured by the coincidence circuit is then positioned in a matrix or sinogram, which contains a set of elements of a projection section of the object. Images are obtained by tomographic reconstruction by powerful computer stations equipped with a software tools allowing the analysis and

  1. Exploiting stock data: a survey of state of the art computational techniques aimed at producing beliefs regarding investment portfolios

    Directory of Open Access Journals (Sweden)

    Mario Linares Vásquez

    2008-01-01

    Full Text Available Selecting an investment portfolio has inspired several models aimed at optimising the set of securities which an in-vesttor may select according to a number of specific decision criteria such as risk, expected return and planning hori-zon. The classical approach has been developed for supporting the two stages of portfolio selection and is supported by disciplines such as econometrics, technical analysis and corporative finance. However, with the emerging field of computational finance, new and interesting techniques have arisen in line with the need for the automatic processing of vast volumes of information. This paper surveys such new techniques which belong to the body of knowledge con-cerning computing and systems engineering, focusing on techniques particularly aimed at producing beliefs regar-ding investment portfolios.

  2. A Survey on Modeling and Simulation of MEMS Switches and Its Application in Power Gating Techniques

    OpenAIRE

    Pramod Kumar M.P; A.S. Augustine Fletcher

    2014-01-01

    Large numbers of techniques have been developed to reduce the leakage power, including supply voltage scaling, varying threshold voltages, smaller logic banks, etc. Power gating is a technique which is used to reduce the static power when the sleep transistor is in off condition. Micro Electro mechanical System (MEMS) switches have properties that are very close to an ideal switch, with infinite off-resistance due to an air gap and low on-resistance due to the ohmic metal to m...

  3. Application of spectroscopic techniques to the study of illuminated manuscripts: A survey

    International Nuclear Information System (INIS)

    Pessanha, S.; Manso, M.; Carvalho, M.L.

    2012-01-01

    This work focused on the application of the most relevant spectroscopic techniques used for the characterization of illuminated manuscripts. The historical value of these unique and invaluable artworks, together with the increased awareness concerning the conservation of cultural heritage, prompted the application of analytical techniques to the study of these illuminations. This is essential for the understanding of the artist's working methods, which aids conservation–restoration. The characterization of the pigments may also help assign a probable date to the manuscript. For these purposes, the spectroscopic techniques used so far include those that provide information on the elemental content: X-ray fluorescence, total reflection X-ray fluorescence and scanning electron microscopy coupled with energy-dispersive spectroscopy and laser-induced breakdown spectroscopy. Complementary techniques, such as X-ray diffraction, Fourier transform infrared and Raman spectroscopy, reveal information regarding the compounds present in the samples. The techniques, suitability, technological evolution and development of high-performance detectors, as well as the possibility of microanalysis and the higher sensitivity of the equipment, will also be discussed. Furthermore, issues such as the necessity of sampling, the portability of the equipment and the overall advantages and disadvantages of different techniques will be analyzed. - Highlights: ► The techniques used for studying illuminated manuscripts are described and compared. ► For in situ, non-destructive analysis the most suitable technique is EDXRF. ► For quantitative analysis TXRF is more appropriate. ► Raman spectroscopy is mostly used for pigments identification. ► FTIR was used for the characterization of binders and parchment.

  4. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  5. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  6. Ground-based intercomparison of two isoprene measurement techniques

    Directory of Open Access Journals (Sweden)

    E. Leibrock

    2003-01-01

    Full Text Available An informal intercomparison of two isoprene (C5H8 measurement techniques was carried out during Fall of 1998 at a field site located approximately 3 km west of Boulder, Colorado, USA. A new chemical ionization mass spectrometric technique (CIMS was compared to a well-established gas chromatographic technique (GC. The CIMS technique utilized benzene cation chemistry to ionize isoprene. The isoprene levels measured by the CIMS were often larger than those obtained with the GC. The results indicate that the CIMS technique suffered from an anthropogenic interference associated with air masses from the Denver, CO metropolitan area as well as an additional interference occurring in clean conditions. However, the CIMS technique is also demonstrated to be sensitive and fast. Especially after introduction of a tandem mass spectrometric technique, it is therefore a candidate for isoprene measurements in remote environments near isoprene sources.

  7. The socio-economic base line survey; first chapter of the handbook under preparation: "Managing farmers: a handbook for working with farmers in irrigation and drainage projects"

    NARCIS (Netherlands)

    Schrevel, A.

    2002-01-01

    The text The socio-economic base line survey is the first chapter of a book under preparation meant to instruct senior staff of irrigation and drainage projects on techniques to work with farmers. It informs the reader of best practices to set up and execute a socio-economic baseline survey. The

  8. A linac-based stereotactic irradiation technique of uveal melanoma

    International Nuclear Information System (INIS)

    Dieckmann, Karin; Bogner, Joachim; Georg, Dietmar; Zehetmayer, Martin; Kren, Gerhard; Poetter, Richard

    2001-01-01

    Purpose: To describe a stereotactic irradiation technique for uveal melanomas performed at a linac, based on a non-invasive eye fixation and eye monitoring system. Methods: For eye immobilization a light source system is integrated in a standard stereotactic mask system in front of the healthy eye: During treatment preparation (computed tomography/magnetic resonance imaging) as well as for treatment delivery, patients are instructed to gaze at the fixation light source. A mini-video camera monitors the pupil center position of the diseased eye. For treatment planning and beam delivery standard stereotactic radiotherapy equipment is used. If the pupil center deviation from a predefined 'zero-position' exceeds 1 mm (for more than 2 s), treatment delivery is interrupted. Between 1996 and 1999 60 patients with uveal melanomas, where (i) tumor height exceeded 7 mm, or (ii) tumor height was more than 3 mm, and the central tumor distance to the optic disc and/or the macula was less than 3 mm, have been treated. A total dose of 60 or 70 Gy has been given in 5 fractions within 10 days. Results: The repositioning accuracy in the mask system is 0.47±0.36 mm in rostral-occipital direction, 0.75±0.52 mm laterally, and 1.12±0.96 mm in vertical direction. An eye movement analysis performed for 23 patients shows a pupil center deviation from the 'zero' position<1 mm in 91% of all cases investigated. In a theoretical analysis, pupil center deviations are correlated with GTV 'movements'. For a pupil center deviation of 1 mm (rotation of the globe of 5 degree sign ) the GTV is still encompassed by the 80% isodose in 94%. Conclusion: For treatments of uveal melanomas, linac-based stereotactic radiotherapy combined with a non-invasive eye immobilization and monitoring system represents a feasible, accurate and reproducible method. Besides considerable technical requirements, the complexity of the treatment technique demands an interdisciplinary team continuously dedicated to this

  9. Application of Knowledge-Based Techniques to Tracking Function

    National Research Council Canada - National Science Library

    Farina, A

    2006-01-01

    ...: historical survey of stochastic filtering theory; overview of tracking systems with some details on mono-sensor and multi-sensor tracking, evolution of filtering logics, evolution of correlation logics, and presentation of recent findings on non...

  10. Mammalian Survey Techniques for Level II Natural Resource Inventories on Corps of Engineers Projects (Part 1)

    Science.gov (United States)

    2009-07-01

    sheep (Ovis dalli dalli), mountain goats (Oreamnos americanus) and other hoofed animals are often surveyed using aerial counts from fixed-wing...Society Bulletin 34:69-73. Hilty, J. A., and A. M. Merenlender. 2004. Use of riparian corridors and vineyards by mammalian preda- tors in northern...Witmer, and R. M. Engeman. 2004. Feral swine impacts on agriculture and the environment. Sheep and Goat Research Journal 19:34-40. Slade, N. A., and

  11. THE OPTIMIZATION OF TECHNOLOGICAL MINING PARAMETERS IN QUARRY FOR DIMENSION STONE BLOCKS QUALITY IMPROVEMENT BASED ON PHOTOGRAMMETRIC TECHNIQUES OF MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Ruslan Sobolevskyi

    2018-01-01

    Full Text Available This research focuses on patterns of change in the dimension stone commodity blocks quality production on previously identifi ed and measured geometrical parameters of natural cracks, modelling and planning out the fi nal dimension of stone products and fi nished products based on the proposed digital photogrammetric techniques. The optimal parameters of surveying are investigated and the infl uence of surveying distance to length and crack area is estimated. Rational technological parameters of dimension stone blocks production are taken into account.

  12. Application of ranging technique of radar level meter for draft survey

    Directory of Open Access Journals (Sweden)

    SHEN Yijun

    2017-12-01

    Full Text Available [Objectives] This paper aims to solve the problems of the high subjectivity and low accuracy and efficiency of draft surveying relying on human visual inspection.[Methods] Radar-level oil and liquid measurement technology products are widely used in the petrochemical industry. A device is developed that uses radar to survey the draft of a boat, designed with data series optimization formulae to ensure that the data results are true and correct. At the same time, a test is designed to prove the accuracy of the results.[Results] According to the conditions of the ship,the device is composed of a radar sensor, triangular bracket and display,and is put to use in the test.[Conclusions] With 15 vessels as the research objects,the comparison experiment shows a difference in range between 0.001-0.022 meters, with an average difference rate of 0.028%, which meets the requirements for ship draft survey accuracy.

  13. Reasons for not changing to activity-based costing: a survey of Irish firms

    Directory of Open Access Journals (Sweden)

    Martin Quinn

    2017-04-01

    Full Text Available Purpose – This paper aims to report on a survey of medium and large Irish firms to ascertain reasons for not changing to more advanced costing techniques, namely, activity-based costing (ABC. Developments in technology and recent poor economic conditions would suggest that the technique could be adopted more by firms, as they make increased efforts to keep costs under control. Design/methodology/approach – A survey instrument was used to gather data drawing from the top 1,000 Irish firms. From a useable population of 821 organisations, a response rate of 20.75 per cent was achieved. Findings – Findings show a rate of adoption of ABC of 18.7 per cent, which is lower than previous studies in an Irish context. The level of information technology in firms is not a key factor for non-adoption. Instead, the main reasoning for non-adoption revolve around stable existing costing methods, which firms expressed satisfaction with. Originality/value – This research suggests the adoption of ABC is not necessarily driven by external factors such as technology and economic shocks, at least in the context of Ireland. It also suggests that costing techniques may be deeply embedded within organisations and are less likely to be subject to change.

  14. Reduced-Item Food Audits Based on the Nutrition Environment Measures Surveys.

    Science.gov (United States)

    Partington, Susan N; Menzies, Tim J; Colburn, Trina A; Saelens, Brian E; Glanz, Karen

    2015-10-01

    The community food environment may contribute to obesity by influencing food choice. Store and restaurant audits are increasingly common methods for assessing food environments, but are time consuming and costly. A valid, reliable brief measurement tool is needed. The purpose of this study was to develop and validate reduced-item food environment audit tools for stores and restaurants. Nutrition Environment Measures Surveys for stores (NEMS-S) and restaurants (NEMS-R) were completed in 820 stores and 1,795 restaurants in West Virginia, San Diego, and Seattle. Data mining techniques (correlation-based feature selection and linear regression) were used to identify survey items highly correlated to total survey scores and produce reduced-item audit tools that were subsequently validated against full NEMS surveys. Regression coefficients were used as weights that were applied to reduced-item tool items to generate comparable scores to full NEMS surveys. Data were collected and analyzed in 2008-2013. The reduced-item tools included eight items for grocery, ten for convenience, seven for variety, and five for other stores; and 16 items for sit-down, 14 for fast casual, 19 for fast food, and 13 for specialty restaurants-10% of the full NEMS-S and 25% of the full NEMS-R. There were no significant differences in median scores for varying types of retail food outlets when compared to the full survey scores. Median in-store audit time was reduced 25%-50%. Reduced-item audit tools can reduce the burden and complexity of large-scale or repeated assessments of the retail food environment without compromising measurement quality. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  15. Orientation of student entrepreneurial practices based on administrative techniques

    Directory of Open Access Journals (Sweden)

    Héctor Horacio Murcia Cabra

    2005-07-01

    Full Text Available As part of the second phase of the research project «Application of a creativity model to update the teaching of the administration in Colombian agricultural entrepreneurial systems» it was decided to re-enforce student planning and execution of the students of the Agricultural business Administration Faculty of La Salle University. Those finishing their studies were given special attention. The plan of action was initiated in the second semester of 2003. It was initially defined as a model of entrepreneurial strengthening based on a coherent methodology that included the most recent administration and management techniques. Later, the applicability of this model was tested in some organizations of the agricultural sector that had asked for support in their planning processes. Through an investigation-action process the methodology was redefined in order to arrive at a final model that could be used by faculty students and graduates. The results obtained were applied to the teaching of Entrepreneurial Laboratory of ninth semester students with the hope of improving administrative support to agricultural enterprises. Following this procedure more than 100 students and 200 agricultural producers have applied this procedure between June 2003 and July 2005. The methodology used and the results obtained are presented in this article.

  16. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  17. Microgrids Real-Time Pricing Based on Clustering Techniques

    Directory of Open Access Journals (Sweden)

    Hao Liu

    2018-05-01

    Full Text Available Microgrids are widely spreading in electricity markets worldwide. Besides the security and reliability concerns for these microgrids, their operators need to address consumers’ pricing. Considering the growth of smart grids and smart meter facilities, it is expected that microgrids will have some level of flexibility to determine real-time pricing for at least some consumers. As such, the key challenge is finding an optimal pricing model for consumers. This paper, accordingly, proposes a new pricing scheme in which microgrids are able to deploy clustering techniques in order to understand their consumers’ load profiles and then assign real-time prices based on their load profile patterns. An improved weighted fuzzy average k-means is proposed to cluster load curve of consumers in an optimal number of clusters, through which the load profile of each cluster is determined. Having obtained the load profile of each cluster, real-time prices are given to each cluster, which is the best price given to all consumers in that cluster.

  18. Using Neutron-based techniques to investigate battery behaviour

    International Nuclear Information System (INIS)

    Pramudita, James C.; Goonetilleke, Damien; Sharma, Neeraj; Peterson, Vanessa K.

    2016-01-01

    The extensive use of portable electronic devices has given rise to increasing demand for reliable high energy density storage in the form of batteries. Today, lithium-ion batteries (LIBs) are the leading technology as they offer high energy density and relatively long lifetimes. Despite their widespread adoption, Li-ion batteries still suffer from significant degradation in their performance over time. The most obvious degradation in lithium-ion battery performance is capacity fade – where the capacity of the battery reduces after extended cycling. This talk will focus on how in situ time-resolved neutron powder diffraction (NPD) can be used to gain a better understanding of the structural changes which contribute to the observed capacity fade. The commercial batteries studied each feature different electrochemical and storage histories that are precisely known, allowing us to elucidate the tell-tale signs of battery degradation using NPD and relate these to battery history. Moreover, this talk will also showcase the diverse use of other neutron-based techniques such as neutron imaging to study electrolyte concentrations in lead-acid batteries, and the use of quasi-elastic neutron scattering to study Na-ion dynamics in sodium-ion batteries.

  19. Light based techniques for improving health care: studies at RRCAT

    International Nuclear Information System (INIS)

    Gupta, P.K.; Patel, H.S.; Ahlawat, S.

    2015-01-01

    The invention of Lasers in 1960, the phenomenal advances in photonics as well as the information processing capability of the computers has given a major boost to the R and D activity on the use of light for high resolution biomedical imaging, sensitive, non-invasive diagnosis and precision therapy. The effort has resulted in remarkable progress and it is widely believed that light based techniques hold great potential to offer simpler, portable systems which can help provide diagnostics and therapy in a low resource setting. At Raja Ramanna Centre for Advanced Technology (RRCAT) extensive studies have been carried out on fluorescence spectroscopy of native tissue. This work led to two important outcomes. First, a better understanding of tissue fluorescence and insights on the possible use of fluorescence spectroscopy for screening of cancer and second development of diagnostic systems that can serve as standalone tool for non-invasive screening of the cancer of oral cavity. The optical coherence tomography setups and their functional extensions (polarization sensitive, Doppler) have also been developed and used for high resolution (∼10 µm) biomedical imaging applications, in particular for non-invasive monitoring of the healing of wounds. Chlorophyll based photo-sensitisers and their derivatives have been synthesized in house and used for photodynamic therapy of tumors in animal models and for antimicrobial applications. Various variants of optical tweezers (holographic, Raman etc.) have also been developed and utilised for different applications notably Raman spectroscopy of optically trapped red blood cells. An overview of these activities carried out at RRCAT is presented in this article. (author)

  20. Weighted graph based ordering techniques for preconditioned conjugate gradient methods

    Science.gov (United States)

    Clift, Simon S.; Tang, Wei-Pai

    1994-01-01

    We describe the basis of a matrix ordering heuristic for improving the incomplete factorization used in preconditioned conjugate gradient techniques applied to anisotropic PDE's. Several new matrix ordering techniques, derived from well-known algorithms in combinatorial graph theory, which attempt to implement this heuristic, are described. These ordering techniques are tested against a number of matrices arising from linear anisotropic PDE's, and compared with other matrix ordering techniques. A variation of RCM is shown to generally improve the quality of incomplete factorization preconditioners.

  1. Fiscal 1999 survey report. Survey of international cooperation over energy use rationalization techniques; 1999 nendo energy shiyo gorika shuho kokusai kyoryoku chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    In the past, studies have been conducted involving the possibility of development of information offering techniques for the realization of sustainable society, for which LCA (life cycle assessment) as a tool for reducing energy consumption and environmental impact is investigated and case studies are conducted in this connection. In this fiscal year, for the purpose of deliberating how to utilize the LCA results, a survey is conducted of how LCA is being used as a tool for constructing an environmental management system set forth in the ISO14000 series, by holding conferences with LCA researchers representing the respective countries involved. Investigations are conducted into the actualities of the environmental management system, environmental performance assessment, and environmental labelling whose standardization has been under way in compliance with the ISO14000 series, into the actualities of matters relating to assessment techniques and decision making such as environmentally-friendly designing, supply chain management, reports on environments, etc., which are becoming established in enterprises, and into the actualities of access to and disclosure of information. International cooperative researches are conducted, participated in by five leading organizations of Sweden, Germany, Denmark, and Canada, and Japan's National Institute for Resources and Development, where actual states of LCA utilization are introduced and improvement on LCA techniques are discussed. (NEDO)

  2. Fiscal 1999 survey report. Survey of international cooperation over energy use rationalization techniques; 1999 nendo energy shiyo gorika shuho kokusai kyoryoku chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    In the past, studies have been conducted involving the possibility of development of information offering techniques for the realization of sustainable society, for which LCA (life cycle assessment) as a tool for reducing energy consumption and environmental impact is investigated and case studies are conducted in this connection. In this fiscal year, for the purpose of deliberating how to utilize the LCA results, a survey is conducted of how LCA is being used as a tool for constructing an environmental management system set forth in the ISO14000 series, by holding conferences with LCA researchers representing the respective countries involved. Investigations are conducted into the actualities of the environmental management system, environmental performance assessment, and environmental labelling whose standardization has been under way in compliance with the ISO14000 series, into the actualities of matters relating to assessment techniques and decision making such as environmentally-friendly designing, supply chain management, reports on environments, etc., which are becoming established in enterprises, and into the actualities of access to and disclosure of information. International cooperative researches are conducted, participated in by five leading organizations of Sweden, Germany, Denmark, and Canada, and Japan's National Institute for Resources and Development, where actual states of LCA utilization are introduced and improvement on LCA techniques are discussed. (NEDO)

  3. Combination of multielement technique (INAA and ICP-MS) for a French air pollution bio-monitoring survey using mosses

    International Nuclear Information System (INIS)

    Ayrault, S.; Deschamps, C.; Amblard, G.; Galsomies, L.; Letrouit-Galinou, M.A.; Bonhomme, P.

    1998-01-01

    This work presents the use of two trace analysis techniques through the data obtained for a significant part of the 557 mosses sampled in France. Sampling were made within the framework of European survey carried out in 1995-1996 and proposed by the Nordic Council. The analyses were produced with a combination of two multielement analysis techniques: INAA (Instrumental Neutron Activation Analysis) and ICP-MS (Inductively Coupled Plasma Mass Spectrometry). Theses two techniques were suitable for trace analyses in mosses. They were clearly complementary and provided for 36 elements including the heavy metals of key interest in air pollution studies. The choice of the technique for a given element depended on the feasibility (e g. Pb is not attainable by INAA), the detection limit, the analytical variability, the preparation procedures and the concentration ranges (5-100 μg/g for Pb, 0.5-5 μg/g for As). INAA measured the total content in the sample, while ICP-MS demanded a mineralization procedure resulting in losses/contamination hazards. Thus, INAA results were preferred, although this technique was time consuming. However the ICP-MS results for Cd, Cu, Ni and Pb were retained, for different reasons: detection limits (Cd, Cu), no convenient INAA conditions (Ni), and feasibility (Pb). (authors)

  4. Survey to explore understanding of the principles of aseptic technique: Qualitative content analysis with descriptive analysis of confidence and training.

    Science.gov (United States)

    Gould, Dinah J; Chudleigh, Jane; Purssell, Edward; Hawker, Clare; Gaze, Sarah; James, Deborah; Lynch, Mary; Pope, Nicola; Drey, Nicholas

    2018-04-01

    In many countries, aseptic procedures are undertaken by nurses in the general ward setting, but variation in practice has been reported, and evidence indicates that the principles underpinning aseptic technique are not well understood. A survey was conducted, employing a brief, purpose-designed, self-reported questionnaire. The response rate was 72%. Of those responding, 65% of nurses described aseptic technique in terms of the procedure used to undertake it, and 46% understood the principles of asepsis. The related concepts of cleanliness and sterilization were frequently confused with one another. Additionally, 72% reported that they not had received training for at least 5 years; 92% were confident of their ability to apply aseptic technique; and 90% reported that they had not been reassessed since their initial training. Qualitative analysis confirmed a lack of clarity about the meaning of aseptic technique. Nurses' understanding of aseptic technique and the concepts of sterility and cleanliness is inadequate, a finding in line with results of previous studies. This knowledge gap potentially places patients at risk. Nurses' understanding of the principles of asepsis could be improved. Further studies should establish the generalizability of the study findings. Possible improvements include renewed emphasis during initial nurse education, greater opportunity for updating knowledge and skills post-qualification, and audit of practice. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  5. The sterile-insect technique for the control of fruit flies: a survey

    International Nuclear Information System (INIS)

    Harris, E.J.

    1975-01-01

    Some advantages of the sterile-insect technique (SIT) are its minimum contribution to environmental pollution and its minimum adverse effect on non-target organisms. A review is made of the melon fly and sterile Mediterranean fruit fly release programmes, the accomplishments, and the implications. Recommendations are made for research leading to development of methods for practical use of the SIT. (author)

  6. Results and analysis of the 2008-2009 Insulin Injection Technique Questionnaire survey

    NARCIS (Netherlands)

    De Coninck, Carina; Frid, Anders; Gaspar, Ruth; Hicks, Debbie; Hirsch, Larry; Kreugel, Gillian; Liersch, Jutta; Letondeur, Corinne; Sauvanet, Jean-Pierre; Tubiana, Nadia; Strauss, Kenneth

    Background: The efficacy of injection therapy in diabetes depends on correct injection technique and, to provide patients with guidance in this area, we must understand how they currently inject. Methods: From September 2008 to June 2009, 4352 insulin-injecting Type 1 and Type 2 diabetic patients

  7. Vector Quantization of Harmonic Magnitudes in Speech Coding Applications—A Survey and New Technique

    Directory of Open Access Journals (Sweden)

    Wai C. Chu

    2004-12-01

    Full Text Available A harmonic coder extracts the harmonic components of a signal and represents them efficiently using a few parameters. The principles of harmonic coding have become quite successful and several standardized speech and audio coders are based on it. One of the key issues in harmonic coder design is in the quantization of harmonic magnitudes, where many propositions have appeared in the literature. The objective of this paper is to provide a survey of the various techniques that have appeared in the literature for vector quantization of harmonic magnitudes, with emphasis on those adopted by the major speech coding standards; these include constant magnitude approximation, partial quantization, dimension conversion, and variable-dimension vector quantization (VDVQ. In addition, a refined VDVQ technique is proposed where experimental data are provided to demonstrate its effectiveness.

  8. A Survey on Evolutionary Algorithm Based Hybrid Intelligence in Bioinformatics

    Directory of Open Access Journals (Sweden)

    Shan Li

    2014-01-01

    Full Text Available With the rapid advance in genomics, proteomics, metabolomics, and other types of omics technologies during the past decades, a tremendous amount of data related to molecular biology has been produced. It is becoming a big challenge for the bioinformatists to analyze and interpret these data with conventional intelligent techniques, for example, support vector machines. Recently, the hybrid intelligent methods, which integrate several standard intelligent approaches, are becoming more and more popular due to their robustness and efficiency. Specifically, the hybrid intelligent approaches based on evolutionary algorithms (EAs are widely used in various fields due to the efficiency and robustness of EAs. In this review, we give an introduction about the applications of hybrid intelligent methods, in particular those based on evolutionary algorithm, in bioinformatics. In particular, we focus on their applications to three common problems that arise in bioinformatics, that is, feature selection, parameter estimation, and reconstruction of biological networks.

  9. Visualization-based analysis of multiple response survey data

    Science.gov (United States)

    Timofeeva, Anastasiia

    2017-11-01

    During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.

  10. Using benchmarking techniques and the 2011 maternity practices infant nutrition and care (mPINC) survey to improve performance among peer groups across the United States.

    Science.gov (United States)

    Edwards, Roger A; Dee, Deborah; Umer, Amna; Perrine, Cria G; Shealy, Katherine R; Grummer-Strawn, Laurence M

    2014-02-01

    A substantial proportion of US maternity care facilities engage in practices that are not evidence-based and that interfere with breastfeeding. The CDC Survey of Maternity Practices in Infant Nutrition and Care (mPINC) showed significant variation in maternity practices among US states. The purpose of this article is to use benchmarking techniques to identify states within relevant peer groups that were top performers on mPINC survey indicators related to breastfeeding support. We used 11 indicators of breastfeeding-related maternity care from the 2011 mPINC survey and benchmarking techniques to organize and compare hospital-based maternity practices across the 50 states and Washington, DC. We created peer categories for benchmarking first by region (grouping states by West, Midwest, South, and Northeast) and then by size (grouping states by the number of maternity facilities and dividing each region into approximately equal halves based on the number of facilities). Thirty-four states had scores high enough to serve as benchmarks, and 32 states had scores low enough to reflect the lowest score gap from the benchmark on at least 1 indicator. No state served as the benchmark on more than 5 indicators and no state was furthest from the benchmark on more than 7 indicators. The small peer group benchmarks in the South, West, and Midwest were better than the large peer group benchmarks on 91%, 82%, and 36% of the indicators, respectively. In the West large, the Midwest large, the Midwest small, and the South large peer groups, 4-6 benchmarks showed that less than 50% of hospitals have ideal practice in all states. The evaluation presents benchmarks for peer group state comparisons that provide potential and feasible targets for improvement.

  11. New sensor and non-contact geometrical survey for the vibrating wire technique

    Energy Technology Data Exchange (ETDEWEB)

    Geraldes, Renan [Brazilian Synchrotron Light Laboratory (LNLS), Campinas, SP (Brazil); Junqueira Leão, Rodrigo, E-mail: rodrigo.leao@lnls.br [Brazilian Synchrotron Light Laboratory (LNLS), Campinas, SP (Brazil); Cernicchiaro, Geraldo [Brazilian Center for Research in Physics (CBPF), Rio de Janeiro, RJ (Brazil); Terenzi Neuenschwander, Regis; Citadini, James Francisco; Droher Rodrigues, Antônio Ricardo [Brazilian Synchrotron Light Laboratory (LNLS), Campinas, SP (Brazil)

    2016-03-01

    The tolerances for the alignment of the magnets in the girders of the next machine of the Brazilian Synchrotron Light Laboratory (LNLS), Sirius, are as small as 40 µm for translations and 0.2 mrad for rotations. Therefore, a novel approach to the well-known vibrating wire technique has been developed and tested for the precise fiducialization of magnets. The alignment bench consists of four commercial linear stages, a stretched wire, a commercial lock-in amplifier working with phase-locked loop (PLL), a coordinate measuring machine (CMM) and a vibration sensor for the wire. This novel sensor has been designed for a larger linear region of operation. For the mechanical metrology step of the fiducialization of quadrupoles an innovative technique, using the vision system of the CMM, is presented. While the work with pitch and yaw orientations is still ongoing with promising partial results, the system already presents an uncertainty level below 10 µm for translational alignment.

  12. Survey of Techniques for Deep Web Source Selection and Surfacing the Hidden Web Content

    OpenAIRE

    Khushboo Khurana; M.B. Chandak

    2016-01-01

    Large and continuously growing dynamic web content has created new opportunities for large-scale data analysis in the recent years. There is huge amount of information that the traditional web crawlers cannot access, since they use link analysis technique by which only the surface web can be accessed. Traditional search engine crawlers require the web pages to be linked to other pages via hyperlinks causing large amount of web data to be hidden from the crawlers. Enormous data is available in...

  13. A Survey of Congestion Control Techniques and Data Link Protocols in Satellite Networks

    OpenAIRE

    Fahmy, Sonia; Jain, Raj; Lu, Fang; Kalyanaraman, Shivkumar

    1998-01-01

    Satellite communication systems are the means of realizing a global broadband integrated services digital network. Due to the statistical nature of the integrated services traffic, the resulting rate fluctuations and burstiness render congestion control a complicated, yet indispensable function. The long propagation delay of the earth-satellite link further imposes severe demands and constraints on the congestion control schemes, as well as the media access control techniques and retransmissi...

  14. Survey of geophysical techniques for site characterization in basalt, salt and tuff

    International Nuclear Information System (INIS)

    Jones, G.M.; Blackey, M.E.; Rice, J.E.; Murphy, V.J.; Levine, E.N.; Fisk, P.S.; Bromery, R.W.

    1987-07-01

    Geophysical techniques may help determine the nature and extent of faulting in the target areas, along with structural information that would be relevant to questions concerning the future integrity of a high-level-waste repository. Chapters focus on particular geophysical applications to four rock types - basalt, bedded salt, domal salt and tuff - characteristic of the sites originally proposed for site characterization. No one geophysical method can adequately characterize the geological structure beneath any site. The seismic reflection method, which is generally considered to be the most incisive of the geophysical techniques, has to date provided only marginal information on structure at the depth of the proposed repository at the Hanford, Washington, site, and no useful results at all at the Yucca Mountain, Nevada, site. This result is partially due to geological complexity beneath these sites, but may also be partially attributed to the use of inappropriate acquisition and processing parameters. To adequately characterize a site using geophysics, modifications will have to be made to standard techniques to emphasize structural details at the depths of interest. 137 refs., 43 figs., 4 tabs

  15. Contextualising Water Use in Residential Settings: A Survey of Non-Intrusive Techniques and Approaches

    Directory of Open Access Journals (Sweden)

    Davide Carboni

    2016-05-01

    Full Text Available Water monitoring in households is important to ensure the sustainability of fresh water reserves on our planet. It provides stakeholders with the statistics required to formulate optimal strategies in residential water management. However, this should not be prohibitive and appliance-level water monitoring cannot practically be achieved by deploying sensors on every faucet or water-consuming device of interest due to the higher hardware costs and complexity, not to mention the risk of accidental leakages that can derive from the extra plumbing needed. Machine learning and data mining techniques are promising techniques to analyse monitored data to obtain non-intrusive water usage disaggregation. This is because they can discern water usage from the aggregated data acquired from a single point of observation. This paper provides an overview of water usage disaggregation systems and related techniques adopted for water event classification. The state-of-the art of algorithms and testbeds used for fixture recognition are reviewed and a discussion on the prominent challenges and future research are also included.

  16. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    Science.gov (United States)

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017.

  17. SPAM CLASSIFICATION BASED ON SUPERVISED LEARNING USING MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Hamsapriya

    2011-12-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Spam emails are invading users without their consent and filling their mail boxes. They consume more network capacity as well as time in checking and deleting spam mails. The vast majority of Internet users are outspoken in their disdain for spam, although enough of them respond to commercial offers that spam remains a viable source of income to spammers. While most of the users want to do right think to avoid and get rid of spam, they need clear and simple guidelines on how to behave. In spite of all the measures taken to eliminate spam, they are not yet eradicated. Also when the counter measures are over sensitive, even legitimate emails will be eliminated. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifier-related issues. In recent days, Machine learning for spam classification is an important research issue. The effectiveness of the proposed work is explores and identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysis among the algorithms has also been presented.

  18. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  19. Using a web-based survey tool to undertake a Delphi study: application for nurse education research.

    Science.gov (United States)

    Gill, Fenella J; Leslie, Gavin D; Grech, Carol; Latour, Jos M

    2013-11-01

    The Internet is increasingly being used as a data collection medium to access research participants. This paper reports on the experience and value of using web-survey software to conduct an eDelphi study to develop Australian critical care course graduate practice standards. The eDelphi technique used involved the iterative process of administering three rounds of surveys to a national expert panel. The survey was developed online using SurveyMonkey. Panel members responded to statements using one rating scale for round one and two scales for rounds two and three. Text boxes for panel comments were provided. For each round, the SurveyMonkey's email tool was used to distribute an individualized email invitation containing the survey web link. The distribution of panel responses, individual responses and a summary of comments were emailed to panel members. Stacked bar charts representing the distribution of responses were generated using the SurveyMonkey software. Panel response rates remained greater than 85% over all rounds. An online survey provided numerous advantages over traditional survey approaches including high quality data collection, ease and speed of survey administration, direct communication with the panel and rapid collation of feedback allowing data collection to be undertaken in 12 weeks. Only minor challenges were experienced using the technology. Ethical issues, specific to using the Internet to conduct research and external hosting of web-based software, lacked formal guidance. High response rates and an increased level of data quality were achieved in this study using web-survey software and the process was efficient and user-friendly. However, when considering online survey software, it is important to match the research design with the computer capabilities of participants and recognize that ethical review guidelines and processes have not yet kept pace with online research practices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim; Muller, Hendrik; Adam, Frederick M.; Panda, Saroj K.; Witt, Matthias; Al-Hajji, Adnan A.; Sarathy, Mani

    2015-01-01

    cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated

  1. Survey of the University "Luis Vargas Torres" through Econometric Techniques. Comprehensive Income

    Directory of Open Access Journals (Sweden)

    Ramón Rodríguez-Betancourt

    2015-12-01

    Full Text Available Government Programme 2013-2017 defines in higher education, the principles of the curriculum proposals of much public interest careers. In this regard the authorities of the Technical University "Luis Vargas Torres" of Esmeraldas, have taken the decision to evaluate its management and action against the development of the province and the country. Therefore, the objective of this research is the application of a survey of students of different specialties, Faculty of Engineering and Technology to explore their views on the educational process, research, outreach, culture and sport, stratified random sampling with results showing that key processes are completed by 66% on average is applied, indicating that the authority still has to set goals to overcome the short comings that have an emphasis on research.

  2. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques.

    Science.gov (United States)

    Bergquist, Magnus; Nilsson, Andreas; Hansla, André

    2017-01-01

    Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants ( n = 347) were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1) and higher personal norms for non-targeted pro-environmental behaviors (Study 2). These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal.

  3. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques

    Directory of Open Access Journals (Sweden)

    Magnus Bergquist

    2017-11-01

    Full Text Available Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants (n = 347 were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1 and higher personal norms for non-targeted pro-environmental behaviors (Study 2. These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal.

  4. FPGA based mixed-signal circuit novel testing techniques

    International Nuclear Information System (INIS)

    Pouros, Sotirios; Vassios, Vassilios; Papakostas, Dimitrios; Hristov, Valentin

    2013-01-01

    Electronic circuits fault detection techniques, especially on modern mixed-signal circuits, are evolved and customized around the world to meet the industry needs. The paper presents techniques used on fault detection in mixed signal circuits. Moreover, the paper involves standardized methods, along with current innovations for external testing like Design for Testability (DfT) and Built In Self Test (BIST) systems. Finally, the research team introduces a circuit implementation scheme using FPGA

  5. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2007-01-01

    Full Text Available In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart sensors that today’s cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher’s interest in the fusion of intelligent sensors and optimal signal processing techniques.

  6. Survey: technique of performing intravitreal injection among members of the Brazilian Retina and Vitreous Society (SBRV

    Directory of Open Access Journals (Sweden)

    Helio F. Shiroma

    2015-02-01

    Full Text Available Purpose: To evaluate and describe the precautions involved in the technique of intravitreal injection of antiangiogenic drugs adopted by the ophthalmologists who are members of the Brazilian Society of Retina and Vitreous (SBRV. Methods: A questionnaire containing 22 questions related to precautions taken before, during, and after intravitreal injection was sent electronically to 920 members of SBRV between November 15, 2013 and April 31, 2014. Results: 352 responses (38% were obtained. There was a predominance of men (76% from the southwest region of Brazil (51%. The professional experience varied between 6 and 15 years after medical specialization (50%. Most professionals (76% performed an average of 1 to 10 intravitreal injections a week, and 88% of the procedures were performed in the operating room using povidone iodine (99%, sterile gloves, and blepharostat (94%. For inducing topical anesthesia, usage of anesthetic eye drops was the most used technique (65%. Ranibizumab (Lucentis® was the most common drug (55%, and age-related macular degeneration (AMD was the most treated disease (57%. Regarding the complications treated, 6% of the ophthalmologists had treated at least one case of retinal detachment, 20% had treated cases of endophthalmitis, 9% had treated cases of vitreous hemorrhage, and 12% had encountered cases of crystalline lens touch. Conclusion: Intravitreal injection is a procedure routinely performed by retina specialists and has a low incidence of complications. Performing the procedure in the operating room using an aseptic technique was preferred by most of the respondents. Ranibizumab was the most used drug, and AMD was the most treated disease.

  7. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    Science.gov (United States)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  8. [Surveying a zoological facility through satellite-based geodesy].

    Science.gov (United States)

    Böer, M; Thien, W; Tölke, D

    2000-06-01

    In the course of a thesis submitted for a diploma degree within the Fachhochschule Oldenburg the Serengeti Safaripark was surveyed in autumn and winter 1996/97 laying in the planning foundations for the application for licences from the controlling authorities. Taking into consideration the special way of keeping animals in the Serengeti Safaripark (game ranching, spacious walk-through-facilities) the intention was to employ the outstanding satellite based geodesy. This technology relies on special aerials receiving signals from 24 satellites which circle around the globe. These data are being gathered and examined. This examination produces the exact position of this aerial in a system of coordinates which allows depicting this point on a map. This procedure was used stationary (from a strictly defined point) as well as in the movement (in a moving car). Additionally conventional procedures were used when the satellite based geodesy came to its limits. Finally a detailed map of the Serengeti Safaripark was created which shows the position and size of stables and enclosures as well as wood and water areas and the sectors of the leisure park. Furthermore the established areas of the enclosures together with an already existing animal databank have flown into an information system with the help of which the stock of animals can be managed enclosure-orientated.

  9. Web-based drug repurposing tools: a survey.

    Science.gov (United States)

    Sam, Elizabeth; Athri, Prashanth

    2017-10-06

    Drug repurposing (a.k.a. drug repositioning) is the search for new indications or molecular targets distinct from a drug's putative activity, pharmacological effect or binding specificities. With the ever-increasing rates of termination of drugs in clinical trials, drug repositioning has risen as one of the effective solutions against the risk of drug failures. Repositioning finds a way to reverse the grim but real trend that Eroom's law portends for the pharmaceutical and biotech industry, and drug discovery in general. Further, the advent of high-throughput technologies to explore biological systems has enabled the generation of zeta bytes of data and a massive collection of databases that store them. Computational analytics and mining are frequently used as effective tools to explore this byzantine series of biological and biomedical data. However, advanced computational tools are often difficult to understand or use, thereby limiting their accessibility to scientists without a strong computational background. Hence it is of great importance to build user-friendly interfaces to extend the user-base beyond computational scientists, to include life scientists who may have deeper chemical and biological insights. This survey is focused on systematically presenting the available Web-based tools that aid in repositioning drugs. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Knowledge and use of evidence-based nutrition : a survey of paediatric dietitians

    NARCIS (Netherlands)

    Thomas, DE; Kukuruzovic, R; Martino, B; Chauhan, SS; Elliott, EJ

    2003-01-01

    Objective To survey paediatric dietitians' knowledge and use of evidence-based nutrition (EBN). Design Cross-sectional survey using reply-paid questionnaires. Subjects Paediatric dietitians in Australian teaching hospitals. Main outcome measures Age, sex, appointment, clinical practice, research

  11. A Survey on Formal Verification Techniques for Safety-Critical Systems-on-Chip

    Directory of Open Access Journals (Sweden)

    Tomás Grimm

    2018-05-01

    Full Text Available The high degree of miniaturization in the electronics industry has been, for several years, a driver to push embedded systems to different fields and applications. One example is safety-critical systems, where the compactness in the form factor helps to reduce the costs and allows for the implementation of new techniques. The automotive industry is a great example of a safety-critical area with a great rise in the adoption of microelectronics. With it came the creation of the ISO 26262 standard with the goal of guaranteeing a high level of dependability in the designs. Other areas in the safety-critical applications domain have similar standards. However, these standards are mostly guidelines to make sure that designs reach the desired dependability level without explicit instructions. In the end, the success of the design to fulfill the standard is the result of a thorough verification process. Naturally, the goal of any verification team dealing with such important designs is complete coverage as well as standards conformity, but as these are complex hardware, complete functional verification is a difficult task. From the several techniques that exist to verify hardware, where each has its pros and cons, we studied six well-established in academia and in industry. We can divide them into two categories: simulation, which needs extremely large amounts of time, and formal verification, which needs unrealistic amounts of resources. Therefore, we conclude that a hybrid approach offers the best balance between simulation (time and formal verification (resources.

  12. A Performance Survey on Stack-based and Register-based Virtual Machines

    OpenAIRE

    Fang, Ruijie; Liu, Siqi

    2016-01-01

    Virtual machines have been widely adapted for high-level programming language implementations and for providing a degree of platform neutrality. As the overall use and adaptation of virtual machines grow, the overall performance of virtual machines has become a widely-discussed topic. In this paper, we present a survey on the performance differences of the two most widely adapted types of virtual machines - the stack-based virtual machine and the register-based virtual machine - using various...

  13. Proximal occlusion of hydrosalpinges by Essure(®) before assisted reproduction techniques: a French survey.

    Science.gov (United States)

    Legendre, Guillaume; Moulin, Julie; Vialard, Jean; Ziegler, Dominique D E; Fanchin, Renato; Pouly, Jean Luc; Watrelot, Antoine; Belaisch Allart, Joëlle; Massin, Nathalie; Fernandez, Hervé

    2014-10-01

    To study the feasibility and results (live-birth and complication rates) of placement of Essure(®) microinserts before assisted reproductive technology (ART) treatment of women with hydrosalpinx when laparoscopy should be avoided. Study design National survey of 45 French hospital centres providing ART reporting a retrospective analysis of 43 women with unilateral or bilateral hydrosalpinges and Essure(®) placement. The results of the following ART cycle were studied for 54 embryo transfers. The placement success rate reached 92.8% (65/70 tubes), and the mean number of visible intrauterine coils was 1.61 (range: 0-6). Pyosalpinx occurred in one case, and expulsion of the device into the uterus in two others. Of 43 women, 29 (67.4%) had a total of 54 fresh or frozen embryos transferred. The clinical pregnancy rate was 40.7% (22/54) and the live-birth rate 25.9% (14/54). The implantation rate was 29.3% (27/92). Essure(®) placement is an effective method for occlusion of hydrosalpinges before IVF. Monitoring the live-birth rate confirms that this option is the strongest in cases when laparoscopy is impossible or contraindicated. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. A compressed sensing based approach on Discrete Algebraic Reconstruction Technique.

    Science.gov (United States)

    Demircan-Tureyen, Ezgi; Kamasak, Mustafa E

    2015-01-01

    Discrete tomography (DT) techniques are capable of computing better results, even using less number of projections than the continuous tomography techniques. Discrete Algebraic Reconstruction Technique (DART) is an iterative reconstruction method proposed to achieve this goal by exploiting a prior knowledge on the gray levels and assuming that the scanned object is composed from a few different densities. In this paper, DART method is combined with an initial total variation minimization (TvMin) phase to ensure a better initial guess and extended with a segmentation procedure in which the threshold values are estimated from a finite set of candidates to minimize both the projection error and the total variation (TV) simultaneously. The accuracy and the robustness of the algorithm is compared with the original DART by the simulation experiments which are done under (1) limited number of projections, (2) limited view problem and (3) noisy projections conditions.

  15. Non-Destructive Techniques Based on Eddy Current Testing

    Science.gov (United States)

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future. PMID:22163754

  16. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  17. A survey on bio inspired meta heuristic based clustering protocols for wireless sensor networks

    Science.gov (United States)

    Datta, A.; Nandakumar, S.

    2017-11-01

    Recent studies have shown that utilizing a mobile sink to harvest and carry data from a Wireless Sensor Network (WSN) can improve network operational efficiency as well as maintain uniform energy consumption by the sensor nodes in the network. Due to Sink mobility, the path between two sensor nodes continuously changes and this has a profound effect on the operational longevity of the network and a need arises for a protocol which utilizes minimal resources in maintaining routes between the mobile sink and the sensor nodes. Swarm Intelligence based techniques inspired by the foraging behavior of ants, termites and honey bees can be artificially simulated and utilized to solve real wireless network problems. The author presents a brief survey on various bio inspired swarm intelligence based protocols used in routing data in wireless sensor networks while outlining their general principle and operation.

  18. A critical survey of agent-based wholesale electricity market models

    International Nuclear Information System (INIS)

    Weidlich, Anke; Veit, Daniel

    2008-01-01

    The complexity of electricity markets calls for rich and flexible modeling techniques that help to understand market dynamics and to derive advice for the design of appropriate regulatory frameworks. Agent-Based Computational Economics (ACE) is a fairly young research paradigm that offers methods for realistic electricity market modeling. A growing number of researchers have developed agent-based models for simulating electricity markets. The diversity of approaches makes it difficult to overview the field of ACE electricity research; this literature survey should guide the way through and describe the state-of-the-art of this research area. In a conclusive summary, shortcomings of existing approaches and open issues that should be addressed by ACE electricity researchers are critically discussed. (author)

  19. Skull base tumours part I: Imaging technique, anatomy and anterior skull base tumours

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Alexandra [Instituto Portugues de Oncologia Francisco Gentil, Centro de Lisboa, Servico de Radiologia, Rua Professor Lima Basto, 1093 Lisboa Codex (Portugal)], E-mail: borgesalexandra@clix.pt

    2008-06-15

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed.

  20. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  1. MySQL based selection of appropriate indexing technique in ...

    African Journals Online (AJOL)

    This paper deals with selection of appropriate indexing technique applied on MySQL Database for a health care system and related performance issues using multiclass support vector machine (SVM). The patient database is generally huge and contains lot of variations. For the quick search or fast retrieval of the desired ...

  2. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...

  3. an architecture-based technique to mobile contact recommendation

    African Journals Online (AJOL)

    user

    Aside being able to store the name of contacts and their phone numbers, there are ... the artificial neural network technique [21], along with ... Recommendation is part of everyday life. This concept ... However, to use RSs some level of intelligence must be ...... [3] Min J.-K. & Cho S.-B.Mobile Human Network Management.

  4. MRA Based Efficient Database Storing and Fast Querying Technique

    Directory of Open Access Journals (Sweden)

    Mitko Kostov

    2017-02-01

    Full Text Available In this paper we consider a specific way of organizing 1D signals or 2D image databases, such that a more efficient storage and faster querying is achieved. A multiresolution technique of data processing is used in order of saving the most significant processed data.

  5. Calcium intake by adolescents: a population-based health survey.

    Science.gov (United States)

    de Assumpção, Daniela; Dias, Marcia Regina Messaggi Gomes; de Azevedo Barros, Marilisa Berti; Fisberg, Regina Mara; de Azevedo Barros Filho, Antonio

    2016-01-01

    To analyze calcium intake in adolescents according to sociodemographic variables, health-related behaviors, morbidities, and body mass index. This was a cross-sectional population-based study, with a two-stage cluster sampling that used data from a survey conducted in Campinas, São Paulo, Brazil, between 2008 and 2009. Food intake was assessed using a 24-hour dietary recall. The study included 913 adolescents aged 10-19 years. Average nutrient intake was significantly lower in the segment with lower education of the head of the family and lower per capita family income, in individuals from other cities or states, those who consumed fruit less than four times a week, those who did not drink milk daily, those who were smokers, and those who reported the occurrence of headaches and dizziness. Higher mean calcium intake was found in individuals that slept less than seven hours a day. The prevalence of calcium intake below the recommendation was 88.6% (95% CI: 85.4-91.2). The results alert to an insufficient calcium intake and suggest that certain subgroups of adolescents need specific strategies to increase the intake of this nutrient. Copyright © 2015 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  6. Calcium intake by adolescents: a population-based health survey

    Directory of Open Access Journals (Sweden)

    Daniela de Assumpção

    2016-06-01

    Full Text Available Abstract Objective To analyze calcium intake in adolescents according to sociodemographic variables, health-related behaviors, morbidities, and body mass index. Methods This was a cross-sectional population-based study, with a two-stage cluster sampling that used data from a survey conducted in Campinas, São Paulo, Brazil, between 2008 and 2009. Food intake was assessed using a 24-hour dietary recall. The study included 913 adolescents aged 10-19 years. Results Average nutrient intake was significantly lower in the segment with lower education of the head of the family and lower per capita family income, in individuals from other cities or states, those who consumed fruit less than four times a week, those who did not drink milk daily, those who were smokers, and those who reported the occurrence of headaches and dizziness. Higher mean calcium intake was found in individuals that slept less than seven hours a day. The prevalence of calcium intake below the recommendation was 88.6% (95% CI: 85.4-91.2. Conclusion The results alert to an insufficient calcium intake and suggest that certain subgroups of adolescents need specific strategies to increase the intake of this nutrient.

  7. Digital Survey Meter based on PIC16F628 Microcontroller

    International Nuclear Information System (INIS)

    Al-Mohamad, A.; Shliwitt, J.

    2010-01-01

    A Digital Survey Meter based on PIC16F628 Microcontroller was designed using simple Geiger-Muller Counter ZP1320 made by Centronic in the UK as detector. The sensitivity of this tube is about 9 counts/s at 10μGy/h. It is sensitive to gamma and beta particles over 0.25 MeV. It has a sensitive length of 28mm. Count rate versus dose rate is quite linear up to about 10 4 counts/s. Indication is given by a speaker which emits one click for each count. In addition to the acoustic alarm, the meter works according one of three different measurement modes selected using appropriate 3 states switch: 1- Measurement of Dose rate ( in μGy/h) and counting rate ( in CPS) , for High counting rates. 2- Measurement of Dose rate ( in μGy/h) and counting rate ( in CPM), for Low counting rates. 3- Accumulated Counting with continues display for No. of Counts and Counting Time with a period of 2 Sec. The results are Displayed on an Alphanumerical LCD Display, and the circuit will give many hours of operation from a single 9V PP3 battery. The design of the circuit combines between accuracy, simplicity and low power consumption. We built 2 Models of this design, the first only with an internal detector, and the second is equipped with an External Detector. (author)

  8. The first survey of airborne trace elements at airport using moss bag technique.

    Science.gov (United States)

    Vuković, Gordana; Urošević, Mira Aničić; Škrivanj, Sandra; Vergel, Konstantin; Tomašević, Milica; Popović, Aleksandar

    2017-06-01

    Air traffic represents an important way of social mobility in the world, and many ongoing discussions are related to the impacts that air transportation has on local air quality. In this study, moss Sphagnum girgensohnii was used for the first time in the assessment of trace element content at the international airport. The moss bags were exposed during the summer of 2013 at four sampling sites at the airport 'Nikola Tesla' (Belgrade, Serbia): runway (two), auxiliary runway and parking lot. According to the relative accumulation factor (RAF) and the limit of quantification of the moss bag technique (LOQ T ), the most abundant elements in the samples were Zn, Na, Cr, V, Cu and Fe. A comparison between the element concentrations at the airport and the corresponding values in different land use classes (urban central, suburban, industrial and green zones) across the city of Belgrade did not point out that the air traffic and associated activities significantly contribute to the trace element air pollution. This study emphasised an easy operational and robust (bio)monitoring, using moss bags as a suitable method for assessment of air quality within various microenvironments with restriction in positioning referent instrumental devices.

  9. Characterizing marijuana concentrate users: A web-based survey.

    Science.gov (United States)

    Daniulaityte, Raminta; Lamy, Francois R; Barratt, Monica; Nahhas, Ramzi W; Martins, Silvia S; Boyer, Edward W; Sheth, Amit; Carlson, Robert G

    2017-09-01

    The study seeks to characterize marijuana concentrate users, describe reasons and patterns of use, perceived risk, and identify predictors of daily/near daily use. An anonymous web-based survey was conducted (April-June 2016) with 673 US-based cannabis users recruited via the Bluelight.org web-forum and included questions about marijuana concentrate use, other drugs, and socio-demographics. Multivariable logistic regression analyses were conducted to identify characteristics associated with greater odds of lifetime and daily use of marijuana concentrates. About 66% of respondents reported marijuana concentrate use. The sample was 76% male, and 87% white. Marijuana concentrate use was viewed as riskier than flower cannabis. Greater odds of marijuana concentrate use was associated with living in states with "recreational" (AOR=4.91; p=0.001) or "medical, less restrictive" marijuana policies (AOR=1.87; p=0.014), being male (AOR=2.21, p=0.002), younger (AOR=0.95, pmarijuana concentrate users reported daily/near daily use. Greater odds of daily concentrate use was associated with being male (AOR=9.29, p=0.033), using concentrates for therapeutic purposes (AOR=7.61, p=0.001), using vape pens for marijuana concentrate administration (AOR=4.58, p=0.007), and lower perceived risk of marijuana concentrate use (AOR=0.92, p=0.017). Marijuana concentrate use was more common among male, younger and more experienced users, and those living in states with more liberal marijuana policies. Characteristics of daily users, in particular patterns of therapeutic use and utilization of different vaporization devices, warrant further research with community-recruited samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Evaluation of nuclear reactor based activation analysis techniques

    International Nuclear Information System (INIS)

    Obrusnik, I.; Kucera, J.

    1977-09-01

    A survey is presented of the basic types of activation analysis applied in environmental control. Reactor neutron activation analysis is described (including the reactor as a neutron source, sample activation in the reactor, methodology of neutron activation analysis, sample transport into the reactor and sample packaging after irradiation, instrumental activation analysis with radiochemical separation, data measurement and evaluation, sampling and sample preparation). Sources of environmental contamination with trace elements, sampling and sample analysis by neutron activation are described. The analysis is described of soils, waters and biological materials. Methods are shown of evaluating neutron activation analysis results and of their interpretation for purposes of environmental control. (J.B.)

  11. Survey regarding the clinical practice of cardiac CT in Germany. Indications, scanning technique and reporting

    Energy Technology Data Exchange (ETDEWEB)

    Maurer, Marc H.; Hamm, B.; Dewey, M. [Inst. fuer Radiologie, Charite - Universitaetsmedizin Berlin (Germany)

    2009-12-15

    Purpose: to obtain an overview of the current clinical practice of cardiac computed tomography (CT) in Germany. Materials and methods: a 30-item question-naire was mailed to 149 providers of cardiac CT in Germany. The items asked about indications, scanning technique and reporting, data storage, and cost of the examination. Results: overall 45 questionnaires could be analyzed (30%). The majority of centers (76%, 34 of 45 centers) used CT scanners of the latest generation (at least 64 rows). The most common appropriate indications were exclusion of coronary artery disease (91%, 41/45), coronary anomalies (80%, 36/45), and follow-up after coronary artery bypass grafting (53%, 24/45). Each center examined on average 243 {+-} 310 patients in 2007 and the number of centers performing cardiac CT increased significantly in 2007 (p = 0.035) compared with the preceding year. Most used sublingual nitroglycerin (84%, 38/45; median of 2 sprays = 0.8 mg) and/or a beta blocker (86%, 39/44; median of 5 mg IV, median heart rate threshold: 70 beats/min). Many providers used ECG-triggered tube current modulation (65%, 29/44) and/or adjusted the tube current to the body mass index or body weight (63%, 28/44). A median slice thickness of 0.75 mm with a 0.5 mm increment and a 20 cm field-of-view was most commonly used. Source images in orthogonal planes (96%, 43/45), curved MPRs (93%, 42/45), and thin-slice MIPs (69%, 31/45) were used most frequently for interpretation. Extracardiac structures were also evaluated by 84% of the centers (38/45). The mean examination time was 16.2 min and reporting took an average of 28.8 min. (orig.)

  12. A New Three Dimensional Based Key Generation Technique in AVK

    Science.gov (United States)

    Banerjee, Subhasish; Dutta, Manash Pratim; Bhunia, Chandan Tilak

    2017-08-01

    In modern era, ensuring high order security becomes one and only objective of computer networks. From the last few decades, many researchers have given their contributions to achieve the secrecy over the communication channel. In achieving perfect security, Shannon had done the pioneer work on perfect secret theorem and illustrated that secrecy of the shared information can be maintained if the key becomes variable in nature instead of static one. In this regard, a key generation technique has been proposed where the key can be changed every time whenever a new block of data needs to be exchanged. In our scheme, the keys not only vary in bit sequences but also in size. The experimental study is also included in this article to prove the correctness and effectiveness of our proposed technique.

  13. A Review On Segmentation Based Image Compression Techniques

    Directory of Open Access Journals (Sweden)

    S.Thayammal

    2013-11-01

    Full Text Available Abstract -The storage and transmission of imagery become more challenging task in the current scenario of multimedia applications. Hence, an efficient compression scheme is highly essential for imagery, which reduces the requirement of storage medium and transmission bandwidth. Not only improvement in performance and also the compression techniques must converge quickly in order to apply them for real time applications. There are various algorithms have been done in image compression, but everyone has its own pros and cons. Here, an extensive analysis between existing methods is performed. Also, the use of existing works is highlighted, for developing the novel techniques which face the challenging task of image storage and transmission in multimedia applications.

  14. Brain tumor segmentation based on a hybrid clustering technique

    Directory of Open Access Journals (Sweden)

    Eman Abdel-Maksoud

    2015-03-01

    This paper presents an efficient image segmentation approach using K-means clustering technique integrated with Fuzzy C-means algorithm. It is followed by thresholding and level set segmentation stages to provide an accurate brain tumor detection. The proposed technique can get benefits of the K-means clustering for image segmentation in the aspects of minimal computation time. In addition, it can get advantages of the Fuzzy C-means in the aspects of accuracy. The performance of the proposed image segmentation approach was evaluated by comparing it with some state of the art segmentation algorithms in case of accuracy, processing time, and performance. The accuracy was evaluated by comparing the results with the ground truth of each processed image. The experimental results clarify the effectiveness of our proposed approach to deal with a higher number of segmentation problems via improving the segmentation quality and accuracy in minimal execution time.

  15. LFC based adaptive PID controller using ANN and ANFIS techniques

    Directory of Open Access Journals (Sweden)

    Mohamed I. Mosaad

    2014-12-01

    Full Text Available This paper presents an adaptive PID Load Frequency Control (LFC for power systems using Neuro-Fuzzy Inference Systems (ANFIS and Artificial Neural Networks (ANN oriented by Genetic Algorithm (GA. PID controller parameters are tuned off-line by using GA to minimize integral error square over a wide-range of load variations. The values of PID controller parameters obtained from GA are used to train both ANFIS and ANN. Therefore, the two proposed techniques could, online, tune the PID controller parameters for optimal response at any other load point within the operating range. Testing of the developed techniques shows that the adaptive PID-LFC could preserve optimal performance over the whole loading range. Results signify superiority of ANFIS over ANN in terms of performance measures.

  16. New technique for producing the alloys based on transition metals

    International Nuclear Information System (INIS)

    Dolukhanyan, S.K.; Aleksanyan, A.G.; Shekhtman, V.Sh.; Mantashyan, A.A.; Mayilyan, D.G.; Ter-Galstyan, O.P.

    2007-01-01

    In principle new technique was elaborated for obtaining the alloys of refractory metals by their hydrides compacting and following dehydrogenation. The elaborated technique is described. The conditions of alloys formation from different hydrides of appropriate metals was investigated in detail. The influence of the process parameters such as: chemical peculiarities, composition of source hydrides, phase transformation during dehydrogenation, etc. on the alloys formation were established. The binary and tertiary alloys of α and ω phases: Ti 0 .8Zr 0 .8; Ti 0 .66Zr 0 .33; Ti 0 .3Zr 0 .8; Ti 0 .2Zr 0 .8; Ti 0 .8Hf 0 .2; Ti 0 .6Hf 0 .4Ti 0 .66Zr 0 .23Hf 0 .11; etc were recieved. Using elaborated special hydride cycle, an earlier unknown effective process for formation of alloys of transition metals was realized. The dependence of final alloy structure on the composition of initial mixture and hydrogen content in source hydrides was established

  17. Assessment of soil compaction properties based on surface wave techniques

    Science.gov (United States)

    Jihan Syamimi Jafri, Nur; Rahim, Mohd Asri Ab; Zahid, Mohd Zulham Affandi Mohd; Faizah Bawadi, Nor; Munsif Ahmad, Muhammad; Faizal Mansor, Ahmad; Omar, Wan Mohd Sabki Wan

    2018-03-01

    Soil compaction plays an important role in every construction activities to reduce risks of any damage. Traditionally, methods of assessing compaction include field tests and invasive penetration tests for compacted areas have great limitations, which caused time-consuming in evaluating large areas. Thus, this study proposed the possibility of using non-invasive surface wave method like Multi-channel Analysis of Surface Wave (MASW) as a useful tool for assessing soil compaction. The aim of this study was to determine the shear wave velocity profiles and field density of compacted soils under varying compaction efforts by using MASW method. Pre and post compaction of MASW survey were conducted at Pauh Campus, UniMAP after applying rolling compaction with variation of passes (2, 6 and 10). Each seismic data was recorded by GEODE seismograph. Sand replacement test was conducted for each survey line to obtain the field density data. All seismic data were processed using SeisImager/SW software. The results show the shear wave velocity profiles increase with the number of passes from 0 to 6 passes, but decrease after 10 passes. This method could attract the interest of geotechnical community, as it can be an alternative tool to the standard test for assessing of soil compaction in the field operation.

  18. A framework for laboratory pre-work based on the concepts, tools and techniques questioning method

    International Nuclear Information System (INIS)

    Huntula, J; Sharma, M D; Johnston, I; Chitaree, R

    2011-01-01

    Learning in the laboratory is different from learning in other contexts because students have to engage with various aspects of the practice of science. They have to use many skills and knowledge in parallel-not only to understand the concepts of physics but also to use the tools and analyse the data. The question arises, how to best guide students' learning in the laboratory. This study is about creating and using questions with a specifically designed framework to aid learning in the laboratory. The concepts, tools and techniques questioning (CTTQ) method was initially designed and used at Mahidol University, Thailand, and was subsequently extended to laboratory pre-work at the University of Sydney. The CTTQ method was implemented in Sydney with 190 first-year students. Three pre-work exercises on a series of electrical experiments were created based on the CTTQ method. The pre-works were completed individually and submitted before the experiment started. Analysed pre-work, surveys and interviews were used to evaluate the pre-work questions in this study. The results indicated that the CTTQ method was successful and the flow in the experiments was better than that in the previous year. At the same time students had difficulty with the last experiment in the sequence and with techniques.

  19. EVE: Explainable Vector Based Embedding Technique Using Wikipedia

    OpenAIRE

    Qureshi, M. Atif; Greene, Derek

    2017-01-01

    We present an unsupervised explainable word embedding technique, called EVE, which is built upon the structure of Wikipedia. The proposed model defines the dimensions of a semantic vector representing a word using human-readable labels, thereby it readily interpretable. Specifically, each vector is constructed using the Wikipedia category graph structure together with the Wikipedia article link structure. To test the effectiveness of the proposed word embedding model, we consider its usefulne...

  20. Voltage Stabilizer Based on SPWM technique Using Microcontroller

    OpenAIRE

    K. N. Tarchanidis; J. N. Lygouras; P. Botsaris

    2013-01-01

    This paper presents an application of the well known SPWM technique on a voltage stabilizer, using a microcontroller. The stabilizer is AC/DC/AC type. So, the system rectifies the input AC voltage to a suitable DC level and the intelligent control of an embedded microcontroller regulates the pulse width of the output voltage in order to produce through a filter a perfect sinusoidal AC voltage. The control program on the microcontroller has the ability to change the FET transistor ...

  1. The use of advanced web-based survey design in Delphi research.

    Science.gov (United States)

    Helms, Christopher; Gardner, Anne; McInnes, Elizabeth

    2017-12-01

    A discussion of the application of metadata, paradata and embedded data in web-based survey research, using two completed Delphi surveys as examples. Metadata, paradata and embedded data use in web-based Delphi surveys has not been described in the literature. The rapid evolution and widespread use of online survey methods imply that paper-based Delphi methods will likely become obsolete. Commercially available web-based survey tools offer a convenient and affordable means of conducting Delphi research. Researchers and ethics committees may be unaware of the benefits and risks of using metadata in web-based surveys. Discussion paper. Two web-based, three-round Delphi surveys were conducted sequentially between August 2014 - January 2015 and April - May 2016. Their aims were to validate the Australian nurse practitioner metaspecialties and their respective clinical practice standards. Our discussion paper is supported by researcher experience and data obtained from conducting both web-based Delphi surveys. Researchers and ethics committees should consider the benefits and risks of metadata use in web-based survey methods. Web-based Delphi research using paradata and embedded data may introduce efficiencies that improve individual participant survey experiences and reduce attrition across iterations. Use of embedded data allows the efficient conduct of multiple simultaneous Delphi surveys across a shorter timeframe than traditional survey methods. The use of metadata, paradata and embedded data appears to improve response rates, identify bias and give possible explanation for apparent outlier responses, providing an efficient method of conducting web-based Delphi surveys. © 2017 John Wiley & Sons Ltd.

  2. Effects of Personalization and Invitation Email Length on Web-Based Survey Response Rates

    Science.gov (United States)

    Trespalacios, Jesús H.; Perkins, Ross A.

    2016-01-01

    Individual strategies to increase response rate and survey completion have been extensively researched. Recently, efforts have been made to investigate a combination of interventions to yield better response rates for web-based surveys. This study examined the effects of four different survey invitation conditions on response rate. From a large…

  3. New measurement techniques of environmental radioactivity. Methods of surveying marine radioactivity

    International Nuclear Information System (INIS)

    Kobayashi, Yoshii

    1994-01-01

    Measurements of radioactivity have been carried out in solution or suspension in sea-water, bottom sediments and specific marine organisms. The general approach to radionuclide measurement in seawater and bottom sediments has been concentration by coprecipitation, adsorption, ion exchange or solvent extraction. These methods employed are based primarily on shipboard collection of samples followed by land-based laboratory analyses and are too time-consuming. For rapid measurement, in situ measurement of seawater or seabed gamma-ray has developed. A gamma-ray detecting probe containing the NaI(Tl) scintillation or germanium detector is enclosed in a sealed cylinder. The measurements are made by suspending the probe in a 200-300 liter tank and passing seawater through the tank by means of ship deck pumping system, towing the probe across the seafloor, hanging down the probe to the seabed, or loading the probe on a remotely operated undersea vehicle. In situ measurement of gamma-ray in the marine environment has some application to a mineral exploration and to monitoring of sea areas which may become contaminated as the result of accidents or contamination incidents. This article reviews several gamma-ray detecting probes and describes the recent studies at JAERI on the development of a small electric-cooled Ge gamma-ray detector and a marine environmental radioactivity investigation system for in situ measurement of gamma-ray. (J.P.N.)

  4. Community Based Survey on Psychiatric Morbidity in Eastern Nepal

    Directory of Open Access Journals (Sweden)

    Pramod Mohan Shyangwa

    2014-12-01

    Conclusions: Community prevalence rate of some common psychiatric disorders is high which calls for special attention to address depressive and alcohol related disorder from all quarters of society particularly from government. Keywords: community survey; mental illness; psychiatric morbidity.

  5. school-based survey of adolescents' opinion on premarital sex

    African Journals Online (AJOL)

    PROF. BARTH EKWEME

    Method: A cross sectional descriptive survey design was used. ... a taboo between parents and children. The adolescents learned through the mass media and peers unguided. ... adolescents, males reported more permissive attitudes towards ...

  6. Analytical techniques for in-line/on-line monitoring of uranium and plutonium in process solutions : a brief literature survey

    International Nuclear Information System (INIS)

    Marathe, S.G.; Sood, D.D.

    1991-01-01

    In-line/on-line monitoring of various parameters such as uranium-plutonium-fission product concentration, acidity, density etc. plays an important role in quickly understanding the efficiency of processes in a reprocessing plant. Efforts in studying and installation of such analytical instruments are going on since more than three decades with adaptation of newer methods and technologies. A review on the developement of in-line analytical instrumentation was carried out in this laboratory about two decades ago. This report presents a very short literature survey of the work in the last two decades. The report includes an outline of principles of the main techniques employed in the in-line/on-line monitoring. (author). 77 refs., 6 tabs

  7. Research and development of LANDSAT-based crop inventory techniques

    Science.gov (United States)

    Horvath, R.; Cicone, R. C.; Malila, W. A. (Principal Investigator)

    1982-01-01

    A wide spectrum of technology pertaining to the inventory of crops using LANDSAT without in situ training data is addressed. Methods considered include Bayesian based through-the-season methods, estimation technology based on analytical profile fitting methods, and expert-based computer aided methods. Although the research was conducted using U.S. data, the adaptation of the technology to the Southern Hemisphere, especially Argentina was considered.

  8. Geochemical drainage surveys for uranium: sampling and analytical methods based on trial surveys in Pennsylvania

    International Nuclear Information System (INIS)

    Rose, A.W.; Keith, M.L.; Suhr, N.H.

    1976-01-01

    Geochemical surveys near sandstone-type uranium prospects in northeastern and north-central Pennsylvania show that the deposits can be detected by carefully planned stream sediment surveys, but not by stream water surveys. Stream waters at single sites changed in U content by x10 to 50 during the 18 months of our studies, and even near known prospects, contain less than 0.2 ppB U most of the time. Uranium extractable from stream sediment by acetic acid--H 2 O 2 provides useful contrast between mineralized and nonmineralized drainages of a square mile or less; total U in sediment does not. High organic material results in increased U content of sediments and must be corrected. Changes in U content of sediment with time reach a maximum of x3 and appear to be of short duration. A sediment of about 200 mi 2 near Jim Thorpe detects anomalies extending over several square miles near known occurrences and a second anomaly about two miles northeast of Penn Haven Jct. A similar survey in Lycoming-Sullivan Counties shows anomalous zones near known prospects of the Beaver Lake area and northwest of Muncy Creek. As, Mn, Pb, and V are enriched in the mineralized zones, and perhaps in surrounding halo zones, but do not appear to be pathfinder elements useful for reconnaissance exploration

  9. Hiding Techniques for Dynamic Encryption Text based on Corner Point

    Science.gov (United States)

    Abdullatif, Firas A.; Abdullatif, Alaa A.; al-Saffar, Amna

    2018-05-01

    Hiding technique for dynamic encryption text using encoding table and symmetric encryption method (AES algorithm) is presented in this paper. The encoding table is generated dynamically from MSB of the cover image points that used as the first phase of encryption. The Harris corner point algorithm is applied on cover image to generate the corner points which are used to generate dynamic AES key to second phase of text encryption. The embedded process in the LSB for the image pixels except the Harris corner points for more robust. Experimental results have demonstrated that the proposed scheme have embedding quality, error-free text recovery, and high value in PSNR.

  10. Skin fluorescence model based on the Monte Carlo technique

    Science.gov (United States)

    Churmakov, Dmitry Y.; Meglinski, Igor V.; Piletsky, Sergey A.; Greenhalgh, Douglas A.

    2003-10-01

    The novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account spatial distribution of fluorophores following the collagen fibers packing, whereas in epidermis and stratum corneum the distribution of fluorophores assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the NIR spectral region, while fluorescence of sensor layer embedded in epidermis is localized at the adjusted depth. The model is also able to simulate the skin fluorescence spectra.

  11. A Computer Based Moire Technique To Measure Very Small Displacements

    Science.gov (United States)

    Sciammarella, Cesar A.; Amadshahi, Mansour A.; Subbaraman, B.

    1987-02-01

    The accuracy that can be achieved in the measurement of very small displacements in techniques such as moire, holography and speckle is limited by the noise inherent to the utilized optical devices. To reduce the noise to signal ratio, the moire method can be utilized. Two system of carrier fringes are introduced, an initial system before the load is applied and a final system when the load is applied. The moire pattern of these two systems contains the sought displacement information and the noise common to the two patterns is eliminated. The whole process is performed by a computer on digitized versions of the patterns. Examples of application are given.

  12. Indirect Fluorescent Antibody Technique based Prevalence of Surra in Equines

    Directory of Open Access Journals (Sweden)

    Ahsan Nadeem, Asim Aslam*, Zafar Iqbal Chaudhary, Kamran Ashraf1, Khalid Saeed1, Nisar Ahmad1, Ishtiaq Ahmed and Habib ur Rehman2

    2011-04-01

    Full Text Available This project was carried out to find the prevalence of trypanosomiasis in equine in District Gujranwala by using indirect fluorescent antibody technique and thin smear method. Blood samples were collected from a total of 200 horses and donkeys of different ages and either sex. Duplicate thin blood smears were prepared from each sample and remaining blood samples were centrifuged to separate the serum. Smears from each animal were processed for giemsa staining and indirect fluorescent antibody test (IFAT. Giemsa stained smears revealed Trypanosome infection in 4/200 (2.0% samples and IFAT in 12/200 (6.0% animals.

  13. Multimedia-Based Integration of Cross-Layer Techniques

    Science.gov (United States)

    2014-06-01

    Wireless Commun. Mag., vol. 12, no. 4, pp. 50–58, August 2005. 11. E. Setton, T. Yoo, X. Zhu, A. Goldsmith , and B. Girod, “Cross-layer design of ad-hoc...Overview,” DARPA Presentation by Preston Marshall and Todd Martin, WAND Industry Day Workshop, Feb. 27, 2007. 17. S. Chan, “Shared spectrum access for DOD...Lavery, A. Goldsmith , and D. J. Goodman, “Throughput optimization using adaptive techniques,” IEEE Commun. Lett., pp. 1–7, 2006. 32. S. Choudhury and J

  14. GPU-Based Techniques for Global Illumination Effects

    CERN Document Server

    Szirmay-Kalos, László; Sbert, Mateu

    2008-01-01

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. This book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make this book self-contained, the most important c

  15. A fast image reconstruction technique based on ART

    International Nuclear Information System (INIS)

    Zhang Shunli; Zhang Dinghua; Wang Kai; Huang Kuidong; Li Weibin

    2007-01-01

    Algebraic Reconstruction Technique (ART) is an iterative method for image reconstruction. Improving its reconstruction speed has been one of the important researching aspects of ART. For the simplified weight coefficients reconstruction model of ART, a fast grid traverse algorithm is proposed, which can determine the grid index by simple operations such as addition, subtraction and comparison. Since the weight coefficients are calculated at real time during iteration, large amount of storage is saved and the reconstruction speed is greatly increased. Experimental results show that the new algorithm is very effective and the reconstruction speed is improved about 10 times compared with the traditional algorithm. (authors)

  16. Sci-Fri PM: Radiation Therapy, Planning, Imaging, and Special Techniques - 10: Results from Canada Wide Survey on Total Body Irradiation Practice

    Energy Technology Data Exchange (ETDEWEB)

    Studinski, Ryan; Fraser, Danielle; Samant, Rajiv; MacPherson, Miller [The Ottawa Hospital Cancer Centre, The Ottawa Hospital Cancer Centre, The Ottawa Hospital Cancer Centre, The Ottawa Hospital Cancer Centre (Canada)

    2016-08-15

    Purpose: Total Body Irradiation (TBI) is delivered to a relatively small number of patients with a variety of techniques; it has been a challenge to develop consensus studies for best practice. This survey was created to assess the current state of TBI in Canada. Methods: The survey was created with questions focusing on the radiation prescription, delivery technique and resources involved. The survey was circulated electronically to the heads of every clinical medical physics department in Canada. Responses were gathered and collated, and centres that were known to deliver TBI were urged to respond. Results: Responses from 20 centres were received, including 12 from centres that perform TBI. Although a variety of TBI dose prescriptions were reported, 12 Gy in 6 fractions was used in 11 centres while 5 centres use unique prescriptions. For dose rate, a range of 9 to 51 cGy/min was reported. Most centres use an extended SSD technique, with the patient standing or lying down against a wall. The rest use either a “sweeping” technique or a more complicated multi-field technique. All centres but one indicated that they shield the lungs, and only a minority shield other organs. The survey also showed that considerable resources are used for TBI including extra staffing, extended planning and treatment times and the use of locally developed hardware or software. Conclusions: This survey highlights that both similarities and important discrepancies exist between TBI techniques across the country, and is an opportunity to prompt more collaboration between centres.

  17. Web-based surveys as an alternative to traditional mail methods.

    Science.gov (United States)

    Fleming, Christopher M; Bowden, Mark

    2009-01-01

    Environmental economists have long used surveys to gather information about people's preferences. A recent innovation in survey methodology has been the advent of web-based surveys. While the Internet appears to offer a promising alternative to conventional survey administration modes, concerns exist over potential sampling biases associated with web-based surveys and the effect these may have on valuation estimates. This paper compares results obtained from a travel cost questionnaire of visitors to Fraser Island, Australia, that was conducted using two alternate survey administration modes; conventional mail and web-based. It is found that response rates and the socio-demographic make-up of respondents to the two survey modes are not statistically different. Moreover, both modes yield similar consumer surplus estimates.

  18. Satellite-based technique for nowcasting of thunderstorms over ...

    Indian Academy of Sciences (India)

    Suman Goyal

    2017-08-31

    Aug 31, 2017 ... Due to inadequate radar network, satellite plays the dominant role for nowcast of these thunderstorms. In this study, a nowcast based algorithm ForTracc developed by Vila ... of actual development of cumulonimbus clouds, ... MCS over Indian region using Infrared Channel ... (2016) based on case study of.

  19. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    Science.gov (United States)

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style

  20. Microrheometric upconversion-based techniques for intracellular viscosity measurements

    Science.gov (United States)

    Rodríguez-Sevilla, Paloma; Zhang, Yuhai; de Sousa, Nuno; Marqués, Manuel I.; Sanz-Rodríguez, Francisco; Jaque, Daniel; Liu, Xiaogang; Haro-González, Patricia

    2017-08-01

    Rheological parameters (viscosity, creep compliance and elasticity) play an important role in cell function and viability. For this reason different strategies have been developed for their study. In this work, two new microrheometric techniques are presented. Both methods take advantage of the analysis of the polarized emission of an upconverting particle to determine its orientation inside the optical trap. Upconverting particles are optical materials that are able to convert infrared radiation into visible light. Their usefulness has been further boosted by the recent demonstration of their three-dimensional control and tracking by single beam infrared optical traps. In this work it is demonstrated that optical torques are responsible of the stable orientation of the upconverting particle inside the trap. Moreover, numerical calculations and experimental data allowed to use the rotation dynamics of the optically trapped upconverting particle for environmental sensing. In particular, the cytoplasm viscosity could be measured by using the rotation time and thermal fluctuations of an intracellular optically trapped upconverting particle, by means of the two previously mentioned microrheometric techniques.

  1. Categorizing natural disaster damage assessment using satellite-based geospatial techniques

    Science.gov (United States)

    Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.

    2008-01-01

    Remote sensing of a natural disaster's damage offers an exciting backup and/or alternative to traditional means of on-site damage assessment. Although necessary for complete assessment of damage areas, ground-based damage surveys conducted in the aftermath of natural hazard passage can sometimes be potentially complicated due to on-site difficulties (e.g., interaction with various authorities and emergency services) and hazards (e.g., downed power lines, gas lines, etc.), the need for rapid mobilization (particularly for remote locations), and the increasing cost of rapid physical transportation of manpower and equipment. Satellite image analysis, because of its global ubiquity, its ability for repeated independent analysis, and, as we demonstrate here, its ability to verify on-site damage assessment provides an interesting new perspective and investigative aide to researchers. Using one of the strongest tornado events in US history, the 3 May 1999 Oklahoma City Tornado, as a case example, we digitized the tornado damage path and co-registered the damage path using pre- and post-Landsat Thematic Mapper image data to perform a damage assessment. We employed several geospatial approaches, specifically the Getis index, Geary's C, and two lacunarity approaches to categorize damage characteristics according to the original Fujita tornado damage scale (F-scale). Our results indicate strong relationships between spatial indices computed within a local window and tornado F-scale damage categories identified through the ground survey. Consequently, linear regression models, even incorporating just a single band, appear effective in identifying F-scale damage categories using satellite imagery. This study demonstrates that satellite-based geospatial techniques can effectively add spatial perspectives to natural disaster damages, and in particular for this case study, tornado damages.

  2. Development of Tools and Techniques to Survey, Assess, Stabilise, Monitor and Preserve Underwater Archaeological Sites: SASMAP

    Science.gov (United States)

    Gregory, D. J.

    2015-08-01

    SASMAP's purpose is to develop new technologies and best practices in order to locate, assess and manage Europe's underwater cultural heritage in a more effective way than is possible today. SASMAP has taken an holistic- and process- based approach to investigating underwater environments and the archaeological sites contained therein. End user of the results of SASMAP are severalfold; i) to benefiet the SMEs involved in the project and development of their products for the offshore industry (not just for archaeological purposes) ii) a better understanding of the marine environment and its effect on archaeological materials iii) the collation of the results from the project into guidelines that can be used by cultural resource managers to better administer and optimise developer lead underwater archaeological project within Europe in accordance with European legislation (Treaty of Valetta (1992). Summarily the project has utilised a down scaling approach to localise archaeological sites at a large scale regional level. This has involved using innovative satellite imagery to obtain seamless topography maps over coastal areas and the seabed (accurate to a depth of 6m) as well as the development of a 3D sub bottom profiler to look within the seabed. Results obtained from the downscaling approach at the study areas in the project (Greece and Denmark) have enabled geological models to be developed inorder to work towards predictive modelling of where submerged prehistoric sites may be encountered. Once sites have been located an upscaling approach has been taken to assessing an individual site and the materials on and within it in order to better understand the state of preservation and dynamic conditions of a site and how it can best be preserved through in situ preservation or excavation. This has involved the development of equipment to monitor the seabed environment (open water and in sediments), equipment for sampling sediments and assessing the state of

  3. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi; Sun, Ying

    2016-01-01

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  4. Wavelet packet transform-based robust video watermarking technique

    Indian Academy of Sciences (India)

    If any conflict happens to the copyright identification and authentication, ... the present work is concentrated on the robust digital video watermarking. .... the wavelet decomposition, resulting in a new family of orthonormal bases for function ...

  5. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi

    2016-06-13

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  6. An RSS based location estimation technique for cognitive relay networks

    KAUST Repository

    Qaraqe, Khalid A.; Hussain, Syed Imtiaz; Ç elebi, Hasari Burak; Abdallah, Mohamed M.; Alouini, Mohamed-Slim

    2010-01-01

    In this paper, a received signal strength (RSS) based location estimation method is proposed for a cooperative wireless relay network where the relay is a cognitive radio. We propose a method for the considered cognitive relay network to determine

  7. An Automated Sorting System Based on Virtual Instrumentation Techniques

    Directory of Open Access Journals (Sweden)

    Rodica Holonec

    2008-07-01

    Full Text Available The application presented in this paper represents an experimental model and it refers to the implementing of an automated sorting system for pieces of same shape but different sizes and/or colors. The classification is made according to two features: the color and weight of these pieces. The system is a complex combination of NI Vision hardware and software tools, strain gauges transducers, signal conditioning connected to data acquisition boards, motion and control elements. The system is very useful for students to learn and experiment different virtual instrumentation techniques in order to be able to develop a large field of applications from inspection and process control to sorting and assembly

  8. Innovative instrumentation for VVERs based in non-invasive techniques

    International Nuclear Information System (INIS)

    Jeanneau, H.; Favennec, J.M.; Tournu, E.; Germain, J.L.

    2000-01-01

    Nuclear power plants such as VVERs can greatly benefit from innovative instrumentation to improve plant safety and efficiency. In recent years innovative instrumentation has been developed for PWRs with the aim of providing additional measurements of physical parameters on the primary and secondary circuits: the addition of new instrumentation is made possible by using non-invasive techniques such as ultrasonics and radiation detection. These innovations can be adapted for upgrading VVERs presently in operation and also in future VVERs. The following innovative instrumentation for the control, monitoring or testing at VVERs is described: 1. instrumentation for more accurate primary side direct measurements (for a better monitoring of the primary circuit); 2. instrumentation to monitor radioactivity leaks (for a safer plant); 3. instrumentation-related systems to improve the plant efficiency (for a cheaper kWh)

  9. Extending Driving Vision Based on Image Mosaic Technique

    Directory of Open Access Journals (Sweden)

    Chen Deng

    2017-01-01

    Full Text Available Car cameras have been used extensively to assist driving by make driving visible. However, due to the limitation of the Angle of View (AoV, the dead zone still exists, which is a primary origin of car accidents. In this paper, we introduce a system to extend the vision of drivers to 360 degrees. Our system consists of four wide-angle cameras, which are mounted at different sides of a car. Although the AoV of each camera is within 180 degrees, relying on the image mosaic technique, our system can seamlessly integrate 4-channel videos into a panorama video. The panorama video enable drivers to observe everywhere around a car as far as three meters from a top view. We performed experiments in a laboratory environment. Preliminary results show that our system can eliminate vision dead zone completely. Additionally, the real-time performance of our system can satisfy requirements for practical use.

  10. A Diagnostic Technique for Formulating Market Strategies in Higher Education Based on Relative Competitive Position.

    Science.gov (United States)

    Dolinsky, Arthur L.; Quazi, Hesan A.

    1994-01-01

    Importance-performance analysis, a marketing research technique using analysis of consumer attitudes toward salient product or service attributes, is found useful for colleges and universities in developing marketing strategies, particularly when competition is considered as an important dimension. Data are drawn from a survey of 252 students at 1…

  11. Computerized tablet based versus traditional paper- based survey methods: results from adolescent's health research in schools of Maharashtra, India

    OpenAIRE

    Naveen Agarwal; Balram Paswan; Prakash H. Fulpagare; Dhirendra N Sinha; Thaksaphon Thamarangsi; Manju Rani

    2018-01-01

    Background and challenges to implementation Technological advancement is growing very fast in India and majority of young population is handling electronic devices often during leisure as well as at work. This study indicates that electronic tablets are less time consuming and improves survey response rate over the traditional paper-pencil survey method. Intervention or response An Android-based Global School-based Health Survey (GSHS) questionnaire was used with the...

  12. Moving base Gravity Gradiometer Survey System (GGSS) program

    Science.gov (United States)

    Pfohl, Louis; Rusnak, Walter; Jircitano, Albert; Grierson, Andrew

    1988-04-01

    The GGSS program began in early 1983 with the objective of delivering a landmobile and airborne system capable of fast, accurate, and economical gravity gradient surveys of large areas anywhere in the world. The objective included the development and use of post-mission data reduction software to process the survey data into solutions for the gravity disturbance vector components (north, east and vertical). This document describes the GGSS equipment hardware and software, integration and lab test procedures and results, and airborne and land survey procedures and results. Included are discussions on test strategies, post-mission data reduction algorithms, and the data reduction processing experience. Perspectives and conclusions are drawn from the results.

  13. Understanding Patient Experience Using Internet-based Email Surveys: A Feasibility Study at Mount Sinai Hospital.

    Science.gov (United States)

    Morgan, Matthew; Lau, Davina; Jivraj, Tanaz; Principi, Tania; Dietrich, Sandra; Bell, Chaim M

    2015-01-01

    Email is becoming a widely accepted communication tool in healthcare settings. This study sought to test the feasibility of Internet-based email surveys of patient experience in the ambulatory setting. We conducted a study of email Internet-based surveys sent to patients in selected ambulatory clinics at Mount Sinai Hospital in Toronto, Canada. Our findings suggest that email links to Internet surveys are a feasible, timely and efficient method to solicit patient feedback about their experience. Further research is required to optimally leverage Internet-based email surveys as a tool to better understand the patient experience.

  14. Radiation level survey of a mobile phone base station

    International Nuclear Information System (INIS)

    Campos, M.C.; Schaffer, S.R.

    2006-01-01

    Electromagnetic field (E.M.F.) evaluations were carried out in the surroundings of a roof-top mobile-phone radio-base station (R.B.S.). Four of its sector-panel antennas are installed on two parallel vertical masts, each supporting two panels in a vertical collinear-array. The geometry is such that the vertical plane containing both masts is about 10 meters distant and parallel to the backside of an educational institution. This proximity provoked great anxiety among the local community members regarding potential health hazards.1. Introduction: To keep up with the expansion of the mobile-phone services, the number of Radio-Base Stations (R.B.S.) installations is increasing tremendously in Brazil. Efficient control and radiation monitoring to assess R.B.S. compliance to existing regulations are still lacking and particularly in big cities, clearly non - compliant R.B.S. can be seen which represent potentially hazardous E.M.F. sources to the nearby population. This first survey of an irregular R.B.S. revealed significant E-field strengths outside, as well as inside a classroom of an educational building where an usually prolonged stay is necessary. These results confirm that this problem deserves further attention, moreover, if one considers that public and occupational exposure limits set by I.C.N.I.R.P. (also adopted in Brazil) are exclusively based on the immediate thermal effects of acute exposure, disregarding any potential health risk from prolonged exposure to lower level radiation. Research activities focusing on quantitative aspects of electromagnetic radiation from R.B.S., as well as on biological and adverse health effects are still at a very incipient level, urging for immediate actions to improve this scenario in our country. 2. Material, methods and results Measurements were carried out with a broadband field strength monitor, E.M.R.-300 (W and G) coupled to an isotropic E-field probe (100 khz to 3 GHz). Preliminary measurements helped locating

  15. Address-based versus random-digit-dial surveys: comparison of key health and risk indicators.

    Science.gov (United States)

    Link, Michael W; Battaglia, Michael P; Frankel, Martin R; Osborn, Larry; Mokdad, Ali H

    2006-11-15

    Use of random-digit dialing (RDD) for conducting health surveys is increasingly problematic because of declining participation rates and eroding frame coverage. Alternative survey modes and sampling frames may improve response rates and increase the validity of survey estimates. In a 2005 pilot study conducted in six states as part of the Behavioral Risk Factor Surveillance System, the authors administered a mail survey to selected household members sampled from addresses in a US Postal Service database. The authors compared estimates based on data from the completed mail surveys (n = 3,010) with those from the Behavioral Risk Factor Surveillance System telephone surveys (n = 18,780). The mail survey data appeared reasonably complete, and estimates based on data from the two survey modes were largely equivalent. Differences found, such as differences in the estimated prevalences of binge drinking (mail = 20.3%, telephone = 13.1%) or behaviors linked to human immunodeficiency virus transmission (mail = 7.1%, telephone = 4.2%), were consistent with previous research showing that, for questions about sensitive behaviors, self-administered surveys generally produce higher estimates than interviewer-administered surveys. The mail survey also provided access to cell-phone-only households and households without telephones, which cannot be reached by means of standard RDD surveys.

  16. Adaptive Landmark-Based Navigation System Using Learning Techniques

    DEFF Research Database (Denmark)

    Zeidan, Bassel; Dasgupta, Sakyasingha; Wörgötter, Florentin

    2014-01-01

    The goal-directed navigational ability of animals is an essential prerequisite for them to survive. They can learn to navigate to a distal goal in a complex environment. During this long-distance navigation, they exploit environmental features, like landmarks, to guide them towards their goal. In...... hexapod robots. As a result, it allows the robots to successfully learn to navigate to distal goals in complex environments.......The goal-directed navigational ability of animals is an essential prerequisite for them to survive. They can learn to navigate to a distal goal in a complex environment. During this long-distance navigation, they exploit environmental features, like landmarks, to guide them towards their goal....... Inspired by this, we develop an adaptive landmark-based navigation system based on sequential reinforcement learning. In addition, correlation-based learning is also integrated into the system to improve learning performance. The proposed system has been applied to simulated simple wheeled and more complex...

  17. Estimating monthly temperature using point based interpolation techniques

    Science.gov (United States)

    Saaban, Azizan; Mah Hashim, Noridayu; Murat, Rusdi Indra Zuhdi

    2013-04-01

    This paper discusses the use of point based interpolation to estimate the value of temperature at an unallocated meteorology stations in Peninsular Malaysia using data of year 2010 collected from the Malaysian Meteorology Department. Two point based interpolation methods which are Inverse Distance Weighted (IDW) and Radial Basis Function (RBF) are considered. The accuracy of the methods is evaluated using Root Mean Square Error (RMSE). The results show that RBF with thin plate spline model is suitable to be used as temperature estimator for the months of January and December, while RBF with multiquadric model is suitable to estimate the temperature for the rest of the months.

  18. Product and process effectiveness using performance-based auditing techniques

    International Nuclear Information System (INIS)

    Horseman, M.L.

    1995-01-01

    Focus is the backbone of genius. Focus is the lifeblood of adequate products and effective processes. Focus is the theme of Performance-Based Audits (PBA). The Civilian Radioactive Waste Management (CRWM) Program is using the PBA tool extensively to focus on the evaluation of product adequacy and process effectiveness. The term Performance-Based Audit has been around for several years. however, the approach presented here for the systematic end-product selection, planning, and measurement of adequacy and effectiveness is new and innovative

  19. Recommendations for abortion surveys using the ballot-box technique Recomendações para inquéritos sobre aborto usando a técnica de urna

    Directory of Open Access Journals (Sweden)

    Marcelo Medeiros

    2012-07-01

    Full Text Available The article lists recommendations for dealing with methodological aspects of an abortion survey and makes suggestions for testing and validating the survey questionnaire. The recommendations are based on the experience of the Brazilian Abortion Survey (PNA, a random sample household survey that used the ballot-box technique and covered adult women in all urban areas of the country.O artigo lista recomendações para lidar com aspectos metodológicos de um inquérito sobre aborto e faz sugestões para testar e validar o questionário do levantamento. As recomendações baseiam-se na experiência da Pesquisa Nacional de Aborto (PNA, uma pesquisa domiciliar baseada em amostra aleatória da população urbana do Brasil que utilizou a técnica de urna.

  20. Detection and sizing of cracks using potential drop techniques based on electromagnetic induction

    International Nuclear Information System (INIS)

    Sato, Yasumoto; Kim, Hoon

    2011-01-01

    The potential drop techniques based on electromagnetic induction are classified into induced current focused potential drop (ICFPD) technique and remotely induced current potential drop (RICPD) technique. The possibility of numerical simulation of the techniques is investigated and the applicability of these techniques to the measurement of defects in conductive materials is presented. Finite element analysis (FEA) for the RICPD measurements on the plate specimen containing back wall slits is performed and calculated results by FEA show good agreement with experimental results. Detection limit of the RICPD technique in depth of back wall slits can also be estimated by FEA. Detection and sizing of artificial defects in parent and welded materials are successfully performed by the ICFPD technique. Applicability of these techniques to detection of cracks in field components is investigated, and most of the cracks in the components investigated are successfully detected by the ICFPD and RICPD techniques. (author)

  1. GIS-based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    Ali Haghizadeh

    2017-11-23

    Nov 23, 2017 ... regions. This study shows the potency of two GIS-based data driven ... growth of these tools has also prepared another ..... Urban. 30467. 3. 0.06. 0.20. 0.74. 0.80. −0.64. Distance from road ..... and artificial neural networks for potential groundwater .... ping: A case study at Mehran region, Iran; Catena 137.

  2. Customer requirements based ERP customization using AHP technique

    NARCIS (Netherlands)

    Parthasarathy, S.; Daneva, Maia

    2014-01-01

    Purpose– Customization is a difficult task for many organizations implementing enterprise resource planning (ERP) systems. The purpose of this paper is to develop a new framework based on customers’ requirements to examine the ERP customization choices for the enterprise. The analytical hierarchy

  3. The Visual Memory-Based Memorization Techniques in Piano Education

    Science.gov (United States)

    Yucetoker, Izzet

    2016-01-01

    Problem Statement: Johann Sebastian Bach is one of the leading composers of the baroque period. In addition to his huge contributions in the artistic dimension, he also served greatly in the field of education. This study has been done for determining the impact of visual memory-based memorization practices in the piano education on the visual…

  4. Photon attenuation correction technique in SPECT based on nonlinear optimization

    International Nuclear Information System (INIS)

    Suzuki, Shigehito; Wakabayashi, Misato; Okuyama, Keiichi; Kuwamura, Susumu

    1998-01-01

    Photon attenuation correction in SPECT was made using a nonlinear optimization theory, in which an optimum image is searched so that the sum of square errors between observed and reprojected projection data is minimized. This correction technique consists of optimization and step-width algorithms, which determine at each iteration a pixel-by-pixel directional value of search and its step-width, respectively. We used the conjugate gradient and quasi-Newton methods as the optimization algorithm, and Curry rule and the quadratic function method as the step-width algorithm. Statistical fluctuations in the corrected image due to statistical noise in the emission projection data grew as the iteration increased, depending on the combination of optimization and step-width algorithms. To suppress them, smoothing for directional values was introduced. Computer experiments and clinical applications showed a pronounced reduction in statistical fluctuations of the corrected image for all combinations. Combinations using the conjugate gradient method were superior in noise characteristic and computation time. The use of that method with the quadratic function method was optimum if noise property was regarded as important. (author)

  5. Key techniques for space-based solar pumped semiconductor lasers

    Science.gov (United States)

    He, Yang; Xiong, Sheng-jun; Liu, Xiao-long; Han, Wei-hua

    2014-12-01

    In space, the absence of atmospheric turbulence, absorption, dispersion and aerosol factors on laser transmission. Therefore, space-based laser has important values in satellite communication, satellite attitude controlling, space debris clearing, and long distance energy transmission, etc. On the other hand, solar energy is a kind of clean and renewable resources, the average intensity of solar irradiation on the earth is 1353W/m2, and it is even higher in space. Therefore, the space-based solar pumped lasers has attracted much research in recent years, most research focuses on solar pumped solid state lasers and solar pumped fiber lasers. The two lasing principle is based on stimulated emission of the rare earth ions such as Nd, Yb, Cr. The rare earth ions absorb light only in narrow bands. This leads to inefficient absorption of the broad-band solar spectrum, and increases the system heating load, which make the system solar to laser power conversion efficiency very low. As a solar pumped semiconductor lasers could absorb all photons with energy greater than the bandgap. Thus, solar pumped semiconductor lasers could have considerably higher efficiencies than other solar pumped lasers. Besides, solar pumped semiconductor lasers has smaller volume chip, simpler structure and better heat dissipation, it can be mounted on a small satellite platform, can compose satellite array, which can greatly improve the output power of the system, and have flexible character. This paper summarizes the research progress of space-based solar pumped semiconductor lasers, analyses of the key technologies based on several application areas, including the processing of semiconductor chip, the design of small and efficient solar condenser, and the cooling system of lasers, etc. We conclude that the solar pumped vertical cavity surface-emitting semiconductor lasers will have a wide application prospects in the space.

  6. Measurement model equivalence in web- and paper-based surveys

    African Journals Online (AJOL)

    9Confirmatory factor analysis (CFA) in a Structural Equation Modelling ... satisfaction (5 dimensions), and leadership and transformational issues (7 dimensions). ... and customers directly. ... Further into the future, some experts predict that the majority of all survey research ...... In addition, the perceived questionnaire length.

  7. Teaching Margery and Julian in Anthology-Based Survey Courses

    Science.gov (United States)

    Petersen, Zina

    2006-01-01

    Recognizing that many of us teach the medieval English women mystics Margery Kempe and Julian of Norwich in survey courses, this essay attempts to put these writers in context for teachers who may have only a passing familiarity with the period. Focusing on passages of their writings found in the Longman and Norton anthologies of British…

  8. Whole arm manipulation planning based on feedback velocity fields and sampling-based techniques.

    Science.gov (United States)

    Talaei, B; Abdollahi, F; Talebi, H A; Omidi Karkani, E

    2013-09-01

    Changing the configuration of a cooperative whole arm manipulator is not easy while enclosing an object. This difficulty is mainly because of risk of jamming caused by kinematic constraints. To reduce this risk, this paper proposes a feedback manipulation planning algorithm that takes grasp kinematics into account. The idea is based on a vector field that imposes perturbation in object motion inducing directions when the movement is considerably along manipulator redundant directions. Obstacle avoidance problem is then considered by combining the algorithm with sampling-based techniques. As experimental results confirm, the proposed algorithm is effective in avoiding jamming as well as obstacles for a 6-DOF dual arm whole arm manipulator. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  9. A Survey of Public Key Infrastructure-Based Security for Mobile Communication Systems

    Directory of Open Access Journals (Sweden)

    Mohammed Ramadan

    2016-08-01

    Full Text Available Mobile communication security techniques are employed to guard the communication between the network entities. Mobile communication cellular systems have become one of the most important communication systems in recent times and are used by millions of people around the world. Since the 1990s, considerable efforts have been taken to improve both the communication and security features of the mobile communications systems. However, these improvements divide the mobile communications field into different generations according to the communication and security techniques such as A3, A5 and A8 algorithms for 2G-GSM cellular system, 3G-authentication and key agreement (AKA, evolved packet system-authentication and key agreement (EPS-AKA, and long term evolution-authentication and key agreement (LTE-AKA algorithms for 3rd generation partnership project (3GPP systems. Furthermore, these generations have many vulnerabilities, and huge security work is involved to solve such problems. Some of them are in the field of the public key cryptography (PKC which requires a high computational cost and more network flexibility to be achieved. As such, the public key infrastructure (PKI is more compatible with the modern generations due to the superior communications features. This paper surveys the latest proposed works on the security of GSM, CDMA, and LTE cellular systems using PKI. Firstly, we present the security issues for each generation of mobile communication systems, then we study and analyze the latest proposed schemes and give some comparisons. Finally, we introduce some new directions for the future scope. This paper classifies the mobile communication security schemes according to the techniques used for each cellular system and covers some of the PKI-based security techniques such as authentication, key agreement, and privacy preserving.

  10. Eye gazing direction inspection based on image processing technique

    Science.gov (United States)

    Hao, Qun; Song, Yong

    2005-02-01

    According to the research result in neural biology, human eyes can obtain high resolution only at the center of view of field. In the research of Virtual Reality helmet, we design to detect the gazing direction of human eyes in real time and feed it back to the control system to improve the resolution of the graph at the center of field of view. In the case of current display instruments, this method can both give attention to the view field of virtual scene and resolution, and improve the immersion of virtual system greatly. Therefore, detecting the gazing direction of human eyes rapidly and exactly is the basis of realizing the design scheme of this novel VR helmet. In this paper, the conventional method of gazing direction detection that based on Purklinje spot is introduced firstly. In order to overcome the disadvantage of the method based on Purklinje spot, this paper proposed a method based on image processing to realize the detection and determination of the gazing direction. The locations of pupils and shapes of eye sockets change with the gazing directions. With the aid of these changes, analyzing the images of eyes captured by the cameras, gazing direction of human eyes can be determined finally. In this paper, experiments have been done to validate the efficiency of this method by analyzing the images. The algorithm can carry out the detection of gazing direction base on normal eye image directly, and it eliminates the need of special hardware. Experiment results show that the method is easy to implement and have high precision.

  11. COMPARISON AND EVALUATION OF CLUSTER BASED IMAGE SEGMENTATION TECHNIQUES

    OpenAIRE

    Hetangi D. Mehta*, Daxa Vekariya, Pratixa Badelia

    2017-01-01

    Image segmentation is the classification of an image into different groups. Numerous algorithms using different approaches have been proposed for image segmentation. A major challenge in segmentation evaluation comes from the fundamental conflict between generality and objectivity. A review is done on different types of clustering methods used for image segmentation. Also a methodology is proposed to classify and quantify different clustering algorithms based on their consistency in different...

  12. A new imaging riometer based on Mills Cross technique

    OpenAIRE

    Grill, M.; Honary, F.; Nielsen, E.; Hagfors, T.; Dekoulis, G.; Chapman, P.; Yamagishi, H.

    2003-01-01

    A new type of imaging riometer system based on a Mills Cross antenna array is currently under construction by the Ionosphere and Radio Propagation Group, Department of Communication Systems, Lancaster in collaboration with the Max-Planck-Institut für Aeronomie, Germany. The system will have an unprecedented spatial resolution in a viewing area of 300x300km. It is located at Ramfjordmoen, near Tromsø, Norway. The riometer (relative ionospheric opacity meter) determines the radio-wave absorptio...

  13. Projection computation based on pixel in simultaneous algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Wang Xu; Chen Zhiqiang; Xiong Hua; Zhang Li

    2005-01-01

    SART is an important arithmetic of image reconstruction, in which the projection computation takes over half of the reconstruction time. An efficient way to compute projection coefficient matrix together with memory optimization is presented in this paper. Different from normal method, projection lines are located based on every pixel, and the following projection coefficient computation can make use of the results. Correlation of projection lines and pixels can be used to optimize the computation. (authors)

  14. Quality of reporting web-based and non-web-based survey studies: What authors, reviewers and consumers should consider.

    Science.gov (United States)

    Turk, Tarek; Elhady, Mohamed Tamer; Rashed, Sherwet; Abdelkhalek, Mariam; Nasef, Somia Ahmed; Khallaf, Ashraf Mohamed; Mohammed, Abdelrahman Tarek; Attia, Andrew Wassef; Adhikari, Purushottam; Amin, Mohamed Alsabbahi; Hirayama, Kenji; Huy, Nguyen Tien

    2018-01-01

    Several influential aspects of survey research have been under-investigated and there is a lack of guidance on reporting survey studies, especially web-based projects. In this review, we aim to investigate the reporting practices and quality of both web- and non-web-based survey studies to enhance the quality of reporting medical evidence that is derived from survey studies and to maximize the efficiency of its consumption. Reporting practices and quality of 100 random web- and 100 random non-web-based articles published from 2004 to 2016 were assessed using the SUrvey Reporting GuidelinE (SURGE). The CHERRIES guideline was also used to assess the reporting quality of Web-based studies. Our results revealed a potential gap in the reporting of many necessary checklist items in both web-based and non-web-based survey studies including development, description and testing of the questionnaire, the advertisement and administration of the questionnaire, sample representativeness and response rates, incentives, informed consent, and methods of statistical analysis. Our findings confirm the presence of major discrepancies in reporting results of survey-based studies. This can be attributed to the lack of availability of updated universal checklists for quality of reporting standards. We have summarized our findings in a table that may serve as a roadmap for future guidelines and checklists, which will hopefully include all types and all aspects of survey research.

  15. The Accuracy Assessment of Determining the Axis of Railway Track Basing on the Satellite Surveying

    Science.gov (United States)

    Koc, Władysław; Specht, Cezary; Chrostowski, Piotr; Palikowska, Katarzyna

    2012-09-01

    In 2009, at the Gdansk University of Technology there have been carried out, for the first time, continuous satellite surveying of railway track by the use of the relative phase method based on geodesic active network ASG-EUPOS and NAVGEO service. Still continuing research works focused on the GNSS multi-receivers platform evaluation for projecting and stock-taking. In order to assess the accuracy of the railway track axis position, the values of deviations of transverse position XTE (Cross Track Error) were evaluated. In order to eliminate the influence of random measurement errors and to obtain the coordinates representing the actual shape of the track, the XTE variable was analyzed by signal analysis methods (Chebyshev low-pass filtering and fast Fourier transform). At the end the paper presents the module of the computer software SATTRACK which currently has been developing at the Gdansk University of Technology. The program serves visualization, assessment and design process of railway track, adapted to the technique of continuous satellite surveying. The module called TRACK STRAIGHT is designed to assess the straight sections. A description of its operation as well as examples of its functions has been presented.

  16. Development of Energy Management System Based on Internet of Things Technique

    OpenAIRE

    Wen-Jye Shyr; Chia-Ming Lin and Hung-Yun Feng

    2017-01-01

    The purpose of this study was to develop an energy management system for university campuses based on the Internet of Things (IoT) technique. The proposed IoT technique based on WebAccess is used via network browser Internet Explore and applies TCP/IP protocol. The case study of IoT for lighting energy usage management system was proposed. Structure of proposed IoT technique included perception layer, equipment layer, control layer, application layer and network layer.

  17. Spreading of suppository bases assessed with histological and scintigraphic techniques

    International Nuclear Information System (INIS)

    Tupper, C.H.; Copping, N.; Thomas, N.W.; Wilson, C.G.

    1982-01-01

    Suppositories of PEG 15400 and PEG 600, Myrj 52 and Brij 35, were administered rectally to fasted male rats. 30 and 60 mins after liquefaction time samples of rectal mucosa were taken from treated and untreated rats. The reduction in rectal cell volume and density in treated rats was noted. Similar suppositories, containing anion exchange resin and labelled with technetium 99, were administered to other rats. Serial scintiscanning was carried out using a gamma camera linked to a computer. Spreading of the suppository bases was assessed histologically and by imaging. (U.K.)

  18. Is integrative use of techniques in psychotherapy the exception or the rule? Results of a national survey of doctoral-level practitioners.

    Science.gov (United States)

    Thoma, Nathan C; Cecero, John J

    2009-12-01

    This study sought to investigate the extent to which therapists endorse techniques outside of their self-identified orientation and which techniques are endorsed across orientations. A survey consisting of 127 techniques from 8 major theories of psychotherapy was administered via U.S. mail to a national random sample of doctoral-level psychotherapy practitioners. The 201 participants endorsed substantial numbers of techniques from outside their respective orientations. Many of these techniques were quite different from those of the core theories of the respective orientations. Further examining when and why experienced practitioners switch to techniques outside their primary orientation may help reveal where certain techniques fall short and where others excel, indicating a need for further research that taps the collective experience of practitioners. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  19. Flash floods warning technique based on wireless communication networks data

    Science.gov (United States)

    David, Noam; Alpert, Pinhas; Messer, Hagit

    2010-05-01

    Flash floods can occur throughout or subsequent to rainfall events, particularly in cases where the precipitation is of high-intensity. Unfortunately, each year these floods cause severe property damage and heavy casualties. At present, there are no sufficient real time flash flood warning facilities found to cope with this phenomenon. Here we show the tremendous potential of flash floods advanced warning based on precipitation measurements of commercial microwave links. As was recently shown, wireless communication networks supply high resolution precipitation measurements at ground level while often being situated in flood prone areas, covering large parts of these hazardous regions. We present the flash flood warning potential of the wireless communication system for two different cases when floods occurred at the Judean desert and at the northern Negev in Israel. In both cases, an advanced warning regarding the hazard could have been announced based on this system. • This research was supported by THE ISRAEL SCIENCE FOUNDATION (grant No. 173/08). This work was also supported by a grant from the Yeshaya Horowitz Association, Jerusalem. Additional support was given by the PROCEMA-BMBF project and by the GLOWA-JR BMBF project.

  20. Concepts and techniques for conducting performance-based audits

    International Nuclear Information System (INIS)

    Espy, I.J.

    1990-01-01

    Quality assurance (QA) audits have historically varied in purpose and approach and have earned labels that attempt to name each type of audit. Some more common labels for QA audits include compliance, program, product, and performance-based. While documentation and methodologies are important and hence controlled, an organizations product has ultimate impact on the user. Product quality then must be of more concern to the user than documentation and methodologies of the provider. Performance-based auditing (PBA) provides for assessing product quality by evaluating the suppliers activities that produce and affect product quality. PBA is defined as auditing that evaluates the ability of an activity to regularly produce and release only acceptable product, where product refers to the output of the activity. The output may be hardware, software, or a service, and acceptability includes suitability to the users needs. To satisfy this definition, PBA should focus on the activities that produce and affect product and should evaluate the systematics of each activity in terms of its ability to produce acceptable product. The activity evaluation model provides a framework for evaluating the systematicness of any activity. Elements of the activity evaluation model are described

  1. Who should be undertaking population-based surveys in humanitarian emergencies?

    Directory of Open Access Journals (Sweden)

    Spiegel Paul B

    2007-06-01

    , coordinate when and where surveys should be undertaken and act as a survey repository. Technical expertise is expensive and donors must pay for it. As donors increasingly demand evidence-based programming, they have an obligation to ensure that sufficient funds are provided so organisations have adequate technical staff.

  2. Developing Visualization Techniques for Semantics-based Information Networks

    Science.gov (United States)

    Keller, Richard M.; Hall, David R.

    2003-01-01

    Information systems incorporating complex network structured information spaces with a semantic underpinning - such as hypermedia networks, semantic networks, topic maps, and concept maps - are being deployed to solve some of NASA s critical information management problems. This paper describes some of the human interaction and navigation problems associated with complex semantic information spaces and describes a set of new visual interface approaches to address these problems. A key strategy is to leverage semantic knowledge represented within these information spaces to construct abstractions and views that will be meaningful to the human user. Human-computer interaction methodologies will guide the development and evaluation of these approaches, which will benefit deployed NASA systems and also apply to information systems based on the emerging Semantic Web.

  3. Computational Intelligence based techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Karimi, M.; Bakar, A.H.A.; Mohamad, Hasmaini

    2014-01-01

    Highlights: • Unintentional and intentional islanding, their causes, and solutions are presented. • Remote, passive, active and hybrid islanding detection techniques are discussed. • The limitation of these techniques in accurately detect islanding are discussed. • Computational intelligence techniques ability in detecting islanding is discussed. • Review of ANN, fuzzy logic control, ANFIS, Decision tree techniques is provided. - Abstract: Accurate and fast islanding detection of distributed generation is highly important for its successful operation in distribution networks. Up to now, various islanding detection technique based on communication, passive, active and hybrid methods have been proposed. However, each technique suffers from certain demerits that cause inaccuracies in islanding detection. Computational intelligence based techniques, due to their robustness and flexibility in dealing with complex nonlinear systems, is an option that might solve this problem. This paper aims to provide a comprehensive review of computational intelligence based techniques applied for islanding detection of distributed generation. Moreover, the paper compares the accuracies of computational intelligence based techniques over existing techniques to provide a handful of information for industries and utility researchers to determine the best method for their respective system

  4. Mobile Robot Navigation Based on Q-Learning Technique

    Directory of Open Access Journals (Sweden)

    Lazhar Khriji

    2011-03-01

    Full Text Available This paper shows how Q-learning approach can be used in a successful way to deal with the problem of mobile robot navigation. In real situations where a large number of obstacles are involved, normal Q-learning approach would encounter two major problems due to excessively large state space. First, learning the Q-values in tabular form may be infeasible because of the excessive amount of memory needed to store the table. Second, rewards in the state space may be so sparse that with random exploration they will only be discovered extremely slowly. In this paper, we propose a navigation approach for mobile robot, in which the prior knowledge is used within Q-learning. We address the issue of individual behavior design using fuzzy logic. The strategy of behaviors based navigation reduces the complexity of the navigation problem by dividing them in small actions easier for design and implementation. The Q-Learning algorithm is applied to coordinate between these behaviors, which make a great reduction in learning convergence times. Simulation and experimental results confirm the convergence to the desired results in terms of saved time and computational resources.

  5. Damage identification in beams by a response surface based technique

    Directory of Open Access Journals (Sweden)

    Teidj S.

    2014-01-01

    Full Text Available In this work, identification of damage in uniform homogeneous metallic beams was considered through the propagation of non dispersive elastic torsional waves. The proposed damage detection procedure consisted of the following sequence. Giving a localized torque excitation, having the form of a short half-sine pulse, the first step was calculating the transient solution of the resulting torsional wave. This torque could be generated in practice by means of asymmetric laser irradiation of the beam surface. Then, a localized defect assumed to be characterized by an abrupt reduction of beam section area with a given height and extent was placed at a known location of the beam. Next, the response in terms of transverse section rotation rate was obtained for a point situated afterwards the defect, where the sensor was positioned. This last could utilize in practice the concept of laser vibrometry. A parametric study has been conducted after that by using a full factorial design of experiments table and numerical simulations based on a finite difference characteristic scheme. This has enabled the derivation of a response surface model that was shown to represent adequately the response of the system in terms of the following factors: defect extent and severity. The final step was performing the inverse problem solution in order to identify the defect characteristics by using measurement.

  6. PELAN - a transportable, neutron-based UXO identification technique

    International Nuclear Information System (INIS)

    Vourvopoulos, G.

    1998-01-01

    An elemental characterization method is used to differentiate between inert projectiles and UXO's. This method identifies in a non-intrusive, nondestructive manner, the elemental composition of the projectile contents. Most major and minor chemical elements within the interrogated object (hydrogen, carbon, nitrogen, oxygen, fluorine, phosphorus, chlorine, arsenic, etc.) are identified and quantified. The method is based on PELAN - Pulsed Elemental Analysis with Neutrons. PELAN uses pulsed neutrons produced from a compact, sealed tube neutron generator. Using an automatic analysis computer program, the quantities of each major and minor chemical element are determined. A decision-making tree identifies the object by comparing its elemental composition with stored elemental composition libraries of substances that could be contained within the projectile. In a series of blind tests, PELAN was able to identify without failure, the contents of each shell placed in front of it. The PELAN probe does not need to be in contact with the interrogated projectile. If the object is buried, the interrogation can take place in situ provided the probe can be inserted a few centimeters from the object's surface. (author)

  7. A survey of clearing techniques for 3D imaging of tissues with special reference to connective tissue.

    Science.gov (United States)

    Azaripour, Adriano; Lagerweij, Tonny; Scharfbillig, Christina; Jadczak, Anna Elisabeth; Willershausen, Brita; Van Noorden, Cornelis J F

    2016-08-01

    For 3-dimensional (3D) imaging of a tissue, 3 methodological steps are essential and their successful application depends on specific characteristics of the type of tissue. The steps are 1° clearing of the opaque tissue to render it transparent for microscopy, 2° fluorescence labeling of the tissues and 3° 3D imaging. In the past decades, new methodologies were introduced for the clearing steps with their specific advantages and disadvantages. Most clearing techniques have been applied to the central nervous system and other organs that contain relatively low amounts of connective tissue including extracellular matrix. However, tissues that contain large amounts of extracellular matrix such as dermis in skin or gingiva are difficult to clear. The present survey lists methodologies that are available for clearing of tissues for 3D imaging. We report here that the BABB method using a mixture of benzyl alcohol and benzyl benzoate and iDISCO using dibenzylether (DBE) are the most successful methods for clearing connective tissue-rich gingiva and dermis of skin for 3D histochemistry and imaging of fluorescence using light-sheet microscopy. Copyright © 2016 The Authors. Published by Elsevier GmbH.. All rights reserved.

  8. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    Lundtang Paulsen, J.

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  9. A correlation-based pulse detection technique for gamma-ray/neutron detectors

    International Nuclear Information System (INIS)

    Faisal, Muhammad; Schiffer, Randolph T.; Flaska, Marek; Pozzi, Sara A.; Wentzloff, David D.

    2011-01-01

    We present a correlation-based detection technique that significantly improves the probability of detection for low energy pulses. We propose performing a normalized cross-correlation of the incoming pulse data to a predefined pulse template, and using a threshold correlation value to trigger the detection of a pulse. This technique improves the detector sensitivity by amplifying the signal component of incoming pulse data and rejecting noise. Simulation results for various different templates are presented. Finally, the performance of the correlation-based detection technique is compared to the current state-of-the-art techniques.

  10. A Web-based survey on students' conceptions of 'accident'.

    Science.gov (United States)

    Blank, Danilo; Hohgraefe Neto, Guilherme; Grando, Elisa; Siqueira, Pauline Z; Lunkes, Roberta P; Pietrobeli, João Leonardo; Marzola, Norma Regina; Goldani, Marcelo Z

    2009-12-01

    To report the implementation of an open source web survey application and a case study of its first utilisation, particularly as to aspects of logistics and response behaviour, in a survey of Brazilian university students' conceptions about injury causing events. We developed an original application capable of recruiting respondents, sending personal e-mail invitations, storing responses and exporting data. Students of medical, law, communication and education schools were asked about personal attributes and conceptions of the term accident, as to associations and preventability. The response rate was 34.5%. Half of the subjects responded by the second day, 66.3% during the first week. Subjects around 4.2% (95% CI 3.3-5.4) refused to disclose religious persuasion, and 19.2% (95% CI 17.2-21.3) refused to disclose political persuasion, whereas only 2.8% (95% CI 2.1-3.8), on average, refused to answer questions on conceptions and attitudes. There was no significant difference between early and late respondents in respect to selected attributes and conceptions of accident (P-value varied from 0.145 to 0.971). The word accident evoked the notion of preventability to 85.1% (95% CI 83.2 to 87.0) of the subjects, foreseeability to 50.3% (95% CI 47.7-53.0), fatality to 15.1% (95% CI 13.3-17.1) and intentionality to 2.3% (95% CI 1.6-3.2). Web surveying university students' conceptions about injuries is feasible in a middle-income country setting, yielding response rates similar to those found in the literature.

  11. A Survey of Advances in Vision-Based Human Motion Capture and Analysis

    DEFF Research Database (Denmark)

    Moeslund, Thomas B.; Hilton, Adrian; Krüger, Volker

    2006-01-01

    This survey reviews advances in human motion capture and analysis from 2000 to 2006, following a previous survey of papers up to 2000 Human motion capture continues to be an increasingly active research area in computer vision with over 350 publications over this period. A number of significant...... actions and behavior. This survey reviews recent trends in video based human capture and analysis, as well as discussing open problems for future research to achieve automatic visual analysis of human movement....

  12. A comparison of mandibular denture base deformation with different impression techniques for implant overdentures.

    Science.gov (United States)

    Elsyad, Moustafa Abdou; El-Waseef, Fatma Ahmad; Al-Mahdy, Yasmeen Fathy; Fouad, Mohammed Mohammed

    2013-08-01

    This study aimed to evaluate mandibular denture base deformation along with three impression techniques used for implant-retained overdenture. Ten edentulous patients (five men and five women) received two implants in the canine region of the mandible and three duplicate mandibular overdentures which were constructed with mucostatic, selective pressure, and definitive pressure impression techniques. Ball abutments and respective gold matrices were used to connect the overdentures to the implants. Six linear strain gauges were bonded to the lingual polished surface of each duplicate overdenture at midline and implant areas to measure strain during maximal clenching and gum chewing. The strains recorded at midline were compressive while strains at implant areas were tensile. Clenching recorded significant higher strain when compared with gum chewing for all techniques. The mucostatic technique recorded the highest strain and the definite pressure technique recorded the lowest. There was no significant difference between the strain recorded with mucostatic technique and that registered with selective pressure technique. The highest strain was recorded at the level of ball abutment's top with the mucostatic technique during clenching. Definite pressure impression technique for implant-retained mandibular overdenture is associated with minimal denture deformation during function when compared with mucostatic and selective pressure techniques. Reinforcement of the denture base over the implants may be recommended to increase resistance of fracture when mucostatic or selective pressure impression technique is used. © 2012 John Wiley & Sons A/S.

  13. Techniques for Improving the Accuracy of 802.11 WLAN-Based Networking Experimentation

    Directory of Open Access Journals (Sweden)

    Portoles-Comeras Marc

    2010-01-01

    Full Text Available Wireless networking experimentation research has become highly popular due to both the frequent mismatch between theory and practice and the widespread availability of low-cost WLAN cards. However, current WLAN solutions present a series of performance issues, sometimes difficult to predict in advance, that may compromise the validity of the results gathered. This paper surveys recent literature dealing with such issues and draws attention on the negative results of starting experimental research without properly understanding the tools that are going to be used. Furthermore, the paper details how a conscious assessment strategy can prevent placing wrong assumptions on the hardware. Indeed, there are numerous techniques that have been described throughout the literature that can be used to obtain a deeper understanding of the solutions that have been adopted. The paper surveys these techniques and classifies them in order to provide a handful reference for building experimental setups from which accurate measurements may be obtained.

  14. Towards a common standard - a reporting checklist for web-based stated preference valuation surveys and a critique for mode surveys

    DEFF Research Database (Denmark)

    Menegaki, Angeliki, N.; Olsen, Søren Bøye; Tsagarakis, Konstantinos P.

    2016-01-01

    . The checklist is developed based on the bulk of knowledge gained so far with web-based surveys. This knowledge is compiled based on an extensive review of relevant literature dated from 2001 to beginning of 2015 in the Scopus database. Somewhat surprisingly, relatively few papers are concerned with survey mode...

  15. The Desired Learning Outcomes of School-Based Nutrition/Physical Activity Health Education: A Health Literacy Constructed Delphi Survey of Finnish Experts

    Science.gov (United States)

    Ormshaw, Michael James; Kokko, Sami Petteri; Villberg, Jari; Kannas, Lasse

    2016-01-01

    Purpose: The purpose of this paper is to utilise the collective opinion of a group of Finnish experts to identify the most important learning outcomes of secondary-level school-based health education, in the specific domains of physical activity and nutrition. Design/ Methodology/ Approach: The study uses a Delphi survey technique to collect the…

  16. Demonstrating the Potential for Web-Based Survey Methodology with a Case Study.

    Science.gov (United States)

    Mertler, Craig

    2002-01-01

    Describes personal experience with using the Internet to administer a teacher-motivation and job-satisfaction survey to elementary and secondary teachers. Concludes that advantages of Web-base surveys, such as cost savings and efficiency of data collection, outweigh disadvantages, such as the limitations of listservs. (Contains 10 references.)…

  17. Lessons Learned from the Administration of a Web-Based Survey.

    Science.gov (United States)

    Mertler, Craig A.

    This paper describes the methodology used in a research study involving the collection of data through a Web-based survey, focusing on the advantages and limitations of the methodology. The Teacher motivation and Job Satisfaction Survey was administered to K-12 teachers. Many of the difficulties occurred during the planning phase, as opposed to…

  18. A knowledge - based system to assist in the design of soil survey schemes

    NARCIS (Netherlands)

    Domburg, P.

    1994-01-01

    Soil survey information with quantified accuracy is relevant to decisions on land use and environmental problems. To obtain such information statistical strategies should be used for collecting and analysing data. A survey project based on a statistical sampling strategy requires a soil

  19. Wood lens design philosophy based on a binary additive manufacturing technique

    Science.gov (United States)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  20. Tilting-Twisting-Rolling: a pen-based technique for compass geometric construction

    Institute of Scientific and Technical Information of China (English)

    Fei LYU; Feng TIAN; Guozhong DAI; Hongan WANG

    2017-01-01

    This paper presents a new pen-based technique,Tilting-Twisting-Rolling,to support compass geometric construction.By leveraging the 3D orientation information and 3D rotation information of a pen,this technique allows smooth pen action to complete multi-step geometric construction without switching task states.Results from a user study show this Tilting-Twisting-Rolling technique can improve user performance and user experience in compass geometric construction.

  1. Population-based survey of cessation aids used by Swedish smokers

    Directory of Open Access Journals (Sweden)

    Rutqvist Lars E

    2012-12-01

    Full Text Available Abstract Background Most smokers who quit typically do so unassisted although pharmaceutical products are increasingly used by those who want a quitting aid. Previous Scandinavian surveys indicated that many smokers stopped smoking by switching from cigarettes to smokeless tobacco in the form of snus. However, usage of various cessation aids may have changed in Sweden during recent years due to factors such as the wider availability of pharmaceutical nicotine, the public debate about the health effects of different tobacco products, excise tax increases on snus relative to cigarettes, and the widespread public misconception that nicotine is the main cause of the adverse health effects associated with tobacco use. Methods A population-based, cross-sectional survey was done during November 2008 and September 2009 including 2,599 males and 3,409 females aged between 18 and 89 years. The sampling technique was random digit dialing. Data on tobacco habits and quit attempts were collected through structured telephone interviews. Results The proportion of ever smokers was similar among males (47% compared to females (44%. About two thirds of them reported having stopped smoking at the time of the survey. Among the former smokers, the proportion who reported unassisted quitting was slightly lower among males (68% compared to females (78%. Among ever smokers who reported having made assisted quit attempts, snus was the most frequently reported cessation aid among males (22%, whereas females more frequently reported counseling (8%, or pharmaceutical nicotine (gum 8%, patch 4%. Of those who reported using snus at their latest quit attempt, 81% of males and 72% of females were successful quitters compared to about 50-60% for pharmaceutical nicotine and counseling. Conclusions This survey confirms and extends previous reports in showing that, although most smokers who have quit did so unassisted, snus continues to be the most frequently reported cessation

  2. school-based survey of adolescents' opinion on premarital sex in ...

    African Journals Online (AJOL)

    PROF. BARTH EKWEME

    Method: A cross sectional descriptive survey design was used. The sample size was 313 senior secondary school students from four public secondary schools in Yakurr Local Government Area of Cross River State. Simple random sampling technique was used to select 313 students from 4 schools in Yakurr Local ...

  3. Modelling of Surface Fault Structures Based on Ground Magnetic Survey

    Science.gov (United States)

    Michels, A.; McEnroe, S. A.

    2017-12-01

    The island of Leka confines the exposure of the Leka Ophiolite Complex (LOC) which contains mantle and crustal rocks and provides a rare opportunity to study the magnetic properties and response of these formations. The LOC is comprised of five rock units: (1) harzburgite that is strongly deformed, shifting into an increasingly olivine-rich dunite (2) ultramafic cumulates with layers of olivine, chromite, clinopyroxene and orthopyroxene. These cumulates are overlain by (3) metagabbros, which are cut by (4) metabasaltic dykes and (5) pillow lavas (Furnes et al. 1988). Over the course of three field seasons a detailed ground-magnetic survey was made over the island covering all units of the LOC and collecting samples from 109 sites for magnetic measurements. NRM, susceptibility, density and hysteresis properties were measured. In total 66% of samples with a Q value > 1, suggests that the magnetic anomalies should include both induced and remanent components in the model.This Ophiolite originated from a suprasubduction zone near the coast of Laurentia (497±2 Ma), was obducted onto Laurentia (≈460 Ma) and then transferred to Baltica during the Caledonide Orogeny (≈430 Ma). The LOC was faulted, deformed and serpentinized during these events. The gabbro and ultramafic rocks are separated by a normal fault. The dominant magnetic anomaly that crosses the island correlates with this normal fault. There are a series of smaller scale faults that are parallel to this and some correspond to local highs that can be highlighted by a tilt derivative of the magnetic data. These fault boundaries which are well delineated by the distinct magnetic anomalies in both ground and aeromagnetic survey data are likely caused by increased amount of serpentinization of the ultramafic rocks in the fault areas.

  4. Analysis of web-based online services for GPS relative and precise point positioning techniques

    Directory of Open Access Journals (Sweden)

    Taylan Ocalan

    Full Text Available Nowadays, Global Positioning System (GPS has been used effectively in several engineering applications for the survey purposes by multiple disciplines. Web-based online services developed by several organizations; which are user friendly, unlimited and most of them are free; have become a significant alternative against the high-cost scientific and commercial software on achievement of post processing and analyzing the GPS data. When centimeter (cm or decimeter (dm level accuracies are desired, that can be obtained easily regarding different quality engineering applications through these services. In this paper, a test study was conducted at ISKI-CORS network; Istanbul-Turkey in order to figure out the accuracy analysis of the most used web based online services around the world (namely OPUS, AUSPOS, SCOUT, CSRS-PPP, GAPS, APPS, magicGNSS. These services use relative and precise point positioning (PPP solution approaches. In this test study, the coordinates of eight stations were estimated by using of both online services and Bernese 5.0 scientific GPS processing software from 24-hour GPS data set and then the coordinate differences between the online services and Bernese processing software were computed. From the evaluations, it was seen that the results for each individual differences were less than 10 mm regarding relative online service, and less than 20 mm regarding precise point positioning service. The accuracy analysis was gathered from these coordinate differences and standard deviations of the obtained coordinates from different techniques and then online services were compared to each other. The results show that the position accuracies obtained by associated online services provide high accurate solutions that may be used in many engineering applications and geodetic analysis.

  5. A survey of techniques to reduce and manage external beam radiation-induced xerostomia in British oncology and radiotherapy departments

    International Nuclear Information System (INIS)

    Macknelly, Andrew; Day, Jane

    2009-01-01

    Xerostomia is the most common side effect of external beam radiotherapy to the head and neck [Anand A, Jain J, Negi P, Chaudhoory A, Sinha S, Choudhury P, et-al. Can dose reduction to one parotid gland prevent xerostomia? - A feasibility study for locally advanced head and neck cancer patients treated with intensity-modulated radiotherapy. Clinical Oncology 2006;18(6):497-504.]. A survey was carried out in British oncology departments to determine what treatment regimes, to minimise xerostomia, are used for patients with head-and-neck cancers treated with external beam radiotherapy. A semi-structured questionnaire consisting of both quantitative and qualitative questions was designed that asked departments which of the identified methods they used, why a method might not be currently employed, and whether its use had ever been considered. The study found that there are wide disparities between the techniques employed by oncology departments to avoid and reduce xerostomia in patients with cancers of the head and neck. The National Institute of Clinical Health and Excellence, [National Institute for Clinical Health and Excellence (NICE). Improving outcomes in head and neck cancers: the manual. London: Office of Public Sector Information; 2004.] for example, recommends that patients are given dental care and dietary advice but some departments did not appear to be doing this. Less than half of departments stated that they offer complementary therapies and less than 40% prescribed pilocarpine, a saliva-stimulant. Only two respondents stated that they use amifostine, a radioprotector, during radiotherapy treatment to the head and neck. The results also suggested a move toward using Intensity Modulated Radiotherapy (IMRT) for treating head-and-neck cancers which offers better normal tissue sparing than three-dimensional conformal radiotherapy. [Anand A, Jain J, Negi P, Chaudhoory A, Sinha S, Choudhury P, et al. Can dose reduction to one parotid gland prevent xerostomia

  6. A Survey on Multimedia-Based Cross-Layer Optimization in Visual Sensor Networks

    Science.gov (United States)

    Costa, Daniel G.; Guedes, Luiz Affonso

    2011-01-01

    Visual sensor networks (VSNs) comprised of battery-operated electronic devices endowed with low-resolution cameras have expanded the applicability of a series of monitoring applications. Those types of sensors are interconnected by ad hoc error-prone wireless links, imposing stringent restrictions on available bandwidth, end-to-end delay and packet error rates. In such context, multimedia coding is required for data compression and error-resilience, also ensuring energy preservation over the path(s) toward the sink and improving the end-to-end perceptual quality of the received media. Cross-layer optimization may enhance the expected efficiency of VSNs applications, disrupting the conventional information flow of the protocol layers. When the inner characteristics of the multimedia coding techniques are exploited by cross-layer protocols and architectures, higher efficiency may be obtained in visual sensor networks. This paper surveys recent research on multimedia-based cross-layer optimization, presenting the proposed strategies and mechanisms for transmission rate adjustment, congestion control, multipath selection, energy preservation and error recovery. We note that many multimedia-based cross-layer optimization solutions have been proposed in recent years, each one bringing a wealth of contributions to visual sensor networks. PMID:22163908

  7. Progress in the development of a video-based wind farm simulation technique

    OpenAIRE

    Robotham, AJ

    1992-01-01

    The progress in the development of a video-based wind farm simulation technique is reviewed. While improvements have been achieved in the quality of the composite picture created by combining computer generated animation sequences of wind turbines with background scenes of the wind farm site, extending the technique to include camera movements has proved troublesome.

  8. Development and validation of a prediction model for long-term sickness absence based on occupational health survey variables.

    Science.gov (United States)

    Roelen, Corné; Thorsen, Sannie; Heymans, Martijn; Twisk, Jos; Bültmann, Ute; Bjørner, Jakob

    2018-01-01

    The purpose of this study is to develop and validate a prediction model for identifying employees at increased risk of long-term sickness absence (LTSA), by using variables commonly measured in occupational health surveys. Based on the literature, 15 predictor variables were retrieved from the DAnish National working Environment Survey (DANES) and included in a model predicting incident LTSA (≥4 consecutive weeks) during 1-year follow-up in a sample of 4000 DANES participants. The 15-predictor model was reduced by backward stepwise statistical techniques and then validated in a sample of 2524 DANES participants, not included in the development sample. Identification of employees at increased LTSA risk was investigated by receiver operating characteristic (ROC) analysis; the area-under-the-ROC-curve (AUC) reflected discrimination between employees with and without LTSA during follow-up. The 15-predictor model was reduced to a 9-predictor model including age, gender, education, self-rated health, mental health, prior LTSA, work ability, emotional job demands, and recognition by the management. Discrimination by the 9-predictor model was significant (AUC = 0.68; 95% CI 0.61-0.76), but not practically useful. A prediction model based on occupational health survey variables identified employees with an increased LTSA risk, but should be further developed into a practically useful tool to predict the risk of LTSA in the general working population. Implications for rehabilitation Long-term sickness absence risk predictions would enable healthcare providers to refer high-risk employees to rehabilitation programs aimed at preventing or reducing work disability. A prediction model based on health survey variables discriminates between employees at high and low risk of long-term sickness absence, but discrimination was not practically useful. Health survey variables provide insufficient information to determine long-term sickness absence risk profiles. There is a need for

  9. Location-based activity adviser - a survey study

    NARCIS (Netherlands)

    Lin, Y.; Vries, de B.; Timmermans, H.J.P.

    2009-01-01

    The objective of the research is to explore the potential of a recommendation system that provides information and suggestions on physical activities based on the environment. We aim at employing location-based and mobile technologies to build an activity-adviser system and motivate users to change

  10. Energy-Based Acoustic Source Localization Methods: A Survey

    Directory of Open Access Journals (Sweden)

    Wei Meng

    2017-02-01

    Full Text Available Energy-based source localization is an important problem in wireless sensor networks (WSNs, which has been studied actively in the literature. Numerous localization algorithms, e.g., maximum likelihood estimation (MLE and nonlinear-least-squares (NLS methods, have been reported. In the literature, there are relevant review papers for localization in WSNs, e.g., for distance-based localization. However, not much work related to energy-based source localization is covered in the existing review papers. Energy-based methods are proposed and specially designed for a WSN due to its limited sensor capabilities. This paper aims to give a comprehensive review of these different algorithms for energy-based single and multiple source localization problems, their merits and demerits and to point out possible future research directions.

  11. Photoacoustic Techniques for Trace Gas Sensing Based on Semiconductor Laser Sources

    Directory of Open Access Journals (Sweden)

    Vincenzo Spagnolo

    2009-12-01

    Full Text Available The paper provides an overview on the use of photoacoustic sensors based on semiconductor laser sources for the detection of trace gases. We review the results obtained using standard, differential and quartz enhanced photoacoustic techniques.

  12. Perceptual evaluation of corpus-based speech synthesis techniques in under-resourced environments

    CSIR Research Space (South Africa)

    Van Niekerk, DR

    2009-11-01

    Full Text Available With the increasing prominence and maturity of corpus-based techniques for speech synthesis, the process of system development has in some ways been simplified considerably. However, the dependence on sufficient amounts of relevant speech data...

  13. Resizing Technique-Based Hybrid Genetic Algorithm for Optimal Drift Design of Multistory Steel Frame Buildings

    Directory of Open Access Journals (Sweden)

    Hyo Seon Park

    2014-01-01

    Full Text Available Since genetic algorithm-based optimization methods are computationally expensive for practical use in the field of structural optimization, a resizing technique-based hybrid genetic algorithm for the drift design of multistory steel frame buildings is proposed to increase the convergence speed of genetic algorithms. To reduce the number of structural analyses required for the convergence, a genetic algorithm is combined with a resizing technique that is an efficient optimal technique to control the drift of buildings without the repetitive structural analysis. The resizing technique-based hybrid genetic algorithm proposed in this paper is applied to the minimum weight design of three steel frame buildings. To evaluate the performance of the algorithm, optimum weights, computational times, and generation numbers from the proposed algorithm are compared with those from a genetic algorithm. Based on the comparisons, it is concluded that the hybrid genetic algorithm shows clear improvements in convergence properties.

  14. A Comparative Analysis of Transmission Control Protocol Improvement Techniques over Space-Based Transmission Media

    National Research Council Canada - National Science Library

    Lawson, Joseph M

    2006-01-01

    ... justification for the implementation of a given enhancement technique. The research questions were answered through model and simulation of a satellite transmission system via a Linux-based network topology...

  15. Comparative Study of Retinal Vessel Segmentation Based on Global Thresholding Techniques

    Directory of Open Access Journals (Sweden)

    Temitope Mapayi

    2015-01-01

    Full Text Available Due to noise from uneven contrast and illumination during acquisition process of retinal fundus images, the use of efficient preprocessing techniques is highly desirable to produce good retinal vessel segmentation results. This paper develops and compares the performance of different vessel segmentation techniques based on global thresholding using phase congruency and contrast limited adaptive histogram equalization (CLAHE for the preprocessing of the retinal images. The results obtained show that the combination of preprocessing technique, global thresholding, and postprocessing techniques must be carefully chosen to achieve a good segmentation performance.

  16. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    International Nuclear Information System (INIS)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-01-01

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  17. Developing cloud-based Business Process Management (BPM): a survey

    Science.gov (United States)

    Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh

    2018-03-01

    In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.

  18. A Survey study on design procedure of Seismic Base Isolation ...

    African Journals Online (AJOL)

    Michael Horsfall

    Base Isolation Systems that is flexible approach to decrease the potential damage. In this ... In addition, we analyze the seismic responses of isolated structures. The seismic ..... Equation 3.7, is examined; it is realized that the inequality ...

  19. A survey on the status of ATM based LAN

    International Nuclear Information System (INIS)

    Yang, Sung Woon; Kang, Soon Ju

    1996-03-01

    This report presents the technical status of the ATM(Asynchronous Transfer Mode) as a new high speed data communication method. Since the FDDI(Fiber optic Distributed Data Interchange) backbone has been installed in september 1995, it has been used as a main network structure of KAERI. However, recently high speed and multimedia data communication environment is being required to accommodate the recent trend of the network usage in KAERI. For example, the rapid growth of Internet usage and increased activities of information retrieval systems on KAERI-Net demand more effective network system. Chapter 1 discusses the status of KAERI-Net and the selection criteria of a network model according to the national plan of super high speed network structure. In Chapter 2, the basic concept of ATM such as communication method and communication structure is studied, and Chapter 3 presents the overall concepts of standard model of ATM. In Chapter 4, we survey the recent trend of technical development of ATM and analyze the status of ATM technology. As a concluding remark, Chapter 5 discusses the criteria and check points for optimal design of KAERI-Net backbone. This report will be used as a technical reference for the installation of ATM in KAERI-Net. 10 tabs., 32 figs., 11 refs. (Author)

  20. Teaching research methods in nursing using Aronson's Jigsaw Technique. A cross-sectional survey of student satisfaction.

    Science.gov (United States)

    Leyva-Moral, Juan M; Riu Camps, Marta

    2016-05-01

    To adapt nursing studies to the European Higher Education Area, new teaching methods have been included that assign maximum importance to student-centered learning and collaborative work. The Jigsaw Technique is based on collaborative learning and everyone in the group must play their part because each student's mark depends on the other students. Home group members are given the responsibility to become experts in a specific area of knowledge. Experts meet together to reach an agreement and improve skills. Finally, experts return to their home groups to share all their findings. The aim of this study was to evaluate nursing student satisfaction with the Jigsaw Technique used in the context of a compulsory course in research methods for nursing. A cross-sectional study was conducted using a self-administered anonymous questionnaire administered to students who completed the Research Methods course during the 2012-13 and 2013-14 academic years. The questionnaire was developed taking into account the learning objectives, competencies and skills that should be acquired by students, as described in the course syllabus. The responses were compared by age group (younger or older than 22years). A total of 89.6% of nursing students under 22years believed that this methodology helped them to develop teamwork, while this figure was 79.6% in older students. Nursing students also believed it helped them to work independently, with differences according to age, 79.7% and 58% respectively (p=0.010). Students disagreed with the statement "The Jigsaw Technique involves little workload", with percentages of 88.5% in the group under 22years and 80% in older students. Most believed that this method should not be employed in upcoming courses, although there were differences by age, with 44.3% of the younger group being against and 62% of the older group (p=0.037). The method was not highly valued by students, mainly by those older than 22years, who concluded that they did not learn

  1. Developing a hybrid dictionary-based bio-entity recognition technique

    Science.gov (United States)

    2015-01-01

    Background Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. Methods This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. Results The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. Conclusions The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall. PMID:26043907

  2. Developing a hybrid dictionary-based bio-entity recognition technique.

    Science.gov (United States)

    Song, Min; Yu, Hwanjo; Han, Wook-Shin

    2015-01-01

    Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall.

  3. Feathering effect detection and artifact agglomeration index-based video deinterlacing technique

    Science.gov (United States)

    Martins, André Luis; Rodrigues, Evandro Luis Linhari; de Paiva, Maria Stela Veludo

    2018-03-01

    Several video deinterlacing techniques have been developed, and each one presents a better performance in certain conditions. Occasionally, even the most modern deinterlacing techniques create frames with worse quality than primitive deinterlacing processes. This paper validates that the final image quality can be improved by combining different types of deinterlacing techniques. The proposed strategy is able to select between two types of deinterlaced frames and, if necessary, make the local correction of the defects. This decision is based on an artifact agglomeration index obtained from a feathering effect detection map. Starting from a deinterlaced frame produced by the "interfield average" method, the defective areas are identified, and, if deemed appropriate, these areas are replaced by pixels generated through the "edge-based line average" method. Test results have proven that the proposed technique is able to produce video frames with higher quality than applying a single deinterlacing technique through getting what is good from intra- and interfield methods.

  4. A new simple technique for improving the random properties of chaos-based cryptosystems

    Science.gov (United States)

    Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.

    2018-03-01

    A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.

  5. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    Science.gov (United States)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  6. A citizen science based survey method for estimating the density of urban carnivores

    Science.gov (United States)

    Baker, Rowenna; Charman, Naomi; Karlsson, Heidi; Yarnell, Richard W.; Mill, Aileen C.; Smith, Graham C.; Tolhurst, Bryony A.

    2018-01-01

    Globally there are many examples of synanthropic carnivores exploiting growth in urbanisation. As carnivores can come into conflict with humans and are potential vectors of zoonotic disease, assessing densities in suburban areas and identifying factors that influence them are necessary to aid management and mitigation. However, fragmented, privately owned land restricts the use of conventional carnivore surveying techniques in these areas, requiring development of novel methods. We present a method that combines questionnaire distribution to residents with field surveys and GIS, to determine relative density of two urban carnivores in England, Great Britain. We determined the density of: red fox (Vulpes vulpes) social groups in 14, approximately 1km2 suburban areas in 8 different towns and cities; and Eurasian badger (Meles meles) social groups in three suburban areas of one city. Average relative fox group density (FGD) was 3.72 km-2, which was double the estimates for cities with resident foxes in the 1980’s. Density was comparable to an alternative estimate derived from trapping and GPS-tracking, indicating the validity of the method. However, FGD did not correlate with a national dataset based on fox sightings, indicating unreliability of the national data to determine actual densities or to extrapolate a national population estimate. Using species-specific clustering units that reflect social organisation, the method was additionally applied to suburban badgers to derive relative badger group density (BGD) for one city (Brighton, 2.41 km-2). We demonstrate that citizen science approaches can effectively obtain data to assess suburban carnivore density, however publicly derived national data sets need to be locally validated before extrapolations can be undertaken. The method we present for assessing densities of foxes and badgers in British towns and cities is also adaptable to other urban carnivores elsewhere. However this transferability is contingent on

  7. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    Science.gov (United States)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  8. Community-based survey versus sentinel site sampling in ...

    African Journals Online (AJOL)

    rural children. Implications for nutritional surveillance and the development of nutritional programmes. G. c. Solarsh, D. M. Sanders, C. A. Gibson, E. Gouws. A study of the anthropometric status of under-5-year-olds was conducted in the Nqutu district of Kwazulu by means of a representative community-based sample and.

  9. Questionnaire-based survey of parturition in the queen

    NARCIS (Netherlands)

    Musters, J.; de Gier, J.; Kooistra, H.S.; Okkens, A.C.

    The lack of scientific data concerning whether parturition in the queen proceeds normally or not may prevent veterinarians and cat owners from recognizing parturition problems in time. A questionnaire-based study of parturition in 197 queens was performed to determine several parameters of

  10. A survey on vision-based human action recognition

    NARCIS (Netherlands)

    Poppe, Ronald Walter

    Vision-based human action recognition is the process of labeling image sequences with action labels. Robust solutions to this problem have applications in domains such as visual surveillance, video retrieval and human–computer interaction. The task is challenging due to variations in motion

  11. A Survey study on design procedure of Seismic Base Isolation ...

    African Journals Online (AJOL)

    Adding shear walls or braced frames can decrease the potential damage caused by earthquakes.We can isolate the structures from the ground using the Seismic Base Isolation Systems that is flexible approach to decrease the potential damage. In this research we present information on the design procedure of seismic ...

  12. Visual servoing in medical robotics: a survey. Part I: endoscopic and direct vision imaging - techniques and applications.

    Science.gov (United States)

    Azizian, Mahdi; Khoshnam, Mahta; Najmaei, Nima; Patel, Rajni V

    2014-09-01

    Intra-operative imaging is widely used to provide visual feedback to a clinician when he/she performs a procedure. In visual servoing, surgical instruments and parts of tissue/body are tracked by processing the acquired images. This information is then used within a control loop to manoeuvre a robotic manipulator during a procedure. A comprehensive search of electronic databases was completed for the period 2000-2013 to provide a survey of the visual servoing applications in medical robotics. The focus is on medical applications where image-based tracking is used for closed-loop control of a robotic system. Detailed classification and comparative study of various contributions in visual servoing using endoscopic or direct visual images are presented and summarized in tables and diagrams. The main challenges in using visual servoing for medical robotic applications are identified and potential future directions are suggested. 'Supervised automation of medical robotics' is found to be a major trend in this field. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Low Power LDPC Code Decoder Architecture Based on Intermediate Message Compression Technique

    Science.gov (United States)

    Shimizu, Kazunori; Togawa, Nozomu; Ikenaga, Takeshi; Goto, Satoshi

    Reducing the power dissipation for LDPC code decoder is a major challenging task to apply it to the practical digital communication systems. In this paper, we propose a low power LDPC code decoder architecture based on an intermediate message-compression technique which features as follows: (i) An intermediate message compression technique enables the decoder to reduce the required memory capacity and write power dissipation. (ii) A clock gated shift register based intermediate message memory architecture enables the decoder to decompress the compressed messages in a single clock cycle while reducing the read power dissipation. The combination of the above two techniques enables the decoder to reduce the power dissipation while keeping the decoding throughput. The simulation results show that the proposed architecture improves the power efficiency up to 52% and 18% compared to that of the decoder based on the overlapped schedule and the rapid convergence schedule without the proposed techniques respectively.

  14. Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation

    Science.gov (United States)

    Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah

    2018-03-01

    To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.

  15. LITERATURE SURVEY ON EXISTING POWER SAVING ROUTING METHODS AND TECHNIQUES FOR INCREASING NETWORK LIFE TIME IN MANET

    Directory of Open Access Journals (Sweden)

    K Mariyappan

    2017-06-01

    Full Text Available Mobile ad hoc network (MANET is a special type of wireless network in which a collection of wireless mobile devices (called also nodes dynamically forming a temporary network without the need of any pre-existing network infrastructure or centralized administration. Currently, Mobile ad hoc networks (MANETs play a significant role in university campus, advertisement, emergency response, disaster recovery, military use in battle fields, disaster management scenarios, in sensor network, and so on. However, wireless network devices, especially in ad hoc networks, are typically battery-powered. Thus, energy efficiency is a critical issue for battery-powered mobile devices in ad hoc networks. This is due to the fact that failure of node or link allows re-routing and establishing a new path from source to destination which creates extra energy consumption of nodes and sparse network connectivity, leading to a more likelihood occurrences of network partition. Routing based on energy related parameters is one of the important solutions to extend the lifetime of the node and reduce energy consumption of the network. In this paper detail literature survey on existing energy efficient routing method are studied and compared for their performance under different condition. The result has shown that both the broadcast schemes and energy aware metrics have great potential in overcoming the broadcast storm problem associated with flooding. However, the performances of these approaches rely on either the appropriate selection of the broadcast decision parameter or an energy efficient path. In the earlier proposed broadcast methods, the forwarding probability is selected based on fixed probability or number of neighbors regardless of nodes battery capacity whereas in energy aware schemes energy inefficient node could be part of an established path. Therefore, in an attempt to remedy the paucity of research and to address the gaps identified in this area, a study

  16. Nasal base narrowing of the caucasian nose through the cerclage technique

    Directory of Open Access Journals (Sweden)

    Mocellin, Marcos

    2010-06-01

    Full Text Available Introduction: Several techniques can be performed to reduce the nasal base (narrowing, as skin resection vestibular columellar skin resection, resection of skin in elliptical lip narinary, sloughing of skin and advancements (VY technique of Bernstein and the use of cerclage sutures in the nasal base. Objective: To evaluate the technique of cerclage performed in the nasal base, through endonasal rhinoplasty without delivery of basic technique, in the Caucasian nose, reducing the distance inter-alar flare and correcting the wing with consequent improvement in nasal harmony in the whole face. Methods: A retrospective analysis by analysis of clinical documents and photos of 43 patients in whom cerclage was made of the nasal base by resecting skin ellipse in the region of the vestibule and the nasal base (modified technique of Weir using colorless mononylon® 4 "0" with a straight cutting needle. The study was conducted in 2008 and 2009 at Hospital of Paraná Institute of Otolaryngology - IPO in Curitiba, Parana - Brazil. Patients had a follow up ranging 7-12 months. Results: In 100% of cases was achieved an improvement in nasal harmony, by decreasing the inter-alar distance. Conclusion: The encircling with minimal resection of vestibular skin and the nasal base is an effective method for the narrowing of the nasal base in the Caucasian nose, with predictable results and easy to perform.

  17. Problem Based Internship in Surveying and Planning Curricula

    DEFF Research Database (Denmark)

    Sørensen, Esben Munk; Enemark, Stig

    2006-01-01

    Programme has been divided into a 3 year Bachelor-Programme and after this a 2 year Master-Programme. It has been done as a part of a governmental policy to adapt and fulfil the Bologna-charter in all University Curricula in Denmark. A new element in the Master Programme is a problem-based internship...... economy and – leadership”. This course is organized as an e-Learning course and the student has to develop and document their skills to follow distance e-learning courses. It will prepare them to follow and organize self paced learning in virtual environment which will develop their capacity for life...... by the society to serve the community with still more new knowledge and technology transfer from the international research community. The internship and still more real world influenced problem based learning by writing thesis will be and important bridge builder in the following years....

  18. A LITERATURE SURVEY ON RECOMMENDATION SYSTEM BASED ON SENTIMENTAL ANALYSIS

    OpenAIRE

    Achin Jain; Vanita Jain; Nidhi Kapoor

    2016-01-01

    Recommender systems have grown to be a critical research subject after the emergence of the first paper on collaborative filtering in the Nineties. Despite the fact that educational studies on recommender systems, has extended extensively over the last 10 years, there are deficiencies in the complete literature evaluation and classification of that research. Because of this, we reviewed articles on recommender structures, and then classified those based on sentiment analysis. The articles are...

  19. SURVEY REGARDING THE ULTRAFILTRATION OF PROTEINES THROUGH MEMBRANE BASED PROCEDURES

    Directory of Open Access Journals (Sweden)

    CAMELIA HODOSAN

    2008-05-01

    Full Text Available This work is based on examples that emphasize the complexity of the proteins ultrafiltration process, pointing out the first 10-15 minutes of ultrafiltration. The knowledgement of the factors that influence the separation through ultrafiltration of proteins will allow to choose the right type of membrane, the frequent use of the same membrane and the operation in mechanical and chemical conditions adequate to the ultrafiltration system, when it is separated a protein with certain molecular weight.

  20. Modelers' perception of mathematical modeling in epidemiology: a web-based survey.

    Directory of Open Access Journals (Sweden)

    Gilles Hejblum

    Full Text Available BACKGROUND: Mathematical modeling in epidemiology (MME is being used increasingly. However, there are many uncertainties in terms of definitions, uses and quality features of MME. METHODOLOGY/PRINCIPAL FINDINGS: To delineate the current status of these models, a 10-item questionnaire on MME was devised. Proposed via an anonymous internet-based survey, the questionnaire was completed by 189 scientists who had published in the domain of MME. A small minority (18% of respondents claimed to have in mind a concise definition of MME. Some techniques were identified by the researchers as characterizing MME (e.g. Markov models, while others-at the same level of sophistication in terms of mathematics-were not (e.g. Cox regression. The researchers' opinions were also contrasted about the potential applications of MME, perceived as highly relevant for providing insight into complex mechanisms and less relevant for identifying causal factors. The quality criteria were those of good science and were not related to the size and the nature of the public health problems addressed. CONCLUSIONS/SIGNIFICANCE: This study shows that perceptions on the nature, uses and quality criteria of MME are contrasted, even among the very community of published authors in this domain. Nevertheless, MME is an emerging discipline in epidemiology and this study underlines that it is associated with specific areas of application and methods. The development of this discipline is likely to deserve a framework providing recommendations and guidance at various steps of the studies, from design to report.