WorldWideScience

Sample records for surveying methods and processes

  1. Remote sensing, airborne radiometric survey and aeromagnetic survey data processing and analysis

    International Nuclear Information System (INIS)

    Dong Xiuzhen; Liu Dechang; Ye Fawang; Xuan Yanxiu

    2009-01-01

    Taking remote sensing data, airborne radiometric data and aero magnetic survey data as an example, the authors elaborate about basic thinking of remote sensing data processing methods, spectral feature analysis and adopted processing methods, also explore the remote sensing data combining with the processing of airborne radiometric survey and aero magnetic survey data, and analyze geological significance of processed image. It is not only useful for geological environment research and uranium prospecting in the study area, but also reference to applications in another area. (authors)

  2. Survey: interpolation methods for whole slide image processing.

    Science.gov (United States)

    Roszkowiak, L; Korzynska, A; Zak, J; Pijanowska, D; Swiderska-Chadaj, Z; Markiewicz, T

    2017-02-01

    Evaluating whole slide images of histological and cytological samples is used in pathology for diagnostics, grading and prognosis . It is often necessary to rescale whole slide images of a very large size. Image resizing is one of the most common applications of interpolation. We collect the advantages and drawbacks of nine interpolation methods, and as a result of our analysis, we try to select one interpolation method as the preferred solution. To compare the performance of interpolation methods, test images were scaled and then rescaled to the original size using the same algorithm. The modified image was compared to the original image in various aspects. The time needed for calculations and results of quantification performance on modified images were also compared. For evaluation purposes, we used four general test images and 12 specialized biological immunohistochemically stained tissue sample images. The purpose of this survey is to determine which method of interpolation is the best to resize whole slide images, so they can be further processed using quantification methods. As a result, the interpolation method has to be selected depending on the task involving whole slide images. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  3. A method to automate the radiological survey process

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.

    1987-01-01

    This document describes the USRAD system, a hardware/software ranging and data transmission system, that provides real-time position data and combines it with other portable instrument measurements. Live display of position data and onsite data reduction, presentation, and formatting for reports and automatic transfer into databases are among the unusual attributes of USRADS. Approximately 25% of any survey-to-survey report process is dedicated to data recording and formatting, which is eliminated by USRADS. Cost savings are realized by the elimination of manual transcription of instrument readout in the field and clerical formatting of data in the office. Increased data reliability is realized by ensuring complete survey coverage of an area in the field, by elimination of mathematical errors in conversion of instrument readout to unit concentration, and by elimination of errors associated with transcribing data from the field into report format. The USRAD system can be adapted to measure other types of pollutants or physical/chemical/geological/biological conditions in which portable instrumentation exists. 2 refs., 2 figs

  4. Comparison of Satellite Surveying to Traditional Surveying Methods for the Resources Industry

    Science.gov (United States)

    Osborne, B. P.; Osborne, V. J.; Kruger, M. L.

    Modern ground-based survey methods involve detailed survey, which provides three-space co-ordinates for surveyed points, to a high level of accuracy. The instruments are operated by surveyors, who process the raw results to create survey location maps for the subject of the survey. Such surveys are conducted for a location or region and referenced to the earth global co- ordinate system with global positioning system (GPS) positioning. Due to this referencing the survey is only as accurate as the GPS reference system. Satellite survey remote sensing utilise satellite imagery which have been processed using commercial geographic information system software. Three-space co-ordinate maps are generated, with an accuracy determined by the datum position accuracy and optical resolution of the satellite platform.This paper presents a case study, which compares topographic surveying undertaken by traditional survey methods with satellite surveying, for the same location. The purpose of this study is to assess the viability of satellite remote sensing for surveying in the resources industry. The case study involves a topographic survey of a dune field for a prospective mining project area in Pakistan. This site has been surveyed using modern surveying techniques and the results are compared to a satellite survey performed on the same area.Analysis of the results from traditional survey and from the satellite survey involved a comparison of the derived spatial co- ordinates from each method. In addition, comparisons have been made of costs and turnaround time for both methods.The results of this application of remote sensing is of particular interest for survey in areas with remote and extreme environments, weather extremes, political unrest, poor travel links, which are commonly associated with mining projects. Such areas frequently suffer language barriers, poor onsite technical support and resources.

  5. Radiological decontamination, survey, and statistical release method for vehicles

    International Nuclear Information System (INIS)

    Goodwill, M.E.; Lively, J.W.; Morris, R.L.

    1996-06-01

    Earth-moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium millsite in Monticello, Utah (a cleanup site regulated under the Comprehensive Environmental Response, Compensation, and Liability Act). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  6. Biological variables for the site survey of surface ecosystems - existing data and survey methods

    International Nuclear Information System (INIS)

    Kylaekorpi, Lasse; Berggren, Jens; Larsson, Mats; Liberg, Maria; Rydgren, Bernt

    2000-06-01

    In the process of selecting a safe and environmentally acceptable location for the deep level repository of nuclear waste, site surveys will be carried out. These site surveys will also include studies of the biota at the site, in order to assure that the chosen site will not conflict with important ecological interests, and to establish a thorough baseline for future impact assessments and monitoring programmes. As a preparation to the site survey programme, a review of the variables that need to be surveyed is conducted. This report contains the review for some of those variables. For each variable, existing data sources and their characteristics are listed. For those variables for which existing data sources are inadequate, suggestions are made for appropriate methods that will enable the establishment of an acceptable baseline. In this report the following variables are reviewed: Fishery, Landscape, Vegetation types, Key biotopes, Species (flora and fauna), Red-listed species (flora and fauna), Biomass (flora and fauna), Water level, water retention time (incl. water body and flow), Nutrients/toxins, Oxygen concentration, Layering, stratification, Light conditions/transparency, Temperature, Sediment transport, (Marine environments are excluded from this review). For a major part of the variables, the existing data coverage is most likely insufficient. Both the temporal and/or the geographical resolution is often limited, which means that complementary surveys must be performed during (or before) the site surveys. It is, however, in general difficult to make exact judgements on the extent of existing data, and also to give suggestions for relevant methods to use in the site surveys. This can be finally decided only when the locations for the sites are decided upon. The relevance of the different variables also depends on the environmental characteristics of the sites. Therefore, we suggest that when the survey sites are selected, an additional review is

  7. Biological variables for the site survey of surface ecosystems - existing data and survey methods

    Energy Technology Data Exchange (ETDEWEB)

    Kylaekorpi, Lasse; Berggren, Jens; Larsson, Mats; Liberg, Maria; Rydgren, Bernt [SwedPower AB, Stockholm (Sweden)

    2000-06-01

    In the process of selecting a safe and environmentally acceptable location for the deep level repository of nuclear waste, site surveys will be carried out. These site surveys will also include studies of the biota at the site, in order to assure that the chosen site will not conflict with important ecological interests, and to establish a thorough baseline for future impact assessments and monitoring programmes. As a preparation to the site survey programme, a review of the variables that need to be surveyed is conducted. This report contains the review for some of those variables. For each variable, existing data sources and their characteristics are listed. For those variables for which existing data sources are inadequate, suggestions are made for appropriate methods that will enable the establishment of an acceptable baseline. In this report the following variables are reviewed: Fishery, Landscape, Vegetation types, Key biotopes, Species (flora and fauna), Red-listed species (flora and fauna), Biomass (flora and fauna), Water level, water retention time (incl. water body and flow), Nutrients/toxins, Oxygen concentration, Layering, stratification, Light conditions/transparency, Temperature, Sediment transport, (Marine environments are excluded from this review). For a major part of the variables, the existing data coverage is most likely insufficient. Both the temporal and/or the geographical resolution is often limited, which means that complementary surveys must be performed during (or before) the site surveys. It is, however, in general difficult to make exact judgements on the extent of existing data, and also to give suggestions for relevant methods to use in the site surveys. This can be finally decided only when the locations for the sites are decided upon. The relevance of the different variables also depends on the environmental characteristics of the sites. Therefore, we suggest that when the survey sites are selected, an additional review is

  8. Survey of Technetium Analytical Production Methods Supporting Hanford Nuclear Materials Processing

    International Nuclear Information System (INIS)

    TROYER, G.L.

    1999-01-01

    This document provides a historical survey of analytical methods used for measuring 99 Tc in nuclear fuel reprocessing materials and wastes at Hanford. Method challenges including special sludge matrices tested are discussed. Special problems and recommendations are presented

  9. Web-based Surveys: Changing the Survey Process

    OpenAIRE

    Gunn, Holly

    2002-01-01

    Web-based surveys are having a profound influence on the survey process. Unlike other types of surveys, Web page design skills and computer programming expertise play a significant role in the design of Web-based surveys. Survey respondents face new and different challenges in completing a Web-based survey. This paper examines the different types of Web-based surveys, the advantages and challenges of using Web-based surveys, the design of Web-based surveys, and the issues of validity, error, ...

  10. Assessing risk of draft survey by AHP method

    Science.gov (United States)

    Xu, Guangcheng; Zhao, Kuimin; Zuo, Zhaoying; Liu, Gang; Jian, Binguo; Lin, Yan; Fan, Yukun; Wang, Fei

    2018-04-01

    The paper assesses the risks of vessel floating in the seawater for draft survey by using the analytic hierarchy process. On this basis, the paper established draft survey risk index from the view of draft reading, ballast water, fresh water, and calculation process and so on. Then the paper proposes the method to deal with risk assessment using one concrete sample.

  11. Methods for rapidly processing angular masks of next-generation galaxy surveys

    Science.gov (United States)

    Swanson, M. E. C.; Tegmark, Max; Hamilton, Andrew J. S.; Hill, J. Colin

    2008-07-01

    As galaxy surveys become larger and more complex, keeping track of the completeness, magnitude limit and other survey parameters as a function of direction on the sky becomes an increasingly challenging computational task. For example, typical angular masks of the Sloan Digital Sky Survey contain about N = 300000 distinct spherical polygons. Managing masks with such large numbers of polygons becomes intractably slow, particularly for tasks that run in time with a naive algorithm, such as finding which polygons overlap each other. Here we present a `divide-and-conquer' solution to this challenge: we first split the angular mask into pre-defined regions called `pixels', such that each polygon is in only one pixel, and then perform further computations, such as checking for overlap, on the polygons within each pixel separately. This reduces tasks to , and also reduces the important task of determining in which polygon(s) a point on the sky lies from to , resulting in significant computational speedup. Additionally, we present a method to efficiently convert any angular mask to and from the popular HEALPIX format. This method can be generically applied to convert to and from any desired spherical pixelization. We have implemented these techniques in a new version of the MANGLE software package, which is freely available at http://space.mit.edu/home/tegmark/mangle/, along with complete documentation and example applications. These new methods should prove quite useful to the astronomical community, and since MANGLE is a generic tool for managing angular masks on a sphere, it has the potential to benefit terrestrial mapmaking applications as well.

  12. A survey of decontamination processes applicable to DOE nuclear facilities

    International Nuclear Information System (INIS)

    Chen, L.; Chamberlain, D.B.; Conner, C.; Vandegrift, G.F.

    1997-05-01

    The objective of this survey was to select an appropriate technology for in situ decontamination of equipment interiors as part of the decommissioning of U.S. Department of Energy nuclear facilities. This selection depends on knowledge of existing chemical decontamination methods. This report provides an up-to-date review of chemical decontamination methods. According to available information, aqueous systems are probably the most universally used method for decontaminating and cleaning metal surfaces. We have subdivided the technologies, on the basis of the types of chemical solvents, into acid, alkaline permanganate, highly oxidizing, peroxide, and miscellaneous systems. Two miscellaneous chemical decontamination methods (electrochemical processes and foam and gel systems) are also described. A concise technical description of various processes is given, and the report also outlines technical considerations in the choice of technologies, including decontamination effectiveness, waste handing, fields of application, and the advantages and limitations in application. On the basis of this survey, six processes were identified for further evaluation. 144 refs., 2 tabs

  13. A survey of decontamination processes applicable to DOE nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Chen, L.; Chamberlain, D.B.; Conner, C.; Vandegrift, G.F.

    1997-05-01

    The objective of this survey was to select an appropriate technology for in situ decontamination of equipment interiors as part of the decommissioning of U.S. Department of Energy nuclear facilities. This selection depends on knowledge of existing chemical decontamination methods. This report provides an up-to-date review of chemical decontamination methods. According to available information, aqueous systems are probably the most universally used method for decontaminating and cleaning metal surfaces. We have subdivided the technologies, on the basis of the types of chemical solvents, into acid, alkaline permanganate, highly oxidizing, peroxide, and miscellaneous systems. Two miscellaneous chemical decontamination methods (electrochemical processes and foam and gel systems) are also described. A concise technical description of various processes is given, and the report also outlines technical considerations in the choice of technologies, including decontamination effectiveness, waste handing, fields of application, and the advantages and limitations in application. On the basis of this survey, six processes were identified for further evaluation. 144 refs., 2 tabs.

  14. The IMACS Cluster Building Survey. I. Description of the Survey and Analysis Methods

    Science.gov (United States)

    Oemler Jr., Augustus; Dressler, Alan; Gladders, Michael G.; Rigby, Jane R.; Bai, Lei; Kelson, Daniel; Villanueva, Edward; Fritz, Jacopo; Rieke, George; Poggianti, Bianca M.; hide

    2013-01-01

    The IMACS Cluster Building Survey uses the wide field spectroscopic capabilities of the IMACS spectrograph on the 6.5 m Baade Telescope to survey the large-scale environment surrounding rich intermediate-redshift clusters of galaxies. The goal is to understand the processes which may be transforming star-forming field galaxies into quiescent cluster members as groups and individual galaxies fall into the cluster from the surrounding supercluster. This first paper describes the survey: the data taking and reduction methods. We provide new calibrations of star formation rates (SFRs) derived from optical and infrared spectroscopy and photometry. We demonstrate that there is a tight relation between the observed SFR per unit B luminosity, and the ratio of the extinctions of the stellar continuum and the optical emission lines.With this, we can obtain accurate extinction-corrected colors of galaxies. Using these colors as well as other spectral measures, we determine new criteria for the existence of ongoing and recent starbursts in galaxies.

  15. THE IMACS CLUSTER BUILDING SURVEY. I. DESCRIPTION OF THE SURVEY AND ANALYSIS METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Oemler, Augustus Jr.; Dressler, Alan; Kelson, Daniel; Villanueva, Edward [Observatories of the Carnegie Institution for Science, 813 Santa Barbara St., Pasadena, CA 91101-1292 (United States); Gladders, Michael G. [Department of Astronomy and Astrophysics, University of Chicago, Chicago, IL 60637 (United States); Rigby, Jane R. [Observational Cosmology Lab, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Bai Lei [Department of Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, ON M5S 3H4 (Canada); Fritz, Jacopo [Sterrenkundig Observatorium, Universiteit Gent, Krijgslaan 281 S9, B-9000 Gent (Belgium); Rieke, George [Steward Observatory, University of Arizona, Tucson, AZ 8572 (United States); Poggianti, Bianca M.; Vulcani, Benedetta, E-mail: oemler@obs.carnegiescience.edu [INAF-Osservatorio Astronomico di Padova, Vicolo dell' Osservatorio 5, I-35122 Padova (Italy)

    2013-06-10

    The IMACS Cluster Building Survey uses the wide field spectroscopic capabilities of the IMACS spectrograph on the 6.5 m Baade Telescope to survey the large-scale environment surrounding rich intermediate-redshift clusters of galaxies. The goal is to understand the processes which may be transforming star-forming field galaxies into quiescent cluster members as groups and individual galaxies fall into the cluster from the surrounding supercluster. This first paper describes the survey: the data taking and reduction methods. We provide new calibrations of star formation rates (SFRs) derived from optical and infrared spectroscopy and photometry. We demonstrate that there is a tight relation between the observed SFR per unit B luminosity, and the ratio of the extinctions of the stellar continuum and the optical emission lines. With this, we can obtain accurate extinction-corrected colors of galaxies. Using these colors as well as other spectral measures, we determine new criteria for the existence of ongoing and recent starbursts in galaxies.

  16. Methods of practice and guidelines for using survey-grade global navigation satellite systems (GNSS) to establish vertical datum in the United States Geological Survey

    Science.gov (United States)

    Rydlund, Jr., Paul H.; Densmore, Brenda K.

    2012-01-01

    Geodetic surveys have evolved through the years to the use of survey-grade (centimeter level) global positioning to perpetuate and post-process vertical datum. The U.S. Geological Survey (USGS) uses Global Navigation Satellite Systems (GNSS) technology to monitor natural hazards, ensure geospatial control for climate and land use change, and gather data necessary for investigative studies related to water, the environment, energy, and ecosystems. Vertical datum is fundamental to a variety of these integrated earth sciences. Essentially GNSS surveys provide a three-dimensional position x, y, and z as a function of the North American Datum of 1983 ellipsoid and the most current hybrid geoid model. A GNSS survey may be approached with post-processed positioning for static observations related to a single point or network, or involve real-time corrections to provide positioning "on-the-fly." Field equipment required to facilitate GNSS surveys range from a single receiver, with a power source for static positioning, to an additional receiver or network communicated by radio or cellular for real-time positioning. A real-time approach in its most common form may be described as a roving receiver augmented by a single-base station receiver, known as a single-base real-time (RT) survey. More efficient real-time methods involving a Real-Time Network (RTN) permit the use of only one roving receiver that is augmented to a network of fixed receivers commonly known as Continually Operating Reference Stations (CORS). A post-processed approach in its most common form involves static data collection at a single point. Data are most commonly post-processed through a universally accepted utility maintained by the National Geodetic Survey (NGS), known as the Online Position User Service (OPUS). More complex post-processed methods involve static observations among a network of additional receivers collecting static data at known benchmarks. Both classifications provide users

  17. Research Methods in Healthcare Epidemiology: Survey and Qualitative Research.

    Science.gov (United States)

    Safdar, Nasia; Abbo, Lilian M; Knobloch, Mary Jo; Seo, Susan K

    2016-11-01

    Surveys are one of the most frequently employed study designs in healthcare epidemiology research. Generally easier to undertake and less costly than many other study designs, surveys can be invaluable to gain insights into opinions and practices in large samples and may be descriptive and/or be used to test associations. In this context, qualitative research methods may complement this study design either at the survey development phase and/or at the interpretation/extension of results stage. This methods article focuses on key considerations for designing and deploying surveys in healthcare epidemiology and antibiotic stewardship, including identification of whether or not de novo survey development is necessary, ways to optimally lay out and display a survey, denominator measurement, discussion of biases to keep in mind particularly in research using surveys, and the role of qualitative research methods to complement surveys. We review examples of surveys in healthcare epidemiology and antimicrobial stewardship and review the pros and cons of methods used. A checklist is provided to help aid design and deployment of surveys in healthcare epidemiology and antimicrobial stewardship. Infect Control Hosp Epidemiol 2016;1-6.

  18. Literature Review on Processing and Analytical Methods for ...

    Science.gov (United States)

    Report The purpose of this report was to survey the open literature to determine the current state of the science regarding the processing and analytical methods currently available for recovery of F. tularensis from water and soil matrices, and to determine what gaps remain in the collective knowledge concerning F. tularensis identification from environmental samples.

  19. GPR survey, as one of the best geophysical methods for social and industrial needs

    Science.gov (United States)

    Chernov, Anatolii

    2016-04-01

    This paper is about ways and methods of applying non-invasive geophysical method - Ground penetrating radar (GPR) survey in different spheres of science, industry, social life and culture. Author would like to show that geological methods could be widely used for solving great variety of industrial, human safety and other problems. In that article, we take GPR survey as an example of such useful geophysical methods. It is a fact that investigation of near surface underground medium is important process, which influence on development of different spheres of science and social life: investigation of near surface geology (layering, spreading of rock types, identification of voids, etc.), hydrogeology (depth to water horizons, their thickness), preparation step for construction of roads and buildings (civil geology, engineering geology), investigation of cultural heritage (burial places, building remains,...), ecological investigations (land slides, variation in underground water level, etc.), glaciology. These tasks can be solved by geological methods, but as usual, geophysical survey takes a lot of time and energy (especially electric current and resistivity methods, seismic survey). Author claims that GPR survey can be performed faster than other geophysical surveys and results of GPR survey are informative enough to make proper conclusions. Some problems even cannot be solved without GPR. For example, identification of burial place (one of author's research objects): results of magnetic and electric resistivity tomography survey do not contain enough information to identify burial place, but according to anomalies on GPR survey radarograms, presence of burial place can be proven. Identification of voids and non-magnetic objects also hardly can be done by another non-invasive geophysics surveys and GPR is applicable for that purpose. GPR can be applied for monitoring of dangerous processes in geological medium under roads, buildings, parks and other places of human

  20. Near Real-Time Processing and Archiving of GPS Surveys for Crustal Motion Monitoring

    Science.gov (United States)

    Crowell, B. W.; Bock, Y.

    2008-12-01

    We present an inverse instantaneous RTK method for rapidly processing and archiving GPS data for crustal motion surveys that gives positional accuracy similar to traditional post-processing methods. We first stream 1 Hz data from GPS receivers over Bluetooth to Verizon XV6700 smartphones equipped with Geodetics, Inc. RTD Rover software. The smartphone transmits raw receiver data to a real-time server at the Scripps Orbit and Permanent Array Center (SOPAC) running RTD Pro. At the server, instantaneous positions are computed every second relative to the three closest base stations in the California Real Time Network (CRTN), using ultra-rapid orbits produced by SOPAC, the NOAATrop real-time tropospheric delay model, and ITRF2005 coordinates computed by SOPAC for the CRTN stations. The raw data are converted on-the-fly to RINEX format at the server. Data in both formats are stored on the server along with a file of instantaneous positions, computed independently at each observation epoch. The single-epoch instantaneous positions are continuously transmitted back to the field surveyor's smartphone, where RTD Rover computes a median position and interquartile range for each new epoch of observation. The best-fit solution is the last median position and is available as soon as the survey is completed. We describe how we used this method to process 1 Hz data from the February, 2008 Imperial Valley GPS survey of 38 geodetic monuments established by Imperial College, London in the 1970's, and previously measured by SOPAC using rapid-static GPS methods in 1993, 1999 and 2000, as well as 14 National Geodetic Survey (NGS) monuments. For redundancy, each monument was surveyed for about 15 minutes at least twice and at staggered intervals using two survey teams operating autonomously. Archiving of data and the overall project at SOPAC is performed using the PGM software, developed by the California Spatial Reference Center (CSRC) for the National Geodetic Survey (NGS). The

  1. Survey of postharvest handling, preservation and processing ...

    African Journals Online (AJOL)

    Survey of postharvest handling, preservation and processing practices along the camel milk chain in Isiolo district, Kenya. ... Despite the important contribution of camel milk to food security for pastoralists in Kenya, little is known about the postharvest handling, preservation and processing practices. In this study, existing ...

  2. Methods and systems for the processing of physiological signals

    International Nuclear Information System (INIS)

    Cosnac, B. de; Gariod, R.; Max, J.; Monge, V.

    1975-01-01

    This note is a general survey of the processing of physiological signals. After an introduction about electrodes and their limitations, the physiological nature of the main signals are shortly recalled. Different methods (signal averaging, spectral analysis, shape morphological analysis) are described as applications to the fields of magnetocardiography, electro-encephalography, cardiography, electronystagmography. As for processing means (single portable instruments and programmable), they are described through the example of application to rheography and to the Plurimat'S general system. As a conclusion the methods of signal processing are dominated by the morphological analysis of curves and by the necessity of a more important introduction of the statistical classification. As for the instruments, microprocessors will appear but specific operators linked to computer will certainly grow [fr

  3. Survey Research: Methods, Issues and the Future

    Science.gov (United States)

    Brewer, Ernest W.; Torrisi-Steele, Geraldine; Wang, Victor C. X.

    2015-01-01

    Survey research is prevalent among many professional fields. Both cost effective and time efficient, this method of research is commonly used for the purposes of gaining insight into the attitudes, thoughts, and opinions of populations. Additionally, because there are several types of survey research designs and data collection instruments, the…

  4. Y-STR frequency surveying method

    DEFF Research Database (Denmark)

    Willuweit, Sascha; Caliebe, Amke; Andersen, Mikkel Meyer

    2011-01-01

    Reasonable formalized methods to estimate the frequencies of DNA profiles generated from lineage markers have been proposed in the past years and were discussed in the forensic community. Recently, collections of population data on the frequencies of variations in Y chromosomal STR profiles have...... reached a new quality with the establishment of the comprehensive neatly quality-controlled reference database YHRD. Grounded on such unrivalled empirical material from hundreds of populations studies the core assumption of the Haplotype Frequency Surveying Method originally described 10 years ago can...... be tested and improved. Here we provide new approaches to calculate the parameters used in the frequency surveying method: a maximum likelihood estimation of the regression parameters (r1, r2, s1 and s2) and a revised Frequency Surveying framework with variable binning and a database preprocessing to take...

  5. Statistical problems raised by data processing of food surveys

    International Nuclear Information System (INIS)

    Lacourly, Nancy

    1974-01-01

    The methods used for the analysis of dietary habits of national populations - food surveys - have been studied. S. Lederman's linear model for the estimation of the average individual consumptions from the total family diets was in the light of a food survey carried on with 250 Roman families in 1969. An important bias in the estimates thus obtained was shown out by a simulation assuming 'housewife's dictatorship'; these assumptions should contribute to set up an unbiased model. Several techniques of multidimensional analysis were therefore used and the theoretical aspect of linear regression for some particular situations had to be investigated: quasi-colinear 'independent variables', measurements with errors, positive constraints on regression coefficients. A new survey methodology was developed taking account of the new 'Integrated Information Systems', which have incidence on all the stages of a consumption survey: organization, data collection, constitution of an information bank and data processing. (author) [fr

  6. Fast and accurate methods of independent component analysis: A survey

    Czech Academy of Sciences Publication Activity Database

    Tichavský, Petr; Koldovský, Zbyněk

    2011-01-01

    Roč. 47, č. 3 (2011), s. 426-438 ISSN 0023-5954 R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Institutional research plan: CEZ:AV0Z10750506 Keywords : Blind source separation * artifact removal * electroencephalogram * audio signal processing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/tichavsky-fast and accurate methods of independent component analysis a survey.pdf

  7. a survey of rice production and processing in south east nigeria

    African Journals Online (AJOL)

    A SURVEY OF RICE PRODUCTION AND PROCESSING IN SOUTH EAST NIGERIA. ... in South-Eastern Nigeria was carried out by investigative survey approach. ... labour and traditional approach in the production and processing of rice.

  8. Automation of the radiological survey process: USRADS ultrasonic ranging and data system

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Little, C.A.

    1987-01-01

    The Radiological Survey Activities (RASA) program at Oak Ridge National Laboratory (ORNL) serves as the Inclusion Survey Contractor (ISC) in the Department of Energy's (DOE) Uranium Mill Tailings Remedial Action project (UMTRAP). The ISC is to identify properties in the vicinity of 24 inactive uranium mill sites suspected of having 226 Ra-bearing uranium mill tailings by-product material originating from the processing of uranium ore contamination. Mobile gamma scanning was the primary method used to identify these properties. Once identified, the ISC conducts an inclusion survey. This survey performs sufficient radiological measurements to determine if uranium mill tailing contamination is present, and, if so, if it is in excess of relevant Environmental Protection Agency (EPA) criteria. Radon emanating from 226 Ra is the primary pathway of exposure to human occupants at these sites. EPA criteria focus on controlling 226 Ra concentration in soil. The concentration of 226 Ra in soil can be measured directly by soil sampling and subsequent gamma spectrographic analysis of the sample, or by direct measurement of the gamma exposure rate at the soil surface using portable instrumentation in the field. In both methods, the concentration of 226 Ra is inferred by examining the frequency of gamma emission of 214 Bi, a radioactive decay product in the 238 U decay chain

  9. Social Workers' Orientation toward the Evidence-Based Practice Process: A Dutch Survey

    Science.gov (United States)

    van der Zwet, Renske J. M.; Kolmer, Deirdre M. Beneken genaamd; Schalk, René

    2016-01-01

    Objectives: This study assesses social workers' orientation toward the evidence-based practice (EBP) process and explores which specific variables (e.g. age) are associated. Methods: Data were collected from 341 Dutch social workers through an online survey which included a Dutch translation of the EBP Process Assessment Scale (EBPPAS), along with…

  10. Survey process quality: a question of healthcare manager approach.

    Science.gov (United States)

    Nilsson, Petra; Blomqvist, Kerstin

    2017-08-14

    Purpose The purpose of this paper is to explore how healthcare first-line managers think about and act regarding workplace survey processes. Design/methodology/approach This interview study was performed at a hospital in south Sweden. First-line healthcare managers ( n=24) volunteered. The analysis was inspired by phenomenography, which aims to describe the ways in which different people experience a phenomenon. The phenomenon was a workplace health promotion (WHP) survey processes. Findings Four main WHP survey process approaches were identified among the managers: as a possibility, as a competition, as a work task among others and as an imposition. For each, three common subcategories emerged; how managers: stated challenges and support from hospital management; described their own work group and collaboration with other managers; and expressed themselves and their situation in their roles as first-line managers. Practical implications Insights into how hospital management can understand their first-line managers' motivation for survey processes and practical suggestions and how managers can work proactively at organizational, group and individual level are presented. Originality/value Usually these studies focus on those who should respond to a survey; not those who should run the survey process. Focusing on managers and not co-workers can lead to more committed and empowered managers and thereby success in survey processes.

  11. Man Versus Machine: Comparing Double Data Entry and Optical Mark Recognition for Processing CAHPS Survey Data.

    Science.gov (United States)

    Fifolt, Matthew; Blackburn, Justin; Rhodes, David J; Gillespie, Shemeka; Bennett, Aleena; Wolff, Paul; Rucks, Andrew

    Historically, double data entry (DDE) has been considered the criterion standard for minimizing data entry errors. However, previous studies considered data entry alternatives through the limited lens of data accuracy. This study supplies information regarding data accuracy, operational efficiency, and cost for DDE and Optical Mark Recognition (OMR) for processing the Consumer Assessment of Healthcare Providers and Systems 5.0 survey. To assess data accuracy, we compared error rates for DDE and OMR by dividing the number of surveys that were arbitrated by the total number of surveys processed for each method. To assess operational efficiency, we tallied the cost of data entry for DDE and OMR after survey receipt. Costs were calculated on the basis of personnel, depreciation for capital equipment, and costs of noncapital equipment. The cost savings attributed to this method were negated by the operational efficiency of OMR. There was a statistical significance between rates of arbitration between DDE and OMR; however, this statistical significance did not create a practical significance. The potential benefits of DDE in terms of data accuracy did not outweigh the operational efficiency and thereby financial savings of OMR.

  12. Multidisciplinary eHealth Survey Evaluation Methods

    Science.gov (United States)

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  13. A Delphi Method Analysis to Create an Emergency Medicine Educational Patient Satisfaction Survey

    Directory of Open Access Journals (Sweden)

    Kory S. London

    2015-12-01

    Full Text Available Introduction: Feedback on patient satisfaction (PS as a means to monitor and improve performance in patient communication is lacking in residency training. A physician’s promotion, compensation and job satisfaction may be impacted by his individual PS scores, once he is in practice. Many communication and satisfaction surveys exist but none focus on the emergency department setting for educational purposes. The goal of this project was to create an emergency medicine-based educational PS survey with strong evidence for content validity. Methods: We used the Delphi Method (DM to obtain expert opinion via an iterative process of surveying. Questions were mined from four PS surveys as well as from group suggestion. The DM analysis determined the structure, content and appropriate use of the tool. The group used four-point Likert-type scales and Lynn’s criteria for content validity to determine relevant questions from the stated goals. Results: Twelve recruited experts participated in a series of seven surveys to achieve consensus. A 10-question, single-page survey with an additional page of qualitative questions and demographic questions was selected. Thirty one questions were judged to be relevant from an original 48-question list. Of these, the final 10 questions were chosen. Response rates for individual survey items was 99.5%. Conclusion: The DM produced a consensus survey with content validity evidence. Future work will be needed to obtain evidence for response process, internal structure and construct validity.

  14. Data acquisition and processing - helicopter radiometric survey, Krageroe, 1998

    CERN Document Server

    Beard, L P

    2000-01-01

    On 07 October 1998 a helicopter radiometric survey was flown in the vicinity of Krageroe municipality. The purpose of the survey was to provide radiometric information to help assess radon hazard from radioactive rocks in the area. A total of 60 line-kilometres of radiometric data were acquired in a single flight, covering an area of approximately 3 square km with a 50-m line spacing. The data were collected by Geological Survey of Norway (NGU) personnel and processed at NGU. Radiometric data were reduced using the three-channel procedure recommended by the International Atomic Energy Association. All data were gridded using square cells with 30-m sides and geophysical maps were produced at a scale of 1:5000. This report covers aspects of data acquisition and processing (Author)

  15. Data acquisition and processing - helicopter radiometric survey, Krageroe, 1998

    Energy Technology Data Exchange (ETDEWEB)

    Beard, Les P.; Mogaard, John Olav

    2000-07-01

    On 07 October 1998 a helicopter radiometric survey was flown in the vicinity of Krageroe municipality. The purpose of the survey was to provide radiometric information to help assess radon hazard from radioactive rocks in the area. A total of 60 line-kilometres of radiometric data were acquired in a single flight, covering an area of approximately 3 square km with a 50-m line spacing. The data were collected by Geological Survey of Norway (NGU) personnel and processed at NGU. Radiometric data were reduced using the three-channel procedure recommended by the International Atomic Energy Association. All data were gridded using square cells with 30-m sides and geophysical maps were produced at a scale of 1:5000. This report covers aspects of data acquisition and processing (Author)

  16. Methods of the National Nutrition Survey 1999

    OpenAIRE

    Resano-Pérez, Elsa; Méndez-Ramírez, Ignacio; Shamah-Levy, Teresa; Rivera, Juan A; Sepúlveda-Amor, Jaime

    2003-01-01

    OBJECTIVE: To describe the methods and analyses of the 1999 National Nutrition Survey (NNS-99). MATERIAL AND METHODS: The 1999 National Nutrition Survey (NNS-99) is a probabilistic survey with nationwide representativity. The NNS-99 included four regions and urban and rural areas of Mexico. The last sampling units were households, selected through stratified cluster sampling. The study population consisted of children under five years of age, school-age children (6-11 years), and women of chi...

  17. Methodical recommendations for power unit comprehensive engineering and radiation survey

    International Nuclear Information System (INIS)

    Nosovskij, A.V.

    2000-01-01

    The article describes power unit radiation survey methods developed and applied during conduction of Ch NPP unit I Comprehensive Engineering Radiation Survey. Special requirements for units under decommissioning, main survey principals, criteria for definition of volume and the order of survey for various systems of a NPP Unit are included

  18. Process-tracing methods in decision making: on growing up in the 70s

    NARCIS (Netherlands)

    Schulte-Mecklenbeck, M.; Johnson, J.G.; Böckenholt, U.; Goldstein, D.G.; Russo, J.E.; Sullivan, N.J.; Willemsen, M.C.

    2017-01-01

    Decision research has experienced a shift from simple algebraic theories of choice to an appreciation of mental processes underlying choice. A variety of process-tracing methods has helped researchers test these process explanations. Here, we provide a survey of these methods, including specific

  19. Optimal power flow: a bibliographic survey I. Formulations and deterministic methods

    Energy Technology Data Exchange (ETDEWEB)

    Frank, Stephen [Colorado School of Mines, Department of Electrical Engineering and Computer Science, Golden, CO (United States); Steponavice, Ingrida [University of Jyvaskyla, Department of Mathematical Information Technology, Agora (Finland); Rebennack, Steffen [Colorado School of Mines, Division of Economics and Business, Golden, CO (United States)

    2012-09-15

    Over the past half-century, optimal power flow (OPF) has become one of the most important and widely studied nonlinear optimization problems. In general, OPF seeks to optimize the operation of electric power generation, transmission, and distribution networks subject to system constraints and control limits. Within this framework, however, there is an extremely wide variety of OPF formulations and solution methods. Moreover, the nature of OPF continues to evolve due to modern electricity markets and renewable resource integration. In this two-part survey, we survey both the classical and recent OPF literature in order to provide a sound context for the state of the art in OPF formulation and solution methods. The survey contributes a comprehensive discussion of specific optimization techniques that have been applied to OPF, with an emphasis on the advantages, disadvantages, and computational characteristics of each. Part I of the survey (this article) provides an introduction and surveys the deterministic optimization methods that have been applied to OPF. Part II of the survey examines the recent trend towards stochastic, or non-deterministic, search techniques and hybrid methods for OPF. (orig.)

  20. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    Science.gov (United States)

    James, Mike R.; Robson, Stuart; d'Oleire-Oltmanns, Sebastian; Niethammer, Uwe

    2016-04-01

    Structure-from-motion (SfM) algorithms are greatly facilitating the production of detailed topographic models based on images collected by unmanned aerial vehicles (UAVs). However, SfM-based software does not generally provide the rigorous photogrammetric analysis required to fully understand survey quality. Consequently, error related to problems in control point data or the distribution of control points can remain undiscovered. Even if these errors are not large in magnitude, they can be systematic, and thus have strong implications for the use of products such as digital elevation models (DEMs) and orthophotos. Here, we develop a Monte Carlo approach to (1) improve the accuracy of products when SfM-based processing is used and (2) reduce the associated field effort by identifying suitable lower density deployments of ground control points. The method highlights over-parameterisation during camera self-calibration and provides enhanced insight into control point performance when rigorous error metrics are not available. Processing was implemented using commonly-used SfM-based software (Agisoft PhotoScan), which we augment with semi-automated and automated GCPs image measurement. We apply the Monte Carlo method to two contrasting case studies - an erosion gully survey (Taurodont, Morocco) carried out with an fixed-wing UAV, and an active landslide survey (Super-Sauze, France), acquired using a manually controlled quadcopter. The results highlight the differences in the control requirements for the two sites, and we explore the implications for future surveys. We illustrate DEM sensitivity to critical processing parameters and show how the use of appropriate parameter values increases DEM repeatability and reduces the spatial variability of error due to processing artefacts.

  1. The Dark Energy Survey Image Processing Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Morganson, E.; et al.

    2018-01-09

    The Dark Energy Survey (DES) is a five-year optical imaging campaign with the goal of understanding the origin of cosmic acceleration. DES performs a 5000 square degree survey of the southern sky in five optical bands (g,r,i,z,Y) to a depth of ~24th magnitude. Contemporaneously, DES performs a deep, time-domain survey in four optical bands (g,r,i,z) over 27 square degrees. DES exposures are processed nightly with an evolving data reduction pipeline and evaluated for image quality to determine if they need to be retaken. Difference imaging and transient source detection are also performed in the time domain component nightly. On a bi-annual basis, DES exposures are reprocessed with a refined pipeline and coadded to maximize imaging depth. Here we describe the DES image processing pipeline in support of DES science, as a reference for users of archival DES data, and as a guide for future astronomical surveys.

  2. Social workers’ orientation toward the evidence-based practice process : A Dutch survey

    NARCIS (Netherlands)

    van der Zwet, R.J.M.; Beneken Genaamd Kolmer, D.M.; Schalk, R.

    2016-01-01

    Objectives: This study assesses social workers’ orientation toward the evidence-based practice (EBP) process and explores which specific variables (e.g. age) are associated. Methods: Data were collected from 341 Dutch social workers through an online survey which included a Dutch translation of the

  3. Prenotification, Incentives, and Survey Modality: An Experimental Test of Methods to Increase Survey Response Rates of School Principals

    Science.gov (United States)

    Jacob, Robin Tepper; Jacob, Brian

    2012-01-01

    Teacher and principal surveys are among the most common data collection techniques employed in education research. Yet there is remarkably little research on survey methods in education, or about the most cost-effective way to raise response rates among teachers and principals. In an effort to explore various methods for increasing survey response…

  4. Commercial off-the-shelf software dedication process based on the commercial grade survey of supplier

    International Nuclear Information System (INIS)

    Kim, J. Y.; Lee, J. S.; Chon, S. W.; Lee, G. Y.; Park, J. K.

    2000-01-01

    Commercial Off-The-Shelf(COTS) software dedication process can apply to a combination of methods like the hardware commercial grade item dedication process. In general, these methods are : methods 1(special test and inspection), method 2(commercial grade survey of supplier), method 3(source verification), and method 4(acceptance supplier/item performance record). In this paper, the suggested procedure-oriented dedication process on the basis of method 2 for COTS software is consistent with EPRI/TR-106439 and NUREG/CR-6421 requirements. Additional tailoring policy based on code and standards related to COTS software may be also founded in the suggested commercial software dedication process. Suggested commercial software dedication process has been developed for a commercial I and C software dedication who performs COTS qualification according to the dedication procedure

  5. Survey Methods, Traditional, Public Opinion Polling

    DEFF Research Database (Denmark)

    Elmelund-Præstekær, Christian; Hopmann, David Nicolas; Pedersen, Rasmus Tue

    2017-01-01

    Traditional public opinion polls are surveys in which a random sample of a given population is asked questions about their attitudes, knowledge, or behavior. If conducted properly, the answers from such surveys are approximately representative of the entire population. Traditional public opinion...... polling is typically based on four different methods of data gathering, or combinations hereof: face-to-face, postal surveys, phone surveys, and web surveys. Given that opinion polls are based on a sample, we cannot be sure that the sample reflects public opinion perfectly, however—even if randomness...... is perfect. Moreover, responses may be highly dependent on the contextual information provided with the question. Also, it may be difficult to capture past or complex causes of attitudes or behavior. In short, surveys are a precise way of measuring public opinion, but they do not come without challenges....

  6. METHODS IN THE POST-METHODS ERA. REPORT ON AN INTERNATIONAL SURVEY ON LANGUAGE TEACHING METHODS'

    Directory of Open Access Journals (Sweden)

    Jun Liu

    2004-06-01

    Full Text Available Do methods still have a place in 21" century language teaching? To answer this question, an international survey was conducted in the surnmer of 1999. A sample of 800 language teachers world-wide randomly drawn from 17,800 TESOLers were each given a 2-page survey. The return rate was 58.5% with the actual usable data set of448, which was analyzed by using both descriptive and inferential statistics. Among the ten commonly recognized teaching methods surveyed, both the Communicative Language Teaching Approach and an eclectic method seem to have the highest rate in familiarity, preference, and use. But when multiple factors, such as teaching contexts, instructional settings, learners' proficiency levels, class size, teaching experience and educational backgrounds of the teachers, and the status of being a native or nonnative English speaking professional were taken into consideration, various patterns and themes emerged. One interesting finding is that Grammar Translation is still used in EFL contexts, in larger classes, and with learners at low proficiency levels, though the ratio between the actual use of this method and teachers' preference does not match. Based on the results of the survey, a new theoretical framework is proposed to conceptualize language teaching methods in the post-methods era.

  7. Survey of electronic payment methods and systems

    NARCIS (Netherlands)

    Havinga, Paul J.M.; Smit, Gerardus Johannes Maria; Helme, A.; Verbraeck, A.

    1996-01-01

    In this paper an overview of electronic payment methods and systems is given. This survey is done as part of the Moby Dick project. Electronic payment systems can be grouped into three broad classes: traditional money transactions, digital currency and creditdebit payments. Such payment systems have

  8. Training practices of cell processing laboratory staff: analysis of a survey by the Alliance for Harmonization of Cellular Therapy Accreditation.

    Science.gov (United States)

    Keever-Taylor, Carolyn A; Slaper-Cortenbach, Ineke; Celluzzi, Christina; Loper, Kathy; Aljurf, Mahmoud; Schwartz, Joseph; Mcgrath, Eoin; Eldridge, Paul

    2015-12-01

    Methods for processing products used for hematopoietic progenitor cell (HPC) transplantation must ensure their safety and efficacy. Personnel training and ongoing competency assessment is critical to this goal. Here we present results from a global survey of methods used by a diverse array of cell processing facilities for the initial training and ongoing competency assessment of key personnel. The Alliance for Harmonisation of Cellular Therapy Accreditation (AHCTA) created a survey to identify facility type, location, activity, personnel, and methods used for training and competency. A survey link was disseminated through organizations represented in AHCTA to processing facilities worldwide. Responses were tabulated and analyzed as a percentage of total responses and as a percentage of response by region group. Most facilities were based at academic medical centers or hospitals. Facilities with a broad range of activity, product sources and processing procedures were represented. Facilities reported using a combination of training and competency methods. However, some methods predominated. Cellular sources for training differed for training versus competency and also differed based on frequency of procedures performed. Most facilities had responsibilities for procedures in addition to processing for which training and competency methods differed. Although regional variation was observed, training and competency requirements were generally consistent. Survey data showed the use of a variety of training and competency methods but some methods predominated, suggesting their utility. These results could help new and established facilities in making decisions for their own training and competency programs. Copyright © 2015 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  9. Research on 3-D terrain correction methods of airborne gamma-ray spectrometry survey

    International Nuclear Information System (INIS)

    Liu Yanyang; Liu Qingcheng; Zhang Zhiyong

    2008-01-01

    The general method of height correction is not effectual in complex terrain during the process of explaining airborne gamma-ray spectrometry data, and the 2-D terrain correction method researched in recent years is just available for correction of section measured. A new method of 3-D sector terrain correction is studied. The ground radiator is divided into many small sector radiators by the method, then the irradiation rate is calculated in certain survey distance, and the total value of all small radiate sources is regarded as the irradiation rate of the ground radiator at certain point of aero- survey, and the correction coefficients of every point are calculated which then applied to correct to airborne gamma-ray spectrometry data. The method can achieve the forward calculation, inversion calculation and terrain correction for airborne gamma-ray spectrometry survey in complex topography by dividing the ground radiator into many small sectors. Other factors are considered such as the un- saturated degree of measure scope, uneven-radiator content on ground, and so on. The results of for- ward model and an example analysis show that the 3-D terrain correction method is proper and effectual. (authors)

  10. Optimal power flow: a bibliographic survey II. Non-deterministic and hybrid methods

    Energy Technology Data Exchange (ETDEWEB)

    Frank, Stephen [Colorado School of Mines, Department of Electrical Engineering and Computer Science, Golden, CO (United States); Steponavice, Ingrida [Univ. of Jyvaskyla, Dept. of Mathematical Information Technology, Agora (Finland); Rebennack, Steffen [Colorado School of Mines, Division of Economics and Business, Golden, CO (United States)

    2012-09-15

    Over the past half-century, optimal power flow (OPF) has become one of the most important and widely studied nonlinear optimization problems. In general, OPF seeks to optimize the operation of electric power generation, transmission, and distribution networks subject to system constraints and control limits. Within this framework, however, there is an extremely wide variety of OPF formulations and solution methods. Moreover, the nature of OPF continues to evolve due to modern electricity markets and renewable resource integration. In this two-part survey, we survey both the classical and recent OPF literature in order to provide a sound context for the state of the art in OPF formulation and solution methods. The survey contributes a comprehensive discussion of specific optimization techniques that have been applied to OPF, with an emphasis on the advantages, disadvantages, and computational characteristics of each. Part I of the survey provides an introduction and surveys the deterministic optimization methods that have been applied to OPF. Part II of the survey (this article) examines the recent trend towards stochastic, or non-deterministic, search techniques and hybrid methods for OPF. (orig.)

  11. An UAV scheduling and planning method for post-disaster survey

    Science.gov (United States)

    Li, G. Q.; Zhou, X. G.; Yin, J.; Xiao, Q. Y.

    2014-11-01

    Annually, the extreme climate and special geological environments lead to frequent natural disasters, e.g., earthquakes, floods, etc. The disasters often bring serious casualties and enormous economic losses. Post-disaster surveying is very important for disaster relief and assessment. As the Unmanned Aerial Vehicle (UAV) remote sensing with the advantage of high efficiency, high precision, high flexibility, and low cost, it is widely used in emergency surveying in recent years. As the UAVs used in emergency surveying cannot stop and wait for the happening of the disaster, when the disaster happens the UAVs usually are working at everywhere. In order to improve the emergency surveying efficiency, it is needed to track the UAVs and assign the emergency surveying task for each selected UAV. Therefore, a UAV tracking and scheduling method for post-disaster survey is presented in this paper. In this method, Global Positioning System (GPS), and GSM network are used to track the UAVs; an emergency tracking UAV information database is built in advance by registration, the database at least includes the following information, e.g., the ID of the UAVs, the communication number of the UAVs; when catastrophe happens, the real time location of all UAVs in the database will be gotten using emergency tracking method at first, then the traffic cost time for all UAVs to the disaster region will be calculated based on the UAVs' the real time location and the road network using the nearest services analysis algorithm; the disaster region is subdivided to several emergency surveying regions based on DEM, area, and the population distribution map; the emergency surveying regions are assigned to the appropriated UAV according to shortest cost time rule. The UAVs tracking and scheduling prototype is implemented using SQLServer2008, ArcEnginge 10.1 SDK, Visual Studio 2010 C#, Android, SMS Modem, and Google Maps API.

  12. U.S. Geological Survey experience with the residual absolutes method

    Directory of Open Access Journals (Sweden)

    E. W. Worthington

    2017-10-01

    Full Text Available The U.S. Geological Survey (USGS Geomagnetism Program has developed and tested the residual method of absolutes, with the assistance of the Danish Technical University's (DTU Geomagnetism Program. Three years of testing were performed at College Magnetic Observatory (CMO, Fairbanks, Alaska, to compare the residual method with the null method. Results show that the two methods compare very well with each other and both sets of baseline data were used to process the 2015 definitive data. The residual method will be implemented at the other USGS high-latitude geomagnetic observatories in the summer of 2017 and 2018.

  13. The Jamaica asthma and allergies national prevalence survey: rationale and methods

    Directory of Open Access Journals (Sweden)

    Edwards Nancy C

    2010-04-01

    Full Text Available Abstract Background Asthma is a significant public health problem in the Caribbean. Prevalence surveys using standardized measures of asthma provide valid prevalence estimates to facilitate regional and international comparisons and monitoring of trends. This paper describes methods used in the Jamaica Asthma and Allergies National Prevalence Survey, challenges associated with this survey and strategies used to overcome these challenges. Methods/Design An island wide, cross-sectional, community-based survey of asthma, asthma symptoms and allergies was done among adults and children using the European Community Respiratory Health Survey Questionnaire for adults and the International Study of Asthma and Allergies in Children. Stratified multi-stage cluster sampling was used to select 2, 163 adults aged 18 years and older and 2, 017 children aged 2-17 years for the survey. The Kish selection table was used to select one adult and one child per household. Data analysis accounted for sampling design and prevalence estimates were weighted to produce national estimates. Discussion The Jamaica Asthma and Allergies National Prevalence Survey is the first population- based survey in the Caribbean to determine the prevalence of asthma and allergies both in adults and children using standardized methods. With response rates exceeding 80% in both groups, this approach facilitated cost-effective gathering of high quality asthma prevalence data that will facilitate international and regional comparison and monitoring of asthma prevalence trends. Another unique feature of this study was the partnership with the Ministry of Health in Jamaica, which ensured the collection of data relevant for decision-making to facilitate the uptake of research evidence. The findings of this study will provide important data on the burden of asthma and allergies in Jamaica and contribute to evidence-informed planning of comprehensive asthma management and education programs.

  14. Survey method for radiological surveys of 300 FF-1 Operable Unit soil and materials

    International Nuclear Information System (INIS)

    Greif, A.A.

    1997-06-01

    This technical basis document is to be used to survey soils at the 300-FF-1 Operable Unit during remediation of the site. Its purpose is to provide a basis for the survey methods to be employed by the Radiological Control Technician to determine if excavated areas require continued remediation in accordance with the Record of Decision for the operable unit

  15. Double-observer line transect surveys with Markov-modulated Poisson process models for animal availability.

    Science.gov (United States)

    Borchers, D L; Langrock, R

    2015-12-01

    We develop maximum likelihood methods for line transect surveys in which animals go undetected at distance zero, either because they are stochastically unavailable while within view or because they are missed when they are available. These incorporate a Markov-modulated Poisson process model for animal availability, allowing more clustered availability events than is possible with Poisson availability models. They include a mark-recapture component arising from the independent-observer survey, leading to more accurate estimation of detection probability given availability. We develop models for situations in which (a) multiple detections of the same individual are possible and (b) some or all of the availability process parameters are estimated from the line transect survey itself, rather than from independent data. We investigate estimator performance by simulation, and compare the multiple-detection estimators with estimators that use only initial detections of individuals, and with a single-observer estimator. Simultaneous estimation of detection function parameters and availability model parameters is shown to be feasible from the line transect survey alone with multiple detections and double-observer data but not with single-observer data. Recording multiple detections of individuals improves estimator precision substantially when estimating the availability model parameters from survey data, and we recommend that these data be gathered. We apply the methods to estimate detection probability from a double-observer survey of North Atlantic minke whales, and find that double-observer data greatly improve estimator precision here too. © 2015 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  16. Getting from neuron to checkmark: Models and methods in cognitive survey research

    NARCIS (Netherlands)

    Holleman, B.C.; Murre, J.M.J.

    2008-01-01

    Since the 1980s much work has been done in the field of Cognitive Survey Research. In an interdisciplinary endeavour, survey methodologists and cognitive psychologists (as well as social psychologists and linguists) have worked to unravel the cognitive processes underlying survey responses: to

  17. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    Science.gov (United States)

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  18. The Dark Energy Survey Data Processing and Calibration System

    Energy Technology Data Exchange (ETDEWEB)

    Mohr, Joseph J. [Munich U.; Armstrong, Robert [Penn State U.; Bertin, Emmanuel [Paris, Inst. Astrophys.; Daues, Gregory E. [NCSA, Urbana; Desai, Shantanu [Munich U.; Gower, Michelle [NCSA, Urbana; Gruendl, Robert [Illinois U., Urbana (main); Hanlon, William [Illinois U., Urbana (main); Kuropatkin, Nikolay [Fermilab; Lin, Huan [Fermilab; Marriner, John [Fermilab; Petravick, Don; Sevilla, Ignacio [Madrid, CIEMAT; Swanson, Molly [Harvard-Smithsonian Ctr. Astrophys.; Tomashek, Todd [NCSA, Urbana; Tucker, Douglas [Fermilab; Yanny, Brian [Fermilab

    2012-09-24

    The Dark Energy Survey (DES) is a 5000 deg2 grizY survey reaching characteristic photometric depths of 24th magnitude (10 sigma) and enabling accurate photometry and morphology of objects ten times fainter than in SDSS. Preparations for DES have included building a dedicated 3 deg2 CCD camera (DECam), upgrading the existing CTIO Blanco 4m telescope and developing a new high performance computing (HPC) enabled data management system (DESDM). The DESDM system will be used for processing, calibrating and serving the DES data. The total data volumes are high (~2PB), and so considerable effort has gone into designing an automated processing and quality control system. Special purpose image detrending and photometric calibration codes have been developed to meet the data quality requirements, while survey astrometric calibration, coaddition and cataloging rely on new extensions of the AstrOmatic codes which now include tools for PSF modeling, PSF homogenization, PSF corrected model fitting cataloging and joint model fitting across multiple input images. The DESDM system has been deployed on dedicated development clusters and HPC systems in the US and Germany. An extensive program of testing with small rapid turn-around and larger campaign simulated datasets has been carried out. The system has also been tested on large real datasets, including Blanco Cosmology Survey data from the Mosaic2 camera. In Fall 2012 the DESDM system will be used for DECam commissioning, and, thereafter, the system will go into full science operations.

  19. Moving beyond Traditional Methods of Survey Validation

    Science.gov (United States)

    Maul, Andrew

    2017-01-01

    In his focus article, "Rethinking Traditional Methods of Survey Validation," published in this issue of "Measurement: Interdisciplinary Research and Perspectives," Andrew Maul wrote that it is commonly believed that self-report, survey-based instruments can be used to measure a wide range of psychological attributes, such as…

  20. FY1998 report on the surveys and studies on developing next generation chemical process technologies; 1998 nendo jisedai kagaku process gijutsu kaihatsu ni kansuru chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For further resource and energy conservation and environmental load reduction, development is necessary on innovative chemical reaction technologies. This paper describes surveys on next generation chemical processes. As non-halogen processes subject to development of new catalysts, new processes were investigated and searching experiments and discussions were given on isocyanate, propylene oxide, and phenol. Technological progress in the C1 chemistry was investigated. Problems in hydrocarbon compound oxidation, hydroxylation, and decomposition by utilizing microorganisms were put into order as application of environmentally friendly technologies. Marine biotechnical possibilities were surveyed. The surveys were given on new processes utilizing the phase transfer catalyst forming a third phase, manufacture of biodegradable plastics, and a novel reaction system combined with self-separation process using molecular assembly. Possibilities were searched on designing a truly simple production system of highly energy saving type. Such fundamental common technologies as structure analysis, property control and reaction engineering were investigated for methods to manufacture functional micro-powder chemical materials. Development was discussed on a system for technology assessment over whole product life cycle to structure a technology assessment basis. (NEDO)

  1. Graph Processing on GPUs: A Survey

    DEFF Research Database (Denmark)

    Shi, Xuanhua; Zheng, Zhigao; Zhou, Yongluan

    2018-01-01

    hundreds of billions, has attracted much attention in both industry and academia. It still remains a great challenge to process such large-scale graphs. Researchers have been seeking for new possible solutions. Because of the massive degree of parallelism and the high memory access bandwidth in GPU......, utilizing GPU to accelerate graph processing proves to be a promising solution. This article surveys the key issues of graph processing on GPUs, including data layout, memory access pattern, workload mapping, and specific GPU programming. In this article, we summarize the state-of-the-art research on GPU...

  2. A Survey of Fish Production and Processing Machinery in Rivers ...

    African Journals Online (AJOL)

    Survey of fish production and processing machinery in Port Harcourt City Local Government Area of Rivers State, Nigeria was carried out to evaluate the followings: different machines used for fish production and processing, the most acceptable machine, effect of cost of machinery on the fish farmer, whether gender has ...

  3. Comparing two survey methods for estimating maternal and perinatal mortality in rural Cambodia.

    Science.gov (United States)

    Chandy, Hoeuy; Heng, Yang Van; Samol, Ha; Husum, Hans

    2008-03-01

    We need solid estimates of maternal mortality rates (MMR) to monitor the impact of maternal care programs. Cambodian health authorities and WHO report the MMR in Cambodia at 450 per 100,000 live births. The figure is drawn from surveys where information is obtained by interviewing respondents about the survival of all their adult sisters (sisterhood method). The estimate is statistically imprecise, 95% confidence intervals ranging from 260 to 620/100,000. The MMR estimate is also uncertain due to under-reporting; where 80-90% of women deliver at home maternal fatalities may go undetected especially where mortality is highest, in remote rural areas. The aim of this study was to attain more reliable MMR estimates by using survey methods other than the sisterhood method prior to an intervention targeting obstetric rural emergencies. The study was carried out in rural Northwestern Cambodia where access to health services is poor and poverty, endemic diseases, and land mines are endemic. Two survey methods were applied in two separate sectors: a community-based survey gathering data from public sources and a household survey gathering data direct from primary sources. There was no statistically significant difference between the two survey results for maternal deaths, both types of survey reported mortality rates around the public figure. The household survey reported a significantly higher perinatal mortality rate as compared to the community-based survey, 8.6% versus 5.0%. Also the household survey gave qualitative data important for a better understanding of the many problems faced by mothers giving birth in the remote villages. There are detection failures in both surveys; the failure rate may be as high as 30-40%. PRINCIPLE CONCLUSION: Both survey methods are inaccurate, therefore inappropriate for evaluation of short-term changes of mortality rates. Surveys based on primary informants yield qualitative information about mothers' hardships important for the design

  4. Formal methods for industrial critical systems a survey of applications

    CERN Document Server

    Margaria-Steffen, Tiziana

    2012-01-01

    "Today, formal methods are widely recognized as an essential step in the design process of industrial safety-critical systems. In its more general definition, the term formal methods encompasses all notations having a precise mathematical semantics, together with their associated analysis methods, that allow description and reasoning about the behavior of a system in a formal manner.Growing out of more than a decade of award-winning collaborative work within the European Research Consortium for Informatics and Mathematics, Formal Methods for Industrial Critical Systems: A Survey of Applications presents a number of mainstream formal methods currently used for designing industrial critical systems, with a focus on model checking. The purpose of the book is threefold: to reduce the effort required to learn formal methods, which has been a major drawback for their industrial dissemination; to help designers to adopt the formal methods which are most appropriate for their systems; and to offer a panel of state-of...

  5. Comparing Traditional and Crowdsourcing Methods for Pretesting Survey Questions

    Directory of Open Access Journals (Sweden)

    Jennifer Edgar

    2016-10-01

    Full Text Available Cognitive interviewing is a common method used to evaluate survey questions. This study compares traditional cognitive interviewing methods with crowdsourcing, or “tapping into the collective intelligence of the public to complete a task.” Crowdsourcing may provide researchers with access to a diverse pool of potential participants in a very timely and cost-efficient way. Exploratory work found that crowdsourcing participants, with self-administered data collection, may be a viable alternative, or addition, to traditional pretesting methods. Using three crowdsourcing designs (TryMyUI, Amazon Mechanical Turk, and Facebook, we compared the participant characteristics, costs, and quantity and quality of data with traditional laboratory-based cognitive interviews. Results suggest that crowdsourcing and self-administered protocols may be a viable way to collect survey pretesting information, as participants were able to complete the tasks and provide useful information; however, complex tasks may require the skills of an interviewer to administer unscripted probes.

  6. Innovation indicators: a survey of innovative activities in the international food processed industry

    Directory of Open Access Journals (Sweden)

    Vinicius Cardoso de Barros Fornari

    2015-02-01

    Full Text Available This paper seeks to combine traditional methods of measuring intensity with other alternative indicators to examine the dispersion of innovation activities in different industries and countries. The hypothesis that underlies the study lies in the fact that in the Food Processed Industry (IAP the traditional methods are insufficient to detect the core of the innovation process. As method, we analyzed patent data extracted from the twenty-five largest food processed companies in the world and suggested different indicators developed from the Pesquisa de Inovação Tecnológica (PINTEC, 2010 – for Brazilian companies – and the Community Innovation Survey (CIS, 2009 – for European Union companies. The results allowed us to establish relationships in three dimensions: (i the complexity of the innovative effort of the IAP; (ii the efforts to innovation in different countries are distinct and; (iii there is heterogeneity in country performance.

  7. Surveying and assessing the hazards associated with the processing of uranium

    International Nuclear Information System (INIS)

    Kruger, J.

    1980-01-01

    The control of uranium during the milling process has not received extensive attention. The results of several surveys of surface contamination, airborne contamination and external radiation made at South African processing facilities are presented and compared with derived norms for permissible exposure to uranium dust. The routine urine sampling results are used as an indicator of personnel exposures. Results of sampling identify the main sources of airborne activity and indicate the contribution of general surface contamination levels to airborne levels. The use of surface contamination levels together with frequent air sampling for assessing the environmental conditions is illustrated. It is concluded that infrequent grab air sampling alone is not adequate for assessing the hazards during uranium processing. Detailed surveys are required and proper area and personnel access control are indicated. (H.K.)

  8. Trends in the nursing doctoral comprehensive examination process: a national survey.

    Science.gov (United States)

    Mawn, Barbara E; Goldberg, Shari

    2012-01-01

    The doctoral comprehensive or qualifying examination (CE/QE) is a traditional rite of passage into the community of scholars for the nursing profession. This exploratory, descriptive cross-sectional study examined trends in the process, timing, and methodology of comprehensive and qualifying examinations in nursing doctoral programs in the United States. Administrators from 45 schools responded to an online survey from 27 states across the country (37% response rate). Participants reported wide variations in the process. The most common method of implementation was the written take-home test (47%), two thirds of which had a subsequent oral examination. Eleven survey respondents (24%) reported using a form of the traditional written, timed, on-site examination; however, only 4 of these also followed up with an oral defense. Nine schools (20%) moved to a requirement for a written publishable paper; three schools consider the written proposal and its defense as the CE/QE. Approximately half had changed their policy in the past 5 years. With the increase in nursing doctor of philosophy programs over the past decade, information is needed to facilitate the development of methods to achieve program outcomes. An understanding of national CE/QE trends can provide a starting point for discussion and allow innovative ideas to meet the need of individual programs. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Studying Cannabis Use Behaviors With Facebook and Web Surveys: Methods and Insights

    Science.gov (United States)

    2018-01-01

    The rapid and wide-reaching expansion of internet access and digital technologies offers epidemiologists numerous opportunities to study health behaviors. One particularly promising new data collection strategy is the use of Facebook’s advertising platform in conjunction with Web-based surveys. Our research team at the Center for Technology and Behavioral Health has used this quick and cost-efficient method to recruit large samples and address unique scientific questions related to cannabis use. In conducting this research, we have gleaned several insights for using this sampling method effectively and have begun to document the characteristics of the resulting data. We believe this information could be useful to other researchers attempting to study cannabis use or, potentially, other health behaviors. The first aim of this paper is to describe case examples of procedures for using Facebook as a survey sampling method for studying cannabis use. We then present several distinctive features of the data produced using this method. Finally, we discuss the utility of this sampling method for addressing specific types of epidemiological research questions. Overall, we believe that sampling with Facebook advertisements and Web surveys is best conceptualized as a targeted, nonprobability-based method for oversampling cannabis users across the United States. PMID:29720366

  10. International survey of methods used in health technology assessment (HTA: does practice meet the principles proposed for good research?

    Directory of Open Access Journals (Sweden)

    Stephens JM

    2012-08-01

    Full Text Available Jennifer M Stephens,1 Bonnie Handke,2 Jalpa A Doshi3 On behalf of the HTA Principles Working Group, part of the International Society for Pharmacoeconomics and Outcomes Research (ISPOR HTA Special Interest Group (SIG1Pharmerit International, Bethesda, MD, USA; 2Medtronic Neuromodulation, Minneapolis, MN, USA; 3Center for Evidence-Based Practice and Center for Health Incentives and Behavioral Economics, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USAObjective: To describe research methods used internationally in health technology assessment (HTA and health-care reimbursement policies; compare the survey findings on research methods and processes to published HTA principles; and discuss important issues/trends reported by HTA bodies related to current research methods and applications of the HTA process.Methods: Representatives from HTA bodies worldwide were recruited to complete an online survey consisting of 47 items within four topics: (1 organizational information and process, (2 primary HTA methodologies and importance of attributes, (3 HTA application and dissemination, and (4 quality of HTA, including key issues. Results were presented as a comparison of current HTA practices and research methods to published HTA principles.Results: The survey was completed by 30 respondents representing 16 countries in five major regions, Australia (n = 3, Canada (n = 2, Europe (n = 17, Latin America (n = 2, and the United States (n = 6. The most common methodologies used were systematic review, meta-analysis, and economic modeling. The most common attributes evaluated were effectiveness (more commonly than efficacy, cost-effectiveness, safety, and quality of life. The attributes assessed, relative importance of the attributes, and conformance with HTA principles varied by region/country. Key issues and trends facing HTA bodies included standardizing methods for economic evaluations and grading of evidence, lack of evidence

  11. Design and methodology of a mixed methods follow-up study to the 2014 Ghana Demographic and Health Survey.

    Science.gov (United States)

    Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys

    2017-01-01

    The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys.

  12. A method for energy and exergy analyses of product transformation processes in industry

    International Nuclear Information System (INIS)

    Abou Khalil, B.

    2008-12-01

    After a literature survey enabling the determination of the advantages and drawbacks of existing methods of assessment of the potential energy gains of an industrial site, this research report presents a newly developed method, named Energy and Exergy Analysis of Transformation Processes (or AEEP for Analyse energetique et exergetique des procedes de transformation), while dealing with actual industrial operations, in order to demonstrate the systematic character of this method. The different steps of the method are presented and detailed, one of them, the process analysis, being critical for the application of the developed method. This particular step is then applied to several industrial unitary operations in order to be a base for future energy audits in the concerned industry sectors, as well as to demonstrate its generic and systematic character. The method is the then applied in a global manner to a cheese manufacturing plant, all the different steps of the AEEP being applied. The author demonstrates that AEEP is a systematic method and can be applied to all energy audit levels, moreover to the lowest levels which have a relatively low cost

  13. Survey Method for Radiological Surveys of 300-FF-1 Operable Unit Soils and Material

    International Nuclear Information System (INIS)

    Brehm, D.M.

    1998-06-01

    This technical basis is to be used to survey soils at the 300-FF-1 Operable Unit during remediation of the site. Its purpose is to provide a basis for the survey methods to be employed by radiological control technician (RCTs) to guide the excavation effort in accordance with the 300-FF-1 waste site Record of Decision (ROD). The ROD for the 300-FF-1 Operable Unit requires selective excavation, removal, and disposal of contaminated soil above 350 pCi/g total uranium activity. Soil above this level will be disposed of as radioactive waste. The remaining soil will remain onsite

  14. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  15. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  16. Studying Cannabis Use Behaviors With Facebook and Web Surveys: Methods and Insights.

    Science.gov (United States)

    Borodovsky, Jacob T; Marsch, Lisa A; Budney, Alan J

    2018-05-02

    The rapid and wide-reaching expansion of internet access and digital technologies offers epidemiologists numerous opportunities to study health behaviors. One particularly promising new data collection strategy is the use of Facebook's advertising platform in conjunction with Web-based surveys. Our research team at the Center for Technology and Behavioral Health has used this quick and cost-efficient method to recruit large samples and address unique scientific questions related to cannabis use. In conducting this research, we have gleaned several insights for using this sampling method effectively and have begun to document the characteristics of the resulting data. We believe this information could be useful to other researchers attempting to study cannabis use or, potentially, other health behaviors. The first aim of this paper is to describe case examples of procedures for using Facebook as a survey sampling method for studying cannabis use. We then present several distinctive features of the data produced using this method. Finally, we discuss the utility of this sampling method for addressing specific types of epidemiological research questions. Overall, we believe that sampling with Facebook advertisements and Web surveys is best conceptualized as a targeted, nonprobability-based method for oversampling cannabis users across the United States. ©Jacob T Borodovsky, Lisa A Marsch, Alan J Budney. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 02.05.2018.

  17. Processing module operating methods, processing modules, and communications systems

    Science.gov (United States)

    McCown, Steven Harvey; Derr, Kurt W.; Moore, Troy

    2014-09-09

    A processing module operating method includes using a processing module physically connected to a wireless communications device, requesting that the wireless communications device retrieve encrypted code from a web site and receiving the encrypted code from the wireless communications device. The wireless communications device is unable to decrypt the encrypted code. The method further includes using the processing module, decrypting the encrypted code, executing the decrypted code, and preventing the wireless communications device from accessing the decrypted code. Another processing module operating method includes using a processing module physically connected to a host device, executing an application within the processing module, allowing the application to exchange user interaction data communicated using a user interface of the host device with the host device, and allowing the application to use the host device as a communications device for exchanging information with a remote device distinct from the host device.

  18. Measuring fertility through mobile‒phone based household surveys: Methods, data quality, and lessons learned from PMA2020 surveys

    OpenAIRE

    Yoonjoung Choi; Qingfeng Li; Blake Zachary

    2018-01-01

    Background: PMA2020 is a survey platform with resident enumerators using mobile phones. Instead of collecting full birth history, total fertility rates (TFR) have been measured with a limited number of questions on recent births. Employing new approaches provides opportunities to test and advance survey methods. Objective: This study aims to assess the quality of fertility data in PMA2020 surveys, focusing on bias introduced from the questionnaire and completeness and distribution of birth...

  19. Proper survey methods for research of aquatic plant ecology and management

    Science.gov (United States)

    Proper survey methods are essential for objective, quantitative assessment of the distribution and abundance of aquatic plants as part of research and demonstration efforts. For research, the use of the appropriate method is an essential part of the scientific method, to ensure that the experimenta...

  20. Research and application of soil-mercury-surveys method for locating uranium

    International Nuclear Information System (INIS)

    You Yunfei; Lu Shili; Jiao Zongrun

    1995-06-01

    Soil-Hg-Surveys method for locating uranium ore was presented. Soil-sampler of drilling bottom, the ability of surveying the deep uranium orebodies was raised by using this method. Application of minicomputer technology to pyrolytic-Hg-analysis raises the degree of automation and precision of the analysis. Application condition of optimum is Hg content of orebodies >1 x 10 -6 . Locating deep is about 100 m. The forecast of uranium orebodies achieved success in two unknown section that are 534 and 510 mining area, therefore two little size deposits expanded into middle size deposits. This method is as well applicable to locating gold, silver, copper, lead zinc and oil-gas natural resource and so on. (8 figs., 3 tabs.)

  1. Beowulf Distributed Processing and the United States Geological Survey

    Science.gov (United States)

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing

  2. Don't spin the pen: two alternative methods for second-stage sampling in urban cluster surveys

    Directory of Open Access Journals (Sweden)

    Rose Angela MC

    2007-06-01

    Full Text Available Abstract In two-stage cluster surveys, the traditional method used in second-stage sampling (in which the first household in a cluster is selected is time-consuming and may result in biased estimates of the indicator of interest. Firstly, a random direction from the center of the cluster is selected, usually by spinning a pen. The houses along that direction are then counted out to the boundary of the cluster, and one is then selected at random to be the first household surveyed. This process favors households towards the center of the cluster, but it could easily be improved. During a recent meningitis vaccination coverage survey in Maradi, Niger, we compared this method of first household selection to two alternatives in urban zones: 1 using a superimposed grid on the map of the cluster area and randomly selecting an intersection; and 2 drawing the perimeter of the cluster area using a Global Positioning System (GPS and randomly selecting one point within the perimeter. Although we only compared a limited number of clusters using each method, we found the sampling grid method to be the fastest and easiest for field survey teams, although it does require a map of the area. Selecting a random GPS point was also found to be a good method, once adequate training can be provided. Spinning the pen and counting households to the boundary was the most complicated and time-consuming. The two methods tested here represent simpler, quicker and potentially more robust alternatives to spinning the pen for cluster surveys in urban areas. However, in rural areas, these alternatives would favor initial household selection from lower density (or even potentially empty areas. Bearing in mind these limitations, as well as available resources and feasibility, investigators should choose the most appropriate method for their particular survey context.

  3. Underwater photography - A visual survey method

    Digital Repository Service at National Institute of Oceanography (India)

    Sharma, R.

    Content-Type text/plain; charset=UTF-8 173 Underwater photography - A visual survey method Rahul Sharma National Institute of Oceanography, Dona Paula, Goa-403004 rsharma@nio.org Introduction “Photography as a means of observing...-sea photographs were those made by Maurice Ewing and his co-workers during cruises on Atlantis in 1940-48. Their subject was the seafloor and their method of clicking was to trigger the camera mechanically when its mounting struck bottom. This is the only...

  4. A field survey on coffee beans drying methods of Indonesian small holder farmers

    Science.gov (United States)

    Siagian, Parulian; Setyawan, Eko Y.; Gultom, Tumiur; Napitupulu, Farel H.; Ambarita, Himsar

    2017-09-01

    Drying agricultural product is a post-harvest process that consumes significant energy. It can affect the quality of the product. This paper deals with literature review and field survey of drying methods of coffee beans of Indonesia farmers. The objective is to supply the necessary information on developing continuous solar drier. The results show that intermittent characteristic of sun drying results in a better quality of coffee beans in comparison with constant convective drying. In order to use energy efficiently, the drying process should be divided into several stages. In the first stage when the moist content is high, higher drying air temperature is more effective. After this step, where the moist content is low, lower drying air temperature is better. The field survey of drying coffee beans in Sumatera Utara province reveals that the used drying process is very traditional. It can be divided into two modes and depend on the coffee beans type. The Arabica coffee is firstly fermented and dried to moisture content of 80% using sun drying method, then followed by Green House model of drying up to moisture content about 12%. The latter typically spends 3 days of drying time. On the other hand, The Robusta coffee is dried by exposing to the sun directly without any treatment. After the coffee beans dried follow by peeled process. These findings can be considered to develop a continuous solar drying that suitable for coffee beans drying.

  5. Global Positioning System (GPS) survey of Augustine Volcano, Alaska, August 3-8, 2000: data processing, geodetic coordinates and comparison with prior geodetic surveys

    Science.gov (United States)

    Pauk, Benjamin A.; Power, John A.; Lisowski, Mike; Dzurisin, Daniel; Iwatsubo, Eugene Y.; Melbourne, Tim

    2001-01-01

    Between August 3 and 8,2000,the Alaska Volcano Observatory completed a Global Positioning System (GPS) survey at Augustine Volcano, Alaska. Augustine is a frequently active calcalkaline volcano located in the lower portion of Cook Inlet (fig. 1), with reported eruptions in 1812, 1882, 1909?, 1935, 1964, 1976, and 1986 (Miller et al., 1998). Geodetic measurements using electronic and optical surveying techniques (EDM and theodolite) were begun at Augustine Volcano in 1986. In 1988 and 1989, an island-wide trilateration network comprising 19 benchmarks was completed and measured in its entirety (Power and Iwatsubo, 1998). Partial GPS surveys of the Augustine Island geodetic network were completed in 1992 and 1995; however, neither of these surveys included all marks on the island.Additional GPS measurements of benchmarks A5 and A15 (fig. 2) were made during the summers of 1992, 1993, 1994, and 1996. The goals of the 2000 GPS survey were to:1) re-measure all existing benchmarks on Augustine Island using a homogeneous set of GPS equipment operated in a consistent manner, 2) add measurements at benchmarks on the western shore of Cook Inlet at distances of 15 to 25 km, 3) add measurements at an existing benchmark (BURR) on Augustine Island that was not previously surveyed, and 4) add additional marks in areas of the island thought to be actively deforming. The entire survey resulted in collection of GPS data at a total of 24 sites (fig. 1 and 2). In this report we describe the methods of GPS data collection and processing used at Augustine during the 2000 survey. We use this data to calculate coordinates and elevations for all 24 sites surveyed. Data from the 2000 survey is then compared toelectronic and optical measurements made in 1988 and 1989. This report also contains a general description of all marks surveyed in 2000 and photographs of all new marks established during the 2000 survey (Appendix A).

  6. Microencapsulation and Electrostatic Processing Method

    Science.gov (United States)

    Morrison, Dennis R. (Inventor); Mosier, Benjamin (Inventor)

    2000-01-01

    Methods are provided for forming spherical multilamellar microcapsules having alternating hydrophilic and hydrophobic liquid layers, surrounded by flexible, semi-permeable hydrophobic or hydrophilic outer membranes which can be tailored specifically to control the diffusion rate. The methods of the invention rely on low shear mixing and liquid-liquid diffusion process and are particularly well suited for forming microcapsules containing both hydrophilic and hydrophobic drugs. These methods can be carried out in the absence of gravity and do not rely on density-driven phase separation, mechanical mixing or solvent evaporation phases. The methods include the process of forming, washing and filtering microcapsules. In addition, the methods contemplate coating microcapsules with ancillary coatings using an electrostatic field and free fluid electrophoresis of the microcapsules. The microcapsules produced by such methods are particularly useful in the delivery of pharmaceutical compositions.

  7. Combining Internet-Based and Postal Survey Methods in a Survey among Gynecologists: Results of a Randomized Trial.

    Science.gov (United States)

    Ernst, Sinja Alexandra; Brand, Tilman; Lhachimi, Stefan K; Zeeb, Hajo

    2018-04-01

    To assess whether a combination of Internet-based and postal survey methods (mixed-mode) compared to postal-only survey methods (postal-only) leads to improved response rates in a physician survey, and to compare the cost implications of the different recruitment strategies. All primary care gynecologists in Bremen and Lower Saxony, Germany, were invited to participate in a cross-sectional survey from January to July 2014. The sample was divided into two strata (A; B) depending on availability of an email address. Within each stratum, potential participants were randomly assigned to mixed-mode or postal-only group. In Stratum A, the mixed-mode group had a lower response rate compared to the postal-only group (12.5 vs. 20.2 percent; RR = 0.61, 95 percent CI: 0.44-0.87). In stratum B, no significant differences were found (15.6 vs. 16.2 percent; RR = 0.95, 95 percent CI: 0.62-1.44). Total costs (in €) per valid questionnaire returned (Stratum A: 399.72 vs. 248.85; Stratum B: 496.37 vs. 455.15) and per percentage point of response (Stratum A: 1,379.02 vs. 861.02; Stratum B 1,116.82 vs. 1,024.09) were higher, whereas variable costs were lower in mixed-mode compared to the respective postal-only groups (Stratum A cost ratio: 0.47, Stratum B cost ratio: 0.71). In this study, primary care gynecologists were more likely to participate by traditional postal-only than by mixed-mode survey methods that first offered an Internet option. However, the lower response rate for the mixed-mode method may be partly due to the older age structure of the responding gynecologists. Variable costs per returned questionnaire were substantially lower in mixed-mode groups and indicate the potential for cost savings if the sample population is sufficiently large. © Health Research and Educational Trust.

  8. Measuring fertility through mobile‒phone based household surveys: Methods, data quality, and lessons learned from PMA2020 surveys

    Directory of Open Access Journals (Sweden)

    Yoonjoung Choi

    2018-05-01

    Full Text Available Background: PMA2020 is a survey platform with resident enumerators using mobile phones. Instead of collecting full birth history, total fertility rates (TFR have been measured with a limited number of questions on recent births. Employing new approaches provides opportunities to test and advance survey methods. Objective: This study aims to assess the quality of fertility data in PMA2020 surveys, focusing on bias introduced from the questionnaire and completeness and distribution of birth month and year, and to estimate TFR adjusted for identified data quality issues. Methods: To assess underestimation from the questionnaire, we simulated births that would be counted using the PMA2020 questionnaires compared to births identified from full birth history. We analyzed the latest Demographic and Health Surveys in ten countries where PMA2020 surveys have been implemented. We assessed the level of reporting completeness for birth month and year and heaping of birth month, analyzing 39 PMA2020 surveys. Finally, TFR were calculated and adjusted for biases introduced from the questionnaire and heaping in birth month. Results: Simple questions introduced minor bias from undercounting multiple births, which was expected and correctable. Meanwhile, incomplete reporting of birth month was relatively high, and the default value of January in data collection software systematically moved births with missing months out of the reference period. On average across the 39 surveys, TFR increased by 1.6Š and 2.4Š, adjusted for undercounted multiple births and heaping on January, respectively. Contribution: This study emphasizes the importance of enumerator training and provides critical insight in software programming in surveys using mobile technologies.

  9. [Data validation methods and discussion on Chinese materia medica resource survey].

    Science.gov (United States)

    Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing

    2013-07-01

    From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.

  10. Does the Underground Sidewall Station Survey Method Meet MHSA ...

    African Journals Online (AJOL)

    Grobler, Hendrik

    The underground survey network in a deep level platinum mine in ... The time duration for peg installation during the initial phase of learning the method was ..... changes to the survey “hardware” including prisms, stems and attachment points ...

  11. An automated radiological survey method for performing site remediation and decommissioning

    International Nuclear Information System (INIS)

    Handy, R.G.; Bolch, W.E.; Harder, G.F.; Tolaymat, T.M.

    1994-01-01

    A portable, computer-based method of performing environmental monitoring and assessment for site remediation and decommissioning has been developed. The integrated system has been developed to provide for survey time reductions and real-time data analysis. The technique utilizes a notebook 486 computer with the necessary hardware and software components that makes it possible to be used in an almost unlimited number of environmental monitoring and assessment scenarios. The results from a pilot, open-quotes hide-and-seekclose quotes gamma survey and an actual alpha decontamination survey were elucidated. It was found that a open-quotes hide-and-seekclose quotes survey could come up with timely and accurate conclusions about the position of the source. The use of the automated system in a Th-232 alpha survey resulted in a reduction in the standard time necessary to do a radiological survey. In addition, the ability to analyze the data on-site allowed for identification and location of areas which needed further decontamination. Finally, a discussion on possible future improvements and field conclusions was made

  12. Features of digital photogrammetry methods application and image processing in small and medium-sized enterprises

    Directory of Open Access Journals (Sweden)

    Samsonova N. V.

    2018-05-01

    Full Text Available the paper discusses the methods of survey enterprises employees effective training to use modern measurement systems and the need for further photogrammetric processing of the results obtained. Attention is also paid to integrated learning, based primarily on web content, the introduction of a social component in the development and familiarization with new photogrammetric equipment and technologies in order to increase the competitiveness of engineering and research small and medium-sized enterprises.

  13. HTA and decision-making processes in Central, Eastern and South Eastern Europe: Results from a survey.

    Science.gov (United States)

    García-Mochón, Leticia; Espín Balbino, Jaime; Olry de Labry Lima, Antonio; Caro Martinez, Araceli; Martin Ruiz, Eva; Pérez Velasco, Román

    2017-03-31

    To gain knowledge and insights on health technology assessment (HTA) and decision-making processes in Central, Eastern and South Eastern Europe (CESEE) countries. A cross-sectional study was performed. Based on the literature, a questionnaire was developed in a multi-stage process. The questionnaire was arranged according to 5 broad domains: (i) introduction/country settings; (ii) use of HTA in the country; (iii) decision-making process; (iv) implementation of decisions; and (v) HTA and decision-making: future challenges. Potential survey respondents were identified through literature review-with a total of 118 contacts from the 24 CESEE countries. From March to July 2014, the survey was administered via e-mail. A total of 22 questionnaires were received generating an 18.6% response rate, including 4 responses indicating that their institutions had no involvement in HTA. Most of the CESEE countries have entities under government mandates with advisory functions and different responsibilities for decision-making, but mainly in charge of the reimbursement and pricing of medicines. Other areas where discrepancies across countries were found include criteria for selecting technologies to be assessed, stakeholder involvement, evidence requirements, use of economic evaluation, and timeliness of HTA. A number of CESEE countries have created formal decision-making processes for which HTA is used. However, there is a high level of heterogeneity related to the degree of development of HTA structures, and the methods and processes followed. Further studies focusing on the countries from which information is scarcer and on the HTA of health technologies other than medicines are warranted. Reviews/comparative analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Training practices of cell processing laboratory staff : Analysis of a survey by the Alliance for Harmonization of Cellular Therapy Accreditation

    NARCIS (Netherlands)

    Keever-Taylor, Carolyn A.; Slaper-Cortenbach, Ineke; Celluzzi, Christina; Loper, Kathy; Aljurf, Mahmoud; Schwartz, Joseph; Mcgrath, Eoin; Eldridge, Paul

    2015-01-01

    Background aims: Methods for processing products used for hematopoietic progenitor cell (HPC) transplantation must ensure their safety and efficacy. Personnel training and ongoing competency assessment is critical to this goal. Here we present results from a global survey of methods used by a

  15. The automated data processing architecture for the GPI Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce

    2017-09-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  16. Accelerator and transport line survey and alignment

    International Nuclear Information System (INIS)

    Ruland, R.E.

    1991-10-01

    This paper summarizes the survey and alignment processes of accelerators and transport lines and discusses the propagation of errors associated with these processes. The major geodetic principles governing the survey and alignment measurement space are introduced and their relationship to a lattice coordinate system shown. The paper continues with a broad overview about the activities involved in the step sequence from initial absolute alignment to final smoothing. Emphasis is given to the relative alignment of components, in particular to the importance of incorporating methods to remove residual systematic effects in surveying and alignment operations. Various approaches to smoothing used at major laboratories are discussed. 47 refs., 19 figs., 1 tab

  17. Targets, drivers and metrics in software process improvement: results of a survey in a multinational organization

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Genuchten, van M.J.I.M.; Aerts, H.

    2007-01-01

    This paper reports on a survey amongst software groups in a multinational organization. The survey was initiated by the Software Process Improvement (SPI) Steering Committee of Philips, a committee that monitors the status and quality of software process improvement in the global organization. The

  18. Survey Shows Variation in Ph.D. Methods Training.

    Science.gov (United States)

    Steeves, Leslie; And Others

    1983-01-01

    Reports on a 1982 survey of journalism graduate studies indicating considerable variation in research methods requirements and emphases in 23 universities offering doctoral degrees in mass communication. (HOD)

  19. Business process management: a survey

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Hofstede, ter A.H.M.; Weske, M.H.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.; Weske, M.H.

    2003-01-01

    Business Process Management (BPM) includes methods, techniques, and tools to support the design, enactment, management, and analysis of operational business processes. It can be considered as an extension of classical Workflow Management (WFM) systems and approaches. Although the practical relevance

  20. A Survey of Formal Methods in Software Development

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2012-01-01

    The use of formal methods and formal techniques in industry is steadily growing. In this survey we shall characterise what we mean by software development and by a formal method; briefly overview a history of formal specification languages - some of which are: VDM (Vienna Development Method, 1974...... need for multi-language formalisation (Petri Nets, MSC, StateChart, Temporal Logics); the sociology of university and industry acceptance of formal methods; the inevitability of the use of formal software development methods; while referring to seminal monographs and textbooks on formal methods....

  1. Model of cognitive processes and conversational principles in survey interview interaction

    NARCIS (Netherlands)

    Ongena, Y.P.; Dijkstra, W.

    2007-01-01

    In this paper we provide a model of interviewer-respondent interaction in survey interviews. Our model is primarily focused on the occurrence of problems within this interaction that seem likely to affect data quality. Both conversational principles and cognitive processes, especially where they do

  2. Exploring selection and recruitment processes for newly qualified nurses: a sequential-explanatory mixed-method study.

    Science.gov (United States)

    Newton, Paul; Chandler, Val; Morris-Thomson, Trish; Sayer, Jane; Burke, Linda

    2015-01-01

    To map current selection and recruitment processes for newly qualified nurses and to explore the advantages and limitations of current selection and recruitment processes. The need to improve current selection and recruitment practices for newly qualified nurses is highlighted in health policy internationally. A cross-sectional, sequential-explanatory mixed-method design with 4 components: (1) Literature review of selection and recruitment of newly qualified nurses; and (2) Literature review of a public sector professions' selection and recruitment processes; (3) Survey mapping existing selection and recruitment processes for newly qualified nurses; and (4) Qualitative study about recruiters' selection and recruitment processes. Literature searches on the selection and recruitment of newly qualified candidates in teaching and nursing (2005-2013) were conducted. Cross-sectional, mixed-method data were collected from thirty-one (n = 31) individuals in health providers in London who had responsibility for the selection and recruitment of newly qualified nurses using a survey instrument. Of these providers who took part, six (n = 6) purposively selected to be interviewed qualitatively. Issues of supply and demand in the workforce, rather than selection and recruitment tools, predominated in the literature reviews. Examples of tools to measure values, attitudes and skills were found in the nursing literature. The mapping exercise found that providers used many selection and recruitment tools, some providers combined tools to streamline process and assure quality of candidates. Most providers had processes which addressed the issue of quality in the selection and recruitment of newly qualified nurses. The 'assessment centre model', which providers were adopting, allowed for multiple levels of assessment and streamlined recruitment. There is a need to validate the efficacy of the selection tools. © 2014 John Wiley & Sons Ltd.

  3. Estimating health expectancies from two cross-sectional surveys: The intercensal method

    Directory of Open Access Journals (Sweden)

    Michel Guillot

    2009-10-01

    Full Text Available Health expectancies are key indicators for monitoring the health of populations, as well as for informing debates about compression or expansion of morbidity. However, current methodologies for estimating them are not entirely satisfactory. They are either of limited applicability because of high data requirements (the multistate method or based on questionable assumptions (the Sullivan method. This paper proposes a new method, called the "intercensal" method, which relies on the multistate framework but uses widely available data. The method uses age-specific proportions "healthy" at two successive, independent cross-sectional health surveys, and, together with information on general mortality, solves for the set of transition probabilities that produces the observed sequence of proportions healthy. The system is solved by making realistic parametric assumptions about the age patterns of transition probabilities. Using data from the Health and Retirement Survey (HRS and from the National Health Interview Survey (NHIS, the method is tested against both the multistate method and the Sullivan method. We conclude that the intercensal approach is a promising framework for the indirect estimation of health expectancies.

  4. Survey of sterile admixture practices in canadian hospital pharmacies: part 1. Methods and results.

    Science.gov (United States)

    Warner, Travis; Nishi, Cesilia; Checkowski, Ryan; Hall, Kevin W

    2009-03-01

    The 1996 Guidelines for Preparation of Sterile Products in Pharmacies of the Canadian Society of Hospital Pharmacists (CSHP) represent the current standard of practice for sterile compounding in Canada. However, these guidelines are practice recommendations, not enforceable standards. Previous surveys of sterile compounding practices have shown that actual practice deviates markedly from voluntary practice recommendations. In 2004, the United States Pharmacopeia (USP) published its "General Chapter Pharmaceutical Compounding-Sterile Preparations", which set a more rigorous and enforceable standard for sterile compounding in the United States. To assess sterile compounding practices in Canadian hospital pharmacies and to compare them with current CSHP recommendations and USP chapter standards. An online survey, based on previous studies of sterile compounding practices, the CSHP guidelines, and the chapter standards, was created and distributed to 193 Canadian hospital pharmacies. A total of 133 pharmacies completed at least part of the survey, for a response rate of 68.9%. All respondents reported the preparation of sterile products. Various degrees of deviation from the practice recommendations were noted for virtually all areas of the CSHP guidelines and the USP standards. Low levels of compliance were most notable in the areas of facilities and equipment, process validation, and product testing. Availability in the central pharmacy of a clean room facility meeting or exceeding the criteria of International Organization for Standardization (ISO) class 8 is a requirement of the chapter standards, but more than 40% of responding pharmacies reported that they did not have such a facility. Higher levels of compliance were noted for policies and procedures, garbing requirements, aseptic technique, and handling of hazardous products. Part 1 of this series reports the survey methods and results relating to policies, personnel, raw materials, storage and handling

  5. Predicting survey responses: how and why semantics shape survey statistics on organizational behaviour.

    Directory of Open Access Journals (Sweden)

    Jan Ketil Arnulf

    Full Text Available Some disciplines in the social sciences rely heavily on collecting survey responses to detect empirical relationships among variables. We explored whether these relationships were a priori predictable from the semantic properties of the survey items, using language processing algorithms which are now available as new research methods. Language processing algorithms were used to calculate the semantic similarity among all items in state-of-the-art surveys from Organisational Behaviour research. These surveys covered areas such as transformational leadership, work motivation and work outcomes. This information was used to explain and predict the response patterns from real subjects. Semantic algorithms explained 60-86% of the variance in the response patterns and allowed remarkably precise prediction of survey responses from humans, except in a personality test. Even the relationships between independent and their purported dependent variables were accurately predicted. This raises concern about the empirical nature of data collected through some surveys if results are already given a priori through the way subjects are being asked. Survey response patterns seem heavily determined by semantics. Language algorithms may suggest these prior to administering a survey. This study suggests that semantic algorithms are becoming new tools for the social sciences, opening perspectives on survey responses that prevalent psychometric theory cannot explain.

  6. Does the underground sidewall station survey method meet MHSA ...

    African Journals Online (AJOL)

    The question is asked whether or not this method of sur-veying will meet the MHSA standards of accuracy that was developed for typical hangingwall traverse type networks. Results obtained from a survey closure using a network of clusters of four sidewall stations demonstrates that under the described circumstances it will ...

  7. 76 FR 19976 - Proposed Information Collection; Comment Request; Survey of EDA Grant Process Improvement

    Science.gov (United States)

    2011-04-11

    ...; Comment Request; Survey of EDA Grant Process Improvement AGENCY: Economic Development Administration.... In 2010, EDA made improvements in its grant application process. The proposed short survey of five to... improvements to the grant application process and to make any necessary adjustments. EDA would like to conduct...

  8. Photometric redshifts for the next generation of deep radio continuum surveys - II. Gaussian processes and hybrid estimates

    Science.gov (United States)

    Duncan, Kenneth J.; Jarvis, Matt J.; Brown, Michael J. I.; Röttgering, Huub J. A.

    2018-04-01

    Building on the first paper in this series (Duncan et al. 2018), we present a study investigating the performance of Gaussian process photometric redshift (photo-z) estimates for galaxies and active galactic nuclei detected in deep radio continuum surveys. A Gaussian process redshift code is used to produce photo-z estimates targeting specific subsets of both the AGN population - infrared, X-ray and optically selected AGN - and the general galaxy population. The new estimates for the AGN population are found to perform significantly better at z > 1 than the template-based photo-z estimates presented in our previous study. Our new photo-z estimates are then combined with template estimates through hierarchical Bayesian combination to produce a hybrid consensus estimate that outperforms both of the individual methods across all source types. Photo-z estimates for radio sources that are X-ray sources or optical/IR AGN are significantly improved in comparison to previous template-only estimates - with outlier fractions and robust scatter reduced by up to a factor of ˜4. The ability of our method to combine the strengths of the two input photo-z techniques and the large improvements we observe illustrate its potential for enabling future exploitation of deep radio continuum surveys for both the study of galaxy and black hole co-evolution and for cosmological studies.

  9. A multifaceted approach to understanding dynamic urban processes: satellites, surveys, and censuses.

    Science.gov (United States)

    Jones, B.; Balk, D.; Montgomery, M.; Liu, Z.

    2014-12-01

    Urbanization will arguably be the most significant demographic trend of the 21st century, particularly in fast-growing regions of the developing world. Characterizing urbanization in a spatial context, however, is a difficult task given only the moderate resolution data provided by traditional sources of demographic data (i.e., censuses and surveys). Using a sample of five world "mega-cities" we demonstrate how new satellite data products and new analysis of existing satellite data, when combined with new applications of census and survey microdata, can reveal more about cities and urbanization in combination than either data type can by itself. In addition to the partially modelled Global Urban-Rural Mapping Project (GRUMP) urban extents we consider four sources of remotely sensed data that can be used to estimate urban extents; the NOAA Defense Meteorological Satellite Program (DMSP) Operational Linescan System (OLS) intercallibrated nighttime lights time series data, the newer NOAA Visible Infrared Imager Radiometer Suite (VIIRS) nighttime lights data, the German Aerospace Center (DLR) radar satellite data, and Dense Sampling Method (DSM) analysis of the NASA scatterometer data. Demographic data come from national censuses and/or georeferenced survey data from the Demographic & Health Survey (DHS) program. We overlay demographic and remotely sensed data (e.g., Figs 1, 2) to address two questions; (1) how well do satellite derived measures of urban intensity correlate with demographic measures, and (2) how well are temporal changes in the data correlated. Using spatial regression techniques, we then estimate statistical relationships (controlling for influences such as elevation, coastal proximity, and economic development) between the remotely sensed and demographic data and test the ability of each to predict the other. Satellite derived imagery help us to better understand the evolution of the built environment and urban form, while the underlying demographic

  10. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan; Bachl, Fabian E.; Lindgren, Finn; Borchers, David L.; Illian, Janine B.; Buckland, Stephen T.; Rue, Haavard; Gerrodette, Tim

    2017-01-01

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  11. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan

    2017-12-28

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  12. Surveying immigrants without sampling frames - evaluating the success of alternative field methods.

    Science.gov (United States)

    Reichel, David; Morales, Laura

    2017-01-01

    This paper evaluates the sampling methods of an international survey, the Immigrant Citizens Survey, which aimed at surveying immigrants from outside the European Union (EU) in 15 cities in seven EU countries. In five countries, no sample frame was available for the target population. Consequently, alternative ways to obtain a representative sample had to be found. In three countries 'location sampling' was employed, while in two countries traditional methods were used with adaptations to reach the target population. The paper assesses the main methodological challenges of carrying out a survey among a group of immigrants for whom no sampling frame exists. The samples of the survey in these five countries are compared to results of official statistics in order to assess the accuracy of the samples obtained through the different sampling methods. It can be shown that alternative sampling methods can provide meaningful results in terms of core demographic characteristics although some estimates differ to some extent from the census results.

  13. Comparing Coral Reef Survey Methods. Unesco Reports in Marine Science No. 21 Report of a Regional Unesco/UNEP Workshop on Coral Reef Survey Management and Assessment Methods in Asia and the Pacific (Phuket, Thailand, December 13-17, 1982).

    Science.gov (United States)

    United Nations Educational, Scientific, and Cultural Organization, Paris (France). Div. of Marine Sciences.

    This report includes nine papers prepared for a workshop on coral reef survey management and assessment methods in Asia and the Pacific. The papers are: "A Contrast in Methodologies between Surveying and Testing" (Charles Birkeland); "Coral Reef Survey Methods in the Andaman Sea" (Hansa Chansang); "A Review of Coral Reef…

  14. [Essential procedure and key methods for survey of traditional knowledge related to Chinese materia medica resources].

    Science.gov (United States)

    Cheng, Gong; Huang, Lu-qi; Xue, Da-yuan; Zhang, Xiao-bo

    2014-12-01

    The survey of traditional knowledge related to Chinese materia medica resources is the important component and one of the innovative aspects of the fourth national survey of the Chinese materia medica resources. China has rich traditional knowledge of traditional Chinese medicine (TCM) and the comprehensive investigation of TCM traditional knowledge aims to promote conservation and sustainable use of Chinese materia medica resources. Building upon the field work of pilot investigations, this paper introduces the essential procedures and key methods for conducting the survey of traditional knowledge related to Chinese materia medica resources. The essential procedures are as follows. First is the preparation phrase. It is important to review all relevant literature and provide training to the survey teams so that they have clear understanding of the concept of traditional knowledge and master key survey methods. Second is the field investigation phrase. When conducting field investigations, survey teams should identify the traditional knowledge holders by using the 'snowball method', record the traditional knowledge after obtaining prior informed concerned from the traditional knowledge holders. Researchers should fill out the survey forms provided by the Technical Specification of the Fourth National Survey of Chinese Materia Medica Resources. Researchers should pay particular attention to the scope of traditional knowledge and the method of inheriting the knowledge, which are the key information for traditional knowledge holders and potential users to reach mutual agreed terms to achieve benefit sharing. Third is the data compilation and analysis phrase. Researchers should try to compile and edit the TCM traditional knowledge in accordance with intellectual property rights requirements so that the information collected through the national survey can serve as the basic data for the TCM traditional knowledge database. The key methods of the survey include regional

  15. Experience base for Radioactive Waste Thermal Processing Systems: A preliminary survey

    International Nuclear Information System (INIS)

    Mayberry, J.; Geimer, R.; Gillins, R.; Steverson, E.M.; Dalton, D.; Anderson, G.L.

    1992-04-01

    In the process of considering thermal technologies for potential treatment of the Idaho National Engineering Laboratory mixed transuranic contaminated wastes, a preliminary survey of the experience base available from Radioactive Waste Thermal Processing Systems is reported. A list of known commercial radioactive waste facilities in the United States and some international thermal treatment facilities are provided. Survey focus is upon the US Department of Energy thermal treatment facilities. A brief facility description and a preliminary summary of facility status, and problems experienced is provided for a selected subset of the DOE facilities

  16. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    Science.gov (United States)

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  17. Survey and assessment of conventional software verification and validation methods

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-04-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 134 methods so identified were classified according to their appropriateness for various phases of a developmental lifecycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes in terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each method for the four identified components of knowledge-based and expert systems, as well as the system as a whole

  18. Evaluation of methods to calibrate radiation survey meters

    International Nuclear Information System (INIS)

    Robinson, R.C.; Arbeau, N.D.

    1987-04-01

    Calibration requirements for radiation survey meters used in industrial radiography have been reviewed. Information obtained from a literature search, discussions with CSLD inspectors and firms performing calibrations has been considered. Based on this review a set of minimum calibration requirements was generated which, when met, will determine that the survey meter is suited for measurements described in the current AEC Regulations that apply to industrial radiography equipment. These requirements are presented in this report and may be used as guidelines for evaluating calibration methods proposed or in use in industry. 39 refs

  19. The use of qualitative methods to inform Delphi surveys in core outcome set development.

    Science.gov (United States)

    Keeley, T; Williamson, P; Callery, P; Jones, L L; Mathers, J; Jones, J; Young, B; Calvert, M

    2016-05-04

    Core outcome sets (COS) help to minimise bias in trials and facilitate evidence synthesis. Delphi surveys are increasingly being used as part of a wider process to reach consensus about what outcomes should be included in a COS. Qualitative research can be used to inform the development of Delphi surveys. This is an advance in the field of COS development and one which is potentially valuable; however, little guidance exists for COS developers on how best to use qualitative methods and what the challenges are. This paper aims to provide early guidance on the potential role and contribution of qualitative research in this area. We hope the ideas we present will be challenged, critiqued and built upon by others exploring the role of qualitative research in COS development. This paper draws upon the experiences of using qualitative methods in the pre-Delphi stage of the development of three different COS. Using these studies as examples, we identify some of the ways that qualitative research might contribute to COS development, the challenges in using such methods and areas where future research is required. Qualitative research can help to identify what outcomes are important to stakeholders; facilitate understanding of why some outcomes may be more important than others, determine the scope of outcomes; identify appropriate language for use in the Delphi survey and inform comparisons between stakeholder data and other sources, such as systematic reviews. Developers need to consider a number of methodological points when using qualitative research: specifically, which stakeholders to involve, how to sample participants, which data collection methods are most appropriate, how to consider outcomes with stakeholders and how to analyse these data. A number of areas for future research are identified. Qualitative research has the potential to increase the research community's confidence in COS, although this will be dependent upon using rigorous and appropriate

  20. Method and apparatus for processing algae

    Science.gov (United States)

    Chew, Geoffrey; Reich, Alton J.; Dykes, Jr., H. Waite; Di Salvo, Roberto

    2012-07-03

    Methods and apparatus for processing algae are described in which a hydrophilic ionic liquid is used to lyse algae cells. The lysate separates into at least two layers including a lipid-containing hydrophobic layer and an ionic liquid-containing hydrophilic layer. A salt or salt solution may be used to remove water from the ionic liquid-containing layer before the ionic liquid is reused. The used salt may also be dried and/or concentrated and reused. The method can operate at relatively low lysis, processing, and recycling temperatures, which minimizes the environmental impact of algae processing while providing reusable biofuels and other useful products.

  1. Methods and representativeness of a European survey in children and adolescents: the KIDSCREEN study

    Directory of Open Access Journals (Sweden)

    von Rueden Ursula

    2007-07-01

    Full Text Available Abstract Background The objective of the present study was to compare three different sampling and questionnaire administration methods used in the international KIDSCREEN study in terms of participation, response rates, and external validity. Methods Children and adolescents aged 8–18 years were surveyed in 13 European countries using either telephone sampling and mail administration, random sampling of school listings followed by classroom or mail administration, or multistage random sampling of communities and households with self-administration of the survey materials at home. Cooperation, completion, and response rates were compared across countries and survey methods. Data on non-respondents was collected in 8 countries. The population fraction (PF, respondents in each sex-age, or educational level category, divided by the population in the same category from Eurostat census data and population fraction ratio (PFR, ratio of PF and their corresponding 95% confidence intervals were used to analyze differences by country between the KIDSCREEN samples and a reference Eurostat population. Results Response rates by country ranged from 18.9% to 91.2%. Response rates were highest in the school-based surveys (69.0%–91.2%. Sample proportions by age and gender were similar to the reference Eurostat population in most countries, although boys and adolescents were slightly underrepresented (PFR Conclusion School-based sampling achieved the highest overall response rates but also produced slightly more biased samples than the other methods. The results suggest that the samples were sufficiently representative to provide reference population values for the KIDSCREEN instrument.

  2. Automated data processing architecture for the Gemini Planet Imager Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Maire, Jérôme; Marchis, Franck; Graham, James R.; Macintosh, Bruce; Ammons, S. Mark; Bailey, Vanessa P.; Barman, Travis S.; Bruzzone, Sebastian; Bulger, Joanna; Cotten, Tara; Doyon, René; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Goodsell, Stephen; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn M.; Larkin, James E.; Marley, Mark S.; Metchev, Stanimir; Nielsen, Eric L.; Oppenheimer, Rebecca; Palmer, David W.; Patience, Jennifer; Poyneer, Lisa A.; Pueyo, Laurent; Rajan, Abhijith; Rantakyrö, Fredrik T.; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane J.

    2018-01-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multiyear direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the Data Cruncher, combines multiple data reduction pipelines (DRPs) together to process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our DRPs. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  3. Application of multivariate statistical methods in analyzing expectation surveys in Central Bank of Nigeria

    OpenAIRE

    Raymond, Ogbuka Obinna

    2017-01-01

    In analyzing survey data, most researchers and analysts make use of statistical methods with straight forward statistical approaches. More common, is the use of one‐way, two‐way or multi‐way tables, and graphical displays such as bar charts, line charts, etc. A brief overview of these approaches and a good discussion on aspects needing attention during the data analysis process can be found in Wilson & Stern (2001). In most cases however, analysis procedures that go beyond simp...

  4. Design and Validation of the Quantum Mechanics Conceptual Survey

    Science.gov (United States)

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included…

  5. Automated processing of zebrafish imaging data: a survey.

    Science.gov (United States)

    Mikut, Ralf; Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A; Kausler, Bernhard X; Ledesma-Carbayo, María J; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine

    2013-09-01

    Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines.

  6. Automated Processing of Zebrafish Imaging Data: A Survey

    Science.gov (United States)

    Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A.; Kausler, Bernhard X.; Ledesma-Carbayo, María J.; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine

    2013-01-01

    Abstract Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines. PMID:23758125

  7. Using stable isotopes to monitor forms of sulfur during desulfurization processes: A quick screening method

    Science.gov (United States)

    Liu, Chao-Li; Hackley, Keith C.; Coleman, D.D.; Kruse, C.W.

    1987-01-01

    A method using stable isotope ratio analysis to monitor the reactivity of sulfur forms in coal during thermal and chemical desulfurization processes has been developed at the Illinois State Geological Survey. The method is based upon the fact that a significant difference exists in some coals between the 34S/32S ratios of the pyritic and organic sulfur. A screening method for determining the suitability of coal samples for use in isotope ratio analysis is described. Making these special coals available from coal sample programs would assist research groups in sorting out the complex sulfur chemistry which accompanies thermal and chemical processing of high sulfur coals. ?? 1987.

  8. Practical methods for radiation survey in nuclear installations

    International Nuclear Information System (INIS)

    Shweikani, R.

    2001-12-01

    This study is placed to those who are responsible to perform radiation survey in the nuclear installations, especially the beginners. Therefore, it gives a comprehensive view to all-important aspects related to their work starting from the structure of atoms to the practical steps for radiation survey works. So, it clarify how to perform personal monitoring, methods for monitoring surface contamination, methods for measuring radioactivity of gases and radioactive aerosols in air, monitoring radiation doses, measuring radiation influences in workplaces and finally measuring internal exposure of radiation workers in nuclear installations. Finally, The study shows some cases of breaches of radiation protection rules in some American nuclear installations and describes the final results of these breaches. The aim of this is to assure that any breach or ignore to radiation protection principles may produce bad results, and there is no leniency in implementing environmental radiation protection principles. (author)

  9. Sample size methods for estimating HIV incidence from cross-sectional surveys.

    Science.gov (United States)

    Konikoff, Jacob; Brookmeyer, Ron

    2015-12-01

    Understanding HIV incidence, the rate at which new infections occur in populations, is critical for tracking and surveillance of the epidemic. In this article, we derive methods for determining sample sizes for cross-sectional surveys to estimate incidence with sufficient precision. We further show how to specify sample sizes for two successive cross-sectional surveys to detect changes in incidence with adequate power. In these surveys biomarkers such as CD4 cell count, viral load, and recently developed serological assays are used to determine which individuals are in an early disease stage of infection. The total number of individuals in this stage, divided by the number of people who are uninfected, is used to approximate the incidence rate. Our methods account for uncertainty in the durations of time spent in the biomarker defined early disease stage. We find that failure to account for this uncertainty when designing surveys can lead to imprecise estimates of incidence and underpowered studies. We evaluated our sample size methods in simulations and found that they performed well in a variety of underlying epidemics. Code for implementing our methods in R is available with this article at the Biometrics website on Wiley Online Library. © 2015, The International Biometric Society.

  10. Natural language processing-based COTS software and related technologies survey.

    Energy Technology Data Exchange (ETDEWEB)

    Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.

    2003-09-01

    Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.

  11. Indexing contamination surveys

    International Nuclear Information System (INIS)

    Brown, R.L.

    1998-01-01

    The responsibility for safely managing the Tank Farms at Hanford belongs to Lockheed Martin Hanford Corporation which is part of the six company Project Hanford Management Team led by Fluor Daniel Hanford, Inc.. These Tank Farm Facilities contain numerous outdoor contamination areas which are surveyed at a periodicity consistent with the potential radiological conditions, occupancy, and risk of changes in radiological conditions. This document describes the survey documentation and data tracking method devised to track the results of contamination surveys this process is referred to as indexing. The indexing process takes a representative data set as an indicator for the contamination status of the facility. The data are further manipulated into a single value that can be tracked and trended using standard statistical methodology. To report meaningful data, the routine contamination surveys must be performed in a manner that allows the survey method and the data collection process to be recreated. Three key criteria are necessary to accomplish this goal: Accurate maps, consistent documentation, and consistent consolidation of data meeting these criteria provides data of sufficient quality to be tracked. Tracking of survey data is accomplished by converting the individual survey results into a weighted value, corrected for the actual number of survey points. This information can be compared over time using standard statistical analysis to identify trends. At the Tank Farms, the need to track and trend the facility's radiological status presents unique challenges. Many of these Tank Farm facilities date back to the second world war. The Tank Farm Facilities are exposed to weather extremes, plant and animal intrusion, as well as all of the normal challenges associated with handling radiological waste streams. Routine radiological surveys did not provide a radiological status adequate for continuing comparisons

  12. From "models" to "reality", and Return. Some Reflections on the Interaction Between Survey and Interpretative Methods for Built Heritage Conservation

    Science.gov (United States)

    Ottoni, F.; Freddi, F.; Zerbi, A.

    2017-05-01

    It's well known that more and more accurate methodologies and automatic tools are now available in the field of geometric survey and image processing and they constitute a fundamental instrument for cultural heritage knowledge and preservation; on the other side, very smart and precise numerical models are continuously improved and used in order to simulate the mechanical behaviour of masonry structures: both instruments and technologies are important part of a global process of knowledge which is at the base of any conservation project of cultural heritage. Despite the high accuracy and automation level reached by both technologies and programs, the transfer of data between them is not an easy task and defining the most reliable way to translate and exchange information without data loosing is still an open issue. The goal of the present paper is to analyse the complex process of translation from the very precise (and sometimes redundant) information obtainable by the modern survey methodologies for historic buildings (as laser scanner), into the very simplified (may be too much) schemes used to understand their real structural behaviour, with the final aim to contribute to the discussion on reliable methods for cultural heritage knowledge improvement, through empiricism.

  13. A model of cognitive processes and conversational principles in survey interview interaction

    NARCIS (Netherlands)

    Ongena, Yfke P.; Dijkstra, Wil

    In this paper we provide a model of interviewer–respondent interaction in survey interviews. Our model is primarily focused on the occurrence of problems within this interaction that seem likely to affect data quality. Both conversational principles and cognitive processes, especially where they do

  14. Comparing two survey methods of measuring health-related indicators: Lot Quality Assurance Sampling and Demographic Health Surveys.

    Science.gov (United States)

    Anoke, Sarah C; Mwai, Paul; Jeffery, Caroline; Valadez, Joseph J; Pagano, Marcello

    2015-12-01

    Two common methods used to measure indicators for health programme monitoring and evaluation are the demographic and health surveys (DHS) and lot quality assurance sampling (LQAS); each one has different strengths. We report on both methods when utilised in comparable situations. We compared 24 indicators in south-west Uganda, where data for prevalence estimations were collected independently for the two methods in 2011 (LQAS: n = 8876; DHS: n = 1200). Data were stratified (e.g. gender and age) resulting in 37 comparisons. We used a two-sample two-sided Z-test of proportions to compare both methods. The average difference between LQAS and DHS for 37 estimates was 0.062 (SD = 0.093; median = 0.039). The average difference among the 21 failures to reject equality of proportions was 0.010 (SD = 0.041; median = 0.009); among the 16 rejections, it was 0.130 (SD = 0.010, median = 0.118). Seven of the 16 rejections exhibited absolute differences of 0.10 and 0.20 (mean = 0.261, SD = 0.083). There is 75.7% agreement across the two surveys. Both methods yield regional results, but only LQAS provides information at less granular levels (e.g. the district level) where managerial action is taken. The cost advantage and localisation make LQAS feasible to conduct more frequently, and provides the possibility for real-time health outcomes monitoring. © 2015 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  15. A Survey of Symplectic and Collocation Integration Methods for Orbit Propagation

    Science.gov (United States)

    Jones, Brandon A.; Anderson, Rodney L.

    2012-01-01

    Demands on numerical integration algorithms for astrodynamics applications continue to increase. Common methods, like explicit Runge-Kutta, meet the orbit propagation needs of most scenarios, but more specialized scenarios require new techniques to meet both computational efficiency and accuracy needs. This paper provides an extensive survey on the application of symplectic and collocation methods to astrodynamics. Both of these methods benefit from relatively recent theoretical developments, which improve their applicability to artificial satellite orbit propagation. This paper also details their implementation, with several tests demonstrating their advantages and disadvantages.

  16. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    Science.gov (United States)

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  17. Survey of systems safety analysis methods and their application to nuclear waste management systems

    International Nuclear Information System (INIS)

    Pelto, P.J.; Winegardner, W.K.; Gallucci, R.H.V.

    1981-11-01

    This report reviews system safety analysis methods and examines their application to nuclear waste management systems. The safety analysis methods examined include expert opinion, maximum credible accident approach, design basis accidents approach, hazard indices, preliminary hazards analysis, failure modes and effects analysis, fault trees, event trees, cause-consequence diagrams, G0 methodology, Markov modeling, and a general category of consequence analysis models. Previous and ongoing studies on the safety of waste management systems are discussed along with their limitations and potential improvements. The major safety methods and waste management safety related studies are surveyed. This survey provides information on what safety methods are available, what waste management safety areas have been analyzed, and what are potential areas for future study

  18. Survey of systems safety analysis methods and their application to nuclear waste management systems

    Energy Technology Data Exchange (ETDEWEB)

    Pelto, P.J.; Winegardner, W.K.; Gallucci, R.H.V.

    1981-11-01

    This report reviews system safety analysis methods and examines their application to nuclear waste management systems. The safety analysis methods examined include expert opinion, maximum credible accident approach, design basis accidents approach, hazard indices, preliminary hazards analysis, failure modes and effects analysis, fault trees, event trees, cause-consequence diagrams, G0 methodology, Markov modeling, and a general category of consequence analysis models. Previous and ongoing studies on the safety of waste management systems are discussed along with their limitations and potential improvements. The major safety methods and waste management safety related studies are surveyed. This survey provides information on what safety methods are available, what waste management safety areas have been analyzed, and what are potential areas for future study.

  19. Housing decision making methods for initiation development phase process

    Science.gov (United States)

    Zainal, Rozlin; Kasim, Narimah; Sarpin, Norliana; Wee, Seow Ta; Shamsudin, Zarina

    2017-10-01

    Late delivery and sick housing project problems were attributed to poor decision making. These problems are the string of housing developer that prefers to create their own approach based on their experiences and expertise with the simplest approach by just applying the obtainable standards and rules in decision making. This paper seeks to identify the decision making methods for housing development at the initiation phase in Malaysia. The research involved Delphi method by using questionnaire survey which involved 50 numbers of developers as samples for the primary stage of collect data. However, only 34 developers contributed to the second stage of the information gathering process. At the last stage, only 12 developers were left for the final data collection process. Finding affirms that Malaysian developers prefer to make their investment decisions based on simple interpolation of historical data and using simple statistical or mathematical techniques in producing the required reports. It was suggested that they seemed to skip several important decision-making functions at the primary development stage. These shortcomings were mainly due to time and financial constraints and the lack of statistical or mathematical expertise among the professional and management groups in the developer organisations.

  20. Knowledge-attitude-practice survey among Portuguese gynaecologists regarding combined hormonal contraceptives methods.

    Science.gov (United States)

    Bombas, Teresa; Costa, Ana Rosa; Palma, Fátima; Vicente, Lisa; Sá, José Luís; Nogueira, Ana Maria; Andrade, Sofia

    2012-04-01

    ABSTRACT Objectives To evaluate knowledge, attitude and practices of Portuguese gynaecologists regarding combined hormonal contraceptives. Methods A cross-sectional survey was conducted among 303 gynaecologists. Results Ninety percent of the gynaecologists considered that deciding on contraceptive methods is a process wherein the woman has her say. Efficacy, safety and the woman's preference were the major factors influencing gynaecologists, while efficacy, tolerability and ease of use were the major factors perceived by the specialists to influence the women's choice. Gynaecologists believed that only 2% of women taking the pill were 100% compliant compared to 48% of those using the patch and 75% of those using the ring. The lower risk of omission was the strong point for the latter methods. Side effects were the main reason to change to another method. Vaginal manipulation was the most difficult topic to discuss. Conclusions Most gynaecologists decided with the woman on the contraceptive method. The main reasons for the gynaecologist's recommendation of a given contraceptive method and the women's choice were different. Counselling implies an open discussion and topics related to sexuality were considered difficult to discuss. Improving communication skills and understanding women's requirements are critical for contraceptive counselling.

  1. Digital image processing mathematical and computational methods

    CERN Document Server

    Blackledge, J M

    2005-01-01

    This authoritative text (the second part of a complete MSc course) provides mathematical methods required to describe images, image formation and different imaging systems, coupled with the principle techniques used for processing digital images. It is based on a course for postgraduates reading physics, electronic engineering, telecommunications engineering, information technology and computer science. This book relates the methods of processing and interpreting digital images to the 'physics' of imaging systems. Case studies reinforce the methods discussed, with examples of current research

  2. National Survey on Access, Use and Promotion of Rational Use of Medicines (PNAUM: household survey component methods

    Directory of Open Access Journals (Sweden)

    Sotero Serrate Mengue

    Full Text Available ABSTRACT OBJECTIVE To describe methodological aspects of the household survey National Survey on Access, Use and Promotion of Rational Use of Medicines (PNAUM related to sampling design and implementation, the actual obtained sample, instruments and fieldwork. METHODS A cross-sectional, population-based study with probability sampling in three stages of the population living in households located in Brazilian urban areas. Fieldwork was carried out between September 2013 and February 2014. The data collection instrument included questions related to: information about households, residents and respondents; chronic diseases and medicines used; use of health services; acute diseases and events treated with drugs; use of contraceptives; use of pharmacy services; behaviors that may affect drug use; package inserts and packaging; lifestyle and health insurance. RESULTS In total, 41,433 interviews were carried out in 20,404 households and 576 urban clusters corresponding to 586 census tracts distributed in the five Brazilian regions, according to eight domains defined by age and gender. CONCLUSIONS The results of the survey may be used as a baseline for future studies aiming to assess the impact of government action on drug access and use. For local studies using a compatible method, PNAUM may serve as a reference point to evaluate variations in space and population. With a comprehensive evaluation of drug-related aspects, PNAUM is a major source of data for a variety of analyses to be carried out both at academic and government level.

  3. Use of methods for specifying the target difference in randomised controlled trial sample size calculations: Two surveys of trialists' practice.

    Science.gov (United States)

    Cook, Jonathan A; Hislop, Jennifer M; Altman, Doug G; Briggs, Andrew H; Fayers, Peter M; Norrie, John D; Ramsay, Craig R; Harvey, Ian M; Vale, Luke D

    2014-06-01

    the most recent trial, the target difference was usually one viewed as important by a stakeholder group, mostly also viewed as a realistic difference given the interventions under evaluation, and sometimes one that led to an achievable sample size. The response rates achieved were relatively low despite the surveys being short, well presented, and having utilised reminders. Substantial variations in practice exist with awareness, use, and willingness to recommend methods varying substantially. The findings support the view that sample size calculation is a more complex process than would appear to be the case from trial reports and protocols. Guidance on approaches for sample size estimation may increase both awareness and use of appropriate formal methods. © The Author(s), 2014.

  4. Using mark-recapture distance sampling methods on line transect surveys

    Science.gov (United States)

    Burt, Louise M.; Borchers, David L.; Jenkins, Kurt J.; Marques, Tigao A

    2014-01-01

    Mark–recapture distance sampling (MRDS) methods are widely used for density and abundance estimation when the conventional DS assumption of certain detection at distance zero fails, as they allow detection at distance zero to be estimated and incorporated into the overall probability of detection to better estimate density and abundance. However, incorporating MR data in DS models raises survey and analysis issues not present in conventional DS. Conversely, incorporating DS assumptions in MR models raises issues not present in conventional MR. As a result, being familiar with either conventional DS methods or conventional MR methods does not on its own put practitioners in good a position to apply MRDS methods appropriately. This study explains the sometimes subtly different varieties of MRDS survey methods and the associated concepts underlying MRDS models. This is done as far as possible without giving mathematical details – in the hope that this will make the key concepts underlying the methods accessible to a wider audience than if we were to present the concepts via equations.

  5. A Survey of Various Object Oriented Requirement Engineering Methods

    OpenAIRE

    Anandi Mahajan; Dr. Anurag Dixit

    2013-01-01

    In current years many industries have been moving to the use of object-oriented methods for the development of large scale information systems The requirement of Object Oriented approach in the development of software systems is increasing day by day. This paper is basically a survey paper on various Object-oriented requirement engineering methods. This paper contains a summary of the available Object-oriented requirement engineering methods with their relative advantages and disadvantages...

  6. Comparison of survey and photogrammetry methods to position gravity data, Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Ponce, D.A.; Wu, S.S.C.; Spielman, J.B.

    1985-01-01

    Locations of gravity stations at Yucca Mountain, Nevada, were determined by a survey using an electronic distance-measuring device and by a photogram-metric method. The data from both methods were compared to determine if horizontal and vertical coordinates developed from photogrammetry are sufficently accurate to position gravity data at the site. The results show that elevations from the photogrammetric data have a mean difference of 0.57 +- 0.70 m when compared with those of the surveyed data. Comparison of the horizontal control shows that the two methods agreed to within 0.01 minute. At a latitude of 45 0 , an error of 0.01 minute (18 m) corresponds to a gravity anomaly error of 0.015 mGal. Bouguer gravity anomalies are most sensitive to errors in elevation, thus elevation is the determining factor for use of photogrammetric or survey methods to position gravity data. Because gravity station positions are difficult to locate on aerial photographs, photogrammetric positions are not always exactly at the gravity station; therefore, large disagreements may appear when comparing electronic and photogrammetric measurements. A mean photogrammetric elevation error of 0.57 m corresponds to a gravity anomaly error of 0.11 mGal. Errors of 0.11 mGal are too large for high-precision or detailed gravity measurements but acceptable for regional work. 1 ref. 2 figs., 4 tabs

  7. Market applications of Resistivity, Induced Polarisation, Magnetic Resonance and Electromagnetic methods for Groundwater Investigations, Mining Exploration, Environmental and Engineering Surveys

    Science.gov (United States)

    Bernard, J.

    2012-12-01

    The Manufacturers of geophysical instruments have been facing these past decades the fast evolution of the electronics and of the computer sciences. More automatisms have been introduced into the equipment and into the processing and interpretation software which may let believe that conducting geophysical surveys requires less understanding of the method and less experience than in the past. Hence some misunderstandings in the skills that are needed to make the geophysical results well integrated among the global information which the applied geologist needs to acquire to be successful in his applications. Globally, the demand in geophysical investigation goes towards more penetration depth, requiring more powerful transmitters, and towards a better resolution, requiring more data such as in 3D analysis. Budgets aspects strongly suggest a high efficiency in the field associated to high speed data processing. The innovation is required in all aspects of geophysics to fit with the market needs, including new technological (instruments, software) and methodological (methods, procedures, arrays) developments. The structures in charge of the geophysical work can be public organisations (institutes, ministries, geological surveys,…) or can come from the private sector (large companies, sub-contractors, consultants, …), each one of them getting their own constraints in the field work and in the processing and interpretation phases. In the applications concerning Groundwater investigations, Mining Exploration, Environmental and Engineering surveys, examples of data and their interpretation presently carried out all around the world will be presented for DC Resistivity (Vertical Electrical Sounding, 2D, 3D Resistivity Imaging, Resistivity Monitoring), Induced Polarisation (Time Domain 2D, 3D arrays for mining and environmental), Magnetic Resonance Sounding (direct detection and characterisation of groundwater) and Electromagnetic (multi-component and multi

  8. Survey compare team based learning and lecture teaching method, on learning-teaching process nursing student\\'s, in Surgical and Internal Diseases course

    Directory of Open Access Journals (Sweden)

    AA Vaezi

    2015-12-01

    Full Text Available Introduction: The effect of teaching methods on learning process of students will help teachers to improve the quality of teaching by selecting an appropriate method. This study aimed to compare the team- based learning and lecture teaching method on learning-teaching process of nursing students in surgical and internal diseases courses. Method: This quasi-experimental study was carried on the nursing students in the School of Nursing and Midwifery in Yazd and Meybod cities. Studied sample was all of the students in the sixth term in the Faculty of Nursing in Yazd (48 persons and the Faculty of Nursing in Meybod (28 persons. The rate of students' learning through lecture was measured using MCQ tests and teaching based on team-based learning (TBL method was run using MCQ tests (IRAT, GRAT, Appeals and Task group. Therefore, in order to examine the students' satisfaction about the TBL method, a 5-point Likert scale (translated questionnaire (1=completely disagree, 2= disagree, 3=not effective, 4=agree, and 5=completely agree consisted of 22 items was utilized. The reliability and validity of this translated questionnaire was measured. The collected data were analyzed through SPSS 17.0 using descriptive and analytical statistic. Result: The results showed that the mean scores in team-based learning were meaningful in individual assessment (17±84 and assessment group (17.2±1.17. The mean of overall scores in TBL method (17.84±0.98% was higher compared with the lecture teaching method (16±2.31. Most of the students believed that TBL method has improved their interpersonal and group interaction skills (100%. Among them, 97.7% of students mentioned that this method (TBL helped them to understand the course content better. The lowest levels of the satisfaction have related to the continuous learning during lifelong (51.2%. Conclusion: The results of the present study showed that the TBL method led to improving the communication skills, understanding

  9. The expert's guide to mealtime interventions - A Delphi method survey.

    Science.gov (United States)

    Conchin, Simone; Carey, Sharon

    2017-09-27

    Prevalence of malnutrition and a myriad of barriers to adequate oral diet in hospitalised patients warrants further investment to improve the patient mealtime experience. The aim of this study was to explore barriers and enablers to implementing effective mealtime interventions and develop a process framework to guide clinicians and researchers in the area. Potential experts in the area of hospital mealtime intervention were identified as having published in this field of work within the Australasian setting. Further information was sought by email and telephone communication on professional background; research experience; interest; and capacity to participate. Recruited participants were surveyed using a modified Delphi method to establish opinion and experience in the area of mealtime interventions. Results were collated and content was coded using a thematic analysis approach by the primary researcher and two additional reviewers. Thirty-two Australian authors in the area of mealtime interventions within the hospital environment were identified from publication. Twenty-one participants were able to be contacted and nineteen of these consented to enrol in the study. Participants included those from a dietetic (n = 14), nursing (n = 4) and medical (n = 1) background. Participants were deemed to have expert knowledge if they had significant involvement in the published research and demonstrated a deep level of understanding of hospital mealtime interventions. All participants provided key insights into barriers to oral intake in the hospital environment and suggestions for interventions to address these barriers. From the survey, an eight step framework to guide mealtime interventions was developed. Hospital mealtime interventions are complex processes. Interventions should be implemented after careful consideration of the local context and baseline data; and tailored to address barriers. Roles and responsibilities for nutrition care should be clear and

  10. An Alternative to the Carlson-Parkin Method for the Quantification of Qualitative Inflation Expectations: Evidence from the Ifo World Economic Survey

    OpenAIRE

    Henzel, Steffen; Wollmershäuser, Timo

    2005-01-01

    This paper presents a new methodology for the quantification of qualitative survey data. Traditional conversion methods, such as the probability approach of Carlson and Parkin (1975) or the time-varying parameters model of Seitz (1988), require very restrictive assumptions concerning the expectations formation process of survey respondents. Above all, the unbiasedness of expectations, which is a necessary condition for rationality, is imposed. Our approach avoids these assumptions. The novelt...

  11. Introduction of an automated mine surveying system - a method for effective control of mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Mazhdrakov, M.

    1987-04-01

    Reviews developments in automated processing of mine survey data in Bulgaria for 1965-1970. This development has occurred in three phases. In the first phase, computers calculated coordinates of mine survey points; in the second phase, these data were electronically processed; in the third phase, surface and underground mine development is controlled by electronic data processing equipment. Centralized and decentralized electronic processing of data has been introduced at major coal mines. The Bulgarian Pravets 82 microcomputer and the ASMO-MINI program package are in current use at major coal mines. A lack of plotters, due to financial limitations, handicaps large-scale application of automated mine surveying in Bulgaria.

  12. Application of iterative method with dynamic weight based on observation equation's constant in NPP's surveying

    International Nuclear Information System (INIS)

    Chen Benfu; Guo Xianchun; Zou Zili

    2009-01-01

    It' s useful to identify the data with errors from the large number of observations during the process of adjustment to decrease the influence of the errors and to improve the quality of the final surveying result. Based on practical conditions of the nuclear power plant's plain control network, it has been given on how to simply calculate the threshold value which used to pre-weight each datum before adjustment calculation; it shows some superiorities in efficiency on data snooping and in quality of the final calculation compared with some traditional methods such as robust estimation, which process data with dynamic weight based the observation' s correction after each iteration. (authors)

  13. Method and apparatus for lysing and processing algae

    Science.gov (United States)

    Chew, Geoffrey; Reich, Alton J.; Dykes, Jr., H. Waite H.; Di Salvo, Roberto

    2013-03-05

    Methods and apparatus for processing algae are described in which a hydrophilic ionic liquid is used to lyse algae cells at lower temperatures than existing algae processing methods. A salt or salt solution is used as a separation agent and to remove water from the ionic liquid, allowing the ionic liquid to be reused. The used salt may be dried or concentrated and reused. The relatively low lysis temperatures and recycling of the ionic liquid and salt reduce the environmental impact of the algae processing while providing biofuels and other useful products.

  14. An enhanced sine dwell method as applied to the Galileo core structure modal survey

    Science.gov (United States)

    Smith, Kenneth S.; Trubert, Marc

    1990-01-01

    An incremental modal survey performed in 1988 on the core structure of the Galileo spacecraft with its adapters with the purpose of assessing the dynamics of the new portions of the structure is considered. Emphasis is placed on the enhancements of the sine dwell method employed in the test. For each mode, response data is acquired at 32 frequencies in a narrow band enclosing the resonance, utilizing the SWIFT technique. It is pointed out that due to the simplicity of the data processing involved, the diagnostic and modal-parameter data is available within several minutes after data acquisition; however, compared with straight curve-fitting approaches, the method requires more time for data acquisition.

  15. Survey of artificial intelligence methods for detection and identification of component faults in nuclear power plants

    International Nuclear Information System (INIS)

    Reifman, J.

    1997-01-01

    A comprehensive survey of computer-based systems that apply artificial intelligence methods to detect and identify component faults in nuclear power plants is presented. Classification criteria are established that categorize artificial intelligence diagnostic systems according to the types of computing approaches used (e.g., computing tools, computer languages, and shell and simulation programs), the types of methodologies employed (e.g., types of knowledge, reasoning and inference mechanisms, and diagnostic approach), and the scope of the system. The major issues of process diagnostics and computer-based diagnostic systems are identified and cross-correlated with the various categories used for classification. Ninety-five publications are reviewed

  16. Data processing for the mise-a-la-masse survey; Ryuden den`iho tansa no data shori

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, K; Mizunaga, H; Ushijima, K [Kyushu University, Fukuoka (Japan). Faculty of Engineering; Kaieda, H [Central Research Institute of Electric Power Industry, Tokyo (Japan)

    1996-10-01

    Data processing method was studied for mise-a-la-masse survey. Mise-a-la-masse method using wells as current sources measures resultant ground surface potential difference, and detects underground local resistivity anomaly. To extract resistivity anomaly largely different from surroundings, the anomaly is extracted as difference between the estimated value of regular resistivity structure (background) and potential difference or apparent resistivity. The following three approximations were attempted to estimate the background: the theoretical equation assuming isotropic homogeneous resistivity structure, that assuming horizontal multi-layered structure and the exponential function of distance from linear electrode for apparent resistivity, and these were compared with each other by numerical model experiment. The data processing method which determines the resistivity structure of the background and the residual of apparent resistivity distribution by inversion analysis using the above second equation, could precisely extract local resistivity anomaly, and estimate the depth distribution of resistivity. 5 refs., 10 figs., 2 tabs.

  17. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    Science.gov (United States)

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  18. The auditory processing battery: survey of common practices.

    Science.gov (United States)

    Emanuel, Diana C

    2002-02-01

    A survey of auditory processing (AP) diagnostic practices was mailed to all licensed audiologists in the State of Maryland and sent as an electronic mail attachment to the American Speech-Language-Hearing Association and Educational Audiology Association Internet forums. Common AP protocols (25 from the Internet, 28 from audiologists in Maryland) included requiring basic audiologic testing, using questionnaires, and administering dichotic listening, monaural low-redundancy speech, temporal processing, and electrophysiologic tests. Some audiologists also administer binaural interaction, attention, memory, and speech-language/psychological/educational tests and incorporate a classroom observation. The various AP batteries presently administered appear to be based on the availability of AP tests with well-documented normative data. Resources for obtaining AP tests are listed.

  19. Development of a numerical simulation method for melting/solidification and dissolution/precipitation phenomena. 1. Literature survey for computer program design

    International Nuclear Information System (INIS)

    Uchibori, Akihiro; Ohshima, Hiroyuki

    2004-04-01

    Survey research of numerical methods for melting/solidification and dissolution/precipitation phenomena was performed to determine the policy for a simulation program development. Melting/solidification and dissolution/ precipitation have been key issues for feasibility evaluation of several techniques applied in the nuclear fuel cycle processes. Physical models for single-component melting/solidification, two-component solution solidification or precipitation by cooling and precipitation by electrolysis, which are moving boundary problems, were made clear from the literature survey. The transport equations are used for thermal hydraulic analysis in the solid and the liquid regions. Behavior of the solid-liquid interface is described by the heat and mass transfer model. These physical models need to be introduced into the simulation program. The numerical methods for the moving boundary problems are categorized into two types: interface tracking method and interface capturing method. Based on the classification, performance of each numerical method was evaluated. The interface tracking method using the Lagrangian moving mesh requires relatively complicated algorithm. The algorithm has high accuracy for predicting the moving interface. On the other hand, the interface capturing method uses the Eulerian fixing mesh, leading to simple algorithm. Prediction accuracy of the method is relatively low. The extended finite element method classified as the interface capturing method can predict the interface behavior accurately even though the Eulerian fixing mesh is used. We decided to apply the extended finite element method to the simulation program. (author)

  20. Generations and Gender Survey (GGS: Towards a Better Understanding of Relationships and Processes in the Life Course

    Directory of Open Access Journals (Sweden)

    Zsolt Spéder

    2007-11-01

    Full Text Available The Generations and Gender Survey (GGS is one of the two pillars of the Generations and Gender Programme designed to improve understanding of demographic and social development and of the factors that influence these developments. This article describes how the theoretical perspectives applied in the survey, the survey design and the questionnaire are related to this objective. The key features of the survey include panel design, multidisciplinarity, comparability, context-sensitivity, inter-generational and gender relationships. The survey applies the life course approach, focussing on the processes of childbearing, partnership dynamics, home leaving, and retiring. The selection of topics for data collection mainly follows the criterion of theoretically grounded relevance to explaining one or more of the mentioned processes. A large portion of the survey deals with economic aspects of life, such as economic activity, income, and economic well-being; a comparably large section is devoted to values and attitudes. Other domains covered by the survey include gender relationships, household composition and housing, residential mobility, social networks and private transfers, education, health, and public transfers. The third chapter of the article describes the motivations for their inclusion. The GGS questionnaire is designed for a face-to-face interview. It includes the core that each participating country needs to implement in full, and four optional sub-modules on nationality and ethnicity, on previous partners, on intentions of breaking up, and on housing, respectively. The participating countries are encouraged to include also the optional sub-modules to facilitate comparative research on these topics.

  1. Optimizing Methods of Obtaining Stellar Parameters for the H3 Survey

    Science.gov (United States)

    Ivory, KeShawn; Conroy, Charlie; Cargile, Phillip

    2018-01-01

    The Stellar Halo at High Resolution with Hectochelle Survey (H3) is in the process of observing and collecting stellar parameters for stars in the Milky Way's halo. With a goal of measuring radial velocities for fainter stars, it is crucial that we have optimal methods of obtaining this and other parameters from the data from these stars.The method currently developed is The Payne, named after Cecilia Payne-Gaposchkin, a code that uses neural networks and Markov Chain Monte Carlo methods to utilize both spectra and photometry to obtain values for stellar parameters. This project was to investigate the benefit of fitting both spectra and spectral energy distributions (SED). Mock spectra using the parameters of the Sun were created and noise was inserted at various signal to noise values. The Payne then fit each mock spectrum with and without a mock SED also generated from solar parameters. The result was that at high signal to noise, the spectrum dominated and the effect of fitting the SED was minimal. But at low signal to noise, the addition of the SED greatly decreased the standard deviation of the data and resulted in more accurate values for temperature and metallicity.

  2. HIDE & SEEK: End-to-end packages to simulate and process radio survey data

    Science.gov (United States)

    Akeret, J.; Seehars, S.; Chang, C.; Monstein, C.; Amara, A.; Refregier, A.

    2017-01-01

    As several large single-dish radio surveys begin operation within the coming decade, a wealth of radio data will become available and provide a new window to the Universe. In order to fully exploit the potential of these datasets, it is important to understand the systematic effects associated with the instrument and the analysis pipeline. A common approach to tackle this is to forward-model the entire system-from the hardware to the analysis of the data products. For this purpose, we introduce two newly developed, open-source Python packages: the HI Data Emulator (HIDE) and the Signal Extraction and Emission Kartographer (SEEK) for simulating and processing single-dish radio survey data. HIDE forward-models the process of collecting astronomical radio signals in a single-dish radio telescope instrument and outputs pixel-level time-ordered-data. SEEK processes the time-ordered-data, removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and aims to recover the astronomical radio signal. The two packages can be used separately or together depending on the application. Their modular and flexible nature allows easy adaptation to other instruments and datasets. We describe the basic architecture of the two packages and examine in detail the noise and RFI modeling in HIDE, as well as the implementation of gain calibration and RFI mitigation in SEEK. We then apply HIDE &SEEK to forward-model a Galactic survey in the frequency range 990-1260 MHz based on data taken at the Bleien Observatory. For this survey, we expect to cover 70% of the full sky and achieve a median signal-to-noise ratio of approximately 5-6 in the cleanest channels including systematic uncertainties. However, we also point out the potential challenges of high RFI contamination and baseline removal when examining the early data from the Bleien Observatory. The fully documented HIDE &SEEK packages are available at http://hideseek.phys.ethz.ch/ and are published

  3. [Method for the quality assessment of data collection processes in epidemiological studies].

    Science.gov (United States)

    Schöne, G; Damerow, S; Hölling, H; Houben, R; Gabrys, L

    2017-10-01

    For a quantitative evaluation of primary data collection processes in epidemiological surveys based on accompaniments and observations (in the field), there is no description of test criteria and methodologies in relevant literature and thus no known application in practice. Therefore, methods need to be developed and existing procedures adapted. The aim was to identify quality-relevant developments within quality dimensions by means of inspection points (quality indicators) during the process of data collection. As a result we seek to implement and establish a methodology for the assessment of overall survey quality supplementary to standardized data analyses. Monitors detect deviations from standard primary data collection during site visits by applying standardized checklists. Quantitative results - overall and for each dimension - are obtained by numerical calculation of quality indicators. Score results are categorized and color coded. This visual prioritization indicates necessity for intervention. The results obtained give clues regarding the current quality of data collection. This allows for the identification of such sections where interventions for quality improvement are needed. In addition, process quality development can be shown over time on an intercomparable basis. This methodology for the evaluation of data collection quality can identify deviations from norms, focalize quality analyses and help trace causes for significant deviations.

  4. Einstein Slew Survey: Data analysis innovations

    Science.gov (United States)

    Elvis, Martin S.; Plummer, David; Schachter, Jonathan F.; Fabbiano, G.

    1992-01-01

    Several new methods were needed in order to make the Einstein Slew X-ray Sky Survey. The innovations which enabled the Slew Survey to be done are summarized. These methods included experimental approach to large projects, parallel processing on a LAN, percolation source detection, minimum action identifications, and rapid dissemination of the whole data base.

  5. Survey and alignment of high energy physics accelerators and transport lines

    International Nuclear Information System (INIS)

    Ruland, R.E.

    1992-11-01

    This talk summarizes the survey and alignment processes of accelerators and transport lines and discusses the propagation of errors associated with these processes. The major geodetic principles governing the survey and alignment measurement space are revisited and their relationship to a lattice coordinate system shown. The paper continues with a broad overview about the activities involved in the step by step sequence from initial absolute alignment to final smoothing. Emphasis is given to the relative alignment of components, in particular to the importance of incorporating methods to remove residual systematic effects in surveying and alignment operations

  6. Comparison of Health Examination Survey Methods in Brazil, Chile, Colombia, Mexico, England, Scotland, and the United States.

    Science.gov (United States)

    Mindell, Jennifer S; Moody, Alison; Vecino-Ortiz, Andres I; Alfaro, Tania; Frenz, Patricia; Scholes, Shaun; Gonzalez, Silvia A; Margozzini, Paula; de Oliveira, Cesar; Sanchez Romero, Luz Maria; Alvarado, Andres; Cabrera, Sebastián; Sarmiento, Olga L; Triana, Camilo A; Barquera, Simón

    2017-09-15

    Comparability of population surveys across countries is key to appraising trends in population health. Achieving this requires deep understanding of the methods used in these surveys to examine the extent to which the measurements are comparable. In this study, we obtained detailed protocols of 8 nationally representative surveys from 2007-2013 from Brazil, Chile, Colombia, Mexico, the United Kingdom (England and Scotland), and the United States-countries that that differ in economic and inequity indicators. Data were collected on sampling frame, sample selection procedures, recruitment, data collection methods, content of interview and examination modules, and measurement protocols. We also assessed their adherence to the World Health Organization's "STEPwise Approach to Surveillance" framework for population health surveys. The surveys, which included half a million participants, were highly comparable on sampling methodology, survey questions, and anthropometric measurements. Heterogeneity was found for physical activity questionnaires and biological samples collection. The common age range included by the surveys was adults aged 18-64 years. The methods used in these surveys were similar enough to enable comparative analyses of the data across the 7 countries. This comparability is crucial in assessing and comparing national and subgroup population health, and to assisting the transfer of research and policy knowledge across countries. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Survey, applications, and prospects of Johnson noise thermometry

    International Nuclear Information System (INIS)

    Blalock, T.V.; Shepard, R.L.

    1981-01-01

    Significant progress in the field of Johnson noise thermometry has occurred since the 1971 survey of Kamper. This paper will review the foundation work of Johnson noise thermometry, survey the basic methods which do not utilize quantum devices for noise thermometry for industrial temperatures, and present some applications of noise thermometry in temperature scale metrology and process temperature instrumentation. 35 references

  8. Process control and optimization with simple interval calculation method

    DEFF Research Database (Denmark)

    Pomerantsev, A.; Rodionova, O.; Høskuldsson, Agnar

    2006-01-01

    for the quality improvement in the course of production. The latter is an active quality optimization, which takes into account the actual history of the process. The advocate approach is allied to the conventional method of multivariate statistical process control (MSPC) as it also employs the historical process......Methods of process control and optimization are presented and illustrated with a real world example. The optimization methods are based on the PLS block modeling as well as on the simple interval calculation methods of interval prediction and object status classification. It is proposed to employ...... the series of expanding PLS/SIC models in order to support the on-line process improvements. This method helps to predict the effect of planned actions on the product quality and thus enables passive quality control. We have also considered an optimization approach that proposes the correcting actions...

  9. Analytical methods manual for the Mineral Resource Surveys Program, U.S. Geological Survey

    Science.gov (United States)

    Arbogast, Belinda F.

    1996-01-01

    The analytical methods validated by the Mineral Resource Surveys Program, Geologic Division, is the subject of this manual. This edition replaces the methods portion of Open-File Report 90-668 published in 1990. Newer methods may be used which have been approved by the quality assurance (QA) project and are on file with the QA coordinator.This manual is intended primarily for use by laboratory scientists; this manual can also assist laboratory users to evaluate the data they receive. The analytical methods are written in a step by step approach so that they may be used as a training tool and provide detailed documentation of the procedures for quality assurance. A "Catalog of Services" is available for customer (submitter) use with brief listings of:the element(s)/species determined,method of determination,reference to cite,contact person,summary of the technique,and analyte concentration range.For a copy please contact the Branch office at (303) 236-1800 or fax (303) 236-3200.

  10. Survey and analysis of deep water mineral deposits using nuclear methods

    International Nuclear Information System (INIS)

    Staehle, C.M.; Noakes, J.E.; Spaulding, J.

    1991-01-01

    Present knowledge of the location, quality, quantity and recoverability of sea floor minerals is severely limited, particularly in the abyssal depths and deep water within the 200 mile Exclusion Economic Zone (EEZ) surrounding the U.S. Pacific Islands. To improve this understanding and permit exploitation of these mineral reserves much additional data is needed. This paper will discuss a sponsored program for extending existing proven nuclear survey methods currently used on the shallow continental margins of the Atlantic and Gulf of Mexico into the deeper waters of the Pacific. This nuclear technology can be readily integrated and extended to depths of 2000 m using the existing RCV-150 remotely operated vehicle (ROV) and the PISCESE V manned deep submersible vehicle (DSV) operated by The University of Hawaii's, Hawaii Underseas Research Laboratory (HURL). Previous papers by the authors have also proposed incorporating these nuclear analytical methods for survey of the deep ocean through the use of Autonomous Underwater Vehicle (AUX). Such a vehicle could extend the use of passive nuclear instrument operation, in addition to conventional analytical methods, into the abyssal depths and do so with speed and economy not otherwise possible. The natural radioactivity associated with manganese nodules and crustal deposits is sufficiently above normal background levels to allow discrimination and quantification in near real time

  11. A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice

    Science.gov (United States)

    Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.

    2015-01-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…

  12. Strategic survey framework for the Northwest Forest Plan survey and manage program.

    Science.gov (United States)

    Randy Molina; Dan McKenzie; Robin Lesher; Jan Ford; Jim Alegria; Richard Cutler

    2003-01-01

    This document outlines an iterative process for assessing the information needs for all Northwest Forest Plan (NWFP) survey and manage species, designing and implementing strategic surveys (including field surveys and other information-gathering processes), and analyzing that information for use in the NWFP annual species review and adaptive-management processes. The...

  13. Mobile acoustic transects miss rare bat species: implications of survey method and spatio-temporal sampling for monitoring bats

    Directory of Open Access Journals (Sweden)

    Elizabeth C. Braun de Torrez

    2017-11-01

    Full Text Available Due to increasing threats facing bats, long-term monitoring protocols are needed to inform conservation strategies. Effective monitoring should be easily repeatable while capturing spatio-temporal variation. Mobile acoustic driving transect surveys (‘mobile transects’ have been touted as a robust, cost-effective method to monitor bats; however, it is not clear how well mobile transects represent dynamic bat communities, especially when used as the sole survey approach. To assist biologists who must select a single survey method due to resource limitations, we assessed the effectiveness of three acoustic survey methods at detecting species richness in a vast protected area (Everglades National Park: (1 mobile transects, (2 stationary surveys that were strategically located by sources of open water and (3 stationary surveys that were replicated spatially across the landscape. We found that mobile transects underrepresented bat species richness compared to stationary surveys across all major vegetation communities and in two distinct seasons (dry/cool and wet/warm. Most critically, mobile transects failed to detect three rare bat species, one of which is federally endangered. Spatially replicated stationary surveys did not estimate higher species richness than strategically located stationary surveys, but increased the rate at which species were detected in one vegetation community. The survey strategy that detected maximum species richness and the highest mean nightly species richness with minimal effort was a strategically located stationary detector in each of two major vegetation communities during the wet/warm season.

  14. Mobile acoustic transects miss rare bat species: implications of survey method and spatio-temporal sampling for monitoring bats.

    Science.gov (United States)

    Braun de Torrez, Elizabeth C; Wallrichs, Megan A; Ober, Holly K; McCleery, Robert A

    2017-01-01

    Due to increasing threats facing bats, long-term monitoring protocols are needed to inform conservation strategies. Effective monitoring should be easily repeatable while capturing spatio-temporal variation. Mobile acoustic driving transect surveys ('mobile transects') have been touted as a robust, cost-effective method to monitor bats; however, it is not clear how well mobile transects represent dynamic bat communities, especially when used as the sole survey approach. To assist biologists who must select a single survey method due to resource limitations, we assessed the effectiveness of three acoustic survey methods at detecting species richness in a vast protected area (Everglades National Park): (1) mobile transects, (2) stationary surveys that were strategically located by sources of open water and (3) stationary surveys that were replicated spatially across the landscape. We found that mobile transects underrepresented bat species richness compared to stationary surveys across all major vegetation communities and in two distinct seasons (dry/cool and wet/warm). Most critically, mobile transects failed to detect three rare bat species, one of which is federally endangered. Spatially replicated stationary surveys did not estimate higher species richness than strategically located stationary surveys, but increased the rate at which species were detected in one vegetation community. The survey strategy that detected maximum species richness and the highest mean nightly species richness with minimal effort was a strategically located stationary detector in each of two major vegetation communities during the wet/warm season.

  15. Development of a Survey to Assess Local Health Department Organizational Processes and Infrastructure for Supporting Obesity Prevention.

    Science.gov (United States)

    Xiao, Ting; Stamatakis, Katherine A; McVay, Allese B

    Local health departments (LHDs) have an important function in controlling the growing epidemic of obesity in the United States. Data are needed to gain insight into the existence of routine functions and structures of LHDs that support and sustain obesity prevention efforts. The purpose of this study was to develop and examine the reliability of measures to assess foundational LHD organizational processes and functions specific to obesity prevention. Survey measures were developed using a stratified, random sample of US LHDs to assess supportive organizational processes and infrastructure for obesity prevention representing different domains. Data were analyzed using weighted κ and intraclass correlation coefficient for assessing test-retest reliability. Most items and summary indices in the majority of survey domains had moderate/substantial or almost perfect reliability. The overall findings support this survey instrument to be a reliable measurement tool for a large number of processes and functions that comprise obesity prevention-related capacity in LHDs.

  16. Continuing to Confront COPD International Patient Survey : methods, COPD prevalence, and disease burden in 2012-2013

    NARCIS (Netherlands)

    Landis, Sarah H.; Muellerova, Hana; Mannino, David M.; Menezes, Ana M.; Han, MeiLan K.; van der Molen, Thys; Ichinose, Masakazu; Aisanov, Zaurbek; Oh, Yeon-Mok; Davis, Kourtney J.

    2014-01-01

    Purpose: The Continuing to Confront COPD International Patient Survey aimed to estimate the prevalence and burden of COPD globally and to update findings from the Confronting COPD International Survey conducted in 1999-2000. Materials and methods: Chronic obstructive pulmonary disease (COPD)

  17. Resident Choice and the Survey Process: The Need for Standardized Observation and Transparency

    Science.gov (United States)

    Schnelle, John F.; Bertrand, Rosanna; Hurd, Donna; White, Alan; Squires, David; Feuerberg, Marvin; Hickey, Kelly; Simmons, Sandra F.

    2009-01-01

    Purpose: To describe a standardized observation protocol to determine if nursing home (NH) staff offer choice to residents during 3 morning activities of daily living (ADL) and compare the observational data with deficiency statements cited by state survey staff. Design and Methods: Morning ADL care was observed in 20 NHs in 5 states by research…

  18. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  19. Survey of potential chlorine production processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-04-01

    This report is part of the ongoing study of industrial electrochemical processes for the purpose of identifying methods of improving energy efficiencies. A computerized literature search of past and current chlorine generation methods was performed to identify basic chlorine production processes. Over 200 pertinent references are cited involving 20 separate and distinct chlorine processes. Each basic process is evaluated for its engineering and economic viability and energy efficiency. A flow diagram is provided for each basic process. Four criteria are used to determine the most promising processes: raw material availability, type and amount of energy required, by-product demand/disposal and status of development. The most promising processes are determined to be the membrane process (with and without catalytic electrodes), Kel-Chlor, Mobay (direct electrolysis of hydrogen chloride), the Shell process (catalytic oxidation of hydrogen chloride) and oxidation of ammonium chloride. Each of these processes is further studied to determine what activities may be pursued.

  20. A Survey of Methods for Gas-Lift Optimization

    Directory of Open Access Journals (Sweden)

    Kashif Rashid

    2012-01-01

    Full Text Available This paper presents a survey of methods and techniques developed for the solution of the continuous gas-lift optimization problem over the last two decades. These range from isolated single-well analysis all the way to real-time multivariate optimization schemes encompassing all wells in a field. While some methods are clearly limited due to their neglect of treating the effects of inter-dependent wells with common flow lines, other methods are limited due to the efficacy and quality of the solution obtained when dealing with large-scale networks comprising hundreds of difficult to produce wells. The aim of this paper is to provide an insight into the approaches developed and to highlight the challenges that remain.

  1. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1996-03-01

    During the 'Workshop on R and D needs' at the 3rd Meeting of the International Group on Research Reactors (IGORR-III), the participants agreed that it would be useful to compile a survey of the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods various organizations use to verify and validate their codes and libraries. Five organizations, Atomic Energy of Canada Limited (AECL, Canada), China Institute of Atomic Energy (CIAE, People's Republic of China), Japan Atomic Energy Research Institute (JAERI, Japan), Oak Ridge National Laboratories (ORNL, USA), and Siemens (Germany) responded to the survey. The results of the survey are compiled in this report. (author) 36 refs., 3 tabs

  2. Methods of information processing

    Energy Technology Data Exchange (ETDEWEB)

    Kosarev, Yu G; Gusev, V D

    1978-01-01

    Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

  3. Development of continuous pharmaceutical production processes supported by process systems engineering methods and tools

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2012-01-01

    The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way.......The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way....

  4. Traditional methods v. new technologies - dilemmas for dietary assessment in large-scale nutrition surveys and studies: a report following an international panel discussion at the 9th International Conference on Diet and Activity Methods (ICDAM9), Brisbane, 3 September 2015.

    Science.gov (United States)

    Amoutzopoulos, B; Steer, T; Roberts, C; Cade, J E; Boushey, C J; Collins, C E; Trolle, E; de Boer, E J; Ziauddeen, N; van Rossum, C; Buurma, E; Coyle, D; Page, P

    2018-01-01

    The aim of the present paper is to summarise current and future applications of dietary assessment technologies in nutrition surveys in developed countries. It includes the discussion of key points and highlights of subsequent developments from a panel discussion to address strengths and weaknesses of traditional dietary assessment methods (food records, FFQ, 24 h recalls, diet history with interviewer-assisted data collection) v. new technology-based dietary assessment methods (web-based and mobile device applications). The panel discussion 'Traditional methods v. new technologies: dilemmas for dietary assessment in population surveys', was held at the 9th International Conference on Diet and Activity Methods (ICDAM9), Brisbane, September 2015. Despite respondent and researcher burden, traditional methods have been most commonly used in nutrition surveys. However, dietary assessment technologies offer potential advantages including faster data processing and better data quality. This is a fast-moving field and there is evidence of increasing demand for the use of new technologies amongst the general public and researchers. There is a need for research and investment to support efforts being made to facilitate the inclusion of new technologies for rapid, accurate and representative data.

  5. Comparison of Survey Data Collection Methods

    Directory of Open Access Journals (Sweden)

    VIDAL DÍAZ DE RADA

    2017-01-01

    Full Text Available This paper presents the results of a mixed-mode survey of the general population using a combination of postal, telephone, and Internet survey protocols. Potential respondents were invited to participate by ordinary mail and were allowed to choose their preferred response mode. The paper focuses on response quality (taking non-responses into consideration, fieldwork time and data collection cost. The results reveal that the Internet survey produces the lowest rate of non-responses and requires significantly less fieldwork time, although it is slightly more costly than the postal survey. However, when differences in cost structure are taken into account, we find that the same number of completed questionnaires could have been obtained through the Internet alone at a cost that is 18.2% lower than the mixed-mode survey.

  6. Hydrographic processing considerations in the “Big Data” age: An overview of technology trends in ocean and coastal surveys

    Science.gov (United States)

    Holland, M.; Hoggarth, A.; Nicholson, J.

    2016-04-01

    The quantity of information generated by survey sensors for ocean and coastal zone mapping has reached the “Big Data” age. This is influenced by the number of survey sensors available to conduct a survey, high data resolution, commercial availability, as well as an increased use of autonomous platforms. The number of users of sophisticated survey information is also growing with the increase in data volume. This is leading to a greater demand and broader use of the processed results, which includes marine archeology, disaster response, and many other applications. Data processing and exchange techniques are evolving to ensure this increased accuracy in acquired data meets the user demand, and leads to an improved understanding of the ocean environment. This includes the use of automated processing, models that maintain the best possible representation of varying resolution data to reduce duplication, as well as data plug-ins and interoperability standards. Through the adoption of interoperable standards, data can be exchanged between stakeholders and used many times in any GIS to support an even wider range of activities. The growing importance of Marine Spatial Data Infrastructure (MSDI) is also contributing to the increased access of marine information to support sustainable use of ocean and coastal environments. This paper offers an industry perspective on trends in hydrographic surveying and processing, and the increased use of marine spatial data.

  7. Google Street View as an alternative method to car surveys in large-scale vegetation assessments.

    Science.gov (United States)

    Deus, Ernesto; Silva, Joaquim S; Catry, Filipe X; Rocha, Miguel; Moreira, Francisco

    2015-10-01

    Car surveys (CS) are a common method for assessing the distribution of alien invasive plants. Google Street View (GSV), a free-access web technology where users may experience a virtual travel along roads, has been suggested as a cost-effective alternative to car surveys. We tested if we could replicate the results from a countrywide survey conducted by car in Portugal using GSV as a remote sensing tool, aiming at assessing the distribution of Eucalyptus globulus Labill. wildlings on roadsides adjacent to eucalypt stands. Georeferenced points gathered along CS were used to create road transects visible as lines overlapping the road in GSV environment, allowing surveying the same sampling areas using both methods. This paper presents the results of the comparison between the two methods. Both methods produced similar models of plant abundance, selecting the same explanatory variables, in the same hierarchical order of importance and depicting a similar influence on plant abundance. Even though the GSV model had a lower performance and the GSV survey detected fewer plants, additional variables collected exclusively with GSV improved model performance and provided a new insight into additional factors influencing plant abundance. The survey using GSV required ca. 9 % of the funds and 62 % of the time needed to accomplish the CS. We conclude that GSV may be a cost-effective alternative to CS. We discuss some advantages and limitations of GSV as a survey method. We forecast that GSV may become a widespread tool in road ecology, particularly in large-scale vegetation assessments.

  8. Digital signal processing with kernel methods

    CERN Document Server

    Rojo-Alvarez, José Luis; Muñoz-Marí, Jordi; Camps-Valls, Gustavo

    2018-01-01

    A realistic and comprehensive review of joint approaches to machine learning and signal processing algorithms, with application to communications, multimedia, and biomedical engineering systems Digital Signal Processing with Kernel Methods reviews the milestones in the mixing of classical digital signal processing models and advanced kernel machines statistical learning tools. It explains the fundamental concepts from both fields of machine learning and signal processing so that readers can quickly get up to speed in order to begin developing the concepts and application software in their own research. Digital Signal Processing with Kernel Methods provides a comprehensive overview of kernel methods in signal processing, without restriction to any application field. It also offers example applications and detailed benchmarking experiments with real and synthetic datasets throughout. Readers can find further worked examples with Matlab source code on a website developed by the authors. * Presents the necess...

  9. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. Methods We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. Results MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%–95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. Conclusions National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. PMID:26965869

  10. Implementation Of Carlson Survey Software2009 In Survey Works And Comparison With CDS Software

    Directory of Open Access Journals (Sweden)

    Mohamed Faraj EL Megrahi

    2017-02-01

    Full Text Available The automation surveying is one of the most influential changes to surveying concept and profession has had to go through, this has taken effect in two major courses, hardware (instrumentation used in data collection and presentation, and the software (the applications used in data processing and manipulation. Automation is majorly computer based and just like all such systems is subject to improvement often; this is manifested in the new kinds of instrumentation models every few years such as total station and newer versions of software’s. The software that has the potential to completely affect survey automation is Carlson Surveying Software. This when coupled with total station as data processing and collection methods respectively; is capable of greatly improving productivity while reducing time and cost required in the long run. However, it is only natural for users to desire a competent software and be able to choose from what is available on the market based on guided research and credible information from previous researches. Such studies not only help in choice of software but are also handy when it comes to testing approaches and recommending improvements based on advantages and disadvantages to the manufacturers to help in advancement in the software industry for better and more comfortable use. The expected outcome of the research is a successful implementation of Carlson survey 2009 software in survey works and a comparison with other existing software like Civil Design Software (CDS was highlighted its advantages and disadvantages.

  11. Proposed methods for treating high-level pyrochemical process wastes

    International Nuclear Information System (INIS)

    Johnson, T.R.; Miller, W.E.; Steunenberg, R.K.

    1985-01-01

    This survey illustrates the large variety and number of possible techniques available for treating pyrochemical wastes; there are undoubtedly other process types and many variations. The choice of a suitable process is complicated by the uncertainty as to what will be an acceptable waste form in the future for both TRU and non-TRU wastes

  12. Survey of an evaluation method for research and development projects; Kenkyu kaihatsu project no hyoka shuho ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This report describes an interim evaluation method and a concrete evaluation method for projects promoted by the Agency of Industrial Science and Technology, and NEDO. As a result of the survey, a method of highly practical interim evaluation, concrete evaluation items, and evaluation criteria have been proposed by assuming that the projects are evaluated by the project evaluation department independent of the project promotion department. Long-term issues for constructing the evaluation system are also described. It is the most essential for the evaluation to fulfill the function of effective promotion of the following projects. It is also indispensable for the evaluation method and issues proposed in this report to communicate closely to project promoters and researchers, and to reassess the projects continuously. Continuous consideration for the feedback of evaluation process and the improvement of evaluation are significant for the long-term construction of system. 21 refs., 9 figs., 23 tabs.

  13. Preparing investigation of methods for surveying tree seed demands among farmers in Tanzania

    DEFF Research Database (Denmark)

    Aabæk, Anders

    Insufficient seed supplies is often a major constraint on tree planting activities in developing countries. A central problem is to assess the actual demands for tree seed. This report shall, as a part of a PhD-study, prepare an investigation of different methods for surveying tree seed demands...... and preferences among private farmers in Tanzania. A framework for investigating seed demand and supply is outlined. The role of a national tree seed project in a seed supply sector is discussed and data requirements for strategy on seed procurement and tree improvement are outlined. Earlier surveys on seed...... demand pattern in Tanzania, Uganda and Nicaragua are discussed and a choice of strategy for an extensive survey of seed demand and supply in Tanzania is made. Different data collection methods and tools, e.g. quantitative and qualitative surveys and rapid rural appraisals, are described in detail...

  14. Finite Element Method in Machining Processes

    CERN Document Server

    Markopoulos, Angelos P

    2013-01-01

    Finite Element Method in Machining Processes provides a concise study on the way the Finite Element Method (FEM) is used in the case of manufacturing processes, primarily in machining. The basics of this kind of modeling are detailed to create a reference that will provide guidelines for those who start to study this method now, but also for scientists already involved in FEM and want to expand their research. A discussion on FEM, formulations and techniques currently in use is followed up by machining case studies. Orthogonal cutting, oblique cutting, 3D simulations for turning and milling, grinding, and state-of-the-art topics such as high speed machining and micromachining are explained with relevant examples. This is all supported by a literature review and a reference list for further study. As FEM is a key method for researchers in the manufacturing and especially in the machining sector, Finite Element Method in Machining Processes is a key reference for students studying manufacturing processes but al...

  15. Image restoration and processing methods

    International Nuclear Information System (INIS)

    Daniell, G.J.

    1984-01-01

    This review will stress the importance of using image restoration techniques that deal with incomplete, inconsistent, and noisy data and do not introduce spurious features into the processed image. No single image is equally suitable for both the resolution of detail and the accurate measurement of intensities. A good general purpose technique is the maximum entropy method and the basis and use of this will be explained. (orig.)

  16. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  17. FROM “MODELS” TO “REALITY”, AND RETURN. SOME REFLECTIONS ON THE INTERACTION BETWEEN SURVEY AND INTERPRETATIVE METHODS FOR BUILT HERITAGE CONSERVATION

    Directory of Open Access Journals (Sweden)

    F. Ottoni

    2017-05-01

    Full Text Available It's well known that more and more accurate methodologies and automatic tools are now available in the field of geometric survey and image processing and they constitute a fundamental instrument for cultural heritage knowledge and preservation; on the other side, very smart and precise numerical models are continuously improved and used in order to simulate the mechanical behaviour of masonry structures: both instruments and technologies are important part of a global process of knowledge which is at the base of any conservation project of cultural heritage. Despite the high accuracy and automation level reached by both technologies and programs, the transfer of data between them is not an easy task and defining the most reliable way to translate and exchange information without data loosing is still an open issue. The goal of the present paper is to analyse the complex process of translation from the very precise (and sometimes redundant information obtainable by the modern survey methodologies for historic buildings (as laser scanner, into the very simplified (may be too much schemes used to understand their real structural behaviour, with the final aim to contribute to the discussion on reliable methods for cultural heritage knowledge improvement, through empiricism.

  18. Behavioural Finance: Theory and Survey

    Directory of Open Access Journals (Sweden)

    Daiva Jurevičienė

    2013-04-01

    Full Text Available The paper analyses the importance of behavioural finance theories in household decision-making process. Behavioural finance theories investigate emotional characteristics to explain subjective factors and irrational anomalies in financial markets. In this regard, behavioural theories and behavioural anomalies in the decision-making process are examined; the application opportunities in the financial market are described. The aim of investigation is to determine the basic features and slopes of behavioural finance in concordance with financial decisions of a household. The survey method was applied to ascertain financial behaviour of literate households.

  19. The former Semipalatinsk Test Site survey by field γ-spectrometry method

    International Nuclear Information System (INIS)

    Pakhomov, S.A.; Dubasov, Yu.V.; Biryukov, E.I.; Gavrilin, S.S.; Ilyin, L.I.

    2001-01-01

    Full text: Field γ-spectrometry method is the most productive method of getting an information on gamma-irradiating radionuclides contents in objects under survey. For many years the Radium Institute was the most active participant of works on radiation survey of the Semipalatinsk Test Site, field γ-spectrometry method including. In field γ-spectrometry method realization the existence of corresponding apparatus, and methodic software is supposed to be allowing to carry out both measurement, recording of γ-spectrometry energetic characteristics in field conditions and final treatment (using corresponding physical models) ensuring obtaining the information on radionuclides' contents. Different variants of such apparatus were developed and made at the Radium Institute. Under conditions of complex spectral structure and high intensity existence a portable variant of Ge(Li)-detector spectrometer fed by alkaline accumulators and allowing to record a spectrum at high load was elaborated. For walk γ-survey in common with the University two variants of portable spectrometers 'Skif-3' were elaborated: one of them with a standard scintillation detector on the base of Nal(Ti) crystal, having the size of 63x63 mm, and the other is an 'X-ray' detector of large dimension on the base of CsI crystal with a diameter of 165 mm, designated for soft gamma-irradiation registration, 241 Am including. During 8 hours of independent works 'Skif-3' is able to record into internal memory up to 100 spectra with an exposure up to 9999 s. At 10 min exposure in its sensitivity of 241 Am finding (10 Bq/kg) a spectrometer 'Skif-3' excels a portable spectrometer 'InSpector' (Canberra) working with the exposure of 1 hour. For automobile measurements a car spectrometer was elaborated fed. from car board supply and having four detectors based on NaI crystals with the dimensions 200x110 mm and total volume of about 121. For express treatment of a 'Skif-3' spectrometer scintillation spectra special

  20. Survey nonresponse among ethnic minorities in a national health survey - a mixed-method study of participation, barriers, and potentials

    DEFF Research Database (Denmark)

    Ahlmark, Nanna; Algren, Maria Holst; Holmberg, Teresa

    2015-01-01

    , to alienation generated by the questions' focus on disease and cultural assumptions, or mistrust regarding anonymity. Ethnic minorities seem particularly affected by such barriers. To increase survey participation, questions could be sensitized to reflect multicultural traditions, and the impact of sender......Objectives. The participation rate in the Danish National Health Survey (DNHS) 2010 was significantly lower among ethnic minorities than ethnic Danes. The purpose was to characterize nonresponse among ethnic minorities in DNHS, analyze variations in item nonresponse, and investigate barriers...... and incentives to participation. Design. This was a mixed-method study. Logistic regression was used to analyze nonresponse using data from DNHS (N = 177,639 and chi-square tests in item nonresponse analyses. We explored barriers and incentives regarding participation through focus groups and cognitive...

  1. Process synthesis, design and analysis using a process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario R.; Gani, Rafiqul

    2015-01-01

    ) techniques. The fundamental pillars of this framework are the definition and use of functional process-groups (building blocks) representing a wide range of process operations, flowsheet connectivity rules to join the process-groups to generate all the feasible flowsheet alternatives and flowsheet property...... models like energy consumption, atom efficiency, environmental impact to evaluate the performance of the generated alternatives. In this way, a list of feasible flowsheets are quickly generated, screened and selected for further analysis. Since the flowsheet is synthesized and the operations......This paper describes the development and application of a process-group contribution method to model, simulate and synthesize chemical processes. Process flowsheets are generated in the same way as atoms or groups of atoms are combined to form molecules in computer aided molecular design (CAMD...

  2. A Study on the Representative Sampling Survey for Radionuclide Analysis of RI Waste

    Energy Technology Data Exchange (ETDEWEB)

    Jee, K. Y. [KAERI, Daejeon (Korea, Republic of); Kim, Juyoul; Jung, Gunhyo [FNC Tech. Co., Daejeon (Korea, Republic of)

    2007-07-15

    We developed a quantitative method for attaining a representative sample during sampling survey of RI waste. Considering a source, process, and type of RI waste, the method computes the number of sample, confidence interval, variance, and coefficient of variance. We also systematize the method of sampling survey logically and quantitatively. The result of this study can be applied to sampling survey of low- and intermediate-level waste generated from nuclear power plant during the transfer process to disposal facility.

  3. Introductory study of super survey (next generation underground exploration technology); Super survey (jisedai chika tansa gijutsu) no sendo kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    An investigational study was conducted on the R and D trend of innovative technology aiming at high-accuracy/high-efficiency next generation underground exploration technology (super survey technology). Paying attention to the seismic survey and electromagnetic survey, the study was made on technical characteristics, the utilization status and the needs at sites, the R and D trend, etc. As to the present R and D, the development is proceeded with of the time domain method in the electromagnetic survey, the effective quantity data processing/analysis method and the indication method using the reflection method in the elastic survey. As new technology to be noticed, the following are cited: SQUID magnetometer, underground analysis using magnetic deviation data, electromagnetic migration, ACROSS, rotating seismometer, laser Doppler vibrator, etc. Concerning the course of the next generation underground survey technology, a system of the integrated underground exploration theory is considered which is based on a combination of the electromagnetic survey and seismic exploration. In the study, a plan is worked out for research/development of a technology of analyzing the different data obtained. 49 figs., 13 tabs.

  4. Mobile phones are a viable option for surveying young Australian women: a comparison of two telephone survey methods

    Directory of Open Access Journals (Sweden)

    Liu Bette

    2011-11-01

    Full Text Available Abstract Background Households with fixed-line telephones have decreased while mobile (cell phone ownership has increased. We therefore sought to examine the feasibility of recruiting young women for a national health survey through random digit dialling mobile phones. Methods Two samples of women aged 18 to 39 years were surveyed by random digit dialling fixed and mobile numbers. We compared participation rates and responses to a questionnaire between women surveyed by each contact method. Results After dialling 5,390 fixed-lines and 3,697 mobile numbers, 140 and 128 women were recruited respectively. Among women contacted and found to be eligible, participation rates were 74% for fixed-lines and 88% for mobiles. Taking into account calls to numbers where eligibility was unknown (e.g. unanswered calls the estimated response rates were 54% and 45% respectively. Of women contacted by fixed-line, 97% reported having a mobile while 61% of those contacted by mobile reported having a fixed-line at home. After adjusting for age, there were no significant differences between mobile-only and fixed-line responders with respect to education, residence, and various health behaviours; however compared to those with fixed-lines, mobile-only women were more likely to identify as Indigenous (OR 4.99, 95%CI 1.52-16.34 and less likely to live at home with their parents (OR 0.09, 95%CI 0.03-0.29. Conclusions Random digit dialling mobile phones to conduct a health survey in young Australian women is feasible, gives a comparable response rate and a more representative sample than dialling fixed-lines only. Telephone surveys of young women should include mobile dialling.

  5. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    OpenAIRE

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective: To examine sociodemographic and behavioural differences between men whohave sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey.\\ud Methods: We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men inthe same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European...

  6. Apparatus and method for radiation processing of materials

    International Nuclear Information System (INIS)

    Neuberg, W.B.; Luniewski, R.

    1983-01-01

    A method and apparatus for radiation degradation processing of polytetrafluoroethylene makes use of a simultaneous irradiation, agitation and cooling. The apparatus is designed to make efficent use of radiation in the processing. (author)

  7. Methods of surveying and monitoring marine radioactivity. Report of an ad hoc panel of experts

    International Nuclear Information System (INIS)

    1965-01-01

    An effective control of the radioactive pollution of the sea depends partly on the availability of adequate technical methods for surveying and monitoring the sea and marine products with regard to the presence of radioactive substances. The purpose of this manual is to offer such methods.

  8. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  9. Geostatistical methods for rock mass quality prediction using borehole and geophysical survey data

    Science.gov (United States)

    Chen, J.; Rubin, Y.; Sege, J. E.; Li, X.; Hehua, Z.

    2015-12-01

    For long, deep tunnels, the number of geotechnical borehole investigations during the preconstruction stage is generally limited. Yet tunnels are often constructed in geological structures with complex geometries, and in which the rock mass is fragmented from past structural deformations. Tunnel Geology Prediction (TGP) is a geophysical technique widely used during tunnel construction in China to ensure safety during construction and to prevent geological disasters. In this paper, geostatistical techniques were applied in order to integrate seismic velocity from TGP and borehole information into spatial predictions of RMR (Rock Mass Rating) in unexcavated areas. This approach is intended to apply conditional probability methods to transform seismic velocities to directly observed RMR values. The initial spatial distribution of RMR, inferred from the boreholes, was updated by including geophysical survey data in a co-kriging approach. The method applied to a real tunnel project shows significant improvements in rock mass quality predictions after including geophysical survey data, leading to better decision-making for construction safety design.

  10. Decision maker perceptions of resource allocation processes in Canadian health care organizations: a national survey.

    Science.gov (United States)

    Smith, Neale; Mitton, Craig; Bryan, Stirling; Davidson, Alan; Urquhart, Bonnie; Gibson, Jennifer L; Peacock, Stuart; Donaldson, Cam

    2013-07-02

    Resource allocation is a key challenge for healthcare decision makers. While several case studies of organizational practice exist, there have been few large-scale cross-organization comparisons. Between January and April 2011, we conducted an on-line survey of senior decision makers within regional health authorities (and closely equivalent organizations) across all Canadian provinces and territories. We received returns from 92 individual managers, from 60 out of 89 organizations in total. The survey inquired about structures, process features, and behaviours related to organization-wide resource allocation decisions. We focus here on three main aspects: type of process, perceived fairness, and overall rating. About one-half of respondents indicated that their organization used a formal process for resource allocation, while the others reported that political or historical factors were predominant. Seventy percent (70%) of respondents self-reported that their resource allocation process was fair and just over one-half assessed their process as 'good' or 'very good'. This paper explores these findings in greater detail and assesses them in context of the larger literature. Data from this large-scale cross-jurisdictional survey helps to illustrate common challenges and areas of positive performance among Canada's health system leadership teams.

  11. Assessment of distribution and abundance estimates for Mariana swiftlets (Aerodramus bartschi) via examination of survey methods

    Science.gov (United States)

    Johnson, Nathan C.; Haig, Susan M.; Mosher, Stephen M.

    2018-01-01

    We described past and present distribution and abundance data to evaluate the status of the endangered Mariana Swiftlet (Aerodramus bartschi), a little-known echolocating cave swiftlet that currently inhabits 3 of 5 formerly occupied islands in the Mariana archipelago. We then evaluated the survey methods used to attain these estimates via fieldwork carried out on an introduced population of Mariana Swiftlets on the island of O'ahu, Hawaiian Islands, to derive better methods for future surveys. We estimate the range-wide population of Mariana Swiftlets to be 5,704 individuals occurring in 15 caves on Saipan, Aguiguan, and Guam in the Marianas; and 142 individuals occupying one tunnel on O'ahu. We further confirm that swiftlets have been extirpated from Rota and Tinian and have declined on Aguiguan. Swiftlets have remained relatively stable on Guam and Saipan in recent years. Our assessment of survey methods used for Mariana Swiftlets suggests overestimates depending on the technique used. We suggest the use of night vision technology and other changes to more accurately reflect their distribution, abundance, and status.

  12. A method to evaluate process performance by integrating time and resources

    Science.gov (United States)

    Wang, Yu; Wei, Qingjie; Jin, Shuang

    2017-06-01

    The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.

  13. A Comparison of Web-Based and Paper-Based Survey Methods: Testing Assumptions of Survey Mode and Response Cost

    Science.gov (United States)

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-01-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and…

  14. Methods of digital image processing

    International Nuclear Information System (INIS)

    Doeler, W.

    1985-01-01

    Increasing use of computerized methods for diagnostical imaging of radiological problems will open up a wide field of applications for digital image processing. The requirements set by routine diagnostics in medical radiology point to picture data storage and documentation and communication as the main points of interest for application of digital image processing. As to the purely radiological problems, the value of digital image processing is to be sought in the improved interpretability of the image information in those cases where the expert's experience and image interpretation by human visual capacities do not suffice. There are many other domains of imaging in medical physics where digital image processing and evaluation is very useful. The paper reviews the various methods available for a variety of problem solutions, and explains the hardware available for the tasks discussed. (orig.) [de

  15. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of the total phosphorus by a Kjeldahl digestion method and an automated colorimetric finish that includes dialysis

    Science.gov (United States)

    Patton, Charles J.; Truitt, Earl P.

    1992-01-01

    A method to determine total phosphorus (TP) in the same digests prepared for total Kjeldahl nitrogen (TKN) determinations is desribed. The batch, high-temperature (block digester), HG(II)-catalyzed digestion step is similar to U.S. Geological Survey methods I-2552-85/I-4552-85 and U.S. Environmental Protection Agency method 365.4 except that sample and reagent volumes are halved. Prepared digests are desolvated at 220 degrees Celsius and digested at 370 degrees Celsius in separate block digesters set at these temperatures, rather than in a single, temperature-programmed block digester. This approach is used in the method escribed here, which permits 40 calibrants, reference waters, and smaples to be digested and resolvated in about an hour. Orthophosphate ions originally present in samples, along with those released during the digestion step, are determined colorimetrically at a rate of 90 tests per hour by an automated version of the phosphoantimonylmolybdenum blue procedure. About 100 microliters of digest are required per determination. The upper concentration limit is 2 milligrams per liter (mg/L) with a method detection limt of 0.01 mg/L. Repeatability for a sample containing approximately 1.6 mg/L of TP in a high suspended-solids matrix is 0.7 percent. Between-day precision for the same sample is 5.0 percent. A dialyzer in the air-segmented continuous flow analyzer provides on-line digest cleanup, eliminated particulates that otherwise would interfere in the colorimetric finish. An single-channel analyzer can process the resolvated digests from two pairs of block digesters each hour. Paired t-test analysis of TP concentrations for approximately 1,600 samples determined by the new method (U.S. Geologial Survey methods I-2610-91 and I-4610-91) and the old method (U.S. Geological Survey methods I-2600-85 and I-4600-85) revealed positive bias in the former of 0.02 to 0.04 mg/L for surface-water samples in agreement with previous studies. Concentrations of total

  16. Digital processing methods for bronchograms

    International Nuclear Information System (INIS)

    Mamilyaev, R.M.; Popova, N.P.; Matsulevich, T.V.

    1989-01-01

    The technique of digital processing of bronchograms with the aim of separating morphological details of bronchi and increasing the clarity in the outlines of contrasted bronchi, is described. The block diagram of digital processing on the automatized system of image processing is given. It is shown that digital processing of bronchograms permits to clearly outline bronchi walls and makes the measurements of bronchi diameters easier and more reliable. Considerable advantages of digital processing of images as compared with the optical methods, are shown

  17. A Mixed-Methods Research Framework for Healthcare Process Improvement.

    Science.gov (United States)

    Bastian, Nathaniel D; Munoz, David; Ventura, Marta

    2016-01-01

    The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.

  18. Occupational injury proneness in Indian women: A survey in fish processing industries

    Directory of Open Access Journals (Sweden)

    Saha Asim

    2006-09-01

    Full Text Available Abstract A cross sectional survey was initiated to understand the frequency of occupational injury occurrence and the associated factors in the fish processing industries of western India involving 185 randomly selected women subjects. All the subjects were interviewed with the help of an interviewer-administered questionnaire to collect information regarding their personal, occupational and work related morbidity details (including details of occupational injuries. Logistic regression method was used to analyze the data in order to obtain the contribution of individual factors on occupational injuries. This study has shown that work related morbidity like blanching of hand (OR; 2.30, 95%CI; 1.12–4.74 and nature of job like grading (OR; 3.99, 95%CI; 1.41–11.27 and packing (OR; 5.68, 95%CI; 1.65–19.57 had a significant impact on injury causation. This study eventually concludes that apart from nature of job of fish processing workers occupational hazards prevailing in the work environment contribute significantly to the occurrence of work related injuries and prevention of such occupational hazards may help in protecting workers from occupational injuries also.

  19. A method of encountering the ratio of adjacent sides and its applied study in nuclear engineering survey

    International Nuclear Information System (INIS)

    Wu Jingqin

    1996-01-01

    The cross side or range net survey method is to compute the average error of the measured lengths of sides. With the increment of the side length, the viewing variance increases greatly. Generally the photo-electrical distance survey equipment has a high inside precision, but it is affected by typical weather error so that the outside precision is decreased, and this weather error similar to systematic error greatly decreases the viewing side precision. To solve this problem, theoretical study and field test were carried out for the correlation of ratios among short sides by photo-electrical survey, and the stability degree of the ratios of sides, a new method of ratio encountering of adjacent sides is put forward. Because of the weights of the ration variance σ γ 2 = 2η 2 γ 2 and the angular variance σ β 2 = 2J 2 ρ 2 match each other, so the systematic error can be eliminated completely, and a survey point co-ordinate of high precision can be obtained. It is easy to operate, as it does not require multi-photo-band survey or to operate at the optimal observation time, and is especially suitable to nuclear engineering survey applications. (3 tabs.)

  20. Traditional methods v. new technologies – dilemmas for dietary assessment in large-scale nutrition surveys and studies

    DEFF Research Database (Denmark)

    Amoutzopoulos, B.; Steer, T.; Roberts, C.

    2018-01-01

    assessment in population surveys’, was held at the 9th International Conference on Diet and Activity Methods (ICDAM9), Brisbane, September 2015. Despite respondent and researcher burden, traditional methods have been most commonly used in nutrition surveys. However, dietary assessment technologies offer...... of traditional dietary assessment methods (food records, FFQ, 24 h recalls, diet history with interviewer-assisted data collection) v. new technology-based dietary assessment methods (web-based and mobile device applications). The panel discussion ‘Traditional methods v. new technologies: dilemmas for dietary......The aim of the present paper is to summarise current and future applications of dietary assessment technologies in nutrition surveys in developed countries. It includes the discussion of key points and highlights of subsequent developments from a panel discussion to address strengths and weaknesses...

  1. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2013-09-01

    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  2. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs

    International Nuclear Information System (INIS)

    Brooks, Robin; Thorpe, Richard; Wilson, John

    2004-01-01

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice

  3. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, Robin [Curvaceous Software Limited, P.O. Box 43, Gerrards Cross, Bucks SL98UX (United Kingdom)]. E-mail: enquiries@curvaceous.com; Thorpe, Richard [Curvaceous Software Limited, P.O. Box 43, Gerrards Cross, Bucks SL98UX (United Kingdom); Wilson, John [Curvaceous Software Limited, P.O. Box 43, Gerrards Cross, Bucks SL98UX (United Kingdom)

    2004-11-11

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.

  4. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs.

    Science.gov (United States)

    Brooks, Robin; Thorpe, Richard; Wilson, John

    2004-11-11

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.

  5. Social Capital: Its Constructs and Survey Development

    Science.gov (United States)

    Enfield, Richard P.; Nathaniel, Keith C.

    2013-01-01

    This article reports on experiences and methods of adapting a valid adult social capital assessment to youth audiences in order to measure social capital and sense of place. The authors outline the process of adapting, revising, prepiloting, piloting, and administering a youth survey exploring young people's sense of community, involvement in the…

  6. Effects of phone versus mail survey methods on the measurement of health-related quality of life and emotional and behavioural problems in adolescents.

    Science.gov (United States)

    Erhart, Michael; Wetzel, Ralf M; Krügel, André; Ravens-Sieberer, Ulrike

    2009-12-30

    Telephone interviews have become established as an alternative to traditional mail surveys for collecting epidemiological data in public health research. However, the use of telephone and mail surveys raises the question of to what extent the results of different data collection methods deviate from one another. We therefore set out to study possible differences in using telephone and mail survey methods to measure health-related quality of life and emotional and behavioural problems in children and adolescents. A total of 1700 German children aged 8-18 years and their parents were interviewed randomly either by telephone or by mail. Health-related Quality of Life (HRQoL) and mental health problems (MHP) were assessed using the KINDL-R Quality of Life instrument and the Strengths and Difficulties Questionnaire (SDQ) children's self-report and parent proxy report versions. Mean Differences ("d" effect size) and differences in Cronbach alpha were examined across modes of administration. Pearson correlation between children's and parents' scores was calculated within a multi-trait-multi-method (MTMM) analysis and compared across survey modes using Fisher-Z transformation. Telephone and mail survey methods resulted in similar completion rates and similar socio-demographic and socio-economic makeups of the samples. Telephone methods resulted in more positive self- and parent proxy reports of children's HRQoL (SMD survey self/proxy Total: 0.84/0.87). KINDL MTMM results were weaker for the phone surveys: mono-trait-multi-method mean r = 0.31 (mail: r = 0.45); multi-trait-mono-method mean (self/parents) r = 0.29/0.36 (mail: r = 0.34/0.40); multi-trait-multi-method mean r = 0.14 (mail: r = 0.21). Weaker MTMM results were also observed for the phone administered SDQ: mono-trait-multi-method mean r = 0.32 (mail: r = 0.40); multi-trait-mono-method mean (self/parents) r = 0.24/0.30 (mail: r = 0.20/0.32); multi-trait-multi-method mean r = 0.14 (mail = 0.14). The SDQ

  7. Apparatus and method X-ray image processing

    International Nuclear Information System (INIS)

    1984-01-01

    The invention relates to a method for X-ray image processing. The radiation passed through the object is transformed into an electric image signal from which the logarithmic value is determined and displayed by a display device. Its main objective is to provide a method and apparatus that renders X-ray images or X-ray subtraction images with strong reduction of stray radiation. (Auth.)

  8. Bridging the Gap between Social Animal and Unsocial Machine: A Survey of Social Signal Processing

    NARCIS (Netherlands)

    Vinciarelli, Alessandro; Pantic, Maja; Heylen, Dirk K.J.; Pelachaud, Catherine; Poggi, Isabella; D’Ericco, Francesca; Schröder, Marc

    Social Signal Processing is the research domain aimed at bridging the social intelligence gap between humans and machines. This paper is the first survey of the domain that jointly considers its three major aspects, namely, modeling, analysis, and synthesis of social behavior. Modeling investigates

  9. Fiscal 1999 survey report. Survey and research concerning development of next-generation chemical process technologies; 1999 nendo jisedai kagaku process gijutsu kaihatsu ni kansuru chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    To further enhance resource/energy conservation and environmental impact reduction, it is necessary to develop innovative chemical reaction process technologies. It is for this reason that 'development of next-generation chemical reaction process technologies' is being carried out under the New Sunshine Program. The survey and research, for the fulfilment of the above goal, aim to select important technologies and put in a easy-to-study order the problems contained in associated technologies for picking out tasks for future studies for the purpose of suggesting some subjects to be taken up for future development. In addition, studies are made about how a comprehensive technology assessment system should be. In this fiscal year, propositions are compiled for research and development projects on five subjects. Studies of subjects other than these five will also continue to eventually build concrete propositions on them. The said five subjects involve 1) the development and application of nonaqueous biotechnologies, 2) biotechnology-aided polymeric material creation processes, 3) construction of high-efficiency energy conservation processes using innovative grain handling technologies in the high-temperature reaction field, 4) manufacture of high-performance polymeric materials for batteries and development of battery fabrication processes, and 5) the development of an energy conservation process maximally utilizing environmentally-friendly polyolefin. (NEDO)

  10. Fiscal 1999 survey report. Survey and research concerning development of next-generation chemical process technologies; 1999 nendo jisedai kagaku process gijutsu kaihatsu ni kansuru chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    To further enhance resource/energy conservation and environmental impact reduction, it is necessary to develop innovative chemical reaction process technologies. It is for this reason that 'development of next-generation chemical reaction process technologies' is being carried out under the New Sunshine Program. The survey and research, for the fulfilment of the above goal, aim to select important technologies and put in a easy-to-study order the problems contained in associated technologies for picking out tasks for future studies for the purpose of suggesting some subjects to be taken up for future development. In addition, studies are made about how a comprehensive technology assessment system should be. In this fiscal year, propositions are compiled for research and development projects on five subjects. Studies of subjects other than these five will also continue to eventually build concrete propositions on them. The said five subjects involve 1) the development and application of nonaqueous biotechnologies, 2) biotechnology-aided polymeric material creation processes, 3) construction of high-efficiency energy conservation processes using innovative grain handling technologies in the high-temperature reaction field, 4) manufacture of high-performance polymeric materials for batteries and development of battery fabrication processes, and 5) the development of an energy conservation process maximally utilizing environmentally-friendly polyolefin. (NEDO)

  11. Survey of Processing Methods for High Strength High Conductivity Wires for High Field Magnet Applications

    Energy Technology Data Exchange (ETDEWEB)

    Han, K.; Embury, J.D.

    1998-10-01

    This paper will deal with the basic concepts of attaining combination of high strength and high conductivity in pure materials, in-situ composites and macrocomposites. It will survey current attainments, and outline where some future developments may lie in developing wire products that are close to the theoretical strength of future magnet applications.

  12. Survey of Processing Methods for High Strength High Conductivity Wires for High Field Magnet Applications

    International Nuclear Information System (INIS)

    Han, K.; Embury, J.D.

    1998-01-01

    This paper will deal with the basic concepts of attaining combination of high strength and high conductivity in pure materials, in-situ composites and macrocomposites. It will survey current attainments, and outline where some future developments may lie in developing wire products that are close to the theoretical strength of future magnet applications

  13. Advances in Packaging Methods, Processes and Systems

    Directory of Open Access Journals (Sweden)

    Nitaigour Premchand Mahalik

    2014-10-01

    Full Text Available The food processing and packaging industry is becoming a multi-trillion dollar global business. The reason is that the recent increase in incomes in traditionally less economically developed countries has led to a rise in standards of living that includes a significantly higher consumption of packaged foods. As a result, food safety guidelines have been more stringent than ever. At the same time, the number of research and educational institutions—that is, the number of potential researchers and stakeholders—has increased in the recent past. This paper reviews recent developments in food processing and packaging (FPP, keeping in view the aforementioned advancements and bearing in mind that FPP is an interdisciplinary area in that materials, safety, systems, regulation, and supply chains play vital roles. In particular, the review covers processing and packaging principles, standards, interfaces, techniques, methods, and state-of-the-art technologies that are currently in use or in development. Recent advances such as smart packaging, non-destructive inspection methods, printing techniques, application of robotics and machineries, automation architecture, software systems and interfaces are reviewed.

  14. Study on method of dose estimation for the Dual-moderated neutron survey meter

    International Nuclear Information System (INIS)

    Zhou, Bo; Li, Taosheng; Xu, Yuhai; Gong, Cunkui; Yan, Qiang; Li, Lei

    2013-01-01

    In order to study neutron dose measurement in high energy radiation field, a Dual-moderated survey meter in the range from 1 keV to 300 MeV mean energies spectra has been developed. Measurement results of some survey meters depend on the neutron spectra characteristics in different neutron radiation fields, so the characteristics of the responses to various neutron spectra should be studied in order to get more reasonable dose. In this paper the responses of the survey meter were calculated under different neutron spectra data from IAEA of Technical Reports Series No. 318 and other references. Finally one dose estimation method was determined. The range of the reading per H*(10) for the method estimated is about 0.7–1.6 for the neutron mean energy range from 50 keV to 300 MeV. -- Highlights: • We studied a novel high energy neutron survey meter. • Response characteristics of the survey meter were calculated by using a series of neutron spectra. • One significant advantage of the survey meter is that it can provide mean energy of radiation field. • Dose estimate deviation can be corrected. • The range of corrected reading per H*(10) is about 0.7–1.6 for the neutron fluence mean energy range from 0.05 MeV to 300 MeV

  15. Survey on Industry Requirements and Drivers for the Development of a Process-Related Certification Scheme for Ecodesign Implementation and Management

    DEFF Research Database (Denmark)

    Pigosso, Daniela Cristina Antelmi; Jakobsen, Maja; McAloone, Tim C.

    2014-01-01

    Despite the existence of a large amount of eco-labels and eco-standards for product declaration, there is still limited research for the development of process-related certification schemes dealing with ecodesign implementation and management. In order to identify companies’ drivers, barriers...... and expected benefits in regards to the development and application of process-related ecodesign certification schemes, a survey was carried out inthis research. This paper presents and discusses the main results obtained in the survey, which comprised the participation of more than 100 professionals from more...... than 25 countries. The results will be employed for the development of an ecodesign process-related certification scheme based on the Ecodesign Maturity Model (EcoM2)....

  16. Processing method and device for radioactive liquid waste

    International Nuclear Information System (INIS)

    Matsuo, Toshiaki; Nishi, Takashi; Matsuda, Masami; Yukita, Atsushi.

    1997-01-01

    When only suspended particulate ingredients are contained as COD components in radioactive washing liquid wastes, the liquid wastes are heated by a first process, for example, an adsorption step to adsorb the suspended particulate ingredients to an activated carbon, and then separating and removing the suspended particulate ingredients by filtration. When both of the floating particle ingredients and soluble organic ingredients are contained, the suspended particulate ingredients are separated and removed by the first process, and then soluble organic ingredients are removed by other process, or both of the suspended particulate ingredients and the soluble organic ingredients are removed by the first process. In an existent method of adding an activated carbon and then filtering them at a normal temperature, the floating particle ingredients cover the layer of activated carbon formed on a filter paper or fabric to sometimes cause clogging. However, according to the method of the present invention, since disturbance by the floating particle ingredients does not occur, the COD components can be separated and removed sufficiently without lowering liquid waste processing speed. (T.M.)

  17. The JCMT Transient Survey: Data Reduction and Calibration Methods

    International Nuclear Information System (INIS)

    Mairs, Steve; Lane, James; Johnstone, Doug; Kirk, Helen; Lacaille, Kevin; Chapman, Scott; Bower, Geoffrey C.; Bell, Graham S.; Graves, Sarah

    2017-01-01

    Though there has been a significant amount of work investigating the early stages of low-mass star formation in recent years, the evolution of the mass assembly rate onto the central protostar remains largely unconstrained. Examining in depth the variation in this rate is critical to understanding the physics of star formation. Instabilities in the outer and inner circumstellar disk can lead to episodic outbursts. Observing these brightness variations at infrared or submillimeter wavelengths constrains the current accretion models. The JCMT Transient Survey is a three-year project dedicated to studying the continuum variability of deeply embedded protostars in eight nearby star-forming regions at a one-month cadence. We use the SCUBA-2 instrument to simultaneously observe these regions at wavelengths of 450 and 850 μ m. In this paper, we present the data reduction techniques, image alignment procedures, and relative flux calibration methods for 850 μ m data. We compare the properties and locations of bright, compact emission sources fitted with Gaussians over time. Doing so, we achieve a spatial alignment of better than 1″ between the repeated observations and an uncertainty of 2%–3% in the relative peak brightness of significant, localized emission. This combination of imaging performance is unprecedented in ground-based, single-dish submillimeter observations. Finally, we identify a few sources that show possible and confirmed brightness variations. These sources will be closely monitored and presented in further detail in additional studies throughout the duration of the survey.

  18. The JCMT Transient Survey: Data Reduction and Calibration Methods

    Energy Technology Data Exchange (ETDEWEB)

    Mairs, Steve; Lane, James [Department of Physics and Astronomy, University of Victoria, Victoria, BC, V8P 1A1 (Canada); Johnstone, Doug; Kirk, Helen [NRC Herzberg Astronomy and Astrophysics, 5071 West Saanich Road, Victoria, BC, V9E 2E7 (Canada); Lacaille, Kevin; Chapman, Scott [Department of Physics and Atmospheric Science, Dalhousie University, Halifax, NS, B3H 4R2 (Canada); Bower, Geoffrey C. [Academia Sinica Institute of Astronomy and Astrophysics, 645 N. A‘ohōkū Place, Hilo, HI 96720 (United States); Bell, Graham S.; Graves, Sarah, E-mail: smairs@uvic.ca [East Asian Observatory, 660 North A‘ohōkū Place, University Park, Hilo, Hawaii 96720 (United States); Collaboration: JCMT Transient Team

    2017-07-01

    Though there has been a significant amount of work investigating the early stages of low-mass star formation in recent years, the evolution of the mass assembly rate onto the central protostar remains largely unconstrained. Examining in depth the variation in this rate is critical to understanding the physics of star formation. Instabilities in the outer and inner circumstellar disk can lead to episodic outbursts. Observing these brightness variations at infrared or submillimeter wavelengths constrains the current accretion models. The JCMT Transient Survey is a three-year project dedicated to studying the continuum variability of deeply embedded protostars in eight nearby star-forming regions at a one-month cadence. We use the SCUBA-2 instrument to simultaneously observe these regions at wavelengths of 450 and 850 μ m. In this paper, we present the data reduction techniques, image alignment procedures, and relative flux calibration methods for 850 μ m data. We compare the properties and locations of bright, compact emission sources fitted with Gaussians over time. Doing so, we achieve a spatial alignment of better than 1″ between the repeated observations and an uncertainty of 2%–3% in the relative peak brightness of significant, localized emission. This combination of imaging performance is unprecedented in ground-based, single-dish submillimeter observations. Finally, we identify a few sources that show possible and confirmed brightness variations. These sources will be closely monitored and presented in further detail in additional studies throughout the duration of the survey.

  19. Report on the survey of geothermal development at Okushiri Island, Hokkaido. Geochemical survey (GC/MS and MS method); Hokkaido Okushiritou chinetsu kaihatsu chosa chikagaku chosa (GC/MS and MS ho) hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-09-01

    To elucidate chemical components of soil gas in the Okushiri Island area, soil gas was collected by the method using charcoal adsorbent, and analysis was made by the GC/MS method. Out of the 19 measuring points, 17 points were set up near the measuring points in the FY 1998 survey by the finger print method. At the same measuring points, analytical survey by the MS method was also conducted to sort the type of soil gas. As a result of the GC/MS analysis, xylene or ethyl benzene was detected at 12 measuring points of all 19 measuring points, and from the distribution, it was predicted that there were anomaly zones in the district along the road of the Okushiri Island line and the district southward from the 5.8K Pass. These results were in harmony with the results of the survey by the finger print method. As to the sorting of soil gas based on the results of the MS analysis, the results were different at 8 measuring points from those of the survey by the finger print method in FY 1998. It was considered that the cause was the accidental vaporization of a large quantity of acetaldehyde, and acetaldehyde was regarded as a noise gas component that does not reflect the geothermal structure. (NEDO)

  20. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  1. Method and Mchievement of Survey and Evaluation of Groundwater Resources of Guangzhou City

    Science.gov (United States)

    Lin, J.

    2017-12-01

    Based on the documents and achievements relevant to hydrogeological surveying and mapping of 1:100000, hydrogeological drilling, pumping test and dynamic monitoring of groundwater level in Guangzhou, considering the hydrogeological conditions of Guangzhou and combining the advanced technologies such as remote sensing, the survey and evaluation of the volume of the groundwater resources of Guangzhou was carried out in plain and mountain areas separately. The recharge method was used to evaluate the volume of groundwater resources in plain areas, meanwhile, the output volume and the storage change volume of groundwater were calculated and the volume of groundwater resources was corrected by water balance analysis; while the discharge method was used to evaluated the volume of groundwater resources in mountain areas. The result of survey and evaluation indicates that: the volume of the natural groundwater resources in Guangzhou City is 1.83 billion m3 of which the groundwater replenishment quantity in plain areas is 510,045,000 m3, with a total output of 509,729,000 m3, an absolute balance difference of 316,000 m3 and a relative balance difference of 0.062%; the volume of groundwater resources in mountain areas is 1,358,208,000 m3 including the river basic flow is 965,054,000 m3; the repetitive counted volume of groundwater resources in both plain areas and mountain areas is 38,839,000 m3. This work was realized by refined means for the first time to entirely find out the volume of groundwater resources of Guangzhou City and the law of their distribution so as to lay an important foundation for the protection and reasonable development and exploration of the groundwater resources of Guangzhou City.

  2. Teaching Intercultural Communication in a Basic Technical Writing Course: A Survey of Our Current Practices and Methods

    Science.gov (United States)

    Matveeva, Natalia

    2008-01-01

    This research article reports the results of an online survey distributed among technical writing instructors in 2006. The survey aimed to examine how we teach intercultural communication in basic technical writing courses: our current practices and methods. The article discusses three major challenges that instructors may face when teaching about…

  3. Stochastic approximation methods-Powerful tools for simulation and optimization: A survey of some recent work on multi-agent systems and cyber-physical systems

    International Nuclear Information System (INIS)

    Yin, George; Wang, Le Yi; Zhang, Hongwei

    2014-01-01

    Stochastic approximation methods have found extensive and diversified applications. Recent emergence of networked systems and cyber-physical systems has generated renewed interest in advancing stochastic approximation into a general framework to support algorithm development for information processing and decisions in such systems. This paper presents a survey on some recent developments in stochastic approximation methods and their applications. Using connected vehicles in platoon formation and coordination as a platform, we highlight some traditional and new methodologies of stochastic approximation algorithms and explain how they can be used to capture essential features in networked systems. Distinct features of networked systems with randomly switching topologies, dynamically evolving parameters, and unknown delays are presented, and control strategies are provided

  4. Aeromagnetic survey in Eurajoensalmi, Olkiluoto 2008

    International Nuclear Information System (INIS)

    Levaeniemi, H.

    2008-08-01

    This report describes the survey operation, survey and processing methods and the deliverables of an aerogeophysical survey in Olkiluoto area in April 2008. The survey was conducted by Geological Survey of Finland (GTK). The survey aircraft was a twin-engine Twin Otter operated by Finnish Aviation Academy (SIO) and owned by Natural Environment Research Council / British Geological Survey (NERC / BGS), with whom GTK has established a joint venture called Joint Airborne-geoscience Capability (JAC). The survey was conducted in April 2008 during six days. The survey consists of six separate survey flights, one of which was a magnetic calibration flight. The survey was based in Pori airport. Survey line spacing was 50 meters and nominal survey altitude was 30 meters. Measurements were completed in April 2008, and data processing and reporting was done in June 2008. Two cesium magnetometers installed onboard the aircraft (at the left wingtip and in a nose cone) were measuring the magnetic total field intensity during the survey flights. An automatic compensation unit corrected the aircraft attitude errors in the magnetic data in real time. In addition to magnetic measurement, auxiliary parameters such as flight altitude and aircraft attitude were also recorded simultaneously. Reference ground base station was used for recording the temporal variations in the magnetic field and also reference data for post-positioning of coordinate information. In the post-processing phase, heading correction, base station correction and microlevelling procedures were applied to the magnetic data. The data was exported to numeric XYZ files and interpolated into grid data file. A noteworthy local detail present in the survey and in the processing was the massive power line. For safety reasons, flight altitude had to be increased and survey lines had to be cut short in the vicinity of the powerline. However, due to reasonable planning of the survey area boundaries, this caused no great

  5. A Survey on Banknote Recognition Methods by Various Sensors

    Science.gov (United States)

    Lee, Ji Woo; Hong, Hyung Gil; Kim, Ki Wan; Park, Kang Ryoung

    2017-01-01

    Despite a decrease in the use of currency due to the recent growth in the use of electronic financial transactions, real money transactions remain very important in the global market. While performing transactions with real money, touching and counting notes by hand, is still a common practice in daily life, various types of automated machines, such as ATMs and banknote counters, are essential for large-scale and safe transactions. This paper presents studies that have been conducted in four major areas of research (banknote recognition, counterfeit banknote detection, serial number recognition, and fitness classification) in the accurate banknote recognition field by various sensors in such automated machines, and describes the advantages and drawbacks of the methods presented in those studies. While to a limited extent some surveys have been presented in previous studies in the areas of banknote recognition or counterfeit banknote recognition, this paper is the first of its kind to review all four areas. Techniques used in each of the four areas recognize banknote information (denomination, serial number, authenticity, and physical condition) based on image or sensor data, and are actually applied to banknote processing machines across the world. This study also describes the technological challenges faced by such banknote recognition techniques and presents future directions of research to overcome them. PMID:28208733

  6. Effects of phone versus mail survey methods on the measurement of health-related quality of life and emotional and behavioural problems in adolescents

    Directory of Open Access Journals (Sweden)

    Ravens-Sieberer Ulrike

    2009-12-01

    Full Text Available Abstract Background Telephone interviews have become established as an alternative to traditional mail surveys for collecting epidemiological data in public health research. However, the use of telephone and mail surveys raises the question of to what extent the results of different data collection methods deviate from one another. We therefore set out to study possible differences in using telephone and mail survey methods to measure health-related quality of life and emotional and behavioural problems in children and adolescents. Methods A total of 1700 German children aged 8-18 years and their parents were interviewed randomly either by telephone or by mail. Health-related Quality of Life (HRQoL and mental health problems (MHP were assessed using the KINDL-R Quality of Life instrument and the Strengths and Difficulties Questionnaire (SDQ children's self-report and parent proxy report versions. Mean Differences ("d" effect size and differences in Cronbach alpha were examined across modes of administration. Pearson correlation between children's and parents' scores was calculated within a multi-trait-multi-method (MTMM analysis and compared across survey modes using Fisher-Z transformation. Results Telephone and mail survey methods resulted in similar completion rates and similar socio-demographic and socio-economic makeups of the samples. Telephone methods resulted in more positive self- and parent proxy reports of children's HRQoL (SMD ≤ 0.27 and MHP (SMD ≤ 0.32 on many scales. For the phone administered KINDL, lower Cronbach alpha values (self/proxy Total: 0.79/0.84 were observed (mail survey self/proxy Total: 0.84/0.87. KINDL MTMM results were weaker for the phone surveys: mono-trait-multi-method mean r = 0.31 (mail: r = 0.45; multi-trait-mono-method mean (self/parents r = 0.29/0.36 (mail: r = 0.34/0.40; multi-trait-multi-method mean r = 0.14 (mail: r = 0.21. Weaker MTMM results were also observed for the phone administered SDQ: mono-trait-multi-method

  7. Present status of processing method

    Energy Technology Data Exchange (ETDEWEB)

    Kosako, Kazuaki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)

    1998-11-01

    Present status of processing method for a high-energy nuclear data file was examined. The NJOY94 code is the only one available to the processing. In Japan, present processing used NJOY94 is orienting toward the production of traditional cross section library, because a high-energy transport code using a high-energy cross section library is indistinct. (author)

  8. Survey of electrochemical metal winning processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Vaaler, L.E.

    1979-03-01

    The subject program was undertaken to find electrometallurgical technology that could be developed into energy saving commercial metal winning processes. Metals whose current production processes consume significant energy (excepting copper and aluminum) are magnesium, zinc, lead, chromium, manganese, sodium, and titanium. The technology of these metals, with the exception of titanium, was reviewed. Growth of titanium demand has been too small to justify the installation of an electrolyte process that has been developed. This fact and the uncertainty of estimates of future demand dissuaded us from reviewing titanium technology. Opportunities for developing energy saving processes were found for magnesium, zinc, lead, and sodium. Costs for R and D and demonstration plants have been estimated. It appeared that electrolytic methods for chromium and manganese cannot compete energywise or economically with the pyrometallurgical methods of producing the ferroalloys, which are satisfactory for most uses of chromium and manganese.

  9. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1995-01-01

    This report is a compilation of the information submitted by AECL, CIAE, JAERI, ORNL and Siemens in response to a need identified at the 'Workshop on R and D Needs' at the IGORR-3 meeting. The survey compiled information on the national standards applied to the Safety Quality Assurance (SQA) programs undertaken by the participants. Information was assembled for the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods used to verify and validate the codes and libraries. Although the survey was not comprehensive, it provides a basis for exchanging information of common interest to the research reactor community

  10. Program software for the automated processing of gravity and magnetic survey data for the Mir computer

    Energy Technology Data Exchange (ETDEWEB)

    Lyubimov, G.A.

    1980-01-01

    A presentation is made of the content of program software for the automated processing of gravity and magnetic survey data for the small Mir-1 and Mir-2 computers as worked out on the Voronezh geophysical expedition.

  11. Device and method for shortening reactor process tubes

    Science.gov (United States)

    Frantz, Charles E.; Alexander, William K.; Lander, Walter E. B.

    1980-01-01

    This disclosure describes a device and method for in situ shortening of nuclear reactor zirconium alloy process tubes which have grown as a result of radiation exposure. An upsetting technique is utilized which involves inductively heating a short band of a process tube with simultaneous application of an axial load sufficient to cause upsetting with an attendant decrease in length of the process tube.

  12. Scientists' attitudes on science and values: Case studies and survey methods in philosophy of science.

    Science.gov (United States)

    Steel, Daniel; Gonnerman, Chad; O'Rourke, Michael

    2017-06-01

    This article examines the relevance of survey data of scientists' attitudes about science and values to case studies in philosophy of science. We describe two methodological challenges confronting such case studies: 1) small samples, and 2) potential for bias in selection, emphasis, and interpretation. Examples are given to illustrate that these challenges can arise for case studies in the science and values literature. We propose that these challenges can be mitigated through an approach in which case studies and survey methods are viewed as complementary, and use data from the Toolbox Dialogue Initiative to illustrate this claim. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. [Comparison study on sampling methods of Oncomelania hupensis snail survey in marshland schistosomiasis epidemic areas in China].

    Science.gov (United States)

    An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang

    2016-06-29

    To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.

  14. Survey and evaluation of aging risk assessment methods and applications

    International Nuclear Information System (INIS)

    Sanzo, D.; Kvam, P.; Apostolakis, G.; Wu, J.; Milici, T.; Ghoniem, N.; Guarro, S.

    1994-11-01

    The US Nuclear Regulatory Commission initiated the nuclear power plant aging research program about 6 years ago to gather information about nuclear power plant aging. Since then, this program has collected a significant amount of information, largely qualitative, on plant aging and its potential effects on plant safety. However, this body of knowledge has not yet been integrated into formalisms that can be used effectively and systematically to assess plant risk resulting from aging, although models for assessing the effect of increasing failure rates on core damage frequency have been proposed. This report surveys the work on the aging of systems, structures, and components (SSCs) of nuclear power plants, as well as associated data bases. We take a critical look at the need to revise probabilistic risk assessments (PRAs) so that they will include the contribution to risk from plant aging, the adequacy of existing methods for evaluating this contribution, and the adequacy of the data that have been used in these evaluation methods. We identify a preliminary framework for integrating the aging of SSCs into the PRA and include the identification of necessary data for such an integration

  15. Three-dimensional image signals: processing methods

    Science.gov (United States)

    Schiopu, Paul; Manea, Adrian; Craciun, Anca-Ileana; Craciun, Alexandru

    2010-11-01

    Over the years extensive studies have been carried out to apply coherent optics methods in real-time processing, communications and transmission image. This is especially true when a large amount of information needs to be processed, e.g., in high-resolution imaging. The recent progress in data-processing networks and communication systems has considerably increased the capacity of information exchange. We describe the results of literature investigation research of processing methods for the signals of the three-dimensional images. All commercially available 3D technologies today are based on stereoscopic viewing. 3D technology was once the exclusive domain of skilled computer-graphics developers with high-end machines and software. The images capture from the advanced 3D digital camera can be displayed onto screen of the 3D digital viewer with/ without special glasses. For this is needed considerable processing power and memory to create and render the complex mix of colors, textures, and virtual lighting and perspective necessary to make figures appear three-dimensional. Also, using a standard digital camera and a technique called phase-shift interferometry we can capture "digital holograms." These are holograms that can be stored on computer and transmitted over conventional networks. We present some research methods to process "digital holograms" for the Internet transmission and results.

  16. A survey of visual preprocessing and shape representation techniques

    Science.gov (United States)

    Olshausen, Bruno A.

    1988-01-01

    Many recent theories and methods proposed for visual preprocessing and shape representation are summarized. The survey brings together research from the fields of biology, psychology, computer science, electrical engineering, and most recently, neural networks. It was motivated by the need to preprocess images for a sparse distributed memory (SDM), but the techniques presented may also prove useful for applying other associative memories to visual pattern recognition. The material of this survey is divided into three sections: an overview of biological visual processing; methods of preprocessing (extracting parts of shape, texture, motion, and depth); and shape representation and recognition (form invariance, primitives and structural descriptions, and theories of attention).

  17. Calcification–carbonation method for red mud processing

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ruibing [School of Metallurgy, Northeastern University, Shenyang 110819 (China); Laboratory for Simulation and Modelling of Particulate Systems, Department of Chemical Engineering, Monash University, Clayton, Victoria, 3800 (Australia); Zhang, Tingan, E-mail: zhangta@smm.neu.edu.cn [School of Metallurgy, Northeastern University, Shenyang 110819 (China); Liu, Yan; Lv, Guozhi; Xie, Liqun [School of Metallurgy, Northeastern University, Shenyang 110819 (China)

    2016-10-05

    Highlights: • A new approach named calcification–carbonation method for red mud processing is proposed. • The method can prevent emission of red mud from alumina production and is good for the environment. • Thermodynamics characteristics were investigated. • The method was verified experimentally using a jet-flow reactor. - Abstract: Red mud, the Bayer process residue, is generated from alumina industry and causes environmental problem. In this paper, a novel calcification–carbonation method that utilized a large amount of the Bayer process residue is proposed. Using this method, the red mud was calcified with lime to transform the silicon phase into hydrogarnet, and the alkali in red mud was recovered. Then, the resulting hydrogarnet was decomposed by CO{sub 2} carbonation, affording calcium silicate, calcium carbonate, and aluminum hydroxide. Alumina was recovered using an alkaline solution at a low temperature. The effects of the new process were analyzed by thermodynamics analysis and experiments. The extraction efficiency of the alumina and soda obtained from the red mud reached 49.4% and 96.8%, respectively. The new red mud with <0.3% alkali can be used in cement production. Using a combination of this method and cement production, the Bayer process red mud can be completely utilized.

  18. A Survey on Chinese Scholars' Adoption of Mixed Methods

    Science.gov (United States)

    Zhou, Yuchun

    2018-01-01

    Since the 1980s when mixed methods emerged as "the third research methodology", it was widely adopted in Western countries. However, inadequate literature revealed how this methodology was accepted by scholars in Asian countries, such as China. Therefore, this paper used a quantitative survey to investigate Chinese scholars' perceptions…

  19. Survey of Nuclear Methods in Chemical Technology

    International Nuclear Information System (INIS)

    Broda, E.

    1966-01-01

    An attempt is made to classify nuclear methods on a logical basis to facilitate assimilation by the technologist. The three main groups are: (I) Tracer methods, (II) Methods based on the influence of absorbers on radiations to be measured, and (III) Radiation chemical methods. The variants of the first two groups are discussed in some detail, and typical examples are given. Group I can be subdivided into (1) Indicator methods, (2) Emanation methods, (3) Radioreagent methods, and (4) Isotope dilution methods, Group II into (5) Activation methods, (6) Absorption methods, (7) Induced Nuclear Reaction methods, (8) Scattering methods, and (9) Fluorescence methods. While the economic benefits due to nuclear methods already run into hundreds of millions of dollars annually, owing to radiation protection problems radiochemical methods in the strict sense are not widely used in actual production. It is suggested that more use should be made of pilot plant tracer studies of chemical processes as used in industry. (author)

  20. 3D seismic survey in Honjo, Akita. Problems and struggles in acquisition and processing; Akitaken Honjo koku ni okeru sanjigen jishin tansa. Genba sagyo to data shori ni okeru mondaiten

    Energy Technology Data Exchange (ETDEWEB)

    Imahori, S; Kotera, Y; Nakanishi, T [Japan Energy Development Co. Ltd., Tokyo (Japan)

    1997-05-27

    Honjo mining area where investigations are conducted is hilly and has a complicated terrain with gas pipes buried in the ground just under the access road disabling the proper positioning of shock-generating large excavators or vibrators. Auger`s shallow hole shooting method is used in this survey to execute blastings at 639 points. In this method using charge depths of 4m, different from the conventional method using deeper charge depths (20-25m), surface waves prevail in the shot records giving rise to a new problem of removing them in the stage of data processing. The 2D filter that is a powerful tool in 2D data processing is not available in a 3D-survey where the tracing intervals are irregular in the shot records. In the effort of this time, a window length as a parameter in the direction of time is specified, and the F-X dip filtering method is employed in which any event that linearly continues beyond a certain number of traces in the said window is eliminated as a linear noise. It is recommended that the weighting function be changed in the direction of space since surface wave velocities are different at different locations. 1 fig., 1 tab.

  1. Methods in Astronomical Image Processing

    Science.gov (United States)

    Jörsäter, S.

    A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

  2. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  3. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  4. Conducting Surveys and Data Collection: From Traditional to Mobile and SMS-based Surveys

    Directory of Open Access Journals (Sweden)

    Iftikhar Alam

    2014-08-01

    Full Text Available Fresh, bias-free and valid data collected using different survey modes is considered an essential requirement for smooth functioning and evolution of an organization. Surveys play a major role in making in-time correct decisions and generating reports. The aim of this study is to compare and investigate state-of-the-art in different survey modes including print, email, online, mobile and SMS-based surveys. Results indicated that existing methods are neither complete nor sufficient to fulfil the overall requirements of an organization which primarily rely on surveys. Also, it shows that SMS is a dominant method for data collection due to its pervasiveness. However, existing SMS-based data collection has limitations like limited number of characters per SMS, single question per SMS and lake of multimedia support. Recent trends in data collection emphasis on data collection applications for smart phones. However, in developing countries low-end mobile devices are still extensively used which makes the data collection difficult from man in the street. The paper conclude that existing survey modes and methods should be improved to get maximum responses quickly in low cost manner. The study has contributed to the area of surveying and data collection by analysing different factors such as cost, time and response rate. The results of this study can help practitioners in creating a more successful surveying method for data collection that can be effectively used for low budget projects in developed as well as developing countries.

  5. Processing of fallopian tube, ovary, and endometrial surgical pathology specimens: A survey of U.S. laboratory practices.

    Science.gov (United States)

    Samimi, Goli; Trabert, Britton; Duggan, Máire A; Robinson, Jennifer L; Coa, Kisha I; Waibel, Elizabeth; Garcia, Edna; Minasian, Lori M; Sherman, Mark E

    2018-03-01

    Many high-grade serous carcinomas initiate in fallopian tubes as serous tubal intraepithelial carcinoma (STIC), a microscopic lesion identified with specimen processing according to the Sectioning and Extensive Examination of the Fimbria protocol (SEE-Fim). Given that the tubal origin of these cancers was recently recognized, we conducted a survey of pathology practices to assess processing protocols that are applied to gynecologic surgical pathology specimens in clinical contexts in which finding STIC might have different implications. We distributed a survey electronically to the American Society for Clinical Pathology list-serve to determine practice patterns and compared results between practice types by chi-square (χ2) tests for categorical variables. Free text comments were qualitatively reviewed. Survey responses were received from 159 laboratories (72 academic, 87 non-academic), which reported diverse specimen volumes and percentage of gynecologic samples. Overall, 74.1% of laboratories reported performing SEE-Fim for risk-reducing surgical specimens (82.5% academic versus 65.7% non-academic, p STIC or early cancer precursors. Published by Elsevier Inc.

  6. Metal speciation: survey of environmental methods of analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mach, M.H.; Nott, B.; Scott, J.W.; Maddalone, R.F.; Whiddon, N.T. [TRW, Redondo Beach, CA (United States). Chemistry Technology Dept.

    1996-07-01

    As part of a recent task under the EPRI Analytical Methods Qualification Program (RP 1851), TRW has surveyed the methods available for monitoring metal species in typical utility aqueous discharge streams. Methods for determining the individual species of these metals can become important in a regulatory sense as the EPA transitions to assessment of environmental risk based on bioavailability. For example, EPA considers methyl mercury and Cr(VI) much more toxic to the aquatic environment than inorganic mercury or Cr(III). The species of a given element can also differ in their transport and bioaccumulation. Methods for speciation generally include a selective separation step followed by standard metals analysis. Speciation, therefore, is mainly derived from the separation step and not from the method of final quantisation. Examples of separation/analysis include: selective extraction followed by graphite furnace atomic absorption or ICP-MS; separation by GC followed by metals detection; chelation and/or direct separation by LC followed by UV measurement or metals detection; and ion chromatography with conductivity, UV, or metals detection. There are a number of sampling issues associated with metal species such as stabilization (maintaining oxidation state), absorption, and filtration that need to be addressed in order to obtain and maintain a representative sample for analysis. 45 refs., 1 tab.

  7. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  8. Field Methods for the Study of Slope and Fluvial Processes

    Science.gov (United States)

    Leopold, Luna Bergere; Leopold, Luna Bergere

    1967-01-01

    In Belgium during the summer of 1966 the Commission on Slopes and the Commission on Applied Geomorphology of the International Geographical Union sponsored a joint symposium, with field excursions, and meetings of the two commissions. As a result of the conference and associated discussions, the participants expressed the view that it would be a contribution to scientific work relating to the subject area if the Commission on Applied Geomorphology could prepare a small manual describling the methods of field investigation being used by research scientists throughout the world in the study of various aspects of &lope development and fluvial processes. The Commission then assumed this responsibility and asked as many persons as were known to be. working on this subject to contribute whatever they wished in the way of descriptions of methods being employed.The purpose of the present manual is to show the variety of study methods now in use, to describe from the experience gained the limitations and advantages of different techniques, and to give pertinent detail which might be useful to other investigators. Some details that would be useful to know are not included in scientific publications, but in a manual on methods the details of how best t6 use a method has a place. Various persons have learned certain things which cannot be done, as well as some methods that are successful. It is our hope that comparison of methods tried will give the reader suggestions as to how a particular method might best be applied to his own circumstance.The manual does not purport to include methods used by all workers. In particular, it does not interfere with a more systematic treatment of the subject (1) or with various papers already published in the present journal. In fact we are sure that there are pertinent research methods that we do not know of and the Commission would be glad to receive additions and other ideas from those who find they have something to contribute. Also, the

  9. Elastic versus acoustic inversion for marine surveys

    KAUST Repository

    Mora, Peter; Wu, Zedong

    2018-01-01

    Full Wavefield Inversion (FWI) is a powerful and elegant approach for seismic imaging that is on the way to becoming the method of choice when processing exploration or global seismic data. In the case of processing marine survey data, one may

  10. Web-based Survey Data Collection With Peer Support and Advocacy Organizations: Implications of Participatory Methods.

    Science.gov (United States)

    Ostrow, Laysha; Penney, Darby; Stuart, Elizabeth; Leaf, Phillip J

    2017-01-01

    The 2012 National Survey of Peer-Run Organizations is one of the first to survey a nationally representative sample of mental health peer-run organizations, nonprofit venues for support and advocacy which are defined by people with psychiatric histories being in positions of authority and control. This paper describes data collection methods and demonstrates how participatory strategies to involve people with psychiatric histories intersected with Internet research to achieve study aims. People with psychiatric histories were involved in designing and implementing a web-based survey to collect data on peer-run organizations' operations and views on national policy. Participatory approaches were used throughout design, data collection analysis, and dissemination. The extensive involvement of people with psychiatric histories in project design and implementation were important strategies that contributed to this study's success.

  11. Radiological survey techniques for decontamination and dismantlement applications

    International Nuclear Information System (INIS)

    Ruesink, G.P.; Stempfley, D.H.; Pettit, P.J.; Warner, R.D.

    1997-01-01

    The Department of Energy's Fernald Environmental Management Project (FEMP) is engaged in an aggressive Program to remove all above ground structures as part of the Fernald sites final remediation remedy. Through the complete removal of major facilities such as Plant 7, Plant 4, and Plant 1, the FEMP has developed radiological survey approaches that are effective for the different phases of the Decontamination and Dismantlement (D ampersand D) process. Some of the most pressing challenges facing the FEMP are implementing effective, low cost methods for the D ampersand D of former process buildings while minimizing environmental effects. One of the key components to ensure minimal impact on the environment is the collection of radiological contamination information during the D ampersand D process to facilitate the decision making process. Prior to the final demolition of any structure, radiological surveys of floors, walls, and ceilings must take place. These surveys must demonstrate that contamination levels am below 5000 dpm removable beta/gamma for non-porous surfaces and below 1000 dpm removable-beta/gamma for all porous surfaces. Technique which can perform these activities in a safe, effective, and cost efficient manner are greatly desired. The FEMP has investigated new approaches to address this need. These techniques include sampling approaches using standard baseline methodology as well as innovative approaches to accelerate final radiological clearance processes. To further improve upon this process, the FEMP has investigated several new technologies through the Fernald Plant 1 Large Scale Technology Demonstration Project. One of the most promising of these new technologies, Laser Induced Fluorescence, may significantly improve the radiological clearance survey process. This paper will present real world experiences in applying radiological control limits to D ampersand D projects as well as relate potential productivity and cost improvements with the

  12. Studying Landslide Displacements in Megamendung (Indonesia Using GPS Survey Method

    Directory of Open Access Journals (Sweden)

    Hasanuddin Z. Abidin

    2004-11-01

    Full Text Available Landslide is one of prominent geohazards that frequently affects Indonesia, especially in the rainy season. It destroys not only environment and property, but usually also causes deaths. Landslide monitoring is therefore very crucial and should be continuously done. One of the methods that can have a contribution in studying landslide phenomena is repeated GPS survey method. This paper presents and discusses the operational performances, constraints and results of GPS surveys conducted in a well known landslide prone area in West Java (Indonesia, namely Megamendung, the hilly region close to Bogor. Three GPS surveys involving 8 GPS points have been conducted, namely on April 2002, May 2003 and May 2004, respectively. The estimated landslide displacements in the area are relatively quite large in the level of a few dm to a few m. Displacements up to about 2-3 m were detected in the April 2002 to May 2003 period, and up to about 3-4 dm in the May 2003 to May 2004 period. In both periods, landslides in general show the northwest direction of displacements. Displacements vary both spatially and temporally. This study also suggested that in order to conclude the existence of real and significant displacements of GPS points, the GPS estimated displacements should be subjected to three types of testing namely: the congruency test on spatial displacements, testing on the agreement between the horizontal distance changes with the predicted direction of landslide displacement, and testing on the consistency of displacement directions on two consecutive periods.

  13. Optimum survey methods when interviewing employed women.

    Science.gov (United States)

    Dunning, Kari; LeMasters, Grace K

    2009-02-01

    While survey studies have examined bias much is unknown regarding specific subpopulations, especially women workers. A population based phone, Internet, and mail survey of workplace falls during pregnancy was undertaken. Participation by industry and occupation and survey approach and bias, reliability, and incomplete data were examined. Of the 3,997 women surveyed, 71% were employed during their pregnancy. Internet responders were most likely to be employed while pregnant and to report a workplace fall at 8.8% compared to 5.8% and 6.1% for mail and phone respondents. Internet responders had the most missing employment data with company name missing for 17.9% compared to 1.3% for phone responders. Mail surveys were best for recruiting those employed in eight of nine industries, and this was especially true for service occupations. To decrease bias and increase participation, mixed approaches may be useful with particular attention for collecting occupational data. Am. J. Ind. Med. 52:105-112, 2009. (c) 2008 Wiley-Liss, Inc.

  14. Aerogeophysical survey in Olkiluoto 2009

    International Nuclear Information System (INIS)

    Kurimo, M.

    2009-08-01

    This report describes the survey operation, survey and processing methods and the deliverables of an aerogeophysical survey in Olkiluoto area in May 2009. The survey was conducted by Geological Survey of Finland (GTK). The survey aircraft was a twin-engine Twin Otter operated by Finnish Aviation Academy (SIO) and owned by Natural Environment Research Council / British Geological Survey (NERC / BGS), with whom GTK has established a joint venture called Joint Airborne-geoscience Capability (JAC). The survey was conducted in May 2009 between May 5th and May 18th. The survey consists of ten separate survey flights and two magnetic calibration flights. The survey was based in Pori airport. Survey line spacing was 50 meters and nominal survey altitude was 30 meters. Measurements were completed in May 2009, and data processing and reporting was done in June 2009. Two cesium magnetometers installed onboard the aircraft (at the left wingtip and in a nose cone) were measuring the magnetic total field intensity during the survey flights. An automatic compensation unit corrected the aircraft attitude errors in the magnetic data in real time. The four-frequency electromagnetic (EM) unit included four transmitter coils with amplifiers in right wingtip and four receiver coils in left wingtip. Frequencies were 900 Hz, 3 kHz, 14 kHz and 24.5 kHz. The gamma spectrometer with two crystal packages (total volume 42 litres) measured the 256 channel energy spectra. In addition, auxiliary parameters such as flight altitude and aircraft attitude were also recorded simultaneously. Reference ground base station was used for recording the temporal variations in the magnetic field and also reference data for post-positioning of coordinate information. In the post-processing phase, heading correction, base station correction and microlevelling procedures were applied to the magnetic data. The EM data and radiometric data were corrected with calibration coefficients and levelled. The data was

  15. An international survey and modified Delphi process revealed editors' perceptions, training needs, and ratings of competency-related statements for the development of core competencies for scientific editors of biomedical journals.

    Science.gov (United States)

    Galipeau, James; Cobey, Kelly D; Barbour, Virginia; Baskin, Patricia; Bell-Syer, Sally; Deeks, Jonathan; Garner, Paul; Shamseer, Larissa; Sharon, Straus; Tugwell, Peter; Winker, Margaret; Moher, David

    2017-01-01

    Background: Scientific editors (i.e., those who make decisions on the content and policies of a journal) have a central role in the editorial process at biomedical journals. However, very little is known about the training needs of these editors or what competencies are required to perform effectively in this role. Methods: We conducted a survey of perceptions and training needs among scientific editors from major editorial organizations around the world, followed by a modified Delphi process in which we invited the same scientific editors to rate the importance of competency-related statements obtained from a previous scoping review. Results: A total of 148 participants completed the survey of perceptions and training needs. At least 80% of participants agreed on six of the 38 skill and expertise-related statements presented to them as being important or very important to their role as scientific editors. At least 80% agreed on three of the 38 statements as necessary skills they perceived themselves as possessing (well or very well).  The top five items on participants' list of top training needs were training in statistics, research methods, publication ethics, recruiting and dealing with peer reviewers, and indexing of journals. The three rounds of the Delphi were completed by 83, 83, and 73 participants, respectively, which ultimately produced a list of 23 "highly rated" competency-related statements and another 86 "included" items. Conclusion: Both the survey and the modified Delphi process will be critical for understanding knowledge and training gaps among scientific editors when designing curriculum around core competencies in the future.

  16. An international survey and modified Delphi process revealed editors’ perceptions, training needs, and ratings of competency-related statements for the development of core competencies for scientific editors of biomedical journals

    Science.gov (United States)

    Galipeau, James; Cobey, Kelly D.; Barbour, Virginia; Baskin, Patricia; Bell-Syer, Sally; Deeks, Jonathan; Garner, Paul; Shamseer, Larissa; Sharon, Straus; Tugwell, Peter; Winker, Margaret; Moher, David

    2017-01-01

    Background: Scientific editors (i.e., those who make decisions on the content and policies of a journal) have a central role in the editorial process at biomedical journals. However, very little is known about the training needs of these editors or what competencies are required to perform effectively in this role. Methods: We conducted a survey of perceptions and training needs among scientific editors from major editorial organizations around the world, followed by a modified Delphi process in which we invited the same scientific editors to rate the importance of competency-related statements obtained from a previous scoping review. Results: A total of 148 participants completed the survey of perceptions and training needs. At least 80% of participants agreed on six of the 38 skill and expertise-related statements presented to them as being important or very important to their role as scientific editors. At least 80% agreed on three of the 38 statements as necessary skills they perceived themselves as possessing (well or very well).  The top five items on participants’ list of top training needs were training in statistics, research methods, publication ethics, recruiting and dealing with peer reviewers, and indexing of journals. The three rounds of the Delphi were completed by 83, 83, and 73 participants, respectively, which ultimately produced a list of 23 “highly rated” competency-related statements and another 86 “included” items. Conclusion: Both the survey and the modified Delphi process will be critical for understanding knowledge and training gaps among scientific editors when designing curriculum around core competencies in the future. PMID:28979768

  17. Analytical techniques for in-line/on-line monitoring of uranium and plutonium in process solutions : a brief literature survey

    International Nuclear Information System (INIS)

    Marathe, S.G.; Sood, D.D.

    1991-01-01

    In-line/on-line monitoring of various parameters such as uranium-plutonium-fission product concentration, acidity, density etc. plays an important role in quickly understanding the efficiency of processes in a reprocessing plant. Efforts in studying and installation of such analytical instruments are going on since more than three decades with adaptation of newer methods and technologies. A review on the developement of in-line analytical instrumentation was carried out in this laboratory about two decades ago. This report presents a very short literature survey of the work in the last two decades. The report includes an outline of principles of the main techniques employed in the in-line/on-line monitoring. (author). 77 refs., 6 tabs

  18. Plane and geodetic surveying

    CERN Document Server

    Johnson, Aylmer

    2014-01-01

    IntroductionAim And ScopeClassification Of SurveysThe Structure Of This BookGeneral Principles Of SurveyingErrorsRedundancyStiffnessAdjustmentPlanning And Record KeepingPrincipal Surveying ActivitiesEstablishing Control NetworksMappingSetting OutResectioningDeformation MonitoringAngle MeasurementThe Surveyor's CompassThe ClinometerThe Total StationMaking ObservationsChecks On Permanent AdjustmentsDistance MeasurementGeneralTape MeasurementsOptical Methods (Tachymetry)Electromagnetic Distance Measurement (EDM)Ultrasonic MethodsGNSSLevellingTheoryThe InstrumentTechniqueBookingPermanent Adjustmen

  19. Uranium manufacturing process employing the electrolytic reduction method

    International Nuclear Information System (INIS)

    Oda, Yoshio; Kazuhare, Manabu; Morimoto, Takeshi.

    1986-01-01

    The present invention related to a uranium manufacturing process that employs the electrolytic reduction method, but particularly to a uranium manufacturing process that employs an electrolytic reduction method requiring low voltage. The process, in which uranium is obtained by means of the electrolytic method and with uranyl acid as the raw material, is prior art

  20. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Science.gov (United States)

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  1. Analysis of method of polarization surveying of water surface oil pollution

    Science.gov (United States)

    Zhukov, B. S.

    1979-01-01

    A method of polarization surveying of oil films on the water surface is analyzed. Model calculations of contrasted oil and water obtained with different orientations of the analyzer are discussed. The model depends on the spectral range, water transparency and oil film, and the selection of observational direction.

  2. Geotechnical survey procedures for a repository mine

    International Nuclear Information System (INIS)

    Walther, C.

    1993-01-01

    The approach to the survey involves the operational realisation of an information process beginning with the definition of the survey programme and ending with the presentation of the survey results in the form of planning and assessment documents. - The survey methods must conform with the mine regulations, provide reliable predictions and produce the maximum possible salient information. The recording of large and varied amounts of data, and the complex interpretation procedures that follow require effective data and information management to allow the presentation of the results in accordance with the planning specifications. (orig.) [de

  3. Multi-GNSS high-rate RTK, PPP and novel direct phase observation processing method: application to precise dynamic displacement detection

    Science.gov (United States)

    Paziewski, Jacek; Sieradzki, Rafal; Baryla, Radoslaw

    2018-03-01

    This paper provides the methodology and performance assessment of multi-GNSS signal processing for the detection of small-scale high-rate dynamic displacements. For this purpose, we used methods of relative (RTK) and absolute positioning (PPP), and a novel direct signal processing approach. The first two methods are recognized as providing accurate information on position in many navigation and surveying applications. The latter is an innovative method for dynamic displacement determination with the use of GNSS phase signal processing. This method is based on the developed functional model with parametrized epoch-wise topocentric relative coordinates derived from filtered GNSS observations. Current regular kinematic PPP positioning, as well as medium/long range RTK, may not offer coordinate estimates with subcentimeter precision. Thus, extended processing strategies of absolute and relative GNSS positioning have been developed and applied for displacement detection. The study also aimed to comparatively analyze the developed methods as well as to analyze the impact of combined GPS and BDS processing and the dependence of the results of the relative methods on the baseline length. All the methods were implemented with in-house developed software allowing for high-rate precise GNSS positioning and signal processing. The phase and pseudorange observations collected with a rate of 50 Hz during the field test served as the experiment’s data set. The displacements at the rover station were triggered in the horizontal plane using a device which was designed and constructed to ensure a periodic motion of GNSS antenna with an amplitude of ~3 cm and a frequency of ~4.5 Hz. Finally, a medium range RTK, PPP, and direct phase observation processing method demonstrated the capability of providing reliable and consistent results with the precision of the determined dynamic displacements at the millimeter level. Specifically, the research shows that the standard deviation of

  4. First-order Convex Optimization Methods for Signal and Image Processing

    DEFF Research Database (Denmark)

    Jensen, Tobias Lindstrøm

    2012-01-01

    In this thesis we investigate the use of first-order convex optimization methods applied to problems in signal and image processing. First we make a general introduction to convex optimization, first-order methods and their iteration complexity. Then we look at different techniques, which can...... be used with first-order methods such as smoothing, Lagrange multipliers and proximal gradient methods. We continue by presenting different applications of convex optimization and notable convex formulations with an emphasis on inverse problems and sparse signal processing. We also describe the multiple...

  5. Benthic Photo Survey: Software for Geotagging, Depth-tagging, and Classifying Photos from Survey Data and Producing Shapefiles for Habitat Mapping in GIS

    Directory of Open Access Journals (Sweden)

    Jared Kibele

    2016-03-01

    Full Text Available Photo survey techniques are common for resource management, ecological research, and ground truthing for remote sensing but current data processing methods are cumbersome and inefficient. The Benthic Photo Survey (BPS software described here was created to simplify the data processing and management tasks associated with photo surveys of underwater habitats. BPS is free and open source software written in Python with a QT graphical user interface. BPS takes a GPS log and jpeg images acquired by a diver or drop camera and assigns the GPS position to each photo based on time-stamps (i.e. geotagging. Depth and temperature can be assigned in a similar fashion (i.e. depth-tagging using log files from an inexpensive consumer grade depth / temperature logger that can be attached to the camera. BPS provides the user with a simple interface to assign quantitative habitat and substrate classifications to each photo. Location, depth, temperature, habitat, and substrate data are all stored with the jpeg metadata in Exchangeable image file format (Exif. BPS can then export all of these data in a spatially explicit point shapefile format for use in GIS. BPS greatly reduces the time and skill required to turn photos into usable data thereby making photo survey methods more efficient and cost effective. BPS can also be used, as is, for other photo sampling techniques in terrestrial and aquatic environments and the open source code base offers numerous opportunities for expansion and customization.

  6. SKOCh modified parameters and data processing method

    International Nuclear Information System (INIS)

    Abramov, V.V.; Baldin, B.Yu.; Vasil'chenko, V.G.

    1986-01-01

    Characteristics of a modified Cherenkov radiation ring spectrometer variant (SKOCH) are presented. Methods of experimental data processing are described. Different SKOCH optics variants are investigated. Multi-particle registering electronic equipment for data read-out from SKOCH providing for the improvement of multiparticle occurance registration conditions is applied in the course of measurements using proton beams. A system of SKOCH spectrometer data processing programms is developed and experimentally tested. Effective algorithm for calibrating Cherenkov radiation ring spectrometers with quite a large angular and radial aperture is developed. The on-line- and off-line-processing program complex provides for the complete control of SKOCH operation during statistics collection and for particle (π, K, P) identification within 5.5-30 GeV/c range

  7. Short assessment of the Big Five: robust across survey methods except telephone interviewing.

    Science.gov (United States)

    Lang, Frieder R; John, Dennis; Lüdtke, Oliver; Schupp, Jürgen; Wagner, Gert G

    2011-06-01

    We examined measurement invariance and age-related robustness of a short 15-item Big Five Inventory (BFI-S) of personality dimensions, which is well suited for applications in large-scale multidisciplinary surveys. The BFI-S was assessed in three different interviewing conditions: computer-assisted or paper-assisted face-to-face interviewing, computer-assisted telephone interviewing, and a self-administered questionnaire. Randomized probability samples from a large-scale German panel survey and a related probability telephone study were used in order to test method effects on self-report measures of personality characteristics across early, middle, and late adulthood. Exploratory structural equation modeling was used in order to test for measurement invariance of the five-factor model of personality trait domains across different assessment methods. For the short inventory, findings suggest strong robustness of self-report measures of personality dimensions among young and middle-aged adults. In old age, telephone interviewing was associated with greater distortions in reliable personality assessment. It is concluded that the greater mental workload of telephone interviewing limits the reliability of self-report personality assessment. Face-to-face surveys and self-administrated questionnaire completion are clearly better suited than phone surveys when personality traits in age-heterogeneous samples are assessed.

  8. The Global Survey Method Applied to Ground-level Cosmic Ray Measurements

    Science.gov (United States)

    Belov, A.; Eroshenko, E.; Yanke, V.; Oleneva, V.; Abunin, A.; Abunina, M.; Papaioannou, A.; Mavromichalaki, H.

    2018-04-01

    The global survey method (GSM) technique unites simultaneous ground-level observations of cosmic rays in different locations and allows us to obtain the main characteristics of cosmic-ray variations outside of the atmosphere and magnetosphere of Earth. This technique has been developed and applied in numerous studies over many years by the Institute of Terrestrial Magnetism, Ionosphere and Radiowave Propagation (IZMIRAN). We here describe the IZMIRAN version of the GSM in detail. With this technique, the hourly data of the world-wide neutron-monitor network from July 1957 until December 2016 were processed, and further processing is enabled upon the receipt of new data. The result is a database of homogeneous and continuous hourly characteristics of the density variations (an isotropic part of the intensity) and the 3D vector of the cosmic-ray anisotropy. It includes all of the effects that could be identified in galactic cosmic-ray variations that were caused by large-scale disturbances of the interplanetary medium in more than 50 years. These results in turn became the basis for a database on Forbush effects and interplanetary disturbances. This database allows correlating various space-environment parameters (the characteristics of the Sun, the solar wind, et cetera) with cosmic-ray parameters and studying their interrelations. We also present features of the coupling coefficients for different neutron monitors that enable us to make a connection from ground-level measurements to primary cosmic-ray variations outside the atmosphere and the magnetosphere. We discuss the strengths and weaknesses of the current version of the GSM as well as further possible developments and improvements. The method developed allows us to minimize the problems of the neutron-monitor network, which are typical for experimental physics, and to considerably enhance its advantages.

  9. Feasibility of dietary assessment methods, other tools and procedures for a pan-European food consumption survey among infants, toddlers and children

    DEFF Research Database (Denmark)

    Ocké, Marga; Brants, Henny; Dofkova, Marcela

    2014-01-01

    Purpose To test the feasibility of tools and procedures for a pan-European food consumption survey among children 0-10 years and to recommend one of two tested dietary assessment methods. Methods Two pilot studies including 378 children were conducted in Belgium and the Czech Republic in the Pilot...... more challenging by the interviewers. Conclusions Both dietary assessment methods with related tools and administration protocols were evaluated as feasible. The administration protocol with two 1-day food diaries with completion interviews offers more advantages for the future pan-European survey...

  10. Pairing field methods to improve inference in wildlife surveys while accommodating detection covariance.

    Science.gov (United States)

    Clare, John; McKinney, Shawn T; DePue, John E; Loftin, Cynthia S

    2017-10-01

    It is common to use multiple field sampling methods when implementing wildlife surveys to compare method efficacy or cost efficiency, integrate distinct pieces of information provided by separate methods, or evaluate method-specific biases and misclassification error. Existing models that combine information from multiple field methods or sampling devices permit rigorous comparison of method-specific detection parameters, enable estimation of additional parameters such as false-positive detection probability, and improve occurrence or abundance estimates, but with the assumption that the separate sampling methods produce detections independently of one another. This assumption is tenuous if methods are paired or deployed in close proximity simultaneously, a common practice that reduces the additional effort required to implement multiple methods and reduces the risk that differences between method-specific detection parameters are confounded by other environmental factors. We develop occupancy and spatial capture-recapture models that permit covariance between the detections produced by different methods, use simulation to compare estimator performance of the new models to models assuming independence, and provide an empirical application based on American marten (Martes americana) surveys using paired remote cameras, hair catches, and snow tracking. Simulation results indicate existing models that assume that methods independently detect organisms produce biased parameter estimates and substantially understate estimate uncertainty when this assumption is violated, while our reformulated models are robust to either methodological independence or covariance. Empirical results suggested that remote cameras and snow tracking had comparable probability of detecting present martens, but that snow tracking also produced false-positive marten detections that could potentially substantially bias distribution estimates if not corrected for. Remote cameras detected marten

  11. Multi-block methods in multivariate process control

    DEFF Research Database (Denmark)

    Kohonen, J.; Reinikainen, S.P.; Aaljoki, K.

    2008-01-01

    methods the effect of a sub-process can be seen and an example with two blocks, near infra-red, NIR, and process data, is shown. The results show improvements in modelling task, when a MB-based approach is used. This way of working with data gives more information on the process than if all data...... are in one X-matrix. The procedure is demonstrated by an industrial continuous process, where knowledge about the sub-processes is available and X-matrix can be divided into blocks between process variables and NIR spectra.......In chemometric studies all predictor variables are usually collected in one data matrix X. This matrix is then analyzed by PLS regression or other methods. When data from several different sub-processes are collected in one matrix, there is a possibility that the effects of some sub-processes may...

  12. A review of neutron scattering correction for the calibration of neutron survey meters using the shadow cone method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sang In; Kim, Bong Hwan; Kim, Jang Lyul; Lee, Jung Il [Health Physics Team, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-12-15

    The calibration methods of neutron-measuring devices such as the neutron survey meter have advantages and disadvantages. To compare the calibration factors obtained by the shadow cone method and semi-empirical method, 10 neutron survey meters of five different types were used in this study. This experiment was performed at the Korea Atomic Energy Research Institute (KAERI; Daejeon, South Korea), and the calibration neutron fields were constructed using a {sup 252}Californium ({sup 252}Cf) neutron source, which was positioned in the center of the neutron irradiation room. The neutron spectra of the calibration neutron fields were measured by a europium-activated lithium iodide scintillator in combination with KAERI's Bonner sphere system. When the shadow cone method was used, 10 single moderator-based survey meters exhibited a smaller calibration factor by as much as 3.1 - 9.3% than that of the semi-empirical method. This finding indicates that neutron survey meters underestimated the scattered neutrons and attenuated neutrons (i.e., the total scatter corrections). This underestimation of the calibration factor was attributed to the fact that single moderator-based survey meters have an under-ambient dose equivalent response in the thermal or thermal-dominant neutron field. As a result, when the shadow cone method is used for a single moderator-based survey meter, an additional correction and the International Organization for Standardization standard 8529-2 for room-scattered neutrons should be considered.

  13. A review of neutron scattering correction for the calibration of neutron survey meters using the shadow cone method

    International Nuclear Information System (INIS)

    Kim, Sang In; Kim, Bong Hwan; Kim, Jang Lyul; Lee, Jung Il

    2015-01-01

    The calibration methods of neutron-measuring devices such as the neutron survey meter have advantages and disadvantages. To compare the calibration factors obtained by the shadow cone method and semi-empirical method, 10 neutron survey meters of five different types were used in this study. This experiment was performed at the Korea Atomic Energy Research Institute (KAERI; Daejeon, South Korea), and the calibration neutron fields were constructed using a 252 Californium ( 252 Cf) neutron source, which was positioned in the center of the neutron irradiation room. The neutron spectra of the calibration neutron fields were measured by a europium-activated lithium iodide scintillator in combination with KAERI's Bonner sphere system. When the shadow cone method was used, 10 single moderator-based survey meters exhibited a smaller calibration factor by as much as 3.1 - 9.3% than that of the semi-empirical method. This finding indicates that neutron survey meters underestimated the scattered neutrons and attenuated neutrons (i.e., the total scatter corrections). This underestimation of the calibration factor was attributed to the fact that single moderator-based survey meters have an under-ambient dose equivalent response in the thermal or thermal-dominant neutron field. As a result, when the shadow cone method is used for a single moderator-based survey meter, an additional correction and the International Organization for Standardization standard 8529-2 for room-scattered neutrons should be considered

  14. Transport survey calculations using the spectral collocation method

    International Nuclear Information System (INIS)

    Painter, S.L.; Lyon, J.F.

    1989-01-01

    A novel transport survey code has been developed and is being used to study the sensitivity of stellarator reactor performance to various transport assumptions. Instead of following one of the usual approaches, the steady-state transport equation are solved in integral form using the spectral collocation method. This approach effectively combine the computational efficiency of global models with the general nature of 1-D solutions. A compact torsatron reactor test case was used to study the convergence properties and flexibility of the new method. The heat transport model combined Shaing's model for ripple-induced neoclassical transport, the Chang-Hinton model for axisymmetric neoclassical transport, and neoalcator scaling for anomalous electron heat flux. Alpha particle heating, radiation losses, classical electron-ion heat flow, and external heating were included. For the test problem, the method exhibited some remarkable convergence properties. As the number of basis functions was increased, the maximum, pointwise error in the integrated power balance decayed exponentially until the numerical noise level as reached. Better than 10% accuracy in the globally-averaged quantities was achieved with only 5 basis functions; better than 1% accuracy was achieved with 10 basis functions. The numerical method was also found to be very general. Extreme temperature gradients at the plasma edge which sometimes arise from the neoclassical models and are difficult to resolve with finite-difference methods were easily resolved. 8 refs., 6 figs

  15. An exploratory survey of methods used to develop measures of performance

    Science.gov (United States)

    Hamner, Kenneth L.; Lafleur, Charles A.

    1993-09-01

    Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.

  16. Method for pre-processing LWR spent fuel

    International Nuclear Information System (INIS)

    Otsuka, Katsuyuki; Ebihara, Hikoe.

    1986-01-01

    Purpose: To facilitate the decladding of spent fuel, cladding tube processing, and waste gas recovery, and to enable the efficient execution of main re-processing process thereafter. Constitution: Spent fuel assemblies are sent to a cutting process where they are cut into chips of easy-to-process size. The chips, in a thermal decladding process, undergo a thermal cycle processing in air with the processing temperatures increased and decreased within the range of from 700 deg C to 1200 deg C, oxidizing zircaloy comprising the cladding tubes into zirconia. The oxidized cladding tubes have a number of fine cracks and become very brittle and easy to loosen off from fuel pellets when even a slight mechanical force is applied thereto, thus changing into a form of powder. Processed products are then separated into zirconia sand and fuel pellets by a gravitational selection method or by a sifting method, the zirconia sand being sent to a waste processing process and the fuel pellets to a melting-refining process. (Yoshino, Y.)

  17. Quality assessment of crude and processed Arecae semen based on colorimeter and HPLC combined with chemometrics methods.

    Science.gov (United States)

    Sun, Meng; Yan, Donghui; Yang, Xiaolu; Xue, Xingyang; Zhou, Sujuan; Liang, Shengwang; Wang, Shumei; Meng, Jiang

    2017-05-01

    Raw Arecae Semen, the seed of Areca catechu L., as well as Arecae Semen Tostum and Arecae semen carbonisata are traditionally processed by stir-baking for subsequent use in a variety of clinical applications. These three Arecae semen types, important Chinese herbal drugs, have been used in China and other Asian countries for thousands of years. In this study, the sensory technologies of a colorimeter and sensitive validated high-performance liquid chromatography with diode array detection were employed to discriminate raw Arecae semen and its processed drugs. The color parameters of the samples were determined by a colorimeter instrument CR-410. Moreover, the fingerprints of the four alkaloids of arecaidine, guvacine, arecoline and guvacoline were surveyed by high-performance liquid chromatography. Subsequently, Student's t test, the analysis of variance, fingerprint similarity analysis, hierarchical cluster analysis, principal component analysis, factor analysis and Pearson's correlation test were performed for final data analysis. The results obtained demonstrated a significant color change characteristic for components in raw Arecae semen and its processed drugs. Crude and processed Arecae semen could be determined based on colorimetry and high-performance liquid chromatography with a diode array detector coupled with chemometrics methods for a comprehensive quality evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Laser Scanning in Engineering Surveying: Methods of Measurement and Modeling of Structures

    Directory of Open Access Journals (Sweden)

    Lenda Grzegorz

    2016-06-01

    Full Text Available The study is devoted to the uses of laser scanning in the field of engineering surveying. It is currently one of the main trends of research which is developed at the Department of Engineering Surveying and Civil Engineering at the Faculty of Mining Surveying and Environmental Engineering of AGH University of Science and Technology in Krakow. They mainly relate to the issues associated with tower and shell structures, infrastructure of rail routes, or development of digital elevation models for a wide range of applications. These issues often require the use of a variety of scanning techniques (stationary, mobile, but the differences also regard the planning of measurement stations and methods of merging point clouds. Significant differences appear during the analysis of point clouds, especially when modeling objects. Analysis of the selected parameters is already possible basing on ad hoc measurements carried out on a point cloud. However, only the construction of three-dimensional models provides complete information about the shape of structures, allows to perform the analysis in any place and reduces the amount of the stored data. Some structures can be modeled in the form of simple axes, sections, or solids, for others it becomes necessary to create sophisticated models of surfaces, depicting local deformations. The examples selected for the study allow to assess the scope of measurement and office work for a variety of uses related to the issue set forth in the title of this study. Additionally, the latest, forward-looking technology was presented - laser scanning performed from Unmanned Aerial Vehicles (drones. Currently, it is basically in the prototype phase, but it might be expected to make a significant progress in numerous applications in the field of engineering surveying.

  19. The relative size of measurement error and attrition error in a panel survey. Comparing them with a new multi-trait multi-method model

    NARCIS (Netherlands)

    Lugtig, Peter

    2017-01-01

    This paper proposes a method to simultaneously estimate both measurement and nonresponse errors for attitudinal and behavioural questions in a longitudinal survey. The method uses a Multi-Trait Multi-Method (MTMM) approach, which is commonly used to estimate the reliability and validity of survey

  20. Chapter 12: Survey Design and Implementation for Estimating Gross Savings Cross-Cutting Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Baumgartner, Robert [Tetra Tech, Madison, WI (United States)

    2017-10-05

    This chapter presents an overview of best practices for designing and executing survey research to estimate gross energy savings in energy efficiency evaluations. A detailed description of the specific techniques and strategies for designing questions, implementing a survey, and analyzing and reporting the survey procedures and results is beyond the scope of this chapter. So for each topic covered below, readers are encouraged to consult articles and books cited in References, as well as other sources that cover the specific topics in greater depth. This chapter focuses on the use of survey methods to collect data for estimating gross savings from energy efficiency programs.

  1. Parametric methods for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper

    is studied in Section 4, and Bayesian inference in Section 5. On one hand, as the development in computer technology and computational statistics continues,computationally-intensive simulation-based methods for likelihood inference probably will play a increasing role for statistical analysis of spatial...... inference procedures for parametric spatial point process models. The widespread use of sensible but ad hoc methods based on summary statistics of the kind studied in Chapter 4.3 have through the last two decades been supplied by likelihood based methods for parametric spatial point process models......(This text is submitted for the volume ‘A Handbook of Spatial Statistics' edited by A.E. Gelfand, P. Diggle, M. Fuentes, and P. Guttorp, to be published by Chapmand and Hall/CRC Press, and planned to appear as Chapter 4.4 with the title ‘Parametric methods'.) 1 Introduction This chapter considers...

  2. Survey of engineering computational methods and experimental programs for estimating supersonic missile aerodynamic characteristics

    Science.gov (United States)

    Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.

    1982-01-01

    This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.

  3. Survey of meshless and generalized finite element methods: A unified approach

    Science.gov (United States)

    Babuška, Ivo; Banerjee, Uday; Osborn, John E.

    In the past few years meshless methods for numerically solving partial differential equations have come into the focus of interest, especially in the engineering community. This class of methods was essentially stimulated by difficulties related to mesh generation. Mesh generation is delicate in many situations, for instance, when the domain has complicated geometry; when the mesh changes with time, as in crack propagation, and remeshing is required at each time step; when a Lagrangian formulation is employed, especially with nonlinear PDEs. In addition, the need for flexibility in the selection of approximating functions (e.g., the flexibility to use non-polynomial approximating functions), has played a significant role in the development of meshless methods. There are many recent papers, and two books, on meshless methods; most of them are of an engineering character, without any mathematical analysis.In this paper we address meshless methods and the closely related generalized finite element methods for solving linear elliptic equations, using variational principles. We give a unified mathematical theory with proofs, briefly address implementational aspects, present illustrative numerical examples, and provide a list of references to the current literature.The aim of the paper is to provide a survey of a part of this new field, with emphasis on mathematics. We present proofs of essential theorems because we feel these proofs are essential for the understanding of the mathematical aspects of meshless methods, which has approximation theory as a major ingredient. As always, any new field is stimulated by and related to older ideas. This will be visible in our paper.

  4. Design and methods in a survey of living conditions in the Arctic - the SLiCA study.

    Science.gov (United States)

    Eliassen, Bent-Martin; Melhus, Marita; Kruse, Jack; Poppel, Birger; Broderstad, Ann Ragnhild

    2012-03-19

    The main objective of this study is to describe the methods and design of the survey of living conditions in the Arctic (SLiCA), relevant participation rates and the distribution of participants, as applicable to the survey data in Alaska, Greenland and Norway. This article briefly addresses possible selection bias in the data and also the ways to tackle it in future studies. Population-based cross-sectional survey. Indigenous individuals aged 16 years and older, living in Greenland, Alaska and in traditional settlement areas in Norway, were invited to participate. Random sampling methods were applied in Alaska and Greenland, while non-probability sampling methods were applied in Norway. Data were collected in 3 periods: in Alaska, from January 2002 to February 2003; in Greenland, from December 2003 to August 2006; and in Norway, in 2003 and from June 2006 to June 2008. The principal method in SLiCA was standardised face-to-face interviews using a questionnaire. A total of 663, 1,197 and 445 individuals were interviewed in Alaska, Greenland and Norway, respectively. Very high overall participation rates of 83% were obtained in Greenland and Alaska, while a more conventional rate of 57% was achieved in Norway. A predominance of female respondents was obtained in Alaska. Overall, the Sami cohort is older than the cohorts from Greenland and Alaska. Preliminary assessments suggest that selection bias in the Sami sample is plausible but not a major threat. Few or no threats to validity are detected in the data from Alaska and Greenland. Despite different sampling and recruitment methods, and sociocultural differences, a unique database has been generated, which shall be used to explore relationships between health and other living conditions variables.

  5. Improvement of nuclear reactor component materials by application of hot isostatic processing (HIP). Survey report on Phase 1

    International Nuclear Information System (INIS)

    Mueller, J.J.

    1975-12-01

    The report summarizes the results of an EPRI-sponsored state-of-the-art survey of hot isostatic processing (HIP). The purpose of the study was to identify potential nuclear plant applications of HIP with high pay-off through improvement in component quality and reliability. The survey shows that HIP will reduce cost and manufacturing time and improve quality and ease of nondestructive examination of all castings for which porosity is a problem. Nuclear valves are a prime example. Tubing, pipe, and sheet and bar present other possibilities of somewhat less immediate promise. This report includes a review of some of the EPRI motivations for undertaking this research; a brief explanation of HIP, the survey methodology exployed; the basic operations in the processes studied; a review of the historical applications of HIP to problem areas consistent with those addressed in the survey; the results of the survey and associated analyses of the problems; and the recommendations and justifications for the Phase II program

  6. Structure from motion, a low cost, very high resolution method for surveying glaciers using GoPros and opportunistic helicopter flights

    Science.gov (United States)

    Girod, L.; Nuth, C.; Schellenberger, T.

    2014-12-01

    The capability of structure from motion techniques to survey glaciers with a very high spatial and temporal resolution is a promising tool for better understanding the dynamic changes of glaciers. Modern software and computing power allow us to produce accurate data sets from low cost surveys, thus improving the observational capabilities on a wider range of glaciers and glacial processes. In particular, highly accurate glacier volume change monitoring and 3D movement computations will be possible Taking advantage of the helicopter flight needed to survey the ice stakes on Kronenbreen, NW Svalbard, we acquired high resolution photogrammetric data over the well-studied Midre Lovénbreen in September 2013. GoPro Hero 2 cameras were attached to the landing gear of the helicopter, acquiring two images per second. A C/A code based GPS was used for registering the stereoscopic model. Camera clock calibration is obtained through fitting together the shapes of the flight given by both the GPS logger and the relative orientation of the images. A DEM and an ortho-image are generated at 30cm resolution from 300 images collected. The comparison with a 2005 LiDAR DEM (5 meters resolution) shows an absolute error in the direct registration of about 6±3m in 3D which could be easily reduced to 1,5±1m by using fine point cloud alignment algorithms on stable ground. Due to the different nature of the acquisition method, it was not possible to use tie point based co-registration. A combination of the DEM and ortho-image is shown with the point cloud in figure below. A second photogrammetric data set will be acquired in September 2014 to survey the annual volume change and movement. These measurements will then be compared to the annual resolution glaciological stake mass balance and velocity measurements to assess the precision of the method to monitor at an annual resolution.

  7. Methods Used to Deal with Peace Process Spoilers

    Directory of Open Access Journals (Sweden)

    MA. Bilbil Kastrati

    2014-06-01

    Full Text Available The conflicts of the past three decades have shown that the major problems which peace processes face are the spoilers. Spoilers are warring parties and their leaders who believe that peaceful settlement of disputes threatens their interests, power and their reputation; therefore, they use all means to undermine or completely spoil the process. Spoilers of peace processes can be inside or outside of the process and are characterized as limited, greedy or total spoilers. Their motives for spoiling can be different, such as: political, financial, ethnic, security, etc. Furthermore, it is important to emphasise that spoilers are not only rebels and insurgents, but can often be governments, diasporas, warlords, private military companies, etc. In order to counteract the spoilers, the international community has adopted and implemented three methods: inducement, socialization and coercion. Often all three methods are used to convince the spoilers to negotiate, accept and implement peace agreements. Hence, this paper will examine the methods used to deal with peace process spoilers through an assessment of the strategies employed, impact, success and failures. This paper will also argue that the success or failure of the peace process depends on the method(s used to deal with spoilers. If the right method is chosen, with a persistent engagement of the international community, the peace process will be successful; on the contrary, if they fail to do so, the consequences will be devastating.

  8. Methods of extending signatures and training without ground information. [data processing, pattern recognition

    Science.gov (United States)

    Henderson, R. G.; Thomas, G. S.; Nalepka, R. F.

    1975-01-01

    Methods of performing signature extension, using LANDSAT-1 data, are explored. The emphasis is on improving the performance and cost-effectiveness of large area wheat surveys. Two methods were developed: ASC, and MASC. Two methods, Ratio, and RADIFF, previously used with aircraft data were adapted to and tested on LANDSAT-1 data. An investigation into the sources and nature of between scene data variations was included. Initial investigations into the selection of training fields without in situ ground truth were undertaken.

  9. Geophysical methods for evaluation of plutonic rocks

    International Nuclear Information System (INIS)

    Gibb, R.A.; Scott, J.S.

    1986-04-01

    Geophysical methods are systematically described according to the physical principle and operational mode of each method, the type of information produced, limitations of a technical and/or economic nature, and the applicability of the method to rock-mass evaluation at Research Areas of the Nuclear Fuel Waste Management Program. The geophysical methods fall into three categories: (1) airborne and other reconnaissance surveys, (2) detailed or surface (ground) surveys, and (3) borehole or subsurface surveys. The possible roles of each method in the site-screening and site-evaluation processes of disposal vault site selection are summarized

  10. Studies of neutron methods for process control and criticality surveillance of fissile material processing facilities

    International Nuclear Information System (INIS)

    Zoltowski, T.

    1988-01-01

    The development of radiochemical processes for fissile material processing and spent fuel handling need new control procedures enabling an improvement of plant throughput. This is strictly related to the implementation of continuous criticality control policy and developing reliable methods for monitoring the reactivity of radiochemical plant operations in presence of the process perturbations. Neutron methods seem to be applicable for fissile material control in some technological facilities. The measurement of epithermal neutron source multiplication with heuristic evaluation of measured data enables surveillance of anomalous reactivity enhancement leading to unsafe states. 80 refs., 47 figs., 33 tabs. (author)

  11. Comparative analysis of accelerogram processing methods

    International Nuclear Information System (INIS)

    Goula, X.; Mohammadioun, B.

    1986-01-01

    The work described here inafter is a short development of an on-going research project, concerning high-quality processing of strong-motion recordings of earthquakes. Several processing procedures have been tested, applied to synthetic signals simulating ground-motion designed for this purpose. The methods of correction operating in the time domain are seen to be strongly dependent upon the sampling rate. Two methods of low-frequency filtering followed by an integration of accelerations yielded satisfactory results [fr

  12. Vertical Cable Seismic Survey for SMS exploration

    Science.gov (United States)

    Asakawa, Eiichi; Murakami, Fumitoshi; Tsukahara, Hotoshi; Mizohata, Shigeharu

    2014-05-01

    The Vertical Cable Seismic (VCS) survey is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by sea-surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. Because the VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed it for the SMS survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We have been developing the VCS survey system, including not only data acquisition hardware but data processing and analysis technique. We carried out several VCS surveys combining with surface towed source, deep towed source and ocean bottom source. The water depths of these surveys are from 100m up to 2100 m. Through these experiments, our VCS data acquisition system has been also completed. But the data processing techniques are still on the way. One of the most critical issues is the positioning in the water. The uncertainty in the positions of the source and of the hydrophones in water degraded the quality of subsurface image. GPS navigation system is available on sea surface, but in case of deep-towed source or ocean bottom source, the accuracy of shot position with SSBL/USBL is not sufficient for the very high-resolution imaging. We have developed a new approach to determine the positions in water using the travel time data from the source to VCS hydrophones. In 2013, we have carried out the second VCS survey using the surface-towed high-voltage sparker and ocean bottom source in the Izena Cauldron, which is one of the most promising SMS areas around Japan. The positions of ocean bottom source estimated by this method are consistent with the VCS field records. The VCS data with the sparker have been processed with 3D PSTM. It gives the very high resolution 3D volume deeper than two

  13. Wet treatment of ashes, a survey of methods; Vaat rening av askor, metodoeversikt

    Energy Technology Data Exchange (ETDEWEB)

    Bjurstroem, Henrik [AaF-Energi och Miljoe AB, Stockhom (Sweden); Steenari, Britt-Marie [Chalmers Univ. of Technology, Goeteborg (Sweden)

    2003-10-01

    Ash contains elements and compounds that are questionable from an environmental point of view, such as very soluble salts, alkali yielding a high pH-value, metals, heavy metals and organic compounds. When ash is to be used, one requires that it is stable, i. e. that it does not influence the immediate surroundings and the environment in a negative way. Stability means that water that comes into contact with ash shall not pick up environmentally disruptive compounds to any significant extent. The presence of heavy metals in the ash does not always lead to their being leached to the surroundings, but it does always imply an uncertainty. It is probable that fly ash from incineration of municipal solid waste has to be treated in some way before it is landfilled. Washing the ash or dissolving it partially with a solvent such as water or an acid is a relatively simple method to reduce the risk for contamination of the environment by removing soluble compounds from the ash. Such methods consist of techniques that in other applications are proven and robust, and that may be adapted to the present conditions: the composition and the properties of the ash. In this report, a survey of methods is presented. Wet treatments may be apprehended as a combined separation and concentration process: on the one hand environmentally disruptive compounds are removed from ash, on the other hand these are concentrated in a remainder survival remission rate. These methods are a perfect pretreatment for various stabilization methods, a. o. thermal treatments such as vitrifying or sintering, or for utilization of the ashes e. g. in public works as they remove the obstacles to a good performance, namely the soluble salts. In this report are presented a systematic description of wet treatments aiming at purification and a survey of methods of industrial interest. A certain number of wet treatment methods are in operation outside Sweden, principally for fly ash from municipal solid waste

  14. Processing methods for temperature-dependent MCNP libraries

    International Nuclear Information System (INIS)

    Li Songyang; Wang Kan; Yu Ganglin

    2008-01-01

    In this paper,the processing method of NJOY which transfers ENDF files to ACE (A Compact ENDF) files (point-wise cross-Section file used for MCNP program) is discussed. Temperatures that cover the range for reactor design and operation are considered. Three benchmarks are used for testing the method: Jezebel Benchmark, 28 cm-thick Slab Core Benchmark and LWR Benchmark with Burnable Absorbers. The calculation results showed the precision of the neutron cross-section library and verified the correct processing methods in usage of NJOY. (authors)

  15. Worldwide trends of the mine surveying services

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Z. (Coal Science Research Institute, Tangshan (China))

    1991-08-01

    This author discusses the role of mine surveying in the present and future mining industry, and summarizes the main trends of development as follows: active training of specialists and technical experts; employment of advance surveying technology and data and image processing technology; research and development of measuring method and equipment for accurate measurement of strata movement and deformation; accurate forecasts of ground motion or earthquakes.

  16. What is the Process Approvals for Survey Research in the Department of Defense (DoD)

    Science.gov (United States)

    2017-04-26

    persons as individuals / representatives of agencies that elicit attitudes, opinions, behavior , and related demographic, social, and economic data to...processes are reported to be confusing. The survey approval process between services is inconsistent and time consuming . Barriers, real or perceived...The working group formed as part of the Behavioral Health Research Interest Group

  17. [Sampling and measurement methods of the protocol design of the China Nine-Province Survey for blindness, visual impairment and cataract surgery].

    Science.gov (United States)

    Zhao, Jia-liang; Wang, Yu; Gao, Xue-cheng; Ellwein, Leon B; Liu, Hu

    2011-09-01

    To design the protocol of the China nine-province survey for blindness, visual impairment and cataract surgery to evaluate the prevalence and main causes of blindness and visual impairment, and the prevalence and outcomes of the cataract surgery. The protocol design was began after accepting the task for the national survey for blindness, visual impairment and cataract surgery from the Department of Medicine, Ministry of Health, China, in November, 2005. The protocol in Beijing Shunyi Eye Study in 1996 and Guangdong Doumen County Eye Study in 1997, both supported by World Health Organization, was taken as the basis for the protocol design. The relative experts were invited to discuss and prove the draft protocol. An international advisor committee was established to examine and approve the draft protocol. Finally, the survey protocol was checked and approved by the Department of Medicine, Ministry of Health, China and Prevention Program of Blindness and Deafness, WHO. The survey protocol was designed according to the characteristics and the scale of the survey. The contents of the protocol included determination of target population and survey sites, calculation of the sample size, design of the random sampling, composition and organization of the survey teams, determination of the examinee, the flowchart of the field work, survey items and methods, diagnostic criteria of blindness and moderate and sever visual impairment, the measures of the quality control, the methods of the data management. The designed protocol became the standard and practical protocol for the survey to evaluate the prevalence and main causes of blindness and visual impairment, and the prevalence and outcomes of the cataract surgery.

  18. Methods of removing uranium from drinking water. 1. A literature survey. 2. Present municipal water treatment and potential removal methods

    International Nuclear Information System (INIS)

    Drury, J.S.; Michelson, D.; Ensminger, J.T.; Lee, S.Y.; White, S.K.

    1982-12-01

    Literature was searched for methods of removing uranium from drinking water. U.S. manufacturers and users of water-treatment equipment and products were also contacted regarding methods of removing uranium from potable water. Based on the results of these surveys, it was recommended that untreated, partially treated, and finished water samples from municipal water-treatment facilities be analyzed to determine the extent of removal of uranium by presently used procedures, and that additional laboratory studies be performed to determine what changes are needed to maximize the effectiveness of treatments that are already in use in existing water-treatment plants

  19. Lesson learned - CGID based on the Method 1 and Method 2 for digital equipment

    International Nuclear Information System (INIS)

    Hwang, Wonil; Sohn, Kwang Young; Cho, Chang Hwan; Kim, Sung Jong

    2015-01-01

    The acceptance methods associated with commercial-grade dedication are the following: 1) Special tests and inspection (Method 1) 2) Commercial-grade surveys (Method 2) 3) Source verification (Method 3) 4) An acceptable item and supplier performance record (Method 4) Special tests and inspections, often referred to as Method 1, are performed by the dedicating entity after the item is received to verify selected critical characteristics. Conducting a commercial-grade survey of a supplier is often referred to as Method 2. Supplier audits to verify compliance with a nuclear QA program do not meet the intent of a commercial-grade survey. Source verification, often referred to as Method 3, entails verification of critical characteristics during manufacture and testing of the item being procured. The performance history (good or bad) of the item and supplier is a consideration when determining the use of the other acceptance methods and the rigor with which they are used on a case-by-case basis. Some digital equipment system has the delivery reference and its operating history for Nuclear Power Plant as far as surveyed. However it was found that there is difficulty in collecting this of supporting data sheet, so that supplier usually decide to conduct the CGID based on the Method-1 and Method-2 based on the initial qualification likely. It is conceived that the Method-4 might be a better approach for CGID(Commercial Grade Item Dedication) even if there are some difficulties in data package for justifying CGID from the vendor and operating organization. This paper present the lesson learned from the consulting for Method-1 and 2 for digital equipment dedication. Considering all the information above, there are a couple of issues to remind in order to perform the CGID for Method-2. In doing commercial grade survey based on Method 2, quality personnel as well as technical engineer shall be involved for integral dedication. Other than this, the review of critical

  20. University Community Engagement and the Strategic Planning Process

    Directory of Open Access Journals (Sweden)

    Laura Newton Miller

    2018-03-01

    Full Text Available Abstract  Objectives – To understand how university libraries are engaging with the university community (students, faculty, campus partners, and administration when working through the strategic planning process.  Methods – Literature review and exploratory open-ended survey to members of CAUL (Council of Australian University Librarians, CARL (Canadian Association of Research Libraries, CONZUL (Council of New Zealand University Librarians, and RLUK (Research Libraries UK who are most directly involved in the strategic planning process at their library.  Results – Out of a potential 113 participants from 4 countries, 31 people (27% replied to the survey. Libraries most often mentioned the use of regularly-scheduled surveys to inform their strategic planning, which helps to truncate the process for some respondents, as opposed to conducting user feedback specifically for the strategic planning process. Other quantitative methods include customer intelligence and library-produced data. Qualitative methods include the use of focus groups, interviews, and user experience/design techniques to help inform the strategic plan. The focus of questions to users tended to fall towards user-focused (with or without library lens, library-focused, trends and vision, and feedback on plan.  Conclusions – Combining both quantitative and qualitative methods can help give a fuller picture for librarians working on a strategic plan. Having the university community join the conversation on how the library moves forward is an important but difficult endeavour.  Regardless, the university library needs to be adaptive to the rapidly changing environment around it. Having a sense of how other libraries engage with the university community benefits others who are tasked with strategic planning.

  1. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys.

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-09-01

    To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. We compared 148 MSM aged 18-64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010-2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%-95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. Photometric calibration of the COMBO-17 survey with the Softassign Procrustes Matching method

    Science.gov (United States)

    Sheikhbahaee, Z.; Nakajima, R.; Erben, T.; Schneider, P.; Hildebrandt, H.; Becker, A. C.

    2017-11-01

    Accurate photometric calibration of optical data is crucial for photometric redshift estimation. We present the Softassign Procrustes Matching (SPM) method to improve the colour calibration upon the commonly used Stellar Locus Regression (SLR) method for the COMBO-17 survey. Our colour calibration approach can be categorised as a point-set matching method, which is frequently used in medical imaging and pattern recognition. We attain a photometric redshift precision Δz/(1 + zs) of better than 2 per cent. Our method is based on aligning the stellar locus of the uncalibrated stars to that of a spectroscopic sample of the Sloan Digital Sky Survey standard stars. We achieve our goal by finding a correspondence matrix between the two point-sets and applying the matrix to estimate the appropriate translations in multidimensional colour space. The SPM method is able to find the translation between two point-sets, despite the existence of noise and incompleteness of the common structures in the sets, as long as there is a distinct structure in at least one of the colour-colour pairs. We demonstrate the precision of our colour calibration method with a mock catalogue. The SPM colour calibration code is publicly available at https://neuronphysics@bitbucket.org/neuronphysics/spm.git.

  3. Optimizing household survey methods to monitor the Sustainable Development Goals targets 6.1 and 6.2 on drinking water, sanitation and hygiene: A mixed-methods field-test in Belize.

    Science.gov (United States)

    Khan, Shane M; Bain, Robert E S; Lunze, Karsten; Unalan, Turgay; Beshanski-Pedersen, Bo; Slaymaker, Tom; Johnston, Richard; Hancioglu, Attila

    2017-01-01

    The Sustainable Development Goals (SDGs) require household survey programmes such as the UNICEF-supported Multiple Indicator Cluster Surveys (MICS) to enhance data collection to cover new indicators. This study aims to evaluated methods for assessing water quality, water availability, emptying of sanitation facilities, menstrual hygiene management and the acceptability of water quality testing in households which are key to monitoring SDG targets 6.1 and 6.2 on drinking Water, Sanitation and Hygiene (WASH) and emerging issues. As part of a MICS field test, we interviewed 429 households and 267 women age 15-49 in Stann Creek, Belize in a split-sample experiment. In a concurrent qualitative component, we conducted focus groups with interviewers and cognitive interviews with respondents during and immediately following questionnaire administration in the field to explore their question comprehension and response processes. About 88% of respondents agreed to water quality testing but also desired test results, given the potential implications for their own health. Escherichia coli was present in 36% of drinking water collected at the source, and in 47% of samples consumed in the household. Both questions on water availability necessitated probing by interviewers. About one quarter of households reported emptying of pit latrines and septic tanks, though one-quarter could not provide an answer to the question. Asking questions on menstrual hygiene was acceptable to respondents, but required some clarification and probing. In the context of Belize, this study confirmed the feasibility of collecting information on the availability and quality of drinking water, emptying of sanitation facilities and menstrual hygiene in a multi-purpose household survey, indicating specific areas to improve question formulation and field protocols. Improvements have been incorporated into the latest round of MICS surveys which will be a major source of national data for monitoring of SDG

  4. Methods and introductory results of the Greek national health and nutrition survey - HYDRIA

    Directory of Open Access Journals (Sweden)

    Georgia Martimianaki

    2018-06-01

    Full Text Available Background:  According to a large prospective cohort study (with baseline examination in the 1990s and smaller studies that followed, the population in Greece has been gradually deprived of the favorable morbidity and mortality indices recorded in the 1960s. The HYDRIA survey conducted in 2013-14 is the first nationally representative survey, which collected data related to the health and nutrition of the population in Greece. Methods: The survey sample consists of 4011 males (47% and females aged 18 years and over. Data collection included interviewer-administered questionnaires on personal characteristics, lifestyle choices, dietary habits and medical history; measurements of somatometry and blood pressure; and, blood drawing. Weighting factors were applied to ensure national representativeness of results. Results: Three out of five adults in Greece reported suffering of a chronic disease, with diabetes mellitus and chronic depression being the more frequent ones among older individuals. The population is also experiencing an overweight/obesity epidemic, since seven out of 10 adults are either overweight or obese. In addition, 40% of the population bears indications of hypertension. Smoking is still common and among women the prevalence was higher in younger age groups. Social disparities were observed in the prevalence of chronic diseases and mortality risk factors (hypertension, obesity, impaired lipid profile and high blood glucose levels. Conclusion: Excess body weight, hypertension, the smoking habit and the population’s limited physical activity are the predominant challenges that public health officials have to deal with in formulating policies and designing actions for the population in Greece.

  5. PEP surveying procedures and equipment

    International Nuclear Information System (INIS)

    Linker, F.

    1982-06-01

    The PEP Survey and Alignment System, which employs both laser-based and optical survey methods, is described. The laser is operated in conjunction with the Tektronix 4051 computer and surveying instruments such as ARM and SAM, system which is designed to automate data input, reduction, and production of alignment instructions. The laser system is used when surveying ring quadrupoles, main bend magnets, sextupoles, and is optional when surveying RF cavities and insertion quadrupoles. Optical methods usually require that data be manually entered into the computer for alignment, but in some cases, an element can be aligned using nominal values of fiducial locations without use of the computer. Optical surveying is used in the alignment of NIT and SIT, low field bend magnets, wigglers, RF cavities, and insertion quadrupoles

  6. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2007-01-01

    Full Text Available In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart sensors that today’s cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher’s interest in the fusion of intelligent sensors and optimal signal processing techniques.

  7. 78 FR 76285 - Proposed Information Collection; Comment Request; Panel Member Survey To Develop Indicators of...

    Science.gov (United States)

    2013-12-17

    ... mathematically. NOAA will apply the Delphi Method to a multi-round survey of panels of individuals with... gather this information, NOAA will conduct a multi-round, iterative survey process based on the Delphi Method, which is a structured method for eliciting and combining expert opinion. The method requires...

  8. Survey the Effect of Pistachio Waste Composting Process with Different Treatments on Concentration of Heavy Metals

    Directory of Open Access Journals (Sweden)

    M Jalili

    2016-09-01

    Full Text Available Abstract Introduction: Composting is one of the pistachio wastes management methods. In the appropriate compost production nutrients and heavy metals are determinant. The aim of this study is survey the effect of pistachio wastes composting process with different treatments on the concentration of heavy metals. Methods: In this study, during the 60-day pistachio wastes composting process with two treatments of dewatered sewage sludge and cow manure, pH, EC, carbon to nitrogen ratio, Heavy metals and nutrients indicators were studied. The results were compared with WHO and Iranian National standard. Drawing the diagrams by Excel software (Version 2007 and Statistical analysis was performed by Spss Software (version 20 at a significance level of 0.005.  Results: During the 60-day composting pH initially had downward trend and then increased. The Cu, Zn, Fe, Mn and C/N ratio had downward trend and the EC, Na, K had increasing trend. Eventually, Iron, zinc, copper and manganese were less than the standard, Sodium was in Standard range and potassium was more than specified standards in the produced compost from pistachios waste with both treatments.  Conclusion: The results showed that the concentration of heavy metals and nutrients in the produced compost with both treatments were in the acceptable range. Eventually quality of produced compost with cow manure treatment due to better decomposition and greater stability was better than processed compost with dewatered sewage sludge treatment.

  9. Multi-method Near-surface Geophysical Surveys for Site Response and Earthquake Damage Assessments at School Sites in Washington, USA

    Science.gov (United States)

    Cakir, R.; Walsh, T. J.; Norman, D. K.

    2017-12-01

    We, Washington Geological Survey (WGS), have been performing multi-method near surface geophysical surveys to help assess potential earthquake damage at public schools in Washington. We have been conducting active and passive seismic surveys, and estimating Shear-wave velocity (Vs) profiles, then determining the NEHRP soil classifications based on Vs30m values at school sites in Washington. The survey methods we have used: 1D and 2D MASW and MAM, P- and S-wave refraction, horizontal-to-vertical spectral ratio (H/V), and 2ST-SPAC to measure Vs and Vp at shallow (0-70m) and greater depths at the sites. We have also run Ground Penetrating Radar (GPR) surveys at the sites to check possible horizontal subsurface variations along and between the seismic survey lines and the actual locations of the school buildings. The seismic survey results were then used to calculate Vs30m for determining the NEHRP soil classifications at school sites, thus soil amplification effects on the ground motions. Resulting shear-wave velocity profiles generated from these studies can also be used for site response and liquefaction potential studies, as well as for improvement efforts of the national Vs30m database, essential information for ShakeMap and ground motion modeling efforts in Washington and Pacific Northwest. To estimate casualties, nonstructural, and structural losses caused by the potential earthquakes in the region, we used these seismic site characterization results associated with structural engineering evaluations based on ASCE41 or FEMA 154 (Rapid Visual Screening) as inputs in FEMA Hazus-Advanced Engineering Building Module (AEBM) analysis. Compelling example surveys will be presented for the school sites in western and eastern Washington.

  10. System and method for deriving a process-based specification

    Science.gov (United States)

    Hinchey, Michael Gerard (Inventor); Rash, James Larry (Inventor); Rouff, Christopher A. (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  11. Comparative Study of Different Processing Methods for the ...

    African Journals Online (AJOL)

    The result of the two processing methods reduced the cyanide concentration to the barest minimum level required by World Health Organization (10mg/kg). The mechanical pressing-fermentation method removed more cyanide when compared to fermentation processing method. Keywords: Cyanide, Fermentation, Manihot ...

  12. Collimation method using an image processing technique for an assembling-type antenna

    Science.gov (United States)

    Okuyama, Toshiyuki; Kimura, Shinichi; Fukase, Yutaro; Ueno, Hiroshi; Harima, Kouichi; Sato, Hitoshi; Yoshida, Tetsuji

    1998-10-01

    To construct highly precise space structures, such as antennas, it is essential to be able to collimate them with high precision by remote operation. Surveying techniques which are commonly used for collimating ground-based antennas cannot be applied to space systems, since they require relatively sensitive and complex instruments. In this paper, we propose a collimation method that is applied to mark-patterns mounted on an antenna dish for detecting very slight displacements. By calculating a cross- correlation function between the target and reference mark- patterns, and by interpolating this calculated function, we can measure the displacement of the target mark-pattern in sub-pixel precision. We developed a test-bed for the measuring system and evaluated several mark-patterns suitable for our image processing technique. A mark-pattern with which enabled to detect displacement within an RMS error of 1/100 pixels was found. Several tests conducted using this chosen pattern verified the robustness of the method to different light conditions and alignment errors. This collimating method is designed for application to an assembling-type antenna which is being developed by the Communications Research Laboratory.

  13. U.S. residential consumer product information: Validation of methods for post-stratification weighting of Amazon Mechanical Turk surveys

    Energy Technology Data Exchange (ETDEWEB)

    Greenblatt, Jeffery B. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yang, Hung-Chia [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Desroches, Louis-Benoit [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Young, Scott J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Beraki, Bereket [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Sarah K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Pratt, Stacy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Willem, Henry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Donovan, Sally M. [Consultant, Melbourne (Australia)

    2013-04-01

    We present two post-stratification weighting methods to validate survey data collected using Amazon Mechanical Turk (AMT). Two surveys focused on appliance and consumer electronics devices were administered in the spring and summer of 2012 to each of approximately 3,000 U.S. households. Specifically, the surveys asked questions about residential refrigeration products, televisions (TVs) and set-top boxes (STBs). Filtered data were assigned weights using each of two weighting methods, termed “sequential” and “simultaneous,” by examining up to eight demographic variables (income, education, gender, race, Hispanic origin, number of occupants, ages of occupants, and geographic region) in comparison to reference U.S. demographic data from the 2009 Residential Energy Consumption Survey (RECS). Five key questions from the surveys (number of refrigerators, number of freezers, number of TVs, number of STBs and primary service provider) were evaluated with a set of statistical tests to determine whether either method improved the agreement of AMT with reference data, and if so, which method was better. The statistical tests used were: differences in proportions, distributions of proportions (using Pearson’s chi-squared test), and differences in average numbers of devices as functions of all demographic variables. The results indicated that both methods generally improved the agreement between AMT and reference data, sometimes greatly, but that the simultaneous method was usually superior to the sequential method. Some differences in sample populations were found between the AMT and reference data. Differences in the proportion of STBs reflected large changes in the STB market since the time our reference data was acquired in 2009. Differences in the proportions of some primary service providers suggested real sample bias, with the possible explanation that AMT user are more likely to subscribe to providers who also provide home internet service. Differences in

  14. Survey of evaluation methods for thermal striping in FBR structures

    International Nuclear Information System (INIS)

    Miura, Naoki; Nitta, Akito; Take, Kohji

    1988-01-01

    In the upper core structures or the sodium mixing tee of Fast Breeder Reactors, sodium mixing streams which are at different temperatures produce rapid temperature fluctuations, namely 'thermal striping', upon component surfaces, and it is apprehended that the high-cycle thermal fatigue causes the crack initiation and propagation. The thermal striping is one of the factors which is considered in FBR component design, however, the standard evaluation method has not built up yet because of the intricacy of that mechanism, the difficulty of an actual proof, the lack of data, and so on. In this report, it is intended to survey of the datails and the present situation of the evaluation method of crack initiation and propagation due to thermal striping, and study the appropriate method which will be made use of the rationalization of design. So it is ascertained that the method which use a quantitative prediction of crack propagation is optimum to evaluate the thermal striping phenomenon. (author)

  15. RED AND PROCESSED MEAT AND CARDIOVASCULAR RISK FACTOR

    OpenAIRE

    ATALIĆ, BRUNO; TOTH, JURICA; ATALIĆ, VLASTA; RADANOVIĆ, DANIJELA; MIŠKULIN, MAJA; LUČIN, ANA

    2014-01-01

    Aims: The British National Diet and Nutrition 2000/1 Survey data set records on 1,724 respondents (766 males and 958 females) were analyzed in order to assess the potential influences of red and processed meat intakes on cardiovascular risk factors. Methods: Linear regression of the associations of the red, processed, combination of red and processed, and total meat intakes with body mass index (BMI), systolic blood pressure and plasma total cholesterol as cardiovascular risk factors was cond...

  16. Method for double-sided processing of thin film transistors

    Science.gov (United States)

    Yuan, Hao-Chih; Wang, Guogong; Eriksson, Mark A.; Evans, Paul G.; Lagally, Max G.; Ma, Zhenqiang

    2008-04-08

    This invention provides methods for fabricating thin film electronic devices with both front- and backside processing capabilities. Using these methods, high temperature processing steps may be carried out during both frontside and backside processing. The methods are well-suited for fabricating back-gate and double-gate field effect transistors, double-sided bipolar transistors and 3D integrated circuits.

  17. Elevation data fitting and precision analysis of Google Earth in road survey

    Science.gov (United States)

    Wei, Haibin; Luan, Xiaohan; Li, Hanchao; Jia, Jiangkun; Chen, Zhao; Han, Leilei

    2018-05-01

    Objective: In order to improve efficiency of road survey and save manpower and material resources, this paper intends to apply Google Earth to the feasibility study stage of road survey and design. Limited by the problem that Google Earth elevation data lacks precision, this paper is focused on finding several different fitting or difference methods to improve the data precision, in order to make every effort to meet the accuracy requirements of road survey and design specifications. Method: On the basis of elevation difference of limited public points, any elevation difference of the other points can be fitted or interpolated. Thus, the precise elevation can be obtained by subtracting elevation difference from the Google Earth data. Quadratic polynomial surface fitting method, cubic polynomial surface fitting method, V4 interpolation method in MATLAB and neural network method are used in this paper to process elevation data of Google Earth. And internal conformity, external conformity and cross correlation coefficient are used as evaluation indexes to evaluate the data processing effect. Results: There is no fitting difference at the fitting point while using V4 interpolation method. Its external conformity is the largest and the effect of accuracy improvement is the worst, so V4 interpolation method is ruled out. The internal and external conformity of the cubic polynomial surface fitting method both are better than those of the quadratic polynomial surface fitting method. The neural network method has a similar fitting effect with the cubic polynomial surface fitting method, but its fitting effect is better in the case of a higher elevation difference. Because the neural network method is an unmanageable fitting model, the cubic polynomial surface fitting method should be mainly used and the neural network method can be used as the auxiliary method in the case of higher elevation difference. Conclusions: Cubic polynomial surface fitting method can obviously

  18. Integrated process development-a robust, rapid method for inclusion body harvesting and processing at the microscale level.

    Science.gov (United States)

    Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid

    2017-10-21

    Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.

  19. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  20. Geophysical methods for monitoring soil stabilization processes

    Science.gov (United States)

    Saneiyan, Sina; Ntarlagiannis, Dimitrios; Werkema, D. Dale; Ustra, Andréa

    2018-01-01

    Soil stabilization involves methods used to turn unconsolidated and unstable soil into a stiffer, consolidated medium that could support engineered structures, alter permeability, change subsurface flow, or immobilize contamination through mineral precipitation. Among the variety of available methods carbonate precipitation is a very promising one, especially when it is being induced through common soil borne microbes (MICP - microbial induced carbonate precipitation). Such microbial mediated precipitation has the added benefit of not harming the environment as other methods can be environmentally detrimental. Carbonate precipitation, typically in the form of calcite, is a naturally occurring process that can be manipulated to deliver the expected soil strengthening results or permeability changes. This study investigates the ability of spectral induced polarization and shear-wave velocity for monitoring calcite driven soil strengthening processes. The results support the use of these geophysical methods as soil strengthening characterization and long term monitoring tools, which is a requirement for viable soil stabilization projects. Both tested methods are sensitive to calcite precipitation, with SIP offering additional information related to long term stability of precipitated carbonate. Carbonate precipitation has been confirmed with direct methods, such as direct sampling and scanning electron microscopy (SEM). This study advances our understanding of soil strengthening processes and permeability alterations, and is a crucial step for the use of geophysical methods as monitoring tools in microbial induced soil alterations through carbonate precipitation.

  1. The importance of standardized observations to evaluate nutritional care quality in the survey process.

    Science.gov (United States)

    Schnelle, John F; Bertrand, Rosanna; Hurd, Donna; White, Alan; Squires, David; Feuerberg, Marvin; Hickey, Kelly; Simmons, Sandra F

    2009-10-01

    Guidelines written for government surveyors who assess nursing home (NH) compliance with federal standards contain instructions to observe the quality of mealtime assistance. However, these instructions are vague and no protocol is provided for surveyors to record observational data. This study compared government survey staff observations of mealtime assistance quality to observations by research staff using a standardized protocol that met basic standards for accurate behavioral measurement. Survey staff used either the observation instructions in the standard survey process or those written for the revised Quality Improvement Survey (QIS). Trained research staff observed mealtime care in 20 NHs in 5 states during the same time period that survey staff evaluated care in the same facilities, although it could not be determined if survey and research staff observed the same residents during the same meals. Ten NHs were evaluated by government surveyors using the QIS survey instructions and 10 NHs were evaluated by surveyors using the standard survey instructions. Research staff observations using a standardized observation protocol identified a higher proportion of residents receiving inadequate feeding assistance during meals relative to survey staff using either the standard or QIS survey instructions. For example, more than 50% of the residents who ate less than half of their meals based on research staff observation were not offered an alternative to the served meal, and the lack of alternatives, or meal substitutions, was common in all 20 NHs. In comparison, the QIS survey teams documented only 2 instances when meal substitutes were not offered in 10 NHs and the standard survey teams documented no instances in 10 NHs. Standardized mealtime observations by research staff revealed feeding assistance care quality issues in all 20 study NHs. Surveyors following the instructions in either the standard or revised QIS surveys did not detect most of these care quality

  2. Development and Psychometric Evaluation of the HPV Clinical Trial Survey for Parents (CTSP-HPV) Using Traditional Survey Development Methods and Community Engagement Principles.

    Science.gov (United States)

    Cunningham, Jennifer; Wallston, Kenneth A; Wilkins, Consuelo H; Hull, Pamela C; Miller, Stephania T

    2015-12-01

    This study describes the development and psychometric evaluation of HPV Clinical Trial Survey for Parents with Children Aged 9 to 15 (CTSP-HPV) using traditional instrument development methods and community engagement principles. An expert panel and parental input informed survey content and parents recommended study design changes (e.g., flyer wording). A convenience sample of 256 parents completed the final survey measuring parental willingness to consent to HPV clinical trial (CT) participation and other factors hypothesized to influence willingness (e.g., HPV vaccine benefits). Cronbach's a, Spearman correlations, and multiple linear regression were used to estimate internal consistency, convergent and discriminant validity, and predictively validity, respectively. Internal reliability was confirmed for all scales (a ≥ 0.70.). Parental willingness was positively associated (p < 0.05) with trust in medical researchers, adolescent CT knowledge, HPV vaccine benefits, advantages of adolescent CTs (r range 0.33-0.42), supporting convergent validity. Moderate discriminant construct validity was also demonstrated. Regression results indicate reasonable predictive validity with the six scales accounting for 31% of the variance in parents' willingness. This instrument can inform interventions based on factors that influence parental willingness, which may lead to the eventual increase in trial participation. Further psychometric testing is warranted. © 2015 Wiley Periodicals, Inc.

  3. Geochemical drainage surveys for uranium: sampling and analytical methods based on trial surveys in Pennsylvania

    International Nuclear Information System (INIS)

    Rose, A.W.; Keith, M.L.; Suhr, N.H.

    1976-01-01

    Geochemical surveys near sandstone-type uranium prospects in northeastern and north-central Pennsylvania show that the deposits can be detected by carefully planned stream sediment surveys, but not by stream water surveys. Stream waters at single sites changed in U content by x10 to 50 during the 18 months of our studies, and even near known prospects, contain less than 0.2 ppB U most of the time. Uranium extractable from stream sediment by acetic acid--H 2 O 2 provides useful contrast between mineralized and nonmineralized drainages of a square mile or less; total U in sediment does not. High organic material results in increased U content of sediments and must be corrected. Changes in U content of sediment with time reach a maximum of x3 and appear to be of short duration. A sediment of about 200 mi 2 near Jim Thorpe detects anomalies extending over several square miles near known occurrences and a second anomaly about two miles northeast of Penn Haven Jct. A similar survey in Lycoming-Sullivan Counties shows anomalous zones near known prospects of the Beaver Lake area and northwest of Muncy Creek. As, Mn, Pb, and V are enriched in the mineralized zones, and perhaps in surrounding halo zones, but do not appear to be pathfinder elements useful for reconnaissance exploration

  4. Sources of Salmonella on broiler carcasses during transportation and processing: modes of contamination and methods of control.

    Science.gov (United States)

    Corry, Janet E L; Allen, V M; Hudson, W R; Breslin, M F; Davies, R H

    2002-01-01

    The prevalence and types of salmonella in broiler chickens during transportation and during slaughter and dressing were studied. This was part of a comprehensive investigation of salmonellas in two UK poultry companies, which aimed to find the origins and mechanisms of salmonella contamination. Salmonellas were isolated using cultural methods. Serovars of Salmonella detected during rearing were usually also found in a small proportion of birds on the day of slaughter and on the carcasses at various points during processing. There was little evidence of salmonellas spreading to large numbers of carcasses during processing. Many serovars found in the feedmills or hatcheries were also detected in the birds during rearing and/or slaughter. Transport crates were contaminated with salmonellas after washing and disinfection. Prevalence of salmonellas fell in the two companies during this survey. A small number of serovars predominated in the processing plants of each company. These serovars originated from the feed mills. Reasons for transport crate contamination were: (1) inadequate cleaning, resulting in residual faecal soiling; (2) disinfectant concentration and temperature of disinfectant too low; (3) contaminated recycled flume water used to soak the crates. Efforts to control salmonella infection in broilers need to concentrate on crate cleaning and disinfection and hygiene in the feed mills.

  5. Identifying research priorities for public health research to address health inequalities: use of Delphi-like survey methods.

    Science.gov (United States)

    Turner, S; Ollerhead, E; Cook, A

    2017-10-09

    In the funding of health research and public health research it is vital that research questions posed are important and that funded research meets a research need or a gap in evidence. Many methods are used in the identification of research priorities, however, these can be resource intensive, costly and logistically challenging. Identifying such research priorities can be particularly challenging for complex public health problems as there is a need to consult a number of experts across disciplines and with a range of expertise. This study investigated the use of Delphi-like survey methods in identifying important research priorities relating to health inequalities and framing tractable research questions for topic areas identified. The study was conducted in two phases, both using Delphi-like survey methods. Firstly, public health professionals with an interest in health inequalities were asked to identify research priorities. Secondly academic researchers were asked to frame tractable research questions relating to the priorities identified. These research priorities identified using Delphi-like survey methods were subsequently compared to those identified using different methods. A total of 52 public health professionals and 21 academics across the United Kingdom agreed to take part. The response rates were high, from public health professionals across three survey rounds (69%, 50% and 40%) and from academics across one round (52%), indicating that participants were receptive to the method and motivated to respond. The themes identified as encompassing the most important research priorities were mental health, healthy environment and health behaviours. Within these themes, the topic areas that emerged most strongly included community interventions for prevention of mental health problems and the food and alcohol environment. Some responses received from academic researchers were (as requested) in the form of tractable research questions, whereas others

  6. Bridging Technometric Method and Innovation Process: An Initial Study

    Science.gov (United States)

    Rumanti, A. A.; Reynaldo, R.; Samadhi, T. M. A. A.; Wiratmadja, I. I.; Dwita, A. C.

    2018-03-01

    The process of innovation is one of ways utilized to increase the capability of a technology component that reflects the need of SME. Technometric method can be used to identify to what extent the level of technology advancement in a SME is, and also which technology component that needs to be maximized in order to significantly deliver an innovation. This paper serves as an early study, which lays out a conceptual framework that identifies and elaborates the principles of innovation process from a well-established innovation model by Martin with the technometric method, based on the initial background research conducted at SME Ira Silver in Jogjakarta, Indonesia.

  7. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  8. A survey of artifact-simulation approaches from the perspective of application to use processes of consumer durables

    NARCIS (Netherlands)

    Van der Vegte, W.F.

    2006-01-01

    In this paper, approaches for artifact-behavior simulation are reviewed. The motivation behind the survey is to explore available knowledge for the development of a new form of computer support for conceptual design to simulate use processes of consumer durables. The survey covers the simulation of

  9. [Reasearch on evolution and transition of processing method of fuzi in ancient and modern times].

    Science.gov (United States)

    Liu, Chan-Chan; Cheng, Ming-En; Duan, Hai-Yan; Peng, Hua-Sheng

    2014-04-01

    Fuzi is a medicine used for rescuing from collapse by restoring yang as well as a famous toxic traditional Chinese medicine. In order to ensure the efficacy and safe medication, Fuzi has mostly been applied after being processed. There have been different Fuzi processing methods recorded by doctors of previous generations. Besides, there have also been differences in Fuzi processing methods recorded in modern pharmacopeia and ancient medical books. In this study, the authors traced back to medical books between the Han Dynasty and the period of Republic of China, and summarized Fuzi processing methods collected in ancient and modern literatures. According to the results, Fuzi processing methods and using methods have changed along with the evolution of dynasties, with differences in ancient and modern processing methods. Before the Tang Dynasty, Fuzi had been mostly processed and soaked. From Tang to Ming Dynasties, Fuzi had been mostly processed, soaked and stir-fried. During the Qing Dynasty, Fuzi had been mostly soaked and boiled. In the modem times, Fuzi is mostly processed by being boiled and soaked. Before the Tang Dynasty, a whole piece of Fuzi herbs or their fragments had been applied in medicines; Whereas their fragments are primarily used in the modern times. Because different processing methods have great impacts on the toxicity of Fuzi, it is suggested to study Fuzi processing methods.

  10. Methods for Optimisation of the Laser Cutting Process

    DEFF Research Database (Denmark)

    Dragsted, Birgitte

    This thesis deals with the adaptation and implementation of various optimisation methods, in the field of experimental design, for the laser cutting process. The problem in optimising the laser cutting process has been defined and a structure for at Decision Support System (DSS......) for the optimisation of the laser cutting process has been suggested. The DSS consists of a database with the currently used and old parameter settings. Also one of the optimisation methods has been implemented in the DSS in order to facilitate the optimisation procedure for the laser operator. The Simplex Method has...... been adapted in two versions. A qualitative one, that by comparing the laser cut items optimise the process and a quantitative one that uses a weighted quality response in order to achieve a satisfactory quality and after that maximises the cutting speed thus increasing the productivity of the process...

  11. The patterns of care study for breast-conserving therapy in Japan: Analysis of process survey from 1995 to 1997

    International Nuclear Information System (INIS)

    Mitsumori, Michihide; Hiraoka, Masahiro; Negoro, Yoshiharu; Yamauchi, Chikako; Shikama, Naoto; Sasaki, Shigeru; Yamamoto, Tokihiro; Teshima, Teruki; Inoue, Toshihiko

    2005-01-01

    Purpose: To present the results of a process survey on breast-conserving therapy (BCT) in Japan from 1995 to 1997. Methods and Materials: From September 1998 to December 1999, data on the treatment process of 865 randomly selected BCT patients were collected by extramural audits. Results: For primary surgery, wide excision or tumorectomy was performed in 372 patients (43.0%), and quadrantectomy or segmental mastectomy was performed in 493 patients (57%). The extent of axillary dissection was equal or beyond Level II in 590 patients (68.2%). Systemic chemotherapy was administered to 103 of 160 node-positive patients (64.4%) and 180 of 569 node-negative patients (31.6%). Tamoxifen was administered to 234 of 323 hormone receptor-positive patients (72.5%) and 68 of 130 hormone receptor-negative patients (52.3%). Photon energy of 10 MV was administered for whole breast irradiation in 38 patients (4.4%) without bolus. Conclusions: The extent of surgical resection for BCT was large in Japan. Pathologic assessment and the technique of radiation therapy were apparently suboptimal in some cases. Information on prognostic/predictive factors was not fully utilized to individualize systemic adjuvant therapy. Establishment and widespread use of guidelines for BCT for in Japan are desirable. Repeated surveys will demonstrate how such guidelines affect clinical practices

  12. Methods of the National Nutrition Survey 1999 Metodología de la Encuesta Nacional de Nutrición 1999

    OpenAIRE

    Elsa Resano-Pérez; Ignacio Méndez-Ramírez; Teresa Shamah-Levy; Juan A Rivera; Jaime Sepúlveda-Amor

    2003-01-01

    OBJECTIVE: To describe the methods and analyses of the 1999 National Nutrition Survey (NNS-99). MATERIAL AND METHODS: The 1999 National Nutrition Survey (NNS-99) is a probabilistic survey with nationwide representativity. The NNS-99 included four regions and urban and rural areas of Mexico. The last sampling units were households, selected through stratified cluster sampling. The study population consisted of children under five years of age, school-age children (6-11 years), and women of chi...

  13. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C......In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...

  14. Equipment decontamination: A brief survey of the DOE complex

    International Nuclear Information System (INIS)

    Conner, C.; Chamberlain, D.B.; Chen, L.; Vandegrift, G.F.

    1995-03-01

    Deactivation at DOE facilities has left a tremendous amount of contaminated equipment behind. In-situ methods are needed to decontaminate the interiors of the equipment sufficiently to allow either free release or land disposal. A brief survey was completed of the DOE complex on their needs for equipment decontamination with in-situ technology to determine (1) the types of contamination problems within the DOE complex, (2) decontamination processes that are being used or are being developed within the DOE, and (3) the methods that are available to dispose of spent decontamination solutions. In addition, potential sites for testing decontamination methods were located. Based on the information obtained from these surveys, the Rocky Flats Plant and the Idaho National Engineering Laboratory appear to be best suited to complete the initial testing of the decontamination processes

  15. Application of gamma spectrometry survey and discussion on data processing

    International Nuclear Information System (INIS)

    Li Ji'an; He Jianguo

    2008-01-01

    This paper analyzed and discussed the different opinions about the measured parameters of gamma spectrometry data, introduced the effect of gamma spectrometry survey to the search for sandstone type uranium deposit. The author believes that it is very necessary to perform some ground gamma spectrometry survey and enforce the development and application of airborne radiometric data so as to carry out the role of gamma spectrometry in the exploration of sandstone type uranium deposit. (authors)

  16. The Role of Character in the Hiring Process: A Pilot Study Survey of College Seniors' Potential Employers

    Science.gov (United States)

    Firmin, Michael; Proemmel, Elizabeth; McDivitt, Sarah; Evens, Jennifer; Gibbs, Lew

    2009-01-01

    We surveyed 31 prospective employers (65% response rate) regarding their views on character as part of the employment selection process. The results showed character qualities superordinate, relative to skills that prospective employees bring to potential jobs. We discuss survey results in light of business educators' responsibility for helping…

  17. Implementation of a new rapid tissue processing method--advantages and challenges

    DEFF Research Database (Denmark)

    Munkholm, Julie; Talman, Maj-Lis; Hasselager, Thomas

    2008-01-01

    Conventional tissue processing of histologic specimens has been carried out in the same manner for many years. It is a time-consuming process involving batch production, resulting in a 1-day delay of the diagnosis. Microwave-assisted tissue processing enables a continuous high flow of histologic...... specimens through the processor with a processing time of as low as 1h. In this article, we present the effects of the automated microwave-assisted tissue processor on the histomorphologic quality and the turnaround time (TAT) for histopathology reports. We present a blind comparative study regarding...... the histomorphologic quality of microwave-processed and conventionally processed tissue samples. A total of 333 specimens were included. The microwave-assisted processing method showed a histomorphologic quality comparable to the conventional method for a number of tissue types, including skin and specimens from...

  18. Application of finite-element-methods in food processing

    DEFF Research Database (Denmark)

    Risum, Jørgen

    2004-01-01

    Presentation of the possible use of finite-element-methods in food processing. Examples from diffusion studies are given.......Presentation of the possible use of finite-element-methods in food processing. Examples from diffusion studies are given....

  19. Strapdown Airborne Gravimetry Quality Assessment Method Based on Single Survey Line Data: A Study by SGA-WZ02 Gravimeter

    Science.gov (United States)

    Wu, Meiping; Cao, Juliang; Zhang, Kaidong; Cai, Shaokun; Yu, Ruihang

    2018-01-01

    Quality assessment is an important part in the strapdown airborne gravimetry. Root mean square error (RMSE) evaluation method is a classical way to evaluate the gravimetry quality, but classical evaluation methods are preconditioned by extra flight or reference data. Thus, a method, which is able to largely conquer the premises of classical quality assessment methods and can be used in single survey line, has been developed in this paper. According to theoretical analysis, the method chooses the stability of two horizontal attitude angles, horizontal specific force and vertical specific force as the determinants of quality assessment method. The actual data, collected by SGA-WZ02 from 13 flights 21 lines in certain survey, was used to build the model and elaborate the method. To substantiate the performance of the quality assessment model, the model is applied in extra repeat line flights from two surveys. Compared with internal RMSE, standard deviation of assessment residuals are 0.23 mGal and 0.16 mGal in two surveys, which shows that the quality assessment method is reliable and stricter. The extra flights are not necessary by specially arranging the route of flights. The method, summarized from SGA-WZ02, is a feasible approach to assess gravimetry quality using single line data and is also suitable for other strapdown gravimeters. PMID:29373535

  20. Review of conventional and novel food processing methods on food allergens.

    Science.gov (United States)

    Vanga, Sai Kranthi; Singh, Ashutosh; Raghavan, Vijaya

    2017-07-03

    With the turn of this century, novel food processing techniques have become commercially very important because of their profound advantages over the traditional methods. These novel processing methods tend to preserve the characteristic properties of food including their organoleptic and nutritional qualities better when compared with the conventional food processing methods. During the same period of time, there is a clear rise in the populations suffering from food allergies, especially infants and children. Though, this fact is widely attributed to the changing livelihood of population in both developed and developing nations and to the introduction of new food habits with advent of novel foods and new processing techniques, their complete role is still uncertain. Under the circumstance, it is very important to understand the structural changes in the protein as food is processed to comprehend whether the specific processing technique (conventional and novel) is increasing or mitigating the allergenicity. Various modern means are now being employed to understand the conformational changes in the protein which can affect the allergenicity. In this review, the processing effects on protein structure and allergenicity are discussed along with the insinuations of recent studies and techniques for establishing a platform to investigate future pathway to reduce or eliminate allergenicity in the population.

  1. The lithosphere structure and deep processes of the Mesozoic metallogenic belt in eastern China: constraints from passive and active seismic methods

    Science.gov (United States)

    Lu, Q.; Shi, D.; Jiang, G.; Yan, J.

    2013-12-01

    The lithosphere structure and deep processes are keys to understanding mineral system and ore-forming processes. Lithosphere-scale process could create big footprints or signatures which can be observed by geophysics methods. SinoProbe-03 has conducted a Transect exploration across middle and lower Yangtze Metallogenic Belt (YMT) in Eastern China. Broadband seismic, reflection seismic, wide-angle reflection and magnetotellurics survey were carried out along the Transect. Seismic reflection profiles and MT survey were also performed in Luzong, Tongling and Ningwu ore districts to construct 3D geological model. The resulting geophysical data provides new information which help to better understanding the lithosphere structure, deep processes and deformation history of the Metallogenic Belt. The major results are: (1) Lower velocity body at the top of upper mantle and a SE dipping high velocity body were imaged by teleseismic tomography beneath YMB; (2) Shear wave splitting results show NE parallel fast-wave polarization direction which parallel with tectonic lineament; (3) The reflection seismic data support the crustal-detachment model, the lower and upper crust was detached during contraction deformation near Tanlu fault and Ningwu volcanic basin; (4) Broadband and reflection seismic confirm the shallow Moho beneath YMB; (5) Strong correlation of lower crust reflectivity with magmatism; (6) The lower crust below Luzong Volcanics shows obvious reflective anisotropy both at the crust-mantle transition and the brittle-ductile transition in the crust. All these features suggest that introcontinental subduction, lithosphere delamination, mantle sources magmatic underplating, and MASH process are responsible for the formation of this Mesozoic metallogenic belt. Acknowledgment: We acknowledge the financial support of SinoProbe by the Ministry of Finance and Ministry of Land and Resources, P. R. China, under Grant sinoprobe-03, and financial support by National Natural

  2. Efficient point cloud data processing in shipbuilding: Reformative component extraction method and registration method

    Directory of Open Access Journals (Sweden)

    Jingyu Sun

    2014-07-01

    Full Text Available To survive in the current shipbuilding industry, it is of vital importance for shipyards to have the ship components’ accuracy evaluated efficiently during most of the manufacturing steps. Evaluating components’ accuracy by comparing each component’s point cloud data scanned by laser scanners and the ship’s design data formatted in CAD cannot be processed efficiently when (1 extract components from point cloud data include irregular obstacles endogenously, or when (2 registration of the two data sets have no clear direction setting. This paper presents reformative point cloud data processing methods to solve these problems. K-d tree construction of the point cloud data fastens a neighbor searching of each point. Region growing method performed on the neighbor points of the seed point extracts the continuous part of the component, while curved surface fitting and B-spline curved line fitting at the edge of the continuous part recognize the neighbor domains of the same component divided by obstacles’ shadows. The ICP (Iterative Closest Point algorithm conducts a registration of the two sets of data after the proper registration’s direction is decided by principal component analysis. By experiments conducted at the shipyard, 200 curved shell plates are extracted from the scanned point cloud data, and registrations are conducted between them and the designed CAD data using the proposed methods for an accuracy evaluation. Results show that the methods proposed in this paper support the accuracy evaluation targeted point cloud data processing efficiently in practice.

  3. Examination of the equivalence of self-report survey-based paper-and-pencil and internet data collection methods.

    Science.gov (United States)

    Weigold, Arne; Weigold, Ingrid K; Russell, Elizabeth J

    2013-03-01

    Self-report survey-based data collection is increasingly carried out using the Internet, as opposed to the traditional paper-and-pencil method. However, previous research on the equivalence of these methods has yielded inconsistent findings. This may be due to methodological and statistical issues present in much of the literature, such as nonequivalent samples in different conditions due to recruitment, participant self-selection to conditions, and data collection procedures, as well as incomplete or inappropriate statistical procedures for examining equivalence. We conducted 2 studies examining the equivalence of paper-and-pencil and Internet data collection that accounted for these issues. In both studies, we used measures of personality, social desirability, and computer self-efficacy, and, in Study 2, we used personal growth initiative to assess quantitative equivalence (i.e., mean equivalence), qualitative equivalence (i.e., internal consistency and intercorrelations), and auxiliary equivalence (i.e., response rates, missing data, completion time, and comfort completing questionnaires using paper-and-pencil and the Internet). Study 1 investigated the effects of completing surveys via paper-and-pencil or the Internet in both traditional (i.e., lab) and natural (i.e., take-home) settings. Results indicated equivalence across conditions, except for auxiliary equivalence aspects of missing data and completion time. Study 2 examined mailed paper-and-pencil and Internet surveys without contact between experimenter and participants. Results indicated equivalence between conditions, except for auxiliary equivalence aspects of response rate for providing an address and completion time. Overall, the findings show that paper-and-pencil and Internet data collection methods are generally equivalent, particularly for quantitative and qualitative equivalence, with nonequivalence only for some aspects of auxiliary equivalence. PsycINFO Database Record (c) 2013 APA, all

  4. The SAGES Legacy Unifying Globulars and Galaxies survey (SLUGGS): sample definition, methods, and initial results

    Energy Technology Data Exchange (ETDEWEB)

    Brodie, Jean P.; Romanowsky, Aaron J.; Jennings, Zachary G.; Pota, Vincenzo; Kader, Justin; Roediger, Joel C.; Villaume, Alexa; Arnold, Jacob A.; Woodley, Kristin A. [University of California Observatories, 1156 High Street, Santa Cruz, CA 95064 (United States); Strader, Jay [Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States); Forbes, Duncan A.; Pastorello, Nicola; Usher, Christopher; Blom, Christina; Kartha, Sreeja S. [Centre for Astrophysics and Supercomputing, Swinburne University, Hawthorn, VIC 3122 (Australia); Foster, Caroline; Spitler, Lee R., E-mail: jbrodie@ucsc.edu [Australian Astronomical Observatory, P.O. Box 915, North Ryde, NSW 1670 (Australia)

    2014-11-20

    We introduce and provide the scientific motivation for a wide-field photometric and spectroscopic chemodynamical survey of nearby early-type galaxies (ETGs) and their globular cluster (GC) systems. The SAGES Legacy Unifying Globulars and GalaxieS (SLUGGS) survey is being carried out primarily with Subaru/Suprime-Cam and Keck/DEIMOS. The former provides deep gri imaging over a 900 arcmin{sup 2} field-of-view to characterize GC and host galaxy colors and spatial distributions, and to identify spectroscopic targets. The NIR Ca II triplet provides GC line-of-sight velocities and metallicities out to typically ∼8 R {sub e}, and to ∼15 R {sub e} in some cases. New techniques to extract integrated stellar kinematics and metallicities to large radii (∼2-3 R {sub e}) are used in concert with GC data to create two-dimensional (2D) velocity and metallicity maps for comparison with simulations of galaxy formation. The advantages of SLUGGS compared with other, complementary, 2D-chemodynamical surveys are its superior velocity resolution, radial extent, and multiple halo tracers. We describe the sample of 25 nearby ETGs, the selection criteria for galaxies and GCs, the observing strategies, the data reduction techniques, and modeling methods. The survey observations are nearly complete and more than 30 papers have so far been published using SLUGGS data. Here we summarize some initial results, including signatures of two-phase galaxy assembly, evidence for GC metallicity bimodality, and a novel framework for the formation of extended star clusters and ultracompact dwarfs. An integrated overview of current chemodynamical constraints on GC systems points to separate, in situ formation modes at high redshifts for metal-poor and metal-rich GCs.

  5. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR

    Directory of Open Access Journals (Sweden)

    Bochaton Audrey

    2007-06-01

    Full Text Available Abstract Background Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. Methods We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. Application We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. Conclusion This paper describes the conceptual reasoning behind

  6. Comparing two remote video survey methods for spatial predictions of the distribution and environmental niche suitability of demersal fishes.

    Science.gov (United States)

    Galaiduk, Ronen; Radford, Ben T; Wilson, Shaun K; Harvey, Euan S

    2017-12-15

    Information on habitat associations from survey data, combined with spatial modelling, allow the development of more refined species distribution modelling which may identify areas of high conservation/fisheries value and consequentially improve conservation efforts. Generalised additive models were used to model the probability of occurrence of six focal species after surveys that utilised two remote underwater video sampling methods (i.e. baited and towed video). Models developed for the towed video method had consistently better predictive performance for all but one study species although only three models had a good to fair fit, and the rest were poor fits, highlighting the challenges associated with modelling habitat associations of marine species in highly homogenous, low relief environments. Models based on baited video dataset regularly included large-scale measures of structural complexity, suggesting fish attraction to a single focus point by bait. Conversely, models based on the towed video data often incorporated small-scale measures of habitat complexity and were more likely to reflect true species-habitat relationships. The cost associated with use of the towed video systems for surveying low-relief seascapes was also relatively low providing additional support for considering this method for marine spatial ecological modelling.

  7. THE BASE OF THE METHODICAL DESIGN AND IMPLEMENTATION OF ENGINEERING EDUCATION PROCESS

    Directory of Open Access Journals (Sweden)

    Renata Lis

    2012-12-01

    Full Text Available The article is devoted to the methodology of implementation of European and national qualifications framework in the academic process. It consists of: the methodic of design degree programs and classes and the methodic of the teaching process.

  8. Optoelectronic imaging of speckle using image processing method

    Science.gov (United States)

    Wang, Jinjiang; Wang, Pengfei

    2018-01-01

    A detailed image processing of laser speckle interferometry is proposed as an example for the course of postgraduate student. Several image processing methods were used together for dealing with optoelectronic imaging system, such as the partial differential equations (PDEs) are used to reduce the effect of noise, the thresholding segmentation also based on heat equation with PDEs, the central line is extracted based on image skeleton, and the branch is removed automatically, the phase level is calculated by spline interpolation method, and the fringe phase can be unwrapped. Finally, the imaging processing method was used to automatically measure the bubble in rubber with negative pressure which could be used in the tire detection.

  9. Systematic Development of Miniaturized (Bio)Processes using Process Systems Engineering (PSE) Methods and Tools

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Larsson, Hilde; Heintz, Søren

    2014-01-01

    The focus of this work is on process systems engineering (PSE) methods and tools, and especially on how such PSE methods and tools can be used to accelerate and support systematic bioprocess development at a miniature scale. After a short presentation of the PSE methods and the bioprocess...... development drivers, three case studies are presented. In the first example it is demonstrated how experimental investigations of the bi-enzymatic production of lactobionic acid can be modeled with help of a new mechanistic mathematical model. The reaction was performed at lab scale and the prediction quality...

  10. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    Energy Technology Data Exchange (ETDEWEB)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles.

  11. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    International Nuclear Information System (INIS)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles

  12. Methods of process management in radiology

    International Nuclear Information System (INIS)

    Teichgraeber, U.K.M.; Gillessen, C.; Neumann, F.

    2003-01-01

    The main emphasis in health care has been on quality and availability but increasing cost pressure has made cost efficiency ever more relevant for nurses, technicians, and physicians. Within a hospital, the radiologist considerably influences the patient's length of stay through the availability of service and diagnostic information. Therefore, coordinating and timing radiologic examinations become increasingly more important. Physicians are not taught organizational management during their medical education and residency training, and the necessary expertise in economics is generally acquired through the literature or specialized courses. Beyond the medical service, the physicians are increasingly required to optimize their work flow according to economic factors. This review introduces various tools for process management and its application in radiology. By means of simple paper-based methods, the work flow of most processes can be analyzed. For more complex work flow, it is suggested to choose a method that allows for an exact qualitative and quantitative prediction of the effect of variations. This review introduces network planning technique and process simulation. (orig.) [de

  13. Indications and organisational methods for autologous blood transfusion procedures in Italy: results of a national survey.

    Science.gov (United States)

    Catalano, Liviana; Campolongo, Alessandra; Caponera, Maurizio; Berzuini, Alessandra; Bontadini, Andrea; Furlò, Giuseppe; Pasqualetti, Patrizio; Liumbruno, Giancarlo M

    2014-10-01

    Pre-operative donation of autologous blood is a practice that is now being abandoned. Alternative methods of transfusing autologous blood, other than predeposited blood, do however play a role in limiting the need for transfusion of allogeneic blood. This survey of autologous blood transfusion practices, promoted by the Italian Society of Transfusion Medicine and Immunohaematology more than 2 years after the publication of national recommendations on the subject, was intended to acquire information on the indications for predeposit in Italy and on some organisational aspects of the alternative techniques of autotransfusion. A structured questionnaire consisting of 22 questions on the indications and organisational methods of autologous blood transfusion was made available on a web platform from 15 January to 15 March, 2013. The 232 Transfusion Services in Italy were invited by e-mail to complete the online survey. Of the 232 transfusion structures contacted, 160 (69%) responded to the survey, with the response rate decreasing from the North towards the South and the Islands. The use of predeposit has decreased considerably in Italy and about 50% of the units collected are discarded because of lack of use. Alternative techniques (acute isovolaemic haemodilution and peri-operative blood salvage) are used at different frequencies across the country. The data collected in this survey can be considered representative of national practice; they show that the already very limited indications for predeposit autologous blood transfusion must be adhered to even more scrupulously, also to avoid the notable waste of resources due to unused units.Users of alternative autotransfusion techniques must be involved in order to gain a full picture of the degree of use of such techniques; multidisciplinary agreement on the indications for their use is essential in order for these indications to have an effective role in "patient blood management" programmes.

  14. Catalytic Enzyme-Based Methods for Water Treatment and Water Distribution System Decontamination. 1. Literature Survey

    Science.gov (United States)

    2006-06-01

    best examples of this is glucose isomerase, which has been used in the commercial production of high fructose corn syrup (HFCS) since 1967.230 Most...EDGEWOOD CHEMICAL BIOLOGICAL CENTER U.S. ARMY RESEARCH, DEVELOPMENT AND ENGINEERING COMMAND ECBC-TR-489 CATALYTIC ENZYME-BASED METHODS FOR WATER ...TREATMENT AND WATER DISTRIBUTION SYSTEM DECONTAMINATION 1. LITERATURE SURVEY Joseph J. DeFrank RESEARCH AND TECHNOLOGY DIRECTORATE June 2006 Approved for

  15. Application of remote sensing methods and GIS in erosive process investigations

    Directory of Open Access Journals (Sweden)

    Mustafić Sanja

    2007-01-01

    Full Text Available Modern geomorphologic investigations of condition and change of the intensity of erosive process should be based on application of remote sensing methods which are based on processing of aerial and satellite photographs. Using of these methods is very important because it enables good possibilities for realizing regional relations of the investigated phenomenon, as well as the estimate of spatial and temporal variability of all physical-geographical and anthropogenic factors influencing given process. Realizing process of land erosion, on the whole, is only possible by creating universal data base, as well as by using of appropriate software, more exactly by establishing uniform information system. Geographical information system, as the most effective one, the most complex and the most integral system of information about the space enables unification as well as analytical and synthetically processing of all data.

  16. Transgender-inclusive measures of sex/gender for population surveys: Mixed-methods evaluation and recommendations.

    Science.gov (United States)

    Bauer, Greta R; Braimoh, Jessica; Scheim, Ayden I; Dharma, Christoffer

    2017-01-01

    Given that an estimated 0.6% of the U.S. population is transgender (trans) and that large health disparities for this population have been documented, government and research organizations are increasingly expanding measures of sex/gender to be trans inclusive. Options suggested for trans community surveys, such as expansive check-all-that-apply gender identity lists and write-in options that offer maximum flexibility, are generally not appropriate for broad population surveys. These require limited questions and a small number of categories for analysis. Limited evaluation has been undertaken of trans-inclusive population survey measures for sex/gender, including those currently in use. Using an internet survey and follow-up of 311 participants, and cognitive interviews from a maximum-diversity sub-sample (n = 79), we conducted a mixed-methods evaluation of two existing measures: a two-step question developed in the United States and a multidimensional measure developed in Canada. We found very low levels of item missingness, and no indicators of confusion on the part of cisgender (non-trans) participants for both measures. However, a majority of interview participants indicated problems with each question item set. Agreement between the two measures in assessment of gender identity was very high (K = 0.9081), but gender identity was a poor proxy for other dimensions of sex or gender among trans participants. Issues to inform measure development or adaptation that emerged from analysis included dimensions of sex/gender measured, whether non-binary identities were trans, Indigenous and cultural identities, proxy reporting, temporality concerns, and the inability of a single item to provide a valid measure of sex/gender. Based on this evaluation, we recommend that population surveys meant for multi-purpose analysis consider a new Multidimensional Sex/Gender Measure for testing that includes three simple items (one asked only of a small sub-group) to assess gender

  17. Transgender-inclusive measures of sex/gender for population surveys: Mixed-methods evaluation and recommendations.

    Directory of Open Access Journals (Sweden)

    Greta R Bauer

    Full Text Available Given that an estimated 0.6% of the U.S. population is transgender (trans and that large health disparities for this population have been documented, government and research organizations are increasingly expanding measures of sex/gender to be trans inclusive. Options suggested for trans community surveys, such as expansive check-all-that-apply gender identity lists and write-in options that offer maximum flexibility, are generally not appropriate for broad population surveys. These require limited questions and a small number of categories for analysis. Limited evaluation has been undertaken of trans-inclusive population survey measures for sex/gender, including those currently in use. Using an internet survey and follow-up of 311 participants, and cognitive interviews from a maximum-diversity sub-sample (n = 79, we conducted a mixed-methods evaluation of two existing measures: a two-step question developed in the United States and a multidimensional measure developed in Canada. We found very low levels of item missingness, and no indicators of confusion on the part of cisgender (non-trans participants for both measures. However, a majority of interview participants indicated problems with each question item set. Agreement between the two measures in assessment of gender identity was very high (K = 0.9081, but gender identity was a poor proxy for other dimensions of sex or gender among trans participants. Issues to inform measure development or adaptation that emerged from analysis included dimensions of sex/gender measured, whether non-binary identities were trans, Indigenous and cultural identities, proxy reporting, temporality concerns, and the inability of a single item to provide a valid measure of sex/gender. Based on this evaluation, we recommend that population surveys meant for multi-purpose analysis consider a new Multidimensional Sex/Gender Measure for testing that includes three simple items (one asked only of a small sub-group to

  18. Transgender-inclusive measures of sex/gender for population surveys: Mixed-methods evaluation and recommendations

    Science.gov (United States)

    Bauer, Greta R.; Braimoh, Jessica; Scheim, Ayden I.; Dharma, Christoffer

    2017-01-01

    Given that an estimated 0.6% of the U.S. population is transgender (trans) and that large health disparities for this population have been documented, government and research organizations are increasingly expanding measures of sex/gender to be trans inclusive. Options suggested for trans community surveys, such as expansive check-all-that-apply gender identity lists and write-in options that offer maximum flexibility, are generally not appropriate for broad population surveys. These require limited questions and a small number of categories for analysis. Limited evaluation has been undertaken of trans-inclusive population survey measures for sex/gender, including those currently in use. Using an internet survey and follow-up of 311 participants, and cognitive interviews from a maximum-diversity sub-sample (n = 79), we conducted a mixed-methods evaluation of two existing measures: a two-step question developed in the United States and a multidimensional measure developed in Canada. We found very low levels of item missingness, and no indicators of confusion on the part of cisgender (non-trans) participants for both measures. However, a majority of interview participants indicated problems with each question item set. Agreement between the two measures in assessment of gender identity was very high (K = 0.9081), but gender identity was a poor proxy for other dimensions of sex or gender among trans participants. Issues to inform measure development or adaptation that emerged from analysis included dimensions of sex/gender measured, whether non-binary identities were trans, Indigenous and cultural identities, proxy reporting, temporality concerns, and the inability of a single item to provide a valid measure of sex/gender. Based on this evaluation, we recommend that population surveys meant for multi-purpose analysis consider a new Multidimensional Sex/Gender Measure for testing that includes three simple items (one asked only of a small sub-group) to assess gender

  19. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  20. Influence of harvesting and processing methods on organic viability of soybean seed

    Directory of Open Access Journals (Sweden)

    Đukanović Lana

    2000-01-01

    Full Text Available Organic viability of soybean seed for three soybean varieties - elite (Bosa, ZPS 015 and Nena depending on methods of manipulation with seeds during harvesting and processing phase were determined in this paper. Trial was conducted in Zemun Polje during 1999; manual and mechanized harvesting or processing methods were applied. Seed germination was tested using ISTA methods (Standard method and Cold test. Following parameters were evaluated: germination viability, germination, rate-speed of emergence, length of hypocotile and main root Rate-speed of emergence was based on number of emerged plants per day. Length of hypocotile or root and percent of germination determined vigour index. Based on obtained results it maybe concluded that methods of seed manipulation during harvesting or processing phase were influenced on soybean seed quality parameters evaluated. Ways of seed manipulation - methods evaluated were influenced organic viability of soybean seed by decreasing germination viability, total germination and length of main root.

  1. A Survey of Logistics’ Processes in Tabriz Health and Nutrition Faculty Using Process Mapping and Analyzing in 2012

    Directory of Open Access Journals (Sweden)

    Jafar Sadegh Tabrizi

    2015-08-01

    Full Text Available ​Background and objectives : Considering the importance of process improvement and support system, we tried to take a step to clarify logistics’ processes and initiate quality improvement in health and nutrition faculty of Tabriz University of Medical Sciences using process mapping and decision making matrix. Material and Methods : This study is qualitative and was conducted in 2012. The data were gathered by interview with the participation of the individuals involving in the process and researchers’ direct observation. In this study, we clarified logistics’ processes of financial and administration deputy using block diagram and detailed flowchart. After clarifying the process, research team analyzed the data and proposed an improved way for process as a suggestive detailed flowchart. At the next stage, we compared the processes in decision making matrix and finally, the best option for intervention was chose among 10 worst functioning processes in decision making matrix of prioritization. Results : In this study, 35 processes were documented using process mapping in general affairs, personnel affairs, secretariat, archive deputy, storage department, accounting, properties, services, supply and financial department. The accounting documenting process had the worst function according to comparison matrix and the purchase and supply process was selected as the best option of intervention. Conclusion : The results of this study showed that most of the processes in this deputy have problems in theory and practice and system improvement is in need of reforming which will improve quality and prevent organization sources to be wasted. Due to the multi-department function of the processes, the unity of departments and a qualified management is required for better reform.

  2. A survey of castration methods and associated livestock management practices performed by bovine veterinarians in the United States.

    Science.gov (United States)

    Coetzee, Johann F; Nutsch, Abbey L; Barbur, Laura A; Bradburn, Ryan M

    2010-03-03

    Castration of male calves destined for beef production is a common management practice performed in the United States amounting to approximately 15 million procedures per year. Societal concern about the moral and ethical treatment of animals is increasing. Therefore, production agriculture is faced with the challenge of formulating animal welfare policies relating to routine management practices such as castration. To enable the livestock industry to effectively respond to these challenges there is a need for more data on management practices that are commonly used in cattle production systems. The objective of this survey was to describe castration methods, adverse events and husbandry procedures performed by U.S. veterinarians at the time of castration. Invitations to participate in the survey were sent to email addresses of 1,669 members of the American Association of Bovine Practitioners and 303 members of the Academy of Veterinary Consultants. After partially completed surveys and missing data were omitted, 189 responses were included in the analysis. Surgical castration with a scalpel followed by testicular removal by twisting (calves 90 kg) was the most common method of castration used. The potential risk of injury to the operator, size of the calf, handling facilities and experience with the technique were the most important considerations used to determine the method of castration used. Swelling, stiffness and increased lying time were the most prevalent adverse events observed following castration. One in five practitioners report using an analgesic or local anesthetic at the time of castration. Approximately 90% of respondents indicated that they vaccinate and dehorn calves at the time of castration. Over half the respondents use disinfectants, prophylactic antimicrobials and tetanus toxoid to reduce complications following castration. The results of this survey describe current methods of castration and associated management practices employed by

  3. A survey of castration methods and associated livestock management practices performed by bovine veterinarians in the United States

    Directory of Open Access Journals (Sweden)

    Bradburn Ryan M

    2010-03-01

    Full Text Available Abstract Background Castration of male calves destined for beef production is a common management practice performed in the United States amounting to approximately 15 million procedures per year. Societal concern about the moral and ethical treatment of animals is increasing. Therefore, production agriculture is faced with the challenge of formulating animal welfare policies relating to routine management practices such as castration. To enable the livestock industry to effectively respond to these challenges there is a need for more data on management practices that are commonly used in cattle production systems. The objective of this survey was to describe castration methods, adverse events and husbandry procedures performed by U.S. veterinarians at the time of castration. Invitations to participate in the survey were sent to email addresses of 1,669 members of the American Association of Bovine Practitioners and 303 members of the Academy of Veterinary Consultants. Results After partially completed surveys and missing data were omitted, 189 responses were included in the analysis. Surgical castration with a scalpel followed by testicular removal by twisting (calves 90 kg was the most common method of castration used. The potential risk of injury to the operator, size of the calf, handling facilities and experience with the technique were the most important considerations used to determine the method of castration used. Swelling, stiffness and increased lying time were the most prevalent adverse events observed following castration. One in five practitioners report using an analgesic or local anesthetic at the time of castration. Approximately 90% of respondents indicated that they vaccinate and dehorn calves at the time of castration. Over half the respondents use disinfectants, prophylactic antimicrobials and tetanus toxoid to reduce complications following castration. Conclusions The results of this survey describe current methods of

  4. The cosmological analysis of X-ray cluster surveys - I. A new method for interpreting number counts

    Science.gov (United States)

    Clerc, N.; Pierre, M.; Pacaud, F.; Sadibekova, T.

    2012-07-01

    We present a new method aimed at simplifying the cosmological analysis of X-ray cluster surveys. It is based on purely instrumental observable quantities considered in a two-dimensional X-ray colour-magnitude diagram (hardness ratio versus count rate). The basic principle is that even in rather shallow surveys, substantial information on cluster redshift and temperature is present in the raw X-ray data and can be statistically extracted; in parallel, such diagrams can be readily predicted from an ab initio cosmological modelling. We illustrate the methodology for the case of a 100-deg2XMM survey having a sensitivity of ˜10-14 erg s-1 cm-2 and fit at the same time, the survey selection function, the cluster evolutionary scaling relations and the cosmology; our sole assumption - driven by the limited size of the sample considered in the case study - is that the local cluster scaling relations are known. We devote special attention to the realistic modelling of the count-rate measurement uncertainties and evaluate the potential of the method via a Fisher analysis. In the absence of individual cluster redshifts, the count rate and hardness ratio (CR-HR) method appears to be much more efficient than the traditional approach based on cluster counts (i.e. dn/dz, requiring redshifts). In the case where redshifts are available, our method performs similar to the traditional mass function (dn/dM/dz) for the purely cosmological parameters, but constrains better parameters defining the cluster scaling relations and their evolution. A further practical advantage of the CR-HR method is its simplicity: this fully top-down approach totally bypasses the tedious steps consisting in deriving cluster masses from X-ray temperature measurements.

  5. An international survey and modified Delphi process revealed editors’ perceptions, training needs, and ratings of competency-related statements for the development of core competencies for scientific editors of biomedical journals [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    James Galipeau

    2017-09-01

    Full Text Available Background: Scientific editors (i.e., those who make decisions on the content and policies of a journal have a central role in the editorial process at biomedical journals. However, very little is known about the training needs of these editors or what competencies are required to perform effectively in this role. Methods: We conducted a survey of perceptions and training needs among scientific editors from major editorial organizations around the world, followed by a modified Delphi process in which we invited the same scientific editors to rate the importance of competency-related statements obtained from a previous scoping review. Results: A total of 148 participants completed the survey of perceptions and training needs. At least 80% of participants agreed on six of the 38 skill and expertise-related statements presented to them as being important or very important to their role as scientific editors. At least 80% agreed on three of the 38 statements as necessary skills they perceived themselves as possessing (well or very well.  The top five items on participants’ list of top training needs were training in statistics, research methods, publication ethics, recruiting and dealing with peer reviewers, and indexing of journals. The three rounds of the Delphi were completed by 83, 83, and 73 participants, respectively, which ultimately produced a list of 23 “highly rated” competency-related statements and another 86 “included” items. Conclusion: Both the survey and the modified Delphi process will be critical for understanding knowledge and training gaps among scientific editors when designing curriculum around core competencies in the future.

  6. A review of experiment data processing method for uranium mining and metallurgy in BRICEM

    International Nuclear Information System (INIS)

    Ye Guoqiang; Lu Kehong; Wang Congying

    1997-01-01

    The authors investigates the methods of experiment data processing in Beijing Research Institute of Chemical Engineering and Metallurgy (BRICEM). It turns out that error analysis method is used to process experiment data, single-factor transformation and orthogonal test design method are adopted for arranging test, and regression analysis and mathematical process simulation are applied to process mathematical model for uranium mining and metallurgy. The methods above-mentioned lay a foundation for the utilization of mathematical statistics in our subject

  7. Application of QMC methods to PDEs with random coefficients : a survey of analysis and implementation

    KAUST Repository

    Kuo, Frances

    2016-01-05

    In this talk I will provide a survey of recent research efforts on the application of quasi-Monte Carlo (QMC) methods to PDEs with random coefficients. Such PDE problems occur in the area of uncertainty quantification. In recent years many papers have been written on this topic using a variety of methods. QMC methods are relatively new to this application area. I will consider different models for the randomness (uniform versus lognormal) and contrast different QMC algorithms (single-level versus multilevel, first order versus higher order, deterministic versus randomized). I will give a summary of the QMC error analysis and proof techniques in a unified view, and provide a practical guide to the software for constructing QMC points tailored to the PDE problems.

  8. Wiki Surveys: Open and Quantifiable Social Data Collection

    Science.gov (United States)

    Salganik, Matthew J.; Levy, Karen E. C.

    2015-01-01

    In the social sciences, there is a longstanding tension between data collection methods that facilitate quantification and those that are open to unanticipated information. Advances in technology now enable new, hybrid methods that combine some of the benefits of both approaches. Drawing inspiration from online information aggregation systems like Wikipedia and from traditional survey research, we propose a new class of research instruments called wiki surveys. Just as Wikipedia evolves over time based on contributions from participants, we envision an evolving survey driven by contributions from respondents. We develop three general principles that underlie wiki surveys: they should be greedy, collaborative, and adaptive. Building on these principles, we develop methods for data collection and data analysis for one type of wiki survey, a pairwise wiki survey. Using two proof-of-concept case studies involving our free and open-source website www.allourideas.org, we show that pairwise wiki surveys can yield insights that would be difficult to obtain with other methods. PMID:25992565

  9. Wiki surveys: open and quantifiable social data collection.

    Science.gov (United States)

    Salganik, Matthew J; Levy, Karen E C

    2015-01-01

    In the social sciences, there is a longstanding tension between data collection methods that facilitate quantification and those that are open to unanticipated information. Advances in technology now enable new, hybrid methods that combine some of the benefits of both approaches. Drawing inspiration from online information aggregation systems like Wikipedia and from traditional survey research, we propose a new class of research instruments called wiki surveys. Just as Wikipedia evolves over time based on contributions from participants, we envision an evolving survey driven by contributions from respondents. We develop three general principles that underlie wiki surveys: they should be greedy, collaborative, and adaptive. Building on these principles, we develop methods for data collection and data analysis for one type of wiki survey, a pairwise wiki survey. Using two proof-of-concept case studies involving our free and open-source website www.allourideas.org, we show that pairwise wiki surveys can yield insights that would be difficult to obtain with other methods.

  10. Radiation process control, study and acceptance of dosimetric methods

    International Nuclear Information System (INIS)

    Radak, B.B.

    1984-01-01

    The methods of primary dosimetric standardization and the calibration of dosimetric monitors suitable for radiation process control were outlined in the form of a logical pattern in which they are in current use on industrial scale in Yugoslavia. The reliability of the process control of industrial sterilization of medical supplies for the last four years was discussed. The preparatory works for the intermittent use of electron beams in cable industry were described. (author)

  11. Microbiological survey of a South African poultry processing plant.

    Science.gov (United States)

    Geornaras, I; de Jesus, A; van Zyl, E; von Holy, A

    1995-01-01

    Bacterial populations associated with poultry processing were determined on neck skin samples, equipment surfaces and environmental samples by replicate surveys. Aerobic plate counts, Enterobacteriaceae counts, Enterobacteriaceae counts and Pseudomonas counts were performed by standard procedures and the prevalence of Listeria, presumptive Salmonella and Staphylococcus aureus determined. Statistically significant (P defeathering curtains, shackles and conveyor belts repeatedly showed aerobic plate counts in excess of 5.0 log CFU 25 cm-2. Aerobic plate counts of scald tank and spinchiller water were 2 log CFU ml-1 higher than those of potable water samples. Bacterial numbers of the air in the "dirty" area were higher than those of the "clean" area. Listeria, presumptive Salmonella and Staphylococcus aureus were isolated from 27.6, 51.7 and 24.1% of all product samples, respectively, and Listeria and Staphylococcus aureus were also isolated from selected equipment surfaces.

  12. A Technical Survey on Optimization of Processing Geo Distributed Data

    Science.gov (United States)

    Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.

    2018-04-01

    With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.

  13. FY 2000 report on the survey for introduction of the hot spring effect prediction method in the geothermal development promotion survey. Improvement of the hot spring effect prediction method in the geothermal development promotion survey; 2000 nendo chinetsu kaihatsu sokushin chosa. Onsen eikyo yosoku shuho donyu chosa - Chinetsu kaihatsu sokushin chosa ni okeru onsen eikyo yosoku shuho no kairyo hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Supposing the case where the geothermal development promotion survey was carried out in promising geothermal areas in Japan, investigational study was conducted on possibilities of introducing the hot spring effect prediction method, improvement of the method, etc. In the survey, adjustment/classification of formation mechanisms of hot spring were made. For each of the formation mechanisms, the mechanisms in case of the geothermal development having effects were studied/summarized. As to how effects are brought about, presumed were the lowering of water level and decrease in discharge amount in accordance with the decreasing pressure and the dilution by increase in mixture of the ground water around the area. Also cited were the vaporization of hot spring aquifers by the increasing rate of vapor inflow, etc. For the introduction of the hot spring effect prediction method to the geothermal development promotion survey, the problem is short supply of various data, and the examination for it was made. Based on the results of the survey, items to be studied in case of introducing the hot spring effect prediction method were selected. Further, the hot spring effect prediction flow in case of introducing surface survey and well survey was made out. (NEDO)

  14. Learning-based controller for biotechnology processing, and method of using

    Science.gov (United States)

    Johnson, John A.; Stoner, Daphne L.; Larsen, Eric D.; Miller, Karen S.; Tolle, Charles R.

    2004-09-14

    The present invention relates to process control where some of the controllable parameters are difficult or impossible to characterize. The present invention relates to process control in biotechnology of such systems, but not limited to. Additionally, the present invention relates to process control in biotechnology minerals processing. In the inventive method, an application of the present invention manipulates a minerals bioprocess to find local exterma (maxima or minima) for selected output variables/process goals by using a learning-based controller for bioprocess oxidation of minerals during hydrometallurgical processing. The learning-based controller operates with or without human supervision and works to find processor optima without previously defined optima due to the non-characterized nature of the process being manipulated.

  15. Dead or Alive? Dealing with Unknown Eligibility in Longitudinal Surveys

    Directory of Open Access Journals (Sweden)

    Watson Nicole

    2016-12-01

    Full Text Available Longitudinal surveys follow people over time and some of these people will die during the life of the panel. Through fieldwork effort, some deaths will be reported or known, but others will be unobserved due to sample members no longer being issued to field or having inconclusive fieldwork outcomes (such as a noncontact that is not followed by a contact at a later wave. The coverage of deaths identified among sample members has flow-on implications to nonresponse correction. Using the Household, Income and Labour Dynamics in Australia (HILDA Survey, four methods are used to examine the extent of missing death reports. The first method matches the sample to the national death register. The second method uses life-expectancy tables to extrapolate the expected number of deaths among the sample with unknown eligibility. The third method is similar but models deaths from data internal to the survey. The fourth method models deaths as part of the attrition process of a longitudinal survey. The last three methods are compared to the first method and the implications for the construction of balanced panel weights and subsequent population inference are explored.

  16. Effect of the method of processing on quality and oxidative stability ...

    African Journals Online (AJOL)

    In this study four samn samples prepared from cow milk using two processing methods (traditional T1, T2 and factory processed T3, T4) were investigated for their physico-chemical properties, fatty acids composition, oxidative stability and sensory properties. The traditionally processed samples showed a significance ...

  17. Statistical sampling method for releasing decontaminated vehicles

    International Nuclear Information System (INIS)

    Lively, J.W.; Ware, J.A.

    1996-01-01

    Earth moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method (MIL-STD-105E, open-quotes Sampling Procedures and Tables for Inspection by Attributesclose quotes) for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium mill site in Monticello, Utah (a CERCLA regulated clean-up site). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello Projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  18. Methods used to conduct the pan-European Union survey on consumer attitudes to physical activity, body weight and health.

    Science.gov (United States)

    Kearney, J M; Kearney, M J; McElhone, S; Gibney, M J

    1999-03-01

    The purpose of conducting this survey was to identify data on consumer attitudes towards and beliefs about physical activity, body weight and health among the 15 countries of the EU. A cross-sectional study to get a picture of the attitudes to physical activity, body weight and health in the EU. For this, it was considered important that samples be nationally representative so that inferences drawn from the data could be applied to the population in each country as well as to the EU population as a whole. Using a non-probability sampling method employing quota controls (and the national weight) we obtained large sample sizes from each country which were nationally representative in terms of the variables age, sex and regional distribution. To ensure samples were truly nationally representative a national weight was used when analysing the data using the same characteristics as those used to define quotas. When examining pooled estimates for the total EU sample a population weight was applied. In total, 15,239 subjects aged 15 years and upwards in the EU completed the survey. This article gives details on the methods used in carrying out the survey from design of the questionnaire to sample selection, questionnaire administration and analysis of the data. The methods and their limitations are discussed.

  19. An Automated Processing Method for Agglomeration Areas

    Directory of Open Access Journals (Sweden)

    Chengming Li

    2018-05-01

    Full Text Available Agglomeration operations are a core component of the automated generalization of aggregated area groups. However, because geographical elements that possess agglomeration features are relatively scarce, the current literature has not given sufficient attention to agglomeration operations. Furthermore, most reports on the subject are limited to the general conceptual level. Consequently, current agglomeration methods are highly reliant on subjective determinations and cannot support intelligent computer processing. This paper proposes an automated processing method for agglomeration areas. Firstly, the proposed method automatically identifies agglomeration areas based on the width of the striped bridging area, distribution pattern index (DPI, shape similarity index (SSI, and overlap index (OI. Next, the progressive agglomeration operation is carried out, including the computation of the external boundary outlines and the extraction of agglomeration lines. The effectiveness and rationality of the proposed method has been validated by using actual census data of Chinese geographical conditions in the Jiangsu Province.

  20. Municipal solid waste processing methods: Technical-economic comparison

    International Nuclear Information System (INIS)

    Bertanza, G.

    1993-01-01

    This paper points out the advantages and disadvantages of municipal solid waste processing methods incorporating different energy and/or materials recovery techniques, i.e., those involving composting or incineration and those with a mix of composting and incineration. The various technologies employed are compared especially with regard to process reliability, flexibility, modularity, pollution control efficiency and cost effectiveness. For that which regards composting, biodigestors are examined, while for incineration, the paper analyzes systems using combustion with complete recovery of vapour, combustion with total recovery of available electric energy, and combustion with cogeneration. Each of the processing methods examined includes an iron recovery cycle

  1. An aerial radiological survey of the Double Track Site and surrounding area, Central Nevada. Date of survey: December 1993

    International Nuclear Information System (INIS)

    1995-08-01

    An aerial radiological survey of the Double Track Site was conducted in December 1993. An interim report described survey procedures and presented terrestrial exposure rate and wide-area-averaged plutonium isopleth plots. This letter report presents additional plutonium plots and some rule-of-thumb calculations which should help the reader to properly interpret the data presented. Attached to this report are three isopleth plots produced from the Double Track data. No one processing method provides all the answers regarding a particular surveyed area. Where peak vales are most important, a figure created from the original unsmoothed data is the presentation of choice. A figure from smoothed data is superior for the detection of areas of widespread low-level contamination. A figure , also smoothed data, satisfied a particular early mission goal but is not as useful for cleanup operations as the other two. This last figure is presented for historical completeness only

  2. Enhancing Field Research Methods with Mobile Survey Technology

    Science.gov (United States)

    Glass, Michael R.

    2015-01-01

    This paper assesses the experience of undergraduate students using mobile devices and a commercial application, iSurvey, to conduct a neighborhood survey. Mobile devices offer benefits for enhancing student learning and engagement. This field exercise created the opportunity for classroom discussions on the practicalities of urban research, the…

  3. Classical-processing and quantum-processing signal separation methods for qubit uncoupling

    Science.gov (United States)

    Deville, Yannick; Deville, Alain

    2012-12-01

    The Blind Source Separation problem consists in estimating a set of unknown source signals from their measured combinations. It was only investigated in a non-quantum framework up to now. We propose its first quantum extensions. We thus introduce the Quantum Source Separation field, investigating both its blind and non-blind configurations. More precisely, we show how to retrieve individual quantum bits (qubits) only from the global state resulting from their undesired coupling. We consider cylindrical-symmetry Heisenberg coupling, which e.g. occurs when two electron spins interact through exchange. We first propose several qubit uncoupling methods which typically measure repeatedly the coupled quantum states resulting from individual qubits preparations, and which then statistically process the classical data provided by these measurements. Numerical tests prove the effectiveness of these methods. We then derive a combination of quantum gates for performing qubit uncoupling, thus avoiding repeated qubit preparations and irreversible measurements.

  4. Web-based surveys as an alternative to traditional mail methods.

    Science.gov (United States)

    Fleming, Christopher M; Bowden, Mark

    2009-01-01

    Environmental economists have long used surveys to gather information about people's preferences. A recent innovation in survey methodology has been the advent of web-based surveys. While the Internet appears to offer a promising alternative to conventional survey administration modes, concerns exist over potential sampling biases associated with web-based surveys and the effect these may have on valuation estimates. This paper compares results obtained from a travel cost questionnaire of visitors to Fraser Island, Australia, that was conducted using two alternate survey administration modes; conventional mail and web-based. It is found that response rates and the socio-demographic make-up of respondents to the two survey modes are not statistically different. Moreover, both modes yield similar consumer surplus estimates.

  5. Novel welding image processing method based on fractal theory

    Institute of Scientific and Technical Information of China (English)

    陈强; 孙振国; 肖勇; 路井荣

    2002-01-01

    Computer vision has come into used in the fields of welding process control and automation. In order to improve precision and rapidity of welding image processing, a novel method based on fractal theory has been put forward in this paper. Compared with traditional methods, the image is preliminarily processed in the macroscopic regions then thoroughly analyzed in the microscopic regions in the new method. With which, an image is divided up to some regions according to the different fractal characters of image edge, and the fuzzy regions including image edges are detected out, then image edges are identified with Sobel operator and curved by LSM (Lease Square Method). Since the data to be processed have been decreased and the noise of image has been reduced, it has been testified through experiments that edges of weld seam or weld pool could be recognized correctly and quickly.

  6. Investigation of error estimation method of observational data and comparison method between numerical and observational results toward V and V of seismic simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Kawakami, Yoshiaki; Nakajima, Norihiro

    2017-01-01

    The method to estimate errors included in observational data and the method to compare numerical results with observational results are investigated toward the verification and validation (V and V) of a seismic simulation. For the method to estimate errors, 144 literatures for the past 5 years (from the year 2010 to 2014) in the structure engineering field and earthquake engineering field where the description about acceleration data is frequent are surveyed. As a result, it is found that some processes to remove components regarded as errors from observational data are used in about 30% of those literatures. Errors are caused by the resolution, the linearity, the temperature coefficient for sensitivity, the temperature coefficient for zero shift, the transverse sensitivity, the seismometer property, the aliasing, and so on. Those processes can be exploited to estimate errors individually. For the method to compare numerical results with observational results, public materials of ASME V and V Symposium 2012-2015, their references, and above 144 literatures are surveyed. As a result, it is found that six methods have been mainly proposed in existing researches. Evaluating those methods using nine items, advantages and disadvantages for those methods are arranged. The method is not well established so that it is necessary to employ those methods by compensating disadvantages and/or to search for a solution to a novel method. (author)

  7. Tertiary survey in polytrauma patients should be an ongoing process.

    Science.gov (United States)

    Ferree, Steven; Houwert, Roderick M; van Laarhoven, Jacqueline J E M; Smeeing, Diederik P J; Leenen, Luke P H; Hietbrink, Falco

    2016-04-01

    Due to prioritisation in the initial trauma care, non-life threatening injuries can be overlooked or temporally neglected. Polytrauma patients in particular might be at risk for delayed diagnosed injuries (DDI). Studies that solely focus on DDI in polytrauma patients are not available. Therefore the aim of this study was to analyze DDI and determine risk factors associated with DDI in polytrauma patients. In this single centre retrospective cohort study, patients were considered polytrauma when the Injury Severity Score was ≥ 16 as a result of injury in at least 2 body regions. Adult polytrauma patients admitted from 2007 until 2012 were identified. Hospital charts were reviewed to identify DDI. 1416 polytrauma patients were analyzed of which 12% had DDI. Most DDI were found during initial hospital admission after tertiary survey (63%). Extremities were the most affected regions for all types of DDI (78%) with the highest intervention rate (35%). Most prevalent DDI were fractures of the hand (54%) and foot (38%). In 2% of all patients a DDI was found after discharge, consisting mainly of injuries other than a fracture. High energy trauma mechanism (OR 1.8, 95% CI 1.2-2.7), abdominal injury (OR 1.5, 95% CI 1.1-2.1) and extremity injuries found during initial assessment (OR 2.3, 95% CI 1.6-3.3) were independent risk factors for DDI. In polytrauma patients, most DDI were found during hospital admission but after tertiary survey. This demonstrates that the tertiary survey should be an ongoing process and thus repeated daily in polytrauma patients. Most frequent DDI were extremity injuries, especially injuries of the hand and foot. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Assessing the search for information on Three Rs methods, and their subsequent implementation: a national survey among scientists in the Netherlands.

    Science.gov (United States)

    van Luijk, Judith; Cuijpers, Yvonne; van der Vaart, Lilian; Leenaars, Marlies; Ritskes-Hoitinga, Merel

    2011-10-01

    A local survey conducted among scientists into the current practice of searching for information on Three Rs (i.e. Replacement, Reduction and Refinement) methods has highlighted the gap between the statutory requirement to apply Three Rs methods and the lack of criteria to search for them. To verify these findings on a national level, we conducted a survey among scientists throughout The Netherlands. Due to the low response rate, the results give an impression of opinions, rather than being representative of The Netherlands as a whole. The findings of both surveys complement each other, and indicate that there is room for improvement. Scientists perceive searching the literature for information on Three Rs methods to be a difficult task, and specific Three Rs search skills and knowledge of Three Rs databases are limited. Rather than using a literature search, many researchers obtain information on these methods through personal communication, which means that published information on possible Three Rs methods often remains unfound and unused. A solution might be to move beyond the direct search for information on Three Rs methods and choose another approach. One approach that seems rather appropriate is that of systematic review. This provides insight into the necessity for any new animal studies, as well as optimal implementation of available data and the prevention of unnecessary animal use in the future. 2011 FRAME.

  9. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  10. THE BOLOCAM GALACTIC PLANE SURVEY. VIII. A MID-INFRARED KINEMATIC DISTANCE DISCRIMINATION METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Ellsworth-Bowers, Timothy P.; Glenn, Jason; Battersby, Cara; Ginsburg, Adam; Bally, John [CASA, University of Colorado, UCB 389, University of Colorado, Boulder, CO 80309 (United States); Rosolowsky, Erik [Department of Physics and Astronomy, University of British Columbia Okanagan, 3333 University Way, Kelowna, BC V1V 1V7 (Canada); Mairs, Steven [Department of Physics and Astronomy, University of Victoria, 3800 Finnerty Road, Victoria, BC V8P 1A1 (Canada); Evans, Neal J. II [Department of Astronomy, University of Texas, 1 University Station C1400, Austin, TX 78712 (United States); Shirley, Yancy L., E-mail: timothy.ellsworthbowers@colorado.edu [Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721 (United States)

    2013-06-10

    We present a new distance estimation method for dust-continuum-identified molecular cloud clumps. Recent (sub-)millimeter Galactic plane surveys have cataloged tens of thousands of these objects, plausible precursors to stellar clusters, but detailed study of their physical properties requires robust distance determinations. We derive Bayesian distance probability density functions (DPDFs) for 770 objects from the Bolocam Galactic Plane Survey in the Galactic longitude range 7. Degree-Sign 5 {<=} l {<=} 65 Degree-Sign . The DPDF formalism is based on kinematic distances, and uses any number of external data sets to place prior distance probabilities to resolve the kinematic distance ambiguity (KDA) for objects in the inner Galaxy. We present here priors related to the mid-infrared absorption of dust in dense molecular regions and the distribution of molecular gas in the Galactic disk. By assuming a numerical model of Galactic mid-infrared emission and simple radiative transfer, we match the morphology of (sub-)millimeter thermal dust emission with mid-infrared absorption to compute a prior DPDF for distance discrimination. Selecting objects first from (sub-)millimeter source catalogs avoids a bias towards the darkest infrared dark clouds (IRDCs) and extends the range of heliocentric distance probed by mid-infrared extinction and includes lower-contrast sources. We derive well-constrained KDA resolutions for 618 molecular cloud clumps, with approximately 15% placed at or beyond the tangent distance. Objects with mid-infrared contrast sufficient to be cataloged as IRDCs are generally placed at the near kinematic distance. Distance comparisons with Galactic Ring Survey KDA resolutions yield a 92% agreement. A face-on view of the Milky Way using resolved distances reveals sections of the Sagittarius and Scutum-Centaurus Arms. This KDA-resolution method for large catalogs of sources through the combination of (sub-)millimeter and mid-infrared observations of molecular

  11. Borehole radar survey at the granite quarry mine, Pocheon, Kyounggi province

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Ho; Cho, Seong Jun; Yi, Myeong Jong; Chung, Seung Hwan; Lee, Hee Il; Shin, In Chul [Korea Institute of Geology Mining and Materials, Taejon (Korea, Republic of)

    1997-12-01

    Borehole radar survey in combination with the reflection and tomography methods was conducted at the Donga granite quarry mine of Pocheon area in Kyounggi province. The purpose of radar survey in quarry mine is to delineate the inhomogeneities including fractures and to estimate the freshness of rock. 20 MHz was adopted as the central frequency for the radar reflection and tomography surveys for the longer distance of penetration. The reflection survey using the direction finding antenna was also conducted to get the information on the spatial orientation of reflectors. Besides the various kinds of radar borehole survey, two surface geophysical methods, dipole-dipole resistivity survey and ground penetrating radar, were also applied to delineate the hidden parts of geological structures which was confirmed by geological mapping. The reflection data processing package, RADPRO ver. 2.2, developed continuously through in this study, was used to process the borehole reflection radar data. The new programs to process radar reflection data using directional antenna were devised and used to calculate and image the orientation of reflectors. The major dip angle of fractured zones were determined from the radar reflection images. With the aid of direction finding antenna and the newly developed algorithm to image the orientation of reflectors, it was possible to get the three dimensional attitudes of reflectors. Detailed interpretation results of the surveyed area are included in this report. Through the interpretation of borehole reflection data using dipole and direction finding antenna, we could determine the orientation of the major fractured zone, the boundary of two mining areas. Many of hidden inhomogeneities were found by borehole radar methods. By the image of direction finding antenna, it was confirmed that nearly all of them were located at the outside of the planned mining area or were situated very deeply. Therefore, the surveyed area consists of very fresh and

  12. Signal processing methods for MFE plasma diagnostics

    International Nuclear Information System (INIS)

    Candy, J.V.; Casper, T.; Kane, R.

    1985-02-01

    The application of various signal processing methods to extract energy storage information from plasma diamagnetism sensors occurring during physics experiments on the Tandom Mirror Experiment-Upgrade (TMX-U) is discussed. We show how these processing techniques can be used to decrease the uncertainty in the corresponding sensor measurements. The algorithms suggested are implemented using SIG, an interactive signal processing package developed at LLNL

  13. Survey and Restoration

    Science.gov (United States)

    Mileto, C.; Vegas, F.

    2017-05-01

    In addition to the technological evolution over the last two centuries, survey has experienced two main conceptual leaps: the introduction of photography as a tool for an indiscriminate register for reality, and the shift from autographic to allographic survey, phenomena which can generate a distancing effect within the restoration process. Besides, this text presents the relationship between survey in its numerous forms and technologies (manual and semi-manual to more complex ones like scanner-laser) and the restoration of the building, either for establishing a diagnosis, operating or valorizating, illustrating it with examples developed by the authors, as well as the criteria to be applied when documenting a building to be restored, irrespective of the means and technology available in each case.

  14. SURVEY AND RESTORATION

    Directory of Open Access Journals (Sweden)

    C. Mileto

    2017-05-01

    Full Text Available In addition to the technological evolution over the last two centuries, survey has experienced two main conceptual leaps: the introduction of photography as a tool for an indiscriminate register for reality, and the shift from autographic to allographic survey, phenomena which can generate a distancing effect within the restoration process. Besides, this text presents the relationship between survey in its numerous forms and technologies (manual and semi-manual to more complex ones like scanner-laser and the restoration of the building, either for establishing a diagnosis, operating or valorizating, illustrating it with examples developed by the authors, as well as the criteria to be applied when documenting a building to be restored, irrespective of the means and technology available in each case.

  15. Unsupervised process monitoring and fault diagnosis with machine learning methods

    CERN Document Server

    Aldrich, Chris

    2013-01-01

    This unique text/reference describes in detail the latest advances in unsupervised process monitoring and fault diagnosis with machine learning methods. Abundant case studies throughout the text demonstrate the efficacy of each method in real-world settings. The broad coverage examines such cutting-edge topics as the use of information theory to enhance unsupervised learning in tree-based methods, the extension of kernel methods to multiple kernel learning for feature extraction from data, and the incremental training of multilayer perceptrons to construct deep architectures for enhanced data

  16. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR).

    Science.gov (United States)

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-06-01

    Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be

  17. Assessing Commercial and Alternative Poultry Processing Methods using Microbiome Analyses

    Science.gov (United States)

    Assessing poultry processing methods/strategies has historically used culture-based methods to assess bacterial changes or reductions, both in terms of general microbial communities (e.g. total aerobic bacteria) or zoonotic pathogens of interest (e.g. Salmonella, Campylobacter). The advent of next ...

  18. Data Processing Procedures and Methodology for Estimating Trip Distances for the 1995 American Travel Survey (ATS)

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, H.-L.; Rollow, J.

    2000-05-01

    The 1995 American Travel Survey (ATS) collected information from approximately 80,000 U.S. households about their long distance travel (one-way trips of 100 miles or more) during the year of 1995. It is the most comprehensive survey of where, why, and how U.S. residents travel since 1977. ATS is a joint effort by the U.S. Department of Transportation (DOT) Bureau of Transportation Statistics (BTS) and the U.S. Department of Commerce Bureau of Census (Census); BTS provided the funding and supervision of the project, and Census selected the samples, conducted interviews, and processed the data. This report documents the technical support for the ATS provided by the Center for Transportation Analysis (CTA) in Oak Ridge National Laboratory (ORNL), which included the estimation of trip distances as well as data quality editing and checking of variables required for the distance calculations.

  19. Systematic methods for synthesis and design of sustainable chemical and biochemical processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Chemical and biochemical process design consists of designing the process that can sustainably manufacture an identified chemical product through a chemical or biochemical route. The chemical product tree is potentially very large; starting from a set of basic raw materials (such as petroleum...... for process intensification, sustainable process design, identification of optimal biorefinery models as well as integrated process-control design, and chemical product design. The lecture will present the main concepts, the decomposition based solution approach, the developed methods and tools together...

  20. Drug and Therapeutics (D & T) committees in Dutch hospitals : a nation-wide survey of structure, activities, and drug selection procedures

    NARCIS (Netherlands)

    Fijn, R; Brouwers, JRBJ; Knaap, RJ; De Jong-Van den Berg, LTW

    Aims To determine structure, activities and drug selection processes used by Dutch hospital drug and therapeutics (D & T) committees. Methods A pretested structured survey questionnaire based on the Australian process and impact indicators, previous research, and consultation of professionals was

  1. Intelligent methods for the process parameter determination of plastic injection molding

    Science.gov (United States)

    Gao, Huang; Zhang, Yun; Zhou, Xundao; Li, Dequn

    2018-03-01

    Injection molding is one of the most widely used material processing methods in producing plastic products with complex geometries and high precision. The determination of process parameters is important in obtaining qualified products and maintaining product quality. This article reviews the recent studies and developments of the intelligent methods applied in the process parameter determination of injection molding. These intelligent methods are classified into three categories: Case-based reasoning methods, expert system- based methods, and data fitting and optimization methods. A framework of process parameter determination is proposed after comprehensive discussions. Finally, the conclusions and future research topics are discussed.

  2. Method for processing spent nuclear reactor fuel

    International Nuclear Information System (INIS)

    Levenson, M.; Zebroski, E.L.

    1981-01-01

    A method and apparatus are claimed for processing spent nuclear reactor fuel wherein plutonium is continuously contaminated with radioactive fission products and diluted with uranium. Plutonium of sufficient purity to fabricate nuclear weapons cannot be produced by the process or in the disclosed reprocessing plant. Diversion of plutonium is prevented by radiation hazards and ease of detection

  3. A new survey method of tsunami inundation area using chemical analysis of soil. Application to the field survey on the 2010 Chilean tsunami at Chile

    International Nuclear Information System (INIS)

    Yoshii, Takumi; Matsuyama, Masafumi; Koshimura, Shunichi; Mas, Erick; Matsuoka, Masashi; Jimenez, Cesar

    2011-01-01

    The severe earthquake of Mw 8.8 occurred on 27 Feb. 2010 at the center of Chile. The tsunami generated by the earthquake attacked the coast of Chile and it propagated to the Pacific Ocean coastline. The field survey on the disaster damages due to the tsunami was conducted near Talcahuano in Chile to prepare for the great tsunamis accompanied by the earthquakes predicted to occur near Japan within several decades. The aims of this field survey were to survey disaster damages especially relevant to electric equipments and to develop the survey method based on a chemical analysis of the inundated soil which supplies objective data with high accuracy compared to the conventional methods. In the survey area, the average of inundation heights was 6 m, however it locally reached up to 25 m. The maximum sea-level height of the series of the tsunamis was recorded in the third or fourth wave (roughly 3 hours after the earthquake occurrence). The first floors of houses were severely destroyed and some ships were carried and left on land by the tsunamis. Furthermore, the large amount of sediment was deposited in towns. Removing the drifted ships and tsunami deposit is important consideration for quick recovery from a disaster due to a tsunami. The soil samples were obtained from both the inundated and the not-inundated position. The stirred solution was made by the soil and ultrapure water, then, the content of water-soluble ions, electric conductivity (EC), and pH were measured. The soil obtained in the tsunami inundated area contains much water-soluble ions (Na + , Mg 2+ , Cl - , Br - , SO 4 2- ) compared to the samples obtained in the not-inundated area. The discriminant analysis of the tsunami inundation was conducted using the amount of ions in the soil. High discriminant accuracy (over 90%) was obtained with Na + , Mg 2+ , Cl - , Br - , SO 4 2- and EC. Br - , Cl - , Na + are believed to be suitable for the discriminant analysis about tsunamis considering the contaminant

  4. A Survey on the Ship Loading Problem

    DEFF Research Database (Denmark)

    Iris, Cagatay; Pacino, Dario

    2015-01-01

    Recent statistics show that large container terminals can process more than 30 million containers a year, and are constantly in search for the better ways to optimize processing time, deliver high quality and profitable services. Some of the terminal decisions are, however, dependent...... are integrated to improve the efficiency of the ship handling operations. We present a survey of the state-of-the-art methods and of the available benchmarking data....

  5. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  6. International Heart Valve Bank Survey: A Review of Processing Practices and Activity Outcomes

    Science.gov (United States)

    Albrecht, Helmi; Lim, Yeong Phang; Manning, Linda

    2013-01-01

    A survey of 24 international heart valve banks was conducted to acquire information on heart valve processing techniques used and outcomes achieved. The objective was to provide an overview of heart valve banking activities for tissue bankers, tissue banking associations, and regulatory bodies worldwide. Despite similarities found for basic manufacturing processes, distinct differences in procedural details were also identified. The similarities included (1) use of sterile culture media for procedures, (2) antibiotic decontamination, (3) use of dimethyl sulfoxide (DMSO) as a cryoprotectant, (4) controlled rate freezing for cryopreservation, and (5) storage at ultralow temperatures of below −135°C. Differences in procedures included (1) type of sterile media used, (2) antibiotics combination, (3) temperature and duration used for bioburden reduction, (4) concentration of DMSO used for cryopreservation, and (5) storage duration for released allografts. For most banks, the primary reasons why allografts failed to meet release criteria were positive microbiological culture and abnormal morphology. On average, 85% of allografts meeting release criteria were implanted, with valve size and type being the main reasons why released allografts were not used clinically. The wide variation in percentage of allografts meeting release requirements, despite undergoing validated manufacturing procedures, justifies the need for regular review of important outcomes as cited in this paper, in order to encourage comparison and improvements in the HVBs' processes. PMID:24163756

  7. A Method for Recruiting Participants from Isolated Islands of Small Island Developing States (SIDS) for Survey Research

    Science.gov (United States)

    Moosa, Sheena; Koopman-Boyden, Peggy

    2016-01-01

    Representing isolated small island communities through social survey research continues to be challenging. We examine a locally developed method to reach and recruit older people (65+ years) for a survey on well-being in the small island developing state of Maldives. The use of messengers to recruit participants is examined in the context of these…

  8. Non-filtration method of processing uranium ores

    International Nuclear Information System (INIS)

    Laskorin, B.N.; Vodolazov, L.I.; Tokarev, N.N.; Vyalkov, V.I.; Goldobina, V.A.; Gosudarstvennyj Komitet po Ispol'zovaniyu Atomnoj Ehnergii SSSR, Moscow)

    1977-01-01

    The development of the non-filtration sorption method has lead to procedures of the sorption leaching and the extraction desorption, which have made it possible to intensify the processing of uranium ores and to improve greatly the technical and economic indexes by eliminating the complex method of multiple filtration and re-pulping of cakes. This method makes it possible to involve more poor uranium raw materials, at the same time extracting valuable components such as molybdenum, vanadium, copper, etc. Considerable industrial experience has been acquired in the sorption of dense pulp with a solid-to-liquid phase ratio of 1:1. This has led to a plant production increase of 1.5-3.0 times, an increase of uranium extraction by 5-10%, a two- to- three-fold increase of labour capacity of the main workers, and to a several-fold decrease of reagents, auxiliary materials, electric energy and vapour. This non-filtration method is a continuous process in all its phases thanks to the use of high-yield and high-power equipment for high-density pulps. (author)

  9. TYPE II-P SUPERNOVAE FROM THE SDSS-II SUPERNOVA SURVEY AND THE STANDARDIZED CANDLE METHOD

    International Nuclear Information System (INIS)

    D'Andrea, Chris B.; Sako, Masao; Dilday, Benjamin; Jha, Saurabh; Frieman, Joshua A.; Kessler, Richard; Holtzman, Jon; Konishi, Kohki; Yasuda, Naoki; Schneider, D. P.; Sollerman, Jesper; Wheeler, J. Craig; Cinabro, David; Nichol, Robert C.; Lampeitl, Hubert; Smith, Mathew; Atlee, David W.; Bassett, Bruce; Castander, Francisco J.; Goobar, Ariel

    2010-01-01

    We apply the Standardized Candle Method (SCM) for Type II Plateau supernovae (SNe II-P), which relates the velocity of the ejecta of a SN to its luminosity during the plateau, to 15 SNe II-P discovered over the three season run of the Sloan Digital Sky Survey-II Supernova Survey. The redshifts of these SNe-0.027 0.01) as all of the current literature on the SCM combined. We find that the SDSS SNe have a very small intrinsic I-band dispersion (0.22 mag), which can be attributed to selection effects. When the SCM is applied to the combined SDSS-plus-literature set of SNe II-P, the dispersion increases to 0.29 mag, larger than the scatter for either set of SNe separately. We show that the standardization cannot be further improved by eliminating SNe with positive plateau decline rates, as proposed in Poznanski et al. We thoroughly examine all potential systematic effects and conclude that for the SCM to be useful for cosmology, the methods currently used to determine the Fe II velocity at day 50 must be improved, and spectral templates able to encompass the intrinsic variations of Type II-P SNe will be needed.

  10. The Canadian Human Activity Pattern Survey: report of methods and population surveyed.

    Science.gov (United States)

    Leech, J A; Wilby, K; McMullen, E; Laporte, K

    1996-01-01

    The assessment of health risk due to environmental contaminants depends upon accurate estimates of the distribution of population exposures. Exposure assessment, in turn, requires information on the time people spend in micro-environments and their activities during periods of exposure. This paper describes preliminary results including study methodology and population sampled in a large Canadian survey of time-activity patterns. A 24-hour diary recall survey was performed in 2381 households (representing a 65% response rate) to describe in detail the timing, location and activity pattern of one household member (the adult or child with the next birthday). Four cities (Toronto, Vancouver, Edmonton and Saint John, NB) and their suburbs were sampled by random-digit dialling over a nine-month period in 1994/1995. Supplemental questionnaires inquiring about sociodemographic information, house and household characteristics and potential exposure to toxins in the air and water were also administered. In general, the results show that respondents spend the majority of their time indoors (88.6%) with smaller proportions of time outdoors (6.1%) and in vehicles (5.3%). Children under the age of 12 spend more time both indoors and outdoors and less time in transit than do adults. The data from this study will be used to define more accurately the exposure of Canadians to a variety of toxins in exposure assessment models and to improve upon the accuracy of risk assessment for a variety of acute and chronic health effects known or suspected to be related to environmental exposures.

  11. Methods for solving reasoning problems in abstract argumentation – A survey

    Science.gov (United States)

    Charwat, Günther; Dvořák, Wolfgang; Gaggl, Sarah A.; Wallner, Johannes P.; Woltran, Stefan

    2015-01-01

    Within the last decade, abstract argumentation has emerged as a central field in Artificial Intelligence. Besides providing a core formalism for many advanced argumentation systems, abstract argumentation has also served to capture several non-monotonic logics and other AI related principles. Although the idea of abstract argumentation is appealingly simple, several reasoning problems in this formalism exhibit high computational complexity. This calls for advanced techniques when it comes to implementation issues, a challenge which has been recently faced from different angles. In this survey, we give an overview on different methods for solving reasoning problems in abstract argumentation and compare their particular features. Moreover, we highlight available state-of-the-art systems for abstract argumentation, which put these methods to practice. PMID:25737590

  12. Spectrometric aerial survey as a new tool for geological survey and mining prospecting

    International Nuclear Information System (INIS)

    Cambon, R.

    1997-01-01

    Airborne survey for radioactive minerals started around 1945. The limited sensitivity of the tools used, the difficulties found for the topographic and training effect corrections, made difficult the evaluation of the results. The technical progresses realized in the recent past years in electronic and computer sciences allowed to overcome these difficulties and gave to the method all its potentialities. With the aerial spectrometric survey, a new step was made, because this method can be used for other topics than radioactive prospection such as geological survey and mining prospection for metallic and industrial minerals. The spectrometric method is based on the possibility to measure photopeak energies (gamma radiation) emitted by radioactive minerals and discriminate between them those emitted by U238, TI 208 and K40 respectively daughter products of uranium, thorium and potassium. For airborne survey, one consider that measuring instruments will allow to pick-up 80% of the radioactive emission concerning the first 15 to 30 centimetres of ground (1 metre maximum). The use of this method for geological and mineral exploration is based on the assumption that different rock types or ore bearing rock types are composed of certain amounts of rock forming minerals which comprise specific quantities of radioactive elements such as potassium, uranium and thorium (cf: Gabelman 77). To be able to evaluate the results of the spectrometric survey it will be necessary to know roughly the behaviour of the different radioactive elements through a complete geological cycle. (author)

  13. How characterization and clearance process is planned to be optimized by combining MARSSIM methods with parametric statistics in decommissioning of Karolinska University Hospital in Stockholm

    International Nuclear Information System (INIS)

    Jiselmark, J.

    2017-01-01

    There are different standards for the characterization and clearance process used globally in the radiological industry. All of them have advantages and disadvantages. This paper is describing a decommissioning project which is combining two methods in order to use the advantages of both and minimizing the disadvantages. In Sweden there have been a standard since several years to use a method based on parametric Bayesian statistics for the characterization and clearance process. This method has great advantages close to the clearance limits due to few measurements per m"2, an ability to add extra measurements if needed and an ability to reshape area units without restarting the clearance process. Since the method is based on units with a normal or LOG-normal distribution of the contamination there can be several units far from the clearance limits. The American MARSSIM method use non parametric statistics instead of parametric. In comparison to the Bayesian methods this results in the disadvantage of less accuracy close to the clearance limits but also in the great advantage with few units far from the clearance limits. In the characterizing and clearance process of old radiological facilities at the Karolinska University Hospital in Stockholm the MARSSIM method is combined with the Bayesian statistics method to minimize the amount of measurements and by that the cost for clearance. By using Bayesian statistics close to the clearance limits, more areas will be approved for clearance and the risk of having to redo the survey is minimized. By using MARSSIM methods in the area with an assumed contamination below 25 % of the clearance limits, the areas are not needed to be divided into units with normal or LOG-normal distributed activity. Bigger areas can be handled as units which result in fewer measurements and a faster process. (authors)

  14. The process and organizational characteristics of memory clinics in Israel in 2007.

    Science.gov (United States)

    Werner, Perla; Goldstein, Dovrat; Heinik, Jeremia

    2009-01-01

    We previously described the characteristics and activities of 25 memory clinics in Israel in 1998 using a mail survey. Questionnaires assessing the administrative structure of the clinics, patient characteristics, processes and methods used, and outcomes of the assessment were mailed again in 2007 to 35 memory clinics. Overall, the general operating characteristics of the clinics in 2007 were found to be similar to those reported in the previous survey conducted in 1998. The assessment process in 2007 was shorter than in 1998 (mean time=1.92 and 3.12 h, respectively), although both surveys were based on an interdisciplinary team, including a physician, a nurse and a social worker. However, in 2007 the teams were more wide-ranging. A wider variety of instruments were reported in the more recent survey. Most of the clinics in both surveys reported that family members were involved at all stages of the assessment. Medication treatment was the main outcome reported by the clinics in both surveys. There has been a development in the process and organizational characteristics of memory clinics in Israel over the years, probably as a consequence of the development of knowledge in the area of cognitive deterioration.

  15. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  16. Data Processing And Machine Learning Methods For Multi-Modal Operator State Classification Systems

    Science.gov (United States)

    Hearn, Tristan A.

    2015-01-01

    This document is intended as an introduction to a set of common signal processing learning methods that may be used in the software portion of a functional crew state monitoring system. This includes overviews of both the theory of the methods involved, as well as examples of implementation. Practical considerations are discussed for implementing modular, flexible, and scalable processing and classification software for a multi-modal, multi-channel monitoring system. Example source code is also given for all of the discussed processing and classification methods.

  17. Service Quality Evaluation of Restaurants Using The Ahp And Topsis Method

    OpenAIRE

    YILDIZ, Salih; YILDIZ, Emel

    2015-01-01

    The purpose of this study is to determine the factors affecting service quality at restaurants that operate in the service industry. Three restaurants operating in Trabzon were analyzed in terms of the quality of service they provided. In the study, the simple sampling random method was applied to 300 face-to-face interviews in which 30 surveys were deemed invalid and thus eliminated, resulting in a total of 270 surveys being evaluated. The Analytical Hierarchy Process (AHP) method was used t...

  18. The application of integrated geophysical methods composed of AMT and high-precision ground magnetic survey to the exploration of granite uranium deposits

    International Nuclear Information System (INIS)

    Qiao Yong; Shen Jingbang; Wu Yong; Wang Zexia

    2014-01-01

    Introduced two methods composed of AMT and high-precision ground magnetic survey were used to the exploration of granite uranium deposits in the Yin gongshan areas middle part of the Nei Monggol. Through experiment of methods and analysis of applicated results, think that AMT have good vertical resolution and could preferably survey thickness of rockmass, position of fracture and deep conditions, space distribution features of fracture zone ect, but it is not clear for rockmass, xenolith of reflection. And high-precision ground magnetic survey could delineate rockmass, xenolith of distribution range and identify the rock contact zone, fracture ect, but it generally measure position and it is not clear for occurrence, extension. That can resolve some geological structures by using the integrated methods and on the basis of sharing their complementary advantages. Effective technological measures are provided to the exploration of deep buried uranium bodies in the granite uranium deposits and outskirt extension of the deposit. (authors)

  19. Improved methods to deduct trip legs and mode from travel surveys using wearable GPS devices: A case study from the Greater Copenhagen area

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Kjær; Ingvardson, Jesper Bláfoss; Halldórsdóttir, Katrín

    2015-01-01

    GPS data collection has become an important means of investigating travel behaviour. This is because such data ideally provide far more detailed information on route choice and travel patterns over a longer time period than possible from traditional travel survey methods. Wearing a GPS unit...... the specification of the model parameters and thresholds. The method thus makes it possible to use GPS for travel surveys in large-scale multi-modal networks....

  20. Collaborative simulation method with spatiotemporal synchronization process control

    Science.gov (United States)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  1. CESAR cost-efficient methods and processes for safety-relevant embedded systems

    CERN Document Server

    Wahl, Thomas

    2013-01-01

    The book summarizes the findings and contributions of the European ARTEMIS project, CESAR, for improving and enabling interoperability of methods, tools, and processes to meet the demands in embedded systems development across four domains - avionics, automotive, automation, and rail. The contributions give insight to an improved engineering and safety process life-cycle for the development of safety critical systems. They present new concept of engineering tools integration platform to improve the development of safety critical embedded systems and illustrate capacity of this framework for end-user instantiation to specific domain needs and processes. They also advance state-of-the-art in component-based development as well as component and system validation and verification, with tool support. And finally they describe industry relevant evaluated processes and methods especially designed for the embedded systems sector as well as easy adoptable common interoperability principles for software tool integratio...

  2. Methods for computing water-quality loads at sites in the U.S. Geological Survey National Water Quality Network

    Science.gov (United States)

    Lee, Casey J.; Murphy, Jennifer C.; Crawford, Charles G.; Deacon, Jeffrey R.

    2017-10-24

    The U.S. Geological Survey publishes information on concentrations and loads of water-quality constituents at 111 sites across the United States as part of the U.S. Geological Survey National Water Quality Network (NWQN). This report details historical and updated methods for computing water-quality loads at NWQN sites. The primary updates to historical load estimation methods include (1) an adaptation to methods for computing loads to the Gulf of Mexico; (2) the inclusion of loads computed using the Weighted Regressions on Time, Discharge, and Season (WRTDS) method; and (3) the inclusion of loads computed using continuous water-quality data. Loads computed using WRTDS and continuous water-quality data are provided along with those computed using historical methods. Various aspects of method updates are evaluated in this report to help users of water-quality loading data determine which estimation methods best suit their particular application.

  3. Use of deterministic methods in survey calculations for criticality problems

    International Nuclear Information System (INIS)

    Hutton, J.L.; Phenix, J.; Course, A.F.

    1991-01-01

    A code package using deterministic methods for solving the Boltzmann Transport equation is the WIMS suite. This has been very successful for a range of situations. In particular it has been used with great success to analyse trends in reactivity with a range of changes in state. The WIMS suite of codes have a range of methods and are very flexible in the way they can be combined. A wide variety of situations can be modelled ranging through all the current Thermal Reactor variants to storage systems and items of chemical plant. These methods have recently been enhanced by the introduction of the CACTUS method. This is based on a characteristics technique for solving the Transport equation and has the advantage that complex geometrical situations can be treated. In this paper the basis of the method is outlined and examples of its use are illustrated. In parallel with these developments the validation for out of pile situations has been extended to include experiments with relevance to criticality situations. The paper will summarise this evidence and show how these results point to a partial re-adoption of deterministic methods for some areas of criticality. The paper also presents results to illustrate the use of WIMS in criticality situations and in particular show how it can complement codes such as MONK when used for surveying the reactivity effect due to changes in geometry or materials. (Author)

  4. Method for qualification of cementation processes and its application to a vibration mixer

    International Nuclear Information System (INIS)

    Vicente, R.; Rzyski, B.M.; Suarez, A.A.

    1987-01-01

    In this paper the definition of homogeneneity is discussed and methods to measure the 'degree of heterogeneity' of waste forms are proposed. These measurements are important as aids for mixing process qualification, and as tools in quality assurance procedures and in the development of waste management standards. Homogeneity is a basic quality requirement for waste forms to be accepted in final sites. It do not depend on the matrix immmobilization, rather it is one mean for qualification of the immobilization process. The proposed methods were applied to a vibration assisted mixing process and has proved to an useful mean to judge process improvements. There are many conceivable methods to evaluate homogeneity of waste forms. Some were selected as screening tests aiming at quickly reaching a promising set of process variables. Others were selected to evaluate the degree of excellence of the process in respect to product quality. These envisaged methods were: visual inspection, the use of cement dye as tracer, scanning of radioactive tracers, and measurements of variations of density, water absorption, porosity and mechanical strength across the waste form sample. The process variables were: waste-cement and water-cement ratios, mixer geometry, mixing time and vibration intensity. Some of the apparatus details were change during the experimental work in order to improve product quality. Experimental methods and results statistically analysed and compared with data obtained from samples prepared with a planetary paddle mixer, which were adopted as the homogeneity standard. (Author) [pt

  5. Methods for the Evaluation of Waste Treatment Processes

    Directory of Open Access Journals (Sweden)

    Hans-Joachim Gehrmann

    2017-01-01

    Full Text Available Decision makers for waste management are confronted with the problem of selecting the most economic, environmental, and socially acceptable waste treatment process. This paper elucidates evaluation methods for waste treatment processes for the comparison of ecological and economic aspects such as material flow analysis, statistical entropy analysis, energetic and exergetic assessment, cumulative energy demand, and life cycle assessment. The work is based on the VDI guideline 3925. A comparison of two thermal waste treatment plants with different process designs and energy recovery systems was performed with the described evaluation methods. The results are mainly influenced by the type of energy recovery, where the waste-to-energy plant providing district heat and process steam emerged to be beneficial in most aspects. Material recovery options from waste incineration were evaluated according to sustainability targets, such as saving of resources and environmental protection.

  6. A Survey of Rabbit Handling Methods Within the United Kingdom and the Republic of Ireland.

    Science.gov (United States)

    Oxley, James Andrew; Ellis, Clare Frances; McBride, E Anne; McCormick, Wanda Denise

    2018-04-25

    Rabbits are commonly kept in a variety of settings, including homes, laboratories, and veterinary clinics. Despite the popularity of keeping this prey species, little research has investigated current methods of handling. The aim of this study was to examine the experience of caregivers (owners and keepers) in using five handling methods commonly referred to in books written for companion animal (pet) owners and veterinary and/or laboratory personnel. An online survey was completed by 2644 respondents, representing all three of these groups, and breeders. Data were acquired to determine sources that participants used to gain knowledge of different handling methods, the methods they used and for what purposes they used them, and their perceptions of any associated difficulties or welfare concerns. Results indicated that participants most frequently used the method of supporting a rabbit's body against a person's chest, which was considered the easiest and most welfare-friendly method of the handling methods explored. "Scruffing with rear support" was the least used method and was considered to be distressing and painful for the rabbit. As rabbits are a terrestrial prey species, being picked up is likely an innately stressful experience. Additional research is encouraged to explore the experience of rabbits during handling to identify methods that can be easily used with the fewest welfare compromises.

  7. Measuring methods, registration and signal processing for magnetic field research

    International Nuclear Information System (INIS)

    Nagiello, Z.

    1981-01-01

    Some measuring methods and signal processing systems based on analogue and digital technics, which have been applied in magnetic field research using magnetometers with ferromagnetic transducers, are presented. (author)

  8. Statistical problems raised by data processing of food surveys; Problemes statistiques poses par le depouillement d'enquetes alimentaires

    Energy Technology Data Exchange (ETDEWEB)

    Lacourly, Nancy

    1974-02-08

    The methods used for the analysis of dietary habits of national populations - food surveys - have been studied. S. Lederman's linear model for the estimation of the average individual consumptions from the total family diets was in the light of a food survey carried on with 250 Roman families in 1969. An important bias in the estimates thus obtained was shown out by a simulation assuming 'housewife's dictatorship'; these assumptions should contribute to set up an unbiased model. Several techniques of multidimensional analysis were therefore used and the theoretical aspect of linear regression for some particular situations had to be investigated: quasi-colinear 'independent variables', measurements with errors, positive constraints on regression coefficients. A new survey methodology was developed taking account of the new 'Integrated Information Systems', which have incidence on all the stages of a consumption survey: organization, data collection, constitution of an information bank and data processing. (author) [French] Apres une revision des concepts et definitions fondamentaux des domaines de la Nutrition et de l'Alimentation humaines, on a approfondi les methodes disponibles pour analyser le comportement alimentaire des populations au niveau national: les enquetes de consommations alimentaires. Le modele lineaire propose par S. LEDERMANN pour estimer les consommations moyennes individuelles a partir des consommations totales des familles est soumis a une critique formulee a partir d'une enquete alimentaire realisee a Rome en 1969 aupres de 250 menages. Une simulation, dont les hypotheses sont basees sur la dictature de la maitresse de maison, a permis de mettre en evidence un biais important dans les estimations obtenues a l'aide de ce modele; ces hypotheses devraient servir a l'etablissement d'un modele sans biais. On a utilise diverses techniques d'analyse multidimensionnelle et on a du approfondir l'aspect theorique de la regression lineaire pour quelques

  9. Duality of Ross Ice Shelf systems: crustal boundary, ice sheet processes and ocean circulation from ROSETTA-Ice surveys

    Science.gov (United States)

    Tinto, K. J.; Siddoway, C. S.; Padman, L.; Fricker, H. A.; Das, I.; Porter, D. F.; Springer, S. R.; Siegfried, M. R.; Caratori Tontini, F.; Bell, R. E.

    2017-12-01

    Bathymetry beneath Antarctic ice shelves controls sub-ice-shelf ocean circulation and has a major influence on the stability and dynamics of the ice sheets. Beneath the Ross Ice Shelf, the sea-floor bathymetry is a product of both tectonics and glacial processes, and is influenced by the processes it controls. New aerogeophysical surveys have revealed a fundamental crustal boundary bisecting the Ross Ice Shelf and imparting a duality to the Ross Ice Shelf systems, encompassing bathymetry, ocean circulation and ice flow history. The ROSETTA-Ice surveys were designed to increase the resolution of Ross Ice Shelf mapping from the 55 km RIGGS survey of the 1970s to a 10 km survey grid, flown over three years from New York Air National Guard LC130s. Radar, LiDAR, gravity and magnetic instruments provide a top to bottom profile of the ice shelf and the underlying seafloor, with 20 km resolution achieved in the first two survey seasons (2015 and 2016). ALAMO ocean-profiling floats deployed in the 2016 season are measuring the temperature and salinity of water entering and exiting the sub-ice water cavity. A significant east-west contrast in the character of the magnetic and gravity fields reveals that the lithospheric boundary between East and West Antarctica exists not at the base of the Transantarctic Mountains (TAM), as previously thought, but 300 km further east. The newly-identified boundary spatially coincides with the southward extension of the Central High, a rib of shallow basement identified in the Ross Sea. The East Antarctic side is characterized by lower amplitude magnetic anomalies and denser TAM-type lithosphere compared to the West Antarctic side. The crustal structure imparts a fundamental duality on the overlying ice and ocean, with deeper bathymetry and thinner ice on the East Antarctic side creating a larger sub-ice cavity for ocean circulation. The West Antarctic side has a shallower seabed, more restricted ocean access and a more complex history of

  10. Mathematical methods for diffusion MRI processing

    International Nuclear Information System (INIS)

    Lenglet, C.; Lenglet, C.; Sapiro, G.; Campbell, J.S.W.; Pike, G.B.; Campbell, J.S.W.; Siddiqi, K.; Descoteaux, M.; Haro, G.; Wassermann, D.; Deriche, R.; Wassermann, D.; Anwander, A.; Thompson, P.M.

    2009-01-01

    In this article, we review recent mathematical models and computational methods for the processing of diffusion Magnetic Resonance Images, including state-of-the-art reconstruction of diffusion models, cerebral white matter connectivity analysis, and segmentation techniques. We focus on Diffusion Tensor Images (DTI) and Q-Ball Images (QBI). (authors)

  11. Using hidden Markov models to deal with availability bias on line transect surveys.

    Science.gov (United States)

    Borchers, D L; Zucchini, W; Heide-Jørgensen, M P; Cañadas, A; Langrock, R

    2013-09-01

    We develop estimators for line transect surveys of animals that are stochastically unavailable for detection while within detection range. The detection process is formulated as a hidden Markov model with a binary state-dependent observation model that depends on both perpendicular and forward distances. This provides a parametric method of dealing with availability bias when estimates of availability process parameters are available even if series of availability events themselves are not. We apply the estimators to an aerial and a shipboard survey of whales, and investigate their properties by simulation. They are shown to be more general and more flexible than existing estimators based on parametric models of the availability process. We also find that methods using availability correction factors can be very biased when surveys are not close to being instantaneous, as can estimators that assume temporal independence in availability when there is temporal dependence. © 2013, The International Biometric Society.

  12. Study on Processing Method of Image Shadow

    Directory of Open Access Journals (Sweden)

    Wang Bo

    2014-07-01

    Full Text Available In order to effectively remove disturbance of shadow and enhance robustness of information processing of computer visual image, this paper makes study on inspection and removal of image shadow. It makes study the continual removal algorithm of shadow based on integration, the illumination surface and texture, it respectively introduces their work principles and realization method, it can effectively carrying processing for shadow by test.

  13. Maintenance Approaches for Different Production Methods

    Directory of Open Access Journals (Sweden)

    Mungani, Dzivhuluwani Simon

    2013-11-01

    Full Text Available Various production methods are used in industry to manufacture or produce a variety of products needed by industry and consumers. The nature of a product determines which production method is most suitable or cost-effective. A continuous process is typically used to produce large volumes of liquids or gases. Batch processing is often used for small volumes, such as pharmaceutical products. This paper discusses a research project to determine the relationship between maintenance approaches and production methods. A survey was done to determine to what extent three maintenance approaches reliability-centred maintenance (RCM, total productive maintenance (TPM, and business-centred maintenance (BCM are used for three different processing methods (continuous process, batch process, and a production line method.

  14. Rethinking Traditional Methods of Survey Validation

    Science.gov (United States)

    Maul, Andrew

    2017-01-01

    It is commonly believed that self-report, survey-based instruments can be used to measure a wide range of psychological attributes, such as self-control, growth mindsets, and grit. Increasingly, such instruments are being used not only for basic research but also for supporting decisions regarding educational policy and accountability. The…

  15. Technical errors in complete mouth radiographic survey according to radiographic techniques and film holding methods

    International Nuclear Information System (INIS)

    Choi, Karp Sik; Byun, Chong Soo; Choi, Soon Chul

    1986-01-01

    The purpose of this study was to investigate the numbers and causes of retakes in 300 complete mouth radiographic surveys made by 75 senior dental students. According to radiographic techniques and film holding methods, they were divided into 4 groups: Group I: Bisecting-angle technique with patient's fingers. Group II: Bisecting-angle technique with Rinn Snap-A-Ray device. Group III: Bisecting-angle technique with Rinn XCP instrument (short cone) Group IV: Bisecting-angle technique with Rinn XCP instrument (long cone). The most frequent cases of retakes, the most frequent tooth area examined, of retakes and average number of retakes per complete mouth survey were evaluated. The obtained results were as follows: Group I: Incorrect film placement (47.8), upper canine region, and 0.89. Group II: Incorrect film placement (44.0), upper canine region, and 1.12. Group III: Incorrect film placement (79.2), upper canine region, and 2.05. Group IV: Incorrect film placement (67.7), upper canine region, and 1.69.

  16. A Protocol for Advanced Psychometric Assessment of Surveys

    Science.gov (United States)

    Squires, Janet E.; Hayduk, Leslie; Hutchinson, Alison M.; Cranley, Lisa A.; Gierl, Mark; Cummings, Greta G.; Norton, Peter G.; Estabrooks, Carole A.

    2013-01-01

    Background and Purpose. In this paper, we present a protocol for advanced psychometric assessments of surveys based on the Standards for Educational and Psychological Testing. We use the Alberta Context Tool (ACT) as an exemplar survey to which this protocol can be applied. Methods. Data mapping, acceptability, reliability, and validity are addressed. Acceptability is assessed with missing data frequencies and the time required to complete the survey. Reliability is assessed with internal consistency coefficients and information functions. A unitary approach to validity consisting of accumulating evidence based on instrument content, response processes, internal structure, and relations to other variables is taken. We also address assessing performance of survey data when aggregated to higher levels (e.g., nursing unit). Discussion. In this paper we present a protocol for advanced psychometric assessment of survey data using the Alberta Context Tool (ACT) as an exemplar survey; application of the protocol to the ACT survey is underway. Psychometric assessment of any survey is essential to obtaining reliable and valid research findings. This protocol can be adapted for use with any nursing survey. PMID:23401759

  17. Process qualification and control in electron beams--requirements, methods, new concepts and challenges

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Gratzl, F.; Hanis, D.

    2004-01-01

    In this paper the status of process qualification and control in electron beam irradiation is analyzed in terms of requirements, concepts, methods and challenges for a state-of-the-art process control concept for medical device sterilization. Aspects from process qualification to routine process control are described together with the associated process variables. As a case study the 10 MeV beams at Mediscan GmbH are considered. Process control concepts like statistical process control (SPC) and a new concept to determine process capability is briefly discussed

  18. Potential Theory Surveys and Problems

    CERN Document Server

    Lukeš, Jaroslav; Netuka, Ivan; Veselý, Jiří

    1988-01-01

    The volume comprises eleven survey papers based on survey lectures delivered at the Conference in Prague in July 1987, which covered various facets of potential theory, including its applications in other areas. The survey papers deal with both classical and abstract potential theory and its relations to partial differential equations, stochastic processes and other branches such as numerical analysis and topology. A collection of problems from potential theory, compiled on the occasion of the conference, is included, with additional commentaries, in the second part of this volume.

  19. The Hyper Suprime-Cam SSP Survey: Overview and survey design

    Science.gov (United States)

    Aihara, Hiroaki; Arimoto, Nobuo; Armstrong, Robert; Arnouts, Stéphane; Bahcall, Neta A.; Bickerton, Steven; Bosch, James; Bundy, Kevin; Capak, Peter L.; Chan, James H. H.; Chiba, Masashi; Coupon, Jean; Egami, Eiichi; Enoki, Motohiro; Finet, Francois; Fujimori, Hiroki; Fujimoto, Seiji; Furusawa, Hisanori; Furusawa, Junko; Goto, Tomotsugu; Goulding, Andy; Greco, Johnny P.; Greene, Jenny E.; Gunn, James E.; Hamana, Takashi; Harikane, Yuichi; Hashimoto, Yasuhiro; Hattori, Takashi; Hayashi, Masao; Hayashi, Yusuke; Hełminiak, Krzysztof G.; Higuchi, Ryo; Hikage, Chiaki; Ho, Paul T. P.; Hsieh, Bau-Ching; Huang, Kuiyun; Huang, Song; Ikeda, Hiroyuki; Imanishi, Masatoshi; Inoue, Akio K.; Iwasawa, Kazushi; Iwata, Ikuru; Jaelani, Anton T.; Jian, Hung-Yu; Kamata, Yukiko; Karoji, Hiroshi; Kashikawa, Nobunari; Katayama, Nobuhiko; Kawanomoto, Satoshi; Kayo, Issha; Koda, Jin; Koike, Michitaro; Kojima, Takashi; Komiyama, Yutaka; Konno, Akira; Koshida, Shintaro; Koyama, Yusei; Kusakabe, Haruka; Leauthaud, Alexie; Lee, Chien-Hsiu; Lin, Lihwai; Lin, Yen-Ting; Lupton, Robert H.; Mandelbaum, Rachel; Matsuoka, Yoshiki; Medezinski, Elinor; Mineo, Sogo; Miyama, Shoken; Miyatake, Hironao; Miyazaki, Satoshi; Momose, Rieko; More, Anupreeta; More, Surhud; Moritani, Yuki; Moriya, Takashi J.; Morokuma, Tomoki; Mukae, Shiro; Murata, Ryoma; Murayama, Hitoshi; Nagao, Tohru; Nakata, Fumiaki; Niida, Mana; Niikura, Hiroko; Nishizawa, Atsushi J.; Obuchi, Yoshiyuki; Oguri, Masamune; Oishi, Yukie; Okabe, Nobuhiro; Okamoto, Sakurako; Okura, Yuki; Ono, Yoshiaki; Onodera, Masato; Onoue, Masafusa; Osato, Ken; Ouchi, Masami; Price, Paul A.; Pyo, Tae-Soo; Sako, Masao; Sawicki, Marcin; Shibuya, Takatoshi; Shimasaku, Kazuhiro; Shimono, Atsushi; Shirasaki, Masato; Silverman, John D.; Simet, Melanie; Speagle, Joshua; Spergel, David N.; Strauss, Michael A.; Sugahara, Yuma; Sugiyama, Naoshi; Suto, Yasushi; Suyu, Sherry H.; Suzuki, Nao; Tait, Philip J.; Takada, Masahiro; Takata, Tadafumi; Tamura, Naoyuki; Tanaka, Manobu M.; Tanaka, Masaomi; Tanaka, Masayuki; Tanaka, Yoko; Terai, Tsuyoshi; Terashima, Yuichi; Toba, Yoshiki; Tominaga, Nozomu; Toshikawa, Jun; Turner, Edwin L.; Uchida, Tomohisa; Uchiyama, Hisakazu; Umetsu, Keiichi; Uraguchi, Fumihiro; Urata, Yuji; Usuda, Tomonori; Utsumi, Yousuke; Wang, Shiang-Yu; Wang, Wei-Hao; Wong, Kenneth C.; Yabe, Kiyoto; Yamada, Yoshihiko; Yamanoi, Hitomi; Yasuda, Naoki; Yeh, Sherry; Yonehara, Atsunori; Yuma, Suraphong

    2018-01-01

    Hyper Suprime-Cam (HSC) is a wide-field imaging camera on the prime focus of the 8.2-m Subaru telescope on the summit of Mauna Kea in Hawaii. A team of scientists from Japan, Taiwan, and Princeton University is using HSC to carry out a 300-night multi-band imaging survey of the high-latitude sky. The survey includes three layers: the Wide layer will cover 1400 deg2 in five broad bands (grizy), with a 5 σ point-source depth of r ≈ 26. The Deep layer covers a total of 26 deg2 in four fields, going roughly a magnitude fainter, while the UltraDeep layer goes almost a magnitude fainter still in two pointings of HSC (a total of 3.5 deg2). Here we describe the instrument, the science goals of the survey, and the survey strategy and data processing. This paper serves as an introduction to a special issue of the Publications of the Astronomical Society of Japan, which includes a large number of technical and scientific papers describing results from the early phases of this survey.

  20. MULTIPLE CRITERA METHODS WITH FOCUS ON ANALYTIC HIERARCHY PROCESS AND GROUP DECISION MAKING

    Directory of Open Access Journals (Sweden)

    Lidija Zadnik-Stirn

    2010-12-01

    Full Text Available Managing natural resources is a group multiple criteria decision making problem. In this paper the analytic hierarchy process is the chosen method for handling the natural resource problems. The one decision maker problem is discussed and, three methods: the eigenvector method, data envelopment analysis method, and logarithmic least squares method are presented for the derivation of the priority vector. Further, the group analytic hierarchy process is discussed and six methods for the aggregation of individual judgments or priorities: weighted arithmetic mean method, weighted geometric mean method, and four methods based on data envelopment analysis are compared. The case study on land use in Slovenia is applied. The conclusions review consistency, sensitivity analyses, and some future directions of research.

  1. Applying Process Improvement Methods to Clinical and Translational Research: Conceptual Framework and Case Examples.

    Science.gov (United States)

    Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K

    2015-12-01

    There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.

  2. Indigenous processing methods and raw materials of borde , an ...

    African Journals Online (AJOL)

    A flow chart of borde production was constructed showing four major processing stages. The short shelf life of borde and the seasonal variations in production volume were identified as major problems for the vendors in the study areas. Keywords: indigenous methods; cereal fermentation; borde; beverage; Ethiopia J Food ...

  3. A Survey of tooth morphology teaching methods employed in the United Kingdom and Ireland.

    Science.gov (United States)

    Lone, M; McKenna, J P; Cryan, J F; Downer, E J; Toulouse, A

    2018-01-15

    Tooth morphology is a central component of the dental curriculum and is applicable to all dental specialities. Traditional teaching methods are being supplemented with innovative strategies to tailor teaching and accommodate the learning styles of the recent generation of students. An online survey was compiled and distributed to the staff involved in teaching tooth morphology in the United Kingdom and Ireland to assess the importance of tooth morphology in the dentistry curriculum and the methodologies employed in teaching. The results of the survey show that tooth morphology constitutes a small module in the dental curriculum. It is taught in the first 2 years of the dental curriculum but is applicable in the clinical years and throughout the dental career. Traditional teaching methods, lecture and practical, are being augmented with innovative teaching including e-learning via virtual learning environment, tooth atlas and e-books leading to blended learning. The majority of the schools teach both normal dental anatomy and morphologic variations of dental anatomy and utilise plastic teeth for practical and examination purposes. Learning the 3D aspects of tooth morphology was deemed important by most of the respondents who also agreed that tooth morphology is a difficult topic for the students. Despite being core to the dental curriculum, overall minimal time is dedicated to the delivery of tooth morphology, creating a reliance on the student to learn the material. New forms of delivery including computer-assisted learning tools should help sustain learning and previously acquired knowledge. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. Information processing systems, reasoning modules, and reasoning system design methods

    Science.gov (United States)

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  5. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  6. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  7. Method and equipment of processing radioactive laundry wastes

    International Nuclear Information System (INIS)

    Shirai, Takamori; Suzuki, Takeo; Tabata, Masayuki; Takada, Takao; Yamaguchi, Shin-ichi; Noda, Tetsuya.

    1985-01-01

    Purpose: To effectively process radioactive laundry wastes generated due to water-washing after dry-cleaning of protective clothings which have been put on in nuclear facilities. Method: Dry cleaning soaps and ionic radioactive materials contained in radioactive laundry wastes are selectively adsorbed to decontaminate by adsorbents. Then, the adsorbents having adsorbed dry cleaning soaps and ionic radioactive materials are purified by being removed with these radioactive materials. The purified adsorbents are re-used. (Seki, T.)

  8. Dental ceramics: a review of new materials and processing methods.

    Science.gov (United States)

    Silva, Lucas Hian da; Lima, Erick de; Miranda, Ranulfo Benedito de Paula; Favero, Stéphanie Soares; Lohbauer, Ulrich; Cesar, Paulo Francisco

    2017-08-28

    The evolution of computerized systems for the production of dental restorations associated to the development of novel microstructures for ceramic materials has caused an important change in the clinical workflow for dentists and technicians, as well as in the treatment options offered to patients. New microstructures have also been developed by the industry in order to offer ceramic and composite materials with optimized properties, i.e., good mechanical properties, appropriate wear behavior and acceptable aesthetic characteristics. The objective of this literature review is to discuss the main advantages and disadvantages of the new ceramic systems and processing methods. The manuscript is divided in five parts: I) monolithic zirconia restorations; II) multilayered dental prostheses; III) new glass-ceramics; IV) polymer infiltrated ceramics; and V) novel processing technologies. Dental ceramics and processing technologies have evolved significantly in the past ten years, with most of the evolution being related to new microstructures and CAD-CAM methods. In addition, a trend towards the use of monolithic restorations has changed the way clinicians produce all-ceramic dental prostheses, since the more aesthetic multilayered restorations unfortunately are more prone to chipping or delamination. Composite materials processed via CAD-CAM have become an interesting option, as they have intermediate properties between ceramics and polymers and are more easily milled and polished.

  9. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    Science.gov (United States)

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  10. Intensive care nurses' perceptions of their professional competence in the organ donor process: a national survey.

    Science.gov (United States)

    Meyer, Käthe; Bjørk, Ida Torunn; Eide, Hilde

    2012-01-01

    This paper is a report of a study that explored Norwegian intensive care nurses' perceptions of their professional competence to identify educational needs in the organ donor process. Intensive care professionals are requested to consider organ donation each time they care for patients with severe cerebral lesion to ensure donor organs for transplantation. The donor process challenges intensive care nurses' professional competence. Nurses' knowledge and experience may influence their professional competence in caring for organ donors and their relatives. METHODS.: A cross-sectional survey was conducted in all 28 Norwegian donor hospitals between October 2008 and January 2009. Intensive care nurses (N = 801) were invited to participate and the response rate was 71·4%. Dimensions of professional competence, learning needs and contextual and demographic variables were explored. Data were analysed using descriptive and inferential statistics. Few intensive care nurses had extensive experience of or competence and training in organ donation. Nurses working at university hospitals had more experience, but lesser training than nurses in local hospitals. Experience of donor acquisition had an impact on intensive care nurses' perceptions of their professional competence in the donor process. Discussions on the ward and educational input were seen as important for the further development of professional competence. Training provided by experienced colleagues and a culture that encourages discussion about aspects of the donor process can develop nurses' professional competence and communally defined professional practice. Educational input that cultivates various types of knowledge can be beneficial in organ donation. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.

  11. Clinical Reasoning: Survey of Teaching Methods, Integration, and Assessment in Entry-Level Physical Therapist Academic Education.

    Science.gov (United States)

    Christensen, Nicole; Black, Lisa; Furze, Jennifer; Huhn, Karen; Vendrely, Ann; Wainwright, Susan

    2017-02-01

    Although clinical reasoning abilities are important learning outcomes of physical therapist entry-level education, best practice standards have not been established to guide clinical reasoning curricular design and learning assessment. This research explored how clinical reasoning is currently defined, taught, and assessed in physical therapist entry-level education programs. A descriptive, cross-sectional survey was administered to physical therapist program representatives. An electronic 24-question survey was distributed to the directors of 207 programs accredited by the Commission on Accreditation in Physical Therapy Education. Descriptive statistical analysis and qualitative content analysis were performed. Post hoc demographic and wave analyses revealed no evidence of nonresponse bias. A response rate of 46.4% (n=96) was achieved. All respondents reported that their programs incorporated clinical reasoning into their curricula. Only 25% of respondents reported a common definition of clinical reasoning in their programs. Most respondents (90.6%) reported that clinical reasoning was explicit in their curricula, and 94.8% indicated that multiple methods of curricular integration were used. Instructor-designed materials were most commonly used to teach clinical reasoning (83.3%). Assessment of clinical reasoning included practical examinations (99%), clinical coursework (94.8%), written examinations (87.5%), and written assignments (83.3%). Curricular integration of clinical reasoning-related self-reflection skills was reported by 91%. A large number of incomplete surveys affected the response rate, and the program directors to whom the survey was sent may not have consulted the faculty members who were most knowledgeable about clinical reasoning in their curricula. The survey construction limited some responses and application of the results. Although clinical reasoning was explicitly integrated into program curricula, it was not consistently defined, taught, or

  12. Curriculum and instructional methods for drug information, literature evaluation, and biostatistics: survey of US pharmacy schools.

    Science.gov (United States)

    Phillips, Jennifer A; Gabay, Michael P; Ficzere, Cathy; Ward, Kristina E

    2012-06-01

    The drug information curriculum in US colleges of pharmacy continues to evolve. The American College of Clinical Pharmacy (ACCP) Drug Information Practice and Research Network (DI PRN) published an opinion paper with specific recommendations regarding drug information education in 2009. Adoption of these recommendations has not been evaluated. To assess which recommendations made in the ACCP DI PRN opinion paper are included in US pharmacy school curricula and characterize faculty qualifications, educational methods, and recent changes in drug information education. An electronic survey was designed using the ACCP DI PRN opinion paper and the Accreditation Council for Pharmacy Education standards and guidelines for accreditation of PharmD programs in the US. Survey questions addressed curricular content within the following categories: drug information, literature evaluation, and biostatistics. A letter including the online survey link was sent via email to the dean of each US college/school of pharmacy (N = 128). Recipients were instructed to forward the email to the individual at their institution who was the most knowledgeable about the content and methodology used for didactic drug information education. Sixty-four responses were included in the final analysis. Of the 19 ACCP DI PRN minimum core concepts, 9 (47%) were included in curricula of all responding institutions; 14 of 19 (74%) were included in curricula for all but 1 institution. In contrast, 5 of 16 concepts (31%) were not formally taught by a number of institutions. Many respondents noted an increased focus on evidence-based medicine, medication safety, and informatics. Although a survey of drug information curricula documented substantial inclusion of the essential concepts presented in the ACCP DI PRN opinion paper, room for improvement remains in drug information curricula in US colleges of pharmacy.

  13. The multicriteria method for environmentally oriented business decision-making

    Directory of Open Access Journals (Sweden)

    Čančer Vesna

    2004-01-01

    Full Text Available Stimulated by the expressed managers’ need for some completed methods for environmental management in enterprises, we present the method for environmentally oriented business decision-making. It is based on simulations where optimization models of business processes are used as scenarios. The possibilities for an integrated approach to environmental protection are introduced and – decomposed according to the type of the considered element by using zero-one variables – included in the optimization models. The method is completed for multicriteria decision-making where in the simulations obtained optimal values are included. In a real-life case where the Analytic Hierarchy Process technique is used to evaluate environmentally oriented business processes, special attention is given to criteria and weights: we consider preferences and survey findings on the environmental impact of business processes in the enterprise, survey findings on environmental management in the processing industry, and ecobalances.

  14. Bayesian adaptive survey protocols for resource management

    Science.gov (United States)

    Halstead, Brian J.; Wylie, Glenn D.; Coates, Peter S.; Casazza, Michael L.

    2011-01-01

    Transparency in resource management decisions requires a proper accounting of uncertainty at multiple stages of the decision-making process. As information becomes available, periodic review and updating of resource management protocols reduces uncertainty and improves management decisions. One of the most basic steps to mitigating anthropogenic effects on populations is determining if a population of a species occurs in an area that will be affected by human activity. Species are rarely detected with certainty, however, and falsely declaring a species absent can cause improper conservation decisions or even extirpation of populations. We propose a method to design survey protocols for imperfectly detected species that accounts for multiple sources of uncertainty in the detection process, is capable of quantitatively incorporating expert opinion into the decision-making process, allows periodic updates to the protocol, and permits resource managers to weigh the severity of consequences if the species is falsely declared absent. We developed our method using the giant gartersnake (Thamnophis gigas), a threatened species precinctive to the Central Valley of California, as a case study. Survey date was negatively related to the probability of detecting the giant gartersnake, and water temperature was positively related to the probability of detecting the giant gartersnake at a sampled location. Reporting sampling effort, timing and duration of surveys, and water temperatures would allow resource managers to evaluate the probability that the giant gartersnake occurs at sampled sites where it is not detected. This information would also allow periodic updates and quantitative evaluation of changes to the giant gartersnake survey protocol. Because it naturally allows multiple sources of information and is predicated upon the idea of updating information, Bayesian analysis is well-suited to solving the problem of developing efficient sampling protocols for species of

  15. Target Selection for the SDSS-III MARVELS Survey

    Science.gov (United States)

    Paegert, Martin; Stassun, Keivan G.; De Lee, Nathan; Pepper, Joshua; Fleming, Scott W.; Sivarani, Thirupathi; Mahadevan, Suvrath; Mack, Claude E., III; Dhital, Saurav; Hebb, Leslie; Ge, Jian

    2015-06-01

    We present the target selection process for the Multi-object APO Radial Velocity Exoplanets Large-area Survey (MARVELS), which is part of the Sloan Digital Sky Survey (SDSS) III. MARVELS is a medium-resolution (R ∼ 11,000) multi-fiber spectrograph capable of obtaining radial velocities for 60 objects at a time in order to find brown dwarfs and giant planets. The survey was configured to target dwarf stars with effective temperatures approximately between 4500 and 6250 K. For the first 2 years MARVELS relied on low-resolution spectroscopic pre-observations to estimate the effective temperature and log (g) for candidate stars and then selected suitable dwarf stars from this pool. Ultimately, the pre-observation spectra proved ineffective at filtering out giant stars; many giants were incorrectly classified as dwarfs, resulting in a giant contamination rate of ∼30% for the first phase of the MARVELS survey. Thereafter, the survey instead applied a reduced proper motion cut to eliminate giants and used the Infrared Flux Method to estimate effective temperatures, using only extant photmetric and proper-motion catalog information. The target selection method introduced here may be useful for other surveys that need to rely on extant catalog data for selection of specific stellar populations.

  16. Computerized tablet based versus traditional paper- based survey methods: results from adolescent's health research in schools of Maharashtra, India

    OpenAIRE

    Naveen Agarwal; Balram Paswan; Prakash H. Fulpagare; Dhirendra N Sinha; Thaksaphon Thamarangsi; Manju Rani

    2018-01-01

    Background and challenges to implementation Technological advancement is growing very fast in India and majority of young population is handling electronic devices often during leisure as well as at work. This study indicates that electronic tablets are less time consuming and improves survey response rate over the traditional paper-pencil survey method. Intervention or response An Android-based Global School-based Health Survey (GSHS) questionnaire was used with the...

  17. Method and apparatus for surface characterization and process control utilizing radiation from desorbed particles

    International Nuclear Information System (INIS)

    Feldman, L.C.; Kraus, J.S.; Tolk, N.H.; Traum, M.M.; Tully, J.C.

    1983-01-01

    Emission of characteristic electromagnetic radiation in the infrared, visible, or UV from excited particles, typically ions, molecules, or neutral atoms, desorbed from solid surfaces by an incident beam of low-momentum probe radiation has been observed. Disclosed is a method for characterizing solid surfaces based on the observed effect, with low-momentum probe radiation consisting of electrons or photons. Further disclosed is a method for controlling manufacturing processes that is also based on the observed effect. The latter method can, for instance, be advantageously applied in integrated circuit-, integrated optics-, and magnetic bubble device manufacture. Specific examples of applications of the method are registering of masks, control of a direct-writing processing beam, end-point detection in etching, and control of a processing beam for laser- or electron-beam annealing or ion implantation

  18. Advances in survey monitoring and deformation analysis of dams

    International Nuclear Information System (INIS)

    Teskey, W.F.; Biacs, Z.; Ingraham, T.J.

    1989-01-01

    Survey monitoring is an important method of determining the deformation behavior of structures such as dams. The deformation survey monitoring method used by Alberta Environment is designed to be able to detect horizontal movement in the order of 1.5 cm and vertical movement in the order of 0.5 cm. Using computer simulation, reference and observation points are varied to enable precisions of less than 1 cm at a 95% confidence level. Reference network points are pillars of 20 cm diameter steel pipe driven to refusal, which protrude above ground level to a comfortable instrument height of 1.5 m. Object points are 3 m long, 5 cm diameter steel pipes fitted with a helix base and drilled flush with ground level. Data processing is completely automated from data collection to preparation of report plots, using a microcomputer. If suitable procedures are followed, trigonometric (trig) leveling can replace spirit leveling in deformation surveys. Trig leveling can be used to determine heights of inaccessible points impossible to determine with spirit leveling, and allows totally automated data collection. An example is provided of application of the technique to deformation analysis of the Paddle River Dam situated north of Edmonton. 8 refs., 3 figs

  19. Improving low power and shutdown PSA methods and data to permit better risk comparison and trade-off decision-making. Volume 1: summary of COOPRA and WGRISK surveys; Volume 2: responses to the WGRISK survey; Volume 3: responses to the COOPRA survey

    International Nuclear Information System (INIS)

    2005-01-01

    The COOPRA LPSD working group is charged with the responsibility to assess their Member country's plant operations at Low Power and Shutdown (LPSD) conditions. The sharing of information is expected to provide each of the Member country the means from which to render informed regulatory decisions for the benefit of public health and safety. Each organization had developed a questionnaire to gather information from Member countries on LPSD PSAs experiences. The responses cover a broad spectrum of LPSD PSA topics, and identifies work for improving risk-informed trade-off decisions, using PSA techniques, between LPSD and full power operational states. Each organization recognized potential benefit for improving the state-of-the-art by combining the wealth of experiences from the questionnaire responses into a common report. This report provides a summary of the current LPSD PSAs in Member countries, covering the elements which make up the PSAs. This report identifies the uses of the LPSD PSAs, summarizes current approaches, aspects, and good practices, identifies and defines differences between methods and data in full power and LPSD PSAs, and identifies guidance, methods, data, and basic research needs to address the differences. The responses to the questionnaires are provided in the Appendixes. The information contained in this report was gathered from two surveys, one by COOPRA and the other by WGRisk, which were performed over several years. Volume 2 of this report contains the responses from the CSNI / WGRisk Survey; Volume 3 contains the responses from the COOPRA Survey

  20. The Selected Method and Tools for Performance Measurement in the Green Supply Chain—Survey Analysis in Poland

    Directory of Open Access Journals (Sweden)

    Blanka Tundys

    2018-02-01

    Full Text Available The methods and tools for the performance measurement and evaluation of the green supply chain management are very important elements for the construction and function of this type of supply chain. The result is a presentation of the considerations underlying a very general model, which presents some selected tools, but no breakdown of individual industries. The considerations undertaken are important and have scientific added value as usually in practice, a very large number of tools are used to assess the supply chain, which are not always correlated or adapted to the specificity of the chain. It is worth pointing out which of the already used or completely new tools and methods will be most useful for assessing the green supply chain. The structure of the paper covers the theoretical and empirical. It includes an introduction, our goals and hypotheses, state of the art, methodology, empirical findings, and discussion. We present the definitional differences between green and sustainable supply chains and focus on the selection and identification of methods for the framework model for evaluating the green supply chain. In the next step, the theoretical and selected method and tools were compared to a survey of Poland. On the basis of the survey, we present the findings and discussions found in this area. The main methodology used includes a literature review, a survey analysis using a questionnaire and statistical tools. The survey was carried out in 2015 in sample organizations in Poland. The research results showed that organizations were aware of the environmental elements of measuring and assessing the supply chain from an environmental point of view, but their use depended on many factors: the area, size of the organization, or the industry. If certain boundary conditions are met and the organizations are aware of the essence of environmental aspects in the chain, then they are applying green measures to the supply chain. These findings

  1. Non-filtration method of processing of uranium ores

    International Nuclear Information System (INIS)

    Laskorin, B.N.; Vodolazov, L.I.; Tokarev, N.N.; Vyalkov, V.I.; Goldobina, V.A.; Gosudarstvennyj Komitet po Ispol'zovaniyu Atomnoj Ehnergii SSSR, Moscow)

    1977-01-01

    The development of the filterless sorption method has lead to working out the sorption leaching process and the process of extraction desorption, which has made possible to intensify the process of uranium ore working and to improve greatly the technical economic indexes by liquidating the complex method of multiple filtration and repulping of cakes. This method makes possible to involve more poor uranium raw materials and at the same time to extract valuable components: molybdenum, vanadium, copper, etc. Great industrial experience has been accumulating in sorption of dense pulp with the ratio of solid phase to liquid one equal to 1:1. This has lead to the increase of productivity of working plants by 1,5-3,0 times, the increase of uranium extraction by 5-10%, the increase of labour capacity of main workers by 2-3 times, and to the decrease of reagents expense, auxiliary materials, electric energy and vapour by several times. In fact the developed technology is continuous in all its steps with complete complex automatization of the process with the help of the most simple and available means of regulation and controlling. The process is equipped with high productivity apparatuses of great power with mechanic and pneumatic mixing for high density pulps, and with the columns KDS, KDZS, KNSPR and PIK for the regeneration of saturated sorbent in the counterflow regime. The exploitation of fine-granular hydrophilic ion-exchange resins in hydrophobized state is foreseen [ru

  2. Medication non-adherence and uncertainty: Information-seeking and processing in the Danish LIFESTAT survey.

    Science.gov (United States)

    Kriegbaum, Margit; Lau, Sofie Rosenlund

    2017-09-23

    Statins are widely prescribed to lower cardiovascular morbidity and mortality. However, statin non-adherence is very high. The aim of this paper was to investigate reasons for stopping statin treatment in the general population and to study how aspects of information-seeking and processing is associated with statin non-adherence. This study used a population survey on 3050 Danish residents aged 45-65 years. Reasons for statin discontinuation was studied among those who were previous statin users. The association between information seeking and processing and statin discontinuation were analysed using multivariate logistical regression models. Experience of side effects and fear of side effects played an important role in the discontinuation of statin treatment. Feelings of uncertainty and confusion regarding information on statins predicted statin discontinuation. This applied to information from both mass media and from general practitioners. There was no clear pattern of information seeking and statin non-adherence. The article point to the impact of information-seeking on the decision to take cholesterol-lowering medication. This included contributions from information disseminated by media outlets. Side effects and fear of side effects should be addressed in clinical practice. Health care professionals should pay attention to emotional aspects of how information is disseminated and perceived by statin users. Copyright © 2017. Published by Elsevier Inc.

  3. A review of the uses and methods of processing banana and ...

    African Journals Online (AJOL)

    ) ... Journal of Agricultural Research and Development ... Different processing methods of Musa spp. into new food products which include production of flour, preparation of jams and jellies and the quality attributes of the products obtained from ...

  4. EPR of free radicals in solids I trends in methods and applications

    CERN Document Server

    Lund, Anders; Lund, Anders

    2012-01-01

    In its updated 2nd edition, this book surveys methods and applications of EPR in the study of free radical processes in solids. The focus is on trends in methods for extracting structural and dynamical properties of radicals and spin probes in solid matrices.

  5. Lessons From Recruitment to an Internet-Based Survey for Degenerative Cervical Myelopathy: Comparison of Free and Fee-Based Methods

    Science.gov (United States)

    2018-01-01

    Background Degenerative Cervical Myelopathy (DCM) is a syndrome of subacute cervical spinal cord compression due to spinal degeneration. Although DCM is thought to be common, many fundamental questions such as the natural history and epidemiology of DCM remain unknown. In order to answer these, access to a large cohort of patients with DCM is required. With its unrivalled and efficient reach, the Internet has become an attractive tool for medical research and may overcome these limitations in DCM. The most effective recruitment strategy, however, is unknown. Objective To compare the efficacy of fee-based advertisement with alternative free recruitment strategies to a DCM Internet health survey. Methods An Internet health survey (SurveyMonkey) accessed by a new DCM Internet platform (myelopathy.org) was created. Using multiple survey collectors and the website’s Google Analytics, the efficacy of fee-based recruitment strategies (Google AdWords) and free alternatives (including Facebook, Twitter, and myelopathy.org) were compared. Results Overall, 760 surveys (513 [68%] fully completed) were accessed, 305 (40%) from fee-based strategies and 455 (60%) from free alternatives. Accounting for researcher time, fee-based strategies were more expensive ($7.8 per response compared to $3.8 per response for free alternatives) and identified a less motivated audience (Click-Through-Rate of 5% compared to 57% using free alternatives) but were more time efficient for the researcher (2 minutes per response compared to 16 minutes per response for free methods). Facebook was the most effective free strategy, providing 239 (31%) responses, where a single message to 4 existing communities yielded 133 (18%) responses within 7 days. Conclusions The Internet can efficiently reach large numbers of patients. Free and fee-based recruitment strategies both have merits. Facebook communities are a rich resource for Internet researchers. PMID:29402760

  6. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  7. Consumption of Ultra-processed Foods and Obesity in Brazilian Adolescents and Adults

    OpenAIRE

    da, Costa Louzada Maria Laura; Baraldi, Larissa Galastri; Steele, Euridice Martinez; Martins, Ana Paula Bortoletto; Canella, Daniela Silva; Claude-Moubarac, Jean; Levy, Renata Bertazzi; Cannon, Geoffrey; Afshin, Ashkan; Imamura, Fumiaki; Mozaffarian, Dariush; Monteiro, Carlos Augusto

    2015-01-01

    Objectives: The aim of this study was to evaluate the relationship between the consumption of ultra-processed foods and obesity indicators among Brazilian adults and adolescents. Methods: We used cross-sectional data on 30,243 individuals aged ≥ 10 years from the 2008–2009 Brazilian Dietary Survey. Food consumption data were collected through 24-h food records. We classified food items according to characteristics of food processing. Ultra-processed foods were defined as formulati...

  8. Evaluation of processing methods for static radioisotope scan images

    International Nuclear Information System (INIS)

    Oakberg, J.A.

    1976-12-01

    Radioisotope scanning in the field of nuclear medicine provides a method for the mapping of a radioactive drug in the human body to produce maps (images) which prove useful in detecting abnormalities in vital organs. At best, radioisotope scanning methods produce images with poor counting statistics. One solution to improving the body scan images is using dedicated small computers with appropriate software to process the scan data. Eleven methods for processing image data are compared

  9. The outlier sample effects on multivariate statistical data processing geochemical stream sediment survey (Moghangegh region, North West of Iran)

    International Nuclear Information System (INIS)

    Ghanbari, Y.; Habibnia, A.; Memar, A.

    2009-01-01

    In geochemical stream sediment surveys in Moghangegh Region in north west of Iran, sheet 1:50,000, 152 samples were collected and after the analyze and processing of data, it revealed that Yb, Sc, Ni, Li, Eu, Cd, Co, as contents in one sample is far higher than other samples. After detecting this sample as an outlier sample, the effect of this sample on multivariate statistical data processing for destructive effects of outlier sample in geochemical exploration was investigated. Pearson and Spear man correlation coefficient methods and cluster analysis were used for multivariate studies and the scatter plot of some elements together the regression profiles are given in case of 152 and 151 samples and the results are compared. After investigation of multivariate statistical data processing results, it was realized that results of existence of outlier samples may appear as the following relations between elements: - true relation between two elements, which have no outlier frequency in the outlier sample. - false relation between two elements which one of them has outlier frequency in the outlier sample. - complete false relation between two elements which both have outlier frequency in the outlier sample

  10. Processing of low-quality bauxite feedstock by thermochemistry-Bayer method

    Directory of Open Access Journals (Sweden)

    О. А. Дубовиков

    2016-11-01

    Full Text Available The modern production of aluminum which by its global output ranks first among the non-ferrous metals includes three main stages: ore extraction, its processing into alumina and, finally, the production of primary aluminum. Alumina production from bauxites,  being the  primary raw material in the  alumina industry,  is based  on two main methods: the Bayer method and the sintering method developed in Russia under the lead of an academician Nikolay Semenovich Kurnakov. Alumina production by the Bayer’s method is more cost effective,  but  has  higher  requirements to the  quality of the bauxite feedstock.  A great deal  of research has  been carried  out on low quality bauxites focusing firstly on finding ways to enrich the feedstock, secondly on improving the combined sequential Bayer-sintering method and thirdly on developing new hydrometallurgical ways for bauxites processing. Mechanical methods of bauxite enrichment have not yet brought any positive outcome, and a development of new hydrometallurgical high alkaline  autoclave process  faced  significant hardware  difficulties not addressed so far. For efficient processing of such low quality bauxite feedstock it is suggested to use a universal thermochemistry-Bayer method, which was developed in St. Petersburg Mining University under  the lead  of  Nikolay Ivanovich Eremin, allows to process different substandard bauxite feedstock and has a competitive costing as compared to the sintering method and combined methods. The main stages of thermochemistry-Bayer method are thermal activation of feedstock, its further desiliconization with the alkaline solution and leaching of the resultant bauxite product  under Bayer’s method. Despite high energy consumption at  the baking stage,  it  allows to condition the  low quality bauxite feedstock by neutralizing a variety of technologically harmful impurities such as organic matter, sulfide sulfur, carbonates, and at the

  11. Processing of plastic track detectors

    International Nuclear Information System (INIS)

    Somogyi, G.

    1977-01-01

    A survey of some actual problems of the track processing methods available at this time for plastics is presented. In the case of the conventional chemical track-etching technique, mainly the etching situations related to detector geometry, and the relationship between registration sensitivity and the etching parameters are considered. Special attention is paid to the behaviour of track-revealing by means of electrochemical etching. Finally, some properties of a promising new track processing method based on graft polymerization are discussed. (author)

  12. Processing of plastic track detectors

    International Nuclear Information System (INIS)

    Somogyi, G.

    1976-01-01

    A survey of some actual problems of the track processing methods available at this time for plastics is presented. In the case of the conventional chemical track etching technique mainly the etching situations related to detector geometry and the relationship of registration sensitivity and the etching parameters are considered. A special attention is paid to the behaviour of track revealing by means of electrochemical etching. Finally, some properties of a promising new track processing method based on graft polymerization is discussed. (orig.) [de

  13. The various correction methods to the high precision aeromagnetic data

    International Nuclear Information System (INIS)

    Xu Guocang; Zhu Lin; Ning Yuanli; Meng Xiangbao; Zhang Hongjian

    2014-01-01

    In the airborne geophysical survey, an outstanding achievement first depends on the measurement precision of the instrument, and the choice of measurement conditions, the reliability of data collection, followed by the correct method of measurement data processing, the rationality of the data interpretation. Obviously, geophysical data processing is an important task for the comprehensive interpretation of the measurement results, processing method is correct or not directly related to the quality of the final results. we have developed a set of personal computer software to aeromagnetic and radiometric survey data processing in the process of actual production and scientific research in recent years, and successfully applied to the production. The processing methods and flowcharts to the high precision aromagnetic data were simply introduced in this paper. However, the mathematical techniques of the various correction programes to IGRF and flying height and magnetic diurnal variation were stressily discussed in the paper. Their processing effectness were illustrated by taking an example as well. (authors)

  14. MethodS of radioactive waste processing and disposal in the United Kingdom

    International Nuclear Information System (INIS)

    Tolstykh, V.D.

    1983-01-01

    The results of investigations into radioactive waste processing and disposal in the United Kingdom are discussed. Methods for solidification of metal and graphite radioactive wastes and radioactive slime of the Magnox reactors are described. Specifications of different installations used for radioactive waste disposal are given. Climatic and geological conditions in the United Kingdom are such that any deep storages of wastes will be lower than the underground water level. That is why dissolution and transport by underground waters will inevitably result in radionuclide mobility. In this connection an extended program of investigations into the main three aspects of disposal problem namely radionucleide release in storages, underground water transport and radionuclide migration is realized. The program is divided in two parts. The first part deals with retrival of hydrological and geochemical data on geological formations, development of specialized methods of investigations which are necessary for identification of places for waste final disposal. The second part represents theoretical and laboratory investigations into provesses of radionuclide transport in the system of ''sttorage-geological formation''. It is concluded that vitrification on the base of borosilicate glass is the most advanced method of radioactive waste solidification

  15. Field Survey - A Journey of Exploration and Discovery

    Science.gov (United States)

    Jadhav, Kiran

    2017-04-01

    You can teach a student a lesson a day but if you teach him to learn by creating curiosity ,he will continue the learning process for life.This abstract is a manifestation of my innate desire as an educator to build students cognitive level of thinking and refine their processes to gain knowledge from the environment ,process it and put it to optimum use. This field survey has been planned and conducted for students of 12th grade (+16yrs). At this level students are introduced to various aspects of Human geography and how human intervention has harnessed environmental resources for its growth and development. They are also encouraged to observe how humans have adapted to the environment and in this process also modified it to satisfy their needs and demands. Students are also sensitized to the understand how geography has evolved as a scientific subject of enquiry. Hence it calls for a deeper understanding and analysis of issues from the local to the global level. Through the pedagogical approach of field survey, they have been oriented to the process of conducting Research -as a well-defined procedure. It involves three phases 1. Planning and preliminary preparation before the survey, 2. During the field survey 3. After the survey compilation, computation and presentation. This activity has been planned over a period of 3 months and as of now the topic and area of case study have been selected. The general concern was studying low rainfall and agriculturally less productive regions hence a case study of a drought prone village- Hiware Bazaar in Ahmednagar district of Maharashtra was undertaken. The site Hiware Bazar has been selected as it is based on the principles of sustainable development and water shed development programme to combat severe drought. The statement of the problem has been defined along with the outlined objectives ,scope of study and the time frame needed to gather the information. The field visit spanned over 5 days for data collection has

  16. Survey nonresponse among ethnic minorities in a national health survey--a mixed-method study of participation, barriers, and potentials.

    Science.gov (United States)

    Ahlmark, Nanna; Algren, Maria Holst; Holmberg, Teresa; Norredam, Marie Louise; Nielsen, Signe Smith; Blom, Astrid Benedikte; Bo, Anne; Juel, Knud

    2015-01-01

    The participation rate in the Danish National Health Survey (DNHS) 2010 was significantly lower among ethnic minorities than ethnic Danes. The purpose was to characterize nonresponse among ethnic minorities in DNHS, analyze variations in item nonresponse, and investigate barriers and incentives to participation. This was a mixed-method study. Logistic regression was used to analyze nonresponse using data from DNHS (N = 177,639 and chi-square tests in item nonresponse analyses. We explored barriers and incentives regarding participation through focus groups and cognitive interviews. Informants included immigrants and their descendants of both sexes, with and without higher education. The highest nonresponse rate was for non-Western descendants (80.0%) and immigrants 25 (72.3%) with basic education. Immigrants and descendants had higher odds ratios (OR = 3.07 and OR = 3.35, respectively) for nonresponse than ethnic Danes when adjusted for sex, age, marital status, and education. Non-Western immigrants had higher item nonresponse in several question categories. Barriers to non-participation related to the content, language, format, and layout of both the questionnaire and the cover letter. The sender and setting in which to receive the questionnaire also influenced answering incentives. We observed differences in barriers and incentives between immigrants and descendants. Nonresponse appears related to linguistic and/or educational limitations, to alienation generated by the questions' focus on disease and cultural assumptions, or mistrust regarding anonymity. Ethnic minorities seem particularly affected by such barriers. To increase survey participation, questions could be sensitized to reflect multicultural traditions, and the impact of sender and setting considered.

  17. Contingent valuation method applied to survey on personal preferences on choice of electric power source

    International Nuclear Information System (INIS)

    Takahashi, Reiko; Nakagome, Yoshihiro

    2004-01-01

    A Survey was conducted on personal preferences regarding their choice of electric power source to verify the applicability of Contingent Valuation Method (CVM) to such analysis. The survey was carried out on local and urban inhabitants in two steps, first by mail and thereafter by direct interview. A choice of four typical forms of power source was presented: nuclear, coal, hydro and green power; and the question was asked whether the respondent would be willing to pay additional charge for specifying their preferable power source. The mail survey indicated more than half of the respondents hold some willingness to pay either for disuse of nuclear power or expansion of green power. The interview survey revealed various complex motives lying behind their answers. Consequently, it was found that their preference is significantly correlated to their personal image or knowledge of power sources, their thinking or attitude toward energy conservation, their sense of consumption and their private view of life. It is concluded that CVM is pertinently applicable to quantitative analysis of individual opinions, particularly in terms of their motivation to participate in national energy issues. A number of modifications, however, should be required to be brought to the survey design in order to ensure smooth application in practice. (author)

  18. Business process simulation - tool survey

    NARCIS (Netherlands)

    Jansen-Vullers, M.H.; Netjes, M.; Jensen, K.

    2006-01-01

    In the nineties, more and more attention was raised for process oriented analysis of the performance of companies. Nowadays, many process aware information systems are implemented (e.g., workflow management systems) and business processes are evaluated and redesigned. The discipline related to this

  19. A citizen science based survey method for estimating the density of urban carnivores

    Science.gov (United States)

    Baker, Rowenna; Charman, Naomi; Karlsson, Heidi; Yarnell, Richard W.; Mill, Aileen C.; Smith, Graham C.; Tolhurst, Bryony A.

    2018-01-01

    Globally there are many examples of synanthropic carnivores exploiting growth in urbanisation. As carnivores can come into conflict with humans and are potential vectors of zoonotic disease, assessing densities in suburban areas and identifying factors that influence them are necessary to aid management and mitigation. However, fragmented, privately owned land restricts the use of conventional carnivore surveying techniques in these areas, requiring development of novel methods. We present a method that combines questionnaire distribution to residents with field surveys and GIS, to determine relative density of two urban carnivores in England, Great Britain. We determined the density of: red fox (Vulpes vulpes) social groups in 14, approximately 1km2 suburban areas in 8 different towns and cities; and Eurasian badger (Meles meles) social groups in three suburban areas of one city. Average relative fox group density (FGD) was 3.72 km-2, which was double the estimates for cities with resident foxes in the 1980’s. Density was comparable to an alternative estimate derived from trapping and GPS-tracking, indicating the validity of the method. However, FGD did not correlate with a national dataset based on fox sightings, indicating unreliability of the national data to determine actual densities or to extrapolate a national population estimate. Using species-specific clustering units that reflect social organisation, the method was additionally applied to suburban badgers to derive relative badger group density (BGD) for one city (Brighton, 2.41 km-2). We demonstrate that citizen science approaches can effectively obtain data to assess suburban carnivore density, however publicly derived national data sets need to be locally validated before extrapolations can be undertaken. The method we present for assessing densities of foxes and badgers in British towns and cities is also adaptable to other urban carnivores elsewhere. However this transferability is contingent on

  20. APPLICATION OF FTA AND FMEA METHOD TO IMPROVE SUGAR PRODUCTION PROCESS QUALITY

    Directory of Open Access Journals (Sweden)

    JojoAndriana

    2016-10-01

    Full Text Available Defective product is a product that has poor quality and do not meet the standart. This defective products can give a bad impact to company, such as high production costs and decreased image company. Several methods that can be used to improve the quality is Six Sigma DMAIC methodology, FTA, and FMEA method. This study is conducted for several purpose, they are to determine the value of sigma level on the process of sugar production in PT.PG. Krebet Baru, to determine the factors that cause defective products in the process of sugar production by the FTA method, and to make a suitable solution based on the FMEA defective causes. The process sigma level in PT.PG.Krebet Baru is 3.58. That value sigma level indicates PT. PG. Krebet Baru is a company that are still growing and need improvement. The primary cause of the defects in the production process is a factor of operator and machine. Mode of failure with the highest RPN at 210 is time for steam process is too long, so they need to install the equipment that can detect the water level on sugar. When this equipment is installed, the exact time for drying will be known and the amount of defective product will be decreased.